CN117464725A - Target coordinate acquisition method and system - Google Patents

Target coordinate acquisition method and system Download PDF

Info

Publication number
CN117464725A
CN117464725A CN202210863217.XA CN202210863217A CN117464725A CN 117464725 A CN117464725 A CN 117464725A CN 202210863217 A CN202210863217 A CN 202210863217A CN 117464725 A CN117464725 A CN 117464725A
Authority
CN
China
Prior art keywords
target object
camera
target
coordinates
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210863217.XA
Other languages
Chinese (zh)
Inventor
李�杰
谭康强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongshan Longsys Electronics Co ltd
Original Assignee
Zhongshan Longsys Electronics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongshan Longsys Electronics Co ltd filed Critical Zhongshan Longsys Electronics Co ltd
Priority to CN202210863217.XA priority Critical patent/CN117464725A/en
Publication of CN117464725A publication Critical patent/CN117464725A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a target coordinate acquisition method and a system, wherein the target coordinate acquisition method comprises the following steps: moving a camera to a target object, and enabling a center point of the camera to be aligned with a preset point of the target object; acquiring the current coordinates of the camera; acquiring the offset of the center point of the camera and the position of the suction head; and obtaining target coordinates of the target object according to the current coordinates of the camera and the offset, so that the suction head can move to the target object according to the target coordinates. According to the method and the device, the center point of the camera is aligned with the preset point of the target object to obtain the current coordinate, and the target coordinate is obtained according to the current coordinate and the offset of the camera, so that the adjustment and the calibration of the target object coordinate are completed, the position of the target object can be accurately positioned through the target coordinate, and the smoothness of taking and placing materials is ensured.

Description

Target coordinate acquisition method and system
Technical Field
The application relates to the field of vision measurement positioning, in particular to a target coordinate acquisition method and system.
Background
In the production process of the chips, the chips are required to be placed in the groove on the material tray, and the chips are taken and placed by the manipulator of the automatic machine. In order to enable the suction head on the manipulator to accurately pick and place chips, the accurate position of the slot position needs to be debugged and calibrated. The debugging mode used by the equipment on the market at present is to acquire or correct the coordinates of the slot position according to a manual visual alignment method. However, this approach is inefficient and often requires multiple corrections to achieve a more suitable coordinate.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a target coordinate acquiring method and system capable of improving the efficiency of acquiring target coordinates.
A first aspect of the present application provides a target coordinate acquisition method, including:
moving a camera to a target object, and enabling a center point of the camera to be aligned with a preset point of the target object;
acquiring the current coordinates of the camera;
acquiring the offset of the center point of the camera and the position of the suction head; and
And obtaining target coordinates of the target object according to the current coordinates of the camera and the offset, so that the suction head can move to the target object according to the target coordinates.
According to a specific embodiment of the first aspect of the present application, the moving the camera to the target object, and aligning the center point of the camera with the preset point of the target object includes: moving the camera to the target object and acquiring an actual image of the target object; comparing the actual image with a preset image to determine the position of the preset point of the target object; and moving the camera according to the position of the preset point, and enabling the center point of the camera to be aligned with the preset point of the target object.
According to a specific embodiment of the first aspect of the present application, the preset point is a central position point of the target object; the suction head is movable to a central position point of the target object according to the target coordinates.
According to a specific embodiment of the first aspect of the present application, the target object has a first position and a second position, and the first position and the second position are located at two ends of a diagonal line of the target object; the preset point is the first position or the second position; the suction head is movable to a central position point of the target object according to the target coordinates.
According to a specific embodiment of the first aspect of the present application, the target object is one of a loading zone slot, a testing zone slot, and a unloading zone slot.
A second aspect of the present application provides an object coordinate acquisition system comprising: a camera; a suction head; the processor is used for realizing instructions and is electrically connected with the camera and the suction head; and a storage device for storing the instructions; wherein the instructions are for loading and executing by the processor: moving a camera to a target object, and enabling a center point of the camera to be aligned with a preset point of the target object; acquiring the current coordinates of the camera; acquiring the offset of the center point of the camera and the position of the suction head; and obtaining target coordinates of the target object according to the current coordinates of the camera and the offset, so that the suction head can move to the target object according to the target coordinates.
According to a specific embodiment of the second aspect of the present application, the instructions are for loading and executing by the processor: moving the camera to the target object and acquiring an actual image of the target object; comparing the actual image with a preset image to determine the position of the preset point of the target object; and moving the camera according to the position of the preset point, and enabling the center point of the camera to be aligned with the preset point of the target object.
According to a specific embodiment of the second aspect of the present application, the preset point is a central position point of the target object; the instructions are for loading and executing by the processor: and controlling the suction head to move to a central position point of the target object according to the target coordinates.
According to a specific embodiment of the second aspect of the present application, the target object has a first position and a second position, and the first position and the second position are located at two ends of a diagonal line of the target object; the preset point is the first position or the second position; the instructions are for loading and executing by the processor: and controlling the suction head to move to a central position point of the target object according to the target coordinates.
According to an embodiment of the second aspect of the present application, the target object is one of a loading zone slot, a testing zone slot, and a unloading zone slot.
Compared with the prior art, the application has at least the following beneficial effects:
the center point of the camera is aligned with a preset point of the target object to obtain the current coordinate, and the target coordinate is obtained according to the current coordinate and the offset of the camera. Therefore, the debugging and calibration of the target object coordinates are completed, the position of the target object can be accurately positioned through the target coordinates, and the smoothness of taking and placing materials is ensured. Compared with the manual visual alignment method, the target coordinate acquisition method can efficiently, quickly and accurately finish the adjustment and calibration of the target coordinates of the material taking and placing points, is simple to operate, can be obtained immediately, reduces the technical requirement of adjustment and avoids the risk of manipulator impact caused by misoperation of the coordinate in error transmission.
Drawings
Fig. 1 is a flowchart of a target coordinate acquiring method according to an embodiment of the present application.
FIG. 2 is a schematic diagram of a slot and a chip according to an embodiment of the present application.
Fig. 3 is a schematic view of a manipulator, a camera and a suction head according to an embodiment of the present application.
Fig. 4A and 4B are schematic diagrams of a corner of a target object according to an embodiment of the present application.
Fig. 5 is another flow chart of the target coordinate acquiring method according to an embodiment of the present application.
Fig. 6 is another flow chart of the target coordinate acquiring method according to an embodiment of the present application.
Fig. 7 is a schematic block diagram of a target coordinate acquisition system according to an embodiment of the present application.
The following detailed description will further illustrate the application in conjunction with the above-described figures.
Description of the main reference signs
Target coordinate acquisition system 100
Manipulator 10
First manipulator 11
Second manipulator 12
Third manipulator 13
Camera 20
First camera 21
Second camera 22
Third camera 23
Suction head 30
First suction head 31
Second suction head 32
Third suction head 33
Target object 201
Chip 202
Detailed Description
In order that the above-recited objects, features and advantages of the present application will be more clearly understood, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description. In addition, embodiments of the present application and features of the embodiments may be combined with each other without conflict. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, and the described embodiments are merely some, rather than all, of the embodiments of the present application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used herein includes all and any combination of one or more of the associated listed items.
In various embodiments of the present application, for ease of description and not limitation, the term "coupled" as used in the specification and claims of the present application is not limited to physical or mechanical coupling, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which change accordingly when the absolute position of the object to be described changes.
Referring to fig. 1, an embodiment of the present application provides a target coordinate acquiring method, which includes the following steps:
step S11, the camera is moved to the target object, and the center point of the camera is aligned with the preset point of the target object.
Referring to fig. 2, in the present embodiment, an organic table is disposed near the target object 201, the organic table is connected with a manipulator and is capable of controlling the manipulator, and a camera is disposed on the manipulator and is controlled by the machine table to move the manipulator, so as to move the camera to a position close to the target object 201. The camera can recognize the target object 201 through the preset image, and move the manipulator so that the center point (or cross target) of the viewfinder of the camera is aligned to the preset point a on the target object 201.
Specifically, this step may include moving the camera to the target object and acquiring an actual image of the target object; comparing the actual image with a preset image to determine the position of a preset point of the target object; and moving the camera according to the position of the preset point, and enabling the center point of the camera to be aligned with the preset point of the target object.
In one possible implementation, the preset point a is a center position point of the target object 201. In another possible implementation, the preset point may also be any point on the target object 201, for example, a point on an edge of the target object 201, and further, for example, a corner point of the target object 201.
In one possible implementation, the target object 201 is a slot in which the chip 202 (i.e., material) is placed. The slot may be, for example, a loading area slot of a tray disposed in a loading area, a testing area slot of a testing jig disposed in a testing area, a unloading area slot of a tray disposed in a unloading area, or a shuttle slot disposed in a shuttle, etc. The test fixture is used for performing functional test or burn-in test on the chip 202, and the shuttle is used for transporting the chip 202.
It can be appreciated that the tray in the loading area, the test fixture in the testing area, the tray in the unloading area, or the shuttle are all provided with a plurality of corresponding slots for placing the chips 202.
In one possible implementation, the camera in this step is a charge coupled device (Charge Coupled Device, CCD) camera.
Step S12, the current coordinates of the camera are acquired.
In one possible implementation, the corresponding column on the console interface can click the modify button to record the current coordinates where the camera is located.
It will be appreciated that in one possible implementation, the coordinates are to establish an XY axis coordinate system based on the surface of the tray or the surface of the test fixture or the surface of the shuttle (see fig. 2).
Step S13, obtaining the offset of the center point of the camera and the position of the suction head.
In one possible implementation, the offset is obtained by calculating the relative distance of the center point of the camera's viewfinder (or cross target) to the center of the tip.
It is understood that the offset is a coordinate difference, and may be, for example, a coordinate offset on the X axis and a coordinate offset on the Y axis.
Referring to fig. 3, in one possible implementation, a plurality of suction heads are disposed on the manipulator. For example, as shown in fig. 3, the robot arm is provided with eight suction heads at the same time. Eight suction heads can simultaneously suck eight chips 202 in eight slots in a take-out tray (e.g., a loading area tray).
Step S14, obtaining the target coordinates of the target object according to the current coordinates and the offset of the camera.
It will be appreciated that by combining the current coordinates with the offset obtained in step S13, the actual position coordinates of the preset point a, i.e., the target coordinates, can be obtained.
It will be appreciated that after the target coordinates are obtained, the cleaner head can be moved to the target object for material retrieval and placement in accordance with the target coordinates.
It can be understood that the coordinates of each slot can be obtained by repeating the steps S11 to S14. And (3) executing the steps S11 to S14 in different areas (such as a feeding area, a testing area, a discharging area or a shuttle), so as to obtain the coordinates of the slot positions of the different areas. In one possible implementation, the coordinates of the loading zone slot, i.e., the first slot coordinates, the coordinates of the test zone slot, i.e., the second slot coordinates, the coordinates of the unloading zone slot, i.e., the third slot coordinates, and the coordinates of the shuttle slot, i.e., the fourth slot coordinates, can be obtained.
It should be noted that, the first slot position coordinate, the second slot position coordinate, the third slot position coordinate and the fourth slot position coordinate do not refer to the coordinates of a specific slot position, but refer to the coordinates of a slot position located in a certain area. For example, the feeding area is provided with a plurality of feeding area slots, and the coordinates corresponding to each feeding area slot are the first slot coordinates. The second slot position coordinate, the third slot position coordinate and the fourth slot position coordinate are similar, and the description is omitted.
In one possible implementation, the verification test is performed after the first slot coordinate, the second slot coordinate, the third slot coordinate, and the fourth slot coordinate are obtained, which ensures that chips can be smoothly fetched and placed according to the coordinates.
It will be appreciated that in one possible implementation, since several suction heads on the manipulator are able to simultaneously suck up chips 202 in a slot, it is only necessary to obtain the target coordinates of one of the slots (e.g. near the edge), and the positioning of the other remaining slots can be achieved by the position of that slot. In another possible implementation, even if several tips on the robot are able to simultaneously pick up chips 202 in slots, it is still necessary to acquire the target coordinates for each slot.
Referring to fig. 4A and 4B together, in one possible embodiment, the viewfinder of the camera cannot accommodate the entire target object, so that the center point of the target object cannot be confirmed, which is caused, for example, by the size of the target object exceeding the range of the lens.
In such an embodiment, the target object includes a first location B and a second location C. For example, the target object is centrosymmetric, and the first position B and the second position C are each a point of a corner of the target object. In one possible implementation, the second position C and the first position B are two symmetrical points of central symmetry, and the second position C and the first position B are also located at two ends on a diagonal line of the target object.
Referring to fig. 5, another flow chart of the target coordinate acquiring method according to the embodiment of the present application is shown, and the method includes the following steps:
in step S21, after the center point of the camera is aligned to the first position, the current coordinate of the camera is obtained as the first coordinate.
Step S22, after the center point of the camera is aligned to the second position, the current coordinate of the camera is obtained as the second coordinate.
It should be understood that the steps S31 and S32 may be sub-flows of the step S12 in fig. 1, that is, in one possible implementation, the camera only needs to align with the center point a (refer to fig. 2) of the target object to obtain the current coordinate, and in another possible implementation, the camera needs to align with the first position B and the second position C on the diagonal line of the target object in order to obtain the current coordinate of the first position B and the current coordinate of the second position C, respectively.
Step S23, obtaining the offset of the center point of the camera and the suction head position.
It is understood that this step may refer to step S13, and will not be described herein.
Step S24, obtaining a first intermediate coordinate according to the first coordinate and the offset.
Step S25, obtaining a second intermediate coordinate according to the second coordinate and the offset.
Step S26, obtaining target coordinates according to the first intermediate coordinates and the second intermediate coordinates.
It should be understood that, in step S34, step S35, and step S36 may be the sub-flows of step S14 in fig. 1, that is, in one possible implementation, the two current coordinates of the two points, that is, the current coordinate of the first position B and the current coordinate of the second position C, are calculated according to the offset amounts respectively to obtain two intermediate values (for example, the first intermediate coordinate and the second intermediate coordinate), and then the target coordinate is obtained according to the two intermediate values, where the target coordinate is the coordinate of the midpoint of the line formed by the point corresponding to the first intermediate coordinate and the point corresponding to the second intermediate coordinate.
Referring to fig. 6, another flowchart of the target coordinate acquiring method according to the embodiment of the present application is shown.
Step S31, sucking heads positioned in the feeding area suck materials from the groove position of the feeding area according to the first groove position coordinates, and then placing the materials into the shuttle groove position.
It can be understood that the loading area, the testing area or the unloading area are respectively provided with corresponding areas for placing the material tray, and the material tray is provided with corresponding slots. For example, a feeding area groove is formed in a feeding disc of the feeding area, a testing area groove is formed in a testing jig of the testing area, and a discharging area groove is formed in a feeding disc of the discharging area.
In one possible implementation, the suction head located in the loading area places the material into the shuttle according to the fourth slot coordinates.
In this embodiment, the manipulator located in the feeding area is controlled to move according to the position indicated by the first slot position coordinate, so that the suction head moves above the target chip in the slot position of the feeding area, and the target chip to be tested is sucked. And then the manipulator in the feeding area is controlled to move according to the position indicated by the fourth slot position coordinate, so that the suction head moves to the upper part of the shuttle slot position, and the target chip to be tested is placed in the shuttle slot position.
Step S32, the shuttle moves to a blanking area.
It will be appreciated that in one possible embodiment, the loading zone is spaced from the test zone, and the shuttle is movable between the loading zone and the test zone to transport the loaded target chip to be tested from the loading zone to the test zone. In another possible implementation mode, the distance between the feeding area and the testing area is relatively short, the shuttle is not required to move for a long distance, and the shuttle is only required to adaptively turn and position according to the placement position of the testing jig in the testing area, so that the manipulator in the testing area can operate conveniently.
And S33, after the suction head positioned in the test area takes out the material from the shuttle slot, placing the material into the slot of the test area according to the second slot coordinates.
In one possible implementation, the suction head located in the test zone removes the material from the shuttle slot according to the fourth slot coordinate.
In this embodiment, the manipulator located in the test area is controlled to move according to the position indicated by the fourth slot position coordinate, so that the suction head moves to the position above the shuttle slot position, and the target chip to be tested is sucked. And the manipulator positioned in the test area is controlled to move according to the position indicated by the second slot position coordinate, so that the suction head moves to the upper part of the slot position of the test area, and the target chip to be tested is placed in the slot position of the test area.
And step S34, taking out the tested materials from the groove of the test area by the suction head positioned in the test area and putting the tested materials into the groove of the shuttle.
In one possible implementation, the suction head located in the test zone places the material into the shuttle according to the fourth slot coordinates.
In this embodiment, the manipulator located in the test area is controlled to move according to the position indicated by the second slot position coordinate, so that the suction head moves to the position above the slot position of the test area, and the target chip after the test is sucked. The manipulator in the test area is controlled to move according to the position indicated by the fourth slot position coordinate, so that the suction head moves to the upper part of the shuttle slot position, and the target chip after the test is placed in the shuttle slot position.
Step S35, the shuttle moves to a blanking area.
Similar to step S32, in one possible embodiment, the test area and the blanking area are separated by a distance, and the shuttle is capable of moving between the test area and the blanking area to transport the loaded test-completed target chip from the test area to the blanking area. In another possible implementation mode, the distance between the test area and the blanking area is relatively short, the shuttle is not required to move for a long distance, and the shuttle is only required to adaptively turn and position according to the placement position of the material tray of the blanking area, so that the manipulator of the blanking area can operate conveniently.
And S36, after the suction head positioned in the blanking area takes out the material from the shuttle slot, the material is placed into the slot of the blanking area according to the coordinates of the third slot.
In one possible implementation, the suction head located in the blanking area takes the material out of the shuttle slot according to the fourth slot coordinate.
In this embodiment, the manipulator in the blanking area is controlled to move according to the position indicated by the fourth slot position coordinate, so that the suction head moves to the position above the shuttle slot position, and the tested target chip is sucked. And the manipulator in the blanking area is controlled to move according to the position indicated by the third slot position coordinate, so that the suction head moves to the upper part of the slot position of the blanking area, and the tested target chip is placed in the slot position of the blanking area.
Obviously, through the first slot position coordinate, the second slot position coordinate, the third slot position coordinate and the fourth slot position coordinate obtained in the above embodiment, the manipulator in different areas can drive the suction head to accurately suck the target chip from the slot position (the feeding area slot position/the test area slot position/the discharging area slot position/the shuttle slot position) under corresponding control, or put the sucked target chip into the slot position.
Referring to fig. 7, an embodiment of the present application further provides an object coordinate acquiring system 100, where the object coordinate acquiring system 100 includes: manipulator 10, camera 20, suction head 30, shuttle 40, processor 50, and storage device 60.
The robot 10 is adapted to move under the control of the processor 50. Specifically, the manipulator 10 can accept a movement instruction including the target coordinates of the position to which the manipulator 10 needs to move.
A camera 20 is provided on the robot 10 for recognizing a target object.
The suction head 30 is disposed on the robot 10, and the suction head 30 can be moved toward or away from the target object by the robot 10 and is aligned with a center point of the target object to suck or unclamp the target object.
In one possible implementation, the manipulator 10 includes a first manipulator 11 located in the loading zone, a second manipulator 12 located in the testing zone, and a third manipulator 13 located in the unloading zone. The first manipulator 11 is provided with a first camera 21 and a first suction head 31. The second manipulator 12 is provided with a second camera 22 and a second suction head 32. The third manipulator 13 is provided with a third camera 23 and a third suction head 33.
The shuttle 40 is used to transport chips to be tested or to be tested.
The processor 50 is connected to the manipulator 10, the camera 20, the suction head 30 and the shuttle 40. The processor 50 is used to implement various instructions. For example, the manipulator 10 is controlled to move, for controlling the camera 20 to acquire images, for controlling the suction head 30 to suck or release a target object, and for controlling the shuttle 40 to move.
The storage device 60 is used for storing various instructions for loading and executing the above steps S11 to S14, steps S21 to S26, and steps S31 to S36 by the processor 50.
The present application obtains the current coordinate by aligning the center point of the camera 20 to the preset point of the target object, and obtains the target coordinate of the target object according to the current coordinate and the offset of the camera 20. Therefore, the adjustment and calibration of the coordinates of the target object are completed, the position of the target object can be accurately positioned through the target coordinates, and the smoothness of taking and placing materials is ensured. Compared with the manual visual alignment method, the target coordinate acquisition method can efficiently, quickly and accurately finish the adjustment and calibration of the target coordinates of the material taking and placing points, is simple to operate, can be obtained immediately, reduces the technical requirement of adjustment and avoids the risk of collision of the manipulator 10 caused by misoperation of the coordinate in error transmission.
It will be appreciated by persons skilled in the art that the above embodiments have been provided for the purpose of illustration only and not as a definition of the limits of the application, and that appropriate modifications and variations of the above embodiments should be within the scope of the application as claimed.

Claims (10)

1. A target coordinate acquisition method, characterized in that the target coordinate acquisition method comprises:
moving a camera to a target object, and enabling a center point of the camera to be aligned with a preset point of the target object;
acquiring the current coordinates of the camera;
acquiring the offset of the center point of the camera and the position of the suction head; and
And obtaining target coordinates of the target object according to the current coordinates of the camera and the offset, so that the suction head can move to the target object according to the target coordinates.
2. The target coordinate acquisition method according to claim 1, wherein moving the camera to the target object and aligning a center point of the camera to a preset point of the target object includes:
moving the camera to the target object and acquiring an actual image of the target object;
comparing the actual image with a preset image to determine the position of the preset point of the target object; and
And moving the camera according to the position of the preset point, and enabling the center point of the camera to be aligned with the preset point of the target object.
3. The target coordinate acquisition method according to claim 1, wherein the preset point is a center position point of the target object; the suction head is movable to a central position point of the target object according to the target coordinates.
4. The target coordinate acquisition method according to claim 1, wherein the target object has a first position and a second position, and the first position and the second position are located at both ends of a diagonal line of the target object; the preset point is the first position or the second position; the suction head is movable to a central position point of the target object according to the target coordinates.
5. The method of claim 1, wherein the target object is one of a loading zone slot, a testing zone slot, and a unloading zone slot.
6. An object coordinate acquisition system, characterized in that the object coordinate acquisition system comprises:
a camera;
a suction head;
the processor is used for realizing instructions and is electrically connected with the camera and the suction head; and
A storage device for storing the instructions;
wherein the instructions are for loading and executing by the processor:
moving a camera to a target object, and enabling a center point of the camera to be aligned with a preset point of the target object;
acquiring the current coordinates of the camera;
acquiring the offset of the center point of the camera and the position of the suction head; and
And obtaining target coordinates of the target object according to the current coordinates of the camera and the offset, so that the suction head can move to the target object according to the target coordinates.
7. The target coordinate acquisition system of claim 6, wherein the instructions are to be loaded and executed by the processor to:
moving the camera to the target object and acquiring an actual image of the target object;
comparing the actual image with a preset image to determine the position of the preset point of the target object; and
And moving the camera according to the position of the preset point, and enabling the center point of the camera to be aligned with the preset point of the target object.
8. The target coordinate acquisition system according to claim 6, wherein the preset point is a center position point of the target object;
the instructions are for loading and executing by the processor: and controlling the suction head to move to a central position point of the target object according to the target coordinates.
9. The target coordinate acquisition system of claim 6, wherein the target object has a first location and a second location, and the first location and the second location are located at two ends of a diagonal of the target object; the preset point is the first position or the second position;
the instructions are for loading and executing by the processor: and controlling the suction head to move to a central position point of the target object according to the target coordinates.
10. The target coordinate acquisition system of claim 6, wherein the target object is one of a loading zone slot, a testing zone slot, and a unloading zone slot.
CN202210863217.XA 2022-07-21 2022-07-21 Target coordinate acquisition method and system Pending CN117464725A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210863217.XA CN117464725A (en) 2022-07-21 2022-07-21 Target coordinate acquisition method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210863217.XA CN117464725A (en) 2022-07-21 2022-07-21 Target coordinate acquisition method and system

Publications (1)

Publication Number Publication Date
CN117464725A true CN117464725A (en) 2024-01-30

Family

ID=89626177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210863217.XA Pending CN117464725A (en) 2022-07-21 2022-07-21 Target coordinate acquisition method and system

Country Status (1)

Country Link
CN (1) CN117464725A (en)

Similar Documents

Publication Publication Date Title
US8135208B1 (en) Calibrated vision based robotic system utilizing upward and downward looking cameras
US8622198B2 (en) Component transferring apparatus and IC handler
US5644245A (en) Probe apparatus for inspecting electrical characteristics of a microelectronic element
KR20180120647A (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
JPH03228591A (en) Work holding device, work and storing case thereof
US20010055069A1 (en) One camera system for component to substrate registration
US7555831B2 (en) Method of validating component feeder exchanges
US11173608B2 (en) Work robot and work position correction method
US10667449B2 (en) Rotary head type component mounter
US20190133010A1 (en) Contactless electric power supply device
CN113687215B (en) Method and equipment for improving contact precision of probe and wafer test point
US7265536B2 (en) Procedure for reproduction of a calibration position of an aligned and afterwards displaced calibration substrate in a probe station
KR102461013B1 (en) Handler for testing electronic devices and method for adjusting teaching point thereof
CN114441942A (en) Flying probe testing method, system, equipment and storage medium for PCB
CN111486787A (en) Test positioning method and test positioning system
US11317551B2 (en) Component mounter
CN117464725A (en) Target coordinate acquisition method and system
CN113740571A (en) Array testing device capable of realizing automatic precise control of single probe block
US20040257101A1 (en) Probe area setting method and probe device
US11395449B2 (en) Three-dimensional mounting device and three-dimensional mounting method
CN111665250A (en) Appearance detection device and method
US11122719B2 (en) Component mounter
TWI545329B (en) An electronic component operating device, a working method, and a working device for its application
JP2005353750A (en) Maintenance and management apparatus for electronic component mounting apparatus
US20190239400A1 (en) Component determination device and component determination method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination