CN114373131A - Manipulator movement compensation method and device, computer equipment and storage medium - Google Patents

Manipulator movement compensation method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114373131A
CN114373131A CN202210016229.9A CN202210016229A CN114373131A CN 114373131 A CN114373131 A CN 114373131A CN 202210016229 A CN202210016229 A CN 202210016229A CN 114373131 A CN114373131 A CN 114373131A
Authority
CN
China
Prior art keywords
manipulator
image
sample
mapping
camera module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210016229.9A
Other languages
Chinese (zh)
Inventor
任将
熊星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou HYC Technology Co Ltd
Original Assignee
Suzhou HYC Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou HYC Technology Co Ltd filed Critical Suzhou HYC Technology Co Ltd
Priority to CN202210016229.9A priority Critical patent/CN114373131A/en
Publication of CN114373131A publication Critical patent/CN114373131A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The disclosure relates to the technical field of visual alignment, in particular to a manipulator movement compensation method, a manipulator movement compensation device, a manipulator movement compensation computer device and a storage medium, wherein the manipulator movement compensation method comprises the steps of obtaining first image data of a target position on a sample to be detected in an image coordinate system of a first camera module and second image data of the target position in an image coordinate system of a second camera module; mapping the first image data to a manipulator coordinate system according to a predetermined first mapping relation to obtain first mapping data; mapping the second image data to a manipulator coordinate system according to a predetermined second mapping relation to obtain second mapping data; the movement coordinates of the robot arm are compensated based on the relative shift data between the first mapping data and the second mapping data. The relative deviation between the sample to be detected and the carrier is determined in the same coordinate system to compensate the moving coordinate of the manipulator, so that the crimping accuracy rate of the sample to be detected in the cavity of the carrier can be improved.

Description

Manipulator movement compensation method and device, computer equipment and storage medium
Technical Field
The invention relates to the technical field of visual alignment, in particular to a manipulator movement compensation method, a manipulator movement compensation device, computer equipment and a storage medium.
Background
In large-scale automatic AR (Augmented Reality) screen inspection equipment, accurately crimping a screen (DUT) into a carrier (carrier) is important for inspecting various indexes of the screen subsequently. The carrier also needs to be grabbed into a fixture (tester) before the screen is crimped into the carrier, i.e. there are two grabbing processes. However, the mechanical arm for grabbing the screen and the mechanical arm for grabbing the carrier have errors when grabbing, and the AR screen is small, so that the screen may be crushed by a small position deviation. Meanwhile, the position of the carrier is not fixed, and the screen cannot be pressed into the carrier simply by hitting a fixed point position by a manipulator. Therefore, how to solve the problem of poor pressure welding when the screen is grabbed to be butted with the carrier in the jig cavity becomes an urgent need.
Disclosure of Invention
Accordingly, it is desirable to provide a method and an apparatus for compensating movement of a manipulator, a computer device, and a storage medium for solving a problem of poor pressure contact when a screen is caught and abutted with a carrier in a jig cavity.
A manipulator movement compensation method comprises the steps of obtaining first image data of a target position on a sample to be measured in an image coordinate system of a first camera module; acquiring second image data of the target position on the sample to be detected in an image coordinate system of a second camera module; mapping the first image data to a manipulator coordinate system according to a predetermined first mapping relation to obtain first mapping data, wherein the first mapping relation is the mapping relation between the image coordinate system of the first camera module and the manipulator coordinate system; mapping the second image data to the manipulator coordinate system according to a predetermined second mapping relation to obtain second mapping data, wherein the second mapping relation is the mapping relation between the image coordinate system of the second camera module and the manipulator coordinate system; and determining relative offset data between the first mapping data and the second mapping data, and compensating the movement coordinate of the manipulator according to the relative offset data.
In one embodiment, the determining process of the first mapping relationship and the second mapping relationship includes acquiring an image coordinate point set when the sample to be measured is captured by the manipulator to different positions in image capturing areas of different camera modules, and acquiring a manipulator coordinate point set of the manipulator when the sample to be measured is captured to different positions; and acquiring a mapping relation between the image coordinate systems of different camera modules and the manipulator coordinate system according to the image coordinate point set and the manipulator coordinate point set of different camera modules.
In one embodiment, the acquiring the image coordinate point set of the to-be-measured sample when the to-be-measured sample is captured to different positions in the image capturing area of different camera modules by the manipulator, and the acquiring the manipulator coordinate point set of the manipulator when the to-be-measured sample is captured to different positions comprises controlling the manipulator to capture the to-be-measured sample and move to a first position in the image capturing area of a first camera module; acquiring a first image coordinate of a test position on the sample to be tested in a first camera module; controlling the manipulator to grab the sample to be detected and move to a second position in an image acquisition area of a second camera module; acquiring a second image coordinate of the test position on the sample to be tested in a second camera module, and acquiring a manipulator coordinate when the manipulator grabs the sample to be tested and moves to a second position; and controlling the manipulator to repeatedly grab the sample to be detected for multiple times and move to different positions in the image acquisition area of the first camera module and different positions in the image acquisition area of the second camera module to acquire multiple groups of first image coordinates, multiple groups of second image coordinates and multiple groups of manipulator coordinates.
In one embodiment, the obtaining a mapping relationship between image coordinate systems of different camera modules and a manipulator coordinate system according to the image coordinate point set and the manipulator coordinate point set of different camera modules includes obtaining a first homography matrix mapped by the first camera module and the manipulator coordinate system according to a plurality of sets of the first image coordinates and a plurality of sets of the manipulator coordinates; and acquiring a second homography matrix mapped by the second camera module and the manipulator coordinate system according to the multiple groups of second image coordinates and the multiple groups of manipulator coordinates.
In one embodiment, after determining the relative offset data between the first mapping data and the second mapping data and compensating the movement coordinate of the manipulator according to the relative offset data, the method further includes controlling the manipulator to press-bond the sample to be measured into the carrier according to the compensated movement coordinate.
A manipulator movement compensation device comprises a manipulator, a sample positioning device and a signal processing device, wherein the manipulator is used for grabbing a sample to be detected and moving the sample to different positions; the manipulator control module is connected with the manipulator and used for controlling the manipulator to move; the first camera module is used for acquiring first image data of a target position on the sample to be detected in an image coordinate system of the first camera module; the second camera module is used for acquiring second image data of the target position on the sample to be detected in an image coordinate system of the second camera module; the data mapping module is respectively connected with the first camera module and the second camera module, and is used for mapping the first image data into a manipulator coordinate system according to a predetermined first mapping relation to obtain first mapping data, and is also used for mapping the second image data into the manipulator coordinate system according to a predetermined second mapping relation to obtain second mapping data; the first mapping relation is a mapping relation between an image coordinate system of the first camera module and a manipulator coordinate system, and the second mapping relation is a mapping relation between an image coordinate system of the second camera module and the manipulator coordinate system; and the movement compensation module is respectively connected with the data mapping module and the manipulator control module and is used for determining relative offset data between the first mapping data and the second mapping data and compensating the movement coordinate of the manipulator according to the relative offset data.
In one embodiment, the manipulator movement compensation device further includes a test data acquisition module, configured to acquire an image coordinate point set when the sample to be tested is captured by the manipulator to different positions in image acquisition regions of different camera modules, and acquire a manipulator coordinate point set of the manipulator when the sample to be tested is captured to different positions; and the mapping relation calculation module is used for acquiring the mapping relation between the image coordinate systems of different camera modules and the manipulator coordinate system according to the image coordinate point set and the manipulator coordinate point set of different camera modules.
In one embodiment, the manipulator control module is further configured to control the manipulator to press and connect the sample to be tested to the carrier according to the compensated movement coordinate.
A computer device comprising a memory storing a computer program and a processor implementing the steps of the manipulator movement compensation method according to any of the above embodiments when the computer program is executed.
A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the robot movement compensation method according to any one of the above embodiments.
A computer program product comprising a computer program which, when executed by a processor, carries out the steps of the robot movement compensation method of any of the above embodiments.
The manipulator motion compensation method comprises the steps of obtaining image data sets of target positions on a sample to be measured in image coordinate systems of two camera modules respectively, and mapping coordinates of the target positions on the sample to be measured in the image coordinate systems of the two camera modules respectively to manipulator coordinates simultaneously by utilizing mapping relations which are obtained in advance and correspond to the manipulator coordinate systems respectively. The relative deviation between the sample to be tested and the carrier is calculated in the same coordinate system to compensate the moving coordinate of the manipulator, so that the crimping accuracy rate when the sample to be tested is grabbed to be in butt joint with the carrier in the jig cavity can be improved, the crimping problem can be effectively solved, and the sample to be tested is prevented from being damaged and affecting the test result due to the position deviation in the crimping process.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the specification, and other drawings can be obtained by those skilled in the art without inventive labor.
FIG. 1 is a schematic flow chart of a method for compensating for robot motion according to one embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating a method for determining a first mapping relationship and a second mapping relationship according to an embodiment of the disclosure;
FIG. 3 is a schematic flow chart illustrating a method for obtaining a set of coordinate points in different coordinate systems according to an embodiment of the present disclosure;
FIG. 4 is a schematic flowchart of a method for obtaining a mapping relationship between different coordinate systems according to coordinate point sets in the different coordinate systems in an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a robot motion compensation apparatus according to an embodiment of the present disclosure;
FIG. 6 is a schematic block diagram of a robot motion compensation apparatus or system according to one embodiment of the present disclosure;
FIG. 7 is a diagram of the internal structure of a computer device in one embodiment of the present disclosure.
Detailed Description
To facilitate an understanding of the invention, the invention will now be described more fully with reference to the accompanying drawings. Preferred embodiments of the present invention are shown in the drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. As used herein, the terms "vertical," "horizontal," "left," "right," "upper," "lower," "front," "rear," "circumferential," and the like are based on the orientation or positional relationship shown in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are therefore not to be considered limiting of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
In a large-scale automatic AR screen detection device, accurate pressure connection of a screen (DUT) to a carrier (carrier) is important for detecting various indexes of the screen subsequently. The main work flow of the large-scale automatic AR screen detection equipment is as follows: carrying the carrier from the tray with the carrier to a jig in a jig turntable by using a carrier carrying manipulator; the screen carrying manipulator carries the tested screen from the tray with the screen and presses the tested screen into the carrier carried into the jig; the pairing component covers a cover plate which is used for fixing the tested screen on the carrier; the jig tests the performance to be tested of the AR screen, if the performance of the AR screen completely meets the test requirement, the screen-carrier carrying manipulator carries the carriers with the tested screens in the jig to a qualified carrier tray; and if the performance of the AR screen does not meet the test requirement, carrying the carriers with the tested screen in the screen-carrier carrying manipulator tool to a unqualified conveyor belt together.
The carrier for carrying out compression joint with the tested screen is grabbed and placed into the jig once from the tray provided with the carrier, namely the position of the carrier is not fixed, the compression joint process of the screen and the carrier cannot be finished by hitting a fixed point position, the tested screen is grabbed and placed into the carrier once from the tray provided with the screen, and certain position deviation exists more or less when the screen grabbing mechanical arm grabs the tested screen and when the carrier grabbing mechanical arm grabs the carrier. However, for the tested screen requiring high precision crimping, the screen may be damaged by pressure when the screen is crimped to the carrier with a slight error of, for example, 0.01 mm.
The manipulator movement compensation method provided by the disclosure calculates the relative offset between the screen and the carrier in a visual mode, and compensates the relative offset between the screen and the carrier by correcting the movement coordinates of the manipulator, so as to reduce the situation that the screen is crushed due to the position deviation existing in the process that the screen is pressed on the carrier.
Fig. 1 is a schematic flow chart of a method for compensating robot movement according to one embodiment of the present disclosure, and in one embodiment, the method for compensating robot movement may include the following steps S100 to S500.
Step S100: and acquiring first image data of the target position on the sample to be detected in an image coordinate system of the first camera module.
The manipulator is used for grabbing the sample to be measured and moving the sample to be measured to the image acquisition area of the first camera module, and the first camera module can shoot the sample to be measured to obtain the image or video information of the sample to be measured. The device comprises a carrier cavity, a manipulator, a DUT grabbing manipulator and a tray, wherein a to-be-detected sample can be a to-be-detected screen, the manipulator can be the DUT grabbing manipulator which is used for grabbing the to-be-detected screen and moving the to-be-detected screen to the carrier cavity for crimping, and the DUT grabbing manipulator grabs the to-be-detected screen from the tray provided with the to-be-detected screen. The first camera module can be a DUT camera module used for monitoring the grabbing condition of the screen to be detected grabbed by the manipulator in real time, and the image acquisition area of the first camera module can cover the area of the tray provided with the screen to be detected.
The target position on the sample to be detected can be identified by carrying out image identification on the image or video information of the sample to be detected, which is acquired by the first camera module. In some embodiments of the present disclosure, the target position may include two different positions on the sample to be detected, and the target position may be a position where image recognition is facilitated, such as two corner points on the left and right of the screen, for example, a marked pattern, a screen structure, a relatively obvious feature on the screen, and the like. After the target position on the sample to be detected is identified, the coordinate point of the target position in the image coordinate system of the first camera module can be obtained. The first image data may include coordinate points of the target position in an image coordinate system of the first camera module. For example, when the target position is two corner points on the left and right of the screen, the first camera module acquires and identifies image or video information including the two corner points. After identifying the left and right corner points of the screen, image coordinate points p1(x, y), p2(x, y) of the left and right corner points of the screen in the first camera module coordinate system are acquired.
Step S200: and acquiring second image data of the target position on the sample to be detected in the image coordinate system of the second camera module.
After step S100 is completed, the manipulator may move the captured sample to be detected to the image acquisition area of the second camera module, and the second camera module may take a picture of the sample to be detected again to obtain an image or video information of the sample to be detected. Wherein, the sample to be detected can also be a screen to be detected. The second camera module can carry out real time monitoring's carrier camera module for being used for the process of awaiting measuring the screen crimping to the carrier die cavity, and the image acquisition region of second camera module can cover the place region of carrier die cavity, is convenient for monitor the crimping process of the screen that awaits measuring.
The target position on the sample to be detected in real time can be identified by carrying out image identification on the image or video information of the sample to be detected, which is acquired by the second camera module. In some embodiments of the present disclosure, the target position mentioned in step S200 and the target position mentioned in step S100 may be the same position. After the target position on the sample to be detected is identified, the coordinate point of the target position in the image coordinate system of the second camera module can be obtained. The second image data may include coordinate points of the target position in an image coordinate system of the second camera module. For example, the second camera module acquires and identifies image or video information including two corner points. After the left and right corner points of the screen are identified, image coordinate points p3(x, y), p4(x, y) of the left and right corner points of the screen in the second camera module coordinate system are acquired.
Step S300: and mapping the first image data to a manipulator coordinate system according to a predetermined first mapping relation to obtain first mapping data, wherein the first mapping relation is the mapping relation between the image coordinate system of the first camera module and the manipulator coordinate system.
Step S400: and mapping the second image data to a manipulator coordinate system according to a predetermined second mapping relation to obtain second mapping data, wherein the second mapping relation is the mapping relation between the image coordinate system of the second camera module and the manipulator coordinate system.
Because the first image data and the second image data are in different coordinate systems, and the deviation between the first image data and the second image data cannot be determined in the different coordinate systems, the first image data and the second image data can be mapped into the same coordinate system, so that the two image data can be analyzed in the same coordinate system. When the problem of crimping is solved, the deviation appearing in the grabbing process can be compensated by adjusting the movement of the manipulator. Therefore, the first image data and the second image data can be mapped into the manipulator coordinate system according to the mapping relation between different coordinate systems acquired in advance, and the deviation between the two image data can be analyzed in the manipulator coordinate system.
According to a first mapping relation between the image coordinate system of the first camera module and the manipulator coordinate system, which is acquired in advance, the first image data can be mapped into the manipulator coordinate system, and first mapping data, namely coordinates of the first image data in the manipulator coordinate system, can be acquired. Similarly, according to a second mapping relationship between the image coordinate system of the second camera module and the manipulator coordinate system, which is acquired in advance, the second image data can be mapped into the manipulator coordinate system, and second mapping data, namely coordinates of the second image data in the manipulator coordinate system, can be acquired.
In one embodiment, the first mapping relationship may be determined by calculating a homography between the image coordinate system of the first camera module and the manipulator coordinate system, or may be determined by other conversion methods summarized by the tester according to practical application experience. Similarly, the second mapping relationship may be determined by calculating a homography between the image coordinate system of the second camera module and the manipulator coordinate system, or may be determined by other conversion methods summarized by the tester according to practical application experience.
Step S500: relative offset data between the first mapping data and the second mapping data is determined, and the movement coordinates of the manipulator are compensated according to the relative offset data.
After the first image data and the second image data are mapped to the manipulator coordinate system and the first mapping data and the second mapping data are acquired, the first mapping data and the second mapping data can be analyzed in a unified coordinate system to determine relative offset data between the first mapping data and the second mapping data.
When solving the crimping problem, can compensate the deviation that appears in the gripping process through the removal of adjustment manipulator. Because the first mapping data and the second mapping data are coordinate data of the same position on the sample to be detected in different coordinate systems, in the manipulator coordinate system, relative offset data generated in the manipulator grabbing process can be determined through data analysis, and therefore the moving coordinate of the manipulator can be compensated according to the determined relative offset data. The movement of the manipulator is adjusted by compensating the movement coordinate of the manipulator, so that the accuracy of the movement crimping of the sample to be tested can be improved, and the sample to be tested is prevented from being damaged and affecting the test result due to the position deviation in the crimping process.
For example, after converting image coordinate points p1(x, y) and p2(x, y) of two corner points of the left and right of the screen in the first camera module coordinate system into the manipulator coordinate system, the manipulator coordinates corresponding to p1(x, y) and p2(x, y) can be k1(x, y) and k2(x, y). Similarly, after converting the image coordinate points p3(x, y) and p4(x, y) of the two corner points on the left and right of the screen in the second camera module coordinate system into the manipulator coordinate system, the manipulator coordinates corresponding to p3(x, y) and p4(x, y) can be k3(x, y) and k4(x, y). The inclination angle theta 1 and the central coordinate k5(x, y) of the product under the DUT camera module can be obtained according to k1(x, y) and k2(x, y), and the inclination angle theta 2 and the central coordinate k6(x, y) of the product under the carrier camera module can be obtained according to k3(x, y) and k4(x, y). When the manipulator is used for compensating the moving coordinate of the carrier, which is used for pressing the DUT (sample to be tested) to be tested to the carrier, the angle compensation can be realized by calculating the deviation (theta 2-theta 1) on the inclination angle, the compensation on the deviation in the X-axis direction can be realized by calculating the distance deviation (k6.X-k5.X) on the X-axis, and the compensation on the deviation in the Y-axis direction can be realized by calculating the distance deviation (k6.Y-k5.Y) on the Y-axis.
According to the manipulator motion compensation method, after image data sets of the target position on the sample to be measured in the image coordinate systems of the two camera modules are obtained, the coordinates of the target position on the sample to be measured in the image coordinate systems of the two camera modules are mapped into the manipulator coordinates simultaneously through the mapping relation, corresponding to the manipulator coordinate systems, of the image coordinate systems of the two camera modules, which is obtained in advance. And analyzing the relative deviation of the manipulator in the process of grabbing the sample to be detected in the same coordinate system, so as to compensate the moving coordinate of the manipulator according to the relative deviation determined by analysis. Through confirming the relative deviation that the manipulator appears snatching the sample in-process that awaits measuring to compensate according to the mobile coordinate of relative deviation to the manipulator, can improve the crimping rate of accuracy when the sample that awaits measuring is snatched and is carried out the butt joint with the carrier of tool die cavity, thereby can solve the crimping problem effectively, prevent to cause damage, influence test result because of having position deviation to the sample that awaits measuring at the crimping in-process.
Fig. 2 is a flowchart illustrating a method for determining the first mapping relationship and the second mapping relationship in one embodiment of the disclosure, and in one embodiment, the determining process of the first mapping relationship and the second mapping relationship may include the following steps S10 to S20.
Step S10: the method comprises the steps of obtaining an image coordinate point set when a sample to be detected is grabbed to different positions in image acquisition areas of different camera modules by a mechanical arm, and obtaining a mechanical arm coordinate point set when the sample to be detected is grabbed to different positions.
In practical application, image coordinate systems and manipulator coordinate systems of different camera modules in the test equipment are usually fixed, so that multiple sets of coordinate data in each coordinate system can be obtained through testing in advance, and the mapping relation between the image coordinate systems and the manipulator coordinate systems of different camera modules can be determined according to the multiple sets of coordinate data.
The manipulator is controlled to move the sample to be tested to different positions in the image acquisition areas of different camera modules, and the image coordinate point sets of the test positions on the sample to be tested in different camera module image coordinate systems can be obtained by acquiring and identifying the test positions on the sample to be tested by using the different camera modules. Meanwhile, when the manipulator works, the actual moving track of the manipulator is usually determined according to the moving track coordinates in the manipulator coordinate system, so that when the manipulator drives a sample to be measured to move, the manipulator display can display the manipulator coordinates each time. And a manipulator coordinate point set of the manipulator when the manipulator grabs the sample to be detected and moves to different positions can be obtained according to the manipulator display.
Step S20: and acquiring the mapping relation between the image coordinate systems of different camera modules and the manipulator coordinate system according to the image coordinate point set and the manipulator coordinate point set of different camera modules.
By analyzing the image coordinate point sets and the manipulator coordinate point sets of different camera modules, the relation between the image coordinate systems of different camera modules and the manipulator coordinate system can be determined, and therefore the mapping relation between the image coordinate systems of different camera modules and the manipulator coordinate system is established. For example, the image coordinate point sets and the manipulator coordinate point sets of different camera modules are respectively constructed into corresponding matrix data, homography matrixes between the image coordinate systems and the manipulator coordinate systems of different camera modules can be determined through matrix transformation, and mapping relations between the image coordinate systems and the manipulator coordinate systems of different camera modules are expressed by utilizing the homography matrixes.
By researching homography matrixes between image coordinate systems of different camera modules and manipulator coordinate systems, the coordinates of the sample to be measured in different camera modules can be mapped to the manipulator coordinates at the same time. In the same coordinate system, the offset generated in the grabbing process of the manipulator can be better determined through data analysis, so that the moving coordinate of the manipulator is compensated according to the determined offset. The movement of the manipulator is adjusted by compensating the movement coordinate of the manipulator, so that the accuracy of the movement crimping of the sample to be tested can be improved, and the sample to be tested is prevented from being damaged and affecting the test result due to the position deviation in the crimping process.
Fig. 3 is a flowchart illustrating a method for obtaining coordinate point sets in different coordinate systems according to an embodiment of the disclosure, where in an embodiment, obtaining an image coordinate point set when a sample to be tested is captured by a manipulator to different positions in image capturing areas of different camera modules, and obtaining a manipulator coordinate point set of the manipulator when the sample to be tested is captured to different positions may include the following steps S11 to S19.
Step S11: and controlling the manipulator to grab the sample to be detected and move to a first position in the image acquisition area of the first camera module.
Step S13: and acquiring a first image coordinate of a test position on the sample to be tested in the first camera module.
The manipulator is controlled to grab a sample to be tested and move to a first position in an image acquisition area of the first camera module, and the first camera module can shoot a test position on the sample to be tested when the sample to be tested is located at the first position, so that image recognition can be performed on an image or video information of the sample to be tested, and an image coordinate of the test position on the sample to be tested is obtained. In some embodiments of the present disclosure, the test position may be a position where one or more features on the sample to be tested are relatively obvious, and the test position may be a position where an image is conveniently identified, such as a certain corner of a screen, for example, a marked pattern, a screen structure, and the like. The test position can be selected to be the same position as the target position or a position different from the target position, and the specific selection mode is determined according to actual test requirements.
After the test position on the sample to be tested is identified according to the image or video information acquired by the first camera module, the first image coordinate of the test position in the image coordinate system of the first camera module can be acquired. For example, after a certain corner point of the screen is identified, image coordinate points (a1, b1) of the corner point in the first camera module coordinate system may be acquired.
Step S15: and controlling the manipulator to grab the sample to be detected and move to a second position in the image acquisition area of the second camera module.
Step S17: and acquiring a second image coordinate of the test position on the sample to be tested in the second camera module, and acquiring a manipulator coordinate when the manipulator grabs the sample to be tested and moves to the second position.
After step S15 is completed, the manipulator may be controlled to move the grasped sample to be tested to a second position in the image capturing area of the second camera module, and it is ensured that the second camera module can shoot the test position on the sample to be tested when the sample to be tested is at the second position, so that image recognition may be performed on the image or video information of the sample to be tested, so as to obtain the image coordinate for recognizing the test position on the sample to be tested. In some embodiments of the present disclosure, the test position mentioned in step S17 and the test position mentioned in step S13 may be the same position.
After the test position on the sample to be tested is identified according to the image or video information acquired by the second camera module, a second image coordinate of the test position in the image coordinate system of the second camera module can be acquired. For example, after a certain corner point of the screen is recognized, image coordinate points (c1, d1) of the corner point in the second camera module coordinate system are acquired.
Meanwhile, when the manipulator works, the actual moving track of the manipulator is usually determined according to the moving track coordinates in the manipulator coordinate system, and when the manipulator drives the sample to be measured to move, the manipulator display displays the manipulator coordinates each time. Therefore, the manipulator coordinates of the manipulator when the manipulator grabs the sample to be detected and moves to different positions can be obtained according to the manipulator display. When the manipulator grabs the sample to be measured and moves from the first position to the second position, the manipulator display can be used for acquiring the manipulator coordinate when the manipulator grabs the sample to be measured and moves to the second position.
Step S19: and controlling the manipulator to repeatedly grab the sample to be detected for multiple times and move to different positions in the image acquisition area of the first camera module and different positions in the image acquisition area of the second camera module to acquire multiple groups of first image coordinates, multiple groups of second image coordinates and multiple groups of manipulator coordinates.
Since it is possible that the amount of shift per movement of the robot is not fixed, in order to ensure the data accuracy of the acquired mapping relationship, the operation steps of step S11 to step S17 may be repeated a plurality of times to acquire a plurality of sets of first image coordinates, a plurality of sets of second image coordinates, and a plurality of sets of robot coordinates. By utilizing the multiple groups of first image coordinates, the multiple groups of second image coordinates and the multiple groups of manipulator coordinates, a more accurate coordinate system mapping relation can be obtained, and therefore a better position compensation effect is guaranteed. The set of image coordinate points captured by the first camera module may include a plurality of sets of first image coordinates, the set of image coordinate points captured by the second camera module may include a plurality of sets of second image coordinates, and the set of manipulator coordinate points may include a plurality of sets of manipulator coordinates.
In one embodiment, the manipulator is made to perform the operation steps from step S11 to step S17 at least four times, that is, the operation of grasping the sample to be measured and moving the sample to the image capturing area of the first camera module and the image capturing area of the second camera module is repeated four times, and the position of the manipulator moving the image capturing area of the first camera module and the position of the manipulator moving the image capturing area of the second camera module are different each time, so as to ensure that different offset amounts are obtained.
In this embodiment, to better describe the method steps for obtaining coordinate point sets in different coordinate systems, a sample to be tested is taken as a screen to be tested DUT, a manipulator is taken as a DUT grabbing manipulator, a first camera module is taken as a DUT camera module, and a second camera module is taken as a carrier camera module.
After the operation steps of the above-described step S11 to step S17 are repeatedly performed at least four times, at least 4 sets of first image coordinates, 4 sets of second image coordinates, and 4 sets of robot arm coordinates may be acquired. The DUT camera module acquired set of image coordinate points X1 may include at least 4 sets of first image coordinates, the carrier camera module acquired set of image coordinate points X2 may include at least 4 sets of second image coordinates, and the robot coordinate point set Y may include at least 4 sets of robot coordinates.
The DUT grabbing manipulator grabs the DUT to be tested and grabs the DUT to the position below the DUT camera module, and the testing position is ensured to be within the view field of the DUT camera module. The DUT camera module shoots and acquires images or video information including a test position on a DUT screen to be tested. The test position is a relatively obvious feature on the screen DUT to be tested, such as 1 corner point of the screen DUT to be tested. Position recognition is performed on the image or video information captured by the DUT camera module to obtain first image coordinates (a1, b1) of the test position in the image coordinate system of the DUT camera module.
After acquiring the first image coordinates (a1, b1), the manipulator can be made to grab the screen under test DUT and move under the carrier camera module, and the test position is ensured to be within the field of view of the carrier camera module. The carrier camera module captures an image or video information including a test location on the screen DUT to be tested and identifies the test location to obtain second image coordinates (c1, d1) of the test location in the image coordinate system of the DUT camera module. Wherein, the test position that carrier camera module discerned and the test position that DUT camera module discerned are same position. Meanwhile, after the manipulator captures the screen to be tested DUT and moves from the first position to the current second position, the manipulator display displays the coordinates of the manipulator, and the manipulator coordinates can be obtained according to the content displayed on the manipulator display (e1, f 1).
And enabling the DUT grabbing manipulator to grab the DUT to be tested again, and moving the DUT to a first position which is different from the first position in the last collection under the DUT camera module, namely, the first position in the two operations has certain relative offset. The DUT camera module again captures an image or video information including the test position on the screen DUT to be tested and identifies the test position to obtain the first image coordinates (a2, b2) of the test position at that time on the sample DUT to be tested. After the first image coordinates (a2, b2) are acquired, the manipulator can grab the sample to be tested DUT and move to a second position under the carrier camera module, which is different from the second position in the last acquisition, that is, the second position in the two operations also has a certain relative offset. The carrier camera module again takes an image or video information including the test position on the screen DUT to be tested and identifies the test position to obtain the second image coordinates (c2, d2) of the test position at that time on the sample DUT to be tested and the manipulator coordinates (e2, f2) of the manipulator moving from the first position to the current second position at that time.
The above steps are repeated a total of 4 times, and 4 sets of first image coordinates, 4 sets of second image coordinates, and 4 sets of robot coordinates are acquired. That is, the image coordinate point set X1 collected by the DUT camera module may include first image coordinates (a1, b1) … … (a4, b4), the image coordinate point set X2 collected by the carrier camera module may include second image coordinates (c1, d1) … … (c4, d4), and the manipulator coordinate point set Y may include manipulator coordinates (e1, f1) … … (e4, f 4).
Fig. 4 is a flowchart illustrating a method for obtaining a mapping relationship between different coordinate systems according to coordinate point sets in different coordinate systems according to one embodiment of the present disclosure, where in one embodiment, obtaining a mapping relationship between image coordinate systems of different camera modules and a manipulator coordinate system according to an image coordinate point set of different camera modules and a manipulator coordinate point set may include the following steps S21 to S23.
Step S21: and acquiring a first homography matrix mapped by the first camera module and the manipulator coordinate system according to the multiple groups of first image coordinates and the multiple groups of manipulator coordinates.
Step S23: and acquiring a second homography matrix mapped by the second camera module and the manipulator coordinate system according to the multiple groups of second image coordinates and the multiple groups of manipulator coordinates.
Since the sets of first image coordinates, the sets of second image coordinates, and the sets of robot coordinates acquired in the above steps S11 to S19 are all corresponding, and are coordinate data of the same object in different coordinate systems at corresponding positions, the first image coordinates acquired by the first camera module and the second image coordinates acquired by the second camera module can be mapped to the same robot coordinates. And respectively processing the multiple groups of first image coordinates, the multiple groups of second image coordinates and the multiple groups of manipulator coordinates into matrix data, and acquiring corresponding homography matrixes through matrix change, so that the mapping relation among different coordinate systems can be expressed by utilizing the homography matrixes.
The multiple groups of first image coordinates, the multiple groups of second image coordinates and the multiple groups of manipulator coordinates are respectively processed into matrix data, and the matrix data can be respectively obtained:
Figure BDA0003459777720000171
the data in the matrix is the first image coordinate data in the image coordinate point set X1 collected by the DUT camera module.
Figure BDA0003459777720000172
The data in the matrix is the second image coordinate data in the image coordinate point set X2 collected by the carrier camera module.
Figure BDA0003459777720000173
The data in the matrix is manipulator coordinate data in a manipulator coordinate point set Y.
The first image coordinate acquired by the DUT camera module and the second image coordinate acquired by the carrier camera module can be mapped to the same manipulator coordinate at the same time, and the mapping relation acquisition method can adopt the following calculation method:
in this embodiment, to better illustrate the method steps for obtaining the mapping relationship between different coordinate systems, the matrix transformation process is illustrated by taking the corresponding points of the two point sets (a1, b1,1) and (e1, f1,1) in the image coordinate point set X1 collected by the DUT camera module and the corresponding manipulator coordinate point set Y as an example, but the method is not to be construed as limiting the scope of the invention. Defining the homography matrix H corresponding to the two points of (a1, b1,1) and (e1, f1,1) as:
Figure BDA0003459777720000181
then there are:
Figure BDA0003459777720000182
the following equation (1) can be obtained by transforming the above matrix:
Figure BDA0003459777720000183
and equation (2):
Figure BDA0003459777720000184
meanwhile, constraint conditions are added to the homography matrix H, and the modulus of the homography matrix H becomes 1, so that the following equation (3) can be obtained:
h112+h122+h132+h212+h222+h232+h312+h322+h332=1
the two formulas of equation (1) and equation (2) can be developed:
h11a1+h12b1+h13-h31a1e1-h32b1e1-h33e1=0
h21a1+h22b1+h23-h31a1f1-h32b1f1-h33f1=0
substituting 4 pairs of feature points in the image coordinate point set X1 acquired by the DUT camera module can obtain the following linear equation set (4):
Figure BDA0003459777720000191
the equation (3) and the linear equation set (4) can determine a first homography matrix H1 in which the image coordinate point set X1 acquired by the DUT camera module corresponds to the robot coordinate point set.
The process of obtaining the second homography matrix H2 corresponding to the image coordinate point set X2 acquired by the carrier camera module and the manipulator coordinate point set may be the same as the process of obtaining the first homography matrix H1 corresponding to the image coordinate point set X1 acquired by the DUT camera module and the manipulator coordinate point set, and is not described herein again.
The first homography matrix H1 may be used to represent a first mapping of the image coordinate system of the first camera module to the robot coordinate system, and the second homography matrix H2 may be used to represent a second mapping of the image coordinate system of the second camera module to the robot coordinate system. The operation of simultaneously mapping the coordinates of the DUT camera module and the coordinates of the carrier camera module to the coordinates of the manipulator of the screen to be tested can be realized by using the first homography matrix H1 and the second homography matrix H2 obtained in the calculation.
In one embodiment, after determining the relative offset data between the first mapping data and the second mapping data and compensating the movement coordinates of the manipulator according to the relative offset data, the method may further include using the manipulator to press the sample to be measured into the carrier according to the compensated movement coordinates.
When one application scene of the manipulator movement compensation method is automatic detection of the AR screen, the manipulator can accurately press the screen to be detected into the carrier cavity after compensating the movement coordinate of the manipulator according to the relative offset data. The manipulator movement compensation method acquires the coordinates of the DUT in the DUT camera module and the coordinates of the DUT in the carrier camera module through a visual mode, and maps the coordinates of the DUT in the DUT camera module and the coordinates of the DUT in the carrier camera module to the manipulator coordinates at the same time. And analyzing the two coordinates in the manipulator coordinates to calculate errors generated by the manipulator in the grabbing process. After the relative offset between the DUT and the carrier is determined, the moving coordinate of the manipulator for pressing the sample to be tested to the carrier cavity can be compensated according to the relative offset on the theta axis, the x axis and the y axis, so that the product can be accurately pressed to the carrier cavity.
It should be understood that, although the steps in the flowcharts of the figures in the specification are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts of the figures of the specification may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the steps or the stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a part of the steps or the stages in other steps.
Based on the description of the above embodiment of the manipulator movement compensation method, the present disclosure also provides a manipulator movement compensation apparatus. The apparatus may include systems (including distributed systems), software (applications), modules, components, servers, clients, etc. that use the methods described in embodiments of the present specification in conjunction with any necessary apparatus to implement the hardware. Based on the same innovative concept, the embodiments of the present disclosure provide an apparatus in one or more embodiments as described in the following embodiments. Since the implementation scheme of the apparatus for solving the problem is similar to that of the method, the specific implementation of the apparatus in the embodiment of the present specification may refer to the implementation of the foregoing method, and repeated details are not repeated. As used hereinafter, the term "unit" or "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 5 is a schematic structural diagram of a robot movement compensation apparatus according to an embodiment of the present disclosure, and in an embodiment, the robot movement compensation apparatus may include a robot 100, a robot control module 200, a first camera module 300, a second camera module 400, a data mapping module 500, and a movement compensation module 600.
The manipulator 100 may be used to grasp a sample to be tested and move the sample to different positions.
The robot control module 200 is connected to the robot 100 and may be used to control the movement of the robot 100.
The first camera module 300 may be configured to acquire first image data of a target position on a sample to be measured in an image coordinate system of the first camera module 300.
The second camera module 400 may be configured to acquire second image data of the target position on the sample to be measured in an image coordinate system of the second camera module 400.
The data mapping module 500 is connected to the first camera module 200 and the second camera module 300, and may be configured to map the first image data to the robot coordinate system according to a predetermined first mapping relationship to obtain first mapping data, and may also be configured to map the second image data to the robot coordinate system according to a predetermined second mapping relationship to obtain second mapping data. The first mapping relationship may be a mapping relationship between an image coordinate system of the first camera module and a manipulator coordinate system, and the second mapping relationship may be a mapping relationship between an image coordinate system of the second camera module and a manipulator coordinate system.
The motion compensation module 600, respectively connected to the data mapping module 500 and the robot control module 300, may be configured to determine relative offset data between the first mapping data and the second mapping data, and compensate the motion coordinate of the robot 100 according to the relative offset data.
In one embodiment, the robot movement compensation device may further include a test data acquisition module and a mapping relation calculation module.
The test data acquisition module can be used for acquiring an image coordinate point set when a sample to be tested is grabbed to different positions in the image acquisition area of different camera modules by the manipulator and acquiring a manipulator coordinate point set of the manipulator when the sample to be tested is grabbed to different positions.
And the mapping relation calculation module can be used for acquiring the mapping relation between the image coordinate systems of different camera modules and the manipulator coordinate system according to the image coordinate point set and the manipulator coordinate point set of different camera modules.
In one embodiment, the manipulator control module may be further configured to control the manipulator to grab the sample to be measured and move to a first position in the image capturing area of the first camera module, and may be further configured to control the manipulator to grab the sample to be measured and move to a second position in the image capturing area of the second camera module.
The test data acquisition module can be used for receiving a first image coordinate of a test position on a sample to be tested, which is acquired by the first camera module, in the first camera module, can also be used for receiving a second image coordinate of the test position on the sample to be tested, which is acquired by the second camera module, in the second camera module, and can also be used for acquiring a manipulator coordinate when the manipulator grabs the sample to be tested and moves to the second position.
The manipulator control module can also be used for controlling the manipulator to repeatedly grab a sample to be detected and move to different positions in the image acquisition area of the first camera module and different positions in the image acquisition area of the second camera module.
The test data acquisition module can also be used for acquiring a plurality of groups of first image coordinates, a plurality of groups of second image coordinates and a plurality of groups of manipulator coordinates.
In one embodiment, the mapping relation calculation module may include a first matrix mapping calculation unit and a second matrix mapping calculation unit.
The first matrix mapping calculation unit may be configured to obtain a first homography matrix mapped by the first camera module and the manipulator coordinate system according to the multiple sets of the first image coordinates and the multiple sets of the manipulator coordinates.
And the second matrix mapping calculation unit can be used for acquiring a second homography matrix mapped by the second camera module and the manipulator coordinate system according to the multiple groups of second image coordinates and the multiple groups of manipulator coordinates.
In one embodiment, the robot control module may be further configured to control the robot to press the sample to be tested into the carrier according to the compensated motion coordinate.
The modules in the robot motion compensation device can be wholly or partially implemented by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
It is to be understood that the various embodiments of the methods, apparatus, etc. described above are described in a progressive manner, and like/similar elements may be referred to one another, with each embodiment focusing on differences from the other embodiments. Reference may be made to the description of other method embodiments for relevant points.
Fig. 6 is a schematic block diagram of a robot motion compensation apparatus or system in one embodiment of the present disclosure. Referring to FIG. 6, a robot motion compensation device or system S00 includes a processing component S20 that further includes one or more processors and memory resources, represented by memory S22, for storing instructions, such as applications, executable by the processing component S20. The application program stored in the memory S22 may include one or more modules each corresponding to a set of instructions. Further, the processing component S20 is configured to execute instructions to perform the above-described method.
The robot movement compensation device or system S00 may further include: the power supply module S24 is configured to perform power management of the robot motion compensation device or system S00, the wired or wireless network interface S26 is configured to connect the robot motion compensation device or system S00 to a network, and the input/output (I/O) interface S28. The robot motion compensation device or system S00 may operate based on an operating system stored in memory S22, such as Windows Server, Mac OS X, Unix, Linux, FreeBSD, or the like.
In an exemplary embodiment, a computer-readable storage medium is also provided that includes instructions, such as the memory S22, including instructions, that are executable by the processor of the robot motion compensation device or system S00 to perform the above-described method. The storage medium may be a computer-readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided that includes instructions executable by a processor of the robot motion compensation device or system S00 to perform the above-described method.
In one embodiment, a computer device is provided, the computer device may be a terminal, an internal structure diagram of which may be as shown in fig. 7, and fig. 7 is an internal structure diagram of the computer device in one embodiment of the disclosure. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a robot movement compensation method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the hardware + program class embodiment, since it is substantially similar to the method embodiment, the description is simple, and the relevant points can be referred to the partial description of the method embodiment.
It should be noted that, the descriptions of the apparatus, the electronic device, the server, and the like according to the method embodiments may also include other embodiments, and specific implementations may refer to the descriptions of the related method embodiments. Meanwhile, the new embodiment formed by the mutual combination of the features of the methods, the devices, the equipment and the server embodiments still belongs to the implementation range covered by the present disclosure, and the details are not repeated herein.
In the description herein, references to the description of "some embodiments," "other embodiments," "desired embodiments," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, a schematic description of the above terminology may not necessarily refer to the same embodiment or example.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A robot movement compensation method is characterized by comprising the following steps:
acquiring first image data of a target position on a sample to be detected in an image coordinate system of a first camera module;
acquiring second image data of the target position on the sample to be detected in an image coordinate system of a second camera module;
mapping the first image data to a manipulator coordinate system according to a predetermined first mapping relation to obtain first mapping data, wherein the first mapping relation is the mapping relation between the image coordinate system of the first camera module and the manipulator coordinate system;
mapping the second image data to the manipulator coordinate system according to a predetermined second mapping relation to obtain second mapping data, wherein the second mapping relation is the mapping relation between the image coordinate system of the second camera module and the manipulator coordinate system;
and determining relative offset data between the first mapping data and the second mapping data, and compensating the movement coordinate of the manipulator according to the relative offset data.
2. The robot movement compensation method of claim 1, wherein the determining of the first mapping relationship and the second mapping relationship comprises:
acquiring an image coordinate point set when the sample to be detected is captured to different positions in image acquisition areas of different camera modules by the manipulator, and acquiring a manipulator coordinate point set of the manipulator when the sample to be detected is captured to different positions;
and acquiring a mapping relation between the image coordinate systems of different camera modules and the manipulator coordinate system according to the image coordinate point set and the manipulator coordinate point set of different camera modules.
3. The method of claim 2, wherein the obtaining the set of image coordinate points when the sample to be tested is captured by the manipulator to different positions in the image capture area of different camera modules, and the obtaining the set of manipulator coordinate points of the manipulator when the sample to be tested is captured to different positions comprises:
controlling the manipulator to grab the sample to be detected and move to a first position in an image acquisition area of a first camera module;
acquiring a first image coordinate of a test position on the sample to be tested in a first camera module;
controlling the manipulator to grab the sample to be detected and move to a second position in an image acquisition area of a second camera module;
acquiring a second image coordinate of the test position on the sample to be tested in a second camera module, and acquiring a manipulator coordinate when the manipulator grabs the sample to be tested and moves to a second position;
and controlling the manipulator to repeatedly grab the sample to be detected for multiple times and move to different positions in the image acquisition area of the first camera module and different positions in the image acquisition area of the second camera module to acquire multiple groups of first image coordinates, multiple groups of second image coordinates and multiple groups of manipulator coordinates.
4. The robot motion compensation method of claim 3, wherein the obtaining a mapping relationship between image coordinate systems of different camera modules and a robot coordinate system according to the image coordinate point set and the robot coordinate point set of different camera modules comprises:
acquiring a first homography matrix mapped by the first camera module and the manipulator coordinate system according to the multiple groups of first image coordinates and the multiple groups of manipulator coordinates;
and acquiring a second homography matrix mapped by the second camera module and the manipulator coordinate system according to the multiple groups of second image coordinates and the multiple groups of manipulator coordinates.
5. The robot movement compensation method according to any one of claims 1 to 4, wherein after determining relative offset data between the first mapping data and the second mapping data and compensating movement coordinates of the robot according to the relative offset data, the method further comprises:
and controlling the mechanical arm to press and connect the sample to be detected into the carrier according to the compensated moving coordinate.
6. A robot movement compensation device, comprising:
the manipulator is used for grabbing a sample to be detected and moving the sample to different positions;
the manipulator control module is connected with the manipulator and used for controlling the manipulator to move;
the first camera module is used for acquiring first image data of a target position on the sample to be detected in an image coordinate system of the first camera module;
the second camera module is used for acquiring second image data of the target position on the sample to be detected in an image coordinate system of the second camera module;
the data mapping module is respectively connected with the first camera module and the second camera module, and is used for mapping the first image data into a manipulator coordinate system according to a predetermined first mapping relation to obtain first mapping data, and is also used for mapping the second image data into the manipulator coordinate system according to a predetermined second mapping relation to obtain second mapping data; the first mapping relation is a mapping relation between an image coordinate system of the first camera module and a manipulator coordinate system, and the second mapping relation is a mapping relation between an image coordinate system of the second camera module and the manipulator coordinate system;
and the movement compensation module is respectively connected with the data mapping module and the manipulator control module and is used for determining relative offset data between the first mapping data and the second mapping data and compensating the movement coordinate of the manipulator according to the relative offset data.
7. The robot movement compensation device of claim 6, further comprising:
the test data acquisition module is used for acquiring an image coordinate point set when the sample to be tested is captured to different positions in image acquisition areas of different camera modules by the manipulator and acquiring a manipulator coordinate point set of the manipulator when the sample to be tested is captured to different positions;
and the mapping relation calculation module is used for acquiring the mapping relation between the image coordinate systems of different camera modules and the manipulator coordinate system according to the image coordinate point set and the manipulator coordinate point set of different camera modules.
8. The manipulator motion compensation device of any one of claims 6-7, wherein the manipulator control module is further configured to control the manipulator to press the sample to be tested into the carrier according to the compensated motion coordinate.
9. A computer arrangement comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, performs the steps of the manipulator movement compensation method according to any of claims 1-5.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the manipulator movement compensation method according to any one of claims 1-5.
CN202210016229.9A 2022-01-07 2022-01-07 Manipulator movement compensation method and device, computer equipment and storage medium Pending CN114373131A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210016229.9A CN114373131A (en) 2022-01-07 2022-01-07 Manipulator movement compensation method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210016229.9A CN114373131A (en) 2022-01-07 2022-01-07 Manipulator movement compensation method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114373131A true CN114373131A (en) 2022-04-19

Family

ID=81144878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210016229.9A Pending CN114373131A (en) 2022-01-07 2022-01-07 Manipulator movement compensation method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114373131A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110125926A (en) * 2018-02-08 2019-08-16 比亚迪股份有限公司 The workpiece of automation picks and places method and system
CN111083376A (en) * 2019-12-30 2020-04-28 广东博智林机器人有限公司 Method, system and device for determining installation position of target object and electronic equipment
KR20210022195A (en) * 2019-08-19 2021-03-03 하이윈 테크놀로지스 코포레이션 Calibration method for robot using vision technology
CN113744336A (en) * 2021-09-07 2021-12-03 深圳市睿达科技有限公司 Auxiliary positioning method and device and computer readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110125926A (en) * 2018-02-08 2019-08-16 比亚迪股份有限公司 The workpiece of automation picks and places method and system
KR20210022195A (en) * 2019-08-19 2021-03-03 하이윈 테크놀로지스 코포레이션 Calibration method for robot using vision technology
CN111083376A (en) * 2019-12-30 2020-04-28 广东博智林机器人有限公司 Method, system and device for determining installation position of target object and electronic equipment
CN113744336A (en) * 2021-09-07 2021-12-03 深圳市睿达科技有限公司 Auxiliary positioning method and device and computer readable storage medium

Similar Documents

Publication Publication Date Title
KR20180120647A (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
US9449378B2 (en) System and method for processing stereoscopic vehicle information
CN105066962B (en) A kind of high-precision photogrammetric apparatus of the big angle of visual field of multiresolution
CN103514450A (en) Image feature extracting method and image correcting method and equipment
US20210247320A1 (en) Optical distortion measuring apparatus and optical distortion measuring method, image processing system, electronic apparatus and display apparatus
CN109859216B (en) Distance measurement method, device and equipment based on deep learning and storage medium
CN115143895A (en) Deformation vision measurement method, device, equipment, medium and double-shaft measurement extensometer
CN105043252A (en) Image processing based size measuring method without reference object
EP2310799B1 (en) Stereoscopic measurement system and method
CN112308930A (en) Camera external parameter calibration method, system and device
CN115829843A (en) Image splicing method and device, computer equipment and storage medium
CN114071008A (en) Image acquisition device and image acquisition method
CN112629565B (en) Method, device and equipment for calibrating rotation relation between camera and inertial measurement unit
CN117359135A (en) Galvanometer correction method, galvanometer correction device, computer apparatus, storage medium, and program product
CN114373131A (en) Manipulator movement compensation method and device, computer equipment and storage medium
CN114063046A (en) Parameter calibration method and device, computer equipment and storage medium
CN107783310B (en) Calibration method and device of cylindrical lens imaging system
CN216900172U (en) Sample positioning navigation device suitable for scanning electron microscope and optical camera are used jointly
CN104677906A (en) Image information detecting method
CN113506351A (en) Calibration method and device for ToF camera, electronic equipment and storage medium
CN113075231A (en) Automatic intelligent detection method and system applied to chip DB and gold wire WB
KR20210112551A (en) Construction management system and method using mobile electric device
EP2286297B1 (en) Stereoscopic measurement system and method
CN114494455B (en) High-precision displacement measurement method under large visual angle
CN118154391B (en) Intelligent recording method and system for fire operation scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination