CN110193849B - Method and device for calibrating hands and eyes of robot - Google Patents
Method and device for calibrating hands and eyes of robot Download PDFInfo
- Publication number
- CN110193849B CN110193849B CN201810162742.2A CN201810162742A CN110193849B CN 110193849 B CN110193849 B CN 110193849B CN 201810162742 A CN201810162742 A CN 201810162742A CN 110193849 B CN110193849 B CN 110193849B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- mechanical arm
- global camera
- calibration plate
- tail end
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0009—Constructional details, e.g. manipulator supports, bases
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The present application providesA method and a device for calibrating hands and eyes of a robot are used in a robot system, wherein the robot system comprises a mechanical arm, a calibration plate and a global camera which is relatively fixed with a mechanical arm base; the calibration plate is fixed at the tail end of the mechanical arm, and the relative positions of the tail end of the mechanical arm and the calibration plate are kept unchanged; the method comprises the following steps: obtaining a relative position coordinate B of the tail end of the mechanical arm in a coordinate system of the calibration plate according to images of the calibration plate shot by the global cameras in the first states; recording a plurality of position coordinate vectors D of the tail end of the mechanical arm in the mechanical arm base coordinate system under the second state, and according to a first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera1And solving to obtain a transformation matrix E between the coordinate system of the global camera and the coordinate system of the mechanical arm base, thereby realizing the guarantee of precision and simple and convenient process.
Description
Technical Field
The application relates to the technical field of robot calibration, in particular to a method and a device for calibrating a hand eye of a robot.
Background
The robot is provided with a vision system, and the robot can control the tail end of the mechanical arm to perform various actions by utilizing images acquired by the vision system. The vision system is equivalent to the eyes of the robot, the tail end of the mechanical arm is equivalent to the hands of the robot, and preset action tasks are completed through the cooperation between the hands and the eyes.
When the robot executes a task by using the hand-eye system, the vision system acquires the position of an environmental space object based on the coordinate system of the vision system, and in order to ensure that the robot accurately moves the space object to a target position, the conversion relation between the coordinate system of the vision system and the coordinate system at the tail end of the mechanical arm or between the coordinate system of the vision system and the coordinate system at the base of the mechanical arm needs to be determined, namely, the hand-eye of the robot is calibrated, so that the robot is ensured to accurately move the space object to the target position.
In the prior art, in the calibration process, the posture of the robot generally needs to be set through manual intervention, and then the coordinate conversion relation from the coordinate system of the vision system to the coordinate system of the robot is obtained, so that the calibration process is complex, and the precision cannot be guaranteed.
Disclosure of Invention
In view of this, the embodiments of the present application provide a method and an apparatus for calibrating a hand-eye of a robot, so as to solve technical defects in the prior art.
The embodiment of the application discloses a method for calibrating a hand eye of a robot, which is used in a robot system, wherein the robot system comprises a mechanical arm, a calibration plate and a global camera which is relatively fixed with a mechanical arm base;
the calibration plate is fixed at the tail end of the mechanical arm, and the relative positions of the tail end of the mechanical arm and the calibration plate are kept unchanged;
the method comprises the following steps:
obtaining a relative position coordinate B of the tail end of the mechanical arm in a calibration plate coordinate system according to calibration plate images shot by the global camera in a plurality of first states;
recording position coordinate vectors D of the tail end of the mechanical arm in a mechanical arm base coordinate system under a plurality of second states, and according to a first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera1And solving to obtain a transformation matrix E between a global camera coordinate system and a mechanical arm base coordinate system according to the relative position coordinate B of the tail end of the mechanical arm in the calibration plate coordinate system.
In an exemplary embodiment of the present application, the first state is: driving the mechanical arm to act, and changing the posture of the mechanical arm under the condition of keeping the position of the tail end of the mechanical arm unchanged;
the second state is: and driving the mechanical arm to act, changing the posture of the mechanical arm and changing the position of the tail end of the mechanical arm.
In an exemplary embodiment of the present application, the obtaining the relative position coordinate B of the robot arm tip in the calibration board coordinate system according to the calibration board images captured by the global camera in the plurality of first states includes:
obtaining a second conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera according to the calibration plate images shot by the global camera in a plurality of first states2;
At least three points are randomly adopted on the calibration plate, and a second transformation matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera is obtained2Calculating the at least threeThe space coordinates of the points under the global camera coordinate system;
fitting a sphere center on the space coordinates of the at least three points under the global camera coordinate system to obtain a space position P of the tail end of the mechanical arm under the global camera coordinate system;
according to the space position P of the tail end of the mechanical arm under the global camera coordinate system and a second conversion matrix A between the calibration board coordinate system and the global camera coordinate system2And obtaining the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate.
In an exemplary embodiment of the present application, the recording of the position coordinate vector D of the robot arm tip in the robot arm base coordinate system in the plurality of second states is performed according to a first transformation matrix a between the calibration plate coordinate system and the global camera coordinate system1And the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate, and solving to obtain a transformation matrix E between the coordinate system of the global camera and the coordinate system of the mechanical arm base, wherein the transformation matrix E comprises:
recording the positions of the calibration plate in at least four second states, and obtaining a first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera in each second state1;
According to the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate and the first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera in each second state1Solving according to the following formula to obtain a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system in each second state1:
C1=A1·B;
Recording a position coordinate vector D of the tail end of the mechanical arm in the mechanical arm base coordinate system in each second state, and recording a corresponding first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system in the second state1Substituting the following equation:
D=E·C1
wherein E represents a transformation matrix between the global camera coordinate system and the robot arm base coordinate system;
according to the position coordinate vectors D of the tail end of the mechanical arm in the mechanical arm base coordinate system under at least four second states and the corresponding first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system under the second state1And solving to obtain a conversion matrix E.
In an exemplary embodiment of the present application, the robot system further comprises: a local camera fixed at the end of the mechanical arm;
the calibration plate is separated from the tail end of the mechanical arm and is positioned in the visual field range of the local camera and the global camera;
the method further comprises the following steps:
acquiring images of the calibration plate in the global camera and the local camera at the same time, and obtaining a third conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera according to the image of the calibration plate in the global camera3According to the image of the calibration plate in the local camera, a fourth conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the local camera is obtained4;
According to a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system1Solving a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system according to the following formula2:
C2=A4 -1·A3·C1。
In an exemplary embodiment of the present application, the first transformation matrix A between the calibration plate coordinate system and the global camera coordinate system1The second conversion matrix A2The third conversion matrix A3And the fourth rotationChange matrix A4The method is obtained by a Zhang Zhengyou calibration method, a least square calibration method or an open source computer vision library calibration method.
The embodiment of the application also discloses a device for calibrating the hands and the eyes of the robot, which is arranged in a robot system, wherein the robot system comprises a mechanical arm, a calibration plate and a global camera which is relatively fixed with a mechanical arm base;
the calibration plate is fixed at the tail end of the mechanical arm, and the relative positions of the tail end of the mechanical arm and the calibration plate are kept unchanged;
the device comprises:
the first processing module is used for obtaining a relative position coordinate B of the tail end of the mechanical arm in a calibration plate coordinate system according to calibration plate images shot by the global camera in a plurality of first states;
a second processing module for recording position coordinate vectors D of the tail end of the mechanical arm in a mechanical arm base coordinate system under a plurality of second states and according to a first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera1And solving to obtain a transformation matrix E between a global camera coordinate system and a mechanical arm base coordinate system according to the relative position coordinate B of the tail end of the mechanical arm in the calibration plate coordinate system.
In an exemplary embodiment of the present application, the first state is: driving the mechanical arm to act, and changing the posture of the mechanical arm under the condition of keeping the position of the tail end of the mechanical arm unchanged;
the second state is: and driving the mechanical arm to act, changing the posture of the mechanical arm and changing the position of the tail end of the mechanical arm.
In an exemplary embodiment of the present application, the first processing module includes:
a first parameter obtaining module, configured to obtain a second transformation matrix a between the coordinate system of the calibration board and the coordinate system of the global camera according to calibration board images captured by the global camera in multiple first states2;
Second parameter is obtainedA module for taking at least three points on the calibration plate randomly according to a second transformation matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera2Calculating the space coordinates of the at least three points in the global camera coordinate system;
the third parameter acquisition module is used for fitting the spherical center of the space coordinates of the at least three points under the global camera coordinate system to obtain the space position P of the tail end of the mechanical arm under the global camera coordinate system;
a fourth parameter obtaining module, configured to obtain a second transformation matrix a between the calibration board coordinate system and the global camera coordinate system according to a spatial position P of the end of the mechanical arm in the global camera coordinate system2And obtaining the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate.
In an exemplary embodiment of the present application, the second processing module includes:
a fifth parameter obtaining module, configured to record positions of the calibration board in at least four second states, and obtain a first conversion matrix a between the coordinate system of the calibration board and the coordinate system of the global camera in each second state1;
A sixth parameter obtaining module, configured to obtain, according to a relative position coordinate B of the end of the mechanical arm in the calibration board coordinate system and a first conversion matrix a between the calibration board coordinate system and the global camera coordinate system in each second state1Solving according to the following formula to obtain a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system in each second state1:
C1=A1·B;
A seventh parameter obtaining module, configured to record a position coordinate vector D of the end of the mechanical arm in the mechanical arm base coordinate system in each second state, and obtain a corresponding first position coordinate vector C of the end of the mechanical arm in the global camera coordinate system in the second state1Substituting the following equation:
D=E·C1
wherein E represents a transformation matrix between the global camera coordinate system and the robot arm base coordinate system;
a calculating module, configured to calculate, according to at least four position coordinate vectors D of the end of the robot arm in the robot arm base coordinate system in the second state and corresponding first position coordinate vectors C of the end of the robot arm in the global camera coordinate system in the second state1And solving to obtain a conversion matrix E.
In an exemplary embodiment of the present application, the robot system further comprises: a local camera fixed at the end of the mechanical arm;
the device further comprises:
an eighth parameter obtaining module, configured to, when the calibration plate is detached from the end of the mechanical arm and is located within the visual field of the local camera and the global camera, obtain images of the calibration plate in the global camera and the local camera at the same time, and obtain a third transformation matrix a between the coordinate system of the calibration plate and the coordinate system of the global camera according to the image of the calibration plate in the global camera3According to the image of the calibration plate in the local camera, a fourth conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the local camera is obtained4;
A ninth parameter obtaining module, configured to obtain a first position coordinate vector C of the end of the mechanical arm in the global camera coordinate system1Solving a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system according to the following formula2:
C2=A4 -1·A3·C1。
The embodiment of the application also discloses a computing device, which comprises a memory, a processor and computer instructions stored on the memory and capable of running on the processor, wherein the processor executes the instructions to realize the steps of the robot hand-eye calibration method.
The embodiment of the application also discloses a computer readable storage medium, which stores computer instructions, and is characterized in that the instructions are executed by a processor to realize the steps of the robot hand-eye calibration method.
According to the method and the device for calibrating the robot hand-eye, the position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate is obtained in the first state, then the position coordinate vector D of the tail end of the mechanical arm in the coordinate system of the base of the mechanical arm is recorded in the second state, and the first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera is used for calibrating the hand-eye of the robot1And solving to obtain a transformation matrix E between the coordinate system of the global camera and the coordinate system of the mechanical arm base according to the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate, thereby automatically realizing the hand-eye calibration of the robot in the global camera. The method can automatically control the motion of the mechanical arm, ensure the precision and have simple and convenient flow.
In addition, the method disclosed by the application can also realize the hand-eye calibration task under the global camera and the local camera at the same time, and does not need to calibrate the global camera and the local camera respectively, so that compared with the prior art, the calibration process is further simplified.
Drawings
FIG. 1 is a schematic block diagram of a computing device according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of a robotic system according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a method for calibrating a hand-eye of a robot according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a method for calibrating a hand-eye of a robot according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a method for calibrating a hand-eye of a robot according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a method for calibrating a hand-eye of a robot according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a device for calibrating the eyes of the robot hand according to an embodiment of the application.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of implementation in many different ways than those herein set forth and of similar import by those skilled in the art without departing from the spirit of this application and is therefore not limited to the specific implementations disclosed below.
In the present application, a method and an apparatus for robot hand-eye calibration, a robot system and a computer readable storage medium are provided, which are described in detail in the following embodiments one by one.
FIG. 1 is a block diagram illustrating a computing device 100 according to an embodiment of the present application. The pages can be accessed by a computing device 100, the components of which computing device 100 include, but are not limited to, memory 110 and processor 120. The processor 120 is connected to the memory 110.
Although not shown in fig. 1, it should be appreciated that computing device 100 may also include a network interface that enables computing device 100 to communicate via one or more networks. Examples of such networks include a Local Area Network (LAN), a Wide Area Network (WAN), a Personal Area Network (PAN), or a combination of communication networks such as the internet. The network interface may include one or more of any type of network interface (e.g., a Network Interface Card (NIC)) whether wired or wireless, such as an IEEE802.11 Wireless Local Area Network (WLAN) wireless interface, a worldwide interoperability for microwave access (Wi-MAX) interface, an ethernet interface, a Universal Serial Bus (USB) interface, a cellular network interface, a bluetooth interface, a Near Field Communication (NFC) interface, and so forth. The computing device may access the page through the network interface.
In one embodiment of the present application, the other components of the computing device 100 described above and not shown in fig. 1 may also be connected to each other, for example, by a bus. It should be understood that the block diagram of the computing device architecture shown in FIG. 1 is for purposes of example only and is not limiting as to the scope of the present application. Those skilled in the art may add or replace other components as desired.
Computing device 100 may be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (e.g., tablet, personal digital assistant, laptop, notebook, netbook, etc.), a mobile phone (e.g., smartphone), a wearable computing device (e.g., smartwatch, smartglasses, etc.), or other type of mobile device, or a stationary computing device such as a desktop computer or PC. Computing device 100 may also be a mobile or stationary server.
The computing device 100 of the present embodiment is connected to the robot system for outputting a control instruction to the robot system. In order to realize automatic hand-eye calibration of the robot, the computing device 100 of the present embodiment may further be provided with a display interface for interacting with a user, so that the user may perform operations or display image data to the user.
The following describes the robot system according to an embodiment of the present application in detail. Referring to fig. 2, the robot system of the present embodiment includes: a robot arm 201, a calibration plate 202, a global camera 204 held fixed relative to the robot arm base, and a local camera 203 fixed to the robot arm tip.
In particular, the vision system may capture images by a camera. In a robot system, the installation manner of a camera may be divided into two types: one is that the camera is installed outside the robot arm, fixed relative to the base of the robot arm, and does not move along with the movement of the robot arm (i.e., the global camera in this embodiment is also called eye-to-hand); the other type is that the camera is mounted at the end of the mechanical arm and moves along with the movement of the mechanical arm (i.e. the local camera in this embodiment is also called eye-in-hand). The calibration modes of the two installation modes are slightly different, but the basic principles are similar: the robot system drives a calibration plate at the tail end of the mechanical arm to change the posture, the camera shoots and identifies the calibration plate in the visual field, and a transformation relation matrix of the global camera relative to the base of the mechanical arm or a transformation relation matrix of the local camera relative to the tail end of the mechanical arm is obtained through a series of calculations. With this relationship, the robot can be guided to perform a work such as grasping by converting an object such as a workpiece in the camera coordinate system to the robot coordinate system.
When the global camera 204 is used for calibration, the calibration plate 202 is fixed at the tail end of the mechanical arm 201, and the relative positions of the tail end of the mechanical arm 201 and the calibration plate 202 are kept unchanged; when the local camera 203 is used for calibration, since the local camera 203 needs to be fixed at the end of the mechanical arm 201, the distance between the local camera 203 and the calibration board 202 is too small to cause the calibration board 202 to be out of the visual field of the local camera 203, so that the calibration board 202 needs to be removed and placed on the ground.
It should be noted that, for the accuracy of the calibration result, it is necessary to ensure that the calibration board 202 is completely in the field of view of the global camera 204 or the local camera 203 and a certain margin is left in the whole calibration process.
Processor 120 may perform the steps of the method shown in fig. 3. Fig. 3 is a flowchart illustrating a method for robot hand-eye calibration according to an embodiment of the present application, including steps 301 to 302.
In this embodiment, the first state refers to driving the robot arm to move, and changing the posture of the robot arm while maintaining the position of the end of the robot arm.
Specifically, in step 301, referring to fig. 4, the method includes:
401. obtaining a second conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera according to the calibration plate images shot by the global camera in a plurality of first states2。
In this step 401, the pose of the end of the robot arm in the first pose is at least three, and preferably six, to increase the accuracy. A second conversion matrix A2Can be obtained by the Zhangyingyou calibration method, the least squares calibration method or the open source computer vision library (OpenCV) calibration method.
402. At least three points are randomly adopted on the calibration plate, and a second transformation matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera is obtained2CalculatingThe spatial coordinates of the at least three points are in a global camera coordinate system.
403. And fitting the spherical center of the space coordinates of the at least three points under the global camera coordinate system to obtain the space position P of the tail end of the mechanical arm under the global camera coordinate system.
404. According to the space position P of the tail end of the mechanical arm under the coordinate system of the global camera and a second conversion matrix A between the coordinate system of the calibration board and the coordinate system of the global camera2And obtaining the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate.
In this embodiment, the second state is: and driving the mechanical arm to act, changing the posture of the mechanical arm and changing the position of the tail end of the mechanical arm.
Specifically, in step 302, referring to fig. 5, the method includes:
501. recording the positions of the calibration plate in at least four second states, and obtaining a first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera in each second state1。
In this step, the poses of the end of the robot arm in the second state are at least three, and preferably six, to heighten the accuracy. First conversion matrix A1Can be obtained by the Zhangyingyou calibration method, the least squares calibration method or the open source computer vision library (OpenCV) calibration method.
It should be noted that the at least four second states are determined by the number of unknown parameters in the transition matrix E calculated in the subsequent step. If there are less than four second states, the unknown parameters in the subsequent steps are too large to be calculated.
502According to the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate and a first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera in each second state1And solving to obtain a first position coordinate vector C of the tail end of the mechanical arm in each second state in the global camera coordinate system1。
Solving a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system according to the following formula1:
C1=A1·B。
503. Recording a position coordinate vector D of the tail end of the mechanical arm in the mechanical arm base coordinate system in each second state, and recording a corresponding first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system in the second state1Substituting the following equation:
D=E·C1
wherein E represents a transformation matrix between the global camera coordinate system and the robot arm base coordinate system.
504. According to the position coordinate vectors D of the tail end of the mechanical arm in the mechanical arm base coordinate system under at least four second states and the corresponding first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system under the second state1And solving to obtain a conversion matrix E.
Optionally, in step 504, the unknown parameters in the transformation matrix E may be solved by using the absolute Orientation algorithm, so as to calculate the transformation matrix E.
The following description will be given with reference to a specific example.
The spatial position coordinates (X, y, z) of a point can be expressed as a position coordinate vector.
In the present invention, a point F in space, whose coordinate value is F in the arm base coordinate system1(x1,y1,z1) The coordinate value of the point is F' in the global camera coordinate system1(x'1,y'1,z'1) And each point coordinate value is recorded as a vector: point F1I.e. byIs recorded as a vectorPoint F'1I.e. byIs recorded as a vectorTwo points F in each state1And F'1The coordinate values of the two are practically the same and have a resolving relationshipAccording to the unknown number of E, the coordinate value vector of four groups of space points under at least 4 states is needed to be solved to calculate E:
specifically, the solution can be performed according to the absolute Orientation algorithm in the technology.
It should be noted that, since the motion of the end of the robot arm is actively controlled, the position coordinate vector D of the end of the robot arm in the robot arm base coordinate system can be obtained by the robot system itself.
For the embodiment, the transformation matrix E between the global camera coordinate system and the robot arm base coordinate system is obtained, that is, the calibration process of the global camera is completed.
According to the robot hand-eye calibration method, the position coordinates B of the tail end of the mechanical arm in the calibration plate coordinate system are obtained in a plurality of first states, then the position coordinate vector D of the tail end of the mechanical arm in the mechanical arm base coordinate system is recorded in a plurality of second states, and the robot hand-eye calibration method is based on the calibration plate coordinate system and the global camera coordinate systemFirst transformation matrix A between the systems1And solving to obtain a transformation matrix E between the coordinate system of the global camera and the coordinate system of the mechanical arm base according to the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate, thereby automatically realizing the hand-eye calibration of the robot in the global camera. The method can automatically control the motion of the mechanical arm, ensure the precision and have simple and convenient flow.
The embodiment of the application also discloses a method for calibrating the hand and the eye of the robot, which is used in a robot system, wherein the robot system comprises a mechanical arm, a calibration plate and a global camera which is relatively fixed with a mechanical arm base.
In an initial state, the calibration plate is fixed at the tail end of the mechanical arm, and the relative position of the tail end of the mechanical arm and the calibration plate is kept unchanged. Referring to fig. 6, the method includes:
601. obtaining a second conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera according to the calibration plate images shot by the global camera in a plurality of first states2。
In the present embodiment, the postures of the robot arm tip in the first posture are at least three, and preferably six, to heighten the accuracy. A second conversion matrix A2Can be obtained by the Zhangyingyou calibration method, the least squares calibration method or the open source computer vision library (OpenCV) calibration method.
602. At least three points are randomly adopted on the calibration plate, and a second transformation matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera is obtained2And calculating the space coordinates of the at least three points in the global camera coordinate system.
603. And fitting the spherical center of the space coordinates of the at least three points under the global camera coordinate system to obtain the space position P of the tail end of the mechanical arm under the global camera coordinate system.
604. According to the space position P of the tail end of the mechanical arm under the coordinate system of the global camera and a second conversion matrix A between the coordinate system of the calibration board and the coordinate system of the global camera2Obtaining the relative position of the tail end of the mechanical arm in the coordinate system of the calibration plateAnd (4) coordinates B.
605. Recording the positions of the calibration plate in at least four second states, and obtaining a first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera in each second state1。
606. According to the relative position coordinate B of the tail end of the mechanical arm in a calibration plate coordinate system and a first conversion matrix A between the calibration plate coordinate system and a global camera coordinate system in each second state1And solving to obtain a first position coordinate vector C of the tail end of the mechanical arm in each second state in the global camera coordinate system1。
Solving a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system according to the following formula1:
C1=A1·B。
607. Recording a position coordinate vector D of the tail end of the mechanical arm in the mechanical arm base coordinate system in each second state, and recording a corresponding first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system in the second state1Substituting the following equation:
D=E·C1
wherein E represents a transformation matrix between the global camera coordinate system and the robot arm base coordinate system;
608. according to the position coordinate vectors D of the tail end of the mechanical arm in the mechanical arm base coordinate system under at least four second states and the corresponding first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system under the second state1And solving to obtain a conversion matrix E.
Optionally, in step 608, the unknown parameters in the transformation matrix E may be solved by using the absolute Orientation algorithm, so as to calculate the transformation matrix E.
It should be noted that, the steps 601 to 608 complete the calibration of the global camera, and then the calibration board is separated from the end of the robot arm and located in the visual field of the local camera and the global camera, and the following steps 609 to 610 are performed.
It should be noted that the calibration board may be located on the ground or on the desktop, so as to achieve the purpose that the local camera and the global camera can simultaneously capture the image of the calibration board.
609. Acquiring images of the calibration plate in the global camera and the local camera at the same time, and acquiring a third conversion matrix A between a coordinate system of the calibration plate and a coordinate system of the global camera according to the images of the calibration plate in the global camera3According to the image of the calibration plate in the local camera, a fourth conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the local camera is obtained4。
Third transformation matrix A3And a fourth conversion matrix A4Can be obtained by the Zhangyingyou calibration method, the least squares calibration method or the open source computer vision library (OpenCV) calibration method.
610. According to a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system1And solving to obtain a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system2。
Solving a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system according to the following formula2:
C2=A4 -1·A3·C1。
Through the steps 609 to 610, a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system is obtained2Thus completing the calibration of the local camera.
The method disclosed by the application can also realize the hand-eye calibration task under the global camera and the local camera at the same time, and does not need to calibrate the global camera and the local camera respectively, so that compared with the prior art, the calibration process is further simplified.
The embodiment of the application also discloses a device for calibrating the hands and the eyes of the robot, which is arranged in a robot system, wherein the robot system comprises a mechanical arm, a calibration plate and a global camera which is relatively fixed with a mechanical arm base; the calibration plate is fixed at the tail end of the mechanical arm, and the relative positions of the tail end of the mechanical arm and the calibration plate are kept unchanged;
referring to fig. 7, the apparatus 700 includes:
a first processing module 701, configured to obtain, according to calibration board images captured by the global camera in multiple first states, a relative position coordinate B of the end of the mechanical arm in a coordinate system of the calibration board;
specifically, the first state is: and driving the mechanical arm to act, and changing the posture of the mechanical arm under the condition of keeping the position of the tail end of the mechanical arm unchanged.
A second processing module 702, configured to record a plurality of position coordinate vectors D of the end of the robot in the robot base coordinate system under the second state, and according to a first transformation matrix a between the calibration board coordinate system and the global camera coordinate system1And solving to obtain a transformation matrix E between a global camera coordinate system and a mechanical arm base coordinate system according to the relative position coordinate B of the tail end of the mechanical arm in the calibration plate coordinate system.
Specifically, the second state is: and driving the mechanical arm to act, changing the posture of the mechanical arm and changing the position of the tail end of the mechanical arm.
Optionally, the first processing module 701 includes:
a first parameter obtaining module, configured to obtain a second transformation matrix a between the coordinate system of the calibration board and the coordinate system of the global camera according to the calibration board images captured by the global camera in multiple first states2;
A second parameter obtaining module for randomly adopting at least three points on the calibration plate according to a second transformation matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera2Calculating the space coordinates of the at least three points in the global camera coordinate system;
the third parameter acquisition module is used for fitting the spherical center of the space coordinates of the at least three points under the global camera coordinate system to obtain the space position P of the tail end of the mechanical arm under the global camera coordinate system;
a fourth parameter obtaining module, configured to obtain a second transformation matrix a between the calibration board coordinate system and the global camera coordinate system according to the spatial position P of the end of the mechanical arm in the global camera coordinate system2And obtaining the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate.
Optionally, the second processing module 702 includes:
a fifth parameter obtaining module, configured to record positions of the calibration board in at least four second states, and obtain a first conversion matrix a between the coordinate system of the calibration board and the coordinate system of the global camera in each second state1;
A sixth parameter obtaining module, configured to obtain, according to a relative position coordinate B of the end of the mechanical arm in a calibration board coordinate system and a first conversion matrix a between the calibration board coordinate system and a global camera coordinate system in each second state1Solving according to the following formula to obtain a first position coordinate vector C of the tail end of the mechanical arm in each second state in the global camera coordinate system1:
C1=A1·B;
A seventh parameter obtaining module, configured to record a position coordinate vector D of the end of the mechanical arm in the mechanical arm base coordinate system in each second state, and obtain a corresponding first position coordinate vector C of the end of the mechanical arm in the global camera coordinate system in the second state1Substituting the following equation:
D=E·C1
wherein E represents a transformation matrix between the global camera coordinate system and the robot arm base coordinate system.
Alternatively, unknown parameters in the transformation matrix E may be solved by using an absolute Orientation algorithm, and the transformation matrix E may be calculated.
A calculation module for calculating the end of the robot arm in the robot arm base coordinate system according to at least four second statesAnd a corresponding first position coordinate vector C of the end of the robot arm in the second state in the global camera coordinate system1And solving to obtain a conversion matrix E.
Optionally, the robotic system further comprises: a local camera fixed at the end of the mechanical arm;
the device that robot hand eye was markd of this application embodiment still includes:
an eighth parameter acquiring module, configured to acquire images of the calibration plate in the global camera and the local camera at the same time under the condition that the calibration plate is detached from the end of the mechanical arm and is located within the visual field range of the local camera and the global camera, and obtain a third transformation matrix a between a coordinate system of the calibration plate and a coordinate system of the global camera according to the image of the calibration plate in the global camera3According to the image of the calibration plate in the local camera, a fourth conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the local camera is obtained4。
A ninth parameter obtaining module, configured to obtain a first position coordinate vector C of the end of the mechanical arm in the global camera coordinate system1And solving to obtain a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system2。
Solving a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system according to the following formula2:
C2=A4 -1·A3·C1。
The above is a schematic solution of the device for robot hand-eye calibration of the embodiment. It should be noted that the technical solution of the device for calibrating the robot eye and the technical solution of the method for calibrating the robot eye belong to the same concept, and details of the technical solution of the device for calibrating the robot eye, which are not described in detail, can be referred to the description of the technical solution of the method for calibrating the robot eye.
An embodiment of the present application also provides a computer readable storage medium storing computer instructions which, when executed by a processor, implement the steps of the method for robot eye calibration as described above.
The above is an illustrative scheme of a computer-readable storage medium of the present embodiment. It should be noted that the technical solution of the storage medium belongs to the same concept as the technical solution of the above-mentioned robot eye calibration method, and details that are not described in detail in the technical solution of the storage medium can be referred to the description of the technical solution of the above-mentioned robot eye calibration method.
The computer instructions comprise computer program code which may be in the form of source code, object code, an executable file or some intermediate form, or the like. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The preferred embodiments of the present application disclosed above are intended only to aid in the explanation of the application. Alternative embodiments are not exhaustive and do not limit the invention to the precise embodiments described. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and the practical application, to thereby enable others skilled in the art to best understand and utilize the application. The application is limited only by the claims and their full scope and equivalents.
Claims (11)
1. A robot hand-eye calibration method is characterized by being used in a robot system, wherein the robot system comprises a mechanical arm, a calibration plate and a global camera which is relatively fixed with a mechanical arm base; the calibration plate is fixed at the tail end of the mechanical arm, and the relative positions of the tail end of the mechanical arm and the calibration plate are kept unchanged;
the method comprises the following steps:
obtaining a second conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera according to the calibration plate images shot by the global camera in a plurality of first states2And the space position P of the tail end of the mechanical arm under the coordinate system of the global camera according to a second conversion matrix A between the coordinate system of the calibration board and the coordinate system of the global camera2The spatial position P of the tail end of the mechanical arm under the global camera coordinate system is obtained, and the relative position coordinate B of the tail end of the mechanical arm in a calibration board coordinate system is obtained, wherein the first state is that the mechanical arm is driven to move, and the posture of the mechanical arm is changed under the condition that the position of the tail end of the mechanical arm is kept unchanged;
according to a plurality of position coordinate vectors D of the tail end of the mechanical arm in a mechanical arm base coordinate system under a second state and a first conversion matrix A between the calibration plate coordinate system and a global camera coordinate system1And obtaining the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate, and solving to obtain the global position coordinate BAnd a transformation matrix E between a camera coordinate system and a mechanical arm base coordinate system, wherein the second state is driving the mechanical arm to move, changing the posture of the mechanical arm and changing the position of the tail end of the mechanical arm.
2. The method according to claim 1, wherein the second transformation matrix A between the calibration plate coordinate system and the global camera coordinate system is obtained from calibration plate images taken by the global camera in a plurality of first states2And the spatial position P of the mechanical arm tail end under the global camera coordinate system comprises:
obtaining a second conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera according to the calibration plate images shot by the global camera in a plurality of first states2;
At least three points are randomly adopted on the calibration plate, and a second transformation matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera is obtained2Calculating the space coordinates of the at least three points in the global camera coordinate system;
and fitting the spherical center of the space coordinates of the at least three points under the global camera coordinate system to obtain the space position P of the tail end of the mechanical arm under the global camera coordinate system.
3. The method of claim 1, wherein the first transformation matrix A between the calibration plate coordinate system and the global camera coordinate system is a function of a position coordinate vector D of the end of the robot arm in the robot arm base coordinate system in the plurality of second states1And obtaining the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate, and solving to obtain a transformation matrix E between the coordinate system of the global camera and the coordinate system of the base of the mechanical arm, wherein the transformation matrix E comprises the following steps:
recording the positions of the calibration plate in at least four second states, and obtaining a first conversion matrix between the coordinate system of the calibration plate and the coordinate system of the global camera in each second stateA1;
According to the relative position coordinate B of the tail end of the mechanical arm in the coordinate system of the calibration plate and the first conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera in each second state1Solving according to the following formula to obtain a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system in each second state1:
C1=A1·B;
Recording a position coordinate vector D of the tail end of the mechanical arm in the mechanical arm base coordinate system in each second state, and recording a corresponding first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system in the second state1Substituting the following equation:
D=E·C1
wherein E represents a transformation matrix between the global camera coordinate system and the robot arm base coordinate system;
according to the position coordinate vectors D of the tail end of the mechanical arm in the mechanical arm base coordinate system under at least four second states and the corresponding first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system under the second state1And solving to obtain a conversion matrix E.
4. The method of claim 3, wherein the robotic system further comprises: a local camera fixed at the end of the mechanical arm;
the calibration plate is separated from the tail end of the mechanical arm and is positioned in the visual field range of the local camera and the global camera;
the method further comprises the following steps:
acquiring images of the calibration plate in the global camera and the local camera at the same time, and obtaining a third conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera according to the image of the calibration plate in the global camera3According to the local cameraThe image of the calibration plate is obtained, and a fourth conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the local camera is obtained4;
According to a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system1Solving a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system according to the following formula2:
C2=A4 -1·A3·C1。
5. The method of claim 4, wherein the first transformation matrix A between the calibration plate coordinate system and the global camera coordinate system1A second transformation matrix A between the calibration plate coordinate system and the global camera coordinate system2The third conversion matrix A3And the fourth conversion matrix A4The method is obtained by a Zhang Zhengyou calibration method, a least square calibration method or an open source computer vision library calibration method.
6. A robot hand-eye calibration device is characterized by being arranged in a robot system, wherein the robot system comprises a mechanical arm, a calibration plate and a global camera which is relatively fixed with a mechanical arm base;
the calibration plate is fixed at the tail end of the mechanical arm, and the relative positions of the tail end of the mechanical arm and the calibration plate are kept unchanged;
the device comprises:
a first processing module, configured to obtain a second transformation matrix a between the coordinate system of the calibration plate and the coordinate system of the global camera according to calibration plate images captured by the global camera in multiple first states2And the space position P of the tail end of the mechanical arm under the coordinate system of the global camera according to a second conversion matrix A between the coordinate system of the calibration board and the coordinate system of the global camera2And the space position P of the tail end of the mechanical arm under the global camera coordinate system to obtain the target of the tail end of the mechanical armA relative position coordinate B in a fixed plate coordinate system, wherein the first state is that the mechanical arm is driven to move, and the posture of the mechanical arm is changed under the condition that the position of the tail end of the mechanical arm is kept unchanged;
a second processing module, configured to perform a transformation according to a position coordinate vector D of the end of the robot arm in the robot arm base coordinate system in a plurality of second states and a first transformation matrix a between the calibration board coordinate system and the global camera coordinate system1And solving to obtain a transformation matrix E between a global camera coordinate system and a mechanical arm base coordinate system, wherein the second state is driving the mechanical arm to move, changing the posture of the mechanical arm and changing the position of the tail end of the mechanical arm.
7. The apparatus of claim 6, wherein the first processing module comprises:
a first parameter obtaining module, configured to obtain a second transformation matrix a between the coordinate system of the calibration board and the coordinate system of the global camera according to calibration board images captured by the global camera in multiple first states2;
A second parameter obtaining module for randomly adopting at least three points on the calibration plate according to a second transformation matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera2Calculating the space coordinates of the at least three points in the global camera coordinate system;
and the third parameter acquisition module is used for fitting the spherical center of the space coordinates of the at least three points under the global camera coordinate system to obtain the space position P of the tail end of the mechanical arm under the global camera coordinate system.
8. The apparatus of claim 6, wherein the second processing module comprises:
a fifth parameter obtaining module for recording the positions of the at least four second-state calibration plates and obtaining each second stateA first transformation matrix A between the calibration plate coordinate system and the global camera coordinate system in the state1;
A sixth parameter obtaining module, configured to obtain, according to a relative position coordinate B of the end of the mechanical arm in the calibration board coordinate system and a first conversion matrix a between the calibration board coordinate system and the global camera coordinate system in each second state1Solving according to the following formula to obtain a first position coordinate vector C of the tail end of the mechanical arm in the global camera coordinate system in each second state1:
C1=A1·B;
A seventh parameter obtaining module, configured to record a position coordinate vector D of the end of the mechanical arm in the mechanical arm base coordinate system in each second state, and obtain a corresponding first position coordinate vector C of the end of the mechanical arm in the global camera coordinate system in the second state1Substituting the following equation:
D=E·C1
wherein E represents a transformation matrix between the global camera coordinate system and the robot arm base coordinate system;
a calculating module, configured to calculate, according to at least four position coordinate vectors D of the end of the robot arm in the robot arm base coordinate system in the second state and corresponding first position coordinate vectors C of the end of the robot arm in the global camera coordinate system in the second state1And solving to obtain a conversion matrix E.
9. The apparatus of claim 8, wherein the robotic system further comprises: a local camera fixed at the end of the mechanical arm;
the device further comprises:
an eighth parameter acquiring module, configured to acquire images of the calibration plate in the global camera and the local camera at the same time and acquire images of the calibration plate in the global camera and the local camera at the same time when the calibration plate is detached from the end of the mechanical arm and is located within the visual fields of the local camera and the global camera, and further acquire an image of the calibration plate in the local camera and the global cameraObtaining a third transformation matrix A between the coordinate system of the calibration plate and the coordinate system of the global camera according to the image of the calibration plate in the global camera3According to the image of the calibration plate in the local camera, a fourth conversion matrix A between the coordinate system of the calibration plate and the coordinate system of the local camera is obtained4;
A ninth parameter obtaining module, configured to obtain a first position coordinate vector C of the end of the mechanical arm in the global camera coordinate system1Solving a second position coordinate vector C of the tail end of the mechanical arm in the local camera coordinate system according to the following formula2:
C2=A4 -1·A3·C1。
10. A computing device comprising a memory, a processor, and computer instructions stored on the memory and executable on the processor, the processor implementing the steps of the method of any one of claims 1-5 when executing the instructions.
11. A computer-readable storage medium storing computer instructions, which when executed by a processor, perform the steps of the method of any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810162742.2A CN110193849B (en) | 2018-02-27 | 2018-02-27 | Method and device for calibrating hands and eyes of robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810162742.2A CN110193849B (en) | 2018-02-27 | 2018-02-27 | Method and device for calibrating hands and eyes of robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110193849A CN110193849A (en) | 2019-09-03 |
CN110193849B true CN110193849B (en) | 2021-06-29 |
Family
ID=67750887
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810162742.2A Active CN110193849B (en) | 2018-02-27 | 2018-02-27 | Method and device for calibrating hands and eyes of robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110193849B (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110640746B (en) * | 2019-11-07 | 2023-03-24 | 上海电气集团股份有限公司 | Method, system, equipment and medium for calibrating and positioning coordinate system of robot |
CN111002312A (en) * | 2019-12-18 | 2020-04-14 | 江苏集萃微纳自动化系统与装备技术研究所有限公司 | Industrial robot hand-eye calibration method based on calibration ball |
CN110977987B (en) * | 2019-12-25 | 2021-07-20 | 杭州未名信科科技有限公司 | Mechanical arm hand-eye calibration method, device and system |
CN111791227B (en) * | 2019-12-31 | 2022-03-11 | 深圳市豪恩声学股份有限公司 | Robot hand-eye calibration method and device and robot |
CN111973212B (en) * | 2020-08-19 | 2022-05-17 | 杭州三坛医疗科技有限公司 | Parameter calibration method and parameter calibration device |
CN112258589B (en) * | 2020-11-16 | 2024-07-02 | 北京如影智能科技有限公司 | Method and device for calibrating eyes and hands |
CN112489133B (en) * | 2020-11-17 | 2024-10-18 | 北京京东乾石科技有限公司 | Calibration method, device and equipment of hand-eye system |
CN112700505B (en) * | 2020-12-31 | 2022-11-22 | 山东大学 | Binocular three-dimensional tracking-based hand and eye calibration method and device and storage medium |
CN112381894B (en) * | 2021-01-15 | 2021-04-20 | 清华大学 | Adaptive light field imaging calibration method, device and storage medium |
CN113129383B (en) * | 2021-03-15 | 2024-06-14 | 中建科技集团有限公司 | Hand-eye calibration method and device, communication equipment and storage medium |
CN113843792B (en) * | 2021-09-23 | 2024-02-06 | 四川锋准机器人科技有限公司 | Hand-eye calibration method of surgical robot |
CN114310881B (en) * | 2021-12-23 | 2024-09-13 | 中国科学院自动化研究所 | Calibration method and system of mechanical arm quick-change device and electronic equipment |
CN114794667B (en) * | 2022-03-31 | 2023-04-14 | 深圳市如本科技有限公司 | Tool calibration method, system, device, electronic equipment and readable storage medium |
CN115127452B (en) * | 2022-09-02 | 2022-12-09 | 苏州鼎纳自动化技术有限公司 | Notebook computer shell size detection method, system and storage medium |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60252913A (en) * | 1984-05-30 | 1985-12-13 | Mitsubishi Electric Corp | Robot controller |
CN1292878C (en) * | 2003-09-03 | 2007-01-03 | 中国科学院自动化研究所 | Pickup camera self calibration method based on robot motion |
CN103558850B (en) * | 2013-07-26 | 2017-10-24 | 无锡信捷电气股份有限公司 | A kind of welding robot full-automatic movement self-calibration method of laser vision guiding |
CN105014678A (en) * | 2015-07-16 | 2015-11-04 | 深圳市得意自动化科技有限公司 | Robot hand-eye calibration method based on laser range finding |
CN105014679A (en) * | 2015-08-03 | 2015-11-04 | 华中科技大学无锡研究院 | Robot hand and eye calibrating method based on scanner |
CN106767393B (en) * | 2015-11-20 | 2020-01-03 | 沈阳新松机器人自动化股份有限公司 | Hand-eye calibration device and method for robot |
CN106426172B (en) * | 2016-10-27 | 2019-04-16 | 深圳元启智能技术有限公司 | A kind of scaling method and system of industrial robot tool coordinates system |
CN107498558A (en) * | 2017-09-19 | 2017-12-22 | 北京阿丘科技有限公司 | Full-automatic hand and eye calibrating method and device |
-
2018
- 2018-02-27 CN CN201810162742.2A patent/CN110193849B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN110193849A (en) | 2019-09-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110193849B (en) | Method and device for calibrating hands and eyes of robot | |
CN111801198B (en) | Hand-eye calibration method, system and computer storage medium | |
Lee et al. | Camera-to-robot pose estimation from a single image | |
CN112022355B (en) | Hand-eye calibration method and device based on computer vision and storage medium | |
CN113601503B (en) | Hand-eye calibration method, device, computer equipment and storage medium | |
CN106426172B (en) | A kind of scaling method and system of industrial robot tool coordinates system | |
RU2700246C1 (en) | Method and system for capturing an object using a robot device | |
CN109421050B (en) | Robot control method and device | |
KR20200107789A (en) | Automatic calibration for a robot optical sensor | |
CN112171666B (en) | Pose calibration method and device for visual robot, visual robot and medium | |
US9165213B2 (en) | Information processing apparatus, information processing method, and program | |
CN111208783A (en) | Action simulation method, device, terminal and computer storage medium | |
CN109108968A (en) | Exchange method, device, equipment and the storage medium of robot head movement adjustment | |
CN112752025B (en) | Lens switching method and device for virtual scene | |
CN112119427A (en) | Method, system, readable storage medium and movable platform for object following | |
CN113997295B (en) | Hand-eye calibration method and device for mechanical arm, electronic equipment and storage medium | |
CN113246131B (en) | Motion capture method and device, electronic equipment and mechanical arm control system | |
CN113284192A (en) | Motion capture method and device, electronic equipment and mechanical arm control system | |
CN110478903B (en) | Control method and device for virtual camera | |
CN115042184A (en) | Robot hand-eye coordinate conversion method and device, computer equipment and storage medium | |
CN114012718A (en) | Data processing method | |
JP2024503275A (en) | Mobile robot control method, computer-implemented storage medium, and mobile robot | |
JP2015230625A (en) | Image display device | |
US20210156710A1 (en) | Map processing method, device, and computer-readable storage medium | |
CN116922374A (en) | Binocular vision calibration method, calibration device, robot and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |