CN115922707A - Calibration method and device for visual positioning guide system of mechanical arm - Google Patents

Calibration method and device for visual positioning guide system of mechanical arm Download PDF

Info

Publication number
CN115922707A
CN115922707A CN202211500846.2A CN202211500846A CN115922707A CN 115922707 A CN115922707 A CN 115922707A CN 202211500846 A CN202211500846 A CN 202211500846A CN 115922707 A CN115922707 A CN 115922707A
Authority
CN
China
Prior art keywords
calibration
coordinate
visual
determining
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211500846.2A
Other languages
Chinese (zh)
Inventor
张学钢
刘浩
严力
王骏荣
钟权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LCFC Hefei Electronics Technology Co Ltd
Original Assignee
LCFC Hefei Electronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LCFC Hefei Electronics Technology Co Ltd filed Critical LCFC Hefei Electronics Technology Co Ltd
Priority to CN202211500846.2A priority Critical patent/CN115922707A/en
Publication of CN115922707A publication Critical patent/CN115922707A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Manipulator (AREA)

Abstract

The present disclosure provides a calibration method and device for a visual positioning guidance system of a mechanical arm, wherein the method comprises the following steps: instructing the mechanical arm to move to at least three calibration positions; determining a visual calibration coordinate aiming at a calibration position according to a calibration object fixed at the tail end of the mechanical arm, wherein the poses of the calibration object and the tail end of the mechanical arm are kept unchanged; performing coordinate conversion on the visual calibration coordinates according to a terminal coordinate system and a base coordinate system corresponding to the mechanical arm, and determining a coordinate conversion relation; and determining a calibration result corresponding to the mechanical arm and the visual positioning guide system according to the coordinate conversion relation. By applying the method provided by the disclosure, a professional calibration tool is not needed, and the mechanical arm is not needed to be manually controlled to move to a specific calibration position, so that the influence on the calibration result caused by the manufacturing precision of the calibration tool, the manual control precision and other factors can be avoided, the accuracy of the calibration result is further improved, and the calibration process is simple and easy to operate.

Description

Calibration method and device for visual positioning guide system of mechanical arm
Technical Field
The disclosure relates to the technical field of automation, and in particular relates to a calibration method and device for a visual positioning guide system of a mechanical arm.
Background
The mechanical arm is guided to operate by the visual system, and the mechanical arm can replace manual work to carry out high-intensity and high-risk work, so the mechanical arm is widely applied to various industries The calibration of the relative pose relationship between the vision system and the mechanical arm is an important premise for realizing the operation of guiding the mechanical arm by the vision system.
The calibration method of 'eyes outside the hand' at present needs to use calibration objects such as a checkerboard, coordinates of coordinate points in the checkerboard in a visual coordinate system are obtained through the visual system, then the mechanical arm is manually controlled, tools such as a calibration rod and the like arranged at the tail end of the mechanical arm are utilized to sequentially touch each coordinate point obtained by the visual system, then the coordinate of the corresponding coordinate point in a mechanical arm coordinate system is determined, and the relative pose relationship between the visual system and the mechanical arm is obtained through calculation according to the coordinate of the same coordinate point in the visual coordinate system and the coordinate of the mechanical arm coordinate system. The calibration method of the eyes outside the hands needs to use professional calibration tools such as checkerboards and the like, and the mechanical arm needs to be manually controlled to a specific coordinate point, so that the calibration precision can be influenced by the manual control precision, the manufacturing precision of the calibration tool and the like, and the operation is complex.
Disclosure of Invention
The present disclosure provides a calibration method and device for a vision positioning guidance system of a robot arm, so as to at least solve the above technical problems in the prior art.
According to a first aspect of the present disclosure, there is provided a calibration method for a visual positioning guidance system of a robot arm, the method comprising: instructing the mechanical arm to move to at least three calibration positions; determining a visual calibration coordinate aiming at the calibration position according to a calibration object fixed at the tail end of the mechanical arm, wherein the poses of the calibration object and the tail end of the mechanical arm are kept unchanged; performing coordinate conversion on the vision calibration coordinate according to a terminal coordinate system and a base coordinate system corresponding to the mechanical arm, and determining a coordinate conversion relation; and determining a calibration result corresponding to the mechanical arm and the visual positioning guide system according to the coordinate conversion relation.
In one embodiment, the indicating mechanical arm moves to at least three calibration positions, including: determining a vision system field of view corresponding to the vision positioning guidance system; instructing the robotic arm to move within the field of view of the vision system to at least three calibration positions, the three calibration positions being non-collinear.
In an embodiment, the determining the visual calibration coordinates for the calibration position according to the calibration object fixed at the end of the mechanical arm comprises: determining, by a vision sensor, a calibration point of the calibration object in the field of view of the vision system; determining a corresponding visual coordinate system according to the visual positioning guide system; and determining the visual calibration coordinates of the calibration points corresponding to the visual coordinate system.
In an implementation manner, the coordinate transformation of the visual calibration coordinate according to the end coordinate system and the base coordinate system corresponding to the robot arm to determine a coordinate transformation relationship includes: determining first conversion matrix information corresponding to the tail end coordinate system and the base coordinate system based on the tail end calibration coordinate of the calibration object corresponding to the tail end coordinate system and the base calibration coordinate of the calibration object corresponding to the base coordinate system; and determining a coordinate conversion relation according to the visual calibration coordinate, the tail end calibration coordinate, the base calibration coordinate and the first conversion matrix information.
In an embodiment, the determining a coordinate transformation relation according to the visual calibration coordinates, the end calibration coordinates, the base calibration coordinates, and the first transformation matrix information includes: determining a first conversion relation according to the base calibration coordinate, the tail end calibration coordinate and the first conversion matrix information; and determining a second conversion relation corresponding to the second conversion matrix information according to the substrate calibration coordinate and the visual calibration coordinate.
In an embodiment, the determining calibration results corresponding to the mechanical arm and the visual positioning and guiding system according to the coordinate transformation relationship includes: integrating the first conversion relation and the second conversion relation to determine an equivalent conversion relation; mutually integrating any two calibration positions in the at least three calibration positions to determine an equivalent transformation relation group; determining the second transformation matrix information corresponding to the visual coordinate system and the base coordinate system based on the equivalence transformation relation set; and determining a calibration result corresponding to the mechanical arm and the visual positioning guide system according to the second conversion matrix information.
In one embodiment, the first transformation matrix information includes: a first rotation matrix and a first translation vector of the end coordinate system corresponding to the base coordinate system; the second conversion matrix information includes: a second rotation matrix and a second translation vector of the visual coordinate system corresponding to the base coordinate system.
According to a second aspect of the present disclosure, there is provided a calibration apparatus for a visual positioning guidance system of a robot arm, the apparatus comprising: the indicating module is used for indicating the mechanical arm to move to at least three calibration positions; the first determining module is used for determining a visual calibration coordinate aiming at the calibration position according to a calibration object fixed at the tail end of the mechanical arm, and the positions of the calibration object and the tail end of the mechanical arm are kept unchanged; the second determining module is used for performing coordinate conversion on the visual calibration coordinate according to a terminal coordinate system and a base coordinate system corresponding to the mechanical arm to determine a coordinate conversion relation; and the third determining module is used for determining a calibration result corresponding to the mechanical arm and the visual positioning guide system according to the coordinate conversion relation.
In an embodiment, the indication module includes: a first determining submodule for determining a visual system field of view corresponding to the visual positioning guidance system; an indication sub-module for indicating the mechanical arm to move to at least three calibration positions within the visual system field of view, the three calibration positions being non-collinear.
In one embodiment, the first determining module includes: a second determination submodule for determining, by a vision sensor, a calibration point of the calibration object in the field of view of the vision system; the third determining submodule is used for determining a corresponding visual coordinate system according to the visual positioning guide system; and the fourth determining submodule is used for determining the visual calibration coordinates of the calibration point corresponding to the visual coordinate system.
In an embodiment, the second determining module includes: the fifth determining submodule is used for determining first conversion matrix information corresponding to the tail end coordinate system and the base coordinate system based on the tail end calibration coordinate of the calibration object corresponding to the tail end coordinate system and the base calibration coordinate of the calibration object corresponding to the base coordinate system; and the sixth determining submodule is used for determining a coordinate conversion relation according to the visual calibration coordinate, the tail end calibration coordinate, the base calibration coordinate and the first conversion matrix information.
In one embodiment, the sixth determining sub-module includes: the first determining unit is used for determining a first conversion relation according to the base calibration coordinate, the tail end calibration coordinate and the first conversion matrix information; and the second determining unit is used for determining a second conversion relation corresponding to the second conversion matrix information according to the substrate calibration coordinate and the visual calibration coordinate.
In an embodiment, the third determining module includes: the first integration submodule is used for integrating the first conversion relation and the second conversion relation and determining an equivalent conversion relation; the second integration submodule is used for mutually integrating any two calibration positions in the at least three calibration positions to determine an equivalent transformation relation group; a seventh determining sub-module, configured to determine, based on the equivalence transformation relation group, the second transformation matrix information corresponding to the visual coordinate system and the base coordinate system; and the eighth determining submodule is used for determining a calibration result corresponding to the mechanical arm and the visual positioning and guiding system according to the second conversion matrix information.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the methods of the present disclosure.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of the present disclosure.
According to the calibration method, the calibration device, the calibration equipment and the storage medium for the visual positioning guide system of the mechanical arm, a calibration object is fixed at the tail end of the mechanical arm and keeps the relative pose of the mechanical arm and the relative pose of the mechanical arm fixed, then the mechanical arm is indicated to randomly move to at least three arbitrary calibration positions in a visual field of a visual system, and the calibration result corresponding to the mechanical arm and the visual positioning guide system can be obtained according to the conversion relation among a tail end coordinate system, a base coordinate system and the visual coordinate system corresponding to the mechanical arm; the calibration device has the advantages that a professional calibration tool is not needed, the mechanical arm does not need to be manually operated to move to a specific calibration position, the influence on the calibration precision due to factors such as manufacturing precision of the calibration tool and manual operation precision can be avoided, the accuracy of the calibration result is further improved, and the calibration process is simple and easy to operate.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present disclosure will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
in the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Fig. 1 is a schematic flow chart illustrating a first implementation of a calibration method for a vision positioning guidance system of a robot arm according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart illustrating an implementation of a calibration method for a vision positioning guidance system of a robot arm according to an embodiment of the present disclosure;
FIG. 3 illustrates an intent of a calibration arrangement for a vision positioning guidance system of a robotic arm in accordance with an embodiment of the present disclosure;
fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, features and advantages of the present disclosure more obvious and understandable, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
Fig. 1 shows a first implementation flowchart of a calibration method for a vision positioning guidance system of a robot arm according to an embodiment of the present disclosure.
As shown in fig. 1, according to a first aspect of the embodiments of the present disclosure, there is provided a calibration method for a visual positioning guidance system of a robot arm, the method including: an operation 101 of indicating that the mechanical arm moves to at least three calibration positions; operation 102, determining a visual calibration coordinate for a calibration position according to a calibration object fixed at the tail end of the mechanical arm, wherein the poses of the calibration object and the tail end of the mechanical arm are kept unchanged; operation 103, performing coordinate conversion on the visual calibration coordinates according to the end coordinate system and the base coordinate system corresponding to the mechanical arm, and determining a coordinate conversion relation; and operation 104, determining a calibration result corresponding to the mechanical arm and the visual positioning guide system according to the coordinate conversion relation.
According to the calibration method for the visual positioning guide system of the mechanical arm, a calibration object is fixed at the tail end of the mechanical arm, the relative positions of the calibration object and the mechanical arm are kept fixed, and then the mechanical arm is indicated to randomly move to at least three arbitrary calibration positions in a visual field of a visual system, so that calibration results corresponding to the mechanical arm and the visual positioning guide system can be obtained; by the method, a professional calibration tool is not needed, the mechanical arm does not need to be manually operated to move to a specific calibration position, and the influence on the calibration result caused by the manufacturing precision of the calibration tool, the manual operation precision and other factors is avoided, so that the accuracy of the calibration result is improved, and the difficulty of the calibration operation is reduced.
In the method operation 101, the calibration location may be any location within the field of view of the vision system; operation 101 may specifically be instructing the robotic arm to randomly move at least three times within the field of view of the vision system, and determining a calibration position for each movement of the robotic arm.
In operation 102 of the method, the calibration object may be any object with edges and corners, such as cardboard, book, etc., and is fixed to the end of the mechanical arm, there is no specific requirement for the fixing manner and pose of the calibration object, as long as it is ensured that the relative pose of the calibration object and the end of the mechanical arm remains unchanged, and the visual calibration coordinate for the calibration position is: when the calibration position is set, the calibration point is the coordinate value of the visual coordinate system, the calibration point is one of the points on the calibration object, and the calibration point can be determined according to the end point of the calibration object.
Specifically, a calibration object is fixed at the tail end of the mechanical arm, the relative pose of the calibration object and the tail end of the mechanical arm is guaranteed to be fixed and unchanged, the mechanical arm is controlled to move randomly, the first position where the mechanical arm stays is determined as a first calibration position, the calibration object located at the first calibration position is shot through the visual positioning guide system, the visual calibration coordinates of the calibration point on the calibration object when the calibration object is located at the first calibration position are obtained through the visual positioning guide system, and at least three visual calibration coordinates corresponding to the calibration point when the calibration object is located at the calibration position are obtained in the same mode.
In the method operation 103, the terminal coordinate system corresponding to the mechanical arm is a rectangular coordinate system established based on the terminal of the mechanical arm, and the base coordinate system corresponding to the mechanical arm is a rectangular coordinate system established based on the mounting base of the mechanical arm; the coordinate transformation relation is a first transformation relation and a second transformation relation which are established according to the coordinate relation among the tail end coordinate system, the base coordinate system and the visual coordinate system corresponding to the mechanical arm; the first conversion relation is that the base calibration coordinate of the calibration point when the calibration object is located at the calibration position is obtained after the tail end calibration coordinate of the calibration point when the calibration object is located at the calibration position is subjected to rotary translation, and the second conversion relation is that the base calibration coordinate of the calibration point when the calibration object is located at the calibration position is obtained after the visual calibration coordinate of the calibration point when the calibration object is located at the calibration position is subjected to rotary translation.
In the method operation 104, the calibration result corresponding to the mechanical arm and the visual positioning guidance system is second transformation matrix information from the base coordinate system corresponding to the mechanical arm to the visual coordinate system; the first conversion relation is to convert the end calibration coordinate of the calibration object at the calibration position into a base calibration coordinate, the second conversion relation is to convert the visual calibration coordinate of the calibration object at the calibration position into a base calibration coordinate, and since the results obtained by the first conversion relation and the second conversion relation are both the base calibration coordinate of the calibration object at the calibration position, the first conversion relation and the second conversion relation in operation 103 can be combined to obtain an equivalent conversion relation of the calibration object at the calibration position; the method comprises the steps of respectively obtaining equivalent transformation relations of a calibration object at three non-collinear calibration positions, obtaining three equivalent transformation relations, subtracting the three equivalent transformation relations by two to establish an equivalent transformation relation group, solving the equivalent transformation relation group to obtain second transformation matrix information from a base coordinate system corresponding to a mechanical arm to a visual coordinate system, wherein the second transformation matrix information from the base coordinate system corresponding to the mechanical arm to the visual coordinate system is a calibration result, and the calibration result can be used for representing the pose transformation relation between the mechanical arm and a visual positioning guide system.
According to the method, a professional calibration tool is not needed, and the mechanical arm does not need to be manually controlled to move to a specific calibration position, so that the influence on the calibration precision caused by factors such as the manufacturing precision of the calibration tool, the manual control precision and the like is avoided, the accuracy of the calibration result is improved, and the difficulty of the calibration operation is reduced.
In an embodiment of the present disclosure, the step 101 includes: firstly, determining a visual system view field corresponding to a visual positioning guide system; the robotic arm is then instructed to move within the field of view of the vision system to at least three calibration positions, the three calibration positions being non-collinear.
The visual positioning guide system is a device for guiding the mechanical arm to grab, and can be used for shooting and positioning all objects in a visual field range. The visual positioning guidance system includes a visual sensor, such as a camera; the visual field of the visual system is the maximum visual field range which can be observed by the visual sensor.
In the method, a visual system field of view corresponding to the visual positioning guide system is determined, and the mechanical arm fixed with the calibration object moves at least three times in the visual system field of view in a non-collinear way to determine at least three non-collinear calibration positions.
In an embodiment of the present disclosure, the step 102 includes: firstly, determining a calibration point of a calibration object in a visual field of a visual system through a visual sensor; then, determining a corresponding visual coordinate system according to the visual positioning guide system; and then, determining the visual calibration coordinates of the calibration points corresponding to the visual coordinate system.
The calibration point may be any point on the calibration object, for example, when the calibration object is a piece of cardboard, the calibration point may be one end point of the cardboard; the visual coordinate system is a coordinate system corresponding to the visual sensor in the visual positioning guidance system, for example, when the visual sensor is a camera, the visual coordinate system is correspondingly a camera coordinate system, i.e., a three-dimensional rectangular coordinate system established by taking the focusing center of the camera as an origin and taking the optical axis as a Z-axis.
The embodiment is specifically described by taking a calibration object as a piece of hard paperboard and taking a visual sensor as a camera as an example, one end point of four end points of the piece of hard paperboard is determined as a calibration point, after a mechanical arm fixed with the hard paperboard moves to a certain calibration position, the hard paperboard is shot through the visual sensor, a visual positioning guide system determines the position of the calibration point, a coordinate value of the calibration point under a camera coordinate system is obtained through the visual positioning guide system, and the coordinate value is a visual calibration coordinate.
Fig. 2 shows a schematic implementation flow chart of a calibration method for a vision positioning guidance system of a robot arm according to an embodiment of the present disclosure.
As shown in fig. 2, in an embodiment of the present disclosure, the step 103 includes: step 1031, determining first conversion matrix information corresponding to the end coordinate system and the base coordinate system based on the end calibration coordinate of the end coordinate system corresponding to the calibration object and the base calibration coordinate of the base coordinate system corresponding to the calibration object; and step 1032, determining a coordinate conversion relation according to the visual calibration coordinate, the end calibration coordinate, the base calibration coordinate and the first conversion matrix information.
The tail end calibration coordinate is a coordinate value of the calibration point in a tail end coordinate system, the base calibration coordinate is a coordinate value of the calibration point in a base coordinate system, and both the tail end calibration coordinate and the base calibration coordinate are unknown quantities; the first conversion matrix information is a space conversion matrix from the terminal coordinate system to the base coordinate system, the space conversion matrix comprises a first rotation matrix and a first translation vector from the terminal coordinate system to the base coordinate system, the first rotation matrix and the first translation vector are known quantities, and when the mechanical arm is placed at a certain calibration position, the first rotation matrix and the first translation vector can be obtained through the mechanical arm demonstrator.
For example, when the calibration object is moved to the calibration position, first conversion matrix information of the calibration point from the end coordinate system to the base coordinate system at this time is determined by the robot arm teach pendant, the first conversion matrix information including a first rotation matrix of the end coordinate system to the base coordinate system
Figure BDA0003967539560000091
And a first shift amount>
Figure BDA0003967539560000092
Wherein B represents the base coordinate system, E represents the end coordinate system, and i represents the position at the time of the calibration position i.
The second transformation matrix from the visual coordinate system to the base coordinate system and the second translation vector are also unknown values, and the second transformation matrix is set to
Figure BDA0003967539560000093
The second translation vector is->
Figure BDA0003967539560000094
Wherein B represents a base coordinate system, C represents a visual coordinate system, and i represents the position at the time of the calibration position i.
The end calibration coordinate and the base calibration coordinate are unknown quantities, and the end calibration coordinate is set as p E The base calibration coordinates are
Figure BDA0003967539560000095
Wherein B represents a base coordinate system, E represents an end coordinate system, and i represents a calibration position i at the moment.
Marking the visual calibration coordinates obtained in the step 102 as
Figure BDA0003967539560000096
C denotes a visual coordinate system>
Figure BDA0003967539560000097
And (4) calibrating coordinates for the vision of the calibration point at the calibration position i.
And establishing a coordinate conversion formula of a coordinate conversion relation according to the association among the tail end coordinate system, the base coordinate system and the visual coordinate system corresponding to the mechanical arm, wherein the coordinate conversion relation is a coordinate value of the calibration point in a certain coordinate system, and obtaining a coordinate value of the coordinate point in another coordinate system through rotation and translation. Specifically, the tail end calibration coordinate of the calibration point when the calibration object is located at the calibration position is subjected to rotary translation to obtain a base calibration coordinate of the calibration point when the calibration object is located at the calibration position; and the visual calibration coordinates of the calibration point when the calibration object is positioned at the calibration position are subjected to rotary translation to obtain the base calibration coordinates of the calibration point when the calibration object is positioned at the calibration position.
In an embodiment of the present disclosure, the step 1032 includes: determining a first conversion relation according to the substrate calibration coordinate, the tail end calibration coordinate and the first conversion matrix information; and determining a second conversion relation corresponding to the second conversion matrix information according to the base calibration coordinate and the visual calibration coordinate.
The first conversion relation is that when the calibration point is located at the calibration position, the terminal calibration coordinate is converted into the base calibration coordinate, and the first conversion relation can be specifically as follows:
Figure BDA0003967539560000098
the second conversion relation is that when the calibration point is located at the calibration position, the visual calibration coordinate is converted into the base calibration coordinate, and the second conversion relation can be specifically as follows:
Figure BDA0003967539560000101
in an embodiment of the present disclosure, the step 104 includes: firstly, integrating a first conversion relation and a second conversion relation to determine an equivalent conversion relation; then, mutually integrating any two calibration positions in the at least three calibration positions to determine an equivalent transformation relation group; then, determining second transformation matrix information corresponding to the visual coordinate system and the base coordinate system based on the equivalent transformation relation group; and finally, determining the calibration results corresponding to the mechanical arm and the visual positioning guide system according to the second conversion matrix information.
Because the first conversion relation and the second conversion relation are converted into the base calibration coordinate, the first conversion relation and the second conversion relation can be combined to obtain the equivalent conversion relation. That is to say, the
Figure BDA0003967539560000102
Figure BDA0003967539560000103
And
Figure BDA0003967539560000104
are combined to obtain->
Figure BDA0003967539560000105
Because the relative pose of the calibration object and the tail end of the mechanical arm is fixed, the value of the tail end calibration coordinate does not change along with the change of the calibration position, is a fixed value and is set to be p E (ii) a Because the eye is calibrated outside the hand, the rotation matrix and the translation vector from the visual coordinate system to the base coordinate system do not change with the change of the calibration position and are also fixed, so the rotation matrix and the translation vector are respectively set to be
Figure BDA0003967539560000106
And &>
Figure BDA0003967539560000107
Similarly, the equivalent conversion relation of the calibration point at the calibration position j and the calibration position k is obtained, and the judgment result is processed>
Figure BDA0003967539560000108
The calibration position i, the calibration position j and the calibration position k are not collinear.
Further sorting the equivalence transformation relationships when the calibration object is located at the three calibration positions to obtain the following equation:
Figure BDA0003967539560000109
Figure BDA00039675395600001010
Figure BDA00039675395600001011
and carrying out combined subtraction on every two sorted three equivalent transformation relations to obtain the following equivalent transformation relation groups:
Figure BDA00039675395600001012
Figure BDA0003967539560000111
Figure BDA0003967539560000112
in the above-mentioned set of equivalence transformation relations,
Figure BDA0003967539560000113
and &>
Figure BDA0003967539560000114
All are known quantities, the equivalence set of transformations is solved to obtain->
Figure BDA0003967539560000115
And &>
Figure BDA0003967539560000116
The second transformation matrix information is a space transformation matrix from a base coordinate system corresponding to the mechanical arm to a visual coordinate system, namely a calibration result corresponding to the mechanical arm and the visual positioning guidance system, and the calibration result can be used for representing a pose transformation relation between the mechanical arm and the visual positioning guidance system.
In one embodiment of the present disclosure, the first rotation matrix is a rotation matrix from an end coordinate system to a base coordinate system, the first translation vector is a translation vector from the end coordinate system to the base coordinate system, and the first rotation matrix and the first translation vector are known quantities and can be obtained by a robot teach pendant; the second rotation matrix is a rotation matrix from the visual coordinate system to the base coordinate system, the second translation vector is a translation vector from the visual coordinate system to the base coordinate system, and the second rotation matrix and the second translation vector are quantities to be solved in the method, namely calibration results corresponding to the mechanical arm and the visual positioning guide system.
In an embodiment of the disclosure, the mechanical arm is indicated to move to six to nine calibration positions, and the calibration result is obtained through coordinate data of the six to nine calibration positions, so that the influence of factors such as measurement noise can be reduced, and the calibration result is more accurate.
To facilitate an understanding of the above embodiments, a specific implementation scenario is provided below.
1. The calibration object is fixed at the tail end of the mechanical arm, the fixed pose of the calibration object has no special requirement, and only the relative pose of the calibration object and the tail end of the mechanical arm in the calibration process needs to be ensured to be unchanged;
2. controlling the mechanical arm fixed with the calibration object to move at least three times in a visual system field of view in a non-collinear manner to determine at least three non-collinear calibration positions, and selecting any end point on the calibration object as a calibration point;
taking the calibration object in one of the calibration positions as an example:
obtaining the coordinate value of the calibration point in the visual coordinate system and recording the coordinate value as the visual calibration coordinate
Figure BDA0003967539560000117
Then obtaining a space conversion matrix of the calibration point from the terminal coordinate system to the base coordinate system, wherein the space conversion matrix comprises a first conversion matrix->
Figure BDA0003967539560000121
And a first shift amount>
Figure BDA0003967539560000122
Let the coordinate value of the calibration point in the terminal coordinate system and the coordinate value in the base coordinate system be recorded as the terminal calibration coordinate p E And base calibration coordinates
Figure BDA0003967539560000123
Assuming a spatial transformation matrix of the calibration point from the visual coordinate system to the base coordinate system, the spatial transformation matrix comprising a second transformation matrix ^ greater than or equal to>
Figure BDA0003967539560000124
And a second shift amount>
Figure BDA0003967539560000125
3. Establishing a coordinate conversion relation, wherein the coordinate conversion relation comprises a first conversion relation and a second conversion relation; wherein the first conversion relationship is:
Figure BDA0003967539560000126
the second conversion relationship is: />
Figure BDA0003967539560000127
Figure BDA0003967539560000128
Merging the first conversion relation and the second conversion relation to obtain an equivalent conversion relation:
Figure BDA0003967539560000129
is obtained by the following formula:
Figure BDA00039675395600001210
obtaining the equivalent transformation relation of the calibration point at the other two calibration positions in the same way, subtracting the two equivalent transformation relations from each other to obtain an equivalent transformation relation group containing six parameters, and solving the equivalent transformation relation group to obtain the equivalent transformation relation group
Figure BDA00039675395600001211
And &>
Figure BDA00039675395600001212
The pose conversion relation between the mechanical arm and the vision positioning guide system is obtained.
FIG. 3 illustrates an intent of a calibration arrangement for a vision positioning guidance system of a robotic arm in accordance with an embodiment of the present disclosure;
as shown in fig. 3, an embodiment of the present disclosure provides a calibration apparatus for a visual positioning guidance system of a robot arm, where the apparatus includes: the indicating module 301 is used for indicating the mechanical arm to move to at least three calibration positions; a first determining module 302, configured to determine a visual calibration coordinate for a calibration position according to a calibration object fixed at the end of a mechanical arm, where the poses of the calibration object and the end of the mechanical arm remain unchanged; the second determining module 303 is configured to perform coordinate conversion on the visual calibration coordinate according to the end coordinate system and the base coordinate system corresponding to the mechanical arm, and determine a coordinate conversion relationship; and a third determining module 304, configured to determine, according to the coordinate transformation relationship, a calibration result corresponding to the mechanical arm and the visual positioning guidance system.
In one embodiment, the indication module 301 includes: a first determining sub-module 3011 configured to determine a visual system field of view corresponding to a visual positioning guidance system; and the indicating sub-module 3012 is configured to indicate that the mechanical arm moves to at least three calibration positions within the field of view of the vision system, where the three calibration positions are not collinear.
In an embodiment, the first determining module 302 includes: a second determination submodule 3021 for determining a calibration point of the calibration object in the visual system field of view by means of the visual sensor; a third determining submodule 3022, configured to determine a corresponding visual coordinate system according to the visual positioning guidance system; and the fourth determining submodule 3023 is configured to determine the visual calibration coordinates of the calibration point corresponding to the visual coordinate system.
In one implementation, the second determining module 303 includes: a fifth determining submodule 3031, configured to determine, based on the terminal calibration coordinate of the terminal coordinate system corresponding to the calibration object and the base calibration coordinate of the base coordinate system corresponding to the calibration object, first conversion matrix information corresponding to the terminal coordinate system and the base coordinate system; and a sixth determining submodule 3032, configured to determine a coordinate conversion relationship according to the visual calibration coordinate, the end calibration coordinate, the substrate calibration coordinate, and the first conversion matrix information.
In an implementation, the sixth determining submodule 3032 includes: a first determining unit 30321, configured to determine a first conversion relationship according to the substrate calibration coordinate, the terminal calibration coordinate, and the first conversion matrix information; a second determining unit 30322, configured to determine a second transformation relationship corresponding to the second transformation matrix information according to the substrate calibration coordinates and the vision calibration coordinates.
In an embodiment, the third determining module 304 includes: a first integration submodule 3041, configured to integrate the first conversion relation and the second conversion relation, and determine an equivalent conversion relation; a second integration submodule 3042, configured to integrate any two calibration positions of the at least three calibration positions with each other, and determine an equivalent transformation relationship group; a seventh determining sub-module 3043, configured to determine, based on the equivalence transformation relation group, second transformation matrix information corresponding to the visual coordinate system and the base coordinate system; the eighth determining submodule 3044 is configured to determine, according to the second conversion matrix information, a calibration result corresponding to the mechanical arm and the visual positioning guidance system.
The present disclosure also provides an electronic device and a readable storage medium according to an embodiment of the present disclosure.
FIG. 4 shows a schematic block diagram of an example electronic device 400 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 4, the apparatus 400 includes a computing unit 401 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 402 or a computer program loaded from a storage unit 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data required for the operation of the device 400 can also be stored. The computing unit 401, ROM 402, and RAM 403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
A number of components in device 400 are connected to I/O interface 405, including: an input unit 406 such as a keyboard, a mouse, or the like; an output unit 407 such as various types of displays, speakers, and the like; a storage unit 408, such as a magnetic disk, optical disk, or the like; and a communication unit 409 such as a network card, modem, wireless communication transceiver, etc. The communication unit 409 allows the device 400 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
Computing unit 401 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of the computing unit 401 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The computing unit 401 performs the various methods and processes described above, such as a calibration method for a visual positioning guidance system of a robotic arm. For example, in some embodiments, a calibration method for a visual positioning guidance system for a robotic arm may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 408. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 400 via the ROM 402 and/or the communication unit 409. When the computer program is loaded into the RAM 403 and executed by the computing unit 401, one or more steps of a calibration method for a visual positioning guidance system of a robot arm as described above may be performed. Alternatively, in other embodiments, computing unit 401
May be configured in any other suitable manner (e.g., by firmware) to perform a calibration method for a vision positioning guidance system of a robotic arm.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, causes the functions/acts specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server combining a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present disclosure, "a plurality" means two or more unless specifically limited otherwise.
The above description is only for the specific embodiments of the present disclosure, but the scope of the present disclosure is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present disclosure, and all the changes or substitutions should be covered within the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (10)

1. A calibration method for a vision positioning guidance system of a robotic arm, the method comprising:
instructing the mechanical arm to move to at least three calibration positions;
determining a visual calibration coordinate aiming at the calibration position according to a calibration object fixed at the tail end of the mechanical arm, wherein the poses of the calibration object and the tail end of the mechanical arm are kept unchanged;
performing coordinate conversion on the vision calibration coordinate according to a terminal coordinate system and a base coordinate system corresponding to the mechanical arm, and determining a coordinate conversion relation;
and determining a calibration result corresponding to the mechanical arm and the visual positioning guide system according to the coordinate conversion relation.
2. The method of claim 1, wherein the indicating a robotic arm to move to at least three calibration positions comprises:
determining a visual system field of view corresponding to the visual positioning guidance system;
instructing the robotic arm to move to at least three calibration positions within the field of view of the vision system, the three calibration positions being non-collinear.
3. The method according to claim 2, wherein said determining visual calibration coordinates for said calibration position from a calibration object fixed to the end of the robot arm comprises:
determining, by a vision sensor, a calibration point of the calibration object in the field of view of the vision system;
determining a corresponding visual coordinate system according to the visual positioning guide system;
and determining the visual calibration coordinates of the calibration points corresponding to the visual coordinate system.
4. The method according to claim 1, wherein the coordinate transformation of the visual calibration coordinates according to the end coordinate system and the base coordinate system corresponding to the robot arm is performed to determine a coordinate transformation relationship, and the method comprises:
determining first conversion matrix information corresponding to the tail end coordinate system and the base coordinate system based on the tail end calibration coordinate of the calibration object corresponding to the tail end coordinate system and the base calibration coordinate of the calibration object corresponding to the base coordinate system;
and determining a coordinate conversion relation according to the visual calibration coordinate, the tail end calibration coordinate, the base calibration coordinate and the first conversion matrix information.
5. The method of claim 4, wherein determining a coordinate transformation relationship based on the visual calibration coordinates, the tip calibration coordinates, the base calibration coordinates, and the first transformation matrix information comprises:
determining a first conversion relation according to the base calibration coordinate, the tail end calibration coordinate and the first conversion matrix information;
and determining a second conversion relation corresponding to second conversion matrix information according to the base calibration coordinate and the visual calibration coordinate.
6. The method according to claim 5, wherein the determining the calibration result corresponding to the robot arm and the visual positioning guidance system according to the coordinate transformation relationship comprises:
integrating the first conversion relation and the second conversion relation to determine an equivalent conversion relation;
mutually integrating any two of the at least three calibration positions to determine an equivalent transformation relation group;
determining the second transformation matrix information corresponding to the visual coordinate system and the base coordinate system based on the equivalence transformation relation set;
and determining a calibration result corresponding to the mechanical arm and the visual positioning guide system according to the second conversion matrix information.
7. The method of claim 6,
the first conversion matrix information includes: a first rotation matrix and a first translation vector corresponding to the end coordinate system and the base coordinate system;
the second conversion matrix information includes: a second rotation matrix and a second translation vector of the visual coordinate system corresponding to the base coordinate system.
8. Calibration device for a visual positioning guidance system of a robot arm, characterized in that it comprises:
the indicating module is used for indicating the mechanical arm to move to at least three calibration positions;
the first determination module is used for determining a visual calibration coordinate aiming at the calibration position according to a calibration object fixed at the tail end of the mechanical arm, and the poses of the calibration object and the tail end of the mechanical arm are kept unchanged;
the second determining module is used for performing coordinate conversion on the visual calibration coordinate according to a terminal coordinate system and a base coordinate system corresponding to the mechanical arm to determine a coordinate conversion relation;
and the third determining module is used for determining a calibration result corresponding to the mechanical arm and the visual positioning guide system according to the coordinate conversion relation.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
10. A non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of any one of claims 1-7.
CN202211500846.2A 2022-11-28 2022-11-28 Calibration method and device for visual positioning guide system of mechanical arm Pending CN115922707A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211500846.2A CN115922707A (en) 2022-11-28 2022-11-28 Calibration method and device for visual positioning guide system of mechanical arm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211500846.2A CN115922707A (en) 2022-11-28 2022-11-28 Calibration method and device for visual positioning guide system of mechanical arm

Publications (1)

Publication Number Publication Date
CN115922707A true CN115922707A (en) 2023-04-07

Family

ID=86551519

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211500846.2A Pending CN115922707A (en) 2022-11-28 2022-11-28 Calibration method and device for visual positioning guide system of mechanical arm

Country Status (1)

Country Link
CN (1) CN115922707A (en)

Similar Documents

Publication Publication Date Title
CN106289379B (en) Industrial robot performance measurement method
CN110640747B (en) Hand-eye calibration method and system for robot, electronic equipment and storage medium
CN110640746A (en) Method, system, equipment and medium for calibrating and positioning coordinate system of robot
CN113211445B (en) Robot parameter calibration method, device, equipment and storage medium
CN115946120A (en) Mechanical arm control method, device, equipment and medium
CN114387352A (en) External parameter calibration method, device, equipment and storage medium
CN112414340B (en) Three-coordinate measuring method, device and equipment of workpiece and storage medium
CN115922707A (en) Calibration method and device for visual positioning guide system of mechanical arm
CN117047771A (en) Flexible control method and device for robot and electronic equipment
CN116298785A (en) Electronic signal testing method and device, electronic equipment and storage medium
CN111971529A (en) Method and apparatus for managing robot system
CN114812576A (en) Map matching method and device and electronic equipment
CN118003331A (en) Calibration method and device of mechanical arm, electronic equipment and storage medium
CN117817655A (en) Equipment calibration method, device, equipment and storage medium
CN114972511A (en) Method and device for determining pose of calibration object, electronic equipment and storage medium
CN118061206B (en) Robot TCP calibration device, method, computer equipment and storage medium
CN117798929A (en) Calibration method and device for mechanical arm and storage medium
CN118143947A (en) Pose determining method, pose determining device, pose determining equipment and storage medium
CN115781665B (en) Mechanical arm control method and device based on monocular camera and storage medium
WO2023157235A1 (en) Arithmetic device
CN117584122A (en) Mechanical arm grabbing method, device, equipment and storage medium
WO2023035100A1 (en) Method, electronic device and computer readable storage medium for calibrating robot
CN117283559A (en) Visual guiding method and device for mechanical arm, electronic equipment and storage medium
CN115502976A (en) Compliant control method and device based on mechanical arm, electronic equipment and storage medium
CN115752469A (en) Part detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination