CN118342520A - Mechanical arm calibration method and device, intelligent equipment and storage medium - Google Patents
Mechanical arm calibration method and device, intelligent equipment and storage medium Download PDFInfo
- Publication number
- CN118342520A CN118342520A CN202410774213.3A CN202410774213A CN118342520A CN 118342520 A CN118342520 A CN 118342520A CN 202410774213 A CN202410774213 A CN 202410774213A CN 118342520 A CN118342520 A CN 118342520A
- Authority
- CN
- China
- Prior art keywords
- image
- positions
- mechanical arm
- calibration
- rotating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 55
- 238000006243 chemical reaction Methods 0.000 claims abstract description 82
- 238000004590 computer program Methods 0.000 claims description 22
- 238000012937 correction Methods 0.000 claims description 15
- 238000012935 Averaging Methods 0.000 claims description 5
- 239000011159 matrix material Substances 0.000 description 44
- 238000013519 translation Methods 0.000 description 33
- 230000008569 process Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000012636 effector Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 229910021389 graphene Inorganic materials 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Landscapes
- Manipulator (AREA)
Abstract
The application relates to a mechanical arm calibration method, a mechanical arm calibration device, intelligent equipment and a storage medium. The method comprises the following steps: determining a calibration parameter value and an offset error based on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions; the position of the moving tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves, and the position of the moving image is the position of the calibration object in the shot image after the mechanical arm moves; the rotating tail end position is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm rotates, and the rotating image position is the position of the calibration object in the photographed image after the mechanical arm rotates; acquiring a target image position of an object to be operated, and determining a conversion position corresponding to the object to be operated based on the target image position and a calibration parameter value; and obtaining a target operation position corresponding to the object to be operated based on the conversion position and the offset error. By adopting the method, the accuracy of the operation position can be improved.
Description
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a mechanical arm calibration method, a mechanical arm calibration device, intelligent equipment and a storage medium.
Background
With the development of artificial intelligence technology, the mechanical arm is applied to more and more fields, such as the fields of automatic production, working and manufacturing, medical field, intelligent home field and the like, and the use of the mechanical arm improves the automation degree of intelligent equipment and lays a foundation for the intelligent equipment to execute more complex and intelligent tasks.
In the conventional technology, in the process of using the mechanical arm, a nine-point calibration method is used for determining a calibration parameter value, and then an operation position corresponding to an object to be operated is determined according to the calibration parameter value, so that the accuracy of the operation position is low.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, an apparatus, an intelligent device, and a computer-readable storage medium for calibrating a manipulator that can improve accuracy of an operation position.
In a first aspect, the application provides a method for calibrating a mechanical arm. The method comprises the following steps:
Determining a calibration parameter value and an offset error based on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions; the position of the moving tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves, and the position of the moving image is the position of the calibration object in the shot image after the mechanical arm moves; the rotating tail end position is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm rotates, and the rotating image position is the position of the calibration object in the shot image after the mechanical arm rotates;
Acquiring a target image position of an object to be operated, and determining a conversion position corresponding to the object to be operated based on the target image position and the calibration parameter value;
And obtaining a target operation position corresponding to the object to be operated based on the conversion position and the offset error.
In a second aspect, the application further provides a mechanical arm calibration device. The device comprises:
a determining module for determining a calibration parameter value and an offset error based on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions; the position of the moving tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves, and the position of the moving image is the position of the calibration object in the shot image after the mechanical arm moves; the rotating tail end position is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm rotates, and the rotating image position is the position of the calibration object in the shot image after the mechanical arm rotates;
the acquisition module is used for acquiring a target image position of an object to be operated and determining a conversion position corresponding to the object to be operated based on the target image position and the calibration parameter value;
And the correction module is used for obtaining a target operation position corresponding to the object to be operated based on the conversion position and the offset error.
In a third aspect, the present application also provides a smart device comprising a memory storing a computer program and a processor implementing the steps of the method of any of the first aspects when the processor executes the computer program.
In a fourth aspect, the present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any of the first aspects.
In a fifth aspect, the application also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the method of any of the first aspects.
The mechanical arm calibration method, the mechanical arm calibration device, the intelligent device, the storage medium and the computer program product are used for determining calibration parameter values and offset errors based on a plurality of groups of mobile tail end positions and mobile image positions and a plurality of groups of rotating tail end positions and rotating image positions; the position of the moving tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves, and the position of the moving image is the position of the calibration object in the shot image after the mechanical arm moves; the rotating tail end position is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm rotates, and the rotating image position is the position of the calibration object in the photographed image after the mechanical arm rotates; acquiring a target image position of an object to be operated, and determining a conversion position corresponding to the object to be operated based on the target image position and a calibration parameter value; and obtaining a target operation position corresponding to the object to be operated based on the conversion position and the offset error. Before determining the target operation position of the object to be operated, determining a calibration parameter value and an offset error by a plurality of groups of moving end positions and moving image positions and a plurality of groups of rotating end positions and rotating image positions, and determining the calibration parameter value by a plurality of groups of moving end positions and moving image positions and a plurality of groups of rotating end positions and rotating image positions, wherein the calibration parameter value is determined by a plurality of groups of moving end positions and moving image positions and a plurality of groups of data for moving the mechanical arm and a plurality of groups of data for rotating the mechanical arm are combined, so that the accuracy of the calibration parameter value is improved; in the process of determining the target operation position of the object to be operated, a calibration parameter value with higher accuracy is used for determining the conversion position corresponding to the object to be operated, the accuracy of the conversion position is improved, and the determined offset error is used for correcting the conversion position to obtain the target operation position corresponding to the object to be operated, so that the accuracy of the operation position is further improved.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for calibrating a robot arm according to an embodiment;
FIG. 2 is a flow chart diagram of the process of determining parameter values and offset errors in one embodiment;
FIG. 3 is a flow chart of an offset error determination step in one embodiment;
FIG. 4 is a flow chart of the calibration parameter values and offset error determination steps in another embodiment;
FIG. 5 is a flowchart illustrating an offset error determination step according to another embodiment;
FIG. 6 is a flowchart illustrating steps for determining and using a planar translation relationship in one embodiment;
FIG. 7 is a block diagram of a mechanical arm calibration device according to one embodiment;
Fig. 8 is an internal structural diagram of the smart device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In one embodiment, as shown in fig. 1, a method for calibrating a mechanical arm is provided, and the method is applied to an intelligent device for explanation, wherein the intelligent device can be, but not limited to, various intelligent robots, a family intelligent device, an education and scientific research intelligent device, an intelligent manufacturing device, an environment detection and maintenance intelligent device and the like, and the intelligent robot can be an industrial robot, a service robot, a medical robot, an agricultural robot and the like. The intelligent device comprises a mechanical arm and a camera, and can be divided into two cases that eyes are on hand and eyes are outside the hand, wherein the eyes are that the camera (eyes) is directly arranged at the tail end of the mechanical arm (hand), the eyes are that the camera is arranged at a certain position outside the mechanical arm, and the camera can be fixed at other positions of a working scene, for example, the camera is fixed on a bracket or a ceiling on a production line. Before the intelligent equipment controls the mechanical arm to operate the object to be operated, the operation position corresponding to the object to be operated needs to be determined, and then the mechanical arm is controlled to operate the object to be operated according to the operation position corresponding to the object to be operated. In this embodiment, the mechanical arm calibration method includes steps 102 to 106, where:
Step 102, determining calibration parameter values and offset errors based on a plurality of groups of mobile terminal positions and mobile image positions, and a plurality of groups of rotating terminal positions and rotating image positions; the position of the moving tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves, and the position of the moving image is the position of the calibration object in the shot image after the mechanical arm moves; the rotating tail end position is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm rotates, and the rotating image position is the position of the calibration object in the photographed image after the mechanical arm rotates.
The moving end position refers to a position of the tail end of the mechanical arm in a base coordinate system after the mechanical arm moves. The mobile end position may be represented in two-dimensional coordinates. The base coordinate system refers to a mechanical arm base coordinate system, and the base coordinate system generally takes a fixed point of the mechanical arm as a coordinate origin. The moving image position refers to the position of the calibration object in the photographed image after the mechanical arm moves, and can be understood as the position coordinate of the calibration object in the photographed image photographed by the camera, and the moving image position can be represented by the two-dimensional coordinate of the pixel point. The rotation end position refers to a position of the end of the mechanical arm in the base coordinate system after the mechanical arm rotates. The rotating image position refers to the position of the calibration object in the photographed image after the mechanical arm rotates. The plurality of sets of moving end positions and moving image positions may be at least 9 sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions may be at least 5 sets of rotating end positions and rotating image positions.
The calibration parameter value refers to a parameter for converting the target image position of the object to be operated, and it is understood that the calibration parameter value is a conversion parameter for determining the operation position of the object to be operated. Under the condition that the height of the mechanical arm is fixed, the calibration parameter value comprises a rotation translation matrix; under the condition that the height of the mechanical arm is variable, the calibration parameter values comprise a rotation translation matrix and a projection distortion matrix; the rotational-translational matrix characterizes the geometric relationship between the camera and the end of the robotic arm, and the projection distortion matrix characterizes the perspective projection of the captured image and the effects of lens distortion. The offset error is a value for correcting the initial operation position, and it can be understood that the initial operation position of the object to be operated is determined according to the target image position and the calibration parameter value corresponding to the object to be operated, and the initial operation position is corrected by using the offset error, so that the target operation position of the object to be operated can be obtained.
The method comprises the steps that under the condition that the intelligent device needs to perform hand-eye calibration, the intelligent device controls the mechanical arm to move on a calibration plane with a calibration object, and for each movement, the intelligent device obtains a group of moving tail end positions and moving image positions, and after a plurality of movements, a plurality of groups of moving tail end positions and moving image positions are obtained; the intelligent device controls the mechanical arm to rotate for a plurality of times, and for each rotation, the intelligent device obtains a group of rotation end positions and rotation image positions, and after the intelligent device moves for a plurality of times, a plurality of groups of rotation end positions and rotation image positions are obtained. The intelligent device determines calibration parameter values based on a plurality of groups of mobile terminal positions and mobile image positions, and a plurality of groups of rotating terminal positions and rotating image positions; and determining an offset error based on the plurality of sets of rotational end positions and rotational image positions, and the calibration parameter values. The condition that the intelligent device needs to perform hand-eye calibration can be that before determining the target operation position of the object to be operated, that is, before determining the target operation position of the object to be operated, the intelligent device performs hand-eye calibration; or the intelligent device needs to perform hand-eye calibration, or the intelligent device performs hand-eye calibration every preset time period.
Step 104, obtaining a target image position of the object to be operated, and determining a conversion position corresponding to the object to be operated based on the target image position and the calibration parameter value.
The object to be operated refers to an object needing to be operated by the mechanical arm, and the object to be operated is located in an operation plane of the mechanical arm. The target image position refers to the position coordinates of the object to be operated in the captured image captured by the camera. The conversion position refers to position coordinates obtained by converting the position of the target image according to the calibration parameter value.
After determining the calibration parameter value and the offset error, the intelligent device obtains a target image position of the object to be operated in the operation plane through a camera corresponding to the mechanical arm when the mechanical arm is required to be controlled to operate the object to be operated; under the condition that the calibration parameter value comprises a rotation translation matrix, a projection distortion matrix is obtained, and the product among the target image position, the calibration parameter value and the projection distortion matrix is determined to be a conversion position corresponding to the object to be operated; and under the condition that the calibration parameter value comprises a rotation translation matrix and a projection distortion matrix, determining the product between the target image position and the calibration parameter value as a conversion position corresponding to the object to be operated.
And 106, obtaining a target operation position corresponding to the object to be operated based on the conversion position and the offset error.
The intelligent device adds an offset error to a conversion position corresponding to the object to be operated under the condition that the working plane and the calibration plane of the mechanical arm are on the same plane, so as to obtain a target operation position corresponding to the object to be operated; under the condition that the working plane and the calibration plane of the mechanical arm are not in the same plane, the intelligent equipment adds an offset error to the conversion position corresponding to the object to be operated to obtain an initial operation position corresponding to the object to be operated, and determines a target operation position corresponding to the object to be operated based on the initial operation position.
According to the mechanical arm calibration method, before the target operation position of the object to be operated is determined, the calibration parameter value and the offset error are determined through the plurality of groups of the mobile terminal position and the mobile image position and the plurality of groups of the rotating terminal position and the rotating image position, compared with the calibration parameter value determined only through the plurality of groups of the mobile terminal position and the mobile image position, the calibration parameter value is determined through the plurality of groups of the mobile terminal position and the mobile image position and the plurality of groups of the rotating terminal position and the rotating image position, and the plurality of groups of data for moving the mechanical arm and the plurality of groups of data for rotating the mechanical arm are combined, so that the accuracy of the calibration parameter value is improved; in the process of determining the target operation position of the object to be operated, a calibration parameter value with higher accuracy is used for determining the conversion position corresponding to the object to be operated, the accuracy of the conversion position is improved, and the determined offset error is used for correcting the conversion position to obtain the target operation position corresponding to the object to be operated, so that the accuracy of the operation position is further improved.
In one embodiment, as shown in fig. 2, determining the calibration parameter values and the offset error based on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions with the camera on the robotic arm includes:
Step 202, averaging the plurality of groups of mobile terminal positions to obtain an average position.
The average position refers to a position coordinate obtained by averaging a plurality of groups of mobile terminal positions.
For example, in the case where the photographing device is located on the mechanical arm, that is, the case where the eye is on the hand, the smart device averages the plurality of sets of moving end positions to obtain an average position.
Step 204, determining a movement difference position between the movement end position and the average position, and a rotation difference position between the rotation end position and the average position.
The movement difference position is a position coordinate obtained by subtracting the average position from the movement end position. The rotation difference position means a position coordinate obtained by subtracting the average position from the rotation end position.
For each mobile end position, the smart device subtracts the average position from the mobile end position to obtain a mobile difference position; for each rotational difference position, the rotational difference position is subtracted from the average position to obtain a rotational difference position.
Step 206, determining calibration parameter values based on the plurality of sets of movement difference positions and movement image positions, and the plurality of sets of rotation difference positions and rotation image positions.
Illustratively, the smart device performs straight line fitting on the plurality of sets of movement difference positions and movement image positions, and the plurality of sets of rotation difference positions and rotation image positions to obtain calibration parameter values.
In one embodiment, the intelligent device performs straight line fitting on the plurality of groups of movement difference positions and movement image positions, and the plurality of groups of rotation difference positions and rotation image positions to obtain calibration parameter values, and the straight line fitting method includes, but is not limited to, least square method, robust regression, regularized regression, bayesian regression and the like, and the formula of straight line fitting is as follows:
Formula (1)
Wherein,As the average position of the objects to be positioned,For the X-axis coordinate values in the average position,Is the Y-axis coordinate value in the average position; An nth moving end position or a rotating end position; An nth moving image position or a rotated image position; Is a rotation translation matrix; p is the projection distortion matrix.
Step 208, determining an offset error based on the calibration parameter value and the average position.
Illustratively, the smart device determines an offset error based on the first set of rotational end position and rotational image position, the calibration parameter values, and the average position; the rotation angle corresponding to the first group of rotation end positions and the rotation image positions is zero, namely when the mechanical arm rotates for the first time, the rotation angle of the mechanical arm is zero, the position of the tail end of the mechanical arm in the base coordinate system is the rotation end position, and the position of the calibration object in the shot image is the rotation image position.
In this embodiment, when the photographing device is located on the mechanical arm, that is, when the eye is on the hand, the calibration parameter value is determined by the plurality of sets of moving difference positions and moving image positions, and the plurality of sets of rotating difference positions and rotating image positions, compared with the calibration parameter value determined by the plurality of sets of moving end positions and moving image positions, the calibration parameter value is determined by the plurality of sets of moving difference positions and moving image positions, and the plurality of sets of rotating difference positions and rotating image positions, and the plurality of sets of data for moving the mechanical arm and the plurality of sets of data for rotating the mechanical arm are combined, thereby improving the accuracy of the calibration parameter value; and then determining an offset error based on the calibration parameter value, and providing basic data for the subsequent correction of the initial operation position.
In one embodiment, as shown in FIG. 3, determining the offset error based on the calibration parameter values and the average position includes:
Step 302, for each rotated image position, obtaining a first image position corresponding to the rotated image position under the end coordinate system based on the rotated image position and the calibration parameter value.
The end coordinate system refers to a coordinate system of the end effector of the mechanical arm, and the end coordinate system usually takes the center of the end effector of the mechanical arm as an origin of coordinates.
For each rotated image position, the smart device multiplies the rotated image position by the calibration parameter value to obtain a corresponding first image position of the rotated image position in the end coordinate system.
And step 304, performing curve fitting on the plurality of first image positions to obtain a fitting circle center position.
Wherein, curve fitting refers to a process of fitting a plurality of points into a circle. The fitting circle center position refers to the position coordinates of the circle center of the circle obtained by curve fitting.
The intelligent device performs curve fitting on the plurality of first image positions to obtain a fitting circle, and determines the position coordinates of the central point of the fitting circle as the fitting circle center position.
Step 306, determining an offset error based on the set of rotational end position and rotational image position, the fitted center position, and the average position.
The smart device determines a first difference value between the rotational end position and the average position, and a second difference value between the rotational image position and the fitted circle center position in the same group as the rotational end position, and adds the first difference value to the second difference value to obtain an offset error.
In one embodiment, the offset error diff is as follows:
Formula (2)
Wherein, P Rotational end position refers to the coordinates of the rotation end position, P Rotating image position refers to the coordinates of the rotation image position, P Rotational end position and P Rotating image position are a set of rotation end position and rotation image position, and generally a first set of rotation end position and rotation image position is used, and the rotation angle of the mechanical arm corresponding to the first set of rotation end position and rotation image position is zero; p Average position is to average the positions of multiple groups of mobile terminals to obtain the coordinates of the average positions; p Fitting the position of the center of a circle is to perform curve fitting on a plurality of first image positions to obtain coordinates of a fitting circle center position.
In this embodiment, if the photographing device is located on the mechanical arm, that is, if the eye is on the hand, the calibration parameter values determined in the steps 202 to 206 are directly used to calculate the target operation position of the object to be operated, and the calculated target operation position is offset from the actual operation position by two distances, one is the distance between the rotation image position and the rotation center position of the mechanical arm, and the other is the distance between the rotation center position of the mechanical arm and the average position, and the fitting center position determined in the steps 302 and 304 is the rotation center position of the mechanical arm, and the offset error is determined by a set of rotation end position and rotation image position, fitting center position and average position, so as to provide basic data for correcting the initial operation position subsequently.
In one embodiment, as shown in fig. 4, in a case where the photographing device is located outside the robot arm, determining the calibration parameter value and the offset error based on the plurality of sets of the moving end position and the moving image position, and the plurality of sets of the rotating end position and the rotating image position includes:
In step 402, calibration parameter values are determined based on a plurality of sets of mobile end positions and mobile image positions, and a plurality of sets of rotating end positions and rotating image positions.
In an exemplary embodiment, when the photographing device is located outside the mechanical arm, that is, when the eye is located outside the hand, the intelligent device performs straight line fitting on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions, to obtain the calibration parameter value. Methods of straight line fitting include, but are not limited to, least squares, robust regression, regularized regression, bayesian regression, and the like.
In one embodiment, the formula for the straight line fit is as follows:
Formula (3)
Wherein,An nth moving end position or a rotating end position; An nth moving image position or a rotated image position; Is a rotation translation matrix; p is the projection distortion matrix.
Step 404, for each rotated image position, obtaining a second image position corresponding to the rotated image position under the base coordinate system based on the rotated image position and the calibration parameter value.
For each rotated image position, the smart device multiplies the rotated image position by the calibration parameter value to obtain a corresponding second image position of the rotated image position under the base coordinate system.
And step 406, performing curve fitting on the plurality of second image positions to obtain a fitting circle center position and a fitting radius.
The fitting circle center position refers to the circle center position of a fitting circle obtained by performing curve fitting on the plurality of second image positions. The fitting radius refers to the radius of a fitting circle obtained by curve fitting the plurality of second image positions.
The intelligent device performs curve fitting on the plurality of second image positions to obtain a fitting circle, determines the position coordinates of the circle center of the fitting circle as the fitting circle center position, and determines the radius of the fitting circle as the fitting radius.
Step 408, determining an offset error based on the set of rotational end position and rotational image position, the fitting center position, and the fitting radius.
Illustratively, the smart device computes a set of rotational end position and rotational image position, a fitting center position, and a fitting radius to obtain an offset error.
In this embodiment, when the photographing device is located outside the mechanical arm, that is, when the eye is located outside the hand, the calibration parameter value is determined by the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions, and compared with the calibration parameter value determined by the plurality of sets of moving end positions and moving image positions alone, the calibration parameter value is determined by the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions, and the calibration parameter value is combined with the plurality of sets of data for moving the mechanical arm and the plurality of sets of data for rotating the mechanical arm, thereby improving the accuracy of the calibration parameter value; and then determining an offset error based on the calibration parameter value, and providing basic data for the subsequent correction of the initial operation position.
In one embodiment, as shown in FIG. 5, determining the offset error based on a set of rotational end position and rotational image position, fitting center position, and fitting radius, includes:
Step 502, determining the inclination angle of the straight line where the rotation end position and the fitting circle center position are located based on the rotation end position and the fitting circle center position.
The inclination angle refers to an included angle between a straight line where the rotation end position and the fitting circle center position are located and the positive direction of the Y axis.
The intelligent device determines an X-axis difference value between an X-axis coordinate value of the rotation end position and an X-axis coordinate value of the fitting circle center position, determines a Y-axis difference value between a Y-axis coordinate value of the rotation end position and a Y-axis coordinate value of the fitting circle center position, determines a difference ratio between the X-axis difference value and the Y-axis difference value, and performs arctangent operation on the difference ratio to obtain an inclination angle of a straight line where the rotation end position and the fitting circle center position are located.
In one embodiment, the angle of inclinationThe following is shown:
Formula (4)
Wherein, (x 1,y1) is a rotation end position, a first set of rotation end position and rotation image position are generally used, and a rotation angle of the mechanical arm corresponding to the first set of rotation end position and rotation image position is zero; (x circle,ycircle) is the fitted circle center position.
In step 504, an offset error is obtained based on the sine value of the tilt angle, the cosine value of the tilt angle, and the fitting radius.
Illustratively, the smart device multiplies the sine value of the tilt angle by the fitting radius to obtain a sine error, multiplies the cosine value of the tilt angle by the fitting radius to obtain a cosine error, and obtains an offset error based on the sine error and the cosine error.
In one embodiment, the offset error diff is as follows:
formula (5)
Wherein circle_r is the fitting radius; is the cosine of the inclination angle; Is a sine value of the inclination angle.
In this embodiment, when the photographing device is located outside the mechanical arm, that is, when the eye is located outside the hand, the offset error is determined through a set of the rotation end position and the rotation image position, the fitting center position and the fitting radius, so that basic data is provided for correcting the initial operation position subsequently.
In one embodiment, as shown in fig. 6, in the case that the calibration object is located on a calibration plane and the working plane are located on different planes, before the target operation position corresponding to the object to be operated is obtained based on the conversion position and the offset error, and based on the conversion position and the offset error, obtaining the target operation position corresponding to the object to be operated includes:
step 602, obtaining a plurality of groups of corrected end positions and corrected image positions; the position of the corrected tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves to the corrected calibration object positioned on the working plane; the corrected image position is the position of the corrected calibration object in the photographed image.
The correction end position refers to a position of the tail end of the mechanical arm in a base coordinate system after the mechanical arm moves to a correction calibration object positioned on a working plane. The calibration plane is the plane in which the index fixture is located, and is understood to be the plane used for determining the value of the calibration parameter. The working plane refers to a plane where the mechanical arm performs an operation, and is understood to be a working plane of the mechanical arm. The correction calibration object is used for determining the plane conversion relation between the calibration plane and the working plane, and is positioned on the working plane. The corrected image position refers to the position of the corrected calibration object in the photographed image.
In an exemplary embodiment, when the calibration object is located on the calibration plane and the working plane are located on different planes, after determining the calibration parameter value and before determining the target operation position corresponding to the object to be operated, the smart object controls the mechanical arm to move to the corrected calibration object located on the working plane, so as to obtain a set of corrected end position and corrected image position, and after performing multiple movements, multiple sets of corrected end position and corrected image position are obtained.
Step 604, determining a corrected conversion position corresponding to the corrected image position based on the corrected image position and the calibration parameter value.
For each corrected image position, the intelligent device multiplies the corrected image position by the calibration parameter value to obtain a corrected conversion position corresponding to the corrected image position.
Step 606, determining a plane conversion relationship between the calibration plane and the working plane based on the plurality of sets of corrected end positions and corrected conversion positions.
The plane conversion relation refers to conversion parameter values for converting position coordinates located on a calibration plane into a working plane. The plane conversion relation comprises a scaling matrix and a translation matrix; the scaling matrix may comprise a first scaling parameter value representing a scaling factor in the X-axis and a second scaling parameter value representing a scaling factor in the Y-axis; the translation matrix may include first translation parameter values that characterize the translation distance in the X-axis and second translation parameter values that characterize the translation distance in the Y-axis.
Illustratively, the intelligent device fits the plurality of sets of corrected end positions and corrected conversion positions to obtain a planar conversion relationship between the calibration plane and the working plane.
In one embodiment, the fitting equation for fitting the sets of corrected end positions and corrected transition positions is as follows:
Formula (6)
Wherein,Correcting the conversion position for the nth; correcting the end position for the nth; scale_x is a scaling multiple corresponding to the X axis, namely a first scaling parameter; scale_y is a scaling multiple corresponding to the Y axis, namely a second scaling parameter; original_x is the translation distance of the X axis, namely a first translation parameter; origin_y is the translation distance of the Y-axis, i.e., the second translation parameter.
Step 608, obtaining an initial operation position corresponding to the object to be operated based on the conversion position and the offset error.
The intelligent device adds an offset error to the conversion position corresponding to the object to be operated to obtain an initial operation position corresponding to the object to be operated.
Step 610, obtaining a target operation position corresponding to the object to be operated based on the initial operation position and the plane conversion relation.
The intelligent device subtracts a translation matrix in the planar conversion relation from the initial operation position to obtain a translation difference position, and multiplies the translation difference position by a scaling matrix in the planar conversion relation to obtain a target operation position corresponding to the object to be operated.
In this embodiment, when the calibration object is located on the calibration plane and the working plane are located on different planes, the plane conversion relationship between the calibration plane and the working plane is determined according to the multiple groups of corrected end positions and corrected conversion positions, and then the initial operation position located on the calibration plane is converted into the target operation position located on the operation plane by using the plane conversion relationship, so that the target operation position is ensured to be located on the working plane of the mechanical arm, and the accuracy of the target operation position is improved.
In an exemplary embodiment, a method for calibrating a mechanical arm of an object to be operated is provided, which is divided into two cases that eyes are on hands and eyes are outside hands, respectively:
1. Eye is on hand
Under the condition that the point is needed and the calibration plane and the operation plane are positioned on the same plane, the method comprises the following steps:
1. the mobile mechanical arm corresponds to the upper calibration object, and the mobile terminal position (x, y) of the calibration object under the base coordinate system is recorded;
2. Moving the mechanical arm, ensuring that a calibration object is in the field of view of the camera, recording the current moving end position (x n,yn) of the tail end of the mechanical arm under a base coordinate system and the corresponding moving image position (ximage n,yimagen), and repeating the steps for nine times to obtain nine groups of moving end positions (x n,yn) and moving image positions (ximage n,yimagen);
3. Based on nine groups of moving end positions (x n,yn) and moving image positions (ximage n,yimagen), fitting the formula (7) to obtain a rotation translation matrix And a projection distortion matrix P.
Formula (7)
4. Obtaining a target tail end position and a target image position corresponding to an object to be operated, multiplying the target image position by a rotation translation matrix and a projection distortion matrix to obtain a conversion position of the object to be operated under a tail end coordinate system, and adding the conversion position to the target tail end position to obtain a target operation position of the object to be operated under a base coordinate system.
Under the condition that the point is not needed and the calibration plane and the operation plane are positioned on the same plane, the method comprises the following steps:
1. Moving the mechanical arm, ensuring that a calibration object is in the field of view of the camera, recording the current moving end position (x n,yn) of the tail end of the mechanical arm under a base coordinate system and the corresponding moving image position (ximage n,yimagen), and repeating the steps for nine times to obtain nine groups of moving end positions (x n,yn) and moving image positions (ximage n,yimagen);
2. Rotating the mechanical arm to ensure that a calibration object is in the visual field of the camera, recording the current rotating end position (x m,ym) of the tail end of the mechanical arm under the base coordinate system and the corresponding rotating image position (ximage m,yimagem), and repeating the steps for five times to obtain five groups of rotating end positions (x m,ym) and rotating image positions (ximage m,yimagem); the rotation angle of the mechanical arm for the first time is zero;
3. Averaging the multiple groups of mobile terminal positions to obtain an average position ;
4. Based on average positionNine sets of moving end positions (x n,yn) and moving image positions (ximage n,yimagen), and five sets of rotating end positions (x m,ym) and rotating image positions (ximage m,yimagem), fitting equation (1) to obtain a rotational translation matrixAnd a projection distortion matrix P;
5. For each rotating image position, multiplying the rotating image position by a calibration parameter value by the intelligent equipment to obtain a first image position corresponding to the rotating image position under the terminal coordinate system; performing curve fitting on the five first image positions to obtain a fitting circle, and determining the position coordinates of the central point of the fitting circle as the fitting circle center position; substituting the position of the rotating tail end and the position of the rotating image, the fitting circle center position and the average position of the rotating tail end and the rotating image of the first group into a formula (2) to obtain an offset error;
6. Obtaining a target image position corresponding to the object to be operated, multiplying the target image position by a rotation translation matrix and a projection distortion matrix to obtain a conversion position of the object to be operated under a terminal coordinate system, and adding an offset error to the conversion position to obtain a target operation position of the object to be operated under a base coordinate system.
2. The eyes are outside the hand
Under the condition that the point is needed and the calibration plane and the operation plane are positioned on the same plane, the method comprises the following steps:
1. Nine calibration objects are placed on a calibration plane, the mobile mechanical arm corresponds to the last target calibration object, and the mobile terminal positions (x, y) of the target calibration objects under a base coordinate system and the mobile image positions (ximage n,yimagen) corresponding to the nine calibration objects are recorded;
2. Fitting equation (8) based on the moving end positions (x, y) and nine moving image positions (ximage n,yimagen) to obtain a rotational-translational matrix And a projection distortion matrix P.
Formula (8)
3. And obtaining a target image position corresponding to the object to be operated, and multiplying the target image position by the rotation translation matrix and the projection distortion matrix to obtain a target operation position of the object to be operated under a base coordinate system.
Under the condition that the point is not needed and the calibration plane and the operation plane are positioned on the same plane, the method comprises the following steps:
1. Moving the mechanical arm, ensuring that a calibration object is in the field of view of the camera, recording the current moving end position (x n,yn) of the tail end of the mechanical arm under a base coordinate system and the corresponding moving image position (ximage n,yimagen), and repeating the steps for nine times to obtain nine groups of moving end positions (x n,yn) and moving image positions (ximage n,yimagen);
2. Rotating the mechanical arm to ensure that a calibration object is in the visual field of the camera, recording the current rotating end position (x m,ym) of the tail end of the mechanical arm under the base coordinate system and the corresponding rotating image position (ximage m,yimagem), and repeating the steps for five times to obtain five groups of rotating end positions (x m,ym) and rotating image positions (ximage m,yimagem); the rotation angle of the mechanical arm for the first time is zero;
3. Fitting equation (8) based on nine sets of moving end positions (x n,yn) and moving image positions (ximage n,yimagen), and five sets of rotating end positions (x m,ym) and rotating image positions (ximage m,yimagem) to obtain a rotational translation matrix And a projection distortion matrix P.
4. Multiplying the rotation image position by a calibration parameter value for each rotation image position to obtain a second image position corresponding to the rotation image position under a base coordinate system; performing curve fitting on the plurality of second image positions to obtain a fitting circle, determining the position coordinates of the circle center of the fitting circle as the fitting circle center position, and determining the radius of the fitting circle as the fitting radius;
5. Substituting the first rotation end position and the fitting circle center position into a formula (4) to obtain the inclination angle of the straight line where the rotation end position and the fitting circle center position are located; substituting the fitting radius and the inclination angle into a formula (5) to obtain an offset error;
6. Obtaining a target image position corresponding to the object to be operated, multiplying the target image position by a rotation translation matrix and a projection distortion matrix to obtain a conversion position of the object to be operated under a terminal coordinate system, and adding an offset error to the conversion position to obtain a target operation position of the object to be operated under a base coordinate system.
In the method for determining the operation position, the calibration object is located on the calibration plane, the calibration plane and the working plane are the same plane, and if the calibration object is located on the calibration plane, and the calibration plane and the working plane are not the same plane, after determining the calibration parameter value, and before determining the target operation position corresponding to the object to be operated, the intelligent object controls the mechanical arm to move to the correction calibration object located on the working plane, so as to obtain a group of correction end positions and correction image positions, and after performing multiple movements, a plurality of groups of correction end positions and correction image positions are obtained; for each corrected image position, the intelligent equipment multiplies the corrected image position by a calibration parameter value to obtain a corrected conversion position corresponding to the corrected image position; and (3) fitting the formula (6) based on a plurality of groups of corrected tail end positions and corrected conversion positions to obtain a plane conversion relation between the calibration plane and the working plane, wherein the plane conversion relation comprises a scaling matrix and a translation matrix.
After determining a plane conversion relation, acquiring a determined rotation translation matrix, a projection distortion matrix and a target image position corresponding to an object to be operated, multiplying the target image position by the rotation translation matrix and the projection distortion matrix to obtain a conversion position of the object to be operated under a terminal coordinate system, and adding an offset error to the conversion position to obtain an initial operation position of the object to be operated under a base coordinate system; subtracting the translation matrix in the plane conversion relation from the initial operation position to obtain a translation difference position, and multiplying the translation difference position by the scaling matrix in the plane conversion relation to obtain a target operation position corresponding to the object to be operated.
According to the mechanical arm calibration method, before the target operation position of the object to be operated is determined, the calibration parameter value and the offset error are determined through the plurality of groups of the mobile terminal position and the mobile image position and the plurality of groups of the rotating terminal position and the rotating image position, compared with the calibration parameter value determined only through the plurality of groups of the mobile terminal position and the mobile image position, the calibration parameter value is determined through the plurality of groups of the mobile terminal position and the mobile image position and the plurality of groups of the rotating terminal position and the rotating image position, and the plurality of groups of data for moving the mechanical arm and the plurality of groups of data for rotating the mechanical arm are combined, so that the accuracy of the calibration parameter value is improved; in the process of determining the target operation position of the object to be operated, a calibration parameter value with higher accuracy is used for determining the conversion position corresponding to the object to be operated, the accuracy of the conversion position is improved, and the determined offset error is used for correcting the conversion position to obtain the target operation position corresponding to the object to be operated, so that the accuracy of the operation position is further improved. Under the condition that the calibration object is positioned on the calibration plane and the working plane are positioned on different planes, determining the plane conversion relation between the calibration plane and the working plane according to the plurality of groups of corrected tail end positions and corrected conversion positions, and then converting the initial operation position positioned on the calibration plane into the target operation position positioned on the operation plane by using the plane conversion relation, thereby ensuring that the target operation position is positioned on the working plane of the mechanical arm and improving the accuracy of the target operation position.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a mechanical arm calibration device for realizing the above related mechanical arm calibration method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the one or more mechanical arm calibration devices provided below may be referred to the limitation of the mechanical arm calibration method hereinabove, and will not be described herein.
In one embodiment, as shown in fig. 7, there is provided a robot calibration device, including: a determination module 702, an acquisition module 704, and a correction module 706, wherein:
A determining module 702 for determining a calibration parameter value and an offset error based on the plurality of sets of mobile end positions and mobile image positions, and the plurality of sets of rotating end positions and rotating image positions; the position of the moving tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves, and the position of the moving image is the position of the calibration object in the shot image after the mechanical arm moves; the rotating tail end position is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm rotates, and the rotating image position is the position of the calibration object in the photographed image after the mechanical arm rotates;
the acquiring module 704 is configured to acquire a target image position of an object to be operated, and determine a conversion position corresponding to the object to be operated based on the target image position and the calibration parameter value;
And the correction module 706 is configured to obtain a target operation position corresponding to the object to be operated based on the conversion position and the offset error.
In one embodiment, the determining module 702 is further configured to: averaging the positions of the multiple groups of mobile terminals to obtain an average position; determining a movement difference position between the movement end position and the average position, and a rotation difference position between the rotation end position and the average position; determining a calibration parameter value based on the plurality of sets of movement difference positions and movement image positions, and the plurality of sets of rotation difference positions and rotation image positions; an offset error is determined based on the calibration parameter values and the average position.
In one embodiment, the determining module 702 is further configured to: for each rotating image position, obtaining a first image position corresponding to the rotating image position under the terminal coordinate system based on the rotating image position and the calibration parameter value; curve fitting is carried out on the plurality of first image positions, and a fitting circle center position is obtained; an offset error is determined based on a set of rotational end positions and rotational image positions, the fitted center position, and the average position.
In one embodiment, the determining module 702 is further configured to: determining calibration parameter values based on the plurality of sets of mobile end positions and mobile image positions, and the plurality of sets of rotating end positions and rotating image positions; for each rotating image position, obtaining a second image position corresponding to the rotating image position under a base coordinate system based on the rotating image position and the calibration parameter value; curve fitting is carried out on the plurality of second image positions, and a fitting circle center position and a fitting radius are obtained; an offset error is determined based on a set of rotational end positions and rotational image positions, a fitting center position, and a fitting radius.
In one embodiment, the determining module 702 is further configured to: determining the inclination angle of a straight line where the rotating tail end position and the fitting circle center position are located based on the rotating tail end position and the fitting circle center position; and obtaining an offset error based on the sine value of the inclination angle, the cosine value of the inclination angle and the fitting radius.
In one embodiment, the correction module 706 is further configured to: acquiring a plurality of groups of corrected tail end positions and corrected image positions; the position of the corrected tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves to the corrected calibration object positioned on the working plane; the corrected image position is the position of the corrected calibration object in the shot image; determining a corrected conversion position corresponding to the corrected image position based on the corrected image position and the calibration parameter value; determining a plane conversion relationship between the calibration plane and the working plane based on the plurality of groups of corrected end positions and corrected conversion positions; obtaining an initial operation position corresponding to the object to be operated based on the conversion position and the offset error; and obtaining a target operation position corresponding to the object to be operated based on the initial operation position and the plane conversion relation.
All or part of the modules in the mechanical arm calibration device can be realized by software, hardware and a combination thereof. The above modules can be embedded in hardware or independent from a processor in the intelligent device, or can be stored in a memory in the intelligent device in software, so that the processor can call and execute the operations corresponding to the above modules.
In one embodiment, a smart device is provided, which may be a terminal, and the internal structure of which may be as shown in fig. 8. The intelligent device comprises a processor, a memory, an input/output interface, a communication interface, a display unit and an input device. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the smart device is configured to provide computing and control capabilities. The memory of the smart device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the intelligent device is used for exchanging information between the processor and the external device. The communication interface of the intelligent device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program when executed by a processor implements a method for calibrating a robotic arm. The display unit of the intelligent device is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the intelligent device can be a touch layer covered on the display screen, can also be keys, a track ball or a touch pad arranged on the shell of the intelligent device, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structure shown in fig. 8 is merely a block diagram of a portion of the structure associated with the present inventive arrangements and is not limiting of the smart device to which the present inventive arrangements are applied, and that a particular smart device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a smart device is provided, including a memory having a computer program stored therein and a processor, which when executing the computer program performs the steps of the method embodiments described above.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
The user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.
Claims (10)
1. The mechanical arm calibration method is characterized by comprising the following steps of:
Determining a calibration parameter value and an offset error based on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions; the position of the moving tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves, and the position of the moving image is the position of the calibration object in the shot image after the mechanical arm moves; the rotating tail end position is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm rotates, and the rotating image position is the position of the calibration object in the shot image after the mechanical arm rotates;
Acquiring a target image position of an object to be operated, and determining a conversion position corresponding to the object to be operated based on the target image position and the calibration parameter value;
And obtaining a target operation position corresponding to the object to be operated based on the conversion position and the offset error.
2. The method of claim 1, wherein the determining calibration parameter values and offset errors based on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions with the camera on the robotic arm comprises:
averaging a plurality of groups of mobile terminal positions to obtain an average position;
Determining a movement difference position between the movement end position and the average position, and a rotation difference position between the rotation end position and the average position;
determining a calibration parameter value based on a plurality of sets of the movement difference position and the movement image position, and a plurality of sets of the rotation difference position and the rotation image position;
the offset error is determined based on the calibration parameter value and the average position.
3. The method of claim 2, wherein the determining the offset error based on the calibration parameter value and the average position comprises:
For each rotated image position, obtaining a first image position corresponding to the rotated image position under an end coordinate system based on the rotated image position and the calibration parameter value;
Performing curve fitting on a plurality of first image positions to obtain fitted circle center positions;
The offset error is determined based on a set of the rotational end position and rotational image position, the fitting center position, and the average position.
4. The method of claim 1, wherein the determining calibration parameter values and offset errors based on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions, with the camera located outside the robotic arm, comprises:
determining calibration parameter values based on the plurality of sets of mobile end positions and mobile image positions, and the plurality of sets of rotating end positions and rotating image positions;
For each rotated image position, obtaining a second image position corresponding to the rotated image position under a base coordinate system based on the rotated image position and the calibration parameter value;
Performing curve fitting on a plurality of second image positions to obtain fitting circle center positions and fitting radiuses;
The offset error is determined based on a set of the rotational end position and rotational image position, the fitting center position, and the fitting radius.
5. The method of claim 4, wherein said determining said offset error based on a set of said rotational end position and rotational image position, said fitting center position, and said fitting radius comprises:
determining the inclination angle of a straight line where the rotating tail end position and the fitting circle center position are located based on the rotating tail end position and the fitting circle center position;
And obtaining the offset error based on the sine value of the inclination angle, the cosine value of the inclination angle and the fitting radius.
6. The method according to claim 1, wherein, in a case where the calibration object is located on a calibration plane and the working plane are located on different planes, before the obtaining the target operation position corresponding to the object to be operated based on the conversion position and the offset error, the method further comprises:
acquiring a plurality of groups of corrected tail end positions and corrected image positions; the position of the corrected tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves to the corrected calibration object positioned on the working plane; the corrected image position is the position of the corrected calibration object in the shot image;
Determining a corrected conversion position corresponding to the corrected image position based on the corrected image position and the calibration parameter value;
determining a plane conversion relationship between the calibration plane and the working plane based on a plurality of groups of the corrected end positions and the corrected conversion positions;
The obtaining, based on the conversion position and the offset error, a target operation position corresponding to the object to be operated includes:
The initial operation position corresponding to the object to be operated is obtained based on the conversion position and the offset error;
And obtaining a target operation position corresponding to the object to be operated based on the initial operation position and the plane conversion relation.
7. A robotic arm calibration apparatus, the apparatus comprising:
a determining module for determining a calibration parameter value and an offset error based on the plurality of sets of moving end positions and moving image positions, and the plurality of sets of rotating end positions and rotating image positions; the position of the moving tail end is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm moves, and the position of the moving image is the position of the calibration object in the shot image after the mechanical arm moves; the rotating tail end position is the position of the tail end of the mechanical arm in the base coordinate system after the mechanical arm rotates, and the rotating image position is the position of the calibration object in the shot image after the mechanical arm rotates;
the acquisition module is used for acquiring a target image position of an object to be operated and determining a conversion position corresponding to the object to be operated based on the target image position and the calibration parameter value;
And the correction module is used for obtaining a target operation position corresponding to the object to be operated based on the conversion position and the offset error.
8. A smart device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 6 when the computer program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410774213.3A CN118342520B (en) | 2024-06-17 | Mechanical arm calibration method and device, intelligent equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410774213.3A CN118342520B (en) | 2024-06-17 | Mechanical arm calibration method and device, intelligent equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118342520A true CN118342520A (en) | 2024-07-16 |
CN118342520B CN118342520B (en) | 2024-10-29 |
Family
ID=
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180214167A1 (en) * | 2017-02-02 | 2018-08-02 | Ethicon Llc | Calibration of a robotic surgical tool |
CN112880562A (en) * | 2021-01-19 | 2021-06-01 | 佛山职业技术学院 | Method and system for measuring pose error of tail end of mechanical arm |
CN114012731A (en) * | 2021-11-23 | 2022-02-08 | 深圳市如本科技有限公司 | Hand-eye calibration method and device, computer equipment and storage medium |
CN116091619A (en) * | 2022-12-27 | 2023-05-09 | 北京纳通医用机器人科技有限公司 | Calibration method, device, equipment and medium |
CN117301052A (en) * | 2023-09-19 | 2023-12-29 | 杭州海康机器人股份有限公司 | Pose conversion method, device, equipment and storage medium |
CN117817660A (en) * | 2023-12-26 | 2024-04-05 | 同方威视技术股份有限公司 | Mechanical arm calibration method, device and system, electronic equipment and storage medium |
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180214167A1 (en) * | 2017-02-02 | 2018-08-02 | Ethicon Llc | Calibration of a robotic surgical tool |
CN112880562A (en) * | 2021-01-19 | 2021-06-01 | 佛山职业技术学院 | Method and system for measuring pose error of tail end of mechanical arm |
CN114012731A (en) * | 2021-11-23 | 2022-02-08 | 深圳市如本科技有限公司 | Hand-eye calibration method and device, computer equipment and storage medium |
CN116091619A (en) * | 2022-12-27 | 2023-05-09 | 北京纳通医用机器人科技有限公司 | Calibration method, device, equipment and medium |
CN117301052A (en) * | 2023-09-19 | 2023-12-29 | 杭州海康机器人股份有限公司 | Pose conversion method, device, equipment and storage medium |
CN117817660A (en) * | 2023-12-26 | 2024-04-05 | 同方威视技术股份有限公司 | Mechanical arm calibration method, device and system, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108346165B (en) | Robot and three-dimensional sensing assembly combined calibration method and device | |
CN111445533B (en) | Binocular camera calibration method, device, equipment and medium | |
CN111127422A (en) | Image annotation method, device, system and host | |
CN111801198A (en) | Hand-eye calibration method, system and computer storage medium | |
US20190096050A1 (en) | Method and device for three-dimensional reconstruction | |
CN112330752A (en) | Multi-camera combined calibration method and device, terminal equipment and readable storage medium | |
CN112862897B (en) | Phase-shift encoding circle-based rapid calibration method for camera in out-of-focus state | |
CN109215086A (en) | Camera extrinsic scaling method, equipment and system | |
CN109901123A (en) | Transducer calibration method, device, computer equipment and storage medium | |
CN109389642A (en) | Vision system is to the scaling method of robot, system and has store function device | |
CN115042184A (en) | Robot hand-eye coordinate conversion method and device, computer equipment and storage medium | |
CN114640833A (en) | Projection picture adjusting method and device, electronic equipment and storage medium | |
CN109493277A (en) | Probe data joining method, device, computer equipment and storage medium | |
CN115049744A (en) | Robot hand-eye coordinate conversion method and device, computer equipment and storage medium | |
CN117944025A (en) | Robot hand-eye calibration method, device, computer equipment and storage medium | |
CN118247184A (en) | Distortion correction method, device and equipment for projection system and readable storage medium | |
CN112902961B (en) | Calibration method, medium, calibration equipment and system based on machine vision positioning | |
CN118342520B (en) | Mechanical arm calibration method and device, intelligent equipment and storage medium | |
CN118342520A (en) | Mechanical arm calibration method and device, intelligent equipment and storage medium | |
CN113487685B (en) | Calibration method, device, equipment and storage medium of line laser scanning camera | |
CN114833825A (en) | Cooperative robot control method and device, computer equipment and storage medium | |
CN109493388A (en) | Rotating axis calibration method, device, computer equipment and storage medium | |
CN116100564B (en) | High-precision calibration method and device for calibrating manipulator | |
CN115582829B (en) | Position determining method and device for mechanical arm, electronic equipment and storage medium | |
CN118657841B (en) | Camera calibration method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |