CN110930458A - Simple Nao robot camera external parameter calibration method - Google Patents

Simple Nao robot camera external parameter calibration method Download PDF

Info

Publication number
CN110930458A
CN110930458A CN201911006065.6A CN201911006065A CN110930458A CN 110930458 A CN110930458 A CN 110930458A CN 201911006065 A CN201911006065 A CN 201911006065A CN 110930458 A CN110930458 A CN 110930458A
Authority
CN
China
Prior art keywords
coordinate system
robot
transformation
camera
joint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911006065.6A
Other languages
Chinese (zh)
Other versions
CN110930458B (en
Inventor
陈启军
刘成菊
孙浩然
徐子晗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN201911006065.6A priority Critical patent/CN110930458B/en
Publication of CN110930458A publication Critical patent/CN110930458A/en
Application granted granted Critical
Publication of CN110930458B publication Critical patent/CN110930458B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a simple Nao robot camera external parameter calibration method, which comprises the following steps: a calibration initialization step: the method comprises the following steps of fixedly placing a calibration plate containing a two-dimensional code relative to the position of a robot, calibrating the coordinate of the two-dimensional code relative to the robot, and storing the coordinate based on the ID value of the two-dimensional code; a data acquisition step: acquiring images of the two-dimensional codes through a camera, and storing the ID value of the two-dimensional codes, camera information and joint information of each image; a projection transformation step: based on the ID value of the two-dimension code, constructing projection transformation parameters through camera information and joint information, and projecting each two-dimension code coordinate into an acquired image plane containing the two-dimension code; and (3) optimizing and calculating: constructing an error function, and optimizing projection transformation parameters; a calibration result obtaining step: evaluating the optimized projection transformation parameters and calibrating the external parameters of the camera; the two-dimensional code is an ArUco two-dimensional code. Compared with the prior art, the method has the advantages of high calibration efficiency, high accuracy, high reliability, easiness in operation and the like.

Description

Simple Nao robot camera external parameter calibration method
Technical Field
The invention relates to the field of robot camera external parameter calibration, in particular to a simple Nao robot camera external parameter calibration method.
Background
External parameter calibration of a Nao robot camera mostly occurs in a RoboCup game, and in the process of the game, the robot needs to rely on a sensor of the robot to finish the game under the condition of no help of an external sensor and manual operation. Therefore, the robot needs to use the camera to recognize the ball, the robot and the goal on the competition field, calculate the relative position of the characteristic point line and the robot on the competition field, calculate the position of the robot on the field and position the robot. In the above tasks, it is completely dependent on the correctness of the robot camera parameters, and the robot external parameters mainly determine the relative position relationship between the camera coordinate system and the robot coordinate system. Most competition teams place the robot on a fixed point of a field, and use the ready-made sideline information of the competition field to calibrate the external parameters of the camera. However, the environment of the competition field is complex during competition debugging, so that the point on the sideline is difficult to be quickly and accurately acquired by adopting an automatic point acquisition method, the point acquisition precision can be ensured by adopting a manual point acquisition mode, and the time consumption of the whole calibration process is greatly increased. Therefore, the method for calibrating the external parameters of the Nao robot camera based on the field sideline cannot give consideration to both accuracy and speed. In addition, the existing calibration method also requires an initial external camera parameter, and if the deviation between the external camera parameter that is not calibrated and the ideal parameter is too large, the calibration process may fail. Therefore, a Nao robot camera external parameter calibration method which is separated from a competition field needs to be designed, the camera external parameter calibration method can be applied to a Nao robot to simply, conveniently and quickly complete the camera external parameter calibration process, and the camera parameters obtained through calibration can be used for building a good foundation for subsequent visual tasks.
In the document, "analysis and calibration of camera parameters of an NAO robot based on image dead points" (hole dimension, inertia. electronic world, 2016(4):170 and 172), the camera shooting process is analyzed on the basis of the NAO robot as hardware, and a single picture of a three-dimensional space is obtained. The image is processed to find parallel lines in three-dimensional space that are mapped to the image scene. And finally, calculating vanishing points in the images, carrying out corresponding three-dimensional coordinate transformation, calibrating internal and external parameters of the camera of the NAO robot, and determining the spatial position of the NAO robot. The method needs complicated image vanishing point calculation steps, is complex, and has low precision of calibrating the external parameters of the robot camera.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a simple Nao robot camera external parameter calibration method which is easy to operate, high in accuracy and capable of being separated from a competition field.
The purpose of the invention can be realized by the following technical scheme:
a simple Nao robot camera external parameter calibration method comprises the following steps:
a calibration initialization step: the method comprises the following steps of fixedly placing a calibration plate containing a plurality of two-dimensional codes relative to the position of a robot, calibrating the coordinates of each two-dimensional code relative to the robot, and storing the coordinates based on the ID values of the two-dimensional codes;
a data acquisition step: acquiring images containing the two-dimensional codes through a camera on the robot, and simultaneously storing the ID value, camera information and joint information of the two-dimensional codes corresponding to each image;
a projection transformation step: based on the ID value of the two-dimension code, a projection transformation parameter is constructed through camera information and joint information, and each two-dimension code coordinate in the calibration initialization step is projected to an image plane which is acquired by a camera and contains the two-dimension code;
and (3) optimizing and calculating: constructing an error function based on the projection result in the projection transformation step, and optimizing projection transformation parameters;
a calibration result obtaining step: evaluating the optimized projective transformation parameters in the optimization calculation step, if the optimized projective transformation parameters meet preset evaluation conditions, calibrating external parameters of the camera based on the projective transformation parameters, and otherwise, executing the calibration process again;
the two-dimensional code is an ArUco two-dimensional code.
Further, in the projective transformation step, the projective transformation parameters include transformation from a world coordinate system to a camera coordinate system and transformation from the camera coordinate system to an image plane, and expressions of the transformation from the world coordinate system to the camera coordinate system are as follows:
Figure BDA0002242815010000021
in the formula (I), the compound is shown in the specification,
Figure BDA0002242815010000022
is the transformation from a world coordinate system to a camera coordinate system, wherein the world coordinate system is a preset coordinate system of a scene where the robot is positioned,
Figure BDA0002242815010000023
for the transformation of the world coordinate system to the robot foot coordinate system,
Figure BDA0002242815010000024
for the transformation of the robot foot coordinate system to the robot coordinate system,
Figure BDA0002242815010000025
is the transformation of the robot coordinate system to the camera coordinate system.
Further, in the projective transformation step, the projective transformation parameters include transformation of a world coordinate system into a camera coordinate system and transformation of the camera coordinate system into an image plane, and considering that the upper body may not be in a vertical state when the robot is actually standing, the expression of the transformation of the world coordinate system into the camera coordinate system is as follows:
Figure BDA0002242815010000031
T(θtorso)=Roty(Δθy)·Rotx(Δθx)
in the formula (I), the compound is shown in the specification,
Figure BDA0002242815010000032
is the transformation from a world coordinate system to a camera coordinate system, wherein the world coordinate system is a preset coordinate system of a scene where the robot is positioned,
Figure BDA0002242815010000033
for the transformation of the world coordinate system to the robot foot coordinate system,
Figure BDA0002242815010000034
for the transformation of the robot foot coordinate system to the robot coordinate system,
Figure BDA0002242815010000035
the coordinate system of the robot is transformed into the coordinate system of the camera, the world coordinate system is the coordinate system of the preset scene where the robot is located, T (theta)torso) Is based on the rotation angle theta of the robot trunktorsoThe transformation of (a) to (b),
Figure BDA0002242815010000036
for transformation of the robot coordinate system into the camera coordinate system, Delta thetatyAngle of rotation of x-axis of torso, Δ θtxIs the torso y-axis rotation angle.
Further, since the robot and the calibration board are manually placed, the error of the placement position of the robot is taken into consideration
Figure BDA0002242815010000037
The expression of (a) is:
Figure BDA0002242815010000038
T(Δp,Δθ)=trans(Δp)·Rotz(Δθ)
in the formula, transx(-110) represents 110mm translation along the negative x-axis direction of the world coordinate system, T (delta p, delta theta) represents translation and rotation transformation of the robot foot coordinate system, wherein delta p is translation deviation amount, delta theta is rotation deviation amount, trans (delta p) is translation transformation with delta p translation amount, Rotz(Δ θ) is a rotational transformation at an angle Δ θ.
Further, the robot coordinate system is considered to have an error in the transformation thereof in the robot foot coordinate system, and is related to each joint of the leg portion
Figure BDA0002242815010000039
The expression of (a) is:
Figure BDA00022428150100000310
in the formula (I), the compound is shown in the specification,
Figure BDA00022428150100000311
the transformation from the i-1 joint to the i-joint of the robot leg is performed, n is the total joint number of the robot leg, q is the joint angle in the joint informationcorrectionIs the measurement error of the joint angle.
Further, the representation of the robot camera in the robot coordinate system may have errors, which are taken into account
Figure BDA00022428150100000312
The expression of (a) is:
Figure BDA00022428150100000313
Figure BDA00022428150100000314
Figure BDA00022428150100000315
T(Δθ)=Rotz(Δθz)·Roty(Δθy)·Rotx(Δθx)
Figure BDA00022428150100000316
in the formula (I), the compound is shown in the specification,
Figure BDA00022428150100000317
is the transformation of a robot coordinate system under a robot camera coordinate system,
Figure BDA00022428150100000318
for the transformation from the robot neck joint coordinate system to the robot camera coordinate system,
Figure BDA00022428150100000319
for transformation of the robot coordinate system into the robot neck joint coordinate system, transz(Z) is a translation of the Z length along the Z-axis, transx(X) is the translation X length along the X-axis direction, T (theta) is the rotation transformation at the angle theta,
Figure BDA0002242815010000041
is the inherent installation angle of the robot camera relative to the robot neck coordinate system,
Figure BDA0002242815010000042
to rotate about the y-axis
Figure BDA0002242815010000043
Δ θ is a correction angle of rotation around each coordinate axis, T (Δ θ) is a rotational conversion of Δ θ angle, Δ θxFor correction of rotation of the camera about the x-axis, Δ θyFor correction of rotation about the y-axis, Δ θzFor correction of rotation about the z-axis, Rotx(Δθx) For rotation by Δ θ about the x-axisxRotational transformation of (r), Roty(Δθy) For rotation by Δ θ about the y-axisyRotational transformation of (r), Rotz(Δθz) For rotation by Δ θ about the z-axiszThe rotational transformation of (a) is performed,
Figure BDA0002242815010000044
the transformation from the robot coordinate system to the robot neck joint coordinate system is carried out, the neckOffsetZ is the translation amount of the neck yaw joint along the Z axis, and the Trans isz(eckOffsetZ) is a translation of the eckOffsetZ length in the z-axis direction, qneckYawIs the neck yaw joint angle, q in the joint informationneckPitchFor neck pitch joint angle, Rot, in joint informationz(qneckYaw) To rotate q about the z-axisneckYawRotational transformation of (r), Roty(qneckPitch) To rotate q about the y-axisneckPitchThe rotational transformation of (1).
Further, the error function in the optimization calculation step is constructed based on the pixel distance between corresponding points in the projection result, and the expression of the error function is as follows:
Figure BDA0002242815010000045
in the formula, LOSS is an error function, S is two-dimensional code information stored in the data acquisition step, and S isiIs the pixel coordinate of the ith two-dimensional code point in the image in S, mjFor calibrating the sum s in the jth two-dimensional code saved in the initialization stepiCoordinates of the corresponding point in the world coordinate system, siAnd mjThe ID values of the corresponding two-dimensional codes are the same, R is an equivalent 3 multiplied by 3 rotation matrix, and t is an equivalent 3 multiplied by 1 translation vector.
Further, the optimization of the projection transformation parameters is specifically that a gauss-newton iteration algorithm of SGD added with momentum is adopted to perform iterative optimization on the projection transformation parameters.
Further, in the calibration result obtaining step, the projection transformation parameters optimized in the ADD metric evaluation optimization calculation step are adopted, the ADD metric is defined as an average pixel distance between corresponding points of the two-dimensional codes, and an expression of the ADD metric is as follows:
Figure BDA0002242815010000046
in the formula, LOSS is an error function, S is two-dimensional code information stored in the data acquisition step, and S isiIs the pixel coordinate of the ith two-dimensional code point in the image in S, mjFor calibrating the sum s in the jth two-dimensional code saved in the initialization stepiCoordinates of the corresponding point in the world coordinate system, siAnd mjThe ID values of the corresponding two-dimensional codes are the same, R is a rotation matrix in the projection transformation parameters, and t is a translation vector in the projection transformation parameters.
The expression of the evaluation condition is:
Figure BDA0002242815010000051
in the formula, threshold is a preset threshold, Status ═ OK is that the evaluation condition is satisfied, and Status ═ NO is that the evaluation condition is not satisfied.
Furthermore, the robot camera external parameter calibration method is integrated in a user interaction interface in a modularized manner. The whole calibration process is easy to operate, and external parameters of the Nao robot camera can be accurately and efficiently calibrated.
Compared with the prior art, the invention has the following advantages:
(1) the camera external parameter calibration method disclosed by the invention gets rid of the dependence on field information, and can realize the calibration of camera parameters at any place simply and quickly by only setting the two-dimensional code calibration plate and calibrating the coordinates of the two-dimensional code relative to the robot; in the calibration process, the robot is only required to collect a plurality of images in a standing state, and the subsequent flow is automatically processed by a module in a user interaction interface without the direct participation of the robot, so that the calibration time of external parameters of the robot is greatly saved; in addition, the ArUco two-dimensional code is adopted, so that the efficiency of two-dimensional code identification is improved, and the efficiency of camera external parameter calibration is improved.
(2) The invention constructs projection transformation parameters, and considers various errors possibly occurring when the external parameters of the robot camera are actually calibrated, including: the error of robot locating position, the robot coordinate system have the error in the expression in the robot foot coordinate system, the camera is for robot neck joint's conversion error, robot neck joint for robot body's conversion error, camera installation error, the camera aversion that the robot collided to tumble and caused and the error that the robot upper part of the body is not in vertical state, consider comprehensively, improved projection transformation parameter accuracy degree to and subsequent optimization speed.
(3) The invention adopts the Gaussian Newton iterative algorithm of SGD added with momentum to carry out iterative optimization on the projection transformation parameters, and is accurate and efficient; and the pixel distance between corresponding points in the projection result is taken as an error function, so that the reliability of the error function is ensured.
(4) The method adopts the ADD measurement to evaluate the optimized projection transformation parameters, and further ensures the reliability and accuracy of the optimized projection transformation parameters.
(5) The method for calibrating the external parameters of the camera of the robot is integrated in a user interaction interface in a modularized manner, so that the whole calibration process is easy to operate, and the calibration of the external parameters of the camera of the Nao robot can be accurately and efficiently completed.
Drawings
FIG. 1 is a schematic flow chart of a Nao robot camera external parameter calibration method of the present invention;
FIG. 2 is a schematic diagram of an Aruco two-dimensional code calibration plate according to the present invention;
FIG. 3 is a schematic diagram of the placement position of the Nao robot and the calibration plate and the world coordinate system according to the present invention;
fig. 4 is a schematic diagram of a Nao robot, (a) is a schematic diagram of positions of upper and lower cameras of the Nao robot, (b) is a schematic diagram of parameters of installation positions of the upper and lower cameras of the Nao robot, and (c) is a schematic diagram of a foot coordinate system and a robot coordinate system of the Nao robot;
fig. 5 is a schematic diagram of projection points and observation points when the optimization parameter is 0, in which dots arranged in an array are projection points, and centers of two-dimensional codes arranged in an array are observation points;
fig. 6 is a schematic diagram of projection points and observation points after optimization is completed, in which dots arranged in an array are projection points, and centers of two-dimensional codes arranged in an array are observation points;
FIG. 7 is a schematic diagram of a user interaction interface 1-main interface of the present invention;
FIG. 8 is a schematic view of a user interaction interface 2-data collection interface of the present invention;
FIG. 9 is a schematic view of the user interaction interface 3-optimization interface of the present invention;
FIG. 10 is a diagram of a user interface 4-result evaluation interface according to the present invention.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. The present embodiment is implemented on the premise of the technical solution of the present invention, and a detailed implementation manner and a specific operation process are given, but the scope of the present invention is not limited to the following embodiments.
Example 1
The existing calibration technology depends on the field information of the competition field, so that the condition that one competition field is provided is a precondition for using the technology. In the Robocup standard deck group game, the size of the playing field is 6mx9m, and in most cases it is not easy to find a suitable size of open space for laying the field. Without the site information, the external parameters of the robot cannot be calibrated by utilizing the site information, which brings difficulty to the use of the robot. In an actual game, the number of teams is large, and the debugging field provided for the teams is small, so that the time for calibrating on the game field is precious. Meanwhile, the environment of the competition field is complex during competition debugging, the point on the sideline is difficult to be quickly and accurately acquired by adopting an automatic point acquisition method, the point acquisition precision can be ensured by adopting a manual point acquisition mode, and the time consumption of the whole calibration process is greatly increased. Therefore, the method for calibrating the external parameters of the Nao robot camera based on the field sideline cannot give consideration to both accuracy and speed. The long calibration time results in the robot needing to stand for a long time, which is also a damage to the motor of the robot.
The camera external parameter calibration method breaks away from the dependence on site information, can simply and quickly calibrate the camera parameters in a proper place, and meanwhile ensures an accurate camera calibration result. In the calibration process, the robot is only required to collect a plurality of images in a standing state, and the subsequent process, such as the parameter optimization process, does not need the robot to directly participate, so that the calibration time of the external parameters of the robot is greatly saved.
As shown in fig. 1, the simple method for calibrating external parameters of a Nao robot camera in this embodiment includes the following steps:
a calibration initialization step: the method comprises the following steps of fixedly placing a calibration plate containing a plurality of two-dimensional codes relative to the position of a robot, calibrating the coordinates of each two-dimensional code relative to the robot, and storing the coordinates based on the ID values of the two-dimensional codes;
a data acquisition step: acquiring images containing the two-dimensional codes through a camera on the robot, and simultaneously storing the ID value, camera information and joint information of the two-dimensional codes corresponding to each image;
a projection transformation step: based on the ID value of the two-dimension code, a projection transformation parameter is constructed through camera information and joint information, and each two-dimension code coordinate in the calibration initialization step is projected to an image plane which is acquired by a camera and contains the two-dimension code;
and (3) optimizing and calculating: constructing an error function based on the projection result in the projection transformation step, and optimizing projection transformation parameters;
a calibration result obtaining step: evaluating the optimized projective transformation parameters in the optimization calculation step, if the optimized projective transformation parameters meet preset evaluation conditions, calibrating external parameters of the camera based on the projective transformation parameters, and otherwise, executing the calibration process again;
the two-dimensional code is an ArUco two-dimensional code.
The following describes in detail the steps of the method for calibrating the external parameters of the Nao robot camera in this embodiment:
1. calibration initialization step
As shown in fig. 2, the calibration board is designed in advance, in this embodiment, the calibration board includes 40 ArUco two-dimensional codes, the size of each two-dimensional code is 4cm, and the distance between the two-dimensional codes is 1 cm.
As shown in figure 3 of the drawings,Wo represents the origin of coordinates of the world coordinate system,Ro represents the origin of coordinates of the robot coordinate system. The initial placing position of the robot is placed at the origin of the world coordinate system, and the error value between the placing position of the robot and the actual origin of the coordinate is calculated in the subsequent optimization process. Because the sizes of the calibration plate and the two-dimensional codes are fixed, the coordinate value of each two-dimensional code relative to the robot sole coordinate system can be obtained. And storing the coordinates of each two-dimensional code relative to the robot coordinate system in a set M according to the ID value index of the two-dimensional code.
2. Data acquisition step
As shown in fig. 4, in this embodiment, the Nao robot includes an upper camera and a lower camera, and the Nao robot sequentially collects images including each two-dimensional code through the upper camera and the lower camera, respectively, and in the collection process, obtains a two-dimensional code ID value, camera information, and joint information corresponding to each image, and stores the two-dimensional code ID value, the camera information, and the joint information in the set S for subsequent use.
In the collecting process, the joint calibration data of the robot is obtained by using the existing robot joint calibration program, so that the state of the robot is adjusted, the upper body of the robot is ensured to be vertical as much as possible, and the influence of the inclination angle between the trunk of the robot and the vertical direction on the calibration result is reduced as much as possible.
3. Step of projective transformation
And (3) constructing projection transformation parameters by using the information stored in the set S, converting the points in the set M from a robot coordinate system to a robot camera coordinate system, and converting the robot camera coordinate system to an image coordinate system to obtain a projection point set S'. Corresponding point pairs in the set S and the set S' can be obtained through the index of the two-dimensional code ID, and then an error function is constructed by calculating the pixel distance of the corresponding point pairs on an image, so that a nonlinear optimization solution is carried out to obtain a result.
Since points in the M set need to be projected onto the image, a translation relationship between these needs to be established and optimization variables determined. Projecting points in the M set onto the image plane can be divided into two phases: establishing a representation under a point-to-robot camera coordinate system; a representation of the camera coordinate system to the image plane is established. The latter is completed by utilizing the camera intrinsic parameters and the pinhole imaging principle, and the calculation of the former needs to consider and introduce some optimization variables.
In this embodiment, the projection transformation parameters include transformation from a world coordinate system to a camera coordinate system and transformation from the camera coordinate system to an image plane, and in a stage of establishing a point to a representation of a robot camera coordinate system, transformation from the world coordinate system to the camera coordinate system is established, where an expression of the transformation is:
Figure BDA0002242815010000081
in the formula (I), the compound is shown in the specification,
Figure BDA0002242815010000082
is the transformation from a world coordinate system to a camera coordinate system, wherein the world coordinate system is a preset coordinate system of a scene where the robot is positioned,
Figure BDA0002242815010000083
for the transformation of the world coordinate system to the robot foot coordinate system,
Figure BDA0002242815010000084
for the transformation of the robot foot coordinate system to the robot coordinate system,
Figure BDA0002242815010000085
is the transformation of the robot coordinate system to the camera coordinate system.
Considering that the upper body may not be in a vertical state when the robot is actually standing, the expression of the transformation of the world coordinate system to the camera coordinate system is updated to:
Figure BDA0002242815010000086
T(θtorso)=Roty(Δθty)·Rotx(Δθtx)
in the formula (I), the compound is shown in the specification,
Figure BDA0002242815010000087
is the transformation from a world coordinate system to a camera coordinate system, wherein the world coordinate system is a preset coordinate system of a scene where the robot is positioned,
Figure BDA0002242815010000088
for the transformation of the world coordinate system to the robot foot coordinate system,
Figure BDA0002242815010000089
for the transformation of the robot foot coordinate system to the robot coordinate system,
Figure BDA0002242815010000091
the coordinate system of the robot is transformed into the coordinate system of the camera, the world coordinate system is the coordinate system of the preset scene where the robot is located, T (theta)torso) Is composed of
Figure BDA0002242815010000092
Is calculated from the actual value of the robot body, the deviation is only determined by the rotation angle theta of the robot bodytorsoIt is shown that,
Figure BDA0002242815010000093
for transformation of the robot coordinate system into the camera coordinate system, Delta thetatyAngle of rotation of x-axis of torso, Δ θtxIs the torso y-axis rotation angle.
Because the positions of the robot and the calibration board are manually placed, the error of the placement position of the robot cannot be ignored, and therefore, the robot and the calibration board are placed in a manual mode
Figure BDA0002242815010000094
The expression of (a) is:
Figure BDA0002242815010000095
T(Δp,Δθ)=trans(Δp)·Rotz(Δθ)
in the formula, transx(-110) represents 110mm translation along the negative x-axis direction of the world coordinate system, T (delta p, delta theta) represents the deviation of the real position and the theoretical position of the robot foot coordinate system, wherein delta p is the translation deviation amount, delta theta is the rotation deviation amount, trans (delta p) is the translation calculation amount, Rotz(Δ θ) is a rotation calculation amount.
The robot coordinate system has an error in the representation thereof in the robot foot coordinate system, and this conversion relationship is related to each joint of the leg portion, and the error needs to be handled based on joint information acquired in advance, and therefore, it is necessary to deal with this error
Figure BDA0002242815010000096
The expression of (a) is:
Figure BDA0002242815010000097
in the formula (I), the compound is shown in the specification,
Figure BDA0002242815010000098
the transformation from the i-1 joint to the i-joint of the robot leg is performed, n is the total joint number of the robot leg, q is the joint angle measured by a joint sensor in joint information, q is the joint angle measured by the joint sensor in the joint informationcorrectionIs the measurement error of the joint angle.
In addition, the representation of the robot camera in the robot coordinate system may have errors, the representation can be split into the transformation from the robot neck joint coordinate system to the robot camera coordinate system and the transformation error from the robot coordinate system to the robot neck joint coordinate system, the latter can be eliminated through joint calibration, but in the former, the installation error of the camera and the displacement of the camera caused by collision and falling of the robot need not be considered, and the error of the part cannot be ignored. So that the robot coordinate system is represented in the robot camera coordinate system
Figure BDA0002242815010000099
The expression of (a) is:
Figure BDA00022428150100000910
Figure BDA00022428150100000911
Figure BDA00022428150100000912
T(Δθ)=Rotz(Δθz)·Roty(Δθy)·Rotx(Δθx)
Figure BDA00022428150100000913
in the formula (I), the compound is shown in the specification,
Figure BDA00022428150100000914
is the transformation from the robot coordinate system to the camera coordinate system,
Figure BDA00022428150100000915
for the transformation from the robot neck joint coordinate system to the robot camera coordinate system,
Figure BDA0002242815010000101
for transformation of the robot coordinate system into the robot neck joint coordinate system, transz(Z) is a translation of the Z length along the Z-axis, transx(X) is a translation in the X-axis direction by a length X, and T (theta) represents
Figure BDA0002242815010000102
The deviation of the calculated value from the actual value, and the deviation is only related to the rotation, denoted by theta,
Figure BDA0002242815010000103
the intrinsic installation angle of the robot camera relative to the robot neck coordinate system is 1.2 degrees and 39.7 degrees respectively for the upper camera and the lower camera,
Figure BDA0002242815010000104
to rotate about the y-axis
Figure BDA0002242815010000105
Δ θ is a quantity related to a correction angle of rotation around each coordinate axis, and T (Δ θ) is represented by a deviation having Δ θ as a variablexFor correction of rotation of the camera about the x-axis, Δ θyFor correction of rotation about the y-axis, Δ θzFor correction of rotation about the z-axis, Rotx(Δθx) For rotation by Δ θ about the x-axisxRotational transformation of (r), Roty(Δθy) For rotation by Δ θ about the y-axisyRotational transformation of (r), Rotz(Δθz) For rotation by Δ θ about the z-axiszThe rotational transformation of (a) is performed,
Figure BDA0002242815010000106
the transformation from the robot coordinate system to the robot neck joint coordinate system is carried out, the neckOffsetZ is the translation amount of the neck yaw joint along the Z axis, and the Trans isz(eckOffsetZ) is a translation of the eckOffsetZ length in the z-axis direction, qneckYawJoint angle, q, obtained for neck yaw joint sensor measurement in joint informationneckPitchFor joint angles, Rot, obtained by neck pitch joint sensor measurements in shutdown informationz(qneckYaw) To rotate q about the z-axisneckYawRotational transformation of (r), Roty(qneckPitch) To rotate q about the y-axisneckPitchThe rotational transformation of (1).
In summary, nine variables in the following table are introduced as optimization variables in this embodiment.
TABLE 1 optimized variables table
Figure BDA0002242815010000107
In optimizing variables, the upper camera x-axis rotation corresponds to a parameter delta theta in the upper camera transformationx(ii) a Upward camera y-axis rotation corresponds to a parameter delta theta in performing an upward camera transformationy(ii) a Lower camera x-axis rotation corresponds to a parameter delta theta in lower camera transformationx(ii) a Lower camera y-axis rotation corresponds to a parameter delta theta in lower camera transformationy(ii) a Torso x-axis rotation corresponds to torso x-axis rotation angle Δ θtyThe y-axis rotation of the torso corresponds to the y-axis rotation angle Δ θ of the torsotx(ii) a The robot position X is deviated to correspond to the parameter X, the robot Z-axis is deviated to correspond to the parameter Z, the parameter delta p comprises X, Y and Z, and the robot position Y is deviated to correspond to the parameter Y.
Based on the above calculation steps, the transformation of the world coordinate system to the camera coordinate system can be represented by the following formula.
Figure BDA0002242815010000111
Figure BDA0002242815010000112
Where the parameter β is the desired optimization variable result, e.g., βΔθxIs Δ θxQ is joint information.
4. Optimization calculation step
Projecting the points in the M set to an image plane according to the data of the current optimization variable to obtain a projection point set, wherein a pixel distance exists between corresponding point pairs, and the pixel distance is defined as an error function LOSS, which can be expressed as:
Figure BDA0002242815010000113
in the formula, LOSS is an error function, S is two-dimensional code information stored in the data acquisition step, and S isiIs the pixel coordinate, m, of the ith two-dimensional code center point in the image in SjThe coordinates, s, of the jth two-dimensional code central point stored in the calibration initialization step in the world coordinate systemiAnd mjThe ID values of the corresponding two-dimension codes are the same, and a point m in a world coordinate system is optimized and calculatedjProjected onto the image plane through a series of transformation matrices containing optimization parameters, R is the equivalent 3 x 3 rotation matrix and t is the equivalent 3 x 1 translation vector.
And using the error function to optimize and solve the optimized variable value in the step of projection transformation by using a Gaussian Newton iterative algorithm of SGD (generalized minimum mean square) with momentum.
5. Calibration result obtaining step
As shown in fig. 5 and 6, a set of optimized parameters is obtained, the points in the M set are projected into the image plane by using the pose transformation matrix modified by the optimized parameters, a new set of projection point sets S' is obtained, the projection transformation parameters optimized in the optimization calculation step are evaluated by using ADD metric, where the expression of the ADD metric is:
Figure BDA0002242815010000114
where LOSS is the error function and S is the data acquisitionTwo-dimensional code information, s, stored in the stepiIs the pixel coordinate, m, of the ith two-dimensional code center point in the image in SjThe coordinates, s, of the jth two-dimensional code central point stored in the calibration initialization step in the world coordinate systemiAnd mjThe ID values of the corresponding two-dimension codes are the same, and a point m in a world coordinate system is optimized and calculatedjAnd (3) projecting the image plane through a series of transformation matrixes containing optimization parameters, wherein R is a 3X 3 rotation matrix in the projection transformation parameters, and t is a 3X 1 translation vector in the projection transformation parameters.
The expression for the evaluation condition is:
Figure BDA0002242815010000121
in the formula, threshold is a preset threshold, and the present embodiment is set to 2 pixels, and Status ═ OK is that the evaluation condition is satisfied, and Status ═ NO is that the evaluation condition is not satisfied.
In the embodiment, the simple Nao robot camera external parameter calibration method is realized through a data acquisition module, an optimization calculation module and a result evaluation module, wherein the data acquisition module runs a data acquisition step and a partial calibration initialization step; the optimization calculation module runs a projection transformation step and an optimization calculation step; and the result evaluation module runs the calibration result acquisition step.
And the data acquisition module, the optimization calculation module and the result evaluation module are integrated into a Qt-based user interaction interface, as shown in FIGS. 7 to 10, the whole Nao robot camera external parameter calibration task is realized through the user interaction interface, so that the whole calibration process is easy to operate, accurate and efficient.
The foregoing detailed description of the preferred embodiments of the invention has been presented. It should be understood that numerous modifications and variations could be devised by those skilled in the art in light of the present teachings without departing from the inventive concepts. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the prior art according to the concept of the present invention should be within the scope of protection defined by the claims.

Claims (10)

1. A simple Nao robot camera external parameter calibration method is characterized by comprising the following steps:
a calibration initialization step: the method comprises the following steps of fixedly placing a calibration plate containing a plurality of two-dimensional codes relative to the position of a robot, calibrating the coordinates of each two-dimensional code relative to the robot, and storing the coordinates based on the ID values of the two-dimensional codes;
a data acquisition step: acquiring images containing the two-dimensional codes through a camera on the robot, and simultaneously storing the ID value, camera information and joint information of the two-dimensional codes corresponding to each image;
a projection transformation step: based on the ID value of the two-dimension code, a projection transformation parameter is constructed through camera information and joint information, and each two-dimension code coordinate in the calibration initialization step is projected to an image plane which is acquired by a camera and contains the two-dimension code;
and (3) optimizing and calculating: constructing an error function based on the projection result in the projection transformation step, and optimizing projection transformation parameters;
a calibration result obtaining step: evaluating the optimized projective transformation parameters in the optimization calculation step, if the optimized projective transformation parameters meet preset evaluation conditions, calibrating external parameters of the camera based on the projective transformation parameters, and otherwise, executing the calibration process again;
the two-dimensional code is an ArUco two-dimensional code.
2. The simple Nao robot camera extrinsic parameter calibration method according to claim 1, wherein in the projective transformation step, the projective transformation parameters include a transformation from a world coordinate system to a camera coordinate system and a transformation from the camera coordinate system to an image plane, and an expression of the transformation from the world coordinate system to the camera coordinate system is as follows:
Figure FDA0002242815000000011
in the formula (I), the compound is shown in the specification,
Figure FDA0002242815000000012
is the transformation from a world coordinate system to a camera coordinate system, wherein the world coordinate system is a preset coordinate system of a scene where the robot is positioned,
Figure FDA0002242815000000013
for the transformation of the world coordinate system to the robot foot coordinate system,
Figure FDA0002242815000000014
for the transformation of the robot foot coordinate system to the robot coordinate system,
Figure FDA0002242815000000015
is the transformation of the robot coordinate system to the camera coordinate system.
3. The simple Nao robot camera extrinsic parameter calibration method according to claim 1, wherein in the projective transformation step, the projective transformation parameters include a transformation from a world coordinate system to a camera coordinate system and a transformation from the camera coordinate system to an image plane, and an expression of the transformation from the world coordinate system to the camera coordinate system is as follows:
Figure FDA0002242815000000021
T(θtorso)=Roty(Δθy)·Rotx(Δθx)
in the formula (I), the compound is shown in the specification,
Figure FDA0002242815000000022
is the transformation from a world coordinate system to a camera coordinate system, wherein the world coordinate system is a preset coordinate system of a scene where the robot is positioned,
Figure FDA0002242815000000023
for the transformation of the world coordinate system to the robot foot coordinate system,
Figure FDA0002242815000000024
for the transformation of the robot foot coordinate system to the robot coordinate system,
Figure FDA0002242815000000025
the coordinate system of the robot is transformed into the coordinate system of the camera, the world coordinate system is the coordinate system of the preset scene where the robot is located, T (theta)torso) Is based on the rotation angle theta of the robot trunktorsoThe transformation of (a) to (b),
Figure FDA0002242815000000026
for transformation of the robot coordinate system into the camera coordinate system, Delta thetatyAngle of rotation of x-axis of torso, Δ θtxIs the torso y-axis rotation angle.
4. The simple Nao robot camera external parameter calibration method according to claim 2 or 3, wherein the method is characterized in that
Figure FDA0002242815000000027
The expression of (a) is:
Figure FDA0002242815000000028
T(Δp,Δθ)=trans(Δp)·Rotz(Δθ)
in the formula, transx(-110) represents 110mm translation along the negative x-axis direction of the world coordinate system, T (delta p, delta theta) represents translation and rotation transformation of the robot foot coordinate system, wherein delta p is translation deviation amount, delta theta is rotation deviation amount, trans (delta p) is translation transformation with delta p translation amount, Rotz(Δ θ) is a rotational transformation at an angle Δ θ.
5. The simple Nao robot camera external parameter calibration method according to claim 2 or 3, wherein the method is characterized in that
Figure FDA0002242815000000029
The expression of (a) is:
Figure FDA00022428150000000210
in the formula (I), the compound is shown in the specification,
Figure FDA00022428150000000211
the transformation from the i-1 joint to the i-joint of the robot leg is performed, n is the total joint number of the robot leg, q is the joint angle in the joint informationcorrectionIs the measurement error of the joint angle.
6. The simple Nao robot camera external parameter calibration method according to claim 2 or 3, wherein the method is characterized in that
Figure FDA00022428150000000212
The expression of (a) is:
Figure FDA00022428150000000213
Figure FDA00022428150000000214
Figure FDA00022428150000000215
T(Δθ)=Rotz(Δθz)·Roty(Δθy)·Rotx(Δθx)
Figure FDA00022428150000000216
in the formula (I), the compound is shown in the specification,
Figure FDA00022428150000000217
is the transformation of a robot coordinate system under a robot camera coordinate system,
Figure FDA00022428150000000218
for the transformation from the robot neck joint coordinate system to the robot camera coordinate system,
Figure FDA00022428150000000219
for transformation of the robot coordinate system into the robot neck joint coordinate system, transz(Z) is a translation of the Z length along the Z-axis, transx(X) is the translation X length along the X-axis direction, T (theta) is the rotation transformation at the angle theta,
Figure FDA0002242815000000031
is the inherent installation angle of the robot camera relative to the robot neck coordinate system,
Figure FDA0002242815000000032
to rotate about the y-axis
Figure FDA0002242815000000033
Δ θ is a correction angle of rotation around each coordinate axis, T (Δ θ) is a rotational conversion of Δ θ angle, Δ θxFor correction of rotation of the camera about the x-axis, Δ θyFor correction of rotation about the y-axis, Δ θzFor correction of rotation about the z-axis, Rotx(Δθx) For rotation by Δ θ about the x-axisxRotational transformation of (r), Roty(Δθy) For rotation by Δ θ about the y-axisyRotational transformation of (r), Rotz(Δθz) For rotation by Δ θ about the z-axiszThe rotational transformation of (a) is performed,
Figure FDA0002242815000000034
the transformation from the robot coordinate system to the robot neck joint coordinate system is carried out, the neckOffsetZ is the translation amount of the neck yaw joint along the Z axis, and the Trans isz(eckOffsetZ) is a translation of the eckOffsetZ length in the z-axis direction, qneckYawIs the neck yaw joint angle, q in the joint informationneckPitchFor neck pitch joint angle, Rot, in joint informationz(qneckYaw) To rotate q about the z-axisneckYawRotational transformation of (r), Roty(qneckPitch) To rotate q about the y-axisneckPitchThe rotational transformation of (1).
7. The simple Nao robot camera extrinsic parameter calibration method according to claim 1, wherein in the optimization calculation step, an error function is constructed based on pixel distances between corresponding points in the projection result, and an expression of the error function is:
Figure FDA0002242815000000035
in the formula, LOSS is an error function, S is two-dimensional code information stored in the data acquisition step, and S isiIs the pixel coordinate of the ith two-dimensional code point in the image in S, mjFor calibrating the sum s in the jth two-dimensional code saved in the initialization stepiCoordinates of the corresponding point in the world coordinate system, siAnd mjThe ID values of the corresponding two-dimensional codes are the same, R is an equivalent 3 multiplied by 3 rotation matrix, and t is an equivalent 3 multiplied by 1 translation vector.
8. The simple Nao robot camera external parameter calibration method according to claim 1, wherein the optimization of the projection transformation parameters is specifically that a Gaussian Newton iterative algorithm with an SGD added momentum is adopted to perform iterative optimization on the projection transformation parameters.
9. The simple Nao robot camera external parameter calibration method according to claim 1, wherein in the calibration result obtaining step, projection transformation parameters optimized in the ADD metric evaluation optimization calculation step are adopted, and the ADD metric expression is as follows:
Figure FDA0002242815000000036
in the formula, LOSS is an error function, S is two-dimensional code information stored in the data acquisition step, and S isiIs the pixel coordinate of the ith two-dimensional code point in the image in S, mjFor calibrating the sum s in the jth two-dimensional code saved in the initialization stepiCoordinates of the corresponding point in the world coordinate system, siAnd mjThe ID values of the corresponding two-dimensional codes are the same, R is a rotation matrix in the projection transformation parameters, and t is a translation vector in the projection transformation parameters.
The expression of the evaluation condition is:
Figure FDA0002242815000000041
in the formula, threshold is a preset threshold, Status ═ OK is that the evaluation condition is satisfied, and Status ═ NO is that the evaluation condition is not satisfied.
10. The simple Nao robot camera extrinsic parameter calibration method according to claim 1, wherein the robot camera extrinsic parameter calibration method is integrated in a user interaction interface in a modular form.
CN201911006065.6A 2019-10-22 2019-10-22 Simple Nao robot camera external parameter calibration method Active CN110930458B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911006065.6A CN110930458B (en) 2019-10-22 2019-10-22 Simple Nao robot camera external parameter calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911006065.6A CN110930458B (en) 2019-10-22 2019-10-22 Simple Nao robot camera external parameter calibration method

Publications (2)

Publication Number Publication Date
CN110930458A true CN110930458A (en) 2020-03-27
CN110930458B CN110930458B (en) 2023-05-02

Family

ID=69849488

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911006065.6A Active CN110930458B (en) 2019-10-22 2019-10-22 Simple Nao robot camera external parameter calibration method

Country Status (1)

Country Link
CN (1) CN110930458B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107492127A (en) * 2017-09-18 2017-12-19 丁志宇 Light-field camera parameter calibration method, device, storage medium and computer equipment
CN108154536A (en) * 2017-12-13 2018-06-12 南京航空航天大学 The camera calibration method of two dimensional surface iteration
CN109118545A (en) * 2018-07-26 2019-01-01 深圳市易尚展示股份有限公司 3-D imaging system scaling method and system based on rotary shaft and binocular camera
CN110276808A (en) * 2019-06-11 2019-09-24 合肥工业大学 A kind of method of one camera combination two dimensional code measurement glass plate unevenness

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107492127A (en) * 2017-09-18 2017-12-19 丁志宇 Light-field camera parameter calibration method, device, storage medium and computer equipment
CN108154536A (en) * 2017-12-13 2018-06-12 南京航空航天大学 The camera calibration method of two dimensional surface iteration
CN109118545A (en) * 2018-07-26 2019-01-01 深圳市易尚展示股份有限公司 3-D imaging system scaling method and system based on rotary shaft and binocular camera
CN110276808A (en) * 2019-06-11 2019-09-24 合肥工业大学 A kind of method of one camera combination two dimensional code measurement glass plate unevenness

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
TOBIAS KASTNER 等: "Automatic Robot Calibration for the NAO" *
姜小娟 等: "GRB-400机器人摄像机内外参数的分离标定" *

Also Published As

Publication number Publication date
CN110930458B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
US11049280B2 (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
JP4914039B2 (en) Information processing method and apparatus
US20050256395A1 (en) Information processing method and device
CN106624709A (en) Assembly system and method based on binocular vision
JP3138080B2 (en) Automatic calibration device for vision sensor
CN112476489B (en) Flexible mechanical arm synchronous measurement method and system based on natural characteristics
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN113409285A (en) Method and system for monitoring three-dimensional deformation of immersed tunnel joint
CN113870366B (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
CN110653819A (en) System and method for generating kicking action of humanoid robot
CN111438688A (en) Robot correction method, robot correction device, computer equipment and storage medium
CN112233184B (en) Laser radar and camera calibration parameter correction method and device based on image registration
CN114952856A (en) Mechanical arm hand-eye calibration method, system, computer and readable storage medium
CN115205286B (en) Method for identifying and positioning bolts of mechanical arm of tower-climbing robot, storage medium and terminal
CN114310901A (en) Coordinate system calibration method, apparatus, system and medium for robot
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN109406525B (en) Bridge apparent disease detection system and detection method thereof
CN112837314B (en) Fruit tree canopy parameter detection system and method based on 2D-LiDAR and Kinect
CN116740187A (en) Multi-camera combined calibration method without overlapping view fields
CN110930458A (en) Simple Nao robot camera external parameter calibration method
CN112102415A (en) Depth camera external parameter calibration method, device and equipment based on calibration ball
CN111696141A (en) Three-dimensional panoramic scanning acquisition method and device and storage device
CN111735447A (en) Satellite-sensitive-simulation type indoor relative pose measurement system and working method thereof
CN116021519A (en) TOF camera-based picking robot hand-eye calibration method and device
Zhang et al. Research on object panoramic 3D point cloud reconstruction system based on structure from motion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant