CN109920006B - Calibration method for pose transformation matrix of automatic throwing system of green feeder - Google Patents

Calibration method for pose transformation matrix of automatic throwing system of green feeder Download PDF

Info

Publication number
CN109920006B
CN109920006B CN201910054811.2A CN201910054811A CN109920006B CN 109920006 B CN109920006 B CN 109920006B CN 201910054811 A CN201910054811 A CN 201910054811A CN 109920006 B CN109920006 B CN 109920006B
Authority
CN
China
Prior art keywords
pose
mechanical arm
transformation matrix
coordinate system
cab
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910054811.2A
Other languages
Chinese (zh)
Other versions
CN109920006A (en
Inventor
何创新
班训康
苗中华
刘成良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201910054811.2A priority Critical patent/CN109920006B/en
Publication of CN109920006A publication Critical patent/CN109920006A/en
Application granted granted Critical
Publication of CN109920006B publication Critical patent/CN109920006B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/80Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in fisheries management
    • Y02A40/81Aquaculture, e.g. of fish

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

A calibration method of a pose transformation matrix of an automatic throwing system of a green feeder comprises the following steps: s1, setting a plurality of mark points in a camera view, and keeping the mark points motionless relative to a mechanical arm turntable in a calibration process; s2, rotating a pose of the mechanical arm, acquiring Euler angle information of the mechanical arm in the pose, and acquiring a color map and a depth point cloud map in a camera in the pose; s3, according to the color map and the depth point cloud map, three-dimensional information of the centers of all the marking points under a camera coordinate system is found out; s4, repeating the steps S2 to S3 to obtain three-dimensional information of the centers of all the mark points in the camera coordinate system and Euler angles of the mechanical arm in a plurality of postures. S5, according to the three-dimensional information under the camera coordinate system obtained in the step S4, a pose transformation matrix between the camera coordinate systems in different poses is calculated; and S6, according to the Euler angle of the mechanical arm obtained in the step S4, calculating a pose transformation matrix between the coordinate systems of the mechanical arm in different poses.

Description

Calibration method for pose transformation matrix of automatic throwing system of green feeder
Technical Field
The invention relates to the technical field of machine vision and agricultural automation, in particular to a calibration method of a pose transformation matrix of an automatic throwing system of a green feeding machine.
Background
The green fodder machine is a common agricultural harvesting machine, and is mainly used for harvesting low green fodder crops such as green forage grass, oat, beet stem leaf and the like. The green fodder machine is mainly composed of a front header, a machine body feeding device and a shredding and throwing device. The front header is provided with a rotary cutter with a plurality of flail knives for harvesting crops at high speed, the machine body collects the crops and cuts up the crops, the crops are then thrown out by the throwing device, and the materials thrown out by the green feeder are received by a transport vehicle running in the same direction at the rear or at the side. The green fodder machine with automatic throwing system has machine body for collecting crop and shredding, RGB-D camera fixed onto the mechanical arm to identify and position the position and posture of the truck hopper, and automatic regulation of the mechanical arm to realize precise throwing of material based on the coordinate transformation matrix of the camera and the mechanical arm. An RGB-D camera (hereinafter, simply referred to as a camera, hereinafter, referred to as an RGB-D camera) is a camera that can obtain color information and depth information at the same time.
The automatic throwing system of the green feeder utilizes a camera to realize automatic identification and positioning of the truck hopper, and one premise is that a pose transformation matrix between a camera coordinate system of the green feeder and a mechanical arm coordinate system is required to be known. The process of solving the above-described pose transformation matrix is called calibration. The green feeding machine with the automatic throwing system is used for calibrating the pose transformation matrix before leaving the factory, three-dimensional information of marking points in a calibration object under a mechanical arm coordinate system is manually measured at present, three-dimensional information in a camera is obtained through an algorithm, and then the pose transformation matrix of the two coordinate systems is calculated by a 3D-3D pose estimation method on the three-dimensional information of the same marking point under the two coordinate systems. This method based on manual measurement is time consuming and laborious and has a large calibration error.
Disclosure of Invention
The invention aims at solving the defects existing in the prior art and provides a calibration method of a pose transformation matrix of an automatic throwing system of a green feeding machine.
The technical scheme adopted by the invention is as follows:
the attitude sensor is arranged and fixed on a mechanical arm of the green feeding machine;
and manufacturing a marking plate with gray background, wherein the marking plate comprises four round or matrix red marking points, and the distance difference between any two marking points is more than 80cm.
The calibration process of the pose transformation matrix of the automatic throwing system of the green feeder comprises the following steps:
step 1: placing a calibration plate on the side or back of the green feeding machine and in the field of view of the camera, and keeping the calibration plate motionless in the calibration process;
step 2: rotating the mechanical arm to a pose;
step 3: acquiring Euler angle information of the mechanical arm under the pose through a pose sensor;
step 4: acquiring a color map and a depth point cloud map of the camera under the pose;
step 5: separating a marked point set from the color map by a color filtering method;
step 6: obtaining pixel coordinates of each marking point center through a clustering algorithm by the separated marking point set, mapping all marking point centers into a point cloud picture, and finding out three-dimensional information of each marking point center under a camera coordinate system;
repeating the steps 2-6 four times, and setting the four different mechanical arm poses as a first pose A, a second pose B, a third pose C and a fourth pose D respectively to obtain three-dimensional information of the centers of all marking points under the four different mechanical arm poses under a camera coordinate system and Euler angles of the mechanical arm poses. All the poses meet the condition that all the marking points are in the view field of the camera, and the difference of the rotation angles of the mechanical arm in the horizontal or vertical direction between any two poses is more than 10 degrees;
step 7: according to three-dimensional information of the centers of all marking points under 4 different mechanical arm pose under a camera coordinate system, a pose transformation matrix Hc between camera coordinate systems of two different pose is calculated; each pose has a camera coordinate system, i.e., each pose includes a robot coordinate system and a camera coordinate system.
The camera coordinate system refers to a three-dimensional coordinate system with a camera as an origin.
In a preferred example, a pose transformation matrix H between the camera coordinate systems at the time of the first pose A and the second pose B is solved cAB Pose transformation matrix H between camera coordinate systems at first pose a and third pose C cAC Pose transformation matrix H between camera coordinate systems at first pose a and fourth pose D cAD
Step 8: based on Euler angles of 4 different mechanical arm positions, a position transformation matrix Hg between mechanical arm coordinate systems in two different positions is calculated, and each position has a mechanical arm coordinate system, namely each position comprises a mechanical arm coordinate system and a camera coordinate system.
The mechanical arm coordinate system refers to a three-dimensional coordinate system which takes the center of a turntable of the mechanical arm as an origin and is parallel to the gesture sensor coordinate system.
In a preferred example, a pose transformation matrix H between the arm coordinate systems at the time of the first pose A and the second pose B is solved gAB Pose transformation matrix H between mechanical arm coordinate systems in first pose A and third pose C gAC Pose transformation matrix H between mechanical arm coordinate systems at the time of first pose A and fourth pose D gAD
Step 9: a Tsai-Lenz calibration algorithm is adopted to obtain a pose transformation matrix between a camera coordinate system and a mechanical arm coordinate system, the camera coordinate system and the mechanical arm coordinate system are relatively fixed, and the pose transformation matrix between the camera coordinate system and the mechanical arm coordinate system does not change along with the change of the pose of the mechanical arm, so that a fixed pose transformation matrix is finally obtained; according to H in a preferred embodiment cAD 、H cAC 、H cAD And H gAB 、H gAC 、H gAD The pose transformation matrix H is obtained through solution cg
The technical scheme of the invention has the technical effects that the Euler angles of the gesture sensors under a plurality of gestures of the mechanical arm and the three-dimensional coordinate information of the calibration points under the camera coordinate system are read to calculate the gesture transformation matrix between the camera coordinate system of the automatic throwing system of the green feeding machine and the mechanical arm coordinate system. Compared with the existing calibration method based on manual measurement, the method has the advantages of simple and quick operation, high calibration precision, high automation degree and low implementation cost.
Drawings
FIG. 1 is a schematic view of an embodiment of the present invention.
FIG. 2 is a schematic diagram of a calibration plate according to an embodiment of the present invention.
Fig. 3 is a schematic top view of a mechanical arm according to an embodiment of the present invention in a first pose a and a second pose B.
Fig. 4 is a schematic top view angle diagram of the mechanical arm in the third pose C and the fourth pose D according to the embodiment of the present invention.
Fig. 5 is a schematic diagram of a top view angle of a mechanical arm according to an embodiment of the present invention in a first pose a, a second pose B, a third pose C, and a fourth pose.
Fig. 6 is a schematic side view of a mechanical arm according to an embodiment of the present invention in a first position a, a second position B, a third position C, and a fourth position.
FIG. 7 is a flow chart of a calibration process according to an embodiment of the present invention.
Detailed Description
The invention will be further described with reference to the drawings and examples. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1, a camera and an attitude sensor are fixed on a mechanical arm 2 of a green feeding machine 1, a color image and a depth point cloud image of a marking plate 5 are acquired through the camera 4, and euler angles of the mechanical arm are acquired through the attitude sensor 3, wherein 6 is a spray head.
As shown in fig. 2, a marking plate with a size of 1.0 mm by 1.5m was manufactured, the background of the marking plate was gray, four corners of the marking plate had 4 marking points with a circular red shape with a diameter of 5cm, and the pitches of the marking points were as shown in fig. 2.
The specific process comprises the following steps:
step 1: placing a calibration plate behind the green feeding machine, keeping the calibration plate motionless in the calibration process, wherein the linear distance between the calibration plate and the mechanical arm turntable is 3m and is parallel to the vehicle body;
step 2: rotating the mechanical arm to a first pose A;
step 3: acquiring Euler angle information of the mechanical arm under the pose through a pose sensor;
step 4: acquiring a color map and a depth point cloud map of a camera in the current pose;
step 5: based on the difference between the background of the calibration plate and the color characteristics of the mark points, filtering out the irrelevant points in the color map by a color filtering method, and separating the mark point set from the color map. The marking point set is a set of pixel coordinates of marking points in the color map information;
step 6: and obtaining the pixel coordinates of the centers of the four marking points from the separated marking point set through a K-means clustering algorithm. Mapping the pixel coordinates of the centers of the four mark points into a depth point cloud image, and finding out three-dimensional information of the centers of the four mark points under a camera coordinate system;
as shown in fig. 3 to 6, the present embodiment has four poses A, B, C, D in total. The mechanical arm of the first pose A is parallel to the vehicle body; the second pose B is the same as the first pose A in height but is different in rotation angle in the horizontal direction by 15 degrees, the third pose C and the fourth pose D are different in rotation angle in the vertical direction by 10 degrees from the first pose A, and the fourth pose D and the third pose C are different in rotation angle in the horizontal direction by 20 degrees from each other;
repeating the steps 2 to 6 for three times, and then acquiring three-dimensional information of the marking points in the camera coordinate system and Euler angles of the mechanical arms when the second pose B, the third pose C and the fourth pose D, so as to acquire three-dimensional information of the marking points in the camera coordinate system and Euler angles of the mechanical arms when the four groups of different mechanical arms pose;
step 7: based on three-dimensional information of mark point centers under four groups of different mechanical arm poses under a camera coordinate system, a pose transformation matrix H between the camera coordinate systems when the first pose A and the second pose B are positioned is calculated by using a 3D-3D pose estimation method cAB Pose transformation matrix H between camera coordinate systems at first pose a and third pose C cAC Pose transformation matrix H between camera coordinate systems at first pose a and fourth pose D cAD
Assume that the set of 3D points under the camera coordinate system at the first pose a of four marker points on the marker plate is represented as: p= { P 1 ,p 2 ,p 3 ,p 4 The set of 3D points under the camera coordinate system at the second pose B is: p '= { P' 1 ,p′ 2 ,p′ 3 ,p′ 4 It is necessary to find an European transformation R cAB ,T cAB So that
P=R cAB P’+T cAB (1);
R in equation (1) can be solved by SVD method cAB ,T cAB
(1) Calculating centroid positions p, p' of the two sets of points;
Figure GDA0004137384450000061
(2) Calculating the barycenter removing coordinates of each point;
q i =p i -p,q′ i =p′ i -p′ (3)
(3) Calculating W to be a 3×3 matrix;
Figure GDA0004137384450000062
(4) SVD decomposition is carried out on W to obtain:
W=UεV T (5)
where ε is a diagonal matrix of singular values, diagonal elements are arranged from large to small, and U and V are diagonal matrices.
When W is full of rank, R is calculated cAB ,T cAB
R cAB =UVT (6)
T cAB =p-R cAB p′ (7)
From R cAB ,T cAB Pose transformation matrix H capable of forming transformation between camera coordinate systems when being positioned at first pose A and second pose B cAB
Figure GDA0004137384450000063
The same can be obtained:
Figure GDA0004137384450000064
Figure GDA0004137384450000071
step 8: based on Euler angles of the mechanical arm in four groups of different mechanical arm positions and postures, a conversion method of Euler angles and a rotation matrix is used for solving a position and posture transformation matrix H between mechanical arm coordinate systems in the first position and posture A and the second position and posture B gAB Pose transformation matrix H between mechanical arm coordinate systems in first pose A and third pose C gAC Pose transformation matrix H between mechanical arm coordinate systems at the time of first pose A and fourth pose D gAD
The conversion between the mechanical arm coordinate system of the first pose A and the mechanical arm coordinate system of the second pose B is realized by finding a pose transformation matrix. Since the second pose B robot arm coordinate system and the first pose a robot arm coordinate system are known to be obtained by invariable rotation of the virtual robot arm coordinate system (pose of the robot arm when the euler angle of the pose sensor is zero, namely, pose O), the two rotations can be decomposed into values of euler angles of the two poses of the pose sensor rotating around Z-Y-X.
Let the coordinate system I rotate unchanged from the origin to obtain the coordinate system II, and the coordinate systems I and II have the following relationship:
Figure GDA0004137384450000072
Figure GDA0004137384450000073
Figure GDA0004137384450000074
R=R Z R Y R X (14),
Figure GDA0004137384450000075
ⅠH=Ⅱ (16),
wherein gamma is the yaw angle from the coordinate system I to the coordinate system II, alpha is the pitch angle from the Ry coordinate system I to the coordinate system II, beta is the roll angle from the Rx coordinate system I to the coordinate system II, R is the rotation matrix between the coordinate system I and the coordinate system II, and H is the pose transformation matrix between the coordinate system I and the coordinate system II.
Since both the first pose a and the second pose B are obtained by the rotation of the pose O, there is a relationship as follows:
OH OA =A (17),
OH OB =B (18),
Figure GDA0004137384450000081
wherein O represents a mechanical arm coordinate system when the Euler angle of the attitude sensor is zero, A represents a mechanical arm coordinate system of a first attitude A, B represents a mechanical arm coordinate system of a second attitude B, H OA The pose transformation matrix from the pose O mechanical arm coordinate system to the first pose A mechanical arm coordinate system is H OB The pose transformation matrix is from the pose O mechanical arm coordinate system to the second pose B mechanical arm coordinate system.
Pose transformation matrix H between first pose A mechanical arm coordinate system and second pose B mechanical arm coordinate system gAB
Figure GDA0004137384450000082
In the same way, the processing method comprises the steps of,
pose transformation matrix H between first pose A mechanical arm coordinate system and third pose C mechanical arm coordinate system gAC
Figure GDA0004137384450000083
Pose transformation matrix H between first pose A mechanical arm coordinate system and fourth pose D mechanical arm coordinate system gAD
Figure GDA0004137384450000084
Step 9: adopts Tsai-Lenz calibration algorithm and is based on H cAB 、H cAC 、H cAD And H gAB 、H gAC 、H gAD Solving pose transformation matrix H between camera coordinate system and mechanical arm coordinate system cg
The method comprises the following steps of:
H gAB H cg =H cg H cAB (23),
Figure GDA0004137384450000085
Figure GDA0004137384450000091
the same reason is obtained by the first pose A and the third pose C:
H gAC H cg =H cg H cAC (26),
Figure GDA0004137384450000092
Figure GDA0004137384450000093
the same reason is obtained by the first pose A and the fourth pose D:
H gAD H cg =H cg H cAD (29),
Figure GDA0004137384450000094
Figure GDA0004137384450000095
according to formulas (25), (28), (31):
Figure GDA0004137384450000096
the pose transformation matrix H of the camera coordinate system and the mechanical arm coordinate system can be obtained by the equation set cg
In this embodiment, there are three pose transformation matrices in total, one is the pose transformation matrix (H) of the camera coordinate system between two different positions cAB 、H cAC 、H cAD ) One is a robot arm coordinate system pose transformation matrix (H) gAB 、H gAC 、H gAD ) One is a position transformation matrix H between a camera coordinate system and a mechanical arm coordinate system cg Only 3 Hc and 3 Hg can solve 1H cg
Automatic throwing system of green feeder is based on pose transformation matrix H of camera coordinate system and arm coordinate system cg And converting the information of the hopper in the camera coordinate system into the information of the hopper in the mechanical arm coordinate system, and automatically identifying and positioning the hopper.

Claims (9)

1. The calibration method of the pose transformation matrix of the automatic throwing system of the green feeder is characterized by comprising the following steps of:
s1, setting a plurality of mark points in a camera view, and keeping the mark points motionless relative to a mechanical arm turntable in a calibration process;
s2, rotating a pose of the mechanical arm, acquiring Euler angle information of the mechanical arm in the pose, and acquiring a color map and a depth point cloud map in a camera in the pose;
s3, according to the color map and the depth point cloud map, three-dimensional information of the centers of all the marking points under a camera coordinate system is found out;
s4, repeating the steps S2 to S3 to obtain three-dimensional information of the centers of all the mark points in a camera coordinate system and Euler angles of the mechanical arm when the positions are multiple;
s5, according to the three-dimensional information under the camera coordinate system obtained in the step S4, a pose transformation matrix between the camera coordinate systems in different poses is calculated; wherein the pose comprises a first pose A and a second pose B, and a pose transformation matrix H between camera coordinate systems when the first pose A and the second pose B are calculated cAB The method comprises the following steps:
find an European transformation R cAB ,T cAB So that the following formula (1) is established,
P=R cAB P'+T cAB (1),
where P is the set of four marker points in the camera coordinate system in the first pose a, denoted p= { P 1 ,p 2 ,p 3 ,p 4 P ' is the set of four marker points in the camera coordinate system in the second pose B, denoted as P ' = { P ' 1 ,p' 2 ,p' 3 ,p' 4 };
R in equation (1) is calculated using SVD method cAB ,T cAB The method is characterized by comprising the following steps:
the centroid positions p, p' of the two sets are calculated,
Figure QLYQS_1
the de-centroid coordinates of each marker point are calculated,
q i =p i -p,q' i =p' i -p' (3);
the calculation W is a 3 x 3 matrix,
Figure QLYQS_2
SVD decomposition is carried out on W to obtain:
W=UεV T (5),
wherein epsilon is a diagonal matrix formed by singular values, diagonal elements are arranged from large to small, and U and V are diagonal matrices;
when W is full of rank, R is obtained by calculation cAB ,T cAB
R cAB =UV T (6),
T cAB =p-R cAB p' (7);
According to R cAB ,T cAB Pose transformation matrix H between camera coordinate systems when pose A and pose B can be formed cAB
Figure QLYQS_3
S6, according to the Euler angle of the mechanical arm obtained in the step S4, a pose transformation matrix between the coordinate systems of the mechanical arm in different poses is calculated;
and S7, adopting a Tsai-Lenz calibration algorithm, and solving to obtain a pose transformation matrix between the camera coordinate system and the mechanical arm coordinate system according to the pose transformation matrix between the camera coordinate systems and the pose transformation matrix between the mechanical arm coordinate systems in different poses.
2. The calibration method of the pose transformation matrix of the automatic throwing system of the green feeding machine according to claim 1, wherein the step S3 is characterized by finding out three-dimensional information of the centers of all marking points under a camera coordinate system, and specifically comprising the following steps:
s31, separating a mark point set from the color map by a color filtering method;
s32, obtaining pixel coordinates of each marking point center from the separated marking point set through a clustering algorithm, mapping all marking point centers into a depth point cloud image, and finding out three-dimensional information of each marking point center under a camera coordinate system.
3. The calibration method of the pose transformation matrix of the automatic throwing system of the green feeding machine according to claim 1, wherein the step S1 is characterized in that a plurality of marking points are arranged, specifically, a marking plate with gray background is manufactured, the marking plate comprises four round or matrix red marking points, the distance difference between any two marking points is larger than 80cm, and the calibration plate is parallel to the green feeding machine body.
4. The calibration method of the pose transformation matrix of the automatic throwing system of the green feeding machine according to claim 3, wherein the number of the poses is four, specifically a first pose A, a second pose B, a third pose C and a fourth pose D, each pose meets the condition that each marking point is in a camera view field, and the difference of the rotation angles of the mechanical arm between any two poses in the horizontal direction or the vertical direction is larger than 10 degrees.
5. The calibration method of the pose transformation matrix of the automatic throwing system of the green feeding machine according to claim 4, wherein the mechanical arm of the first pose A is parallel to the vehicle body; the second pose B is the same as the first pose A in height but is rotated by 15 degrees in the horizontal direction, the third pose C and the fourth pose D are rotated by 10 degrees in the vertical direction than the first pose A, and the fourth pose D and the third pose C are rotated by 20 degrees in the horizontal direction.
6. The calibration method of the pose transformation matrix of the automatic throwing system of the green feeding machine according to claim 5, which is characterized in that: in the step S5, a pose transformation matrix between camera coordinate systems in different poses is calculated;
according to three-dimensional information of the centers of all marking points in the camera coordinate system when four different mechanical arms are in pose, a pose transformation matrix H between the camera coordinate systems when the mechanical arms are in the first pose A and the second pose B is calculated cAB Pose transformation matrix H between camera coordinate systems at first pose a and third pose C cAC Pose transformation matrix H between camera coordinate systems at first pose a and fourth pose D cAD
7. The calibration method of the pose transformation matrix of the automatic throwing system of the green feeding machine according to claim 6, which is characterized in that: the step S6 of resolving a pose transformation matrix between the coordinate systems of the mechanical arm when the pose is different includes:
according to Euler angles of the mechanical arm in the four different mechanical arm pose, a pose transformation matrix H between the mechanical arm coordinate systems in the first pose A and the second pose B is calculated by using a Euler angle and rotation matrix conversion method gAB Pose transformation matrix H between mechanical arm coordinate systems in first pose A and third pose C gAC Pose transformation matrix H between mechanical arm coordinate systems at the time of first pose A and fourth pose D gAD
8. The calibration method of the pose transformation matrix of the automatic throwing system of the green feeding machine according to claim 7, which is characterized in that: solving a pose transformation matrix H between the mechanical arm coordinate systems when the first pose A and the second pose B are positioned gAB The method specifically comprises the following steps:
the first pose A and the second pose B have the following relation:
OH OA =A (17),
OH OB =B (18),
Figure QLYQS_4
wherein O represents a mechanical arm coordinate system when the Euler angle of the attitude sensor is zero, A represents a mechanical arm coordinate system of a first attitude A, B represents a mechanical arm coordinate system of a second attitude B, H OA The pose transformation matrix from the pose O mechanical arm coordinate system to the first pose A mechanical arm coordinate system is H OB The pose transformation matrix is from the pose O mechanical arm coordinate system to the second pose B mechanical arm coordinate system;
calculating a pose transformation matrix H between the mechanical arm coordinate system of the first pose A and the mechanical arm coordinate system of the second pose B according to formulas 17-19 gAB The formula is as follows:
Figure QLYQS_5
9. the calibration method of the pose transformation matrix of the automatic throwing system of the green feeding machine according to claim 8, which is characterized in that: solving pose transformation matrix H between camera coordinate system and mechanical arm coordinate system cg The method specifically comprises the following steps:
the following formula is established:
H gAB H cg =H cg H cAB (23),
Figure QLYQS_6
Figure QLYQS_7
obtained according to formulas (23) - (25):
Figure QLYQS_8
CN201910054811.2A 2019-01-21 2019-01-21 Calibration method for pose transformation matrix of automatic throwing system of green feeder Active CN109920006B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910054811.2A CN109920006B (en) 2019-01-21 2019-01-21 Calibration method for pose transformation matrix of automatic throwing system of green feeder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910054811.2A CN109920006B (en) 2019-01-21 2019-01-21 Calibration method for pose transformation matrix of automatic throwing system of green feeder

Publications (2)

Publication Number Publication Date
CN109920006A CN109920006A (en) 2019-06-21
CN109920006B true CN109920006B (en) 2023-06-20

Family

ID=66960536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910054811.2A Active CN109920006B (en) 2019-01-21 2019-01-21 Calibration method for pose transformation matrix of automatic throwing system of green feeder

Country Status (1)

Country Link
CN (1) CN109920006B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111090258B (en) * 2019-12-09 2021-05-25 上海大学 Motion control method for mechanical arm of automatic throwing system of green feeding machine
CN111890356A (en) * 2020-06-30 2020-11-06 深圳瀚维智能医疗科技有限公司 Mechanical arm coordinate system and camera coordinate system calibration method, device, equipment and medium
CN112767479A (en) * 2021-01-13 2021-05-07 深圳瀚维智能医疗科技有限公司 Position information detection method, device and system and computer readable storage medium
CN113524201B (en) * 2021-09-07 2022-04-08 杭州柳叶刀机器人有限公司 Active adjusting method and device for pose of mechanical arm, mechanical arm and readable storage medium
CN114842079B (en) * 2022-04-23 2023-09-19 四川大学 Equipment and method for measuring pose of prefabricated intermediate wall in shield tunnel

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7890313B2 (en) * 2003-10-14 2011-02-15 Verseon Method and apparatus for analysis of molecular combination based on computations of shape complementarity using basis expansions
CN103759716A (en) * 2014-01-14 2014-04-30 清华大学 Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm
CN107590835A (en) * 2017-08-24 2018-01-16 中国东方电气集团有限公司 Mechanical arm tool quick change vision positioning system and localization method under a kind of nuclear environment
CN107738254A (en) * 2017-08-25 2018-02-27 中国科学院光电研究院 The conversion scaling method and system of a kind of mechanical arm coordinate system
CN107767423A (en) * 2017-10-10 2018-03-06 大连理工大学 A kind of mechanical arm target positioning grasping means based on binocular vision
CN108645428A (en) * 2018-05-10 2018-10-12 天津大学 The monoblock type scaling method of six degree of freedom laser target
CN109048893A (en) * 2018-07-27 2018-12-21 浙江工业大学 A kind of mechanical arm localization method based on monocular RGB camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7890313B2 (en) * 2003-10-14 2011-02-15 Verseon Method and apparatus for analysis of molecular combination based on computations of shape complementarity using basis expansions
CN103759716A (en) * 2014-01-14 2014-04-30 清华大学 Dynamic target position and attitude measurement method based on monocular vision at tail end of mechanical arm
CN107590835A (en) * 2017-08-24 2018-01-16 中国东方电气集团有限公司 Mechanical arm tool quick change vision positioning system and localization method under a kind of nuclear environment
CN107738254A (en) * 2017-08-25 2018-02-27 中国科学院光电研究院 The conversion scaling method and system of a kind of mechanical arm coordinate system
CN107767423A (en) * 2017-10-10 2018-03-06 大连理工大学 A kind of mechanical arm target positioning grasping means based on binocular vision
CN108645428A (en) * 2018-05-10 2018-10-12 天津大学 The monoblock type scaling method of six degree of freedom laser target
CN109048893A (en) * 2018-07-27 2018-12-21 浙江工业大学 A kind of mechanical arm localization method based on monocular RGB camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
双臂6R服务机器人的运动学研究及仿真;李宪华 等;《机械传动》;20180531;全文 *

Also Published As

Publication number Publication date
CN109920006A (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN109920006B (en) Calibration method for pose transformation matrix of automatic throwing system of green feeder
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
CN110426051B (en) Lane line drawing method and device and storage medium
CN110103217B (en) Industrial robot hand-eye calibration method
CN110116411B (en) Robot 3D vision hand-eye calibration method based on spherical target
CN107292927B (en) Binocular vision-based symmetric motion platform pose measurement method
CN105588563B (en) Binocular camera and inertial navigation combined calibrating method in a kind of intelligent driving
CN113436260B (en) Mobile robot pose estimation method and system based on multi-sensor tight coupling
CN110580725A (en) Box sorting method and system based on RGB-D camera
CN108665499B (en) Near distance airplane pose measuring method based on parallax method
CN112907675B (en) Calibration method, device, system, equipment and storage medium of image acquisition equipment
CN109465830B (en) Robot monocular stereoscopic vision calibration system and method
CN110488838B (en) Accurate repeated positioning method for indoor autonomous navigation robot
CN111968074A (en) Method for detecting and harvesting lodging crops of harvester by combining binocular camera and IMU
CN112966571B (en) Standing long jump flight height measurement method based on machine vision
CN108007437B (en) Method for measuring farmland boundary and internal obstacles based on multi-rotor aircraft
CN111486864A (en) Multi-source sensor combined calibration method based on three-dimensional regular octagon structure
CN112614147A (en) Method and system for estimating plant density of crop at seedling stage based on RGB image
CN113379848A (en) Target positioning method based on binocular PTZ camera
CN108759773A (en) A kind of monocular vision distance measuring method applied to robot crawl
CN112837314B (en) Fruit tree canopy parameter detection system and method based on 2D-LiDAR and Kinect
CN110348067A (en) A kind of air-flow characterization physical parameter extracting method and system, medium, equipment
Cho et al. Vision-based uncut crop edge detection for automated guidance of head-feeding combine
CN111598956A (en) Calibration method, device and system
CN111340884A (en) Binocular heterogeneous camera and RFID dual target positioning and identity identification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant