CN103925879A - Indoor robot vision hand-eye relation calibration method based on 3D image sensor - Google Patents

Indoor robot vision hand-eye relation calibration method based on 3D image sensor Download PDF

Info

Publication number
CN103925879A
CN103925879A CN201410166077.6A CN201410166077A CN103925879A CN 103925879 A CN103925879 A CN 103925879A CN 201410166077 A CN201410166077 A CN 201410166077A CN 103925879 A CN103925879 A CN 103925879A
Authority
CN
China
Prior art keywords
hand
robot
vision
trick
sensor
Prior art date
Application number
CN201410166077.6A
Other languages
Chinese (zh)
Inventor
孔令成
赵江海
张志华
张强
Original Assignee
中国科学院合肥物质科学研究院
常州先进制造技术研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国科学院合肥物质科学研究院, 常州先进制造技术研究所 filed Critical 中国科学院合肥物质科学研究院
Priority to CN201410166077.6A priority Critical patent/CN103925879A/en
Publication of CN103925879A publication Critical patent/CN103925879A/en

Links

Abstract

The invention provides an indoor robot vision hand-eye relation calibration method based on a 3D image sensor. The method comprises the following steps: S1, marking a plurality of marking points on hand grabbing tail end joints of a robot, acquiring point cloud image information of the hand grabbing tail end joints through the 3D vision sensor of the robot, and acquiring a plurality of sets of three-dimensional coordinate values relative to a visual sensor coordinate system; S2, acquiring the three-dimensional coordinate values of the multiple marking points of the hand grabbing tail end joints under the world coordinate system through an external three-dimensional measurement device, wherein the values are acquired through an arm-base coordinate system of the robot; S3, acquiring the coordinate values through the step S1 and the step S2 and obtaining a hand-eye calibration array. The indoor robot vision hand-eye relation calibration method based on the 3D image sensor simplifies the calibration process, the measurement precision is high, and the requirement for indoor robot hand-eye calibration can be effectively met.

Description

Indoor Robot vision trick based on 3D rendering sensor is related to scaling method

Technical field

The present invention relates to a kind of robot vision method, the Indoor Robot vision trick relating in particular to based on 3D rendering sensor is related to scaling method.

Background technology

Application and development along with 3D sensor, increasing robot adopts 3D sensor as robotic vision system, same with traditional binocular vision system, the collection of 3D sensor be that the point cloud chart of current scene is as information, its image information is a large amount of D coordinates value, can reflect intuitively the degree of depth and image information.What traditional binocular vision system adopted is two industrial cameras, collection be the plane picture of current scene, its image information is pixel value, the parallax by two images carrys out compute depth information, this inevitably can bring the error of calculation, causes precision to reduce.In Robotic Hand-Eye Calibration system, adopt the D coordinates value of 3D sensor return label point, applying three-dimensional dynamic measuring instrument measurement markers point is with respect to the D coordinates value of arm basis coordinates system, obtaining transition matrix gets final product to such an extent that trick is related to calibration matrix, point cloud chart looks like to obtain simple and efficient, proving operation is simple, and stated accuracy is higher; Existing Robot Hand-eye is related to scaling method, as document " algorithm of trick stereoscopic vision and realization " (bear Chun-shan Mountain, the gold zone Chinese, Wang Min.The > > of < < robot, 2001,23 (2), pp.113~117) based on directly image coordinate being mapped to the "black box" thought of robot reference coordinate, simplify calibration process, but calculated relative complex.The new method of the Indoor Robot hand and eye calibrating based on 3D sensor in this paper, has applied "black box" thought, and calibration process is simple, and calculated amount is little, and precision is higher, can effectively meet the demand of machine hand and eye calibrating.

Summary of the invention

The features and advantages of the present invention are partly statement in the following description, or can describe obviously from this, or can learn by putting into practice the present invention.

For overcoming the problem of prior art, the invention provides a kind of Indoor Robot vision trick based on 3D rendering sensor and be related to scaling method, comprise step S1, at the hand of robot, grab a plurality of gauge points of end joint marker, 3D vision sensor by this robot gathers point cloud chart that this hand grabs face end joint as information, obtains with respect to vision sensor coordinate system and organizes D coordinates value more; S2, exterior three dimensional measuring equipment gather the D coordinates value that this hand under world coordinate system is grabbed a plurality of gauge points in end joint, and this value is that the arm basis coordinates system with respect to this robot gathers; S3, the coordinate figure obtaining by step S1, S2, show that trick is related to calibration matrix.

According to one embodiment of present invention, this hand is grabbed end joint and is marked with four gauge points, and wherein three are positioned at finger tips, and another is positioned at the palm of the hand, and its mutual relationship is at any 34 of conllinear are not coplanar when measuring.

According to one embodiment of present invention, in this step S1, this 3D vision sensor obtains point cloud chart that this hand grabs end joint as information by mouse-keyboard trigger event algorithm in a cloud storehouse PCL.

According to one embodiment of present invention, this exterior three dimensional measuring equipment, when gathering this hand and grab the three-dimensional coordinate of gauge point in end joint, is reference frame with this robot arm basis coordinates.

According to one embodiment of present invention, in this step S3, by changing this hand, grab the mutual alignment of a plurality of gauge points of end, repeatedly record numerical value and calculate transition matrix, try to achieve Optimal matrix, this matrix of this optimum is related to calibration matrix as this trick.

The method that Indoor Robot vision trick relation based on 3D sensor provided by the invention is demarcated, there is the features such as method is simple, measuring accuracy is higher, easy popularization, can effectively meet the needs of Indoor Robot hand and eye calibrating, and effectively simplified calibration process, do not need calibrating camera inside and outside parameter, calculated amount is little, and has higher hand and eye calibrating precision.

By reading instructions, those of ordinary skills will understand feature and the content of these technical schemes better.

Accompanying drawing explanation

Below by describing particularly the present invention with reference to accompanying drawing and in conjunction with example, advantage of the present invention and implementation will be more obvious, wherein content shown in accompanying drawing is only for explanation of the present invention, and do not form the restriction of going up in all senses of the present invention, in the accompanying drawings:

Fig. 1 is the robot of the embodiment of the present invention and the structural representation of exterior three dimensional measuring equipment.

Fig. 2 is the schematic flow sheet that the Indoor Robot vision trick based on 3D rendering sensor of the embodiment of the present invention is related to scaling method.

Fig. 3 is the idiographic flow schematic diagram that the Indoor Robot vision trick based on 3D rendering sensor of the embodiment of the present invention is related to scaling method.

Embodiment

Please refer to Fig. 1, Fig. 1 is the robot of the embodiment of the present invention and the structural representation of exterior three dimensional measuring equipment, indoor mobile robot 4 is provided with 3D sensor Kinect, robot has been grabbed end joint 5 marks four stains 2, serve as a mark a little, laying respectively at right-hand man grabs in three finger tips and the palm of the hand in end joint, at the side of this indoor mobile robot endways NDI Three-Dimensional Dynamic measuring instrument 3.When measuring, make robot grab each joint of end to arm and hand and enable, make to go smoothly and grab end at robot the place ahead correct position, guarantee in the field range of Kinect, and 4 of conllinear are not coplanar to make four gauge point mutual relationships of finger tips be at 3.The program of robot terminal operation Kinect collection point cloud atlas picture, main function is the mouse-keyboard trigger event function in some cloud storehouse PCL (Point Cloud Library), and this function performance is in clicking point cloud atlas picture, a bit to return to the D coordinates value of this point.

As shown in Figure 2, the invention provides a kind of Indoor Robot vision trick based on 3D rendering sensor and be related to scaling method, comprise step S1, at the hand of robot 4, grab a plurality of gauge points 2 of end joint 5 mark, the point cloud chart of grabbing face end joint by 3D vision sensor 1 this hand of collection of this robot, as information, obtains with respect to vision sensor coordinate system and organizes D coordinates value more; S2, exterior three dimensional measuring equipment 3 gathers the D coordinates value that this hand under world coordinate systems is grabbed a plurality of gauge points 2 in end joint 5, and this value is that the arm basis coordinates system with respect to this robot gathers; S3, the coordinate figure obtaining by step S1, S2, show that trick is related to calibration matrix.

In the present embodiment, in step S1, this 3D vision sensor obtains point cloud chart that this hand grabs end joint as information by mouse-keyboard trigger event algorithm in a cloud storehouse PCL; In step S2, this exterior three dimensional measuring equipment, when gathering this hand and grab the three-dimensional coordinate of gauge point in end joint, is reference frame with this robot arm basis coordinates; In step S3, by changing this hand, grab the mutual alignment of a plurality of gauge points of end, repeatedly record numerical value and calculate transition matrix, try to achieve Optimal matrix, this matrix of this optimum is related to calibration matrix as this trick.

Referring again to Fig. 3, in the specific implementation, first robot is grabbed to four unique points of end joint marker, and robot arm is enabled, make to go smoothly and grab end joint in the field range the inside of 3D sensor and Three-Dimensional Dynamic measuring instrument.Terminal starts Kinect operation image capture program, demonstrates and comprises the real-time point cloud chart picture stream that hand is grabbed end, clicks successively four gauge points in image, triggers mouse-keyboard function, returns to the D coordinates value of four gauge points; In the measurement space of NDI Three-Dimensional Dynamic measuring instrument, with robot arm basis coordinates, be source point coordinate system, record the D coordinates value that hand is grabbed four gauge points in end joint.By these two groups of D coordinates value, can try to achieve trick relational matrix.

Specifically, can be so that { z} is world coordinate system, { x for x, y k, y k, z kcoordinate be the point cloud chart that gathers of 3D sensor Kinect as coordinate system, { x n, y n, z ncoordinate is in NDI Three-Dimensional Dynamic measurement space, the basis coordinates of robot arm system, (x 0, y 0, z 0) be that Kinect is with respect to { x n, y n, z nthe translation vector of coordinate system.Click successively four gauge points in the point cloud chart picture of current collection, record four D coordinates value returning, be designated as { x k i, y k i, z k i, i=1,2,3,4.In the measurement space of NDI Three-Dimensional Dynamic measuring instrument, with robot arm basis coordinates, be source point coordinate system, the D coordinates value that remote holder is grabbed four gauge points of end worked as in record, is designated as { x simultaneously n i, y n i, z n i, i=1,2,3,4.

By formula:

x n 1 x n 2 x n 3 x n 4 y n 1 y n 2 y n 3 y n 4 z n 1 z n 2 z n 3 z n 4 1 1 1 1 = x 1 x 2 x 3 x 0 y 1 y 2 y 3 y 0 z 1 z 2 z 3 z 0 0 0 0 1 * x k 1 x k 2 x k 3 x k 4 y k 1 y k 2 y k 3 y k 4 z k 1 z k 2 z k 3 z k 4 1 1 1 1

Release transition matrix is

x 1 x 2 x 3 x 0 y 1 y 2 y 3 y 0 z 1 z 2 z 3 z 0 0 0 0 1 = x n 1 x n 2 x n 3 x n 4 y n 1 y n 2 y n 3 y n 4 z n 1 z n 2 z n 3 z n 4 1 1 1 1 * x k 1 x k 2 x k 3 x k 4 y k 1 y k 2 y k 3 y k 4 z k 1 z k 2 z k 3 z k 4 1 1 1 1 - 1

By changing hand, grab the mutual alignment of four gauge points of end, repeatedly record numerical value and calculate transition matrix, try to achieve Optimal matrix.This matrix is required trick relation and demarcates transition matrix.

With reference to the accompanying drawings of the preferred embodiments of the present invention, those skilled in the art do not depart from the scope and spirit of the present invention above, can have multiple flexible program to realize the present invention.For example, the feature that illustrates or describe as the part of an embodiment can be used for another embodiment to obtain another embodiment.These are only the better feasible embodiment of the present invention, not thereby limit to interest field of the present invention, the equivalence that all utilizations instructions of the present invention and accompanying drawing content are done changes, within being all contained in interest field of the present invention.

Claims (5)

1. the Indoor Robot vision trick based on 3D rendering sensor is related to a scaling method, it is characterized in that, comprises step:
S1, at the hand of robot, grab a plurality of gauge points of end joint marker, the 3D vision sensor by described robot gathers point cloud chart that described hand grabs face end joint as information, obtains with respect to vision sensor coordinate system and organizes D coordinates value more;
S2, exterior three dimensional measuring equipment gather the D coordinates value that described hand under world coordinate system is grabbed a plurality of gauge points in end joint, and this value is that the arm basis coordinates system with respect to described robot gathers;
S3, the coordinate figure obtaining by step S1, S2, show that trick is related to calibration matrix.
2. the Indoor Robot vision trick based on 3D rendering sensor as claimed in claim 1 is related to scaling method, it is characterized in that, described hand is grabbed end joint and is marked with four gauge points, wherein three are positioned at finger tips, another is positioned at the palm of the hand, and its mutual relationship is at any 34 of conllinear are not coplanar when measuring.
3. the Indoor Robot vision trick based on 3D rendering sensor as claimed in claim 1 is related to scaling method, it is characterized in that, in described step S1, described 3D vision sensor obtains point cloud chart that described hand grabs end joint as information by mouse-keyboard trigger event algorithm in a cloud storehouse PCL.
4. the Indoor Robot vision trick based on 3D rendering sensor as claimed in claim 1 is related to scaling method, it is characterized in that, described exterior three dimensional measuring equipment, when gathering described hand and grab the three-dimensional coordinate of gauge point in end joint, is reference frame with described robot arm basis coordinates.
5. the Indoor Robot vision trick based on 3D rendering sensor as claimed in claim 1 is related to scaling method, it is characterized in that, in described step S3, by changing described hand, grab the mutual alignment of a plurality of gauge points of end, repeatedly record numerical value and calculate transition matrix, try to achieve Optimal matrix, this matrix of described optimum is related to calibration matrix as described trick.
CN201410166077.6A 2014-04-24 2014-04-24 Indoor robot vision hand-eye relation calibration method based on 3D image sensor CN103925879A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410166077.6A CN103925879A (en) 2014-04-24 2014-04-24 Indoor robot vision hand-eye relation calibration method based on 3D image sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410166077.6A CN103925879A (en) 2014-04-24 2014-04-24 Indoor robot vision hand-eye relation calibration method based on 3D image sensor

Publications (1)

Publication Number Publication Date
CN103925879A true CN103925879A (en) 2014-07-16

Family

ID=51144180

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410166077.6A CN103925879A (en) 2014-04-24 2014-04-24 Indoor robot vision hand-eye relation calibration method based on 3D image sensor

Country Status (1)

Country Link
CN (1) CN103925879A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105818167A (en) * 2015-01-22 2016-08-03 通用汽车环球科技运作有限责任公司 Method for calibrating an articulated end effector employing a remote digital camera
CN104236456B (en) * 2014-09-04 2016-10-26 中国科学院合肥物质科学研究院 A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor
CN106679590A (en) * 2016-12-29 2017-05-17 中国科学院长春光学精密机械与物理研究所 Three-dimensional scanning equipment and three-dimensional scanner
CN107993227A (en) * 2017-12-15 2018-05-04 深圳先进技术研究院 A kind of method and apparatus of acquisition 3D laparoscope trick matrixes
CN108942922A (en) * 2018-06-11 2018-12-07 杭州灵西机器人智能科技有限公司 Mechanical arm hand and eye calibrating method, apparatus and system based on circular cone calibration object
CN109470138A (en) * 2018-10-22 2019-03-15 江苏集萃微纳自动化系统与装备技术研究所有限公司 The On-line Measuring Method of part
CN109737871A (en) * 2018-12-29 2019-05-10 南方科技大学 A kind of scaling method of the relative position of three-dimension sensor and mechanical arm
CN109893833A (en) * 2019-03-27 2019-06-18 深圳市瑞源祥橡塑制品有限公司 Aiming spot acquisition methods, device and its application

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1672881A (en) * 2005-04-21 2005-09-28 上海交通大学 On-line robot hand and eye calibrating method based on motion selection
CN1904806A (en) * 2006-07-28 2007-01-31 上海大学 System and method of contactless position input by hand and eye relation guiding
CN101186038A (en) * 2007-12-07 2008-05-28 北京航空航天大学 Method for demarcating robot stretching hand and eye
US20090076655A1 (en) * 2007-09-14 2009-03-19 Zimmer, Inc. Robotic calibration method
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1672881A (en) * 2005-04-21 2005-09-28 上海交通大学 On-line robot hand and eye calibrating method based on motion selection
CN1904806A (en) * 2006-07-28 2007-01-31 上海大学 System and method of contactless position input by hand and eye relation guiding
US20090076655A1 (en) * 2007-09-14 2009-03-19 Zimmer, Inc. Robotic calibration method
CN101186038A (en) * 2007-12-07 2008-05-28 北京航空航天大学 Method for demarcating robot stretching hand and eye
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
QIANG ZHANG ET AL.: "Real-time general object recognition for indoor robot based on PCL", 《INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS》, 31 December 2013 (2013-12-31), pages 651 - 655, XP032579468, DOI: doi:10.1109/ROBIO.2013.6739534 *
王俊等: "老年服务机器人视觉定位方法研究", 《华中科技大学学报(自然科学版)》, vol. 39, 30 November 2011 (2011-11-30), pages 255 - 258 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104236456B (en) * 2014-09-04 2016-10-26 中国科学院合肥物质科学研究院 A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor
CN105818167A (en) * 2015-01-22 2016-08-03 通用汽车环球科技运作有限责任公司 Method for calibrating an articulated end effector employing a remote digital camera
CN105818167B (en) * 2015-01-22 2018-10-23 通用汽车环球科技运作有限责任公司 The method that hinged end effector is calibrated using long distance digital camera
CN106679590A (en) * 2016-12-29 2017-05-17 中国科学院长春光学精密机械与物理研究所 Three-dimensional scanning equipment and three-dimensional scanner
CN107993227A (en) * 2017-12-15 2018-05-04 深圳先进技术研究院 A kind of method and apparatus of acquisition 3D laparoscope trick matrixes
CN107993227B (en) * 2017-12-15 2020-07-24 深圳先进技术研究院 Method and device for acquiring hand-eye matrix of 3D laparoscope
CN108942922A (en) * 2018-06-11 2018-12-07 杭州灵西机器人智能科技有限公司 Mechanical arm hand and eye calibrating method, apparatus and system based on circular cone calibration object
CN109470138A (en) * 2018-10-22 2019-03-15 江苏集萃微纳自动化系统与装备技术研究所有限公司 The On-line Measuring Method of part
CN109737871A (en) * 2018-12-29 2019-05-10 南方科技大学 A kind of scaling method of the relative position of three-dimension sensor and mechanical arm
CN109737871B (en) * 2018-12-29 2020-11-17 南方科技大学 Calibration method for relative position of three-dimensional sensor and mechanical arm
CN109893833A (en) * 2019-03-27 2019-06-18 深圳市瑞源祥橡塑制品有限公司 Aiming spot acquisition methods, device and its application

Similar Documents

Publication Publication Date Title
US20200096317A1 (en) Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
Pérez et al. Robot guidance using machine vision techniques in industrial environments: A comparative review
EP3143596B1 (en) Method and apparatus for scanning and printing a 3d object
Pauwels et al. Simtrack: A simulation-based framework for scalable real-time object pose detection and tracking
CN104908038B (en) The robot simulation device that the removal process of workpiece is emulated
US9460517B2 (en) Photogrammetric methods and devices related thereto
Gudmundsson et al. Fusion of stereo vision and time-of-flight imaging for improved 3d estimation
CN103279186B (en) Merge the multiple goal motion capture system of optical alignment and inertia sensing
CN104260112B (en) A kind of Robot Hand-eye localization method
US10899014B2 (en) Multiple lens-based smart mechanical arm and positioning and assembly method thereof
US10909770B2 (en) Capturing and aligning three-dimensional scenes
EP1886281B1 (en) Image processing method and image processing apparatus
Montiel et al. Unified inverse depth parametrization for monocular SLAM
CN103135755B (en) Interactive system and method
JP5378374B2 (en) Method and system for grasping camera position and direction relative to real object
US10341633B2 (en) Systems and methods for correcting erroneous depth information
CN103353758B (en) A kind of Indoor Robot navigation method
US9482515B2 (en) Stereoscopic measurement system and method
CN102722249B (en) Control method, actuation means and electronic installation
CN103941851B (en) A kind of method and system for realizing virtual touch calibration
CN106041937A (en) Control method of manipulator grabbing control system based on binocular stereoscopic vision
CN106272423A (en) A kind of multirobot for large scale environment works in coordination with the method for drawing and location
CN106371281A (en) Multi-module 360-degree space scanning and positioning 3D camera based on structured light
CN107255476A (en) A kind of indoor orientation method and device based on inertial data and visual signature
US9454822B2 (en) Stereoscopic measurement system and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140716