CN1419104A - Object space position detector - Google Patents

Object space position detector Download PDF

Info

Publication number
CN1419104A
CN1419104A CN 02158692 CN02158692A CN1419104A CN 1419104 A CN1419104 A CN 1419104A CN 02158692 CN02158692 CN 02158692 CN 02158692 A CN02158692 A CN 02158692A CN 1419104 A CN1419104 A CN 1419104A
Authority
CN
China
Prior art keywords
video camera
sonac
robot
pick
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 02158692
Other languages
Chinese (zh)
Inventor
丁希仑
解玉文
战强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Beijing University of Aeronautics and Astronautics
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN 02158692 priority Critical patent/CN1419104A/en
Publication of CN1419104A publication Critical patent/CN1419104A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Numerical Control (AREA)

Abstract

The invention discloses a kind of precise automatic-measuring equipment detecting the object's space position and attitude, consisting of the camera at the end of robot, the ultrasonic sensor and the computer, the camera and the ultrasonic sensor linked with the image-collecting card and the ultrasonic information collecting card in the computer by the data bus, and the robot linked with the computer by the control bus. The invention replaces the traditional binocular vision detection and from the single image obtains the ideal coordinate of arbitrary point in the image to get the line direction of the point and the camera center, then uses the ultrasonic sensor to measure the lime length to confirm the coordinate of this point in the camera reference frame, then further converted into the robot basic reference frame.

Description

The object space apparatus for detecting position and posture
Technical field
The present invention relates to a kind of accurate self-operated measuring unit, particularly relate to a kind of object space apparatus for detecting position and posture that carries out the object pose detection by robot, video camera, sonac, computing machine.
Background technology
Along with development of science and technology, robot has obtained application more and more widely, and for example assembling is automatically welded automatically, sprayed paint, the inspection of mechanical component etc.In these were used, the detection of testee pose was the prerequisite that realizes the testee operation.Conventional detection generally all is to utilize binocular vision to realize, but the data volume that this method need be handled is very big, the image matching algorithm in the binocular vision, and also perfect not to the utmost, error is bigger.
Summary of the invention
In order to overcome the deficiency of said method, the present invention proposes a kind of method of new inspected object pose, can only need to gather single sub-picture by video camera and the sonac that is contained in robots arm's end, processing through computing machine just can realize the detection to the testee pose.
A kind of object space apparatus for detecting position and posture of the present invention, form by the video camera that is contained in the robot end and sonac, computing machine, also relating to an object pose detection system is stored in the computing machine, video camera and sonac are installed in the end of robot arm, video camera and sonac are linked to each other with the ultrasound information capture card with image pick-up card in the computing machine by data line, and robot is linked to each other with computing machine by control bus.
Described pick-up unit, its video camera and the sonac fixing position that is arranged in parallel changes differently according to robot construction, be fixed on the end that robot picks up the object parts.
Described pick-up unit detects required information acquisition unit by single camera and single sonac constituent posture.
Described pick-up unit, the pose of object to be detected only need be gathered a sub-picture, by the direction of the point on the Image Acquisition object of camera acquisition and the video camera line of centres, is obtained the length of this line by sonac.
Described pick-up unit, a sub-picture of camera acquisition testee, and passed in the computing machine by video and data line by to treatment of picture, can obtain any 1 P on the object to the projection vector of video camera center O Direction, measure by sonac then Length, thereby can determine the coordinate of a P in camera coordinate system OXYZ, this coordinate can finally be transformed in the basis coordinates system of robot.Image coordinate by point is obtained its ideal coordinates, and corresponding point overlap with the line at video camera center on the point of this ideal coordinates correspondence and the line at video camera center and the object.
Described pick-up unit, sonac can adopt laser sensor.
Described pick-up unit is characterized in that: video camera and sonac also can parallel to the layout and installation at the end of bowl portion of robot.
The position of spatial point generally needs two width of cloth images, determines with the intersection point of two projection lines.The method that the present invention proposes then can add the position that ultrasonic (or laser) range finding can be determined spatial point with piece image.
Compare with binocular vision, the present invention has fundamentally avoided because caused uncertainty of images match and error.After the image coordinate of spatial point was determined, its coordinate in camera coordinate system can just can obtain by simple calculating.Binocular vision then will be carried out images match earlier, and then asks the intersection point of projection line.Obviously the present invention has significantly reduced calculated amount, has improved detection speed.
Description of drawings
Fig. 1 is a structural representation of the present invention.
Fig. 2 is that the direction synoptic diagram of determining projection vector produces.
Fig. 3 is a length synoptic diagram of determining projection vector.
Among the figure: 1. video camera 2. sonacs 3. testees 4. worktable 5. robots 6. computing machines, 7. data lines, 8. control buss, 9. planes of delineation
Embodiment
The present invention is further illustrated below in conjunction with drawings and Examples.
See also shown in Figure 1, object space apparatus for detecting position and posture of the present invention, form by the video camera that is contained in the robot end and sonac, computing machine and the object pose detection system that is stored in the computing machine, single camera and single sonac are installed in the end of robot arm, video camera and sonac are linked to each other with the ultrasound information capture card with image pick-up card in the computing machine by data line, and robot is linked to each other with computing machine by control bus.
In the present invention, testee 3 is placed on the worktable 4, gathers a sub-picture by video camera 1, this image is transferred in the computing machine 6 by video and ultrasound data line 7, and by the operation control software of storage in the computing machine 6 it is handled.For example, set 1 P on testee 3,1 P on the object will measure its coordinate in camera coordinate system OXYZ, needs to determine projection vector
Figure A0215869200051
Direction and length.Consider the distortion that camera lens causes, by the image coordinate P of a P uCan obtain its ideal image coordinate (distortion that the distortion of compensation camera lens causes) P i, because
Figure A0215869200052
Length known (being the camera lens focal distance f) so just can obtain Direction in camera coordinate system,
Figure A0215869200054
Direction be exactly with
Figure A0215869200055
Direction (as shown in Figure 2).By the motion of control robot 5, make the true origin of sonac 2 move to the O point, axis with Overlap, the measured value of sonac 2 is exactly so
Figure A0215869200057
Length.Like this, the coordinate of some P in video camera 1 coordinate system just can obtain, and can finally be converted into the coordinate in 5 basis coordinates system of robot.Can record the project objects center of gravity with this method is the coordinate in space in robot 5 basis coordinates, just the position coordinates of object.Be without loss of generality, can represent the attitude of object with the attitude of the major axis of project objects.The volume coordinate of this two end points can record with said process, just can further calculate the attitude of this axis in basis coordinates system of robot.The position of object and attitude are just decided fully like this.
For example, for the workpiece that is placed on the worktable, establishing its focus point is C, then can record With the angle of each between centers of camera coordinate system be: with the X-axis angle be 68.2694 °, with the Y-axis angle be 60.4186 °, with Z axle clamp angle be 38.1027 °, record Length be 42.2019mm, its coordinate in camera coordinate system is X=5.6250mm so, Y=20.8333mm, Z=33.2089mm, the coordinate of measured workpiece in basis coordinates system of robot is X=27.7815mm, Y=98.6157mm, Z=34.5791mm.The attitude angle of measured workpiece in robot basis coordinates system is: with the X-axis angle be 62.6605 °, with the Y-axis angle be 74.6356 °, with Z axle clamp angle be 32.0197 °.
The method that the present invention proposes can add the position that ultrasonic (or laser) range finding can be determined spatial point with piece image.
Compare with binocular vision, the present invention has fundamentally avoided because caused uncertainty of images match and error.After the image coordinate of spatial point was determined, its coordinate in camera coordinate system can just can obtain by simple calculating.Binocular vision then will be carried out images match earlier, and then asks the intersection point of projection line.Obviously the present invention has significantly reduced calculated amount, has improved detection speed.

Claims (8)

1. object space apparatus for detecting position and posture, form by robot, video camera, sonac, computing machine, it is characterized in that: also relate to an object pose detection system and be stored in the computing machine, video camera (1) and sonac (2) parallel to the layout and installation at the end of robot arm, video camera (1) and sonac (2) are linked to each other with the ultrasound information capture card with image pick-up card in the computing machine by data line, and robot (5) is linked to each other with computing machine (6) by control bus (8).
2. pick-up unit according to claim 1 is characterized in that: detect required information acquisition unit by single camera (1) and single sonac (2) constituent posture.
3. pick-up unit according to claim 1, it is characterized in that: the pose of object to be detected (3) only need be gathered a sub-picture, by the point on the Image Acquisition object of video camera (1) collection and the direction of video camera (1) line of centres, obtain the length of this line by sonac (2).
4. according to claim 1,3 described pick-up units, it is characterized in that: video camera (1) is gathered a sub-picture of testee (3), and pass in the computing machine (6) by video and data line (7), by to treatment of picture, can obtain any 1 P on the object to the projection vector of video camera (1) center O Direction, measure by sonac (2) then Length, thereby can determine the coordinate of a P in camera coordinate system OXYZ, this coordinate can finally be transformed in the basis coordinates system of robot.
5. pick-up unit according to claim 4 is characterized in that: the image coordinate by point is obtained its ideal coordinates, and corresponding point overlap with the line at video camera center on the point of this ideal coordinates correspondence and the line at video camera center and the object.
6. pick-up unit according to claim 1 is characterized in that: sonac (2) can adopt laser sensor.
7. pick-up unit according to claim 1 is characterized in that: video camera (1) and sonac (2) the fixing position that is arranged in parallel changes differently according to robot construction, be fixed on the end that robot picks up the object parts.
8. pick-up unit according to claim 1 is characterized in that: video camera (1) and sonac (2) also can parallel to the layout and installation at the end of bowl portion of robot.
CN 02158692 2002-12-26 2002-12-26 Object space position detector Pending CN1419104A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 02158692 CN1419104A (en) 2002-12-26 2002-12-26 Object space position detector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 02158692 CN1419104A (en) 2002-12-26 2002-12-26 Object space position detector

Publications (1)

Publication Number Publication Date
CN1419104A true CN1419104A (en) 2003-05-21

Family

ID=4753152

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 02158692 Pending CN1419104A (en) 2002-12-26 2002-12-26 Object space position detector

Country Status (1)

Country Link
CN (1) CN1419104A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100375939C (en) * 2003-05-29 2008-03-19 发那科株式会社 Robot system
CN102941569A (en) * 2012-11-01 2013-02-27 李木 Single-rail robot capable of locating and operating target object and control method thereof
CN104298169A (en) * 2014-08-29 2015-01-21 暨南大学韶关研究院 Data converting method of intelligent vision numerical control system
CN105750724A (en) * 2016-04-29 2016-07-13 江苏科技大学 Laser calibration device and calibration method for friction stir welding
CN105750723A (en) * 2016-04-29 2016-07-13 江苏科技大学 Friction stir welding tool posture and position calibration device and calibration method
CN107255463A (en) * 2017-05-26 2017-10-17 珠海格力电器股份有限公司 Positioning measurement device and positioning measurement method
CN108291803A (en) * 2015-09-17 2018-07-17 瓦卢瑞克图沃斯巴西股份公司 The automatic system and method for end for measuring and processing tubular element
CN110036162A (en) * 2016-09-30 2019-07-19 新加坡-Eth研究中心 For object to be placed system and method on the surface

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100375939C (en) * 2003-05-29 2008-03-19 发那科株式会社 Robot system
CN102941569A (en) * 2012-11-01 2013-02-27 李木 Single-rail robot capable of locating and operating target object and control method thereof
CN104298169A (en) * 2014-08-29 2015-01-21 暨南大学韶关研究院 Data converting method of intelligent vision numerical control system
CN108291803A (en) * 2015-09-17 2018-07-17 瓦卢瑞克图沃斯巴西股份公司 The automatic system and method for end for measuring and processing tubular element
CN105750724A (en) * 2016-04-29 2016-07-13 江苏科技大学 Laser calibration device and calibration method for friction stir welding
CN105750723A (en) * 2016-04-29 2016-07-13 江苏科技大学 Friction stir welding tool posture and position calibration device and calibration method
CN110036162A (en) * 2016-09-30 2019-07-19 新加坡-Eth研究中心 For object to be placed system and method on the surface
CN110036162B (en) * 2016-09-30 2021-04-02 新加坡-Eth研究中心 System and method for placing an object on a surface
CN107255463A (en) * 2017-05-26 2017-10-17 珠海格力电器股份有限公司 Positioning measurement device and positioning measurement method

Similar Documents

Publication Publication Date Title
CN110977985B (en) Positioning method and device
CN105547153B (en) Plug-in element stitch vision positioning method and device based on binocular vision
JP2012002761A (en) Position attitude measurement instrument, processing method thereof, and program
CN106485746A (en) Visual servo mechanical hand based on image no demarcation and its control method
CN111028340A (en) Three-dimensional reconstruction method, device, equipment and system in precision assembly
CN114714029B (en) Automatic arc welding method and device for aluminum alloy
CN1512134A (en) Contact type object position and gesture measurer
CN112230345A (en) Optical fiber auto-coupling alignment apparatus and method
JPH03213251A (en) Workpiece position detecting device
JP2010276447A (en) Position measuring apparatus, position measuring method and robot system
CN1419104A (en) Object space position detector
CN116393982B (en) Screw locking method and device based on machine vision
JPH04255077A (en) Image analyzing method
CN205482791U (en) Plug -in components component stitch vision positioning device based on binocular vision
CN2645034Y (en) Object space pose detection apparatus
JPH03161223A (en) Fitting of work
Chen et al. Application of visual servoing to an X-ray based welding inspection robot
CN114926531A (en) Binocular vision based method and system for autonomously positioning welding line of workpiece under large visual field
CN115294315A (en) Device, method and system for detecting and picking up foreign matters in mechanical arm in pipeline
JPH0260377A (en) Automatic focus adjustment of industrial or military video camera
WO2021145304A1 (en) Image processing system
TWI761891B (en) Uninterrupted automation system and execution method thereof
Gatla et al. Calibrating pan-tilt cameras in robot hand-eye systems using a single point
CN116758160B (en) Method for detecting pose of optical element assembly process based on orthogonal vision system and assembly method
JPH0829120A (en) Position measuring method of object having curved surface and positioning controller for two objects having curved surface

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C57 Notification of unclear or unknown address
DD01 Delivery of document by public notice

Addressee: Zhou Changqi

Document name: Deemed as a notice of withdrawal (Trial)

C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication