CN105447856A - Marking point coupling method based on robot motion parameters and characteristic vectors - Google Patents

Marking point coupling method based on robot motion parameters and characteristic vectors Download PDF

Info

Publication number
CN105447856A
CN105447856A CN201510784047.6A CN201510784047A CN105447856A CN 105447856 A CN105447856 A CN 105447856A CN 201510784047 A CN201510784047 A CN 201510784047A CN 105447856 A CN105447856 A CN 105447856A
Authority
CN
China
Prior art keywords
robot
camera
matrix
coordinates
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510784047.6A
Other languages
Chinese (zh)
Other versions
CN105447856B (en
Inventor
肖志涛
郎建业
耿磊
张芳
吴骏
李月龙
李峰
齐旭平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baotou Yihui Information Technology Co ltd
Original Assignee
Tianjin Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Polytechnic University filed Critical Tianjin Polytechnic University
Priority to CN201510784047.6A priority Critical patent/CN105447856B/en
Publication of CN105447856A publication Critical patent/CN105447856A/en
Application granted granted Critical
Publication of CN105447856B publication Critical patent/CN105447856B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Abstract

The invention discloses a method for recovering the three-dimensional information of a marking point based on robot motion parameters and characteristic vectors. The method overcomes the defects of slow speed and bad flexibility of a conventional orientation method and also achieves the characteristics of high coupling rate and high accuracy. The method includes that (1) hands and eyes of a robot are calibrated; (2) the relative position of a camera is unified to the base coordinate system based on the hands and eyes relation and the robot motion parameters; (3) the automatic orientation of the camera is realized; and (4) ambiguity coupling correction based on the characteristic vectors is realized. The method has important application values on the recovery of the three-dimensional information during the motion of the robot and is suitable for the online precision measurement during the intelligent manufacturing.

Description

Based on the reference points matching method of robot motion's parameter and proper vector
Technical field
The invention belongs to technical field of image processing, based on travelling shot measure theory, and propose a kind of reference points matching method based on robot motion's parameter and proper vector.Based on robot motion's parameter, realize the self-orientation of camera.The present invention adds the ambiguity correct methods matching of feature based vector on the basis that epipolar-line constraint mates, and ensure that the accuracy of matching rate and coupling, is applicable to very much the online precision measurement in intelligence manufacture.
Background technology
For improving the quality of products in intelligence manufacture, precision measurement need be carried out to some key position of that manufactures parts.Travelling shot measurement has the advantage such as noncontact, high precision, is widely used in precision engineering gradually.It is that mobile camera carries out imaging from different azimuth to measured object, by recovering the three-dimensional information of gauge point to realize the measurement to measured object size, pattern.Reference points matching wherein between different images is gordian technique wherein.Matching process based on epipolar-line constraint is a kind of very effective method, and its prerequisite is that camera is directed.
The current method for camera orientation mainly contains four classes: (1) is by carrying out camera orientation at measured object surface mount coded markings point, but the method needs loaded down with trivial details stickup and removes markers work, easy infringement measured object surface, and binding mark location information can be caused to lack, directly affect measurement result.(2) by arranging directed target in the public view field of adjacent twice measurement, with for reference to realizing camera orientation, but the method measurement range is subject to great restriction, and dirigibility is poor.(3) in measurement space, arrange controlling filed, calibrate the position at reference mark, realize camera orientation by reference mark.But need to have 5 public reference mark at least in twice measurement, have certain limitation to the measurement of big article.(4) around measured object, arrange global control point, realize camera orientation in conjunction with directed target and directional camera.But the method complex operation, easily causes precision to lose.
Summary of the invention
The object of the invention is the above-mentioned deficiency overcoming prior art, and propose a kind of camera self-orientation method based on robot motion's parameter, the method mainly overcomes the shortcoming of slow, the flexible difference of traditional orientation method speed, can also reach the feature that matching rate is high, accuracy is high simultaneously.Realize the object of the invention technical scheme, comprise the following steps:
(1) utilize camera to remain unchanged relative to robot end's tool location, hand and eye calibrating is carried out to robot;
(2) trick relation and robot motion's parameter is utilized, by the position unification of robot in robot basis coordinates;
(3) position in basis coordinates system of robot is tied up to, to camera self-orientation according to camera coordinates;
(4) epipolar-line constraint coupling is adopted to realize the coupling of gauge point;
(5) feature based vector carries out ambiguity matching and correlation.
In step (2), camera is arranged on robot end and moves along with robot motion, and camera remains unchanged relative to the position of robot end's instrument; Basic ideas are that control moves to diverse location, make camera carry out imaging from different azimuth to calibrated reference space; Then
P c1=CP c2(1)
P c1=XP t1(2)
P t1=DP t2(3)
P c2=XP t2(4)
Obtained by formula (1) and (4)
P c1=CXP t2(5)
Obtained by formula (2) and (3)
P c1=XDP t2(6)
Simultaneous (5) obtains with (6)
CX=XD(7)
Mobile robot is to diverse location, and obtain multiple group type (7) equation of constraint, simultaneous can try to achieve X.
As shown in Fig. 3 (b), C calfor scaling board coordinate system; C cam1and C cam2for camera coordinates system; C tool1with C tool2for tool coordinates system; Use scaling board to C cam1and C cam2camera Calibration on two positions calculates inside and outside parameter, obtains three matrixes; Robot motion's parameter can be read by controller again, therefore C tool1and C tool2be known parameters, can matrix D be obtained; The relative position X of tool coordinates system and camera coordinates system remains unchanged; 1 P is had at coordinate system C in hypothesis space cam1, C cam2, C tool1and C tool2in coordinate be respectively P c1, P c2, P t1and P t2.
In step (5), utilize the close matrix of self property structure of point set, point set is transformed into higher dimensional space from two-dimensional space and mates again; Suppose that two groups of points with ambiguity coupling are respectively I i(i=1,2 ..., m) and J j(j=1,2 ..., n), set up metric matrix:
H ij=exp(-r ij 2/2σ 2)
R in above formula ijan I iwith a J jbetween distance, σ is calculated as follows:
Wherein M is number a little, carries out SVD decomposition to matrix H:
H m=V mD mV m T
Wherein V is the unit orthogonal matrix of m dimension; Obtain the m-dimensional space description that V matrix just obtains this point set, m column vector E of matrix V ithe base of this m-dimensional space, m the row vector then F of V ithe coordinate of this m point under this group m ties up base respectively; H can be obtained from first point set 1=V 1d 1v 1 t, can H be obtained from second point set equally 2=V 2d 2v 2 t, remember that two groups of points are described as F under new base respectively i, 1and F j, 2; If m with n is not identical, choose | an element before m-n|, set up trip current Z:
Z ij=||F i,1-F j,2|| 2
Decision criteria is: only have and work as Z ijwhen being the least member of the i-th row and jth row in matrix Z, just think an I iwith a J jthat correct coupling is right.
The present invention compared with prior art tool has the following advantages:
1. this invention is without the need to pasting coded markings point, avoids using controlling filed and directed target, replaces hand labor, improve the speed of camera orientation, improve the automaticity of measurement with robot;
2. this invention overcomes the shortcoming of slow, the flexible difference of traditional orientation method speed;
3. this invention can reach the feature that matching rate is high, accuracy is high, and is applicable to the online precision measurement in intelligence manufacture.
Accompanying drawing explanation
Fig. 1: process flow diagram of the present invention.
Fig. 2: travelling shot is measured.(a) travelling shot instrumentation plan, (b) pin-hole imaging model.Collinearity condition equation describes picture point m u, projected centre point O c, and object space point M be in equation on same straight line.O c-X cy cz cfor camera coordinates system, O w-X wy wz wfor world coordinate system, o 0-uv is image coordinate system, o 1-xy is plane of delineation coordinate system.
Fig. 3: hand and eye calibrating.(a) hand and eye calibrating schematic diagram, (b) each coordinate system relative position.As shown in (b), A, B, C, D, X are 4 × 4 matrixes, and the relative orientation be used between expression two coordinate systems, can be analyzed to translation vector and rotation matrix, C calfor scaling board coordinate system, C cam1and C cam2for camera coordinates system, C tool1with C tool2for tool coordinates system.Use scaling board to C cam1and C cam2camera Calibration on two positions calculates inside and outside parameter, obtains three matrixes.Robot motion's parameter can be read by controller again, therefore C tool1and C tool2be known parameters, can matrix D be obtained.The relative position X of tool coordinates system and camera coordinates system remains unchanged.1 P is had at coordinate system C in hypothesis space cam1, C cam2, C tool1and C tool2in coordinate be respectively P c1, P c2, P t1and P t2.
Fig. 4: Epipolar geometric constraint.A () pole geometric match, (b) ambiguity is mated.As shown in (a), two as plane be respectively I, I ', m and m ' is picture point corresponding to spatial point P, the photocentre that C and C ' is camera, and the line of photocentre is called baseline, baseline and intersection point e, the e ' be called limit as plane; Baseline CC ' is called polar plane with two projection ray CP, C ' plane that P is formed, polar plane and the intersection l as plane m ', l m' be called polar curve; M and m ' is corresponding point, l m' for m picture planar I ' on polar curve, obvious m ' is at l m' on.Equally, m is inevitable at polar curve l m 'on.
Fig. 5: search for match point on corresponding polar curve.
Fig. 6: the correct coupling realizing point set, demonstrates the accuracy of the directed result of camera.
Fig. 7: the correct matching double points 5,6 and 7,8 that random selecting two is adjacent, its objective is that the architectural feature making point set is more obvious.Matching double points correct in addition participates in correcting, and can guarantee the accuracy corrected.If 5,6 and 7,8 still to mate after correcting, then think and correct successfully, otherwise think and correct unsuccessfully, again choose matching double points, until correct successfully.
Fig. 8: ambiguity matching and correlation result.
Embodiment
Process flow diagram of the present invention as shown in Figure 1, first carries out hand and eye calibrating to robot; Then trick relation and robot motion's parameter is utilized, by the unification of camera relative position in basis coordinates system of robot; Finally realize the self-orientation of camera.In addition, for eliminating the coupling ambiguity produced because polar plane is identical, a kind of ambiguity correct methods matching of feature based vector is also proposed.Below in conjunction with accompanying drawing, the specific implementation process of technical solution of the present invention is illustrated.
1. travelling shot measuring principle;
As shown in Fig. 2 (a), mobile camera carries out imaging from different azimuth to measured object, set up the restriction relation of different erect-position same place based on collinearity condition equation, eventually pass the three-dimensional information that bundle adjustment recovers gauge point, realize the measurement of key position.In order to improve Automatic survey, speed and flexibility, camera being fixed in robot, moving camera by planning robot's motion path.Gauge point is launched by laser projecting apparatus in addition, avoids contacting measured object, can realize 100% noncontacting measurement.
As shown in Fig. 2 (b), collinearity condition equation describes picture point m u, projected centre point O c, and object space point M be in equation on same straight line.O c-X cy cz cfor camera coordinates system, O w-X wy wz wfor world coordinate system, o 0-uv is image coordinate system, o 1-xy is plane of delineation coordinate system.It is easy to show that
Consider lens distortion, distortion correction need be carried out to picture point.
In formula, (x u, y u) be ideal image point coordinate, (x d, y d) be picpointed coordinate actual value, (X w, Y w, Z w) be object space point coordinate under world coordinate system, (X c, Y c, Z c) be the coordinate of object space point under camera coordinates system.F is focal length, K 1, K 2, K 3, P 1, P 2for distortion parameter, focal length and distortion parameter are the interior orientation parameter of camera, can demarcate in advance.R, T are respectively rotation matrix and the translation vector of camera, represent the orientation of camera in world coordinate system.From formula, the three-dimensional coordinate of gauge point be recovered, imaging must be carried out from different azimuth to multiple point, set up collinearity condition equation group.Therefore on different picture, the coupling of same tag point is a crucial step, and camera orientation is the prerequisite of coupling.
2. hand and eye calibrating;
Camera orientation based on robot motion's parameter is based upon on the basis of hand and eye calibrating and robot motion's parameter.As shown in Fig. 3 (a), camera is arranged on robot end and moves along with robot motion, and in the process of robot motion, camera remains unchanged relative to the position of robot end's instrument.The basic ideas of hand and eye calibrating are that control moves to diverse location, make camera carry out imaging from different azimuth to calibrated reference space, derive the mathematical relation of tool coordinates system and camera coordinates system.
As shown in Fig. 3 (b), camera remains unchanged relative to the position of robot end's instrument; Basic ideas are that control moves to diverse location, make camera carry out imaging from different azimuth to calibrated reference space; Then
P c1=CP c2(1)
P c1=XP t1(2)
P t1=DP t2(3)
P c2=XP t2(4)
Obtained by formula (1) and (4)
P c1=CXP t2(5)
Obtained by formula (2) and (3)
P c1=XDP t2(6)
Simultaneous (5) obtains with (6)
CX=XD(7)
Mobile robot is to diverse location, and obtain multiple group type (7) equation of constraint, simultaneous can try to achieve X.
3. camera is directed:
After obtaining trick relation, composition graphs 3 (a), in matrix X corresponding diagram inverse matrix, namely mobile robot during measurement, can read tool coordinates from controller and tie up to position basis coordinates system then camera coordinates ties up to the position in basis coordinates system of robot and is
In the middle of basis coordinates system, the self-orientation of camera is realized thus, for epipolar-line constraint coupling is prepared by unified for the position of camera.
4. epipolar-line constraint coupling;
Described by the geometry of pole is the corresponding relation of picture point and its polar curve, and basis matrix is the Algebraic Expression of pole geometry.As shown in Fig. 4 (a), two as plane be respectively I, I ', m and m ' is picture point corresponding to spatial point P, the photocentre that C and C ' is camera, and the line of photocentre is called baseline, baseline and intersection point e, the e ' be called limit as plane.Baseline CC ' is called polar plane with two projection ray CP, C ' plane that P is formed, polar plane and the intersection l as plane m ', l m' be called polar curve.In figure, m and m ' is corresponding point, l m' for m picture planar I ' on polar curve, obvious m ' is at l m' on.Equally, m is inevitable at polar curve l m 'on.Describing above-mentioned restriction relation with basis matrix F is
As long as calculate basis matrix just by calculating corresponding polar curve search match point.
Tying up to the position in robot basis coordinates for camera coordinates as Fig. 4 (a), H, H ', is the self-orientating result of camera, then the relative orientation H of two cameras relfor
H rel=H -1H′
H relcan be analyzed to
Then basis matrix is
F=K T[t rel] xR relK -1
In above formula, K is the Intrinsic Matrix of camera, remains unchanged in the process measured.The limit restraint coupling that is calculated as of basis matrix provides the foundation.
The ambiguity matching and correlation of 5 feature based vectors;
As Fig. 4 (b), point P and N is positioned on same polar plane, and the corresponding polar curve of 2 is identical, on polar curve, therefore searches for match point there will be ambiguity coupling.Propose a kind of ambiguity correct methods matching of feature based vector herein, utilize the close matrix of self property structure of point set, point set is transformed into higher dimensional space from two-dimensional space and mates again.Suppose that two groups of points with ambiguity coupling are respectively I i(i=1,2 ..., m) and J j(j=1,2 ..., n), set up metric matrix
H ij=exp(-r ij 2/2σ 2)
R in above formula ijan I iwith a J jbetween distance, σ is calculated as follows
Wherein M is number a little.SVD decomposition is carried out to matrix H
H m=V mD mV m T
Wherein V is the unit orthogonal matrix of m dimension.Obtain the m-dimensional space description that V matrix just obtains this point set, m column vector E of matrix V ithe base of this m-dimensional space, m the row vector then F of V ithe coordinate of this m point under this group m ties up base respectively.H can be obtained from first point set 1=V 1d 1v 1 t, can H be obtained from second point set equally 2=V 2d 2v 2 t, remember that two groups of points are described as F under new base respectively i, 1and F j, 2.If m with n is not identical, before choosing | m-n| element, set up trip current Z
Z ij=F i,1-F j,2 2
Decision criteria is: only have and work as Z ijwhen being the least member of the i-th row and jth row in matrix Z, just think an I iwith a J ithat correct coupling is right.The result that the result of correction and epipolar-line constraint mate is combined, completes ambiguity matching and correlation.
6. experimental system;
The system configuration adopted herein has: industrial robot (ABBIRB1410), laser plumbing device, industrial camera (VA-8MC coordinates AFNikon28mmf/2.8D camera lens), computing machine and scaling board (Ti-TIMESCG-150-H-15).
Hand and eye calibrating result is as following table:
7. the directed result of camera;
Read tool coordinates from controller and tie up to position in basis coordinates system of robot, in conjunction with hand and eye calibrating result, by camera position unification in basis coordinates system of robot, following table lists the directed result of wherein two cameras.
8. reference points matching result;
In conjunction with the directed result of camera shown in upper table, calculate the corresponding polar curve of left figure gauge point on right figure, as Fig. 5.Corresponding polar curve searches for match point, as Fig. 6, realizes the correct coupling of point set, demonstrate the accuracy of the directed result of camera.In addition the corresponding polar curve of Fig. 5 mid point 1 and point 2 all crosses point 3 and point 4, occurs that ambiguity is mated, need correct.
As Fig. 7, the matching result according to Fig. 6, the correct matching double points 5,6 and 7,8 that random selecting two is adjacent, its objective is that the architectural feature making point set is more obvious.Matching double points correct in addition participates in correcting, and can guarantee the accuracy corrected.If 5,6 and 7,8 still to mate after correcting, then think and correct successfully, otherwise think and correct unsuccessfully, again choose matching double points, until correct successfully.The coordinate of point set to be matched is as following table.
Trimming process is as follows, first calculates the σ parameter of point set (1,2,5,7) and (3,4,6,8), σ 1=657.6971, σ 2=1290.5428.Then computation measure matrix H 1with H 2
Calculate trip current Z afterwards
Finding in trip current in row, column is all minimum element, completes Point set matching, corrects result as Fig. 8.

Claims (6)

1., based on the reference points matching method of robot motion's parameter and proper vector, comprise the following steps:
(1) utilize camera to remain unchanged relative to robot end's tool location, hand and eye calibrating is carried out to robot;
(2) trick relation and robot motion's parameter is utilized, by the position unification of robot in robot basis coordinates;
(3) position in basis coordinates system of robot is tied up to, to camera self-orientation according to camera coordinates;
(4) epipolar-line constraint coupling is adopted to realize the coupling of gauge point;
(5) feature based vector carries out ambiguity matching and correlation.
2. the reference points matching method based on robot motion's parameter and proper vector according to claim 1, is characterized in that, in step (2), according to the relative position coordinates matrix between robot motion's gain of parameter camera and instrument; Camera is arranged on robot end and moves along with robot motion, and camera remains unchanged relative to the position of robot end's instrument; Basic ideas are that control moves to diverse location, make camera carry out imaging from different azimuth to calibrated reference space; The position of mobile robot, can obtain equation of constraint CX=XD, and C is the relative position coordinates matrix between camera, and D is the relative position coordinates matrix between instrument, tries to achieve the relative position coordinates matrix X between camera and instrument according to equation of constraint.
3. the reference points matching method based on robot motion's parameter and proper vector according to claim 1, it is characterized in that, in step (3), the position in basis coordinates system and the relative position relation between tool coordinates system and camera coordinates system is tied up to according to tool coordinates, camera coordinates can be obtained and tie up to position relationship in basis coordinates system of robot, its equation of constraint wherein matrix i.e. matrix X, mobile robot during measurement, can read tool coordinates from controller and tie up to position basis coordinates system the position in robot coordinate system is tied up to for camera coordinates.
4. the reference points matching method based on robot motion's parameter and proper vector according to claim 1, is characterized in that, in step (4), closes be with basis matrix F Description Image point and its epipolar-line constraint:
m ′ T F m = 0 l m ′ = F m l m ′ = F T m ′
Wherein m and m ' is picture point corresponding to spatial point P, l m ', l m' be polar plane and the intersection as plane, as shown in Fig. 4 (a).
5. the reference points matching method based on robot motion's parameter and proper vector according to claim 1, it is characterized in that, in step (5), utilize the close matrix of self property structure of point set, point set is transformed into higher dimensional space from two-dimensional space mate again, sets up trip current Z:
Z ij=||F i,1-F j,2|| 2
Only have and work as Z ijwhen being the least member of the i-th row and jth row in matrix Z, just think an I iwith a J jthat correct coupling is right.
6. feature based vector according to claim 5 carries out ambiguity matching and correlation, it is characterized in that, utilization measure matrix, and decomposes the SVD of metric matrix; Suppose that two groups of points with ambiguity coupling are respectively I i(i=1,2 ..., m) and I j(j=1,2 ..., n), set up metric matrix:
H ij=exp(-r ij 2/2σ 2)
Wherein r ijan I iwith an I jbetween distance, m is number a little, carries out SVD decomposition to matrix H:
H m=V mD mV m T
Wherein V is the unit orthogonal matrix of m dimension, obtains the m-dimensional space description that V matrix just obtains this point set, m column vector E of matrix V ithe base of this m-dimensional space, m the row vector then F of V ithe coordinate of this m point under this group m ties up base respectively; H can be obtained from first point set 1=V 1d 1v 1 t, can H be obtained from second point set equally 2=V 2d 2v 2 t, remember that two groups of points are described as F under new base respectively i, 1and F j, 2; If m with n is not identical, before choosing | m-n| element sets up trip current.
CN201510784047.6A 2015-11-17 2015-11-17 Reference points matching method based on robot motion's parameter and feature vector Active CN105447856B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510784047.6A CN105447856B (en) 2015-11-17 2015-11-17 Reference points matching method based on robot motion's parameter and feature vector

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510784047.6A CN105447856B (en) 2015-11-17 2015-11-17 Reference points matching method based on robot motion's parameter and feature vector

Publications (2)

Publication Number Publication Date
CN105447856A true CN105447856A (en) 2016-03-30
CN105447856B CN105447856B (en) 2019-01-22

Family

ID=55557989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510784047.6A Active CN105447856B (en) 2015-11-17 2015-11-17 Reference points matching method based on robot motion's parameter and feature vector

Country Status (1)

Country Link
CN (1) CN105447856B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106553195A (en) * 2016-11-25 2017-04-05 中国科学技术大学 Object 6DOF localization method and system during industrial robot crawl
CN107369184A (en) * 2017-06-23 2017-11-21 中国科学院自动化研究所 Mix binocular industrial robot system's synchronization calibration system, method and other devices
CN107829221A (en) * 2016-09-16 2018-03-23 Juki株式会社 Sewing system
CN107993265A (en) * 2017-11-29 2018-05-04 深圳市沃特沃德股份有限公司 The calibration facility of monocular sweeper, method and device
CN110238848A (en) * 2019-05-30 2019-09-17 埃夫特智能装备股份有限公司 The calculation method of gravitational vectors under a kind of robot coordinate system
CN110360991A (en) * 2019-06-18 2019-10-22 武汉中观自动化科技有限公司 A kind of photogrammetric survey method, device and storage medium
CN112828878A (en) * 2019-11-22 2021-05-25 中国科学院沈阳自动化研究所 Three-dimensional measurement and tracking method for large-scale equipment in butt joint process
CN113688678A (en) * 2021-07-20 2021-11-23 深圳市普渡科技有限公司 Road sign multi-ambiguity processing method, robot and storage medium
CN114700989A (en) * 2022-04-24 2022-07-05 安吉八塔机器人有限公司 Automatic leveling device for plane camera and control method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101010140B1 (en) * 2008-09-30 2011-01-24 삼성에스디에스 주식회사 Measuring system for a subject using geographic information system
CN102419172A (en) * 2011-08-18 2012-04-18 武汉大学 Stereo image pair automatic relative orientation method with additional non-linear constraint condition
CN102865857A (en) * 2012-09-04 2013-01-09 北京信息科技大学 Photography measurement image matching method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101010140B1 (en) * 2008-09-30 2011-01-24 삼성에스디에스 주식회사 Measuring system for a subject using geographic information system
CN102419172A (en) * 2011-08-18 2012-04-18 武汉大学 Stereo image pair automatic relative orientation method with additional non-linear constraint condition
CN102865857A (en) * 2012-09-04 2013-01-09 北京信息科技大学 Photography measurement image matching method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张云珠: ""工业机器人手眼标定技术研究"", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
郭磊: ""移动视觉精密测量关键技术研究"", 《中国博士学位论文全文数据库信息科技辑》 *
鲍文霞: ""基于结构特征的图像匹配算法及应用"", 《中国博士学位论文全文数据库信息科技辑》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107829221B (en) * 2016-09-16 2021-06-08 Juki株式会社 Sewing system
CN107829221A (en) * 2016-09-16 2018-03-23 Juki株式会社 Sewing system
CN106553195B (en) * 2016-11-25 2018-11-27 中国科学技术大学 Object 6DOF localization method and system during industrial robot crawl
CN106553195A (en) * 2016-11-25 2017-04-05 中国科学技术大学 Object 6DOF localization method and system during industrial robot crawl
CN107369184A (en) * 2017-06-23 2017-11-21 中国科学院自动化研究所 Mix binocular industrial robot system's synchronization calibration system, method and other devices
CN107993265A (en) * 2017-11-29 2018-05-04 深圳市沃特沃德股份有限公司 The calibration facility of monocular sweeper, method and device
CN110238848A (en) * 2019-05-30 2019-09-17 埃夫特智能装备股份有限公司 The calculation method of gravitational vectors under a kind of robot coordinate system
CN110238848B (en) * 2019-05-30 2022-07-05 埃夫特智能装备股份有限公司 Method for calculating gravity vector under robot coordinate system
CN110360991A (en) * 2019-06-18 2019-10-22 武汉中观自动化科技有限公司 A kind of photogrammetric survey method, device and storage medium
CN112828878A (en) * 2019-11-22 2021-05-25 中国科学院沈阳自动化研究所 Three-dimensional measurement and tracking method for large-scale equipment in butt joint process
CN113688678A (en) * 2021-07-20 2021-11-23 深圳市普渡科技有限公司 Road sign multi-ambiguity processing method, robot and storage medium
CN113688678B (en) * 2021-07-20 2024-04-12 深圳市普渡科技有限公司 Road sign multi-ambiguity processing method, robot and storage medium
CN114700989A (en) * 2022-04-24 2022-07-05 安吉八塔机器人有限公司 Automatic leveling device for plane camera and control method thereof

Also Published As

Publication number Publication date
CN105447856B (en) 2019-01-22

Similar Documents

Publication Publication Date Title
CN105447856A (en) Marking point coupling method based on robot motion parameters and characteristic vectors
CN108010085B (en) Target identification method based on binocular visible light camera and thermal infrared camera
Hu et al. Extrinsic calibration of 2-D laser rangefinder and camera from single shot based on minimal solution
CN102472609B (en) Position and orientation calibration method and apparatus
CN107883870A (en) Overall calibration method based on binocular vision system and laser tracker measuring system
CN108594245A (en) A kind of object movement monitoring system and method
US7659921B2 (en) Distance measurement apparatus, distance measurement method, and distance measurement program
CN111192235B (en) Image measurement method based on monocular vision model and perspective transformation
CN104517291B (en) Pose measuring method based on target coaxial circles feature
CN105678785A (en) Method for calibrating posture relation of laser and camera
CN105073348A (en) A robot system and method for calibration
CN105096341B (en) Mobile robot position and orientation estimation method based on trifocal tensor and key frame strategy
CN105716542A (en) Method for three-dimensional data registration based on flexible feature points
US10928191B2 (en) Marker, and posture estimation method and position and posture estimation method using marker
CN107990940A (en) A kind of moving object method for tracing based on stereo vision measuring technology
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image
CN105469389A (en) Grid ball target for visual sensor calibration and corresponding calibration method
CN103983186A (en) Binocular vision system correcting method and device
CN104807405B (en) Three-dimensional coordinate measurement method based on light ray angle calibration
CN102914295A (en) Computer vision cube calibration based three-dimensional measurement method
CN106017321A (en) Binocular vision-based large-dimensional geometric quantity measurement method
CN105809706A (en) Global calibration method of distributed multi-camera system
Huai et al. Continuous-time spatiotemporal calibration of a rolling shutter camera-IMU system
CN208350997U (en) A kind of object movement monitoring system
De Cecco et al. A unified framework for uncertainty, compatibility analysis, and data fusion for multi-stereo 3-d shape estimation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220308

Address after: 300450 No. 1, cuipu Road, Yixian science and Industry Park, economic and Technological Development Zone, Binhai New Area, Tianjin

Patentee after: Siteng Heli (Tianjin) Technology Co.,Ltd.

Address before: 300160 Tianjin City Hedong District Forest Road No. 63

Patentee before: TIANJIN POLYTECHNIC University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231216

Address after: 014010 The first floor of A3 factory building in the High tech Industrial Base Park, No. 21 Alatan Khan Street, Rare Earth Development Zone, Baotou City, Inner Mongolia Autonomous Region

Patentee after: Baotou Yihui Information Technology Co.,Ltd.

Address before: 300450 No. 1, cuipu Road, Yixian science and Industry Park, economic and Technological Development Zone, Binhai New Area, Tianjin

Patentee before: Siteng Heli (Tianjin) Technology Co.,Ltd.

TR01 Transfer of patent right