CN103115615A - Fully-automatic calibration method for hand-eye robot based on exponential product model - Google Patents

Fully-automatic calibration method for hand-eye robot based on exponential product model Download PDF

Info

Publication number
CN103115615A
CN103115615A CN2013100309557A CN201310030955A CN103115615A CN 103115615 A CN103115615 A CN 103115615A CN 2013100309557 A CN2013100309557 A CN 2013100309557A CN 201310030955 A CN201310030955 A CN 201310030955A CN 103115615 A CN103115615 A CN 103115615A
Authority
CN
China
Prior art keywords
robot
camera
joint
hand
surveyor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013100309557A
Other languages
Chinese (zh)
Other versions
CN103115615B (en
Inventor
王海霞
卢晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Zhuo Xintong Intelligent Technology Co ltd
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN201310030955.7A priority Critical patent/CN103115615B/en
Publication of CN103115615A publication Critical patent/CN103115615A/en
Application granted granted Critical
Publication of CN103115615B publication Critical patent/CN103115615B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a fully-automatic calibration method for a hand-eye robot based on an exponential product model and particularly relates to a fully-automatic calibration method for achieving synchronization of three parts including a robot body, hands and eyes as well as a camera. The method can be achieved by utilizing a mirror and a plurality of beacon scaffolds instead of additional auxiliary measuring instruments. The fully-automatic calibration method is mainly characterized in that firstly, hand information outside the view field of a terminal camera is converted to a coordinate system of the camera by utilizing mirror reflection principle to finish hand and eye calibration; and secondly, the terminal camera carried by the robot is used as a measuring instrument and a checkerboard on a spacial mirror surface is used as a measuring target, so that the times for shooting the target by using the camera in a rotating process of each joint of the robot are increased, and meanwhile, the camera calibration and the joint parameter calibration are achieved under a special motion strategy. The fully-automatic calibration method for the robot is simple and effective.

Description

A kind of hand-eye machine people full automatic calibration method based on exponent product model
Technical field
The present invention relates to the hand-eye machine people and demarcate the field, more particularly relate to a kind of hand-eye machine people full automatic calibration method based on exponent product model, to improve hand-eye machine people's absolute fix precision.
Background technology
The hand-eye machine people refers to the robot end, camera is installed, and plays the effect of eyes, because it is flexible, enjoys people to pay close attention to.The height of its integral calibrating precision is to determine that this robot can be applied to actual a kind of important indicator, hand-eye machine people's integral calibrating comprises camera calibration, hand and eye calibrating, robot body demarcation, and wherein camera calibration refers to the hard-wired camera intrinsic parameter of robot end is demarcated.Hand and eye calibrating is that the relation between robot end's instrument and camera mounted thereto is demarcated.It is that each joint parameter of robot is demarcated that robot body is demarcated, and different kinematics models has different parameter-definitions.it is all the very important problem in robot vision field that every part is demarcated, more complicated all, therefore the application of trick robot brought a lot of inconvenience, people simplify calibration process as far as possible under the prerequisite that guarantees precision for this reason, there are camera calibration and hand and eye calibrating to carry out simultaneously, there is the demarcation of hand and eye calibrating and robot body to carry out simultaneously, also there is the demarcation of camera calibration and robot body to carry out simultaneously, but also there is no a kind of scaling method that the three is carried out simultaneously, and the calibration process complexity needs could realize alternately, therefore a kind of simple in the urgent need to developing, effectively automated process is realized hand-eye machine people's integral calibrating.
Summary of the invention
Task of the present invention is to solve the deficiency that in prior art, hand-eye machine people scaling method exists, and a kind of hand-eye machine people full automatic calibration method based on exponent product model is provided.
Its technical solution is:
A kind of hand-eye machine people full automatic calibration method based on exponent product model comprises the following steps:
A provides industrial robot, end binocular camera, a level crossing and some surveyor's beacons; The end binocular camera takes the detachable connected mode to be installed on the robot end, and surveyor's beacon is pasted on the minute surface of level crossing and place, robot end's instrument reference mark; Then enter step b,
B determines the initial position of robot, and the level crossing of selecting suitable position will be pasted with surveyor's beacon is positioned in end binocular camera field range, utilize simultaneously the end binocular camera to obtain one of the mirror image of level crossing, comprise minute surface surveyor's beacon information in this mirror image and reflex to the surveyor's beacon information that the end hand of mirror the inside is grabbed, be used for demarcating the trick relation; Then enter step c,
The c control begins to rotate one by one from the end joint, utilizes end binocular camera interval certain angle in each joint rotary course to take minute surface surveyor's beacon image once, until it exceeds the viewing field of camera scope; Suppose to obtain m in each joint motions process of robot iThe width image, i=1 ..., n, n are the joint of robot number, utilize
Figure BDA00002783785900021
Width image calibration camera and joint of robot parameter.
In above-mentioned steps a, the exponent product motion model of setting robot is:
g wt ( θ ) = e ξ ^ 1 θ 1 e ξ ^ 2 θ 2 . . . e ξ ^ n θ n
G wherein WT(θ) that robot is in θ=(θ 1, θ 2, θ 3..., θ n) time end-of-arm tooling coordinate system T relative world coordinate system W transformation relation,
Figure BDA00002783785900023
It is the motion spinor
Figure BDA00002783785900024
Exponential matrix;
In above-mentioned steps b, setting the trick variation relation is A ch,
Figure BDA00002783785900025
A wherein ocAnd A Ch 'Be given data;
In above-mentioned steps c, suppose that the point that the end binocular camera is measured is designated as X in i joint rotary course i=(x i1, x i2..., x im), i=1 ..., n, x ij∈ R 4 * numBe space three-dimensional homogeneous coordinates, m iBe the number of times of camera measurement data in the rotary course of i joint, num is the number of measured point, space, and the joint parameter solution formula is:
x i , a = e θ ij ξ ^ ij x i , b , i = 1 , . . . , n , a , b ∈ { 1 , . . . , m }
X wherein i,aAnd x i,bX iIn the homogeneous coordinates value of any two groups of data, can directly obtain spinor parameter ξ according to following formula ij, then pass through X iIn m iThe group measurement data, every two groups obtain one group of solution, obtain by combination in twos
Figure BDA00002783785900027
Group is separated, will
Figure BDA00002783785900028
The group solution is averaged as final joint parameter, that is:
ξ i = 1 C m i 2 Σ j = 1 C m 2 ξ ij
ξ wherein iBe motion spinor coordinate.
The present invention has following useful technique effect:
1, realize simply.Do not need extra measuring equipment, only utilize self-contained end camera in each joint rotation of robot
Obtain in process
Figure BDA000027837859000210
The width image can all be demarcated trick, camera, body.
2, the introducing error is little.Error source only has index of measuring accuracy of end camera.
3, precision is high.Error source is few, and utilizes the image calibration joint of robot parameter of calibration for cameras, and its degree of accuracy is higher.Description of drawings
Below in conjunction with accompanying drawing and embodiment, the present invention is further described:
Fig. 1 is the robot kinematics's illustraton of model in the present invention.
Fig. 2 is that the trick in the present invention concerns calibration principle figure.
Fig. 3 is the calibration principle figure in the present invention.
Embodiment
Shown in Fig. 1 is robot kinematics's illustraton of model.During initial position, world coordinate system W and tool coordinates are that T is based upon on robot end's camera, world coordinates ties up to initial position and maintains static, and tool coordinates is moved along with the machine human motion, supposes that there is the rotary joint of n series connection in robot, and the exponent product motion model of robot is:
g wt ( θ ) = e ξ ^ 1 θ 1 e ξ ^ 2 θ 2 . . . e ξ ^ n θ n - - - ( 1 )
G wherein WT(θ) that robot is in θ=(θ 1, θ 2, θ 3..., θ n) time end-of-arm tooling coordinate system T relative world coordinate system W transformation relation.
Figure BDA00002783785900032
It is the motion spinor
Figure BDA00002783785900033
Exponential matrix, specifically be expressed as follows:
e θ ξ ^ = e θ ω ^ ( I - e θ ω ^ ) r 0 1 = R 3 × 3 t 3 × 1 0 1 , ω ≠ 0 - - - ( 2 )
ω ^ = 0 - ω 3 ω 2 ω 3 0 - ω 1 - ω 2 ω 1 0
Figure BDA00002783785900036
Antisymmetric matrix for ω.ξ iBe motion spinor coordinate, can be expressed as:
ξ i = ω i ω i × r i ∈ R 6 - - - ( 3 )
ω wherein iAnd r i∈ R 3, i=1 ..., n is joint turning axle unit's direction under world coordinate system and the position vector of turning axle, is collectively referred to as the spinor parameter, also referred to as joint parameter.And the coordinate of spatial point under world coordinate system can obtain according to following formula:
X W=g WT(θ) X T(4) X wherein T, X W∈ R 4 * 1Be respectively the homogeneous coordinates of spatial point under tool coordinates system and world coordinate system.
Shown in Fig. 2 is the hand and eye calibrating schematic diagram.At first select a suitable attitude as robot initial position, and obtain the mirror image of the level crossing in field range, comprised minute surface surveyor's beacon information in this mirror image and reflexed to the surveyor's beacon information that the end hand of mirror the inside is grabbed.Namely utilize this two parts information acquisition trick relation, concrete grammar is as follows:
Consider the robot end at camera not within sweep of the eye, can utilize mirror to reflect it to it within sweep of the eye, as shown in Figure 2, C, H, O are respectively camera coordinates system, hand coordinate system and minute surface coordinate system, C ', H ' are corresponding virtual image coordinate system, X be the space a bit, its virtual image is X '.Before introducing the hand and eye calibrating process, two propositions of given first:
Proposition 1: by specularly reflected jiong coordinate system, if the left-handed system of being originally is right-handed system after reflection, vice versa;
Proposition 2: due to point and the duality of camera, the camera observation station X ' at some C place is equivalent to the camera observation station X that a C ' locates.
Postulated point X ' and the odd coordinate of X under camera C and O coordinate system are expressed as respectively
Figure BDA00002783785900038
X cAnd X o, the coordinate transform between coordinate system O and C, C and C ', C and H ', C ' and H ', C and H is respectively A oc, A Cc ', A Ch ', A C ' h ', A ch, A wherein ocAnd A Ch 'Be given data, A chThe trick transformation relation of finding the solution for needs.
Figure BDA00002783785900039
With , X cWith X oBetween have a following relation:
X o ′ = A oc X c ′ - - - ( 5 )
X o=A ocX c(6)
According to proposition 1 as can be known:
X o = 1 0 0 0 0 1 0 0 0 0 - 1 0 0 0 0 1 X o ′ = BX o ′ , B = 1 0 0 0 0 1 0 0 0 0 - 1 0 0 0 0 1 - - - ( 7 )
Bringing formula (5) and (6) into (7) arranges:
X c = A oc - 1 BA oc X c ′ - - - ( 8 )
If note X c′Odd coordinate for X under coordinate system C ', according to proposition 2 as can be known
Figure BDA00002783785900049
, formula (8) can be changed into:
X c = A oc - 1 BA oc X c ′ - - - ( 9 )
Camera coordinates is transformation relation A between C and C ' as can be known Cc 'For:
A cc ′ = A oc - 1 BA oc - - - ( 10 )
Again according to the transformation relation between coordinate in Fig. 2 as can be known:
A c ′ h ′ = A c c ′ - 1 A ch ′ - - - ( 11 )
In addition, according to proposition 2 as can be known:
A ch=A C ' h '(12) formula (10) and (11) are brought in formula (12), can be got:
A ch = A ch - 1 BA oc A ch ′ - - - ( 13 )
Trick relativeness A chCan determine by formula (13) is unique.
As Fig. 3, the present invention does not need extra surveying instrument, only need utilize mirror and some surveyor's beacons to get final product.Its principal feature is: the one, and the principle of reflection of the having utilized mirror hand information that end camera field range is outer is transformed under camera coordinates system,
Complete hand and eye calibrating; The 2nd, the end camera that robot is self-contained is as surveying instrument, gridiron pattern (surveyor's beacon) on the minute surface of space is as measurement target, with the point of fixity in the field range of space as impact point, then in robot simple joint rotary course, impact point is taken, first utilize these image calibration camera intrinsic parameters, the reentry 3 d space coordinate value of impact point, wherein end binocular camera intrinsic parameter is that the planar approach of utilizing Zhang Zhengyou to propose is demarcated, and the scaling method of joint of robot parameter is as follows:
Suppose that the point that the end camera is measured is designated as X in i joint rotary course i=(x i1, x i2..., x im), i=1 ..., n, x ij∈ R 4 * numBe space three-dimensional homogeneous coordinates, m iBe the number of times of camera measurement data in the rotary course of i joint, num is the number of measured point, space, and the joint parameter solution formula is:
x i , a = e θ ij ξ ^ ij x i , b , i = 1 , . . . , n , a , b ∈ { 1 , . . . , m } - - - ( 14 )
x i,aAnd x i,bX iIn the homogeneous coordinates value of any two groups of data, can directly obtain spinor parameter ξ according to following formula ijDue to X iIn m is arranged iThe group measurement data, every two groups can obtain one group of solution, can obtain by combination in twos
Figure BDA00002783785900052
Group is separated, with this
Figure BDA00002783785900053
The group solution is averaged as final joint parameter, that is:
ξ i = 1 C m i 2 Σ j = 1 C m 2 ξ ij - - - ( 15 )
The relevant technologies content of not addressing in aforesaid way is taked or is used for reference prior art and can realize.
Need to prove, those skilled in the art can also make such or such easy variation pattern under the instruction of this instructions, such as equivalent way, or obvious mode of texturing.Above-mentioned variation pattern all should be within protection scope of the present invention.

Claims (2)

1. hand-eye machine people full automatic calibration method based on exponent product model is characterized in that comprising the following steps:
A provides industrial robot, end binocular camera, a level crossing and some surveyor's beacons; The end binocular camera takes the detachable connected mode to be installed on the robot end, and surveyor's beacon is pasted on the minute surface of level crossing and place, robot end's instrument reference mark; Then enter step b,
B determines the initial position of robot, and the level crossing of selecting suitable position will be pasted with surveyor's beacon is positioned in end binocular camera field range, utilize simultaneously the end binocular camera to obtain one of the mirror image of level crossing, comprise minute surface surveyor's beacon information in this mirror image and reflex to the surveyor's beacon information that the end hand of mirror the inside is grabbed, be used for demarcating the trick relation; Then enter step c,
The c control begins to rotate one by one from the end joint, utilizes end binocular camera interval certain angle in each joint rotary course to take minute surface surveyor's beacon image once, until it exceeds the viewing field of camera scope; Suppose to obtain m in each joint motions process of robot iThe width image, i=1 ..., n, n are the joint of robot number, utilize
Figure FDA00002783785800011
Width image calibration camera and joint of robot parameter.
2. a kind of hand-eye machine people full automatic calibration method based on exponent product model according to claim 1 is characterized in that:
In above-mentioned steps a, the exponent product motion model of setting robot is:
g wt ( θ ) = e ξ ^ 1 θ 1 e ξ ^ 2 θ 2 . . . e ξ ^ n θ n
G wherein WT(θ) that robot is in θ=(θ 1, θ 2, θ 3..., θ n) time end-of-arm tooling coordinate system T relative world coordinate system W transformation relation, It is the motion spinor
Figure FDA00002783785800014
Exponential matrix;
In above-mentioned steps b, setting the trick variation relation is A ch, A wherein ocAnd A Ch 'Be given data;
In above-mentioned steps c, suppose that the point that the end binocular camera is measured is designated as X in i joint rotary course i=(x i1, x i2..., x im), i=1 ..., n, x ij∈ R 4 * numBe space three-dimensional homogeneous coordinates, m iBe the number of times of camera measurement data in the rotary course of i joint, num is the number of measured point, space, and the joint parameter solution formula is:
x i , a = e θ ij ξ ^ ij x i , b , i = 1 , . . . , n , a , b ∈ { 1 , . . . , m }
X wherein i,aAnd x i,bX iIn the homogeneous coordinates value of any two groups of data, can directly obtain spinor parameter ξ according to following formula ij, then pass through X iIn m iThe group measurement data, every two groups obtain one group of solution, obtain by combination in twos
Figure FDA00002783785800017
Group is separated, will
Figure FDA00002783785800018
The group solution is averaged as final joint parameter, that is:
ξ i = 1 C m i 2 Σ j = 1 C m 2 ξ ij
ξ wherein iBe motion spinor coordinate.
CN201310030955.7A 2013-01-28 2013-01-28 Fully-automatic calibration method for hand-eye robot based on exponential product model Active CN103115615B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310030955.7A CN103115615B (en) 2013-01-28 2013-01-28 Fully-automatic calibration method for hand-eye robot based on exponential product model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310030955.7A CN103115615B (en) 2013-01-28 2013-01-28 Fully-automatic calibration method for hand-eye robot based on exponential product model

Publications (2)

Publication Number Publication Date
CN103115615A true CN103115615A (en) 2013-05-22
CN103115615B CN103115615B (en) 2015-01-21

Family

ID=48414035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310030955.7A Active CN103115615B (en) 2013-01-28 2013-01-28 Fully-automatic calibration method for hand-eye robot based on exponential product model

Country Status (1)

Country Link
CN (1) CN103115615B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9211643B1 (en) 2014-06-25 2015-12-15 Microsoft Technology Licensing, Llc Automatic in-situ registration and calibration of robotic arm/sensor/workspace system
CN105598957A (en) * 2016-01-27 2016-05-25 国机集团科学技术研究院有限公司 Industrial robot kinematic modelling method and system
CN106113035A (en) * 2016-06-16 2016-11-16 华中科技大学 A kind of Six-DOF industrial robot end-of-arm tooling coordinate system caliberating device and method
CN107369184A (en) * 2017-06-23 2017-11-21 中国科学院自动化研究所 Mix binocular industrial robot system's synchronization calibration system, method and other devices
CN110246193A (en) * 2019-06-20 2019-09-17 南京博蓝奇智能科技有限公司 Industrial robot end camera online calibration method
CN110370316A (en) * 2019-06-20 2019-10-25 重庆大学 It is a kind of based on the robot TCP scaling method vertically reflected
CN111829492A (en) * 2020-07-24 2020-10-27 中交第二航务工程局有限公司 Laser plummet application-based contact measurement method
CN112809668A (en) * 2020-12-30 2021-05-18 上海媒智科技有限公司 Method, system and terminal for automatic hand-eye calibration of mechanical arm
CN112907673A (en) * 2021-03-19 2021-06-04 深圳创维-Rgb电子有限公司 Positioning method, positioning device, terminal equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101096101A (en) * 2006-06-26 2008-01-02 北京航空航天大学 Robot foot-eye calibration method and device
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
CN101733749A (en) * 2009-12-22 2010-06-16 哈尔滨工业大学 Multidomain uniform modeling and emulation system of space robot
CN102022989A (en) * 2010-09-29 2011-04-20 山东科技大学 Robot calibration method based on exponent product model
EP2402124A2 (en) * 2010-06-30 2012-01-04 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and device for determining structural parameters of a robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101096101A (en) * 2006-06-26 2008-01-02 北京航空航天大学 Robot foot-eye calibration method and device
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
CN101733749A (en) * 2009-12-22 2010-06-16 哈尔滨工业大学 Multidomain uniform modeling and emulation system of space robot
EP2402124A2 (en) * 2010-06-30 2012-01-04 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and device for determining structural parameters of a robot
CN102022989A (en) * 2010-09-29 2011-04-20 山东科技大学 Robot calibration method based on exponent product model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王海霞等: "一种实用的手眼视觉系统定位方法", 《计算机工程与应用》, vol. 43, no. 24, 31 December 2007 (2007-12-31), pages 235 - 238 *
魏振忠等: "双机器人系统的快速手眼标定方法", 《光学精密工程》, vol. 19, no. 8, 31 August 2011 (2011-08-31), pages 1895 - 1902 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9211643B1 (en) 2014-06-25 2015-12-15 Microsoft Technology Licensing, Llc Automatic in-situ registration and calibration of robotic arm/sensor/workspace system
US10052766B2 (en) 2014-06-25 2018-08-21 Microsoft Technology Licensing, Llc Automatic in-situ registration and calibration of robotic arm/sensor/workspace system
CN105598957A (en) * 2016-01-27 2016-05-25 国机集团科学技术研究院有限公司 Industrial robot kinematic modelling method and system
CN106113035A (en) * 2016-06-16 2016-11-16 华中科技大学 A kind of Six-DOF industrial robot end-of-arm tooling coordinate system caliberating device and method
CN107369184A (en) * 2017-06-23 2017-11-21 中国科学院自动化研究所 Mix binocular industrial robot system's synchronization calibration system, method and other devices
CN110370316A (en) * 2019-06-20 2019-10-25 重庆大学 It is a kind of based on the robot TCP scaling method vertically reflected
CN110246193A (en) * 2019-06-20 2019-09-17 南京博蓝奇智能科技有限公司 Industrial robot end camera online calibration method
CN110246193B (en) * 2019-06-20 2021-05-14 南京博蓝奇智能科技有限公司 Industrial robot end camera online calibration method
CN110370316B (en) * 2019-06-20 2021-12-10 重庆大学 Robot TCP calibration method based on vertical reflection
CN111829492A (en) * 2020-07-24 2020-10-27 中交第二航务工程局有限公司 Laser plummet application-based contact measurement method
CN111829492B (en) * 2020-07-24 2021-11-30 中交第二航务工程局有限公司 Laser plummet application-based contact measurement method
CN112809668A (en) * 2020-12-30 2021-05-18 上海媒智科技有限公司 Method, system and terminal for automatic hand-eye calibration of mechanical arm
CN112907673A (en) * 2021-03-19 2021-06-04 深圳创维-Rgb电子有限公司 Positioning method, positioning device, terminal equipment and storage medium
CN112907673B (en) * 2021-03-19 2021-10-22 深圳创维-Rgb电子有限公司 Positioning method, positioning device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN103115615B (en) 2015-01-21

Similar Documents

Publication Publication Date Title
CN103115615B (en) Fully-automatic calibration method for hand-eye robot based on exponential product model
CN106338245B (en) A kind of non-contact traverse measurement method of workpiece
CN104299261B (en) Three-dimensional imaging method and system for human body
CN105411490B (en) The real-time location method and mobile robot of mobile robot
CN101435704B (en) Star tracking method of star sensor under high dynamic state
CN103759669B (en) A kind of monocular vision measuring method of heavy parts
CN102692214B (en) Narrow space binocular vision measuring and positioning device and method
EP2993490A1 (en) Operating device, operating system, operating method, and program therefor
CN103591951B (en) A kind of indoor navigation system and method
CN101408422B (en) Traffic accident on-site mapper based on binocular tridimensional all-directional vision
CN106980368A (en) A kind of view-based access control model calculating and the virtual reality interactive device of Inertial Measurement Unit
CN102679961B (en) Portable four-camera three-dimensional photographic measurement system and method
CN105928505A (en) Determination method and apparatus for position and orientation of mobile robot
CN107093195A (en) A kind of locating mark points method that laser ranging is combined with binocular camera
CN105509733A (en) Measuring method for relative pose of non-cooperative spatial circular object
CN104155765A (en) Method and equipment for correcting three-dimensional image in tiled integral imaging display
CN104363438B (en) Full-view stereo making video method
CN108256430A (en) Obstacle information acquisition methods, device and robot
CN101329174A (en) Full field vision self-scanning measurement apparatus
CN108844543A (en) Indoor AGV navigation control method based on UWB positioning and dead reckoning
CN106027887B (en) For the method, apparatus and electronic equipment of the rifle ball linkage control of rotating mirror holder
CN106157322B (en) A kind of camera installation site scaling method based on plane mirror
CN108749601A (en) Electrical changing station, vehicle positioning method, apparatus and system
CN105115560A (en) Non-contact measurement method for cabin capacity
CN103110429A (en) Optical calibration method of ultrasonic probe

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C53 Correction of patent of invention or patent application
CB03 Change of inventor or designer information

Inventor after: Wang Haixia

Inventor after: Lu Xiao

Inventor after: Li Yuxia

Inventor after: Fan Binghui

Inventor before: Wang Haixia

Inventor before: Lu Xiao

COR Change of bibliographic data

Free format text: CORRECT: INVENTOR; FROM: WANG HAIXIA LU XIAO TO: WANG HAIXIA LU XIAO LI YUXIA FAN BINGHUI

TR01 Transfer of patent right

Effective date of registration: 20220608

Address after: 266590 room 402, industrialization building, 579 qianwangang Road, Huangdao District, Qingdao City, Shandong Province

Patentee after: Qingdao Zhuo Xintong Intelligent Technology Co.,Ltd.

Address before: 266590 No. 579, Qian Wan Gang Road, Qingdao economic and Technological Development Zone, Shandong

Patentee before: SHANDONG University OF SCIENCE AND TECHNOLOGY

TR01 Transfer of patent right