CN105091744A - Pose detection apparatus and method based on visual sensor and laser range finder - Google Patents

Pose detection apparatus and method based on visual sensor and laser range finder Download PDF

Info

Publication number
CN105091744A
CN105091744A CN201510229371.1A CN201510229371A CN105091744A CN 105091744 A CN105091744 A CN 105091744A CN 201510229371 A CN201510229371 A CN 201510229371A CN 105091744 A CN105091744 A CN 105091744A
Authority
CN
China
Prior art keywords
pose
laser range
deviation
current
theta
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510229371.1A
Other languages
Chinese (zh)
Other versions
CN105091744B (en
Inventor
卢金燕
王鹏
覃政科
熊召
任超
徐德
袁晓东
邹伟
刘长春
周海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Laser Fusion Research Center China Academy of Engineering Physics
Original Assignee
Institute of Automation of Chinese Academy of Science
Laser Fusion Research Center China Academy of Engineering Physics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science, Laser Fusion Research Center China Academy of Engineering Physics filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN201510229371.1A priority Critical patent/CN105091744B/en
Publication of CN105091744A publication Critical patent/CN105091744A/en
Application granted granted Critical
Publication of CN105091744B publication Critical patent/CN105091744B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a pose detection apparatus and method based on a visual sensor and laser range finders. One visual sensor and three laser range finders are utilized by the apparatus to together acquire information of a target. The method includes the steps: selecting from a target image characteristic points and characteristic lines, obtaining image coordinates of the characteristic points and angles of characteristic lines, and then obtaining image deviation of the characteristic points and angle deviation of the characteristic lines; calculating a current pixel equivalent, and obtaining deviation of a current pose in three degrees of freedom according to the current image deviation, the angle deviation and the pixel equivalent; and obtaining the angle deviation of the current pose in the corresponding two rotational degrees of freedom and the deviation of the current pose in a depth direction according to reading of the three laser range finders and relative position relations. In this way, six degrees of freedom of pose detection of a target can be realized.

Description

The apparatus for detecting position and posture of a kind of view-based access control model sensor and laser range finder and method
Technical field
The invention belongs to sensor detecting field, relate more specifically to apparatus for detecting position and posture and the method for a kind of view-based access control model sensor and laser range finder.
Background technology
Pose detects as important step, is widely used in the numerous areas such as robot controlling, aviation docking, homing guidance.Kind of sensor enriches, diverse in function, is the important tool obtaining external environment information.The information realization pose provided by sensor is detected and has attracted the concern of numerous scholar.
Vision sensor can the abundant environmental information of perception, and easy to install, is the comparatively normal a kind of external sensor used.Processed by the image information obtained vision sensor, the profile of target, shape, color can be obtained, the motion detection, relative positioning etc. of all right realize target.It is conventional position and posture detection method that the image information provided according to vision sensor obtains object pose information, these class methods are (such as with reference to SangJooKwon, HaemmJeong, andJaewoongHwang.KalmanFilter-BasedCoarse-to-FineControl forDisplayVisualAlignmentSystems [J] .IEEETransactionsonAutomationScienceandEngineering, 2012,9 (3): 621-628; BiaoZhang, JianjunWang, GregoryRossano, CarlosMartinezand kock.Vision-guidedRobotAlignmentforScalable, FlexibleAssemblyAutomation [C] .IEEEInternationalConferenceonRoboticsandBiomimetic.Phuk et, Thailand, 2011:944-951.) general based on monocular vision or binocular vision, first from visual pattern, extract key feature, then in conjunction with vision measurement technology, through multiple coordinate transform, obtain position and the attitude information of target, the pose of realize target detects.These class methods only use vision sensor, and sensor configuration is convenient, and testing process is simply efficient.But these class methods require that target is always completely visible in testing process, and the pose being therefore only applicable to small size target detects.
When not obtaining that abundant target information is as comparatively large in target size, partial visual feature is invisible, just need to complete target detection by other sensors.At present, conventional method mainly contains by laser tracker (for example, see ZhiLiu, YingXie, JingXu, KenChen.Lasertrackerbasedroboticassemblysystemforlargesc alepeg-holeparts [C] .IEEEInternationalConferenceonCyberTechnologyinAutomatio n, ControlandIntelligentSystems, HongKong, China, 2014:574-578.) or laser range finder (for example, see Young-KeunKim, YonghunKim, Kyung-sooKim, YunSubJung.DevelopingaRobustSensingSystemforRemoteRelati ve6-DOFMotionUsing1-DLaserSensors [C] .IEEEInternationalSystemsConference, Vancouver, CANADA.2012.) pose realizing large scale target detects.Laser tracker is suitable for the pose measurement of large scale parts very much, but expensive.Laser range finder measuring accuracy is high, stable performance, and antijamming capability is strong, and volume is little, is convenient to install.Vision sensor is arranged on target object by current method of carrying out pose detection by laser range finder and vision sensor, the pose of first manually adjustment laser range finder is needed during detection, it is made to get to the assigned address of target object, this class methods testing process is loaded down with trivial details, and accuracy of detection is lower.
Summary of the invention
Based on above-mentioned background, fundamental purpose of the present invention is the apparatus for detecting position and posture and the method that provide a kind of comprehensive visual sensor and laser range finder advantage.
For achieving the above object, as one aspect of the present invention, the invention provides the position and posture detection method of a kind of view-based access control model sensor and laser range sensor, comprising step as follows:
Step S0, uses a vision sensor and three laser range finders jointly to obtain target information;
Step S1: unique point and the characteristic straight line of selecting vision sensor sensitivity from the target image that described vision sensor obtains, off-line obtains unique point desired image coordinate and characteristic straight line expected angle, unique point described in On-line testing and characteristic straight line, obtain described unique point present image coordinate and described characteristic straight line current angular, relatively currency and expectation value, obtain the image deviations between current signature point and desired character point, and the deviation between current signature straight line angle and desired character straight line angle;
Step S2: according to the characteristic sum object space size of the described target image of On-line testing, obtain current pixel equivalent;
Step S3: according to described image deviations and current pixel equivalent, obtains current pose and the object pose deviation at the three degree of freedom of vision sensor sensitivity;
Step S4: according to reading and their relative position relation of described three laser range finders, off-line obtains the normal vector expectation value of the plane that described three laser range finders are formed, obtain current plane normal vector online, current normal vector and expectation normal vector are projected in plane corresponding to insensitive two rotary freedoms of vision respectively, and according to projecting to the component of each plane, obtaining current pose and expecting the angular deviation of attitude in corresponding degree of freedom;
Step S5: obtain current pose and the expected pose deviation at depth direction according to the reading of described three laser range finders;
Step S6: the deviation combining the three degree of freedom obtained from described target image and the deviation of three degree of freedom obtained from laser range finder reading, and current pose, obtain the six-degree-of-freedom information of target, the pose of realize target detects.
Wherein, described vision sensor is arranged on the position being convenient to obtain target object image, and described three laser range finders are isosceles triangle arrangement.
Wherein, the present image deviation of unique point described in step S1 and the current angular deviation of characteristic straight line are calculated by following formula:
Δu = u d - u Δv = v d - v θ z = θ d - θ ;
Wherein, Δ u, Δ v respectively representation feature point in the image coordinate deviation in U direction and V direction, θ zthe angular deviation of representation feature straight line.(u d, v d) representation feature point desired image coordinate, (u, v) representation feature point present image coordinate, θ drepresentation feature straight line expected angle, θ representation feature straight line current angular.
Wherein, the current pixel equivalent described in step S2 is calculated by following formula:
t s=S/S 0
Wherein, t srepresent current pixel equivalent, S represents target size in the picture, S 0represent the bulk of target.
Wherein, the current pose described in step S3 and object pose are expressed as follows in the deviation of the three degree of freedom of vision sensor sensitivity:
Δx Δy Δθ z = t s · Δu t s · Δv θ z ;
Wherein, Δ x represents current pose and the expected pose position deviation in X-direction, and Δ y represents current pose and the expected pose position deviation in X-direction, Δ θ zrepresent that current pose and expected pose are in the anglec of rotation deviation around optical axis direction.
Wherein, the current pose described in step S4 and object pose are expressed as follows in the deviation of insensitive two degree of freedom of vision sensor:
cos ( Δθ x ) = V qyoz V yoz | V qyoz | · | V yoz | cos ( Δθ y ) = V qzox V zox | V qzox | · | V yoz |
Wherein, Δ θ xrepresent that current pose and object pose are in the anglec of rotation deviation around X-direction, Δ θ yshow that current pose and object pose are in the anglec of rotation deviation around Y direction.V qyozand V qzoxthe expectation normal vector V of laser plane qrespectively in the projection of YOZ plane and ZOX plane, V yoxand V zoxthat laser plane normal vector V is respectively in the projection of YOZ plane and ZOX plane.
Wherein, the current pose described in step S5 and the position deviation of object pose on the insensitive depth direction of vision sensor are expressed as follows:
Δz=(d 1+d 2+d 3)/3;
Wherein, Δ z represents current pose and the object pose position deviation in Z-direction, d i(i=1,2,3) represent the reading of three laser range finders.
Wherein, described object pose is expressed as follows:
x y z θ x θ y θ z = x 0 y 0 z 0 θ x 0 θ y 0 θ z 0 + Δx Δy Δz Δθ x Δθ y Δθ z ;
Wherein, [x, y, z, θ x, θ y, θ z] represent and detect the object pose obtained, [x by vision sensor and laser range finder 0, y 0, z 0, θ x0, θ y0, θ z0] represent current pose, [Δ x, Δ y, Δ z, Δ θ x, Δ θ y, Δ θ z] represent deviation between current pose and object pose, [Δ x, Δ y, Δ θ z] be the pose deviation obtained by visual pattern, vision sensor to the pose sensitive of this three degree of freedom, [Δ z, Δ θ x, Δ θ y] be the pose deviation obtained according to the measured value of laser range finder, the posture information outside the three degree of freedom being obtained vision sensor sensitivity by laser range finder.
As another aspect of the present invention, present invention also offers the apparatus for detecting position and posture of a kind of view-based access control model sensor and laser range finder, comprise a vision sensor, three laser range finders and control module, wherein said vision sensor is arranged on the position being convenient to obtain target object image, described three laser range finders are isosceles triangle arrangement, and
The input of vision sensor and laser range finder as described in the position and posture detection method that described control module performs view-based access control model sensor as described in claim 1 to 8 any one and laser range finder controls, and calculating the six-degree-of-freedom information of target, the pose of realize target detects.
Wherein, described vision sensor is AVTGC1600H, and laser range finder is CASTALPT50220S.
Known based on technique scheme, the present invention has following beneficial effect: the position and posture detection method of traditional view-based access control model image requires that target is always completely visible in testing process, the pose being only applicable to small size target detects, for not obtaining abundant target image information as scenes such as target size are comparatively large, partial visual feature is invisible, it is difficult to the accurate posture information detecting target.The present invention is directed to the scene that vision sensor can not obtain abundant target image, jointly target information is obtained by vision sensor and laser range finder, the posture information of the three degree of freedom of vision sensor sensitivity is obtained from visual pattern, the posture information remaining three degree of freedom is obtained from laser distance information, merge the testing result of two kinds of sensors, the pose of realize target detects.This invention under the scene that cannot obtain abundant target image, according to visual image information and laser distance information, can obtain the six-freedom degree pose of target.A large amount of data on probation also demonstrate validity of the present invention.As can be seen here, the present invention, by the vision sensor of meticulous layout and laser sensor, solves the problem being only difficult to obtain abundant target information by vision sensor; The information that view-based access control model sensor and laser sensor provide, the six-freedom degree pose achieving target detects; Accuracy of detection is high, stability and real-time good.
Accompanying drawing explanation
Fig. 1 is the sensor placement figure of the apparatus for detecting position and posture of view-based access control model sensor of the present invention and laser range finder, the square expression laser range finder of its Oxford gray, and Dark grey circle represents vision sensor;
Fig. 2 is the process flow diagram of the position and posture detection method of view-based access control model sensor of the present invention and laser range finder;
Fig. 3 is the testing result of the apparatus for detecting position and posture sensor of view-based access control model sensor of the present invention and laser range finder.
Embodiment
Be described in detail embodiments of the invention below in conjunction with accompanying drawing: the present embodiment is implemented under premised on technical solution of the present invention, in conjunction with detailed embodiment and concrete operating process, but protection scope of the present invention is not limited to following embodiment.
The invention discloses apparatus for detecting position and posture and the method for a kind of view-based access control model sensor and laser range finder, this device uses multiple sensors of meticulous layout jointly to gather target information, the information that dissimilar sensor provides is processed respectively, and result is merged, the six-freedom degree pose of realize target detects.
More specifically, as a preferred embodiment of the present invention, as Fig. 1 illustrates the sensor placement figure of the apparatus for detecting position and posture of view-based access control model sensor of the present invention and laser range finder, employ four sensors in this device: a vision sensor and three laser range finders.Vision sensor is arranged on the position being convenient to obtain target object image, and three laser range finders are isosceles triangle arrangement.For square target, sensor stand is also square, and three laser range finders are arranged on the left side of square set, the right and following midpoint respectively, and vision sensor is arranged on the top of square set.Wherein, described vision sensor is such as AVTGC1600H, and laser range finder is such as CASTALPT50220S.
The process flow diagram of the position and posture detection method of view-based access control model sensor of the present invention and laser range finder as shown in Figure 2, in testing process, the image deviations of current signature and desired character is obtained from visual pattern, and then obtain current pose and the expected pose information at the three degree of freedom of visual acuity, current pose and the expected pose position deviation at depth direction is obtained from laser distance information, and the angular deviation of current normal vector and expectation normal vector, thus obtain current pose and the information of expected pose on the insensitive three degree of freedom of vision, merge visual pattern result and laser distance information result, the pose of realize target detects, the method comprises the following steps:
The first step: unique point and the characteristic straight line of selecting vision sensor sensitivity from target image, off-line obtains unique point desired image coordinate and characteristic straight line expected angle, On-line testing unique point and characteristic straight line, obtain unique point present image coordinate and characteristic straight line current angular, relatively currency and expectation value, obtain the image deviations between current signature point and desired character point, and the deviation between current signature straight line angle and desired character straight line angle;
Second step: according to characteristics of image and the object space size of On-line testing, obtain current pixel equivalent;
3rd step: according to present image deviation and pixel equivalent, obtains current pose and the object pose deviation at the three degree of freedom of vision sensor sensitivity;
4th step: according to reading and their relative position relation of three laser range finders, off-line obtains the normal vector expectation value of the plane that three laser range finders are formed, obtain current plane normal vector online, current normal vector and expectation normal vector are projected in plane corresponding to insensitive two rotary freedoms of vision respectively, and according to projecting to the component of each plane, obtaining current pose and expecting the angular deviation of attitude in corresponding degree of freedom;
5th step: obtain current pose and the expected pose deviation at depth direction according to the reading of three laser sensors;
6th step: the deviation combining the three degree of freedom obtained from visual pattern and the deviation of three degree of freedom obtained from laser distance information, and current pose, obtain the six-degree-of-freedom information of target, the pose of realize target detects.
The described first step, specific as follows:
According to the target image that vision sensor obtains, extract key feature, the current angular deviation obtaining the present image deviation of unique point and characteristic straight line is as follows:
Δu = u d - u Δv = v d - v θ z = θ d - θ
Wherein, Δ u, Δ v respectively representation feature point in the image coordinate deviation in U direction and V direction, θ zthe angular deviation of representation feature straight line.(u d, v d) representation feature point desired image coordinate, (u, v) representation feature point present image coordinate.θ drepresentation feature straight line expected angle, θ representation feature straight line current angular.
Described second step, specific as follows:
According to target size in the picture and the bulk of target, obtain current pixel equivalent as follows:
t s=S/S 0
Wherein, t srepresent current pixel equivalent, S represents target size in the picture, S 0represent the bulk of target.
Described 3rd step, specific as follows:
The image deviations obtained according to the first step and angular deviation, and the pixel equivalent that second step obtains, obtain current pose and object pose is as follows in the deviation of the three degree of freedom of vision sensor sensitivity:
Δx Δy Δθ z = t s · Δu t s · Δv θ z
Wherein, Δ x represents current pose and the expected pose position deviation in X-direction, and Δ y represents current pose and the expected pose position deviation in X-direction, Δ θ zrepresent that current pose and expected pose are in the anglec of rotation deviation around optical axis direction.
Described 4th step, specific as follows:
By laser sensor as particle, then these three laser sensors form a plane.Remember that three laser sensor particles are A, B, C, then the planar process vector V of three particle formations is as follows:
V = AB → × AC →
According to the measured value of three laser range finders and the position relationship between them, off-line obtains the planar process vector V expected q, then at the planar process vector V that line computation is current, by V and V qproject to YOZ plane and ZOX plane, then the angle of two normal vectors between the projection of YOZ plane is current pose and expected pose in the angular deviation around X-direction, and the angle of two normal vectors between the projection of ZOX plane is current pose and expected pose in the angular deviation around Y direction.Therefore, angular deviation is obtained as follows:
cos ( Δθ x ) = V qyoz V yoz | V qyoz | · | V yoz | cos ( Δθ y ) = V qzox V zox | V qzox | · | V yoz |
Wherein, Δ θ xrepresent that current pose and object pose are in the anglec of rotation deviation around X-direction, Δ θ yrepresent that current pose and object pose are in the anglec of rotation deviation around Y direction.V qyozand V qzoxthe expectation normal vector V of laser plane qrespectively in the projection of YOZ plane and ZOX plane, V yoxand V zoxthat laser plane normal vector V is respectively in the projection of YOZ plane and ZOX plane.
Described 5th step, specific as follows:
According to the measured value of three laser range finders, obtain current pose and the position deviation of object pose on the insensitive depth direction of vision sensor is as follows:
Δz=(d 1+d 2+d 3)/3
Wherein, Δ z represents current pose and the object pose position deviation in Z-direction, d i(i=1,2,3) represent the measured value of three laser range finders.
Described 6th step, specific as follows:
It is characterized in that, described object pose is expressed as follows:
x y z θ x θ y θ z = x 0 y 0 z 0 θ x 0 θ y 0 θ z 0 + Δx Δy Δz Δθ x Δθ y Δθ z
Wherein, [x, y, z, θ x, θ y, θ z] represent and detect the object pose obtained, [x by vision sensor and laser range finder 0, y 0, z 0, θ x0, θ y0, θ z0] represent current pose, [Δ x, Δ y, Δ z, Δ θ x, Δ θ y, Δ θ z] represent deviation between current pose and object pose, [Δ x, Δ y, Δ θ z] be the pose deviation obtained by visual pattern, vision sensor to the pose sensitive of this three degree of freedom, [Δ z, Δ θ x, Δ θ y] be the pose deviation obtained according to the measured value of laser range finder, the posture information outside the three degree of freedom being obtained vision sensor sensitivity by laser range finder.
In order to verify method of the present invention, pose detection is carried out to target.Fig. 3 is sensor detection results of the present invention, and (a) is target image result, and (b) is the measured value of three laser range finders.As can be seen from Figure 3, described method can detect target image accurately, and three laser ranging apparatus measuring values are basically identical, and now the pose deviation of pick-up unit and target is eliminated substantially, according to the current pose of device, the posture information of target can be obtained.
Above-described specific embodiment; object of the present invention, technical scheme and beneficial effect are further described; be understood that; the foregoing is only specific embodiments of the invention; be not limited to the present invention; within the spirit and principles in the present invention all, any amendment made, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (10)

1. a position and posture detection method for view-based access control model sensor and laser range finder, comprises the following steps:
Step S0, uses a vision sensor and three laser range finders jointly to obtain target information;
Step S1: unique point and the characteristic straight line of selecting vision sensor sensitivity from the target image that described vision sensor obtains, off-line obtains unique point desired image coordinate and characteristic straight line expected angle, unique point described in On-line testing and characteristic straight line, obtain described unique point present image coordinate and described characteristic straight line current angular, relatively currency and expectation value, obtain the image deviations between current signature point and desired character point, and the deviation between current signature straight line angle and desired character straight line angle;
Step S2: according to the characteristic sum object space size of the described target image of On-line testing, obtain current pixel equivalent;
Step S3: according to described image deviations and current pixel equivalent, obtains current pose and the object pose deviation at the three degree of freedom of vision sensor sensitivity;
Step S4: according to reading and their relative position relation of described three laser range finders, off-line obtains the normal vector expectation value of the plane that described three laser range finders are formed, obtain current plane normal vector online, current normal vector and expectation normal vector are projected in plane corresponding to insensitive two rotary freedoms of vision respectively, and according to projecting to the component of each plane, obtaining current pose and expecting the angular deviation of attitude in corresponding degree of freedom;
Step S5: obtain current pose and the expected pose deviation at depth direction according to the reading of described three laser range finders;
Step S6: the deviation combining the three degree of freedom obtained from described target image and the deviation of three degree of freedom obtained from laser range finder reading, and current pose, obtain the six-degree-of-freedom information of target, the pose of realize target detects.
2. the position and posture detection method of view-based access control model sensor according to claim 1 and laser range finder, wherein said vision sensor is arranged on the position being convenient to obtain target object image, and described three laser range finders are isosceles triangle arrangement.
3. the position and posture detection method of view-based access control model sensor according to claim 1 and laser range finder, wherein the present image deviation of unique point described in step S1 and the current angular deviation of characteristic straight line are calculated by following formula:
Δu = u d - u Δv = v d - v θ z = θ d - θ ;
Wherein, Δ u, Δ v respectively representation feature point in the image coordinate deviation in U direction and V direction, θ zthe angular deviation of representation feature straight line.(u d, v d) representation feature point desired image coordinate, (u, v) representation feature point present image coordinate, θ drepresentation feature straight line expected angle, θ representation feature straight line current angular.
4. the position and posture detection method of view-based access control model sensor according to claim 1 and laser range finder, the current pixel equivalent wherein described in step S2 is calculated by following formula:
t s=S/S 0
Wherein, t srepresent current pixel equivalent, S represents target size in the picture, S 0represent the bulk of target.
5. the position and posture detection method of view-based access control model sensor according to claim 1 and laser range finder, the current pose wherein described in step S3 and object pose are expressed as follows in the deviation of the three degree of freedom of vision sensor sensitivity:
Δx Δy Δ θ z = t s · Δu t s · Δv θ z ;
Wherein, Δ x represents current pose and the expected pose position deviation in X-direction, and Δ y represents current pose and the expected pose position deviation in X-direction, Δ θ zrepresent that current pose and expected pose are in the anglec of rotation deviation around optical axis direction.
6. the position and posture detection method of view-based access control model sensor according to claim 1 and laser range finder, the current pose wherein described in step S4 and object pose are expressed as follows in the deviation of insensitive two degree of freedom of vision sensor:
cos ( Δθ x ) = V qyoz V yoz | V qyoz | · | V yoz | cos ( Δθ y ) = V qzox V zox | V qzox | · | V yoz |
Wherein, Δ θ xrepresent that current pose and object pose are in the anglec of rotation deviation around X-direction, Δ θ yshow that current pose and object pose are in the anglec of rotation deviation around Y direction.V qyozand V qzoxthe expectation normal vector V of laser plane qrespectively in the projection of YOZ plane and ZOX plane, V yoxand V zoxthat laser plane normal vector V is respectively in the projection of YOZ plane and ZOX plane.
7. the position and posture detection method of view-based access control model sensor according to claim 1 and laser range finder, the current pose wherein described in step S5 and the position deviation of object pose on the insensitive depth direction of vision sensor are expressed as follows:
Δz=(d 1+d 2+d 3)/3;
Wherein, Δ z represents current pose and the object pose position deviation in Z-direction, d i(i=1,2,3) represent the reading of three laser range finders.
8. the position and posture detection method of view-based access control model sensor according to claim 1 and laser range finder, wherein said object pose is expressed as follows:
x y z θ x θ y θ z = x 0 y 0 z 0 θ x 0 θ y 0 θ z 0 + Δx Δy Δz Δθ x Δθ y Δ θ z ;
Wherein, [x, y, z, θ x, θ y, θ z] represent and detect the object pose obtained, [x by vision sensor and laser range finder 0, y 0, z 0, θ x0, θ y0, θ z0] represent current pose, [Δ x, Δ y, Δ z, Δ θ x, Δ θ y, Δ θ z] represent deviation between current pose and object pose, [Δ x, Δ y, Δ θ z] be the pose deviation obtained by visual pattern, vision sensor to the pose sensitive of this three degree of freedom, [Δ z, Δ θ x, Δ θ y] be the pose deviation obtained according to the measured value of laser range finder, the posture information outside the three degree of freedom being obtained vision sensor sensitivity by laser range finder.
9. the apparatus for detecting position and posture of a view-based access control model sensor and laser range finder, comprise a vision sensor, three laser range finders and control module, wherein said vision sensor is arranged on the position being convenient to obtain target object image, described three laser range finders are isosceles triangle arrangement, and
The input of vision sensor and laser range finder as described in the position and posture detection method that described control module performs view-based access control model sensor as described in claim 1 to 8 any one and laser range finder controls, and calculating the six-degree-of-freedom information of target, the pose of realize target detects.
10. the apparatus for detecting position and posture of view-based access control model sensor according to claim 9 and laser range finder, wherein said vision sensor is AVTGC1600H, and laser range finder is CASTALPT50220S.
CN201510229371.1A 2015-05-07 2015-05-07 The apparatus for detecting position and posture and method of a kind of view-based access control model sensor and laser range finder Active CN105091744B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510229371.1A CN105091744B (en) 2015-05-07 2015-05-07 The apparatus for detecting position and posture and method of a kind of view-based access control model sensor and laser range finder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510229371.1A CN105091744B (en) 2015-05-07 2015-05-07 The apparatus for detecting position and posture and method of a kind of view-based access control model sensor and laser range finder

Publications (2)

Publication Number Publication Date
CN105091744A true CN105091744A (en) 2015-11-25
CN105091744B CN105091744B (en) 2018-06-26

Family

ID=54572707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510229371.1A Active CN105091744B (en) 2015-05-07 2015-05-07 The apparatus for detecting position and posture and method of a kind of view-based access control model sensor and laser range finder

Country Status (1)

Country Link
CN (1) CN105091744B (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106885514A (en) * 2017-02-28 2017-06-23 西南科技大学 A kind of Deep Water Drilling Riser automatic butt position and posture detection method based on machine vision
CN106944818A (en) * 2017-05-08 2017-07-14 成都锦江电子系统工程有限公司 Large Radar Antenna six degree of freedom automatic butt piece-rate system and method
CN108089196A (en) * 2017-12-14 2018-05-29 中国科学院光电技术研究所 The noncooperative target pose measuring apparatus that a kind of optics master is passively merged
CN108177143A (en) * 2017-12-05 2018-06-19 上海工程技术大学 A kind of robot localization grasping means and system based on laser vision guiding
CN108680926A (en) * 2018-04-11 2018-10-19 北京特种机械研究所 Double tabletop relative pose measuring system and method in plane
CN108839027A (en) * 2018-08-31 2018-11-20 河南工程学院 Robot based on laser range sensor is automatically aligned to control method
CN108927807A (en) * 2018-08-14 2018-12-04 河南工程学院 A kind of robot vision control method based on point feature
CN108961254A (en) * 2018-06-26 2018-12-07 天津城建大学 A kind of lower corrective exercise detection method of laser vision fusion
CN109146957A (en) * 2018-08-14 2019-01-04 河南工程学院 A kind of robot vision control method based on triangle character
CN109212497A (en) * 2018-10-30 2019-01-15 哈尔滨工业大学 A kind of measurement of space six degree of freedom vehicle radar antenna pose deviation and interconnection method
CN109541626A (en) * 2018-12-12 2019-03-29 华南农业大学 Objective plane normal direction amount detecting device and detection method
CN109814548A (en) * 2018-12-29 2019-05-28 广州蓝海机器人系统有限公司 A kind of air navigation aid and AGV based on indoor microwave base station
CN109815911A (en) * 2019-01-26 2019-05-28 上海交通大学 Video moving object detection system, method and terminal based on depth integration network
CN110370286A (en) * 2019-08-13 2019-10-25 西北工业大学 Dead axle motion rigid body spatial position recognition methods based on industrial robot and monocular camera
CN110455277A (en) * 2019-08-19 2019-11-15 哈尔滨工业大学 High-precision attitude measuring device and method based on internet of things data fusion
CN111583239A (en) * 2020-05-09 2020-08-25 中南大学 Honeycomb structure geometric regularity image recognition method and system
CN112548554A (en) * 2020-12-19 2021-03-26 北京工业大学 Robot bolt tightening system integrating multi-sensing distance measurement
WO2021115331A1 (en) * 2019-12-13 2021-06-17 深圳市瑞立视多媒体科技有限公司 Triangulation-based coordinate positioning method, apparatus, and device and storage medium
WO2021120911A1 (en) * 2019-12-17 2021-06-24 中兴通讯股份有限公司 Three-dimensional coordinate calibration method for plate-like workpiece
WO2021184859A1 (en) * 2020-03-19 2021-09-23 智美康民(珠海)健康科技有限公司 Tool head posture adjustment method and apparatus, and readable storage medium
CN113516013A (en) * 2021-04-09 2021-10-19 阿波罗智联(北京)科技有限公司 Target detection method and device, electronic equipment, road side equipment and cloud control platform
CN113914880A (en) * 2021-09-01 2022-01-11 中铁九局集团电务工程有限公司 Inclination angle correctable tunnel punching method based on laser ranging and punching robot
CN116007655A (en) * 2022-12-05 2023-04-25 广州阿路比电子科技有限公司 Attitude sensor course angle testing system and method
CN116147521A (en) * 2023-04-18 2023-05-23 菲特(天津)检测技术有限公司 Non-contact workpiece size measuring device and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6542249B1 (en) * 1999-07-20 2003-04-01 The University Of Western Ontario Three-dimensional measurement method and apparatus
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN104482924A (en) * 2014-12-11 2015-04-01 中国航天空气动力技术研究院 Revolution body object pose vision measurement method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6542249B1 (en) * 1999-07-20 2003-04-01 The University Of Western Ontario Three-dimensional measurement method and apparatus
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN104482924A (en) * 2014-12-11 2015-04-01 中国航天空气动力技术研究院 Revolution body object pose vision measurement method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CLIPP B, ET AL.: "Robust 6DOF Motion Estimation for Non-Overlapping, Multi-Camera Systems", 《IEEE WORKSHOP ON APPLICATION OF COMPUTER VISION》 *
KIM Y K,ET AL.: "Developing a robust sensing system for remote relative 6-DOF motion using 1-D laser sensors", 《PROCEEDINGS OF THE 2012 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS》 *
KIM Y,ET AL.: "Structure optimization of 1-D laser sensors assembly for robust 6-DOF measurement", 《PROCEEDINGS OF THE 2012 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS》 *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106885514A (en) * 2017-02-28 2017-06-23 西南科技大学 A kind of Deep Water Drilling Riser automatic butt position and posture detection method based on machine vision
CN106944818B (en) * 2017-05-08 2019-10-01 成都锦江电子系统工程有限公司 Large Radar Antenna six degree of freedom automatic butt separation system and method
CN106944818A (en) * 2017-05-08 2017-07-14 成都锦江电子系统工程有限公司 Large Radar Antenna six degree of freedom automatic butt piece-rate system and method
CN108177143A (en) * 2017-12-05 2018-06-19 上海工程技术大学 A kind of robot localization grasping means and system based on laser vision guiding
CN108177143B (en) * 2017-12-05 2021-08-10 上海工程技术大学 Robot positioning and grabbing method and system based on laser vision guidance
CN108089196A (en) * 2017-12-14 2018-05-29 中国科学院光电技术研究所 The noncooperative target pose measuring apparatus that a kind of optics master is passively merged
CN108089196B (en) * 2017-12-14 2021-11-19 中国科学院光电技术研究所 Optics is initiative and is fused non-cooperative target position appearance measuring device passively
CN108680926A (en) * 2018-04-11 2018-10-19 北京特种机械研究所 Double tabletop relative pose measuring system and method in plane
CN108680926B (en) * 2018-04-11 2022-03-25 北京特种机械研究所 In-plane double-platform relative pose measurement system
CN108961254A (en) * 2018-06-26 2018-12-07 天津城建大学 A kind of lower corrective exercise detection method of laser vision fusion
CN109146957B (en) * 2018-08-14 2020-09-25 河南工程学院 Robot vision control method based on triangular features
CN108927807A (en) * 2018-08-14 2018-12-04 河南工程学院 A kind of robot vision control method based on point feature
CN109146957A (en) * 2018-08-14 2019-01-04 河南工程学院 A kind of robot vision control method based on triangle character
CN108927807B (en) * 2018-08-14 2020-08-07 河南工程学院 Robot vision control method based on point characteristics
CN108839027B (en) * 2018-08-31 2020-12-25 河南工程学院 Robot automatic alignment control method based on laser ranging sensor
CN108839027A (en) * 2018-08-31 2018-11-20 河南工程学院 Robot based on laser range sensor is automatically aligned to control method
CN109212497A (en) * 2018-10-30 2019-01-15 哈尔滨工业大学 A kind of measurement of space six degree of freedom vehicle radar antenna pose deviation and interconnection method
CN109541626A (en) * 2018-12-12 2019-03-29 华南农业大学 Objective plane normal direction amount detecting device and detection method
CN109814548B (en) * 2018-12-29 2022-02-15 广州蓝海机器人系统有限公司 Navigation method based on indoor microwave base station and AGV
CN109814548A (en) * 2018-12-29 2019-05-28 广州蓝海机器人系统有限公司 A kind of air navigation aid and AGV based on indoor microwave base station
CN109815911A (en) * 2019-01-26 2019-05-28 上海交通大学 Video moving object detection system, method and terminal based on depth integration network
CN110370286A (en) * 2019-08-13 2019-10-25 西北工业大学 Dead axle motion rigid body spatial position recognition methods based on industrial robot and monocular camera
CN110370286B (en) * 2019-08-13 2022-04-12 西北工业大学 Method for identifying rigid body space position of dead axle motion based on industrial robot and monocular camera
CN110455277A (en) * 2019-08-19 2019-11-15 哈尔滨工业大学 High-precision attitude measuring device and method based on internet of things data fusion
WO2021115331A1 (en) * 2019-12-13 2021-06-17 深圳市瑞立视多媒体科技有限公司 Triangulation-based coordinate positioning method, apparatus, and device and storage medium
WO2021120911A1 (en) * 2019-12-17 2021-06-24 中兴通讯股份有限公司 Three-dimensional coordinate calibration method for plate-like workpiece
WO2021184859A1 (en) * 2020-03-19 2021-09-23 智美康民(珠海)健康科技有限公司 Tool head posture adjustment method and apparatus, and readable storage medium
CN111583239B (en) * 2020-05-09 2021-03-30 中南大学 Honeycomb structure geometric regularity image recognition method and system
CN111583239A (en) * 2020-05-09 2020-08-25 中南大学 Honeycomb structure geometric regularity image recognition method and system
CN112548554A (en) * 2020-12-19 2021-03-26 北京工业大学 Robot bolt tightening system integrating multi-sensing distance measurement
CN113516013A (en) * 2021-04-09 2021-10-19 阿波罗智联(北京)科技有限公司 Target detection method and device, electronic equipment, road side equipment and cloud control platform
CN113516013B (en) * 2021-04-09 2024-05-14 阿波罗智联(北京)科技有限公司 Target detection method, target detection device, electronic equipment, road side equipment and cloud control platform
CN113914880A (en) * 2021-09-01 2022-01-11 中铁九局集团电务工程有限公司 Inclination angle correctable tunnel punching method based on laser ranging and punching robot
CN113914880B (en) * 2021-09-01 2024-02-23 中铁九局集团电务工程有限公司 Tunnel punching method capable of correcting inclination angle based on laser ranging and punching robot
CN116007655A (en) * 2022-12-05 2023-04-25 广州阿路比电子科技有限公司 Attitude sensor course angle testing system and method
CN116007655B (en) * 2022-12-05 2023-09-01 广州阿路比电子科技有限公司 Attitude sensor course angle testing system and method
CN116147521A (en) * 2023-04-18 2023-05-23 菲特(天津)检测技术有限公司 Non-contact workpiece size measuring device and method

Also Published As

Publication number Publication date
CN105091744B (en) 2018-06-26

Similar Documents

Publication Publication Date Title
CN105091744A (en) Pose detection apparatus and method based on visual sensor and laser range finder
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN105157725B (en) A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot
Hu et al. Extrinsic calibration of 2-D laser rangefinder and camera from single shot based on minimal solution
Li et al. An algorithm for extrinsic parameters calibration of a camera and a laser range finder using line features
CN102435188B (en) Monocular vision/inertia autonomous navigation method for indoor environment
CN102589530B (en) Method for measuring position and gesture of non-cooperative target based on fusion of two dimension camera and three dimension camera
US20120262455A1 (en) Three-dimensional measurement apparatus, model generation apparatus, processing method thereof, and non-transitory computer-readable storage medium
Tzschichholz et al. Relative pose estimation of satellites using PMD-/CCD-sensor data fusion
CN106157322B (en) A kind of camera installation site scaling method based on plane mirror
Xia et al. Global calibration of non-overlapping cameras: State of the art
CN206990800U (en) A kind of alignment system
CN114608554B (en) Handheld SLAM equipment and robot instant positioning and mapping method
Gao et al. Development and calibration of an accurate 6-degree-of-freedom measurement system with total station
Mashita et al. Calibration method for misaligned catadioptric camera
Liu et al. Precise pose and radius estimation of circular target based on binocular vision
Liu et al. A high-accuracy pose measurement system for robotic automated assembly in large-scale space
Cui et al. A measurement method of motion parameters in aircraft ground tests using computer vision
Bok et al. Extrinsic calibration of a camera and a 2D laser without overlap
Bikmaev et al. Improving the accuracy of supporting mobile objects with the use of the algorithm of complex processing of signals with a monocular camera and LiDAR
Jang et al. Metric localization using a single artificial landmark for indoor mobile robots
Chen et al. Low cost and efficient 3D indoor mapping using multiple consumer RGB-D cameras
Shao et al. Calibration method for a vision guiding-based laser-tracking measurement system
Shmatko et al. Estimation of rotation measurement error of objects using computer simulation
Kupervasser et al. Robust positioning of drones for land use monitoring in strong terrain relief using vision-based navigation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant