CN103278138B - Method for measuring three-dimensional position and posture of thin component with complex structure - Google Patents

Method for measuring three-dimensional position and posture of thin component with complex structure Download PDF

Info

Publication number
CN103278138B
CN103278138B CN201310159555.6A CN201310159555A CN103278138B CN 103278138 B CN103278138 B CN 103278138B CN 201310159555 A CN201310159555 A CN 201310159555A CN 103278138 B CN103278138 B CN 103278138B
Authority
CN
China
Prior art keywords
und
binocular
dimensional
camera
sift
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310159555.6A
Other languages
Chinese (zh)
Other versions
CN103278138A (en
Inventor
贾立好
乔红
苏建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Original Assignee
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN201310159555.6A priority Critical patent/CN103278138B/en
Publication of CN103278138A publication Critical patent/CN103278138A/en
Application granted granted Critical
Publication of CN103278138B publication Critical patent/CN103278138B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method for measuring three-dimensional position and posture of a thin component with complex structure. The method comprises the following steps of: establishing a mapping relation between a three-dimensional world coordinate system and a two-dimensional image coordinate in a binocular vision measurement system; extracting the features on a reference posture thin component binocular image, establishing a prior feature library, extracting the features on a to-be-measured posture thin component binocular image, matching the extracted features with the prior feature library so as to obtain the same amount of feature matched pair sets corresponding to the prior feature library; obtaining a three-dimensional point cloud of feature points; calculating a main plane of the reference posture thin component three-dimensional point cloud, calculating the main plane of three-dimensional point cloud of the to-be-measured thin component and calculating the spatial position relation with the main plane of the reference posture thin component so as to obtain the three-dimensional position and posture of the to-be-measured thin component. The measurement method provided by the invention is beneficial for accurate positioning for the thin component in the technological processes of automatic manufacturing, assembly, welding, and the like of an industrial mechanical arm. According to the method, the implementation is simple and easy, the positioning accuracy is high, the cost is low, and the operation is convenient.

Description

Method for measuring three-dimensional position and posture of thin part with complex structure
Technical Field
The invention belongs to the field of visual measurement in information technology, and particularly relates to a visual measurement method for three-dimensional positions and postures of complex thin parts.
Background
The binocular vision-based measurement method can realize three-dimensional size measurement, positioning and online quality control of parts on an industrial production line, is an important research field in the current information technology, and is already applied to the aspects of measurement, positioning and the like of common small-size parts. For three-dimensional position and attitude measurement of thin parts (such as side walls and doors of a body-in-white of an automobile) with large size and complex structure, no clear application is seen.
Disclosure of Invention
In order to solve the above problems, the present invention provides a method for measuring a three-dimensional position and posture of a thin component to be measured, comprising:
step 1: calibrating the internal parameters and the external parameters of the binocular camera and the spatial position relationship of the binocular camera, and establishing a mapping relationship between a three-dimensional world coordinate system and a two-dimensional image coordinate in a binocular vision measurement system;
step 2: extracting SIFT feature point pairs in the binocular image of the reference thin component by using an SIFT operator to generate an SIFT prior feature library, and using a three-dimensional plane where the SIFT feature point pairs are located as a reference plane where the reference thin component is located;
and step 3: respectively extracting a plurality of SIFT feature point pairs from the binocular stereo correction image pair of the thin component to be detected, and matching the extracted SIFT feature point pairs with SIFT feature point pairs in the SIFT prior feature library to obtain an SIFT feature point pair set which has the same number and corresponds to the SIFT feature point pairs in the SIFT feature prior library;
and 4, step 4: obtaining three-dimensional point cloud of the thin part to be measured according to the obtained image coordinates of the feature points in the SIFT feature point pair set, and further obtaining a space plane of the thin part to be measured;
and 5: and determining the three-dimensional position and attitude information of the thin component to be measured by solving the transformation matrix between the space plane and the reference plane.
The invention adopts a binocular vision measurement technology: (1) selecting proper cameras and lenses according to an application scene and a visual field range, building a binocular vision system, respectively calibrating the left camera and the right camera to obtain internal parameters and lens distortion coefficients of the left camera and the right camera, obtaining external parameters of the left camera and the right camera according to a world coordinate system on a reference plane, and then performing binocular stereo calibration to further obtain a spatial position relation between the left camera and the right camera; (2) the method comprises the steps of sequentially carrying out lens distortion correction and stereo correction on collected binocular images, respectively obtaining a binocular distortion correction image pair and a binocular stereo correction image pair, extracting SIFT characteristics of the binocular stereo correction image pair of a reference pose thin component, manually selecting SIFT characteristics of specific positions (such as positions of a positioning hole, a positioning bolt and the like) of the thin component, obtaining a binocular characteristic matching pair set of the thin component, taking the binocular characteristic matching pair set as an SIFT priori characteristic library, extracting SIFT characteristics of the binocular stereo correction image pair of the thin component to be detected, carrying out characteristic matching on the SIFT characteristics and the priori characteristic library, and obtaining the same number of and corresponding characteristic matching pair sets; (3) obtaining three-dimensional coordinates of the feature matching pair set by a binocular measurement method, and obtaining three-dimensional point cloud of feature points; (4) and respectively calculating main planes of the three-dimensional point cloud of the reference pose thin part and the three-dimensional point cloud of the thin part to be measured according to a least square method, and obtaining the three-dimensional position and the pose of the thin part to be measured by calculating the spatial position relation between the two main planes. The three-dimensional position and posture measuring method for the complex thin part is beneficial to accurate positioning of the thin part in the technical processes of automatic manufacturing, assembling, welding and the like of an industrial robot, and is high in positioning accuracy, low in cost and convenient to operate.
Drawings
FIG. 1 is a general block diagram of a method for measuring three-dimensional position and attitude of a complex thin part according to the present invention;
FIG. 2 is a diagram of a binocular camera imaging model and a physical imaging and normalized imaging coordinate system in accordance with the present invention;
FIG. 3 is a schematic view of the binocular camera stereo calibration of the present invention;
FIG. 4 is a flow chart of feature matching in the present invention;
FIG. 5 is a schematic diagram of the acquisition of missing SIFT features in the present invention;
FIG. 6 is a flow chart of the calculation of the feature point three-dimensional point cloud in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to specific embodiments and the accompanying drawings.
The invention discloses a visual measurement method for three-dimensional positions and postures of complex thin parts.
Fig. 1 shows a general flow chart of a visual measurement method for three-dimensional position and posture of a complex thin part in the invention. As shown in FIG. 1, the method is divided into two stages of reference pose measurement and pose estimation to be measured. Wherein, benchmark position appearance measurement stage includes: the method comprises the following steps of camera calibration, binocular image synchronous acquisition, binocular image distortion correction and stereo correction, binocular image pair SIFT feature extraction, prior feature set acquisition, reference pose thin component three-dimensional point cloud acquisition, reference pose thin component three-dimensional pose estimation and the like; the to-be-detected pose estimation stage comprises the following steps: the method comprises the steps of binocular image synchronous acquisition, binocular image pair distortion correction and stereo correction, binocular image pair SIFT feature extraction, feature set acquisition of a pose thin component to be detected, three-dimensional point cloud acquisition of the pose thin component to be detected, pose estimation of the pose thin component to be detected and the like.
Because the implementation modes of the corresponding steps in the reference pose measurement and the pose estimation to be measured are the same, the reference pose measurement and the pose estimation to be measured are summarized and summarized into four steps for expansion description, namely camera calibration, feature extraction and thin component characterization, binocular vision measurement and three-dimensional point cloud acquisition and thin component three-dimensional position and pose estimation, and the steps of the invention are further described in detail by combining the related drawings.
The invention discloses a visual measurement method for three-dimensional positions and postures of complex thin parts, which specifically comprises the following steps:
step 1: and calibrating the camera. The method comprises the steps of calibrating the internal parameters of a camera, a lens distortion coefficient and the spatial position relation between binocular cameras to establish the mapping relation between a reference plane three-dimensional world coordinate system and two-dimensional image coordinates in a binocular vision measuring system.
The whole working range is set to be 5 m multiplied by 5 m, and a lens with a proper view field is selected for the view field. Taking these factors into consideration, it is preferable to select a lens having a focal length of 6mm (the angle of view is about 53 degrees). The binocular cameras were placed horizontally with the spacing set at 60cm, with the left camera as the reference camera.
Camera calibration mainly involves four coordinate systems, i.e. three-dimensional world coordinate system XwYwZw(origin on the reference plane, and ZwAxis perpendicular to reference plane), three-dimensional camera coordinate system XcYcZc(origin is at the optical center of the lens, and ZcAxis coincident with optical axis), two-dimensional image physical coordinate system xO1y (origin at the center of the image, coordinates are physical coordinates), a two-dimensional image pixel coordinate system uOv (origin at the top left corner of the image, coordinates are pixel coordinates).
Figure 2 shows a schematic diagram of the various coordinate systems and their normalized coordinate systems of the present invention. As shown in FIG. 2, the coordinate of the space point P in the three-dimensional world coordinate system is defined as [ X ] by using the linear camera pinhole modelw Yw Zw]TWith the corresponding homogeneous coordinate P ═ Xw Yw Zw 1]T(ii) a Defining the coordinate of the point P in the three-dimensional camera coordinate system as [ X ]c Yc Zc]TWith homogeneous coordinate Pc=[Xc Yc Zc 1]T(ii) a Defining the projection point of the point P on the two-dimensional image plane as P', and the coordinate of the two-dimensional image in the physical coordinate system as [ x y ]]T(unit: mm) having coordinates in a two-dimensional image pixel coordinate system of [ u v%]TIts homogeneous coordinate is p' ═ u v1]T. Intersection O of the optical axis of the camera and the image plane1Has a pixel coordinate of [ u ]0 v0]T(ii) a The physical dimensions of the image unit pixel in the x-axis and y-axis directions are dx and dy, respectively. The mapping relationship between the coordinates of the point P in the three-dimensional coordinate system and the coordinates of its projected point P' on the two-dimensional image plane in the two-dimensional image coordinates is:
Z c u v 1 = f dx 0 u 0 0 0 f dy v 0 0 0 0 1 0 R t O t 1 X w Y w Z w 1 - - - ( 1 )
= a x 0 u 0 0 0 a y v 0 0 0 0 1 0 R t O t 1 X w Y w Z w 1 = M 1 M 2 P = MP
wherein, P ═ Xw Yw Zw 1]TIs a homogeneous coordinate under a three-dimensional world coordinate system; f is the focal length of the lens, ax=f/dx、ayF/dy is the focal ratio of the camera, and R, t is the rotation matrix and translation vector between the three-dimensional camera coordinate system and the three-dimensional world coordinate system respectively; o ist=[0 0 0],M1、M2Respectively, an intrinsic parameter matrix and an extrinsic parameter matrix (i.e., a homography matrix) of the camera, and M is an overall projection matrix of the camera.
In the whole calibration process, the internal parameter [ f dx dy u ] of the camera needs to be calibrated0 v0]TDistortion coefficient [ k ] of camera lens1 k2 p1 p2 k3]TAnd external parameters, wherein the internal parameters comprise the focal ratio and the central point position of the camera, and the external parameters comprise a rotation matrix R and a translational vector t between a three-dimensional camera coordinate system and a three-dimensional world coordinate system. In the calibration process, a black and white square grid calibration plate is used for calibration, wherein a small calibration plate is used for calibrating internal parameters, the grid size is 50mm multiplied by 50mm, and the number of grids is 8 (width) multiplied by 12 (length); when external parameters are calibrated, a large calibration plate is used, the size of the grid is 100mm multiplied by 100mm, and the number of the grids is 4 (width) multiplied by 6 (length). And detecting each angular point on the calibration plate by adopting a Harris angular point detection algorithm, and acquiring the position of the angular point at the sub-pixel level.
The step 1 specifically comprises the following steps:
step 11: and respectively calibrating the left camera and the right camera to respectively obtain an internal parameter matrix and an external parameter matrix of each camera.
(a) Initial intrinsic parameters of the left and right cameras and the rotation matrix and translation vector of the cameras relative to the small calibration plate are first acquired.
Taking calibration of a reference camera as an example, in order to obtain an internal parameter matrix of the camera, lens distortion is not considered firstly, a calibration method based on a plane homography matrix is adopted, a small calibration plate is held by hand, and the calibration method is not carried outChanging the postures (at least 3 postures) of the small calibration plate, and calculating initial internal parameters of the camera and a rotation matrix and a translation vector of the camera relative to the small calibration plate by using plane matching (a direct linear transformation method) among different viewpoints; that is, the internal parameters of the camera and the rotation matrix R and translation vector t of the camera relative to the small calibration plate when the lens distortion is zero are obtained. Let the plane Z of the small calibration platewWhen the value is 0, then:
Z c u v 1 = a x 0 u 0 0 0 a y v 0 0 0 0 1 0 R t O t 1 X w Y w 0 1 = a x 0 u 0 0 a y v 0 0 0 1 R t X w Y w 0 1 [ 0,1 ]
= a x 0 u 0 0 a y v 0 0 0 1 r 1 r 2 r 3 t X w Y w 0 1 = a x 0 u 0 0 a y v 0 0 0 1 r 1 r 2 t X w Y w 1 - - - ( 2 )
<math> <mrow> <mo>=</mo> <mi>H</mi> <mo>&CenterDot;</mo> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>X</mi> <mi>w</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>Y</mi> <mi>w</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> </mrow> </math>
wherein R ═ R1 r2 r3]。
Therefore, the calibration of the reference camera can be completed by solving the H matrix, and the internal parameters of the camera and the rotation matrix R and translation vector t of the camera relative to the small calibration plate are obtained.
(b) Based on the initial internal parameters of the camera and the rotation matrix R and translation vector t of the camera relative to the small calibration plate, the distortion coefficient of the camera lens is further obtained by considering the lens distortion, and the internal parameters of the camera are further optimized.
Distortion diagram of original lensThe image pixel coordinate of a corner point detected in the image is [ u ]raw vraw]TThe image pixel coordinate of the angular point in the lens distortion-free image under the ideal pinhole imaging model is [ u [ ]und vund]TAnd wherein, the original lens distortion image is an original collected image with lens distortion, then:
(1) and transforming the three-dimensional world coordinate of each corner point on the small calibration plate into a three-dimensional camera coordinate system, wherein one corner point on the small calibration plate is taken as an original point, the three-dimensional world coordinate system is established, and the three-dimensional world coordinate of each corner point on the calibration plate is calculated according to the length of black and white grids on the small calibration plate. Namely, it is
X c Y c Z c = R X w Y w Z w 1 + t - - - ( 3 )
Where R and t are the rotation matrix and translation vector of the camera relative to the small calibration plate.
(2) Further projecting to an image plane to obtain the undistorted image physical coordinate [ x ] of the angular point in the image plane coordinate systemund yund]TAnd image pixel coordinate [ u ]und vund]TI.e. by
x und y und = fX c / Z c f Y c / Z c - - - ( 4 )
u und v und a x 0 u 0 0 a y v 0 0 0 1 x und y und - - - ( 5 )
(3) Image pixel coordinates [ u ] of corner points to be detected in an original lens distorted imageraw vraw]TConversion to physical coordinates [ x ] of the imageraw yraw]TAnd introducing a certain initially set lens distortion coefficient [ k ]1 k2 p1 p2 k3]TObtaining the pixel coordinates [ u 'of the corner image after distortion correction'und v′und]TAnd image physical coordinates [ x'und y′und]TI.e. by
x raw y raw 1 = dx 0 - u 0 dx 0 dy - v 0 dy 0 0 1 u raw v raw 1 - - - ( 6 )
<math> <mrow> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <msup> <mi>x</mi> <mo>&prime;</mo> </msup> <mi>und</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <msup> <mi>y</mi> <mo>&prime;</mo> </msup> <mi>und</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>+</mo> <msub> <mi>k</mi> <mn>1</mn> </msub> <msup> <mi>r</mi> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>k</mi> <mn>2</mn> </msub> <msup> <mi>r</mi> <mn>4</mn> </msup> <mo>+</mo> <msub> <mi>k</mi> <mn>3</mn> </msub> <msup> <mi>r</mi> <mn>6</mn> </msup> <mo>)</mo> </mrow> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>raw</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>raw</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>+</mo> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mn>2</mn> <msub> <mi>p</mi> <mn>1</mn> </msub> <msub> <mi>x</mi> <mi>d</mi> </msub> <msub> <mi>y</mi> <mi>d</mi> </msub> <mo>+</mo> <msub> <mi>p</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msup> <mi>r</mi> <mn>2</mn> </msup> <mo>+</mo> <mn>2</mn> <msubsup> <mi>x</mi> <mi>d</mi> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> </mtd> </mtr> <mtr> <mtd> <msub> <mi>p</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msup> <mi>r</mi> <mn>2</mn> </msup> <mo>+</mo> <mn>2</mn> <msubsup> <mi>y</mi> <mi>d</mi> <mn>2</mn> </msubsup> <mo>)</mo> </mrow> <mo>+</mo> <mn>2</mn> <msub> <mi>p</mi> <mn>2</mn> </msub> <msub> <mi>x</mi> <mi>d</mi> </msub> <msub> <mi>y</mi> <mi>d</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <msup> <mi>u</mi> <mo>&prime;</mo> </msup> <mi>und</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <msup> <mi>v</mi> <mo>&prime;</mo> </msup> <mi>und</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <mi>a</mi> <mi>x</mi> </msub> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>u</mi> <mn>0</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>a</mi> <mi>y</mi> </msub> </mtd> <mtd> <msub> <mi>v</mi> <mn>0</mn> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <msub> <msup> <mi>x</mi> <mo>&prime;</mo> </msup> <mi>und</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <msup> <mi>y</mi> <mo>&prime;</mo> </msup> <mi>und</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow> </math>
Wherein,in the process, the image pixel coordinates of the corner points after distortion correction and the image pixel coordinates [ u ] of the corner points detected in the original lens distorted image are established according to the internal parameters and the distortion parameters of the cameraraw vraw]TLinear equation between.
(4) For the N angular points used in the calibration, an objective function is defined:
<math> <mrow> <mi>min</mi> <mi>F</mi> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <mo>[</mo> <msup> <mrow> <mo>(</mo> <msubsup> <mi>u</mi> <mi>und</mi> <mi>i</mi> </msubsup> <mo>-</mo> <msubsup> <mi>u</mi> <mi>und</mi> <mrow> <mo>&prime;</mo> <mi>i</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msubsup> <mi>v</mi> <mi>und</mi> <mi>i</mi> </msubsup> <mo>-</mo> <msubsup> <mi>v</mi> <mi>und</mi> <mrow> <mo>&prime;</mo> <mi>i</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>]</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow> </math>
the nonlinear least square problem is iterated for many times through a nonlinear optimization algorithm to obtain a parameter value which enables a target function to be minimum, and global optimization is obtainedCamera internal parameter of [ f dx dy u ]0 v0]TAnd distortion parameter [ k1 k2 p1 p2k3]T
(c) And calibrating external parameters of the camera, namely the spatial position relation between the image plane and the reference plane, and obtaining a rotation matrix and a translation vector of the camera plane relative to the reference plane.
The invention needs to measure the three-dimensional coordinates of the feature points at certain specific positions on the complex thin part, so that an external parameter matrix of the camera when the reference plane is taken as a world coordinate system needs to be obtained. Placing the large calibration plate on the reference plane, namely setting the reference plane in the scene as a world coordinate system XWOWYWThe method comprises the following specific steps:
(1) determining a three-dimensional world coordinate system XwYwZwPlacing the large calibration plate on the reference plane, taking one angular point of the grid on the large calibration plate as the origin of the coordinate system, and making ZwThe axis is vertical to the reference plane of the large calibration plate;
(2) acquiring a large calibration plate image at the moment, and carrying out distortion correction by using the previously acquired camera internal parameters and the lens distortion coefficient to acquire a corrected image;
(3) detecting the pixel coordinates of a two-dimensional image of each corner point in the image after distortion correction by using a Harris corner point detection algorithm;
(4) and calculating a rotation matrix and a translation vector of the camera relative to the large calibration plate, namely external parameters of the camera according to the three-dimensional world coordinates of each corner point on the large calibration plate and the three-dimensional camera coordinates of each corner point detected on the distortion correction image under the three-dimensional camera coordinate system.
Step 12: and (4) binocular stereo calibration, and establishing a spatial position relation between binocular cameras. Namely, a rotation matrix and a translation matrix, namely a homography matrix, of the right camera relative to the reference camera are obtained, and the method specifically comprises the following steps:
fig. 3 shows a schematic perspective view of a binocular camera according to the present invention. And solving the spatial position relation between the right camera and the reference camera when the large calibration plate is on the reference plane according to the rotation matrix and the translation vector of the left camera and the right camera relative to the reference plane, namely obtaining the rotation matrix and the translation vector of the right camera relative to the left camera when the target object is on the horizontal ground. As shown in fig. 3, the space point P in the three-dimensional world coordinate system is in the three-dimensional camera coordinate system of the left camera and the right camera (the optical center or the origin is O, respectively)ClAnd OCR) The lower spatial point is respectively PClAnd PCRLet the extrinsic parameter matrices of the left and right cameras be [ R ] respectivelyl tl]And [ R ]r tr]Then according to
PCl=RlP+tl
PCr=RrP+tr
PCl=RT(PCr-t)
The spatial position relation [ R t ] of the right camera relative to the left camera is obtained as
R = R r R l T - - - ( 10 )
t=tr-Rtl (11)
Step 13: and performing stereo correction on the binocular images to realize coplanarity and line alignment of the binocular images. Even if the optical axes of the two cameras are parallel, the image planes of the left camera and the right camera are re-projected to accurately fall on the same plane, and the lines of the left image and the right image are aligned at the same time. Based on the rotation matrix and translation vector between the left and right cameras, etc., adoptThe view overlapping maximization criterion and the Bouguet calibration algorithm realize the stereo correction to obtain the stereo correction internal matrix M of the left camerarect_lAnd a row alignment matrix Rrect_l(ii) a Stereo-corrected internal matrix M for right camerared_rAnd a row alignment matrix Rrect_rSo as to obtain binocular stereo corrected image pair. The specific operation process of the stereo correction is as follows:
let image coordinate P of point P in binocular original image pairraw_lAnd Praw_r(ii) a Image coordinates P in a binocular distortion corrected image pairund_lAnd Pund_l(ii) a Image coordinates P in a binocular stereo-corrected image pairrect_lAnd Prect_rThen, the stereo correction formula is:
Prect_l=Mrect_lRl(M1l)-1Pund_l (12)
Prect_r=Mrect_rRr(M1r)-1Pund_r (13)
wherein R islAnd RrThe rotation matrixes of the external parameters of the left camera and the right camera are respectively.
Pund_l=M1l·UnDistort(M1lPraw_l)
Pund_r=M1r·UnDistort(M1rPraw_r)
In the formula,
UnDistort (·) is a distortion correction function for a pixel.
Step 2: feature extraction and thin part characterization. In the thin component binocular stereo correction image pair, SIFT features and neighborhood descriptions are used to represent the thin component images respectively.
FIG. 4 shows a flow chart of a method for feature extraction and thin part characterization in the present invention. Referring to fig. 4, the method specifically includes the following steps:
step 21: acquiring binocular images of the thin component, and respectively carrying out lens distortion correction and stereo correction to obtain a binocular distortion correction image pair and a binocular stereo correction image pair;
step 22: extracting SIFT features in the binocular stereo correction image pair by using an SIFT operator, wherein the SIFT features comprise SIFT feature points and neighborhood descriptions thereof;
step 23: and acquiring a priori feature set and characterizing the thin part.
In the reference pose measuring stage, aiming at the detected SIFT features in the left image of the binocular stereo correction image of the thin component, manually selecting a plurality of (at least 3) SIFT features near specific positions (such as positioning holes, positioning bolts and the like), carrying out feature matching on the SIFT features and SIFT features in the right image, manually rejecting mismatching pairs, obtaining a SIFT feature matching pair set, and further forming a prior SIFT feature library of the reference pose thin component.
In the pose estimation stage to be detected, SIFT features extracted from binocular stereo correction image pairs of thin components to be detected are respectively subjected to feature matching with a prior SIFT feature library, and a RANSAC algorithm is used for eliminating mismatching results to form SIFT feature matching pairs which are the same in number and correspond to the prior SIFT feature library. The matching purpose is that if the SIFT feature logarithm in the binocular stereo correction image pair of the thin component to be detected is less than the feature number of the prior SIFT feature library, the missing SIFT feature is obtained through transformation according to the homography matrix and the SIFT feature corresponding to the prior feature library.
Fig. 5 shows a schematic diagram of an acquisition mode of missing SIFT features in the present invention. As shown in fig. 5, a prior SIFT feature library (composed of 8 pairs of SIFT feature matching pairs) and 7 pairs of SIFT feature matching pairs extracted from a binocular stereo-corrected image pair of a thin component to be detected are provided, and the matching pairs of SIFT features with missing compensation are obtained according to a homography matrix H between two left images1And point P6Calculating the missing SIFT feature point P in the left image of the binocular stereo corrected image of the thin component to be measured6And finally obtaining a SIFT feature matching pair set of the thin parts to be tested, wherein the SIFT feature matching pair set is the same as the SIFT feature library in number and corresponds to the SIFT feature library in a priori mode.
And step 3: binocular vision measurement and reference thin part three-dimensional point cloud acquisition.
FIG. 6 shows a schematic view of binocular vision measurement and thin part three-dimensional point cloud acquisition in the present invention. In the invention, the three-dimensional world coordinates of the feature point set at the specific position on the thin component are calculated by a binocular measurement method, so that the three-dimensional point cloud of the thin component is obtained.
Taking the three-dimensional coordinate calculation of a point P at a certain specific position on the three-dimensional space thin component as an example, the calculation process is described, wherein the imaging of the point P in the binocular stereo correction image pair corresponds to a certain SIFT feature matching pair. The coordinates of the point P under the coordinate system of the three-dimensional camera of the binocular left camera and the binocular right camera are respectively [ X ]cl Ycl Zcl]TAnd [ X ]cr Ycr Zcr]T(ii) a The image coordinates of the SIFT feature matching pair in the binocular stereo correction image pair are [ u ] respectivelyrect_l vrect_l]TAnd [ u ]rect_r vrect_r]TWith homogeneous coordinates Prect_l=[urect_l vrect_l 1]TAnd Prect_r=[urect_r vrect_r 1]T. As shown in fig. 6, the binocular vision measurement and the thin component three-dimensional point cloud acquisition specifically include the following steps:
step 31: and calculating the image coordinates of the SIFT feature points in the binocular distortion correction image pair. In the reference pose measuring stage, calculating image coordinates of binocular distortion correction image pairs acquired by SIFT feature points on a reference thin component on the reference thin component; and in the measurement stage of the pose to be measured, calculating the image coordinates of the binocular distortion correction image pair acquired by the SIFT feature points on the thin component to be measured on the component to be measured.
Let the image coordinates of point P on the binocular left and right distortion corrected image pair be [ u ] respectivelyund_l vund_l]TAnd [ u ]und_r vund_r]TTheir corresponding homogeneous coordinates are each Pund_l=[uund_l vund_l 1]TAnd Pund_r=[uund_r vund_r 1]TAccording to the formula (1):
Z cl u und _ l v und _ l 1 = M 1 l M 2 l X w Y w Z w 1 = M l X w Y w Z w 1 = m l 11 m l 12 m l 13 m l 14 m l 21 m l 22 m l 23 m l 24 m l 31 m l 32 m l 33 m l 34 X w Y w Z w 1 - - - ( 14 )
Z cr u und _ r v und _ r 1 = M 1 r M 2 r X w Y w Z w 1 = M r X w Y w Z w 1 = m r 11 m r 12 m r 13 m r 14 m r 21 m r 22 m r 23 m r 24 m r 31 m r 32 m r 33 m r 34 X w Y w Z w 1 - - - ( 15 )
in the formula, M1l、M2lAnd M1r、M2rInternal parameter matrix and external parameter matrix, M, of binocular left and right cameras, respectivelylAnd MrRespectively, an overall projection matrix of the binocular left and right cameras, an
M l = m l 11 m l 12 m l 13 m l 14 m l 21 m l 22 m l 23 m l 24 m l 31 m l 32 m l 33 m l 34
M r = m r 11 m r 12 m r 13 m r 14 m r 21 m r 22 m r 23 m r 24 m r 31 m r 32 m r 33 m r 34
Knowing the image coordinates P of this point in the binocular stereo-corrected image pairrect_lAnd Prect_rThen the point is at the image coordinate P in the binocular distortion corrected image pairund_lAnd Pund_lRespectively as follows:
Pund_l=M1l(Mrect_lRl)-1Prect_l
Pund_r=M1r(Mrect_rRr)-1Prect_rstep 32: and calculating the three-dimensional coordinates of the SIFT feature points in a world coordinate system. In the reference pose measuring stage, three-dimensional coordinates under a world coordinate system are calculated according to the image coordinates of the SIFT feature points of the reference thin component; in the measurement stage of the pose to be measured, calculating a three-dimensional coordinate under a world coordinate system according to the image coordinate of the SIFT feature points of the thin component to be measured;
image coordinate P of the point on the binocular distortion corrected imageund_lAnd Pund_rRespectively substituted into pinhole camera model formulas (14) andequation (15), the overall parameter matrix M combining the left and right cameraslAnd MrElimination of ZclAnd ZcrTo obtain the equation set:
( m l 11 - m l 31 u und _ l ) X w + ( m l 12 - m l 32 u und _ l ) Y w + ( m l 13 - m l 33 u und _ l ) Z w = m 134 u und _ l - m l 14 ( m l 21 - m l 31 v und _ l ) X w + ( m l 22 - m l 32 v und _ l ) Y w + ( m l 23 - m l 33 v und _ l ) Z w = m 134 v und _ l - m l 24 ( m r 11 - m r 31 u und _ r ) X x + ( m r 12 - m r 32 u und _ r ) Y w + ( m r 13 - m r 33 u und _ r ) Z w = m r 34 u und _ r - m r 14 ( m r 21 - m r 31 v und - r ) X w + ( m r 22 - m r 32 v und _ r ) Y w + ( m r 23 - m r 33 v und _ r ) Z w = m r 34 v und _ r - m r 24
writing the system of equations in matrix form:
A = X w Y w Z w = B
in the formula,
A = m l 11 - m l 31 u und _ l m l 12 - m l 32 u und _ l m l 13 - m l 33 u und _ l m l 21 - m l 31 v und _ l m l 22 - m l 32 v und _ l m l 23 - m l 33 v und _ l m r 11 - m r 31 u und _ r m r 12 - m r 32 u und _ r m r 13 - m r 33 u und _ r m r 21 - m r 31 v und _ r m r 22 - m r 32 v und _ r m r 23 - m r 33 v und _ r
B = m l 34 u und _ l - m l 14 m l 34 v und _ l - m l 24 m r 34 u und _ r - m r 14 m r 34 v und _ r - m r 24
and obtaining the three-dimensional coordinates of the point in the world coordinate system by using a least square method.
[Xw Yw Zw]T=((ATA)-1ATB) (16)
And 4, step 4: thin part three-dimensional position and attitude estimation.
And respectively calculating main planes of the three-dimensional point cloud of the reference pose thin part and the three-dimensional point cloud of the thin part to be measured according to a least square method, and obtaining the three-dimensional position and the pose of the thin part to be measured by calculating the spatial position relation between the two main planes.
Step 41: calculating a space main plane of the three-dimensional point cloud; namely, in the reference pose measurement stage, the three-dimensional plane where the three-dimensional point cloud of the SIFT feature points of the reference thin component is located is taken as the main plane pi of the reference space0(ii) a In the measurement stage of the pose to be measured, taking the plane of the three-dimensional power supply of the SIFT feature points of the thin component to be measured as a main plane pi of a space to be measured;
calculating a space reference main plane pi of the three-dimensional point cloud of the reference pose thin part by adopting a least square method0(ii) a And calculating a main plane pi to be measured in the space of the thin part to be measured according to the three-dimensional point cloud of the thin part to be measured. Taking the solution of the main plane pi to be measured in space as an example, let the equation of the space plane pi be: ax + by + cz +1 is 0, the plane pi normal vector n is (a, b, c), and the three-dimensional coordinate of the feature point at the specific position is Pi=(xi,yi,zi)TThen point PiDistance d to the fitting plane pii=nPi+1, according to the principle of least squares, requires S ═ Σ (d)i)2Then the normal vector n is solved by solving the following system of equations.
<math> <mrow> <mfenced open='{' close=''> <mtable> <mtr> <mtd> <mfrac> <mrow> <mo>&PartialD;</mo> <mi>S</mi> </mrow> <mi>a</mi> </mfrac> <mo>=</mo> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mfrac> <mrow> <mo>&PartialD;</mo> <mi>S</mi> </mrow> <mi>b</mi> </mfrac> <mo>=</mo> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mfrac> <mrow> <mo>&PartialD;</mo> <mi>S</mi> </mrow> <mi>c</mi> </mfrac> <mo>=</mo> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>17</mn> <mo>)</mo> </mrow> </mrow> </math>
Step 42: solving the main plane Pi of the space to be measured and the main plane Pi of the reference space0The transformation matrix between the two is used for respectively projecting the characteristic points of the reference thin component and the thin component to be measured onto the main planes of the thin component to be measured, and the projection points on the main planes are used for calculating the rotation matrix and the translation vector between the two planes, so that the three-dimensional position and the posture information of the thin component to be measured relative to the reference thin component are accurately determined.
In the description of the steps of the method, for convenience, the reference pose measurement stage and the pose measurement stage to be measured are described in parallel, but in practical application, the camera calibration is firstly carried out, then the reference pose measurement, namely the acquisition of a prior feature library, the calculation of a reference plane and the like are carried out, and finally the pose measurement stage to be measured is carried out, namely the three-dimensional position and attitude information of the actual thin part to be measured are solved based on the reference plane.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for measuring the three-dimensional position and posture of a thin part to be measured comprises the following steps:
step 1: calibrating the internal parameters and the external parameters of the binocular camera and the spatial position relationship of the binocular camera, and establishing a mapping relationship between a three-dimensional world coordinate system and a two-dimensional image coordinate in a binocular vision measurement system;
step 2: extracting SIFT feature point pairs in a binocular image of the reference thin component by using an SIFT operator, generating an SIFT prior feature library, and using a three-dimensional plane where the SIFT feature point pairs are located as a reference plane where the reference thin component is located;
and step 3: respectively extracting a plurality of SIFT feature point pairs from the binocular stereo correction image pair of the thin component to be detected, and matching the extracted SIFT feature point pairs with SIFT feature point pairs in the SIFT prior feature library to obtain an SIFT feature point pair set which has the same number and corresponds to the SIFT feature point pairs in the SIFT feature prior library;
and 4, step 4: obtaining three-dimensional point cloud of the thin part to be measured according to the obtained image coordinates of the feature points in the SIFT feature point pair set, and further obtaining a space plane of the thin part to be measured;
and 5: and determining the three-dimensional position and attitude information of the thin component to be measured by solving the transformation matrix between the space plane and the reference plane.
2. The measurement method according to claim 1, wherein the step 1 specifically comprises:
step 11: respectively establishing a mapping relation between a single-camera two-dimensional image coordinate system and a three-dimensional world coordinate system;
step 12: calibrating a single camera respectively to obtain internal parameters and external parameters of the camera;
step 13: and carrying out three-dimensional calibration according to the obtained internal parameters and external parameters of the camera, and establishing a spatial position relationship between the binocular cameras.
3. The measurement method of claim 2, wherein the internal parameters of the camera include a focal ratio of the camera and a center point position of the camera, and the external parameters include a rotation matrix and a translation vector between a three-dimensional camera coordinate system and a three-dimensional world coordinate system.
4. The measurement method according to claim 2, wherein the spatial positional relationship between the binocular cameras represents a rotation matrix and a translation vector between the cameras.
5. The measurement method according to claim 1, wherein the specific step of obtaining the SIFT prior feature library in step 2 comprises:
step 21: acquiring a binocular original image pair of the reference thin component, and respectively correcting the binocular original image pair by using a lens distortion coefficient and a stereo correction parameter of a binocular camera to obtain a binocular stereo correction image pair of the reference thin component;
step 22: SIFT features are extracted from the binocular stereo correction image pair of the reference thin component to form an SIFT prior feature library of the reference thin component.
6. The measurement method according to claim 1, wherein step 3 comprises the following steps:
step 31, acquiring a binocular original image pair of the thin part to be detected, and respectively correcting the binocular original image pair by using a lens distortion coefficient and a stereo correction parameter of a binocular camera to obtain a binocular stereo correction image pair of the thin part to be detected;
step 32: and extracting SIFT feature point pairs from the binocular stereo correction image pairs of the thin component to be detected, and matching the extracted SIFT feature point pairs with SIFT feature point pairs in the SIFT prior feature library to obtain SIFT feature point pairs with the same number and corresponding to the SIFT feature point pairs in the SIFT prior feature library.
7. The measurement method according to any one of claims 5 to 6, wherein the stereo correction parameters include a stereo correction internal matrix and a row alignment matrix of the binocular cameras obtained based on a rotation matrix and a translation vector between the binocular cameras.
8. The measurement method according to claim 1, wherein step 4 specifically comprises:
step 41: according to the image coordinates of the SIFT feature point pairs on the thin component to be measured, the three-dimensional world coordinates of the SIFT feature point pairs are obtained by utilizing the mapping relation between the three-dimensional world coordinate system and the two-dimensional image coordinates in the established binocular vision measuring system;
step 42: and obtaining a three-dimensional plane where the SIFT feature point pairs are located according to the three-dimensional world coordinates of the SIFT feature point pairs, and taking the three-dimensional plane as a space plane of the thin component to be measured.
9. The measurement method according to claim 8, wherein the image coordinates of the SIFT feature point pairs are binocular stereo-corrected image coordinates on the thin part to be measured, and the three-dimensional world coordinates of the SIFT feature points are obtained from their image coordinates on a binocular distortion-corrected image pair, wherein the image coordinates on the binocular distortion-corrected image pair are obtained as follows:
Pund_l=M1l(Mrect_l Rl)-1Prect_l
Pund_r=M1r(Mrect_r Rr)-1Prect_r
wherein, Pund_l=[uund_l vund_l 1]TAnd Pund_r=[uund_r vund_r 1]TCorrecting the image coordinates, u, in the image pair for binocular distortion, respectivelyund_l、vund_l、uund_r、vund_rAre respectively the coordinates of the pixel points, Prect_lAnd Prect_rCorrecting image coordinates, M, in an image pair for binocular stereo1lAnd M1rRespectively, the internal parameter matrix, M, of the binocular camerarect_lAnd Mrect_rAre the stereo correction parameter matrixes of the binocular camera respectively.
10. The measurement method according to claim 9, wherein the three-dimensional coordinates of the SIFT feature points are obtained as follows:
[Xw Yw Zw]T=((ATA)-1ATB)
A = m l 11 - m l 31 u und _ l m l 12 - m l 32 u und _ l m l 13 - m l 33 u und _ l m l 21 - m l 31 v und _ l m l 22 - m l 32 v und _ l m l 23 - m l 33 v und _ l m r 11 - m r 31 u und _ r m r 12 - m r 32 u und _ r m r 13 - m r 33 u und _ r m r 21 - m r 31 v und _ r m r 22 - m r 32 v und _ r m r 23 - m r 33 v und _ r
B = m l 34 u und _ l - m l 14 m l 34 v und _ l - m l 24 m r 34 u und _ r - m r 14 m r 34 v und _ r - m r 24
wherein (X)w,Yw,Zw) The three-dimensional world coordinates of the SIFT feature points are obtained; m islAnd mrInternal parameter matrix M of binocular camera respectively1lAnd M1rOf (2) is used.
CN201310159555.6A 2013-05-03 2013-05-03 Method for measuring three-dimensional position and posture of thin component with complex structure Active CN103278138B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310159555.6A CN103278138B (en) 2013-05-03 2013-05-03 Method for measuring three-dimensional position and posture of thin component with complex structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310159555.6A CN103278138B (en) 2013-05-03 2013-05-03 Method for measuring three-dimensional position and posture of thin component with complex structure

Publications (2)

Publication Number Publication Date
CN103278138A CN103278138A (en) 2013-09-04
CN103278138B true CN103278138B (en) 2015-05-06

Family

ID=49060727

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310159555.6A Active CN103278138B (en) 2013-05-03 2013-05-03 Method for measuring three-dimensional position and posture of thin component with complex structure

Country Status (1)

Country Link
CN (1) CN103278138B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714571B (en) * 2013-09-23 2016-08-10 西安新拓三维光测科技有限公司 A kind of based on photogrammetric single camera three-dimensional rebuilding method
CN104636743B (en) * 2013-11-06 2021-09-03 北京三星通信技术研究有限公司 Method and device for correcting character image
CN104154875B (en) * 2014-08-20 2017-02-15 深圳大学 Three-dimensional data acquisition system and acquisition method based on two-axis rotation platform
CN104484870B (en) * 2014-11-25 2018-01-12 北京航空航天大学 Verify Plane location method
CN104880176B (en) * 2015-04-15 2017-04-12 大连理工大学 Moving object posture measurement method based on prior knowledge model optimization
CN105157592B (en) * 2015-08-26 2018-03-06 北京航空航天大学 The deformed shape of the deformable wing of flexible trailing edge and the measuring method of speed based on binocular vision
CN105741290B (en) * 2016-01-29 2018-06-19 中国人民解放军国防科学技术大学 A kind of printed circuit board information indicating method and device based on augmented reality
CN106352855A (en) * 2016-09-26 2017-01-25 北京建筑大学 Photographing measurement method and device
CN107423772A (en) * 2017-08-08 2017-12-01 南京理工大学 A kind of new binocular image feature matching method based on RANSAC
CN108010085B (en) * 2017-11-30 2019-12-31 西南科技大学 Target identification method based on binocular visible light camera and thermal infrared camera
CN108955685B (en) * 2018-05-04 2021-11-26 北京航空航天大学 Refueling aircraft taper sleeve pose measuring method based on stereoscopic vision
CN110858403B (en) * 2018-08-22 2022-09-27 杭州萤石软件有限公司 Method for determining scale factor in monocular vision reconstruction and mobile robot
CN109700465A (en) * 2019-01-07 2019-05-03 广东体达康医疗科技有限公司 A kind of mobile three-dimensional wound scanning device and its workflow
CN111179356A (en) * 2019-12-25 2020-05-19 北京中科慧眼科技有限公司 Binocular camera calibration method, device and system based on Aruco code and calibration board
CN113378606A (en) * 2020-03-10 2021-09-10 杭州海康威视数字技术股份有限公司 Method, device and system for determining labeling information
CN111536981B (en) * 2020-04-23 2023-09-12 中国科学院上海技术物理研究所 Embedded binocular non-cooperative target relative pose measurement method
CN114559131A (en) * 2020-11-27 2022-05-31 北京颖捷科技有限公司 Welding control method and device and upper computer
CN113822945A (en) * 2021-09-28 2021-12-21 天津朗硕机器人科技有限公司 Workpiece identification and positioning method based on binocular vision
CN114913246B (en) * 2022-07-15 2022-11-01 齐鲁空天信息研究院 Camera calibration method and device, electronic equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065351A (en) * 2012-12-16 2013-04-24 华南理工大学 Binocular three-dimensional reconstruction method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07287764A (en) * 1995-05-08 1995-10-31 Omron Corp Stereoscopic method and solid recognition device using the method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065351A (en) * 2012-12-16 2013-04-24 华南理工大学 Binocular three-dimensional reconstruction method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
党乐.基于双目立体视觉的三维重建方法研究.《中国优秀硕士学位论文全文数据库(信息科技辑)》.2010,第72-80页. *

Also Published As

Publication number Publication date
CN103278138A (en) 2013-09-04

Similar Documents

Publication Publication Date Title
CN103278138B (en) Method for measuring three-dimensional position and posture of thin component with complex structure
CN110296691B (en) IMU calibration-fused binocular stereo vision measurement method and system
CN108510551B (en) Method and system for calibrating camera parameters under long-distance large-field-of-view condition
CN101581569B (en) Calibrating method of structural parameters of binocular visual sensing system
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN104616292B (en) Monocular vision measuring method based on global homography matrix
CN103530880B (en) Based on the camera marking method of projection Gaussian network pattern
CN102810205B (en) The scaling method of a kind of shooting or photographic means
CN104392435B (en) Fisheye camera scaling method and caliberating device
CN107886547B (en) Fisheye camera calibration method and system
CN104268876B (en) Camera calibration method based on partitioning
CN110728715A (en) Camera angle self-adaptive adjusting method of intelligent inspection robot
CN111612794B (en) High-precision three-dimensional pose estimation method and system for parts based on multi-2D vision
CN102221331B (en) Measuring method based on asymmetric binocular stereovision technology
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN104537707A (en) Image space type stereo vision on-line movement real-time measurement system
CN114705122B (en) Large-view-field stereoscopic vision calibration method
CN111415391A (en) Multi-view camera external orientation parameter calibration method adopting inter-shooting method
CN111220126A (en) Space object pose measurement method based on point features and monocular camera
CN112229323B (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
CN104463791A (en) Fisheye image correction method based on spherical model
CN110349257B (en) Phase pseudo mapping-based binocular measurement missing point cloud interpolation method
CN111879354A (en) Unmanned aerial vehicle measurement system that becomes more meticulous
CN104167001B (en) Large-visual-field camera calibration method based on orthogonal compensation
CN115797461A (en) Flame space positioning system calibration and correction method based on binocular vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant