CN102034238A - Multi-camera system calibrating method based on optical imaging test head and visual graph structure - Google Patents

Multi-camera system calibrating method based on optical imaging test head and visual graph structure Download PDF

Info

Publication number
CN102034238A
CN102034238A CN201010585261.6A CN201010585261A CN102034238A CN 102034238 A CN102034238 A CN 102034238A CN 201010585261 A CN201010585261 A CN 201010585261A CN 102034238 A CN102034238 A CN 102034238A
Authority
CN
China
Prior art keywords
mrow
camera
cameras
msubsup
optical imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201010585261.6A
Other languages
Chinese (zh)
Other versions
CN102034238B (en
Inventor
赵宏
李进军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Cartesan Testing Technology Co Ltd
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN2010105852616A priority Critical patent/CN102034238B/en
Publication of CN102034238A publication Critical patent/CN102034238A/en
Application granted granted Critical
Publication of CN102034238B publication Critical patent/CN102034238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a multi-camera system calibrating method based on an optical imaging test head and a visual graph structure. The method comprises the following steps: independently calibrating each camera by the optical imaging test head to obtain the initial values of the internal parameter and aberration parameter of each camera; calibrating the multiple cameras two by two, and obtaining the fundamental matrix, polar constraint, rotation matrix and translation vector between every two cameras with a plurality of overlapped regions at a view field by means of linear estimation; building the connection relationship among the multiple cameras according to the graph theory and the visual graph structure, and estimating the rotation vector quantity initial value and translation vector quantity initial value of each camera relative to the referred cameras by a shortest path method; and optimally estimating all the internal parameters and external parameters of the all cameras and the acquired three-dimensional sign point set of the optical imaging test head by a sparse bundling and adjusting algorithm to obtain a high-precision calibrating result. The multi-camera system calibrating method is simple in a calibrating process from the partial situation to the overall situation and from the robust to the precise, ensures high-precise and robust calibration, and is applied to calibrating multi-camera systems with different measurement ranges and different distribution structures.

Description

Multi-camera system calibration method based on optical imaging measuring head and visual graph structure
Technical Field
The invention belongs to the technical field of vision measurement, and relates to a multi-camera system calibration method based on an optical imaging measuring head and a vision graph structure.
Background
The three-dimensional measurement is carried out by using a multi-camera system, and the calibration of the internal parameters and distortion parameters of each camera, the relative poses among the cameras and the pose parameters of a reference coordinate system must be completed firstly. The robustness and the precision of the calibration directly determine the measurement precision of the multi-camera system, so the robustness and the high-precision calibration are the premise and the condition for realizing the high-precision calibration of the multi-camera system.
Camera calibration has long been known and many effective methods have been available over decades. In recent years, due to the unique and obvious characteristics of a multi-camera system on large-scale and large-range measurement, the multi-camera system is widely paid attention to domestic and foreign research, and particularly, the high efficiency and high precision of the multi-camera system become a research hotspot. However, some of the conventional multi-camera calibration methods use electronic theodolites to establish a world coordinate system, thereby increasing the cost and complexity of the system; some adopt three-dimensional, two-dimensional, one-dimensional or 0-dimensional calibration objects, and directly utilize the pose relationship between the internal parameters of a single camera and the cameras to complete the estimation of the calibration parameters of the multi-camera system, so that on one hand, the calibration objects are required to be visible in all the camera fields, on the other hand, the calibration results are influenced by noise, calculation errors, initial parameter errors and the like, and the precision is low; although some methods also adopt a binding adjustment optimization method in the final stage of calibration, the acquisition process of the three-dimensional calibration point set is complex, and extra points need to be discarded, so that the efficiency is low and the robustness is poor.
Disclosure of Invention
The invention aims to provide a multi-camera system calibration method based on an optical imaging measuring head and a visual diagram structure, aiming at the defects or shortcomings of the existing calibration technology. The method can simultaneously complete the internal and external parameter calibration of a single camera and the calibration of a multi-camera system only by one calibrated optical imaging measuring head, is simple, does not need the optical imaging measuring head to be visible in all the camera fields, only needs two or more cameras to be visible in the field, and is suitable for various measurement fields. And the calibration of the multi-camera system is completed from coarse to fine by adopting a visual diagram structure and binding adjustment optimization, so that the robustness and high precision of the result are ensured.
In order to achieve the purpose, the invention adopts the technical scheme that:
step 1, establishing hardware:
by adopting a plurality of distributed cameras and an optical imaging measuring head, all the cameras do not require overlapped view field spaces, but require at least two cameras to have overlapped view fields:
and 2, determining the spatial position of each camera, determining one or more spatial positioning points for each camera, and positioning the optical imaging measuring head within the field of view of the calibrated camera according to the principle of spatial positioning point selection, namely in the calibration process of a single camera or multiple cameras. During calibration of a single camera, firstly, a reference positioning block (1) is installed on a corresponding positioning point, an optical imaging measuring head is placed in the field range of each camera and aligned with the reference positioning block (1), the optical imaging measuring head is rotated by taking a measuring tip (2) of the optical imaging measuring head as a circle center, seven mark points (5) on an optical imaging measuring head target body 4 concentrically rotate by taking the measuring tip (2) as the circle center, and the camera acquires a plurality of images of the optical imaging measuring head in different postures; secondly, extracting the central coordinate of each LED sub-pixel in each image by adopting a threshold segmentation and ellipse fitting algorithm, and calculating the central coordinate of each mark point 5 by adopting a gravity center method; finally, recovering the internal parameter matrix and distortion parameters of each camera according to the concentric circle rotation relation of the mark points 5 and the calibrated space distance invariant relation;
step 3, reselecting and positioning the space positioning point, adjusting the posture of the optical imaging measuring head, and ensuring that the optical imaging measuring head is positioned in the overlapped field of view of every two or more cameras; in the overlapped view field, rotating the optical imaging measuring head, simultaneously acquiring the LED images of the mark points of the optical imaging measuring head by the cameras with overlapped view fields, recovering the basic matrix, the essential matrix, the polar line geometric relationship, the rotation matrix and the translation vector between two local cameras through image processing, center recognition and mark point matching and according to the geometric constraint between the mark points;
step 4, taking each camera as a node, taking every two cameras with overlapped fields of view as edges, constructing a visual graph of the multi-camera system, wherein the direction of each edge is determined by the initially calibrated local rotation matrix and the translation vector, and the former camera points to the other camera after transformation; in the visual map, firstly determining a reference camera, starting from the reference camera, establishing a connection relation among all cameras, and calculating a rotation matrix and a translation vector of each camera relative to the reference camera by adopting a shortest path method to realize global initial calibration of a multi-camera system;
step 5, according to the initial calibration result, the mark points acquired, identified and obtained in the step 2 and the step 3 are back projected to a reference camera coordinate system, so that a three-dimensional calibration point set in a world coordinate system is established; performing optimization estimation on all calibration parameters and the three-dimensional calibration point set by adopting a sparse bundling adjustment algorithm to obtain a robust and high-precision calibration result;
step 6, if the calibration result meets the measurement precision requirement, the calibration process is finished; if the three-dimensional calibration point set does not meet the requirement, the number of the positioning points of the optical imaging measuring head is increased, the image is collected again, the identified and reconstructed new three-dimensional calibration points are added to the original marker point set, all calibration parameters and the three-dimensional calibration point set are optimized again by adopting a sparse binding adjustment algorithm, and all parameters are calibrated again.
Each marking point 5 consists of six LEDs, and the center of gravity of the marking point is the center of the marking point.
The step 2 comprises the following steps:
step 21, establishing a camera pinhole projection model:
first, a world coordinate system X is establishedWYWZWCoordinate system X of measuring headtYtZtAn image coordinate scale coordinate system xy and a pixel coordinate system uv, wherein N is 7 mark points on the optical imaging measuring head,indicating the coordinates of the mark point in the measuring head coordinate system,
Figure BDA0000037914030000032
representing the coordinates of the mark point in the world coordinate system, (R, T) are the rotation and translation parameters of the camera relative to the world coordinate system, K is the internal parameter matrix of the camera, and comprises 5 parameters, namely the horizontal and vertical focal lengths (f)x,fy) The image tilt factor s and the optical center coordinates (u0, v0), (k1, k2, d1, d2) are distortion parameters of a radial direction and a tangential direction respectively, and 7 mark points (5) on the measuring head target surface (4) satisfy the following projection relation:
<math><mrow><msup><mi>&lambda;</mi><mi>j</mi></msup><msup><mover><mi>x</mi><mo>~</mo></mover><mi>j</mi></msup><mo>=</mo><mi>KR</mi><msubsup><mi>X</mi><mi>t</mi><mi>j</mi></msubsup><mo>+</mo><mi>KT</mi><mo>,</mo><mi>j</mi><mo>&Element;</mo><mn>7</mn><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>1</mn><mo>)</mo></mrow></mrow></math>
wherein λ isjIs the projected depth of landmark point j;
Figure BDA0000037914030000034
indicating the normalized coordinates of the landmark point j on the image.
Step 22, calibrating image acquisition and marking point identification: selecting a space positioning point according to a principle of space positioning point selection, fixing a reference positioning block (1), rotating an optical imaging measuring head by taking a measuring tip (2) as a circle center, acquiring I1 images with different postures by using a camera, distributing the acquired images in a field space of the camera as much as possible, extracting a sub-pixel center of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center of each mark point by adopting a gravity center method;
step 23, according to the invariance of the radius of the concentric circle, establishing an error function of the distance between each mark point and the measuring tip when the optical imaging measuring head rotates around the measuring tip (2), wherein 7 mark points respectively have the radius rjRotating concentrically around the measuring tip (2), and because the positions of all the mark points are accurately calibrated, the rotating radiuses r of 7 concentric circlesjAnd (3) keeping the radius of the concentric circles unchanged, keeping the distance from each mark point to the measuring tip unchanged during the rotation process of the measuring head, and obtaining a radius error equation for each mark for the I1 rotationally acquired images:
<math><mrow><msubsup><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mrow><mi>I</mi><mn>1</mn></mrow></msubsup><mrow><mo>(</mo><msubsup><mi>r</mi><mi>j</mi><mi>i</mi></msubsup><mo>-</mo><msub><mi>r</mi><mi>j</mi></msub><mo>)</mo></mrow><mo>=</mo><mn>0</mn><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>2</mn><mo>)</mo></mrow></mrow></math>
wherein,
Figure BDA0000037914030000042
denotes an estimated rotation radius r of the jth (j-1, …, 7) index point at the ith rotationjA calibration value representing the turning radius of the jth (j ═ 1, …, 7) index point;
step 24, estimating initial values of internal and external parameters of the camera: solving initial values of a parameter matrix K, a rotation matrix R and a translational vector T in the camera by a radius error function of a minimum formula (3) due to the existence of errors, and estimating an initial value of a distortion parameter by a back projection error minimum
<math><mrow><msubsup><mi>&Sigma;</mi><mrow><mi>j</mi><mo>=</mo><mn>1</mn></mrow><mn>7</mn></msubsup><msubsup><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mrow><mi>I</mi><mn>1</mn></mrow></msubsup><mrow><mo>(</mo><msubsup><mi>r</mi><mi>j</mi><mi>i</mi></msubsup><mo>-</mo><msub><mi>r</mi><mi>j</mi></msub><mo>)</mo></mrow><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mtext>3</mtext><mo>)</mo></mrow></mrow></math>
And 25, further improving the calibration precision by adopting a sparse binding adjustment algorithm.
The step 3 comprises the following steps:
step 31, acquiring the measuring head mark point images in the overlapped view field: reselecting and positioning a space positioning point (L1, L2.. once.), adjusting the posture of the optical imaging measuring head to ensure that the optical imaging measuring head is positioned in the overlapped visual field of every two or more cameras, rotating the optical imaging measuring head, and simultaneously acquiring a three-dimensional image of the measuring head marking point by the cameras with overlapped visual fields in the area;
step 32, identifying and matching the mark points of the stereo image: extracting the center of a sub-pixel of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center of each mark point by adopting a gravity center method; fast matching the mark points of the stereo image pair according to the mark point position constraint and the geometric moment invariance rule;
step 33, estimating the fundamental matrix and epipolar geometry:
assuming that the two cameras C1 and C2 have overlapped view field spaces, and no loss of generality, the origin of world coordinates is positioned at the center of the camera C1, and the pose of the camera C1 is (R)1,T1) The pose of the camera C2 is (R, 0)2,T2) The relative relationship between the two cameras is (R, T). ThenImage coordinates of two cameras acquired same-name mark point j
Figure BDA0000037914030000051
The following relations exist between the following components:
<math><mrow><msubsup><mi>&lambda;</mi><mn>2</mn><mi>j</mi></msubsup><msubsup><mover><mi>x</mi><mo>~</mo></mover><mn>2</mn><mi>j</mi></msubsup><mo>=</mo><msubsup><mi>&lambda;</mi><mn>1</mn><mi>j</mi></msubsup><msub><mi>K</mi><mn>2</mn></msub><msub><mi>R</mi><mn>2</mn></msub><msubsup><mi>K</mi><mn>1</mn><mrow><mo>-</mo><mn>1</mn></mrow></msubsup><mrow><msubsup><mover><mi>x</mi><mo>~</mo></mover><mn>1</mn><mi>j</mi></msubsup><mo>+</mo><msub><mi>K</mi><mn>2</mn></msub><msub><mi>T</mi><mn>2</mn></msub><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>4</mn><mo>)</mo></mrow></mrow></mrow></math>
estimating unknown scale parameters from the equation
Figure BDA0000037914030000053
And
Figure BDA0000037914030000054
and obtain
Figure BDA0000037914030000055
WhereinI.e. the required fundamental matrix, and if there are enough pairs of landmark points, the 9 parameters of the fundamental matrix are solved linearly. Meanwhile, in step 2, the intrinsic parameter matrix K of the two cameras C1 and C21,K2Has been calibrated, using coordinate values
Figure BDA0000037914030000057
(i ═ 1, 2) estimating the epipolar constraint relationship of the stereo images:
x 2 jT Ex 1 j = 0 - - - ( 5 )
wherein,
Figure BDA0000037914030000059
the method is an essential matrix, and an essential matrix E can be estimated by using a 7-point algorithm;
step 34, estimating the relative pose and scale factor between every two cameras:
because the positions of 7 mark points on the measuring head are known, a rotation matrix R and a translation vector T between every two cameras can be uniquely determined, and when the scale factor is estimated, the normalized radius of the mark point relative to the rotation of the measuring tip is used
Figure BDA00000379140300000510
And actual nominal radius
Figure BDA00000379140300000511
The following results were obtained:
<math><mrow><msup><mi>&lambda;</mi><mi>j</mi></msup><mo>=</mo><msup><mi>r</mi><mi>j</mi></msup><mo>/</mo><msup><mover><mi>r</mi><mo>~</mo></mover><mi>j</mi></msup><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>6</mn><mo>)</mo></mrow></mrow></math>
taking the average value of 7 point estimates or the average value of N times of measurement as a final scale factor lambda due to the existence of errors;
<math><mrow><mi>&lambda;</mi><mo>=</mo><msubsup><mi>&Sigma;</mi><mrow><mi>j</mi><mo>=</mo><mn>1</mn></mrow><mn>7</mn></msubsup><msup><mi>&lambda;</mi><mi>j</mi></msup><mo>/</mo><mn>7</mn><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>7</mn><mo>)</mo></mrow></mrow></math>
<math><mrow><mi>&lambda;</mi><mo>=</mo><msubsup><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>N</mi></msubsup><mrow><mo>(</mo><msubsup><mi>&Sigma;</mi><mrow><mi>j</mi><mo>=</mo><mn>1</mn></mrow><mn>7</mn></msubsup><msup><mi>&lambda;</mi><mi>j</mi></msup><mo>/</mo><mn>7</mn><mo>)</mo></mrow><mo>/</mo><mi>N</mi><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>8</mn><mo>)</mo></mrow><mo>.</mo></mrow></math>
the step 4 comprises the following steps:
step 41, constructing a visual diagram of the multi-camera system according to a diagram theory:
constructing a visual graph of the multi-camera system by taking each camera as a node and taking every two cameras with overlapped view fields as edges, wherein in the visual graph, the direction of each edge is temporarily determined by an initially calibrated local rotation matrix and a translation vector, and the other camera after transformation is pointed by the previous camera;
step 42, determining the reference cameras, and reestablishing the connection relationship between the cameras:
in the visual map, determining a reference camera, considering the relation with the adjacent camera when selecting the reference camera, and readjusting and establishing the connection relation between the cameras from the reference camera;
step 43, determining the rotation and translation amount of each camera in the reference camera coordinate system by adopting a shortest path method;
solving for the absolute position from the reference camera to each camera using the shortest path, let Ci,Cj,CkThe cameras which are mutually connected on a certain path of the visual map respectively obtain the secondary camera C for every two calibrated camerasiTo CjAnd CjTo CkOf (A) a transformation matrix (R)ij,Tij) And (R)jk,Tjk) From CiTo CkIs calculated by the following equation:
Rik=RijRjk,Tik=Tij+RijTjk (9)
and if the number of nodes on the path of the reference camera is more than two, solving by using the above formula for multiple times to finish absolute calibration of all cameras relative to the reference camera.
The step 5 comprises the following steps:
step 51, obtaining an initial three-dimensional calibration point set
In the step 22 and the step 32, image coordinates of different calibration points of the marker point in all cameras are collected and identified, a three-dimensional sparse calibration point set is obtained by back projection according to all calibration parameters obtained in the previous 4 steps, and errors exist in the reconstructed three-dimensional sparse calibration point set and the calibration parameters due to the fact that a simple connection relation of multiple cameras is used in calibration;
step 52, constructing a back projection error function
Supposing that n three-dimensional coordinate points and m cameras are arranged, the projection coordinate of the ith point on the jth camera image is xijThe parameter of each camera is represented by a vector ajRepresenting, each three-dimensional point by a vector biShowing that the function Q (,) defines the projection of a three-dimensional point on the image surface of the camera, the function d (x, y) represents the Euclidean distance between the image points x and y, and the following back projection error function is established according to the projective geometrical relationship:
<math><mrow><msub><mi>min</mi><mrow><msub><mi>a</mi><mi>j</mi></msub><mo>,</mo><msub><mi>b</mi><mi>i</mi></msub></mrow></msub><msubsup><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></msubsup><msubsup><mi>&Sigma;</mi><mrow><mi>j</mi><mo>=</mo><mn>1</mn></mrow><mi>m</mi></msubsup><mi>d</mi><msup><mrow><mo>(</mo><mi>Q</mi><mrow><mo>(</mo><msub><mi>a</mi><mi>j</mi></msub><mo>,</mo><msub><mi>b</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>,</mo><msub><mi>x</mi><mi>ij</mi></msub><mo>)</mo></mrow><mn>2</mn></msup><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>10</mn><mo>)</mo></mrow></mrow></math>
and step 53, using the binding adjustment minimized back projection error function to obtain an accurate calibration result.
The back projection error function is a function containing P e □MProblem of non-linear minimization of parameter vectors, P ∈ RMThe parameter vector is formed by the pose parameters of all cameras and a three-dimensional measuring point set X-shaped □NAnd (4) forming.
In order to realize the calibration of the internal and external parameters of a single camera and the relative pose of every two cameras, the invention adopts an optical imaging measuring head as a calibration object, 7 marked points with calibrated positions are arranged on the optical imaging measuring head (each marked point consists of 6 LEDs, the gravity center of the marked point is the center of the marked point), and the internal and external parameters of the cameras are recovered according to the concentric rotation motion relation of the marked points on the target surface of the measuring head around a measuring tip, thereby realizing the rapid calibration. Because the measuring space is large and the visual field of the camera is limited, the invention regards the multi-camera system as a visual graph structure relationship, each camera is a node of the visual graph, the edge of the visual graph represents the overlapped visual field range between the nodes (cameras), the edge of the visual graph has directionality, and the representing pose is changed from the previous camera to the other camera; and optimizing all pose initial parameters of the multi-camera system on a visual map by a shortest path method. Meanwhile, because the noise and the error in the coordinate transformation can influence the overall precision of the system, the invention further adopts a sparse binding adjustment algorithm to carry out optimization adjustment on the acquired three-dimensional mark points and all initial calibration parameters so as to improve the precision of the calibration parameters. By using the method, the overall error of system calibration is greatly reduced, and the robustness of each parameter is obviously improved.
Drawings
Fig. 1 is a schematic diagram of the projection relationship between the optical imaging probe and the camera.
Wherein the reference numerals denote: 1. the measuring head comprises a reference positioning block, 2, a measuring tip, 3, an extension bar, 4, a measuring head target body and 5 and 7 mark points (each mark point consists of 6 LED light spots). XWYWZWRepresenting the world coordinate system, XtYtZtRepresenting a measuring head coordinate system, an xy image coordinate scale coordinate system and uv representing an image pixel coordinate system; P1-P7 represent 7 mark points on the optical imaging measuring head, and r 1-r 7 are distances from the 7 mark points to the measuring tip respectively.
FIG. 2 is a schematic diagram of relative position calibration of two cameras with overlapping fields of view.
Wherein the reference numbers: C1-C8 represent 8 cameras, L1 and L2 represent the positions of positioning points, and R and T represent relative pose transformation relations.
Fig. 3 is a visual diagram construction diagram of a multi-camera system.
Wherein the reference numbers: C1-C8 represent 8 cameras.
Fig. 4 is a schematic diagram illustrating a change in connection relation for determining the front and rear visual images of the reference camera.
Wherein the reference numbers: C1-C8 represent 8 cameras.
Fig. 5 is a schematic diagram of global calibration under a reference camera coordinate system.
Wherein the reference numbers: C1-C8 represent 8 cameras, L1 and L2 represent the positions of positioning points, and R and T represent relative pose transformation relations.
Detailed Description
The invention will be further explained with reference to the drawings.
The following description will specifically discuss a vision system including 8 cameras as an example. The present invention uses an optical imaging probe as shown in figure 1. The optical imaging measuring head mainly comprises a measuring tip (2), an extension bar (3), a target body (4), 7 mark points (5) and LED light spots. The distribution of 8 cameras is shown in fig. 3, where all cameras do not require overlapping fields of view, but at least every second camera has an overlapping field of view. According to the method, a space positioning point is selected and determined according to the space distribution of 8 cameras, and independent calibration of a single camera and relative pose calibration of every two cameras are completed by utilizing the rotation invariant relation of an optical imaging measuring head around a measuring tip. And then, constructing a connection relation of the multiple cameras according to the graph theory and the visual graph, and completing the calibration of the multiple cameras under a reference camera coordinate system. And finally, globally optimizing the three-dimensional geometric structure and all calibration parameters by adopting a sparse bundling adjustment algorithm in order to eliminate the initial calibration error.
The following describes the steps in a specific order of calibration.
The first stage is as follows: and (4) independently calibrating intrinsic parameters and distortion parameters of the single camera.
The numbers of the cameras are C1-C8, as shown in FIG. 2.
Step 11, establishing a world coordinate system X according to the projection relation between the optical imaging measuring head and the camera shown in figure 1WYWZWCoordinate system X of measuring headtYtZtAn image coordinate scale coordinate system xy and a pixel coordinate system uv.
And step 12, determining the spatial position of each camera, determining one or more spatial positioning points for each camera, and positioning the optical imaging measuring head within the field of view of the calibrated camera according to the principle of spatial positioning point selection, namely in the calibration process of a single camera or multiple cameras. In one shotDuring calibration of the camera, a reference positioning block (1) is fixed on a corresponding positioning point, an optical imaging measuring head rotates by taking a measuring tip (2) as a circle center, the camera is used for collecting I1 images with different postures, and the collected images are distributed in the field space of the camera as much as possible. Extracting the center of a sub-pixel of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center coordinate of each mark point (consisting of 6 LED light spots) by adopting a gravity center method
Figure BDA0000037914030000091
(I-1, …, I1; j-1, …, 7). Meanwhile, the coordinates of the N-7 mark points on the optical imaging measuring head in the measuring head coordinate system are set as
Figure BDA0000037914030000092
The coordinates in the world coordinate system are
Figure BDA0000037914030000093
And (3) establishing a projection relation of the camera according to the formula (1).
And step 13, constructing a radius error equation (2) of each mark point according to the concentric circle radius invariant geometric constraint.
And step 14, a minimized radius error function formula (3) restores the initial values of the intrinsic parameter matrix K, the rotation matrix R and the translational vector T of each camera, and estimates distortion parameters of each camera through the minimum back projection error (K1, K2, d1 and d 2).
And step 15, optimizing internal and external parameters of each camera by adopting a sparse binding adjustment algorithm, and improving the calibration precision.
And a second stage: and calibrating the relative pose of every two cameras with overlapped view fields.
A pair of 8 cameras is arbitrarily selected and grouped in the order of the small number preceding and the large number following.
Step 21, as shown in fig. 2, selecting and positioning an appropriate spatial positioning point (L1, L2,. cndot.), adjusting the posture of the optical imaging probe to ensure that the optical imaging probe is positioned in the overlapped fields of view of every two or more cameras, rotating the optical imaging probe, and simultaneously acquiring a three-dimensional image of a probe mark point by the cameras with overlapped fields of view in the area.
And step 22, extracting the center of the sub-pixel of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center of each mark point by adopting a gravity center method. And quickly matching the mark points of the stereo image pair according to the mark point position constraint and the geometric moment invariance rule.
Step 23, estimating the unknown scale parameters of each pair of cameras according to the formula (4)
Figure BDA0000037914030000101
And
Figure BDA0000037914030000102
according to the relationship
Figure BDA0000037914030000103
Recovering the basic matrix of each two pairs of cameras; and restoring the epipolar constraint relation of each pair of cameras according to the formula (5).
And 24, recovering the rotation matrix R and the translational vector T of each pair of cameras, and calculating a scale factor lambda according to a formula (7) and a formula (8).
And a third stage: and global calibration of the multi-camera system based on the visual map.
Step 31, as shown in fig. 3, a visual diagram of the multi-camera system is constructed by using 8 cameras as nodes and every two cameras with overlapped fields of view as edges. In the visual map, the direction of each edge is temporarily determined by the initially calibrated local rotation matrix and the translation vector, and a directional visual map as shown in fig. 4(a) is constructed.
And step 32, in the visual map, using C8 as a reference camera, readjusting and establishing a connection relationship between the cameras, and determining a new directional visual map by using a shortest path method, as shown in fig. 4 (b).
Step 33, determining the rotation and translation amount of each camera in the reference camera coordinate system according to formula (9), as shown in fig. 5. Thus, the absolute calibration of all cameras relative to the reference camera is initially completed.
A fourth stage: and carrying out global optimization calibration based on sparse binding adjustment.
Step 41, reconstructing a three-dimensional sparse calibration point set X epsilon □ through back projection according to the initial calibration parametersN
And 42, establishing a back projection error minimum function (10) by all the calibration data and the three-dimensional point set.
And 43, using the binding adjustment minimized back projection error function to obtain an accurate calibration result.
The fifth stage: and further optimizing and estimating the calibration parameters until the three-dimensional measurement calibration precision requirement is met.
The invention is characterized in that:
the calibration method uses the imaging measuring head with fixed optical mark points to calibrate the internal and external parameters of the camera and the relative pose between the cameras, does not need any other equipment, and has simple method and low cost.
The calibration method does not need the optical imaging measuring head to be visible in all the camera view fields, only needs to be visible in the overlapped view fields of every two cameras, is suitable for multi-camera systems with different distribution structures, and can be used for small-scale space calibration and large-scale space calibration.
The visual map and the binding adjustment optimization algorithm adopted by the calibration method can obviously improve the calibration efficiency, robustness and precision.

Claims (7)

1. A multi-camera system calibration method based on an optical imaging measuring head and a visual diagram structure is characterized by comprising the following steps:
step 1, establishing hardware:
by adopting a plurality of distributed cameras and an optical imaging measuring head, all the cameras do not require overlapped view field spaces, but require at least two cameras to have overlapped view fields:
and 2, determining the spatial position of each camera, determining one or more spatial positioning points for each camera, and positioning the optical imaging measuring head within the field of view of the calibrated camera according to the principle of spatial positioning point selection, namely in the calibration process of a single camera or multiple cameras. During calibration of a single camera, firstly, a reference positioning block (1) is installed on a corresponding positioning point, an optical imaging measuring head is placed in the field range of each camera and aligned with the reference positioning block (1), the optical imaging measuring head is rotated by taking a measuring tip (2) of the optical imaging measuring head as a circle center, seven mark points (5) on an optical imaging measuring head target body 4 concentrically rotate by taking the measuring tip (2) as the circle center, and the camera acquires a plurality of images of the optical imaging measuring head in different postures; secondly, extracting the central coordinate of each LED sub-pixel in each image by adopting a threshold segmentation and ellipse fitting algorithm, and calculating the central coordinate of each mark point 5 by adopting a gravity center method; finally, recovering the internal parameter matrix and distortion parameters of each camera according to the concentric circle rotation relation of the mark points 5 and the calibrated space distance invariant relation;
step 3, reselecting and positioning the space positioning point, adjusting the posture of the optical imaging measuring head, and ensuring that the optical imaging measuring head is positioned in the overlapped field of view of every two or more cameras; in the overlapped view field, rotating the optical imaging measuring head, simultaneously acquiring the LED images of the mark points of the optical imaging measuring head by the cameras with overlapped view fields, recovering the basic matrix, the essential matrix, the polar line geometric relationship, the rotation matrix and the translation vector between two local cameras through image processing, center recognition and mark point matching and according to the geometric constraint between the mark points;
step 4, taking each camera as a node, taking every two cameras with overlapped fields of view as edges, constructing a visual graph of the multi-camera system, wherein the direction of each edge is determined by the initially calibrated local rotation matrix and the translation vector, and the former camera points to the other camera after transformation; in the visual map, firstly determining a reference camera, starting from the reference camera, establishing a connection relation among all cameras, and calculating a rotation matrix and a translation vector of each camera relative to the reference camera by adopting a shortest path method to realize global initial calibration of a multi-camera system;
step 5, according to the initial calibration result, the mark points acquired, identified and obtained in the step 2 and the step 3 are back projected to a reference camera coordinate system, so that a three-dimensional calibration point set in a world coordinate system is established; performing optimization estimation on all calibration parameters and the three-dimensional calibration point set by adopting a sparse bundling adjustment algorithm to obtain a robust and high-precision calibration result;
step 6, if the calibration result meets the measurement precision requirement, the calibration process is finished; if the three-dimensional calibration point set does not meet the requirement, the number of the positioning points of the optical imaging measuring head is increased, the image is collected again, the identified and reconstructed new three-dimensional calibration points are added to the original marker point set, all calibration parameters and the three-dimensional calibration point set are optimized again by adopting a sparse binding adjustment algorithm, and all parameters are calibrated again.
2. The method for calibrating a multi-camera system based on an optical imaging probe and a visual pattern structure according to claim 1, characterized in that: each marking point 5 consists of six LEDs, and the center of gravity of the marking point is the center of the marking point.
3. The method for calibrating a multi-camera system based on an optical imaging probe and a visual pattern structure according to claim 1, characterized in that: the step 2 comprises the following steps:
step 21, establishing a camera pinhole projection model:
first, a world coordinate system X is establishedWYWZWCoordinate system X of measuring headtYtZtAn image coordinate scale coordinate system xy and a pixel coordinate system uv, wherein N is 7 mark points on the optical imaging measuring head,
Figure FDA0000037914020000021
indicating the coordinates of the mark point in the measuring head coordinate system,representing the coordinates of the mark point in a world coordinate system, (R, T) are rotation and translation parameters of the camera relative to the world coordinate system, K is an internal parameter matrix of the camera, and the internal parameter matrix comprises 5 parameters which are respectively a horizontal focal distance and a vertical focal distance(fx,fy) The image tilt factor s and the optical center coordinates (u0, v0), (k1, k2, d1, d2) are distortion parameters of a radial direction and a tangential direction respectively, and 7 mark points (5) on the measuring head target surface (4) satisfy the following projection relation:
<math><mrow><msup><mi>&lambda;</mi><mi>j</mi></msup><msup><mover><mi>x</mi><mo>~</mo></mover><mi>j</mi></msup><mo>=</mo><mi>KR</mi><msubsup><mi>X</mi><mi>t</mi><mi>j</mi></msubsup><mo>+</mo><mi>KT</mi><mo>,</mo><mi>j</mi><mo>&Element;</mo><mn>7</mn><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>1</mn><mo>)</mo></mrow></mrow></math>
wherein λ isjIs the projected depth of landmark point j;
Figure FDA0000037914020000031
indicating the normalized coordinates of the landmark point j on the image.
Step 22, calibrating image acquisition and marking point identification: selecting a space positioning point according to a principle of space positioning point selection, fixing a reference positioning block (1), rotating an optical imaging measuring head by taking a measuring tip (2) as a circle center, acquiring I1 images with different postures by using a camera, distributing the acquired images in a field space of the camera as much as possible, extracting a sub-pixel center of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center of each mark point by adopting a gravity center method;
step 23, according to the invariance of the radius of the concentric circle, establishing an error function of the distance between each mark point and the measuring tip when the optical imaging measuring head rotates around the measuring tip (2), wherein 7 mark points respectively have the radius rjRotating concentrically around the measuring tip (2), and because the positions of all the mark points are accurately calibrated, the rotating radiuses r of 7 concentric circlesjAnd (3) keeping the radius of the concentric circles unchanged, keeping the distance from each mark point to the measuring tip unchanged during the rotation process of the measuring head, and obtaining a radius error equation for each mark for the I1 rotationally acquired images:
<math><mrow><msubsup><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mrow><mi>I</mi><mn>1</mn></mrow></msubsup><mrow><mo>(</mo><msubsup><mi>r</mi><mi>j</mi><mi>i</mi></msubsup><mo>-</mo><msub><mi>r</mi><mi>j</mi></msub><mo>)</mo></mrow><mo>=</mo><mn>0</mn><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>2</mn><mo>)</mo></mrow></mrow></math>
wherein,
Figure FDA0000037914020000033
denotes an estimated rotation radius r of the jth (j-1, …, 7) index point at the ith rotationjA calibration value representing the turning radius of the jth (j ═ 1, …, 7) index point;
step 24, estimating initial values of internal and external parameters of the camera: solving initial values of a parameter matrix K, a rotation matrix R and a translational vector T in the camera by a radius error function of a minimum formula (3) due to the existence of errors, and estimating an initial value of a distortion parameter by a back projection error minimum
<math><mrow><msubsup><mi>&Sigma;</mi><mrow><mi>j</mi><mo>=</mo><mn>1</mn></mrow><mn>7</mn></msubsup><msubsup><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mrow><mi>I</mi><mn>1</mn></mrow></msubsup><mrow><mo>(</mo><msubsup><mi>r</mi><mi>j</mi><mi>i</mi></msubsup><mo>-</mo><msub><mi>r</mi><mi>j</mi></msub><mo>)</mo></mrow><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>3</mn><mo>)</mo></mrow></mrow></math>
And 25, further improving the calibration precision by adopting a sparse binding adjustment algorithm.
4. The method for calibrating a multi-camera system based on an optical imaging probe and a visual pattern structure according to claim 1, characterized in that: the step 3 comprises the following steps:
step 31, acquiring the measuring head mark point images in the overlapped view field: reselecting and positioning a space positioning point (L1, L2.. once.), adjusting the posture of the optical imaging measuring head to ensure that the optical imaging measuring head is positioned in the overlapped visual field of every two or more cameras, rotating the optical imaging measuring head, and simultaneously acquiring a three-dimensional image of the measuring head marking point by the cameras with overlapped visual fields in the area;
step 32, identifying and matching the mark points of the stereo image: extracting the center of a sub-pixel of each LED light spot by adopting a threshold segmentation and ellipse fitting algorithm, and estimating the center of each mark point by adopting a gravity center method; fast matching the mark points of the stereo image pair according to the mark point position constraint and the geometric moment invariance rule;
step 33, estimating the fundamental matrix and epipolar geometry:
assuming that the two cameras C1 and C2 have overlapped view field spaces, and no loss of generality, the origin of world coordinates is positioned at the center of the camera C1, and the pose of the camera C1 is (R)1,T1) The pose of the camera C2 is (R, 0)2,T2) The relative relationship between the two cameras is (R, T). The image coordinates of the same-name mark point j acquired by the two cameras
Figure FDA0000037914020000041
The following relations exist between the following components:
<math><mrow><msubsup><mi>&lambda;</mi><mn>2</mn><mi>j</mi></msubsup><msubsup><mover><mi>x</mi><mo>~</mo></mover><mn>2</mn><mi>j</mi></msubsup><mo>=</mo><msubsup><mi>&lambda;</mi><mn>1</mn><mi>j</mi></msubsup><msub><mi>K</mi><mn>2</mn></msub><msub><mi>R</mi><mn>2</mn></msub><msubsup><mi>K</mi><mn>1</mn><mrow><mo>-</mo><mn>1</mn></mrow></msubsup><mrow><msubsup><mover><mi>x</mi><mo>~</mo></mover><mn>1</mn><mi>j</mi></msubsup><mo>+</mo></mrow><msub><mi>K</mi><mn>2</mn></msub><msub><mi>T</mi><mn>2</mn></msub><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>4</mn><mo>)</mo></mrow></mrow></math>
estimating unknown scale parameters from the equation
Figure FDA0000037914020000043
And
Figure FDA0000037914020000044
and obtain
Figure FDA0000037914020000045
Wherein
Figure FDA0000037914020000046
I.e. the required fundamental matrix, and if there are enough pairs of landmark points, the 9 parameters of the fundamental matrix are solved linearly. Meanwhile, in step 2, the intrinsic parameter matrix K of the two cameras C1 and C21,K2Has been calibrated, using coordinate values
Figure FDA0000037914020000047
(i ═ 1, 2) estimating the epipolar constraint relationship of the stereo images:
x 2 jT Ex 1 j = 0 - - - ( 5 )
wherein,
Figure FDA0000037914020000049
the method is an essential matrix, and an essential matrix E can be estimated by using a 7-point algorithm;
step 34, estimating the relative pose and scale factor between every two cameras:
because the positions of 7 mark points on the measuring head are known, a rotation matrix R and a translation vector T between every two cameras can be uniquely determined, and when the scale factor is estimated, the normalized radius of the mark point relative to the rotation of the measuring tip is used
Figure FDA00000379140200000410
And the actual calibrated radius rjThe following can be obtained:
<math><mrow><msup><mi>&lambda;</mi><mi>j</mi></msup><mo>=</mo><msup><mi>r</mi><mi>j</mi></msup><mo>/</mo><msup><mover><mi>r</mi><mo>~</mo></mover><mi>j</mi></msup><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>6</mn><mo>)</mo></mrow></mrow></math>
taking the average value of 7 point estimates or the average value of N times of measurement as a final scale factor lambda due to the existence of errors;
<math><mrow><mi>&lambda;</mi><mo>=</mo><msubsup><mi>&Sigma;</mi><mrow><mi>j</mi><mo>=</mo><mn>1</mn></mrow><mn>7</mn></msubsup><msup><mi>&lambda;</mi><mi>i</mi></msup><mo>/</mo><mn>7</mn><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>7</mn><mo>)</mo></mrow></mrow></math>
<math><mrow><mi>&lambda;</mi><mo>=</mo><msubsup><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>N</mi></msubsup><mrow><mo>(</mo><msubsup><mi>&Sigma;</mi><mrow><mi>j</mi><mo>=</mo><mn>1</mn></mrow><mn>7</mn></msubsup><msup><mi>&lambda;</mi><mi>j</mi></msup><mo>/</mo><mn>7</mn><mo>)</mo></mrow><mo>/</mo><mi>N</mi><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>8</mn><mo>)</mo></mrow><mo>.</mo></mrow></math>
5. the method for calibrating a multi-camera system based on an optical imaging probe and a visual pattern structure according to claim 1, characterized in that: the step 4 comprises the following steps:
step 41, constructing a visual diagram of the multi-camera system according to a diagram theory:
constructing a visual graph of the multi-camera system by taking each camera as a node and taking every two cameras with overlapped view fields as edges, wherein in the visual graph, the direction of each edge is temporarily determined by an initially calibrated local rotation matrix and a translation vector, and the other camera after transformation is pointed by the previous camera;
step 42, determining the reference cameras, and reestablishing the connection relationship between the cameras:
in the visual map, determining a reference camera, considering the relation with the adjacent camera when selecting the reference camera, and readjusting and establishing the connection relation between the cameras from the reference camera;
step 43, determining the rotation and translation amount of each camera in the reference camera coordinate system by adopting a shortest path method;
solving for the absolute position from the reference camera to each camera using the shortest path, let Ci,Cj,CkThe cameras which are mutually connected on a certain path of the visual map respectively obtain the secondary camera C for every two calibrated camerasiTo CjAnd CjTo CkOf (A) a transformation matrix (R)ij,Tij) And (R)jk,Tjk) From CiTo CkIs calculated by the following equation:
Rik=RijRjk,Tik=Tij+RijTjk (9)
and if the number of nodes on the path of the reference camera is more than two, solving by using the above formula for multiple times to finish absolute calibration of all cameras relative to the reference camera.
6. The method for calibrating a multi-camera system based on an optical imaging probe and a visual pattern structure according to claim 1, characterized in that: the step 5 comprises the following steps:
step 51, obtaining an initial three-dimensional calibration point set
In the step 22 and the step 32, image coordinates of different calibration points of the marker point in all cameras are collected and identified, a three-dimensional sparse calibration point set is obtained by back projection according to all calibration parameters obtained in the previous 4 steps, and errors exist in the reconstructed three-dimensional sparse calibration point set and the calibration parameters due to the fact that a simple connection relation of multiple cameras is used in calibration;
step 52, constructing a back projection error function
Supposing that n three-dimensional coordinate points and m cameras are arranged, the projection coordinate of the ith point on the jth camera image is xijThe parameter of each camera is represented by a vector ajRepresenting, each three-dimensional point by a vector biShowing that the function Q (,) defines the projection of a three-dimensional point on the image surface of the camera, the function d (x, y) represents the Euclidean distance between the image points x and y, and the following back projection error function is established according to the projective geometrical relationship:
<math><mrow><msub><mi>min</mi><mrow><msub><mi>a</mi><mi>j</mi></msub><mo>,</mo><msub><mi>b</mi><mtext>i</mtext></msub></mrow></msub><msubsup><mi>&Sigma;</mi><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></msubsup><msubsup><mi>&Sigma;</mi><mrow><mi>j</mi><mo>=</mo><mn>1</mn></mrow><mi>m</mi></msubsup><mi>d</mi><msup><mrow><mo>(</mo><mi>Q</mi><mrow><mo>(</mo><msub><mi>a</mi><mi>j</mi></msub><mo>,</mo><msub><mi>b</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>,</mo><msub><mi>x</mi><mi>ij</mi></msub><mo>)</mo></mrow><mn>2</mn></msup><mo>-</mo><mo>-</mo><mo>-</mo><mrow><mo>(</mo><mn>10</mn><mo>)</mo></mrow></mrow></math>
and step 53, using the binding adjustment minimized back projection error function to obtain an accurate calibration result.
7. The method for calibrating a multi-camera system based on an optical imaging probe and a visual pattern structure according to claim 5, wherein: the back projection error function is a function containing P e □MProblem of non-linear minimization of parameter vectors, P ∈ RMThe parameter vector is formed by the pose parameters of all cameras and a three-dimensional measuring point set X-shaped □NAnd (4) forming.
CN2010105852616A 2010-12-13 2010-12-13 Multi-camera system calibrating method based on optical imaging probe and visual graph structure Active CN102034238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010105852616A CN102034238B (en) 2010-12-13 2010-12-13 Multi-camera system calibrating method based on optical imaging probe and visual graph structure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010105852616A CN102034238B (en) 2010-12-13 2010-12-13 Multi-camera system calibrating method based on optical imaging probe and visual graph structure

Publications (2)

Publication Number Publication Date
CN102034238A true CN102034238A (en) 2011-04-27
CN102034238B CN102034238B (en) 2012-07-18

Family

ID=43887091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010105852616A Active CN102034238B (en) 2010-12-13 2010-12-13 Multi-camera system calibrating method based on optical imaging probe and visual graph structure

Country Status (1)

Country Link
CN (1) CN102034238B (en)

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103196370A (en) * 2013-04-01 2013-07-10 北京理工大学 Measuring method and measuring device of conduit connector space pose parameters
CN103377471A (en) * 2012-04-16 2013-10-30 株式会社理光 Method and device for object positioning, and method and device for determining optimal camera pair
CN103617622A (en) * 2013-12-10 2014-03-05 云南大学 Pose estimation orthogonal iterative optimization algorithm
CN103792950A (en) * 2014-01-06 2014-05-14 中国航空无线电电子研究所 Three-dimensional shooting optical error correcting device and method based on piezoelectric ceramics
CN104145294A (en) * 2012-03-02 2014-11-12 高通股份有限公司 Scene structure-based self-pose estimation
CN104766291A (en) * 2014-01-02 2015-07-08 株式会社理光 Method and system for calibrating multiple cameras
CN105894505A (en) * 2016-03-30 2016-08-24 南京邮电大学 Quick pedestrian positioning method based on multi-camera geometrical constraint
CN106060524A (en) * 2016-06-30 2016-10-26 北京邮电大学 Method and device for setting camera
CN106408614A (en) * 2016-09-27 2017-02-15 中国船舶工业系统工程研究院 Video camera intrinsic parameter calibration method and system suitable for field application
CN106780630A (en) * 2017-01-09 2017-05-31 上海商泰汽车信息系统有限公司 Demarcate panel assembly, vehicle-mounted camera scaling method and device, system
CN106799732A (en) * 2016-12-07 2017-06-06 中国科学院自动化研究所 For the control system and its localization method of the motion of binocular head eye coordination
CN106843224A (en) * 2017-03-15 2017-06-13 广东工业大学 A kind of method and device of multi-vision visual positioning collaboration guiding transport vehicle
CN108827156A (en) * 2018-08-24 2018-11-16 合肥工业大学 A kind of industrial photogrammetry station meter
CN109099883A (en) * 2018-06-15 2018-12-28 哈尔滨工业大学 The big visual field machine vision metrology of high-precision and caliberating device and method
CN109416576A (en) * 2016-06-30 2019-03-01 微软技术许可有限责任公司 Based on the interaction of determining constraint and virtual objects
CN105072414B (en) * 2015-08-19 2019-03-12 浙江宇视科技有限公司 A kind of target detection and tracking and system
CN109785373A (en) * 2019-01-22 2019-05-21 东北大学 A kind of six-freedom degree pose estimating system and method based on speckle
CN109813335A (en) * 2017-11-21 2019-05-28 武汉四维图新科技有限公司 Scaling method, device, system and the storage medium of data collection system
CN110176035A (en) * 2019-05-08 2019-08-27 深圳市易尚展示股份有限公司 Localization method, device, computer equipment and the storage medium of index point
CN110310337A (en) * 2019-06-24 2019-10-08 西北工业大学 A kind of more view optical field imaging system population parameter estimation methods based on light field fundamental matrix
CN110782498A (en) * 2019-09-26 2020-02-11 北京航空航天大学 Rapid universal calibration method for visual sensing network
CN110889901A (en) * 2019-11-19 2020-03-17 北京航空航天大学青岛研究院 Large-scene sparse point cloud BA optimization method based on distributed system
CN110969662A (en) * 2018-09-28 2020-04-07 杭州海康威视数字技术股份有限公司 Fisheye camera internal reference calibration method and device, calibration device controller and system
CN111127560A (en) * 2019-11-11 2020-05-08 江苏濠汉信息技术有限公司 Calibration method and system for three-dimensional reconstruction binocular vision system
CN111243021A (en) * 2020-01-06 2020-06-05 武汉理工大学 Vehicle-mounted visual positioning method and system based on multiple combined cameras and storage medium
CN111308448A (en) * 2018-12-10 2020-06-19 杭州海康威视数字技术股份有限公司 Image acquisition equipment and radar external parameter determination method and device
CN111598954A (en) * 2020-04-21 2020-08-28 哈尔滨拓博科技有限公司 Rapid high-precision camera parameter calculation method
CN111890354A (en) * 2020-06-29 2020-11-06 北京大学 Robot hand-eye calibration method, device and system
CN112781496A (en) * 2021-01-20 2021-05-11 湘潭大学 Measuring head pose calibration technology of non-contact measuring system
CN113077519A (en) * 2021-03-18 2021-07-06 中国电子科技集团公司第五十四研究所 Multi-phase external parameter automatic calibration method based on human skeleton extraction
CN113963058A (en) * 2021-09-07 2022-01-21 于留青 On-line calibration method and device for CT (computed tomography) of preset track, electronic equipment and storage medium
CN114049324A (en) * 2021-11-15 2022-02-15 天津大学 Associated reference telecentric measurement quick calibration method under super-field scale
CN114581515A (en) * 2020-12-02 2022-06-03 中国科学院沈阳自动化研究所 Multi-camera calibration parameter optimization method based on optimal path conversion
CN114705122A (en) * 2022-04-13 2022-07-05 成都飞机工业(集团)有限责任公司 Large-field stereoscopic vision calibration method
CN115578694A (en) * 2022-11-18 2023-01-06 合肥英特灵达信息技术有限公司 Video analysis computing power scheduling method, system, electronic equipment and storage medium
CN116188602A (en) * 2023-04-26 2023-05-30 西北工业大学青岛研究院 High-precision calibration method for underwater multi-vision three-dimensional imaging system
CN117650844A (en) * 2024-01-29 2024-03-05 江西开放大学 VLC relay system safety performance optimization method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101231749A (en) * 2007-12-20 2008-07-30 昆山华恒工程技术中心有限公司 Method for calibrating industry robot
US20080240612A1 (en) * 2007-03-30 2008-10-02 Intel Corporation Non-overlap region based automatic global alignment for ring camera image mosaic
CN101285680A (en) * 2007-12-12 2008-10-15 中国海洋大学 Line structure optical sensor outer parameter calibration method
CN101334267A (en) * 2008-07-25 2008-12-31 西安交通大学 Digital image feeler vector coordinate transform calibration and error correction method and its device
US7554575B2 (en) * 2005-10-28 2009-06-30 Seiko Epson Corporation Fast imaging system calibration

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7554575B2 (en) * 2005-10-28 2009-06-30 Seiko Epson Corporation Fast imaging system calibration
US20080240612A1 (en) * 2007-03-30 2008-10-02 Intel Corporation Non-overlap region based automatic global alignment for ring camera image mosaic
CN101285680A (en) * 2007-12-12 2008-10-15 中国海洋大学 Line structure optical sensor outer parameter calibration method
CN101231749A (en) * 2007-12-20 2008-07-30 昆山华恒工程技术中心有限公司 Method for calibrating industry robot
CN101334267A (en) * 2008-07-25 2008-12-31 西安交通大学 Digital image feeler vector coordinate transform calibration and error correction method and its device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Pattern Recognition》 20011231 Bent David Olsen等 Calibration a camera network using a domino grid , 2 *
《计测技术》 20091231 孙佳星 等 双摄像机光笔式三维坐标测量系统研究 第29卷, 第1期 2 *

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104145294A (en) * 2012-03-02 2014-11-12 高通股份有限公司 Scene structure-based self-pose estimation
CN104145294B (en) * 2012-03-02 2017-03-08 高通股份有限公司 Self pose estimation based on scene structure
CN103377471B (en) * 2012-04-16 2016-08-03 株式会社理光 Object positioning method and device, optimum video camera are to determining method and apparatus
CN103377471A (en) * 2012-04-16 2013-10-30 株式会社理光 Method and device for object positioning, and method and device for determining optimal camera pair
CN103196370A (en) * 2013-04-01 2013-07-10 北京理工大学 Measuring method and measuring device of conduit connector space pose parameters
CN103617622A (en) * 2013-12-10 2014-03-05 云南大学 Pose estimation orthogonal iterative optimization algorithm
CN104766291A (en) * 2014-01-02 2015-07-08 株式会社理光 Method and system for calibrating multiple cameras
CN104766291B (en) * 2014-01-02 2018-04-10 株式会社理光 Multiple cameras scaling method and system
CN103792950A (en) * 2014-01-06 2014-05-14 中国航空无线电电子研究所 Three-dimensional shooting optical error correcting device and method based on piezoelectric ceramics
CN105072414B (en) * 2015-08-19 2019-03-12 浙江宇视科技有限公司 A kind of target detection and tracking and system
CN105894505A (en) * 2016-03-30 2016-08-24 南京邮电大学 Quick pedestrian positioning method based on multi-camera geometrical constraint
CN106060524A (en) * 2016-06-30 2016-10-26 北京邮电大学 Method and device for setting camera
CN106060524B (en) * 2016-06-30 2017-12-29 北京邮电大学 The method to set up and device of a kind of video camera
CN109416576A (en) * 2016-06-30 2019-03-01 微软技术许可有限责任公司 Based on the interaction of determining constraint and virtual objects
CN109416576B (en) * 2016-06-30 2022-04-29 微软技术许可有限责任公司 Interaction with virtual objects based on determined constraints
CN106408614A (en) * 2016-09-27 2017-02-15 中国船舶工业系统工程研究院 Video camera intrinsic parameter calibration method and system suitable for field application
CN106408614B (en) * 2016-09-27 2019-03-15 中国船舶工业系统工程研究院 Camera intrinsic parameter Calibration Method and system suitable for field application
CN106799732A (en) * 2016-12-07 2017-06-06 中国科学院自动化研究所 For the control system and its localization method of the motion of binocular head eye coordination
CN106780630A (en) * 2017-01-09 2017-05-31 上海商泰汽车信息系统有限公司 Demarcate panel assembly, vehicle-mounted camera scaling method and device, system
CN106843224A (en) * 2017-03-15 2017-06-13 广东工业大学 A kind of method and device of multi-vision visual positioning collaboration guiding transport vehicle
CN106843224B (en) * 2017-03-15 2020-03-10 广东工业大学 Method and device for cooperatively guiding transport vehicle through multi-view visual positioning
CN109813335A (en) * 2017-11-21 2019-05-28 武汉四维图新科技有限公司 Scaling method, device, system and the storage medium of data collection system
CN109099883A (en) * 2018-06-15 2018-12-28 哈尔滨工业大学 The big visual field machine vision metrology of high-precision and caliberating device and method
CN108827156A (en) * 2018-08-24 2018-11-16 合肥工业大学 A kind of industrial photogrammetry station meter
CN110969662B (en) * 2018-09-28 2023-09-26 杭州海康威视数字技术股份有限公司 Method and device for calibrating internal parameters of fish-eye camera, calibration device controller and system
CN110969662A (en) * 2018-09-28 2020-04-07 杭州海康威视数字技术股份有限公司 Fisheye camera internal reference calibration method and device, calibration device controller and system
CN111308448B (en) * 2018-12-10 2022-12-06 杭州海康威视数字技术股份有限公司 External parameter determining method and device for image acquisition equipment and radar
CN111308448A (en) * 2018-12-10 2020-06-19 杭州海康威视数字技术股份有限公司 Image acquisition equipment and radar external parameter determination method and device
CN109785373B (en) * 2019-01-22 2022-12-23 东北大学 Speckle-based six-degree-of-freedom pose estimation system and method
CN109785373A (en) * 2019-01-22 2019-05-21 东北大学 A kind of six-freedom degree pose estimating system and method based on speckle
CN110176035A (en) * 2019-05-08 2019-08-27 深圳市易尚展示股份有限公司 Localization method, device, computer equipment and the storage medium of index point
CN110176035B (en) * 2019-05-08 2021-09-28 深圳市易尚展示股份有限公司 Method and device for positioning mark point, computer equipment and storage medium
CN110310337A (en) * 2019-06-24 2019-10-08 西北工业大学 A kind of more view optical field imaging system population parameter estimation methods based on light field fundamental matrix
CN110310337B (en) * 2019-06-24 2022-09-06 西北工业大学 Multi-view light field imaging system full-parameter estimation method based on light field fundamental matrix
CN110782498B (en) * 2019-09-26 2022-03-15 北京航空航天大学 Rapid universal calibration method for visual sensing network
CN110782498A (en) * 2019-09-26 2020-02-11 北京航空航天大学 Rapid universal calibration method for visual sensing network
CN111127560A (en) * 2019-11-11 2020-05-08 江苏濠汉信息技术有限公司 Calibration method and system for three-dimensional reconstruction binocular vision system
CN111127560B (en) * 2019-11-11 2022-05-03 江苏濠汉信息技术有限公司 Calibration method and system for three-dimensional reconstruction binocular vision system
CN110889901A (en) * 2019-11-19 2020-03-17 北京航空航天大学青岛研究院 Large-scene sparse point cloud BA optimization method based on distributed system
CN110889901B (en) * 2019-11-19 2023-08-08 北京航空航天大学青岛研究院 Large-scene sparse point cloud BA optimization method based on distributed system
CN111243021A (en) * 2020-01-06 2020-06-05 武汉理工大学 Vehicle-mounted visual positioning method and system based on multiple combined cameras and storage medium
CN111598954A (en) * 2020-04-21 2020-08-28 哈尔滨拓博科技有限公司 Rapid high-precision camera parameter calculation method
CN111890354B (en) * 2020-06-29 2022-01-11 北京大学 Robot hand-eye calibration method, device and system
CN111890354A (en) * 2020-06-29 2020-11-06 北京大学 Robot hand-eye calibration method, device and system
CN114581515A (en) * 2020-12-02 2022-06-03 中国科学院沈阳自动化研究所 Multi-camera calibration parameter optimization method based on optimal path conversion
CN114581515B (en) * 2020-12-02 2024-08-20 中国科学院沈阳自动化研究所 Multi-camera calibration parameter optimization method based on optimal path conversion
CN112781496B (en) * 2021-01-20 2022-03-08 湘潭大学 Measuring head pose calibration method of non-contact measuring system
CN112781496A (en) * 2021-01-20 2021-05-11 湘潭大学 Measuring head pose calibration technology of non-contact measuring system
CN113077519A (en) * 2021-03-18 2021-07-06 中国电子科技集团公司第五十四研究所 Multi-phase external parameter automatic calibration method based on human skeleton extraction
CN113963058A (en) * 2021-09-07 2022-01-21 于留青 On-line calibration method and device for CT (computed tomography) of preset track, electronic equipment and storage medium
CN114049324A (en) * 2021-11-15 2022-02-15 天津大学 Associated reference telecentric measurement quick calibration method under super-field scale
CN114049324B (en) * 2021-11-15 2024-08-23 天津大学 Rapid calibration method for correlated reference telecentric measurement under super-view field scale
CN114705122A (en) * 2022-04-13 2022-07-05 成都飞机工业(集团)有限责任公司 Large-field stereoscopic vision calibration method
CN114705122B (en) * 2022-04-13 2023-05-05 成都飞机工业(集团)有限责任公司 Large-view-field stereoscopic vision calibration method
CN115578694A (en) * 2022-11-18 2023-01-06 合肥英特灵达信息技术有限公司 Video analysis computing power scheduling method, system, electronic equipment and storage medium
CN116188602A (en) * 2023-04-26 2023-05-30 西北工业大学青岛研究院 High-precision calibration method for underwater multi-vision three-dimensional imaging system
CN117650844A (en) * 2024-01-29 2024-03-05 江西开放大学 VLC relay system safety performance optimization method
CN117650844B (en) * 2024-01-29 2024-04-26 江西开放大学 VLC relay system safety performance optimization method

Also Published As

Publication number Publication date
CN102034238B (en) 2012-07-18

Similar Documents

Publication Publication Date Title
CN102034238B (en) Multi-camera system calibrating method based on optical imaging probe and visual graph structure
CN108510551B (en) Method and system for calibrating camera parameters under long-distance large-field-of-view condition
CN102376089B (en) Target correction method and system
CN102155923B (en) Splicing measuring method and system based on three-dimensional target
CN103530880B (en) Based on the camera marking method of projection Gaussian network pattern
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN112396656B (en) Outdoor mobile robot pose estimation method based on fusion of vision and laser radar
CN110031829B (en) Target accurate distance measurement method based on monocular vision
CN107886547B (en) Fisheye camera calibration method and system
Gerke Using horizontal and vertical building structure to constrain indirect sensor orientation
CN105160702A (en) Stereoscopic image dense matching method and system based on LiDAR point cloud assistance
CN102982551B (en) Method for solving intrinsic parameters of parabolic catadioptric camera linearly by utilizing three unparallel straight lines in space
CN103198524A (en) Three-dimensional reconstruction method for large-scale outdoor scene
CN105509733A (en) Measuring method for relative pose of non-cooperative spatial circular object
CN109341668B (en) Multi-camera measuring method based on refraction projection model and light beam tracking method
CN103278138A (en) Method for measuring three-dimensional position and posture of thin component with complex structure
CN110763204B (en) Planar coding target and pose measurement method thereof
Hansen et al. Online continuous stereo extrinsic parameter estimation
CN111524195B (en) Camera calibration method in positioning of cutting head of heading machine
CN110782498B (en) Rapid universal calibration method for visual sensing network
CN110827361A (en) Camera group calibration method and device based on global calibration frame
Perdigoto et al. Calibration of mirror position and extrinsic parameters in axial non-central catadioptric systems
CN116563377A (en) Mars rock measurement method based on hemispherical projection model
CN113947638A (en) Image orthorectification method for fisheye camera
CN113963067B (en) Calibration method for calibrating large-view-field visual sensor by using small target

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
ASS Succession or assignment of patent right

Owner name: SUZHOU DIKA TESTING TECHNOLOGY CO., LTD.

Free format text: FORMER OWNER: XI AN JIAOTONG UNIV.

Effective date: 20150528

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 710049 XI AN, SHAANXI PROVINCE TO: 215505 SUZHOU, JIANGSU PROVINCE

TR01 Transfer of patent right

Effective date of registration: 20150528

Address after: 215505 Jiangsu province Suzhou Changshou City Lianfeng Road No. 58

Patentee after: Suzhou cartesan Testing Technology Co. Ltd

Address before: 710049 Xianning West Road, Shaanxi, China, No. 28, No.

Patentee before: Xi'an Jiaotong University

EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20110427

Assignee: Xi'an like Photoelectric Technology Co., Ltd.

Assignor: Suzhou cartesan Testing Technology Co. Ltd

Contract record no.: 2015610000089

Denomination of invention: Multi-camera system calibrating method based on optical imaging test head and visual graph structure

Granted publication date: 20120718

License type: Exclusive License

Record date: 20150902

LICC Enforcement, change and cancellation of record of contracts on the licence for exploitation of a patent or utility model