CN103854291A - Camera calibration method in four-degree of freedom binocular vision system - Google Patents
Camera calibration method in four-degree of freedom binocular vision system Download PDFInfo
- Publication number
- CN103854291A CN103854291A CN201410123114.5A CN201410123114A CN103854291A CN 103854291 A CN103854291 A CN 103854291A CN 201410123114 A CN201410123114 A CN 201410123114A CN 103854291 A CN103854291 A CN 103854291A
- Authority
- CN
- China
- Prior art keywords
- video camera
- camera
- coordinate system
- matrix
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 239000011159 matrix material Substances 0.000 claims description 51
- 238000013519 translation Methods 0.000 claims description 25
- 230000009466 transformation Effects 0.000 claims description 9
- 230000008901 benefit Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000007665 sagging Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a camera calibration method in a four-degree of freedom binocular vision system. The method comprises the following steps of estimating the inner parameters of a left camera and a right camera in the four-degree of freedom binocular vision system by using a planar template-based calibration method, calculating the outer parameters of the left and right cameras under the initial posture of the four-degree of freedom binocular vision system, estimating the horizontal and vertical rotation shafts of the left camera and the horizontal and vertical rotation shafts of the right camera, and calculating the outer parameters of the left and right cameras under the current posture. According to the technical scheme, degrees of the cameras respectively rotating around the rotation shafts are given, so that corresponding inner and outer parameters of the two cameras can be effectively calculated.
Description
Technical field
The present invention relates to technical field of computer vision, relate in particular to the camera marking method in a kind of four-degree-of-freedom stereo visual system.
Background technology
At computer vision field, camera calibration is a key issue in the application such as three-dimensional reconstruction, vision guided navigation, industrial detection, robot location, three-dimensional measurement, receives for a long time people's concern.Camera calibration refers to determines the inside and outside parameter of camera, wherein intrinsic parameter refer to camera intrinsic, with irrelevant inner geometry and the optical parametric of location parameter, comprise picture centre coordinate (principal point coordinate), focal length, scale factor and distortion factor etc.; And the demarcation of outer parameter refers to and determines that camera coordinate system, with respect to the transformation relation of world coordinate system, can represent with rotation matrix R and translation vector T.From upper world seventies, Chinese scholars has been done a large amount of research work to this, proposed all multi-methods.Whether these methods are according to having the thing of demarcation to be divided into: have the thing of demarcation method, nothing to demarcate thing method; Be divided into 3D, 2D, 1D, 0D standardization according to the dimension of demarcating thing; Be divided into Linear Model Calibration method and nonlinear model standardization according to adopted model; Being divided into tradition according to demarcation mode demarcates, initiatively demarcates and self-calibration.
Traditional camera marking method comprises traditional scaling method, active vision method and self-calibrating method, wherein, tradition scaling method refers to by a calibrating block that structure is known, precision is very high as space object of reference, set up the constraint of camera model parameter by the corresponding relation between spatial point and picture point, then ask for the method for these parameters by optimized algorithm.The advantage of tradition scaling method is to obtain higher precision, is applicable to the occasion that accuracy requirement parameter high and video camera does not often change.Weak point is calibration process complexity, needs high-precision calibrating block.In these class methods, the most representative work is (the Zhengyou Zhang of the scaling method based on plane template that Zhang proposed in 1999, Flexible Camera Calibration by Viewing a Plane from Unknown Orientations.IEEE International Conference on Computer Vision, Vol.1,666-673,1999.).This method had both had good robustness, did not need again expensive exquisite calibrating block, and practicality is stronger.Active vision method refers to the method for utilizing some movable information of known camera to demarcate camera.These class methods need to be controlled video camera and do some peculair motion, utilize the singularity of these motions to calculate the intrinsic parameter of video camera.The advantage of these class methods is that algorithm is simple, often can obtain linear solution, and robustness is higher; Shortcoming is that the cost of system is high, needs special experimental facilities and is not suitable for kinematic parameter the unknown or uncontrollable occasion.Self-calibrating method refers to does not need calibrating block, the method that only relies on the relation between multiple image corresponding point directly to demarcate.The constraint that these class methods have only utilized camera intrinsic parameter self to exist, irrelevant with the motion of scene and video camera, have flexibly, feature easily; It is not high that but the deficiency of its maximum is precision.
For multi-camera system, multiple video cameras can be to same object from the imaging respectively of different angles, and utilization has been demarcated the camera intrinsic parameter obtaining and can have been calculated relative translation and the rotation between camera.But four-degree-of-freedom binocular vision system is made up of two video cameras that can be rotated around horizontal direction and vertical direction, position relationship between two video cameras is not what fix, and this has brought difficulty to a certain degree to the demarcation of Binocular Stereo Vision System.
Summary of the invention
The object of the present invention is to provide the camera marking method in a kind of four-degree-of-freedom binocular vision system.Under given current attitude, each video camera, respectively around the number of degrees of two turning axle rotations, can calculate the inside and outside parameter of video camera that two video cameras are corresponding.
Camera marking method in a kind of four-degree-of-freedom binocular vision system that the present invention proposes comprises the following steps:
Step 1, is used the intrinsic parameter of estimating to obtain four-degree-of-freedom binocular vision system middle left and right video camera based on the scaling method of plane template;
Step 2, the outer parameters R of calculating binocular vision system initial attitude bottom left video camera
10and t
10, and the outer parameters R of right video camera
20and t
20;
Step 3, estimates the feathering axis A of left video camera
1with vertical rotating shaft A
2, and the feathering axis A of right video camera
3with vertical rotating shaft A
4;
Step 4, calculates under current attitude the outer parameter of left and right two video cameras.
According to above technical scheme, the beneficial effect of the camera marking method in the four-degree-of-freedom binocular vision system that the present invention proposes has:
(1) can effectively calculate binocular vision system middle left and right video camera feathering axis and vertical rotating shaft;
(2), under the prerequisite of known left and right camera intrinsic feathering axis and vertical rotating shaft and the anglec of rotation thereof, can effectively calculate the outer parameter of left and right video camera under current attitude.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the camera marking method in four-degree-of-freedom binocular vision system of the present invention;
Fig. 2 is the structural representation of four-degree-of-freedom binocular vision system;
Fig. 3 a-f is the six width plane template images for demarcating left camera intrinsic parameter;
Fig. 4 a-c is the three width plane template images that use left video camera to obtain under initial attitude;
Fig. 5 a-c is the three width plane template images that use right video camera to obtain under initial attitude;
Fig. 6 a-c is the three width plane template images for estimating left video camera feathering axis;
Fig. 7 a-c is the three width plane template images for estimating left video camera vertical rotating shaft;
Fig. 8 a-b is the one group of image that uses left and right video camera to obtain under any attitude.
Embodiment
For making the object, technical solutions and advantages of the present invention clearer, below in conjunction with specific embodiment, and with reference to accompanying drawing, the present invention is described in more detail.
Camera marking method in the four-degree-of-freedom binocular vision system that the present invention proposes mainly comprises 4 parts of outer calculation of parameter of two video cameras under a left side (right side) camera intrinsic parameter demarcation, left and right cameras initial attitude external parameters calibration, the estimation of a left side (right side) video camera level (vertically) turning axle, current attitude, Fig. 1 is the process flow diagram of the camera marking method in four-degree-of-freedom binocular vision system of the present invention, Fig. 2 is the structural representation of four-degree-of-freedom binocular vision system, as depicted in figs. 1 and 2, described camera marking method comprises the following steps:
Step 1, is used the intrinsic parameter of estimating to obtain four-degree-of-freedom binocular vision system middle left and right video camera based on the scaling method of plane template;
Describe for this step as an example of left video camera example below.
Step 11 is used left video camera to obtain several (>=3) uncalibrated images, as shown in Fig. 3 a-f under different plane template attitudes;
Step 12, sets up respectively world coordinate system using each stencil plane as XOY plane, can obtain the relational expression between coordinate system and the world coordinate system of every width uncalibrated image by camera model, is shown below:
Wherein, s is scale factor, u is the transverse axis coordinate of picture point under image coordinate system, v is the ordinate of orthogonal axes of picture point under image coordinate system, the Intrinsic Matrix that K is video camera, and [R t] is rotation matrix and the translation vector of camera coordinate system with respect to world coordinate system, X is the axial coordinate of three-dimensional point under world coordinate system, Y is the Y-axis coordinate of three-dimensional point under world coordinate system, and H is the homography matrix that stencil plane arrives the corresponding plane of delineation, r
1for the first row vector of R, r
2for the secondary series vector of R, t is translation vector.
Step 13, combines the coordinate system relational expression that several uncalibrated images are corresponding, estimates to obtain the homography matrix H that every width uncalibrated image is corresponding;
Wherein, use the normalization homography matrix method of estimation based on least square to estimate the homography matrix H that every width uncalibrated image is corresponding, the normalization homography matrix method of estimation of described least square belongs to matrix estimation method conventional in prior art, does not repeat them here.
Step 14, the multiple homography matrixs that obtain according to estimation are estimated the intrinsic parameter of left video camera.
Can obtain the relational expression between homography matrix and Intrinsic Matrix by formula (1):
H=[h
1 h
2 h
3]=K[r
1 r
2 t] (2)
Further, can release for every width uncalibrated image, its corresponding homography matrix H can provide two constraints about left camera intrinsic parameter matrix:
Like this, utilize restriction relation that multiple homography matrixs that several (>=3) uncalibrated images are corresponding provide just can estimate to obtain the Intrinsic Matrix K of left video camera.
Certainly also can obtain its Intrinsic Matrix according to above-mentioned steps for right video camera.
Step 2, the outer parameters R of calculating binocular vision system initial attitude bottom left video camera
10and t
10, and the outer parameters R of right video camera
20and t
20;
Described step 2 is further comprising the steps:
Step 21, under initial attitude, is used left and right video camera to obtain corresponding plane template image, as shown in Fig. 4 a-c, Fig. 5 a-c;
Step 22, the image obtaining based on left and right video camera, uses direct linear transformation's method to calculate the now outer parameters R of left video camera
1and t
1, and the outer parameters R of right video camera
2and t
2;
Described direct linear transformation's method representation is:
Wherein, M represents a three-dimensional point, s
1and s
2represent two scale factors, m
1and m
2represent that three-dimensional point M uses corresponding picture point in the image that left and right video camera obtains, K under the initial attitude of binocular vision system
1and K
2be respectively the Intrinsic Matrix of left and right video camera, R
1and t
1be respectively rotation matrix and the translation vector of world coordinate system with respect to left camera coordinate system, R
2and t
2be respectively rotation matrix and the translation vector of world coordinate system with respect to right camera coordinate system.
In this step, due to picture point m
1, m
2all known with three-dimensional point M, the Intrinsic Matrix K of left and right video camera
1and K
2can estimate by the scaling method based on plane template again, therefore can utilize formula (4) to use direct linear transformation's method to calculate the outer parameters R of initial attitude bottom left video camera
1and t
1, and the outer parameters R of right video camera
2and t
2.
Step 23, utilizes the outer parameters R of initial attitude bottom left video camera
1and t
1, and the outer parameters R of right video camera
2and t
2, calculate the corresponding right camera coordinate system of image with Different Plane template attitude with respect to the rotation matrix R of left camera coordinate system
21with translation vector t
21;
If the coordinate of three-dimensional point M under left and right camera coordinate system is respectively M
1and M
2, there is following relational expression:
Can release two transformation relations between camera coordinate system by formula (5):
According to (5) formula and (6) formula, can calculate the rotation matrix R of right camera coordinate system with respect to left camera coordinate system
21with translation vector t
21:
Step 24, right camera coordinate system corresponding to image that several are had to Different Plane template attitude averages with respect to rotating vector and the translation vector of left camera coordinate system, obtains average rotation matrix and average translation vector that average rotating vector is corresponding;
This step utilizes formula (8) to calculate the average rotating vector of right camera coordinate system with respect to left camera coordinate system
with average translation vector
Wherein, R
21 (i)and t
21 (i)(i=1,2 ..., n) represent to use i group image estimated go out right camera coordinate system with respect to rotation matrix and the translation vector of left camera coordinate system; A
21 (i)(i=1,2 ..., n) represent rotation matrix R
21 (i)corresponding rotating vector, rotating vector can utilize Rodrigues formula to calculate, and then can utilize Rodrigues formula to calculate average rotating vector
corresponding average rotation matrix
Step 25, utilizes the average rotation matrix and the average translation vector that obtain, calculates the outer parameters R of initial attitude bottom left video camera
10and t
10, and the outer parameters R of right video camera
20and t
20.
In this step, utilize following formula to calculate the outer parameters R of initial attitude bottom left video camera
10and t
10, and the outer parameters R of right video camera
20and t
20:
Step 3, estimates the feathering axis A of left video camera
1with vertical rotating shaft A
2, and the feathering axis A of right video camera
3with vertical rotating shaft A
4;
As an example of the method for estimation of left video camera feathering axis example, this step is described below.
Step 31, the outer parameter of establishing under left video camera initial attitude is R
10and t
10, be R around the outer parameter of feathering axis rotation alpha degree rear left video camera
11and t
11, utilize following formula to calculate the rotation matrix R of the postrotational camera coordinate system of left video camera with respect to the left camera coordinate system under initial attitude
α:
Wherein, R
10for the outer parameter of initial attitude bottom left video camera, R
11for the outer parameter around feathering axis rotation alpha degree rear left video camera.
In this step, can use direct linear transformation's method to estimate the outer parameters R after left camera intrinsic feathering axis rotation alpha degree
11and t
11.
Step 32, utilizes Rodrigues formula, according to rotation matrix R
αfurther estimate the feathering axis A of left video camera
1.
In embodiments of the present invention, we use the postrotational Image estimation feathering axis of several left camera intrinsic feathering axis, and the average of getting them is as final estimated result.For right video camera feathering axis and left and right cameras vertical rotating shaft, all can utilize similar method to estimate.
Step 4, calculates under current attitude the outer parameter of left and right two video cameras.
In this step, utilize following formula to calculate the outer parameter of left and right two video cameras under current attitude:
Wherein, α and β are the anglec of rotation of current attitude bottom left camera intrinsic feathering axis and vertical rotating shaft, θ and
for the anglec of rotation of current attitude bottom right camera intrinsic feathering axis and vertical rotating shaft, R
1cand T
1crepresent the outer parameter of current attitude bottom left video camera, R
2cand T
2crepresent the outer parameter of current attitude bottom right video camera, Rod () represents to utilize Rodrigues formula to be obtained the computing of rotation matrix by turning axle and rotation angle.
Accompanying drawing 2 has provided the hardware schematic diagram of four-degree-of-freedom binocular vision system, and system comprises two video cameras, and these two video cameras all can be rotated around feathering axis and vertical turning axle as shown in the figure.
In example of the present invention, the experimental data using is all from remaining unchanged in left and right video camera attitude respectively, and constantly the attitude of changing the plane template is taken and is used for the plane template image of calibrating camera intrinsic parameter, and parts of images as shown in Figure 3; Under four-degree-of-freedom binocular solid system initial attitude, use respectively left and right video camera to obtain the image of Different Plane template attitude, parts of images as shown in Figure 4 and Figure 5; Under initial attitude, turn left video camera, right video camera of horizontal rotary obtains plane template image respectively, obtains plane template image respectively at turn left video camera, right video camera of the sagging direct rotary of initial attitude, and parts of images as shown in Figure 6 and Figure 7; Under four-degree-of-freedom binocular solid system initial attitude, left camera intrinsic feathering axis rotation-2 is spent, rotated-24.95 degree around vertical rotating shaft, right camera intrinsic feathering axis rotation 2 is spent, after vertical rotating shaft rotates-24.26 degree, is taken one group of image, as shown in Figure 8.The each tessellated 40mm that is of a size of of plane template.Concrete implementation step comprises:
(1) scaling method of use based on plane template estimated the intrinsic parameter of left and right video camera;
(2), for the image that uses left and right video camera to obtain under every group of initial attitude, use direct linear transformation's method to calculate the now outer parameters R of left video camera according to formula (4)
1and t
1, right video camera outer parameters R
2and t
2; Use formula (7) to calculate the rotation matrix R of right camera coordinate system with respect to left camera coordinate system
21with translation vector t
21; Use formula (8) to average with respect to rotating vector and the translation vector of left camera coordinate system the right camera coordinate system that there is the image acquisition of Different Plane template attitude by many groups; Finally obtained the outer parameters R of initial attitude bottom left video camera by formula (9)
10and t
10, the outer parameters R of right video camera
20and t
20.
(3) for the image obtaining after the left camera intrinsic feathering axis rotation of each width, use formula (10) to calculate the rotation matrix R of the postrotational camera coordinate system of left video camera with respect to the left camera coordinate system under initial attitude
α; Utilize Rodrigues formula, according to rotation matrix R
αcan further estimate feathering axis A
1; For the image obtaining after several left camera intrinsic feathering axis rotations, can obtain the estimated result of multiple feathering axis, get their average as final estimated result.
(4) use the vertical rotating shaft A that calculates left video camera with the similar method of step 3
2, right video camera feathering axis A
3with vertical rotating shaft A
4.
(5) use formula (6) to calculate the outer parameter of the left and right video camera under current attitude.
In order to verify the validity of the inventive method, the inside and outside parameter of postrotational video camera that use estimates is rebuild X-comers, and the distance of adjacent 2 of the gridiron pattern that reconstructs of statistics, find that by statistics these distances are all at 40 ± 4mm, be measuring accuracy more than 90%, reached the demand of practical application.
Above-described specific embodiment; object of the present invention, technical scheme and beneficial effect are further described; institute is understood that; the foregoing is only specific embodiments of the invention; be not limited to the present invention; within the spirit and principles in the present invention all, any modification of making, be equal to replacement, improvement etc., within all should being included in protection scope of the present invention.
Claims (10)
1. the camera marking method in four-degree-of-freedom binocular vision system, is characterized in that, the method comprises the following steps:
Step 1, is used the intrinsic parameter of estimating to obtain four-degree-of-freedom binocular vision system middle left and right video camera based on the scaling method of plane template;
Step 2, the outer parameters R of calculating binocular vision system initial attitude bottom left video camera
10and t
10, and the outer parameters R of right video camera
20and t
20;
Step 3, estimates the feathering axis A of left video camera
1with vertical rotating shaft A
2, and the feathering axis A of right video camera
3with vertical rotating shaft A
4;
Step 4, calculates under current attitude the outer parameter of left and right two video cameras.
2. method according to claim 1, is characterized in that, described step 1 is further comprising the steps:
Step 11 is used video camera to obtain several uncalibrated images under different plane template attitudes;
Step 12, sets up respectively world coordinate system using each stencil plane as XOY plane, can obtain the relational expression between coordinate system and the world coordinate system of every width uncalibrated image by camera model;
Step 13, combines the coordinate system relational expression that several uncalibrated images are corresponding, estimates to obtain the homography matrix H that every width uncalibrated image is corresponding;
Step 14, the multiple homography matrixs that obtain according to estimation are estimated the intrinsic parameter of video camera.
3. method according to claim 2, is characterized in that, the relational expression between coordinate system and the world coordinate system of described uncalibrated image is expressed as:
Wherein, s is scale factor, u is the transverse axis coordinate of picture point under image coordinate system, v is the ordinate of orthogonal axes of picture point under image coordinate system, the Intrinsic Matrix that K is video camera, and [R t] is rotation matrix and the translation vector of camera coordinate system with respect to world coordinate system, X is the axial coordinate of three-dimensional point under world coordinate system, Y is the Y-axis coordinate of three-dimensional point under world coordinate system, and H is the homography matrix that stencil plane arrives the corresponding plane of delineation, r
1for the first row vector of R, r
2for the secondary series vector of R, t is translation vector.
4. method according to claim 2, is characterized in that, in described step 14, utilizes multiple homography matrixs that several uncalibrated images are corresponding to estimate to obtain the Intrinsic Matrix K of video camera for the constraint of camera intrinsic parameter matrix, and described constraint representation is:
Wherein, h
1, h
2for the first two element of homography matrix.
5. method according to claim 1, is characterized in that, described step 2 is further comprising the steps:
Step 21, under initial attitude, is used left and right video camera to obtain corresponding plane template image;
Step 22, the image obtaining based on left and right video camera, uses direct linear transformation's method to calculate the now outer parameters R of left video camera
1and t
1, and the outer parameters R of right video camera
2and t
2;
Step 23, utilizes the outer parameters R of initial attitude bottom left video camera
1and t
1, and the outer parameters R of right video camera
2and t
2, calculate the corresponding right camera coordinate system of image with Different Plane template attitude with respect to the rotation matrix R of left camera coordinate system
21with translation vector t
21;
Step 24, right camera coordinate system corresponding to image that several are had to Different Plane template attitude averages with respect to rotating vector and the translation vector of left camera coordinate system, obtains average rotation matrix and average translation vector that average rotating vector is corresponding;
Step 25, utilizes the average rotation matrix and the average translation vector that obtain, calculates the outer parameters R of initial attitude bottom left video camera
10and t
10, and the outer parameters R of right video camera
20and t
20.
6. method according to claim 5, is characterized in that, described direct linear transformation's method representation is:
Wherein, M represents a three-dimensional point, s
1and s
2represent two scale factors, m
1and m
2represent that three-dimensional point M uses corresponding picture point in the image that left and right video camera obtains, K under the initial attitude of binocular vision system
1and K
2be respectively the Intrinsic Matrix of left and right video camera, R
1and t
1be respectively rotation matrix and the translation vector of world coordinate system with respect to left camera coordinate system, R
2and t
2be respectively rotation matrix and the translation vector of world coordinate system with respect to right camera coordinate system.
7. method according to claim 5, is characterized in that, in described step 23, utilizes following formula to calculate the rotation matrix R of right camera coordinate system with respect to left camera coordinate system
21with translation vector t
21:
8. method according to claim 5, is characterized in that, in described step 25, utilizes following formula to calculate the outer parameters R of initial attitude bottom left video camera
10and t
10, and the outer parameters R of right video camera
20and t
20:
9. method according to claim 1, is characterized in that, described step 3 is further comprising the steps:
Step 31, the outer parameter of establishing under video camera initial attitude is R
10and t
10, the outer parameter after turning axle rotation alpha degree is R
11and t
11, utilize following formula to calculate the rotation matrix R of the postrotational camera coordinate system of video camera with respect to the camera coordinate system under initial attitude
α:
Wherein, R
10for the outer parameter of video camera under initial attitude, R
11for the outer parameter of video camera after turning axle rotation alpha degree;
Step 32, according to rotation matrix R
αestimate to obtain the turning axle of video camera.
10. method according to claim 1, is characterized in that, in described step 4, utilizes following formula to calculate the outer parameter of left and right two video cameras under current attitude:
Wherein, α and β are the anglec of rotation of current attitude bottom left camera intrinsic feathering axis and vertical rotating shaft, θ and
for the anglec of rotation of current attitude bottom right camera intrinsic feathering axis and vertical rotating shaft, R
1cand T
1crepresent the outer parameter of current attitude bottom left video camera, R
2cand T
2crepresent the outer parameter of current attitude bottom right video camera, Rod () represents to utilize Rodrigues formula to be obtained the computing of rotation matrix by turning axle and rotation angle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410123114.5A CN103854291B (en) | 2014-03-28 | 2014-03-28 | Camera marking method in four-degree-of-freedom binocular vision system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410123114.5A CN103854291B (en) | 2014-03-28 | 2014-03-28 | Camera marking method in four-degree-of-freedom binocular vision system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103854291A true CN103854291A (en) | 2014-06-11 |
CN103854291B CN103854291B (en) | 2017-08-29 |
Family
ID=50861903
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410123114.5A Active CN103854291B (en) | 2014-03-28 | 2014-03-28 | Camera marking method in four-degree-of-freedom binocular vision system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103854291B (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105447877A (en) * | 2015-12-13 | 2016-03-30 | 大巨龙立体科技有限公司 | Parallel dual-camera stereo calibration method |
CN105785989A (en) * | 2016-02-24 | 2016-07-20 | 中国科学院自动化研究所 | System for calibrating distributed network camera by use of travelling robot, and correlation methods |
CN105913410A (en) * | 2016-03-03 | 2016-08-31 | 华北电力大学(保定) | Long-distance moving object height measurement apparatus and method based on machine vision |
CN104236456B (en) * | 2014-09-04 | 2016-10-26 | 中国科学院合肥物质科学研究院 | A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor |
CN106799732A (en) * | 2016-12-07 | 2017-06-06 | 中国科学院自动化研究所 | For the control system and its localization method of the motion of binocular head eye coordination |
CN106903665A (en) * | 2017-04-18 | 2017-06-30 | 中国科学院重庆绿色智能技术研究院 | A kind of master-slave mode telesurgery robot control system based on stereoscopic vision |
CN107367229A (en) * | 2017-04-24 | 2017-11-21 | 天津大学 | Free binocular stereo vision rotating shaft parameter calibration method |
CN109118545A (en) * | 2018-07-26 | 2019-01-01 | 深圳市易尚展示股份有限公司 | 3-D imaging system scaling method and system based on rotary shaft and binocular camera |
CN109242914A (en) * | 2018-09-28 | 2019-01-18 | 上海爱观视觉科技有限公司 | A kind of stereo calibration method of movable vision system |
CN109360243A (en) * | 2018-09-28 | 2019-02-19 | 上海爱观视觉科技有限公司 | A kind of scaling method of the movable vision system of multiple degrees of freedom |
CN109677217A (en) * | 2018-12-27 | 2019-04-26 | 魔视智能科技(上海)有限公司 | The detection method of tractor and trailer yaw angle |
CN110246191A (en) * | 2019-06-13 | 2019-09-17 | 易思维(杭州)科技有限公司 | Camera nonparametric model scaling method and stated accuracy appraisal procedure |
WO2020061771A1 (en) * | 2018-09-25 | 2020-04-02 | 深圳市大疆创新科技有限公司 | Parameter processing method and device for camera and image processing apparatus |
CN113379848A (en) * | 2021-06-09 | 2021-09-10 | 中国人民解放军陆军军事交通学院军事交通运输研究所 | Target positioning method based on binocular PTZ camera |
CN113472973A (en) * | 2021-05-11 | 2021-10-01 | 浙江大华技术股份有限公司 | Binocular camera |
CN113538598A (en) * | 2021-07-21 | 2021-10-22 | 北京能创科技有限公司 | Active stereo vision system calibration method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2535869A2 (en) * | 2011-06-13 | 2012-12-19 | Alpine Electronics, Inc. | Apparatus and method for detecting posture of camera mounted on vehicle |
CN103473758A (en) * | 2013-05-13 | 2013-12-25 | 中国科学院苏州生物医学工程技术研究所 | Secondary calibration method of binocular stereo vision system |
-
2014
- 2014-03-28 CN CN201410123114.5A patent/CN103854291B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2535869A2 (en) * | 2011-06-13 | 2012-12-19 | Alpine Electronics, Inc. | Apparatus and method for detecting posture of camera mounted on vehicle |
CN103473758A (en) * | 2013-05-13 | 2013-12-25 | 中国科学院苏州生物医学工程技术研究所 | Secondary calibration method of binocular stereo vision system |
Non-Patent Citations (2)
Title |
---|
T.XUE等: "Complete calibration of a structure-uniform stereovision sensor with free-position planar pattern", 《SENSORS AND ACTUATORS A:PHYSICAL》, vol. 135, no. 1, 30 March 2007 (2007-03-30), pages 185 - 191, XP005928265, DOI: 10.1016/j.sna.2006.07.004 * |
曾慧等: "基于线对应的单应矩阵估计及其在视觉测量中的应用", 《自动化学报》, vol. 33, no. 5, 31 May 2007 (2007-05-31), pages 449 - 455 * |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104236456B (en) * | 2014-09-04 | 2016-10-26 | 中国科学院合肥物质科学研究院 | A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor |
CN105447877A (en) * | 2015-12-13 | 2016-03-30 | 大巨龙立体科技有限公司 | Parallel dual-camera stereo calibration method |
CN105785989A (en) * | 2016-02-24 | 2016-07-20 | 中国科学院自动化研究所 | System for calibrating distributed network camera by use of travelling robot, and correlation methods |
CN105785989B (en) * | 2016-02-24 | 2018-12-07 | 中国科学院自动化研究所 | Utilize the system and correlation technique of Robot calibration distributed network video camera in traveling |
CN105913410A (en) * | 2016-03-03 | 2016-08-31 | 华北电力大学(保定) | Long-distance moving object height measurement apparatus and method based on machine vision |
CN106799732A (en) * | 2016-12-07 | 2017-06-06 | 中国科学院自动化研究所 | For the control system and its localization method of the motion of binocular head eye coordination |
CN106903665A (en) * | 2017-04-18 | 2017-06-30 | 中国科学院重庆绿色智能技术研究院 | A kind of master-slave mode telesurgery robot control system based on stereoscopic vision |
CN107367229A (en) * | 2017-04-24 | 2017-11-21 | 天津大学 | Free binocular stereo vision rotating shaft parameter calibration method |
CN109118545A (en) * | 2018-07-26 | 2019-01-01 | 深圳市易尚展示股份有限公司 | 3-D imaging system scaling method and system based on rotary shaft and binocular camera |
CN109118545B (en) * | 2018-07-26 | 2021-04-16 | 深圳市易尚展示股份有限公司 | Three-dimensional imaging system calibration method and system based on rotating shaft and binocular camera |
WO2020061771A1 (en) * | 2018-09-25 | 2020-04-02 | 深圳市大疆创新科技有限公司 | Parameter processing method and device for camera and image processing apparatus |
CN109360243A (en) * | 2018-09-28 | 2019-02-19 | 上海爱观视觉科技有限公司 | A kind of scaling method of the movable vision system of multiple degrees of freedom |
WO2020063058A1 (en) * | 2018-09-28 | 2020-04-02 | 上海爱观视觉科技有限公司 | Calibration method for multi-degree-of-freedom movable vision system |
JP2022500956A (en) * | 2018-09-28 | 2022-01-04 | 上海愛観視覚科技有限公司 | 3D calibration method for movable visual system |
WO2020063059A1 (en) * | 2018-09-28 | 2020-04-02 | 上海爱观视觉科技有限公司 | Stereo calibration method for movable vision system |
CN109242914A (en) * | 2018-09-28 | 2019-01-18 | 上海爱观视觉科技有限公司 | A kind of stereo calibration method of movable vision system |
US11847797B2 (en) | 2018-09-28 | 2023-12-19 | Anhui Eyevolution Technology Co., Ltd. | Calibration method for multi-degree-of-freedom movable vision system |
US11663741B2 (en) | 2018-09-28 | 2023-05-30 | Anhui Eyevolution Technology Co., Ltd. | Stereo calibration method for movable vision system |
JP7185821B2 (en) | 2018-09-28 | 2022-12-08 | 安徽愛観視覚科技有限公司 | Stereocalibration method for movable vision system |
CN109677217A (en) * | 2018-12-27 | 2019-04-26 | 魔视智能科技(上海)有限公司 | The detection method of tractor and trailer yaw angle |
CN110246191A (en) * | 2019-06-13 | 2019-09-17 | 易思维(杭州)科技有限公司 | Camera nonparametric model scaling method and stated accuracy appraisal procedure |
CN113472973A (en) * | 2021-05-11 | 2021-10-01 | 浙江大华技术股份有限公司 | Binocular camera |
CN113379848A (en) * | 2021-06-09 | 2021-09-10 | 中国人民解放军陆军军事交通学院军事交通运输研究所 | Target positioning method based on binocular PTZ camera |
CN113538598B (en) * | 2021-07-21 | 2022-03-25 | 北京能创科技有限公司 | Active stereo vision system calibration method |
CN113538598A (en) * | 2021-07-21 | 2021-10-22 | 北京能创科技有限公司 | Active stereo vision system calibration method |
Also Published As
Publication number | Publication date |
---|---|
CN103854291B (en) | 2017-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103854291A (en) | Camera calibration method in four-degree of freedom binocular vision system | |
CN102005039B (en) | Fish-eye camera stereo vision depth measuring method based on Taylor series model | |
CN109360240B (en) | Small unmanned aerial vehicle positioning method based on binocular vision | |
CN101581569B (en) | Calibrating method of structural parameters of binocular visual sensing system | |
CN102842117B (en) | Method for correcting kinematic errors in microscopic vision system | |
CN107358633A (en) | Join scaling method inside and outside a kind of polyphaser based on 3 points of demarcation things | |
CN103278138A (en) | Method for measuring three-dimensional position and posture of thin component with complex structure | |
CN102982551B (en) | Method for solving intrinsic parameters of parabolic catadioptric camera linearly by utilizing three unparallel straight lines in space | |
CN104217435A (en) | Method of determining intrinsic parameters of parabolic catadioptric camera through linearity of two mutually-shielded spheres | |
CN105389543A (en) | Mobile robot obstacle avoidance device based on all-dimensional binocular vision depth information fusion | |
CN102013099A (en) | Interactive calibration method for external parameters of vehicle video camera | |
CN104463791A (en) | Fisheye image correction method based on spherical model | |
CN102567991B (en) | A kind of binocular vision calibration method based on concentric circle composite image matching and system | |
CN104268876A (en) | Camera calibration method based on partitioning | |
CN111091076B (en) | Tunnel limit data measuring method based on stereoscopic vision | |
CN102609977A (en) | Depth integration and curved-surface evolution based multi-viewpoint three-dimensional reconstruction method | |
CN105374067A (en) | Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof | |
CN104807405B (en) | Three-dimensional coordinate measurement method based on light ray angle calibration | |
CN104794718A (en) | Single-image CT (computed tomography) machine room camera calibration method | |
CN113450416B (en) | TCSC method applied to three-dimensional calibration of three-dimensional camera | |
CN113658266A (en) | Moving axis rotation angle visual measurement method based on fixed camera and single target | |
CN103900504A (en) | Nano-scale real-time three-dimensional visual information feedback method | |
CN107909543A (en) | A kind of flake binocular vision Stereo matching space-location method | |
CN114812558B (en) | Monocular vision unmanned aerial vehicle autonomous positioning method combining laser ranging | |
CN104123726A (en) | Blanking point based large forging measurement system calibration method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20220402 Address after: 230088 a14-5, 13 / F, block a, building J1, phase II, innovation industrial park, No. 2800, innovation Avenue, high tech Zone, Hefei, Anhui Province Patentee after: Anhui aiguan Vision Technology Co.,Ltd. Address before: 100190 No. 95 East Zhongguancun Road, Beijing, Haidian District Patentee before: INSTITUTE OF AUTOMATION, CHINESE ACADEMY OF SCIENCES |
|
TR01 | Transfer of patent right |