CN104167001B - Large-visual-field camera calibration method based on orthogonal compensation - Google Patents

Large-visual-field camera calibration method based on orthogonal compensation Download PDF

Info

Publication number
CN104167001B
CN104167001B CN201410432768.6A CN201410432768A CN104167001B CN 104167001 B CN104167001 B CN 104167001B CN 201410432768 A CN201410432768 A CN 201410432768A CN 104167001 B CN104167001 B CN 104167001B
Authority
CN
China
Prior art keywords
point
prime
orthogonal
video camera
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410432768.6A
Other languages
Chinese (zh)
Other versions
CN104167001A (en
Inventor
贾振元
刘巍
王亮
刘阳
杨景豪
王灵丽
张弛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201410432768.6A priority Critical patent/CN104167001B/en
Publication of CN104167001A publication Critical patent/CN104167001A/en
Application granted granted Critical
Publication of CN104167001B publication Critical patent/CN104167001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a large-visual-field camera calibration method based on orthogonal compensation, belongs to the field of computer visual inspection and image processing and relates to a method for calibrating internal parameters and external parameters of a camera in a large forged piece dimension measurement system. Generally, in an active visual calibration process, two translation guide rails of an electric-control platform are not complete orthogonally, and calibration is conducted simply by blindly selecting a high-precision electric-control platform and regarding that the translation guide rails of the selected electric-control platform are orthogonal approximately. According to the large-visual-field camera calibration method based on orthogonal compensation, the internal parameters and the external parameters of a camera are obtained by obtaining the principal point, the scale factor, the translation vector and the rotation matrix. According to the large-visual-field camera calibration method based on orthogonal compensation, the guide rail orthogonal degree is considered during calibration, namely orthogonal compensation is introduced, and thus the errors, caused by the guide rail orthogonal degree, of a calibration result are reduced.

Description

Large-field shooting machine scaling method based on quadrature compensation
Technical field
The invention belongs to Computer Vision Detection and image processing field, it is related to shooting in large forgings Size Measuring System The scaling method of the inside and outside parameter of machine, specifically refers to a kind of large-field shooting machine scaling method based on quadrature compensation.
Background technology
One of basic task of heavy forging size vision measurement is the two-dimensional image information according to heavy forging in forging process Recover three-dimensional geometric information, ask for real-time dimensional parameters.Realize asking for corresponding space object surface point using picture point It is thus necessary to determine that video camera imaging geometric model, the parameter of this geometric model is referred to as camera parameters to task.Camera intrinsic parameter is The parameter relevant with geometry and optical characteristics of video camera itself, outer parameter is video camera with respect to the three of a certain world coordinate system Dimension position and direction.
The principal element of impact camera calibration precision includes:Accurate, the robust of feature point for calibration extract.Determine video camera The process of inside and outside parameter is referred to as camera calibration, and the precision of scaling method directly influences the precision of computer vision measurement.Cause This, the research that video camera is carried out with quick, simple and direct, accurate demarcation is undoubtedly significant.
Difference according to camera calibration mode, can be divided into traditional standardization, camera self-calibration method and be based at present The camera marking method of active vision;Based on the high advantage of active vision scaling method stability and robustness, and generally main Electric control platform two translating rails non-fully orthogonal during dynamic vision calibration, and demarcating is high automatically controlled of blindness choice accuracy It is orthogonal that platform is approximately considered it.For this problem it is considered to add guide rail intercept factor in calibration process, that is, introduce orthogonal Compensate, to reduce the error that guide rail intercept brings to calibration result.
Content of the invention
The large-field shooting machine scaling method based on quadrature compensation of the present invention, asks principal point technology to obtain first with zoom Take principal point coordinate, then utilize four groups of orthogonal motions to obtain 12 images, in the situation of the intercept considering four-dimensional electric control platform Lower ask for scale factor using 12 images, so far, inner parameter has been demarcated and has been finished;Finally, in conjunction with the inner parameter asked for Ask for external parameter.
The present invention adopts the technical scheme that:
A kind of large-field shooting machine scaling method based on quadrature compensation, electric control platform in usual active vision calibration process Two translating rails are simultaneously non-fully orthogonal, and demarcating is that to be approximately considered it orthogonal, therefore for the high electric control platform of blindness choice accuracy Propose a kind of camera marking method based on quadrature compensation to pass through to ask for principal point and scale factor, translation vector and rotation respectively Matrix obtains the inside and outside parameter of video camera;Comprise the following steps that:
Step 1:Zoom method seeks principal point
Shoot two-dimensional target with video camera under different focal, when focal length changes, photocentre moves along optical axis, in video camera In visual field two-dimensional target will equal proportion scalable, obtain target image under different visual fields, in different images, have the image of any to protect Hold constant, be called scalable center, scalable center is principal point in the image of different visual fields, therefore each characteristic point of target exists Intersection point under the least square meaning of the line under different focal is principal point.
Step 2:Active vision scaling method using four-dimensional electric control platform obtains target image
Four-dimensional electric control platform is controlled to drive video camera to do the orthogonal motion in four groups of planes, every group of orthogonal motion contains twice Translation, video camera stops three positions in each orthogonal motion, and then obtains three images, and is guaranteeing target in visual field On the premise of interior, between every group of orthogonal motion, video camera all has the luffing angle not less than 5 degree or pan angle, by four groups Pure translational movement obtains totally 12 target images twice.
Step 3:Ask for scale factor
Parallel lines and plane at infinity intersect at same infinite point, that is, hidden disappear a little, by 12 after distortion correction Image packet carries out blanking point estimation, and in every group of 3 images, corresponding point line obtains 2 vanishing points, 4 groups of totally 8 vanishing points;Will Between the 8 blanking point coordinates, principal point coordinate and the guide rail that obtain, angle is brought in following formula:
Wherein, θ is the angle between two translating rails of electric control platform, vi1、vi2For under i-th group of orthogonal motion twice The pixel coordinate (homogeneous form) of the vanishing point pair that translational motion obtains, w=K-TK-1It is the image of absolute conic, its tool Body form is as follows:
Can obtain comprising the binary side of four nonlinear equations by above-mentioned two formula and eight blanking point coordinates Journey group, goes out scale factor f with nonlinear least square method iterativexAnd fy.
Step 4:Demarcate outer parameter
The homogeneous coordinates of one group of corresponding picture point can obtain a linear equation, as follows:
uu′f11+vu′f12+u′f13+uv′f21+vv′f22+v′f23+uf31+vf32+f33=0 (3)
Wherein, fijFor the element on the i-th row of fundamental matrix, jth row;
[uu ', vu ', u ', uv ', vv ', v ', u, v, 1] f=0 (4)
Wherein, f=[f11,f12,f13,f21,f22,f23,f31,f32,f33]TFor 9 dimensional vectors;
If providing n group corresponding point { mi,mi', then there is following system of homogeneous linear equations
Given at least 8 groups corresponding picture points can solve above-mentioned linear homogeneous in the unknown non-zero proportions factor lower linear of difference one Equation group, here applies constraint | | f | |=1, when corresponding picture point accurately mate, can ask for substantially merely with 8 groups of corresponding point Matrix;And then the essential matrix under one non-zero proportions factor of difference can be obtainedWhereinS is difference Scale factor;Using E*=Udiag ((σ12)/2,(σ12)/2,0)VTAsApproximate, Wherein eigenvalue σ1≥σ2≥σ3.
From essential matrix E*It is estimated that the external parameter between video cameraThere are following 4 kinds possible combination shapes Formula:
[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3]. (6)
Wherein,u3The 3rd column vector for U,
Constrained by Chirality, be positive acute angle and translation vector according to yaw angleX-direction componentCome for negative value Correct judgment parameter combination;External parameter between using the video camera under the intrinsic parameters of the camera obtaining and difference one scale factor Rebuild two accurately known characteristic points of range information, the ratio rebuilding distance with actual range is scale factor s.
Brief description
Fig. 1 seeks principal point schematic diagram for zoom.
Fig. 2 is assembly of the invention structural representation.
Fig. 3 is Epipolar geometry schematic diagram in binocular stereo vision.
In figure:1 platform;2 left video camera electric control platform;3 right video camera electric control platform;4 left video cameras;5 right video cameras;6 Two-dimentional gridiron pattern scaling board.
Specific embodiment
Further describe the specific embodiment of the present invention below in conjunction with the accompanying drawings with technical scheme.
Camera calibration generally adopts classical national forest park in Xiaokeng, and the expression formula of this model is as follows:
Wherein, (Xw,Yw,Zw,1)TFor homogeneous coordinates in world coordinate system for the spatial point, (u, v, 1)TFor corresponding image Picture point pixel coordinate system o0Homogeneous coordinates in uv, αx=f/dx is o0Scale factor on u axle in uv coordinate system, αy=f/dy For o0Scale factor on v axle in uv coordinate system, f is camera lens focal length, and dx and dy is respectively horizontal, the vertical physics chi of pixel Very little, (u0,v0) based on point coordinates, ρcFor proportionality coefficient, K is intrinsic parameters of the camera matrix, [R | t] join for the outside of video camera Matrix number, wherein, R is spin matrix, and t is translation vector.
Intrinsic parameters of the camera includes principal point coordinate (u0,v0), scale factor αx、αy, coefficient of radial distortion k1、k2With tangential Distortion factor p1、p2.Video camera external parameter is the orientation that camera coordinate system is with respect to world coordinate system, including spin matrix R With translation vector t.
Step 1:Zoom method seeks principal point;
Shoot two-dimensional target with video camera under different focal, as shown in figure 1, when focal length changes, photocentre moves along optical axis Dynamic, in camera field of view two-dimensional target will equal proportion scalable, obtain target image under different visual fields, in different images, have one The image of point keeps constant, is called scalable center, scalable center is principal point in the image of different visual fields, therefore that target is each Intersection point under the least square meaning of line under different focal for the individual characteristic point is principal point.
Step 2:Active vision scaling method using four-dimensional electric control platform obtains target image;
As Fig. 2 builds experimental facilitiess, two video cameras are fixed in two four-dimensional electric control platform, control four-dimensional automatically controlled flat Platform drives video camera to do orthogonal motion in four groups of planes, and every group of orthogonal motion contains and translate twice, and video camera is orthogonal every time Stop three positions in motion, and then obtain three images, and on the premise of guaranteeing target in visual field, just ship for every group Between dynamic, video camera all has the luffing angle not less than 5 degree or pan angle, is total to by four groups of pure translational movement twice 12 target images.
Step 3:Ask for scale factor;
Parallel lines and plane at infinity intersect at same infinite point, that is, hidden disappear a little, by 12 after distortion correction Image packet carries out blanking point estimation, and in every group of 3 images, corresponding point line obtains 2 vanishing points, 4 groups of totally 8 vanishing points;Will Between the 8 blanking point coordinates, principal point coordinate and the guide rail that obtain, angle is brought in following formula:
Wherein, θ is the angle between two translating rails of electric control platform, vi1、vi2For under i-th group of orthogonal motion twice The pixel coordinate (homogeneous form) of the vanishing point pair that translational motion obtains, w=K-TK-1It is the image of absolute conic, its tool Body form is as follows:
Can obtain comprising the binary side of four nonlinear equations by above-mentioned two formula and eight blanking point coordinates Journey group, goes out scale factor f with nonlinear least square method iterativexAnd fy.
Step 4:Demarcate outer parameter;
Fig. 3 is Epipolar geometry schematic diagram in binocular stereo vision.
The homogeneous coordinates of each group of corresponding picture point can obtain a linear equation, as follows:
uu′f11+vu′f12+u′f13+uv′f21+vv′f22+v′f23+uf31+vf32+f33=0 (3)
Wherein, fijFor the element on the i-th row of fundamental matrix, jth row;
[uu ', vu ', u ', uv ', vv ', v ', u, v, 1] f=0 (4)
Wherein, f=[f11,f12,f13,f21,f22,f23,f31,f32,f33]TFor 9 dimensional vectors;
If providing n group corresponding point { mi,mi', then there is following system of homogeneous linear equations
Given at least 8 groups corresponding picture points can solve above-mentioned linear homogeneous in the unknown non-zero proportions factor lower linear of difference one Equation group, here applies constraint | | f | |=1, when corresponding picture point accurately mate, can ask for substantially merely with 8 groups of corresponding point Matrix;And then the essential matrix under one non-zero proportions factor of difference can be obtainedWhereinS is difference Scale factor;Using E*=Udiag ((σ12)/2,(σ12)/2,0)VTAsApproximate, Wherein eigenvalue σ1≥σ2≥σ3
From essential matrix E*It is estimated that the external parameter between video cameraThere are following 4 kinds possible combination shapes Formula:
[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3]. (6)
Wherein,u3The 3rd column vector for U,
Constrained by Chirality, be positive acute angle and translation vector according to yaw angleX-direction componentCome for negative value Correct judgment parameter combination;External parameter between using the video camera under the intrinsic parameters of the camera obtaining and difference one scale factor Rebuild two accurately known characteristic points of range information, the ratio rebuilding distance with actual range is scale factor s.

Claims (1)

1. a kind of scaling method based on quadrature compensation is it is characterised in that following steps:
Step 1:Zoom method seeks principal point
Shoot two-dimensional target with video camera under different focal, when focal length changes, photocentre moves along optical axis, in camera field of view Middle two-dimensional target will equal proportion scalable, obtain target image under different visual fields, have in different images any image keep not Become, be called scalable center, scalable center is principal point in the image of different visual fields, therefore by each characteristic point of target in difference Intersection point under the least square meaning of the line under focal length is principal point;
Step 2:Active vision scaling method using four-dimensional electric control platform obtains target image
Four-dimensional electric control platform is controlled to drive video camera to do the orthogonal motion in four groups of planes, every group of orthogonal motion contains to be put down twice Move, video camera stops three positions in each orthogonal motion, and then obtains three images, and is guaranteeing target in visual field On the premise of, between every group of orthogonal motion, video camera all has luffing angle not less than 5 degree or pan angle, by four groups two Secondary pure translational movement obtains totally 12 target images;
Step 3:Ask for scale factor
Parallel lines and plane at infinity intersect at same infinite point, that is, hidden disappear a little, by 12 images after distortion correction Packet carries out blanking point estimation, and in every group of 3 images, corresponding point line obtains 2 vanishing points, 4 groups of totally 8 vanishing points;To obtain 8 blanking point coordinates, between principal point coordinate and guide rail, angle is brought in following formula:
cos θ = v i 1 T wv i 2 v i 1 T wv i 1 v i 2 T wv i 2 , i = 1 , 2 , 3 , 4 - - - ( 1 )
Wherein, θ is the angle between two translating rails of electric control platform, vi1、vi2For the fortune of translation twice under i-th group of orthogonal motion The pixel coordinate of the dynamic vanishing point pair obtaining, w=K-TK-1It is the image of absolute conic, its concrete form is as follows:
w = 1 / f x 2 0 - u 0 / f x 2 0 1 / f y 2 - v 0 / f y 2 - u 0 / f x 2 - v 0 / f y 2 u 0 2 / f x 2 + v 0 2 / f y 2 + 1 - - - ( 2 )
Obtain comprising the two element equations of four nonlinear equations by above-mentioned two formula and eight blanking point coordinates, with non- Linear least square iterative goes out scale factor fxAnd fy, wherein, u0, v0For video camera principal point coordinate;
Step 4:Demarcate outer parameter;
The homogeneous coordinates of one group of corresponding picture point can obtain a linear equation, as follows:
uu′f11+vu′f12+u′f13+uv′f21+vv′f22+v′f23+uf31+vf32+f33=0 (3)
Wherein, fijFor the element on the i-th row of fundamental matrix, jth row;
[uu ', vu ', u ', uv ', vv ', v ', u, v, 1] f=0 (4)
Wherein, f=[f11,f12,f13,f21,f22,f23,f31,f32,f33]TFor 9 dimensional vectors;
If providing n group corresponding point { mi,m′i, then there is following system of homogeneous linear equations
A f = u 1 u 1 ′ v 1 u 1 ′ u 1 ′ u 1 v 1 ′ v 1 v 1 ′ v 1 ′ u 1 v 1 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . u n u n ′ v n u n ′ u n ′ u n v n ′ v n v n ′ v n ′ u n v n 1 f = 0 - - - ( 5 )
Given at least 8 groups corresponding picture points can solve above-mentioned homogeneous linear equations in the unknown non-zero proportions factor lower linear of difference one Group, here applies constraint | | f | |=1, when corresponding picture point accurately mate, asks for fundamental matrix merely with 8 groups of corresponding point;And then Obtain the essential matrix under one non-zero proportions factor of differenceWhereinS is the scale factor of difference;Using E*=Udiag ((σ12)/2,(σ12)/2,0)VTAsApproximate, wherein eigenvalue σ1≥ σ2≥σ3
From essential matrix E*Estimate the external parameter between video cameraThere are following 4 kinds of possible combining forms:
[UWVT|u3];[UWVT|-u3];[UWTVT|u3];[UWTVT|-u3]. (6)
Wherein,u3The 3rd column vector for U,
Constrained by Chirality, be positive acute angle and translation vector according to yaw angleX-direction componentJust to judge for negative value Really parameter combination;Using obtain intrinsic parameters of the camera and difference one scale factor under video camera between external parameter rebuild away from From two accurately known characteristic points of information, the ratio rebuilding distance with actual range is scale factor s.
CN201410432768.6A 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation Active CN104167001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410432768.6A CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410432768.6A CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Publications (2)

Publication Number Publication Date
CN104167001A CN104167001A (en) 2014-11-26
CN104167001B true CN104167001B (en) 2017-02-15

Family

ID=51910792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410432768.6A Active CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Country Status (1)

Country Link
CN (1) CN104167001B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751452A (en) * 2015-03-08 2015-07-01 大连理工大学 Monocular camera calibration method based on any known movement
CN105716542B (en) * 2016-04-07 2018-04-10 大连理工大学 A kind of three-dimensional data joining method based on flexible characteristic point
CN106157322B (en) * 2016-07-28 2019-04-26 北京航空航天大学 A kind of camera installation site scaling method based on plane mirror
CN109727291B (en) * 2018-12-28 2020-10-23 北京航空航天大学 High-precision online calibration method for zoom camera
CN114152190B (en) * 2021-11-15 2023-10-24 苏州铸正机器人有限公司 Industrial camera precision and working space test platform
CN114663527A (en) * 2022-03-22 2022-06-24 张慧 Camera self-calibration method under general motion

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101285676A (en) * 2008-06-10 2008-10-15 北京航空航天大学 Multi-visual sense sensor calibration method based on one-dimensional target
CN101581569A (en) * 2009-06-17 2009-11-18 北京信息科技大学 Calibrating method of structural parameters of binocular visual sensing system
WO2013111229A1 (en) * 2012-01-23 2013-08-01 日本電気株式会社 Camera calibration device, camera calibration method, and camera calibration program
CN103530880A (en) * 2013-10-16 2014-01-22 大连理工大学 Camera calibration method based on projected Gaussian grid pattern

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101285676A (en) * 2008-06-10 2008-10-15 北京航空航天大学 Multi-visual sense sensor calibration method based on one-dimensional target
CN101581569A (en) * 2009-06-17 2009-11-18 北京信息科技大学 Calibrating method of structural parameters of binocular visual sensing system
WO2013111229A1 (en) * 2012-01-23 2013-08-01 日本電気株式会社 Camera calibration device, camera calibration method, and camera calibration program
CN103530880A (en) * 2013-10-16 2014-01-22 大连理工大学 Camera calibration method based on projected Gaussian grid pattern

Also Published As

Publication number Publication date
CN104167001A (en) 2014-11-26

Similar Documents

Publication Publication Date Title
CN104167001B (en) Large-visual-field camera calibration method based on orthogonal compensation
CN103971353B (en) Splicing method for measuring image data with large forgings assisted by lasers
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN103473771B (en) A kind of camera scaling method
CN105469389B (en) A kind of grid ball target for vision sensor calibration and corresponding scaling method
CN104268876B (en) Camera calibration method based on partitioning
CN108594245A (en) A kind of object movement monitoring system and method
CN105627926A (en) Four-camera group planar array feature point three-dimensional measurement system and measurement method
CN106981083A (en) The substep scaling method of Binocular Stereo Vision System camera parameters
CN104517291B (en) Pose measuring method based on target coaxial circles feature
CN104034305B (en) A kind of monocular vision is the method for location in real time
CN103971378A (en) Three-dimensional reconstruction method of panoramic image in mixed vision system
CN103278138A (en) Method for measuring three-dimensional position and posture of thin component with complex structure
CN108845335A (en) Unmanned aerial vehicle ground target positioning method based on image and navigation information
CN103559711A (en) Motion estimation method based on image features and three-dimensional information of three-dimensional visual system
CN101216296A (en) Binocular vision rotating axis calibration method
CN103852060A (en) Visible light image distance measuring method based on monocular vision
CN104778716B (en) Lorry compartment volume measuring method based on single image
CN110517325A (en) The vehicle body surrounding objects localization method and system of a kind of coordinate transform and coordinate transform
CN109727290A (en) Zoom camera dynamic calibrating method based on monocular vision triangle telemetry
CN103983186A (en) Binocular vision system correcting method and device
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image
CN104807405B (en) Three-dimensional coordinate measurement method based on light ray angle calibration
CN104848801A (en) Line structure light vision sensor calibration method based on parallel bicylindrical target
CN102914295A (en) Computer vision cube calibration based three-dimensional measurement method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant