CN105118055A  Camera positioning correction calibration method and system  Google Patents
Camera positioning correction calibration method and system Download PDFInfo
 Publication number
 CN105118055A CN105118055A CN201510489677.0A CN201510489677A CN105118055A CN 105118055 A CN105118055 A CN 105118055A CN 201510489677 A CN201510489677 A CN 201510489677A CN 105118055 A CN105118055 A CN 105118055A
 Authority
 CN
 China
 Prior art keywords
 camera
 focusing
 video camera
 theta
 coordinate system
 Prior art date
 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 Granted
Links
 239000011159 matrix material Substances 0.000 claims abstract description 145
 238000003384 imaging method Methods 0.000 claims abstract description 59
 230000003287 optical Effects 0.000 claims abstract description 26
 230000000875 corresponding Effects 0.000 claims description 60
 238000005070 sampling Methods 0.000 claims description 56
 241000251468 Actinopterygii Species 0.000 claims description 18
 238000005259 measurement Methods 0.000 claims description 18
 238000009877 rendering Methods 0.000 claims description 14
 210000001747 Pupil Anatomy 0.000 claims description 13
 101710009578 CYP97C1 Proteins 0.000 claims description 12
 101710016814 CRTLE1 Proteins 0.000 claims description 10
 238000000034 method Methods 0.000 claims description 10
 230000004304 visual acuity Effects 0.000 claims description 6
 210000002320 Radius Anatomy 0.000 claims description 3
 238000000605 extraction Methods 0.000 claims description 3
 230000000694 effects Effects 0.000 description 6
 238000010586 diagram Methods 0.000 description 3
 238000005516 engineering process Methods 0.000 description 3
 238000004364 calculation method Methods 0.000 description 2
 238000006243 chemical reaction Methods 0.000 description 2
 238000009434 installation Methods 0.000 description 2
 206010021403 Illusion Diseases 0.000 description 1
 230000001808 coupling Effects 0.000 description 1
 238000010168 coupling process Methods 0.000 description 1
 238000005859 coupling reaction Methods 0.000 description 1
 238000011156 evaluation Methods 0.000 description 1
 239000000463 material Substances 0.000 description 1
 239000000203 mixture Substances 0.000 description 1
 238000002715 modification method Methods 0.000 description 1
 230000000007 visual effect Effects 0.000 description 1
Classifications

 G—PHYSICS
 G06—COMPUTING; CALCULATING; COUNTING
 G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
 G06T2207/00—Indexing scheme for image analysis or image enhancement
 G06T2207/30—Subject of image; Context of image processing
 G06T2207/30244—Camera pose
Abstract
The present invention discloses a camera positioning correction calibration method and a system for realizing the method, belonging to the technical field of image virtual production. Through the intrinsic relationship among a lens parameter, an imaging surface and an optical tracking device, by using the world coordinate and image point coordinate of N mark points on a background screen and the internal parameter and lens distortion parameter of a camera lens, the rotation matrix between a camera coordinate system and a world coordinate system and the translation vector of a camera perspective center in the world coordinate system are obtained, combined with the current position information given by a camera posture external tracking device in a current state, camera correction calibration information and viewing angle are obtained, the lookup table with a focusing distance and a focus length relationship is established, thus when a camera position, a lens focal distance and a focusing distance change, the position of the virtual camera of the virtual production system is fully automatically positioned and corrected, and thus a real video frame picture and a virtual frame image generated by a computer can be matched perfectly.
Description
Technical field
The present invention relates to a kind of camera position correction scaling method and realize the system of the method, belong to virtual manufacture technology field.
Background technology
Virtual camera attitude and field angle are two extremely important information needed for computer picture render engine.In virtual photography Making programme, in order to the visual effect obtained, require that real scene shooting picture mates with virtual screen.This coupling refers to that the effect of virtual screen can imitate the shooting effect of real camera, if the focal length of camera lens, focusing from changing, the field angle of virtual camera is also corresponding to be changed in real time; When video camera seat in the plane changes, also there is the change on position in the computer picture virtual camera seat in the plane generated in engine accordingly.Newauto (Beijing) Video Technology Co., Ltd. proposes a kind of acquisition methods and device (application number 200810223256.3) of calibrating parameters, the method adopts biplane coordinate scaling method or Linear Camaera Calibrating Method to calculate inner parameter and the external parameter of video camera by the volume coordinate of known abundant reference point, the inner parameter of video camera comprises optics and the geometrical property of video camera, focal length, scale factor and lens distortion etc., the external parameter of video camera comprises camera coordinates relative to the position of world coordinate system and direction, then ant group algorithm is adopted to be optimized parameter, and generate evaluation function.But this file does not provide concrete embodiment and technical scheme, and also undeclared which kind of parameter of concrete employing calculates.A kind of method of calibrating camera parameters that Material Evidence Identification Center, Ministry of Public Security proposes and device (application number 201010144764.X) are although be a kind of method of calibrating camera but its problem that will solve is different from the present invention, this application builds virtual grid according to orthoscopic zone pixel coordinate near principal point in demarcation target image thus calculates distortion of camera coefficient, complete camera intrinsic parameter to demarcate, and have nothing to do with video camera external parameter.A kind of video image positional relationship correction apparatus that Toyota Industries Co., Ltd. proposes and there is steering assistance device and the video image positional relationship modification method (application number 200480041109.4) of this video image positional relationship correction apparatus, this application based on by video camera actual acquisition and the video image reference point of display and the deviation of virtual target point on a monitor between coordinate calculate coordinate conversion parameter, thus for the monitor coordinate of virtual target of deriving.Beijing Jingwei Hirain Technologies Co., Ltd. proposes a kind of emulation stand camera marking method and realtime machine (application number 201510134249.6), scaling reference is gathered and imaging by the virtual video camera set up, set up the relation between its any point and virtual video camera plane of delineation coordinate, then gather described imaging by real camera and set up corresponding relation between the imaging point of scaling reference any point in virtual video camera and real camera plane of delineation coordinate, these two relations are utilized to solve the corresponding relation obtained between scaling reference any point and real camera plane of delineation coordinate, thus eliminate the parameter differences existed between virtual video camera to real camera.Need in the method first to set up virtual video camera, and the present invention directly calculate be consistent with actual video camera field angle, the attitude information of video camera under world coordinate system (containing rotation angle and camera position) be to image rendering engine.
In order to reach such object, we must obtain the position of video camera under given coordinate system, attitude and field angle information in real time.Present common way on video camera, installs video camera attitude external trace device as optictracking device, by the sensing station of photogrammetric camera and the skew of optictracking device, estimates the position of video camera, attitude information.The video camera tracking of the virtual preview system of existing realtime imaging, mainly measure in a manual manner, then according to green curtain gauge point or other unique points, manually adjust, in addition do not consider yet focusing from the factors such as field angle, thus cause that efficiency is lower and precision is lower, cause occurring inconsistent phenomenon from the video image of video camera and the virtual scene of Practical computer teaching, cause virtual scene to a kind of false sensation of people, the Overlay of virtual scene and real scene is undesirable.
Summary of the invention
In order to reach the object of virtual background and real scene shooting image perfect matching, the position of video camera should be identical with the position of the virtual camera that computer picture render engine accepts.Pinhole imaging system principle is obeyed in the imaging of the virtual camera of computer picture render engine, so the real camera position matched is then the centre of perspectivity of video camera, i.e. and the intersection point of entrance pupil and optical axis.So finally obtain video camera correction to demarcate the alternate position spike information that information is video camera attitude external trace device (as optictracking device) and the video camera centre of perspectivity.Based on abovementioned principle, the invention provides a kind of camera position is carried out and revise the method and system of demarcating, pass through lens parameters, internal relation between imaging surface and optictracking device, automatic location correction is carried out to the position of the virtual camera of virtual manufacturing system, and calculate the field angle be consistent with actual video camera, camera position, video camera attitude is to image rendering engine, thus make to work as camera position, lens focus, focusing is from when changing, automatically corrected parameter demarcated by the realtime video camera that obtains, and virtual camera coordinate and the attitude of computer picture render engine is obtained according to demarcation corrected parameter, make the virtual frames image perfect matching that real scene shooting frame of video picture and computing machine produce.
In order to solve the problems of the technologies described above, the invention provides a kind of camera position is carried out and revising the method for demarcating, comprising the following steps:
S1. N number of gauge point A on background extraction curtain
_{1}..., A
_{n}coordinate under world coordinate system: A
_{1}(x
_{1}, y
_{1}, z
_{1}) ..., A
_{n}(x
_{n}, y
_{n}, z
_{n}), wherein the coordinate of ith gauge point under world coordinate system is A
_{i}(x
_{i}, y
_{i}, z
_{i}), i=1 ~ N; Described N number of gauge point at least comprises 3 not collinear gauge points; Described gauge point coordinate is the home position of gauge point and described gauge point is all positioned at the picture that video camera gathers;
S2. numerical value focusing distance sample FD is from small to large determined
_{1}, FD
_{2}..., FD
_{j}..., FD
_{j}; Wherein j=1 ~ J, FD
_{1}and FD
_{j}be respectively camera lens the maximum focusing of minimum focusing Distance geometry from;
If camera lens is zoom lens, then also need to determine numerical value focal length sampled point FL from small to large
_{1}, FL
_{2}..., FL
_{k}..., FL
_{k}; Wherein k=1 ~ K, FL
_{1}and FL
_{k}minimum focus and the maximum focal length of camera lens respectively;
If S3. camera lens is tight shot, obtain each focusing distance sample FD
_{j}corresponding camera lens inner parameter and lens distortion parameter;
If camera lens is zoom lens, obtain each focusing distance sample FD
_{j}with each focal length sampled point FL
_{k}corresponding camera lens inner parameter and lens distortion parameter;
Described camera lens inner parameter and lens distortion parameter are obtained by camera lens calibration process;
If S4. camera lens is tight shot, when focusing is from getting a jth focusing distance sample FD
_{j}time, adjustment camera position makes gauge point on background screen can blurfree imaging;
If camera lens is zoom lens, when focusing is from getting a jth focusing distance sample FD
_{j}and focal length is at a kth sampled point FL
_{k}during place, adjustment camera position makes gauge point on background screen can blurfree imaging;
S5. the current location information that under acquisition current state, video camera attitude external trace device provides, comprise the rotation Eulerian angle for making world coordinate system overlap with the selfdefining coordinate system of video camera attitude external trace device after the rotation of its Xaxis, Yaxis and Z axis respectively successively after translation, and the position coordinates of video camera attitude external trace device under world coordinate system;
The camera lens inner parameter that picpointed coordinate under video camera imaging coordinate system of the world coordinates of the N number of gauge point S6. utilizing S1 to obtain, gauge point and S3 obtain and lens distortion parameter, obtain rotation matrix between camera coordinates system and world coordinate system and the video camera centre of perspectivity translation vector at world coordinate system, i.e. the attitude information of video camera under world coordinate system;
S7. the current location information that under the current state utilizing S5 to obtain, video camera attitude external trace device provides, and the attitude information of the video camera of trying to achieve in S6 under world coordinate system, try to achieve video camera correction and demarcate information and field angle;
If S8. camera lens is tight shot: adjustment focusing from, to each focusing distance sample FD
_{j}perform step S4 ~ S7, until travel through all focusing distance sample, obtain correction corresponding to each focusing distance sample and demarcate information and field angle, set up focusing fromrevise the lookup table LUT1 of demarcation informationfield angle;
If camera lens is zoom lens: adjustment focusing Distance geometry focal length, to each focusing distance sample FD
_{j}with each focal length sampled point FL
_{k}perform step S4 ~ S7, until travel through all focusing distance sample and focal length sampled point, obtain each focusing distance sample and the correction corresponding to focal length sampled point and demarcate information and field angle, set up the lookup table LUT2 of focusing fromfocal lengthcorrection demarcation informationfield angle;
If S9. camera lens is tight shot:, according to described lookup table LUT1, revise demarcation information and field angle by focusing from acquisition is corresponding.
If camera lens is zoom lens:, according to described lookup table LUT2, obtain corresponding correction demarcation information and field angle by focusing Distance geometry focal length.
Further, camera position correction scaling method proposed by the invention also comprises following preferred technical scheme:
One, as preferred, in step S3,
(1) when camera lens is tight shot:
For a jth focusing distance sample FD
_{j}, corresponding camera lens inner parameter is: f
_{xj}, f
_{yj}, c
_{xj}, c
_{yj}, wherein:
F
_{xj}: represent that focusing is from being FD
_{j}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{j}with the ratio of the horizontal width dx of each unit of video camera imager, namely
F
_{yj}: represent that focusing is from being FD
_{j}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{j}with the ratio of the vertical height dy of each unit of video camera imager, namely
C
_{xj}: represent that focusing is from being FD
_{j}time, the optical axis of camera lens and the intersection point of imaging surface, with the pixel count of the horizontalshift at imaging surface center;
C
_{yj}: represent that focusing is from being FD
_{j}time, the optical axis of camera lens and the intersection point of imaging surface, with the pixel count of the vertical shift at imaging surface center;
For a jth focusing distance sample FD
_{j}, corresponding camera lens distortions parameter is: k
_{1j}, k
_{2j}, k
_{3j}, p
_{1j}and p
_{2j}, wherein k
_{1j}, k
_{2j}, k
_{3j}for focusing is from being FD
_{j}time radial distortion parameter, wherein k
_{3j}for selected parameter, use k when camera lens is fish eye lens
_{1j}, k
_{2j}, k
_{3j}these three radial distortion parameter, when camera lens is nonfish eye lens, k
_{3j}=0, namely other camera lenses only use k
_{1j}, k
_{2j}this two radial distortion parameter; p
_{1j}, p
_{2j}for focusing is from being FD
_{j}time tangential distortion parameter;
Further, each focusing is obtained from FD
_{j}under internal reference matrix M
_{j}for:
(2) when camera lens is zoom lens:
For a jth focusing distance sample FD
_{j}with a kth focal length sampled point FL
_{k}, corresponding camera lens inner parameter is: f
_{xjk}, f
_{yjk}, c
_{xjk}, c
_{yjk}, wherein:
F
_{xjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{jk}with the ratio of the horizontal width dx of each unit of imager, namely
F
_{yjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{jk}with the ratio of the vertical height dy of each unit of imager, namely
C
_{xjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the pixel count of the intersection point of optical axis and imaging surface and the horizontalshift at imaging surface center;
C
_{yjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the pixel count of the intersection point of optical axis and imaging surface and the vertical shift at imaging surface center;
For a jth focusing distance sample FD
_{j}with a kth focal length sampled point FL
_{k}, corresponding camera lens distortions parameter is: k
_{1jk}, k
_{2jk}, k
_{3jk}, p
_{1jk}and p
_{2jk}, wherein k
_{1jk}, k
_{2jk}and k
_{3jk}for focusing is from being FD
_{j}and focal length is FL
_{k}time radial distortion parameter, wherein k
_{3jk}for selected parameter, use k when camera lens is fish eye lens
_{1jk}, k
_{2jk}and k
_{3jk}these three radial distortion parameter; When camera lens is nonfish eye lens, k
_{3jk}=0, namely other camera lenses only use k
_{1jk}, k
_{2jk}these two radial distortion parameter; p
_{1jk}, p
_{2jk}for focusing is from being FD
_{j}and focal length is FL
_{k}time tangential distortion parameter;
And obtain each focusing from FD
_{j}with focal length FL
_{k}under internal reference matrix be M
_{jk}:
Two, as preferred, in step S5,
(1) when camera lens is tight shot:
The current location information that described video camera attitude external trace device provides is [θ
_{xj}, θ
_{yj}, θ
_{zj}, T
_{xj}, T
_{yj}, T
_{zj}]; Wherein [θ
_{xj}, θ
_{yj}, θ
_{zj}] be the rotation Eulerian angle that overlap with the selfdefining coordinate system of video camera attitude external trace device after world coordinate system is rotated around Xaxis, Yaxis and Z axis respectively successively after translation, [T
_{xj}, T
_{yj}, T
_{zj}] be the position coordinates of video camera attitude external trace device under world coordinate system, the current location information that video camera attitude external trace device provides is expressed as matrix form:
Here
T
_{Ij}＝[T
_{xj}T
_{yj}T
_{zj}]
^{T}
0＝[0，0，0]；
(2) when camera lens is zoom lens:
The current location information that described video camera attitude external trace device provides is [θ
_{xjk}, θ
_{yjk}, θ
_{zjk}, T
_{xjk}, T
_{yjk}, T
_{zjk}]; Wherein [θ
_{xjk}, θ
_{yjk}, θ
_{zjk}] be the rotation Eulerian angle that overlap with the selfdefining coordinate system of video camera attitude external trace device after world coordinate system is rotated around Xaxis, Yaxis and Z axis respectively successively after translation, [T
_{xjk}, T
_{yjk}, T
_{zjk}] be the position coordinates of video camera attitude external trace device under world coordinate system; The current location information that video camera attitude external trace device provides is expressed as matrix form:
Here
T
_{Ijk}＝[T
_{xjk}T
_{yjk}T
_{zjk}]
^{T}
0＝[0，0，0]
Three, as preferred, in step S6:
(1) when camera lens is tight shot:
Current focusing is from being FD
_{j}time, obtain the rotation matrix R between camera coordinates system and world coordinate system
_{pj}, the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pj}, the attitude matrix H of video camera under world coordinate system
_{pj}method be:
According to the picpointed coordinate of ith gauge point
and current focusing is from FD
_{j}under lens distortion parameter, obtain ith gauge point after correction picpointed coordinate (x '
_{ij}, y '
_{ij}):
If when wherein camera lens is nonfish eye lens, k
_{3j}=0;
World coordinates (the x of ith gauge point
_{i}, y
_{i}, z
_{i}) picpointed coordinate after the correction corresponding with it (x '
_{ij}, y '
_{ij}) between relation be expressed as:
Wherein λ
_{j}be called scale factor,
Rotation matrix R wherein between camera coordinates system and world coordinate system
_{pj}for orthogonal matrix, T
_{pj}for the video camera centre of perspectivity is at the translation vector of world coordinate system:
World coordinates (the x of through type (1) (2) and each gauge point
_{i}, y
_{i}, z
_{i}) and corresponding correction after picpointed coordinate (x '
_{ij}, y '
_{ij}), obtain matrix O
_{j}=λ
_{j}[R
_{pj}, T
_{pj}] and the attitude information of video camera under world coordinate system
wherein
for the rotation Eulerian angle making world coordinate system overlap with camera coordinates system after the rotation of Xaxis, Yaxis and Z axis respectively successively after translation,
${\left[\begin{array}{ccc}{T}_{\mathrm{xj}}^{c}& {T}_{\mathrm{yj}}^{c}& {T}_{\mathrm{zj}}^{c}\end{array}\right]}^{T}$ For the translation vector of the video camera centre of perspectivity under world coordinate system;
Again because matrix R
_{pj}be unit orthogonal matrix, have:
λ
_{j}＝1/O
_{j}(：，1)；
Wherein O
_{j}(:, 1) be matrix O
_{j}the 1st row,   represent vector Euclid norm;
According to the rotation matrix R between the camera coordinates system obtained and world coordinate system
_{pj}and the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pj}, the attitude information of video camera under world coordinate system is expressed as matrix H
_{pj}:
(2) when camera lens is zoom lens:
Current focusing is from being FD
_{j}and when front focal length is FL
_{k}time, obtain the rotation matrix R between camera coordinates system and world coordinate system
_{pjk}, the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pjk}, the attitude matrix H of video camera under world coordinate system
_{pjk}method be:
According to current focusing from FD
_{j}with as front focal length FL
_{k}the picpointed coordinate of lower ith gauge point
and lens distortion parameter obtain its correct after picpointed coordinate (x '
_{ijk}, y '
_{ijk}):
If when wherein camera lens is nonfish eye lens, k
_{3jk}=0;
World coordinates (the x of ith gauge point
_{i}, y
_{i}, z
_{i}) picpointed coordinate after the correction corresponding with it (x '
_{ijk}, y '
_{ijk}) between relation be expressed as:
Wherein λ
_{jk}be called scale factor,
Rotation matrix R wherein between camera coordinates system and world coordinate system
_{pjk}for orthogonal matrix, T
_{pjk}for the video camera centre of perspectivity is at the translation vector of world coordinate system:
World coordinates (the x of through type (3) (4) and each gauge point
_{i}, y
_{i}, z
_{i}) and corresponding correct after picpointed coordinate (x '
_{ijk}, y '
_{ijk}), obtain matrix O
_{jk}=λ
_{jk}[R
_{pjk}, T
_{pjk}] and the attitude information of video camera under world coordinate system
wherein
for the rotation Eulerian angle making world coordinate system overlap with camera coordinates system after the rotation of Xaxis, Yaxis and Z axis respectively successively after translation,
${\left[\begin{array}{ccc}{T}_{\mathrm{xjk}}^{c}& {T}_{\mathrm{yjk}}^{c}& {T}_{\mathrm{zjk}}^{c}\end{array}\right]}^{T}$ For the translation vector of the video camera centre of perspectivity under world coordinate system;
Because matrix R
_{jk}be unit orthogonal matrix, have:
λ
_{jk}＝1/O
_{jk}(：，1)；
Wherein O
_{jk}(:, 1) be matrix O
_{jk}the 1st row,   represent vector Euclid norm;
According to the rotation matrix R between the camera coordinates system obtained and world coordinate system
_{pjk}and the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pjk}, the attitude information of video camera under world coordinate system is expressed as matrix H
_{pjk}:
Four, as preferred, step S7 comprises the steps:
(1) when camera lens is tight shot:
S7.1 calculates video camera attitude rectification matrix H
_{j}:
S7.2 is by video camera attitude rectification matrix H
_{j}be transformed to and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}) and translation vector T
_{j}:
Described video camera attitude rectification matrix H
_{j}be the matrix of 4 × 4, its live part is H
_{j}(1:3,1:4), i.e. H
_{j}first three rows, by H
_{j}(1:3,1:4) is expressed as form:
H
_{j}(1：3，1：4)＝[R
_{j}，T
_{j}]；
Wherein rotation matrix R
_{j}be the real number matrix of 3 × 3, translation vector T
_{j}=[t
_{xj}, t
_{yj}, t
_{zj}]
^{t}be 3 dimensional vectors;
By rotation matrix R
_{j}be converted to according to following formula and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}):
Here
$\stackrel{\→}{{n}_{j}}=({n}_{\mathrm{xj},}{n}_{\mathrm{yj}},{n}_{\mathrm{zj}}),$ Be 3 dimension row vectors;
Thus R is stored
_{j}only need store and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}), therefore video camera attitude rectification matrix H
_{j}be converted to video camera correction and demarcate information (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}, t
_{xj}, t
_{yj}, t
_{zj});
S7.3 calculates camera coverage angle: when focusing is from being FD
_{j}time, calculate camera coverage angle α according to following formula
_{j}:
Wherein W, H are respectively wide, the high resolving power of video camera.
(2) when camera lens is zoom lens:
S7.1 calculates video camera attitude rectification matrix H jk:
S7.2 is by video camera attitude rectification matrix H
_{jk}be transformed to and revise hypercomplex number (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}) and translation vector T
_{jk}:
Described video camera attitude rectification matrix H
_{jk}be the matrix of 4 × 4, its live part is H
_{jk}(1:3,1:4), i.e. H
_{jk}first three rows, by H
_{jk}(1:3,1:4) is expressed as form:
H
_{jk}(1：3，1：4)＝[R
_{jk}，T
_{jk}]；
Wherein rotation matrix R
_{jk}be the real number matrix of 3 × 3, translation vector T
_{jk}=[t
_{xjk}, t
_{yjk}, t
_{zjk}]
^{t}be 3 dimensional vectors;
By rotation matrix R
_{jk}be converted to according to following formula and revise hypercomplex number (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}):
Here
be 3 dimension row vectors;
Thus R is stored
_{jk}only need store and revise hypercomplex number (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}), therefore video camera attitude rectification matrix H
_{jk}be converted to video camera correction and demarcate information (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}, t
_{xjk}, t
_{yjk}, t
_{zjk});
S7.3 calculates camera coverage angle: when focusing is from being FD
_{j}and focal length is FL
_{k}time, calculate camera coverage angle α according to following formula
_{jk}:
Wherein W, H are respectively wide, the high resolving power of video camera.
Five, as preferred, in step S9,
When actual focusing from FD not any one focusing distance sample FD
_{j}place, or real focal length FL is not at any one focusing distance sample FL
_{k}place, or work as actual focusing from FD not at any one focusing distance sample FD
_{j}place and real focal length FL be not also at any one focusing distance sample FL
_{k}during place, information and field angle are demarcated in the correction adopting the mode of following interpolation arithmetic to obtain its correspondence:
Adopt SLERP interpolation method to carry out interpolation for the correction hypercomplex number revised in demarcation information, translation vector adopts linear interpolation; Field angle adopts linear interpolation method to carry out interpolation.
Further,
(1) when camera lens is tight shot:
When actual focusing from FD not any one focusing distance sample FD
_{j}place, the method using SLERP algorithm to carry out interpolation to correction hypercomplex number is as follows:
If FD
_{j}< FD < FD
_{j+1}, video camera in focusing from FD
_{j}and FD
_{j+1}the video camera attitude rectification matrix at place is respectively H
_{j}and H
_{j+1}, its rotation matrix is respectively R
_{j}and R
_{j+1}, rotation matrix is expressed as correction hypercomplex number
${q}_{j}=(\mathrm{cos}{\mathrm{\θ}}_{j},\stackrel{\→}{{n}_{j}}\mathrm{sin}{\mathrm{\θ}}_{j})$ With
then current actual focusing is calculated from correction hypercomplex number q during FD by following formula:
Here
q
_{j}inverse,
(2) when camera lens is zoom lens:
When actual focusing from FD not any one focusing distance sample FD
_{j}place, or real focal length FL is not at any one focusing distance sample FL
_{k}place, or work as actual focusing from FD not at any one focusing distance sample FD
_{j}place and real focal length FL be not also at any one focusing distance sample FL
_{k}during place, the method using SLERP algorithm to carry out interpolation to correction hypercomplex number is as follows:
If 1. FD
_{j}< FD < FD
_{j+1}and FL
_{k}< FL < FL
_{k+1}, namely when actual focusing from FD not any one focusing distance sample FD
_{j}place, and real focal length FL is not also at any one focusing distance sample FL
_{k}place, adopts and combines (FD with its focusing Distance geometry focal length sampled point closed on most
_{j}, FL
_{k}), (FD
_{j}, FL
_{k+1}), (FD
_{j+1}, FL
_{k}), (FD
_{j+1}, FL
_{k+1}) corresponding to correction hypercomplex number q
_{j, k}, q
_{j, k+1}, q
_{j+1, k}, q
_{j+1, k+1}, carry out interpolation according to the following formula, then for current actual focusing from FD and real focal length is FL time correction hypercomplex number q
_{l, d}calculated by following formula:
Wherein:
If 2. FD
_{j}< FD < FD
_{j+1}, FL=FL
_{k}time, namely when actual focusing from FD not any one focusing distance sample FD
_{j}place and real focal length FL is focal length sampled point FL
_{k}time, adopt and combine (FD with its focusing closed on most from sampled point
_{j}, FL
_{k}), (FD
_{j+1}, FL
_{k}) corresponding to correction hypercomplex number information q
_{j, k}, q
_{j+1, k}, carry out interpolation according to the following formula, then for current actual focusing from FD and real focal length is FL
_{k}time correction hypercomplex number q
_{l, d}calculated by following formula
Wherein
If 3. FD=FD
_{j}, FL
_{k}< FL < FL
_{k+1}time, namely actual focusing is focusing distance sample FD from FD
_{j}and real focal length FL is not at any one focusing distance sample FL
_{k}during place, adopt and combine (FD with its focal length sampled point closed on most
_{j}, FL
_{k}), (FD
_{j}, FL
_{k+1}) corresponding to correction hypercomplex number information q
_{j, k}, q
_{j, k+1}, carry out interpolation according to the following formula, then for current actual focusing from FD
_{j}and correction hypercomplex number q when real focal length is FL
_{l, d}calculated by following formula:
Wherein
In addition, according to as above method, the present invention also proposes a kind of camera position correction calibration system, comprising: servomotor control system, video camera, video camera attitude external trace device, background screen, gauge point, space measurement equipment, data processing equipment, computer picture render engine;
Wherein, servomotor control system is connected with video camera, for the focal length of adjustable lens and focusing from; Servomotor control system is also connected with data processing equipment, for sending the focal length of camera lens and focusing range information to data processing equipment, calculates revise demarcation information and field angle and set up lookup table to make data processing equipment;
Data processing equipment is also connected with video camera, for reading video stream data in real time;
Video camera attitude external trace device is arranged at video camera outside, for by the position of photogrammetric camera and the skew of video camera attitude external trace device, estimates position and the attitude information of video camera; Video camera attitude external trace device is also connected with data processing equipment, for sending measurement data information;
Background screen is arranged the gauge point of at least 3 not conllinear, these gauge point radiuses are identical, and color and background screen have contrast;
Space measurement equipment is connected with data processing equipment respectively with image rendering engine, space measurement equipment is used for the world coordinates of measurement markers point home position and sends to data processing equipment, and the lookup table that image rendering engine is used for setting up according to data processing equipment obtains revises demarcation information and field angle accordingly.
Compared with prior art, beneficial effect of the present invention is, not only realize video camera correction calibration process full automation, without the need to manual intervention, and the position of video camera under given coordinate system, attitude and field angle information can be calculated in real time, and by Computer image genration engine according to abovementioned information accurately image, make the virtual frames image perfect matching that real scene shooting frame of video picture and computing machine produce.
Accompanying drawing explanation
Fig. 1 is the composition schematic diagram of the camera position correction calibration system that the present invention proposes; Wherein, 1servomotor control system; 2video camera, 3camera lens, 4video camera attitude external trace device; 5background screen; 6gauge point; 7data processing equipment;
Fig. 2 is the method flow diagram carrying out revising demarcation to tight shot camera position that the present invention proposes;
Fig. 3 is the method flow diagram carrying out revising demarcation to zoom lens camera position that the present invention proposes.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in detail, also described technical matters, principle and beneficial effect that technical solution of the present invention solves simultaneously.
As shown in Figure 1, for the structural drawing of camera position correction calibration system proposed according to the present invention, as shown in the figure, the camera position correction calibration system that the present invention proposes comprises: servomotor control system, video camera, video camera attitude external trace device, background screen, gauge point, space measurement equipment (not shown), data processing equipment, image rendering engine (not shown);
Wherein, servomotor control system is connected with video camera, for the focal length of adjustable lens and focusing from; Servomotor control system is also connected with data processing equipment, for sending the focal length of camera lens and focusing range information to data processing equipment, calculates revise demarcation information and field angle and set up lookup table to make data processing equipment;
Data processing equipment is also connected with video camera, for reading video stream data in real time;
Video camera attitude external trace device is arranged at video camera outside, for by the position of photogrammetric camera and the skew of video camera attitude external trace device, estimates position and the attitude information of video camera; Video camera attitude external trace device is also connected with data processing equipment, for sending measurement data information;
Background screen is arranged the gauge point of at least 3 not conllinear, these gauge point radiuses are identical, and color and background screen have contrast;
Space measurement equipment is connected with data processing equipment respectively with image rendering engine, space measurement equipment is used for the world coordinates of measurement markers point home position and sends to data processing equipment, and the lookup table that image rendering engine is used for setting up according to data processing equipment obtains revises demarcation information and field angle accordingly.
As preferably, background screen is green curtain or blue curtain.
As preferably, described video camera attitude external trace device is optictracking device or mechanical arm.
The workflow of the camera position correction calibration system that the present invention proposes comprises the steps:
S1. the coordinate of N number of gauge point under world coordinate system on usage space measuring equipment background extraction curtain, A
_{1}(x
_{1}, y
_{1}, z
_{1}) ..., A
_{n}(x
_{n}, y
_{n}, z
_{n}), and the world coordinates of gauge point is sent to data processing equipment, described N number of gauge point at least comprises 3 not collinear gauge points;
S2. multiple numerical value focusing distance sample FD is from small to large determined
_{1}, FD
_{2}..., FD
_{j}..., FD
_{j}(j=1 ~ J), FD
_{1}and FD
_{j}be respectively camera lens the maximum focusing of minimum focusing Distance geometry from;
If camera lens is zoom lens, then also need to determine numerical value focal length sampled point FL from small to large
_{1}, FL
_{2}..., FL
_{k}..., FL
_{k}; Wherein k=1 ~ K, FL
_{1}and FL
_{k}minimum focus and the maximum focal length of camera lens respectively;
As preferably, described focusing distance sample and focal length sampled point adopt the sampled point in camera lens calibration process; Demarcating about camera lens please refer to as Publication about Document: [1] DuaneCB.Closerangecameracalibration [J] .Photogram.Eng.RemoteSens, 1971,37:855866. and [2] ZhangZ.Flexiblecameracalibrationbyviewingaplanefromunkno wnorientations [C] //ComputerVision, 1999.TheProceedingsoftheSeventhIEEEInternationalConferen ceon.IEEE, 1999,1:666673.
Focusing is from choosing method: (as [0,1]) between the rotational area being placed in the servomotor control system of the adjustment focus of camera lens is divided into J1 section, altogether J end points, what these end points were corresponding is focusing from sampling point value; J generally chooses the integer between 1030.
The choosing method of focal length sampled point: (as [0,1]) between the rotational area being placed in the servomotor control system of focusing of camera lens is divided into K1 section, altogether K end points, what these end points were corresponding is focal length sampling point value; K generally suitably chooses according to the zooming range of camera lens.
S3. camera inner parameter corresponding to each focusing distance sample and distortion parameter is obtained by data processing equipment; Described camera lens inner parameter and lens distortion parameter are obtained by camera lens calibration process;
S4. servomotor control system sends the focal length of camera lens and focusing range information to data processing equipment, when focusing is from a jth sampled point FD
_{j}place, adjustment camera position makes gauge point on background screen can blurfree imaging;
If camera lens is zoom lens, when focusing is from getting a jth focusing distance sample FD
_{j}and focal length is at a kth sampled point FL
_{k}during place, adjustment camera position makes gauge point on background screen can blurfree imaging;
S5. employing video camera attitude external trace device such as optictracking device or mechanical arm transmits the positional information under its current state to data processing equipment; Comprise the rotation Eulerian angle for making world coordinate system overlap with the selfdefining coordinate system of video camera attitude external trace device after the rotation of its Xaxis, Yaxis and Z axis respectively successively after translation, and the position coordinates of video camera attitude external trace device under world coordinate system;
Described optictracking device is as total powerstation etc.;
S6. the camera lens inner parameter that obtains of picpointed coordinate under video camera imaging coordinate system of the world coordinates of data processing equipment N number of gauge point of utilizing S1 to obtain, gauge point and S3 and lens distortion parameter, obtain rotation matrix between camera coordinates system and world coordinate system and the video camera centre of perspectivity translation vector at world coordinate system, i.e. the attitude information of video camera under world coordinate system;
As preferably, video camera imaging coordinate system initial point is set in field of view center.
S7. data processing equipment utilize S5 to obtain current state under the current location information that provides of video camera attitude external trace device, and the attitude information of the video camera of trying to achieve in S6 under world coordinate system, tries to achieve video camera correction and demarcates information and field angle;
If S8. camera lens is tight shot: adjustment focusing from, to each focusing distance sample FD
_{j}perform step S4 ~ S7, until travel through all focusing distance sample, obtain correction corresponding to each focusing distance sample and demarcate information and field angle, data processing equipment set up focusing fromrevise the lookup table LUT1 of demarcation informationfield angle;
If camera lens is zoom lens: adjustment focusing Distance geometry focal length, to each focusing distance sample FD
_{j}with each focal length sampled point FL
_{k}perform step S4 ~ S7, until travel through all focusing distance sample and focal length sampled point, obtain each focusing distance sample and the correction corresponding to focal length sampled point and demarcate information and field angle, the lookup table LUT2 of focusing fromfocal lengthcorrection demarcation informationfield angle set up by data processing equipment;
If S9. camera lens is tight shot: the lookup table LUT1 that image rendering engine is set up according to data processing equipment, revise demarcation information and field angle by focusing from acquisition is corresponding.
If camera lens is zoom lens: the lookup table LUT2 that image rendering engine is set up according to data processing equipment, obtain corresponding correction demarcation information and field angle by focusing Distance geometry focal length.
The camera position correction calibration system related in the present invention its go for tight shot, also zoom lens is applicable to, here respectively for the camera position correction calibration system with tight shot and zoom lens, the technical scheme of camera position correction scaling method of the present invention is described:
One, tight shot (asking for an interview accompanying drawing 2)
S1. the world coordinates of gauge point is obtained:
Utilize the world coordinates A of N number of gauge point on space measurement device measuring background screen
_{1}(x
_{1}, y
_{1}, z
_{1}) ..., A
_{n}(x
_{n}, y
_{n}, z
_{n}), described N number of gauge point at least comprises 3 not collinear gauge points; Described gauge point coordinate is the home position of gauge point and described gauge point is all positioned at the picture that video camera gathers; Wherein the coordinate of ith gauge point under world coordinate system is A
_{i}(x
_{i}, y
_{i}, z
_{i}), i=1 ~ N;
S2. distance sample of focusing is determined:
By servomotor control system, according to camera lens calibration process determination numerical value focusing distance sample FD from small to large
_{1}, FD
_{2}..., FD
_{j}..., FD
_{j}(j=1 ~ J), FD
_{1}and FD
_{j}be respectively camera lens the maximum focusing of minimum focusing Distance geometry from;
S3. obtain camera inner parameter corresponding to each focusing distance sample and distortion parameter to go forward side by side line item, described camera lens inner parameter and lens distortion parameter are obtained by camera lens calibration process;
Specifically, for a jth focusing distance sample FD
_{j}, corresponding camera lens inner parameter is: f
_{xj}, f
_{yj}, c
_{xj}, c
_{yj}, wherein:
F
_{xj}: represent that focusing is from being FD
_{j}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{j}with the ratio of the horizontal width dx of each unit of video camera imager, namely
F
_{yj}: represent that focusing is from being FD
_{j}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{j}with the ratio of the vertical height dy of each unit of video camera imager, namely
As preferably, video camera imager described herein is CCD or CMOS.
C
_{xj}: represent that focusing is from being FD
_{j}time, the optical axis of camera lens and the intersection point of imaging surface, with the pixel count of the horizontalshift at imaging surface center;
C
_{yj}: represent that focusing is from being FD
_{j}time, the optical axis of camera lens and the intersection point of imaging surface, with the pixel count of the vertical shift at imaging surface center;
For a jth focusing distance sample FD
_{j}, corresponding camera lens distortions parameter is: k
_{1j}, k
_{2j}, k
_{3j}, p
_{1j}and p
_{2j}, wherein k
_{1j}, k
_{2j}, k
_{3j}for focusing is from being FD
_{j}time radial distortion parameter, wherein k
_{3j}for selected parameter, use k when camera lens is fish eye lens
_{1j}, k
_{2j}, k
_{3j}these three radial distortion parameter, when camera lens is nonfish eye lens, k
_{3j}=0, namely other camera lenses only use k
_{1j}, k
_{2j}this two radial distortion parameter; p
_{1j}, p
_{2j}for focusing is from being FD
_{j}time tangential distortion parameter;
Further, each focusing is obtained from FD
_{j}under internal reference matrix M
_{j}for:
S4. when focusing is from getting a jth focusing distance sample FD
_{j}time, adjustment camera position makes gauge point on background screen can blurfree imaging;
S5. data processing equipment obtains the current location information [θ that video camera attitude external trace device is transmitted as optictracking device
_{xj}, θ
_{yj}, θ
_{zj}, T
_{xj}, T
_{yj}, T
_{zj}].Wherein [θ
_{xj}, θ
_{yj}, θ
_{zj}] be the rotation Eulerian angle that overlap with the selfdefining coordinate system of video camera attitude external trace device after world coordinate system is rotated around Xaxis, Yaxis and Z axis respectively successively after translation, [T
_{xj}, T
_{yj}, T
_{zj}] be the position coordinates of video camera attitude external trace device under world coordinate system, the current location information that video camera attitude external trace device provides is expressed as matrix form:
Here
T
_{Ij}＝[T
_{xj}T
_{yj}T
_{zj}]
^{T}
0＝[0，0，0]；
The camera lens inner parameter that picpointed coordinate under video camera imaging coordinate system of the world coordinates of the N number of gauge point S6. utilizing S1 to obtain, gauge point and S3 obtain and lens distortion parameter, obtain the rotation matrix R between camera coordinates system and world coordinate system
_{pj}and the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pj}, i.e. the attitude information of video camera under world coordinate system;
As preferably, current focusing is from being FD
_{j}time, obtain the rotation matrix R between camera coordinates system and world coordinate system
_{pj}, the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pj}, the attitude matrix H of video camera under world coordinate system
_{pj}method be:
According to the picpointed coordinate of ith gauge point
and current focusing is from FD
_{j}under lens distortion parameter, obtain ith gauge point after correction picpointed coordinate (x '
_{ij}, y '
_{ij}):
If when wherein camera lens is nonfish eye lens, k
_{3j}=0;
World coordinates (the x of ith gauge point
_{i}, y
_{i}, z
_{i}) picpointed coordinate after the correction corresponding with it (x '
_{ij}, y '
_{ij}) between relation be expressed as:
Wherein λ
_{j}be called scale factor,
Rotation matrix R wherein between camera coordinates system and world coordinate system
_{pj}for orthogonal matrix, T
_{pj}for the video camera centre of perspectivity is at the translation vector of world coordinate system:
World coordinates (the x of through type (1) (2) and each gauge point
_{i}, y
_{i}, z
_{i}) and corresponding correction after picpointed coordinate (x '
_{ij}, y '
_{ij}), obtain matrix O
_{j}=λ
_{j}[R
_{pj}, T
_{pj}] and the attitude information of video camera under world coordinate system
wherein
for the rotation Eulerian angle making world coordinate system overlap with camera coordinates system after the rotation of Xaxis, Yaxis and Z axis respectively successively after translation,
${\left[\begin{array}{ccc}{T}_{\mathrm{xj}}^{c}& {T}_{\mathrm{yj}}^{c}& {T}_{\mathrm{zj}}^{c}\end{array}\right]}^{T}$ For the translation vector of the video camera centre of perspectivity under world coordinate system;
Again because matrix R
_{pj}be unit orthogonal matrix, have:
λ
_{j}＝1/O
_{j}(：，1)；
Wherein O
_{j}(:, 1) be matrix O
_{j}the 1st row,   represent vector Euclid norm;
According to the rotation matrix R between the camera coordinates system obtained and world coordinate system
_{pj}and the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pj}, the attitude information of video camera under world coordinate system is expressed as matrix H
_{pj}:
S7. the current location information that under the current state utilizing S5 to obtain, video camera attitude external trace device provides, and the attitude information of the video camera of trying to achieve in S6 under world coordinate system, try to achieve video camera correction and demarcate information and field angle;
First S7.1 calculates video camera attitude rectification matrix.Calculate the unchangeability that video camera attitude rectification matrix mainly utilizes the relative position relation of video camera and optictracking device, here relative position relation is constant is the term of validity of onestep installation, reinstall optictracking device, need again to demarcate camera position.Following formula can be utilized video camera attitude rectification matrix H is calculated when calculating
_{j}:
S7.2 is by video camera attitude rectification matrix H
_{j}be transformed to and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}) and translation vector T
_{j}:
Due to the calculating video camera attitude rectification matrix H that formula (5) obtains
_{j}be the matrix of 4 × 4, it is not easy to store, and is also not easy to carry out interpolation calculation in subsequent arithmetic, therefore, needs to be transformed to the form being convenient to storage and computing.
Described video camera attitude rectification matrix H
_{j}be the matrix of 4 × 4, its live part is H
_{j}(1:3,1:4), i.e. H
_{j}first three rows, by H
_{j}(1:3,1:4) is expressed as form:
H
_{j}(1：3，1：4)＝[R
_{j}，T
_{j}]；
Wherein rotation matrix R
_{j}be the real number matrix of 3 × 3, translation vector T
_{j}=[t
_{xj}, t
_{yj}, t
_{zj}]
^{t}be 3 dimensional vectors;
Due to rotation matrix R
_{j}mutually can change with hypercomplex number, therefore by rotation matrix R
_{j}be converted to according to following formula and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}):
Here
$\stackrel{\→}{{n}_{j}}=({n}_{\mathrm{xj}},{n}_{\mathrm{yj}},{n}_{\mathrm{zj}}),$ Be 3 dimension row vectors.
Here
$\stackrel{\→}{{n}_{j}}=({n}_{\mathrm{xj}},{n}_{\mathrm{yj}},{n}_{\mathrm{zj}}),$ Be 3 dimension row vectors;
Thus R is stored
_{j}only need store and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}), therefore video camera attitude rectification matrix H
_{j}be converted to video camera correction and demarcate information (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}, t
_{xj}, t
_{yj}, t
_{zj}); Namely wherein correction hypercomplex number and translation vector are collectively referred to as and revise demarcation information.
S7.3 calculates camera coverage angle: when focusing is from being FD
_{j}time, calculate camera coverage angle α according to following formula
_{j}:
Wherein W, H are wide, the high resolving power of video camera.
S8. adjust focusing from, to each focusing distance sample FD
_{j}perform step S4 ~ S7, until travel through all focusing distance sample, information (θ is demarcated in the correction obtained corresponding to each focusing distance sample
_{j}, n
_{xj}, n
_{yj}, n
_{zj}, t
_{xj}, t
_{yj}, t
_{zj}) and field angle α
_{j}, set up focusing fromrevise the lookup table LUT1 of demarcation informationfield angle;
S9. export LUT1 to image rendering engine, image rendering engine, according to described lookup table LUT1, revises demarcation information and field angle by focusing from acquisition is corresponding.
Table 1 focusing fromrevise the form of demarcation informationfield angle lookup table LUT1
When camera lens is tight shot, Computer image genration engine is when utilizing lookup table LUT1 to demarcate information and field angle according to focusing from the correction obtaining correspondence, because the focusing distance sample recorded in lookup table LUT1 is limited, the focusing not at every turn chosen from be all just sample point focusing from.For nonsampled point place focusing from, information is demarcated in the correction that the mode of interpolation arithmetic can be adopted to obtain its correspondence, owing to revising hypercomplex number, translation vector and its mathematical characteristic difference of field angle, therefore in order to reach more accurate interpolation arithmetic effect, different interpolation algorithms can be adopted respectively.Specifically, SLERP interpolation method is adopted to carry out interpolation, translation vector (t for the correction hypercomplex number revised in demarcation information
_{x}, t
_{y}, t
_{z}) adopting linear interpolation, field angle adopts linear interpolation, or other forms carry out interpolation.
Wherein, as preferably, to carry out the method for interpolation as follows to revising hypercomplex number to use SLERP algorithm:
If FD
_{j}< FD < FD
_{j+1}, video camera in focusing from FD
_{j}and FD
_{j+1}the video camera attitude rectification matrix at place is respectively H
_{j}and H
_{j+1}, its rotation matrix is respectively R
_{j}and R
_{j+1}, rotation matrix is expressed as correction hypercomplex number
${q}_{j}=(\mathrm{cos}{\mathrm{\θ}}_{j},\stackrel{\→}{{n}_{j}}\mathrm{sin}{\mathrm{\θ}}_{j})$ With
then current actual focusing is calculated from correction hypercomplex number q during FD by following formula:
Here
q
_{j}inverse,
In focusing from under any value, corresponding correction can be obtained and demarcates information and field angle thus.
Two, zoom lens (asking for an interview accompanying drawing 3)
For the camera position correction calibration system adopting zoom lens, except need consider focusing from change, also should consider the change of focal length, the correction that should obtain respectively under each focusing distance sample, focal length sampled point demarcates information and field angle, specifically see Fig. 3 simultaneously:
S1. the world coordinates of gauge point is obtained:
Utilize N number of gauge point A on space measurement device measuring background screen
_{1}..., A
_{n}world coordinate system under coordinate A
_{1}(x
_{1}, y
_{1}, z
_{1}) ..., A
_{n}(x
_{n}, y
_{n}, z
_{n}), described N number of gauge point at least comprises 3 not collinear gauge points; Described gauge point coordinate is the home position of gauge point and described gauge point is all positioned at the picture that video camera gathers; Wherein the coordinate of ith gauge point under world coordinate system is A
_{i}(x
_{i}, y
_{i}, z
_{i}), i=1 ~ N;
S2. focusing distance sample and focal length sampled point is determined:
By servomotor control system, according to camera lens calibration process determination numerical value focusing distance sample FD from small to large
_{1}, FD
_{2}..., FD
_{j}..., FD
_{j}(j=1 ~ J), FD
_{1}and FD
_{j}be respectively camera lens the maximum focusing of minimum focusing Distance geometry from; In addition, numerical value focal length sampled point FL is from small to large determined
_{1}, FL
_{2}..., FL
_{k}..., FL
_{k}(k=1 ~ K), FL
_{1}and FL
_{k}minimum focus and the maximum focal length of camera lens respectively.
S3. obtain each focusing distance sample and camera lens inner parameter corresponding to focal length sampled point and lens distortion parameter to go forward side by side line item, described camera lens inner parameter and lens distortion parameter are obtained by camera lens calibration process;
As preferably, for a jth focusing distance sample FD
_{j}with a kth focal length sampled point FL
_{k}, corresponding camera lens inner parameter is: f
_{xjk}, f
_{yjk}, c
_{xjk}, c
_{yjk}, wherein:
F
_{xjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{jk}with the ratio of the horizontal width dx of each unit of imager, namely
F
_{yjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{jk}with the ratio of the vertical height dy of each unit of imager, namely
C
_{xjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the pixel count of the intersection point of optical axis and imaging surface and the horizontalshift at imaging surface center;
C
_{yjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the pixel count of the intersection point of optical axis and imaging surface and the vertical shift at imaging surface center;
For a jth focusing distance sample FD
_{j}with a kth focal length sampled point FL
_{k}, corresponding camera lens distortions parameter is: k
_{1jk}, k
_{2jk}, k
_{3jk}, p
_{1jk}and p
_{2jk}, wherein k
_{1jk}, k
_{2jk}and k
_{3jk}for focusing is from being FD
_{j}and focal length is FL
_{k}time radial distortion parameter, wherein k
_{3jk}for selected parameter, use k when camera lens is fish eye lens
_{1jk}, k
_{2jk}and k
_{3jk}these three radial distortion parameter; When camera lens is nonfish eye lens, k
_{3jk}=0, namely other camera lenses only use k
_{1jk}, k
_{2jk}these two radial distortion parameter; p
_{1jk}, p
_{2jk}for focusing is from being FD
_{j}and focal length is FL
_{k}time tangential distortion parameter;
And obtain each focusing from FD
_{j}with focal length FL
_{k}under internal reference matrix be M
_{jk}:
S4. when focusing is from getting a jth focusing distance sample FD
_{j}and focal length is at a kth sampled point FL
_{k}during place, adjustment camera position makes gauge point on background screen can blurfree imaging;
S5. current location information [the θ that under acquisition current state, video camera attitude external trace device is transmitted as optictracking device or mechanical arm
_{xjk}, θ
_{yjk}, θ
_{zjk}, T
_{xjk}, T
_{yjk}, T
_{zjk}].Wherein [θ
_{xjk}, θ
_{yjk}, θ
_{zjk}] be the rotation Eulerian angle that overlap with the selfdefining coordinate system of video camera attitude external trace device after world coordinate system is rotated around Xaxis, Yaxis and Z axis respectively successively after translation, [T
_{xjk}, T
_{yjk}, T
_{zjk}] be the position coordinates of video camera attitude external trace device under world coordinate system; The current location information that video camera attitude external trace device provides is expressed as matrix form:
Here
T
_{Ijk}＝[T
_{xjk}T
_{yjk}T
_{zjk}]
^{T}
0＝[0，0，0]
The camera lens inner parameter that picpointed coordinate under video camera imaging coordinate system of the world coordinates of the N number of gauge point S6. utilizing S1 to obtain, gauge point and S3 obtain and lens distortion parameter, obtain the rotation matrix R between camera coordinates system and world coordinate system
_{pjk}and the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pjk}, i.e. the attitude information of video camera under world coordinate system;
Current focusing is from being FD
_{j}and when front focal length is FL
_{k}time, obtain the rotation matrix R between camera coordinates system and world coordinate system
_{pjk}, the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pjk}, the attitude matrix H of video camera under world coordinate system
_{pjk}method be:
According to current focusing from FD
_{j}with as front focal length FL
_{k}the picpointed coordinate of lower ith gauge point
and lens distortion parameter obtain its correct after picpointed coordinate (x '
_{ijk}, y '
_{ijk}):
If when wherein camera lens is nonfish eye lens, k
_{3jk}=0;
World coordinates (the x of ith gauge point
_{i}, y
_{i}, z
_{i}) picpointed coordinate after the correction corresponding with it (x '
_{ijk}, y '
_{ijk}) between relation be expressed as:
Wherein λ
_{jk}be called scale factor,
Rotation matrix R wherein between camera coordinates system and world coordinate system
_{pjk}for orthogonal matrix, T
_{pjk}for the video camera centre of perspectivity is at the translation vector of world coordinate system:
World coordinates (the x of through type (3) (4) and each gauge point
_{i}, y
_{i}, z
_{i}) and corresponding correct after picpointed coordinate (x '
_{ijk}, y '
_{ijk}), obtain matrix O
_{jk}=λ
_{jk}[R
_{pjk}, T
_{pjk}] and the attitude information of video camera under world coordinate system
wherein
for the rotation Eulerian angle making world coordinate system overlap with camera coordinates system after the rotation of Xaxis, Yaxis and Z axis respectively successively after translation,
${\left[\begin{array}{ccc}{T}_{\mathrm{xjk}}^{c}& {T}_{\mathrm{yjk}}^{c}& {T}_{\mathrm{zjk}}^{c}\end{array}\right]}^{T}$ For the translation vector of the video camera centre of perspectivity under world coordinate system;
Because matrix R
_{jk}be unit orthogonal matrix, have:
λ
_{jk}＝1/O
_{jk}(：，1)；
Wherein O
_{jk}(:, 1) be matrix O
_{jk}the 1st row,   represent vector Euclid norm;
According to the rotation matrix R between the camera coordinates system obtained and world coordinate system
_{pjk}and the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pjk}, the attitude information of video camera under world coordinate system is expressed as matrix H
_{pjk}:
S7. the current location information that under the current state utilizing S5 to obtain, video camera attitude external trace device provides, and the attitude information of the video camera of trying to achieve in S6 under world coordinate system, try to achieve video camera correction and demarcate information and field angle;
S7.1 calculates video camera attitude rectification matrix:
Calculate the unchangeability that video camera attitude rectification matrix mainly utilizes the relative position relation of video camera and optictracking device, here relative position relation is constant is the term of validity of onestep installation, reinstall optictracking device, need again to demarcate camera position.Formula (6) can be utilized video camera attitude rectification matrix H is calculated when calculating
_{jk}.
Wherein
S7.2 is by video camera attitude rectification matrix H
_{jk}be transformed to and revise hypercomplex number (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}) and translation vector T
_{jk}:
Due to the calculating video camera attitude rectification matrix H that formula (6) obtains
_{jk}be the matrix of 4 × 4, it is not easy to store, and is also not easy to carry out interpolation calculation in subsequent arithmetic, therefore, needs to be transformed to the form being convenient to storage and computing.
Described video camera attitude rectification matrix H
_{jk}be the matrix of 4 × 4, its live part is H
_{jk}(1:3,1:4), i.e. H
_{jk}first three rows, by H
_{jk}(1:3,1:4) is expressed as form:
H
_{jk}(1：3，1：4)＝[R
_{jk}，T
_{jk}]，
Wherein rotation matrix R
_{jk}be the real number matrix of 3 × 3, translation vector T
_{jk}=[t
_{xjk}, t
_{yjk}, t
_{zjk}]
^{t}be 3 dimensional vectors;
Due to rotation matrix R
_{jk}mutually can change with hypercomplex number, by rotation matrix R
_{jk}be converted to according to following formula and revise hypercomplex number (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}):
Here
be 3 dimension row vectors.
Thus R is stored
_{jk}only need to store (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}), therefore video camera attitude rectification matrix H
_{jk}be converted to video camera correction and demarcate information (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}, t
_{xjk}k, t
_{yjk}, t
_{zjk}); Wherein correction hypercomplex number and translation vector are collectively referred to as and revise demarcation information.
S7.3 calculates camera coverage angle: when focusing is from being FD
_{j}, focal length is FL
_{k}time, camera coverage angle α
_{jk}calculate according to formula below:
Wherein W, H are wide, the high resolving power of video camera.
S8. adjustment focusing Distance geometry focal length, to each focusing distance sample FD
_{j}with each focal length sampled point FL
_{k}perform step S4 ~ S7, until travel through all focusing distance sample and focal length sampled point, obtain each focusing distance sample and the correction corresponding to focal length sampled point and demarcate information and field angle, set up the lookup table LUT2 of focusing fromfocal lengthcorrection demarcation informationfield angle; Form example is in table 2;
Specifically, focal length priority principle can be taked, namely first fix focusing from, for this focusing under each focal length sampled point repeat S4 ~ S7, then adjust focusing from, repeat abovementioned steps, until all focusing distance sample of traversal traversal, thus step S4 ~ S7 is performed to each focusing distance sample and focal length sampled point, obtain correction hypercomplex number (θ, the n of its correspondence
_{x}, n
_{y}, n
_{z}), translation vector (t
_{x}, t
_{y}, t
_{z}) and field angle α.
S9. export LUT to Computer image genration engine, be convenient to Computer image genration engine according to focusing from and focal length obtain and corresponding revise demarcation information and field angle.
Table 2 focal lengthfocusing fromrevise the form of demarcation informationfield angle lookup table LUT2
Under camera lens is zoom lens situation, Computer image genration engine is when utilizing lookup table LUT2 to obtain corresponding correction demarcation information and field angle according to current focusing Distance geometry focal length, due to the focusing distance sample that records in lookup table LUT2 and focal length sampled point limited, the focusing not at every turn chosen is the focusing Distance geometry focal length of sample point from, focal length all just.For nonsampled point place focusing Distance geometry focal length (such as when actual focusing from FD not any one focusing distance sample FD
_{j}place, or real focal length FL is not at any one focusing distance sample FL
_{k}place, or work as actual focusing from FD not at any one focusing distance sample FD
_{j}place and real focal length FL be not also at any one focusing distance sample FL
_{k}place), information is demarcated in the correction that the mode of interpolation arithmetic can be adopted to obtain its correspondence, owing to revising hypercomplex number, translation vector and its mathematical characteristic difference of field angle, therefore in order to reach more accurate interpolation arithmetic effect, can adopt different interpolation algorithms respectively.Specifically, revising hypercomplex number adopts SLERP interpolation method to carry out interpolation, translation vector T
_{jk}=[t
_{xjk}, t
_{yjk}, t
_{zjk}]
^{t}adopt linear interpolation, field angle adopts linear interpolation, or other forms carry out interpolation.
Wherein, as preferably, to carry out the method for interpolation as follows to revising hypercomplex number to use SLERP algorithm:
If 1. FD
_{j}< FD < FD
_{j+1}and FL
_{k}< FL < FL
_{k+1}, namely when actual focusing from FD not any one focusing distance sample FD
_{j}place, and real focal length FL is not also at any one focusing distance sample FL
_{k}place, adopts and combines (FD with its focusing Distance geometry focal length sampled point closed on most
_{j}, FL
_{k}), (FD
_{j}, FL
_{k+1}), (FD
_{j+1}, FL
_{k}), (FD
_{j+1}, FL
_{k+1}) corresponding to correction hypercomplex number q
_{j, k}k, q
_{j, k+1}, q
_{j+1, k}, q
_{j+1, k+1}, carry out interpolation according to the following formula, then for current actual focusing from FD and real focal length is FL time correction hypercomplex number q
_{l, d}calculated by following formula:
Wherein:
If 2. FD
_{j}< FD < FD
_{j+1}, FL=FL
_{k}time, namely when actual focusing from FD not any one focusing distance sample FD
_{j}place and real focal length FL is focal length sampled point FL
_{k}time, adopt and combine (FD with its focusing closed on most from sampled point
_{j}, FL
_{k}), (FD
_{j+1}, FL
_{k}) corresponding to correction hypercomplex number information q
_{j, k}, q
_{j+1, k}, carry out interpolation according to the following formula, then for current actual focusing from FD and real focal length is FL
_{k}time correction hypercomplex number q
_{l, d}calculated by following formula
Wherein
If 3. FD=FD
_{j}, FL
_{k}< FL < FL
_{k+1}time, namely actual focusing is focusing distance sample FD from FD
_{j}and real focal length FL is not at any one focusing distance sample FL
_{k}during place, adopt and combine (FD with its focal length sampled point closed on most
_{j}, FL
_{k}), (FD
_{j}, FL
_{k+1}) corresponding to correction hypercomplex number information q
_{j, k}, q
_{j, k+1}, carry out interpolation according to the following formula, then for current actual focusing from FD
_{j}and correction hypercomplex number q when real focal length is FL
_{l, d}calculated by following formula:
Wherein
Under can being any value at focusing Distance geometry focal length thus, demarcate information and field angle by the correction of tabling lookup and interpolation acquisition is corresponding.
The above; be only the specific embodiment of the present invention; but protection scope of the present invention is not limited thereto; any people being familiar with this technology is in the technical scope disclosed by the present invention; the conversion and replacement expected can be understood; all should be encompassed in and of the present inventionly comprise within scope, therefore, protection scope of the present invention should be as the criterion with the protection domain of claims.
Claims (20)
1. a camera position correction scaling method, is characterized in that, comprises the steps:
S1. N number of gauge point A on background extraction curtain
_{1}..., A
_{n}coordinate under world coordinate system: A
_{1}(x
_{1}, y
_{1}, z
_{1}) ..., A
_{n}(x
_{n}, y
_{n}, z
_{n}), wherein the coordinate of ith gauge point under world coordinate system is A
_{i}(x
_{i}, y
_{i}, z
_{i}), i=1 ~ N; Described N number of gauge point at least comprises 3 not collinear gauge points; Described gauge point coordinate is the home position of gauge point and described gauge point is all positioned at the picture that video camera gathers;
S2. numerical value focusing distance sample FD is from small to large determined
_{1}, FD
_{2}..., FD
_{j}..., FD
_{j}; Wherein j=1 ~ J, FD
_{1}and FD
_{j}be respectively camera lens the maximum focusing of minimum focusing Distance geometry from;
If camera lens is zoom lens, then also need to determine numerical value focal length sampled point FL from small to large
_{1}, FL
_{2}..., FL
_{k}..., FL
_{k}; Wherein k=1 ~ K, FL
_{1}and FL
_{k}minimum focus and the maximum focal length of camera lens respectively;
If S3. camera lens is tight shot, obtain each focusing distance sample FD
_{j}corresponding camera lens inner parameter and lens distortion parameter;
If camera lens is zoom lens, obtain each focusing distance sample FD
_{j}with each focal length sampled point FL
_{k}corresponding camera lens inner parameter and lens distortion parameter;
Described camera lens inner parameter and lens distortion parameter are obtained by camera lens calibration process;
If S4. camera lens is tight shot, when focusing is from getting a jth focusing distance sample FD
_{j}time, adjustment camera position makes gauge point on background screen can blurfree imaging;
If camera lens is zoom lens, when focusing is from getting a jth focusing distance sample FD
_{j}and focal length is at a kth sampled point FL
_{k}during place, adjustment camera position makes gauge point on background screen can blurfree imaging;
S5. the current location information that under acquisition current state, video camera attitude external trace device provides, comprise the rotation Eulerian angle for making world coordinate system overlap with the selfdefining coordinate system of video camera attitude external trace device after the rotation of its Xaxis, Yaxis and Z axis respectively successively after translation, and the position coordinates of video camera attitude external trace device under world coordinate system;
The camera lens inner parameter that picpointed coordinate under video camera imaging coordinate system of the world coordinates of the N number of gauge point S6. utilizing S1 to obtain, gauge point and S3 obtain and lens distortion parameter, obtain rotation matrix between camera coordinates system and world coordinate system and the video camera centre of perspectivity translation vector at world coordinate system, i.e. the attitude information of video camera under world coordinate system;
S7. the current location information that under the current state utilizing S5 to obtain, video camera attitude external trace device provides, and the attitude information of the video camera of trying to achieve in S6 under world coordinate system, try to achieve video camera correction and demarcate information and field angle;
If S8. camera lens is tight shot: adjustment focusing from, to each focusing distance sample FD
_{j}perform step S4 ~ S7, until travel through all focusing distance sample, obtain correction corresponding to each focusing distance sample and demarcate information and field angle, set up focusing fromrevise the lookup table LUT1 of demarcation informationfield angle;
If camera lens is zoom lens: adjustment focusing Distance geometry focal length, to each focusing distance sample FD
_{j}with each focal length sampled point FL
_{k}perform step S4 ~ S7, until travel through all focusing distance sample and focal length sampled point, obtain each focusing distance sample and the correction corresponding to focal length sampled point and demarcate information and field angle, set up the lookup table LUT2 of focusing fromfocal lengthcorrection demarcation informationfield angle;
If S9. camera lens is tight shot:, according to described lookup table LUT1, revise demarcation information and field angle by focusing from acquisition is corresponding;
If camera lens is zoom lens:, according to described lookup table LUT2, obtain corresponding correction demarcation information and field angle by focusing Distance geometry focal length.
2. a kind of camera position correction scaling method according to claim 1, is characterized in that, in step S3,
(1) when camera lens is tight shot:
For a jth focusing distance sample FD
_{j}, corresponding camera lens inner parameter is: f
_{xj}, f
_{yj}, c
_{xj}, c
_{yj}, wherein:
F
_{xj}: represent that focusing is from being FD
_{j}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{j}with the ratio of the horizontal width dx of each unit of video camera imager, namely
F
_{yj}: represent that focusing is from being FD
_{j}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{j}with the ratio of the vertical height dy of each unit of video camera imager, namely
C
_{xj}: represent that focusing is from being FD
_{j}time, the optical axis of camera lens and the intersection point of imaging surface, with the pixel count of the horizontalshift at imaging surface center;
C
_{yj}: represent that focusing is from being FD
_{j}time, the optical axis of camera lens and the intersection point of imaging surface, with the pixel count of the vertical shift at imaging surface center;
For a jth focusing distance sample FD
_{j}, corresponding camera lens distortions parameter is: k
_{1j}, k
_{2j}, k
_{3j}, p
_{1j}and p
_{2j}, wherein k
_{1j}, k
_{2j}, k
_{3j}for focusing is from being FD
_{j}time radial distortion parameter, wherein k
_{3j}for selected parameter, use k when camera lens is fish eye lens
_{1j}, k
_{2j}, k
_{3j}these three radial distortion parameter, when camera lens is nonfish eye lens, k
_{3j}=0, namely other camera lenses only use k
_{1j}, k
_{2j}this two radial distortion parameter; p
_{1j,}p
_{2j}for focusing is from being FD
_{j}time tangential distortion parameter;
Further, each focusing is obtained from FD
_{j}under internal reference matrix M
_{j}for:
(2) when camera lens is zoom lens:
For a jth focusing distance sample FD
_{j}with a kth focal length sampled point FL
_{k}, corresponding camera lens inner parameter is: f
_{xjk}, f
_{yjk}, c
_{xjk}, c
_{yjk}, wherein:
F
_{xjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{jk}with the ratio of the horizontal width dx of each unit of imager, namely
F
_{yjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the emergent pupil of camera lens and camera lens optical axis intersection point are to the distance f of imaging surface
_{jk}with the ratio of the vertical height dy of each unit of imager, namely
C
_{xjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the pixel count of the intersection point of optical axis and imaging surface and the horizontalshift at imaging surface center;
C
_{yjk}: represent that focusing is from being FD
_{j}, focal length is FL
_{k}time, the pixel count of the intersection point of optical axis and imaging surface and the vertical shift at imaging surface center;
For a jth focusing distance sample FD
_{j}with a kth focal length sampled point FL
_{k}, corresponding camera lens distortions parameter is: k
_{1jk}, k
_{2jk}, k
_{3jk}, p
_{1jk}and p
_{2jk}, wherein k
_{1jk}, k
_{2jk}and k
_{3jk}for focusing is from being FD
_{j}and focal length is FL
_{k}time radial distortion parameter, wherein k
_{3jk}for selected parameter, use k when camera lens is fish eye lens
_{1jk}, k
_{2jk}and k
_{3jk}these three radial distortion parameter; When camera lens is nonfish eye lens, k
_{3jk}=0, namely other camera lenses only use k
_{1jk}, k
_{2jk}these two radial distortion parameter; p
_{1jk}, p
_{2jk}for focusing is from being FD
_{j}and focal length is FL
_{k}time tangential distortion parameter;
And obtain each focusing from FD
_{j}with focal length FL
_{k}under internal reference matrix be M
_{jk}:
3. a kind of camera position correction scaling method according to claim 2, is characterized in that, in step S5,
(1) when camera lens is tight shot:
The current location information that described video camera attitude external trace device provides is
[θ
_{xj}, θ
_{yj}, θ
_{zj}, T
_{xj}, T
_{yj}, T
_{zj}]; Wherein [θ
_{xj}, θ
_{yj}, θ
_{zj}] be the rotation Eulerian angle that overlap with the selfdefining coordinate system of video camera attitude external trace device after world coordinate system is rotated around Xaxis, Yaxis and Z axis respectively successively after translation, [T
_{xj}, T
_{yj}, T
_{zj}] be the position coordinates of video camera attitude external trace device under world coordinate system, the current location information that video camera attitude external trace device provides is expressed as matrix form:
Here
T
_{Ij}＝[T
_{xj}T
_{yj}T
_{zj}]
^{T}
0＝[0，0，0]；
(2) when camera lens is zoom lens:
Current location information [the θ that described video camera attitude external trace device provides
_{xjk}, θ
_{yjk}, θ
_{zjk}, T
_{xjk}, T
_{yjk}, T
_{zjk}]; Wherein [θ
_{xjk}, θ
_{yjk}, θ
_{zjk}] be the rotation Eulerian angle that overlap with the selfdefining coordinate system of video camera attitude external trace device after world coordinate system is rotated around Xaxis, Yaxis and Z axis respectively successively after translation, [T
_{xjk}, T
_{yjk}, T
_{zjk}] be the position coordinates of video camera attitude external trace device under world coordinate system; The current location information that video camera attitude external trace device provides is expressed as matrix form:
Here
T
_{Ijk}＝[T
_{xjk}T
_{yjk}T
_{zjk}]
^{T}
0＝[0，0，0]。
4. a kind of camera position correction scaling method according to claim 3, is characterized in that, in step S6,
(1) when camera lens is tight shot:
Current focusing is from being FD
_{j}time, obtain the rotation matrix R between camera coordinates system and world coordinate system
_{pj}, the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pj}, the attitude matrix H of video camera under world coordinate system
_{pj}method be:
According to the picpointed coordinate of ith gauge point
and current focusing is from FD
_{j}under lens distortion parameter, obtain ith gauge point after correction picpointed coordinate (x '
_{ij}, y '
_{ij}):
If when wherein camera lens is nonfish eye lens, k
_{3j}=0;
World coordinates (the x of ith gauge point
_{i}, y
_{i}, z
_{i}) picpointed coordinate after the correction corresponding with it (x '
_{ij}, y '
_{ij}) between relation be expressed as:
Wherein λ
_{j}be called scale factor,
Rotation matrix R wherein between camera coordinates system and world coordinate system
_{pj}for orthogonal matrix, T
_{pj}for the video camera centre of perspectivity is at the translation vector of world coordinate system:
World coordinates (the x of through type (1) (2) and each gauge point
_{i}, y
_{i}, z
_{i}) and corresponding correction after picpointed coordinate (x '
_{ij}, y '
_{ij}), obtain matrix O
_{j}=λ
_{j}[R
_{pj}, T
_{pj}] and the attitude information of video camera under world coordinate system
wherein
for the rotation Eulerian angle making world coordinate system overlap with camera coordinates system after the rotation of Xaxis, Yaxis and Z axis respectively successively after translation,
${\left[\begin{array}{ccc}{T}_{\mathrm{xj}}^{c}& {T}_{\mathrm{yj}}^{c}& {T}_{\mathrm{zj}}^{c}\end{array}\right]}^{T}$ For the translation vector of the video camera centre of perspectivity under world coordinate system;
Again because matrix R
_{pj}be unit orthogonal matrix, have:
λ
_{j}＝1/O
_{j}(：，1)；
Wherein O
_{j}(:, 1) be matrix O
_{j}the 1st row,   represent vector Euclid norm;
According to the rotation matrix R between the camera coordinates system obtained and world coordinate system
_{pj}and the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pj}, the attitude information of video camera under world coordinate system is expressed as matrix H
_{pj}:
(2) when camera lens is zoom lens:
Current focusing is from being FD
_{j}and when front focal length is FL
_{k}time, obtain the rotation matrix R between camera coordinates system and world coordinate system
_{pjk}, the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pjk}, the attitude matrix H of video camera under world coordinate system
_{pjk}method be:
According to current focusing from FD
_{j}with as front focal length FL
_{k}the picpointed coordinate of lower ith gauge point
and lens distortion parameter obtain its correct after picpointed coordinate (x '
_{ijk}, y '
_{ijk}):
If when wherein camera lens is nonfish eye lens, k
_{3jk}=0;
World coordinates (the x of ith gauge point
_{i}, y
_{i}, z
_{i}) picpointed coordinate after the correction corresponding with it (x '
_{ijk}, y '
_{ijk}) between relation be expressed as:
Wherein λ
_{jk}be called scale factor,
Rotation matrix R wherein between camera coordinates system and world coordinate system
_{pjk}for orthogonal matrix, T
_{pjk}for the video camera centre of perspectivity is at the translation vector of world coordinate system:
World coordinates (the x of through type (3) (4) and each gauge point
_{i}, y
_{i}, z
_{i}) and corresponding correct after picpointed coordinate (x '
_{ijk}, y '
_{ijk}), obtain matrix O
_{jk}=λ
_{jk}[R
_{pjk}, T
_{pjk}] and the attitude information of video camera under world coordinate system
wherein
for the rotation Eulerian angle making world coordinate system overlap with camera coordinates system after the rotation of Xaxis, Yaxis and Z axis respectively successively after translation,
${\left[\begin{array}{ccc}{T}_{\mathrm{xjk}}^{c}& {T}_{\mathrm{yjk}}^{c}& {T}_{\mathrm{zjk}}^{c}\end{array}\right]}^{T}$ For the translation vector of the video camera centre of perspectivity under world coordinate system;
Because matrix R
_{jk}be unit orthogonal matrix, have:
λ
_{jk}＝1/O
_{jk}(：，1)；
Wherein O
_{jk}(:, 1) be matrix O
_{jk}the 1st row,   represent vector Euclid norm;
According to the rotation matrix R between the camera coordinates system obtained and world coordinate system
_{pjk}and the video camera centre of perspectivity is at the translation vector T of world coordinate system
_{pjk}, the attitude information of video camera under world coordinate system is expressed as matrix H
_{pjk}:
5. a kind of camera position correction scaling method according to claim 4, it is characterized in that, step S7 comprises the steps:
(1) when camera lens is tight shot:
S7.1 calculates video camera attitude rectification matrix H
_{j}:
S7.2 is by video camera attitude rectification matrix H
_{j}be transformed to and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}) and translation vector T
_{j}:
Described video camera attitude rectification matrix H
_{j}be the matrix of 4 × 4, its live part is H
_{j}(1: 3,1: 4), i.e. H
_{j}first three rows, by H
_{j}(1: 3,1: 4) is expressed as form:
H
_{j}(1∶3，1∶4)＝[R
_{j}，T
_{j}]；
Wherein rotation matrix R
_{j}be the real number matrix of 3 × 3, translation vector T
_{j}=[t
_{xj}, t
_{yj}, t
_{zj}]
^{t}be 3 dimensional vectors;
By rotation matrix R
_{j}be converted to according to following formula and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}):
Here
be 3 dimension row vectors;
Thus R is stored
_{j}only need store and revise hypercomplex number (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}), therefore video camera attitude rectification matrix H
_{j}be converted to video camera correction and demarcate information (θ
_{j}, n
_{xj}, n
_{yj}, n
_{zj}, t
_{xj}, t
_{yj}, t
_{zj});
S7.3 calculates camera coverage angle: when focusing is from being FD
_{j}time, calculate camera coverage angle α according to following formula
_{j}:
Wherein W, H are respectively wide, the high resolving power of video camera;
(2) when camera lens is zoom lens:
S7.1 calculates video camera attitude rectification matrix H
_{jk}:
S7.2 is by video camera attitude rectification matrix H
_{jk}be transformed to and revise hypercomplex number (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}) and translation vector T
_{jk}:
Described video camera attitude rectification matrix H
_{jk}be the matrix of 4 × 4, its live part is H
_{jk}(1: 3,1: 4), i.e. H
_{jk}first three rows, by H
_{jk}(1: 3,1: 4) is expressed as form:
H
_{jk}(1∶3，1∶4)＝[R
_{jk}，T
_{jk}]；
Wherein rotation matrix R
_{jk}be the real number matrix of 3 × 3, translation vector T
_{jk}=[t
_{xjk}, t
_{yjk}, t
_{zjk}]
^{t}be 3 dimensional vectors;
By rotation matrix R
_{jk}be converted to according to following formula and revise hypercomplex number (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}):
Here
be 3 dimension row vectors;
Thus R is stored
_{jk}only need store and revise hypercomplex number (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}), therefore video camera attitude rectification matrix H
_{jk}be converted to video camera correction and demarcate information (θ
_{jk}, n
_{xjk}, n
_{yjk}, n
_{zjk}, t
_{xjk}, t
_{yjk}, t
_{zjk});
S7.3 calculates camera coverage angle: when focusing is from being FD
_{j}and focal length is FL
_{k}time, calculate camera coverage angle α according to following formula
_{jk}:
Wherein W, H are respectively wide, the high resolving power of video camera.
6. a kind of camera position correction scaling method according to claim 5, is characterized in that, in step S9, when actual focusing from FD not at any one focusing distance sample FD
_{j}place, or real focal length FL is not at any one focusing distance sample FL
_{k}place, or work as actual focusing from FD not at any one focusing distance sample FD
_{j}place and real focal length FL be not also at any one focusing distance sample FL
_{k}during place, information and field angle are demarcated in the correction adopting the mode of following interpolation arithmetic to obtain its correspondence:
Adopt SLERP interpolation method to carry out interpolation for the correction hypercomplex number revised in demarcation information, translation vector adopts linear interpolation; Field angle adopts linear interpolation method to carry out interpolation.
7. a kind of camera position correction scaling method according to claim 6, is characterized in that, in step S9,
(1) when camera lens is tight shot:
When actual focusing from FD not any one focusing distance sample FD
_{j}place, the method using SLERP algorithm to carry out interpolation to correction hypercomplex number is as follows:
If FD
_{j}< FD < FD
_{j+1}, video camera in focusing from FD
_{j}and FD
_{j+1}the video camera attitude rectification matrix at place is respectively H
_{j}and H
_{j+1}, its rotation matrix is respectively R
_{j}and R
_{j+1}, rotation matrix is expressed as correction hypercomplex number
with
then current actual focusing is calculated from correction hypercomplex number q during FD by following formula:
Here
q
_{j}inverse,
(2) when camera lens is zoom lens:
When actual focusing from FD not any one focusing distance sample FD
_{j}place, or real focal length FL is not at any one focusing distance sample FL
_{k}place, or work as actual focusing from FD not at any one focusing distance sample FD
_{j}place and real focal length FL be not also at any one focusing distance sample FL
_{k}during place, the method using SLERP algorithm to carry out interpolation to correction hypercomplex number is as follows:
If 1. FD
_{j}< FD < FD
_{j+1}and FL
_{k}< FL < FL
_{k+1}, namely when actual focusing from FD not any one focusing distance sample FD
_{j}place, and real focal length FL is not also at any one focusing distance sample FL
_{k}place, adopts and combines (FD with its focusing Distance geometry focal length sampled point closed on most
_{j}, FL
_{k}), (FD
_{j}, FL
_{k+1}), (FD
_{j+1,}fL
_{k}), (FD
_{j+1}, FL
_{k+1}) corresponding to correction hypercomplex number q
_{j, k}, q
_{j, k+1}, q
_{j+1, k}, q
_{j+1, k+1}, carry out interpolation according to the following formula, then for current actual focusing from FD and real focal length is FL time correction hypercomplex number q
_{l, d}calculated by following formula:
Wherein:
If 2. FD
_{j}< FD < FD
_{j+1}, FL=FL
_{k}time, namely when actual focusing from FD not any one focusing distance sample FD
_{j}place and real focal length FL is focal length sampled point FL
_{k}time, adopt and combine (FD with its focusing closed on most from sampled point
_{j}, FL
_{k}), (FD
_{j+1}, FL
_{k}) corresponding to correction hypercomplex number information q
_{j, k}, q
_{j+1, k}, carry out interpolation according to the following formula, then for current actual focusing from FD and real focal length is FL
_{k}time correction hypercomplex number q
_{l, d}calculated by following formula
Wherein
If 3. FD=FD
_{j}, FL
_{k}< FL < FL
_{k+1}time, namely actual focusing is focusing distance sample FD from FD
_{j}and real focal length FL is not at any one focusing distance sample FL
_{k}during place, adopt and combine (FD with its focal length sampled point closed on most
_{j}, FL
_{k}), (FD
_{j}, FL
_{k+1}) corresponding to correction hypercomplex number information q
_{j, k}, q
_{j, k+1}, carry out interpolation according to the following formula, then for current actual focusing from FD
_{j}and correction hypercomplex number q when real focal length is FL
_{l, d}calculated by following formula:
Wherein
8. a camera position correction calibration system, is characterized in that, comprising: servomotor control system, video camera, video camera attitude external trace device, background screen, gauge point, space measurement equipment, data processing equipment, image rendering engine;
Wherein, servomotor control system is connected with video camera, for the focal length of adjustable lens and focusing from; Servomotor control system is also connected with data processing equipment, for sending the focal length of camera lens and focusing range information to data processing equipment, calculates revise demarcation information and field angle and set up lookup table to make data processing equipment;
Data processing equipment is also connected with video camera, for reading video stream data in real time;
Video camera attitude external trace device is arranged at video camera outside, for by the position of photogrammetric camera and the skew of video camera attitude external trace device, estimates position and the attitude information of video camera; Video camera attitude external trace device is also connected with data processing equipment, for sending measurement data information;
Background screen is arranged the gauge point of at least 3 not conllinear, these gauge point radiuses are identical, and color and background screen have contrast;
Space measurement equipment is connected with data processing equipment respectively with image rendering engine, space measurement equipment is used for the world coordinates of measurement markers point home position and sends to data processing equipment, and the lookup table that image rendering engine is used for setting up according to data processing equipment obtains revises demarcation information and field angle accordingly.
9. a kind of camera position correction scaling method according to claim 1, is characterized in that, adopts space measurement equipment to obtain N number of gauge point A on background screen in S1
_{1}..., A
_{n}coordinate under world coordinate system.
10. any one camera position correction scaling method according to claim 1, it is characterized in that, described video camera attitude external trace device is optictracking device or mechanical arm.
11. a kind of camera position correction scaling methods according to claim 1, it is characterized in that, video camera imaging coordinate system initial point is set in field of view center.
12. a kind of camera position correction scaling methods according to claim 1, described focusing distance sample and focal length sampled point adopt the sampled point got in camera lens calibration process.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201510489677.0A CN105118055B (en)  20150811  20150811  Camera position amendment scaling method and system 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201510489677.0A CN105118055B (en)  20150811  20150811  Camera position amendment scaling method and system 
Publications (2)
Publication Number  Publication Date 

CN105118055A true CN105118055A (en)  20151202 
CN105118055B CN105118055B (en)  20171215 
Family
ID=54666029
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201510489677.0A Active CN105118055B (en)  20150811  20150811  Camera position amendment scaling method and system 
Country Status (1)
Country  Link 

CN (1)  CN105118055B (en) 
Cited By (22)
Publication number  Priority date  Publication date  Assignee  Title 

CN106990776A (en) *  20170227  20170728  广东省智能制造研究所  Robot goes home localization method and system 
CN107223255A (en) *  20161020  20170929  深圳达闼科技控股有限公司  A kind of image preview method and device based on iris recognition 
CN107306325A (en) *  20160422  20171031  宁波舜宇光电信息有限公司  The device of utilization space measurement of coordinates camera module visual field and its application 
CN107492126A (en) *  20170803  20171219  厦门云感科技有限公司  Calibration method, device, system, medium and the equipment of camera central shaft 
CN107562189A (en) *  20170721  20180109  广州励丰文化科技股份有限公司  A kind of spacelocation method and service equipment based on binocular camera 
CN107610185A (en) *  20171012  20180119  长沙全度影像科技有限公司  A kind of fisheye camera fast calibration device and scaling method 
CN107665483A (en) *  20170927  20180206  天津智慧视通科技有限公司  Exempt from calibration easily monocular camera lens fish eye images distortion correction method 
CN107809610A (en) *  20160908  20180316  松下知识产权经营株式会社  Camera parameter set calculating apparatus, camera parameter set calculation method and program 
CN107959794A (en) *  20171129  20180424  天津聚飞创新科技有限公司  Data Modeling Method, device and data capture method, device and electronic equipment 
CN108282651A (en) *  20171218  20180713  北京小鸟看看科技有限公司  Antidote, device and the virtual reality device of camera parameter 
CN108702450A (en) *  20160219  20181023  快图有限公司  Stablize the method for image sequence 
CN109523597A (en) *  20170918  20190326  百度在线网络技术（北京）有限公司  The scaling method and device of Camera extrinsic 
CN109788277A (en) *  20190108  20190521  浙江大华技术股份有限公司  Compensation method, device and the storage medium of the optical axis deviation of stabilization machine core 
CN109887041A (en) *  20190305  20190614  中测国检（北京）测绘仪器检测中心  A kind of method of mechanical arm control digital camera photo centre position and posture 
CN110266944A (en) *  20190621  20190920  大庆安瑞达科技开发有限公司  A kind of calibration quick focusing method of remote optical monitoring system 
CN110389349A (en) *  20180417  20191029  北京京东尚科信息技术有限公司  Localization method and device 
CN110487249A (en) *  20190717  20191122  广东工业大学  A kind of unmanned plane scaling method for structure threedimensional vibration measurement 
CN110858403A (en) *  20180822  20200303  杭州萤石软件有限公司  Method for determining scale factor in monocular vision reconstruction and mobile robot 
CN111311671A (en) *  20200512  20200619  创新奇智(南京)科技有限公司  Workpiece measuring method and device, electronic equipment and storage medium 
CN113409391A (en) *  20210625  20210917  浙江商汤科技开发有限公司  Visual positioning method and related device, equipment and storage medium 
CN113766131A (en) *  20210915  20211207  广州市明美光电技术有限公司  Multitargetpoint focusing method and application thereof 
CN114299167A (en) *  20220311  20220408  杭州灵西机器人智能科技有限公司  Monocular calibration method, system, device and medium for zoom lens 
Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

US5991437A (en) *  19960712  19991123  RealTime Geometry Corporation  Modular digital audio system having individualized functional modules 
JP2005003463A (en) *  20030610  20050106  Topcon Corp  Calibration chart image display device, calibration apparatus, and calibration method 
CN1906943A (en) *  20040130  20070131  株式会社丰田自动织机  Video image positional relationship correction apparatus, steering assist apparatus having the video image positional relationship correction apparatus and video image positional relationship correcti 
CN101447073A (en) *  20071126  20090603  新奥特（北京）视频技术有限公司  Zoom lens calibration method 

2015
 20150811 CN CN201510489677.0A patent/CN105118055B/en active Active
Patent Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

US5991437A (en) *  19960712  19991123  RealTime Geometry Corporation  Modular digital audio system having individualized functional modules 
JP2005003463A (en) *  20030610  20050106  Topcon Corp  Calibration chart image display device, calibration apparatus, and calibration method 
CN1906943A (en) *  20040130  20070131  株式会社丰田自动织机  Video image positional relationship correction apparatus, steering assist apparatus having the video image positional relationship correction apparatus and video image positional relationship correcti 
CN101447073A (en) *  20071126  20090603  新奥特（北京）视频技术有限公司  Zoom lens calibration method 
NonPatent Citations (3)
Title 

C.S. FRASER ET AL: "ZoomDependent camera calibration in digital closerange photogrammetry", 《PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING》 * 
董明利 等: "随对焦状态与物距变化的畸变模型及标定方法", 《仪器仪表学报》 * 
闫利 等: "利用四元数描述线阵CCD影像的空间后方交会", 《武汉大学学报.信息科学版》 * 
Cited By (31)
Publication number  Priority date  Publication date  Assignee  Title 

CN108702450B (en) *  20160219  20201027  快图有限公司  Camera module for image capture device 
CN108702450A (en) *  20160219  20181023  快图有限公司  Stablize the method for image sequence 
CN107306325A (en) *  20160422  20171031  宁波舜宇光电信息有限公司  The device of utilization space measurement of coordinates camera module visual field and its application 
CN107809610B (en) *  20160908  20210611  松下知识产权经营株式会社  Camera parameter set calculation device, camera parameter set calculation method, and recording medium 
US11233983B2 (en)  20160908  20220125  Panasonic Intellectual Property Management Co., Ltd.  Cameraparameterset calculation apparatus, cameraparameterset calculation method, and recording medium 
CN107809610A (en) *  20160908  20180316  松下知识产权经营株式会社  Camera parameter set calculating apparatus, camera parameter set calculation method and program 
CN107223255A (en) *  20161020  20170929  深圳达闼科技控股有限公司  A kind of image preview method and device based on iris recognition 
CN106990776A (en) *  20170227  20170728  广东省智能制造研究所  Robot goes home localization method and system 
CN107562189A (en) *  20170721  20180109  广州励丰文化科技股份有限公司  A kind of spacelocation method and service equipment based on binocular camera 
CN107492126A (en) *  20170803  20171219  厦门云感科技有限公司  Calibration method, device, system, medium and the equipment of camera central shaft 
CN107492126B (en) *  20170803  20191105  厦门云感科技有限公司  Calibration method, device, system, medium and the equipment of camera central axis 
CN109523597A (en) *  20170918  20190326  百度在线网络技术（北京）有限公司  The scaling method and device of Camera extrinsic 
CN107665483B (en) *  20170927  20200505  天津智慧视通科技有限公司  Calibrationfree convenient monocular head fisheye image distortion correction method 
CN107665483A (en) *  20170927  20180206  天津智慧视通科技有限公司  Exempt from calibration easily monocular camera lens fish eye images distortion correction method 
CN107610185A (en) *  20171012  20180119  长沙全度影像科技有限公司  A kind of fisheye camera fast calibration device and scaling method 
CN107959794A (en) *  20171129  20180424  天津聚飞创新科技有限公司  Data Modeling Method, device and data capture method, device and electronic equipment 
CN108282651A (en) *  20171218  20180713  北京小鸟看看科技有限公司  Antidote, device and the virtual reality device of camera parameter 
CN110389349A (en) *  20180417  20191029  北京京东尚科信息技术有限公司  Localization method and device 
CN110389349B (en) *  20180417  20210817  北京京东尚科信息技术有限公司  Positioning method and device 
CN110858403A (en) *  20180822  20200303  杭州萤石软件有限公司  Method for determining scale factor in monocular vision reconstruction and mobile robot 
CN109788277B (en) *  20190108  20200804  浙江大华技术股份有限公司  Method and device for compensating optical axis deviation of antishake movement and storage medium 
CN109788277A (en) *  20190108  20190521  浙江大华技术股份有限公司  Compensation method, device and the storage medium of the optical axis deviation of stabilization machine core 
CN109887041A (en) *  20190305  20190614  中测国检（北京）测绘仪器检测中心  A kind of method of mechanical arm control digital camera photo centre position and posture 
CN110266944A (en) *  20190621  20190920  大庆安瑞达科技开发有限公司  A kind of calibration quick focusing method of remote optical monitoring system 
CN110487249A (en) *  20190717  20191122  广东工业大学  A kind of unmanned plane scaling method for structure threedimensional vibration measurement 
CN111311671A (en) *  20200512  20200619  创新奇智(南京)科技有限公司  Workpiece measuring method and device, electronic equipment and storage medium 
CN111311671B (en) *  20200512  20200807  创新奇智(南京)科技有限公司  Workpiece measuring method and device, electronic equipment and storage medium 
CN113409391A (en) *  20210625  20210917  浙江商汤科技开发有限公司  Visual positioning method and related device, equipment and storage medium 
CN113766131A (en) *  20210915  20211207  广州市明美光电技术有限公司  Multitargetpoint focusing method and application thereof 
CN114299167A (en) *  20220311  20220408  杭州灵西机器人智能科技有限公司  Monocular calibration method, system, device and medium for zoom lens 
CN114299167B (en) *  20220311  20220726  杭州灵西机器人智能科技有限公司  Monocular calibration method, system, device and medium of zoom lens 
Also Published As
Publication number  Publication date 

CN105118055B (en)  20171215 
Similar Documents
Publication  Publication Date  Title 

CN105118055A (en)  Camera positioning correction calibration method and system  
CN106846415B (en)  A kind of multichannel fisheye camera binocular calibration device and method  
CN103198487B (en)  A kind of automatic marking method for video monitoring system  
US7479982B2 (en)  Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device  
CN105243637B (en)  One kind carrying out fullview image joining method based on threedimensional laser point cloud  
TWI555378B (en)  An image calibration, composing and depth rebuilding method of a panoramic fisheye camera and a system thereof  
CN104574425B (en)  A kind of demarcation of the masterslave camera chain based on rotating model and interlock method  
CN107424118A (en)  Based on the spherical panorama mosaic method for improving Lens Distortion Correction  
CN107665483B (en)  Calibrationfree convenient monocular head fisheye image distortion correction method  
CN109961485A (en)  A method of target positioning is carried out based on monocular vision  
CN103729839B (en)  A kind of method and system of sensorbased outdoor camera tracking  
CN105469412A (en)  Calibration method of assembly error of PTZ camera  
KR20090078463A (en)  Distorted image correction apparatus and method  
CN108020175A (en)  A kind of more optical grating projection binocular vision tongue body surface three dimension entirety imaging methods  
CN111461963A (en)  Fisheye image splicing method and device  
CN206460515U (en)  A kind of multichannel fisheye camera caliberating device based on stereo calibration target  
CN206460516U (en)  A kind of multichannel fisheye camera binocular calibration device  
CN107564070B (en)  Largescene binocular ranging and correcting method in overhead power transmission channel image monitoring  
CN112907647B (en)  Threedimensional space size measurement method based on fixed monocular camera  
CN112949478A (en)  Target detection method based on holder camera  
CN113724337A (en)  Camera dynamic external parameter calibration method and device without depending on holder angle  
CN105608706A (en)  Structure design and geometric calibration method for polarization vision sensor  
CN112258581A (en)  Onsite calibration method for multifish glasses head panoramic camera  
CN112785655A (en)  Method, device and equipment for automatically calibrating external parameters of allround camera based on lane line detection and computer storage medium  
CN102968784B (en)  Method for aperture synthesis imaging through multiview shooting 
Legal Events
Date  Code  Title  Description 

C06  Publication  
PB01  Publication  
C10  Entry into substantive examination  
SE01  Entry into force of request for substantive examination  
GR01  Patent grant  
GR01  Patent grant 