CN102939562B - Object projection method and object projection system - Google Patents

Object projection method and object projection system Download PDF

Info

Publication number
CN102939562B
CN102939562B CN201080066634.7A CN201080066634A CN102939562B CN 102939562 B CN102939562 B CN 102939562B CN 201080066634 A CN201080066634 A CN 201080066634A CN 102939562 B CN102939562 B CN 102939562B
Authority
CN
China
Prior art keywords
video camera
dimensional coordinate
double points
matching double
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201080066634.7A
Other languages
Chinese (zh)
Other versions
CN102939562A (en
Inventor
吴迪
陈�光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Taishan Sports Technology Co.,Ltd.
Original Assignee
SHENZHEN TOL TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHENZHEN TOL TECHNOLOGY Co Ltd filed Critical SHENZHEN TOL TECHNOLOGY Co Ltd
Publication of CN102939562A publication Critical patent/CN102939562A/en
Application granted granted Critical
Publication of CN102939562B publication Critical patent/CN102939562B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

An object projection method includes: acquiring the three-dimensional coordinates corresponding to the matching point-pair (P0 and P1) of the object in the first video camera (21) and the second video camera (22); acquiring the affine projection matrix of the third video camera (32); calculating the theoretical barycentric coordinates of the object in the third video camera (32); calculating the theoretical projection size of the object in the third video camera (32); acquiring the actual barycentric coordinates of the projection of the object in the third video camera (32); acquiring the actual projection size of the object in the third video camera (32); and judging whether the three-dimensional coordinates corresponding to the matching point-pair (P0 and P1) is right or not. An object projection system using the method is also provided.

Description

Object projection method and system
Technical field
The present invention relates to a kind of object projection method and use the target projection system of the method, more particularly, relating to and a kind ofly there is the object projection method of proof procedure and use the target projection system of the method.
Background technology
In the system taking video camera as obtaining information, the available information of extraction correct from image is crucial.What mainly rely on the judgement of target identification correctness is the shape of target, the information such as color and size, is exactly epipolar-line constraint and Ordinal Consistency constraint to what identify between two video cameras that the coupling of target mainly relies on.In the system adopting binocular camera, principle of parallax is adopted to realize the foundation of objective.The shortcoming of this system is the identification of target, and the verification of correctness of coupling and foundation is comparatively difficult.
Summary of the invention
The technical problem to be solved in the present invention is, for the three-dimensional coordinate identification of the target projection system of prior art, the defect that the verification of correctness mated and set up is comparatively difficult, a kind of twin camera that utilizes is provided to set up affine transformation relationship between target and single camera, by the identification of target, coupling and the information of rebuilding focus on projected image, verify three-dimensional coordinate identification according to projected image, the object projection method of the correctness of coupling and foundation and the target projection system of utilization the method.
The technical solution adopted for the present invention to solve the technical problems is: construct a kind of object projection method, wherein, comprising:
Obtain the three-dimensional coordinate that the matching double points of object in the first video camera and the second video camera is corresponding;
Obtain the affine projection matrix of the 3rd video camera;
Proof procedure:
The theoretical barycentric coordinates of described object on described 3rd video camera are calculated according to the affine projection matrix of described 3rd video camera and three-dimensional coordinate corresponding to the matching double points of described object in the first video camera and the second video camera;
The three-dimensional coordinate corresponding according to described matching double points calculates the theoretical projected size of described object on the 3rd video camera;
Obtain the actual center gravity coordinate that described object projects on described 3rd video camera;
Obtain the actual projected size of described object on described 3rd video camera;
Jointly judge that whether the three-dimensional coordinate that described matching double points is corresponding is correct according to described theoretical barycentric coordinates and the comparative result of described actual center gravity coordinate and the comparative result of described theoretical projected size and described actual projected size.
Object projection method of the present invention, wherein, the three-dimensional coordinate that described matching double points is corresponding is obtained by calibration process and three-dimensional coordinate process of establishing:
Calibration process: the calibrating parameters utilizing described first video camera and described second video camera, obtains the first correction matrix of described first video camera described first video camera being become parallel vidicon with described second Camera calibration and the second correction matrix of described second video camera;
Three-dimensional coordinate process of establishing: by obtaining the matching double points of described object in described first video camera and described second video camera to the target identification coupling of described first video camera and described second video camera; Described first correction matrix, described second correction matrix and described matching double points is utilized to calculate three-dimensional coordinate corresponding to described matching double points.
Object projection method of the present invention, wherein, described calibrating parameters comprises inner parameter and outside matching parameter, described inner parameter is obtained by stereo calibration plate scaling method, described outside matching parameter adopts the coordinate data of permanent order described first video camera of coupling and described second camera acquisition, and the coordinate data then utilizing Li Wenbeige-Ma quart algorithm to carry out permanent order described first video camera of coupling and described second camera acquisition described in Nonlinear least squares optimization obtains.
Object projection method of the present invention, wherein, described inner parameter comprises A = α x γ u 0 0 α y v 0 0 0 1 , α x=f/dx, α y=f/dy, wherein f is focal length of camera, and dx, dy are the image space-between of horizontal direction and vertical direction respectively, u 0, v 0be the photocentre of video camera, γ is the out of plumb factor of u axle and v axle; Radial distortion parameter k1, k2; Tangential distortion parameter p 1, p2; Described outside matching parameter comprises rotation matrix parameter R = r 11 ′ r 12 ′ r 13 ′ r 21 ′ r 22 ′ r 23 ′ r 31 ′ r 32 ′ r 33 ′ With translation matrix parameter T = t x t y t z .
Object projection method of the present invention, wherein, described affine projection matrix obtains by the following method:
Gather the two-dimensional coordinate that three-dimensional coordinate corresponding to the described matching double points of at least 6 group diverse locations and described matching double points correspondence project on described 3rd video camera;
The two-dimensional coordinate that the three-dimensional coordinate corresponding according to the described matching double points of described at least 6 group diverse locations and described matching double points correspondence project on described 3rd video camera, adopts linear least square to calculate affine projection matrix.
Object projection method of the present invention, wherein, the three-dimensional coordinate that the described matching double points of described collection at least 6 group diverse location is corresponding and the two-dimensional coordinate that described matching double points correspondence projects on described 3rd video camera obtain by the following method:
Arranging the 3rd video camera is coordinate system, is taken to the two-dimensional coordinate of few 6 groups of diverse locations from described coordinate system;
When described object move to two-dimensional coordinate that described matching double points projects on described 3rd video camera overlap with the described two-dimensional coordinate being taken to few 6 groups of diverse locations from described coordinate system time, obtain the three-dimensional coordinate that the described matching double points of at least 6 group diverse locations corresponding with the two-dimensional coordinate of described at least 6 group diverse locations is corresponding.
Construct a kind of target projection system, comprise processing unit, image-generating unit, wherein, also comprise authentication unit, described image-generating unit comprises the first video camera and the second video camera, and described authentication unit comprises authentication module and the 3rd video camera;
Described processing unit: for obtaining three-dimensional coordinate corresponding to the matching double points of object in the first video camera and the second video camera; With the affine projection matrix of acquisition the 3rd video camera;
Described authentication module: for calculating the theoretical barycentric coordinates of described object on described 3rd video camera according to the affine projection matrix of described 3rd video camera and three-dimensional coordinate corresponding to the matching double points of described object in the first video camera and the second video camera;
The three-dimensional coordinate corresponding according to described matching double points calculates the theoretical projected size of described object on the 3rd video camera;
Obtain the actual center gravity coordinate that described object projects on described 3rd video camera;
Obtain the actual projected size of described object on described 3rd video camera;
Jointly judge that whether the three-dimensional coordinate that described matching double points is corresponding is correct according to described theoretical barycentric coordinates and the comparative result of described actual center gravity coordinate and the comparative result of described theoretical projected size and described actual projected size.
Target projection system of the present invention, wherein, described processing unit comprises demarcating module and three-dimensional coordinate sets up module;
Described demarcating module: for utilizing the calibrating parameters of described first video camera and described second video camera, obtains the first correction matrix of described first video camera described first video camera being become parallel vidicon with described second Camera calibration and the second correction matrix of described second video camera;
Described three-dimensional coordinate sets up module: for obtaining the matching double points of described object in described first video camera and described second video camera by mating the target identification of described first video camera and described second video camera; Described first correction matrix, described second correction matrix and described matching double points is utilized to calculate three-dimensional coordinate corresponding to described matching double points.
Target projection system of the present invention, wherein, described calibrating parameters comprises inner parameter and outside matching parameter, described inner parameter is obtained by stereo calibration plate scaling method, described outside matching parameter adopts the coordinate data of permanent order described first video camera of coupling and described second camera acquisition, and the coordinate data then utilizing Li Wenbeige-Ma quart algorithm to carry out permanent order described first video camera of coupling and described second camera acquisition described in Nonlinear least squares optimization obtains.
Target projection system of the present invention, wherein, described inner parameter comprises A = α x γ u 0 0 α y v 0 0 0 1 , α x=f/dx, α y=f/dy, wherein f is focal length of camera, and dx, dy are the image space-between of horizontal direction and vertical direction respectively, u 0, v 0be the photocentre of video camera, γ is the out of plumb factor of u axle and v axle; Radial distortion parameter k1, k2; Tangential distortion parameter p 1, p2; Described outside matching parameter comprises rotation matrix parameter R = r 11 ′ r 12 ′ r 13 ′ r 21 ′ r 22 ′ r 23 ′ r 31 ′ r 32 ′ r 33 ′ With translation matrix parameter T = t x t y t z .
Target projection system of the present invention, wherein, described affine projection matrix obtains by the following method:
Gather the two-dimensional coordinate that three-dimensional coordinate corresponding to the described matching double points of at least 6 group diverse locations and described matching double points correspondence project on described 3rd video camera;
The two-dimensional coordinate that the three-dimensional coordinate corresponding according to the described matching double points of described at least 6 group diverse locations and described matching double points correspondence project on described 3rd video camera, adopts linear least square to calculate affine projection matrix.
Target projection system of the present invention, wherein, the three-dimensional coordinate that the described matching double points of described collection at least 6 group diverse location is corresponding and the two-dimensional coordinate that described matching double points correspondence projects on described 3rd video camera obtain by the following method:
Arranging the 3rd video camera is coordinate system, is taken to the two-dimensional coordinate of few 6 groups of diverse locations from described coordinate system;
When described object move to two-dimensional coordinate that described matching double points projects on described 3rd video camera overlap with the described two-dimensional coordinate being taken to few 6 groups of diverse locations from described coordinate system time, obtain the three-dimensional coordinate that the described matching double points of at least 6 group diverse locations corresponding with the two-dimensional coordinate of described at least 6 group diverse locations is corresponding.
Target projection system of the present invention, wherein, described first video camera and described second video camera are provided with infra-red transmitting filter.
Target projection system of the present invention, wherein, described authentication unit comprises at least two described 3rd video cameras.
Target projection system of the present invention, wherein, described image-generating unit also comprises at least one the imaging video camera except described first video camera and described second video camera further.
Implement object projection method of the present invention and use the target projection system of the method, there is following beneficial effect: on the basis of original target projection system, increase the 3rd video camera that has authentication module, utilize the mapping between three-dimensional coordinate and two-dimensional coordinate of setting up, the accuracy of the result of checking three-dimensional coordinate identification, coupling and foundation.
The calibrating parameters of prior acquisition makes target projection process more accurately simple; The acquisition of inner parameter and outside matching parameter makes the second correction matrix of the first correction matrix of the first video camera and the second video camera more accurate, and the collimation of the first video camera and the second video camera is more effective; The calculating of affine projection matrix makes it possible to calculate the first projected size on the 3rd video camera accurately; And affine projection matrix can be obtained by multiple method; Described first video camera and described second video camera are provided with infra-red transmitting filter, eliminate the interference of visible ray, and it is more accurate that three-dimensional coordinate is set up; Adopt with the checking of the multiple-camera of authentication module, further increase judge twin camera identification, coupling and foundation correctness; Described authentication unit also can be used in the target projection imaging system of multiple-camera (being more than or equal to three).
Accompanying drawing explanation
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 is the structural representation of the preferred embodiment of target projection system of the present invention;
Fig. 2 A is the inner parameter of a secondary calibrating camera and the demarcation image of outside matching parameter of the preferred embodiment of target projection system of the present invention;
Fig. 2 B is the inner parameter of another secondary calibrating camera of the preferred embodiment of target projection system of the present invention and the demarcation image of outside matching parameter;
Fig. 3 is two-dimensional coordinate arrangement and the coordinate figure figure of the match point of 8 groups of diverse locations of the calculating affine projection matrix of the preferred embodiment of target projection system of the present invention.
Embodiment
Below in conjunction with diagram, the preferred embodiments of the present invention are described in detail.
Fig. 1 is the structural representation of the preferred embodiment of target projection system of the present invention.As shown in Figure 1, a kind of target projection system provided by the invention, comprises processing unit 1, image-generating unit 2, also comprises authentication unit 3 in the present embodiment, image-generating unit 2 comprises the first video camera 21 and the second video camera 22, and authentication unit 3 comprises authentication module 31 and the 3rd video camera 32; Processing unit 1: for obtaining the matching double points (P of object in the first video camera 21 and the second video camera 22 0and P 1) corresponding three-dimensional coordinate; With the affine projection matrix of acquisition the 3rd video camera 32; Authentication module 31: for according to the affine projection matrix of the 3rd video camera 32 and the matching double points (P of described object in the first video camera 21 and the second video camera 22 0and P 1) corresponding three-dimensional coordinate calculates the theoretical barycentric coordinates of object on the 3rd video camera 32; According to matching double points (P 0and P 1) corresponding three-dimensional coordinate calculates the theoretical projected size of described object on the 3rd video camera 32; Obtain the actual center gravity coordinate that object projects on the 3rd video camera 32; Obtain the actual projected size of object on the 3rd video camera 32; Matching double points (P is jointly judged according to theoretical barycentric coordinates and the comparative result of actual center gravity coordinate and the comparative result of theoretical projected size and actual projected size 0and P 1) whether corresponding three-dimensional coordinate correct.
As shown in Figure 1, processing unit 1 comprises demarcating module 11 and three-dimensional coordinate sets up module 12: demarcating module 11: for utilizing the calibrating parameters of the first video camera 21 and the second video camera 22, obtains the first video camera 21 and the second video camera 22 to be corrected into the first correction matrix of the first video camera 21 of parallel vidicon and the second correction matrix of the second video camera 22; Three-dimensional coordinate sets up module 12: for obtaining the matching double points (P of object in the first video camera 21 and the second video camera 22 by mating the target identification of the first video camera 21 and the second video camera 22 0and P 1); Utilize the first correction matrix, the second correction matrix and matching double points (P 0and P 1) calculate matching double points (P 0and P 1) corresponding three-dimensional coordinate.
The present invention also provides a kind of object projection method, as shown in Figure 1, in object projection method of the present invention, comprise calibration process, three-dimensional coordinate process of establishing and proof procedure, the first video camera 21 wherein utilized in three-dimensional coordinate process of establishing and the calibrating parameters of the second video camera 22 obtain by the following method:
The calibrating parameters of video camera comprises inner parameter and outside matching parameter.
Calibration of camera:
Intrinsic parameters of the camera is changeless for the formula camera chain of focusing.Inner parameter comprises: A = α x γ u 0 0 α y v 0 0 0 1 α x=f/dx, α y=f/dy, wherein f is focal length of camera, and dx, dy are the image space-between of horizontal direction and vertical direction respectively, u 0, v 0be the photocentre of video camera, γ is the out of plumb factor of u axle and v axle; Radial distortion parameter k1, k2; Tangential distortion parameter p 1, p2.
Stereo calibration plate scaling method is adopted to demarcate the inner parameter of all video cameras.Demarcation image is as shown in Fig. 2 A, 2B.After all camera acquisitions to uncalibrated image, the connectivity analysis of connected domain is adopted to obtain the barycentric coordinates of all calibrating block imagings.Then utilize these data messages, according to dull and stereotyped scaling method, intrinsic parameters of the camera is demarcated.Scaling method is current common method, and different scaling methods can be had to realize.
Outside matching parameter is demarcated:
The outside matching parameter of video camera mainly relative position relation information between video camera, comprises
Rotation matrix parameter R = r 11 ′ r 12 ′ r 13 ′ r 21 ′ r 22 ′ r 23 ′ r 31 ′ r 32 ′ r 33 ′ ; Translation matrix parameter T = t x t y t z .
Utilize above-mentioned middle the barycentric coordinates gathering the first video camera 21 and the second video camera 22 calibrating block imaging.Permanent order is adopted to mate the coordinate data of two camera acquisitions.Then utilize Li Wenbeige-Ma quart (Levenberg-Marquardt) algorithm to carry out the described permanent order of non-linear minimum optimization and mate the coordinate data that the first video camera 21 and the second video camera 22 gather, obtain final camera calibration parameter.Final argument comprises the outside matching parameter between the inner parameter of the first video camera 21 and the respective video camera of the second video camera 22 and two video cameras.This step ground implementation method is not unique.
Calibration process: two camera acquisition image rectifications are become parallel vidicon by the calibrating parameters utilizing the first video camera 21 and the second video camera 22; This step implementation method is not unique.
A. the first correction matrix H of the first video camera 21 is calculated 0:
Set demarcation virtual point P under the first video camera 21 coordinate system 01=(0 0 2000),
According to formula:
P 02=R -1(P 01-T) (1)
Calculate virtual point coordinate under the first video camera 21 coordinate system.
According to formula:
h 01 = R - 1 T | | R - 1 T | |
h 03=h 01∧h 02(2)
Calculate column vector in the first correction matrix respectively.The first correction matrix of the first video camera 21 is obtained according to (2) formula:
H 0 = h 01 T h 02 T h 03 T - - - ( 3 ) .
B. the second correction matrix H of the second video camera 22 is calculated 1:
Set demarcation virtual point P under the second video camera 22 coordinate system 11=(0 0 2000),
According to formula:
P 12=RP 11+T (4)
Calculate virtual point coordinate under the second video camera 22 coordinate system.
According to formula:
h 11 = T | | T | |
h 13=h 11∧h 12(5)
Calculate column vector in the second correction matrix respectively.The second correction matrix of the second video camera 22 is obtained according to (5) formula:
H 1 = h 11 T h 12 T h 13 T - - - ( 6 ) .
Three-dimensional coordinate process of establishing: by obtaining the corresponding matching double points of object to the target identification coupling of the first video camera 21 and the second video camera 22:
Adopt circular reflective object as target recognizate.In the first video camera 21 and the second video camera 22, adopt the support vector machine classifier crossed through circular reflective object sample training to identify object respectively, and calculate matching double points (barycentric coordinates to) P of object imaging in the first video camera 21 and the second video camera 22 0and P 1, then utilize the correct coupling of epipolar-line constraint and Ordinal Consistency constraint realize target.
Described first correction matrix, described second correction matrix and described matching double points is utilized to calculate three-dimensional coordinate corresponding to described matching double points;
A. pixel coordinate distortion correction, simultaneously by matching double points P 0and P 1become orthoscopic image coordinate P 0-undistortand P 1-undistort.Here iterative computation is used to correct coordinate.
B. correcting image coordinate calculates according to formula:
p 0 = H 0 P 0 - undistort 1 p 1 = H 1 P 1 - undistort 1 - - - ( 7 )
p 0 = p 0 [ 0 ] / p 0 [ 2 ] p 0 [ 1 ] / p 0 [ 2 ] 1 p 1 = p 1 [ 0 ] / p 1 [ 2 ] p 1 [ 1 ] / p 1 [ 2 ] 1 - - - ( 8 )
C. target three-dimensional coordinate is set up according to formula:
Disparity=p 1[1]-p 0[1], wherein p 1[1] and p 0[1] be image x coordinate (9)
B = t x 2 + t y 2 + t z 2
Zc=B*1.0/Disparity;
Xc=B*p 0[1]/Disparity; (10)
Yc=Zc*p 0[1]/1.0;
This step ground implementation method is not unique.
Proof procedure: three-dimensional coordinate process of establishing obtains three-dimensional coordinate P corresponding to described matching double points;
According to affine projection matrix and the matching double points P of object in the first video camera 21 and the second video camera 22 of the 3rd video camera 32 0and P 1corresponding three-dimensional coordinate P calculates described matching double points P 0and P 1coordinate Q on the 3rd video camera 32:
Q x ′ Q y ′ Q z ′ = a 1 a 2 a 3 a 4 a 5 a 6 a 7 a 8 a 9 a 10 a 11 a 12 P x P y P z 1 - - - ( 11 )
Q x=Q' x/Q' z
Q y=Q' y/Q' z(12);
The theoretical barycentric coordinates of described object on the 3rd video camera 32 are calculated according to the coordinate Q on described matching double points the 3rd video camera 32 of described object:
This method is known target recognizate size, is the diameter D of circular target thing here, the focal distance f of the 3rd video camera 32 and object three-dimensional coordinate P, then according to the projection radius of circle of formulae discovery circle object:
Radius=D×f/P z(13);
The theoretical projected size of described object on the 3rd video camera 32 is calculated according to the coordinate of described matching double points on the 3rd video camera 32:
The image of the 3rd video camera 32 is drawn centered by theoretical barycentric coordinates, Radius is the filled circles of radius;
Obtain the actual center gravity coordinate that described object projects on the 3rd video camera 32;
The actual projected size of described object on the 3rd video camera 32 is obtained according to the coordinate of described matching double points on the 3rd video camera 32:
Whether the difference whether difference of the actual center gravity coordinate projected on the 3rd video camera 32 according to described theoretical barycentric coordinates and described object is less than the first setting value and described theoretical projected size and described actual projected size is less than the second setting value judges described matching double points P jointly 0and P 1whether corresponding three-dimensional coordinate is correct.When two kinds judge then to think when being all less than setting value that the identification of the three-dimensional coordinate that described matching double points is corresponding, coupling and foundation are correct.
As shown in Figure 3 (Fig. 3 adopt be 640*480 resolution, the words coordinate of other resolutions makes to be revised accordingly), the affine projection matrix utilized in proof procedure obtains by the following method:
Gather the described matching double points P of at least 6 groups (in Fig. 3 being 8 groups) diverse location 0and P 1corresponding three-dimensional coordinate Q and described matching double points P 0and P 1the two-dimensional coordinate of corresponding projection on described 3rd video camera 32:
Arranging the 3rd video camera 32 is coordinate system, gets the two-dimensional coordinate of 8 groups of diverse locations in described coordinate system;
Described matching double points P is moved to when described object 0and P 1when the two-dimensional coordinate of projection overlaps with the two-dimensional coordinate of the described 8 groups of diverse locations being coordinate system with the 3rd video camera 32 on described 3rd video camera 32, obtain the described matching double points P of the 8 group diverse locations corresponding with the two-dimensional coordinate of described 8 groups of diverse locations 0and P 1corresponding three-dimensional coordinate Q.
Set up equation formulations:
Q ix Q iy Q iz 1 0 0 0 0 - P ix × Q ix - P ix × Q iy - P ix × Q iz 0 0 0 0 Q ix Q iy Q iz 1 - P iy × Q ix - P iy × Q iy - P iy × Q iz . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . a 1 . . . . . . a 11 = P ix P iy . . . . . . - - - ( 14 )
Ax=b (15)
Wherein affine projection matrix parameter is a 1 a 2 a 3 a 4 a 5 a 6 a 7 a 8 a 9 a 10 a 11 a 12 , Wherein a 12=1, formula (14) can be abbreviated as formula (15).
Adopt linear least square calculating parameter x:
x=(A TA) -1A Tb (16)。
Gather the described matching double points P of at least 6 group diverse locations 0and P 1corresponding three-dimensional coordinate and described matching double points P 0and P 1the method of the two-dimensional coordinate of corresponding projection on described 3rd video camera 32 also can by correct match cognization, and the three-dimensional coordinate that foundation at least 6 groups are corresponding and two-dimensional coordinate realize.
As the preferred embodiments of the present invention, realization of the present invention at least needs three video cameras, and keep video camera to take the synchronism of image, the first video camera 21 set up for three-dimensional coordinate and the second video camera 22 are for being provided with the video camera of infra-red transmitting filter, the three-dimensional coordinate undertaken by the image of the shooting of the first video camera 21 and the second video camera 22 is made to set up the interference effectively can getting rid of visible ray, three-dimensional coordinate is set up more accurate, and this set can be selected according to actual needs.
As the preferred embodiments of the present invention, authentication unit 3 comprises at least two for verifying the 3rd video camera 32 of the three-dimensional coordinate of foundation.Adopt with the checking of the multiple-camera of authentication module, further increase judge twin camera identification, coupling and foundation correctness.The quantity of choice for use checking video camera can be needed according to realistic accuracy.
As the preferred embodiments of the present invention, image-generating unit 2 can be the target projection imaging system of multiple-camera (being more than or equal to three), can add judge multiple-camera identification, coupling and foundation correctness.
The foregoing is only embodiments of the invention; not thereby the scope of the claims of the present invention is limited; every equivalent structure transformation utilizing instructions of the present invention and accompanying drawing content to do, or be directly or indirectly used in other relevant technical fields, be all in like manner included in scope of patent protection of the present invention.

Claims (15)

1. an object projection method, is characterized in that, comprising:
Obtain the matching double points (P of object in the first video camera (21) and the second video camera (22) 0and P 1) corresponding three-dimensional coordinate;
Obtain the affine projection matrix of the 3rd video camera (32);
Proof procedure:
According to affine projection matrix and the matching double points (P of described object in the first video camera (21) and the second video camera (22) of described 3rd video camera (32) 0and P 1) corresponding three-dimensional coordinate calculates the theoretical barycentric coordinates of described object on described 3rd video camera (32);
According to described matching double points (P 0and P 1) corresponding three-dimensional coordinate calculates the theoretical projected size of described object on the 3rd video camera (32);
Obtain the actual center gravity coordinate of described object in the upper projection of described 3rd video camera (32);
Obtain the actual projected size of described object on described 3rd video camera (32);
Described matching double points (P is jointly judged according to described theoretical barycentric coordinates and the comparative result of described actual center gravity coordinate and the comparative result of described theoretical projected size and described actual projected size 0and P 1) whether corresponding three-dimensional coordinate correct.
2. object projection method according to claim 1, is characterized in that, described matching double points (P 0and P 1) corresponding three-dimensional coordinate obtained by calibration process and three-dimensional coordinate process of establishing:
Calibration process: the calibrating parameters utilizing described first video camera (21) and described second video camera (22), obtains the first correction matrix of described first video camera (21) and the second correction matrix of described second video camera (22) that described first video camera (21) and described second video camera (22) are corrected into parallel vidicon;
Three-dimensional coordinate process of establishing: by obtaining the matching double points (P of described object in described first video camera (21) and described second video camera (22) to the target identification coupling of described first video camera (21) and described second video camera (22) 0and P 1); Utilize described first correction matrix, described second correction matrix and described matching double points (P 0and P 1) calculate described matching double points (P 0and P 1) corresponding three-dimensional coordinate.
3. object projection method according to claim 2, it is characterized in that, described calibrating parameters comprises inner parameter and outside matching parameter, described inner parameter is obtained by stereo calibration plate scaling method, the coordinate data that described outside matching parameter adopts permanent order described first video camera (21) of coupling and described second video camera (22) to gather, then utilize Li Wenbeige-Ma quart algorithm to carry out the coordinate data that described in Nonlinear least squares optimization, permanent order described first video camera (21) of coupling and described second video camera (22) gather to obtain.
4. object projection method according to claim 3, is characterized in that, described inner parameter comprises A = α x γ u 0 0 α y v 0 0 0 1 , α x=f/dx, α y=f/dy, wherein f is focal length of camera, and dx, dy are the image space-between of horizontal direction and vertical direction respectively, u 0, v 0be the photocentre of video camera, γ is the out of plumb factor of u axle and v axle; Radial distortion parameter k1, k2; Tangential distortion parameter p 1, p2; Described outside matching parameter comprises rotation matrix parameter R = r 11 ′ r 12 ′ r 13 ′ r 21 ′ r 22 ′ r 23 ′ r 31 ′ r 32 ′ r 33 ′ With translation matrix parameter T = t x t y t z .
5. object projection method according to claim 1, is characterized in that, described affine projection matrix obtains by the following method:
Gather the described matching double points (P of at least 6 group diverse locations 0and P 1) corresponding three-dimensional coordinate and described matching double points (P 0and P 1) the corresponding two-dimensional coordinate in the upper projection of described 3rd video camera (32);
According to the described matching double points (P of described at least 6 group diverse locations 0and P 1) corresponding three-dimensional coordinate and described matching double points (P 0and P 1) the corresponding two-dimensional coordinate in the upper projection of described 3rd video camera (32), adopt linear least square to calculate affine projection matrix.
6. object projection method according to claim 5, is characterized in that, the described matching double points (P of described collection at least 6 group diverse location 0and P 1) corresponding three-dimensional coordinate and described matching double points (P 0and P 1) the corresponding two-dimensional coordinate in the upper projection of described 3rd video camera (32) obtains by the following method:
Arranging the 3rd video camera (32) is coordinate system, is taken to the two-dimensional coordinate of few 6 groups of diverse locations from described coordinate system;
Described matching double points (P is moved to when described object 0and P 1) the upper projection of described 3rd video camera (32) two-dimensional coordinate with described from described coordinate system be taken to the two-dimensional coordinate lacking 6 groups of diverse locations overlap time, obtain the described matching double points (P of at least 6 group diverse locations corresponding with the two-dimensional coordinate of described at least 6 group diverse locations 0and P 1) corresponding three-dimensional coordinate.
7. a target projection system, comprise processing unit (1), image-generating unit (2), it is characterized in that, also comprise authentication unit (3), described image-generating unit (2) comprises the first video camera (21) and the second video camera (22), and described authentication unit (3) comprises authentication module (31) and the 3rd video camera (32):
Described processing unit (1): for obtaining the matching double points (P of object in the first video camera (21) and the second video camera (22) 0and P 1) corresponding three-dimensional coordinate; With the affine projection matrix of acquisition the 3rd video camera (32);
Described authentication module (31): for according to the affine projection matrix of described 3rd video camera (32) and the matching double points (P of described object in the first video camera (21) and the second video camera (22) 0and P 1) corresponding three-dimensional coordinate calculates the theoretical barycentric coordinates of described object on described 3rd video camera (32);
According to described matching double points (P 0and P 1) corresponding three-dimensional coordinate calculates the theoretical projected size of described object on the 3rd video camera (32);
Obtain the actual center gravity coordinate of described object in the upper projection of described 3rd video camera (32);
Obtain the actual projected size of described object on described 3rd video camera (32);
Described matching double points (P is jointly judged according to described theoretical barycentric coordinates and the comparative result of described actual center gravity coordinate and the comparative result of described theoretical projected size and described actual projected size 0and P 1) whether corresponding three-dimensional coordinate correct.
8. target projection system according to claim 7, is characterized in that, described processing unit (1) comprises demarcating module (11) and three-dimensional coordinate sets up module (12):
Described demarcating module (11): for utilizing the calibrating parameters of described first video camera (21) and described second video camera (22), obtains the first correction matrix of described first video camera (21) and the second correction matrix of described second video camera (22) that described first video camera (21) and described second video camera (22) are corrected into parallel vidicon;
Described three-dimensional coordinate sets up module (12): for obtaining the matching double points (P of described object in described first video camera (21) and described second video camera (22) by mating the target identification of described first video camera (21) and described second video camera (22) 0and P 1); Utilize described first correction matrix, described second correction matrix and described matching double points (P 0and P 1) calculate described matching double points (P 0and P 1) corresponding three-dimensional coordinate.
9. target projection system according to claim 8, it is characterized in that, described calibrating parameters comprises inner parameter and outside matching parameter, described inner parameter is obtained by stereo calibration plate scaling method, the coordinate data that described outside matching parameter adopts permanent order described first video camera (21) of coupling and described second video camera (22) to gather, then utilize Li Wenbeige-Ma quart algorithm to carry out the coordinate data that described in Nonlinear least squares optimization, permanent order described first video camera (21) of coupling and described second video camera (22) gather to obtain.
10. target projection system according to claim 9, is characterized in that, described inner parameter comprises A = α x γ u 0 0 α y v 0 0 0 1 , α x=f/dx, α y=f/dy, wherein f is focal length of camera, and dx, dy are the image space-between of horizontal direction and vertical direction respectively, u 0, v 0be the photocentre of video camera, γ is the out of plumb factor of u axle and v axle; Radial distortion parameter k1, k2; Tangential distortion parameter p 1, p2; Described outside matching parameter comprises rotation matrix parameter R = r 11 ′ r 12 ′ r 13 ′ r 21 ′ r 22 ′ r 23 ′ r 31 ′ r 32 ′ r 33 ′ With translation matrix parameter T = t x t y t z .
11. target projection systems according to claim 7, is characterized in that, described affine projection matrix obtains by the following method:
Gather the described matching double points (P of at least 6 group diverse locations 0and P 1) corresponding three-dimensional coordinate and described matching double points (P 0and P 1) the corresponding two-dimensional coordinate in the upper projection of described 3rd video camera (32);
According to the described matching double points (P of described at least 6 group diverse locations 0and P 1) corresponding three-dimensional coordinate and described matching double points (P 0and P 1) the corresponding two-dimensional coordinate in the upper projection of described 3rd video camera (32), adopt linear least square to calculate affine projection matrix.
12. target projection systems according to claim 11, is characterized in that, the described matching double points (P of described collection at least 6 group diverse location 0and P 1) corresponding three-dimensional coordinate and described matching double points (P 0and P 1) the corresponding two-dimensional coordinate in the upper projection of described 3rd video camera (32) obtains by the following method:
Arranging the 3rd video camera (32) is coordinate system, is taken to the two-dimensional coordinate of few 6 groups of diverse locations from described coordinate system;
Described matching double points (P is moved to when described object 0and P 1) the upper projection of described 3rd video camera (32) two-dimensional coordinate with described from described coordinate system be taken to the two-dimensional coordinate lacking 6 groups of diverse locations overlap time, obtain the described matching double points (P of at least 6 group diverse locations corresponding with the two-dimensional coordinate of described at least 6 group diverse locations 0and P 1) corresponding three-dimensional coordinate.
13. target projection systems according to claim 7, is characterized in that, described first video camera (21) and described second video camera (22) are provided with infra-red transmitting filter.
14. target projection systems according to claim 7, is characterized in that, described authentication unit (3) comprises at least two described 3rd video cameras (32).
15. target projection systems according to claim 7, it is characterized in that, described image-generating unit (2) also comprises at least one the imaging video camera except described first video camera (21) and described second video camera (22) further.
CN201080066634.7A 2010-05-19 2010-05-19 Object projection method and object projection system Active CN102939562B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2010/072907 WO2011143813A1 (en) 2010-05-19 2010-05-19 Object projection method and object projection sysytem

Publications (2)

Publication Number Publication Date
CN102939562A CN102939562A (en) 2013-02-20
CN102939562B true CN102939562B (en) 2015-02-18

Family

ID=44991160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201080066634.7A Active CN102939562B (en) 2010-05-19 2010-05-19 Object projection method and object projection system

Country Status (2)

Country Link
CN (1) CN102939562B (en)
WO (1) WO2011143813A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109887002A (en) * 2019-02-01 2019-06-14 广州视源电子科技股份有限公司 Image feature point matching method and device, computer equipment and storage medium

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103699274B (en) * 2013-12-27 2017-02-15 三星电子(中国)研发中心 Information input device and method based on projection and infrared sensing
CN109285109B (en) * 2018-09-05 2019-11-26 天目爱视(北京)科技有限公司 A kind of multizone 3D measurement and information acquisition device
CN109754434B (en) * 2018-12-27 2023-08-29 歌尔科技有限公司 Camera calibration method, device, user equipment and storage medium
CN110728715B (en) * 2019-09-06 2023-04-25 南京工程学院 Intelligent inspection robot camera angle self-adaptive adjustment method
CN111127560B (en) * 2019-11-11 2022-05-03 江苏濠汉信息技术有限公司 Calibration method and system for three-dimensional reconstruction binocular vision system
CN112509035A (en) * 2020-11-26 2021-03-16 江苏集萃未来城市应用技术研究所有限公司 Double-lens image pixel point matching method for optical lens and thermal imaging lens
CN112651427B (en) * 2020-12-03 2024-04-12 中国科学院西安光学精密机械研究所 Image point rapid and efficient matching method for wide-baseline optical intersection measurement
CN113610931A (en) * 2021-08-13 2021-11-05 南京航空航天大学 Machine vision method for measuring parachute-shaped state parameters

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1567384A (en) * 2003-06-27 2005-01-19 史中超 Method of image acquisition, digitized measure and reconstruction of three-dimensional object
CN1946195A (en) * 2006-10-26 2007-04-11 上海交通大学 Scene depth restoring and three dimension re-setting method for stereo visual system
CN101689299A (en) * 2007-06-20 2010-03-31 汤姆逊许可证公司 System and method for stereo matching of images
CN101706957A (en) * 2009-10-30 2010-05-12 无锡景象数字技术有限公司 Self-calibration method for binocular stereo vision device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3991501B2 (en) * 1999-04-16 2007-10-17 コニカミノルタセンシング株式会社 3D input device
US6606406B1 (en) * 2000-05-04 2003-08-12 Microsoft Corporation System and method for progressive stereo matching of digital images
JP2001320652A (en) * 2000-05-11 2001-11-16 Nec Corp Projector
US7103212B2 (en) * 2002-11-22 2006-09-05 Strider Labs, Inc. Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
JP3829819B2 (en) * 2003-05-08 2006-10-04 ソニー株式会社 Holographic stereogram creation device
US20070057946A1 (en) * 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object
JP2009047498A (en) * 2007-08-17 2009-03-05 Fujifilm Corp Stereoscopic imaging device, control method of stereoscopic imaging device, and program
CN101571953B (en) * 2009-05-20 2012-04-25 深圳泰山在线科技有限公司 Object detection method, system and stereoscopic vision system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1567384A (en) * 2003-06-27 2005-01-19 史中超 Method of image acquisition, digitized measure and reconstruction of three-dimensional object
CN1946195A (en) * 2006-10-26 2007-04-11 上海交通大学 Scene depth restoring and three dimension re-setting method for stereo visual system
CN101689299A (en) * 2007-06-20 2010-03-31 汤姆逊许可证公司 System and method for stereo matching of images
CN101706957A (en) * 2009-10-30 2010-05-12 无锡景象数字技术有限公司 Self-calibration method for binocular stereo vision device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109887002A (en) * 2019-02-01 2019-06-14 广州视源电子科技股份有限公司 Image feature point matching method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
WO2011143813A1 (en) 2011-11-24
CN102939562A (en) 2013-02-20

Similar Documents

Publication Publication Date Title
CN102939562B (en) Object projection method and object projection system
CN103983186B (en) Binocular vision system bearing calibration and calibration equipment
CN103971353B (en) Splicing method for measuring image data with large forgings assisted by lasers
CN104019829B (en) Vehicle-mounted panorama camera based on POS (position and orientation system) and external parameter calibrating method of linear array laser scanner
CN106846415A (en) A kind of multichannel fisheye camera binocular calibration device and method
CN102810205A (en) Method for calibrating camera shooting or photographing device
CN109325981B (en) Geometric parameter calibration method for micro-lens array type optical field camera based on focusing image points
CN111896221B (en) Alignment method of rotating optical measurement system for virtual coordinate system auxiliary camera calibration
CN104089628B (en) Self-adaption geometric calibration method of light field camera
CN104517291B (en) Pose measuring method based on target coaxial circles feature
CN111210468A (en) Image depth information acquisition method and device
CN102567989A (en) Space positioning method based on binocular stereo vision
CN106767533A (en) Efficient phase three-dimensional mapping method and system based on fringe projection technology of profiling
CN1897715A (en) Three-dimensional vision semi-matter simulating system and method
CN103278138A (en) Method for measuring three-dimensional position and posture of thin component with complex structure
CN104613871A (en) Calibration method of coupling position relationship between micro lens array and detector
CN102081798B (en) Epipolar rectification method for fish-eye stereo camera pair
CN105374067A (en) Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof
CN101424530B (en) Method for generating approximate kernel line of satellite stereo image pairs based on projection reference surface
CN111435540A (en) Annular view splicing method of vehicle-mounted annular view system
CN108230401A (en) 3D four-wheel position finder automatic camera calibration method and system
CN111435539A (en) Multi-camera system external parameter calibration method based on joint optimization
CN112308926A (en) Camera external reference calibration method without public view field
CN108175535A (en) A kind of dentistry spatial digitizer based on microlens array
CN105434046B (en) Based on the surgical navigator localization method for eliminating infrared fileter refraction effect

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C56 Change in the name or address of the patentee
CP01 Change in the name or title of a patent holder

Address after: 518000, Guangdong, Shenzhen hi tech Industrial Park, South District, science and technology, South Road, twelve square building, 4 floor

Patentee after: SHENZHEN TAISHAN SPORTS TECHNOLOGY CORP., LTD.

Address before: 518000, Guangdong, Shenzhen hi tech Industrial Park, South District, science and technology, South Road, twelve square building, 4 floor

Patentee before: Shenzhen Tol Technology Co., Ltd.

CP01 Change in the name or title of a patent holder

Address after: 518000 4th floor, Fangda building, Keji South 12th Road, South District, high tech Industrial Park, Shenzhen, Guangdong

Patentee after: Shenzhen Taishan Sports Technology Co.,Ltd.

Address before: 518000 4th floor, Fangda building, Keji South 12th Road, South District, high tech Industrial Park, Shenzhen, Guangdong

Patentee before: SHENZHEN TAISHAN SPORTS TECHNOLOGY Corp.,Ltd.

CP01 Change in the name or title of a patent holder