CN104715486B - One kind emulation stand camera marking method and real-time machine - Google Patents

One kind emulation stand camera marking method and real-time machine Download PDF

Info

Publication number
CN104715486B
CN104715486B CN201510134249.6A CN201510134249A CN104715486B CN 104715486 B CN104715486 B CN 104715486B CN 201510134249 A CN201510134249 A CN 201510134249A CN 104715486 B CN104715486 B CN 104715486B
Authority
CN
China
Prior art keywords
mtd
msub
mrow
mtr
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510134249.6A
Other languages
Chinese (zh)
Other versions
CN104715486A (en
Inventor
蔡绍晓
李锦明
陈筱婧
李晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingwei Hirain Tech Co Ltd
Original Assignee
Beijing Jingwei Hirain Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingwei Hirain Tech Co Ltd filed Critical Beijing Jingwei Hirain Tech Co Ltd
Priority to CN201510134249.6A priority Critical patent/CN104715486B/en
Publication of CN104715486A publication Critical patent/CN104715486A/en
Application granted granted Critical
Publication of CN104715486B publication Critical patent/CN104715486B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Medicines Containing Antibodies Or Antigens For Use As Internal Diagnostic Agents (AREA)

Abstract

The application provides a kind of emulation stand camera marking method and real-time machine, and scaling reference and the imaging of addition, the first corresponding relation established between its any point and virtual video camera plane of delineation coordinate are gathered by the virtual video camera of foundation;The imaging is gathered by real camera, the second corresponding relation established between imaging point and real camera plane of delineation coordinate of the described any point of scaling reference in the imaging of virtual video camera;Two corresponding relations are solved, obtain the 3rd corresponding relation between described any point of scaling reference and real camera plane of delineation coordinate, export to FAS systems, FAS systems are made to obtain the corresponding relation between the test scene data that real camera plane of delineation coordinate and real-time machine provide, virtual video camera is eliminated to existing parameter differences and spatial correlation between real camera, solves the problems, such as that real-time machine can not provide accurate test scene data for FAS systems.

Description

One kind emulation stand camera marking method and real-time machine
Technical field
The present invention relates to technical field of automation, more particularly to a kind of emulation stand camera marking method and real-time machine.
Background technology
In recent years, with the fast development of machine vision imaging and image processing techniques and increasingly ripe, regarded based on monocular The driver assistance system of feel is more and more applied in high-end vehicles.One of them important system is to contain certainly Adapt to cruise, the multiple functions such as automatic emergency brake forward sight accessory system (Fronview Assistance Systems, FAS).Within the research and development phase of system, it usually needs the function logic and performance of FAS systems are carried out on emulation stand checking and Test.
Therefore, it is necessary to be realized on real-time machine with the virtual video camera for providing camera angles for scenes such as people, car, roads Emulation, to provide test object for production debugging.And as electronic control unit input video source real camera then Need to gather the traffic environment that virtual video camera provides, test scene is provided for unit under test.But due to virtual video camera and Real camera has respective inner parameter, and has the relativeness in space between virtual video camera and real camera again, So the simulating scenes seen of real camera and the scene that real-time machine provides are not quite identical, will be absorbed to real camera The simulating scenes of real-time machine bring influence, and accurate test scene data can not be provided for FAS systems.
The content of the invention
In view of this, the invention provides one kind emulation stand camera marking method and real-time machine, to solve existing skill Because of parameter differences and spatial correlation existing between virtual video camera and real camera in art, caused real-time machine without The problem of method provides accurate test scene data for FAS systems.
To achieve these goals, technical scheme provided in an embodiment of the present invention is as follows:
One kind emulation stand camera marking method, it is described imitative applied to the real-time machine that performance test is carried out to FAS systems True stand camera marking method includes:
Add scaling reference;
Virtual video camera is established, the scaling reference is gathered by the virtual video camera and is imaged, establish the demarcation The first corresponding relation between any point of reference substance and the virtual video camera plane of delineation coordinate;
Real camera gathers the imaging of the virtual video camera, and the described any point for establishing the scaling reference exists The second corresponding relation between imaging point and the real camera plane of delineation coordinate in the imaging of the virtual video camera;
First corresponding relation and the second corresponding relation are solved, obtain the described any of the scaling reference A little the 3rd corresponding relation between the real camera plane of delineation coordinate;
3rd corresponding relation is exported to the FAS systems.
Preferably, first corresponding relation is:
[uv,vv,1]T=Av[xv,yv,1]T
Zvc[xv,yv,1]T=Bv[Xvc,Yvc,Zvc]T
Wherein, [Xw,Yw,Zw] for the scaling reference any point world coordinates, [Xvc,Yvc,Zvc] it is described The point of scaling reference corresponding coordinate, [x in virtual video camera coordinate systemv,yv, 1] be the scaling reference the point Corresponding coordinate, [u in image coordinate systemv,vv, 1] for the scaling reference the point in pixel coordinate system corresponding seat Mark;
αvAnd βvThe respectively horizontal and vertical focal length of virtual video camera, alpha_cvIncline for the reference axis of virtual video camera Oblique coefficient, uv0And vv0For the photocentre of virtual video camera;RvAnd TvBetween respectively described scaling reference and virtual video camera Spin matrix and translation matrix.
Preferably, second corresponding relation is:
[ur,vr,1]T=Ar[xr,yr,1]T
Zrc[xr,yr,1]T=Br[Xrc,Yrc,Zrc]T
Wherein, [Xrc,Yrc,Zrc] for the scaling reference any point in true camera coordinates system corresponding to Coordinate, [xr,yr, 1] for the scaling reference the point in image coordinate system corresponding coordinate, [ur,vr, 1] and it is the mark Determine the point of the reference substance corresponding coordinate in pixel coordinate system;
αrrThe respectively horizontal and vertical focal length of real camera, alpha_crTilted for the reference axis of real camera Coefficient, ur0,vr0For the photocentre of real camera;RrAnd TrRespectively scaling reference in the imaging of display plane with truly taking the photograph Spin matrix and translation matrix between camera.
Preferably, method is minimum used by being solved to first corresponding relation and second corresponding relation Square law.
Preferably, the 3rd corresponding relation is:
Preferably, the machine in real time is exported the 3rd corresponding relation to the FAS systems by CAN lines.
A kind of real-time machine, performance is carried out to FAS systems using any of the above-described described emulation stand camera marking method Test.
The emulation stand camera marking method that the application provides, by adding scaling reference, by the void established Intend scaling reference described in camera acquisition and be imaged, establish any point of the scaling reference and the virtual video camera The first corresponding relation between plane of delineation coordinate;The imaging of the virtual video camera is gathered by real camera again, establishes institute State imaging point of the described any point of scaling reference in the imaging of the virtual video camera and the real camera figure The second corresponding relation between image plane coordinate;Then first corresponding relation and the second corresponding relation are solved, obtained The 3rd corresponding relation between described any point of the scaling reference and the real camera plane of delineation coordinate, And export to the FAS systems, the FAS systems is obtained what the real camera plane of delineation coordinate provided with real-time machine Corresponding relation between test scene data, virtual video camera is eliminated to existing parameter differences and sky between real camera Between relativeness, solve the problems, such as therefore caused by real time machine accurate test scene data can not be provided for FAS systems.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is the required accompanying drawing used in technology description to be briefly described, it should be apparent that, drawings in the following description are only this The embodiment of invention, for those of ordinary skill in the art, on the premise of not paying creative work, can also basis The accompanying drawing of offer obtains other accompanying drawings.
Fig. 1 is a kind of emulation stand camera marking method flow chart that the embodiment of the present application provides.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation describes, it is clear that described embodiment is only part of the embodiment of the present invention, rather than whole embodiments.It is based on Embodiment in the present invention, those of ordinary skill in the art are obtained every other under the premise of creative work is not made Embodiment, belong to the scope of protection of the invention.
The invention provides one kind emulate stand camera marking method, with solve in the prior art because virtual video camera with Existing parameter differences and spatial correlation between real camera, caused real-time machine can not provide standard for FAS systems The problem of true test scene data.
Specifically, the emulation stand camera marking method, the real-time of performance test is carried out applied to FAS systems Machine;As shown in figure 1, the emulation stand camera marking method includes:
S101, addition scaling reference;
In specific practical application, the chequered with black and white gridiron pattern of one 5 × 7 can be added, each gridiron pattern is the length of side For 30mm square;It can also be imaged on simulating scenes display.
S102, virtual video camera is established, gathered the scaling reference by the virtual video camera and be imaged, described in foundation The first corresponding relation between any point of scaling reference and the virtual video camera plane of delineation coordinate;
Specifically, in actual applications, while virtual vehicle can be established, and the virtual vehicle is set according to truth Size, the virtual video camera is installed to the central upper of the virtual vehicle windshield, and according to true shooting The machine angle of visual field sets the virtual video camera field range, switches to virtual video camera to regard on the simulating scenes display Angle, adjust the 2/3 of the screen that gridiron pattern putting position makes its area account for the simulating scenes display;Then the mark is established Determine the first corresponding relation between any point of reference substance and the virtual video camera plane of delineation coordinate.
S103, real camera gather the imaging of the virtual video camera, establish the described any of the scaling reference A little the second couple between the imaging point in the imaging of the virtual video camera and the real camera plane of delineation coordinate It should be related to;
First, the real camera position is adjusted, the real camera is placed into and shown apart from the simulating scenes Show at device screen d, and make its optical axis perpendicular to the simulating scenes display.Wherein, d values are:
D=(d1+d2)/2
d1=w/ (2*tan (Th/2))
d2=h/ (2*tan (Tv/2))
Wherein, w and h is respectively virtual video camera visual field width over the display and height, ThAnd TvRespectively truly take the photograph The horizontal field of view angle and vertical field of view angle of camera.
Then second corresponding relation is established.
S104, first corresponding relation and the second corresponding relation are solved, obtain the institute of the scaling reference State the 3rd corresponding relation between any point and the real camera plane of delineation coordinate;
Two corresponding relations that step S102 is obtained with step S103 are solved, and finally give the scaling reference Described any point and the real camera plane of delineation coordinate between the 3rd corresponding relation.
S105, the 3rd corresponding relation exported to the FAS systems.
The FAS systems are tested according to the 3rd corresponding relation, can effectively eliminate the institute in simulating scenes State the influence for the simulating scenes that the internal and external parameter of virtual video camera is seen to the real camera.
The emulation stand camera marking method that the application provides, obtains the FAS systems by above-mentioned steps Corresponding relation between the test scene data that the real camera plane of delineation coordinate and real-time machine provide, eliminate virtual Video camera to existing parameter differences and spatial correlation between real camera, caused by solving therefore in real time machine without The problem of method provides accurate test scene data for FAS systems.
Another embodiment of the present invention additionally provides another emulation stand camera marking method, as shown in figure 1, bag Include:
S101, addition scaling reference;
S102, establish virtual vehicle and virtual video camera, by the virtual video camera gather the scaling reference and into Picture, first established between described any point of the scaling reference and the virtual video camera plane of delineation coordinate are corresponding Relation;
S103, real camera gather the imaging of the virtual video camera, establish the described any of the scaling reference A little the second couple between the imaging point in the imaging of the virtual video camera and the real camera plane of delineation coordinate It should be related to;
S104, above-mentioned two corresponding relation is solved, obtain described any point and the institute of the scaling reference State the 3rd corresponding relation between real camera plane of delineation coordinate;
S105, the 3rd corresponding relation exported to the FAS systems.
Preferably, first corresponding relation is:
[uv,vv,1]T=Av[xv,yv,1]T
Zvc[xv,yv,1]T=Bv[Xvc,Yvc,Zvc]T
Wherein, [Xw,Yw,Zw] for the scaling reference any point world coordinates, [Xvc,Yvc,Zvc] it is described The point of scaling reference corresponding coordinate, [x in virtual video camera coordinate systemv,yv, 1] be the scaling reference the point Corresponding coordinate, [u in image coordinate systemv,vv, 1] for the scaling reference the point in pixel coordinate system corresponding seat Mark;
αvAnd βvThe respectively horizontal and vertical focal length of virtual video camera, alpha_cvIncline for the reference axis of virtual video camera Oblique coefficient, uv0And vv0For the photocentre of virtual video camera;RvAnd TvRespectively between Virtual Calibration reference substance and virtual video camera Spin matrix and translation matrix.
Wherein, image coordinate system represents the coordinate points in a physical world, in virtual video camera (or true shooting Machine) after imaging, coordinate system that the two dimensional surface that is formed on video camera internal sensor is formed;And pixel coordinate system represents display The two-dimensional coordinate system that plane where device is formed.
Preferably, second corresponding relation is:
[ur,vr,1]T=Ar[xr,yr,1]T
Zrc[xr,yr,1]T=Br[Xrc,Yrc,Zrc]T
Wherein, [Xrc,Yrc,Zrc] for the scaling reference any point in true camera coordinates system corresponding to Coordinate, [xr,yr, 1] for the scaling reference the point in image coordinate system corresponding coordinate, [ur,vr, 1] and it is the mark Determine the point of the reference substance corresponding coordinate in pixel coordinate system;
αr, βrThe respectively horizontal and vertical focal length of real camera, alpha_crTilted for the reference axis of real camera Coefficient, ur0,vr0For the photocentre of real camera;RrAnd TrRespectively scaling reference in the imaging of display plane with truly taking the photograph Spin matrix and translation matrix between camera.
Preferably, method is minimum used by being solved to first corresponding relation and second corresponding relation Square law.
Preferably, the 3rd corresponding relation is:
Above formula can also be rewritten as:
ZrcZvc[xr,yr,1]T=M [Xw,Yw,Zw]T;Wherein,
Include unknown quantity in M:αr、βr、alpha_cr、ur0、vr0、Rr、Tr、αv、βv、alpha_cv、uv0、vv0RvTv, because Any point for the scaling reference is the point in gridiron pattern plane, so, eliminate ZrcZvc, Rr、RvIt is 2 × 2 respectively Matrix, Tr、TvIt is 2 × 1 matrix respectively, therefore includes 22 unknown quantitys in M altogether.
And each solution formula represents the coordinate points on a gridiron pattern and coordinate points in real camera image plane Corresponding relation, two equatioies can be written, it is therefore desirable at least 11 coordinate points.Projection matrix M is obtained according to least square, so as to Try to achieve the virtual video camera and the real camera parameters.
Preferably, the machine in real time is exported the 3rd corresponding relation to the FAS systems by CAN lines.
The estimation of distance and angle can be improved to the FAS systems by being exported the 3rd corresponding relation by CAN lines Precision, improve each active and passive functions.
Another embodiment of the present invention additionally provides a kind of real-time machine, and the machine in real time is using described in any of the above-described embodiment Emulate stand camera marking method and performance test is carried out to FAS systems.
Specific operation principle is same as the previously described embodiments, and here is omitted.
Each embodiment is described by the way of progressive in the present invention, and what each embodiment stressed is and other realities Apply the difference of example, between each embodiment identical similar portion mutually referring to.For device disclosed in embodiment Speech, because it is corresponded to the method disclosed in Example, so description is fairly simple, related part is referring to method part illustration .
It the above is only the preferred embodiment of the present invention, make skilled artisans appreciate that or realizing of the invention.It is right A variety of modifications of these embodiments will be apparent to one skilled in the art, as defined herein general former Reason can be realized in other embodiments without departing from the spirit or scope of the present invention.Therefore, the present invention will not Be intended to be limited to the embodiments shown herein, and be to fit to it is consistent with principles disclosed herein and features of novelty most Wide scope.

Claims (6)

1. one kind emulation stand camera marking method, it is characterised in that carry out the real-time of performance test applied to FAS systems Machine, the emulation stand camera marking method include:
Add scaling reference;
Virtual video camera is established, the scaling reference is gathered by the virtual video camera and is imaged, establishes the demarcation reference The first corresponding relation between any point of thing and the virtual video camera plane of delineation coordinate;
Real camera gathers the imaging of the virtual video camera, establishes described any point of the scaling reference described The second corresponding relation between imaging point and the real camera plane of delineation coordinate in the imaging of virtual video camera;
First corresponding relation and the second corresponding relation are solved, obtain described any point of the scaling reference With the 3rd corresponding relation between the real camera plane of delineation coordinate;
3rd corresponding relation is exported to the FAS systems;
3rd corresponding relation is:
<mrow> <msub> <mi>Z</mi> <mrow> <mi>r</mi> <mi>c</mi> </mrow> </msub> <msub> <mi>Z</mi> <mrow> <mi>v</mi> <mi>c</mi> </mrow> </msub> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>x</mi> <mi>r</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>r</mi> </msub> <mo>,</mo> <mn>1</mn> <mo>&amp;rsqb;</mo> </mrow> <mi>T</mi> </msup> <mo>=</mo> <msub> <mi>A</mi> <mi>r</mi> </msub> <msub> <mi>B</mi> <mi>r</mi> </msub> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>R</mi> <mi>r</mi> </msub> </mtd> <mtd> <msub> <mi>T</mi> <mi>r</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <msubsup> <mi>A</mi> <mi>v</mi> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msubsup> <msub> <mi>B</mi> <mi>v</mi> </msub> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>R</mi> <mi>v</mi> </msub> </mtd> <mtd> <msub> <mi>T</mi> <mi>v</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>X</mi> <mi>w</mi> </msub> <mo>,</mo> <msub> <mi>Y</mi> <mi>w</mi> </msub> <mo>,</mo> <msub> <mi>Z</mi> <mi>w</mi> </msub> <mo>&amp;rsqb;</mo> </mrow> <mi>T</mi> </msup> <mo>;</mo> </mrow>
Wherein:[Xw,Yw,Zw] for the scaling reference any point world coordinates, RvAnd TvRespectively described demarcation ginseng Examine the spin matrix and translation matrix between thing and virtual video camera, RrAnd TrRespectively scaling reference is in display plane Spin matrix and translation matrix between imaging and real camera;
<mrow> <msub> <mi>A</mi> <mi>v</mi> </msub> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>&amp;alpha;</mi> <mi>v</mi> </msub> </mtd> <mtd> <mrow> <mi>a</mi> <mi>l</mi> <mi>p</mi> <mi>h</mi> <mi>a</mi> <mo>_</mo> <msub> <mi>c</mi> <mi>v</mi> </msub> <mo>*</mo> <msub> <mi>&amp;alpha;</mi> <mi>v</mi> </msub> </mrow> </mtd> <mtd> <msub> <mi>u</mi> <mrow> <mi>v</mi> <mn>0</mn> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>&amp;beta;</mi> <mi>v</mi> </msub> </mtd> <mtd> <msub> <mi>v</mi> <mrow> <mi>v</mi> <mn>0</mn> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>
<mrow> <msub> <mi>B</mi> <mi>v</mi> </msub> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>&amp;alpha;</mi> <mi>v</mi> </msub> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>&amp;beta;</mi> <mi>v</mi> </msub> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>
αvAnd βvThe respectively horizontal and vertical focal length of virtual video camera, alpha_cvSystem is tilted for the reference axis of virtual video camera Number, uv0And vv0For the photocentre of virtual video camera;RvAnd TvRotation between respectively described scaling reference and virtual video camera Matrix and translation matrix;
<mrow> <msub> <mi>A</mi> <mi>r</mi> </msub> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>&amp;alpha;</mi> <mi>r</mi> </msub> </mtd> <mtd> <mrow> <mi>a</mi> <mi>l</mi> <mi>p</mi> <mi>h</mi> <mi>a</mi> <mo>_</mo> <msub> <mi>c</mi> <mi>r</mi> </msub> <mo>*</mo> <msub> <mi>&amp;alpha;</mi> <mi>r</mi> </msub> </mrow> </mtd> <mtd> <msub> <mi>u</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>&amp;beta;</mi> <mi>r</mi> </msub> </mtd> <mtd> <msub> <mi>v</mi> <mrow> <mi>r</mi> <mn>0</mn> </mrow> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>
<mrow> <msub> <mi>B</mi> <mi>r</mi> </msub> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>&amp;alpha;</mi> <mi>r</mi> </msub> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <msub> <mi>&amp;beta;</mi> <mi>r</mi> </msub> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>
αr, βrThe respectively horizontal and vertical focal length of real camera, alpha_crSystem is tilted for the reference axis of real camera Number, ur0, vr0For the photocentre of real camera;RrAnd TrRespectively imaging and true shooting of the scaling reference in display plane Spin matrix and translation matrix between machine.
2. emulation stand camera marking method according to claim 1, it is characterised in that first corresponding relation For:
[uv,vv,1]T=Av[xv,yv,1]T
<mrow> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>X</mi> <mrow> <mi>v</mi> <mi>c</mi> </mrow> </msub> <mo>,</mo> <msub> <mi>Y</mi> <mrow> <mi>v</mi> <mi>c</mi> </mrow> </msub> <mo>,</mo> <msub> <mi>Z</mi> <mrow> <mi>v</mi> <mi>c</mi> </mrow> </msub> <mo>&amp;rsqb;</mo> </mrow> <mi>T</mi> </msup> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>R</mi> <mi>v</mi> </msub> </mtd> <mtd> <msub> <mi>T</mi> <mi>v</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>X</mi> <mi>w</mi> </msub> <mo>,</mo> <msub> <mi>Y</mi> <mi>w</mi> </msub> <mo>,</mo> <msub> <mi>Z</mi> <mi>w</mi> </msub> <mo>&amp;rsqb;</mo> </mrow> <mi>T</mi> </msup> <mo>;</mo> </mrow>
Zvc[xv,yv,1]T=Bv[Xvc,Yvc,Zvc]T
Wherein, [Xw,Yw,Zw] for the scaling reference any point world coordinates, [Xvc,Yvc,Zvc] it is the demarcation The point of reference substance corresponding coordinate, [x in virtual video camera coordinate systemv,yv, 1] scheming for the point of the scaling reference As corresponding coordinate in coordinate system, [uv,vv, 1] for the scaling reference the point in pixel coordinate system corresponding coordinate.
3. emulation stand camera marking method according to claim 2, it is characterised in that second corresponding relation For:
[ur,vr,1]T=Ar[xr,yr,1]T
<mrow> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>X</mi> <mrow> <mi>r</mi> <mi>c</mi> </mrow> </msub> <mo>,</mo> <msub> <mi>Y</mi> <mrow> <mi>r</mi> <mi>c</mi> </mrow> </msub> <mo>,</mo> <msub> <mi>Z</mi> <mrow> <mi>r</mi> <mi>c</mi> </mrow> </msub> <mo>&amp;rsqb;</mo> </mrow> <mi>T</mi> </msup> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>R</mi> <mi>r</mi> </msub> </mtd> <mtd> <msub> <mi>T</mi> <mi>r</mi> </msub> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <msup> <mrow> <mo>&amp;lsqb;</mo> <msub> <mi>x</mi> <mi>v</mi> </msub> <mo>,</mo> <msub> <mi>y</mi> <mi>v</mi> </msub> <mo>,</mo> <mn>1</mn> <mo>&amp;rsqb;</mo> </mrow> <mi>T</mi> </msup> <mo>;</mo> </mrow>
Zrc[xr,yr,1]T=Br[Xrc,Yrc,Zrc]T
Wherein, [Xrc,Yrc,Zrc] for the scaling reference any point in true camera coordinates system corresponding coordinate, [xr,yr, 1] for the scaling reference the point in image coordinate system corresponding coordinate, [ur,vr, 1] join for the demarcation Examine the point of the thing corresponding coordinate in pixel coordinate system.
4. emulation stand camera marking method according to claim 3, it is characterised in that to first corresponding relation Method is least square method used by being solved with second corresponding relation.
5. emulation stand camera marking method according to claim 1, it is characterised in that the machine in real time passes through CAN Line exports the 3rd corresponding relation to the FAS systems.
6. a kind of real-time machine, it is characterised in that using any described emulation stand camera marking method of claim 1 to 5 Performance test is carried out to FAS systems.
CN201510134249.6A 2015-03-25 2015-03-25 One kind emulation stand camera marking method and real-time machine Active CN104715486B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510134249.6A CN104715486B (en) 2015-03-25 2015-03-25 One kind emulation stand camera marking method and real-time machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510134249.6A CN104715486B (en) 2015-03-25 2015-03-25 One kind emulation stand camera marking method and real-time machine

Publications (2)

Publication Number Publication Date
CN104715486A CN104715486A (en) 2015-06-17
CN104715486B true CN104715486B (en) 2017-12-19

Family

ID=53414780

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510134249.6A Active CN104715486B (en) 2015-03-25 2015-03-25 One kind emulation stand camera marking method and real-time machine

Country Status (1)

Country Link
CN (1) CN104715486B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105872319B (en) * 2016-03-29 2018-12-18 深圳迪乐普数码科技有限公司 A kind of depth of field measurement method
CN114814758B (en) * 2022-06-24 2022-09-06 国汽智控(北京)科技有限公司 Camera-millimeter wave radar-laser radar combined calibration method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1851618A (en) * 2006-05-31 2006-10-25 北京航空航天大学 Single-eye vision semi-matter simulating system and method
CN1897715A (en) * 2006-05-31 2007-01-17 北京航空航天大学 Three-dimensional vision semi-matter simulating system and method
CN102426425A (en) * 2011-11-08 2012-04-25 重庆邮电大学 Automobile ABS (Antilock Brake System) virtual reality simulation system
CN103646403A (en) * 2013-12-26 2014-03-19 北京经纬恒润科技有限公司 Vehicle-mounted multi-camera calibration method, system and image processing device
CN103871071A (en) * 2014-04-08 2014-06-18 北京经纬恒润科技有限公司 Method for camera external reference calibration for panoramic parking system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7773797B2 (en) * 2006-02-06 2010-08-10 Beijing University Of Aeronautics And Astronautics Methods and apparatus for measuring the flapping deformation of insect wings
US8502860B2 (en) * 2009-09-29 2013-08-06 Toyota Motor Engineering & Manufacturing North America (Tema) Electronic control system, electronic control unit and associated methodology of adapting 3D panoramic views of vehicle surroundings by predicting driver intent

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1851618A (en) * 2006-05-31 2006-10-25 北京航空航天大学 Single-eye vision semi-matter simulating system and method
CN1897715A (en) * 2006-05-31 2007-01-17 北京航空航天大学 Three-dimensional vision semi-matter simulating system and method
CN102426425A (en) * 2011-11-08 2012-04-25 重庆邮电大学 Automobile ABS (Antilock Brake System) virtual reality simulation system
CN103646403A (en) * 2013-12-26 2014-03-19 北京经纬恒润科技有限公司 Vehicle-mounted multi-camera calibration method, system and image processing device
CN103871071A (en) * 2014-04-08 2014-06-18 北京经纬恒润科技有限公司 Method for camera external reference calibration for panoramic parking system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
单摄像机单投影仪三维测量系统标定技术;韦争亮 等;《清华大学学报(自然科学版)》;20090215;第49卷(第2期);202-205,209 *
移动机器人双目视觉摄像机标定方法研究;高忠国;《中国优秀硕士学位论文全文数据库 信息科技辑》;20120715;正文第3.2节 *

Also Published As

Publication number Publication date
CN104715486A (en) 2015-06-17

Similar Documents

Publication Publication Date Title
CN112894832B (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
US9858639B2 (en) Imaging surface modeling for camera modeling and virtual view synthesis
US8817079B2 (en) Image processing apparatus and computer-readable recording medium
CN100562707C (en) Binocular vision rotating axis calibration method
US7768527B2 (en) Hardware-in-the-loop simulation system and method for computer vision
CN109345596A (en) Multisensor scaling method, device, computer equipment, medium and vehicle
CN102968809B (en) The method of virtual information mark and drafting marking line is realized in augmented reality field
CN100417231C (en) Three-dimensional vision semi-matter simulating system and method
CN109903341A (en) Join dynamic self-calibration method outside a kind of vehicle-mounted vidicon
CN108257183A (en) A kind of camera lens axis calibrating method and device
CN102243764B (en) Motion characteristic point detection method and device
CN109690622A (en) Camera registration in multicamera system
CN102692236A (en) Visual milemeter method based on RGB-D camera
CN104463778A (en) Panoramagram generation method
CN106534670B (en) It is a kind of based on the panoramic video generation method for connecting firmly fish eye lens video camera group
CN110969663A (en) Static calibration method for external parameters of camera
CN109961522A (en) Image projecting method, device, equipment and storage medium
CN109579868A (en) The outer object localization method of vehicle, device and automobile
CN105809729B (en) A kind of spherical panorama rendering method of virtual scene
CN107240065A (en) A kind of 3D full view image generating systems and method
CN108344401A (en) Localization method, device and computer readable storage medium
CN105931261A (en) Method and device for modifying extrinsic parameters of binocular stereo camera
CN105374067A (en) Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof
CN104715486B (en) One kind emulation stand camera marking method and real-time machine
CN104913775A (en) Method for measuring height of transmission line of unmanned aerial vehicle and method and device for positioning unmanned aerial vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 4 / F, building 1, No.14 Jiuxianqiao Road, Chaoyang District, Beijing 100020

Patentee after: Beijing Jingwei Hengrun Technology Co., Ltd

Address before: 8 / F, block B, No. 11, Anxiang Beili, Chaoyang District, Beijing 100101

Patentee before: Beijing Jingwei HiRain Technologies Co.,Ltd.

CP03 Change of name, title or address