CN103019011A - Simulation validation method of projection system effect - Google Patents

Simulation validation method of projection system effect Download PDF

Info

Publication number
CN103019011A
CN103019011A CN2012105066059A CN201210506605A CN103019011A CN 103019011 A CN103019011 A CN 103019011A CN 2012105066059 A CN2012105066059 A CN 2012105066059A CN 201210506605 A CN201210506605 A CN 201210506605A CN 103019011 A CN103019011 A CN 103019011A
Authority
CN
China
Prior art keywords
prime
camera
projection system
coordinate system
curtain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012105066059A
Other languages
Chinese (zh)
Inventor
刘雷
秦聚超
梁晟溟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Feiyu Technology Co Ltd
Original Assignee
Harbin Feiyu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Feiyu Technology Co Ltd filed Critical Harbin Feiyu Technology Co Ltd
Priority to CN2012105066059A priority Critical patent/CN103019011A/en
Publication of CN103019011A publication Critical patent/CN103019011A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a simulation validation method of a projection system effect. The simulation validation method of the projection system effect is achieved through a computer software program under the specific engineering background and upon the space analytic geometry principle. Based on the simulation validation method of the projection system effect, a relative space position of a shooting object and a camera lens light point and a camera placing position (namely a lens optical axis direction) when shooting can be precisely set in a program to obtain images of the shooting object under all kinds of shooting situations. The simulation validation method of the projection system effect is aimed at providing the necessary image data support for the overall design of the projection system and the virtual machining and debugging.

Description

A kind of emulation verification method of optical projection system effect
Technical field
The present invention relates to a kind of optical projection formation method based on the space analysis geometrical principle, can precisely predict the imaging effect of optical projection system, namely obtain in advance the image of made thing take software program as instrument, in order to carry out optical projection system overall design, virtual manufacture and debugging.
Background technology
Digital image processing techniques have been dissolved into the every field in the daily life, and are bringing into play increasing effect.The meaning that it has has been not limited only to improve image visual sense quality, has extracted traditional categories such as image information and manipulation of image data, also produces in the realization with numerous science imaginations simultaneously to contact more and more closely.This method is intended to provide necessary virtual image Data support for the design of correlated digital image processing system, processing and debugging.
Such as, the model recognition system of wish design motor vehicle, common mentality of designing is, gather the contour feature of each model vehicle and be stored in the Computer Database, so, after camera was taken certain model vehicle, the customizing messages that just can obtain in the image was as foundation, vehicle model corresponding to screening in Computer Database.In order accurately to make up the contour feature information database of each model vehicle, just need the precognition camera under some ad-hoc location, special angle situation to the geometric profile information in the gained image behind each model vehicle photographic.In fact, three angles (horizontal azimuth α, vertical angle of pitch β and axial-rotation angle γ) of want in space coordinates the coordinate of accurately positioning shooting head and vehicle, accurately setting cam lens optical axis (optical axis is camera lens daylighting point and sensor devices straight line that central point connects) have certain technical difficulty.For this reason, the present invention proposes a kind of projection imaging system effect emulation method.According to the method, can in program, three shooting angle to camera lens optical axis precisely set, can also in rectangular coordinate system in space, arbitrarily and exactly set the relative position of made thing and camera daylighting point (also being called " photocentre "), namely in the optional situation of style of shooting, obtain required image with almost having no error.
Summary of the invention
This paper will illustrate by a Practical Project background intension of the method.In engineering, need to obtain the image-forming information of certain rectangle curtain under the different styles of shooting of camera, finally obtain corresponding object point coordinate according to the picpointed coordinate in the rectangle curtain image.By setting up world coordinate system, can after fixing the known rectangle curtain of size, obtain the coordinate of any point in world coordinate system on the curtain.Draw mesh lines (the transverse grid line is parallel with the curtain horizontal edge, and the longitudinal network ruling is parallel with the curtain longitudinal edge) at the rectangle curtain, whole curtain is divided into the rectangular area of some standards.Four summits of each rectangular area are painted with the black round dot.When the camera optical axis (is that optical axis place straight line and curtain plane angulation are any with different angles, be not only limited to the right angle) when this rectangle curtain is taken, top each rectangular area imaging in the curtain image no longer is the rectangle of standard, but irregular quadrilateral.The method of utilizing the present invention to propose can after setting camera position and camera optical axis shooting angle, obtain the image-forming information of each rectangular area in the curtain image, i.e. each trapeziform accurate shape on the rectangle curtain.
For the picture (line or point) of each rectangular area on the rectangle curtain in one-tenth's image showed, model the model space geometric of the camera system that jointly formed by curtain and camera.In this model, set up world coordinate system, demarcated respectively rectangle curtain, cam lens daylighting point and the coordinate of camera lens sensor devices in this coordinate system; At image coordinate system that sensor devices has been set up in the plane, success has realized by the conversion of object point to picture point according to the pinhole imaging system principle.Particularly, describe exactly to connect the line between object point and the daylighting point, this straight line and plane, sensor devices place meet at a bit, (sensor devices is a rectangular device if this point is in sensor devices, its size can be set in program according to device specification and design requirement) in the rectangular area, place, it is the picture point of object point in photo so; Otherwise former object point can imaging in photo, has namely exceeded the field range of cam lens.
In above process, the key problem of solution is on the basis that sets cam lens daylighting point position, three windup-degrees of camera optical axis, to obtain the data representation method of sensor devices in world coordinate system.Among the present invention, world coordinate system is set up take camera lens daylighting point as initial point.In this coordinate system, the camera optical axis overlaps with the y coordinate axis, as initial shooting direction; Sensor devices is parallel with the xOz face, and its four edges is parallel with x axle, z axle respectively, so just the initial position of sensor devices is set.When with the method the rectangle curtain being taken, the position of curtain in world coordinate system will determining with camera lens daylighting point (being coordinate origin) position; And the shooting direction of camera lens optical axis is in case definite, and optical axis also can be by above-mentioned three angles (horizontal azimuth α, vertical angle of pitch β and axial-rotation angle γ) quantitatively calibrating with its initial drift angle of taking between direction.
The technical solution used in the present invention:
A kind of emulation verification method of optical projection system effect, it is characterized in that, be used for obtaining the image of rectangle curtain 3, three deflection angle α, β, γ when camera is taken all can preset, camera and the relative position of rectangle curtain 3 in world coordinate system can preset, camera sensor devices 1a size (l 1, l 2Length with focal length l) can preset.
The emulation verification method of described a kind of optical projection system effect, it is characterized in that, in the simulation checking system size, camera and the rectangle curtain 3 of α, β, three angles of γ in world coordinate system relative position and after camera sensor devices 1a size determines, B ", B 1" ', B 2" ', B 3" ', B 4Coordinate figure (the B of " ' in world coordinate system x, B y, B z), (B 1x', B 1y', B 1z'), (B 2x', B 2y', B 2z'), (B 3x', B 3y', B 3z'), (B 4x', B 4y', B 4z') derivation algorithm.
The emulation verification method of described a kind of optical projection system effect is characterized in that: B ", B 1" ', B 2" ', B 3" ', B 4After the coordinate figure of " ' in world coordinate system is obtained, picture point 5a in image coordinate system coordinate (P ' x, P ' y) derivation algorithm, particularly image coordinate system in, d 1, d 2With P ' x, P ' yPositive and negative relation.
Hereinafter in connection with the drawings and the specific embodiments summary of the invention is discussed in detail.
Description of drawings
Fig. 1 is the mathematical model simple schematic diagram of camera when the rectangle curtain is taken;
Fig. 2 is the mathematical model schematic diagram of camera when the rectangle curtain is taken;
Fig. 3 is picture point is tied to image coordinate system by world coordinates transformational relation schematic diagram;
Fig. 4 is that the rectangle curtain is divided into 36 schematic diagram (band grid and numbering) behind the rectangular area;
Fig. 5 is the schematic diagram after keeping the grid intersection point on the rectangle curtain and removing mesh lines;
Fig. 6 is camera parameter alpha, β, the γ image that is set as respectively rectangle curtain after 10 °, 20 °, 3 ° (black rectangle representative image border, extra-regional part fail imaging);
Fig. 7 is the schematic diagram that concerns of cam lens focal length, sensor devices size and angle of view scope.
Symbol description:
1a is the sensor devices of initial position, 1b is that the setting horizontal azimuth is the sensor devices behind the α, 1c is at the sensor devices of setting again on the basis of 1b after the vertical angle of pitch is β, 1d is at the sensor devices after optical axis rotates the γ angle on the basis of 1c, the 2nd, camera lens daylighting point (photocentre), the 3rd, rectangle curtain (made thing), the 4th, object point, 5a is that picture point is (in the sensor devices zone, imaging), 5b is " picture point " (not in sensor devices zone, not imaging), the 6th, image coordinate system initial point, the 7th, camera optical axis (focal length).
Embodiment
The practical situation of rectangle curtain 3 as shown in Figure 5, be distributed with 49 points on it, per 4 points surround the rectangular area (in order to distinguish 36 rectangular areas clearer and more definite, Fig. 4 has drawn mesh lines on the basis of Fig. 5, and each zone is numbered) of a standard.Wherein, | A 1A 12|, | A 12A 13|, | A 13A 14|, | A 14A 15|, | A 15A 16|, | A 16A 2| ratio adjustable, | A 1A 21|, | A 21A 31|, | A 31A 41|, | A 41A 51|, | A 51A 61|, | A 61A 4| ratio also adjustable.When above two groups of ratios arrange when appropriate, the length and width ratio of each rectangular area is with different on the rectangle curtain 3.And, no matter why camera optical axis 7 is worth (this angle has certain limitation according to the Practical Project background) with the angle on curtain plane, the length and width ratio in each irregular quadrilateral zone on the curtain image picture of respective rectangular zone on the former curtain (as shown in Figure 6, for) will keep difference.So, just can be according to 36 groups of different length and width ratios, with the zone of the irregular quadrilateral in the curtain image (as) corresponding one by one with the rectangular area (thing) on the former curtain.
After one-to-one relationship is set up in each irregular quadrilateral zone in each rectangular area of curtain and the curtain image, just can utilize specific algorithm, further the point in the rectangular area and the point in the corresponding irregular quadrilateral are set up mapping relations, thereby realize by the conversion of picpointed coordinate (in image coordinate system) to object point coordinate (in world coordinate system), reach engine request.
Body matter of the present invention is tried to achieve the required algorithm of software program exactly.Can set up world coordinate system and image coordinate system by this software program, determine the relative position of rectangle curtain 3 and camera photocentre 2, accurately set the shooting angle (being α, β, three angles of γ) of camera, and then obtain the image of curtain 3 under the current shooting situation.The final purpose of obtaining image be for the ratio that two reasonable set are set (| A 1A 12|, | A 12A 13|, | A 13A 14|, | A 14A 15|, | A 15A 16|, | A 16A 2| ratio and | A 1A 21|, | A 21A 31|, | A 31A 41|, | A 41A 51|, | A 51A 61|, | A 61A 4| ratio) prepare.
The body matter of the described method of the below's explaination this patent.
Fig. 7 has drawn the relation of sensor devices 1a in the camera, optical axis (focal length) 7 and angle of view scope.If B 1B 2Length is l 1, B 2B 3Length is l 2The O point is cam lens daylighting point 2, and OB is that its focal length 7(OB is perpendicular to quadrilateral B 1B 2B 3B 4Plane, place, intersection point are B), length is l.
A 1, A 2Be respectively B 1B 2, B 3B 4Mid point, make respectively straight line OA 1, OA 2(as shown in phantom in Figure 7), two wire clamp angles are β, and then B (is established in the β visual angle that is the camera vertical direction in angle 1B 2Horizontal positioned, B 2B 3Vertically place).β, l and l 2Relation shown in (1) formula.
tan β 2 = l 2 / 2 l - - - ( 1 )
Here the value with β is set as 46.8 °, and namely vertical direction field angle scope is 46.8 °.Long l 1Be made as 0.006773 meter, wide l 2Be made as 0.005080 meter, this is the common size of 1/3 inch sensor devices.With β and l 2Value substitution (1) formula, the value that gets l is 0.005866 meter.Again with l and l 1Value substitution (1) formula, the scope that can get horizontal direction field angle α is 60 °.
Above part has illustrated the setting rule of camera inner parameter.In fact in a single day, before application the method is obtained the image of rectangle curtain 3, need to select in advance to satisfy engineering to the camera lens of field angle area requirement, and determined applied camera lens model, sensor devices size and focal length are all determined thereupon.At this moment, a need be inputted these parameters and get final product in program.
World coordinate system (being called coordinate system I) is at first set up in discussion in the contact preamble take camera lens photocentre 2 as initial point, the y axle is perpendicular to sensor devices 1a, as shown in Figure 2.The coordinate of B point this moment (central point of sensor devices 1a) is (0 ,-l, 0), B 1B 2, B 2B 3Be parallel to respectively x axle, z axle.If 1a is sensor devices B 1B 2B 3B 4Initial shooting direction and camera site.Four summit A of rectangle curtain 3 1, A 2, A 3, A 4Coordinate in coordinate system I is set as respectively (1.15 ,-10,2.7), (1.15 ,-10,2.7), (1.15 ,-10 ,-1.3), (1.15 ,-10 ,-1.3).Need to prove, above 4 coordinate can be set according to actual conditions, and namely the size of rectangle curtain 3 and position can be adjusted according to requirement of engineering.
The setting of three of camera lens optical axis 7 deflection angles (horizontal azimuth α, vertical angle of pitch β and axial-rotation angle γ) can realize by following three steps in the program:
The first step, as shown in Figure 2, OB becomes OB ' (" 2 " among Fig. 2) in xOy face upper deflecting angle [alpha] (for the initial bit set), the x axial coordinate of approximately settled B ' greater than zero the time, α〉0; When the x of B ' axial coordinate less than zero the time, α<0.
" position (" 3 " among Fig. 2), angle changing is angle of pitch β to OB by the described position transfer of the first step for second step, OB '.Agreement is as B " the z axial coordinate of point greater than zero the time, β〉0; As B " the z axial coordinate of point less than zero the time, β<0.
In the 3rd step, 1d has illustrated the assignment procedure of axial-rotation angle γ among Fig. 2.On the basis of second step, sensor devices 1c is anglec of rotation γ centered by optical axis 7, four summit B 1", B 2", B 3", B 4" become B thereupon 1" ', B 2" ', B 3" ', B 4" '.Agreement is from O to B " look, when camera (sensor devices) when being rotated counterclockwise, γ be on the occasion of; When camera (sensor devices) when turning clockwise, γ is negative value.
If B ", B 1", B 2", B 3", B 4" coordinate in coordinate system I is respectively (B x, B y, B z), (B 1x, B 1y, B 1z), (B 2x, B 2y, B 2z), (B 3x, B 3y, B 3z), (B 4x, B 4y, B 4z), then have
B x=|OB"|sinαcos β (2)
B y=-|OB″|cosαcosβ (3)
B z=|OB″|sinβ (4)
B 1 x = B x - | B 2 ′ ′ B 3 ′ ′ | sin α sin β 2 + | B 1 ′ ′ B 2 ′ ′ | cos α 2 - - - ( 5 )
B 1 y = B y + | B 2 ′ ′ B 3 ′ ′ | sin β cos α 2 + | B 1 ′ ′ B 2 ′ ′ | sin α 2 - - - ( 6 )
B 1 z = B z + | B 2 ′ ′ B 3 ′ ′ | cos β 2 - - - ( 7 )
B 2 x = B x - | B 2 ′ ′ B 3 ′ ′ | sin α sin β 2 - | B 1 ′ ′ B 2 ′ ′ | cos α 2 - - - ( 8 )
B 2 y = B y + | B 2 ′ ′ B 3 ′ ′ | sin β cos α 2 - | B 1 ′ ′ B 2 ′ ′ | sin α 2 - - - ( 9 )
B 2 z = B z + | B 2 ′ ′ B 3 ′ ′ | cos β 2 - - - ( 10 )
B 3 x = B x + | B 2 ′ ′ B 3 ′ ′ | sin α sin β 2 - | B 1 ′ ′ B 2 ′ ′ | cos α 2 - - - ( 11 )
B 3 y = B y - | B 2 ′ ′ B 3 ′ ′ | sin β cos α 2 - | B 1 ′ ′ B 2 ′ ′ | sin α 2 - - - ( 12 )
B 3 z = B z - | B 2 ′ ′ B 3 ′ ′ | cos β 2 - - - ( 13 )
B 4 x = B x + | B 2 ′ ′ B 3 ′ ′ | sin α sin β 2 + | B 1 ′ ′ B 2 ′ ′ | cos α 2 - - - ( 14 )
B 4 y = B y - | B 2 ′ ′ B 3 ′ ′ | sin β cos α 2 + | B 1 ′ ′ B 2 ′ ′ | sin α 2 - - - ( 15 )
B 4 z = B z - | B 2 ′ ′ B 3 ′ ′ | cos β 2 - - - ( 16 )
If sensor devices 1c is after optical axis 7 rotates γ °, B 1" ', B 2" ', B 3" ', B 4" ' coordinate become successively (B 1x', B 1y', B 1z'), (B 2x', B 2y', B 2z'), (B 3x', B 3y', B 3z'), (B 4x', B 4y', B 4z') (obviously, B " point coordinate is not subjected to the impact of optical axis 7 rotations, remains unchanged), can obtain this four coordinate figures by following process.
Order
e x = B x B x 2 + B y 2 + B z 2 - - - ( 17 )
e y = B y B x 2 + B y 2 + B z 2 - - - ( 18 )
e z = B z B x 2 + B y 2 + B z 2 - - - ( 19 )
Wherein,
Figure BDA00002512787400059
For
Figure BDA000025127874000510
The unit direction vector, then have
x 1=x 0[e x 2(1-cosγ)+cosγ]+y 0[e xe y(1-cosγ)-e z sinγ]+z 0[e xe z(1-cosγ)+e y sinγ] (20)
y 1=x 0[e xe y(1-cosγ)+e z sinγ]+y 0[e y 2(1-cosγ)+cosγ]+z 0[e ye z(1-cosγ)-e x sinγ] (21)
z 1=x 0[e xe z(1-cosγ)-e y sinγ]+y 0[e ye z(1-cosγ)+e x sinγ]+z 0[e z 2(1-cosγ)+cosγ] (22)
By formula (2)~(19) simultaneous, substitution α, β, γ, OB " (being l), B 1' B 2' (be l 1) and B 2' B 3' (be l 2) value, and with (the x in (20)~(22) 0, y 0, z 0) be replaced by successively (B 1x, B 1y, B 1z), (B 2x, B 2y, B 2z), (B 3x, B 3y, B 3z), (B 4x, B 4y, B 4z), can obtain B ", B 1" ', B 2" ', B 3" ', B 4Coordinate figure (the B of " ' in coordinate system I x, B y, B z), (B 1x', B 1y', B 1z'), (B 2x', B 2y', B 2z'), (B 3x', B 3y', B 3z'), (B 4x', B 4y', B 4z').
So far, three of camera optical axis 7 deflection angles are set complete.
" coordinate (the B at known B x, B y, B z) condition under, can obtain plane B 1" ' B 2" ' B 3" ' B 4" ' equation:
B x x + B y y + B z z - B x 2 - B y 2 - B z 2 = 0 - - - ( 23 )
With (2)~(4) simultaneous,
B xx+B yy+B zz-|OB"| 2=0 (24)
That is
B xx+B yy+B zz-l 2=0 (25)
Obviously, 2 companies of object point 4 and daylighting point straight line is handed over plane B 1" ' B 2" ' B 3" ' B 4" ' in picture point 5a, the coordinate (P of picture point 5a x, P y, P z) (in coordinate system I) can be by finding the solution straight line OP equation and plane B 1" ' B 2" ' B 3" ' B 4" system of equations of ' equation institute simultaneous obtains.
[0043] passes through B 1" ', B 2" ', B 4Coordinate (the B of " ' in coordinate system I 1x', B 1y', B 1z'), (B 2x', B 2y', B 2z'), (B 4x', B 4y', B 4z'), can get straight line B 1" ' B 2" ', B 1" ' B 4" ' equation (26), (27).
x - B 1 x ′ B 1 x ′ - B 2 x ′ = y - B 1 y ′ B 1 y ′ - B 2 y ′ = z - B 1 z ′ B 1 z ′ - B 2 z ′ - - - ( 26 )
x - B 1 x ′ B 1 x ′ - B 4 x ′ = y - B 1 y ′ B 1 y ′ - B 4 y ′ = z - B 1 z ′ B 1 z ′ - B 4 z ′ - - - ( 27 )
According to the range formula of spatial point to straight line
L = | { X - x 0 , Y - y 0 , Z - z 0 } × { l , m , n } | l 2 + m 2 + n 2 - - - ( 28 )
Can obtain 5a to B 1" ' B 4" ', B 1" ' B 2" ' apart from d 1, d 2In formula (28), (X, Y, Z) is the spatial point coordinate, and { l, m, n} are the direction vector of space line, (x 0, y 0, z 0) be certain point coordinate on this straight line, L is this distance of point to line.
The below is according to Fig. 3, further determine the coordinate of picture point 5a in image coordinate system (being referred to as the coordinate system II) (P ' x, P ' y).With P ' xBe example, establish P ' to straight line B 2" ' B 3" ' distance be d 3(it should be noted that this distance in coordinate system I, try to achieve), and calculate its numerical value by formula (28).Obviously, if d 1, d 3Satisfy
d 1 < d 3 d 3 > | B 1 &prime; &prime; &prime; B 2 &prime; &prime; &prime; | - - - ( 29 )
P ' then x=-d 1, P ' xBe negative value; Otherwise, P ' x=d 1, P ' xFor on the occasion of.Similarly, can obtain P ' ySymbol, so just obtained the coordinate of picture point 5a in the coordinate system II (P ' x, P ' y).So far narrated the coordinate transformation process of object point 4 to picture point 5a.
For 49 points on the rectangle curtain 3, according to method coding discussed above, obtain the picpointed coordinate of each point correspondence on sensor devices, just obtained a complete curtain image.
Fig. 6 is after camera parameter alpha, β, γ are set as respectively 10 °, 20 °, 3 °, the image of the rectangle curtain 3 that runs software program obtains, and black rectangle frame 1d is sensor devices B 1B 2B 3B 4, the actual boundary of presentation video.The outer all points of frame are " picture points " that exceeds the point of camera field range on the curtain in 49 points, and in fact they can imaging on sensor devices.For example, under current shooting situation, the some A on the rectangle curtain 3 74Exceeded the camera field range, 5b is not in black box for its " picture point ".
In conjunction with the engineering background that relates to, all picture points need to sort according to certain method in Fig. 6 black box, each trapeziform four summits grouping is distinguished, obtained their length and width ratio, thereby find the rectangular area in the former rectangle curtain 3 corresponding with them take ratio as foundation.

Claims (3)

1. the emulation verification method of an optical projection system effect, it is characterized in that, be used for obtaining the image of rectangle curtain 3, three deflection angle α, β, γ when camera is taken all can preset, camera and the relative position of rectangle curtain 3 in world coordinate system can preset, camera sensor devices 1a size (l 1, l 2Length with focal length l) can preset.
2. the emulation verification method of a kind of optical projection system effect according to claim 1, it is characterized in that, in the simulation checking system size, camera and the rectangle curtain 3 of α, β, three angles of γ in world coordinate system relative position and after camera sensor devices 1a size determines, B ", B 1" ', B 2" ', B 3" ', B 4Coordinate figure (the B of " ' in world coordinate system x, B y, B z), (B 1x', B 1y', B 1z'), (B 2x', B 2y', B 2z'), (B 3x', B 3y', B 3z'), (B 4x', B 4y', B 4z') derivation algorithm.
3. the emulation verification method of a kind of optical projection system effect according to claim 1 and 2 is characterized in that: B ", B 1" ', B 2" ', B 3" ', B 4After the coordinate figure of " ' in world coordinate system was obtained, picture point 5a is coordinate (P in image coordinate system x', P y') derivation algorithm, particularly image coordinate system in, d 1, d 2With P x', P y' positive and negative relation.
CN2012105066059A 2012-12-03 2012-12-03 Simulation validation method of projection system effect Pending CN103019011A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012105066059A CN103019011A (en) 2012-12-03 2012-12-03 Simulation validation method of projection system effect

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012105066059A CN103019011A (en) 2012-12-03 2012-12-03 Simulation validation method of projection system effect

Publications (1)

Publication Number Publication Date
CN103019011A true CN103019011A (en) 2013-04-03

Family

ID=47967777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012105066059A Pending CN103019011A (en) 2012-12-03 2012-12-03 Simulation validation method of projection system effect

Country Status (1)

Country Link
CN (1) CN103019011A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107255458A (en) * 2017-06-19 2017-10-17 昆明理工大学 A kind of upright projection grating measuring analogue system and its implementation

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201004150Y (en) * 2007-01-22 2008-01-09 哈尔滨宇宙光电子有限公司 Optical source compensation device for LCD projector
CN101833228A (en) * 2010-03-31 2010-09-15 仇文杰 Double-side visible projection optical film and manufacturing method thereof
CN201837823U (en) * 2010-03-31 2011-05-18 仇文杰 Projection optical film with two visible surfaces

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201004150Y (en) * 2007-01-22 2008-01-09 哈尔滨宇宙光电子有限公司 Optical source compensation device for LCD projector
CN101833228A (en) * 2010-03-31 2010-09-15 仇文杰 Double-side visible projection optical film and manufacturing method thereof
CN201837823U (en) * 2010-03-31 2011-05-18 仇文杰 Projection optical film with two visible surfaces

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107255458A (en) * 2017-06-19 2017-10-17 昆明理工大学 A kind of upright projection grating measuring analogue system and its implementation
CN107255458B (en) * 2017-06-19 2020-02-07 昆明理工大学 Resolving method of vertical projection grating measurement simulation system

Similar Documents

Publication Publication Date Title
CN101813465B (en) Monocular vision measuring method of non-contact precision measuring corner
CN109685855B (en) Camera calibration optimization method under road cloud monitoring platform
US20140125774A1 (en) Apparatus for synthesizing three-dimensional images to visualize surroundings of vehicle and method thereof
CN104408689A (en) Holographic-image-based streetscape image fragment optimization method
CN103903260B (en) Target method for quickly calibrating intrinsic parameters of vidicon
CN101661162B (en) Distortion compensation method based on wide-angle lens
CN101697105A (en) Camera type touch detection positioning method and camera type touch detection system
CN105844584B (en) The method for correcting fisheye image distortion
CN102103746B (en) Method for calibrating parameters in camera through solving circular ring points by utilizing regular tetrahedron
CN102750697A (en) Parameter calibration method and device
CN103871071A (en) Method for camera external reference calibration for panoramic parking system
CN102402785B (en) Camera self-calibration method based on quadratic curves
CN109961485A (en) A method of target positioning is carried out based on monocular vision
CN102915535A (en) Method and system for correcting circle center deviation of round mark points during camera projection transformation
CN101520897A (en) Video camera calibration method
CN108225216A (en) Structured-light system scaling method and device, structured-light system and mobile equipment
CN102081798B (en) Epipolar rectification method for fish-eye stereo camera pair
CN106570906A (en) Rectangular pattern-based method for detecting distances under camera angle deflection condition
CN103258329A (en) Camera calibration method based on one-dimensional feature of balls
CN114283391A (en) Automatic parking sensing method fusing panoramic image and laser radar
CN105374067A (en) Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof
CN105988224A (en) 3D display device and Moire fringe reducing method and device thereof
CN105741296A (en) Auxiliary calibration method of 360-degre all-visual-angle aerial view panorama travelling crane
CN103778610B (en) A kind of spaceborne line array sensor hangs down the geometry preprocess method of rail sweeping image
CN102034234B (en) Self-calibration method of multi-view structured light system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130403

WD01 Invention patent application deemed withdrawn after publication