CN103020354A - Design method of spherical curtain projection system for region identification - Google Patents

Design method of spherical curtain projection system for region identification Download PDF

Info

Publication number
CN103020354A
CN103020354A CN2012105354677A CN201210535467A CN103020354A CN 103020354 A CN103020354 A CN 103020354A CN 2012105354677 A CN2012105354677 A CN 2012105354677A CN 201210535467 A CN201210535467 A CN 201210535467A CN 103020354 A CN103020354 A CN 103020354A
Authority
CN
China
Prior art keywords
curtain
prime
image
camera
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012105354677A
Other languages
Chinese (zh)
Inventor
刘雷
胡文龙
梁晟溟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Feiyu Technology Co Ltd
Original Assignee
Harbin Feiyu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Feiyu Technology Co Ltd filed Critical Harbin Feiyu Technology Co Ltd
Priority to CN2012105354677A priority Critical patent/CN103020354A/en
Publication of CN103020354A publication Critical patent/CN103020354A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a design method of a spherical curtain projection system based on a space analytic geometry principle, aiming at dividing a spherical curtain into a plurality of regions and identifying the region to which any projection point on the curtain belongs by virtue of a monocular imaging method, thereby providing information support for practical engineering requirements. According to the method, a software program is used as a tool, images of the spherical curtain and the projection points on the spherical curtain can be obtained in advance, so that reasonable partitioning and debugging are carried out on each region of the spherical curtain conveniently, further the mapping relation between all the curved regions of the curtain and all quadrilateral regions in the image of the curtain can be realized.

Description

A kind of ball curtain projection system method for designing for domain identification
Technical field
The present invention relates to a kind of ball curtain projection system method for designing based on the space analysis geometrical principle, it is some zones that purpose is to divide the sphere curtain, and by any affiliated area of subpoint on the monocular image capture method identification curtain, thereby provide the information support for the Practical Project demand.The method is take software program as instrument, in order to classifying rationally and debugging are carried out in each zone of sphere curtain, and then realizes on the curtain single mapping relations of all quadrilateral areas in all curved surface areas and curtain image.
Background technology
Digital Image Processing has developed into the integrated technology that each field intersects, the crucial meaning that it has has been not limited to traditional categories such as improving image visual sense quality, extraction image information and manipulation of image data, but produces deeper the contact with numerous science imaginations gradually.This method is intended to provide necessary view data support for the design of certain digital image processing system and debugging.
The proposition of this method relates to the engineering background of a reality.This project requires design one cover sphere curtain optical projection system, and this system is made of hemisphere face curtain, beam projector and a camera.At first curtain is divided into the nothing intersection curved surface area of some, and to curtain, form subpoint by projector emission fine light beam, camera obtains the image of subpoint on sphere curtain and the curtain with any shooting angle, final curved surface area according to obtaining image identification subpoint place.
For realizing above purpose, the curved surface area that at first curtain is divided into specified quantity also is its numbering.Like this, each curved surface area becomes to be presented as an irregular quadrilateral that is formed by four points (these four points are the picture point on these four summits of curved surface area) in the image at camera, because each curved surface area is different when dividing, each quadrilateral is also different in the image, thereby can set up all curved surface areas and all trapeziform single mapping relations.Secondly, by in image, judging the quadrilateral area that the subpoint picture point belongs to, just can according to the curved surface area at the further identification subpoint of the single mapping relations place of setting up before, also just successfully satisfy engine request.
Owing to not limiting camera site and the shooting angle of camera when obtaining sphere curtain and subpoint image in the engineering, therefore in order to set up single mapping relations of each quadrilateral area in each curved surface area and the image, must obtain camera at an arbitrary position, the image-forming information of arbitrarily angled shooting situation lower peripheral surface curtain, in order to provide foundation for setting up single mapping relations.In fact, three angles (horizontal azimuth α, vertical angle of pitch β and axial-rotation angle γ) of want in reality the coordinate of each point on accurately the positioning shooting head and sphere curtain, accurately setting cam lens optical axis (optical axis is camera lens daylighting point and sensor devices straight line that central point connects) have certain technical difficulty.For this reason, the present invention proposes the sofeware simulation method of ball curtain projection system effect.According to the method, can in software program, three shooting angle to camera lens optical axis precisely set, can also in rectangular coordinate system in space, arbitrarily and exactly set the relative position of each point and camera daylighting point (also being called " photocentre ") on the ball curtain, namely in the optional situation of style of shooting, obtain required image with almost having no error.
Summary of the invention
This method has been set the coordinate figure of camera daylighting point in world coordinate system by setting up world coordinate system; Drawn mesh lines at the sphere curtain, each grid is the curved surface area that requires division; The summit (intersection point of mesh lines) of curved surface area is painted with the black round dot, thereby in image, forms the black picture point that is easy to identify; Set the coordinate figure of all mesh lines intersection points in world coordinate system on the sphere curtain.When the camera optical axis is taken this sphere curtain with different angles, each top curved surface area imaging in the curtain image will be irregular quadrilateral.The method of utilizing the present invention to propose can after setting camera position and camera optical axis shooting angle, obtain the image-forming information of each curved surface area in the curtain image, i.e. each trapeziform accurate shape on the curtain.
For the picture (irregular quadrilateral) of each curved surface area on the sphere curtain in one-tenth's image showed, model the model space geometric of the camera system that jointly formed by sphere curtain and camera.In this model, set up world coordinate system, demarcated respectively curtain, cam lens daylighting point and the coordinate of camera lens sensor devices in this coordinate system; At image coordinate system that sensor devices has been set up in the plane, realized by the conversion of object point to picture point according to the pinhole imaging system principle.Particularly, describe exactly to connect the line of object point and daylighting point, this straight line and plane, sensor devices place meet at a bit, (sensor devices is a rectangular device if this point is in sensor devices, its size can be set in program according to device specification and design requirement) in the rectangular area, place, it is the picture point of object point in figure so; Otherwise former object point can imaging in image, has namely exceeded the field range of cam lens.
In above process, the key problem of solution is on the basis that sets cam lens daylighting point position, three windup-degrees of camera optical axis, to obtain the data representation method of sensor devices in world coordinate system.Among the present invention, world coordinate system is set up take camera lens daylighting point as initial point.The camera optical axis overlaps with the y coordinate axis, as initial shooting direction; Sensor devices is parallel with the xOz face, and its four edges is parallel with x axle, z axle respectively, so just the initial position of sensor devices is set.When with the method the sphere curtain being taken, all coordinates in world coordinate system will determining with camera lens daylighting point (being coordinate origin) position on the curtain; And the shooting direction of camera lens optical axis is in case definite, and optical axis also can be by above-mentioned three angles (horizontal azimuth α, vertical angle of pitch β and axial-rotation angle γ) quantitatively calibrating with its initial drift angle of taking between direction.
The technical solution used in the present invention:
A kind of ball curtain projection system method for designing for domain identification, it is characterized in that, the picture point that is used for identification subpoint 4 is positioned at the quadrilateral area of the image of sphere curtain 3, and then identification subpoint 4 is positioned at the curved surface area on the sphere curtain, three angle [alpha], β, γ when camera is taken all can preset, camera and the relative position of sphere curtain 3 in world coordinate system can preset, camera sensor devices 1a size (l 1, l 2Length with focal length l) can preset.
Described a kind of ball curtain projection system method for designing for domain identification is characterized in that, single mapping relations method for building up of each quadrilateral area in each curved surface area and the curtain image on the sphere curtain 3.
B ", B 1' ", B 2' ", B 3' ", B 4' " after the coordinate figure in world coordinate system obtained, picture point 5 is coordinate (P in image coordinate system x', P y') derivation algorithm, particularly image coordinate system in d 1, d 2With P x', P yDefinite method of ' positive and negative relation.
Single mapping relations method for building up of each quadrilateral area in each curved surface area and the curtain image on the sphere curtain 3.
Hereinafter in connection with the drawings and the specific embodiments summary of the invention is discussed in detail.
Description of drawings
Fig. 1 is the mathematical model simple schematic diagram of camera when the sphere curtain is taken;
Fig. 2 is the mathematical model schematic diagram of camera when the sphere curtain is taken;
Fig. 3 is picture point is tied to image coordinate system by world coordinates transformational relation schematic diagram;
Fig. 4 is the front elevation (actual sphere curtain) after Fig. 5 basis keeps the grid intersection point and removes mesh lines;
Fig. 5 is that the sphere curtain is divided into 30 front elevations (grid division and numbering) behind the curved surface area;
Fig. 6 is camera parameter alpha, β, the γ image that is set as respectively sphere curtain after 10 °, 20 °, 3 ° (rectangle frame representative image border, extra-regional part fail imaging);
Fig. 7 is the schematic diagram that concerns of cam lens focal length, sensor devices size and angle of view scope.
Symbol description:
1a is the sensor devices of initial position, 1b is that the setting horizontal azimuth is the sensor devices behind the α, and 1c is at the sensor devices of setting again on the basis of 1b after the vertical angle of pitch is β, and 1d is at the sensor devices after optical axis rotates the γ angle on the basis of 1c, the 2nd, camera lens daylighting point (photocentre), the 3rd, sphere curtain, the 4th, object point (also referring to subpoint among Fig. 2), the 5th, picture point, the 6th, the image coordinate system initial point, the 7th, camera optical axis (focal length), the 8th, the light emitted bundle of projector, the 9th, beam projector.
Embodiment
The practical situation of sphere curtain 3 is distributed with 40 points as shown in Figure 4 on it, per 4 points surround a curved surface area (in order to distinguish 30 curved surface areas clearer and more definite, Fig. 5 has drawn mesh lines on the basis of Fig. 4, and each zone is numbered).Wherein, Arc length ratio adjustable, | A 11A 21|, | A 21A 31|, | A 31A 41| ratio also adjustable.When above two groups of ratios arrange when appropriate, each curved surface area arc length and high ratio are with different on the sphere curtain 3.And no matter why the shooting drift angle of camera optical axis 7 is worth, and the length and width ratio in each irregular quadrilateral zone on the curtain image picture of corresponding surface zone on the former curtain (as shown in Figure 6, for) will keep difference.So, just can be according to 30 groups of different ratios, with the zone of the irregular quadrilateral in the curtain image (as) corresponding one by one with curved surface area (thing) on the former curtain.
After single mapping relations are set up in each irregular quadrilateral zone in each curved surface area of curtain and the curtain image, just can analyze curved surface area under the subpoint by the picture point place quadrilateral area of judging subpoint, thereby reach engine request.
Body matter of the present invention is tried to achieve the required algorithm of software program exactly.Can set up world coordinate system and image coordinate system by this software program, determine the relative position of sphere curtain 3 and camera photocentre 2, accurately set the shooting angle (being α, β, three angles of γ) of camera, and then obtain the image of curtain 3 under the current shooting situation.The final purpose of obtaining image is as the ratio of two reasonable set is set
Figure BDA00002573110900033
Arc length ratio and | A 11A 21|, | A 21A 31|, | A 31A 41| ratio) prepare.
The body matter of the described method of the below's explaination this patent.
Fig. 7 has drawn the relation of sensor devices 1a in the camera, optical axis (focal length) 7 and angle of view scope.If B 1B 2Length is l 1, B 2B 3Length is l 2The O point is cam lens daylighting point 2, and OB is that its focal length 7(OB is perpendicular to quadrilateral B 1B 2B 3B 4Plane, place, intersection point are B), length is l.
A 1, A 2Be respectively B 1B 2, B 3B 4Mid point, make respectively straight line OA 1, OA 2(as shown in phantom in Figure 7), two wire clamp angles are β, and then B (is established in the β visual angle that is the camera vertical direction in angle 1B 2Horizontal positioned, B 2B 3Vertically place).β, l and l 2Relation shown in (1) formula.
tab β 2 l 2 / 2 l - - - ( 1 )
Here the value with β is set as 46.8 °, and namely vertical direction field angle scope is 46.8 °.Long l 1Be made as 0.006773 meter, wide l 2Be made as 0.005080 meter, this is the common size of 1/3 inch sensor devices.With β and l 2Value substitution (1) formula, the value that gets l is 0.005866 meter.Again with l and l 1Value substitution (1) formula, the scope that can get horizontal direction field angle α is 60 °.
Above part has illustrated the setting rule of camera inner parameter.In fact in a single day, before application the method is obtained the image of sphere curtain 3, need to select in advance to satisfy engineering to the camera lens of field angle area requirement, and determined applied camera lens model, sensor devices size and focal length are all determined thereupon.At this moment, a need be inputted these parameters and get final product in program.
World coordinate system (being called the coordinate system I) is at first set up in discussion in the contact preamble take camera lens photocentre 2 as initial point, the y axle is perpendicular to sensor devices 1a, as shown in Figure 2.The coordinate of B point this moment (central point of sensor devices 1a) is (0 ,-l, 0), B 1B 2, B 2B 3Be parallel to respectively x axle, z axle.If 1a is sensor devices B 1B 2B 3B 4Initial shooting direction and camera site.The coordinate of 40 regional summits (that is grid intersection point) in the coordinate system I can roughly be set according to engine request on the sphere curtain 3, subsequently can with
Figure BDA00002573110900041
Arc length ratio change and change.
The setting of three of camera lens optical axis 7 deflection angles (horizontal azimuth α, vertical angle of pitch β and axial-rotation angle γ) can realize by following three steps in the program:
The first step, as shown in Figure 2, OB becomes OB ' (" 2 " among Fig. 2) in xOy face upper deflecting angle [alpha] (for the initial bit set), the x axial coordinate of approximately settled B ' greater than zero the time, α〉0; When the x of B ' axial coordinate less than zero the time, α<0.
" position (" 3 " among Fig. 2), angle changing is angle of pitch β to OB by the described position transfer of the first step for second step, OB '.Agreement is as B " the z axial coordinate of point greater than zero the time, β〉0; As B " the z axial coordinate of point less than zero the time, β<0.
In the 3rd step, 1d has illustrated the assignment procedure of axial-rotation angle γ among Fig. 2.On the basis of second step, sensor devices 1c is anglec of rotation γ centered by optical axis 7, four summit B 1", B 2", B 3", B 4" become B thereupon 1' ", B 2' ", B 3' ", B 4' ".Agreement is from O to B " look, when camera (sensor devices) when being rotated counterclockwise, γ be on the occasion of; When camera (sensor devices) when turning clockwise, γ is negative value.
If B ", B 1", B 2", B 3", B 4" coordinate in the coordinate system I is respectively (B x, B y, B z), (B 1x, B 1y, B 1z), (B 2x, B 2y, B 2z), (B 3x, B 3y, B 3z), (B 4x, B 4y, B 4z), then have
B x=|OB″|sinαcos β (2)
B y=-|OB″|cosαcosβ (3)
B z=|OB″|sinβ (4)
B 1 x = B x - | B 2 ′ ′ B 3 ′ ′ | sin α sin β 2 + | B 1 ′ ′ B 2 ′ ′ | cos α 2 - - - ( 5 )
B 1 y = B y + | B 2 ′ ′ B 3 ′ ′ | sin β cos α 2 + | B 1 ′ ′ B 2 ′ ′ | sin α 2 - - - ( 6 )
B 1 z = B z + | B 2 ′ ′ B 3 ′ ′ | cos β 2 - - - ( 7 )
B 2 x = B x - | B 2 ′ ′ B 3 ′ ′ | sin β sin α 2 + | B 1 ′ ′ B 2 ′ ′ | cos α 2 - - - ( 8 )
B 2 y = B y + | B 2 ′ ′ B 3 ′ ′ | sin β cos α 2 + | B 1 ′ ′ B 2 ′ ′ | sin α 2 - - - ( 9 )
B 2 z = B z + | B 2 ′ ′ B 3 ′ ′ | cos β 2 - - - ( 10 )
B 3 x = B x + | B 2 ′ ′ B 3 ′ ′ | sin α sin β 2 + | B 1 ′ ′ B 2 ′ ′ | cos α 2 - - - ( 11 )
B 3 y = B y - | B 2 ′ ′ B 3 ′ ′ | sin β cos α 2 + | B 1 ′ ′ B 2 ′ ′ | sin α 2 - - - ( 12 )
B 3 z = B z - | B 2 ′ ′ B 3 ′ ′ | cos β 2 - - - ( 13 )
B 4 x = B x + | B 2 ′ ′ B 3 ′ ′ | sin α sin β 2 + | B 1 ′ ′ B 2 ′ ′ | cos α 2 - - - ( 14 )
B 4 y = B y - | B 2 ′ ′ B 3 ′ ′ | sin β cos α 2 + | B 1 ′ ′ B 2 ′ ′ | sin α 2 - - - ( 15 )
B 4 z = B z - | B 2 ′ ′ B 3 ′ ′ | cos β 2 - - - ( 16 )
If sensor devices 1c is after optical axis 7 rotates γ °, B 1' ", B 2' ", B 3' ", B 4' " coordinate become successively (B 1x', B 1y', B 1z'), (B 2x', B 2y', B 2z'), (B 3x', B 3y', B 3z'), (B 4x', B 4y', B 4z') (obviously, B " point coordinate is not subjected to the impact of optical axis 7 rotations, remains unchanged), can obtain this four coordinate figures by following process.
Order
e x = B x B x 2 + B y 2 + B z 2 - - - ( 17 )
e y = B y B x 2 + B y 2 + B z 2 - - - ( 18 )
e z = B z B x 2 + B y 2 + B z 2 - - - ( 19 )
Wherein,
Figure BDA000025731109000512
For
Figure BDA000025731109000513
The unit direction vector, then have
x 1=x 0[e x 2(1-cosγ)+cosγ]+y 0[e xe y(1-cosγ)-e zsinγ]+z 0[e xe z(1-cosγ)+e ysinγ] (20)
y 1=x 0[e xe y(1-cosγ)+e zsinγ]+y 0[e y 2(1-cosγ)+cosγ]+z 0[e ye z(1-cosγ)-e xsinγ] (21)
z 1=x 0[e xe z(1-cosγ)-e ysinγ]+y 0[e ye z(1-cosγ)+e xsinγ]+z 0[e z 2(1-cosγ)+cosγ] (22)
By formula (2)~(19) simultaneous, substitution α, β, γ, OB " (being l), B 1' B 2' (be l 1) and B 2' B 3' (be l 2) value, and with (the x in (20)~(22) 0, y 0, z 0) be replaced by successively (B 1x, B 1y, B 1z), (B 2x, B 2y, B 2z), (B 3x, B 3y, B 3z), (B 4x, B 4y, B 4z), can obtain B ", B 1' ", B 2' ", B 3' ", B 4' " coordinate figure (the B in the coordinate system I x, B y, B z), (B 1x', B 1y', B 1z'), (B 2x', B 2y', B 2z'), (B 3x', B 3y', B 3z'), (B 4x', B 4y', B 4z').
So far, three of camera optical axis 7 deflection angles are set complete.
" coordinate (the B at known B x, B y, B z) condition under, can obtain plane B 1' " B 2' " B 3' " B 4' " equation:
B xx+B yy+B zz-B x 2-B y 2-B z 2=0 (23)
With (2)~(4) simultaneous,
B xx+B yy+B zz-|OB″| 2=0 (24)
That is
B xx+B yy+B zz-l 2=0 (25)
Obviously, 2 companies of object point 4 and daylighting point straight line is handed over plane B 1' " B 2' " B 3' " B 4' " in picture point 5, the coordinate (P of picture point 5 x, P y, P z) (in the coordinate system I) can be by finding the solution straight line OP equation and plane B 1' " B 2' " B 3' " B 4' " system of equations of equation institute simultaneous obtains.
Pass through B 1' ", B 2' ", B 4' " coordinate (the B in the coordinate system I 1x', B 1y', B 1z'), (B 2x', B 2y', B 2z'), (B 4x', B 4y', B 4z'), can get straight line B 1' " B 2' ", B 1' " B 4' " equation (26), (27).
x - B 1 x ′ B 1 x ′ - B 2 x ′ = y - B 1 y ′ B 1 y ′ - B 2 y ′ = z - B 1 z ′ B 1 z ′ - B 2 z ′ - - - ( 26 )
x - B 1 x ′ B 1 x ′ - B 4 x ′ = y - B 1 y ′ B 1 y ′ - B 4 y ′ = z - B 1 z ′ B 1 z ′ - B 4 z ′ - - - ( 27 )
According to the range formula of spatial point to straight line
L = | { X - x 0 , Y - y 0 , Z - z 0 } × { l , m , n } | l 2 + m 2 + n 2 - - - ( 28 )
Can obtain 5 to B 1' " B 4' ", B 1' " B 2' " apart from d 1, d 2In formula (28), (X, Y, Z) is the spatial point coordinate, and { l, m, n} are the direction vector of space line, (x 0, y 0, z 0) be certain point coordinate on this straight line, L is this distance of point to line.
The below further determines the coordinate (P of picture point 5 in image coordinate system (being referred to as the coordinate system II) according to Fig. 3 x', P y').With P x' be example, establish P ' to straight line B 2' " B 3' " distance be d 3(it should be noted that this distance in the coordinate system I, try to achieve), and calculate its numerical value by formula (28).Obviously, if d 1, d 3Satisfy
d 1 < d 3 d 3 > | B 1 &prime; &prime; &prime; B 2 &prime; &prime; &prime; | - - - ( 29 )
P then xThe d of '=- 1, P x' be negative value; Otherwise, P x'=d 1, P x' be on the occasion of.Similarly, can obtain P y' symbol, so just obtained the coordinate (P of picture point 5 in the coordinate system II x', P y').So far narrated the coordinate transformation process of object point 4 to picture point 5.
For 40 points on the sphere curtain 3, according to method coding discussed above, obtain the picpointed coordinate of each point correspondence on sensor devices, just obtained a complete curtain image.
Fig. 6 is after camera parameter alpha, β, γ are set as respectively 10 °, 20 °, 3 °, the image of the sphere curtain 3 that runs software program obtains, and black rectangle frame 1d is sensor devices B 1B 2B 3B 4, the actual boundary of presentation video.
After this constantly adjust
Figure BDA00002573110900065
Arc length ratio, until no matter camera is with which kind of angle shot, the wide ratio of 30 group leaders of 30 quadrilaterals (as previously mentioned, some quadrilateral area may not appear in the rectangular area of sensor devices) keeps different in the image, record this moment
Figure BDA00002573110900071
Arc length ratio, and the layout of 40 points on take this group ratio as standard setting actual sphere curtain.
After with camera the sphere curtain being carried out actual photographed in the engineering, the curtain image (referring at this moment true picture) of similar Fig. 6 will be obtained, utilize the digital picture approximating method to obtain the coordinate figure of all picture points in the true picture, try to achieve respectively the wide ratio of trapeziform 30 group leaders by each tetragonal topological structure, find the curved surface area in the former sphere curtain 3 corresponding with them take these 30 groups of ratios as foundation.Like this, as long as determined the affiliated quadrilateral area of picture point of subpoint, just can in former sphere curtain, be in which curved surface area by the identification subpoint.

Claims (2)

1. ball curtain projection system method for designing that is used for domain identification, it is characterized in that, the picture point that is used for identification subpoint 4 is positioned at the quadrilateral area of the image of sphere curtain 3, and then identification subpoint 4 is positioned at the curved surface area on the sphere curtain, three angle [alpha], β, γ when camera is taken all can preset, camera and the relative position of sphere curtain 3 in world coordinate system can preset, camera sensor devices 1a size (l 1, l 2Length with focal length l) can preset.
2. a kind of ball curtain projection system method for designing for domain identification according to claim 1 is characterized in that, single mapping relations method for building up of each quadrilateral area in each curved surface area and the curtain image on the sphere curtain 3.
CN2012105354677A 2012-12-12 2012-12-12 Design method of spherical curtain projection system for region identification Pending CN103020354A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012105354677A CN103020354A (en) 2012-12-12 2012-12-12 Design method of spherical curtain projection system for region identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012105354677A CN103020354A (en) 2012-12-12 2012-12-12 Design method of spherical curtain projection system for region identification

Publications (1)

Publication Number Publication Date
CN103020354A true CN103020354A (en) 2013-04-03

Family

ID=47968956

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012105354677A Pending CN103020354A (en) 2012-12-12 2012-12-12 Design method of spherical curtain projection system for region identification

Country Status (1)

Country Link
CN (1) CN103020354A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110933280A (en) * 2019-12-23 2020-03-27 吉林省广播电视研究所(吉林省广播电视局科技信息中心) Front-view steering method and steering system for plane oblique image
CN111566704A (en) * 2017-09-26 2020-08-21 交互数字Ce专利控股公司 Method and network equipment for tiling a sphere representing spherical multimedia content

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111566704A (en) * 2017-09-26 2020-08-21 交互数字Ce专利控股公司 Method and network equipment for tiling a sphere representing spherical multimedia content
CN110933280A (en) * 2019-12-23 2020-03-27 吉林省广播电视研究所(吉林省广播电视局科技信息中心) Front-view steering method and steering system for plane oblique image
CN110933280B (en) * 2019-12-23 2021-01-26 吉林省广播电视研究所(吉林省广播电视局科技信息中心) Front-view steering method and steering system for plane oblique image

Similar Documents

Publication Publication Date Title
US9451236B2 (en) Apparatus for synthesizing three-dimensional images to visualize surroundings of vehicle and method thereof
CN109685855B (en) Camera calibration optimization method under road cloud monitoring platform
CN110211043B (en) Registration method based on grid optimization for panoramic image stitching
CN110197466B (en) Wide-angle fisheye image correction method
CN102509261B (en) Distortion correction method for fisheye lens
US7479982B2 (en) Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device
CN101697105B (en) Camera type touch detection positioning method and camera type touch detection system
US6768813B1 (en) Photogrammetric image processing apparatus and method
CN101813465B (en) Monocular vision measuring method of non-contact precision measuring corner
CN101739707B (en) Elliptic fisheye image-based distortion correction method
CN107665483B (en) Calibration-free convenient monocular head fisheye image distortion correction method
CN103258329B (en) A kind of camera marking method based on ball one-dimensional
CN104408689A (en) Holographic-image-based streetscape image fragment optimization method
CN109903227A (en) Full-view image joining method based on camera geometry site
CN109961485A (en) A method of target positioning is carried out based on monocular vision
CN104657982A (en) Calibration method for projector
CN105844584A (en) Method for correcting image distortion of fisheye lens
CN105825470A (en) Fisheye image correction method base on point cloud image
CN105469412A (en) Calibration method of assembly error of PTZ camera
CN106169076A (en) A kind of angle license plate image storehouse based on perspective transform building method
CN106570906A (en) Rectangular pattern-based method for detecting distances under camera angle deflection condition
CN105988224A (en) 3D display device and Moire fringe reducing method and device thereof
CN108195472A (en) A kind of heat transfer method for panoramic imaging based on track mobile robot
CN105374067A (en) Three-dimensional reconstruction method based on PAL cameras and reconstruction system thereof
CN103020354A (en) Design method of spherical curtain projection system for region identification

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20130403