WO2009017331A1 - Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens - Google Patents
Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens Download PDFInfo
- Publication number
- WO2009017331A1 WO2009017331A1 PCT/KR2008/004338 KR2008004338W WO2009017331A1 WO 2009017331 A1 WO2009017331 A1 WO 2009017331A1 KR 2008004338 W KR2008004338 W KR 2008004338W WO 2009017331 A1 WO2009017331 A1 WO 2009017331A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- plane
- image plane
- given
- camera
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 238000012545 processing Methods 0.000 claims abstract description 82
- 230000003287 optical effect Effects 0.000 claims abstract description 76
- 238000003384 imaging method Methods 0.000 claims abstract description 63
- 230000006870 function Effects 0.000 claims description 17
- 238000012544 monitoring process Methods 0.000 claims description 12
- 230000000007 visual effect Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 24
- 238000005516 engineering process Methods 0.000 description 6
- 230000001413 cellular effect Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 241000270322 Lepidosauria Species 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 241000196324 Embryophyta Species 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 2
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000005314 correlation function Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000004091 panning Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000009885 systemic effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
- G06T3/047—Fisheye or wide-angle transformations
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/06—Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19617—Surveillance camera constructional details
- G08B13/19626—Surveillance camera constructional details optical details, e.g. lenses, mirrors or multiple lenses
- G08B13/19628—Surveillance camera constructional details optical details, e.g. lenses, mirrors or multiple lenses of wide angled cameras and camera groups, e.g. omni-directional cameras, fish eye, single units having multiple cameras achieving a wide angle view
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the present invention generally relates to mathematically precise image processing methods of extracting distortion-free rectilinear images and panoramic images, which appear most natural to the naked eye, from images acquired using a camera equipped with a wide-angle lens that is rotationaly symmetric about an optical axis, as wel as devices using the methods.
- Panoramic camera which captures the 360°view of scenic places such as tourist resorts, is an example of a panoramic imaging system.
- Panoramic imaging system is an imaging system that captures the views one could get by making one complete turnaround from a given spot.
- omnidirectional imaging system captures the view of every possible direction from a given position.
- Omnidirectional imaging system provides a view that a person could observe from a given position by turning around as wel as boking up and down.
- the solid angle of the region that can be captured by the imaging system is 4 ⁇ steradian.
- One method of obtaining a panoramic image is to empby a fisheye lens with a wide field of view(FOV). For example, the entire sky and the horizm can be captured in a single image by pointing a camera equipped with a fisheye lens with 180° FOV toward the zenith (i.e., the optical axis of the camera is aligned perpendicular to the ground plane).
- fisheye lenses have been often referred to as "all-sky lenses”.
- a high-end fisheye lens by Mkon namely, 6mm f/5.6 Fisheye-Mkkor, has a FOV of 220°. Therefore, a camera equipped with this lens can capture a portion of the backside of the camera as wel as the front side of the camera. Then, panoramic image can be obtained from thus obtained fisheye image after proper image processing.
- imaging system is installed on vertical walls. Imaging systems installed on outside walls of a building for the purpose of monitoring the surroundings, or a rear view camera for monitoring the backside of a passenger car are such examples. In such cases, it is inefficient if the horixmtal field of view is significantly larger than 180°. This is because a wall, which is not needed to be monitored, takes up a large space in the monitor screen. Pixels are wasted in this case, and screen appears dul. Therefore, a horizmtal FOV around 180°is more appropriate for such cases. Nevertheless, a fisheye lens with 18O 0 FOV is not desirable for such application. This is because the barrel distortion, which accompanies a fisheye lens, evokes psychological discomfort and abhorred by the consumer.
- pan-tit-x An example of an imaging system, which can be installed on an interior wall for the purpose of monitoring the entire room, is given by a pan-tit-x)om camera.
- a camera is comprised of a video camera, which is equipped with an optical »om lens, mounted on a pan-tit stage.
- Pan is an operation of rotating in the horixmtal direction for a given angle
- tilt is an operation of rotating in the vertical direction for a given angle.
- pan is an operation of changing the bngitude
- tit is an operation of changing the latitude. Therefore, the theoretical range of pan operation is 360°, and the theoretical range of tilt operation is 180°.
- pan-tit-z The shortcomings of a pan-tit-z)om camera include high price, large size and heavy weight. Optical sorn lens is large, heavy and expensive due to the difficulty in design and the complicated structure. Also, a pan-tit stage is an expensive device no cheaper than a camera. Therefore, it cost a considerable sum of money to install a pan-tit-»om camera. Furthermore, since a pan-tilt- »om camera is large and heavy, this point can become a serious impediment to certain applications. Examples of such cases include airplanes where the weight of the payfoad is of critical importance, or a strict size limitation exists in order to install a camera in a confined space. Furthermore, pan-tilt-»om operation takes a time because it is a mechanical operation. Therefore, depending on the particular application at hand, such a mechanical operation may not be fast enough.
- References 1 and 2 provide fundamental technologies of extracting an image having a particular viewpoint or projection scheme from an image having other than the desirable viewpoint or projection scheme.
- reference 2 provides an example of a cubic panorama.
- a cubic panorama is a special technique of illustration wherein the observer is assumed to be located at the very center of an imaginary cubic room made of glass, and the outside view from the center of the glass room is directly transcribed on the region of the glass wall where the ray vector from the object to the observer meets the glass wall.
- An example of a more advanced technology is provided in the above reference with which reflections from an arbitrarily shaped mirrored surface can be calculated.
- the author of reference 2 created an imaginary lizard having a highly reflective mirror-like skin as if made of a metal surface, then set-up an observer's viewpoint separated from the lizard, and calculated the view of the imaginary environment reflected on the lizard skin from the viewpoint of the imaginary observer.
- the environment was not a real environment captured by an optical lens, but a computer-created imaginary environment captured with an imaginary distortion-free pinhole camera.
- an imaging system is described in reference 3 that is able to perform pan-tilt-x)om operations without a physically moving part.
- the said invention uses a camera equipped with a fisheye lens with more than 180° FOV in order to take a picture of the environment. Then, user designates a principal direction of vision using various devices such as a joystick, upon which, the computer extracts a rectilinear image from the fisheye image that could be obtained by heading a distortion-free camera to that particular direction.
- this invention creates a rectilinear image corresponding to the particular direction the user has designated using devices such as a joystick or a computer mouse.
- Such a technology is a core technology in the field of virtual reality, or when it is desirable to replace mechanical pan-tilt-x)om camera, and the keyword is "interactive picture".
- this technology there are no physically moving parts in the camera. As a consequence, the system response is fast, and there is less chance of mechanical failure.
- the said invention assumes that the projection scheme of the fisheye lens is an ideal equidistance projection scheme. But, the real projection scheme of a fisheye lens generally shows a considerable deviation from an ideal equidistance projection scheme. Snce the said invention does not take into account the distortion characteristics of the real lens, images obtained after image processing stl shows distortion.
- the invention described in reference 4 remedies the shortcoming of the invention described in reference 3, namely the inability of taking into account the real projection scheme of the fisheye lens used in image processing. Nevertheless, the defect of not showing vertical lines as vertical lines in the monitor screen has not been resolved.
- equi-rectangular projection scheme is the projection scheme most familiar to us when we describe the geography on the earth, or when we draw the celestial sphere in order to make a map of the constellation.
- FIG. 2 is a schematic diagram of a planar map drawn according to the equi-rectangular projection scheme.
- a point Q on the earth's surface having a longitude ⁇ and a latitude ⁇ has a corresponding point P" on the planar map(234) drawn according to the equi-rectangular projection scheme.
- the rectangular coordinate of this corresponding point is given as (x", y").
- the reference point on the equator having a bngitude 0°and a latitude 0° has a corresponding point 0 " on the planar map, and this corresponding point 0 " is the origin of the rectangular coordinate system.
- the same interval in the bngitude i.e., the same angular distance afong the equator
- the lateral coordinate x" on the planar map(234) is proportional to the longitude.
- c is proportionality constant.
- longitudinal coordinate y" is proportional to the latitude, and has the same proportionality constant as the lateral coordinate.
- Fig. 3 is a conceptual drawing of a cylindrical projection scheme or a panoramic perspective.
- an imaginary observer is located at the center N of a celestial sphere(331) with a radius S, and it is desired to make a map of the celestial sphere centered on the observer, the map covering most of the region excluding the zenith and the nadir.
- the span of the bngitude must be 360°ranging from -180°to +180°, but the range of the latitude can be narrower including the equator within its span.
- the span of the latitude can be assumed as ranging from - ⁇ to + ⁇ , and here, ⁇ must be smaller than 90°.
- Such a cylindrical projection scheme is the natural projection scheme for a panoramic camera that produces a panoramic image by rotating in the horixmtal plane. Especially, if the lens mounted on the rotating panoramic camera is a distortion-free rectilinear lens, then the resulting panoramic image exactly foflows a cylindrical projection scheme. In principle, such a cylindrical projection scheme is the most accurate panoramic projection scheme. However, the panoramic image appears unnatural when the latitude range is large, and thus it is not widely used in practice.
- Unwrapped panoramic image thus produced and having a cylindrical projection scheme has a lateral width W given by Eq. 3.
- the range of the latitude is from ⁇ to ⁇
- the longitudinal height of the unwrapped panoramic image is given by Eq. 7.
- ground plane is fairly perpendicular to the gravitational force, but needless to say, it is not so on a slanted ground. Therefore, the word "ground plane” actually refers to the horixmtal plane, and the vertical direction is the direction perpendicular to the horixmtal plane.
- the ground plane must be understood as the horixmtal plane
- the vertical direction must be understood as the direction perpendicular to the horixmtal plane
- the horixmtal direction must be understood as a direction parallel to the horixmtal plane, whenever an exact meaning of a term needs to be clarified.
- Panoramic lenses described in references 7 and 8 take panoramic images in one shot with the optical axis of the panoramic lens aligned vertical to the ground plane.
- a cheaper alternative to the panoramic image acquisition method by the previously described camera with a horixmtaUy -rotating lens consist of taking an image with an ordinary camera with the optical axis horixmtaUy aligned, and repeating to take pictures after horixmtaUy rotating the optical axis by a certain amount.
- Four to eight pictures are taken in this way, and a panoramic image with a cylindrical projection scheme can be obtained by seamlessly joining the pictures consecutively.
- Such a technique is called stitching.
- QuickTime VR from Apple computer inc. is commercial software supporting this stitching technology. This method requires a complex, time- consuming, and elaborate operation of precisely joining several pictures and correcting the lens distortion.
- another method of obtaining a panoramic or an omnidirectional image is to take a hemispherical image by horizmtaUy pointing a camera equipped with a fisheye lens with more than 180° FOV, and then point the camera to the exact opposite direction and take another hemispherical image.
- one omnidirectional image having the view of every direction i.e., 4 ⁇ steradian
- the user can select his own viewpoint from the received omnidirectional image according to his own personal interest, and image processing software on the user's computing device can extract a partial image corresponding to the user-selected viewpoint, and a perspectively correct planar image can be displayed on the computing device. Therefore, through the image processing software, the user can make a choice of turning around(pan), boking-up or down(tilt), or take a dose (som in) or remote (som out) view as if the user is actually at the specific place in the image.
- This method has a distinctive advantage of multiple users accessing the same Internet site to be able to take boks along the directions of their own choices. This advantage cannot be enjoyed in a panoramic imaging system empbying a motion camera such as a pan-tit camera.
- References 10 and 11 describe a method of obtaining an omnidirectional image providing the views of every direction centered on the observer.
- the projection scheme provided by the said references is one kind of equidistance projection schemes in essence.
- the techniques described in the documents make it possible to obtain omnidirectional images from a real environment or from a cubic panorama, but the obtained omnidirectional image fdbws an equidistance projection scheme only and its usefulness is thus limited.
- reference 12 provides an algorithm for projecting an Omnimax movie on a semi-cylindrical screen using a fisheye lens.
- a method is described for locating the position of the object point on the film corresponding to a certain point on the screen whereon an image point is formed. Therefore, it is possible to calculate what image has to be on the film in order to project a particular image on the screen, and such an image on the film is produced using a computer.
- the lens distortion is already reflected in the image processing algorithm, a spectator near the movie projector can entertain himself with a satisfactory panoramic image.
- the real projection scheme of the fisheye lens in the said reference is inconvenient to use because it has been modeled with the real image height on the film plane as the independent variable, and the zenith angle of the incident ray as the dependent variable. Furthermore, unnecessarily, the real projection scheme of the fisheye lens has been modeled only with an odd polynomial.
- Reference 13 provides examples of stereo panoramic images produced by Professor
- each of the panoramic images follows a cylindrical projection scheme, and a panoramic image of an imaginary scene produced by a computer as wel as a panoramic image produced by a rotating slit camera are presented.
- a panoramic image of an imaginary scene produced by a computer as wel as a panoramic image produced by a rotating slit camera are presented.
- the lens distortion is not an important issue.
- rotating slit camera cannot be used to take a real-time panoramic image(i.e., movie) of a real world.
- References 14 and 15 provide an example of a fisheye lens with 19O 0 FOV
- reference 16 provides various examples of wide-angle lenses including dioptric and catadioptric fisheye lenses with stereographic projection schemes.
- reference 17 provides various examples of obtaining panoramic images following cylindrical projection schemes, equi-rectangular projection schemes, and Mercator projection schemes from images acquired using rotationaUy symmetric wide-angle lenses including fisheye lenses.
- Figure 4 is a conceptual drawing illustrating the real projection schemes of ro- tationaUy symmetric wide-angle lenses(412) including fisheye lenses.
- Z-axis of the world coordinate system describing objects captured by the wide-angle lens coincides with the optical axis(401) of the wide-angle lens(412).
- An incident ray(405) having a zenith angle ⁇ with respect to the Z-axis is refracted by the lens (412), and as a refracted ray(406), converges toward an image point P on the focal plane(432).
- the distance between the nodal point N of the lens and the said focal plane is approximately equal to the effective focal length of the lens.
- the sub area on the focal plane whereon real image points have been formed is the image plane(433).
- the said image plane(433) must coincide with the image sensor plane(413) within the camera body(414).
- Said focal plane and the said image sensor plane are perpendicular to the optical axis.
- the intersection point 0 between the optical axis(401) and the image plane(433) is hereinafter referred to as the first intersection point.
- the distance between the first intersection point and the said image point P is r.
- the unit of the incidence angle ⁇ is radian
- the above function r( ⁇ ) is a monotonicaUy increasing function of the zenith angle ⁇ of the incident ray.
- Such a real projection scheme of a lens can be experimentally measured using an actual lens, or can be calculated from the lens prescription using dedicated lens design software such as Code V or Zemax.
- the y-axis coordinate y of the image point on the focal plane by an incident ray having given horixmtal and vertical incidence angles can be calculated using a Zemax operator REAY, and the x-axis coordinate x can be similarly calculated using an operator REAX.
- Figure 5 is an imaginary interior scene produced by professor Paul Bourke by using a computer, and it has been assumed that the imaginary lens used to capture the image is a fisheye lens with 18O 0 FOV having an ideal equidistance projection scheme.
- r' is not a physical distance, but an image height measured in pixel distance. Snce this imaginary fisheye lens folows an equidistance projection scheme, the projection scheme of this lens is given by Eq. 10.
- Math Figure 10 Math Figure 10
- Fig. 6 through Fig. 8 show several embodiments of wide-angle lenses presented in reference 16.
- Figure 6 is a dioptric(i.e., refractive) fisheye lens with a stereographic projection scheme
- Fig. 7 is a catadioptric fisheye lens with a stereographic projection scheme
- Fig. 8 is a catadioptric panoramic lens with a rectilinear projection scheme.
- the wide-angle lenses from the said reference and in the current invention are not limited to a fisheye lens with an equidistance projection scheme, but encompass all kind of wide-angle lenses that are rotationaly symmetric about the optical axes.
- the main point of the invention in reference 17 is about methods of obtaining panoramic images by applying mathematically accurate image processing algorithms on images obtained using rotationaly symmetric wide-angle lenses. Numerous embodiments in reference 17 can be summarized as foflows.
- the world coordinate system of the said invention takes the nodal point N of a rotationaly symmetric wide-angle lens as the origin, and a vertical line passing through the origin as the Y-axis.
- the vertical line is a line perpendicular to the ground plane, or more precisely to the horizmtal plane(917).
- the X-axis and the Z-axis of the world coordinate system are contained within the ground plane.
- the optical axis (901) of the said wide-angle lens generally does not coincide with the Y-axis, and can be contained within the ground plane(i.e., parallel to the ground), or is not contained within the ground plane.
- the plane(904) containing both the said Y-axis and the said optical axis(901) is referred to as the reference plane.
- the intersection Une(902) between this reference plane(904) and the ground plane(917) coincides with the Z-axis of the world coordinate system.
- an incident ray(905) originating from an object point Q having a rectangular coordinate (X, Y, Z) in the world coordinate system has an altitude angle ⁇ from the ground plane, and an aamuth angle ⁇ with respect to the reference plane.
- the plane(906) containing both the Y-axis and the said incident ray(905) is the incidence plane.
- the horixmtal incidence angle ⁇ of the said incident ray with respect to the said reference plane is given by Eq. 11. [53] Math Figure l l
- FIG. 10 is a schematic diagram of a device of the current invention, which also coincides with that of the reference 17, having an imaging system which mainly includes an image acquisition means(l ⁇ l ⁇ ), an image processing means(1016) and image display means(1015, 1017).
- the image acquisition means(l ⁇ l ⁇ ) includes a ro- tationaUy symmetric wide-angle lens(1012) and a camera body(1014) having an image sensor(1013) inside.
- the said wide-angle lens can be a fisheye lens with more than 180° FOV and having an equidistance projection scheme, but it is by no means limited to such a fisheye lens. Rather, it can be any rotationaUy symmetric wide-angle lens including a catadioptric fisheye lens.
- a wide-angle lens is referred to as a fisheye lens.
- Said camera body contains photo- electronic sensors such as CCD or CMOS sensors, and it can acquire either a stl image or a movie.
- the said fisheye lens(1012) a real image of the object plane(lC ⁇ l) is formed on the focal plane(1032).
- the image sensor pkne(1013) must coincide with the focal plane(1032).
- the real image of the objects on the object pkne(KBl) formed by the fisheye lens(1012) is converted by the image sensor(1013) into electrical signals, and displayed as an uncon n ected image plane(10S4) on the image display means(1015).
- This uncon n ected image plane(10S4) contains a barrel distortion by the fisheye lens.
- This distorted image plane can be rectified by the image processing means(1016), and then displayed as a processed image plane(lC65) on an image display means(1017) such as a computer monitor or a CCTV monitor.
- Said image processing can be software image processing by a computer, or hardware image processing by Field Programmable Gate Arrays (FPGA) or ARM core processors.
- FIG. 11 is a conceptual drawing of an uncorrected image plane(l 134) prior to the image processing stage, which corresponds to the real image on the image sensor plane(1013). If the lateral dimension of the image sensor plane(1013) is B and the longitudinal dimension is V, then the lateral dimension of the uncorrected image plane is gB and the longitudinal dimension is gV, where g is proportionality constant.
- Uncorrected image plane(l 134) can be considered as the image displayed on the image display means without rectification of distortion, and is a magnified image of the real image on the image sensor plane by a magnification ratio g.
- the image sensor plane of a 1/3-inch CCD sensor has a rectangular shape having a lateral dimension of 4.8mm, and a longitudinal dimension of 3.6mm.
- the magnification ratio g is 100.
- the side dimension of a pixel in a digital image is considered as 1.
- a VGA-grade 1/3-inch CCD sensor has pixels in an array form with 640 columns and 480 bws.
- the uncorrected image plane(l 134) is a distorted digital image obtained by converting the real image formed on the image sensor plane into electrical signals.
- the first intersection point 0 on the image sensor plane is the intersection point between the optical axis and the image sensor plane. Therefore, a ray entered along the optical axis forms an image point on the said first intersection point 0.
- the point 0 ' on the uncorrected image plane corresponding to the first intersection point 0 in the image sensor plane - hereinafter referred to as the second intersection point - corresponds to the image point by an incident ray entered along the optical axis.
- x'-axis is taken as the axis that passes through the second intersection point 0 ' on the uncon n ected image plane and is parallel to the lateral side of the uncorrected image plane
- y'-axis is taken as the axis that passes through the said second intersection point 0 ' and is parallel to the longitudinal side of the uncorrected image plane.
- the positive direction of the x'-axis runs from the left to the right, and the positive direction of the y'-axis runs from the top to the bottom.
- Figure 12 is a conceptual drawing of a rectified screen of the current invention, wherein the distortion has been removed.
- the processed image plane(1235) has a rectangular shape, of which the lateral side measuring as W and the longitudinal side measuring as H.
- a third rectangular coordinate system is assumed wherein x"-axis is parallel to the lateral side of the processed image plane, and y"-axis is parallel to the longitudinal side of the processed image plane.
- the z"-axis of the third rectangular coordinate system coincides with the z-axis of the first rectangular coordinate system and the z'-axis of the second rectangular coordinate system.
- intersection point 0 " between the said z"-axis and the processed image plane - hereinafter referred to as the third intersection point - can take an arbitrary position, and it can even be bcated outside the processed image plane.
- the positive direction of the x"-axis runs from the left to the right, and the positive direction of the y"-axis runs from the top to the bottom.
- the first and the second intersection points correspond to the bcation of the optical axis.
- the third intersection point corresponds not to the bcation of the optical axis but to the principal direction of vision.
- the principal direction of vision may coincide with the optical axis, but it is not needed to.
- Principal direction of vision is the direction of the optical axis of an imaginary panoramic or rectilinear camera corresponding to the desired panoramic or rectilinear images.
- the principal direction of vision is referred to as the optical axis direction.
- the lateral coordinate x" of a third point P" on the processed image plane(1235) has a minimum value x" and a maximum value x" (i.e., x" ⁇ x" ⁇ x" ).
- the longitudinal coordinate y" of the third point P" has a minimum value y" and a maximum value y" (i.e., y" ⁇ y" ⁇ y" ).
- F( ⁇ ) is a monotonicaUy increasing function passing through the origin. In mathematical terminology, it means that Eqs. 16 and 17 are satisfied.
- Figure 13 is a schematic diagram of an ordinary car rear view camera(1310).
- a car rear view camera it is rather common that a wide-angle lens with more than 150° FOV is used, and the optical axis of the lens is typically indined toward the ground plane(1317) as illustrated in Fig. 13.
- the camera By installing the camera in this way, parking lane can be easily recognized when backing up the car.
- the lens surface is oriented downward toward the ground, precipitation of dust is prevented, and partial protection is provided from rain and snow.
- the world coordinate system takes the nodal point N of the imaging system(1310) as the origin, and takes a vertical line that is perpendicular to the ground plane as the Y-axis, and the Z-axis is set parallel to the car(1351) axle.
- the positive direction of the X-axis is the direction directly plunging into the paper in Fig. 13.
- the lens optical axis is indined bebw the horixm with an angle ⁇
- a coordinate system fixed to the camera has been rotated around the X-axis of the world coordinate system by angle ⁇ .
- This coordinate system is referred to as the first world coordinate system, and the three axes of this first world coordinate system are named as X', Y' and Z'-axis, respectively.
- Fig. 13 it appears that the first world coordinate system has been rotated around the X-axis dockwise by angle ⁇ relative to the world coordinate system.
- the direction of the positive X-axis it has been in fact rotated counterclockwise by angle ⁇ . Snce direction of rotation considers counterdockwise rotation as the positive direction, the first world coordinate system in Fig. 13 has been rotated by + ⁇ around the X-axis of the world coordinate system.
- O is the three dimensional vector in the world coordinate system starting at the origin and ending at the point Q. Then, the coordinate of a new point obtainable by rotating the point Q in the space by an angle of - ⁇ around the X-axis is given by multiplying the matrix given in Eq. 23 on the above vector.
- the matrix given in Eq. 24 can be used to find the coordinate of a new point which is obtainable by rotating the point Q by angle - ⁇ around the Y-axis
- the matrix given in Eq. 25 can be used to find the coordinate of a new point which is obtainable by rotating the point Q by angle - ⁇ around the Z- axis.
- 1 2 processed image plane as wel as the range of the longitudinal coordinate (y" ⁇ y" ⁇ y" ) can take arbitrary real numbers.
- the horixmtal FOV ⁇ of this panoramic image(i.e., the processed image plane) is determined.
- the functional form of F( ⁇ ) dictating the desirable projection scheme along the vertical direction is determined, as wel.
- the horixmtal incidence angle ⁇ and the vertical incidence angle ⁇ of an incident ray corresponding to the third point on the panoramic image having a rectangular coordinate (x", y") can be obtained using Eqs. 30 through 32.
- the zenith angle ⁇ and the azimuth angle ⁇ of an incident ray having the said horizontal incidence angle and the vertical incidence angle are calculated using Eqs. 37 and 38.
- the real image height r corresponding to the zenith angle ⁇ of the incident ray is obtained using Eq. 9.
- the rectangular coordinate (x 1 , y') of the image point on the uncorrected image plane is obtained using Eqs. 39 and 40. In this procedure, the coordinate of the second intersection point on the uncorrected image plane, or equivalently the first intersection point on the image sensor plane, has to be accurately determined.
- a desirable size of the processed image plane and the location (I , J ) of the third in- tersection point are determined.
- the bcation (I , J ) of the third intersection point refers to the pixel coordinate of the third intersection point 0 ".
- the coordinate of a pixel on the upper left corner of a digitized image is defined as (1, 1) or (0, 0).
- we wil assume that the coordinate of the pixel on the upper left corner is given as (1, 1).
- the object distance S does not affect the final outcome and thus it is assumed as 1 for the sake of simplicity. From this coordinate of the object point in the world coordinate system, the coordinate of the object point in the first world coordinate system is calculated from Eqs. 46 through 48.
- Figure 15 is a panoramic image obtained using this method, and a cylindrical projection scheme has been employed. As can be seen from Fig. 15, a perfect panoramic image has been obtained despite the fact that the optical axis is not parallel to the ground plane. Using such a panoramic imaging system as a car rear view camera, the backside of a vehicle can be entirely monitored without any dead spot.
- the video signal S'(I, J) for the pixel in a mirrored (i.e., the left and the right sides are exchanged) processed image plane with a coordinate (I, J) is given by the video signal S(I, J - J + 1) from the pixel in the max processed image plane with coordinate (I, J - J + 1).
- Identical image acquisition means can be installed near the room mirror, frontal bumper, or the radiator grl in order to be used as a recording camera connected to a car black box for the purpose of recording vehicle's driving history.
- the purpose of the present invention is to provide image processing algorithms for extracting natural boking panoramic images and rectilinear images from digitized images acquired using a camera equipped with a wide-angle lens which is rotationaUy symmetric about an optical axis and devices implementing such algorithms.
- the present invention provides image processing algorithms that are accurate in principle based on the geometrical optics principle regarding image formation by wide-angle lenses with distortion and mathematical definitions of panoramic and rectilinear images.
- Figure 1 is a conceptual drawing of the latitude and the bngitude.
- Figure 2 is a conceptual drawing of a map with an equi-rectangular projection scheme.
- Figure 3 is a conceptual drawing illustrating a cylindrical projection scheme.
- Figure 4 is a conceptual drawing illustrating the real projection scheme of a general rotationaly symmetric lens.
- Figure 5 is an exemplary image produced by a computer assuming that a fisheye lens with an equidistance projection scheme has been used to take the picture of an imaginary scene.
- Figure 6 is a diagram showing the optical structure of a refractive fisheye lens with a stereographic projection scheme along with the traces of rays.
- Figure 7 is a diagram showing the optical structure of a catadioptric fisheye lens with a stereographic projection scheme along with the traces of rays.
- Figure 8 is a diagram showing the optical structure of a catadioptric panoramic lens with a rectilinear projection scheme along with the traces of rays.
- Figure 9 is a conceptual drawing of the world coordinate system of the invention of a prior art.
- Figure 10 is a schematic diagram of a panoramic imaging system of the invention of a prior art.
- Figure 11 is a conceptual drawing of an uncorrected image plane.
- Figure 12 is a conceptual drawing of a processed image plane that can be shown on an image display means.
- Figure 13 is a schematic diagram of a car rear view camera empbying a panoramic imaging system of the invention of a prior art.
- Figure 14 is an exemplary fisheye image captured by an inclined imaging system.
- Figure 15 is an exemplary panoramic image extracted from Fig. 14.
- Figure 16 is a conceptual drawing illustrating the rectilinear projection scheme of the first embodiment of the present invention.
- Figure 17 is a conceptual drawing illustrating the change in field of view as the relative position of the processed image plane is changed.
- Figure 18 is the conceptual drawing of a processed image plane according to the first embodiment of the present invention.
- Figure 19 is an exemplary rectilinear image with a horizmtal FOV of 120°extracted from Fig. 5.
- Figure 20 is an exemplary rectilinear image after the application of slide and »om operations.
- Figure 21 is an exemplary image of an interior scene captured using a fisheye lens of the invention of a prior art.
- Figure 22 is a panoramic image with a horixmtal FOV of 190°and following a cylindrical projection scheme extracted from the fisheye image given in Fig. 21.
- Figure 23 is a rectilinear image with a horixmtal FOV of 60° extracted from the fisheye image given in Fig. 21.
- Figure 24 is a schematic diagram of a panoramic camera phone embodying the conception of the present invention.
- Figure 25 is an exemplary rectilinear image obtained after applying pan-tit operation to the fisheye image given in Fig. 5.
- Figure 26 is an exemplary rectilinear image obtained after applying pan-tit operation to the fisheye image given in Fig. 21.
- Figure 27 is another exemplary image of an interior scene captured using a fisheye lens of the invention of a prior art.
- Figure 28 is an exemplary rectiinear image obtained after applying pan-tit operation to the fisheye image given in Fig. 27.
- Figure 29 is an exemplary rectiinear image obtained after applying tilt-pan operation to the fisheye image given in Fig. 27.
- Figure 30 is a conceptual drawing illustrating the most general second world coordinate system in rectiinear projection schemes.
- Figure 31 is a schematic diagram showing the relation between the second world coordinate system and the processed image plane in rectiinear projection schemes.
- Figure 32 is a schematic diagram of a car rear view camera empbying slide-pan-tilt operations.
- Figure 33 is an exemplary rectiinear image obtained after applying tilt operation to the fisheye image given in Fig. 5.
- Figure 34 is a conceptual drawing of an imaging system with large pan-tilt angles in absence of slide operation.
- Figure 35 is a conceptual drawing of an imaging system with large pan-tit angles with a proper mix of slide operation.
- Figure 36 is an exemplary rectiinear image obtained after applying slide and tit operations to the fisheye image given in Fig. 5.
- Figure 37 is an exemplary image of an outdoor scene captured using a fisheye lens of the invention of a prior art.
- Figure 38 is a panoramic image foflowing a Mercator projection scheme extracted from the fisheye image given in Fig. 37.
- Figure 39 is an exemplary rectilinear image obtained after applying tit operation to the fisheye image given in Fig. 37.
- Figure 40 is an exemplary rectilinear image obtained after applying pan-tit operations to the fisheye image given in Fig. 37.
- Figure 41 is a schematic diagram of an imaging system for a vehide embodying the conception of the present invention.
- Figure 42 is a schematic diagram of an imaging system for monitoring the surroundings of a building embodying the conception of the present invention.
- Figure 43 is another exemplary image of an outdoor scene captured using a fisheye lens of the invention of a prior art.
- Figure 44 is an exemplary rectilinear image extracted from the fisheye image given in Fig. 43.
- Figure 45 is a schematic diagram for understanding the rectilinear image given in
- Figure 46 is a schematic diagram illustrating a desirable method of installing cameras on a vehide.
- Figure 47 is another exemplary image of an outdoor scene captured using a fisheye lens of the invention of a prior art.
- Figure 48 is an exemplary rectilinear image extracted from the fisheye image given in Fig. 47.
- Figure 49 is another schematic diagram illustrating a desirable method of installing cameras on a vehide.
- Figure 50 is another schematic diagram of an imaging system for monitoring the surroundings of a building embodying the conception of the present invention.
- Figure 51 is a conceptual drawing of the object plane in a multiple viewpoint panoramic imaging system.
- Figure 52 is an exemplary multiple viewpoint panoramic image extracted from the fisheye image given in Fig. 5.
- Figure 53 is a conceptual drawing illustrating the notion of multiple viewpoint panoramic image.
- Figure 54 is another exemplary multiple viewpoint panoramic image extracted from the fisheye image given in Fig. 5.
- Figure 55 is an exemplary multiple viewpoint panoramic image extracted from the fisheye image given in Fig. 37.
- Figure 56 is an exemplary multiple viewpoint panoramic image extracted from the fisheye image given in Fig. 27.
- Figure 57 is another exemplary image of an interior scene captured using a fisheye lens of the invention of a prior art.
- Figure 58 is an exemplary panoramic image extracted from the fisheye image given in Fig. 57.
- Figure 59 is an exemplary multiple viewpoint panoramic image extracted from the fisheye image given in Fig. 57.
- Figure 60 is a schematic diagram of a desirable embodiment of the image processing means of the present invention.
- Figure 16 is a conceptual drawing illustrating the rectilinear projection scheme of the first embodiment of the present invention.
- a lens with a rectilinear projection scheme is a distortion-free lens, and in mathematical viewpoint, the characteristics of a rectilinear lens are considered identical to those of a pinhole camera.
- To acquire such an image with a rectilinear projection scheme we assume an object plane(1631) and a processed image plane(1635) in the world coordinate system as shown in Fig. 16.
- the imaging system of the present embodiment is heading in an arbitrary direction, and the third rectangular coordinate system takes the optical axis (1601) of the imaging system as the negative z"-axis, and the nodal point of the lens as the origin.
- Image sensor plane has a rectangular shape with a lateral width B and a longitudinal height V, and the image sensor plane is a plane perpendicular to the optical axis.
- the processed image plane has a rectangular shape with a lateral width W and a longitudinal height H.
- the x-axis of the first rectangular coordinate system, the x'-axis of the second rectangular coordinate system, the x"-axis of the third rectangular coordinate system and the X-axis of the world coordinate system are all parallel to the lateral side of the image sensor plane. Furthermore, the z-axis of the first rectangular coordinate system, the z'-axis of the second rectangular coordinate system, and the z"-axis of the third rectangular coordinate systems are all identical to each other and opposite to the Z-axis of the world coordinate system.
- the processed image plane is assumed to be located at a distance s" from the nodal point of the lens.
- the shape of the object plane(1631) is also a plane perpendicular to the optical axis, and the image of objects on the object plane is faithfully reproduced on the processed image plane(1635) with both the lateral and the longitudinal scales preserved.
- the ideal projection scheme of a rectilinear lens is identical to the projection scheme of a pinhole camera. Considering the simple geometrical characteristics of a pinhole camera, it is convenient to assume that the shape and the size of the object plane(1631) are identical to those of the processed image plane. Therefore, the distance from the object plane(1631) to the nodal point N of the lens is also assumed as s".
- Figure 17 is a conceptual drawing illustrating how the horixmtal field of view ⁇ of a lens is changed by the relative position of the processed image plane(1735).
- the position of the processed image plane is symmetric with respect to the optical axis. Therefore, the image displayed on the processed image plane has a symmetrical horixmtal field of view.
- the processed image plane(1735) has been laterally displaced with respect to the optical axis, and the FOVs are different for the left and the right sides.
- Such an operation is useful when it is desired to change the monitored area without changing the principal direction of vision. Physically, it corresponds to laterally displacing the image sensor with respect to the optical axis. In the present invention, such an operation wl be referred to as a slide operation.
- Fig. 17(c) shows the case where the distance s" between the nodal point and the processed image plane has been increased.
- the field of view becomes narrower, and only a small region is monitored. Physically, this corresponds to a sorn operation. Therefore, by changing the relative position of the processed image plane with respect to the optical axis, and the distance to the nodal point, slide and »om effects can be achieved.
- Figure 18 illustrates the case where the intersection 0 between the image sensor plane and the optical axis, or equivalently the third intersection point 0 " on the processed image plane corresponding to the first intersection point 0 does not coincide with the center C" of the processed image plane. Therefore, it corresponds to an imaging system with slide operation.
- the coordinate of the said center C" is given as (x" , y" ).
- the lateral dimension of the processed image plane is W
- the virtual distance s" of the processed image plane having given horixmtal or vertical FOVs can be calculated.
- the size (W, H) of the processed image plane and the horizmtal FOV ⁇ is first determined. Then, the virtual distance of the processed image plane is obtained, and from this distance, the symmetrical vertical FOV ⁇ is automatically determined.
- the coordinate of the center of the processed image plane is given by (x" , y" ) as a result of a slide operation, then the foflowing Eqs. 61 and 62 are satisfied.
- ⁇ and ⁇ are the maximum and the minimum incidence angles in the max min horixmtal direction, and likewise, ⁇ and ⁇ are the maximum and the minimum max min incidence angles in the vertical direction. Furthermore, irrespective of the position of the center, the foflowing relation given in Eq. 63 must be satisfied all the time. [251] Math Figure 63 [Math.63]
- Eq. (65) reflects the convention that the coordinate of the pixel on the upper left corner of a digital image is given as (1, 1).
- the displacement ( ⁇ I, ⁇ J) of the said center from the third intersection point is determined.
- the zenith angle given in Eq. 66 and the azimuth angle given in Eq. 67 are calculated for every pixel on the processed image plane.
- the position of the second point on the uncorrected image plane is calculated using the position of the second intersection point on the uncorrected image plane and the magnification ratio g.
- the rectilinear image can be obtained using the previously described interpolation methods.
- Figure 19 is a rectilinear image extracted from the fisheye image given in Fig. 5, wherein the lateral dimension of the processed image plane is 240 pixels, the longitudinal dimension is 180 pixels, the horixmtal FOV is 120°, and there is no slide operation. As can be seen from Fig. 19, all the straight lines are captured as straight lines. On the other hand, Fig. 20 shows a rectilinear image, of which the parameters are identical to those of Fig. 19 except for the fact that the center of the processed image plane has been slid 70 pixels along the lateral direction and -30 pixels along the longitudinal direction.
- Fig. 21 is an exemplary image of an interior scene, which has been acquired by aligning the optical axis of a fisheye lens with 190° FOV described in references 14 and 15 parallel to the ground plane. The real projection scheme of this fisheye lens is described in detail in the said references.
- Fig. 22 is a panoramic image having a cylindrical projection scheme extracted from the fisheye image given in Fig. 21.
- the width: height ratio of the processed image plane is 16:9
- the position of the third intersection point coincides with the center of the processed image plane
- the horixmtal FOV is set as 190°.
- al the vertical lines are captured as vertical lines and all the objects appear natural. Slight errors are due to the error in aligning the optical axis parallel to the ground plane, and the error in experimentally determining the position of the optical axis on the uncorrected image plane.
- Fig. 23 is a rectilinear image extracted from Fig. 21 with the width: height ratio of 4:3, wherein the position of the third intersection point coincides with the center of the processed image plane, and the horixmtal FOV is 60°.
- the straight lines in the world coordinate system are captured as straight lines in the processed image plane.
- Figure 24 shows one exemplary application wherein such an invention can be used.
- the built-in camera module is a very important factor in determining the market value of a cellular phone. If a panoramic picture can be taken with a cellular phone(2414), then the customer satisfaction wil be greatly increased.
- Ordinary lens currently mounted in camera phones is comprised of 2 ⁇ 4 pieces of double aspherical lens elements, and has a mega pixel grade resolution with a typical FOV of 60°. On the other hand, it is possible to realize a fisheye lens with a FOV between 120°and 180°using 3 - 5 pieces of double aspherical lens elements.
- a panoramic camera phone can be realized by installing a wide- angle lens with more than 120° FOV, and endowing the image processing function to the electronic circuitry of the celular phone.
- the resolution of the image sensor for such application is preferably more than IM pixels.
- a complication arises if it is desired to endow more than 180° FOV to the built-in lens for a celular phone. If the lens does not protrude from the wall of the celular phone, then the wall ocdude the view of the lens even if the lens itself has a FOV which is larger than 180°. Therefore, in order to obtain a FOV which is greater than 180°, the lens must protrude from the wall of the cellular phone. However, due to the characteristics of a cellular phone being carried by the user all the time, a protruding lens can be scratched or stained. Considering this fact, a lens cover can be procured to cover the lens when the built-in camera is not in use.
- Another method is to make the camera module to come out from the wall of the cellular phone only when the phone camera is to be used.
- the easiest method is to make the FOV of the lens less than 170°.
- the lens needs not protrude from the wall, and a fixed lens module is sufficient.
- the horixmtal FOV of a panoramic camera with a rotating lens is only 120°, it is apparent that such a FOV can be enough for the customer satisfaction.
- Identical techniques can be used in PC camera or web camera, and the computing power for the necessary image processing can be provided by a PC connected to the PC camera, or a computer of another user connected to the Internet, or a network server.
- a fisheye image can be acquired using a digital camera equipped with a fisheye lens, then a panoramic image or a rectilinear image can be extracted using an image editing software running on PC.
- a camera that can physically provide such an image is a camera equipped with a rectilinear lens and mounted on a pan-tilt stage. Snce camera can be oriented to the direction that needs most attention, a most satisfactory image can be provided. Furthermore, images can be continuously produced while dynamically tracking a moving object such as a cat or an intruder. A method of realizing such functionality with software is provided as follows.
- the size (W, H) of the processed image plane and the horizontal FOV ⁇ prior to the slide operation are determined.
- the distance s" of the processed image plane is determined by Eq. 60.
- a proper amount (x" , y" ) of slide operation is determined in order to obtain a desirable horizontal FOV ( ⁇ ⁇ ⁇ ⁇ ⁇ ) and a vertical FOV ( ⁇ ⁇ ⁇ ⁇ ⁇ ).
- pan operation is a rotational operation around the Y-axis
- tit operation is a rotational operation around the X'-axis.
- the coordinate of a new point which corresponds to an object point in the world coordinate system having a coordinate (X, Y, Z) rotated around the Y-axis by angle - ⁇ is given as (X', Y', Z')
- the coordinate of yet another point which corresponds to the said new point rotated around the X'-axis by angle - ⁇ is given as (X", Y", Z").
- the coordinate of the new point is given by Eq. 75.
- M x ⁇ a M ⁇ ⁇ ⁇ M x ⁇ a ⁇ I ⁇ ⁇ - ⁇ )
- the rectilinear image can be obtained using interpolation methods identical to those described in the first embodiment.
- the lateral dimension of the processed image plane is 1280 pixels
- the longitudinal dimension is 960 pixels
- the horixmtal FOV prior to the slide-pan-tilt operations is 70°
- Figure 27 is another exemplary image of an interior scene captured using a fisheye lens described in references 14 and 15, wherein the optical axis of the fisheye lens has been inclined downward from the horizmtal plane toward the floor(i.e., nadir) by 45°.
- the method of extracting a rectilinear image presented in the third embodiment is a method wherein slide operation is taken first, and then the pan and the tilt operations folow in sequence.
- slide operation is taken first, and then the pan and the tilt operations folow in sequence.
- the coordinate of a new point obtainable by taking the tit and the pan operations in sequence is given by Eq. 96.
- the rectilinear image can be obtained using interpolation methods identical to those described in the first and the second embodiments.
- FIG. 9 shows the second world coordinate system (X", Y", Z"), which is obtainable by rotating and translating the world coordinate system (X, Y, Z) shown in Fig. 9.
- the origin N" of the second world coordinate system shown in Fig. 30 is the origin N in the world coordinate system translated by ⁇ X along the X-axis direction, and by ⁇ Y along the Y-axis direction.
- the X"-axis and the Y"-axis are the X-axis and the Y-axis rotated around the Z- axis by angle ⁇ , respectively, after the above translational operations.
- Figure 31 illustrates that the coordinate (X", Y") of the object point in the second world coordinate systems is proportional to the coordinate (x", y") of the image point on the processed image plane. In other words, a relation given in Eq. 115 holds true.
- a rectilinear projection scheme is a scheme wherein a second world coordinate system (X", Y", Z") exist such that the relation given in Eq. 115 holds true for all the image points (x", y") on the processed image plane, where the second world coordinate system (X", Y", Z") is a coordinate system obtainable by rotating and translating the said world coordinate system (X, Y, Z) in the three- dimensional space by arbitrary number of times and in random orders.
- FIG 32 is a schematic diagram of a car rear view camera utiUang the wide-angle imaging system of the current invention.
- the panoramic imaging system of the invention of a prior art described in reference 17 can be used as a car rear view camera in order to completely eliminate the dead zmes in monitoring.
- Said fisheye lens can be installed inside a passenger car(3251) trunk for the purpose of monitoring the backside of a car without a dead zme. It can also be installed at the bumpers or at the rear window.
- the top of the trunk wl be the ideal place to install a rear view camera.
- the rear view camera is desirably installed at the top of the rear end of the vehicle.
- the image of the immediate back area of the car wl be most helpful rather than the view of a remote scenery.
- the optical axis(3201) of the image acquisition means(3210) wl be aligned parallel to the ground plane(3217). Therefore, in order to visually check the obstacles lying behind the car or the parking lane, the pan-tilt operations presented in the first through the third embodiments of the present invention wl be helpful.
- Figure 33 is a rectilinear image extracted from the fisheye image given in Fig. 5.
- the lateral dimension of the processed image plane is 240 pixels
- the longitudinal dimension is 180 pixels
- the horixmtal FOV prior to the slide-pan-tilt operations is 120°
- Fig. 34 only half of the screen contains meaningful image. The reason can be understood referring to Fig. 34.
- Fig. 34 has been drawn under the assumption that the FOV of the wide-angle lens of the image acquisition means is 180°.
- the processed image plane(3435) following a rectilinear projection scheme as well as the object plane(3431) are perpendicular to the optical axis, and all the object points on the object plane are within the FOV of the said lens.
- the tit angle becomes 90°as in Fig. 34(b)
- half of the processed image plane(3435) and half of the object plane(3431) lie outside the FOV of the lens. Therefore, when a tit operation with a tilt angle of 90° is taken purely in software, then a region where there is no visual information exists occupy half of the screen.
- the region outside the FOV of the lens is the vehicle's body, and therefore it is not really meaningful to monitor that area.
- Fig. 35 shows a way to resolve such problem. If it is desired to make the pan or the tilt angle equal to 90°in order to make the principal direction of vision perpendicular to the optical axis, then as schematically shown in Fig. 35(a), it is preferable to apply a slide operation to the processed image plane(3535) first, so that object plane(3531) lies at the opposite side of the direction with respect to the optical axis along which a rotational operation is intended. Then, even if the tilt angle becomes 90°as in Fig. 35(b), the object plane remains within the FOV of the lens and a satisfactory rectilinear image is obtained.
- Fig. 36 is an example of a wide-angle image obtained by applying the slide operation before the tilt operation.
- Figure 37 is yet another fisheye image obtained using the said fisheye lens with the optical axis aligned parallel to the ground plane.
- a camera equipped with the said fisheye lens has been set-up near the sidewaU of a large bus and headed outward from the bus. The height of the camera was comparable to the height of the bus.
- al the objects within a hemisphere including the sidewaU of the bus have been captured in the image.
- Figure 38 is a panoramic image having a horixmtal FOV of 190°extracted from Fig. 37.
- Fig. 40 is another rectilinear image extracted from Fig.
- FIG. 41 shows a schematic diagram of a car using such an imaging system.
- At least one camera(411OL, 411OR, 4110B) is installed at the outside wall of a vehide(4151) such as a passenger car or a bus, and preferably identical fisheye lenses are installed on all the cameras.
- this camera can be installed parallel to the ground plane, vertical to the ground plane, or at an angle to the ground plane. Which one among these options is taken depends on the particular application area.
- the fisheye images acquired from these cameras are gathered by the image processing means (4116), and displayed on the image display means (4117) after proper image processing operation has been taken, and it can be simultaneously recorded on an image recording means(4118).
- This image recording means can be a DVR(Digital Video Recorder).
- This image selection means receives signal from the electronic circuitry of the car which indicates whether the car is running forward, backing up, or the driver gave the change-lane signal to turn to the left or to the right. Even when the vehide is moving forward, it may be necessary to differentiate between a fast and a sbw driving. Independent of all these signals, the driver can manually override the signals by manipulating the menus on the dashboard in order to force the projection scheme and the parameters of the image displayed on the image display means. Based upon such signals supplied by the image selection means, the image processing means provide the most appropriate images suitable for the current driving situation such as the images from the left-side camera(4110L) or the rear view camera(4110B).
- the situation awareness module is StI useful.
- panoramic image such as the one given in Fig. 22 or a rectilinear image parallel to the ground plane such as the one given in Fig. 23 can be extracted from the fisheye image acquired by the rear view camera and shown to the driver.
- a pan-tit image such as the one given in Fig. 40 can be shown to the driver.
- back-up signal lamp is lit, then a tilted image such as the one given in Fig. 39 can be shown to the driver in order to avoid possible accidents.
- each of the raw images from the respective camera can be separately image-processed, and then the processed images can be seamlessly joined together to form a single panoramic image or a rectilinear image.
- a more satisfactory image can be shown to the driver such as a bird's eye view providing image of the vehide and its surroundings as if taken from the air.
- imaging system for vehides indudes an image acquisition means for acquiring an uncorrected image plane which is equipped with a wide-angle lens ro- tationaUy symmetric about an optical axis, image processing means producing a processed image plane from the uncorrected image plane, image selection means determining the projection scheme and the parameters of the processed image plane, and image display means displaying the processed image plane on a screen with a rectangular shape.
- the projection schemes of the said processed image plane indude a panoramic projection scheme and a rectilinear projection scheme.
- a vehide is a device having an imaging system for vehides.
- the world coordinate system for this imaging system takes the nodal point of the wide-angle lens as the origin and a vertical line passing through the said origin as the Y-axis.
- said panoramic projection scheme is a scheme wherein straight lines parallel to the said Y-axis in the world coordinate system all appear as straight lines parallel to the y"-axis on the said screen, and two objects points having identical angular distance on the X-Y plane in the said world coordinate system has an identical distance along the x"-axis on the said screen.
- said rectilinear projection scheme is a scheme wherein an arbitrary straight line in the said world coordinate system appears as a straight line on the said screen.
- said device is a camera phone, and said image selection means is a menu button on the camera phone that a user can select.
- said device is a PC camera
- said image acquisition means is a USB CMOS camera with a fisheye lens
- said image processing means is software running on a computer.
- said device is a digital camera
- said image processing means is software running on a computer.
- said image acquisition means and said image processing means are two means which are separated not only physically but also in time.
- image acquisition means is a digital camera equipped with a fisheye lens and image processing means is image editing software running on a computer.
- image selection means is a menu button on the image editing software, and said image display means is of course the PC monitor.
- Figure 42 is a schematic diagram of a building monitoring system using the imaging system of the fourth embodiment. As shown in Fig. 42, one camera is installed at a high point on each wall of the building. Four cameras wl be needed in general since typical building has a rectangular shape. If four walls of the building face the east, the west, the north, and the south, respectively, then a rectangular shaped region of the ground plane adjacent to the east wall of the building is assigned as the object plane for the east camera(4210E).
- each camera is installed heading vertically downward toward the ground plane, but a similar effect can be obtained by installing the cameras at an angle. After rectilinear images have been extracted from the images acquired by the installed cameras, the region of the image corresponding to the object plane illustrated in Fig. 42 are trimmed out.
- situation awareness module(4219) displays various images on the image display means(4217) corresponding to the operation of the security personnel.
- the procedure of assembling the rectilinear images extracted from each direction is called an image registration.
- image registration technique is described in detail in reference 18, and basic principle is to calculate a correlation function from at least partially overlapping two sub-images and from the peak value of the correlation function, the amount of relative translation or rotation between the two images necessary for the two images to match at its best is determined.
- Figure 43 is an image acquired from a camera equipped with a fisheye lens, which is installed at the top of a passenger vehide as illustrated in Fig. 13 with the camera tit angle of -45°.
- the FOV is so wide that even the vehicle's number plate can be read.
- Fig. 44 is a rectilinear image extracted from Fig. 43 having a horixmtal FOV of 120°and -45°tilt angle.
- the parking lane on the ground plane is dearly visible.
- the car rear bumper appears larger than the parking lane.
- Such a rectilinear image wl not be of a much use for a parking assistance means. The reason the image appears in this way can be understood by referring to Fig. 45.
- Figure 45 is a schematic diagram of a case where a car rear view camera(4510) is installed near the top of a passenger car(4551)'s trunk.
- the characteristics of a camera providing a rectilinear image is identical to that of a pinhole camera.
- the width of the parking lane(4555) is needless to say wider than the width of the passenger car, or the rear bumper(4553). Therefore, the passenger car can be completed contained within the parking lane.
- the bumper appears wider than the parking lane due to the height of the bumper. Viewed from the camera in Fig.
- the boundary(4563) of the parking lane appears necessarily narrower than the boundary of the bumper (4563), and it has nothing to do with the projection scheme of the lens, but it is the result of a pure viewpoint.
- the heights of the two objects must be similar, but this is practically impossible.
- FIG. 46 is a schematic diagram of an imaging system for vehicles resolving the aforementioned problems.
- the cameras for vehicles are installed at each corner of the vehide, and the optical axes are aligned perpendicular to the ground plane.
- it can be installed at the corner of the bumper, which is a protruding end of the car.
- precise bcation(4663) of the vehide with respect to the ground plane is obtained independent of the height of the bumper, and therefore the distance to the parking lane can be accurately estimated while trying to park the vehide.
- Figure 47 is an image obtained using a camera equipped with a fisheye lens, which is installed near the top of a recreational vehide with the optical axis perpendicular to the ground plane, and Fig. 48 is a rectilinear image extracted from such a fisheye image. From Fig. 48, it can be seen that the vehide and the parking lane appear well separated from each other.
- Such an imaging system will be particularly useful for a vehide with a high stature such as a bus or a truck as schematically illustrated in Fig. 49. Since it is installed at a height higher than the stature of a man, the rectilinear images appear more natural, it is easier to maintain, and there is a less chance of breakage or defilement.
- Such an imaging system for vehides is characterized as comprised of the 1st and the 2nd image acquisition means for acquiring the 1st and the 2nd uncorrected image planes using wide-angle lenses rotationaUy symmetric about optical axes, an image processing means for extracting the 1st and the 2nd processed image planes following rectilinear projection schemes from the said 1st and the 2nd uncorrected image planes and subsequently generating a registered processed image plane from the 1st and the 2nd processed image planes, and an image display means for displaying the said registered processed image plane on a rectangular shaped screen.
- Said device is an automobile having said imaging systems for vehides.
- Fig. 50 shows an imaging system for monitoring the outside of a building using the same principle.
- the said device is the building itself.
- Such an imaging system for monitoring the outside of a building is characterized as comprised of the 1st through the 4th image acquisition means (501 ONW, 501 ONE, 501 OSW, 501 OSE) for acquiring the 1st through the 4th uncorrected image planes using wide- angle lenses rotationaUy symmetric about optical axes, an image processing means(5016) for extracting the 1st through the 4th processed image planes following rectilinear projection schemes from the said 1st through the 4th uncorrected image planes and subsequently generating a registered processed image plane from the 1st through the 4th processed image planes, and an image display means(5017) for displaying the said registered processed image plane on a rectangular shaped screen.
- the 4th image acquisition means 501 ONW, 501 ONE, 501 OSW, 501 OSE
- the said 1st through the 4th image acquisition means are installed at the four corners of the building heading downward toward the ground plane, and the said registered processed image plane contains the view of the said four corners.
- Panoramic imaging system of the reference 17 requires a direction sensing means in order to constantly provide natural-looking panoramic images irrespective of the inclination of the device having the imaging system with respect to the ground plane.
- a direction sensing means may be difficult in terms of cost, weight, or volume for some devices such as motorcycle or unmanned aerial vehicle.
- Figure 51 is a conceptual drawing of an object plane providing a multiple viewpoint panorama that can be advantageously used in such cases.
- Object plane of the current embodiment has a structure where more than two sub object planes are joined together, where each of the sub object plane is a planar surface by itself.
- Fig. 51 is illustrated as a case where three sub object planes, namely 5131-1, 5131 -2 and 5131-3 are used, a more general case of using n sub object planes can be easily understood as wel.
- a sphere with a radius T centered on the nodal point of the lens is assumed. If a folding screen is set-up around the sphere while keeping the folding screen to touch the sphere, then this folding screen corresponds to the object plane of the current embodiment.
- the n sub object planes are all at the same distance T from the nodal point of the lens.
- all the sub object planes have the identical »om ratio or the magnification ratio.
- the principal direction of vision(5101-1) of the 1st sub object plane(5131-l) makes an angle of ⁇ with the principal direction of vision(5101-2) of the 2nd sub object plane(5131-2), and the principal direction of vision(5101-3) of the 3rd sub object plane(5131-3) makes an angle of ⁇ with the
- the range of the horizmtal FOV of the 1st sub object plane is from a minimum value ⁇ to a maximum value ⁇
- the range of the horixmtal FOV of the 2nd sub object plane is from a minimum value ⁇ to a maximum value ⁇ .
- Figure 52 is an example of a multiple viewpoint panoramic image extracted from Fig. 5, wherein each of the sub processed image plane is 240 pixels wide along the lateral direction and 180 pixels high along the longitudinal direction. Furthermore, the horixmtal FOV of each of the sub object plane or the sub processed image plane is 60°, there is no slide operation for each sub object plane, and the distance to the sub object plane is identical for all the sub object planes.
- the pan angle for the 3rd sub object plane on the left side is -60°
- the pan angle for the 2nd sub object plane in the middle is 0°
- the pan angle for the 1st sub object plane on the right side is 60°.
- each of the three sub object planes has a horixmtal FOV of 60°
- the said three sub object planes comprise a multiple viewpoint panoramic image having an horixmtal FOV of 180° as a whole.
- all the straight lines appear as straight lines in each sub object plane.
- FIG. 53 is a conceptual drawing illustrating the definition of a multiple viewpoint panoramic image.
- An imaging system providing a multiple viewpoint panoramic image includes an image acquisition means for acquiring an uncorrected image plane which is equipped with a wide-angle lens rotationaUy symmetric about an optical axis, an image processing means for extracting a processed image plane from the said uncorrected image plane, and an image display means for displaying the processed image plane on a screen with a rectangular shape.
- Said processed image plane is a multiple viewpoint panoramic image, wherein the said multiple viewpoint panoramic image is comprised of the 1st through the nth sub rectilinear image planes laid out horizontally on the said screen, n is a natural number larger than 2, an arbitrary straight line in the world coordinate system having the nodal point of the wide-angle lens as the origin appears as a straight Une(5331A) on any of the 1st through the nth sub rectilinear image plane, and any straight line in the world coordinate system appearing on more than two adjacent sub rectilinear image planes appears as a connected line segments(5381B-l, 5381B-2, 5381B-3).
- Fig. 54 is another multiple viewpoint panoramic image extracted from Fig. 5.
- This is a multiple viewpoint panoramic image obtained by first tilting the three sub object planes by -55°, and then subsequently panning the sub object planes by the same condition applied to Fig. 52.
- Figure 55 is a multiple viewpoint panoramic image extracted from Fig. 37
- Fig. 56 is a multiple viewpoint panoramic image extracted from Fig. 27.
- each of the three object plane has a horizmtal FOV of 190°/ 3.
- Figure 57 is another example of a fisheye image, and it shows the effect of installing a fisheye lens with 190° FOV on the center of the ceiling of an interior.
- Figure 58 is a panoramic image extracted from the fisheye image given in Fig. 57.
- Fig. 59 is a multiple viewpoint panoramic image extracted from Fig. 57.
- Each sub object plane has a horixmtal FOV of 360°/ 4, and it has been obtained by tilting all the sub object planes by -90° first, and subsequently panning the sub object planes by necessary amounts. From Fig. 59, it can be seen that such an imaging system wl be useful as an indoor security camera.
- the foflowing is the MatLab code used to obtain the image in Fig. 59.
- Jmax round(Lmax / 4); % canvas width
- Jo (Jmax + I) / 2;
- T (Jmax - 1 ) / 2.0 / tan(Dpsi / 2.0) ;
- alpha ALPHA / 180.0 * pi
- Beta_o 45.0
- BETA Beta_o + 360.0 * S / 4;
- beta BETA / 180.0 * pi
- Yp X * sin_alpha * sin_beta + Y * cos_alpha - Z * sin_alpha * cos_beta;
- Zp -X * cos_alpha * sin_beta + Y * sin_alpha + Z * cos_alpha * cos_beta;
- phi_p atan2(Yp, Xp) ;
- r_p gain * pdyval(coeff, theta_p);
- Km floor(y_p);
- FIG. 60 is a schematic diagram of an exemplary image processing means embodying the conception of the present invention.
- the image processing means(6016) of the present invention has an input frame buffer(6071) storing one frame of image acquired from the image acquisition means(6010). If the image acquisition means(6010) is an analog CCTV, then it is necessary to decode NTSC, PAL, or Secam signals, and to deinterlace the two interlaced sub frames. On the other hand, this procedure is unnecessary for a digital camera. After these necessary procedures have been taken, the input frame buffer(6071) stores a digital image acquired from the image acquisition means(6010) in the form of a two dimensional array. This digital image is the uncorrected image plane.
- the output frame buffer(6073) stores an output signal in the form of a two dimensional array, which corresponds to the processed image pkne(6Q35) that can be displayed on the image display means(6017).
- a central processing unit(CPU: 6075) further exists, which generates a processed image plane from the uncorrected image plane existing in the input frame buffer and stores in the output frame buffer.
- the mapping relation between the output frame buffer and the input frame buffer is stored in a non- volatile memory (6079) such as a SDRAM in the form of a bokup table.
- a long list of pixel addresses for the input frame buffer corresponding to particular pixels in the output frame buffer is generated and stored.
- Central processing unit(6075) refers to this list stored in the nonvolatile memory in order to process the image.
- the image selection means(6077) in Fig. 60 receives the signals coming from various sensors and switches connected to the imaging system and send them to the central processing unit. For example, by recognizing the button pressed by the user, it can dictate whether the original distorted fisheye image is displayed without any processing, or a panoramic image with a cylindrical or a Mercator projection scheme is displayed, or a rectilinear image is displayed. Said nonvolatile memory stores a number of list corresponding to the number of possible options a user can choose.
- the image processing means of the embodiments of the current invention are preferably implemented on a FPGA(ReId Programmable Gate Array) chip.
- the said central processing unit, the said input frame buffer, and the said output frame buffer can all be realized on the FPGA chip.
- the central processing unit can dynamically generate the bok-up table referring to the algorithm and the parameters corresponding to the user's selection, and store them on a volatile or a non-volatile memory. Therefore, if the user does not provide a new input by appropriate means such as moving the mouse, then already generated look-up table is used for the image processing, and when a new input is supplied, then a corresponding bok-up table is promptly generated and stored in the said memory.
- said image acquisition means is an anabg CCTV equipped with a fisheye lens with more than 180° FOV
- said image processing means is an independent device using a FPGA chip and storing the image processing algorithm on a non- volatile memory as a bok- up table
- the said anabg CCTV and the said image processing means are connected by signal and power lines.
- said image acquisition means is a network camera equipped with a fisheye lens with more than 180° FOV
- said image processing means is an independent device using a FPGA chip and storing the image processing algorithm on a non- volatile memory as a look-up table
- the uncorrected image plane acquired by the said network camera is supplied to the said image processing means by the Internet.
- panoramic imaging systems and devices can be used not only in security-surveillance applications for indoor and outdoor environments, but also in diverse areas such as video phone for apartment entrance door, rear view camera for vehicles, visual sensor for robots, and also it can be used to obtain panoramic photographs using a digital camera.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2008801085480A CN101809991B (en) | 2007-07-29 | 2008-07-24 | Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens |
AU2008283196A AU2008283196B9 (en) | 2007-07-29 | 2008-07-24 | Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens |
US12/670,095 US8553069B2 (en) | 2007-07-29 | 2008-07-24 | Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2007-0076113 | 2007-07-29 | ||
KR20070076113 | 2007-07-29 | ||
KR10-2007-0091373 | 2007-09-10 | ||
KR20070091373 | 2007-09-10 | ||
KR10-2008-0071106 | 2008-07-22 | ||
KR1020080071106A KR100898824B1 (en) | 2007-07-29 | 2008-07-22 | Method and imaging system for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009017331A1 true WO2009017331A1 (en) | 2009-02-05 |
Family
ID=40304519
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2008/004338 WO2009017331A1 (en) | 2007-07-29 | 2008-07-24 | Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2009017331A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2444932A3 (en) * | 2010-08-31 | 2012-07-25 | Hitachi Information & Communication | Device, method and program for correcting an image |
CN104901727A (en) * | 2014-02-26 | 2015-09-09 | 清华大学 | Unmanned aerial vehicle queue formation cooperative communication control system and method |
CN110021044A (en) * | 2018-01-10 | 2019-07-16 | 华晶科技股份有限公司 | The method and image acquiring device of taken the photograph object coordinates are calculated using double fish eye images |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6356297B1 (en) * | 1998-01-15 | 2002-03-12 | International Business Machines Corporation | Method and apparatus for displaying panoramas with streaming video |
WO2005013001A2 (en) * | 2003-07-03 | 2005-02-10 | Physical Optics Corporation | Panoramic video system with real-time distortion-free imaging |
JP2005056295A (en) * | 2003-08-07 | 2005-03-03 | Iwane Kenkyusho:Kk | 360-degree image conversion processing apparatus |
WO2006112927A2 (en) * | 2005-04-14 | 2006-10-26 | Microsoft Corporation | System and method for head size equalization in 360 degree panoramic images |
KR20070061033A (en) * | 2005-12-08 | 2007-06-13 | 한국전자통신연구원 | Apparatus and method for generating video signal of multi-view panoramic video, apparatus and method for playing video |
-
2008
- 2008-07-24 WO PCT/KR2008/004338 patent/WO2009017331A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6356297B1 (en) * | 1998-01-15 | 2002-03-12 | International Business Machines Corporation | Method and apparatus for displaying panoramas with streaming video |
WO2005013001A2 (en) * | 2003-07-03 | 2005-02-10 | Physical Optics Corporation | Panoramic video system with real-time distortion-free imaging |
JP2005056295A (en) * | 2003-08-07 | 2005-03-03 | Iwane Kenkyusho:Kk | 360-degree image conversion processing apparatus |
WO2006112927A2 (en) * | 2005-04-14 | 2006-10-26 | Microsoft Corporation | System and method for head size equalization in 360 degree panoramic images |
KR20070061033A (en) * | 2005-12-08 | 2007-06-13 | 한국전자통신연구원 | Apparatus and method for generating video signal of multi-view panoramic video, apparatus and method for playing video |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2444932A3 (en) * | 2010-08-31 | 2012-07-25 | Hitachi Information & Communication | Device, method and program for correcting an image |
US8630506B2 (en) | 2010-08-31 | 2014-01-14 | Hitachi Information & Communication Engineering, Ltd. | Image correcting device, method for creating corrected image, correction table creating device, method for creating correction table, program for creating correction table, and program for creating corrected image |
CN104901727A (en) * | 2014-02-26 | 2015-09-09 | 清华大学 | Unmanned aerial vehicle queue formation cooperative communication control system and method |
CN104901727B (en) * | 2014-02-26 | 2018-02-09 | 清华大学 | The communication control system and method for unmanned plane queue formation collaboration |
CN110021044A (en) * | 2018-01-10 | 2019-07-16 | 华晶科技股份有限公司 | The method and image acquiring device of taken the photograph object coordinates are calculated using double fish eye images |
CN110021044B (en) * | 2018-01-10 | 2022-12-20 | 华晶科技股份有限公司 | Method for calculating coordinates of shot object by using double-fisheye image and image acquisition device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8553069B2 (en) | Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens | |
US9762795B2 (en) | Method and apparatus for obtaining rectilinear images using rotationally symmetric wide-angle lens | |
US8798451B1 (en) | Methods of obtaining panoramic images using rotationally symmetric wide-angle lenses and devices thereof | |
KR100988872B1 (en) | Method and imaging system for obtaining complex images using rotationally symmetric wide-angle lens and image sensor for hardwired image processing | |
Nayar | Omnidirectional video camera | |
Nayar et al. | 360/spl times/360 mosaics | |
Nayar | Omnidirectional vision | |
CN101271187B (en) | Non-dead angle binocular solid all-directional vision sensing equipment | |
CN103905792A (en) | 3D positioning method and device based on PTZ surveillance camera | |
KR102126159B1 (en) | Scanning panoramic camera and scanning stereoscopic panoramic camera | |
WO2009017332A1 (en) | Methods of obtaining panoramic images using rotationally symmetric wide-angle lenses and devices thereof | |
WO2009017331A1 (en) | Method and apparatus for obtaining panoramic and rectilinear images using rotationally symmetric wide-angle lens | |
Kweon et al. | Image-processing based panoramic camera employing single fisheye lens | |
Kweon | Panoramic image composed of multiple rectilinear images generated from a single fisheye image | |
KR101889225B1 (en) | Method of obtaining stereoscopic panoramic images, playing the same and stereoscopic panoramic camera | |
WO2016196825A1 (en) | Mobile device-mountable panoramic camera system method of displaying images captured therefrom | |
KR101762475B1 (en) | Image-processing-based digital pan·tilt·zoom·slide camera whereof the principal direction of vision of rectilinear image plane is periodically scanned along a predetermined trace | |
EP4095745B1 (en) | An image processor and a method therein for providing a target image | |
Edwards | 360° camera systems for surveillance and security | |
KR20200116682A (en) | Scanning mono and stereoscopic panoramic image acquisition device | |
Nayar et al. | ¿ ¼¢¿ ¼ Mosaics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200880108548.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08792886 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008283196 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12670095 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2008283196 Country of ref document: AU Date of ref document: 20080724 Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 586/CHENP/2010 Country of ref document: IN |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08792886 Country of ref document: EP Kind code of ref document: A1 |