WO1999065245A1 - Procede et systeme pour obtenir une image uniforme a vue omnidirectionnelle a partir d'images prises avec un oeil-de-poisson - Google Patents

Procede et systeme pour obtenir une image uniforme a vue omnidirectionnelle a partir d'images prises avec un oeil-de-poisson Download PDF

Info

Publication number
WO1999065245A1
WO1999065245A1 PCT/SG1999/000052 SG9900052W WO9965245A1 WO 1999065245 A1 WO1999065245 A1 WO 1999065245A1 SG 9900052 W SG9900052 W SG 9900052W WO 9965245 A1 WO9965245 A1 WO 9965245A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sin
cos
angular
fisheye
Prior art date
Application number
PCT/SG1999/000052
Other languages
English (en)
Inventor
See Wan Toong
Original Assignee
Surreal Online Pte Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Surreal Online Pte Ltd. filed Critical Surreal Online Pte Ltd.
Priority to AU43060/99A priority Critical patent/AU4306099A/en
Publication of WO1999065245A1 publication Critical patent/WO1999065245A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/12Panospheric to cylindrical image transformations

Definitions

  • This invention relates generally to the fields of digital image rendering and image capturing systems, and in particular, to a method and system for providing a normal perspective image with improved resolution at the seams from a set of images captured using a fisheye lens.
  • Camera viewing systems that use a fisheye lens to capture images and convert the images to nomnal perspective images are known to those skilled in the art.
  • these systems capture images using a fisheye lens, typically a which has the advantage of being able to capture the entire 180 degrees of hemispherical field-of-view without having to move the camera.
  • the resulting fisheye image which, in its original form, is distorted, is then converted to a normal perspective image using a digital image transformation technique.
  • FIG. 1 Shown schematically at 1 is the fisheye lens that provides an image of the environment with a 180 degree field-of-view.
  • the fisheye lens is attached to a camera 2 which converts the optical image into an electrical signal.
  • These signals are then digitized electronically 3 and stored in an image buffer 4 within the present invention.
  • An image processing system consisting of an X-MAP and a Y-MAP processor shown as 6 and 7, respectively, performs the two-dimensional transform mapping.
  • the image transform processors are controlled by the microcomputer and control interface 5.
  • the microcomputer control interface provides initialization and transform parameter calculation for the system.
  • the control interface also determines the desired transformation coefficients based on orientation angle, magnification, rotation, and light sensitivity input from an input means such as a joystick controller 12 or computer input means 13.
  • the transformed image is filtered by a 2-dimensional convolution filter 8 and the output of the filtered image is stored in an output image buffer 9.
  • the output image buffer 9 is scanned out by display electronics 10 to a video display device 11 for viewing.
  • a range of lens types can be accommodated to support various fields of view.
  • the lens optics 1 correspond directly with the mathematical coefficients used with the X-MAP and Y-MAP processors 6 and 7 to transform the image.
  • the capability to pan and tilt the output image remains even though a different maximum field of view is provided with a different lens element.
  • This prior art system can be realized by proper combination of a number of optical and electronic devices.
  • the fisheye lens 1 is exemplified by any of a series of wide angle lenses from, for example, Nikon, particularly the 8 mm F2.8.
  • Any video source 2 and image capturing device 3 that converts the optical image into electronic memory can serve as the input for the invention such as a Videk Digital Camera interfaced with Texas Instrument's TMS 34061 integrated circuits.
  • Input and output image buffers 4 and 9 can be constructed using Texas Instrument TMS44C251 video random access memory chips or their equivalents.
  • the control interface can be accomplished with any of a number of microcontrollers including the Intel 80C196.
  • the X-MAP and Y-MAP transform processors 6 and 7 and image filtering 8 can be accomplished with application-specific integrated circuits or other means as will be known to persons skilled in the art.
  • the display driver can also be accomplished with integrated circuits such as the Texas Instruments TMS34061.
  • the output video signal can be of the NTSC RS- 170, for example, compatible with most commercial television displays in the United States.
  • Remote control 12 and computer control 13 are accomplished via readily available switches and/or computer systems that also will be well known. These components function as a system to select a portion of the input image (fisheye or wide angle) and then mathematically transform the image to provide the proper prospective for output.
  • the keys to the success of the system include:
  • FIG. 2 The image shown in FIG. 2 is a pen and ink rendering of the image of a grid pattern produced by a fisheye lens. This image has a field-of-view of 180 degrees and shows the contents of the environment throughout an entire hemisphere. Notice that the resulting image in FIG. 2 is significantly distorted relative to human perception.
  • Vertical grid lines in the environment appear in the image plane as 14a, 14b, and 14c.
  • Horizontal grid lines in the environment appear in the image plane as 15a, 15b, and 15c.
  • the image of an object is exemplified by 16.
  • a portion of the image in FIG. 2 has been correct, magnified, and rotated to produce the image shown in FIG. 3.
  • Item 17 shows the corrected representation of the object in the output display.
  • the results shown in the image in FIG. 3 can be produced from any portion of the image of FIG. 2 using the prior art system. Note the perspective corrected as demonstrated by the straightening of the grid pattern displayed in FIG. 3. In the prior art system, these transformations can be performed at real-time video rates (30 times per second), compatible with commercial video standards.
  • This prior art system has the capability to pan and tilt the output image through the entire field of view of the lens element by changing the input means, e.g. the joystick or computer, to the controller.
  • the image can also be rotated through 360 degrees on its axis changing the perceived vertical of the displayed image.
  • This capability provides the ability to align the vertical image with the gravity vector to maintain a proper perspective in the image display regardless of the pan or tilt angle of the image.
  • the system also supports modifications in the magnification used to display the output image. This is commensurate with a zoom function that allows a change in the field of view of the output image. This function is extremely useful for inspection operations.
  • the magnitude of zoom provided is a function of the resolution of the input camera, the resolution of the output display, the clarity of the output display, and the amount of picture element(pixel) averaging that is used in a given display.
  • the system supports all of these functions to provide capabilities associated with traditional mechanical pan (through 180 degrees), tilt (through 180 degrees), rotation (through 360 degrees), and zoom devices.
  • the system also supports image intensity scaling that emulates the functionality of a mechanical iris by shifting the intensity of the displayed image based on commands from the user or an external computer.
  • the postulates and equations that follow are based on this prior art system utilizing a fisheye lens as the optical element.
  • the first property of a fisheye lens is that the lens has a 2 ⁇ steradian field-of-view and the image it produces is a circle.
  • the second property is that all objects in the field-of-view are in focus, i.e., the perfect fisheye lens has an infinite depth-of-field.
  • the two important postulates of the fisheye lens system (refer to FIGS. 4 and 5) are stated as follows;
  • Postulate 1 Azimuth angle invariability-For object points that lie in a content plane that is perpendicular to the image plane and passes through the image plane origin, all such points are mapped as image points onto the line of intersection between the image plane and the content plane, i.e. along a radial line.
  • the azimuth angle of the image points is therefore invariant to elevation and object distance changes within the content plane.
  • Postulate 2 Equidistant Projection Rule-The radial distance, r, from the image plane origin along the azimuth angle containing the projection of the object point is linearly proportional to the zenith angle ⁇ , where ⁇ is defined as the angle between a perpendicular line through the image plane origin and the line from the image plane origin to the object point.
  • is defined as the angle between a perpendicular line through the image plane origin and the line from the image plane origin to the object point.
  • FIG. 4 shows the coordinate reference frames for the object plane and the image plane.
  • the coordinates u, v describe object points within the object plane.
  • the coordinates x,y,z describe points within the image coordinate frame of reference.
  • the object plane shown in FIG. 4 is a typical region of interest to determine the mapping relationship onto the image plane to properly correct the object.
  • the direction of view vector, DOV[x,y,z] determines the zenith and azimuth angles for mapping the object plane, UV, onto the image plane, XY.
  • the object plane is defined to be pe ⁇ endicular to the vector, DOV[x,y,z].
  • the formulas for obtaining the transformation to obtain a perspective corrected image are the following:
  • A (cos 0 cos -9 - sin 0 sin -9 cos )
  • B (sin 0 cos d + cos 0 sin ⁇ ? cos /?)
  • C (cos 0 sin ⁇ 9 + sin 0 cos e? cos /7)
  • D (sin 0 sin ⁇ ? - cos 0 cos d cos /?)
  • R radius of the image circle
  • zenith angle
  • d Azimuth angle in image plane
  • 0 Object plane rotation angle
  • x,y image plane coordinates.
  • the equations 2PA and 3PA provide a direct mapping from the UV space to the XY image space and are the fundamental mathematical result that supports the functioning of the prior art system.
  • the locations of x and y in the imaging array can be determined.
  • This approach provides a means to transform an image from the input video buffer to the output video buffer exactly.
  • the system is completely symmetrical about the zenith, therefore, the vector assignments and resulting signs of various components can be chosen differently depending on the desired orientation of the object plane with respect to the image plane.
  • This system has many uses. For instance, it can be used as a surveillance system, U.S. Pat. No.
  • the present invention employs a two-step transformation where the fisheye image is first mathematically transformed into an intermediate angular image, and the angular image is then transformed into a perspective corrected image.
  • the present system begins by first obtaining a set of three fisheye images separated by 120 degrees. This can be accomplished simply by taking a first picture with a camera using the fisheye lens, rotating the camera view by 120 degrees and taking another picture, rotating the view again by 120 degrees and taking the third and the last picture.
  • each of the three fisheye images has a 60 degree image overlap with its neighboring fisheye image.
  • the images are transformed into angular images using a mathematical transfomnation.
  • the resulting three angular images are then combined to form a single angular image having 360 degree field-of-view. Because the fisheye images were taken 120 degrees apart using a fisheye lens having a steradian of 180 degrees, there is 60 degrees of overlap between the adjacent images.
  • the precise boundaries of the overlap are determined basically by comparing the pixels of the overlapping images and looking for a match.
  • the image in the 60 degree overlap of one angular image should be similar to that of the image of its corresponding overlapping angular image, but they are not identical due to variations in the resolution. Therefore, the boundaries are determined by finding the degree of overlap which produces the minimum difference in pixel intensity value between the two overlapping angular images.
  • the light intensities of the duplicative pixels in the overlapping area are adjusted. Because the resolution of the angular image is better near the center than the edges, the intensity of the pixels which are nearer to the center is increased while the intensity of the pixels which are nearer to the edge is lowered. In the preferred embodiment, for each angular image, 30 degrees of the overlap area which is nearest to the center is given 90 percent intensity, while the other 30 degrees of the overlap area which is farthest from the center (and nearest to the edge) is given 10 percent intensity. Once the proper intensity levels have been applied to the sections of the overlap area, much of the picture degradation attributed to the edges can be eliminated to produce a seamless look.
  • the combined angular image is finally transformed into the perspective corrected image. This is done basically by converting he points in the combined angular image, and applying a transformation matrix to perform tilt, pan and roll to obtain the points in the perspective image.
  • the present transformation method of first converting the fisheye image to the angular image, and then converting the angular image to the perspective corrected image offers a number of advantages over the prior art.
  • the angular images due their mathematical properties, can be easily combined unlike the fisheye images. Depending on how much overlap there is, much of the resolution degradation can be eliminated.
  • the overlap area provides a reference point to compare light intensities of the respective images, and hence, allows a convenient way to normalize the lighting for the final perspective corrected image.
  • the angular image generally takes up less memory space than the fisheye image of comparable field-of-view.
  • FIG. 1 shows a schematic block diagram of the prior art system illustrating the major components thereof.
  • FIG. 2 (prior art) is an example sketch of a typical fisheye image used as input by the prior art system.
  • FIG. 3 (prior art) is an example sketch of the output image after correction for a desired image orientation and magnification within the original image.
  • FIG. 4 (prior art) is a schematic diagram of the fundamental geometry that the prior art system embodies to accomplish the image transformation.
  • FIG. 5 (prior art) is a schematic diagram demonstrating the projection of the object plane and position vector into image plane coordinates.
  • FIG. 6 is a mathematical representation of the fundamental geometry that the present invention embodies to accomplish the image transformation from the fisheye image to angular image.
  • FIG. 7 is a mathematical representation of the angular image plotted on a ( ⁇ , ⁇ ) coordinate system.
  • FIG. 8 is a schematic diagram illustrating how the three angular images are combined.
  • FIG. 9 is a mathematical representation of the combined angular image plotted on a ( ⁇ , ⁇ ) coordinate system.
  • the present invention employs the same system hardware as was used in the prior art system described in the Background section. However, by employing a unique mathematical transformation algorithm, the present invention is able to overcome the shortcomings mentioned above. For convenience in describing the present invention, it shall be assumed that the hardware described above is used, though other similar set of hardware can be employed as well.
  • the present invention employs a two-step transformation where the fisheye image is first mathematically transformed into an intermediate angular image, and the angular image is then transformed into a perspective corrected image. It should be appreciated by one skilled in the art, however, that while the transformation techniques differ, the two basic properties and the two basic postulates discussed in the Background portion still applies to the present invention.
  • fisheye image shall be used to refer to the digitized form of the image directly obtained by taking a picture using the fisheye lens.
  • angular image shall be used to denote an intermediate image obtained by mathematical transforming the digitized fisheye image. It should be understood, however, that the angular image is not really an "image” in the sense that it is not used for viewing; it is simply a mathematical conversion which gives the present invention its flexibility.
  • perspective corrected image shall refer to the final image which is used for viewing. To obtain a perspective corrected image having a 360 degree field- of-view, the present system begins by first obtaining a set of three fisheye images separated by 120 degrees.
  • the fisheye lens preferably has steradian 180 degree, though it is possible to use fisheye lens having less than 180 degree field-of-view (in which case more pictures need to be taken or the degree of overlap will be different).
  • each of the three fisheye images has a 60 degree image overlap with its neighboring fisheye image. The pu ⁇ ose and the usefulness of this overlap in image will be discussed further down below.
  • FIG. 6 shows a mathematical representation of a fisheye image with a point on the fisheye image being represented by (x,y) and its direction vector represented by P[dx,dy,dz].
  • FIG. 7 illustrates the angular image with a point on the image being represented by (0,0). The following equations are used for converting the fisheye image into the angular image in P[dx,dy,dz] coordinates.
  • each of the three fisheye images are converted to the angular image.
  • the angular image is plotted on a ( ⁇ , ⁇ ) coordinate system making it easy to combine the images.
  • the resulting three angular images are combined as illustrated in
  • FIG. 8 Because the fisheye images were taken 120 degrees apart using a fisheye lens having a steradian of 180 degrees, there is 60 degrees of overlap between the adjacent images. Hence, as can be seen from FIG. 8, the angular image 1 overlaps angular image 2 by about 60 degrees, and the angular image 2 overlaps angular image 3 also by about 60 degrees.
  • the precise boundaries of the overlap are determined basically by comparing the pixels of the overlapping images and looking for a match.
  • the image in the 60 degree overlap of one angular image should be similar to that of the image of its corresponding overlapping angular image, but they are not identical due to variations in the resolution. Therefore, the boundaries are determined by finding the degree of overlap which produces the minimum difference in pixel intensity value between the two overlapping angular images using the following equation: where,
  • the degree of overlap which produces the minimum e value is chosen. Since it is known in the beginning approximately how much overlap there will be, in this case 60 degrees, it is more efficient to try out only those values near 60 degrees, e.g., between 40 degrees and 70 degrees, and not all values need to be tested. Once the boundaries are determined, the images are combined.
  • the final angular image after the three angular images are combined is shown in FIG. 9.
  • the combined angular image has the full 360 degree of field-of-view.
  • the light intensities of the duplicative pixels in the overlapping area are adjusted. Because the resolution of the angular image is better near the center (at 0 degrees, FIG. 7) than the edges (at 90 and -90 degrees, FIG. 7), the intensity of the pixels which are nearer to the center is increased while the intensity of the pixels which are nearer to the edge is lowered.
  • the equations 4 and 5 provide a means for mapping from the angular image of FIG. 9 to the perspective corrected image.
  • the perspective corrected image can be further enhanced through the various known image processing techniques.
  • the present transformation method of first converting the fisheye image to the angular image, and then converting the angular image to the perspective corrected image offers a number of advantages over the prior art.
  • the angular images due their mathematical properties, can be easily combined unlike the fisheye images. Depending on how much overlap there is, much of the resolution degradation can be eliminated.
  • the overlap area provides a reference point to compare light intensities of the respective images, and hence, allows a convenient way to normalize the lighting for the final perspective corrected image.
  • the angular image generally takes up less memory space than the fisheye image of comparable field-of-view.
  • the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
  • the preferred embodiment of the transformation method was described in relation to an image having 360 degree field-of- view, it is possible to use the present method on an image having less than 360 degree field-of-view by using a fisheye lens having a steradian of less than 180 degrees.
  • three angular images it is possible to combine only two angular images if less than 360 degree field-of-view is desired for the final perspective image.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un procédé et un système pour obtenir une image à perspective normale corrigée à partir d'un ensemble d'images prises avec un oeil-de-poisson. On obtient trois images oeil-de-poisson espacées de 120° en utilisant un objectif oeil-de-poisson avec 180° stéradians. Chacune des images oeil-de-poisson est transformée en une image angulaire (1, 2, 3). Les trois images angulaires (1, 2, 3) sont combinées avec 60° de chevauchement (10, 20) de manière à former une image angulaire combinée unique avec un champ de vision de 360°. On règle l'intensité relative des pixels dans la zone de chevauchement puis on transforme l'image angulaire combinée finale en une image à perspective corrigée, que l'on peut visualiser.
PCT/SG1999/000052 1998-06-11 1999-06-03 Procede et systeme pour obtenir une image uniforme a vue omnidirectionnelle a partir d'images prises avec un oeil-de-poisson WO1999065245A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU43060/99A AU4306099A (en) 1998-06-11 1999-06-03 A method and system for providing a seamless omniview image from fisheye images

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG1998001398A SG77639A1 (en) 1998-06-11 1998-06-11 A method and system for providing a seamless perspective corrected image from fisheye images
SG9801398-0 1998-06-11

Publications (1)

Publication Number Publication Date
WO1999065245A1 true WO1999065245A1 (fr) 1999-12-16

Family

ID=20430024

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG1999/000052 WO1999065245A1 (fr) 1998-06-11 1999-06-03 Procede et systeme pour obtenir une image uniforme a vue omnidirectionnelle a partir d'images prises avec un oeil-de-poisson

Country Status (4)

Country Link
AU (1) AU4306099A (fr)
SG (1) SG77639A1 (fr)
TW (1) TW381399B (fr)
WO (1) WO1999065245A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2360413A (en) * 2000-03-16 2001-09-19 Lee Scott Friend Wide angle parabolic imaging and image mapping apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5563650A (en) * 1992-11-24 1996-10-08 Geeris Holding Nederland B.V. Method and device for producing panoramic images, and a method and device for consulting panoramic images
US5691765A (en) * 1995-07-27 1997-11-25 Sensormatic Electronics Corporation Image forming and processing device and method for use with no moving parts camera

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5563650A (en) * 1992-11-24 1996-10-08 Geeris Holding Nederland B.V. Method and device for producing panoramic images, and a method and device for consulting panoramic images
US5691765A (en) * 1995-07-27 1997-11-25 Sensormatic Electronics Corporation Image forming and processing device and method for use with no moving parts camera

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2360413A (en) * 2000-03-16 2001-09-19 Lee Scott Friend Wide angle parabolic imaging and image mapping apparatus

Also Published As

Publication number Publication date
SG77639A1 (en) 2001-01-16
AU4306099A (en) 1999-12-30
TW381399B (en) 2000-02-01

Similar Documents

Publication Publication Date Title
USRE36207E (en) Omniview motionless camera orientation system
US6201574B1 (en) Motionless camera orientation system distortion correcting sensing element
US7336299B2 (en) Panoramic video system with real-time distortion-free imaging
US8326077B2 (en) Method and apparatus for transforming a non-linear lens-distorted image
US7161615B2 (en) System and method for tracking objects and obscuring fields of view under video surveillance
US7382399B1 (en) Omniview motionless camera orientation system
US6977676B1 (en) Camera control system
Nalwa A true omnidirectional viewer
US6002430A (en) Method and apparatus for simultaneous capture of a spherical image
DE69727052T2 (de) Omnidirektionales bildaufnahmegerät
JP3012142B2 (ja) 全視野静止カメラ監視システム
JP2005006341A (ja) パノラマ画像形成器
KR20090012291A (ko) 회전 대칭형의 광각 렌즈를 이용하여 전방위 영상 및 직선수차보정 영상을 얻는 방법 및 그 영상 시스템
KR19990036920A (ko) 오프셋된 가상의 광학 중심을 구비한 파노라마식 뷰잉 시스템
JP2001136518A (ja) コンパクト形高解像度パノラマ画面表示システム
EP1042697A1 (fr) Dispositif omnidirectionnel servant a capter des images
US6345129B1 (en) Wide-field scanning tv
JP3594225B2 (ja) 広視野カメラ装置
WO1999065245A1 (fr) Procede et systeme pour obtenir une image uniforme a vue omnidirectionnelle a partir d'images prises avec un oeil-de-poisson
JPWO2011158344A1 (ja) 画像処理方法、プログラム、画像処理装置及び撮像装置
WO1996008105A1 (fr) Procede permettant de creer des donnees d'image
JP2003512783A (ja) 周辺視覚を伴うカメラ
CN110581959A (zh) 多重成像设备和多重成像方法
JP2005005816A (ja) 広角カメラおよび広角カメラシステム
JP3934345B2 (ja) 撮像装置

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase