AU2003214899A1 - Stereoscopic Panoramic Image Capture Device - Google Patents

Stereoscopic Panoramic Image Capture Device Download PDF

Info

Publication number
AU2003214899A1
AU2003214899A1 AU2003214899A AU2003214899A AU2003214899A1 AU 2003214899 A1 AU2003214899 A1 AU 2003214899A1 AU 2003214899 A AU2003214899 A AU 2003214899A AU 2003214899 A AU2003214899 A AU 2003214899A AU 2003214899 A1 AU2003214899 A1 AU 2003214899A1
Authority
AU
Australia
Prior art keywords
image
combined
panoramic
imaging system
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2003214899A
Inventor
Fred Good
Trent Grover
Steven Herrnstadt
Don Pierce
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Micoy Corp
Original Assignee
Micoy Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Micoy Corp filed Critical Micoy Corp
Publication of AU2003214899A1 publication Critical patent/AU2003214899A1/en
Assigned to MICOY CORPORATION reassignment MICOY CORPORATION Amend patent request/document other than specification (104) Assignors: PRAIRIE LOGIC, INC.
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/221Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0088Synthesising a monoscopic image signal from stereoscopic images, e.g. synthesising a panoramic or high resolution monoscopic image

Description

WO 2004/068865 PCT/US2003/002285 STEREOSCOPIC PANORAMIC IMAGE CAPTURE DEVICE 5 Technical Field Embodiments of the invention relate in general to a panoramic image capture device and, more specifically, to a panoramic image capture device for producing a stereoscopic panoramic image. 10 Background Panoramic cameras are known in the art. Such cameras often use a single, rotatable camera. Although such devices are suitable for stationary images, such devices typically produce blurred or distorted images when used to capture non-stationary objects. It is also known in the art to utilize an image 15 capture system having a plurality of image capture devices. In this manner, a plurality of images can be captured, substantially simultaneously, and stitched together using processes known in the art. Although such systems substantially eliminate the problem associated with capturing objects in motion, such systems do not provide means for producing a stereoscopic image. 20 It is also known in the art to use a "fish-eye" lens to capture a panoramic image. Such images, however, import a large amount of distortion into the resulting image, and capture images of relatively low quality. Accordingly, it would be desirable to produce a panoramic image of lower distortion and higher quality. 25 Typically, to capture a stereoscopic image, two imaging systems are positioned next to one another to capture a particular image. Unfortunately, this method cannot be extrapolated to producing a stereoscopic panoramic image, as one image capture device would necessarily fall into the field of view of the adjacent image capture device. It would be desirable, therefore, to provide a 30 panoramic image capture system which could be utilized to produce a WO 2004/068865 PCT/US2003/002285 stereoscopic pair of panoramic images for a stereoscopic display of a particular image. Summary of the Invention 5 In an advantage provided by this invention, an image capture system produces a stereoscopic pair of panoramic images. Advantageously, this invention provides an image capture system for producing a seamless panoramic image. Advantageously, this invention provides an image capture system for 10 producing a stereoscopic panoramic image in motion. Advantageously, this invention provides a stereoscopic panoramic image of minimal distortion. Advantageously, this invention provides an imaging system for full motion, real time, panoramic stereoscopic imaging. 15 Advantageously, in a preferred example of this invention, an imaging system is provided comprising a first image capture device, a second image capture device, and a third image capture device. Means are also provided for combining at least a first portion of a first image captured using the first image capture device with a portion of a second image captured using the second image 20 capture device to produce a first combined image. Means are also provided for combining at least a second portion of the first image with at least a portion of a third image captured using the third image capture device, to produce a second combined image. In the preferred embodiment, a plurality of image capture devices are 25 utilized to produce a plurality of images, a portion of each of which is combined with a portion of adjacent images to produce a first combined panoramic image. Similarly, a second portion of each image is combined with separate portions of adjacent images to produce a second combined panoramic image. Preferably, the first combined panoramic image, and second combined panoramic image are 30 displayed in a stereoscopic orientation to produce a stereoscopic panoramic 2 WO 2004/068865 PCT/US2003/002285 image. The imaging system of the present invention may be utilized to capture a plurality of images, to produce a full-motion stereoscopic panoramic image. Brief Description of the Drawings 5 The present invention will now be described, by way of example, with reference to the accompanying drawings in which: Fig. 1 illustrates a front elevation of the imaging system of the present invention; Fig. 2 illustrates a graphic depiction of image capture regions of adjacent 10 image capture devices of the imaging system of the present invention; Fig. 3 illustrates a bottom perspective view of the final panoramic stereoscopic image displayed on a screen, and the polarized glasses used to view the image; Figs. 4A-4B illustrate a left panoramic image and right panoramic image; 15 and Figs. 5A-5B illustrates images associated with first image buffer and second image buffer; Figs. 6A-6C are a flowchart of the transformation process utilized in association with the imaging system of the present device, to transform the 20 plurality of images captured with the imaging devices into a first panoramic image and a second panoramic image to produce a stereoscopic panoramic image; and Fig. 7 illustrates a perspective view of a 360 degree unobstructed panoramic stereoscopic image, created using the camera of Fig. 1. 25 Detailed Description of the Preferred Embodiment Referring to Fig. 1, a camera (10) is shown having a body (12) constructed of plastic or other similar lightweight material. In the preferred embodiment, the body (12) is substantially spherical, having a diameter 30 preferably between about 0.001 and about 500 centimeters, and more preferably, 3 WO 2004/068865 PCT/US2003/002285 between about 10 and about 50 centimeters. Provided substantially equally spaced across the surface of the body (12) are a plurality of lenses (14). The lenses (14) are preferably circular lenses having a diameter of preferably between about 5 angstroms and about 10 centimeters, and more preferably, 5 between about 0.5 and about 5 centimeters. In the preferred embodiment, the lenses are model number BCL3 8C 3.8 millimeter micro lenses, manufactured by CBC America, located at 55 Mall Drive, Commack, NY 11725. As shown in Fig. 2, the lenses (14) are each associated with a charge coupled device (CCD) assembly (16), such as those well known in the art. Although in the preferred 10 embodiment a GP-CX171/LM CCD color board camera, manufactured by Panasonic and available from Rock2000.com is used, any known image capture system may be used. Thus, for the purposes of this disclosure, the lenses (14) and/or CCD assemblies (16) can also be described as "image capture units" (26), (40), and (54). As shown in Fig. 2, all of the image capture units (26), (40), and 15 (54) are operationally coupled to a processing unit (CPU)(22), which can thereby receive images from the image capture devices (14) and/or (16). In the preferred embodiment, the CPU (22) is a 900 MHz, Pentium@4 class personal computer provided with an Oxygen GVX210 graphics card manufactured by 3Dlabs of 480 Pontrero, Sunnyvale, CA 94086. Although the CPU may be of any type 20 known in the art, it is preferably capable of quad buffering and utilizing page flipping software in a manner such as that known in the art. The CPU (22) may be coupled to a head mounted display (24) which, in the preferred embodiment, is a VFX3D, manufactured by Interactive Imaging Systems, Inc., located at 2166 Brighton Henrietta Townline Road, Rochester, NY 14623. 25 As shown in Fig. 2, the lenses (14) are divergent from one another, and offset about twenty degrees from one another along the substantially arcuate path or line defined by the body (12). This offset, which may be from about 5 to about 45 degrees along the direction of the substantially arcuate path, allows each lens (14) and/or image capture unit (26), (40), (54) to have a substantially 30 similar focal point. Each lens (14) has a about a fifty-three degree field of view, 4 WO 2004/068865 PCT/US2003/002285 which overlaps the field of view of a laterally adjoining lens by between about ten and about ninety percent, and, preferably, between about fifty and about sixty-five percent. As shown in Fig. 2, a first image capture unit (26) is associated with an optical axis (28), bisecting the image bordered on one side by 5 a left side plane (30) and on the other side by a right side plane (32). The lens (14) of the image capture unit (26) is focused on a defined image plane (34), divided into a left image plane (36) and a right image plane (38). Similarly, a second image capture unit (40) is also provided with an optical axis (42), a left side plane (44), a right side plane (46), a defined image 10 plane (48), a left image plane (50), and a right image plane (52). A third image capture unit (54), to the right of the first image capture unit (26), is provided with optical axis (56), a left side plane (58), a right side plane (60), a defined image plane (62), a left image plane (64), and a right image plane (66). By providing a plurality of image capture units (26), (40) and (54), 15 dividing the defined image planes (34), (48) and (62) into portions, such as halves, and orienting the image capture units (26), (40) and (54) as shown in Fig. 2, every point associated with a final panoramic image is within the defined image plane of at least two adjacent image capture units (26), (40) and/or (54). As shown in Fig. 5A-5B, the defined image planes of adjacent image capture 20 units overlap vertically, preferably about 1-20 percent, more preferably about 5 10 percent, and about 7 percent in the preferred embodiment. Similarly, the defined image planes of adjacent image capture units overlap horizontally, preferably about 20 percent, more preferably about 5-10 percent, and about 6 percent in the preferred embodiment. These overlaps aid in the "stitching" 25 process described below. To produce the final panoramic image (68) of the present invention, which may display about 180 degrees of a scene (e.g., a hemisphere), a first panoramic image (72) and second panoramic image (74) are created. (Figs. 3 and 4A-4B). Each of the panoramic images (72), (74) may display about 90 degrees 30 of a scene (e.g., about half of a hemisphere). To create the first panoramic image 5 WO 2004/068865 PCT/US2003/002285 (72), an image (76), associated with the left image plane (36) of the first image capture unit (26) is combined with an image (78), associated with the left image plane (50) of the second image capture unit (40) and an image (80), associated with the left image plane (64) of the third image capture unit (54). (Figs. 2, 4A 5 and 5A). As shown in Fig. 2, the associated image planes (36), (50) and (64), preferably overlap by about 0.5-30 percent, more preferably by about 10-20 percent, and about 13 percent in the preferred embodiment, but are not parallel to one another, and are not necessarily tangent to a curve defining the first panoramic image (72). (Fig. 2 and 4A). Accordingly, the images (76), (78), 10 (80), (82), (84) and (86), associated with planes (36), (50) and (64) must be transformed, to remove distortion associated with their non-parallel orientation, before they can be stitched together as described below to form the final panoramic image (68). (Figs. 2, 3, 4A-4B and 5A-5B). Once the images (76), (78) and (80), associated with the left image 15 planes (38), (52), (66), and the images (82), (84) and (86), associated with the right image planes (36), (50) and (64) have been collected and received from all of the image capture units (26), (40) and (54), the images may be transmitted via hardwired, wireless, or any desired connection to the CPU (22). The CPU (22) then may operate to transform the images in accordance with the process 20 described in Figs. 6A-6C. As shown in block (88), source images, which in the preferred embodiment are substantially rectilinear images, but which may, of course, be any type of image, are obtained or received from the image capture units (26), (40) and (54) by the CPU (22). (Figs. 2, 4A, 4B and 6A-6C). As shown in block (94), the CPU (22) may then define registration pixel pairs for 25 the untransformed source images. Thereafter, as shown in block (96), the CPU(22) creates an input file. The input file includes the height and width of the final panoramic image (68), the source information, and registration point information. The source information includes the file name and path of the source image, the height and 30 width of the source image in pixels, and the yaw, pitch and roll angles of the 6 WO 2004/068865 PCT/US2003/002285 source of the associated image capture unit. The horizontal field of view of the image capture unit, which is preferably between about 1 and about 80 degrees, more preferably between about 30 and about 60 degrees, and about 53 degrees in the preferred embodiment, is defined by the associated left side plane and right 5 side plane, the X Shift, Y Shift and zoom values of the source image. The information associated with the registration points includes information regarding the source image associated with the first pixel position of the registration point, a horizontal and vertical pixel position in the first source image, the list index of infonnation regarding the source image associated with 10 the second pixel position, and a horizontal and vertical pixel position in the second source image. The images (76-86) associated with the image planes (36, 38, 50, 52, 64 and 66) are substantially rectilinear, normal flat-field images, and the panoramic images (72 and 74) are at least partially equirectangular, representing pixel 15 mapping on a spherical surface in a manner such as that shown in Figs. 4A-4B. Accordingly, once the CPU (22) creates the input file, as shown in block (98), the registration pixel pairs can be transformed to locate their position in the final panoramic image (68). Starting with an arbitrary source image, a vector is defined that 20 represents a first pixel of a given registration pixel pair in three dimensional space, locating it on the final panoramic image (68). This is accomplished by applying the following matrix transformation to each pixel: Define segment x: Alter the horizontal position of the pixel in the source 25 image so that it is relative to the image's center. Then compensate for the X Shift and zoom variables of the source image. Define segment y: Alter the vertical position of the pixel in the source image so that it is relative to the image's center. Then compensate for the 30 Y Shift and zoom variables of the source image. 7 WO 2004/068865 PCT/US2003/002285 Define segment z: Using various source image variables, determine the z segment that corresponds to the scale provided by the image's size in pixels. 5 Transform the vector so that it corresponds to the rotation angles of the source camera. This is calculated by transforming the vector by each of three rotational matrices: 10 e Rotation about x-axis: 1 0 0 0 0 cos(pitch) sin(pitch) 0 0 - sin(pitch) cos(pitch) 0 0,O 0 0 1) " Rotation about y-axis: cos(yaw) 0 -sin(yaw) O 0 1 0 0 sin(yaw) 0 cos(yaw) 0 0 0 0 1. 15 * Rotation about z-axis: cos(roll) sin(roll) 0 0" -sin(roll) cos(roll) 0 0 0 0 1 0 0 0 0 1, Upon matrix transformation, the pixel vector represents the global globX, globY, 20 and globZ positions of that point in three-dimensional space. The CPU (22) then converts these positions into spherical coordinates and applies them directly to the final panoramic coordinates. The vector's yaw angle represents its horizontal 8 WO 2004/068865 PCT/US2003/002285 panorama position, newX, and its pitch angle represents its vertical panorama position, newY. As shown in block (100), once the registration pixel pairs have been mapped to the final panoramic image (68) in this manner, the CPU (22) 5 calculates the distance between the registration pixel pairs. If the average distance of the registration pixel pairs for a given source image are not yet minimized, as would be the case upon the initial transformation, shown in block (102), the yaw of the source image is altered slightly, whereafter the process returns to block (98) and the registration pixel pairs are again transformed to 10 pixel points in the final panoramic image (68). This process continues, altering the yaw, until the average distance of the source image registration pixel pairs is minimized. Thereafter, the pitch, roll, X Shift, Y Shift and zoom are altered, until the average distance of the associated registration pixel pairs is minimized. Once the yaw, pitch, roll, X Shift, Y Shift and zoom of a particular source image 15 is optimized, as shown in block (104), the transformation procedure is repeated for all source images until they are all thus optimized. As shown in block (106), once all of the source images have been thus optimized, the average distance of all source image registration pixel pairs is calculated and, if they are not yet minimized, the yaw, pitch, roll, XShift, YShift 20 and zoom of the source images are altered as shown in block (108), and the process returns to block (98), where the process continues, until the distance between the registration pixel pairs is minimized across all of the source images. Once the average distance between the registration pixel pairs has been minimized across all source images, as shown in block (110), an output file is 25 created, identifying the height and width of the first panoramic image (72), the yaw, pitch, roll, XShift, YShift and zoom transformation image information relative to each particular source image. Thereafter, as shown in block (112), for each pixel within a given source image, a vector is defined representing the particular pixel's position in three dimensional space, using the vector 30 transformation described above. 9 WO 2004/068865 PCT/US2003/002285 Once a vector has been defined, as shown in block (114), the vector is transformed to reflect the yaw, pitch, roll, XShift, YShift and zoom information associated with the source image as defined in the output file. After completion of the transformation of the vector, as shown in block (116), the transformed 5 vector is associated with a pixel in the final panoramic image (68). As shown in block (118), this process is repeated until all of the pixels in a particular source image have been transformed into vectors, and their position located on the final panoramic image (68) As shown in block (120), once all of the pixels of a given source image 10 have been located on the final panoramic image (68), two image buffers (90) and (92) are created, each having a height and width approximately equal to that of the final panoramic image (68). (Figs. 3, 5A-5B and 6B). As shown in block (122), once the image buffers (90) and (92) have been created, vector transformation information associated with a quadrilateral of four adjacent pixels 15 of a particular source image is utilized to draw the quadrilateral of pixels onto the appropriate image buffer (90) or (92). (Figs. 5A-5B and 6C). If the pixel is in the left image planes (38), (52) or (66), the pixel is written to the left image buffer (90). If the pixel is in the right image planes (36), (50) or (64), the pixel is in the right image buffer (92). (Figs. 2 and 5A-5B). 20 As shown in block (124), since the transformation is likely to spread out the quadrilateral of pixels on the image buffer, there will likely be gaps between pixels as they are converted from their rectilinear location to their equirectangular location in the associated image buffer (90) or (92). (Figs. 5A 5B and 6C). When the quadrilateral of pixels is located on the associated image 25 buffer (90) or (92), the gaps thereby created are filled as smoothly as possible, perhaps using a linear gradient of the corner pixel colors, in a manner such as that known in the art. Known linear gradient fill techniques may also be used to diminish visible seams between images. As additional source image information is applied to the image buffers (90) and (92), for areas of the image buffers (90) 30 and (92) that have already been filled with pixel data, the alpha transparency of 10 WO 2004/068865 PCT/US2003/002285 the new overlapping pixels is linearly degraded, smoothing the resulting seam as described below. Accordingly, when mapping the quadrilateral to the associated image buffer (90) or (92), the CPU (22) may eliminate the gaps by interpolating the 5 internal pixel values, using any method known in the art, which may include comparing the gap to the adjacent pixels, or "white boxing" the gap by utilizing the immediately preceding frame and immediately succeeding frame in a motion capture system, to extrapolate the most appropriate pixel value for the gap. As shown in block (126), blocks (122) and (124) are repeated until all of the source 10 image pixels have been mapped to the appropriate image buffer (90) or (92). Once all of the pixels have been mapped, as shown in block (128), the first image buffer pixels are compared to the first panoramic image pixels. If the first panoramic image (72) does not have a pixel associated with a pixel in the first image buffer (90), the CPU (22) sets the pixel in the first image buffer (90) 15 to maximum visibility. If the first panoramic image (72) already has a pixel associated with a pixel in the first image buffer (90), the existing pixel is compared to the corresponding pixel in the first image buffer (90). If the pixel in the first image buffer (90) has a shorter distance to the center of its respective source image than does the existing pixel, the pixel in the first image buffer (90) 20 is set to maximum visibility. Conversely, if the pixel in the first image buffer (90) has a greater distance to the center of its respective source image than that of the existing pixel, then the pixel in the first image buffer (90) is set to minimum visibility. The process is repeated to merge pixels of the second image buffer (92) into the second panoramic image (74). 25 As shown in block (130), the overlapping edges of the image in the image buffers (90) and (92) are feathered by degrading visibility of the pixels over the area of overlap. This feathering smoothes the overlapping areas of the image once it is merged from the image buffers (90) and (92) into the panoramic images (72) and (74). As shown in block (132), once the image buffer pixels 30 have been set to the appropriate visibility, and the image of the image buffers 11 WO 2004/068865 PCT/US2003/002285 feathered, the images in the image buffers are merged into the first and second panoramic images (72) and (74). As shown in block (134), blocks (122) through (132) are repeated until all of the source images have been merged into the first and second panoramic images (72) and (74). As noted above, the images (76), 5 (78) and (80) associated with the left image planes (36), (50) and (64) of the image capture units (26), (40) and (54) are used to create the first panoramic image (72), and the images (82), (84) and (86) associated with the right image planes (38), (52) and (66) of the image capture units (26), (40) and (54) are used to create the second panoramic image (74). 10 Once both of the final panoramic images (72) and (74) have been created, as shown in block (138), the panoramic images (72) and (74) can be displayed together as the final panoramic image (68). (Figs. 3, 4A and 4B). The panoramic images (72) and (74) may be reverse polarized, shown on a standard panoramic screen (140), such as that shown in Fig. 3, and viewed using glasses (142) 15 having lenses of reversed polarity. Alternatively, the panoramic images (72) and (74) may be conveyed to the head mounted display (24) shown in Fig. 1. The CPU (22) can send multiple panoramic images to the head mounted display (24) using known "page-flipping" techniques to send and separately display the images in the left and right displays of the head mounted display (24), as 20 appropriate. This process can be used to animate the display as a full 24 frame per second motion picture, or to display multiple visual images to each display. Thus, an imaging system (see Fig. 2) may comprise a first image capture unit (26) to capture a first image (36), a second image capture unit (40) to capture a second image (50), and a third image capture unit (54) to capture a third image 25 (64), as well as a means (22) for combining a first portion of the first image (36) with a portion of the second image (50) to produce a first combined image, and means (22) for combining a second portion of the first image (36) with a portion of the third image (64) to produce a second combined image. The first portion of the first image can be any amount up to the entire first image, or may be 30 between about 20 percent and about 80 percent of the first image. The portion of 12 WO 2004/068865 PCT/US2003/002285 the second image can be any amount up to the entire second image, or may be between about 20 percent and about 80 percent of the second image. The first and second combined images can be repeatedly produced and displayed to convey stereoscopic motion. 5 An imaging system may also comprise an image capture unit (26) for providing an image, a means (22) for using a first portion of said image to provide a first stereoscopic image, and means (22) for using a second portion of said image to provide a second stereoscopic image. The head mounted display unit (24) may also be provided with an 10 orientation sensor (144), such as those well known in the art, to change the image provided to the head mounted display (24) as the sensor (144) moves. In this manner, a user (not shown) may look up, down, and in either direction, to see that portion of the final panoramic image (68) associated with the vector of the user's line of sight, and have the sensation of actually looking around a three 15 dimensional space. In an alternative embodiment of the present invention, the camera (10) may be provided with a plurality of image capture pairs having substantially rectilinear capture systems oriented substantially parallel to one another. The pairs may be offset by a predetermined factor to obtain a desired stereoscopic 20 image. Because the images are captured in pairs, the transformation process associated with this embodiment is identical to that described above, albeit instead of dividing the images in half, and sending pixels from each half to a separate image buffer, all of the pixels associated with the images from the "left" image capture devices of each image capture device pair are sent to one image 25 buffer, and all of the pixels associated with images from the "right" image capture devices are sent to another image buffer. In another alternative embodiment of the present invention, the camera (10) and CPU (22) may be utilized to capture twenty-four or more frames per second, and display the final stereoscopic, panoramic image (68), in real time, as 30 a motion picture. 13 WO 2004/068865 PCT/US2003/002285 In yet another alternative embodiment of the present invention, computer generated graphical information, produced in a manner such as that well known in the art, may be combined with the final panoramic images (72) and (74) in the CPU (22) to provide a seamless integration of actual images captured with the 5 camera (10), and digitized virtual reality images (146). (Fig. 3) This combination produces a seamless display of real and virtual panoramic stereographic virtual images. In yet another alternative embodiment of the present invention, the images captured by the camera (10) may be transformed, utilizing the above 10 transformation procedures, to produce a seamless 360-degree panoramic monographic image. As shown in Fig. 1, the camera (10) is provided with a support post (148) and a transportation unit such as a remote control carriage (150), similar to or identical to those used in association with remote control cars and the like. Images associated with the left image planes of the image capture 15 units may be used to produce a combined image and the images associated with the right image planes of the image capture units may be used to overwrite and fill the combined image to hide the support post (148), carriage (150) and any other camera equipment otherwise visible in the combined image. The foregoing transformation procedures may be utilized for such overwriting and filling. In 20 this manner, only those images which do not include the undesired information are mapped to a final panoramic image (152), which may be displayed on a spherical screen (154) or on the head mounted display (24). (Figs. 1 and 7). For that portion of the image reflecting the area covered by the footprint of the carriage (150), the interpolation and feathering procedures detailed above may 25. be utilized to approximate the image lying beneath the carriage (150), to produce the appearance of a completely unobstructed 360-degree image. Although the invention has been described with respect to the preferred embodiment thereof, it is to be also understood that it is not to be so limited, since changes and modifications can be made therein which are within the full 30 intended scope of this invention as defined by the appended claims. 14

Claims (46)

1. An imaging system, comprising: a first image capture unit to capture a first image; a second image capture unit to capture a second image; 5 a third image capture unit to capture a third image; means for combining a first portion of the first image with a portion of the second image to produce a first combined image; and means for combining a second portion of the first image with a portion of the third image to produce a second combined image. 10
2. The imaging system of Claim 1, wherein said first image capture unit, said second image capture unit, and said third image capture unit are located along an arc relative to one another, about five to about forty-five degrees apart. 15
3. The imaging system of Claim 1, wherein an image plane associated with said first image overlaps an image plane associated with said second image by about 0.5 to about 30 percent. 20
4. The imaging system of Claim 1, wherein an image plane associated with said first image vertically overlaps an image plane associated with said second image by about 1 to about 20 percent. 25
5. The imaging system of Claim 1, wherein said first image and said second image are substantially rectilinear. 15 WO 2004/068865 PCT/US2003/002285
6. The imaging system of Claim 1, wherein said first combined image and said second combined image are at least partially equirectangular. 5
7. The imaging system of Claim 1, wherein the first and the second combined images can be displayed to provide a stereoscopic image. 10
8. The imaging system of Claim 1, further comprising means for displaying said first combined image and said second combined image as a stereoscopic image.
9. The imaging system of Claim 1, further comprising means for 15 sequentially displaying a plurality of the first and the second combined images as a moving stereoscopic image.
10. The imaging system of Claim 1, further comprising means for 20 combining said first combined image with a sufficient plurality of images to produce a first combined panoramic image representing at least about 90 degrees of a scene, and combining said second combined image with a sufficient plurality of other images to produce a second combined panoramic image, representing about 25 90 degrees of the scene, and means for displaying said first combined panoramic image and said second combined panoramic image to provide a stereoscopic, panoramic image. 30 16 WO 2004/068865 PCT/US2003/002285
11. The imaging system of Claim 10, further comprising a means to display a first set of combined panoramic images and a second set of combined panoramic images in succession to provide a moving stereoscopic, panoramic image. 5
12. The imaging system of Claim 1, wherein said first combined image and said second combined image are combined with a digital image to produce a stereoscopic image including said digital image. 10
13. An imaging system, comprising: an image capture unit for providing an image; means for using a first portion of said image to provide a first stereoscopic image; and 15 means for using a second portion of said image to provide a second stereoscopic image.
14. The imaging system of Claim 13, further comprising means for combining said first stereoscopic image and said second 20 stereoscopic image into a panoramic stereoscopic image.
15. The imaging system of Claim 13, further comprising: a plurality of image capture units to provide a plurality of images; means for combining selected ones of the plurality of images with 25 the image to produce combined images; means for combining said combined images into a first panoramic image and a second panoramic image; and means for displaying said first panoramic image and said second panoramic image to provide a panoramic, stereoscopic image. 17 WO 2004/068865 PCT/US2003/002285
16. The imaging system of Claim 15, wherein said first panoramic image displays about 90 degrees of a scene.
17. The imaging system of Claim 15, wherein said panoramic, 5 stereoscopic image displays about 180 degrees of a scene.
18. A method, comprising: obtaining a first image; obtaining a second image; 10 obtaining a third image; combining a first portion of said first image with a portion of said second image to produce a first combined image; combining a second portion of said first image with a portion of said third image to produce a second combined image; and 15 displaying said first combined image and said second combined image as a stereoscopic image.
19. The method of Claim 18, further comprising obtaining said first image, said second image, and said third image from a plurality of 20 image capture units located along an arc.
20. The method of Claim 19, wherein said plurality of image capture units are located about 5 degrees to about 45 degrees apart in a single direction along the arc. 25
21. The method of Claim 18, further comprising displaying a plurality of first combined images in sequence and a plurality of second combined images in sequence to provide a moving stereoscopic image. 18 WO 2004/068865 PCT/US2003/002285
22. The method of Claim 21, wherein said moving stereoscopic image represents about 180 degrees of a scene.
23. The method of claim 18, further comprising: 5 defining a plurality of registration pixel pairs.
24. The method of claim 23, further comprising: transforming the plurality of registration pixel pairs to minimize a distance of image registration. 10
25. The method of claim 23, further comprising: defining a vector to represent a pixel of one of the plurality of registration pixel pairs. 15
26. The method of claim 25, further comprising: transforming the vector into a transformed vector to correspond to a source rotation angle.
27. The method of claim 26, further comprising: 20 associating the transformed vector with a pixel in a panoramic image.
28. The method of claim 27, further comprising: adjusting a visibility of a pixel in an image buffer by comparing a 25 distance associated with the pixel in the panoramic image with a distance associated with the pixel in the image buffer.
29. The method of claim 21, further comprising: feathering overlapping edges of an image in an image buffer by 30 degrading visibility of pixels in an overlap area. 19 WO 2004/068865 PCT/US2003/002285
30. An imaging system comprising: a first image capture unit to capture a first image; a second image capture unit to capture a second image; a third image capture unit to capture a third image; and 5 a processing unit operationally coupled to the first, second, and third image capture units to receive the first, the second, and the third images, wherein a first portion of the first image can be combined with a portion of the second image to provide a first combined image, wherein a second portion of the first image can be combined with a 10 portion of the third image to provide a second combined image, and wherein the first and second combined images can be displayed to provide a stereoscopic image.
31. The imaging system of Claim 30, wherein said first, the 15 second, and the third image capture units are located approximately equidistant from each other along a substantially arcuate path.
32. The imaging system of Claim 31, wherein said 20 substantially arcuate path is defined by a substantially spherical body.
33. The imaging system of Claim 30, wherein said first and said second image capture urits are separated from each 25 other by an angular distance of about 5 degrees to about 45 degrees along an arc, and wherein said second and said third image capture units are separated from each other by about the angular distance along the are. 20 WO 2004/068865 PCT/US2003/002285
34. The imaging system of Claim 30, wherein a field of view associated with said first image capture unit overlaps a field of view associated with said second image capture unit by an overlap amount, and wherein the field of view 5 associated with said second image capture unit overlaps a field of view associated with said third image capture unit by the overlap amount.
35. The imaging system of Claim 34, wherein the overlap 10 amount is about 10 percent to about 90 percent.
36. The imaging system of Claim 30, wherein a defined image plane associated with said first image capture unit overlaps a defined image plane associated with said 15 second image capture unit by about 1 to about 20 percent.
37. The imaging system of Claim 30, wherein said first portion of said first image is between about 20 percent and about 80 percent of said first image, and wherein said 20 portion of said second image is between about 20 percent and about 80 percent of said second image.
38. The imaging system of Claim 30, wherein a plurality of said first and said second combined images are displayed 25 in sequence to convey motion.
39. The imaging system of Claim 30, wherein the first combined image is combined with a sufficient plurality of images to produce a first combined panoramic image, 30 representing about 90 degrees of a scene, and wherein the 21 WO 2004/068865 PCT/US2003/002285 second combined image is combined with a sufficient plurality of other images to produce a second combined panoramic image, representing about 90 degrees of the scene, and wherein said first combined panoramic image 5 and said second combined panoramic image are displayed to provide a stereoscopic, panoramic image.
40. The imaging system of Claim 39, wherein a set of first combined panoramic images and a set of second 10 combined panoramic images are displayed in sequence to provide a moving stereoscopic, panoramic image.
41. An imaging system comprising: an image capture unit to provide an image; 15 a processing unit coupled to the image capture unit to receive a first portion of said image to provide a first stereoscopic image, and to receive a second portion of said image to provide a second stereoscopic image. 20
42. The imaging system of Claim 41, wherein said first stereoscopic image and said second stereoscopic image are combined into a panoramic stereoscopic image.
43. The imaging system of Claim 41, further comprising: 25 a plurality of image capture units coupled to the processing unit, the plurality of image capture units to provide a plurality of images, wherein selected ones of the plurality of images are combined with at least one other image to produce a plurality of combined images, wherein the plurality of combined images are combined to provide a 30 first panoramic image and a second panoramic image, and wherein 22 WO 2004/068865 PCT/US2003/002285 the first panoramic image and the second panoramic image are combined to provide a panoramic, stereoscopic image.
44. The imaging system of Claim 43, wherein said first 5 panoramic image displays about 90 degrees of a scene.
45. The imaging system of Claim 43, wherein said panoramic, stereoscopic image displays about 180 degrees of a scene. 10
46. The imaging system of Claim 41, further comprising an imaging unit coupled to the processing unit. 23
AU2003214899A 2003-01-24 2003-01-24 Stereoscopic Panoramic Image Capture Device Abandoned AU2003214899A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2003/002285 WO2004068865A1 (en) 2003-01-24 2003-01-24 Steroscopic panoramic image capture device

Publications (1)

Publication Number Publication Date
AU2003214899A1 true AU2003214899A1 (en) 2004-08-23

Family

ID=32823170

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2003214899A Abandoned AU2003214899A1 (en) 2003-01-24 2003-01-24 Stereoscopic Panoramic Image Capture Device

Country Status (7)

Country Link
EP (1) EP1586204A1 (en)
JP (1) JP2006515128A (en)
CN (1) CN1771740A (en)
AU (1) AU2003214899A1 (en)
CA (1) CA2513874A1 (en)
IL (1) IL169827A0 (en)
WO (1) WO2004068865A1 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7697839B2 (en) 2006-06-30 2010-04-13 Microsoft Corporation Parametric calibration for panoramic camera systems
US9792012B2 (en) 2009-10-01 2017-10-17 Mobile Imaging In Sweden Ab Method relating to digital images
JP2011082919A (en) * 2009-10-09 2011-04-21 Sony Corp Image processing device and method, and program
SE534551C2 (en) 2010-02-15 2011-10-04 Scalado Ab Digital image manipulation including identification of a target area in a target image and seamless replacement of image information from a source image
CN102959943B (en) * 2010-06-24 2016-03-30 富士胶片株式会社 Stereoscopic panoramic image synthesizer and method and image capture apparatus
CN103189796B (en) 2010-09-20 2015-11-25 瑞典移动影像股份公司 For the formation of the method for image
TWI506595B (en) * 2011-01-11 2015-11-01 Altek Corp Method and apparatus for generating panorama
JP2012199752A (en) * 2011-03-22 2012-10-18 Sony Corp Image processing apparatus, image processing method, and program
SE1150505A1 (en) 2011-05-31 2012-12-01 Mobile Imaging In Sweden Ab Method and apparatus for taking pictures
EP2718896A4 (en) 2011-07-15 2015-07-01 Mobile Imaging In Sweden Ab Method of providing an adjusted digital image representation of a view, and an apparatus
JP5754312B2 (en) * 2011-09-08 2015-07-29 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
KR101804199B1 (en) * 2011-10-12 2017-12-05 삼성전자주식회사 Apparatus and method of creating 3 dimension panorama image
CN103096101B (en) * 2011-11-07 2016-03-30 联想(北京)有限公司 Image synthesizing method, device and electronic equipment
KR20150068298A (en) * 2013-12-09 2015-06-19 씨제이씨지브이 주식회사 Method and system of generating images for multi-surface display
JP6609616B2 (en) * 2014-03-28 2019-11-20 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Quantitative 3D imaging of surgical scenes from a multiport perspective
GB2525170A (en) 2014-04-07 2015-10-21 Nokia Technologies Oy Stereo viewing
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video
US10750153B2 (en) 2014-09-22 2020-08-18 Samsung Electronics Company, Ltd. Camera system for three-dimensional video
US10210660B2 (en) * 2016-04-06 2019-02-19 Facebook, Inc. Removing occlusion in camera views
CN106851240A (en) * 2016-12-26 2017-06-13 网易(杭州)网络有限公司 The method and device of image real time transfer
US11049218B2 (en) 2017-08-11 2021-06-29 Samsung Electronics Company, Ltd. Seamless image stitching
GB2591278A (en) * 2020-01-24 2021-07-28 Bombardier Transp Gmbh A monitoring system of a rail vehicle, a method for monitoring and a rail vehicle

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5023725A (en) * 1989-10-23 1991-06-11 Mccutchen David Method and apparatus for dodecahedral imaging system
KR940017747A (en) * 1992-12-29 1994-07-27 에프. 제이. 스미트 Image processing device
JPH0962861A (en) * 1995-08-21 1997-03-07 Matsushita Electric Ind Co Ltd Panoramic video device
US6141034A (en) * 1995-12-15 2000-10-31 Immersive Media Co. Immersive imaging method and apparatus
US7015954B1 (en) * 1999-08-09 2006-03-21 Fuji Xerox Co., Ltd. Automatic video system using multiple cameras
JP2002094849A (en) * 2000-09-12 2002-03-29 Sanyo Electric Co Ltd Wide view image pickup device
JP2002159019A (en) * 2000-11-16 2002-05-31 Canon Inc Display control device, imaging position estimating device, display system, image pickup system, image positioning method, imaging position estimating method, and recording medium recorded with process program
JP4590754B2 (en) * 2001-02-28 2010-12-01 ソニー株式会社 Image input processing device
JP2003141562A (en) * 2001-10-29 2003-05-16 Sony Corp Image processing apparatus and method for nonplanar image, storage medium, and computer program

Also Published As

Publication number Publication date
CN1771740A (en) 2006-05-10
IL169827A0 (en) 2007-07-04
EP1586204A1 (en) 2005-10-19
CA2513874A1 (en) 2004-08-12
JP2006515128A (en) 2006-05-18
WO2004068865A1 (en) 2004-08-12

Similar Documents

Publication Publication Date Title
US6947059B2 (en) Stereoscopic panoramic image capture device
US11528468B2 (en) System and method for creating a navigable, three-dimensional virtual reality environment having ultra-wide field of view
AU2003214899A1 (en) Stereoscopic Panoramic Image Capture Device
US7429997B2 (en) System and method for spherical stereoscopic photographing
US8581961B2 (en) Stereoscopic panoramic video capture system using surface identification and distance registration technique
US20170280133A1 (en) Stereo image recording and playback
US20100259599A1 (en) Display system and camera system
WO2012166593A2 (en) System and method for creating a navigable, panoramic three-dimensional virtual reality environment having ultra-wide field of view
KR20100105351A (en) Banana codec
KR20060039896A (en) Panoramic video system with real-time distortion-free imaging
US6836286B1 (en) Method and apparatus for producing images in a virtual space, and image pickup system for use therein
US11812009B2 (en) Generating virtual reality content via light fields
WO2018035347A1 (en) Multi-tier camera rig for stereoscopic image capture
JP4406824B2 (en) Image display device, pixel data acquisition method, and program for executing the method
CN109769111A (en) Image display method, device, system, storage medium and processor
KR101670328B1 (en) The appratus and method of immersive media display and image control recognition using real-time image acquisition cameras
US7154528B2 (en) Apparatus for placing primary image in registration with lenticular lens in system for using binocular fusing to produce secondary 3D image from primary image
KR20060015460A (en) Stereoscopic panoramic image capture device
CN109272445A (en) Panoramic video joining method based on Sphere Measurement Model
CN115174805A (en) Panoramic stereo image generation method and device and electronic equipment
JPH11220758A (en) Method and device for stereoscopic image display
CN108122283B (en) Method for editing VR image by coordinate transformation
Coleshill Spherical panoramic video: the space ball
JP2005077808A (en) Image display device, image display method, and image display system
Hutchison The development of a hybrid virtual reality/video view-morphing display system for teleoperation and teleconferencing

Legal Events

Date Code Title Description
DA3 Amendments made section 104

Free format text: THE NATURE OF THE AMENDMENT IS: AMEND THE INVENTION TITLE TO READ STEREOSCOPIC PANORAMIC IMAGE CAPTURE DEVICE

TC Change of applicant's name (sec. 104)

Owner name: MICOY CORPORATION

Free format text: FORMER NAME: PRAIRIE LOGIC, INC.

MK4 Application lapsed section 142(2)(d) - no continuation fee paid for the application