WO2011003168A1 - Method and apparatus for generating three dimensional image information using a single imaging path - Google Patents

Method and apparatus for generating three dimensional image information using a single imaging path Download PDF

Info

Publication number
WO2011003168A1
WO2011003168A1 PCT/CA2009/000957 CA2009000957W WO2011003168A1 WO 2011003168 A1 WO2011003168 A1 WO 2011003168A1 CA 2009000957 W CA2009000957 W CA 2009000957W WO 2011003168 A1 WO2011003168 A1 WO 2011003168A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging path
extent
portions
images
Prior art date
Application number
PCT/CA2009/000957
Other languages
French (fr)
Inventor
Thomas N. Mitchell
Original Assignee
Isee3D Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Isee3D Inc. filed Critical Isee3D Inc.
Priority to KR1020127003672A priority Critical patent/KR101598653B1/en
Priority to CN200980161394.6A priority patent/CN102725688B/en
Priority to US13/382,895 priority patent/US9298078B2/en
Priority to EP09846964.6A priority patent/EP2452228A4/en
Priority to JP2012518701A priority patent/JP2012532347A/en
Priority to PCT/CA2009/000957 priority patent/WO2011003168A1/en
Priority to TW099120775A priority patent/TWI531209B/en
Priority to EP10796640.0A priority patent/EP2452224A4/en
Priority to PCT/CA2010/001093 priority patent/WO2011003208A1/en
Priority to KR1020107027161A priority patent/KR101758377B1/en
Priority to CN201080040644.3A priority patent/CN102640036B/en
Priority to JP2012518716A priority patent/JP5840607B2/en
Priority to US13/382,892 priority patent/US9442362B2/en
Publication of WO2011003168A1 publication Critical patent/WO2011003168A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/02Stereoscopic photography by sequential recording
    • G03B35/04Stereoscopic photography by sequential recording with movement of beam-selecting members in a system defining two or more viewpoints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00194Optical arrangements adapted for three-dimensional imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/02Stereoscopic photography by sequential recording
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/18Stereoscopic photography by simultaneous viewing
    • G03B35/26Stereoscopic photography by simultaneous viewing using polarised or coloured light separating different viewpoint images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/211Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing

Definitions

  • This invention relates generally to generating three dimensional image information and more particularly to generating three dimensional image information using a single imaging path.
  • Imaging generally involves producing a representation of a scene by receiving radiation emitted or reflected by objects in the scene at a suitable image sensor.
  • radiation that may be imaged include visible light, infrared light or heat, radiofrequency waves, acoustic waves, and ultrasonic waves.
  • a three-dimensional (3D) scene includes depth information, which in many imaging systems is mapped onto a two-dimensional (2D) image plane and is thus not preserved.
  • a conventional camera is an example of an optical imaging system in which depth information is not preserved resulting in a 2D image representing the scene.
  • Stereoscopic optical systems are capable of producing images that represent depth information by producing separate images from differing perspective viewpoints. The depth information may be used to produce 3D measurements between points in the scene, for example.
  • the separate images may be presented to respective left and right eyes of a user to enable the user to perceive an image view having at least some depth represented in the images.
  • the stereoscopic system thus produces images having spatially separated perspective viewpoints that mimic the operation of the human eyes in viewing a real scene.
  • the images may be viewed using some form of active eyewear or by operating a display to project spatially separated images toward the user's respective left and right eyes.
  • stereoscopic imaging finds application in surgery where a 3D endoscope may be used to provide a 3D view to the surgeon.
  • Stereoscopic imaging may also be useful in remote operations, such as undersea exploration for example, where control of a robotic actuator is facilitated by providing 3D image information to an operator who is located remotely from the actuator.
  • Other applications of stereoscopic imaging may be found in physical measurement systems and in the entertainment industry.
  • a method of generating three dimensional image information using a single imaging path having an associated field of view involves selectively receiving first and second images through respective first and second portions of the single imaging path, the first portion having a first perspective viewpoint within the field of view and the second portion having a second perspective viewpoint within the field of view.
  • the first and second images together are operable to represent three dimensional spatial attributes of objects within the field of view.
  • the method also involves varying an extent of the first and second portions of the imaging path to cause the first and second perspective viewpoints to change location while receiving the first and second images, the change in perspective viewpoint location providing a corresponding change in the representation of the three dimensional spatial attributes.
  • the method further involves compensating for changes in transmission through the first and second portions of the imaging path such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images is maintained at a generally uniform image intensity level.
  • Selectively receiving the first and second images may involve receiving the first and second images at an image sensor and compensating for the changes in the transmission may involve one of increasing an exposure associated with the image sensor in response to a reducing extent of the first and second portions of the imaging path, decreasing a gain associated with the image sensor in response to an increasing extent of the first and second portions of the imaging path, increasing overall transmittance through the imaging path in response to a reducing extent of the first and second portions of the imaging path, and reducing overall transmittance through the imaging path in response to an increasing extent of the first and second portions of the imaging path.
  • Selectively receiving the first and second images may involve alternately blocking the first portion of the imaging path while receiving the second image, and blocking the second portion of the imaging path while receiving the first image.
  • blocking the first and second portions of the imaging path may involve causing a blocking element located proximate an aperture plane of the image path to move between first and second positions in the image path to define the varying extent of the first and second portions of the imaging path.
  • Causing the blocking element to move may involve producing a force operable to alternately move the blocking element toward one of the first and second positions, receiving a position signal representing a position of the blocking element, and controlling a magnitude of the force in response to the position signal to cause the blocking element to come to rest at the one of the first and second positions.
  • Alternately blocking the first and second portions of the imaging path may involve selectively actuating first and second regions of an optical element -A-
  • the optical element may involve a plurality of elements and selectively actuating the first and second regions may involve selectively actuating one of a first plurality of elements in the plurality of elements, and a second plurality of elements in the plurality of elements.
  • Each element of the plurality of elements may be operable to be actuated in response to receiving an actuation signal, and varying the extent of the first and second portions of the imaging path may involve generating actuation signals to cause a number of elements in the first and second plurality of elements to be selectively varied to vary the extent of the first and second portions of the imaging path.
  • Selectively actuating the first and second regions of the optical element may involve selectively actuating first and second regions of a transmissive optical element disposed to transmit light through the respective first and second portions of the single imaging path.
  • Selectively actuating first and second regions of the transmissive optical element may involve selectively actuating first and second regions of one of a liquid crystal element, and a light valve.
  • Selectively actuating the first and second regions of the optical element may involve selectively actuating first and second regions of a reflective optical element disposed to reflect light through the respective first and second portions of the single imaging path.
  • Selectively actuating first and second regions of the reflective optical element may involve selectively actuating first and second regions of a light valve having a plurality of moveable mirror elements.
  • Selectively receiving the first and second images may involve simultaneously receiving a first image having first image attributes and a second image having second image attributes, and separating the first and second images in accordance with the first and second image attributes to produce respective first and second image representations.
  • Receiving the first image may involve receiving a first image having a first state of polarization and receiving the second image may involve receiving a second image having a second state of polarization, and separating the first and second images may involve receiving the first and second images at a sensor array having a first plurality of elements responsive to radiation of the first polarization state and a second plurality of elements responsive to radiation of the second polarization state.
  • the method may involve generating the first image having the first state of polarization and generating the second image having the second state of polarization.
  • Varying the extent may involve varying the extent of the first and second portions of the imaging path in response to a control signal.
  • the method may involve generating the control signal.
  • a location of the first perspective viewpoint may be defined by a first centroid location and a location of the second perspective viewpoint may be defined by a second centroid location, and generating the control signal may involve generating a control signal operable to cause the first and second centroids to move with respect to each other at a generally constant rate to provide a smooth change in the representation of the three dimensional spatial attributes.
  • a location of the first perspective viewpoint may be defined by a first centroid location and a location of the second perspective viewpoint is defined by a second centroid location, and varying the extent may involve varying the extent of the first and second portions of the imaging path between a first extent where the first and second centroid locations may be proximally located causing the first and second images to may include predominately two-dimensional spatial attributes within the field of view, and a second extent where the first and second centroid locations are spaced apart to cause the first and second images to may include an increasing degree of three dimensional spatial attribute information.
  • Varying the extent of the first and second portions of the imaging path may involve varying the extent to provide a smooth transition from one of the first extent to the second extent to produce a two-dimensional to three- dimensional transition effect, and the second extent to the first extent to produce a three-dimensional to two-dimensional transition effect.
  • Receiving the first and second images may involve sequentially receiving a plurality first and second images representing time variations of subject matter within the field of view.
  • an apparatus for generating three dimensional image information using a single imaging path having an associated field of view includes provisions for selectively receiving first and second images through respective first and second portions of the single imaging path, the first portion having a first perspective viewpoint within the field of view and the second portion having a second perspective viewpoint within the field of view, the first and second images together being operable to represent three dimensional spatial attributes of objects within the field of view.
  • the apparatus also includes provisions for varying an extent of the first and second portions of the imaging path to cause the first and second perspective viewpoints to change location while receiving the first and second images, the change in perspective viewpoint location providing a corresponding change in the representation of the three dimensional spatial attributes.
  • the apparatus further includes provisions for compensating for changes in transmission through the first and second portions of the imaging path such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images is maintained at a generally uniform image intensity level.
  • an apparatus for generating three dimensional image information includes a single imaging path having an associated field of view.
  • the apparatus also includes an image modulator operably configured to cause first and second images to be selectively received through respective first and second portions of the single imaging path, the first portion having a first perspective viewpoint within the field of view and the second portion having a second perspective viewpoint within the field of view.
  • the first and second images together are operable to represent three dimensional spatial attributes of objects within the field of view.
  • the apparatus also includes a controller in communication with the modulator, the controller being operably configured to produce a signal operable to cause the modulator to vary an extent of the first and second portions of the imaging path to cause the first and second perspective viewpoints to change location while receiving the first and second images, the change in perspective viewpoint location providing a corresponding change in the representation of the three dimensional spatial attributes.
  • the apparatus further includes a compensator operably configured to compensate for changes in transmission through the first and second portions of the imaging path such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images is maintained at a generally uniform image intensity level.
  • the single imaging path may be operably configured to produce the first and second images at an image sensor and the compensator may be operably configured to compensate for the changes in the transmission by one of increasing an exposure associated with the image sensor in response to a reducing extent of the first and second portions of the imaging path, decreasing an exposure associated with the image sensor in response to an increasing extent of the first and second portions of the imaging path, increasing overall transmittance through the imaging path in response to a reducing extent of the first and second portions of the imaging path, and reducing overall transmittance through the imaging path in response to an increasing extent of the first and second portions of the imaging path.
  • the modulator may be operably configured to alternately block the first portion of the imaging path while receiving the second image, and block the second portion of the imaging path while receiving the first image.
  • the modulator may be operably configured to cause a blocking element located proximate an aperture plane of the image path to move between first and second positions in the image path to define the varying extent of the first and second portions of the imaging path.
  • the modulator may include an actuator for producing a force operable to alternately move the blocking element toward one of the first and second positions, a position sensor operably configured to produce a position signal representing a position of the blocking element, and the controller may be operably configured to control a magnitude of the force in response to the position signal to cause the blocking element to come to rest at the one of the first and second positions.
  • the modulator may include an optical element having first and second regions, the first and second regions being operably configured to be selectively actuated to selectively block the first and second portions of the imaging path.
  • the optical element may include a plurality of elements and the first region may include a first plurality of elements and the second region may include a second plurality of elements, the first and second pluralities being selected to vary the extent of the first and second portions of the imaging path.
  • Each element of the plurality of elements may be operable to be actuated in response to receiving an actuation signal, and may further include a modulator driver operably configured to generate the actuation signals to cause a number of elements in the first and second plurality of elements to be selectively varied to vary the extent of the first and second portions of the imaging path.
  • the modulator may be operably configured to selectively actuating first and second regions of a transmissive optical element disposed to transmit light through the respective first and second portions of the single imaging path.
  • the modulator may include of one of a liquid crystal element, and a light valve.
  • the modulator may be operably configured to selectively actuate first and second regions of a reflective optical element disposed to reflect light received through the respective first and second portions of the single imaging path.
  • the modulator may include a light valve having a plurality of moveable mirror elements.
  • the modulator may be operably configured to simultaneously receive a first image having first image attributes and a second image having second image attributes, and separate the first and second images in accordance with the first and second image attributes to produce respective first and second image representations.
  • the modulator may include a polarizer having first and second polarization regions operably configured to generate a first image having a first state of polarization and the second image having a second state of polarization, and may further include a sensor array having a first plurality of elements responsive to radiation of the first polarization state and a second plurality of elements responsive to radiation of the second polarization state, the sensor array being operable to separate the first and second images.
  • the modulator may be operably configured to varying the extent of the first and second portions of the imaging path in response to a control signal.
  • the controller may be operably configured to generate the control signal.
  • a location of the first perspective viewpoint may be defined by a first centroid location and a location of the second perspective viewpoint is defined by a second centroid location, and the controller is operably configured to generate the control signal by generating a control signal operable to cause the first and second centroids to move with respect to each other at a generally constant rate to provide a smooth change in the representation of the three dimensional spatial attributes.
  • a location of the first perspective viewpoint may be defined by a first centroid location and a location of the second perspective viewpoint is defined by a second centroid location
  • the modulator is operably configured to vary the extent of the first and second portions of the imaging path between a first extent the first and second centroid locations may be proximally located causing the first and second images to may include predominately two- dimensional spatial attributes within the field of view, and a second extent the first and second centroid locations are spaced apart to cause the first and second images to may include an increasing degree of three dimensional spatial attribute information.
  • the modulator may be operably configured to vary the extent of the first and second portions of the imaging path by varying the extent to provide a smooth transition from one of the first extent to the second extent to produce a two- dimensional to three-dimensional transition effect, and the second extent to the first extent to produce a three-dimensional to two-dimensional transition effect.
  • the image path may be operably configured to receiving the first and second images by sequentially receiving a plurality first and second images representing time variations of subject matter within the field of view.
  • Figure 1 is a top schematic view of an apparatus for generating three- dimensional image information in accordance with a first embodiment of the invention
  • Figure 2 is a front schematic view of an imaging path of the apparatus shown in Figure 1;
  • Figure 3 is a perspective view of an optical imaging apparatus for generating three-dimensional image information in accordance with another embodiment of the invention
  • Figure 4 is a cross-sectional view of the optical imaging apparatus shown in Figure 3, taken along the line 4 - 4;
  • Figure 5 is a representation of first and second images produced by the optical imaging apparatus shown in Figure 3;
  • Figure 6 is a block diagram of a controller for controlling operation of the optical imaging apparatus shown in Figure 3;
  • Figure 7 is a process flow chart depicting a control process implemented by the controller shown in Figure 6;
  • Figure 8A-8D are a series of representations of first and second images produced by the optical imaging apparatus shown in Figure 3;
  • Figure 9 is a perspective view of a liquid crystal modulator used in the optical imaging apparatus shown in Figure 3;
  • Figure 10 is a schematic view of a spatial modulator in accordance with an alternative embodiment of the invention;
  • Figure 11 is a graphical depiction of control signals for controlling the spatial modulator shown in Figure 10;
  • Figure 12 is a perspective view of an alternative embodiment of an actuator for use in the spatial modulator shown in Figure 10;
  • Figure 13 is a perspective view of an alternative embodiment of an optical imaging apparatus for generating three-dimensional image information.
  • an apparatus according to a first embodiment of the invention for generating three-dimensional image information is shown in schematic top view generally at 100.
  • the apparatus 100 includes a single imaging path 102 having an associated field of view 104, which in this embodiment includes an object 106.
  • the apparatus 100 also includes an image modulator 108 operably configured to cause first and second images (shown schematically as "A” and "B” in Figure 1) to be selectively received through respective first and second portions 112 and 114 of the single imaging path 102.
  • the imaging path is circular and the first and second portions 112 and 114 each generally comprise a circular segment.
  • the first portion 112 defines a first perspective viewpoint within the field of view 104, which is represented by a first centroid 116.
  • the second portion 114 defines a second perspective viewpoint within the field of view 104, which is represented by a second centroid 118.
  • the imaging path may be non-circular.
  • the apparatus 100 also includes a controller 120 in communication with the modulator 108.
  • the controller 120 includes an output 122 for producing a control signal operable to cause the modulator 108 to vary an extent of the first and second portions 112 and 114 of the imaging path, thereby causing the first and second perspective viewpoints 116 and 118 to change location while receiving the first and second images.
  • the change in location of perspective viewpoints 116 and 118 provides a corresponding change in the representation of the three dimensional spatial attributes the object 106 within the field of view 104.
  • the apparatus 100 also includes a compensator 124.
  • the compensator 124 is operably configured to compensate for changes in transmission through the first and second portions 112 and 114 of the imaging path 102 such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images A and B is maintained at a generally uniform image intensity level.
  • the first and second images A and B are formed at an image plane 126, and together the first and second images are operable to represent three dimensional spatial attributes the object 106, and other objects within the field of view 104.
  • the controller 120 also includes an input 128 for receiving user input of a desired change in perspective and the controller is operably configured to produce the control signal at the output 122 in response to the user input.
  • the imaging path 102 may be an optical imaging path operable to receive light radiation for producing the images.
  • the light radiation may have a wavelength range in the infrared, visible, and/or ultra- violet wavelength ranges.
  • the imaging path 102 may be operable to produce images in response to receiving acoustic, ultrasonic, or radio frequency signals.
  • the image at the image plane 126 may be captured by any suitable image capture device using any of a variety of recording methods and/or media.
  • the image capture device may be a still camera or movie camera having a photosensitive film or a charge coupled device (CCD) array for recording the images.
  • CCD charge coupled device
  • a piezoelectric crystal array may be used for acoustic or ultrasonic imaging
  • an antenna or antenna array may be used for radio frequency imaging, for example.
  • the single image path 102 produces A and B images from which 3D information can be perceived and/or extracted without requiring any special alignments other than would normally be required in assembling the image path.
  • the single image path 102 produces A and B images from which 3D information can be perceived and/or extracted without requiring any special alignments other than would normally be required in assembling the image path.
  • the optical imaging apparatus 150 includes a single imaging path 152, having a first lens 154 and a second lens 156 disposed to receive light rays from an object 158 within a field of view of the first and second lenses.
  • the optical imaging apparatus 150 also includes a liquid crystal device (LCD) modulator 160 having a plurality of elements 162.
  • Each element 162 defines a columnar portion of a front surface area 164 of the modulator 160 that may be selectively controlled to alternately block a first portion 165 of the imaging path 152 while receiving a first image, and a second portion 167 of the imaging path while receiving a second image.
  • the modulator 160 also includes a plurality of control inputs 166, each element 162 having an associated control input for receiving an actuation signal for selectively actuating the element.
  • the optical imaging apparatus 150 further includes a camera 170 having a third lens 172 and a CCD image sensor 174 located at an image plane of the camera 170.
  • the camera may be a still camera or a video camera and may be sensitive to visible or non-visible light.
  • the third lens 172 gathers light transmitted by the modulator 160 and forms an image on the image sensor
  • the image sensor 174 includes a photo-sensitive area 176, and one or more control inputs 178 for receiving various control signals operable to control operations of the sensor related to capturing the image.
  • the image sensor 174 has a spatial array of photosensitive elements that accumulate charges in proportion to incident light on the element. The accumulated charge may be read out of the image sensor 174 by serially shifting the charges through adjacent coupled elements to a charge amplifier, which converts the charges into a voltage signal representing the light incident on the associated element.
  • the image sensor 174 may be a complementary metal-oxide-semiconductor (CMOS) active-pixel sensor, or other electronic image sensor.
  • CMOS complementary metal-oxide-semiconductor
  • the image sensor 174 may be a photosensitive film emulsion, such as 35mm film for example.
  • the image sensor 174, third lens 172, liquid crystal modulator 160, first lens 154, second lens 156, and the camera 170 are all aligned along an optical axis 180.
  • the apparatus 150 is shown in cross section in Figure 4, with the imaging path 152 being illuminated by a bundle or cone of rays 190 emanating from an off-axis point on the object 158.
  • a diameter of one of the optical elements will limit which rays in the bundle 190 can pass through the optical system, and this diameter defines the system aperture.
  • the image of the system aperture by optical surfaces located between the aperture and the object 158 defines a location and extent of an entrance pupil for the system.
  • the entrance pupil in turn defines a bundle of rays that are able to pass through the imaging path.
  • the first lens 154 is the system aperture and thus also the entrance pupil and 152 rays that impinge on the first lens will be transmitted through the imaging path through the imaging path 152.
  • the entrance pupil may be located in front of or behind the first lens 154, depending on the configuration of the lenses.
  • Rays in the bundle 190 that enter the first lens 154 are thus focused through the second lens 156 and impinge on the front surface area 164 of the modulator 160.
  • a partial occlusion such as the actuated first portion 165 of the modulator 160 is located after the system aperture in the imaging path 152, vignetting of the image occurs.
  • rays 192 in the bundle of rays 190 are blocked by the first portion 165 of the front surface area 164 of the modulator 160 and do not reach the photo-sensitive area 176 of the sensor 174.
  • Rays 194 in the bundle of rays 190 pass through the modulator 160, and are focused onto the photo-sensitive area 176 by the lens 172.
  • Vignetting reduces the overall illumination of the image formed at the photosensitive area 176 of the sensor 174. However, since the rays 194 intersect at the photo-sensitive area 176 of the sensor 174, a real image is formed at the sensor. Furthermore the vignetting caused by the modulator does not change the angle of view at the entrance pupil.
  • the first image produced by the optical imaging apparatus 150 under the vignetting conditions shown in Figure 4, is shown generally at 200 in Figure 5.
  • the image 200 corresponds to a right perspective viewpoint represented by a centroid 182 (shown in Figure 1).
  • a second image 202 (shown in Figure 5) is produced by the apparatus 150.
  • the second image 202 corresponds to a left perspective viewpoint represented by a centroid 184 (shown in Figure 1).
  • the first and second images 200 and 202 together include information representative of three dimensional spatial attributes of objects within the field of view. For example, a user separately viewing the image 200 using their right eye while viewing the image 202 using their left eye will be able to perceive a similar depth effect that would be perceptible if the user were to view the object directly.
  • the images may be separately directed to the respective left and right eyes of the user using a pair of stereoscopic viewing glasses, for example.
  • the controller 220 includes an output 222 for producing a synchronization signal (SYNC), which typically comprises a pulse train.
  • the output 222 is in communication with the image sensor input 178 for synchronizing the image capture at the image sensor 174.
  • the controller 220 also includes an output 224 for producing a compensation signal (COMP) for controlling image intensity compensation.
  • the output 224 is in communication with the image sensor input 178 and the image sensor acts as the compensator 124 shown in Figure 1.
  • the COMP signal produced at the output 224 may be used to control an aperture stop compensator such as an adjustable iris in the optical system to reduce or increase the bundle of rays accepted by the imaging path.
  • Electronically controlled auto-iris diaphragms are commonly used in cameras that automatically select an aperture size and exposure to ensure correct image exposure.
  • the controller 220 further includes a modulator driver 226 having an output 228 for driving the control input 166 of the modulator 160.
  • the output 228 has "n" output channels corresponding to the number of elements 162 on the modulator 160.
  • the controller 220 also includes an input 230 for receiving a change perspective (CP) user input.
  • the CP input 230 may be provided from a biased single-pole-double-throw switch configured to provide a varying potential at the input.
  • controller 220 may be implemented using a processor circuit such as a micro-controller, for example. Controller operation
  • controller 220 in controlling operation of the optical imaging apparatus 150 is described further with reference to Figure 3, Figure 6, and Figure 7.
  • Figure 7 one embodiment of a control process implemented by the controller 220 is shown generally at 250.
  • the process begins with the controller 220 detecting a signal state associated with the CP signal at the input 230. As shown at 254, if the CP signal has changed state, indicating that the user wishes to change the image perspective, then the process continues at 256.
  • the compensator then produces a predicted light loss or gain in response to the CP signal.
  • the predicted light loss or gain may be computed for the detected CP signal state change at the input 230.
  • the predicted light loss or gain may be pre-determined and stored as a look up table in a memory of the processor circuit.
  • the predicted light loss or gain is then used to produce a compensation signal (COMP) at the output 224 of the controller, in a format suitable for driving the particular image sensor 174.
  • COMP compensation signal
  • the amount of light captured by the CCD array may be controlled by a mechanical shutter (not shown) proximate the focal plane and the COMP signal would then be configured to cause the mechanical shutter to operate with a suitable shutter speed to produce a desired image intensity.
  • the COMP signal may be a gating signal for gating a light accumulation phase of image capture such that the CCD elements are only configured to receive light for a portion of the time between successive image captures.
  • Some CCD sensors also permit adjustment of a gain associated either analog charge amplification and/or the analog to digital conversion of the charge signals, and this gain may also be controlled by the COMP signal to compensate for the intensity of the first and second images.
  • the modulator 160 is configured for the first image capture in accordance with the CP signal state, which involves configuring the modulator to drive a first plurality of the n-channels 228 to cause a first plurality of the elements 162 of the modulator 160 to be controlled to block light.
  • capture of the first image is initiated when the controller produces a SYNC pulse at the output 222.
  • the captured first image may be recorded in analog or digital format on an image storage medium (not shown) such as magnetic tape, a memory, a hard drive, or a photosensitive emulsion, for example.
  • the modulator 160 is then configured for the second image capture, by configuring the modulator to drive a second plurality of the n- channels of the output 228 to cause a second plurality of the elements 162 of the modulator 160 to be controlled to block light.
  • capture of the second image is initiated when the controller produces a second SYNC pulse at the output 222.
  • the SYNC signal would produce first and second time-separated synchronization pulses.
  • the time-separation between pulses is selected to provide sufficient time for the image sensor 174 to accumulate photons sufficient to produce an image.
  • a frame rate may be imposed by a selected video format (e.g. 29.97 frames per second for NTSC video), in which case the SYNCH signal may comprise a plurality of pulses at time intervals of about 33.3 milliseconds, for a noninterlaced image capture.
  • the first and second images may be captured at time intervals of 16.7 milliseconds such each eye of the user receives the respective images at the full NTSC frame rate.
  • FIG. 8A A series of representations of the imaging path 152 depicting a change in perspective, are shown in Figure 8. Referring to Figure 8A, the imaging path
  • the 152 is depicted in end view and the centroids 182 and 184 are located on a central axis of the imaging path. Under these conditions, neither the first nor the second portions of the modulator 160 are controlled to block light and the first image (A) and the second image (B) are identical. A user viewing the respective A and B images using respective left and right eyes will perceive only a standard two-dimensional (2D) image, with no three-dimensional (3D) depth perception being possible.
  • first and second portions 165 and 167 of the modulator 160 are alternately controlled to block light such that transmission occurs alternately through portions 300 and 302 of the imaging path 152.
  • the resulting A and B images have slightly differing perspective viewpoints and the user viewing the respective A and B images will be able to perceive at least some 3D depth due to the differing perspectives of the images presented to each eye.
  • the centroids 182 and 184 have again moved further outwardly such that transmission occurs alternately through portions 300 and 302 of the imaging path 152.
  • the resulting A and B images have greater differing perspective viewpoints than in Figure 8B and the user viewing the respective A and B images will be able to perceive a greater degree of 3D depth.
  • the centroids 182 and 184 are spaced apart to an extent where a region 304 of the imaging path 152 is blocked either by the first portion 165 of the modulator 160 or by the second portion 167 of the modulator. Transmission occurs alternately through the portions 300 and 302 of the imaging path 152.
  • the resulting A and B images have even greater differing perspective viewpoints than in Figure 8C providing and even greater degree of 3D depth perception.
  • the apparatus 150 facilitates a smooth change in perspective from a 2D to a 3D image representation in the resulting images.
  • the captured A and B images may be viewed using a specially adapted 3D display system, that uses special eyewear or headgear to present the different A and B images to the users left and right eyes.
  • the images may be displayed using an auto-stereoscopic display capable of displaying 3D image information that can be viewed without the use of special glasses or headgear.
  • the controller may be configured to cause the first and second centroids 182 and 184 to move with respect to each other at a generally constant rate to provide a smooth change in the representation of the three dimensional spatial attributes.
  • the non-linear relation between the location of the centroids 182 and 184 and the area of the first and second portions 165 and 167 may be stored in the controller as a look-up table, for example.
  • the LCD modulator 160 is shown in greater detail in Figure 9. Referring to
  • the modulator 160 includes a liquid crystal material layer 350 disposed between a first glass plate 352 and a second glass plate 354.
  • the first glass plate 352 includes a plurality of transparent electrodes 356 arranged in columns.
  • the electrodes 356 define an extent of each of the plurality of elements 162 shown in Figure 3.
  • Each electrode 356 has an associated connector 358, which may be a wire-bonded or flexible circuit connection, for example.
  • the connectors 358 connect to a header 360, which in turn facilitates connection to the output 228 of the controller 220 shown in Figure 6.
  • the second glass plate 354 includes a transparent area electrode (not shown) which acts as a common electrode for all elements 162.
  • the modulator 160 also includes a first polarizer 362, having a first linear polarization property (in this case vertical polarization).
  • the first polarizer 362 overlays the first electrodes 356.
  • the modulator 160 further includes a second polarizer 364 overlaying the second electrode and having a second linear polarization property (in this case horizontal polarization).
  • the layers are not shown to scale.
  • the modulator driver 226 provides a drive voltage to each electrode 356 via the header 360 and connectors 358, with the common electrode acting as a ground connection.
  • the drive voltage may be a 50% duty cycle square wave varying between a voltage V + and V, where the voltages are selected within a range of safe operating voltages to provide sufficient contrast between transmission and blocking of light impinging on the LCD modulator 160.
  • the first polarizer 362 transmits light having a vertical polarization.
  • the liquid crystal material 350 is selected so that in its relaxed phase (un-actuated) the polarization of light passing through the crystal is unaffected and the second polarizer 364 thus blocks the light.
  • a portion of the liquid crystal material underlying the electrode causes the light to undergo a 90° change in polarization, thus passing through the modulator 160.
  • the modulator 160 alternatively blocks light at the first and second portions 165 and 167 respectively.
  • an extent of the first and second portions of the imaging path 152 may be varied to cause the first and second perspective viewpoints represented by the centroids 182 and 184 in Figure 3 to change location.
  • providing a sufficient number of electrodes 356 facilitates a generally smooth variation in perspective thereby preventing a visually disturbing transition from 2D to 3D imaging.
  • the polarizers 362 and 364 may both be vertically polarized, such that the LCD Modulator is transmissive when no actuation voltage is applied.
  • the liquid crystal material When actuated by the drive voltage, the liquid crystal material causes the light to undergo a 90° change in polarization thus causing elements 356 to block transmission of light.
  • the modulator 160 shown in Figure 3 may be implemented using the spatial modulator shown generally at 380.
  • the spatial modulator 380 includes an opaque shutter blade 382 mounted on an arm 384.
  • the arm 384 is mounted on a pivot 386 to provide for side-to-side motion.
  • the arm 384 also includes a magnet 390 mounted partway along the arm. The magnet 390 is disposed between first and second electromagnets 392 and 394.
  • the first and second positions 382 and 383 define a varying extent of the first and second portions of the single imaging path 152 shown in Figure 3.
  • the spatial modulator further includes a position sensor 396 located behind the arm 384.
  • the position sensor 396 includes an output 398 for producing a position signal representative of a position of the arm 384 with respect to the position sensor.
  • the position sensor 396 may be implemented using a linear photodiode array where either background stray light or illumination from a source such as a light emitting diode (not shown) casts a shadow on the array.
  • the location of the shadowed array elements may be read out from the photodiode array at the output 398 and various interpolation methods used to determine a center location of the arm 384.
  • the modulator driver 226 shown in Figure 6 may be replaced by the modulator driver 400 shown in Figure 10.
  • the modulator driver 400 includes a first pair of outputs 402 for driving a coil 404 of the first electromagnet 392 and a second pair of outputs 406 for driving a coil 408 of the second electromagnet 394.
  • the modulator driver 400 also includes an input 410 for receiving the position signal from the position sensor 396.
  • the modulator driver 400 further includes an input 412 for receiving a reference signal representing the desired alternate positions of the arm 384.
  • the reference signal defines an alternating target position for the arm 384 and shutter blade 382 and may be generated by the controller 220 in response to the CP signal.
  • the spatial modulator 380 and modulator driver 400 together implement a feedback control loop for producing alternating motion of the arm 384 and shutter blade 382 to vary an extent of blocking of the image path (shown in broken outline at 152).
  • the reference signal received at the input 412 of the modulator driver 400 provides a target position of the arm 384, while the position signal received at the input 410 represents the actual position of the arm and may be used to produce an error signal for driving the modulator driver 400.
  • the feedback control loop thus produces drive signals at the outputs 402 and 406 to cause the electromagnets 392 and 394 to exert drive forces on the arm 384 to move toward a desired position.
  • the drive may be implemented as a push-pull driver where one of the electromagnets 392 and 394 provides an attractive force on the magnet 390, while the other of the electromagnets provides a repulsion force.
  • Exemplary waveforms of a current drive provided to the coils 404 and 408 to cause the arm 384 to move toward the first electromagnet 392 are shown graphically in Figure 11.
  • the current waveform through the coil 404 is shown at 440 and the current waveform through the coil 408 is shown at 442.
  • the alternating target positions provided by the reference signal REF at the input 412 are Si and S 2 respectively.
  • the error signal derived from the difference between the target position and the current position is large causing the position current 440 to increase rapidly to produce an attractive force on the arm 384.
  • the attractive force overcomes the inertia of the arm 384 and causes the arm to accelerate away from the second electromagnet 394.
  • the current 442 is initially at zero and once the arm 384 begins to accelerate, the current 442 increases rapidly to provide a decelerating force as a desired arm position S 2 is approached.
  • the arm 384 comes to rest at the position S 2 and is held in place at this position by a holding current in each of the coils 404 and 408, which is continuously adjusted by the feedback control loop to maintain the arm 384 in the position S 2 for a second period of time 448.
  • the second time period 448 provides sufficient time to complete capture of the first image.
  • the reference signal at the input 412 then changes defining the target position Si as the new target position.
  • the current 442 changes polarity and increases rapidly to produce an attractive force causing the arm 384 to overcome its inertia and accelerate away from the first electromagnet 392.
  • the current 440 is initially allowed to fall to zero and once the arm 384 begins to accelerate, the current 440 increases rapidly to provide a decelerating force as the target position Si is approached.
  • the arm 384 comes to rest at the position S 1 and is held in place at this position by a holding current in each of the coils 404 and 408 which is continuously adjusted by the feedback control loop to maintain the arm 384 in the position S 1 for a fourth period of time 452.
  • the fourth time period 452 provides sufficient time to complete capture of the second image.
  • an alternative embodiment of the actuator portion of the spatial modulator 380 (shown in Figure 11) is shown generally at 500.
  • the actuator 500 includes a motor portion 502 and a rotary position sensor portion 504.
  • a common rotor shaft 506 extends through the motor and position sensor portions 502 and 504.
  • the arm 384 is mounted to the shaft for side-to-side motion.
  • the motor portion 502 provides a drive force for moving the arm 384, while the position sensor portion 504 provides a position signal.
  • the motor portion 502 is implemented using a pair of magnets 508 and 510
  • the sensor portion 504 is implemented using a pair of magnets 512 and 514.
  • the shaft 506 supports an actuator coil 516 between the magnets 508 and 510.
  • the actuator coil 516 is coupled to the modulator output 402 for receiving a drive current, which causes a torque to be generated on the coil and thus applied to the shaft 506.
  • the sensor portion 504 also includes a pickup coil (not shown) located between the magnets 512 and 514. The pickup coil generates a current signal proportional to rotary displacement, which may be used as the position signal at the input 410.
  • the actuator 500 operates in a manner similar to an analogue meter movement.
  • the motor portion 502 may be configured such that the shaft 506 is magnetized and the coil is wound around pole pieces (i.e. 508 and 510). Similarly, the pickup coil of the sensor portion 504 may be wound around pole pieces (i.e. 512 and 514). Liqht valve embodiment
  • the apparatus 550 includes a single imaging path 552, having a first lens 554 and a second lens 556 disposed to receive light rays from an object 558 within a field of view of the first and second lenses.
  • the apparatus 550 includes a light valve modulator 560, having a plurality of individually actuated mirror elements 562 disposed to direct a beam of light through a lens 568 when actuated. In an un-actuated state the mirror elements 562 direct the beam of light away from the lens 568.
  • the modulator may be actuated in the alternating manner described earlier in connection with the modulator 160 shown in Figure 3.
  • the second polarizer 364 may be omitted to configure the modulator to selectively change the polarization of the transmitted light.
  • the first polarizer 362 only transmits light having a vertical polarization. Portions of the liquid crystal material 350 underlying un-actuated electrodes
  • the liquid crystal material 350 of the LCD modulator 160 may be configured to produce a first image having right circular polarized light and a second image having left circular polarized light.
  • the sensor 174 may be configured to simultaneously receive the respective first and second images by adding polarizing elements in front of individual sensor array elements. For example, adjacent sensor pixels may be alternately horizontally polarized and vertically polarized to provide polarization selective pixels that are sensitive to only one polarization orientation. The sensor would thus permit both the first and second images to be simultaneously received.
  • the first and second images may be separated during readout of the array or in a separate processing step.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

A method and apparatus for generating three dimensional image information using a single imaging path having an associated field of view is disclosed. The method involves selectively receiving first and second images through respective first and second portions of the single imaging path, the first portion having a first perspective viewpoint within the field of view and the second portion having a second perspective viewpoint within the field of view. The first and second images together are operable to represent three dimensional spatial attributes of objects within the field of view. The method also involves varying an extent of the first and second portions of the imaging path to cause the first and second perspective viewpoints to change location while receiving the first and second images, the change in perspective viewpoint location providing a corresponding change in the representation of the three dimensional spatial attributes. The method further involves compensating for changes in transmission through the first and second portions of the imaging path such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images is maintained at a generally uniform image intensity level.

Description

METHOD AND APPARATUS FOR GENERATING THREE DIMENSIONAL IMAGE INFORMATION USING A SINGLE IMAGING PATH
BACKGROUND OF THE INVENTION
1. Field of Invention
This invention relates generally to generating three dimensional image information and more particularly to generating three dimensional image information using a single imaging path. 2. Description of Related Art
Imaging generally involves producing a representation of a scene by receiving radiation emitted or reflected by objects in the scene at a suitable image sensor. Some examples of radiation that may be imaged include visible light, infrared light or heat, radiofrequency waves, acoustic waves, and ultrasonic waves.
A three-dimensional (3D) scene includes depth information, which in many imaging systems is mapped onto a two-dimensional (2D) image plane and is thus not preserved. A conventional camera is an example of an optical imaging system in which depth information is not preserved resulting in a 2D image representing the scene. Stereoscopic optical systems are capable of producing images that represent depth information by producing separate images from differing perspective viewpoints. The depth information may be used to produce 3D measurements between points in the scene, for example. Alternatively, the separate images may be presented to respective left and right eyes of a user to enable the user to perceive an image view having at least some depth represented in the images. The stereoscopic system thus produces images having spatially separated perspective viewpoints that mimic the operation of the human eyes in viewing a real scene. The images may be viewed using some form of active eyewear or by operating a display to project spatially separated images toward the user's respective left and right eyes.
The use of stereoscopic imaging finds application in surgery where a 3D endoscope may be used to provide a 3D view to the surgeon. Stereoscopic imaging may also be useful in remote operations, such as undersea exploration for example, where control of a robotic actuator is facilitated by providing 3D image information to an operator who is located remotely from the actuator. Other applications of stereoscopic imaging may be found in physical measurement systems and in the entertainment industry.
SUMMARY OF THE INVENTION
In accordance with one aspect of the invention there is provided a method of generating three dimensional image information using a single imaging path having an associated field of view. The method involves selectively receiving first and second images through respective first and second portions of the single imaging path, the first portion having a first perspective viewpoint within the field of view and the second portion having a second perspective viewpoint within the field of view. The first and second images together are operable to represent three dimensional spatial attributes of objects within the field of view. The method also involves varying an extent of the first and second portions of the imaging path to cause the first and second perspective viewpoints to change location while receiving the first and second images, the change in perspective viewpoint location providing a corresponding change in the representation of the three dimensional spatial attributes. The method further involves compensating for changes in transmission through the first and second portions of the imaging path such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images is maintained at a generally uniform image intensity level. Selectively receiving the first and second images may involve receiving the first and second images at an image sensor and compensating for the changes in the transmission may involve one of increasing an exposure associated with the image sensor in response to a reducing extent of the first and second portions of the imaging path, decreasing a gain associated with the image sensor in response to an increasing extent of the first and second portions of the imaging path, increasing overall transmittance through the imaging path in response to a reducing extent of the first and second portions of the imaging path, and reducing overall transmittance through the imaging path in response to an increasing extent of the first and second portions of the imaging path.
Selectively receiving the first and second images may involve alternately blocking the first portion of the imaging path while receiving the second image, and blocking the second portion of the imaging path while receiving the first image.
Alternately blocking the first and second portions of the imaging path may involve causing a blocking element located proximate an aperture plane of the image path to move between first and second positions in the image path to define the varying extent of the first and second portions of the imaging path.
Causing the blocking element to move may involve producing a force operable to alternately move the blocking element toward one of the first and second positions, receiving a position signal representing a position of the blocking element, and controlling a magnitude of the force in response to the position signal to cause the blocking element to come to rest at the one of the first and second positions. Alternately blocking the first and second portions of the imaging path may involve selectively actuating first and second regions of an optical element -A-
located proximate an aperture plane of the image path to selectively block the first and second portions of the imaging path.
The optical element may involve a plurality of elements and selectively actuating the first and second regions may involve selectively actuating one of a first plurality of elements in the plurality of elements, and a second plurality of elements in the plurality of elements.
Each element of the plurality of elements may be operable to be actuated in response to receiving an actuation signal, and varying the extent of the first and second portions of the imaging path may involve generating actuation signals to cause a number of elements in the first and second plurality of elements to be selectively varied to vary the extent of the first and second portions of the imaging path.
Selectively actuating the first and second regions of the optical element may involve selectively actuating first and second regions of a transmissive optical element disposed to transmit light through the respective first and second portions of the single imaging path.
Selectively actuating first and second regions of the transmissive optical element may involve selectively actuating first and second regions of one of a liquid crystal element, and a light valve. Selectively actuating the first and second regions of the optical element may involve selectively actuating first and second regions of a reflective optical element disposed to reflect light through the respective first and second portions of the single imaging path. Selectively actuating first and second regions of the reflective optical element may involve selectively actuating first and second regions of a light valve having a plurality of moveable mirror elements. Selectively receiving the first and second images may involve simultaneously receiving a first image having first image attributes and a second image having second image attributes, and separating the first and second images in accordance with the first and second image attributes to produce respective first and second image representations.
Receiving the first image may involve receiving a first image having a first state of polarization and receiving the second image may involve receiving a second image having a second state of polarization, and separating the first and second images may involve receiving the first and second images at a sensor array having a first plurality of elements responsive to radiation of the first polarization state and a second plurality of elements responsive to radiation of the second polarization state.
The method may involve generating the first image having the first state of polarization and generating the second image having the second state of polarization.
Varying the extent may involve varying the extent of the first and second portions of the imaging path in response to a control signal.
The method may involve generating the control signal.
A location of the first perspective viewpoint may be defined by a first centroid location and a location of the second perspective viewpoint may be defined by a second centroid location, and generating the control signal may involve generating a control signal operable to cause the first and second centroids to move with respect to each other at a generally constant rate to provide a smooth change in the representation of the three dimensional spatial attributes. A location of the first perspective viewpoint may be defined by a first centroid location and a location of the second perspective viewpoint is defined by a second centroid location, and varying the extent may involve varying the extent of the first and second portions of the imaging path between a first extent where the first and second centroid locations may be proximally located causing the first and second images to may include predominately two-dimensional spatial attributes within the field of view, and a second extent where the first and second centroid locations are spaced apart to cause the first and second images to may include an increasing degree of three dimensional spatial attribute information.
Varying the extent of the first and second portions of the imaging path may involve varying the extent to provide a smooth transition from one of the first extent to the second extent to produce a two-dimensional to three- dimensional transition effect, and the second extent to the first extent to produce a three-dimensional to two-dimensional transition effect.
Receiving the first and second images may involve sequentially receiving a plurality first and second images representing time variations of subject matter within the field of view.
In accordance with another aspect of the invention there is provided an apparatus for generating three dimensional image information using a single imaging path having an associated field of view. The apparatus includes provisions for selectively receiving first and second images through respective first and second portions of the single imaging path, the first portion having a first perspective viewpoint within the field of view and the second portion having a second perspective viewpoint within the field of view, the first and second images together being operable to represent three dimensional spatial attributes of objects within the field of view. The apparatus also includes provisions for varying an extent of the first and second portions of the imaging path to cause the first and second perspective viewpoints to change location while receiving the first and second images, the change in perspective viewpoint location providing a corresponding change in the representation of the three dimensional spatial attributes. The apparatus further includes provisions for compensating for changes in transmission through the first and second portions of the imaging path such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images is maintained at a generally uniform image intensity level.
In accordance with another aspect of the invention there is provided an apparatus for generating three dimensional image information. The apparatus includes a single imaging path having an associated field of view. The apparatus also includes an image modulator operably configured to cause first and second images to be selectively received through respective first and second portions of the single imaging path, the first portion having a first perspective viewpoint within the field of view and the second portion having a second perspective viewpoint within the field of view. The first and second images together are operable to represent three dimensional spatial attributes of objects within the field of view. The apparatus also includes a controller in communication with the modulator, the controller being operably configured to produce a signal operable to cause the modulator to vary an extent of the first and second portions of the imaging path to cause the first and second perspective viewpoints to change location while receiving the first and second images, the change in perspective viewpoint location providing a corresponding change in the representation of the three dimensional spatial attributes. The apparatus further includes a compensator operably configured to compensate for changes in transmission through the first and second portions of the imaging path such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images is maintained at a generally uniform image intensity level. The single imaging path may be operably configured to produce the first and second images at an image sensor and the compensator may be operably configured to compensate for the changes in the transmission by one of increasing an exposure associated with the image sensor in response to a reducing extent of the first and second portions of the imaging path, decreasing an exposure associated with the image sensor in response to an increasing extent of the first and second portions of the imaging path, increasing overall transmittance through the imaging path in response to a reducing extent of the first and second portions of the imaging path, and reducing overall transmittance through the imaging path in response to an increasing extent of the first and second portions of the imaging path.
The modulator may be operably configured to alternately block the first portion of the imaging path while receiving the second image, and block the second portion of the imaging path while receiving the first image.
The modulator may be operably configured to cause a blocking element located proximate an aperture plane of the image path to move between first and second positions in the image path to define the varying extent of the first and second portions of the imaging path.
The modulator may include an actuator for producing a force operable to alternately move the blocking element toward one of the first and second positions, a position sensor operably configured to produce a position signal representing a position of the blocking element, and the controller may be operably configured to control a magnitude of the force in response to the position signal to cause the blocking element to come to rest at the one of the first and second positions.
The modulator may include an optical element having first and second regions, the first and second regions being operably configured to be selectively actuated to selectively block the first and second portions of the imaging path.
The optical element may include a plurality of elements and the first region may include a first plurality of elements and the second region may include a second plurality of elements, the first and second pluralities being selected to vary the extent of the first and second portions of the imaging path.
Each element of the plurality of elements may be operable to be actuated in response to receiving an actuation signal, and may further include a modulator driver operably configured to generate the actuation signals to cause a number of elements in the first and second plurality of elements to be selectively varied to vary the extent of the first and second portions of the imaging path.
The modulator may be operably configured to selectively actuating first and second regions of a transmissive optical element disposed to transmit light through the respective first and second portions of the single imaging path. The modulator may include of one of a liquid crystal element, and a light valve.
The modulator may be operably configured to selectively actuate first and second regions of a reflective optical element disposed to reflect light received through the respective first and second portions of the single imaging path. The modulator may include a light valve having a plurality of moveable mirror elements.
The modulator may be operably configured to simultaneously receive a first image having first image attributes and a second image having second image attributes, and separate the first and second images in accordance with the first and second image attributes to produce respective first and second image representations. The modulator may include a polarizer having first and second polarization regions operably configured to generate a first image having a first state of polarization and the second image having a second state of polarization, and may further include a sensor array having a first plurality of elements responsive to radiation of the first polarization state and a second plurality of elements responsive to radiation of the second polarization state, the sensor array being operable to separate the first and second images.
The modulator may be operably configured to varying the extent of the first and second portions of the imaging path in response to a control signal.
The controller may be operably configured to generate the control signal.
A location of the first perspective viewpoint may be defined by a first centroid location and a location of the second perspective viewpoint is defined by a second centroid location, and the controller is operably configured to generate the control signal by generating a control signal operable to cause the first and second centroids to move with respect to each other at a generally constant rate to provide a smooth change in the representation of the three dimensional spatial attributes. A location of the first perspective viewpoint may be defined by a first centroid location and a location of the second perspective viewpoint is defined by a second centroid location, and the modulator is operably configured to vary the extent of the first and second portions of the imaging path between a first extent the first and second centroid locations may be proximally located causing the first and second images to may include predominately two- dimensional spatial attributes within the field of view, and a second extent the first and second centroid locations are spaced apart to cause the first and second images to may include an increasing degree of three dimensional spatial attribute information.
The modulator may be operably configured to vary the extent of the first and second portions of the imaging path by varying the extent to provide a smooth transition from one of the first extent to the second extent to produce a two- dimensional to three-dimensional transition effect, and the second extent to the first extent to produce a three-dimensional to two-dimensional transition effect.
The image path may be operably configured to receiving the first and second images by sequentially receiving a plurality first and second images representing time variations of subject matter within the field of view.
Other aspects and features of the present invention will become apparent to those ordinarily skilled in the art upon review of the following description of specific embodiments of the invention in conjunction with the accompanying figures. BRIEF DESCRIPTION OF THE DRAWINGS
In drawings which illustrate embodiments of the invention,
Figure 1 is a top schematic view of an apparatus for generating three- dimensional image information in accordance with a first embodiment of the invention;
Figure 2 is a front schematic view of an imaging path of the apparatus shown in Figure 1;
Figure 3 is a perspective view of an optical imaging apparatus for generating three-dimensional image information in accordance with another embodiment of the invention; Figure 4 is a cross-sectional view of the optical imaging apparatus shown in Figure 3, taken along the line 4 - 4;
Figure 5 is a representation of first and second images produced by the optical imaging apparatus shown in Figure 3;
Figure 6 is a block diagram of a controller for controlling operation of the optical imaging apparatus shown in Figure 3;
Figure 7 is a process flow chart depicting a control process implemented by the controller shown in Figure 6;
Figure 8A-8D are a series of representations of first and second images produced by the optical imaging apparatus shown in Figure 3; Figure 9 is a perspective view of a liquid crystal modulator used in the optical imaging apparatus shown in Figure 3; Figure 10 is a schematic view of a spatial modulator in accordance with an alternative embodiment of the invention; Figure 11 is a graphical depiction of control signals for controlling the spatial modulator shown in Figure 10;
Figure 12 is a perspective view of an alternative embodiment of an actuator for use in the spatial modulator shown in Figure 10; and
Figure 13 is a perspective view of an alternative embodiment of an optical imaging apparatus for generating three-dimensional image information. DETAILED DESCRIPTION
Referring to Figure 1 , an apparatus according to a first embodiment of the invention for generating three-dimensional image information is shown in schematic top view generally at 100. The apparatus 100 includes a single imaging path 102 having an associated field of view 104, which in this embodiment includes an object 106. The apparatus 100 also includes an image modulator 108 operably configured to cause first and second images (shown schematically as "A" and "B" in Figure 1) to be selectively received through respective first and second portions 112 and 114 of the single imaging path 102.
Referring to Figure 2, in this embodiment the imaging path is circular and the first and second portions 112 and 114 each generally comprise a circular segment. The first portion 112 defines a first perspective viewpoint within the field of view 104, which is represented by a first centroid 116. The second portion 114 defines a second perspective viewpoint within the field of view 104, which is represented by a second centroid 118. In other embodiments the imaging path may be non-circular.
The apparatus 100 also includes a controller 120 in communication with the modulator 108. The controller 120 includes an output 122 for producing a control signal operable to cause the modulator 108 to vary an extent of the first and second portions 112 and 114 of the imaging path, thereby causing the first and second perspective viewpoints 116 and 118 to change location while receiving the first and second images. The change in location of perspective viewpoints 116 and 118 provides a corresponding change in the representation of the three dimensional spatial attributes the object 106 within the field of view 104.
The apparatus 100 also includes a compensator 124. The compensator 124 is operably configured to compensate for changes in transmission through the first and second portions 112 and 114 of the imaging path 102 such that while varying the extent of the first and second portions, an image intensity associated with each of the first and second images A and B is maintained at a generally uniform image intensity level.
The first and second images A and B are formed at an image plane 126, and together the first and second images are operable to represent three dimensional spatial attributes the object 106, and other objects within the field of view 104. In this embodiment the controller 120 also includes an input 128 for receiving user input of a desired change in perspective and the controller is operably configured to produce the control signal at the output 122 in response to the user input.
In one embodiment the imaging path 102 may be an optical imaging path operable to receive light radiation for producing the images. The light radiation may have a wavelength range in the infrared, visible, and/or ultra- violet wavelength ranges. In other embodiments the imaging path 102 may be operable to produce images in response to receiving acoustic, ultrasonic, or radio frequency signals. The image at the image plane 126 may be captured by any suitable image capture device using any of a variety of recording methods and/or media. For example, the image capture device may be a still camera or movie camera having a photosensitive film or a charge coupled device (CCD) array for recording the images. Alternatively, a piezoelectric crystal array may be used for acoustic or ultrasonic imaging, and an antenna or antenna array may be used for radio frequency imaging, for example.
Advantageously, the single image path 102 produces A and B images from which 3D information can be perceived and/or extracted without requiring any special alignments other than would normally be required in assembling the image path. In contrast, when using separate image paths or an image path that optically divides into two spaced apart image paths, there is a significant alignment challenge and any minor misalignment may cause eyestrain or other uncomfortable effects for users. Optical imaging embodiment
Referring to Figure 3, an optical imaging apparatus embodiment for generating three-dimensional image information is shown generally at 150. The optical imaging apparatus 150 includes a single imaging path 152, having a first lens 154 and a second lens 156 disposed to receive light rays from an object 158 within a field of view of the first and second lenses.
The optical imaging apparatus 150 also includes a liquid crystal device (LCD) modulator 160 having a plurality of elements 162. Each element 162 defines a columnar portion of a front surface area 164 of the modulator 160 that may be selectively controlled to alternately block a first portion 165 of the imaging path 152 while receiving a first image, and a second portion 167 of the imaging path while receiving a second image. The modulator 160 also includes a plurality of control inputs 166, each element 162 having an associated control input for receiving an actuation signal for selectively actuating the element.
The optical imaging apparatus 150 further includes a camera 170 having a third lens 172 and a CCD image sensor 174 located at an image plane of the camera 170. The camera may be a still camera or a video camera and may be sensitive to visible or non-visible light. The third lens 172 gathers light transmitted by the modulator 160 and forms an image on the image sensor
174. The image sensor 174 includes a photo-sensitive area 176, and one or more control inputs 178 for receiving various control signals operable to control operations of the sensor related to capturing the image. In general the image sensor 174 has a spatial array of photosensitive elements that accumulate charges in proportion to incident light on the element. The accumulated charge may be read out of the image sensor 174 by serially shifting the charges through adjacent coupled elements to a charge amplifier, which converts the charges into a voltage signal representing the light incident on the associated element. In another embodiment, the image sensor 174 may be a complementary metal-oxide-semiconductor (CMOS) active-pixel sensor, or other electronic image sensor. Alternatively, the image sensor 174 may be a photosensitive film emulsion, such as 35mm film for example.
The image sensor 174, third lens 172, liquid crystal modulator 160, first lens 154, second lens 156, and the camera 170 are all aligned along an optical axis 180.
The apparatus 150 is shown in cross section in Figure 4, with the imaging path 152 being illuminated by a bundle or cone of rays 190 emanating from an off-axis point on the object 158. In general, for an optical system such as that shown in Figure 3 and Figure 4, a diameter of one of the optical elements will limit which rays in the bundle 190 can pass through the optical system, and this diameter defines the system aperture. The image of the system aperture by optical surfaces located between the aperture and the object 158 defines a location and extent of an entrance pupil for the system. The entrance pupil in turn defines a bundle of rays that are able to pass through the imaging path.
In this case, it will be assumed that the first lens 154 is the system aperture and thus also the entrance pupil and 152 rays that impinge on the first lens will be transmitted through the imaging path through the imaging path 152. In other embodiments, the entrance pupil may be located in front of or behind the first lens 154, depending on the configuration of the lenses.
Rays in the bundle 190 that enter the first lens 154 are thus focused through the second lens 156 and impinge on the front surface area 164 of the modulator 160. When a partial occlusion such as the actuated first portion 165 of the modulator 160 is located after the system aperture in the imaging path 152, vignetting of the image occurs. In this case, rays 192 in the bundle of rays 190 are blocked by the first portion 165 of the front surface area 164 of the modulator 160 and do not reach the photo-sensitive area 176 of the sensor 174. Rays 194 in the bundle of rays 190 pass through the modulator 160, and are focused onto the photo-sensitive area 176 by the lens 172.
Vignetting reduces the overall illumination of the image formed at the photosensitive area 176 of the sensor 174. However, since the rays 194 intersect at the photo-sensitive area 176 of the sensor 174, a real image is formed at the sensor. Furthermore the vignetting caused by the modulator does not change the angle of view at the entrance pupil.
Other points on the object 158 will be similarly imaged to produce a first image of the object 158 on the photo-sensitive area 176 of the sensor 174. The first image produced by the optical imaging apparatus 150 under the vignetting conditions shown in Figure 4, is shown generally at 200 in Figure 5.
The image 200 corresponds to a right perspective viewpoint represented by a centroid 182 (shown in Figure 1). Similarly, by providing control signals to the modulator 160 to cause the first portion 165 to transmit light while controlling a plurality of elements 162 on an opposite side of the modulator 160 to block light, a second image 202 (shown in Figure 5) is produced by the apparatus 150. The second image 202 corresponds to a left perspective viewpoint represented by a centroid 184 (shown in Figure 1).
The first and second images 200 and 202 together include information representative of three dimensional spatial attributes of objects within the field of view. For example, a user separately viewing the image 200 using their right eye while viewing the image 202 using their left eye will be able to perceive a similar depth effect that would be perceptible if the user were to view the object directly. In one embodiment the images may be separately directed to the respective left and right eyes of the user using a pair of stereoscopic viewing glasses, for example.
Controller
A controller for controlling operation of the optical imaging apparatus 150
(shown in Figure 3) is shown at 220 in Figure 6. The controller 220 includes an output 222 for producing a synchronization signal (SYNC), which typically comprises a pulse train. The output 222 is in communication with the image sensor input 178 for synchronizing the image capture at the image sensor 174. The controller 220 also includes an output 224 for producing a compensation signal (COMP) for controlling image intensity compensation. In the embodiment shown, the output 224 is in communication with the image sensor input 178 and the image sensor acts as the compensator 124 shown in Figure 1. In other embodiments the COMP signal produced at the output 224 may be used to control an aperture stop compensator such as an adjustable iris in the optical system to reduce or increase the bundle of rays accepted by the imaging path. Electronically controlled auto-iris diaphragms are commonly used in cameras that automatically select an aperture size and exposure to ensure correct image exposure.
The controller 220 further includes a modulator driver 226 having an output 228 for driving the control input 166 of the modulator 160. In the embodiment shown, the output 228 has "n" output channels corresponding to the number of elements 162 on the modulator 160. The controller 220 also includes an input 230 for receiving a change perspective (CP) user input. For example, the CP input 230 may be provided from a biased single-pole-double-throw switch configured to provide a varying potential at the input.
In one embodiment the controller 220 may be implemented using a processor circuit such as a micro-controller, for example. Controller operation
The operation of the controller 220 in controlling operation of the optical imaging apparatus 150 is described further with reference to Figure 3, Figure 6, and Figure 7. Referring to Figure 7, one embodiment of a control process implemented by the controller 220 is shown generally at 250.
As shown at 252, the process begins with the controller 220 detecting a signal state associated with the CP signal at the input 230. As shown at 254, if the CP signal has changed state, indicating that the user wishes to change the image perspective, then the process continues at 256.
As shown at 256, the compensator then produces a predicted light loss or gain in response to the CP signal. In embodiments where the controller 220 is implemented using a microcontroller, the predicted light loss or gain may be computed for the detected CP signal state change at the input 230. Alternatively, the predicted light loss or gain may be pre-determined and stored as a look up table in a memory of the processor circuit. The predicted light loss or gain is then used to produce a compensation signal (COMP) at the output 224 of the controller, in a format suitable for driving the particular image sensor 174. For example, in an embodiment where the image sensor 174 comprises a full frame CCD architecture, the amount of light captured by the CCD array may be controlled by a mechanical shutter (not shown) proximate the focal plane and the COMP signal would then be configured to cause the mechanical shutter to operate with a suitable shutter speed to produce a desired image intensity. Alternatively, for frame-transfer or interline transfer CCD devices, the COMP signal may be a gating signal for gating a light accumulation phase of image capture such that the CCD elements are only configured to receive light for a portion of the time between successive image captures. Some CCD sensors also permit adjustment of a gain associated either analog charge amplification and/or the analog to digital conversion of the charge signals, and this gain may also be controlled by the COMP signal to compensate for the intensity of the first and second images.
The process then continues at 258.
If at 254, the CP signal has not changed state then there is no light loss/gain to compensate for and the process continues directly at 258.
As shown at 258, the modulator 160 is configured for the first image capture in accordance with the CP signal state, which involves configuring the modulator to drive a first plurality of the n-channels 228 to cause a first plurality of the elements 162 of the modulator 160 to be controlled to block light. At 260, capture of the first image is initiated when the controller produces a SYNC pulse at the output 222. The captured first image may be recorded in analog or digital format on an image storage medium (not shown) such as magnetic tape, a memory, a hard drive, or a photosensitive emulsion, for example. As shown at 262, the modulator 160 is then configured for the second image capture, by configuring the modulator to drive a second plurality of the n- channels of the output 228 to cause a second plurality of the elements 162 of the modulator 160 to be controlled to block light. At 264, capture of the second image is initiated when the controller produces a second SYNC pulse at the output 222.
For still image capture, only a single image from each of the first and second perspective viewpoints 182 and 184 is required, and in this case the SYNC signal would produce first and second time-separated synchronization pulses.
The time-separation between pulses is selected to provide sufficient time for the image sensor 174 to accumulate photons sufficient to produce an image. For capture of variations in a scene in the form of sequential video images, a frame rate may be imposed by a selected video format (e.g. 29.97 frames per second for NTSC video), in which case the SYNCH signal may comprise a plurality of pulses at time intervals of about 33.3 milliseconds, for a noninterlaced image capture. Where the image acquisition rate of a particular camera is sufficiently fast, the first and second images may be captured at time intervals of 16.7 milliseconds such each eye of the user receives the respective images at the full NTSC frame rate. When capturing successive video frames, following block 264 the process continues at block 252 and the process 250 is repeated.
A series of representations of the imaging path 152 depicting a change in perspective, are shown in Figure 8. Referring to Figure 8A, the imaging path
152 is depicted in end view and the centroids 182 and 184 are located on a central axis of the imaging path. Under these conditions, neither the first nor the second portions of the modulator 160 are controlled to block light and the first image (A) and the second image (B) are identical. A user viewing the respective A and B images using respective left and right eyes will perceive only a standard two-dimensional (2D) image, with no three-dimensional (3D) depth perception being possible.
Referring to Figure 8B, the centroids 182 and 184 have now moved outwardly and under these conditions, first and second portions 165 and 167 of the modulator 160 are alternately controlled to block light such that transmission occurs alternately through portions 300 and 302 of the imaging path 152. The resulting A and B images have slightly differing perspective viewpoints and the user viewing the respective A and B images will be able to perceive at least some 3D depth due to the differing perspectives of the images presented to each eye.
Referring to Figure 8C, the centroids 182 and 184 have again moved further outwardly such that transmission occurs alternately through portions 300 and 302 of the imaging path 152. The resulting A and B images have greater differing perspective viewpoints than in Figure 8B and the user viewing the respective A and B images will be able to perceive a greater degree of 3D depth. Referring to Figure 8D, the centroids 182 and 184 are spaced apart to an extent where a region 304 of the imaging path 152 is blocked either by the first portion 165 of the modulator 160 or by the second portion 167 of the modulator. Transmission occurs alternately through the portions 300 and 302 of the imaging path 152. The resulting A and B images have even greater differing perspective viewpoints than in Figure 8C providing and even greater degree of 3D depth perception.
Clearly, between Figure 8A and Figure 8D the amount of light transmitted through the modulator 160 is successively reduced. However, the light reduction is accompanied by a corresponding increase in exposure in response to the COMP signal, thereby producing a perception of generally uniform image intensity level. Advantageously, the apparatus 150 facilitates a smooth change in perspective from a 2D to a 3D image representation in the resulting images. The captured A and B images may be viewed using a specially adapted 3D display system, that uses special eyewear or headgear to present the different A and B images to the users left and right eyes.
Alternatively, the images may be displayed using an auto-stereoscopic display capable of displaying 3D image information that can be viewed without the use of special glasses or headgear. In general, where the modulator 160 has a rectangular cross section, a rate of change in location of the centroids 182 and 184 will not vary linearly with a rate of change of area of the first and second portions 165 and 167 Accordingly, to provide a smooth transition between the images shown in Figure 8A to Figure 8D, the controller may be configured to cause the first and second centroids 182 and 184 to move with respect to each other at a generally constant rate to provide a smooth change in the representation of the three dimensional spatial attributes. The non-linear relation between the location of the centroids 182 and 184 and the area of the first and second portions 165 and 167 may be stored in the controller as a look-up table, for example.
LCD modulator
The LCD modulator 160 is shown in greater detail in Figure 9. Referring to
Figure 9, the modulator 160 includes a liquid crystal material layer 350 disposed between a first glass plate 352 and a second glass plate 354. The first glass plate 352 includes a plurality of transparent electrodes 356 arranged in columns. The electrodes 356 define an extent of each of the plurality of elements 162 shown in Figure 3. Each electrode 356 has an associated connector 358, which may be a wire-bonded or flexible circuit connection, for example. The connectors 358 connect to a header 360, which in turn facilitates connection to the output 228 of the controller 220 shown in Figure 6. The second glass plate 354 includes a transparent area electrode (not shown) which acts as a common electrode for all elements 162. The modulator 160 also includes a first polarizer 362, having a first linear polarization property (in this case vertical polarization). The first polarizer 362 overlays the first electrodes 356. The modulator 160 further includes a second polarizer 364 overlaying the second electrode and having a second linear polarization property (in this case horizontal polarization). In Figure 9 the layers are not shown to scale. The modulator driver 226 provides a drive voltage to each electrode 356 via the header 360 and connectors 358, with the common electrode acting as a ground connection. In one embodiment the drive voltage may be a 50% duty cycle square wave varying between a voltage V+ and V, where the voltages are selected within a range of safe operating voltages to provide sufficient contrast between transmission and blocking of light impinging on the LCD modulator 160.
The first polarizer 362 transmits light having a vertical polarization. In this embodiment the liquid crystal material 350 is selected so that in its relaxed phase (un-actuated) the polarization of light passing through the crystal is unaffected and the second polarizer 364 thus blocks the light. When actuated by the drive voltage applied to any of the electrodes 356, a portion of the liquid crystal material underlying the electrode causes the light to undergo a 90° change in polarization, thus passing through the modulator 160. By alternately generating drive signals for first and second pluralities of the electrodes 356, the modulator 160 alternatively blocks light at the first and second portions 165 and 167 respectively. By subsequently changing a number of electrodes 356 that receive actuation signals, an extent of the first and second portions of the imaging path 152 may be varied to cause the first and second perspective viewpoints represented by the centroids 182 and 184 in Figure 3 to change location. Advantageously, providing a sufficient number of electrodes 356 facilitates a generally smooth variation in perspective thereby preventing a visually disturbing transition from 2D to 3D imaging.
In an alternative embodiment the polarizers 362 and 364 may both be vertically polarized, such that the LCD Modulator is transmissive when no actuation voltage is applied. When actuated by the drive voltage, the liquid crystal material causes the light to undergo a 90° change in polarization thus causing elements 356 to block transmission of light. Spatial modulator embodiment
Referring to Figure 10, in an alternative embodiment the modulator 160 shown in Figure 3 may be implemented using the spatial modulator shown generally at 380. The spatial modulator 380 includes an opaque shutter blade 382 mounted on an arm 384. The arm 384 is mounted on a pivot 386 to provide for side-to-side motion. The arm 384 also includes a magnet 390 mounted partway along the arm. The magnet 390 is disposed between first and second electromagnets 392 and 394. The shutter blade 382, arm 384, pivot 386, and the electromagnets 392 and 394, together make up a mechanical actuator operable to produce a force for moving the shutter blade 382 from side-to-side in the direction of the arrow 388 between a first position shown at 382 and a second position shown in broken outline at 383. The first and second positions 382 and 383 define a varying extent of the first and second portions of the single imaging path 152 shown in Figure 3. The spatial modulator further includes a position sensor 396 located behind the arm 384. The position sensor 396 includes an output 398 for producing a position signal representative of a position of the arm 384 with respect to the position sensor. In one embodiment the position sensor 396 may be implemented using a linear photodiode array where either background stray light or illumination from a source such as a light emitting diode (not shown) casts a shadow on the array. The location of the shadowed array elements may be read out from the photodiode array at the output 398 and various interpolation methods used to determine a center location of the arm 384.
For driving the spatial modulator 380, the modulator driver 226 shown in Figure 6 may be replaced by the modulator driver 400 shown in Figure 10.
The modulator driver 400 includes a first pair of outputs 402 for driving a coil 404 of the first electromagnet 392 and a second pair of outputs 406 for driving a coil 408 of the second electromagnet 394. The modulator driver 400 also includes an input 410 for receiving the position signal from the position sensor 396. The modulator driver 400 further includes an input 412 for receiving a reference signal representing the desired alternate positions of the arm 384. The reference signal defines an alternating target position for the arm 384 and shutter blade 382 and may be generated by the controller 220 in response to the CP signal.
The spatial modulator 380 and modulator driver 400 together implement a feedback control loop for producing alternating motion of the arm 384 and shutter blade 382 to vary an extent of blocking of the image path (shown in broken outline at 152). In operation, the reference signal received at the input 412 of the modulator driver 400 provides a target position of the arm 384, while the position signal received at the input 410 represents the actual position of the arm and may be used to produce an error signal for driving the modulator driver 400. The feedback control loop thus produces drive signals at the outputs 402 and 406 to cause the electromagnets 392 and 394 to exert drive forces on the arm 384 to move toward a desired position.
Advantageously, the drive may be implemented as a push-pull driver where one of the electromagnets 392 and 394 provides an attractive force on the magnet 390, while the other of the electromagnets provides a repulsion force. Exemplary waveforms of a current drive provided to the coils 404 and 408 to cause the arm 384 to move toward the first electromagnet 392 are shown graphically in Figure 11. The current waveform through the coil 404 is shown at 440 and the current waveform through the coil 408 is shown at 442. The alternating target positions provided by the reference signal REF at the input 412 are Si and S2 respectively.
During a first time period 444, the error signal derived from the difference between the target position and the current position is large causing the position current 440 to increase rapidly to produce an attractive force on the arm 384. The attractive force overcomes the inertia of the arm 384 and causes the arm to accelerate away from the second electromagnet 394. The instantaneous position s of the arm 384 produced by at the position sensor output 398 is graphically depicted at 446 in Figure 11, where a position midway between the electromagnets 392 and 394 is shown at s = 0 on the graph and the target position is S2. During the time period 444 the current 442 is initially at zero and once the arm 384 begins to accelerate, the current 442 increases rapidly to provide a decelerating force as a desired arm position S2 is approached. The arm 384 comes to rest at the position S2 and is held in place at this position by a holding current in each of the coils 404 and 408, which is continuously adjusted by the feedback control loop to maintain the arm 384 in the position S2 for a second period of time 448. The second time period 448 provides sufficient time to complete capture of the first image.
The reference signal at the input 412 then changes defining the target position Si as the new target position. During a third time period 450, the current 442 changes polarity and increases rapidly to produce an attractive force causing the arm 384 to overcome its inertia and accelerate away from the first electromagnet 392. During the third time period 450 the current 440 is initially allowed to fall to zero and once the arm 384 begins to accelerate, the current 440 increases rapidly to provide a decelerating force as the target position Si is approached. The arm 384 comes to rest at the position S1 and is held in place at this position by a holding current in each of the coils 404 and 408 which is continuously adjusted by the feedback control loop to maintain the arm 384 in the position S1 for a fourth period of time 452. The fourth time period 452 provides sufficient time to complete capture of the second image. Referring to Figure 12, an alternative embodiment of the actuator portion of the spatial modulator 380 (shown in Figure 11) is shown generally at 500. The actuator 500 includes a motor portion 502 and a rotary position sensor portion 504. A common rotor shaft 506 extends through the motor and position sensor portions 502 and 504. The arm 384 is mounted to the shaft for side-to-side motion. In general the motor portion 502 provides a drive force for moving the arm 384, while the position sensor portion 504 provides a position signal.
In one embodiment, the motor portion 502 is implemented using a pair of magnets 508 and 510, and the sensor portion 504 is implemented using a pair of magnets 512 and 514. The shaft 506 supports an actuator coil 516 between the magnets 508 and 510. The actuator coil 516 is coupled to the modulator output 402 for receiving a drive current, which causes a torque to be generated on the coil and thus applied to the shaft 506. The sensor portion 504 also includes a pickup coil (not shown) located between the magnets 512 and 514. The pickup coil generates a current signal proportional to rotary displacement, which may be used as the position signal at the input 410. In general, the actuator 500 operates in a manner similar to an analogue meter movement.
In other embodiments, the motor portion 502 may be configured such that the shaft 506 is magnetized and the coil is wound around pole pieces (i.e. 508 and 510). Similarly, the pickup coil of the sensor portion 504 may be wound around pole pieces (i.e. 512 and 514). Liqht valve embodiment
Referring to Figure 13, an alternative embodiment of the optical image apparatus (shown in Figure 3) is shown generally at 550. The apparatus 550 includes a single imaging path 552, having a first lens 554 and a second lens 556 disposed to receive light rays from an object 558 within a field of view of the first and second lenses. The apparatus 550 includes a light valve modulator 560, having a plurality of individually actuated mirror elements 562 disposed to direct a beam of light through a lens 568 when actuated. In an un-actuated state the mirror elements 562 direct the beam of light away from the lens 568. By providing drive signals to the modulator 560 to activate first and second groups of mirror elements 162, the modulator may be actuated in the alternating manner described earlier in connection with the modulator 160 shown in Figure 3. Other embodiments
In an alternative embodiment, in the LCD modulator 160 shown in Figure 9, the second polarizer 364 may be omitted to configure the modulator to selectively change the polarization of the transmitted light. Referring to Figure 9, the first polarizer 362 only transmits light having a vertical polarization. Portions of the liquid crystal material 350 underlying un-actuated electrodes
356 thus have no effect on the polarization of the light, which is transmitted as vertically polarized light. Portions the liquid crystal material 350 underlying actuated electrodes 356 cause the light to undergo a 90° change in polarization, thus causing transmitted light to have a horizontal polarization.
Using such an alternately configured modulator in the optical imaging apparatus 150 shown in Figure 3 results a first image having vertical polarization and a second image having horizontal polarization. Alternatively, the liquid crystal material 350 of the LCD modulator 160 may be configured to produce a first image having right circular polarized light and a second image having left circular polarized light. The sensor 174 may be configured to simultaneously receive the respective first and second images by adding polarizing elements in front of individual sensor array elements. For example, adjacent sensor pixels may be alternately horizontally polarized and vertically polarized to provide polarization selective pixels that are sensitive to only one polarization orientation. The sensor would thus permit both the first and second images to be simultaneously received. The first and second images may be separated during readout of the array or in a separate processing step.
While specific embodiments of the invention have been described and illustrated, such embodiments should be considered illustrative of the invention only and not as limiting the invention as construed in accordance with the accompanying claims.

Claims

What is claimed is:
1. A method of generating three dimensional image information using a single imaging path having an associated field of view, the method comprising: selectively receiving first and second images through respective first and second portions of the single imaging path, said first portion having a first perspective viewpoint within the field of view and said second portion having a second perspective viewpoint within the field of view, said first and second images together being operable to represent three dimensional spatial attributes of objects within the field of view; varying an extent of said first and second portions of the imaging path to cause said first and second perspective viewpoints to change location while receiving said first and second images, said change in perspective viewpoint location providing a corresponding change in said representation of said three dimensional spatial attributes; and compensating for changes in transmission through said first and second portions of the imaging path such that while varying said extent of the first and second portions, an image intensity associated with each of said first and second images is maintained at a generally uniform image intensity level.
2. The method of claim 1 wherein selectively receiving said first and second images comprises receiving said first and second images at an image sensor and wherein compensating for said changes in said transmission comprises one of: increasing an exposure associated with said image sensor in response to a reducing extent of the first and second portions of the imaging path; decreasing a gain associated with said image sensor in response to an increasing extent of the first and second portions of the imaging path; increasing overall transmittance through the imaging path in response to a reducing extent of the first and second portions of the imaging path; and reducing overall transmittance through the imaging path in response to an increasing extent of the first and second portions of the imaging path.
3. The method of claim 1 wherein selectively receiving said first and second images comprises alternately: blocking said first portion of the imaging path while receiving s saaiidd s seeccoonndd i immaaαgee:; a anndd blocking said second portion of the imaging path while receiving said first image.
4. The method of claim 3 wherein alternately blocking said first and second portions of the imaging path comprises causing a blocking element located proximate an aperture plane of the image path to move between first and second positions in said image path to define said varying extent of said first and second portions of the imaging path.
5. The method of claim 4 wherein causing said blocking element to move comprises: producing a force operable to alternately move said blocking element toward one of said first and second positions; receiving a position signal representing a position of said blocking element; and controlling a magnitude of said force in response to said position signal to cause the blocking element to come to rest at said one of said first and second positions.
6. The method of claim 3 wherein alternately blocking said first and second portions of the imaging path comprises selectively actuating first and second regions of an optical element located proximate an aperture plane of the image path to selectively block said first and second portions of the imaging path.
7. The method of claim 6 wherein said optical element comprises a plurality of elements and wherein selectively actuating said first and second regions comprises selectively actuating one of: a first plurality of elements in said plurality of elements; and a second plurality of elements in said plurality of elements.
8. The method of claim 7 wherein each element of said plurality of elements is operable to be actuated in response to receiving an actuation signal, and wherein varying said extent of said first and second portions of the imaging path comprises generating actuation signals to cause a number of elements in said first and second plurality of elements to be selectively varied to vary said extent of said first and second portions of the imaging path.
9. The method of claim 6 wherein selectively actuating said first and second regions of said optical element comprises selectively actuating first and second regions of a transmissive optical element disposed to transmit light through said respective first and second portions of the single imaging path.
10. The method of claim 9 wherein selectively actuating first and second regions of said transmissive optical element comprises selectively actuating first and second regions of one of a liquid crystal element, and a light valve.
11. The method of claim 6 wherein selectively actuating said first and second regions of said optical element comprises selectively actuating first and second regions of a reflective optical element disposed to reflect light through said respective first and second portions of the single imaging path.
12. The method of claim 11 wherein selectively actuating first and second regions of said reflective optical element comprises selectively actuating first and second regions of a light valve having a plurality of moveable mirror elements.
13. The method of claim 1 wherein selectively receiving said first and second images comprises: simultaneously receiving a first image having first image attributes and a second image having second image attributes; and separating said first and second images in accordance with said first and second image attributes to produce respective first and second image representations.
14. The method of claim 13 wherein receiving said first image comprises receiving a first image having a first state of polarization and receiving said second image comprises receiving a second image having a second state of polarization, and wherein separating said first and second images comprises receiving the first and second images at a sensor array having a first plurality of elements responsive to radiation of the first polarization state and a second plurality of elements responsive to radiation of the second polarization state.
15. The method of claim 14 further comprising generating said first image having said first state of polarization and generating said second image having said second state of polarization.
16. The method of claim 1 wherein varying said extent comprises varying said extent of said first and second portions of the imaging path in response to a control signal.
17. The method of claim 16 further comprising generating said control signal.
18. The method of claim 17 wherein a location of said first perspective viewpoint is defined by a first centroid location and a location of said second perspective viewpoint is defined by a second centroid location, and wherein generating said control signal comprises generating a control signal operable to cause said first and second centroids to move with respect to each other at a generally constant rate to provide a smooth change in said representation of said three dimensional spatial attributes.
19. The method of claim 1 wherein a location of said first perspective viewpoint is defined by a first centroid location and a location of said second perspective viewpoint is defined by a second centroid location, and wherein varying said extent comprises varying said extent of said first and second portions of the imaging path between: a first extent wherein said first and second centroid locations are proximally located causing said first and second images to comprise predominately two-dimensional spatial attributes within the field of view; and a second extent wherein said first and second centroid locations are spaced apart to cause said first and second images to comprise an increasing degree of three dimensional spatial attribute information.
20. The method of claim 19 wherein varying said extent of said first and second portions of the imaging path comprises varying said extent to provide a smooth transition from one of: said first extent to said second extent to produce a two- dimensional to three-dimensional transition effect; and said second extent to said first extent to produce a three- dimensional to two-dimensional transition effect.
21. The method of claim 1 wherein receiving said first and second images comprises sequentially receiving a plurality first and second images representing time variations of subject matter within the field of view.
22. An apparatus for generating three dimensional image information using a single imaging path having an associated field of view, the apparatus comprising: means for selectively receiving first and second images through respective first and second portions of the single imaging path, said first portion having a first perspective viewpoint within the field of view and said second portion having a second perspective viewpoint within the field of view, said first and second images together being operable to represent three dimensional spatial attributes of objects within the field of view; means for varying an extent of said first and second portions of the imaging path to cause said first and second perspective viewpoints to change location while receiving said first and second images, said change in perspective viewpoint location providing a corresponding change in said representation of said three dimensional spatial attributes; and means for compensating for changes in transmission through said first and second portions of the imaging path such that while varying said extent of the first and second portions, an image intensity associated with each of said first and second images is maintained at a generally uniform image intensity level.
23. An apparatus for generating three dimensional image information, the apparatus comprising: a single imaging path having an associated field of view; an image modulator operably configured to cause first and second images to be selectively received through respective first and second portions of the single imaging path, said first portion having a first perspective viewpoint within the field of view and said second portion having a second perspective viewpoint within the field of view, said first and second images together being operable to represent three dimensional spatial attributes of objects within the field of view; a controller in communication with said modulator, said controller being operably configured to produce a signal operable to cause said modulator to vary an extent of said first and second portions of the imaging path to cause said first and second perspective viewpoints to change location while receiving said first and second images, said change in perspective viewpoint location providing a corresponding change in said representation of said three dimensional spatial attributes; and a compensator operably configured to compensate for changes in transmission through said first and second portions of the imaging path such that while varying said extent of the first and second portions, an image intensity associated with each of said first and second images is maintained at a generally uniform image intensity level.
24. The apparatus of claim 23 wherein said single imaging path is operably configured to produce said first and second images at an image sensor and wherein said compensator is operably configured to compensate for said changes in said transmission by one of: increasing an exposure associated with said image sensor in response to a reducing extent of the first and second portions of the imaging path; decreasing an exposure associated with said image sensor in response to an increasing extent of the first and second portions of the imaging path; increasing overall transmittance through the imaging path in response to a reducing extent of the first and second portions of the imaging path; and reducing overall transmittance through the imaging path in response to an increasing extent of the first and second portions of the imaging path.
25. The apparatus of claim 23 wherein said modulator is operably configured to alternately: block said first portion of the imaging path while receiving said second image; and block said second portion of the imaging path while receiving said first image.
26. The apparatus of claim 25 wherein said modulator is operably configured to cause a blocking element located proximate an aperture plane of the image path to move between first and second positions in said image path to define said varying extent of said first and second portions of the imaging path.
27. The apparatus of claim 26 wherein said modulator comprises: an actuator for producing a force operable to alternately move said blocking element toward one of said first and second positions; a position sensor operably configured to produce a position signal representing a position of said blocking element; and wherein said controller is operably configured to control a magnitude of said force in response to said position signal to cause the blocking element to come to rest at said one of said first and second positions.
28. The apparatus of claim 25 wherein said modulator comprises an optical element having first and second regions, said first and second regions being operably configured to be selectively actuated to selectively block said first and second portions of the imaging path.
29. The apparatus of claim 28 wherein said optical element comprises a plurality of elements and wherein said first region comprises a first plurality of elements and said second region comprises a second plurality of elements, said first and second pluralities being selected to vary said extent of said first and second portions of the imaging path.
30. The apparatus of claim 29 wherein each element of said plurality of elements is operable to be actuated in response to receiving an actuation signal, and further comprising a modulator driver operably configured to generate said actuation signals to cause a number of elements in said first and second plurality of elements to be selectively varied to vary said extent of said first and second portions of the imaging path.
31. The apparatus of claim 28 wherein said modulator is operably configured to selectively actuating first and second regions of a transmissive optical element disposed to transmit light through said respective first and second portions of the single imaging path.
32. The apparatus of claim 31 wherein said modulator comprises of one of a liquid crystal element, and a light valve.
33. The apparatus of claim 28 wherein said modulator is operably configured to selectively actuate first and second regions of a reflective optical element disposed to reflect light received through said respective first and second portions of the single imaging path.
34. The apparatus of claim 33 wherein said modulator comprises a light valve having a plurality of moveable mirror elements.
35. The apparatus of claim 23 wherein said modulator is operably configured to: simultaneously receive a first image having first image attributes and a second image having second image attributes; and separate said first and second images in accordance with said first and second image attributes to produce respective first and second image representations.
36. The apparatus of claim 35 wherein said modulator comprises a polarizer having first and second polarization regions operably configured to generate a first image having a first state of polarization and said second image having a second state of polarization, and further comprising a sensor array having a first plurality of elements responsive to radiation of the first polarization state and a second plurality of elements responsive to radiation of the second polarization state, said sensor array being operable to separate said first and second images.
37. The apparatus of claim 23 wherein said modulator is operably configured to varying said extent of said first and second portions of the imaging path in response to a control signal.
38. The apparatus of claim 37 wherein said controller is operably configured to generate said control signal.
39. The apparatus of claim 38 wherein a location of said first perspective viewpoint is defined by a first centroid location and a location of said second perspective viewpoint is defined by a second centroid location, and wherein said controller is operably configured to generate said control signal by generating a control signal operable to cause said first and second centroids to move with respect to each other at a generally constant rate to provide a smooth change in said representation of said three dimensional spatial attributes.
40. The apparatus of claim 23 wherein a location of said first perspective viewpoint is defined by a first centroid location and a location of said second perspective viewpoint is defined by a second centroid location, and wherein said modulator is operably configured to vary said extent of said first and second portions of the imaging path between: a first extent wherein said first and second centroid locations are proximally located causing said first and second images to comprise predominately two-dimensional spatial attributes within the field of view; and a second extent wherein said first and second centroid locations are spaced apart to cause said first and second images to comprise an increasing degree of three dimensional spatial attribute information.
41. The apparatus of claim 40 wherein said modulator is operably configured to vary said extent of said first and second portions of the imaging path by varying said extent to provide a smooth transition from one of: said first extent to said second extent to produce a two- dimensional to three-dimensional transition effect; and said second extent to said first extent to produce a three- dimensional to two-dimensional transition effect.
42. The apparatus of claim 23 wherein said image path is operably configured to receiving said first and second images by sequentially receiving a plurality first and second images representing time variations of subject matter within the field of view.
PCT/CA2009/000957 2009-07-10 2009-07-10 Method and apparatus for generating three dimensional image information using a single imaging path WO2011003168A1 (en)

Priority Applications (13)

Application Number Priority Date Filing Date Title
KR1020127003672A KR101598653B1 (en) 2009-07-10 2009-07-10 Method and apparatus for generating three dimensional image information using a single imaging path
CN200980161394.6A CN102725688B (en) 2009-07-10 2009-07-10 Method and apparatus for generating three dimensional image information using a single imaging path
US13/382,895 US9298078B2 (en) 2009-07-10 2009-07-10 Method and apparatus for generating three-dimensional image information using a single imaging path
EP09846964.6A EP2452228A4 (en) 2009-07-10 2009-07-10 Method and apparatus for generating three dimensional image information using a single imaging path
JP2012518701A JP2012532347A (en) 2009-07-10 2009-07-10 Method and apparatus for generating 3D image information using a single imaging path
PCT/CA2009/000957 WO2011003168A1 (en) 2009-07-10 2009-07-10 Method and apparatus for generating three dimensional image information using a single imaging path
TW099120775A TWI531209B (en) 2009-07-10 2010-06-25 Method and apparatus for generating three dimensional image information using a single imaging path
EP10796640.0A EP2452224A4 (en) 2009-07-10 2010-07-12 Method and apparatus for generating three-dimensional image information
PCT/CA2010/001093 WO2011003208A1 (en) 2009-07-10 2010-07-12 Method and apparatus for generating three-dimensional image information
KR1020107027161A KR101758377B1 (en) 2009-07-10 2010-07-12 Method and apparatus for generating three-dimensional image information
CN201080040644.3A CN102640036B (en) 2009-07-10 2010-07-12 Generating three-dimensional figures is as the method and apparatus of information
JP2012518716A JP5840607B2 (en) 2009-07-10 2010-07-12 Method and apparatus for generating three-dimensional image information
US13/382,892 US9442362B2 (en) 2009-07-10 2010-07-12 Method and apparatus for generating three-dimensional image information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CA2009/000957 WO2011003168A1 (en) 2009-07-10 2009-07-10 Method and apparatus for generating three dimensional image information using a single imaging path

Publications (1)

Publication Number Publication Date
WO2011003168A1 true WO2011003168A1 (en) 2011-01-13

Family

ID=43428692

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/CA2009/000957 WO2011003168A1 (en) 2009-07-10 2009-07-10 Method and apparatus for generating three dimensional image information using a single imaging path
PCT/CA2010/001093 WO2011003208A1 (en) 2009-07-10 2010-07-12 Method and apparatus for generating three-dimensional image information

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/CA2010/001093 WO2011003208A1 (en) 2009-07-10 2010-07-12 Method and apparatus for generating three-dimensional image information

Country Status (7)

Country Link
US (2) US9298078B2 (en)
EP (2) EP2452228A4 (en)
JP (2) JP2012532347A (en)
KR (2) KR101598653B1 (en)
CN (2) CN102725688B (en)
TW (1) TWI531209B (en)
WO (2) WO2011003168A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249752A1 (en) * 2011-03-28 2012-10-04 Sony Corporation Imaging apparatus and electronic apparatus
WO2012164388A1 (en) * 2011-06-01 2012-12-06 Gvbb Holdings S.A.R.L. Grid modulated single lens 3-d camera
WO2013033811A1 (en) 2011-09-08 2013-03-14 Front Street Investment Management Inc. Method and apparatus for illuminating a field of view of an optical system for generating three dimensional image information
EP2724192A4 (en) * 2011-06-21 2015-09-02 Front Street Diversified Income Class By Its Manager Front Street Invest Man Inc Method and apparatus for generating three-dimensional image information
US9392260B2 (en) 2012-01-27 2016-07-12 Panasonic Intellectual Property Management Co., Ltd. Array optical element, imaging member, imaging element, imaging device, and distance measurement device

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9298078B2 (en) * 2009-07-10 2016-03-29 Steropes Technologies, Llc Method and apparatus for generating three-dimensional image information using a single imaging path
WO2012174633A1 (en) * 2011-06-21 2012-12-27 Isee3D Inc. Method and apparatus for generating three-dimensional image information
JP2013538360A (en) 2010-06-25 2013-10-10 フロント、ストリート、インベストメント、マネジメント、インコーポレイテッド、アズ、マネジャー、フォー、フロント、ストリート、ダイバーシファイド、インカム、クラス Method and apparatus for generating three-dimensional image information
JP5741683B2 (en) * 2011-03-28 2015-07-01 株式会社Jvcケンウッド 3D image processing apparatus and 3D image processing method
JP5982751B2 (en) * 2011-08-04 2016-08-31 ソニー株式会社 Image processing apparatus, image processing method, and program
US9456735B2 (en) * 2012-09-27 2016-10-04 Shahinian Karnig Hrayr Multi-angle rear-viewing endoscope and method of operation thereof
JP5831105B2 (en) * 2011-09-30 2015-12-09 ソニー株式会社 Imaging apparatus and imaging method
US8937646B1 (en) 2011-10-05 2015-01-20 Amazon Technologies, Inc. Stereo imaging using disparate imaging devices
JP6156787B2 (en) * 2012-07-25 2017-07-05 パナソニックIpマネジメント株式会社 Imaging observation device
JP6044328B2 (en) * 2012-12-26 2016-12-14 株式会社リコー Image processing system, image processing method, and program
US11153696B2 (en) 2017-02-14 2021-10-19 Virtual 3-D Technologies Corp. Ear canal modeling using pattern projection
US9456752B2 (en) 2013-03-14 2016-10-04 Aperture Diagnostics Ltd. Full-field three-dimensional surface measurement
US20210108922A1 (en) * 2013-05-14 2021-04-15 The Charles Stark Draper Laboratory, Inc. Star Tracker with Adjustable Light Shield
US9726617B2 (en) * 2013-06-04 2017-08-08 Kla-Tencor Corporation Apparatus and methods for finding a best aperture and mode to enhance defect detection
US9255887B2 (en) * 2013-06-19 2016-02-09 Kla-Tencor Corporation 2D programmable aperture mechanism
KR101476820B1 (en) * 2014-04-07 2014-12-29 주식회사 썸텍 3D video microscope
US20170055814A1 (en) * 2014-06-01 2017-03-02 Capsovision ,Inc. Reconstruction of Images from an in Vivo Multi-Camera Capsule with Confidence Matching
US20160057405A1 (en) * 2014-08-22 2016-02-25 George E. Duckett, III Compact Stereoscopic Lens System for Medical or Industrial Imaging Device
KR102312273B1 (en) * 2014-11-13 2021-10-12 삼성전자주식회사 Camera for depth image measure and method of operating the same
DE102014017281A1 (en) * 2014-11-21 2016-05-25 e.solutions GmbH Optical detection device and method for controlling the same
KR101639685B1 (en) * 2015-02-12 2016-07-14 한국생산기술연구원 Camera type active filter device, and active filtering method thereof
US11076986B2 (en) * 2015-05-12 2021-08-03 Ikem C Ajaelo Electronic drop dispensing device and method of operation thereof
EP3275359A1 (en) * 2015-05-12 2018-01-31 Olympus Corporation Stereoscopic endoscope device
CN104935915B (en) * 2015-07-17 2018-05-11 珠海康弘发展有限公司 Imaging device, 3-D imaging system and three-D imaging method
JP2018019020A (en) * 2016-07-29 2018-02-01 ソニーセミコンダクタソリューションズ株式会社 Imaging device
JP6589071B2 (en) * 2016-12-28 2019-10-09 オリンパス株式会社 Imaging device, endoscope and endoscope system
US10972643B2 (en) * 2018-03-29 2021-04-06 Microsoft Technology Licensing, Llc Camera comprising an infrared illuminator and a liquid crystal optical filter switchable between a reflection state and a transmission state for infrared imaging and spectral imaging, and method thereof
US10365554B1 (en) 2018-04-04 2019-07-30 Intuitive Surgical Operations, Inc. Dynamic aperture positioning for stereo endoscopic cameras
US10924692B2 (en) 2018-05-08 2021-02-16 Microsoft Technology Licensing, Llc Depth and multi-spectral camera
KR102124623B1 (en) * 2018-08-31 2020-06-18 주식회사 하이딥 Display apparatus capable of fingerprint recognition
CN109597275A (en) * 2018-11-29 2019-04-09 同济大学 A kind of axial Distributed Three-dimensional imaging method based on double-wedge prism
CN112312047B (en) * 2019-08-01 2024-04-16 格科微电子(上海)有限公司 Method for reducing power supply noise of image sensor
WO2022074973A1 (en) * 2020-10-05 2022-04-14 ソニーグループ株式会社 Gaze detection device, and display device
CN114445497A (en) * 2022-03-01 2022-05-06 上海涛影医疗科技有限公司 Image positioning method, image positioning device, dynamic image generating method, dynamic image generating device, dynamic image generating system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5471237A (en) * 1992-06-26 1995-11-28 Apollo Camer, Llc Single lens stereoscopic video camera
WO1997003378A1 (en) * 1995-07-07 1997-01-30 International Telepresence Corporation System with movable lens for producing three-dimensional images
US5914810A (en) 1993-11-23 1999-06-22 Watts; Jonathan Robert Stereoscopic imaging arrangement and viewing arrangement
US6275335B1 (en) * 1999-07-16 2001-08-14 Sl3D, Inc. Single-lens 3D method, microscope, and video adapter
US6348994B1 (en) * 1995-03-02 2002-02-19 Carl Zeiss Jena Gmbh Method for generating a stereoscopic image of an object and an arrangement for stereoscopic viewing
US6624935B2 (en) * 2000-12-06 2003-09-23 Karl Store Imaging, Inc. Single-axis stereoscopic video imaging system with centering capability

Family Cites Families (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2255631A (en) 1940-07-09 1941-09-09 Cyril A Schulman Microscope
US3464766A (en) 1966-10-28 1969-09-02 Us Interior Stereo-image alternator system
US3712199A (en) 1970-09-23 1973-01-23 Video West Inc Three-dimensional color photographic process, apparatus and product
JPS5440937B1 (en) 1970-10-23 1979-12-06
US4021846A (en) 1972-09-25 1977-05-03 The United States Of America As Represented By The Secretary Of The Navy Liquid crystal stereoscopic viewer
US3825328A (en) * 1973-09-10 1974-07-23 W Hoch Optical system for a stereoscopic motion picture camera
GB1502274A (en) 1974-02-14 1978-03-01 Hopkins H Microscope and magnification changer
US4103260A (en) * 1977-01-03 1978-07-25 Hughes Aircraft Company Spatial polarization coding electro-optical transmitter
US4196966A (en) 1978-05-01 1980-04-08 Malis Leonard I Binocular magnification system
US4303316A (en) 1978-10-19 1981-12-01 Mcelveen Robert H Process for recording visual scenes for reproduction in stereopsis
US4392710A (en) 1979-11-22 1983-07-12 Pilkington P. E. Limited Optical apparatus
US4568160A (en) 1983-06-03 1986-02-04 Mgs Incorporated Process and apparatus for 3-dimensional moving pictures
US4651201A (en) 1984-06-01 1987-03-17 Arnold Schoolman Stereoscopic endoscope arrangement
US4601552A (en) 1984-10-22 1986-07-22 Jessmore Floyd E Binocular construction having adjustable filters
US4761066A (en) 1986-01-14 1988-08-02 Carter William J Stereoscopic optical system
US4924853A (en) 1989-05-22 1990-05-15 Medical Dimensions, Inc. Stereoscopic medical viewing device
US5097359A (en) 1990-04-12 1992-03-17 Mckinley Optics, Inc. Endoscope relay lens configuration
US5059009A (en) 1990-04-12 1991-10-22 Mckinley Optics, Incorporated Endoscope relay lens
US5094523A (en) 1990-05-11 1992-03-10 Eye Research Institute Of Retina Foundation Bidirectional light steering apparatus
US5537144A (en) 1990-06-11 1996-07-16 Revfo, Inc. Electro-optical display system for visually displaying polarized spatially multiplexed images of 3-D objects for use in stereoscopically viewing the same with high image quality and resolution
US5198877A (en) 1990-10-15 1993-03-30 Pixsys, Inc. Method and apparatus for three-dimensional non-contact shape sensing
JPH04251239A (en) * 1991-01-09 1992-09-07 Hiroyoshi Sugibuchi Stereoscopic viewing photographing device
US5122650A (en) 1991-04-18 1992-06-16 Mckinley Optics, Inc. Stereo video endoscope objective lens system
US5222477A (en) 1991-09-30 1993-06-29 Welch Allyn, Inc. Endoscope or borescope stereo viewing system
US5588948A (en) 1993-02-17 1996-12-31 Olympus Optical Co. Ltd. Stereoscopic endoscope
CA2177165C (en) 1993-11-23 2006-04-11 Jonathan Robert Watts Stereoscopic imaging arrangement and viewing arrangement
CA2123077C (en) 1994-04-14 2001-09-04 Anthony B. Greening Single lens stereoscopic imaging system
JPH0836229A (en) 1994-07-21 1996-02-06 Canon Inc Stereo adapter
EP0730181B1 (en) * 1995-03-02 2000-12-20 CARL ZEISS JENA GmbH Method of producing a stereoscopic image from an object and device for stereoscopic viewing
US6882473B2 (en) * 1995-03-02 2005-04-19 Carl Zeiss Jena Gmbh Method for generating a stereoscopic image of an object and an arrangement for stereoscopic viewing
US5532777A (en) 1995-06-06 1996-07-02 Zanen; Pieter O. Single lens apparatus for three-dimensional imaging having focus-related convergence compensation
US5703677A (en) 1995-11-14 1997-12-30 The Trustees Of The University Of Pennsylvania Single lens range imaging method and apparatus
US5835133A (en) 1996-01-23 1998-11-10 Silicon Graphics, Inc. Optical system for single camera stereo video
GB9617314D0 (en) * 1996-08-17 1996-09-25 Fryco Ltd Optical images
US6006001A (en) 1996-12-02 1999-12-21 The Research Foundation Of Cuny Fiberoptic assembly useful in optical spectroscopy
HUP9700348A1 (en) * 1997-02-04 1998-12-28 Holografika E.C. Method and device for displaying three-dimensional pictures
KR100261582B1 (en) 1997-11-06 2000-07-15 윤종용 3-dimensional image projection display device
US6239912B1 (en) * 1998-09-11 2001-05-29 Nikon Corporation Focal point detection apparatus
US7683926B2 (en) 1999-02-25 2010-03-23 Visionsense Ltd. Optical device
US6563105B2 (en) 1999-06-08 2003-05-13 University Of Washington Image acquisition with depth enhancement
JP3863319B2 (en) 1999-06-29 2006-12-27 富士フイルムホールディングス株式会社 Parallax image capturing apparatus and camera
GB2371878A (en) 1999-11-22 2002-08-07 Sl3D Inc Stereoscopic telescope with camera
EP1301826A2 (en) 2000-05-24 2003-04-16 Pieter Zanen Device for making 3-d images
JP2002034056A (en) * 2000-07-18 2002-01-31 Scalar Corp Device and method for picking up stereoscopic image
JP3564383B2 (en) * 2000-11-15 2004-09-08 日本電信電話株式会社 3D video input device
US7324279B2 (en) 2000-12-28 2008-01-29 Texas Instruments Incorporated Dual modulator projection system
US20020131170A1 (en) 2001-01-12 2002-09-19 Bryan Costales Stereoscopic aperture valves
JP2004309868A (en) * 2003-04-08 2004-11-04 Sony Corp Imaging device and stereoscopic video generating device
US20070197875A1 (en) 2003-11-14 2007-08-23 Osaka Shoji Endoscope device and imaging method using the same
JP2005159755A (en) * 2003-11-26 2005-06-16 Fuji Photo Film Co Ltd Image processing apparatus and image processing program
US20060279740A1 (en) * 2003-12-31 2006-12-14 Badami Vivek G Optically balanced instrument for high accuracy measurement of dimensional change
US7426039B2 (en) * 2003-12-31 2008-09-16 Corning Incorporated Optically balanced instrument for high accuracy measurement of dimensional change
US7227568B2 (en) 2004-04-03 2007-06-05 Li Sun Dual polarizing light filter for 2-D and 3-D display
WO2005117458A2 (en) * 2004-05-26 2005-12-08 Tibor Balogh Method and apparatus for generating 3d images
FR2889318B1 (en) * 2005-07-26 2007-12-28 Commissariat Energie Atomique RECONFIGURABLE OPTICAL BEAM PROCESSING DEVICE
EP1764644B1 (en) * 2005-09-09 2017-08-30 Viavi Solutions Inc. Optimally oriented trim retarders
US7559653B2 (en) 2005-12-14 2009-07-14 Eastman Kodak Company Stereoscopic display apparatus using LCD panel
US8102413B2 (en) * 2005-12-15 2012-01-24 Unipixel Displays, Inc. Stereoscopic imaging apparatus incorporating a parallax barrier
US7978892B2 (en) 2006-10-25 2011-07-12 D4D Technologies, Llc 3D photogrammetry using projected patterns
WO2008068753A2 (en) * 2006-12-04 2008-06-12 Ben-Gurion University Of The Negev - Research And Development Authority Polarization independent birefringent tunable filters
JP4931668B2 (en) 2007-03-29 2012-05-16 富士フイルム株式会社 Compound eye imaging device
TW200921044A (en) * 2007-11-07 2009-05-16 Lite On Semiconductor Corp 3D position detecting device and detecting method thereof
TW200921042A (en) * 2007-11-07 2009-05-16 Lite On Semiconductor Corp 3D multi-degree of freedom detecting device and detecting method thereof
US8025416B2 (en) * 2008-02-18 2011-09-27 3D4K Displays, Inc. Integrated optical polarization combining prism for projection displays
JP2009169096A (en) * 2008-01-16 2009-07-30 Fujifilm Corp Imaging device
DE102008000467A1 (en) * 2008-02-29 2009-09-10 Seereal Technologies S.A. Device for reading holograms
US8737721B2 (en) * 2008-05-07 2014-05-27 Microsoft Corporation Procedural authoring
US8204299B2 (en) * 2008-06-12 2012-06-19 Microsoft Corporation 3D content aggregation built into devices
US20100103276A1 (en) * 2008-10-28 2010-04-29 Border John N Split aperture capture of rangemap for 3d imaging
CN101588513B (en) * 2009-01-07 2011-05-18 深圳市掌网立体时代视讯技术有限公司 Device and method of stereo camera
US9298078B2 (en) 2009-07-10 2016-03-29 Steropes Technologies, Llc Method and apparatus for generating three-dimensional image information using a single imaging path
DE102010044502A1 (en) * 2010-09-06 2012-03-08 Leica Microsystems (Schweiz) Ag Special lighting Video Operations stereomicroscope

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5471237A (en) * 1992-06-26 1995-11-28 Apollo Camer, Llc Single lens stereoscopic video camera
US5914810A (en) 1993-11-23 1999-06-22 Watts; Jonathan Robert Stereoscopic imaging arrangement and viewing arrangement
US6348994B1 (en) * 1995-03-02 2002-02-19 Carl Zeiss Jena Gmbh Method for generating a stereoscopic image of an object and an arrangement for stereoscopic viewing
WO1997003378A1 (en) * 1995-07-07 1997-01-30 International Telepresence Corporation System with movable lens for producing three-dimensional images
US6275335B1 (en) * 1999-07-16 2001-08-14 Sl3D, Inc. Single-lens 3D method, microscope, and video adapter
US6624935B2 (en) * 2000-12-06 2003-09-23 Karl Store Imaging, Inc. Single-axis stereoscopic video imaging system with centering capability

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2452228A4

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249752A1 (en) * 2011-03-28 2012-10-04 Sony Corporation Imaging apparatus and electronic apparatus
US9239514B2 (en) * 2011-03-28 2016-01-19 Sony Corporation Imaging apparatus and electronic device for producing stereoscopic images
WO2012164388A1 (en) * 2011-06-01 2012-12-06 Gvbb Holdings S.A.R.L. Grid modulated single lens 3-d camera
US9338435B2 (en) 2011-06-01 2016-05-10 Gvbb Holdings S.A.R.L. Grid modulated single lens 3-D camera
EP2724192A4 (en) * 2011-06-21 2015-09-02 Front Street Diversified Income Class By Its Manager Front Street Invest Man Inc Method and apparatus for generating three-dimensional image information
WO2013033811A1 (en) 2011-09-08 2013-03-14 Front Street Investment Management Inc. Method and apparatus for illuminating a field of view of an optical system for generating three dimensional image information
EP2764328A1 (en) * 2011-09-08 2014-08-13 Front Street Investment Management Inc. Method and apparatus for illuminating a field of view of an optical system for generating three dimensional image information
EP2764328A4 (en) * 2011-09-08 2015-05-06 Front Street Invest Man Inc Method and apparatus for illuminating a field of view of an optical system for generating three dimensional image information
EP2912995A1 (en) * 2011-09-08 2015-09-02 FRONT STREET INVESTMENT MANAGEMENT INC. as manager for Front Street Diversified Income Class Method and apparatus for illuminating a field of view of an optical system for generating three dimensional image information
US9392260B2 (en) 2012-01-27 2016-07-12 Panasonic Intellectual Property Management Co., Ltd. Array optical element, imaging member, imaging element, imaging device, and distance measurement device

Also Published As

Publication number Publication date
US20120188347A1 (en) 2012-07-26
WO2011003208A1 (en) 2011-01-13
CN102640036A (en) 2012-08-15
EP2452224A4 (en) 2015-06-03
KR101598653B1 (en) 2016-02-29
JP5840607B2 (en) 2016-01-06
CN102725688A (en) 2012-10-10
US20130038690A1 (en) 2013-02-14
TW201108715A (en) 2011-03-01
EP2452228A1 (en) 2012-05-16
US9298078B2 (en) 2016-03-29
KR20120039440A (en) 2012-04-25
CN102725688B (en) 2015-04-01
JP2012532347A (en) 2012-12-13
EP2452224A1 (en) 2012-05-16
EP2452228A4 (en) 2015-06-03
CN102640036B (en) 2016-02-17
US9442362B2 (en) 2016-09-13
JP2012532348A (en) 2012-12-13
KR20120093825A (en) 2012-08-23
KR101758377B1 (en) 2017-07-26
TWI531209B (en) 2016-04-21

Similar Documents

Publication Publication Date Title
US9298078B2 (en) Method and apparatus for generating three-dimensional image information using a single imaging path
KR101194521B1 (en) A system for acquiring and displaying three-dimensional information and a method thereof
JP5346266B2 (en) Image processing apparatus, camera, and image processing method
US9664612B2 (en) Method and apparatus for generating three-dimensional image information
WO2002078324A2 (en) Stereoscopic camera with only one digital image sensor
US9124877B1 (en) Methods for acquiring stereoscopic images of a location
WO2006014469A2 (en) 3d television broadcasting system
WO1995014952A1 (en) Stereoscopic imaging arrangement and viewing arrangement
US20120307016A1 (en) 3d camera
JP4208351B2 (en) Imaging apparatus, convergence distance determination method, and storage medium
JP2005173270A (en) Optical device for stereoscopic photography, photographing device, and system and device for stereoscopic photography
TWI486633B (en) Adapter apparatus and method for generating three-dimensional image information
JP2001016619A (en) Image pickup device, its convergence distance decision method, storage medium and optical device
JP2000152282A (en) Stereoscopic picture photographing device
JP2001016617A (en) Image pickup device, its convergence control method, storage medium and optical device
KR100943949B1 (en) Stereoscopic camera system and driving method of the same
Evens et al. The development of 3-D (stereoscopic) imaging systems for security applications
JP2004126290A (en) Stereoscopic photographing device
JP3578808B2 (en) 3D display device
JP2002027498A (en) Apparatus for imaging three-dimensional video
JP2003092769A (en) Stereoscopic video imaging apparatus
JP2002191059A (en) Stereoscopic photographing device
JP2003092770A (en) Stereoscopic video imaging apparatus
JPS63164596A (en) Image pickup device for stereoscopic vision
JP2003121949A (en) Stereoscopic photography optical device

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980161394.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09846964

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012518701

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2009846964

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20127003672

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 13382895

Country of ref document: US