WO2005065272A2 - Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration - Google Patents

Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration Download PDF

Info

Publication number
WO2005065272A2
WO2005065272A2 PCT/US2004/043408 US2004043408W WO2005065272A2 WO 2005065272 A2 WO2005065272 A2 WO 2005065272A2 US 2004043408 W US2004043408 W US 2004043408W WO 2005065272 A2 WO2005065272 A2 WO 2005065272A2
Authority
WO
WIPO (PCT)
Prior art keywords
optical
calibration
dimensional
wavelengths
mixer
Prior art date
Application number
PCT/US2004/043408
Other languages
French (fr)
Other versions
WO2005065272A3 (en
Inventor
John J. Keating Iii
Rainer Martini
Original Assignee
Trustees Of Stevens Institute Of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Trustees Of Stevens Institute Of Technology filed Critical Trustees Of Stevens Institute Of Technology
Priority to CA002550842A priority Critical patent/CA2550842A1/en
Priority to US10/585,157 priority patent/US8098275B2/en
Priority to EP04815479A priority patent/EP1709617A2/en
Publication of WO2005065272A2 publication Critical patent/WO2005065272A2/en
Publication of WO2005065272A3 publication Critical patent/WO2005065272A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • H04N13/39Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume the picture elements emitting light at places where a pair of light beams intersect in a transparent material
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • H04N13/393Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume the volume being generated by a moving, e.g. vibrating or rotating, surface

Definitions

  • the present invention relates to three-dimensional imaging, and, more particularly, to an apparatus for providing a three-dimensional imaging system which includes a three-dimensional display, a two-dimensional and/or three-dimensional scanning device, and calibration equipment to calibrate the scanning device(s) and to simplify the combination of the images from one or more two-dimensional optical imaging devices into three-dimensional information.
  • the display and the scanning device(s) both employ optical pulses and non-linear optics to display and record, respectively, a three-dimensional image.
  • Three- dimensional images provide the viewer with texture, depth color and position information. Three-dimensional images are more natural for humans to appreciate.
  • a volumetric three-dimensional imaging system displays images in a display volume which are acquired by a three-dimensional optical scanner or acquired by one or more two-dimensional optical scanners and converted to a three dimensional representation using holographic calibration. Light rays generated by the display at three-dimensional spatial positions appear as real objects to the viewer.
  • the prior art for three-dimensional displays includes two classes of displays: stereoscopic displays and swept volume "volumetric" displays.
  • Stereoscopic displays are based on holographic or binocular stereoscopic technology that use two-dimensional displays to create a three dimension effect for the viewer.
  • a shortcoming of stereoscopic displays is that they display spatial information from the perspective of only one viewer.
  • Volumetric displays overcome this shortcoming by creating three-dimensional images in the display volume from voxels, the smallest distinguishable three-dimensional spatial element of a three- dimensional image.
  • Volume displays satisfy depth cues such as stereo vision and motion parallax. Motion parallax is that phenomenon that a driver observes from his car when the terrain closer to him moves by faster than the terrain farther away.
  • the prior art for flat screen volumetric displays includes the LED arrays described in U. S. Patent No. 4,160,973 to Berlin (the Berlin '973 Patent), the cathode ray sphere displays described in U. S. Patent No. 5,703,606 to Blundell (the Blundell '606 Patent), the laser projection displays described in U. S. Patent No. 5,148,301 to Batchko (the Batchko '301 Patent), and the rotating reflector displays described in U. S. Patent No. 6,302,542 to Tsao (the Tsao '542 Patent).
  • the prior art for curved screens includes the helical screen displays and the Archimedes' Spiral displays described U. S. Patent 3,428,393 to de Montebello (the de Montebello '393 Patent).
  • the first class which uses a laser or electron beam to excite a phosphor to emit light, includes the displays described in the Batchko '301 Patent, the Blundell '606 Patent, and U. S. Patent No. 4,871 ,231 to Garcia (the Garcia '231 Patent).
  • the second class uses intersecting lasers beams to generate illumination on moving screen using two stage excitement of photoluminescent media as described in U. S. Patent No. 5.943,160 to Downing et al. (the Downing et al. '160 Patent) or photoionization of silicate Glasses as described in U. S. Patent No.
  • Fiber optics implementations using this approach include the implementations described in U. S. Patent No. 4,294,523 to Woloshuk et al. (the Woloshuk et al.'523 Patent) and in U. S. Patent No. 3,604,780 to Martin (the Martin '780 Patent), which channel light using fiber optics to the moving screen.
  • the rotating light source approach has problems connecting a large number of light sources to a moving screen and therefore their high definition displays are difficult to construct and maintain in operation.
  • Implementations utilizing light sources on the moving screen such as the light emitting diodes (LEDs) of the Berlin '973 Patent result in complex implantations of light emitters and their associated active control electronics which are also included with the rotating screen.
  • the prior art describes three-dimensional image scanners which capture shapes of objects in the target space (see, for instance, U. S. Patent Mo. 5,936,739 to Cameron, U. S. Patent No. 5,585,913 to Hariharan, and U. S. Patent No. 6,445,491 to Sucha).
  • Such three-dimensional image scanners are not able to capture both shape and color.
  • the present invention overcomes the disadvantages and shortcomings of the prior art discussed above by providing a three-dimensional imaging system, which includes a three-dimensional display, an image scanning device for capturing a three-dimensional image to be displayed on the three-dimensional display, and three-dimensional calibration equipment for calibrating the image scanning device.
  • a three-dimensional imaging system which includes a three-dimensional display, an image scanning device for capturing a three-dimensional image to be displayed on the three-dimensional display, and three-dimensional calibration equipment for calibrating the image scanning device.
  • Both the three-dimensional display and the image scanning device employ optical pulses and non-linear optics to display and record, respectively, a three-dimensional image.
  • the image scanning device may be two-dimensional or three-dimensional.
  • the three-dimensional display includes at least three pulsed optical sources; and an optical mixer movable in a display space, wherein the at least three pulsed optical sources are spatially separated so as to permit pulses emanating therefrom to overlap in a voxel within the display space and intersecting the optical mixer at a selected position, whereby a first-order non-linear interaction of the pulses causes the optical mixer to produce at least one pre-determined wavelength of electromagnetic waves.
  • the three-dimensional image scanner captures a three-dimensional image of an object.
  • the three-dimensional image scanner includes a first pulsed optical source for generating an illuminating optical pulse at an illumination wavelength, the first pulsed optical source directing the illuminating optical pulse toward the object; a second pulsed optical source for generating a gating optical pulse at a gating wavelength; an optical mixer positioned to receive light reflected from the object at a single wavelength in response to interaction of the illuminating optical pulse with the object, a portion of the illuminating optical pulse and a portion of the gating optical pulse spatially and temporally overlapping each other within the optical mixer, thereby producing a first optical pulse indicative of the shape of the object and a second optical pulse indicative of the color of the object; and an optical recorder having a plurality of pixels responsive to output light emitted by the optical mixer, a first portion of the plurality of pixels having an associated filter which passes the first optical pulse and which blocks the second optical pulse, and a second portion of the plurality
  • the three-dimensional calibration equipment includes acquiring means for acquiring an optical image of a desired object from at least two positions, the acquiring me ans being either at least two optical recorders placed at least two different positions or a single optical recorder that is moved between several positions.
  • the three-dimensional calibration equipment also includes a holographic calibration plate placed between the acquiring means and the desired object, and a light source of at least one of a set of calibration wavelengths for illuminating the holographic calibration plate so as to project at least one virtual calibration pattern in the field of view of the acquiring means and in the vicinity of the desired object.
  • An alternative embodiment of the three-dimensional calibration equipment includes at least two optical recorders and a light source of at least one of a set of calibration wavelengths for illuminating at least three reference points relative to the desired object to be recorded by the at least two optical recorders.
  • a method for calibrating the three-dimensional imaging system using the three-dimensional imaging equipment mentioned above includes the steps of projecting a virtual calibration pattern in the field of view of the optical recorder(s); choosing one position of one optical recorder as a reference position; assigning coordinates of a coordinate system relative to either the virtual calibration pattern or the reference position; measuring the differences in the virtual calibration pattern from a second position of the optical recorder(s); calculating calibration corrections relative to the reference position based on the differences measured; and adjusting the optical recorder(s) based on the calibration corrections.
  • An alternative method of calibrating a three-dimensional imaging system using the three-dimensional imaging equipment mentioned above for calibrating optical recorder(s) includes the steps of projecting a calibration pattern at a calibration wavelength on a plane that is tangent to the nearest point of a desired object as measured from the optical recorder; labeling an intersection point P between the calibration pattern and the desired object; positioning the end of a laser light beam operating at the calibration wavelength at the point P; measuring the distance from the point P to the calibration pattern; generating a second calibration pattern at a greater distance from the reference optical recorder; and repeating the steps of labeling, positioning, and measuring when the calibration pattern intersects the desired object.
  • Another alternative method of calibrating a three-dimensional imaging system using the three-dimensional imaging equipment mentioned above which includes at least two optical recorders to be calibrated and two holographic calibration plates placed in the field of view of a respective one of the optical recorders where each of the holographic calibration plates contains the same hologram, includes the steps of positioning the calibration plates relative to each other to approximate a monolithic calibration plate; projecting a calibration pattern in the field of view of a desired object through each of the calibration plates; determining the position of at least three reference points in the vicinity of the desired object relative to each of the optical recorders; determining a corresponding position on the calibration pattern corresponding to each reference point; determining the misalignment of the virtual calibration pattern; determining the correction factors as a function of position of the desired object relative to each optical recorder; and applying the correction factors to each optical recorder.
  • FIG. 1 is a block diagram of a three-dimensional imaging system constructed in accordance with an exemplary embodiment of the present invention
  • FIG. 2 is a schematic diagram of an exemplary embodiment of the three-dimensional display described in FIG. 1;
  • FIG. 3 is a schematic diagram showing that if optical pulses used in FIG. 2 are sufficiently short in duration, then the pulses will spatially overlap essentially at a desired position P1;
  • FIG. 4 is a schematic diagram showing that pulses from several optical sources arriving at point P1 at the same point in time that minimizes the spatial spread of the overlapping pulses;
  • FIG. 5 is a schematic diagram of an optical mixer moving periodically back and forth along the z axis under the control of display electronics such that optical pulse wave fronts will intersect the optical mixer at different points of successive planes, which provides a mechanism to generate an optical mixer output of the desired optical wavelength (color) at any point in the volume of space that is traversed by the optical mixer;
  • FIG. 6 is a schematic diagram of an implementation of a pulsed optical source in which a laser produces a beam of light which is transformed into a cone of light by a concave lens;
  • FIG. 7 is a schematic diagram of another implementation of a pulsed optical source in which a point source of light emits light which is transformed by a convex lens into extended beams with plane wave fronts;
  • FIG. 8 is a schematic diagram showing how a desired wavelength generator can be shared across three pulsed optical sources
  • FIG. 9 is a schematic diagram depicting an optical mixer constructed from a plurality of smaller optical mixing elements
  • FIG. 10 is a schematic diagram showing that each of the smaller optical mixing elements of FIG. 9 can also include optical mixer sub-elements which are optimized for one of the three primary colors in an RGB display;
  • FIG. 11 is a perspective view of optical mixing elements made from a monolithic non-linear optical material depicted as having a cylindrical shape and another optical mixing element depicted as having a truncated conical shape;
  • FIG. 12A is a side view of an optical mixing element that is a combination of a hemispherical lens, a cylindrical non-linear mixing material, and a cylindrical desired wavelength filter;
  • FIG. 12B is an exploded perspective view of the optical mixing element of FIG. 12A;
  • FIG. 13A is a side view of an optical mixing element that is a combination of a triangular lens, a triangular non-linear mixing material, and a triangular desired wavelength filter;
  • FIG. 13B is an exploded perspective view of the optical mixing element of FIG. 13A;
  • FIG. 14A is a side view of an optical mixing element that is a combination of a pyramidal lens, a rectangular non-linear mixing material, and a rectangular diffuser;
  • FIG. 14B is an exploded perspective view of the optical mixing element of FIG. 14A;
  • FIG. 15A is a side view of an optical mixing element that is a combination of a hemispherical lens, a rectangular non-linear mixing material, and a rectangular desired wavelength filter;
  • FIG. 15B is a an exploded perspective view of the optical mixing element of FIG. 15A;
  • FIG. 16A is a side view of an optical mixing element that is a combination of a rectangular non-linear mixing material, a pyramidal lens, a rectangular desired wavelength filter, and a pyramidal optical reflector which allows one or more pulsed optical sources to be positioned on the same side of an optical mixer as a viewer;
  • FIG. 16B is a an exploded perspective view of the optical mixing element of FIG. 16A;
  • FIG. 17 is a schematic diagram of a mechanical mechanism for moving a planar optical mixer back and forth
  • FIG. 18 is a schematic diagram showing that a optical mixer can be moved using a rotational motion about the X-axis
  • FIG. 19 is a schematic diagram of the optical mixer of FIG. 18 in which additional pulsed optical sources are placed on a side of the optical mixer opposite the side to which the original pulsed optical sources direct optical pulses, such that the pulsed optical sources arrive at the point P1 on the optical mixer at the same time;
  • FIG. 20 is a schematic diagram depicting the pulsed optical sources of FIG. 19 in which the pulsed optical sources are successively utilized as the optical mixer rotates clockwise around the X-axis;
  • FIG. 21 is a schematic diagram showing that an optical mixer can have other physical shapes in addition to planar;
  • FIG. 22 is an exemplary embodiment of the three-dimensional image scanning device described in FIG. 1 ;
  • FIG. 23 is a schematic diagram of an arrangement for a second source of optical pulses which includes a pulsed optical source and a mirror;
  • FIG. 24 is a schematic diagram of an arrangement for a second source of optical pulses in which the mirror of FIG. 23 is omitted and in which the timing of the optical sources is controlled by image scanner electronics;
  • FIG. 25 is a schematic diagram showing that the generation and detection of reflected light from a desired object with high temporal resolution will reveal the depth profile of the desired object as a succession of different "slices" of the desired object are captured within the optics of an optical recorder;
  • FIG. 26 is a schematic diagram depicting the process of separating spatial information from color information
  • FIG. 27 is a schematic diagram showing a CCD array in which some of the pixels can be coated with a filtering material that passes only the spatial wavelengths relating to shape while other pixels pass only the desired wavelengths related to color;
  • FIG. 28 is a schematic diagram of a first exemplary embodiment of the three-dimensional calibration equipment depicted in FIG. 1 , in which the three- dimensional calibration equipment includes a light source, a holographic calibration plate, and two or more optical recorders;
  • FIG. 29 is a perspective view of a cubical virtual calibration pattern in the form of a "tinker toy” like grid projected by a holographic calibration plate in the vicinity of a desired object;
  • FIG. 30 is a perspective view of a cubical virtual calibration pattern of FIG. 29 in which individual intersections of the grid of the virtual calibration pattern are labeled with numerals;
  • FIG. 31 is a perspective view of a cubical virtual calibration pattern of FIG. 29 in which individual intersections of the grid of the virtual calibration pattern are labeled with bar codes;
  • FIG. 32 is a schematic diagram showing how different calibration wavelengths can produce different grids for the virtual calibration pattern of varying density around the desired object, in this case a grid of low density;
  • FIG. 33 is similar to FIG. 32 in which the grid is of medium and high density
  • FIG. 34 is a schematic diagram of a second exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which a mirror is placed in the field of view of the optical recorders;
  • FIG. 35 is a block diagram depicted the optics/electronics/software for use with the embodiments of FIGS. 28 and 34;
  • FIG. 36 is a schematic diagram of a third exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which the three-dimensional calibration equipment also includes an optical or mechanical shutter;
  • FIG. 37 is a flow chart depicting the method of calibration to be used in conjunction with the embodiments of the three-dimensional calibration equipment depicted in FIGS. 28, 34, and 36;
  • FIG. 38 is a schematic diagram of a variation of the embodiment of FIG. 28 in which the desired object has affixed to it calibration points which are painted with or reflective of the calibration wavelengths which are captured separately in the optical recorders;
  • FIG. 39 is a schematic diagram showing an optical recorder located outside of the cylindrical holographic calibration plate, a position from which the optical recorder views a cylindrical virtual calibration pattern in its field of view;
  • FIG. 40 is a schematic diagram showing an optical recorder located inside a spherical holographic calibration plate from which position a part of the spherical calibration grid is observable;
  • FIG. 41 is a schematic diagram of a fourth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, which includes three optical sources, a holographic calibration plate with an excitation source, and a laser pointer or laser ranging measurement device;
  • FIG. 42 is a flow chart depicting the method of calibration to be used in conjunction with the embodiment of the three-dimensional calibration equipment depicted in FIGS. 41 ;
  • FIG. 43 is a schematic diagram of a fifth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which the three-dimensional calibration equipment employs non-continuous, identical holographic calibration plates instead of a single continuous holographic calibration plate;
  • FIG. 44 is a flow chart depicting the method of calibration to be used in conjunction with the embodiment of the three-dimensional calibration equipment depicted in FIGS. 43;
  • FIG. 45 is a schematic diagram of a sixth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 43, in which the three-dimensional calibration equipment also includes reference points identified by the calibration equipment in the field of view of the desired object;
  • FIG. 46 is a schematic diagram of a seventh exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 44, in which the three-dimensional calibration equipment does not utilize fixed calibration plate(s);
  • FIG. 47 is a schematic diagram of an eighth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which the three-dimensional calibration equipment also includes a band stop filter;
  • FIG. 48 is a schematic diagram of a ninth exemplary embodiment of the three-dimensional calibration equipment depicted in FIG. 1 , in which the three- dimensional calibration equipment is used in conjunction with a stereoscopic microscope;
  • FIG. 49 is a schematic diagram of a ninth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which the three-dimensional calibration equipment also includes a desired object imprinted on a plate in which the desired object is identified using a combination of characteristics.
  • the three-dimensional imaging system 10 includes a three-dimensional display 12, a three-dimensional image scanning device 14, and/or one or more two-dimensional image scanning devices 15, and three-dimensional calibration equipment 16.
  • the three-dimensional image scanning device 14 and/or the two-dimensional image scanning device 15 are employed to generate a three-dimensional image (not shown) to be displayed on the three- dimensional display 12, and to provide data for use by the three-dimensional calibration equipment 16.
  • the three-dimensional calibration equipment 16 calibrates the three-dimensional image scanning device 14 and/or the two-dimensional image scanning device 15.
  • the three-dimensional display 12 and the three-dimensional image scanning device 14 both employ optical pulses and non-linear optics (not shown) to display and record, respectively, a three-dimensional image.
  • voxels are volume pixels, the smallest distinguishable three-dimensional spatial element of a three- dimensional image.
  • Desired wavelengths are the visible wavelengths of light to be displayed to the observer.
  • Cone of acceptance or “acceptance angle” refers to the range of angles incident on a non-linear optical material from a line perpendicular to the material within which an incident optical pulse will produce non-linear optical pulses that exit the non-linear optical material. Incident angles outside the range of the cone of acceptance will result in no conversion of light and hence no output light emanating from the non-linear optical material.
  • the three-dimensional display 12 renders for viewing a three-dimensional image of a desired object 17 at a single optical wavelength (monochrome) or multiple optical wavelengths, including the visible wavelengths (colors).
  • the three-dimensional display 12 can be adapted to display still or moving three-dimensional frames of still or moving desired objects 17.
  • the three-dimensional display 12 includes K pulsed optical sources 16a-16k, an optical mixer 18 including one or more optical mixer elements 20a-20i, one or more optical filters 22, and display electronics 24.
  • the three-dimensional display 12 uses the optical mixer 18 to create the desired wavelengths 26 in a display volume 28 under the control of the display electronics 24.
  • the optical mixer 18 produced the desired wavelengths 26 at specific voxels (not shown) at specific times within the display volume 28.
  • the optical mixer 18 creates three-dimensional images, which includes graphics, in the display volume 28. Three-dimensional images are generated as the optical mixer 18 emits specific desired wavelengths 26 in selected voxels (not shown).
  • the optical mixer 18 converts the optical excitation from the K pulsed optical sources 16a-16k into the desired wavelengths 26 observable by the viewer(s) or viewing equipment 30 through one or more optical filters 22 of the three-dimensional display 12.
  • pulsed optical sources 16a-16k may be operable at a given time, which produce optical pulses 32a-32k spatially positioned on one or both sides of the optical mixer 18.
  • three pulsed optical sources 16a-16c emit the optical pulses 32a-32c of frequencies F-i, F 2 , and F 3 toward the optical mixer 18.
  • the optical pulses 32a-32c are sufficiently separated to enable triangulation of the overlap of the optical pulses 32a-32c by pulse timing under the control of the display electronics 24.
  • the optical pulses 32a-32c arrive at a desired point P1 (voxel(s)) in three-dimensional display volume together with the optical mixer 18.
  • the optical pulses 32a-32c are sufficiently short in duration so that they temporally coincide (i.e., spatially overlap; see FIG. 3) essentially at the desired position P-i.
  • the optical pulses 32a-32c arrive at different times, and do not overlap.
  • the pulse timing is controlled by the display electronics 24.
  • the optical pulses 32a-32c interact in a non-linear manner when passing through the optical mixer 18. What emanates from the optical mixer 18 is a set of pulses 34 that include pulses of not only the original frequencies F- F 2 , and F 3 but also pulses of the sum and difference frequencies of the optical pulses 32a-32c. This set of pulses 34 are transmitted through the optical f ⁇ lter(s) 22 to the viewer 30, which may be a person or optical equipment such as a camera.
  • the pulsed optical sources 16a-16c are frequency tunable or frequency selectable pulsed optical sources
  • a judicious selection of the frequencies for F 1 ( F 2 , and F 3 produces an output pulse 34 from the optical mixer 18 of a specified optical frequency (color if a visible frequency).
  • the optical filter 22 can be selected to pass only the sum frequency Fi + F 2 + F 3 such that the viewer 30 "sees" only a pulse 34 of frequency Fi + F 2 + F 3 illuminated at point PL
  • selectively choosing Fi, F 2 , and F 3 and selectively choosing the upper and lower cutoff frequencies of the optical filter 22 to be within a selected range of the optical spectrum allows the three- dimensional display 12 to display a specific range of frequencies (e.g., the visible colors).
  • Brightness of the three-dimensional display 12 may be increased by increasing the intensity of any of the optical pulses 32a-32c. Brightness of the three- dimensional display 12 may also be increased from the optical mixer by use of an intense "gating" pulse of frequency F 4 from a fourth pulsed optical source 16d and choosing Fi + F 2 + F 3 +F as the desired wavelength.
  • moving the optical mixer 18 periodically back and forth in a direction normal to the plane of the optical mixer under the control of the display electronics 24 that also controls the pulse timing of the optical pulses 16a-16k provides a mechanism to generate an optical mixer output of the desired optical wavelength (color) at any point in the display volume of space that is traversed by the optical mixer 18.
  • Objects or their representations are displayed by creating the desired optical frequencies in the optical mixer 18 at points, such as P1 , as the optical mixer 18 moves periodically back forth.
  • the three- dimensional display 12 creates pulses of light in the optical mixer 18 at desired points, such as P1, that then travels thru the optical filter 22 to the viewer or viewing equipment 30.
  • Each movement through the display volume 28 can show a different image to the viewer.
  • persistence of vision creates the perception of motion.
  • the wave fronts of the pulsed optical sources 32a-32k will converge and intersect the optical mixer 18 at different points 36 of successive planes 38 (see FIG. 5). Since the pulse repetition rate of the pulsed optical sources 16a-16k are very rapid relative to persistence of vision and the optical mixer 18 moves very slowly, optical mixer element 20a-20i can be sequentially excited in an interval during which the optical mixer 18 moves a very small distance. Also since three pulsed optical sources 16a- 16c will converge at only one point in the display space and the proposed implementation of the optical display 12 allows for up to K pulsed optical sources 16a-16k, more than one optical mixer element 20a-20i in the optical mixer 18 may be excited in parallel.
  • Each of the optical mixer elements 20a-20i recursively passes through the display volume 28 such that during the recursion period, at least one of the optical mixer elements 20a-20i passes through each voxel 40.
  • a least one of the optical mixer element 20a-20i is capable of emitting a specific desired wavelength 26, in any selected voxel 40 observable by the viewer(s) 30 of the three-dimensional display 12 and thus a three-dimensional image is created.
  • the three-dimensional display 12 is capable of producing a desired wavelength 26 at any voxel 40 in the display volume 28.
  • the display electronics 24 triggers pulses from several pulsed optical sources 16a-16k that arrive at the desired one of the optical mixer elements 20a-20i simultaneously.
  • the pulsed optical sources positioning and timing meet the following conditions: (1) pulsed optical sources 16a-16k are sufficiently outside the display volume 28 to illuminate the desired one of the optical mixer elements 20a- 20i; (2) the triggered pulsed optical sources 16a-16k are sufficiently separated so as to enable triangulation, by pulse timing, to excite a desired mixer element (e.g., 20a); and (3) the optical pulses 32a-32c are so short in duration that they overlap essentially at the desired one of the optical mixer elements 20a-20i.
  • the display electronics 24 controls the pulsed optical sources 16a-16k to generate optical pulses 32a-32c that excite the desired optical mixer elements 20a-20i with a predetermined combination of optical frequencies that produce the desired wavelength 26 in the desired voxel 40.
  • Each of the pulsed optical sources 16a-16k operates at one or more predetermined optical frequencies.
  • the optical display 12 may be constructed using one or more pulsed optical sources emitting short pulses in combination with one or more pulsed optical sources emitting longer pulses (up to infinite pulse widths, i.e., continuous illumination).
  • Using optical pulses with very short pulse widths (called ultra short optical pulses with pulse widths in the femtosecond to nanosecond range) enables the excitation of specific voxels in the display volume 28 with such high accuracy that sharp images are produced.
  • Exemplary methods and devices for generating and controlling ultra short optical pulses are disclosed in U. S. patent Nos.
  • the optical pulses 32a-32k emanating from the pulsed optical sources 16a-16k with appropriate timing arrive at the desired point, P1 , in the display volume 28 together with the optical mixer 18. Furthermore, the pulsed optical sources 16a- 16k are sufficiently separated from each other such that pulses from the sources arriving at point P1 minimize the spatial spread of these overlapping pulses (see FIG. 4).
  • the three-dimensional display 12 is operable with as many pulsed optical sources as is necessary to excite the optical mixer 18 within the cone of acceptance for each voxel in the display volume 28.
  • the particular subset of the optical mixer elements 20a-20i that are illuminated is determined by the display electronics 24, which selects which of the K pulsed optical sources 16a-16k are used, and which selects pulsed optical source parameters such as optical pulse timing, optical pulse width (including continuous illumination) and intensity.
  • the K pulsed optical sources 16a-16k produce pulsed or non-pulsed desired wavelengths 26 under the control of the display electronics 24 and may also include one or more optical elements (not shown) to direct the light from the pulsed optical sources 16a-16k to a subset of the optical mixer elements 20a-20i. When used, the optical elements shape the output from the pulsed optical sources 16a-16k into a cone of light or into light that is essentially parallel to the illuminated subset of the optical mixer elements 20a-20i.
  • a pulsed optical source e.g. 16a
  • a desired wavelength generator 42 e.g., a laser
  • an optical element e.g., a concave lens 48
  • a desired wavelength generator 50 e.g., a point source of light
  • emits light 52 which is transformed by an optical elements, (e.g., a convex lens 54) into extended beams 56 with plane wave fronts which illuminate a subset of the optical mixer elements 20a-20i.
  • Plane wave fronts of light are created when the desired wavelength generator 50 is located at the focal point 58 of the convex lens 54 or at the focal point of a mirror (not shown).
  • Using parallel beams of light allows for a more simplified arrangement of the pulsed optical sources 16a-16k and ensures the illumination of each of the optical mixer elements 20a-20i are within their acceptance cone for every voxel in the display volume 28. More particularly, the incident angles of the optical pulses 32a-32k on the optical mixer 18 stay more constant as the mixer moves and may provide a more constant conversion efficiency during the movement of the optical mixer 18.
  • the pulsed optical source 16b creates light with a constant orientation relative to the optical mixer 18 which produces a more constant angle of incidence as the optical mixer 18 traverses the display volume 28 as compared to the pulsed optical source 16a which, because it produces a cone of light, has an angle of incidence relative to the optical mixer 18 that changes as the optical mixer 18 traverses the display volume 28.
  • each individual pulsed optical source 16a, 16b contains its own desired wavelength generator 42, 50, respectively.
  • the wavelength generator 42, 50 can be shared across multiple optical sources.
  • a d esired wavelength generator 60 can be shared across three optical sources (e.g. 16a-16c).
  • the desired wavelength generator 60 produces a train of pulses 62 under the control of the display electronics 24.
  • the train of pulses 62 passes through an optical splitter 64 which divides the train of pulses 62 into three trains of pulses 66 a-66c.
  • Three optical pulse controllers 68, 70, 72 de lay and attenuate a respective train of pulses 66a-66c for optically exciting the optical mixer elements 20a-20i.
  • the train of pulses 62 from the desired wavelength generator 60 may be eliminated completely by the display electronics 24 by having the optical pulse controllers 68, 70, 72 increase the attenuation of train of pulses 62.
  • This attenuation of the train of pulses 62 can be implemented in one device within each of the beam paths of the desire wavelength generator 60 using a spatial light modulator (SLM - not shown).
  • SLM spatial light modulator
  • the optical mixer 18 of FIG. 2 can contain as few as a single optical mixer element, i.e. the optical mixer 18 can be constructed as a single contiguous planar sheet of a material.
  • the optical mixer 18 can also be constructed from a plurality of smaller optical mixing elements 20a-20i across the surface defined by the shape (e.g., planar) of the optical mixer 18.
  • the optical mixing elements 20a-20i are placed at the intersection of the curves 74a-74f which delineate the regular arrangement of the optical mixer 18. While FIG. 9 shows the curves 74a-74f as straight lines for the simplicity of the drawing, in general they are of a shape which enhances the desired frequency conversion properties of the optical mixer 18.
  • the optical mixing elements 20a-20i can be of different shapes, sizes, and composition.
  • the optical mixing elements 20a-20i provide non-linear optical mixing.
  • a first order analysis shows that sum and difference frequencies are created.
  • Desired wavelength(s) 26, which result from the non-linear mixing of the pulsed optical source frequencies, are selected by using one or more optical filters within the optical mixer element 20a to prevent all but the desired wavelength(s) 26 from reaching the viewer(s) 30 of the three-dimensional display 12.
  • the optical mixer element 20a also produces additional optical frequencies by higher order non-linear effects.
  • the optical mixer element 20a may produce a frequency component that is double the frequency of 0J ⁇ . While the higher order non-linear frequencies generated have lower conversion efficiencies than the first order frequencies produced and hence the higher order non-linear frequencies are of lower intensity than first order frequencies, the higher order non-linear frequencies are undesirable.
  • the frequencies of the pulsed optical sources 16a-16k are chosen such that for a given set of desired wavelength(s) 26 (e.g., the three primary colors for a RGB display), no second, third or higher order non-linear interaction up to order N will generate a new frequency that is also a desired wavelength.
  • the optical mixer elements 20a-20i incorporate filters that pass only the desired wavelength(s) 26, then unintended higher order frequencies created in the optical mixer elements 20a-20i will not reach the viewer(s) 30.
  • the non-linear optical mixing elements 20a-20i include materials that permit the generation of optical sum and difference frequencies at the desired wavelength(s) 26.
  • Typical materials include inorganic ferroelectric materials such as LiNb0 3 , Lil0 3 , KH 2 PO 4 , TI 3 AsSe 3 (TAS), Hg 2 CI 2 , KH 2 P0 4 (KDP), KD 2 PO 4 (DKDP or D*KDP), NH 4 H 2 PO 4 (ADP), Hg 2 Br 2 and BaTiO 3 ; quantum well structure semiconductors that include GaAs, etc.; organic single crystals such as 4- nitrobenzyIidene-3-acetamino-4-methoxyaniline (MNBA) and 2-methyl-4-nitroaniline (MNA); conjugated organic high molecular compounds such as polydiacetylene and polyarylene vinylene; and semiconductor grain-dispersed glass comprising CdS, CdSSe, etc. dispersed in glass.
  • FIG. 2 depicts each of the optical mixing elements 20a-20i as being made from a single non-linear optical material that can generate all possible frequencies in the visible spectrum.
  • each of the optical mixing elements 20a-20i in an RGB optical mixer can also include optical mixer sub- elements 76a-76c which are composed of non-linear optical materials optimized for one of three desired wavelengths - red, green, or blue.
  • the sub-elements 76a-76c are arranged and spaced such that no two types of sub-elements (optimized for the same desired wavelength) are adjacent. This arrangement and spacing of the sub- elements 76a-76c minimizes the unintended excitation of nearby sub-elements with the same desired wavelength (type).
  • Small spacing is consistent with physically small implementations of the three-dimensional display 12 that are designed for high resolution. Larger spacing between the sub-elements 76a-76c is consistent with physically larger implementations of the three-dimensional display 12 (e.g., a movie screen size display).
  • Each of the sub-elements 76a-76c is composed of a non-linear optical material that generates a desired wavelength 26 when excited by the pulsed optical sources 16a-16k under the control of the display electronics 24.
  • the sub-element 76a that produces the red desired wavelength includes a filter (not shown) that blocks all wavelengths except the red desired wavelength.
  • the blue and green sub-elements 766b, 766c have blue and green filters, respectively.
  • the optical mixer sub-elements 76a-76c are capable of being simultaneously excited since there are K pulsed optical sources 16a-16k and each of the three sub-elements 76a-76c in one of the optical mixer elements 20a-20i may be excited by three of the K pulsed optical sources 16a-16k.
  • An optical mixer 18 composed of a plurality of optical mixer elements 20a-20i has several advantages over an optical mixer 18 composed of only one element. Arrays of small optical mixer elements 20a-20i are more cost efficient than an optical mixer 18 composed of a single optical mixer element for all except very small displays. For example, a Lithium Niobate crystal, LiNbO 3 ⁇ when used as a non-linear optical mixer element, is currently very difficult to produce in sizes beyond tens of centimeters on a side and becomes more expensive as the size of the crystal increases. Discrete optical mixer elements 20a-20i can be designed with spacing between the elements and therefore the unintended excitement of an optical mixer element 20a-20i by optical excitement of an adjacent optical mixer element 20a-20i is reduced by this inter-element spacing.
  • the conversion efficiency for the non-linear materials which make up the optical mixer elements 20a-20i varies by material type and other design parameters such as size, shape and orientation relative to the pulsed optical sources 16a-16k.
  • the optical mixer sub-elements 76a-76c are independently optimized for each of the desired wavelengths 26 used in the three- dimensional display 12. For example, in an optical mixer 18 that use three desired wavelengths 26 to produce three primary colors in an RGB display, the optical mixer sub-element 76a can be optimized for the generation of red desired wavelengths; the optical mixer sub-element 76b can be optimized for the generation of blue desired wavelengths; and the optical mixer sub-element 76c can be optimized for the generation of green desired wavelengths.
  • the design parameters such as type non linear material, cross sectional area, thickness, size, acceptance angle, spectral acceptance, walk-off angle, etc. can each be independently optimized for each desired wavelength 26 to achieve a desired conversion efficiency, cost, and to equalize the maximum intensity produced by each type of optical mixer sub-element.
  • Optimizing the optical mixer sub-element design by choosing the design parameters for each desired wavelength permits an equalization of the peak intensity for each desired wavelength relative to the viewer 30 of the three-dimensional display 12.
  • the conversion efficiency of one or more optical mixer elements 20a- 20i is a measure of the intensity of the desired wavelength 26 generated by an optical mixer element relative to the excitation of the element by the pulsed optical sources 16a-16k. Improving the conversion efficiency increases the intensity of the desired wavelength(s) 26 transmitted to the viewer(s) 30 of the three-dimensional display 12, i.e., the conversion efficiency increases with non-linear mixer length. Conversion efficiency increases with area of the non-linear mixer o n which the optical pulses 32a-32k of the pulsed optical sources 16a-16k impinge. The conversion efficiency increases with increasing excitation level up to a fixed limit (damage threshold) at which power conversion efficiency decreases. The conversion efficiency increases with phase matching (to be discussed below). When an optical mixer element includes a lens to focus the light onto the non-linear mixing material, the local intensity of the light increases, thereby generating a higher electric field, and thereby increasing the conversion efficiency.
  • the peak intensity of the output of a given optical mixer element (e.g. 20a) for a given fixed pulse width(s) of the optical pulses 32a-32k emanating from the pulsed optical sources 16a-16k and exciting the discrete elements 20a-20i is varied by adjusting the power output level of one or more of the pulsed optical sources 16a-16k which excite the element (e.g., 20a).
  • the average power output level is reduced when the pulse width of one or more of the pulsed optical sources 16a-16k which is exciting the element 20a is shortened. Very high pulse rates are possible when the pulse widths are very short.
  • the pulse rate of the pulsed optical sources 16a-16k can range form megabit to multi gigabit rates, thus illustrating that the optical mixer elements 20a-20i can be excited many times by the pulsed optical sources 16a-16k in the time that an optical mixer elements, e.g. 20a, takes to move through a voxel.
  • the optical mixer 18 may use three different types of optical mixing elements 20a-20i to produce the output frequencies F-i, F 2 and F 3 .
  • the design parameters of the optical mixing elements 20a-20i will differ in order to achieve higher conversion efficiencies or more cost effective conversion efficiencies at the output frequencies Fi, F 2 and F 3 .
  • physical design parameters are appropriately chosen, such as the cross sectional area upon which the optical pulses 32a-32k impinge, or the thickness of the optical mixing elements 20a-20i.
  • Other design parameters to achieve equalization include phase-matching type and angle, damage threshold, acceptance angle, spectral acceptance, crystal size, walk-off angle, group velocity mismatching, and temperature acceptance.
  • an optical mixing element e.g. 20a
  • another optical mixing element e.g. 20b
  • the shape or size of the optical mixing elements 20a-20i in the optical mixer 18 are chosen to compensate for conversion efficiency variations by frequency or position of the optical mixer 18 in the display volume 28, power output variations of pulse sources by frequency, or other attenuation and losses in the system.
  • Two optical frequencies (colors) Fi and F 2 impinge upon the cylindrical optical mixing element 20a and one of the frequencies produced in the cylindrical optical mixing element 20a is the sum of these frequencies Fi + F 2 .
  • Conversion efficiency improves as the diameter of the cylindrical optical mixing element 20a is increased because a greater amount of light enters the cylindrical optical mixing element 20a. Conversion efficiency also improves as the length of the cylinder increases.
  • one end of an optical mixing element e.g. 20c, may be coated with a reflective substance to give it a mirrored surface so that the input optical pulses 32c and the resulting nonlinear output pulses 34 of different frequencies exit from the same end of the optical mixing element 20c that the input optical pulses 32c entered. Traveling through the optical mixing element 20c twice further increases the conversion efficiency.
  • the orientation of the optical mixer elements 20a-20i relative to the optical excitation by the pulsed optical sources 16a-16k is a critical parameter relative to the intensity of the desired wavelengths 26 generated in the optical mixer element and transmitted to the viewer(s) 30 of the three-dimensional display 12.
  • the conversion efficiency of the optical mixer elements 20a-20i of the optical mixer 18 among other properties is dependent upon phase mismatching and hence on the alignment of the incident optical energy relative to the structure of the optical mixer 18.
  • Phase-matching can be implemented by angle tilting, temperature tuning, or other methods. Angle tilting is used the most to obtain phase- matching.
  • each of the optical mixer elements 20a-20i relative to each of the excitation pulsed optical sources 16a-16k is further complicated as the optical mixer 18 is recurrently moving through the display volume 28 and therefore the angle of excitation of an optical mixer element 20a-20i from a particular combination of pulsed optical source 16a-16k changes significantly and therefore the conversion efficiency changes accordingly.
  • the three-dimensional display 12 uses alternative pulsed optical sources 16a-16k to maintain optimal conversion efficiency by utilizing the display electronics 24 to select the optimally positioned pulsed optical sources 16a-16k for each of the optical mixer elements 20a-20i and for each set of positions of the optical mixer 18 as it moves recurrently in the display volume 28.
  • the logic employed by the display electronics 24 continually changes the combination of pulsed optical sources 16a-16k that are exciting each of the optical display elements 20a-20i to produce the desired wavelength(s) 26 in each voxel, at P1 in FIG. 1.
  • FIGS. 9-11 depict the optical mixer elements 20a-20i as being constructed from non-linear mixing materials.
  • the optical mixer elements 20a-20i can include additional elements to enhance the production and transmission of the desired wavelengths 26.
  • FIGS. 12A, 12B show a hemispherical lens 78 used in combination with a cylindrical non-linear mixing material 80 and a cylindrical desired wavelength filter 82.
  • FIGS. 13A, 13B show a triangular lens 84 used in combination with a triangular non-linear mixing material 86 and a triangular desired wavelength filter 88.
  • the triangular lens 84 need not be symmetric.
  • FIGS. 14A, 14B shows a pyramidal lens 90 used in combination with a rectangular non-linear mixing material 92 and a rectangular diffuser 94.
  • FIGS. 15A, 15B which depict the preferred embodiment, shows a hemispherical lens 96 used in combination with a rectangular non-linear mixing material 98 and a rectangular desired wavelength filter 100.
  • the optical display 12 is not limited using those shapes or combinations of shapes of lenses, non-linear mixing materials, desired wavelength filters/optical diffusers depicted in FIGS. 12A- 15B. In any of the embodiments of FIGS 12A-15B, optical diffusers can be used in place of or in addition to the desired wavelength filters and vice versa.
  • the lenses adjust the angle of incidence of light from the pulsed optical sources 16a-16k relative to the optical axes of the non-linear mixing materials (e.g. the material 80).
  • the addition of the lenses focuses light onto the non-linear mixing materials (e.g., the material 80), which increases the local intensity of the optical excitation of the non-linear mixing materials (e.g., the material 80), and which increases the electric field within the non-linear mixing materials (e.g., the material 80), results in higher conversion efficiency.
  • the desired wavelength filters insure that for all the pulsed optical source frequencies incident upon the non-linear mixing materials (e.g., the material 80) and for all the new wavelengths generated within the optical mixer (e.g., the material 80) by nonlinear optical interaction, only the desired wavelength(s) 26 are passed to the viewer(s) 30.
  • the non-linear optical interaction occurs in the non-linear mixing materials (e.g., the material 80) for light from the pulsed optical sources 16a-16k that impinges thereupon within the cone of acceptance range of angles. This range is dependent upon certain parameters of the non-linear mixing materials (e.g., the material 80), such as thickness and type of non-linear material.
  • the pulsed optical sources 16a-16k are positioned too far apart from each other, there will be position points for the o ptical rays from the pulsed optical sources 16a-16k that upon passing through the lenses (e.g., the lens 78) will be directed at such a large angle with respect to the peak conversion optic axis of the non-linear mixing materials (e.g., the material 80) that incident light from the pulsed optical sources 16a-16k is not converted.
  • the lenses e.g., the lens 78
  • spacing between pulsed optical sources is limited only by the aperture of the lenses (e.g., the lens 78), rather than being determined by the acceptance angle of the non-linear mixing materials (e.g., the material 80).
  • the size of the lenses (e.g., the lens 78) can be made relatively large compared to the size of the non-linear mixing materials (e.g., the material 80).
  • the optical diffusers e.g. the diffuser 94 greatly reduce the directional intensity variations of the desired wavelength(s) 26 as they exit the optical mixer materials (e.g., the material 92).
  • the desired wavelength filters may be omitted when the non-linear mixing materials (e.g., the material 80) are designed to selectively enhance specific design wavelength(s) 26 when excited by the pulsed optical sources 16a-16k.
  • the peak conversion efficiency varies by orientation as a function of the desired wavelength produced.
  • the orientation of the excitation by the pulsed optical sources 16a-16k is chosen to correspond to the orientation within the non-linear mixing materials (e.g., the material 80) along which the conversion of the desired wavelength(s) 26 is maximized.
  • each desired wavelength orientation has a different cone of acceptance in the non-linear mixing materials (e.g., the material 80)
  • choosing the positions of pulsed optical sources 16a-16k so that the optical pulses 32a-32k emitted are incident upon the non-linear mixing materials (e.g., the material 80) within their cone of acceptance yields only the desired wavelength(s) 26.
  • the optical mixer elements 20a-20i need only contain one nonlinear mixing material (e.g., the material 80) containing a single sub-element than a separate sub-element for each desired wavelength.
  • each desired wavelength produced exits the non-linear mixing material (e.g., the material 80) at a different angle corresponding to the orientation along which the conversion of the desired wavelength 26 is maximized.
  • the direction of peak intensity of light exiting from the non-linear mixing material varies according to the desired wavelength 26.
  • the use of an optical diffuser 94 in place of a desired wavelength filter greatly reduces the directional intensity variations of the desired wavelength 26 observed by the viewer 30.
  • Dynamic equalization of the desired wavelengths 26 is implemented by adjusting the peak power and pulse width of the optical pulses 32a-32k and by choosing conversion efficiency via the selection of alternative pulsed optical sources 16a-16k. Referring now to FIGS.
  • the optical mixer elements 20a-20i of FIGS. 16A, 16B includes a rectangular non-linear mixing material 102, a pyramidal lens 104, a rectangular desired wavelength filter 106, and an optical reflector 108 which allows one or more of the pulsed optical sources 16a-16k to be positioned on the same side of the optical mixer 18 as the viewer 30.
  • the pulsed optical sources 16a, 16b provide excitation of the rectangular non-linear mixing material 102 in the direction of the viewer 30.
  • the pulsed optical source 16c is on the opposite side of the non-linear mixing material 102 and the direction of excitation from the pulsed optical source 16c is reversed to coincide with the direction of pulsed optical sources 16a, 16b.
  • FIG. 17 a simple mechanism for moving a planar optical mixer 18 back and forth is depicted.
  • the corners 110a-110d of the optical mixer 18 are supported by tracks 112a-112d which maintain a constant orientation relative to the x-y plane.
  • a rotational source 114 synchronized to the optical pulse generators 16a-16k turns an idler drive gear (not shown) which in turn is connected to top and bottom gear drives (not shown), respectively.
  • top gear drive As the top gear drive turns, linear motion is transferred to the top of the optical mixer 18 by a top gear arm 116.
  • the bottom gear drive linear motion is transferred to the bottom of the optical mixer 18 by a bottom gear arm 118.
  • the optical mixer 18 and its moving mechanism are operable over a range of atmospheric pressures.
  • the optical mixer 18 and its moving mechanism can be placed in a vacuum to minimize the air resistance when moving the optical mixer 18 at high speeds.
  • the optical mixer 18 is depicted in FIG. 2 as moving back and forth in one dimension along the Z-axis. Referring now to FIG. 18, the optical mixer 18 is operated using a rotational motion about the X-axis.
  • the pulsed optical sources 16a, 16b, ..., 16i are positioned on one side 120 of the optical mixer 18 and the pulsed optical sources 16 i+ ⁇ , 16 i+2 , ... 16K are placed on the other side 122 of the optical mixer 18 . Positioning some of the pulsed optical sources 16 i+1 , 16 i+2 , ...
  • the display electronics 24 selects the combinations of pulsed optical sources 1 6a-16k that non-linearly combine in the optical mixer elements 20a-20i, e.g., the optical mixer element at point P1, which produce the desired wavelengths.
  • the optical filter 22 blocks all wavelengths other than the desired wavelengths 26 produced in the optical mixer elements 20a-20i from reaching the viewer(s) 30 or viewing equipment.
  • pulsed optical source 16k if the pulsed optical source 16k is to be placed on the side 120 of the optical mixer 18 opposite the side 122 to which the other pulsed optical sources 16a-16c direct optical pulses, it is preferable that pulsed optical source 16k generates a pulse 32k that arrives at the plane of the optical mixer 18 at the same time as axis of the pulsed optical source 60k is aligned with the perpendicular to the optical mixer 18. It is also desirable to coat the side 122 of the optical mixer 18 with a partially transparent and partially reflective material, similar to a half silvered mirror, and/or to construct the optical mixer 18 of a thin material that will diffuse light.
  • the side 120 of the optical mixer 18 may also include an optical filter to selectively transmit only the desired sum and difference products of the impinging optical pulses 32a-32k.
  • multiple pulsed optical sources 16a-16k are successively utilized as the optical mixer 18 rotates clockwise around the X-axis.
  • This sequential use of pulsed optical sources 16a-16k maintains the alignment of the axes of the direction of the impinging optical pulses 32a-32k with the changing cone of acceptance of the optical mixer 18 and thus helps insure that the optical pulses 32a-32k impinge on the optical mixer 18 at an angle required for the necessary conversion efficiency.
  • FIG 21 shows an implementation of the optical mixer 18 that is not planar.
  • the optical mixer 18 has a surface whose shape is known to the display electronics 24.
  • the display electronics 24 selects changing combinations of pulsed optical sources 16a-16k that non-linearly combine in the optical mixer elements 20a-20i, to produce the desired wavelengths at desired voxels.
  • the display electronics 24 stores the alternative possible combinations of the pulsed optical sources 16a-16k which are capable of producing the desired wavelength 26 in each voxel as lists of predetermined pulsed optical source combinations.
  • the display electronics 24 also stores the predetermined lists of alternative possible combinations of pulsed optical sources 16a-16k which achieve different levels of conversion efficiencies.
  • the logic of the display electronics 24 uses these predetermined lists to select appropriate combinations of the pulsed optical sources 16a-16k for each voxel in the three-dimensional image generated by the three-dimensional display 12.
  • the three-dimensional image scanning device 14 is adapted to take three-dimensional frames of still or moving desired objects 134.
  • the three-dimensional image scanning device 14 includes an pulsed optical source 124, a second source of optical pulses 126, an optical mixer 128, an optical recorder 130 (e.g. a two-dimensional camera), and image scanning device electronics 132 for controlling the coordination of the pulsed optical source 124, the second source of optical pulses 126, and the optical recorder 130.
  • the pulsed optical source 124 illuminates the desired object 134 using optical pulses 136 of frequency F-i, which are of the same type as used in the three-dimensional display 12, and which can include ultra short optical pulses.
  • the optical pulses 136 reflect from the desired object 134 and impinge on the optical mixer 128.
  • the optical mixer 128 and pulsed optical source 124 can be constructed from the same materials and have the same geometry as those used in the optical display 12.
  • the second source of optical pulses 126 can take on at least two forms to be described below with reference to FIGS. 23, 24.
  • the output of the second source of optical pulses is optical pulses 138 (of the same type as the optical pulses 136) of frequency F 2 , which impinge on the optical ixer 128.
  • the optical pulses 138 have controlled time delays with respect to the optical pulses 136.
  • the optical pulses 136, 138, arriving and temporally overlapping at the optical mixer 128, interact in a non-linear fashion when passing through the optical mixer 128 as previously described for the optical mixer 18 associated with the three-dimensional display 12.
  • What emanates from the optical mixer 128 is a set of pulses 140 that include pulses of not only the original frequencies Fi and F 2 , but also pulses of the sum and difference frequencies of the optical pulses 136, 138.
  • This set of pulses 140 are transmitted to the optical recorder 130, which can include, for example, a visible wavelength and/or infra-red camera.
  • the image scanning device electronics 132 which can include a processor, are incorporated within or external to the optical recorder 130 and construct a complete three-dimensional representation of the object to be discussed below with reference to FIG. 25 from a succession of sets of pulses 140 (after appropriate filtering to extract the desired pulses 140).
  • the second source of optical pulses 126 includes a pulsed optical source 142 and a mirror 144.
  • the pulsed optical source 142 is of frequency Fi and the pulsed optical source 124 is of frequency F 2 .
  • the pulsed optical source 142 transmits the optical pulses 138 of frequency F 2 toward a mirror 144 at approximately the same time that the pulsed optical source 124 illuminates the desired object 134 using optical pulses 136 of frequency F-i.
  • the mirror 144 reflects the optical pulses 138 towards the optical mixer 128.
  • the plane of the mirror 144 is approximately parallel to the plane of the optical mixer 128 (i.e., the x-y plane).
  • the desired spatial/temporal time delay is implemented by moving the mirror 144 back and forth along the plane perpendicular to the plane of the mirror 144 and the optical mixer 128 (i.e., the z dimension). In the second arrangement of FIG. 24, the mirror 144 is omitted.
  • the image scanning device electronics 132 are temporally linked to the pulsed optical sources 124, 142 in such a way that the optical pulses 138 are delayed a predetermined amount from the optical pulses 136.
  • each of the optical pulses 136 combine with a respective one of the optical pulses 138 at different times in the optical mixer 128. Since the optical pulses 136, 138 can be of very short duration, they interact with each other in the optical mixer 128 over a small area and over a short interval of time. Thus the successive pulses 140 emanating from the optical mixer 128 will have both high temporal and spatial resolution. Since spatial resolution is equivalent to temporal resolution, generating and detecting the reflected light with high temporal resolution will therefore reveal the depth profile of the desired object 134 with high spatial resolution.
  • Each of the optical pulses 138 are delayed by a tight succession of increasing time intervals relative to the optical pulses 136, which are captured as a succession of different "slices" 146 of the desired object 134 within the optics of the optical recorder 130. With one delay, the front edge of the desired object 134 is captured, followed by images moving toward the rear of the desired object 134.
  • the time delay between the two sets of optical pulses 136, 138 reflections with selected ranges i.e., distance from the pulsed optical source 142
  • the individual time delays of the optical pulses 138 do not have to be in ascending or descending order, but can be in any order so long as the image scanning device electronics 132 associated with the optical recorder 130 "combines" each of the resulting "slices" 146 of the desired object 134 to produce a composite three-dimensional image from a single perspective of the desired object 134.
  • a three-dimensional image of the desired object 134 is obtained by generating ranges for a succession of two-dimensional images that encompass all external views of the three-dimension desired object 134. This is accomplished by either rotating the desired object 134, or moving the optical recorder 130, the optical mixer 128, and optionally the pulsed optical source 124 and the second source of optical pulses 126 around the desired object 134. Alternatively, several sets of the optical recorder 130, the optical mixer 128, the pulsed optical source 124, and the second source of optical pulses 126 can be stationed around the desired object 134 and coordinated by the image scanning device electronics 132 to capture a complete three-dimensional representation of the desired object 134 a nd provide data to construct a three-dimensional model of the desired object 134.
  • the optical frequencies of the pulsed optical source 124 and the second source of optical pulses 126 can be chosen such that only the desired sum and/or difference frequencies generated in the optical mixer 128 are within the range that are processed by the optical recorder 130.
  • the frequencies F-i and F 2 can be chosen to be optical frequencies below the frequency processed by the optical recorder 130 while the sum of these frequencies is within the processing range of the optical recorder 130 (i.e., optical up/down conversion).
  • some of the pixels 154 can be coated with an filtering material that only pass the spatial wavelengths relating to shape while the other pixels 156 can be coated with an filtering material that only pass the desired wavelengths related to color.
  • the pixels 154 capture only the spatial information, while the uncoated pixels 156 capture the color and textual information.
  • the pixels 154 capture only the spatial information, while the pixels 156 capture the color information.
  • Another alternative is for the optical recorder 130 to selectively split the acquired color and spatial image using wavelength selective filters into spatial wavelengths and desired wavelengths and acquire these wavelengths with two separate CCD arrays, one for capturing spatial information and another for capturing color information.
  • the three-dimensional image scanning device 14 can have enhanced precision, since the time resolution is given strictly by the length of the optical pulses 136, 138 and the optical non-linear process within the optical mixer 128.
  • the depth sensitivity and range can easily be adjusted to needed specifications by the selection of optical pulse length and the precision of the control in the time delay of the optical pulses 138.
  • depth information can be obtained even from very low reflecting desired objects 134, since in the non-linear mixing process the signal strength can be enhanced by use of an intense gating pulse (138).
  • the calibration of the three-dimensional image scanning device 14, especially embodiments using multiple optical recorders 130, is a very time consuming, expensive process when using conventional calibration techniques.
  • the three-dimensional calibration equipment 16 includes a light source 158 and a holographic calibration plate 160 that contains holographic encoded calibration information.
  • Two or more optical recorders 162a- 162c are placed on either side of the holographic calibration plate 160 through which a desired object 164 is viewed and one or more virtual calibration pattern(s) 166 is viewed in the vicinity of the desired object 164.
  • a single optical recorder 162a is moved to different positions in space to acquire images of the desired object 164 from different perspectives.
  • the optical recorders 162a-162c can be, for example, two-dimensional or three-dimensional analog or digital cameras.
  • “virtual calibration pattern” refers a h olographic projection viewed in the vicinity of the desired object 164
  • “desired wavelengths” refers to the wavelengths of light reflected from the desired object 164 due to ambient illumination
  • “calibration wavelengths” refers to the wavelengths of light produced by the holographic calibration plate 160 for viewing the virtual calibration pattern.
  • Reflected light from the desired object 164 passes through the holographic calibration plate 160 which is transparent to light of the desired wavelengths 168 (e.g., normal visible light reflected from the desired object 164).
  • the desired wavelengths 168 pass through the holographic calibration plate 160 to the optical recorders 162a-162c.
  • the light source 158 which may be on either side of the holographic calibration plate 160, illuminates the holographic calibration plate 160 with light of one or more calibration wavelengths 170.
  • the light source 158 may produce pulses of coherent light, a beam of coherent light, or may produce incoherent light.
  • Light of the calibration wavelengths 170 excites holographic calibration plate 160 in such a way that the optical recorders 162a-162c "see" a virtual calibration pattern 166 which is made recordable and storable by optical sensors within the optical recorders 162a-162c.
  • the three-dimensional calibration equipment 16 is envisioned to use fixed holographic images or variable plate holograms (e.g. using external inputs to phase modulate the holographic calibration plate 160).
  • the virtual calibration pattern 166 can be, for example, a "tinker toy” like grid or multidimensional wireframe of arbitrary shape (see FIG. 29) located in the vicinity of, overlapping, or enclosing the desired object 164.
  • the virtual calibration pattern 166 can have any desired shape, such as the shape of a cube, portions of concentric cylinders, portions of concentric spheres, etc., depending on the geometry of the desired object 164 and the field of view of the optical recorders 162a-162c. Individual intersections of the grid of the virtual calibration pattern 166 can be labeled with numerals (see FIG. 30) and/or bar codes (see FIG. 31) to aid in the calibration process.
  • holographic calibration plate 160 Through movement of holographic calibration plate 160 or by changing of the illumination parameters (e.g. the frequency of the calibration wavelength 170 emitted by the light source 158), alternative calibration information can be generated for processing by the optical recorders 162a-162c.
  • Changing the virtual calibration pattern 166 is accomplished by recording multiple, superimposed holograms at different calibration wavelength(s) 170 or different positions on the holographic calibration plate 160 and then choosing a specific calibration pattern by illuminating the holographic calibration plate 160 by the specific calibration wavelength(s) 170 corresponding to the specific virtual calibration pattern 166 or viewing from a specific position.
  • Different calibration wavelengths 170 can produce, for example different grids 172, 174, 176 for the virtual calibration pattern 166 of varying density around the desired object 164 (see FIGS.
  • the three-dimensional calibration equipment 177 includes a mirror 178, optical recorders 180a-180c, a light source 182, and a holographic calibration plate 184.
  • a mirror 178 is placed in the field of view of the optical recorders 180a-180c.
  • the light source 182 illuminates the holographic calibration plate 184 at a position out of the field of view of the optical recorders 180a-180c.
  • Light of calibration wavelength (s) 186 excites the holographic calibration plate 184 such that the virtual calibration pattern 188 is reflected in the mirror 178 and viewed by the optical recorders 180a-180c as if the virtual calibration pattern 190 were superimposed on or near the desired object 188.
  • the mirror 178 is reflective at the calibration wavelengths 186 and transparent at the desired wavelengths 192.
  • images from the desired object 188 pass directly to the optical recorders 180a-180c.
  • This embodiment enables the use of wavelengths in light source 182 that included the desired wavelengths 192, as the desired wavelengths 192 reflected from the desired object 188 will pass directly through the mirror 178 and are not reflected to the optical recorders 180a-18Oc.
  • the virtual calibration pattern 190 can be produced by calibration wavelength(s) 186 in the desired wavelength region of the electromagnetic spectrum for applications in which the calibration wavelength(s) 186 overlays the desired wavelengths 192.
  • the calibration wavelength(s) 186 overlays the desired wavelengths 192
  • both the desired object 188 and the virtual calibration pattern 190 are simultaneously observable by the optical recorders 180a-180c without additional post processing.
  • the calibration wavelengths information is stored or processed separately from the desired wavelength information.
  • Calibration wavelength information and desired wavelength information at overlapping wavelengths can also be temporally separated by enabling the calibration wavelength(s) 186 for short times and synchronizing observation of the calibration wavelength(s) 186.
  • the images reaching the optical recorders 180a-180c of FIG. 34 contain both the desired wavelengths 192 and the calibration wavelength(s) 186.
  • the optical recorders 162a-162c, 180a-180c can contain optics/electronics/software corresponding to elements of the block diagram depicted in FIG. 35.
  • the optical recorders 162a-162c, 180a-180c includes optics 198, 200, optional wavelength selective filters 202, 204, and calibration electronics 222, 224, respectively.
  • the calibration electronics 222, 224 can contain a processor (not shown) and output storage 218, 220 for storing information gleaned from the calibration wavelengths 170, 186 and output storage 206, 208 for storing information gleaned from the desired wavelengths 168, 192, respectively.
  • An incoming desired image and calibration image 194, 196 enters each of the optical recorders 162a-162c, 180a-180c, respectively, and is separated into the desired wavelengths 168, 192 and the calibration wavelengths 170, 186 using the additional wavelength selective filters 202, 204, respectively.
  • the desired wavelengths 168, 192 a nd the calibration wavelengths 170, 186 are processed separately in the calibration electronics 222, 224 as represented by the blocks 206, 208, and 210, 212, respectively.
  • the resulting data are stored in separate parts of memory in the output storage 218, 220 and the output storage 206, 208, respectively.
  • Selective separation of the optical wavelengths can be accomplished in at least three ways.
  • One way is to use the wavelength selective filters 202, 204.
  • Another way is for the calibration electronics 222, 224 to be designed to implement optical-to-electronic conversion such as CCD arrays to separate the wavelengths.
  • the pixel array that provides optical-to-electronic conversion uses a planar arrangement of light sensitive devices whose arrangement in two dimensions alternates devices whose sensitivity peaks for the desired wavelengths 168, 192 and the calibration wavelengths 170, 186, respectively.
  • Still another way is for the calibration electronics 222, 224 to be designed to implement optical band pass and band stop filters for selecting the desired wavelengths 168, 192 and the calibration wavelengths 170, 186, respectively.
  • the three-dimensional calibration equipment 226 includes an optical or mechanical shutter 228, optical recorders 230a-230c, a pulsed light source 232, and a holographic calibration plate 234.
  • the optical or mechanical shutter 228 is placed between the holographic calibration plate 234 and the desired object 236.
  • the calibration wavelength(s) 238 are generated only for special frames. Frames imply that the recording of the desired wavelengths 240 and the calibration wavelengths 238 is broken into discrete units of time during which subsequent samples of the desired wavelengths 240 and the calibration wavelengths 238 are captured.
  • Special frames store the calibration wavelengths 238.
  • the special frames can be of two types. In one type, the special frames containing the calibration wavelengths 238 are stored or processed separately from the desired frames containing the desired wavelengths 240. In another type, the special frames are interspersed at a periodic or non- periodic rate between the desired frames.
  • the pulsed optical source 232 is pulsed on with the optical or mechanical shutter 228 closed during the special frames.
  • This arrangement enables the recording of desired and calibration information without wavelength separation.
  • the three-dimensional calibration equipment 226 eliminates the need for wavelength selective filters, since multiple calibration patterns based on calibration wavelengths above and below the desired wavelengths 240 can be generated.
  • the use of a pulsed optical source 232 and the optical or mechanical shutter 228 synchronized to the pulses can be adapted to provide pulsed time code information for synchronizing multiple optical recorders 230a-230c and/or to convey synchronized instructions to the multiple three- dimensional imaging systems (not shown) simultaneously, e.g. for special effects and other camera related functions.
  • the combination of synchronization and holographic calibration/alignment across multiple cameras permits a more cost effective implementation of panoramic cameras such as those that are now implemented mechanically in CineMax® systems and the simplified construction of panoramic three-dimensional imaging systems.
  • the calibration method is as follows: At step 241 , a virtual calibration pattern is projected in the field of view of a desired object. At step 242, one of the optical recorders is chosen as the reference. At step 244, if the optical recorder includes an electronic image detector, then at step 246, the coordinates of a coordinate system are assigned in parallel or normal to the pixel array in the optical recorder or in alignment with the virtual calibration pattern.
  • the optical recorder is a non-electronic system
  • the coordinates of a coordinate system are assigned arbitrarily or in alignment with the virtual calibration pattern.
  • the same coordinate system is assigned to all other optical recorders.
  • the differences in the virtual calibration pattern of each optical recorder other than the reference optical recorder is measured.
  • the differences measured are used to calculate the calibration corrections for each optical recorder relative to the reference optical recorder.
  • the calibration corrections are used to compensate the desired images either mechanically or electronically.
  • the methodology of how to calculate the calibration corrections from the differences measured can be found in U. S. Patent No. 6,822,748 to Johnston et al., which is incorporated herein by reference in its entirety. If there is only one optical recorder, then the method is modified such that the first position of the optical recorder becomes the reference and each subsequent position is treated in the calibration method as an additional optical recorder. Then the steps 241-256 are followed above.
  • one of the non-reference optical recorders is mechanically or electronically (i) rotated so that the desired object appears to be twisted, (ii) translated so that the desired object appears to be moved left or right, and/or (iii) scaled (i.e. zoomed) so that the desired object appears to be made smaller or larger (equivalent in a singe camera view to moving it nearer or farther away).
  • the reference numbers for the three-dimensional calibration equipment 16 also stand for the corresponding reference numbers of the three-dimensional calibration equipment 177 and 226.
  • the three-dimensional calibration equipment 16 permits the processing of the calibration wavelengths 170 separately from the desired wavelengths 168. If the desired object 164 has affixed to it calibration points 258 which are painted with or reflective of the calibration wavelengths 170 (see FIG. 38), then the calibration points 258 are captured separately in the optical recorders 162a- 162c.
  • Specific points on the desired object 164 are uniquely determined by highlight the desired points with points painted with materials that are reflective at the calibration wavelengths 170, which can be points where an application pattern is projected on to the desired object 164 using the calibration wavelengths 170 or with calibration points 176 attached to the object which are painted with the calibration wavelengths 170.
  • the calibration wavelengths 170 can be points where an application pattern is projected on to the desired object 164 using the calibration wavelengths 170 or with calibration points 176 attached to the object which are painted with the calibration wavelengths 170.
  • an application pattern e.g. grid
  • a cosmetic surgeon may paint a selected pattern on the patient using paint in a calibration wavelength that is not visible to the human eye to allow the surgeon to view normal visible images and the selected pattern of the patient in post processing for either still or full motion images. Surgeons may also use these techniques in conjunction with thermal images when the desired wavelengths are properly selected for radiological treatment applications where heat is generated by radioactive treatment materials.
  • the virtual calibration pattern can have any desired shape, such as the shape of a cube, portions of concentric cylinders, portions of concentric spheres, etc., depending on the geometry of the desired object 164 and the field of view of one or more optical recorders.
  • the holographic calibration plate 178 can have any desired shape.
  • a single optical recorder 162a or multiple optical recorders 162a-162c can use a rectangular, spherical, cylindrical or arbitrarily shaped hologram, illuminated by light source 158 from either the inside or outside of the holographic calibration plate that surrounds or partially surrounds the desired object 164.
  • the optical recorder 164a moves around the outside of the cylindrical holographic calibration plate 260 through which a cylindrical virtual calibration pattern 262 is viewed.
  • Successive images of the desired object 164 are post processed into a stereoscopic image or into a three-dimensional model of the object using algorithms for color and or edge matching.
  • These stereoscopic images of the desired object 164 can be used to capture the three-dimensional shape of the desired object 164 by means of calculation by triangulation and the spatial position of uniquely determined points on the desired object 164, e.g., points for a wire frame model. Color, texture and shading are then applied to the wire frame model from the captured images of the desired object 164.
  • a spherical-shaped holographic calibration plate 264 is used in generating a partially spherical calibration hologram 266 of a partially-spherical calibration grid, i.e., a spherical coordinate grid, around one or more optical recorders 164a.
  • the three-dimensional calibration equipment 16 of this configuration is well suited to applications requiring a complete field of view.
  • Multiple optical recorders 164a-164c can be placed within the spherical holographic calibration plate 264 to simultaneously cover all directions.
  • Such a spherical calibration hologram 266 provides the mechanism to overcome the problems of adjacent optical recording devices that record two-dimensional images, which include overlapping fields of view and conformal mapping from the planar segment of the optical recording device to a spherical frame of reference.
  • Using a holographic calibration pattern in the form of a spherical coordinate grid in the field of view of the optical recorders enhances the conformal mapping of the multiple optical recorder outputs.
  • the spherical calibration hologram 164 provides a matching point between the field of view of the adjacent optical recorders and provides a uniform coordinate system across all the optical recorders, which simplifies calibration and alignment of the optical recorders and simplifies conformal mapping across the optical recorders, especially when the alignment between such devices are vary due to shock, vibration or acceleration.
  • the spherical holographic calibration configuration of FIG. 40 provides the three-dimensional calibration equipment 16 with the necessary data for generating a panoramic three-hundred sixty degree view of remotely piloted vehicles subject to shock or acceleration.
  • the spherical holographic calibration configuration enables real-time compensation for the two major problems when providing a panoramic three-hundred sixty degree view for remotely piloted vehicles: (i) the continuous alignment of multiple "fisheye" optical recorders which are subject to misalignment by shock or vibration, and (ii) conformal mapping of multiple optical recorders into a three-hundred sixty degree panoramic view.
  • the calibration wavelengths 170 can be chosen to include wavelength used in collision avoidance systems and thus process such information jointly with optical recorder calibration.
  • the three-dimensional calibration equipment 268 includes a laser pointer 270, optical recorders 272a-272c, a light source 274, and a cylindrical holographic calibration plate 276.
  • the optical recorders 272a-272c separate the desired and calibration wavelengths to enable efficient extraction of the desired object's color in addition to its three dimensional shape.
  • the three optical recorders 272a-272c move around the outside of the cylindrical holographic calibration plate 276 in order to capture and triangulate the position in three-dimensional space of a point P on a desired object 278 which is illuminated by a laser pointer 270 operating at a calibration wavelength 280.
  • the position of point P can be inferred from its position relative to a virtual calibration pattern 282. Since the processing of the object color information using the laser 270 is independent of processing of the spatial information using the virtual calibration pattern 282, the three-dimensional calibration equipment 268 is capable of capturing the object's shape. Although a desired object's shape and color can be recorded with a single optical recorder 272a, multiple optical recorders 272a-272c provides increased speed and accuracy.
  • the simple laser pointer 270 can be replaced with a laser ranging measurement device 284.
  • the laser ranging measurement device 284 provides accurate ranges to any point P on the surface of the desired object 278.
  • a wavelength for the laser ranging measurement device 284 that is a calibration wavelength
  • the point P illuminated by the laser ranging measurement device 284 is observable at the same time as the virtual calibration pattern 282.
  • the embodiment of FIG. 41 uses the virtual calibration pattern 282 to position the laser pointer 2 70 or the laser ranging measurement device 284 at desired points on the virtual calibration pattern 282.
  • the virtual calibration pattern 282 is chosen to be a grid on a plane that intersects the desired object 278.
  • the laser generated point P is to be positioned on the surface of the desired object 278 at grid points nearest to the spatial positions where the grid intersects the desired object 278.
  • the calibration method for use with the three- dimensional calibration equipment 268 can be summarized as follows: At step 286, a virtual calibration pattern (object) 282 is projected on a plane that is tangent to the nearest point of the desired object 278 as measured from an optical recorder 272a- 272c.
  • a subset of virtual calibration pattern intersection points defined by those points closest to where the virtual calibration pattern 282 intersects the desired object 278 is labeled in some numerical order.
  • the laser point P is positioned at each numbered point successively and position data using measurements to the virtual calibration pattern 282 is collected. Step 290 simplifies the positioning process since the laser pointer wavelength is a calibration pattern wavelength and the pointer position P and the virtual calibration pattern 282 are simultaneously observable by the optical recorder 272a-272c for the laser pointer and thus relative positioning corrections rather than absolute position correction of the laser pointing system are required.
  • an attempt is made to generate another virtual calibration pattern that intersects the desired object 278 at a greater distance from the reference optical recorder.
  • steps 184-188 are repeated, otherwise stop.
  • the three-dimensional calibration equipment 296 includes optical recorders 298a-298c, a light source 300, and non continuous, identical holographic calibration plates 302a-302c. If the holographic calibration plates 302a-302c were held mechanically parallel, then this configuration would effectively be the configuration of the three-dimensional calibration equipment 16. Misalignment of the holographic calibration plates 302a- 302c shifts the calibration pattern up or down, left or right and/or forward or back. Thus the misalignment may be calculated from a set of reference points in the field of view.
  • the calibration method for use with the three- dimensional calibration equipment 296 can be summarized as follows: At step 306, the separate holographic correction plates 302a-302c are fixed as rigidly and as closely as possible to the configuration of single fixed plate For flat calibration plates, this implies initially fixing the non-contiguous plates as parallel to each other as possible.
  • a virtual calibration pattern is projected in the field of view of a desired object 304.
  • the position of each of the reference points in the vicinity of the desired object 304, if not on the desired object 304 itself, relative to each optical recorder, is determined. Illuminated points on the desired object 304 are a subset of the reference points.
  • Additional fixed calibration points are generated by affixing reflecting and/or absorbing colors to the desired object 304 with wavelengths in the calibration range, thus predetermining a fixed set of points for calibration.
  • the corresponding position on the virtual calibration pattern is determined.
  • the misalignment of the virtual calibration pattern is determined.
  • the correction factors for example, shift, rotation and scaling in an orthogonal coordinate system, as a function of position in the desired object 304 in three- dimensional space, for each optical recorder, is determined.
  • the corrections are applied for each optical recorder to both the virtual calibration pattern and the desired object.
  • the three-dimensional calibration equipment 318 includes optical recorders 320a-320c, a light source 322, non continuous, identical holographic calibration plates 324a-324c, and reference points 326a-326b in the field of view of the desired object 328.
  • the three- dimensional calibration equipment 318 is realizable for applications that do not utilize the patterns of the virtual calibration plate 324a-324c.
  • Specific reference points 326a-326c in the field of view of the desired object 328 are illuminated by a remote source (not shown) or self illuminated with calibration wavelengths.
  • the reference points 326a-326c are separated into calibration wavelengths by the electronics (not shown) and are used to provide calibration across the optical recorders 320a-320c and provide the calibration for compensation of the desired object 328, i.e., in some circumstances, fixed calibration points near or on the desired object may replace the holographic calibration plates.
  • the three-dimensional calibration equipment 330 includes optical recorders 332a-332c, a remote light source 334, and reference points 336a-336b in the field of view of the desired object 338.
  • the three-dimensional calibration equipment 330 is applicable to applications that do not utilize the patterns of a fixed calibration plate, or in applications where specific reference points 336a-336c can be illuminated by a remote light source 334 or self illuminated with calibration wavelengths.
  • a holographic calibration plate is not required. What is required is only those elements of the three-dimensional calibration equipment 330 that separate the desired object wavelengths from calibration wavelengths.
  • the calibration corrections are calculated from known points 336a-336c illuminated by the calibration wavelengths for each of the optical recorders 332a-332c and calibration corrections are applied to each object in each of the optical recorders 332a-332c.
  • This simplified calibration and compensation method is more likely to be employed in wide angle images that uniformly capture many fixed calibration sources and where the fixed points 336a-336c are affixed to the desired object 338 or placed near stationary desired objects.
  • the three-dimensional calibration equipment 340 includes a band stop filter 342, optical recorders 344a- 344c, a light source 346, and a holographic calibration plate 348.
  • the band stop filter 342 is added to the calibration equipment 16, which prevents the illumination wavelength(s) of intruding hologram source(s) 350 from traveling to the region in the vicinity of the desired object 352 via the region in the vicinity of the holographic calibration plate 348 and the optical recorders 344a-344c.
  • Such emanations from the intruding hologram source(s) 350 are undesirable when the three-dimensional calibration equipment 340 is used in a security or a surveillance application.
  • the three-dimensional calibration equipment 354 includes a stereoscopic microscope 356, optical recorders 358a, 358b a light source 360, and a holographic calibration plate 362.
  • FIG. 48 shows an apparatus which generates virtual holograms for the stereoscopic microscope 356. Calibration of the stereoscopic microscope 356 can be useful when multiple optical paths 362a, 362b are employed.
  • calibration for the stereoscopic microscope 356 include projecting three-dimensional grids in the field of view of the lenses 364a, 364b of the stereoscopic microscope 356 to assist in counting specimens such as while blood cells on microscope slides 366 or to ascertain the locations of imperfections in diamonds as a means of grading and identification in the case of theft.
  • the use of three-dimensional holograms permits improved analysis as stereoscopic microscope images are scaled, especially in the depth dimension.
  • the multiple virtual calibration patterns are selectively accessed as the wavelength of illumination of the holographic calibration plate 360 is varied or by movement of the holographic calibration plate 360.
  • one calibration wavelength is used to record a virtual calibration pattern and another calibration wavelength is used to symbolically identify the pattern (grid) intersection, e.g. a bar code or alphanumeric sequence. Additional wavelengths are used for illuminating holographic calibration grids (virtual calibration patterns) with finer grid structures for more accurate determination of three-dimensional spatial positioning.
  • the three-dimensional calibration equipment 368 includes a light source 370 which excites the holographic calibration pattern, a holographic calibration plate 372 that contains holographic encoded calibration information, two or more optical recorders 374a-374c which acquire images in different optical wavelengths.
  • the light source 370 may be placed on either side of the holographic calibration plate 372.
  • This embodiment projects holographic calibration patterns into the field of view of multiple optical recorders which acquire images in different optical wavelengths in order to capture multiple views referenced to common calibration patterns.
  • the three-dimensional calibration equipment 368 is used for identifying the desired object 378 using a combination of characteristics. For example, these characteristics include unique object identifiers (e.g. finger prints), object geometry (e.g. hand geometry) and/or object substructures (e.g. veins viewed at non-visible wavelengths).
  • the wavelength selective filters 202 and 204 enhance selection of identification characteristics. For example, a band stop filter which only passes the infrared wavelengths is most useful for vein identification.
  • the filters may be selected in real time using opto-acoustic implementations for the wavelength selective filters to select a particular eye or hair color corresponding to a specific desired object(s) 378 for which an identification is desired.
  • the three-dimensional calibration equipment 368 minimizes post processing by the use of band pass and band stop optical filters and/or continuous holographic calibration and correction, which speeds up the matching of the desired object 378.
  • the exemplary embodiment of the three-dimensional calibration equipment 368 can be employed in banking where the underlying identification objectives are to: (1) minimize false acceptances and (2) minimize false rejections.
  • the multi-criteria identification system shown in FIG. 49 can apply low false rejection criteria (e.g. hand geometry) to low risk activities (e.g. balance checks) and apply low false acceptance criteria (finger print, vein structure, etc.) to high cost of failure activities (e.g. funds withdrawal). , plague_schreib draw criteria (e.g. hand geometry) to low risk activities (e.g. balance checks) and apply low false acceptance criteria (finger print, vein structure, etc.) to high cost of failure activities (e.g. funds withdrawal).
  • low false rejection criteria e.g. hand geometry
  • low risk activities e.g. balance checks
  • low false acceptance criteria finger print, vein structure, etc.
  • high cost of failure activities e.g. funds withdrawal
  • the present invention has several advantages over the prior art three- dimensional imaging system.
  • the ultra short optical pulses converge at very precise locations on the optical mixer 18, thereby creating high precision, high definition images in the display volume 28.
  • the optical mixer elements 20a-20i can be varied in size, allowing the three-dimensional display 12 to be scalable from very small to very large displays. There is no inherent size limitation.
  • the optical mixer element layout can be optimized to prevent unintended pulse overlap from creating unintended outputs.
  • the optical mixer 18 is capable of being viewed at high refresh rates.
  • the three-dimensional display 12 Since the three-dimensional display 12 is operable using optical pulses 32a- 32k that are in visible and non-visible wavelengths, the three-dimensional display 12 can be used as test equipment for both the three-dimensional image scanner 14 and the two-dimensional image scanner 15.
  • the display electronics 24 allows for (i) the simultaneous excitement of voxels in the display volume, (ii) dynamic adjustment of the intensity of the light produced in a voxel in the display volume 28, and (iii) the selection of optical source combinations for each voxel. These three characteristics of the display electronics 24 achieve the needed conversion efficiency in the optical mixer 18 to produce equalization of intensity throughout the display volume 28.
  • optical mixer elements 20a-20i Using intense optical pulses 32a-32k to excite the optical mixer 18 increases voxel intensity in the display volume 28 and thereby increases viewing angles, including angles close to the plane of the display 12. Increased viewing angle is also achieved when the optical mixer elements 20a-20i employ lenses, since the cone of acceptance of each of the optical mixer elements 20a-20i increases over an element without a lens. Using optical mixer elements 20a-20i without filters reduces the cost of the overall optical mixer 18.
  • the three-dimensional image scanner 14 captures shape, color and texture simultaneously in real-time.
  • the three-dimensional calibration equipment herein described provides continuous (real time) calibration of stereoscopic information from still or moving objects. It improves the quality of the stereoscopic images and permits the construction of more accurate three-dimensional models from such images.
  • the three-dimensional calibration equipment can operate in real time, for both optical and electronic based optical recorders.
  • the three-dimensional calibration equipment improves the quality of the stereoscopic images, for example, by more accurately combining information from multiple optical recording devices.
  • the three- dimensional calibration equipment is both an apparatus to use with image capture equipment such as cameras and a method to compensate for the absolute and/or differential distortion within and across stereoscopic imaging equipment used together as a system.
  • the three-dimensional calibration equipment automates the calibration process.
  • the virtual calibration pattern is viewed by the same lens system in optical recorders that are used for stereoscopic imaging systems, all static distortion will be taken into account.
  • the virtual calibration pattern allows explicit calibration in the desired object's space without the limitations of using real calibration patterns that are moved through the image space.
  • One potential use of the higher quality images obtained from the three- dimensional calibration equipment is to more accurately determine the distance to points on the stereoscopically recorded or viewed desired object for uses which include the construction of three dimensional images of desired object.
  • the three- dimensional calibration equipment is capable of providing calibration during normal stereoscopic operation without affecting the desired image, it is usable for continuous calibration, e.g., when the optical recorders are equipped with variable zoom lenses that exhibit differing optical properties as the zoom setting is changed, or continuous calibration during normal operation when mechanically misalignment of equipment occurs in one or more of the optical paths to the optical recorders.
  • the three-dimensional calibration equipment continuously operates without interfering with the optical recorders.
  • the virtual calibration patterns for example, by varying the hologram frequency when alternative holograms are recorded at different frequencies, several ranges and positions of the image field of view can be continuously recalibrated to compensate for variations in image system distortions (e.g., as the degree of telephoto magnification is changed).
  • a virtual calibration pattern can be produced in any place in the field of view of the optical recorders, and the calibration is done quickly.
  • the desired information and the calibration information are recorded through the same lens of each image recorder as the desired image is captured at the same time as the calibration image, which enables real time or post-processing calibration.
  • the three-dimensional calibration equipment herein described can operate in real-time and at high speeds, which makes the equipment suitable for applications in which traditional calibration equipment is not suitable, such as in the cockpit of an aircraft, where it is not possible to place a traditional calibration object in field of view. This is especially true when the optical recorders are subject to acceleration or other causes of changes in the optical paths between he desired object and the optical recorders. Having a single frame of reference with known points in the calibration pattern observable in each view simplifies combining multiple camera views to create to produce a three-dimensional representation of a desired object.
  • the three-dimensional calibration equipment of the present invention simplifies conformal mapping from a planar optical recorder to a non-planer coordinate system - holographic project enables each optical recorder to view the same desired coordinate system, e.g., spherical, simplifying point matching and transformation.
  • Computer readable labeling of holographic p attern intersections improved performance as identification of points in the field of view is simplified compared to the searching and matching operations performed in the prior art.
  • the calibration equipment of the present invention can make use of multiple patters, slightly shifted patterns, or patterns with more fine detail to capture different views of the desired object.
  • the calibration equipment 268 which employs laser ranging measurement device 284 is quicker and more precise than traditional triangulation from two optical recorders.
  • the discrete holographic calibration plates of the calibration equipment 318 is lower in cost and less susceptible to vibrations present in moving vehicles compared to using a single calibration plate spanning all optical recorders.
  • the calibration equipment 354 permits the stereoscopic microscope 356 to capture multiple biometric views from optical recorders with a common virtual calibration pattern.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Holo Graphy (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A three dimensional imaging system is disclosed which includes a three dimensional display (12), three-dimensional calibration equipment (16), and one or more two-dimensional (15) or three dimensional (14) image scanners. The three-dimensional display (12) uses optical pulses (32a-32k) and a non linear optical mixer (18) to display a three-dimensional image (17). The three-dimensional image (17) is generated in voxels of the display volume (28) as the optical mixer (18) sweeps the display volume (28). The three-dimensional calibration equipment (16) uses a hologram projected proximal to a desired object (164) to calibrate optical imaging devices (162a-162c) and to simplify the combination of the images from one or more optical imaging devices (162a-162c) into three-dimensional information. The three-dimensional image scanner (14) employs optical pulses (136, 138) and a non-linear optical mixer (128) to acquire three-dimensional images of a desired object (134). The three-dimensional image scanner (14) captures both the shape and color of a desired object (134).

Description

THREE-DIMENSIONAL IMAGING SYSTEM USING OPTICAL PULSES. NONLINEAR OPTICAL MIXERS AND HOLOGRAPHIC CALIBRATION
Cross-Reference to Related Application
This application claims the benefit of U.S. Provisional Application Serial No. 60/533,384 filed December 30, 2003, U.S. Provisional Application Serial No. 60/533,134 filed December 30, 2003, U.S. Provisional Application Serial No. 60/533,305 filed December 30, 2003, and U.S. Provisional Application Serial No. 60/537,773 filed January 20, 2004, the disclosures of which are incorporated herein by reference in their entirety.
Field of the Invention
The present invention relates to three-dimensional imaging, and, more particularly, to an apparatus for providing a three-dimensional imaging system which includes a three-dimensional display, a two-dimensional and/or three-dimensional scanning device, and calibration equipment to calibrate the scanning device(s) and to simplify the combination of the images from one or more two-dimensional optical imaging devices into three-dimensional information. The display and the scanning device(s) both employ optical pulses and non-linear optics to display and record, respectively, a three-dimensional image.
Background of the Invention
In obtaining and displaying images, more information for the viewer can be extracted if the image is three-dimensional rather than two-dimensional. Three- dimensional images provide the viewer with texture, depth color and position information. Three-dimensional images are more natural for humans to appreciate.
A volumetric three-dimensional imaging system displays images in a display volume which are acquired by a three-dimensional optical scanner or acquired by one or more two-dimensional optical scanners and converted to a three dimensional representation using holographic calibration. Light rays generated by the display at three-dimensional spatial positions appear as real objects to the viewer.
The prior art for three-dimensional displays includes two classes of displays: stereoscopic displays and swept volume "volumetric" displays. Stereoscopic displays are based on holographic or binocular stereoscopic technology that use two-dimensional displays to create a three dimension effect for the viewer. A shortcoming of stereoscopic displays is that they display spatial information from the perspective of only one viewer. Volumetric displays overcome this shortcoming by creating three-dimensional images in the display volume from voxels, the smallest distinguishable three-dimensional spatial element of a three- dimensional image. Volume displays satisfy depth cues such as stereo vision and motion parallax. Motion parallax is that phenomenon that a driver observes from his car when the terrain closer to him moves by faster than the terrain farther away.
In volumetric displays, the display volume is swept by a moving screen. The prior art for flat screen volumetric displays includes the LED arrays described in U. S. Patent No. 4,160,973 to Berlin (the Berlin '973 Patent), the cathode ray sphere displays described in U. S. Patent No. 5,703,606 to Blundell (the Blundell '606 Patent), the laser projection displays described in U. S. Patent No. 5,148,301 to Batchko (the Batchko '301 Patent), and the rotating reflector displays described in U. S. Patent No. 6,302,542 to Tsao (the Tsao '542 Patent). The prior art for curved screens includes the helical screen displays and the Archimedes' Spiral displays described U. S. Patent 3,428,393 to de Montebello (the de Montebello '393 Patent).
There are two classes of holographic displays which utilize lasers or electron beams to generate illumination on moving screens of phosphor materials. The first class, which uses a laser or electron beam to excite a phosphor to emit light, includes the displays described in the Batchko '301 Patent, the Blundell '606 Patent, and U. S. Patent No. 4,871 ,231 to Garcia (the Garcia '231 Patent). The second class uses intersecting lasers beams to generate illumination on moving screen using two stage excitement of photoluminescent media as described in U. S. Patent No. 5.943,160 to Downing et al. (the Downing et al. '160 Patent) or photoionization of silicate Glasses as described in U. S. Patent No. 6,664,501 to Troitski (the Troitski '501 Patent). The problem in both holographic display classes is the need for a high refresh rate of the voxels for real time displays. For a low resolution display of 500 by 500 by 500 voxels, with 20 refreshes each second to enable persistence of vision, data rates of several hundred megahertz to several gigahertz are required to refresh the display. High resolution (high definition) three dimensional display data rates are greatly reduced when voxels in the display can be accessed randomly without raster scanning and also when multiple voxels in the display can be accessed in parallel.
Another approach to color three-dimensional volumetric displays is the use of rotating light sources. Fiber optics implementations using this approach include the implementations described in U. S. Patent No. 4,294,523 to Woloshuk et al. (the Woloshuk et al.'523 Patent) and in U. S. Patent No. 3,604,780 to Martin (the Martin '780 Patent), which channel light using fiber optics to the moving screen. The rotating light source approach has problems connecting a large number of light sources to a moving screen and therefore their high definition displays are difficult to construct and maintain in operation. Implementations utilizing light sources on the moving screen such as the light emitting diodes (LEDs) of the Berlin '973 Patent result in complex implantations of light emitters and their associated active control electronics which are also included with the rotating screen.
Yet another approach to color three-dimensional volumetric displays uses projection techniques to display whole two-dimensional images on a moving screen such as a rotating reflector on a reciprocating screen (see the Tsao '542 Patent). While this approach has the advantage of high simultaneous transfer of image data to moving screen, its moving mechanism becomes mechanically more complicated as the size and thus the forces for moving the display increases. Furthermore, the accuracy of positioning of the projections in specific voxels decreases as the size of the display increases because of the increasing forces on the rotating screen and because the pointing error of the projection beams increases as display size increases. The calibration of three-dimensional image acquisition equipment which includes optical recorders, especially for those configurations using multiple cameras, is a very time consuming process, as they require calibration for all the points in the target space. The techniques for combining the information from two- dimension sources into three-dimensional content are well known (see for instance, U. S. Patent No. 6,816,629 to Redlich). The techniques for moving physical calibration objects to obtain a sufficient number of points to calibrate the target space are also well known (see for instance, U. S. Patent No. 6,822,748 to Johnston et a I). Current calibration techniques using real objects moved through all the calibration points of the target space do not provide continuous real time calibration since optical properties of the optical recorders change with zoom adjustments, wear, or mechanical deformation under acceleration.
The prior art describes three-dimensional image scanners which capture shapes of objects in the target space (see, for instance, U. S. Patent Mo. 5,936,739 to Cameron, U. S. Patent No. 5,585,913 to Hariharan, and U. S. Patent No. 6,445,491 to Sucha). Such three-dimensional image scanners are not able to capture both shape and color.
Summary of the Invention
The present invention overcomes the disadvantages and shortcomings of the prior art discussed above by providing a three-dimensional imaging system, which includes a three-dimensional display, an image scanning device for capturing a three-dimensional image to be displayed on the three-dimensional display, and three-dimensional calibration equipment for calibrating the image scanning device. Both the three-dimensional display and the image scanning device employ optical pulses and non-linear optics to display and record, respectively, a three-dimensional image. The image scanning device may be two-dimensional or three-dimensional.
The three-dimensional display includes at least three pulsed optical sources; and an optical mixer movable in a display space, wherein the at least three pulsed optical sources are spatially separated so as to permit pulses emanating therefrom to overlap in a voxel within the display space and intersecting the optical mixer at a selected position, whereby a first-order non-linear interaction of the pulses causes the optical mixer to produce at least one pre-determined wavelength of electromagnetic waves.
The three-dimensional image scanner captures a three-dimensional image of an object. The three-dimensional image scanner includes a first pulsed optical source for generating an illuminating optical pulse at an illumination wavelength, the first pulsed optical source directing the illuminating optical pulse toward the object; a second pulsed optical source for generating a gating optical pulse at a gating wavelength; an optical mixer positioned to receive light reflected from the object at a single wavelength in response to interaction of the illuminating optical pulse with the object, a portion of the illuminating optical pulse and a portion of the gating optical pulse spatially and temporally overlapping each other within the optical mixer, thereby producing a first optical pulse indicative of the shape of the object and a second optical pulse indicative of the color of the object; and an optical recorder having a plurality of pixels responsive to output light emitted by the optical mixer, a first portion of the plurality of pixels having an associated filter which passes the first optical pulse and which blocks the second optical pulse, and a second portion of the plurality of pixels being unfiltered.
The three-dimensional calibration equipment includes acquiring means for acquiring an optical image of a desired object from at least two positions, the acquiring me ans being either at least two optical recorders placed at least two different positions or a single optical recorder that is moved between several positions. The three-dimensional calibration equipment also includes a holographic calibration plate placed between the acquiring means and the desired object, and a light source of at least one of a set of calibration wavelengths for illuminating the holographic calibration plate so as to project at least one virtual calibration pattern in the field of view of the acquiring means and in the vicinity of the desired object. An alternative embodiment of the three-dimensional calibration equipment includes at least two optical recorders and a light source of at least one of a set of calibration wavelengths for illuminating at least three reference points relative to the desired object to be recorded by the at least two optical recorders.
A method for calibrating the three-dimensional imaging system using the three-dimensional imaging equipment mentioned above includes the steps of projecting a virtual calibration pattern in the field of view of the optical recorder(s); choosing one position of one optical recorder as a reference position; assigning coordinates of a coordinate system relative to either the virtual calibration pattern or the reference position; measuring the differences in the virtual calibration pattern from a second position of the optical recorder(s); calculating calibration corrections relative to the reference position based on the differences measured; and adjusting the optical recorder(s) based on the calibration corrections.
An alternative method of calibrating a three-dimensional imaging system using the three-dimensional imaging equipment mentioned above for calibrating optical recorder(s) includes the steps of projecting a calibration pattern at a calibration wavelength on a plane that is tangent to the nearest point of a desired object as measured from the optical recorder; labeling an intersection point P between the calibration pattern and the desired object; positioning the end of a laser light beam operating at the calibration wavelength at the point P; measuring the distance from the point P to the calibration pattern; generating a second calibration pattern at a greater distance from the reference optical recorder; and repeating the steps of labeling, positioning, and measuring when the calibration pattern intersects the desired object.
Another alternative method of calibrating a three-dimensional imaging system using the three-dimensional imaging equipment mentioned above which includes at least two optical recorders to be calibrated and two holographic calibration plates placed in the field of view of a respective one of the optical recorders where each of the holographic calibration plates contains the same hologram, includes the steps of positioning the calibration plates relative to each other to approximate a monolithic calibration plate; projecting a calibration pattern in the field of view of a desired object through each of the calibration plates; determining the position of at least three reference points in the vicinity of the desired object relative to each of the optical recorders; determining a corresponding position on the calibration pattern corresponding to each reference point; determining the misalignment of the virtual calibration pattern; determining the correction factors as a function of position of the desired object relative to each optical recorder; and applying the correction factors to each optical recorder.
Further features and advantages of the invention will appear more clearly on a reading of the following detailed description of several exemplary embodiments of the invention.
Brief Description of the Drawings
For a more complete understanding of the present invention, reference is made to the following detailed description of several exemplary embodiments considered in conjunction with the accompanying drawings, in which:
FIG. 1 is a block diagram of a three-dimensional imaging system constructed in accordance with an exemplary embodiment of the present invention;
FIG. 2 is a schematic diagram of an exemplary embodiment of the three-dimensional display described in FIG. 1;
FIG. 3 is a schematic diagram showing that if optical pulses used in FIG. 2 are sufficiently short in duration, then the pulses will spatially overlap essentially at a desired position P1;
FIG. 4 is a schematic diagram showing that pulses from several optical sources arriving at point P1 at the same point in time that minimizes the spatial spread of the overlapping pulses;
FIG. 5 is a schematic diagram of an optical mixer moving periodically back and forth along the z axis under the control of display electronics such that optical pulse wave fronts will intersect the optical mixer at different points of successive planes, which provides a mechanism to generate an optical mixer output of the desired optical wavelength (color) at any point in the volume of space that is traversed by the optical mixer;
FIG. 6 is a schematic diagram of an implementation of a pulsed optical source in which a laser produces a beam of light which is transformed into a cone of light by a concave lens;
FIG. 7 is a schematic diagram of another implementation of a pulsed optical source in which a point source of light emits light which is transformed by a convex lens into extended beams with plane wave fronts;
FIG. 8 is a schematic diagram showing how a desired wavelength generator can be shared across three pulsed optical sources;
FIG. 9 is a schematic diagram depicting an optical mixer constructed from a plurality of smaller optical mixing elements;
FIG. 10 is a schematic diagram showing that each of the smaller optical mixing elements of FIG. 9 can also include optical mixer sub-elements which are optimized for one of the three primary colors in an RGB display;
FIG. 11 is a perspective view of optical mixing elements made from a monolithic non-linear optical material depicted as having a cylindrical shape and another optical mixing element depicted as having a truncated conical shape;
FIG. 12A is a side view of an optical mixing element that is a combination of a hemispherical lens, a cylindrical non-linear mixing material, and a cylindrical desired wavelength filter;
FIG. 12B is an exploded perspective view of the optical mixing element of FIG. 12A;
FIG. 13A is a side view of an optical mixing element that is a combination of a triangular lens, a triangular non-linear mixing material, and a triangular desired wavelength filter; FIG. 13B is an exploded perspective view of the optical mixing element of FIG. 13A;
FIG. 14A is a side view of an optical mixing element that is a combination of a pyramidal lens, a rectangular non-linear mixing material, and a rectangular diffuser;
FIG. 14B is an exploded perspective view of the optical mixing element of FIG. 14A;
FIG. 15A is a side view of an optical mixing element that is a combination of a hemispherical lens, a rectangular non-linear mixing material, and a rectangular desired wavelength filter;
FIG. 15B is a an exploded perspective view of the optical mixing element of FIG. 15A;
FIG. 16A is a side view of an optical mixing element that is a combination of a rectangular non-linear mixing material, a pyramidal lens, a rectangular desired wavelength filter, and a pyramidal optical reflector which allows one or more pulsed optical sources to be positioned on the same side of an optical mixer as a viewer;
FIG. 16B is a an exploded perspective view of the optical mixing element of FIG. 16A;
FIG. 17 is a schematic diagram of a mechanical mechanism for moving a planar optical mixer back and forth;
FIG. 18 is a schematic diagram showing that a optical mixer can be moved using a rotational motion about the X-axis;
FIG. 19 is a schematic diagram of the optical mixer of FIG. 18 in which additional pulsed optical sources are placed on a side of the optical mixer opposite the side to which the original pulsed optical sources direct optical pulses, such that the pulsed optical sources arrive at the point P1 on the optical mixer at the same time;
FIG. 20 is a schematic diagram depicting the pulsed optical sources of FIG. 19 in which the pulsed optical sources are successively utilized as the optical mixer rotates clockwise around the X-axis;
FIG. 21 is a schematic diagram showing that an optical mixer can have other physical shapes in addition to planar;
FIG. 22 is an exemplary embodiment of the three-dimensional image scanning device described in FIG. 1 ;
FIG. 23 is a schematic diagram of an arrangement for a second source of optical pulses which includes a pulsed optical source and a mirror;
FIG. 24 is a schematic diagram of an arrangement for a second source of optical pulses in which the mirror of FIG. 23 is omitted and in which the timing of the optical sources is controlled by image scanner electronics;
FIG. 25 is a schematic diagram showing that the generation and detection of reflected light from a desired object with high temporal resolution will reveal the depth profile of the desired object as a succession of different "slices" of the desired object are captured within the optics of an optical recorder;
FIG. 26 is a schematic diagram depicting the process of separating spatial information from color information;
FIG. 27 is a schematic diagram showing a CCD array in which some of the pixels can be coated with a filtering material that passes only the spatial wavelengths relating to shape while other pixels pass only the desired wavelengths related to color;
FIG. 28 is a schematic diagram of a first exemplary embodiment of the three-dimensional calibration equipment depicted in FIG. 1 , in which the three- dimensional calibration equipment includes a light source, a holographic calibration plate, and two or more optical recorders;
FIG. 29 is a perspective view of a cubical virtual calibration pattern in the form of a "tinker toy" like grid projected by a holographic calibration plate in the vicinity of a desired object;
FIG. 30 is a perspective view of a cubical virtual calibration pattern of FIG. 29 in which individual intersections of the grid of the virtual calibration pattern are labeled with numerals;
FIG. 31 is a perspective view of a cubical virtual calibration pattern of FIG. 29 in which individual intersections of the grid of the virtual calibration pattern are labeled with bar codes;
FIG. 32 is a schematic diagram showing how different calibration wavelengths can produce different grids for the virtual calibration pattern of varying density around the desired object, in this case a grid of low density;
FIG. 33 is similar to FIG. 32 in which the grid is of medium and high density;
FIG. 34 is a schematic diagram of a second exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which a mirror is placed in the field of view of the optical recorders;
FIG. 35 is a block diagram depicted the optics/electronics/software for use with the embodiments of FIGS. 28 and 34;
FIG. 36 is a schematic diagram of a third exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which the three-dimensional calibration equipment also includes an optical or mechanical shutter; FIG. 37 is a flow chart depicting the method of calibration to be used in conjunction with the embodiments of the three-dimensional calibration equipment depicted in FIGS. 28, 34, and 36;
FIG. 38 is a schematic diagram of a variation of the embodiment of FIG. 28 in which the desired object has affixed to it calibration points which are painted with or reflective of the calibration wavelengths which are captured separately in the optical recorders;
FIG. 39 is a schematic diagram showing an optical recorder located outside of the cylindrical holographic calibration plate, a position from which the optical recorder views a cylindrical virtual calibration pattern in its field of view;
FIG. 40 is a schematic diagram showing an optical recorder located inside a spherical holographic calibration plate from which position a part of the spherical calibration grid is observable;
FIG. 41 is a schematic diagram of a fourth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, which includes three optical sources, a holographic calibration plate with an excitation source, and a laser pointer or laser ranging measurement device;
FIG. 42 is a flow chart depicting the method of calibration to be used in conjunction with the embodiment of the three-dimensional calibration equipment depicted in FIGS. 41 ;
FIG. 43 is a schematic diagram of a fifth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which the three-dimensional calibration equipment employs non-continuous, identical holographic calibration plates instead of a single continuous holographic calibration plate;
FIG. 44 is a flow chart depicting the method of calibration to be used in conjunction with the embodiment of the three-dimensional calibration equipment depicted in FIGS. 43; FIG. 45 is a schematic diagram of a sixth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 43, in which the three-dimensional calibration equipment also includes reference points identified by the calibration equipment in the field of view of the desired object;
FIG. 46 is a schematic diagram of a seventh exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 44, in which the three-dimensional calibration equipment does not utilize fixed calibration plate(s);
FIG. 47 is a schematic diagram of an eighth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which the three-dimensional calibration equipment also includes a band stop filter;
FIG. 48 is a schematic diagram of a ninth exemplary embodiment of the three-dimensional calibration equipment depicted in FIG. 1 , in which the three- dimensional calibration equipment is used in conjunction with a stereoscopic microscope; and
FIG. 49 is a schematic diagram of a ninth exemplary embodiment of the three-dimensional calibration equipment depicted in FIGS. 1 and 28, in which the three-dimensional calibration equipment also includes a desired object imprinted on a plate in which the desired object is identified using a combination of characteristics.
Detailed Description of the Invention
With reference to FIG. 1 , a block diagram of a complete three- dimensional imaging system 10 is depicted. The three-dimensional imaging system 10 includes a three-dimensional display 12, a three-dimensional image scanning device 14, and/or one or more two-dimensional image scanning devices 15, and three-dimensional calibration equipment 16. The three-dimensional image scanning device 14 and/or the two-dimensional image scanning device 15 are employed to generate a three-dimensional image (not shown) to be displayed on the three- dimensional display 12, and to provide data for use by the three-dimensional calibration equipment 16. The three-dimensional calibration equipment 16 calibrates the three-dimensional image scanning device 14 and/or the two-dimensional image scanning device 15. The three-dimensional display 12 and the three-dimensional image scanning device 14 both employ optical pulses and non-linear optics (not shown) to display and record, respectively, a three-dimensional image.
For the purposes of the discussion below, "voxels" are volume pixels, the smallest distinguishable three-dimensional spatial element of a three- dimensional image. "Desired wavelengths" are the visible wavelengths of light to be displayed to the observer. "Cone of acceptance" or "acceptance angle" refers to the range of angles incident on a non-linear optical material from a line perpendicular to the material within which an incident optical pulse will produce non-linear optical pulses that exit the non-linear optical material. Incident angles outside the range of the cone of acceptance will result in no conversion of light and hence no output light emanating from the non-linear optical material.
With reference to FIG. 2, an exemplary embodiment of the three- dimensional display 12 of the present invention is depicted. The three-dimensional display 12 renders for viewing a three-dimensional image of a desired object 17 at a single optical wavelength (monochrome) or multiple optical wavelengths, including the visible wavelengths (colors). The three-dimensional display 12 can be adapted to display still or moving three-dimensional frames of still or moving desired objects 17. The three-dimensional display 12 includes K pulsed optical sources 16a-16k, an optical mixer 18 including one or more optical mixer elements 20a-20i, one or more optical filters 22, and display electronics 24.
The three-dimensional display 12 uses the optical mixer 18 to create the desired wavelengths 26 in a display volume 28 under the control of the display electronics 24. The optical mixer 18 produced the desired wavelengths 26 at specific voxels (not shown) at specific times within the display volume 28. The optical mixer 18 creates three-dimensional images, which includes graphics, in the display volume 28. Three-dimensional images are generated as the optical mixer 18 emits specific desired wavelengths 26 in selected voxels (not shown). The optical mixer 18 converts the optical excitation from the K pulsed optical sources 16a-16k into the desired wavelengths 26 observable by the viewer(s) or viewing equipment 30 through one or more optical filters 22 of the three-dimensional display 12.
For the viewer 30 to observe the full set of visible colors P1 (at least three primary colors) at the point P1 , three or more, up to K, pulsed optical sources 16a-16k, may be operable at a given time, which produce optical pulses 32a-32k spatially positioned on one or both sides of the optical mixer 18. In the preferred embodiment, three pulsed optical sources 16a-16c emit the optical pulses 32a-32c of frequencies F-i, F2, and F3 toward the optical mixer 18. The optical pulses 32a-32c are sufficiently separated to enable triangulation of the overlap of the optical pulses 32a-32c by pulse timing under the control of the display electronics 24. The optical pulses 32a-32c arrive at a desired point P1 (voxel(s)) in three-dimensional display volume together with the optical mixer 18. The optical pulses 32a-32c are sufficiently short in duration so that they temporally coincide (i.e., spatially overlap; see FIG. 3) essentially at the desired position P-i. At all other points on the optical mixer 18, the optical pulses 32a-32c arrive at different times, and do not overlap. The pulse timing is controlled by the display electronics 24.
The optical pulses 32a-32c interact in a non-linear manner when passing through the optical mixer 18. What emanates from the optical mixer 18 is a set of pulses 34 that include pulses of not only the original frequencies F- F2, and F3 but also pulses of the sum and difference frequencies of the optical pulses 32a-32c. This set of pulses 34 are transmitted through the optical fιlter(s) 22 to the viewer 30, which may be a person or optical equipment such as a camera. In the case where the pulsed optical sources 16a-16c are frequency tunable or frequency selectable pulsed optical sources, a judicious selection of the frequencies for F1 ( F2, and F3 produces an output pulse 34 from the optical mixer 18 of a specified optical frequency (color if a visible frequency). The optical filter 22 can be selected to pass only the sum frequency Fi + F2 + F3 such that the viewer 30 "sees" only a pulse 34 of frequency Fi + F2 + F3 illuminated at point PL Alternatively, selectively choosing Fi, F2, and F3, and selectively choosing the upper and lower cutoff frequencies of the optical filter 22 to be within a selected range of the optical spectrum allows the three- dimensional display 12 to display a specific range of frequencies (e.g., the visible colors). Brightness of the three-dimensional display 12 may be increased by increasing the intensity of any of the optical pulses 32a-32c. Brightness of the three- dimensional display 12 may also be increased from the optical mixer by use of an intense "gating" pulse of frequency F4 from a fourth pulsed optical source 16d and choosing Fi + F2 + F3 +F as the desired wavelength.
Now referring to FIGS. 1 and 5, moving the optical mixer 18 periodically back and forth in a direction normal to the plane of the optical mixer under the control of the display electronics 24 that also controls the pulse timing of the optical pulses 16a-16k provides a mechanism to generate an optical mixer output of the desired optical wavelength (color) at any point in the display volume of space that is traversed by the optical mixer 18. Objects or their representations are displayed by creating the desired optical frequencies in the optical mixer 18 at points, such as P1 , as the optical mixer 18 moves periodically back forth. The three- dimensional display 12 creates pulses of light in the optical mixer 18 at desired points, such as P1, that then travels thru the optical filter 22 to the viewer or viewing equipment 30. Each movement through the display volume 28 can show a different image to the viewer. As the optical mixer 18 moves back and forth at greater than or equal to about twenty traversals per second, persistence of vision creates the perception of motion.
Referring now to FIGS. 2 and 5, for a planar optical mixer, the wave fronts of the pulsed optical sources 32a-32k will converge and intersect the optical mixer 18 at different points 36 of successive planes 38 (see FIG. 5). Since the pulse repetition rate of the pulsed optical sources 16a-16k are very rapid relative to persistence of vision and the optical mixer 18 moves very slowly, optical mixer element 20a-20i can be sequentially excited in an interval during which the optical mixer 18 moves a very small distance. Also since three pulsed optical sources 16a- 16c will converge at only one point in the display space and the proposed implementation of the optical display 12 allows for up to K pulsed optical sources 16a-16k, more than one optical mixer element 20a-20i in the optical mixer 18 may be excited in parallel.
Each of the optical mixer elements 20a-20i recursively passes through the display volume 28 such that during the recursion period, at least one of the optical mixer elements 20a-20i passes through each voxel 40. Thus, during each recursion period, a least one of the optical mixer element 20a-20i is capable of emitting a specific desired wavelength 26, in any selected voxel 40 observable by the viewer(s) 30 of the three-dimensional display 12 and thus a three-dimensional image is created. The three-dimensional display 12 is capable of producing a desired wavelength 26 at any voxel 40 in the display volume 28. For example, at some point in time when one of the optical mixer element 20a-20i arrives at a position in the display volume 28 where a particular desired wavelength 26 is desired, the display electronics 24 triggers pulses from several pulsed optical sources 16a-16k that arrive at the desired one of the optical mixer elements 20a-20i simultaneously. The pulsed optical sources positioning and timing meet the following conditions: (1) pulsed optical sources 16a-16k are sufficiently outside the display volume 28 to illuminate the desired one of the optical mixer elements 20a- 20i; (2) the triggered pulsed optical sources 16a-16k are sufficiently separated so as to enable triangulation, by pulse timing, to excite a desired mixer element (e.g., 20a); and (3) the optical pulses 32a-32c are so short in duration that they overlap essentially at the desired one of the optical mixer elements 20a-20i. In summary, the display electronics 24 controls the pulsed optical sources 16a-16k to generate optical pulses 32a-32c that excite the desired optical mixer elements 20a-20i with a predetermined combination of optical frequencies that produce the desired wavelength 26 in the desired voxel 40.
Each of the pulsed optical sources 16a-16k operates at one or more predetermined optical frequencies. The optical display 12 may be constructed using one or more pulsed optical sources emitting short pulses in combination with one or more pulsed optical sources emitting longer pulses (up to infinite pulse widths, i.e., continuous illumination). Using optical pulses with very short pulse widths (called ultra short optical pulses with pulse widths in the femtosecond to nanosecond range) enables the excitation of specific voxels in the display volume 28 with such high accuracy that sharp images are produced. Exemplary methods and devices for generating and controlling ultra short optical pulses are disclosed in U. S. patent Nos. 6,603,778, 5,898,714, 5,852,700, and 5,177,752 , the contents of which are incorporated herein by reference in their entirety. Further disclosure of methods and devices for generating and controlling ultra short optical pulses is discussed in the technical journal articles, John D. Simon, Reviews of Scientific Instruments, 60 (12), December 1989, G. Steinmeyer, Science, Vol. 286, November 19, 1999, and Roberto Paiella, Science Vol. 290, December 1 , 2000, the contents of which are incorporated herein by reference in their entirety.
The optical pulses 32a-32k emanating from the pulsed optical sources 16a-16k with appropriate timing arrive at the desired point, P1 , in the display volume 28 together with the optical mixer 18. Furthermore, the pulsed optical sources 16a- 16k are sufficiently separated from each other such that pulses from the sources arriving at point P1 minimize the spatial spread of these overlapping pulses (see FIG. 4). The three-dimensional display 12 is operable with as many pulsed optical sources as is necessary to excite the optical mixer 18 within the cone of acceptance for each voxel in the display volume 28.
The particular subset of the optical mixer elements 20a-20i that are illuminated is determined by the display electronics 24, which selects which of the K pulsed optical sources 16a-16k are used, and which selects pulsed optical source parameters such as optical pulse timing, optical pulse width (including continuous illumination) and intensity. The K pulsed optical sources 16a-16k produce pulsed or non-pulsed desired wavelengths 26 under the control of the display electronics 24 and may also include one or more optical elements (not shown) to direct the light from the pulsed optical sources 16a-16k to a subset of the optical mixer elements 20a-20i. When used, the optical elements shape the output from the pulsed optical sources 16a-16k into a cone of light or into light that is essentially parallel to the illuminated subset of the optical mixer elements 20a-20i. Referring now to FIG. 6, an implementation of a pulsed optical source (e.g. 16a) is depicted. A desired wavelength generator 42 (e.g., a laser) produces a beam of light 44 which is transformed into a cone of light 46 by an optical element (e.g., a concave lens 48). Referring now to FIG. 7, another implementation of a pulsed optical source (e.g. 16b) is depicted. A desired wavelength generator 50 (e.g., a point source of light) emits light 52, which is transformed by an optical elements, (e.g., a convex lens 54) into extended beams 56 with plane wave fronts which illuminate a subset of the optical mixer elements 20a-20i. Plane wave fronts of light are created when the desired wavelength generator 50 is located at the focal point 58 of the convex lens 54 or at the focal point of a mirror (not shown). Using parallel beams of light allows for a more simplified arrangement of the pulsed optical sources 16a-16k and ensures the illumination of each of the optical mixer elements 20a-20i are within their acceptance cone for every voxel in the display volume 28. More particularly, the incident angles of the optical pulses 32a-32k on the optical mixer 18 stay more constant as the mixer moves and may provide a more constant conversion efficiency during the movement of the optical mixer 18. The pulsed optical source 16b creates light with a constant orientation relative to the optical mixer 18 which produces a more constant angle of incidence as the optical mixer 18 traverses the display volume 28 as compared to the pulsed optical source 16a which, because it produces a cone of light, has an angle of incidence relative to the optical mixer 18 that changes as the optical mixer 18 traverses the display volume 28.
In FIGS. 6 and 7 each individual pulsed optical source 16a, 16b contains its own desired wavelength generator 42, 50, respectively. The wavelength generator 42, 50 can be shared across multiple optical sources. Referring now to FIG. 8, a d esired wavelength generator 60 can be shared across three optical sources (e.g. 16a-16c). The desired wavelength generator 60 produces a train of pulses 62 under the control of the display electronics 24. The train of pulses 62 passes through an optical splitter 64 which divides the train of pulses 62 into three trains of pulses 66 a-66c. Three optical pulse controllers 68, 70, 72 de lay and attenuate a respective train of pulses 66a-66c for optically exciting the optical mixer elements 20a-20i.
The train of pulses 62 from the desired wavelength generator 60 may be eliminated completely by the display electronics 24 by having the optical pulse controllers 68, 70, 72 increase the attenuation of train of pulses 62. This attenuation of the train of pulses 62 can be implemented in one device within each of the beam paths of the desire wavelength generator 60 using a spatial light modulator (SLM - not shown).
The optical mixer 18 of FIG. 2 can contain as few as a single optical mixer element, i.e. the optical mixer 18 can be constructed as a single contiguous planar sheet of a material. Referring now to FIGS. 2 and 9, the optical mixer 18 can also be constructed from a plurality of smaller optical mixing elements 20a-20i across the surface defined by the shape (e.g., planar) of the optical mixer 18. In the preferred embodiment, the optical mixing elements 20a-20i are placed at the intersection of the curves 74a-74f which delineate the regular arrangement of the optical mixer 18. While FIG. 9 shows the curves 74a-74f as straight lines for the simplicity of the drawing, in general they are of a shape which enhances the desired frequency conversion properties of the optical mixer 18. The optical mixing elements 20a-20i can be of different shapes, sizes, and composition.
The optical mixing elements 20a-20i provide non-linear optical mixing. When light of frequencies ω-i, ω2, with wavelengths λi, λ2 , respectively, impinge upon an optical mixer element (e.g. 20 a), a first order analysis shows that sum and difference frequencies are created. Non-linear mixing of these optical frequencies in the optical mixing element 20a, by a first order analysis, creates new optical frequencies ω3 and ω .
The sum frequency created is ω<[ + ω2 = ω3; (1/λι + l/λ2 = 1/λ3) and the difference frequency is ω-i - ω2 = ω4; (1/λτ - 1/λ2 = 1/λ4) When more than two frequencies impinge upon the optical mixer element 20a, a first order analysis shows that additional frequencies are created, including the sum and difference of each pair of impinging frequencies and the sum of all the frequencies. Desired wavelength(s) 26, which result from the non-linear mixing of the pulsed optical source frequencies, are selected by using one or more optical filters within the optical mixer element 20a to prevent all but the desired wavelength(s) 26 from reaching the viewer(s) 30 of the three-dimensional display 12.
The optical mixer element 20a also produces additional optical frequencies by higher order non-linear effects. For example, the optical mixer element 20a may produce a frequency component that is double the frequency of 0Jι. While the higher order non-linear frequencies generated have lower conversion efficiencies than the first order frequencies produced and hence the higher order non-linear frequencies are of lower intensity than first order frequencies, the higher order non-linear frequencies are undesirable. In the optical display 12, the frequencies of the pulsed optical sources 16a-16k are chosen such that for a given set of desired wavelength(s) 26 (e.g., the three primary colors for a RGB display), no second, third or higher order non-linear interaction up to order N will generate a new frequency that is also a desired wavelength. The optical mixer elements 20a-20i incorporate filters that pass only the desired wavelength(s) 26, then unintended higher order frequencies created in the optical mixer elements 20a-20i will not reach the viewer(s) 30.
The non-linear optical mixing elements 20a-20i include materials that permit the generation of optical sum and difference frequencies at the desired wavelength(s) 26. Typical materials include inorganic ferroelectric materials such as LiNb03, Lil03, KH2PO4, TI3AsSe3 (TAS), Hg2CI2, KH2P04 (KDP), KD2PO4 (DKDP or D*KDP), NH4H2PO4 (ADP), Hg2Br2 and BaTiO3; quantum well structure semiconductors that include GaAs, etc.; organic single crystals such as 4- nitrobenzyIidene-3-acetamino-4-methoxyaniline (MNBA) and 2-methyl-4-nitroaniline (MNA); conjugated organic high molecular compounds such as polydiacetylene and polyarylene vinylene; and semiconductor grain-dispersed glass comprising CdS, CdSSe, etc. dispersed in glass.
FIG. 2 depicts each of the optical mixing elements 20a-20i as being made from a single non-linear optical material that can generate all possible frequencies in the visible spectrum. Referring now to FIG. 10, each of the optical mixing elements 20a-20i in an RGB optical mixer can also include optical mixer sub- elements 76a-76c which are composed of non-linear optical materials optimized for one of three desired wavelengths - red, green, or blue. The sub-elements 76a-76c are arranged and spaced such that no two types of sub-elements (optimized for the same desired wavelength) are adjacent. This arrangement and spacing of the sub- elements 76a-76c minimizes the unintended excitation of nearby sub-elements with the same desired wavelength (type). Small spacing is consistent with physically small implementations of the three-dimensional display 12 that are designed for high resolution. Larger spacing between the sub-elements 76a-76c is consistent with physically larger implementations of the three-dimensional display 12 (e.g., a movie screen size display).
Each of the sub-elements 76a-76c is composed of a non-linear optical material that generates a desired wavelength 26 when excited by the pulsed optical sources 16a-16k under the control of the display electronics 24. In an RGB display, the sub-element 76a that produces the red desired wavelength includes a filter (not shown) that blocks all wavelengths except the red desired wavelength. Similarly the blue and green sub-elements 766b, 766c, have blue and green filters, respectively.
The optical mixer sub-elements 76a-76c are capable of being simultaneously excited since there are K pulsed optical sources 16a-16k and each of the three sub-elements 76a-76c in one of the optical mixer elements 20a-20i may be excited by three of the K pulsed optical sources 16a-16k.
An optical mixer 18 composed of a plurality of optical mixer elements 20a-20i has several advantages over an optical mixer 18 composed of only one element. Arrays of small optical mixer elements 20a-20i are more cost efficient than an optical mixer 18 composed of a single optical mixer element for all except very small displays. For example, a Lithium Niobate crystal, LiNbO when used as a non-linear optical mixer element, is currently very difficult to produce in sizes beyond tens of centimeters on a side and becomes more expensive as the size of the crystal increases. Discrete optical mixer elements 20a-20i can be designed with spacing between the elements and therefore the unintended excitement of an optical mixer element 20a-20i by optical excitement of an adjacent optical mixer element 20a-20i is reduced by this inter-element spacing. The conversion efficiency for the non-linear materials which make up the optical mixer elements 20a-20i varies by material type and other design parameters such as size, shape and orientation relative to the pulsed optical sources 16a-16k. The optical mixer sub-elements 76a-76c are independently optimized for each of the desired wavelengths 26 used in the three- dimensional display 12. For example, in an optical mixer 18 that use three desired wavelengths 26 to produce three primary colors in an RGB display, the optical mixer sub-element 76a can be optimized for the generation of red desired wavelengths; the optical mixer sub-element 76b can be optimized for the generation of blue desired wavelengths; and the optical mixer sub-element 76c can be optimized for the generation of green desired wavelengths. For each of the optical mixer sub- elements 76a-76c, the design parameters, such as type non linear material, cross sectional area, thickness, size, acceptance angle, spectral acceptance, walk-off angle, etc. can each be independently optimized for each desired wavelength 26 to achieve a desired conversion efficiency, cost, and to equalize the maximum intensity produced by each type of optical mixer sub-element. Optimizing the optical mixer sub-element design by choosing the design parameters for each desired wavelength permits an equalization of the peak intensity for each desired wavelength relative to the viewer 30 of the three-dimensional display 12.
The conversion efficiency of one or more optical mixer elements 20a- 20i is a measure of the intensity of the desired wavelength 26 generated by an optical mixer element relative to the excitation of the element by the pulsed optical sources 16a-16k. Improving the conversion efficiency increases the intensity of the desired wavelength(s) 26 transmitted to the viewer(s) 30 of the three-dimensional display 12, i.e., the conversion efficiency increases with non-linear mixer length. Conversion efficiency increases with area of the non-linear mixer o n which the optical pulses 32a-32k of the pulsed optical sources 16a-16k impinge. The conversion efficiency increases with increasing excitation level up to a fixed limit (damage threshold) at which power conversion efficiency decreases. The conversion efficiency increases with phase matching (to be discussed below). When an optical mixer element includes a lens to focus the light onto the non-linear mixing material, the local intensity of the light increases, thereby generating a higher electric field, and thereby increasing the conversion efficiency.
The peak intensity of the output of a given optical mixer element (e.g. 20a) for a given fixed pulse width(s) of the optical pulses 32a-32k emanating from the pulsed optical sources 16a-16k and exciting the discrete elements 20a-20i is varied by adjusting the power output level of one or more of the pulsed optical sources 16a-16k which excite the element (e.g., 20a). The average power output level is reduced when the pulse width of one or more of the pulsed optical sources 16a-16k which is exciting the element 20a is shortened. Very high pulse rates are possible when the pulse widths are very short. Therefore, as the optical mixer 18 moves recurrently through the display volume 28 at repetition rates ranging from tens to perhaps thousand of times a second, the pulse rate of the pulsed optical sources 16a-16k can range form megabit to multi gigabit rates, thus illustrating that the optical mixer elements 20a-20i can be excited many times by the pulsed optical sources 16a-16k in the time that an optical mixer elements, e.g. 20a, takes to move through a voxel.
In an implementation where the optical mixer 18 is employed to output three different colors (e.g. the three primary colors), then the optical mixer 18 may use three different types of optical mixing elements 20a-20i to produce the output frequencies F-i, F2 and F3. The design parameters of the optical mixing elements 20a-20i will differ in order to achieve higher conversion efficiencies or more cost effective conversion efficiencies at the output frequencies Fi, F2 and F3. To equalize the peak output intensity at the output frequencies F-i, F2 and F3, from element types with lower conversion efficiencies, physical design parameters are appropriately chosen, such as the cross sectional area upon which the optical pulses 32a-32k impinge, or the thickness of the optical mixing elements 20a-20i. Other design parameters to achieve equalization include phase-matching type and angle, damage threshold, acceptance angle, spectral acceptance, crystal size, walk-off angle, group velocity mismatching, and temperature acceptance.
Referring now to FIG. 11, an optical mixing element, e.g. 20a, is depicted as having a cylindrical shape, while another optical mixing element, e.g. 20b, is depicted as having a truncated conical shape. The shape or size of the optical mixing elements 20a-20i in the optical mixer 18 are chosen to compensate for conversion efficiency variations by frequency or position of the optical mixer 18 in the display volume 28, power output variations of pulse sources by frequency, or other attenuation and losses in the system. Two optical frequencies (colors) Fi and F2 impinge upon the cylindrical optical mixing element 20a and one of the frequencies produced in the cylindrical optical mixing element 20a is the sum of these frequencies Fi + F2. Conversion efficiency improves as the diameter of the cylindrical optical mixing element 20a is increased because a greater amount of light enters the cylindrical optical mixing element 20a. Conversion efficiency also improves as the length of the cylinder increases. Similarly, one end of an optical mixing element, e.g. 20c, may be coated with a reflective substance to give it a mirrored surface so that the input optical pulses 32c and the resulting nonlinear output pulses 34 of different frequencies exit from the same end of the optical mixing element 20c that the input optical pulses 32c entered. Traveling through the optical mixing element 20c twice further increases the conversion efficiency.
The orientation of the optical mixer elements 20a-20i relative to the optical excitation by the pulsed optical sources 16a-16k is a critical parameter relative to the intensity of the desired wavelengths 26 generated in the optical mixer element and transmitted to the viewer(s) 30 of the three-dimensional display 12. As discussed above, the conversion efficiency of the optical mixer elements 20a-20i of the optical mixer 18 among other properties is dependent upon phase mismatching and hence on the alignment of the incident optical energy relative to the structure of the optical mixer 18. Phase-matching can be implemented by angle tilting, temperature tuning, or other methods. Angle tilting is used the most to obtain phase- matching.
The orientation of each of the optical mixer elements 20a-20i relative to each of the excitation pulsed optical sources 16a-16k is further complicated as the optical mixer 18 is recurrently moving through the display volume 28 and therefore the angle of excitation of an optical mixer element 20a-20i from a particular combination of pulsed optical source 16a-16k changes significantly and therefore the conversion efficiency changes accordingly. The three-dimensional display 12 uses alternative pulsed optical sources 16a-16k to maintain optimal conversion efficiency by utilizing the display electronics 24 to select the optimally positioned pulsed optical sources 16a-16k for each of the optical mixer elements 20a-20i and for each set of positions of the optical mixer 18 as it moves recurrently in the display volume 28. Thus the logic employed by the display electronics 24 continually changes the combination of pulsed optical sources 16a-16k that are exciting each of the optical display elements 20a-20i to produce the desired wavelength(s) 26 in each voxel, at P1 in FIG. 1.
The optical mixer elements of FIGS. 9-11 depict the optical mixer elements 20a-20i as being constructed from non-linear mixing materials. Referring now to FIGS. 12A-15B, the optical mixer elements 20a-20i can include additional elements to enhance the production and transmission of the desired wavelengths 26. FIGS. 12A, 12B show a hemispherical lens 78 used in combination with a cylindrical non-linear mixing material 80 and a cylindrical desired wavelength filter 82. FIGS. 13A, 13B show a triangular lens 84 used in combination with a triangular non-linear mixing material 86 and a triangular desired wavelength filter 88. The triangular lens 84 need not be symmetric. FIGS. 14A, 14B shows a pyramidal lens 90 used in combination with a rectangular non-linear mixing material 92 and a rectangular diffuser 94. FIGS. 15A, 15B, which depict the preferred embodiment, shows a hemispherical lens 96 used in combination with a rectangular non-linear mixing material 98 and a rectangular desired wavelength filter 100. The optical display 12 is not limited using those shapes or combinations of shapes of lenses, non-linear mixing materials, desired wavelength filters/optical diffusers depicted in FIGS. 12A- 15B. In any of the embodiments of FIGS 12A-15B, optical diffusers can be used in place of or in addition to the desired wavelength filters and vice versa.
The lenses (e.g., the lens 78) adjust the angle of incidence of light from the pulsed optical sources 16a-16k relative to the optical axes of the non-linear mixing materials (e.g. the material 80). The addition of the lenses focuses light onto the non-linear mixing materials (e.g., the material 80), which increases the local intensity of the optical excitation of the non-linear mixing materials (e.g., the material 80), and which increases the electric field within the non-linear mixing materials (e.g., the material 80), results in higher conversion efficiency. The desired wavelength filters (e.g., the filter 82) insure that for all the pulsed optical source frequencies incident upon the non-linear mixing materials (e.g., the material 80) and for all the new wavelengths generated within the optical mixer (e.g., the material 80) by nonlinear optical interaction, only the desired wavelength(s) 26 are passed to the viewer(s) 30. The non-linear optical interaction occurs in the non-linear mixing materials (e.g., the material 80) for light from the pulsed optical sources 16a-16k that impinges thereupon within the cone of acceptance range of angles. This range is dependent upon certain parameters of the non-linear mixing materials (e.g., the material 80), such as thickness and type of non-linear material. If the pulsed optical sources 16a-16k are positioned too far apart from each other, there will be position points for the o ptical rays from the pulsed optical sources 16a-16k that upon passing through the lenses (e.g., the lens 78) will be directed at such a large angle with respect to the peak conversion optic axis of the non-linear mixing materials (e.g., the material 80) that incident light from the pulsed optical sources 16a-16k is not converted. By using the lenses (e.g., the lens 78), spacing between pulsed optical sources is limited only by the aperture of the lenses (e.g., the lens 78), rather than being determined by the acceptance angle of the non-linear mixing materials (e.g., the material 80). The size of the lenses (e.g., the lens 78) can be made relatively large compared to the size of the non-linear mixing materials (e.g., the material 80). The optical diffusers (e.g. the diffuser 94) greatly reduce the directional intensity variations of the desired wavelength(s) 26 as they exit the optical mixer materials (e.g., the material 92).
The desired wavelength filters (e.g., the filter 80) may be omitted when the non-linear mixing materials (e.g., the material 80) are designed to selectively enhance specific design wavelength(s) 26 when excited by the pulsed optical sources 16a-16k. In some non linear materials, such as Lithium Niobate, the peak conversion efficiency varies by orientation as a function of the desired wavelength produced. The orientation of the excitation by the pulsed optical sources 16a-16k is chosen to correspond to the orientation within the non-linear mixing materials (e.g., the material 80) along which the conversion of the desired wavelength(s) 26 is maximized. Since each desired wavelength orientation has a different cone of acceptance in the non-linear mixing materials (e.g., the material 80), choosing the positions of pulsed optical sources 16a-16k so that the optical pulses 32a-32k emitted are incident upon the non-linear mixing materials (e.g., the material 80), within their cone of acceptance yields only the desired wavelength(s) 26. Under these conditions, the optical mixer elements 20a-20i need only contain one nonlinear mixing material (e.g., the material 80) containing a single sub-element than a separate sub-element for each desired wavelength. However, each desired wavelength produced exits the non-linear mixing material (e.g., the material 80) at a different angle corresponding to the orientation along which the conversion of the desired wavelength 26 is maximized. Thus the direction of peak intensity of light exiting from the non-linear mixing material (e.g., the material 80) varies according to the desired wavelength 26. As depicted in FIGS. 14A, 14B, the use of an optical diffuser 94 in place of a desired wavelength filter greatly reduces the directional intensity variations of the desired wavelength 26 observed by the viewer 30. Dynamic equalization of the desired wavelengths 26 is implemented by adjusting the peak power and pulse width of the optical pulses 32a-32k and by choosing conversion efficiency via the selection of alternative pulsed optical sources 16a-16k. Referring now to FIGS. 16A, 16B, another variation of the optical mixer elements 20a-20i of FIGS. 9-11 is depicted. The optical mixer elements 20a-20i of FIGS. 16A, 16B includes a rectangular non-linear mixing material 102, a pyramidal lens 104, a rectangular desired wavelength filter 106, and an optical reflector 108 which allows one or more of the pulsed optical sources 16a-16k to be positioned on the same side of the optical mixer 18 as the viewer 30. The pulsed optical sources 16a, 16b provide excitation of the rectangular non-linear mixing material 102 in the direction of the viewer 30. The pulsed optical source 16c is on the opposite side of the non-linear mixing material 102 and the direction of excitation from the pulsed optical source 16c is reversed to coincide with the direction of pulsed optical sources 16a, 16b.
Referring now to FIG. 17, a simple mechanism for moving a planar optical mixer 18 back and forth is depicted. The corners 110a-110d of the optical mixer 18 are supported by tracks 112a-112d which maintain a constant orientation relative to the x-y plane. A rotational source 114, synchronized to the optical pulse generators 16a-16k turns an idler drive gear (not shown) which in turn is connected to top and bottom gear drives (not shown), respectively. As the top gear drive turns, linear motion is transferred to the top of the optical mixer 18 by a top gear arm 116. As the bottom gear drive turns, linear motion is transferred to the bottom of the optical mixer 18 by a bottom gear arm 118. The optical mixer 18 and its moving mechanism are operable over a range of atmospheric pressures. Preferably, the optical mixer 18 and its moving mechanism can be placed in a vacuum to minimize the air resistance when moving the optical mixer 18 at high speeds.
The optical mixer 18 is depicted in FIG. 2 as moving back and forth in one dimension along the Z-axis. Referring now to FIG. 18, the optical mixer 18 is operated using a rotational motion about the X-axis. The pulsed optical sources 16a, 16b, ..., 16i are positioned on one side 120 of the optical mixer 18 and the pulsed optical sources 16i+ι, 16i+2, ... 16K are placed on the other side 122 of the optical mixer 18 . Positioning some of the pulsed optical sources 16i+1, 16i+2, ... 16K on the same side of the optical mixer 18 as the viewer(s) 30 allows the use of very intense pulses that are directed away form the viewer(s) 30 and therefore return to the viewer(s) 30 at a much lower, and visually safer, intensity level after being reduced in intensity by the non linear conversion in the optical mixer 18. The display electronics 24 selects the combinations of pulsed optical sources 1 6a-16k that non-linearly combine in the optical mixer elements 20a-20i, e.g., the optical mixer element at point P1, which produce the desired wavelengths. The optical filter 22 blocks all wavelengths other than the desired wavelengths 26 produced in the optical mixer elements 20a-20i from reaching the viewer(s) 30 or viewing equipment.
Referring now to FIG. 19, if the pulsed optical source 16k is to be placed on the side 120 of the optical mixer 18 opposite the side 122 to which the other pulsed optical sources 16a-16c direct optical pulses, it is preferable that pulsed optical source 16k generates a pulse 32k that arrives at the plane of the optical mixer 18 at the same time as axis of the pulsed optical source 60k is aligned with the perpendicular to the optical mixer 18. It is also desirable to coat the side 122 of the optical mixer 18 with a partially transparent and partially reflective material, similar to a half silvered mirror, and/or to construct the optical mixer 18 of a thin material that will diffuse light. The side 120 of the optical mixer 18 may also include an optical filter to selectively transmit only the desired sum and difference products of the impinging optical pulses 32a-32k.
Referring now to FIG. 20, multiple pulsed optical sources 16a-16k are successively utilized as the optical mixer 18 rotates clockwise around the X-axis. This sequential use of pulsed optical sources 16a-16k maintains the alignment of the axes of the direction of the impinging optical pulses 32a-32k with the changing cone of acceptance of the optical mixer 18 and thus helps insure that the optical pulses 32a-32k impinge on the optical mixer 18 at an angle required for the necessary conversion efficiency. In a similar fashion, positioning the pulsed optical sources 16a-16k in space to maintain the alignment of their axes with the normal to the optical mixer 18 as the latter rotates (i) improves conversion efficiency, (2) simplifies the design of the display electronics 24 for generating overlapping optical pulses at desired points in the three-dimensional display space, and (3) reduces spatial dispersion of overlapping optical pulses at a desired point on the optical mixer 18 in the three dimensional display space.
FIG 21 shows an implementation of the optical mixer 18 that is not planar. In this implementation, the optical mixer 18 has a surface whose shape is known to the display electronics 24. The display electronics 24 selects changing combinations of pulsed optical sources 16a-16k that non-linearly combine in the optical mixer elements 20a-20i, to produce the desired wavelengths at desired voxels. The display electronics 24 stores the alternative possible combinations of the pulsed optical sources 16a-16k which are capable of producing the desired wavelength 26 in each voxel as lists of predetermined pulsed optical source combinations. The display electronics 24 also stores the predetermined lists of alternative possible combinations of pulsed optical sources 16a-16k which achieve different levels of conversion efficiencies. The logic of the display electronics 24 uses these predetermined lists to select appropriate combinations of the pulsed optical sources 16a-16k for each voxel in the three-dimensional image generated by the three-dimensional display 12.
With reference to FIG. 22, an exemplary embodiment of the three- dimensional image scanning device 14 of the present invention is depicted. The three-dimensional image scanning device 14 is adapted to take three-dimensional frames of still or moving desired objects 134. The three-dimensional image scanning device 14 includes an pulsed optical source 124, a second source of optical pulses 126, an optical mixer 128, an optical recorder 130 (e.g. a two-dimensional camera), and image scanning device electronics 132 for controlling the coordination of the pulsed optical source 124, the second source of optical pulses 126, and the optical recorder 130. The pulsed optical source 124 illuminates the desired object 134 using optical pulses 136 of frequency F-i, which are of the same type as used in the three-dimensional display 12, and which can include ultra short optical pulses. The optical pulses 136 reflect from the desired object 134 and impinge on the optical mixer 128. The optical mixer 128 and pulsed optical source 124 can be constructed from the same materials and have the same geometry as those used in the optical display 12. The second source of optical pulses 126 can take on at least two forms to be described below with reference to FIGS. 23, 24. The output of the second source of optical pulses is optical pulses 138 (of the same type as the optical pulses 136) of frequency F2, which impinge on the optical ixer 128. The optical pulses 138 have controlled time delays with respect to the optical pulses 136. The optical pulses 136, 138, arriving and temporally overlapping at the optical mixer 128, interact in a non-linear fashion when passing through the optical mixer 128 as previously described for the optical mixer 18 associated with the three-dimensional display 12. What emanates from the optical mixer 128 is a set of pulses 140 that include pulses of not only the original frequencies Fi and F2, but also pulses of the sum and difference frequencies of the optical pulses 136, 138. This set of pulses 140 are transmitted to the optical recorder 130, which can include, for example, a visible wavelength and/or infra-red camera. The image scanning device electronics 132, which can include a processor, are incorporated within or external to the optical recorder 130 and construct a complete three-dimensional representation of the object to be discussed below with reference to FIG. 25 from a succession of sets of pulses 140 (after appropriate filtering to extract the desired pulses 140).
Now referring to FIGS. 23 and 24, two preferred arrangements for the second source of optical pulses 126 are depicted. In FIG. 23, the second source of optical pulses 126 includes a pulsed optical source 142 and a mirror 144. The pulsed optical source 142 is of frequency Fi and the pulsed optical source 124 is of frequency F2. The pulsed optical source 142 transmits the optical pulses 138 of frequency F2 toward a mirror 144 at approximately the same time that the pulsed optical source 124 illuminates the desired object 134 using optical pulses 136 of frequency F-i. The mirror 144 reflects the optical pulses 138 towards the optical mixer 128. The plane of the mirror 144 is approximately parallel to the plane of the optical mixer 128 (i.e., the x-y plane). The desired spatial/temporal time delay is implemented by moving the mirror 144 back and forth along the plane perpendicular to the plane of the mirror 144 and the optical mixer 128 (i.e., the z dimension). In the second arrangement of FIG. 24, the mirror 144 is omitted. The image scanning device electronics 132 are temporally linked to the pulsed optical sources 124, 142 in such a way that the optical pulses 138 are delayed a predetermined amount from the optical pulses 136.
Now referring to FIG. 25, each of the optical pulses 136 combine with a respective one of the optical pulses 138 at different times in the optical mixer 128. Since the optical pulses 136, 138 can be of very short duration, they interact with each other in the optical mixer 128 over a small area and over a short interval of time. Thus the successive pulses 140 emanating from the optical mixer 128 will have both high temporal and spatial resolution. Since spatial resolution is equivalent to temporal resolution, generating and detecting the reflected light with high temporal resolution will therefore reveal the depth profile of the desired object 134 with high spatial resolution.
Each of the optical pulses 138 are delayed by a tight succession of increasing time intervals relative to the optical pulses 136, which are captured as a succession of different "slices" 146 of the desired object 134 within the optics of the optical recorder 130. With one delay, the front edge of the desired object 134 is captured, followed by images moving toward the rear of the desired object 134. By varying the time delay between the two sets of optical pulses 136, 138 reflections with selected ranges (i.e., distance from the pulsed optical source 142) are gated into the optical recorder 130. The individual time delays of the optical pulses 138 do not have to be in ascending or descending order, but can be in any order so long as the image scanning device electronics 132 associated with the optical recorder 130 "combines" each of the resulting "slices" 146 of the desired object 134 to produce a composite three-dimensional image from a single perspective of the desired object 134.
A three-dimensional image of the desired object 134 is obtained by generating ranges for a succession of two-dimensional images that encompass all external views of the three-dimension desired object 134. This is accomplished by either rotating the desired object 134, or moving the optical recorder 130, the optical mixer 128, and optionally the pulsed optical source 124 and the second source of optical pulses 126 around the desired object 134. Alternatively, several sets of the optical recorder 130, the optical mixer 128, the pulsed optical source 124, and the second source of optical pulses 126 can be stationed around the desired object 134 and coordinated by the image scanning device electronics 132 to capture a complete three-dimensional representation of the desired object 134 a nd provide data to construct a three-dimensional model of the desired object 134.
The optical frequencies of the pulsed optical source 124 and the second source of optical pulses 126 can be chosen such that only the desired sum and/or difference frequencies generated in the optical mixer 128 are within the range that are processed by the optical recorder 130. For example, the frequencies F-i and F2 can be chosen to be optical frequencies below the frequency processed by the optical recorder 130 while the sum of these frequencies is within the processing range of the optical recorder 130 (i.e., optical up/down conversion).
Acquiring a three-dimensional shape and also acquiring the color of this shape makes use of a novel arrangement of the optical to electronic device, e.g., a charge coupled device (CCD) of a video camera, such that the color is captured at desired wavelengths. Referring now to FIGS. 26-27, the process of separating the spatial information at the spatial wavelengths from the color information encoded within the set of pulses 34 emanating from the optical mixer 128 is accomplished by optical filtering. In FIG. 26, filter(s) 148 is interposed between the optical mixer 128 and the CCD array 150 having pixels 152 within the optical recorder 130. Alternatively, in FIG. 27, some of the pixels 154 can be coated with an filtering material that only pass the spatial wavelengths relating to shape while the other pixels 156 can be coated with an filtering material that only pass the desired wavelengths related to color. The pixels 154 capture only the spatial information, while the uncoated pixels 156 capture the color and textual information. The pixels 154 capture only the spatial information, while the pixels 156 capture the color information. Another alternative is for the optical recorder 130 to selectively split the acquired color and spatial image using wavelength selective filters into spatial wavelengths and desired wavelengths and acquire these wavelengths with two separate CCD arrays, one for capturing spatial information and another for capturing color information.
Compared to other depth information systems, the three-dimensional image scanning device 14 can have enhanced precision, since the time resolution is given strictly by the length of the optical pulses 136, 138 and the optical non-linear process within the optical mixer 128. The depth sensitivity and range can easily be adjusted to needed specifications by the selection of optical pulse length and the precision of the control in the time delay of the optical pulses 138. Additionally, depth information can be obtained even from very low reflecting desired objects 134, since in the non-linear mixing process the signal strength can be enhanced by use of an intense gating pulse (138).
The calibration of the three-dimensional image scanning device 14, especially embodiments using multiple optical recorders 130, is a very time consuming, expensive process when using conventional calibration techniques. With reference to FIGS. 28-33, a first exemplary embodiment of three-dimensional calibration equipment 16 is depicted. The three-dimensional calibration equipment 16 includes a light source 158 and a holographic calibration plate 160 that contains holographic encoded calibration information. Two or more optical recorders 162a- 162c are placed on either side of the holographic calibration plate 160 through which a desired object 164 is viewed and one or more virtual calibration pattern(s) 166 is viewed in the vicinity of the desired object 164. Alternatively, a single optical recorder 162a is moved to different positions in space to acquire images of the desired object 164 from different perspectives. The optical recorders 162a-162c can be, for example, two-dimensional or three-dimensional analog or digital cameras.
For the purposes of the discussion below, "virtual calibration pattern" refers a h olographic projection viewed in the vicinity of the desired object 164, "desired wavelengths" refers to the wavelengths of light reflected from the desired object 164 due to ambient illumination, and "calibration wavelengths" refers to the wavelengths of light produced by the holographic calibration plate 160 for viewing the virtual calibration pattern. Reflected light from the desired object 164 passes through the holographic calibration plate 160 which is transparent to light of the desired wavelengths 168 (e.g., normal visible light reflected from the desired object 164). The desired wavelengths 168 pass through the holographic calibration plate 160 to the optical recorders 162a-162c. The light source 158, which may be on either side of the holographic calibration plate 160, illuminates the holographic calibration plate 160 with light of one or more calibration wavelengths 170. The light source 158 may produce pulses of coherent light, a beam of coherent light, or may produce incoherent light. Light of the calibration wavelengths 170 excites holographic calibration plate 160 in such a way that the optical recorders 162a-162c "see" a virtual calibration pattern 166 which is made recordable and storable by optical sensors within the optical recorders 162a-162c. The three-dimensional calibration equipment 16 is envisioned to use fixed holographic images or variable plate holograms (e.g. using external inputs to phase modulate the holographic calibration plate 160). The virtual calibration pattern 166 can be, for example, a "tinker toy" like grid or multidimensional wireframe of arbitrary shape (see FIG. 29) located in the vicinity of, overlapping, or enclosing the desired object 164. The virtual calibration pattern 166 can have any desired shape, such as the shape of a cube, portions of concentric cylinders, portions of concentric spheres, etc., depending on the geometry of the desired object 164 and the field of view of the optical recorders 162a-162c. Individual intersections of the grid of the virtual calibration pattern 166 can be labeled with numerals (see FIG. 30) and/or bar codes (see FIG. 31) to aid in the calibration process. Through movement of holographic calibration plate 160 or by changing of the illumination parameters (e.g. the frequency of the calibration wavelength 170 emitted by the light source 158), alternative calibration information can be generated for processing by the optical recorders 162a-162c. Changing the virtual calibration pattern 166 is accomplished by recording multiple, superimposed holograms at different calibration wavelength(s) 170 or different positions on the holographic calibration plate 160 and then choosing a specific calibration pattern by illuminating the holographic calibration plate 160 by the specific calibration wavelength(s) 170 corresponding to the specific virtual calibration pattern 166 or viewing from a specific position. Different calibration wavelengths 170 can produce, for example different grids 172, 174, 176 for the virtual calibration pattern 166 of varying density around the desired object 164 (see FIGS. 32 and 33) in order to capture, say, a small or larger portion of the desired object 164 (such as, for example, the nose, then the face, then the body of a human subject) depending on the desired scaling factors. These various grids or virtual calibration patterns 172, 174, 176 allow for calibration at selectable and assignable positions in the desired object three-dimensional target space.
Referring now to FIG. 34, a second exemplary embodiment of the three-dimensional calibration equipment 177 is depicted. The three-dimensional calibration equipment 177 includes a mirror 178, optical recorders 180a-180c, a light source 182, and a holographic calibration plate 184. In this embodiment, a mirror 178 is placed in the field of view of the optical recorders 180a-180c. The light source 182 illuminates the holographic calibration plate 184 at a position out of the field of view of the optical recorders 180a-180c. Light of calibration wavelength (s) 186 excites the holographic calibration plate 184 such that the virtual calibration pattern 188 is reflected in the mirror 178 and viewed by the optical recorders 180a-180c as if the virtual calibration pattern 190 were superimposed on or near the desired object 188. The mirror 178 is reflective at the calibration wavelengths 186 and transparent at the desired wavelengths 192. Thus images from the desired object 188 pass directly to the optical recorders 180a-180c. This embodiment enables the use of wavelengths in light source 182 that included the desired wavelengths 192, as the desired wavelengths 192 reflected from the desired object 188 will pass directly through the mirror 178 and are not reflected to the optical recorders 180a-18Oc.
The virtual calibration pattern 190 can be produced by calibration wavelength(s) 186 in the desired wavelength region of the electromagnetic spectrum for applications in which the calibration wavelength(s) 186 overlays the desired wavelengths 192. When the calibration wavelength(s) 186 overlays the desired wavelengths 192, both the desired object 188 and the virtual calibration pattern 190 are simultaneously observable by the optical recorders 180a-180c without additional post processing. For applications in which the desired wavelengths 192 alone are desired, the calibration wavelengths information is stored or processed separately from the desired wavelength information. Calibration wavelength information and desired wavelength information at overlapping wavelengths can also be temporally separated by enabling the calibration wavelength(s) 186 for short times and synchronizing observation of the calibration wavelength(s) 186. In the case where the calibration wavelength(s) 186 are different from the desired wavelengths 192, the images reaching the optical recorders 180a-180c of FIG. 34 contain both the desired wavelengths 192 and the calibration wavelength(s) 186.
In the embodiments depicted in FIGS. 28 and 34, the optical recorders 162a-162c, 180a-180c can contain optics/electronics/software corresponding to elements of the block diagram depicted in FIG. 35. The optical recorders 162a-162c, 180a-180c includes optics 198, 200, optional wavelength selective filters 202, 204, and calibration electronics 222, 224, respectively. The calibration electronics 222, 224 can contain a processor (not shown) and output storage 218, 220 for storing information gleaned from the calibration wavelengths 170, 186 and output storage 206, 208 for storing information gleaned from the desired wavelengths 168, 192, respectively. An incoming desired image and calibration image 194, 196 enters each of the optical recorders 162a-162c, 180a-180c, respectively, and is separated into the desired wavelengths 168, 192 and the calibration wavelengths 170, 186 using the additional wavelength selective filters 202, 204, respectively. The desired wavelengths 168, 192 a nd the calibration wavelengths 170, 186 are processed separately in the calibration electronics 222, 224 as represented by the blocks 206, 208, and 210, 212, respectively. The resulting data are stored in separate parts of memory in the output storage 218, 220 and the output storage 206, 208, respectively.
Selective separation of the optical wavelengths can be accomplished in at least three ways. One way is to use the wavelength selective filters 202, 204. Another way is for the calibration electronics 222, 224 to be designed to implement optical-to-electronic conversion such as CCD arrays to separate the wavelengths. In this method the pixel array that provides optical-to-electronic conversion uses a planar arrangement of light sensitive devices whose arrangement in two dimensions alternates devices whose sensitivity peaks for the desired wavelengths 168, 192 and the calibration wavelengths 170, 186, respectively. Still another way is for the calibration electronics 222, 224 to be designed to implement optical band pass and band stop filters for selecting the desired wavelengths 168, 192 and the calibration wavelengths 170, 186, respectively.
Referring now to FIG. 36, a third exemplary embodiment of the three- dimensional calibration equipment 226 is depicted. The three-dimensional calibration equipment 226 includes an optical or mechanical shutter 228, optical recorders 230a-230c, a pulsed light source 232, and a holographic calibration plate 234. The optical or mechanical shutter 228 is placed between the holographic calibration plate 234 and the desired object 236. In this embodiment, the calibration wavelength(s) 238 are generated only for special frames. Frames imply that the recording of the desired wavelengths 240 and the calibration wavelengths 238 is broken into discrete units of time during which subsequent samples of the desired wavelengths 240 and the calibration wavelengths 238 are captured. Special frames store the calibration wavelengths 238. The special frames can be of two types. In one type, the special frames containing the calibration wavelengths 238 are stored or processed separately from the desired frames containing the desired wavelengths 240. In another type, the special frames are interspersed at a periodic or non- periodic rate between the desired frames.
During calibration, the pulsed optical source 232 is pulsed on with the optical or mechanical shutter 228 closed during the special frames. This arrangement enables the recording of desired and calibration information without wavelength separation. The three-dimensional calibration equipment 226 eliminates the need for wavelength selective filters, since multiple calibration patterns based on calibration wavelengths above and below the desired wavelengths 240 can be generated. In the embodiment of FIG. 36, the use of a pulsed optical source 232 and the optical or mechanical shutter 228 synchronized to the pulses can be adapted to provide pulsed time code information for synchronizing multiple optical recorders 230a-230c and/or to convey synchronized instructions to the multiple three- dimensional imaging systems (not shown) simultaneously, e.g. for special effects and other camera related functions. The combination of synchronization and holographic calibration/alignment across multiple cameras permits a more cost effective implementation of panoramic cameras such as those that are now implemented mechanically in CineMax® systems and the simplified construction of panoramic three-dimensional imaging systems.
The embodiments of the three-dimensional calibration equipment 16, 177, and 226 described above generate an image of the desired object and an image of the virtual calibration pattern. Referring now to FIG. 37, for each mechanical configuration of the calibration equipment 16, 177, and 226 (e.g. physical position or optical magnification setting), the calibration method is as follows: At step 241 , a virtual calibration pattern is projected in the field of view of a desired object. At step 242, one of the optical recorders is chosen as the reference. At step 244, if the optical recorder includes an electronic image detector, then at step 246, the coordinates of a coordinate system are assigned in parallel or normal to the pixel array in the optical recorder or in alignment with the virtual calibration pattern. If the optical recorder is a non-electronic system, then at step 248, the coordinates of a coordinate system are assigned arbitrarily or in alignment with the virtual calibration pattern. At step 250, the same coordinate system is assigned to all other optical recorders. At step 252, the differences in the virtual calibration pattern of each optical recorder other than the reference optical recorder is measured. At step 254, the differences measured are used to calculate the calibration corrections for each optical recorder relative to the reference optical recorder. At step 256, the calibration corrections are used to compensate the desired images either mechanically or electronically. The methodology of how to calculate the calibration corrections from the differences measured can be found in U. S. Patent No. 6,822,748 to Johnston et al., which is incorporated herein by reference in its entirety. If there is only one optical recorder, then the method is modified such that the first position of the optical recorder becomes the reference and each subsequent position is treated in the calibration method as an additional optical recorder. Then the steps 241-256 are followed above.
Since a three dimensional system has three degrees of freedom, three adjustments are necessary: rotation, translation and scaling. In real time or by delayed processing, one of the non-reference optical recorders is mechanically or electronically (i) rotated so that the desired object appears to be twisted, (ii) translated so that the desired object appears to be moved left or right, and/or (iii) scaled (i.e. zoomed) so that the desired object appears to be made smaller or larger (equivalent in a singe camera view to moving it nearer or farther away).
For the discussion which follows with reference to FIGS. 38-40, let the reference numbers for the three-dimensional calibration equipment 16 also stand for the corresponding reference numbers of the three-dimensional calibration equipment 177 and 226. The three-dimensional calibration equipment 16 permits the processing of the calibration wavelengths 170 separately from the desired wavelengths 168. If the desired object 164 has affixed to it calibration points 258 which are painted with or reflective of the calibration wavelengths 170 (see FIG. 38), then the calibration points 258 are captured separately in the optical recorders 162a- 162c. Specific points on the desired object 164 are uniquely determined by highlight the desired points with points painted with materials that are reflective at the calibration wavelengths 170, which can be points where an application pattern is projected on to the desired object 164 using the calibration wavelengths 170 or with calibration points 176 attached to the object which are painted with the calibration wavelengths 170. For example, in order to measure a person for shoes or clothing, the person puts on a pair of socks or tights that are painted with an application pattern, e.g. grid, in one or more calibration wavelengths. The image matching for one or more optical recorders 162a-162c is precise at the calibration grid points. Alternatively, a cosmetic surgeon may paint a selected pattern on the patient using paint in a calibration wavelength that is not visible to the human eye to allow the surgeon to view normal visible images and the selected pattern of the patient in post processing for either still or full motion images. Surgeons may also use these techniques in conjunction with thermal images when the desired wavelengths are properly selected for radiological treatment applications where heat is generated by radioactive treatment materials.
Referring now to FIGS. 39 and 40, as discussed above, the virtual calibration pattern can have any desired shape, such as the shape of a cube, portions of concentric cylinders, portions of concentric spheres, etc., depending on the geometry of the desired object 164 and the field of view of one or more optical recorders. For the same reasons, the holographic calibration plate 178 can have any desired shape. A single optical recorder 162a or multiple optical recorders 162a-162c can use a rectangular, spherical, cylindrical or arbitrarily shaped hologram, illuminated by light source 158 from either the inside or outside of the holographic calibration plate that surrounds or partially surrounds the desired object 164. In FIG. 39, the optical recorder 164a moves around the outside of the cylindrical holographic calibration plate 260 through which a cylindrical virtual calibration pattern 262 is viewed. Successive images of the desired object 164 are post processed into a stereoscopic image or into a three-dimensional model of the object using algorithms for color and or edge matching. These stereoscopic images of the desired object 164 can be used to capture the three-dimensional shape of the desired object 164 by means of calculation by triangulation and the spatial position of uniquely determined points on the desired object 164, e.g., points for a wire frame model. Color, texture and shading are then applied to the wire frame model from the captured images of the desired object 164.
Referring now to FIG. 40, a spherical-shaped holographic calibration plate 264 is used in generating a partially spherical calibration hologram 266 of a partially-spherical calibration grid, i.e., a spherical coordinate grid, around one or more optical recorders 164a. The three-dimensional calibration equipment 16 of this configuration is well suited to applications requiring a complete field of view. Multiple optical recorders 164a-164c can be placed within the spherical holographic calibration plate 264 to simultaneously cover all directions. Such a spherical calibration hologram 266 provides the mechanism to overcome the problems of adjacent optical recording devices that record two-dimensional images, which include overlapping fields of view and conformal mapping from the planar segment of the optical recording device to a spherical frame of reference. Using a holographic calibration pattern in the form of a spherical coordinate grid in the field of view of the optical recorders enhances the conformal mapping of the multiple optical recorder outputs. The spherical calibration hologram 164 provides a matching point between the field of view of the adjacent optical recorders and provides a uniform coordinate system across all the optical recorders, which simplifies calibration and alignment of the optical recorders and simplifies conformal mapping across the optical recorders, especially when the alignment between such devices are vary due to shock, vibration or acceleration. The spherical calibration hologram 266, when combined with laser ranging to be discussed below in connection with FIG. 41 , provides accurate ranging to a desired object 164.
The spherical holographic calibration configuration of FIG. 40 provides the three-dimensional calibration equipment 16 with the necessary data for generating a panoramic three-hundred sixty degree view of remotely piloted vehicles subject to shock or acceleration. The spherical holographic calibration configuration enables real-time compensation for the two major problems when providing a panoramic three-hundred sixty degree view for remotely piloted vehicles: (i) the continuous alignment of multiple "fisheye" optical recorders which are subject to misalignment by shock or vibration, and (ii) conformal mapping of multiple optical recorders into a three-hundred sixty degree panoramic view. Furthermore, the calibration wavelengths 170 can be chosen to include wavelength used in collision avoidance systems and thus process such information jointly with optical recorder calibration.
Referring now to FIG. 41, a fourth exemplary embodiment of the three- dimensional calibration equipment 268 is depicted. The three-dimensional calibration equipment 268 includes a laser pointer 270, optical recorders 272a-272c, a light source 274, and a cylindrical holographic calibration plate 276. The optical recorders 272a-272c separate the desired and calibration wavelengths to enable efficient extraction of the desired object's color in addition to its three dimensional shape. For the calibration equipment 268, the three optical recorders 272a-272c move around the outside of the cylindrical holographic calibration plate 276 in order to capture and triangulate the position in three-dimensional space of a point P on a desired object 278 which is illuminated by a laser pointer 270 operating at a calibration wavelength 280. This is an improvement over stereoscopic color matching or edge matching, as the point P is precise. Furthermore, when combined with holographic calibration technique of FIG. 38, the position of point P can be inferred from its position relative to a virtual calibration pattern 282. Since the processing of the object color information using the laser 270 is independent of processing of the spatial information using the virtual calibration pattern 282, the three-dimensional calibration equipment 268 is capable of capturing the object's shape. Although a desired object's shape and color can be recorded with a single optical recorder 272a, multiple optical recorders 272a-272c provides increased speed and accuracy.
Still referring to FIG. 41 , the simple laser pointer 270 can be replaced with a laser ranging measurement device 284. The laser ranging measurement device 284 provides accurate ranges to any point P on the surface of the desired object 278. By choosing a wavelength for the laser ranging measurement device 284 that is a calibration wavelength, the point P illuminated by the laser ranging measurement device 284 is observable at the same time as the virtual calibration pattern 282. The embodiment of FIG. 41 uses the virtual calibration pattern 282 to position the laser pointer 2 70 or the laser ranging measurement device 284 at desired points on the virtual calibration pattern 282. In a calibration method for use with the three-dimensional calibration equipment 268 employing either the laser pointer 270 or the laser ranging measurement device 284, the virtual calibration pattern 282 is chosen to be a grid on a plane that intersects the desired object 278. The laser generated point P is to be positioned on the surface of the desired object 278 at grid points nearest to the spatial positions where the grid intersects the desired object 278. Referring now to FIG. 42, the calibration method for use with the three- dimensional calibration equipment 268 can be summarized as follows: At step 286, a virtual calibration pattern (object) 282 is projected on a plane that is tangent to the nearest point of the desired object 278 as measured from an optical recorder 272a- 272c. At step 288, a subset of virtual calibration pattern intersection points defined by those points closest to where the virtual calibration pattern 282 intersects the desired object 278 is labeled in some numerical order. At step 290, starting with the first numbered point and continuing to the last numbered point, the laser point P is positioned at each numbered point successively and position data using measurements to the virtual calibration pattern 282 is collected. Step 290 simplifies the positioning process since the laser pointer wavelength is a calibration pattern wavelength and the pointer position P and the virtual calibration pattern 282 are simultaneously observable by the optical recorder 272a-272c for the laser pointer and thus relative positioning corrections rather than absolute position correction of the laser pointing system are required. At step 292, an attempt is made to generate another virtual calibration pattern that intersects the desired object 278 at a greater distance from the reference optical recorder. At step 294, if one can be generated, then steps 184-188 are repeated, otherwise stop.
Referring now to FIG. 43, a fifth exemplary embodiment of the three- dimensional calibration equipment 296 is depicted. The three-dimensional calibration equipment 296 includes optical recorders 298a-298c, a light source 300, and non continuous, identical holographic calibration plates 302a-302c. If the holographic calibration plates 302a-302c were held mechanically parallel, then this configuration would effectively be the configuration of the three-dimensional calibration equipment 16. Misalignment of the holographic calibration plates 302a- 302c shifts the calibration pattern up or down, left or right and/or forward or back. Thus the misalignment may be calculated from a set of reference points in the field of view. These points my be known calibration points, such as fixed points in the field of view, points projected by a laser whose wavelength is a calibration wavelength, or some distinctive feature of the desired object 304 such as an edge or point of a distinct color. Referring now to FIG. 44, the calibration method for use with the three- dimensional calibration equipment 296 can be summarized as follows: At step 306, the separate holographic correction plates 302a-302c are fixed as rigidly and as closely as possible to the configuration of single fixed plate For flat calibration plates, this implies initially fixing the non-contiguous plates as parallel to each other as possible. For cylindrical/spherical segmented non-contiguous plates, this implies initially fixing the plates rigidly as close to the location at which a contiguous cylindrical/spherical plate would occupy, and so forth for other shapes of holographic plates (e.g., elliptical). At step 307, a virtual calibration pattern is projected in the field of view of a desired object 304. At step 308, the position of each of the reference points in the vicinity of the desired object 304, if not on the desired object 304 itself, relative to each optical recorder, is determined. Illuminated points on the desired object 304 are a subset of the reference points. Additional fixed calibration points are generated by affixing reflecting and/or absorbing colors to the desired object 304 with wavelengths in the calibration range, thus predetermining a fixed set of points for calibration. At step 310, for each reference point, the corresponding position on the virtual calibration pattern is determined. At step 312, from the data of step 310, the misalignment of the virtual calibration pattern is determined. At step 314, the correction factors, for example, shift, rotation and scaling in an orthogonal coordinate system, as a function of position in the desired object 304 in three- dimensional space, for each optical recorder, is determined. At step 316, the corrections are applied for each optical recorder to both the virtual calibration pattern and the desired object.
Referring now to FIG. 45, a sixth exemplary embodiment of the three- dimensional calibration equipment 318 is depicted. The three-dimensional calibration equipment 318 includes optical recorders 320a-320c, a light source 322, non continuous, identical holographic calibration plates 324a-324c, and reference points 326a-326b in the field of view of the desired object 328. The three- dimensional calibration equipment 318 is realizable for applications that do not utilize the patterns of the virtual calibration plate 324a-324c. Specific reference points 326a-326c in the field of view of the desired object 328 are illuminated by a remote source (not shown) or self illuminated with calibration wavelengths. The reference points 326a-326c are separated into calibration wavelengths by the electronics (not shown) and are used to provide calibration across the optical recorders 320a-320c and provide the calibration for compensation of the desired object 328, i.e., in some circumstances, fixed calibration points near or on the desired object may replace the holographic calibration plates.
Referring now to FIG. 46, a seventh exemplary embodiment of the three-dimensional calibration equipment 330 is depicted. The three-dimensional calibration equipment 330 includes optical recorders 332a-332c, a remote light source 334, and reference points 336a-336b in the field of view of the desired object 338. The three-dimensional calibration equipment 330 is applicable to applications that do not utilize the patterns of a fixed calibration plate, or in applications where specific reference points 336a-336c can be illuminated by a remote light source 334 or self illuminated with calibration wavelengths. A holographic calibration plate is not required. What is required is only those elements of the three-dimensional calibration equipment 330 that separate the desired object wavelengths from calibration wavelengths. The calibration corrections are calculated from known points 336a-336c illuminated by the calibration wavelengths for each of the optical recorders 332a-332c and calibration corrections are applied to each object in each of the optical recorders 332a-332c. This simplified calibration and compensation method is more likely to be employed in wide angle images that uniformly capture many fixed calibration sources and where the fixed points 336a-336c are affixed to the desired object 338 or placed near stationary desired objects.
Referring now to FIG. 47, an eighth exemplary embodiment of the three-dimensional calibration equipment 340 is depicted. The three-dimensional calibration equipment 340 includes a band stop filter 342, optical recorders 344a- 344c, a light source 346, and a holographic calibration plate 348. Essentially, the band stop filter 342 is added to the calibration equipment 16, which prevents the illumination wavelength(s) of intruding hologram source(s) 350 from traveling to the region in the vicinity of the desired object 352 via the region in the vicinity of the holographic calibration plate 348 and the optical recorders 344a-344c. Such emanations from the intruding hologram source(s) 350 are undesirable when the three-dimensional calibration equipment 340 is used in a security or a surveillance application.
Referring now to FIG. 48, a ninth exemplary embodiment of the three- dimensional calibration equipment 354 is depicted. The three-dimensional calibration equipment 354 includes a stereoscopic microscope 356, optical recorders 358a, 358b a light source 360, and a holographic calibration plate 362. FIG. 48 shows an apparatus which generates virtual holograms for the stereoscopic microscope 356. Calibration of the stereoscopic microscope 356 can be useful when multiple optical paths 362a, 362b are employed. Potential uses of calibration for the stereoscopic microscope 356 include projecting three-dimensional grids in the field of view of the lenses 364a, 364b of the stereoscopic microscope 356 to assist in counting specimens such as while blood cells on microscope slides 366 or to ascertain the locations of imperfections in diamonds as a means of grading and identification in the case of theft. The use of three-dimensional holograms permits improved analysis as stereoscopic microscope images are scaled, especially in the depth dimension. When multiple virtual calibration patterns (not shown) are recorded by the optical recorders 358a, 358b on the holographic calibration plate 362 after being illuminated by an optical source 360, the multiple virtual calibration patterns are selectively accessed as the wavelength of illumination of the holographic calibration plate 360 is varied or by movement of the holographic calibration plate 360. For the three-dimensional calibration equipment 354, one calibration wavelength is used to record a virtual calibration pattern and another calibration wavelength is used to symbolically identify the pattern (grid) intersection, e.g. a bar code or alphanumeric sequence. Additional wavelengths are used for illuminating holographic calibration grids (virtual calibration patterns) with finer grid structures for more accurate determination of three-dimensional spatial positioning. Combining the holographic calibration plate 360 with a holographic calibration grid with marked symbolic identification such a bar code decreases the time for real time recognition of grid intersections by commercial software. Referring now to FIG. 49, a tenth exemplary embodiment of the three- dimensional calibration equipment 368 is depicted. The three-dimensional calibration equipment 368 includes a light source 370 which excites the holographic calibration pattern, a holographic calibration plate 372 that contains holographic encoded calibration information, two or more optical recorders 374a-374c which acquire images in different optical wavelengths. The light source 370 may be placed on either side of the holographic calibration plate 372. This embodiment projects holographic calibration patterns into the field of view of multiple optical recorders which acquire images in different optical wavelengths in order to capture multiple views referenced to common calibration patterns. The three-dimensional calibration equipment 368 is used for identifying the desired object 378 using a combination of characteristics. For example, these characteristics include unique object identifiers (e.g. finger prints), object geometry (e.g. hand geometry) and/or object substructures (e.g. veins viewed at non-visible wavelengths). The wavelength selective filters 202 and 204 enhance selection of identification characteristics. For example, a band stop filter which only passes the infrared wavelengths is most useful for vein identification. The filters may be selected in real time using opto-acoustic implementations for the wavelength selective filters to select a particular eye or hair color corresponding to a specific desired object(s) 378 for which an identification is desired. The three-dimensional calibration equipment 368 minimizes post processing by the use of band pass and band stop optical filters and/or continuous holographic calibration and correction, which speeds up the matching of the desired object 378.
The exemplary embodiment of the three-dimensional calibration equipment 368 can be employed in banking where the underlying identification objectives are to: (1) minimize false acceptances and (2) minimize false rejections. The multi-criteria identification system shown in FIG. 49 can apply low false rejection criteria (e.g. hand geometry) to low risk activities (e.g. balance checks) and apply low false acceptance criteria (finger print, vein structure, etc.) to high cost of failure activities (e.g. funds withdrawal). ,„_„„„
WO 2005/065272 -50-
The present invention has several advantages over the prior art three- dimensional imaging system. For the three-dimensional display 12, when ultra short optical pulses are employed, the ultra short optical pulses converge at very precise locations on the optical mixer 18, thereby creating high precision, high definition images in the display volume 28. The optical mixer elements 20a-20i can be varied in size, allowing the three-dimensional display 12 to be scalable from very small to very large displays. There is no inherent size limitation. The optical mixer element layout can be optimized to prevent unintended pulse overlap from creating unintended outputs. The optical mixer 18 is capable of being viewed at high refresh rates. Since the three-dimensional display 12 is operable using optical pulses 32a- 32k that are in visible and non-visible wavelengths, the three-dimensional display 12 can be used as test equipment for both the three-dimensional image scanner 14 and the two-dimensional image scanner 15. The display electronics 24 allows for (i) the simultaneous excitement of voxels in the display volume, (ii) dynamic adjustment of the intensity of the light produced in a voxel in the display volume 28, and (iii) the selection of optical source combinations for each voxel. These three characteristics of the display electronics 24 achieve the needed conversion efficiency in the optical mixer 18 to produce equalization of intensity throughout the display volume 28. Using intense optical pulses 32a-32k to excite the optical mixer 18 increases voxel intensity in the display volume 28 and thereby increases viewing angles, including angles close to the plane of the display 12. Increased viewing angle is also achieved when the optical mixer elements 20a-20i employ lenses, since the cone of acceptance of each of the optical mixer elements 20a-20i increases over an element without a lens. Using optical mixer elements 20a-20i without filters reduces the cost of the overall optical mixer 18.
Compared to three-dimensional image scanners in the prior art, which capture only shape, the three-dimensional image scanner 14 captures shape, color and texture simultaneously in real-time.
The three-dimensional calibration equipment herein described provides continuous (real time) calibration of stereoscopic information from still or moving objects. It improves the quality of the stereoscopic images and permits the construction of more accurate three-dimensional models from such images. The three-dimensional calibration equipment can operate in real time, for both optical and electronic based optical recorders. The three-dimensional calibration equipment improves the quality of the stereoscopic images, for example, by more accurately combining information from multiple optical recording devices. The three- dimensional calibration equipment is both an apparatus to use with image capture equipment such as cameras and a method to compensate for the absolute and/or differential distortion within and across stereoscopic imaging equipment used together as a system.
By employing a holographic projection technique that creates a virtual calibration pattern in the desired object space, the three-dimensional calibration equipment automates the calibration process. As the virtual calibration pattern is viewed by the same lens system in optical recorders that are used for stereoscopic imaging systems, all static distortion will be taken into account. The virtual calibration pattern allows explicit calibration in the desired object's space without the limitations of using real calibration patterns that are moved through the image space.
One potential use of the higher quality images obtained from the three- dimensional calibration equipment is to more accurately determine the distance to points on the stereoscopically recorded or viewed desired object for uses which include the construction of three dimensional images of desired object. As the three- dimensional calibration equipment is capable of providing calibration during normal stereoscopic operation without affecting the desired image, it is usable for continuous calibration, e.g., when the optical recorders are equipped with variable zoom lenses that exhibit differing optical properties as the zoom setting is changed, or continuous calibration during normal operation when mechanically misalignment of equipment occurs in one or more of the optical paths to the optical recorders. The three-dimensional calibration equipment continuously operates without interfering with the optical recorders. By changing the virtual calibration patterns, for example, by varying the hologram frequency when alternative holograms are recorded at different frequencies, several ranges and positions of the image field of view can be continuously recalibrated to compensate for variations in image system distortions (e.g., as the degree of telephoto magnification is changed). With the three- dimensional calibration equipment, a virtual calibration pattern can be produced in any place in the field of view of the optical recorders, and the calibration is done quickly. Compared with other calibration systems, the desired information and the calibration information are recorded through the same lens of each image recorder as the desired image is captured at the same time as the calibration image, which enables real time or post-processing calibration.
The three-dimensional calibration equipment herein described can operate in real-time and at high speeds, which makes the equipment suitable for applications in which traditional calibration equipment is not suitable, such as in the cockpit of an aircraft, where it is not possible to place a traditional calibration object in field of view. This is especially true when the optical recorders are subject to acceleration or other causes of changes in the optical paths between he desired object and the optical recorders. Having a single frame of reference with known points in the calibration pattern observable in each view simplifies combining multiple camera views to create to produce a three-dimensional representation of a desired object. The three-dimensional calibration equipment of the present invention simplifies conformal mapping from a planar optical recorder to a non-planer coordinate system - holographic project enables each optical recorder to view the same desired coordinate system, e.g., spherical, simplifying point matching and transformation. Computer readable labeling of holographic p attern intersections improved performance as identification of points in the field of view is simplified compared to the searching and matching operations performed in the prior art. The calibration equipment of the present invention can make use of multiple patters, slightly shifted patterns, or patterns with more fine detail to capture different views of the desired object. The calibration equipment 268 which employs laser ranging measurement device 284 is quicker and more precise than traditional triangulation from two optical recorders. The discrete holographic calibration plates of the calibration equipment 318 is lower in cost and less susceptible to vibrations present in moving vehicles compared to using a single calibration plate spanning all optical recorders. When the calibration equipment 354 is used in combination with the stereoscopic microscope 356, the calibration equipment 354 permits the stereoscopic microscope 356 to capture multiple biometric views from optical recorders with a common virtual calibration pattern.
It will be understood that the embodiments described herein are merely exemplary and that a person skilled in the art may make many variations and modifications without departing from the spirit and scope of the invention. All such variations and modifications are intended to be included within the scope of the present invention as defined in the appended claims.

Claims

WHAT IS CLAIMED: 1. A three-dimensional imaging system, comprising: a three-dimensional display; an image scanning device for capturing a three-dimensional image to be displayed on said three-dimensional display; and three-dimensional calibration equipment for calibrating said image scanning device, wherein both said three-dimensional display and said image scanning device employ optical pulses and non-linear optics to display and record, respectively, a three-dimensional image.
2. The three-dimensional imaging system of Claim 1 , wherein said image scanning device is a three-dimensional imaging system.
3. The three-dimensional imaging system of Claim 1 , wherein said image scanning device is a two-dimensional imaging system.
4. The three-dimensional imaging system of Claim 3, wherein said two- dimensional imaging system includes at least one two-dimensional camera.
5. A three-dimensional display, comprising: at least three pulsed optical sources; and an optical mixer movable in a display space, wherein said at least three pulsed optical sources are spatially separated so as to permit pulses emanating therefrom to overlap in a voxel within said display space and intersecting said optical mixer at a selected position, whereby a first-order non-linear interaction of said pulses causes said optical mixer to produce at least one pre-determined wavelength of electromagnetic waves.
6. The three-dimensional display of Claim 5, wherein said optical mixer includes a plurality of non-linear mixing elements.
7. The three-dimensional display of Claim 5, wherein said pulses emanating from said at least three pulsed optical sources are ultra short optical pulses.
8. The three-dimensional display of Claim 5, wherein said ultra short optical pulses have a pulse width in the range of femtoseconds to nanoseconds.
9. The three-dimensional display of Claim 5, further comprising at least one optical filter adapted to permit the passage of said at least one predetermined wavelength.
10. The three-dimensional display of Claim 6, wherein said at least three pulsed optical sources includes K pulsed optical sources, where K is greater than or equal to three.
11. The three-dimensional display of Claim 6, wherein said optical mixer recurrently sweeps through every voxel of a plurality of voxels in said display volume.
12. The three-dimensional display of Claim 11, wherein said optical mixer is planar in shape.
13. The three-dimensional display of Claim 11, wherein said optical mixer moves such that the normal to a centroid of said optical mixer maintains a constant direction.
14. The three-dimensional display of Claim 11 , wherein said optical mixer rotates about an axis.
15. The three-dimensional display of Claim 11, further comprising display electronics.
16. The three-dimensional display of Claim 13, wherein said optical mixer is of a shape such that said optical mixer is capable of producing desired wavelengths in each voxel of the display volume and a mapping of said shape is known to said display electronics.
17. The three-dimensional display of Claim 16, wherein said display electronics selects combinations of a subset of said K pulsed optical sources to produce desired wavelengths at desired voxels a nd stores alternative possible combinations of said K pulsed optical sources as lists of predetermined pulsed optical source combinations, and wherein said predetermined lists of alternative possible combinations of pulsed optical sources equalize the peak intensity of the desired wavelengths produced from said combinations of said subset of said K pulsed optical sources.
18. The three-dimensional display of Claim 6, wherein a subset of said K pulsed optical sources operate so as to excite said optical mixer in a plurality of voxels with a predetermined combination of optical frequencies so as to produce a plurality of desired wavelengths in a time interval that is much less than the repetition rate of movement of said optical mixer so that persistence of vision of the viewer makes the illumination of said voxels appear to be simultaneous.
19. The three-dimensional display of Claim 18, wherein different subsets of said K pulsed optical sources are chosen for different voxels and different positions of said optical mixer so as to maintain an approximately constant conversion efficiency.
20. The three-dimensional display of Claim 18, wherein each of said plurality of non-linear mixing elements has a cone of acceptance which is used to select the different subsets of said K pulsed optical sources for different voxels and different positions of said optical mixer.
21. The three-dimensional display of Claim 18, wherein said K pulsed optical sources operate so as to excite said optical mixer in said plurality of voxels to produce said desired wavelengths.
22. The three-dimensional display of Claim 18, wherein one of said K optical sources emits a gating pulse of a pre-selected intensity and pulse width so as to control the brightness of the light produced in said plurality of voxels.
23. The three-dimensional display of Claim 18, wherein at least one of said K pulsed optical sources emits pulses of longer duration than the pulses emitted by the remaining K-1 pulsed optical sources.
24. The three-dimensional display of Claim 5, wherein each of said at least three pulsed optical sources includes a wavelength generator and a lens for focusing a wavelength of light.
25. The three-dimensional display of Claim 24, wherein said wavelength generator is a point source of light and is located at the focal point of said lens.
26. The three-dimensional display of Claim 5, wherein each of said at least three pulsed optical sources includes a wavelength generator, an optical splitter for dividing an optical pulse emitted by said wavelength generator into at least three optical pulses, and a pulse controller for independently delaying and attenuating each of said at least three optical pulses.
27. The three-dimensional display of Claim 5, wherein said optical mixer moves periodically at a rate of repetition of at least twenty frames per second.
28. The three-dimensional display of Claim 6, wherein said plurality of nonlinear mixing elements is composed from a non-linear optical material chosen from the group consisting of LiNbO3, LilO3, KH2PO TI3AsSe3 (TAS), Hg2CI2, KH2PO4 (KDP), KD2PO4 (DKDP or D*KDP), NH4H2PO4 (ADP), Hg2Br2 and BaTiO3, quantum well structure semiconductors made of GaAs, etc.; organic single crystals made of 4-nitrobenzylidene-3- acetamino-4-methoxyaniline (MNBA), organic single crystals made of 2- methyl-4-nitroaniline (MNA); conjugated organic high molecular compounds made of polydiacetylene, conjugated organic high molecular compounds made of polyarylene vinylene, semiconductor grain-dispersed glass comprising CdS dispersed in glass, and semiconductor grain- dispersed glass comprising CdSSe dispersed in glass.
29. The three-dimensional display of Claim 6, wherein each of said plurality of non-linear mixing elements further includes at least three sub-elements, each of which is made of a non-linear optical material that is optimized for a desired wavelength.
30. The three-dimensional display of Claim 29, wherein each of said sub- elements is optimized to produce a primary color chosen from the group consisting of red, blue, and green.
31. The three-dimensional display of Claim 30, wherein said sub-elements are arranged and spaced such that no two types of sub-elements optimized for the same desired wavelength are adjacent to one another.
32. The three-dimensional display of Claim 29, wherein each of said plurality of non-linear mixing elements further includes a non-linear mixing material.
33. The three-dimensional display of Claim 32, wherein each of said plurality of non-linear mixing elements further includes a lens for improving the cone of acceptance of said optical mixer sub-elements.
34. The three-dimensional display of Claim 33, wherein each of said plurality of non-linear mixing elements has a desired wavelength filter.
35. The three-dimensional display of Claim 33, wherein each said plurality of non-linear mixing elements has a diffuser for improving the viewing angle of said optical mixer.
36. The three-dimensional display of Claim 34, wherein said lens is hemispherical, said non-linear mixing material i s rectangular, and said wavelength filter is rectangular.
37. The three-dimensional display of Claim 34, wherein each of said plurality of non-linear mixing elements has an optical reflector positioned adjacent to said lens.
38. A three-dimensional image scanner for capturing a three-dimensional image of an object, comprising: a first pulsed optical source for generating an illuminating optical pulse at an illumination wavelength, said first pulsed optical source directing said illuminating optical pulse toward the object; a second pulsed optical source for generating a gating optical pulse at a gating wavelength; an optical mixer positioned to receive light reflected from the object at a single wavelength in response to interaction of said illuminating optical pulse with the object, a portion of said illuminating optical pulse and a portion of said gating optical pulse spatially and temporally overlapping each other within the optical mixer, thereby producing a first optical pulse indicative of the shape of the object and a second optical pulse indicative of the color of the object; and an optical recorder having a plurality of pixels responsive to output light emitted by said optical mixer, a first portion of said plurality of pixels having an associated filter which passes said first optical pulse and which blocks said second optical pulse, and a second portion of said plurality of pixels being unfiltered.
39. The three-dimensional image scanner of Claim 38, further comprising display electronics for controlling the relative timing of said first pulsed optical source and said second pulsed optical source.
40. The three-dimensional image scanner of Claim 38, wherein said filter is a coating on said first portion of said plurality of pixels.
41. The three-dimensional mage scanner of claim 38 wherein said optical mixer includes a plurality of non-linear mixing elements, each of which is placed directly in front of a corresponding one of said plurality of pixels and said filter.
42. The three-dimensional image scanner of Claim 38, wherein said optical recorder has a planar shape.
43. The three-dimensional display of Claim 38, wherein pulses emanating from at least two of said at least three pulsed optical sources are ultra short optical pulses.
44. The three-dimensional display of Claim 38, wherein said ultra short optical pulses have a pulse width in the range of femtoseconds to nanoseconds.
45. The three-dimensional display of Claim 41, wherein each of said plurality of non-linear mixing elements is composed from a non-linear optical material chosen from the group consisting of LiNbO3, LilO KH2PO4, TlaAsSes (TAS), Hg2CI2, KH2PO4 (KDP), KD2PO4 (DKDP or D*KDP), NH4H2PO (ADP), Hg2Br2 and BaTiO3, quantum well structure semiconductors made of GaAs, etc.; organic single crystals made of 4- nitrobenzylidene-3-acetamino-4-methoxyaniline (MNBA), organic single crystals made of 2-methyl-4-nitroaniline (MNA); conjugated organic high molecular compounds made of polydiacetylene, conjugated organic high molecular compounds made of polyarylene vinylene, semiconductor grain- dispersed glass comprising CdS dispersed in glass, and semiconductor grain-dispersed glass comprising CdSSe dispersed in glass.
46. A method for calibrating a three-dimensional imaging system having optical apparatus for capturing an optical image of a desired object from at least two positions, comprising the steps of: projecting a virtual calibration pattern in the field of view of the optical apparatus; choosing one position of the optical apparatus as a reference position; assigning coordinates of a coordinate system relative to either the virtual calibration pattern or the reference position; measuring the differences in the virtual calibration pattern from a second position of the optical apparatus; calculating calibration corrections relative to the reference position based on the differences measured; and adjusting the optical apparatus based on the calibration corrections.
47. The method of Claim 46 further including the step of assigning the coordinate system at the second position.
48. The method of Claim 47, wherein the optical apparatus includes a single optical recorder that moves between a reference and a displaced position.
49. The method of Claim 48, wherein said single optical recorder is a three- dimensional camera.
50. The method of Claim 48, wherein said single optical recorder is a two- dimensional camera.
51. The method of Claim 48, wherein said single optical recorder includes an electronic imaging detector comprising a pixel array and said step of assigning coordinates is either in parallel to the pixel array or normal to the pixel array.
52. The method of Claim 47, wherein the optical apparatus includes at least two optical recorders, one of which is located at a reference position and another of which is located at a displaced position.
53. The method of Claim 52, wherein said at least two optical recorders are three-dimensional cameras.
54. The method of Claim 52, wherein said at least two optical recorders are two-dimensional cameras.
55. The method of Claim 52, wherein said at least two optical recorders include an electronic imaging detector comprising a pixel array and said step of assigning coordinates is either in parallel to the pixel array or normal to the pixel array.
56. The method of Claim 47, wherein said step of assigning coordinates is in alignment with the virtual calibration pattern.
57. The method of Claim 47, wherein the coordinates are assigned arbitrarily.
58. The method of Claim 47, wherein said compensating step is performed mechanically or electronically.
59. A method of calibrating an optical recorder of a three-dimensional imaging system, comprising the steps of: projecting a calibration pattern at a calibration wavelength on a plane that is tangent to the nearest point of a desired object as measured from the optical recorder; labeling an intersection point P between said calibration pattern and the desired object; positioning the end of a laser light beam operating at said calibration wavelength at the point P; measuring the distance from the point P to said calibration pattern; generating a second calibration pattern at a greater distance from the optical recorder; and repeating said steps of labeling, positioning, and measuring when said second calibration pattern intersects the desired object.
60. The method of Claim 59, wherein the intersection of said calibration pattern with the desired object includes a plurality of intersection points, said method further comprising the steps of: labeling a subset of said plurality of intersection points in numerical order starting with a first point and continuing to a last point; and repeating said steps of positioning and measuring starting with the first point and continuing through the subset to the last point.
61. The method of Claim 59, further comprising the steps of: generating a second calibration pattern at a greater distance from the optical recorder; and repeating said steps of labeling, positioning, and measuring when said calibration pattern intersects the desired object.
62. A method of calibrating a three-dimensional imaging system relative to a desired object to be imaged, at least two optical recorders to be calibrated, and two holographic calibration plates placed in the field of view of a respective one of the at least two optical recorders wherein each of said holographic calibration plates contains the same hologram, comprising the steps of: positioning the calibration plates relative to each other to approximate a monolithic calibration plate; projecting a calibration pattern in the field of view of the desired object through each of the calibration plates; determining the position of at least three reference points in the vicinity of the desired object relative to each of the at least two optical recorders; determining a corresponding position on the calibration pattern corresponding to each reference point; determining the misalignment of the virtual calibration pattern; determining the correction factors as a function of position of the desired object relative to each of said at least two optical recorders; and applying the correction factors to each of said at least two optical recorders.
63. The method of Claim 62, wherein said correction factors include shift, rotation and scaling in an orthogonal coordinate system as a function of position of the desired object in three-dimensional space.
64. Apparatus for calibrating a three-dimensional imaging system relative to a desired object, the desired object being illuminated by desired wavelengths, comprising: acquiring means for acquiring an optical image of the desired object from at least two positions; a holographic calibration plate placed between said acquiring means and the desired object; and a light source of at least one of a set of calibration wavelengths for illuminating said holographic calibration plate so as to project at least one virtual calibration pattern in the field of view of said acquiring means and in the vicinity of the desired object.
65. The apparatus of Claim 64, wherein said acquiring means is a single optical recorder that moves between a reference position and a displaced position.
66. The apparatus of Claim 65, wherein said single optical recorder is a three- dimensional camera.
67. The apparatus of Claim 65, wherein said single optical recorder is a two- dimensional camera.
68. The apparatus of Claim 64, wherein said acquiring means is at least two optical recorders, one of which is located at a reference position and another of which is located at a displaced position.
69. The apparatus of Claim 68, wherein said at least two optical recorders are three-dimensional cameras.
70. The apparatus of Claim 68, wherein said at least two optical recorders are two-dimensional cameras.
71. The apparatus of Claim 64, wherein said light source is placed on the side of said holographic calibration plate that contains the desired object.
72. The apparatus of Claim 64, wherein said light source is placed on the side of said holographic calibration plate that contains said acquiring means.
73. The apparatus of Claim 64, wherein said holographic calibration plate has a shape that is planar, cylindrical, elliptical, or spherical, including a subset of these shapes.
74. The apparatus of Claim 64, wherein said calibration pattern has a shape that is cubic, cylindrical, or spherical.
75. The apparatus of Claim 64, wherein said calibration pattern includes a multidimensional wireframe of arbitrary shape with intersecting locations.
76. The apparatus of Claim 75, wherein said intersecting locations are labeled.
77. The apparatus of Claim 76, wherein said intersecting locations are labeled with numerals and/or bar codes.
78. The apparatus of Claim 64, wherein said holographic calibration plate has multiple, superimposed holograms recorded at different calibration wavelengths.
79. The apparatus of Claim 64, wherein said holographic calibration plate has multiple holograms recorded at different calibration wavelengths and at different positions on said holographic calibration plate.
80. The apparatus of Claim 75, wherein illumination of said holographic calibration plate with different calibration wavelengths produces virtual calibration objects comprising multidimensional wireframes of arbitrary shape of varying levels of detail and displacement.
81. The apparatus of Claim 64, further comprising a mirror that is reflective to said calibration wavelengths but is transparent to said desired wavelengths, said mirror being placed in the field of view of said acquiring means.
82. The apparatus of Claim 64, wherein the desired object reflects the desired wavelengths, which are optically separable from said calibration wavelengths.
83. The apparatus of Claim 64, wherein the desired object reflects the desired wavelengths, which are not optically separable from said calibration wavelengths.
84. The apparatus of Claim 64, further including wavelength selection means for separating the desired wavelengths from said calibration wavelengths; calibration electronics for processing said calibration wavelengths and the desired wavelengths; a first memory for storing data related to said calibration wavelengths; and a second memory for storing data related to the desired wavelengths.
85. The apparatus of Claim 84, wherein said wavelength selection means is either a wavelength selection filter or a CCD array.
86. The apparatus of Claim 84, wherein said calibration electronics includes a band pass filter and a band stop filter.
87. The apparatus of Claim 64, further comprising a shutter interposed between said acquiring means and the desired object, said shutter being operable to produce an image of the desired object from said calibration wavelengths for at least one frame and an image of the desired object from the desired wavelengths for at least one other frame.
88. The apparatus of Claim 64, further comprising a material that is reflective of at least one of said calibration wavelengths and that may be applied to the desired object at predetermined points.
89. The apparatus of Claim 64, further comprising a laser pointer which illuminates a point on the desired object with one of said calibration wavelengths.
90. The apparatus of Claim 64, further comprising a laser ranging calibration device which illuminates a point on the desired object with one of said calibration wavelengths.
91. The apparatus of Claim 64, w herein said holographic calibration plate includes a plurality of holographic calibration plates, each of said plurality of holographic calibration plates containing the same recorded hologram.
92. The apparatus of Claim 64, wherein said field of view of said acquiring means includes at least three reference points that are illuminated at said calibration wavelengths.
93. The apparatus of Claim 92, further comprising a beam of light that illuminates said at least three reference points with said calibration wavelengths.
94. The apparatus of Claim 64, further comprising a band stop filter located between the desired object and said holographic calibration plate for preventing an illuminating wavelength from an intruding hologram source from traveling to the vicinity of the desired object.
95. The apparatus of Claim 64, further comprising a stereoscopic microscope placed between the desired object and said holographic calibration plate.
96. The apparatus of Claim 64, further comprising a plate on which is imprinted a desired object to be identified.
97. Apparatus for calibrating a three-dimensional imaging system relative to a desired object to be imaged, comprising: at least two optical recorders; and a light source of at least one calibration wavelength for illuminating at least three reference points relative to the desired object to be recorded by said at least two optical recorders.
PCT/US2004/043408 2003-12-30 2004-12-22 Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration WO2005065272A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CA002550842A CA2550842A1 (en) 2003-12-30 2004-12-22 Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration
US10/585,157 US8098275B2 (en) 2003-12-30 2004-12-22 Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration
EP04815479A EP1709617A2 (en) 2003-12-30 2004-12-22 Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US53338403P 2003-12-30 2003-12-30
US53330503P 2003-12-30 2003-12-30
US53313403P 2003-12-30 2003-12-30
US60/533,384 2003-12-30
US60/533,134 2003-12-30
US60/533,305 2003-12-30
US53777304P 2004-01-20 2004-01-20
US60/537,773 2004-01-20

Publications (2)

Publication Number Publication Date
WO2005065272A2 true WO2005065272A2 (en) 2005-07-21
WO2005065272A3 WO2005065272A3 (en) 2005-10-06

Family

ID=34753896

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/043408 WO2005065272A2 (en) 2003-12-30 2004-12-22 Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration

Country Status (4)

Country Link
US (1) US8098275B2 (en)
EP (1) EP1709617A2 (en)
CA (1) CA2550842A1 (en)
WO (1) WO2005065272A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2428793A (en) * 2004-06-18 2007-02-07 Japan Aerospace Exploration A fluorescent transparent camera calibration tool
WO2009055195A1 (en) * 2007-10-23 2009-04-30 Gii Acquisition, Llc Dba General Inspection, Llc Calibration device for use in an optical part measuring system
CN102456296A (en) * 2010-10-14 2012-05-16 上海科斗电子科技有限公司 LED three-dimensional display screen
CN102740105A (en) * 2012-06-08 2012-10-17 美信华成(北京)科技有限公司 Method and system for displaying three-dimensional images
TWI451358B (en) * 2007-02-14 2014-09-01 Photint Venture Group Inc Banana codec
US20210146978A1 (en) * 2019-11-20 2021-05-20 Thales Canada Inc. High-integrity object detection system and method
CN118169901A (en) * 2024-05-13 2024-06-11 成都工业学院 Transparent three-dimensional display device based on conjugate viewpoint imaging
RU2821221C1 (en) * 2023-12-13 2024-06-18 Федеральное государственное бюджетное учреждение науки Институт оптики атмосферы им. В.Е. Зуева Сибирского отделения Российской академии наук Digital system for synchronizing operating modes of elements of active optical imaging systems

Families Citing this family (113)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8339379B2 (en) * 2004-04-29 2012-12-25 Neonode Inc. Light-based touch screen
US9052777B2 (en) 2001-11-02 2015-06-09 Neonode Inc. Optical elements with alternating reflective lens facets
US9213443B2 (en) 2009-02-15 2015-12-15 Neonode Inc. Optical touch screen systems using reflected light
US9052771B2 (en) 2002-11-04 2015-06-09 Neonode Inc. Touch screen calibration and update methods
SE0103835L (en) * 2001-11-02 2003-05-03 Neonode Ab Touch screen realized by display unit with light transmitting and light receiving units
WO2009008786A1 (en) * 2007-07-06 2009-01-15 Neonode Inc. Scanning of a touch screen
US9164654B2 (en) 2002-12-10 2015-10-20 Neonode Inc. User interface for mobile computer unit
US9778794B2 (en) 2001-11-02 2017-10-03 Neonode Inc. Light-based touch screen
US9471170B2 (en) 2002-11-04 2016-10-18 Neonode Inc. Light-based touch screen with shift-aligned emitter and receiver lenses
US8674966B2 (en) 2001-11-02 2014-03-18 Neonode Inc. ASIC controller for light-based touch screen
US8896575B2 (en) 2002-11-04 2014-11-25 Neonode Inc. Pressure-sensitive touch screen
US8416217B1 (en) 2002-11-04 2013-04-09 Neonode Inc. Light-based finger gesture user interface
US8587562B2 (en) 2002-11-04 2013-11-19 Neonode Inc. Light-based touch screen using elliptical and parabolic reflectors
US8403203B2 (en) * 2002-12-10 2013-03-26 Neonoda Inc. Component bonding using a capillary effect
US8902196B2 (en) 2002-12-10 2014-12-02 Neonode Inc. Methods for determining a touch location on a touch screen
US9389730B2 (en) * 2002-12-10 2016-07-12 Neonode Inc. Light-based touch screen using elongated light guides
US9195344B2 (en) 2002-12-10 2015-11-24 Neonode Inc. Optical surface using a reflected image for determining three-dimensional position information
JP4605507B2 (en) * 2005-12-14 2011-01-05 富士電機デバイステクノロジー株式会社 Three-dimensional stereoscopic image display device
US20090273662A1 (en) * 2006-03-15 2009-11-05 Zebra Imaging, Inc. Systems and Methods for Calibrating a Hogel 3D Display
US9843790B2 (en) 2006-03-15 2017-12-12 Fovi 3D, Inc. Dynamic autostereoscopic displays
US20080243415A1 (en) * 2007-01-30 2008-10-02 Applera Corporation Calibrating the Positions of a Rotating and Translating Two-Dimensional Scanner
EP2122409B1 (en) * 2007-02-25 2016-12-07 Humaneyes Technologies Ltd. A method and a system for calibrating and/or visualizing a multi image display and for reducing ghosting artifacts
WO2009013744A2 (en) 2007-07-23 2009-01-29 Humaneyes Technologies Ltd. Multi view displays and methods for producing the same
US7988297B2 (en) * 2007-10-19 2011-08-02 Look Dynamics, Inc. Non-rigidly coupled, overlapping, non-feedback, optical systems for spatial filtering of fourier transform optical patterns and image shape content characterization
US20090179852A1 (en) * 2008-01-14 2009-07-16 Refai Hakki H Virtual moving screen for rendering three dimensional image
KR100914961B1 (en) * 2008-01-23 2009-09-02 성균관대학교산학협력단 Method and system for determining optimal exposure of structured light based 3d camera
US8666131B2 (en) * 2008-05-15 2014-03-04 David Allburn Biometric self-capture criteria, methodologies, and systems
US9794448B1 (en) * 2008-06-04 2017-10-17 Hao-jan Chang Visible multiple codes system, method and apparatus
US20100079580A1 (en) * 2008-09-30 2010-04-01 Waring Iv George O Apparatus and method for biomedical imaging
EP2351371B1 (en) * 2008-10-28 2017-12-27 Nxp B.V. Method for buffering streaming data and a terminal device
DE102008043621A1 (en) * 2008-11-10 2010-05-12 Seereal Technologies S.A. Holographic color display
JP2012510701A (en) * 2008-11-28 2012-05-10 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Display system, control unit, method, and computer program for imparting a three-dimensional sensation to ambient lighting
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
US9063614B2 (en) 2009-02-15 2015-06-23 Neonode Inc. Optical touch screens
US8693745B2 (en) * 2009-05-04 2014-04-08 Duke University Methods and computer program products for quantitative three-dimensional image correction and clinical parameter computation in optical coherence tomography
DE102009021233A1 (en) * 2009-05-14 2010-11-18 Siemens Aktiengesellschaft Capturing thermal images of an object
US8937641B1 (en) * 2009-05-18 2015-01-20 The United States Of America As Represented By The Secretary Of The Navy Holographic map
US8582824B2 (en) * 2009-06-25 2013-11-12 Twin Coast Metrology Cell feature extraction and labeling thereof
JP5455213B2 (en) * 2009-11-17 2014-03-26 Necシステムテクノロジー株式会社 Image drawing apparatus, image drawing method, and program
DE102010005358B4 (en) * 2010-01-21 2016-01-14 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and device for calibrating two optical sensor systems
US20110181587A1 (en) * 2010-01-22 2011-07-28 Sony Corporation Image display device having imaging device
US9456204B2 (en) * 2010-03-16 2016-09-27 Universal Electronics Inc. System and method for facilitating configuration of a controlling device via a 3D sync signal
CN102812424B (en) * 2010-03-24 2016-03-16 内奥诺德公司 For the lens combination of the touch-screen based on light
US8994786B2 (en) * 2010-04-08 2015-03-31 City University Of Hong Kong Multiple view display of three-dimensional images
CN201804207U (en) * 2010-04-23 2011-04-20 张瑞聪 Light shift compensation device of multicolor-holographic image composition equipment
US9393694B2 (en) 2010-05-14 2016-07-19 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
GB201008338D0 (en) * 2010-05-19 2010-07-07 Sec Dep For Innovation Univers Infinity image hologram
GB201008281D0 (en) 2010-05-19 2010-06-30 Nikonovas Arkadijus Indirect analysis and manipulation of objects
US9200779B2 (en) 2010-05-25 2015-12-01 Nokia Technologies Oy Three-dimensional display for displaying volumetric images
KR101456112B1 (en) 2010-07-12 2014-11-04 오티스 엘리베이터 컴파니 Speed and position detection system
US9466148B2 (en) * 2010-09-03 2016-10-11 Disney Enterprises, Inc. Systems and methods to dynamically adjust an image on a display monitor represented in a video feed
KR101670927B1 (en) * 2010-11-05 2016-11-01 삼성전자주식회사 Display apparatus and method
JP2012221211A (en) * 2011-04-08 2012-11-12 Nintendo Co Ltd Information processor, information processing program, information processing method and information processing system
EP2715669A4 (en) * 2011-05-25 2015-03-18 Third Dimension Ip Llc Systems and methods for alignment, calibration and rendering for an angular slice true-3d display
US8854424B2 (en) 2011-06-08 2014-10-07 City University Of Hong Kong Generating an aerial display of three-dimensional images from a single two-dimensional image or a sequence of two-dimensional images
KR101087180B1 (en) * 2011-06-22 2011-11-28 동국대학교 경주캠퍼스 산학협력단 Reliable extraction scheme of 3 dimensional shape of metallic surface
US9286658B2 (en) * 2012-03-22 2016-03-15 Qualcomm Incorporated Image enhancement
WO2013163391A1 (en) * 2012-04-25 2013-10-31 The Trustees Of Columbia University In The City Of New York Surgical structured light system
EP2672461A1 (en) * 2012-06-05 2013-12-11 a.tron3d GmbH Method for continuing recordings to detect three-dimensional geometries of objects
JP6074926B2 (en) * 2012-07-05 2017-02-08 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
US9163938B2 (en) * 2012-07-20 2015-10-20 Google Inc. Systems and methods for image acquisition
US10282034B2 (en) 2012-10-14 2019-05-07 Neonode Inc. Touch sensitive curved and flexible displays
US9207800B1 (en) 2014-09-23 2015-12-08 Neonode Inc. Integrated light guide and touch screen frame and multi-touch determination method
US9164625B2 (en) 2012-10-14 2015-10-20 Neonode Inc. Proximity sensor for determining two-dimensional coordinates of a proximal object
US9921661B2 (en) 2012-10-14 2018-03-20 Neonode Inc. Optical proximity sensor and associated user interface
US10674135B2 (en) 2012-10-17 2020-06-02 DotProduct LLC Handheld portable optical scanner and method of using
US9332243B2 (en) * 2012-10-17 2016-05-03 DotProduct LLC Handheld portable optical scanner and method of using
US9117267B2 (en) 2012-10-18 2015-08-25 Google Inc. Systems and methods for marking images for three-dimensional image generation
US9560246B2 (en) * 2012-12-14 2017-01-31 The Trustees Of Columbia University In The City Of New York Displacement monitoring system having vibration cancellation capabilities
US9838824B2 (en) 2012-12-27 2017-12-05 Avaya Inc. Social media processing with three-dimensional audio
US9892743B2 (en) * 2012-12-27 2018-02-13 Avaya Inc. Security surveillance via three-dimensional audio space presentation
US9301069B2 (en) 2012-12-27 2016-03-29 Avaya Inc. Immersive 3D sound space for searching audio
US10203839B2 (en) 2012-12-27 2019-02-12 Avaya Inc. Three-dimensional generalized space
US9039706B2 (en) 2013-03-13 2015-05-26 DePuy Synthes Products, Inc. External bone fixation device
EP3281592B1 (en) 2013-03-13 2021-04-21 DePuy Synthes Products, Inc. External bone fixation device
US8864763B2 (en) 2013-03-13 2014-10-21 DePuy Synthes Products, LLC External bone fixation device
DE102013006994A1 (en) * 2013-04-19 2014-10-23 Carl Zeiss Microscopy Gmbh Digital microscope and method for optimizing the workflow in a digital microscope
US9927457B2 (en) * 2013-08-23 2018-03-27 The United States Of America, As Represented By The Secretary Of The Navy Single beam/detector optical remote cross-flow sensor
US9267893B2 (en) * 2013-10-01 2016-02-23 Wisconsin Alumni Research Foundation Triple sum frequency coherent multidimensional imaging
US9628778B2 (en) * 2013-10-14 2017-04-18 Eys3D Microelectronics, Co. Calibration system of a stereo camera and calibration method of a stereo camera
WO2015102626A1 (en) * 2013-12-31 2015-07-09 Empire Technology Development Llc Three-dimensional display device using fluorescent material
CN104897051B (en) * 2014-03-03 2019-01-11 卡尔蔡司显微镜有限责任公司 For measuring the calibration plate and its application method of calibration to digit microscope
US10198647B2 (en) * 2015-09-25 2019-02-05 Datalogic IP Tech, S.r.l. Compact imaging module with range finder
EP3420535B1 (en) * 2016-02-26 2022-09-07 University Of Southern California Optimized volumetric imaging with selective volume illumination and light field detection
KR102571242B1 (en) * 2016-07-11 2023-08-25 삼성디스플레이 주식회사 Plastic substrate with improved hardness and display device comprising the same
US10136055B2 (en) * 2016-07-29 2018-11-20 Multimedia Image Solution Limited Method for stitching together images taken through fisheye lens in order to produce 360-degree spherical panorama
US10835318B2 (en) 2016-08-25 2020-11-17 DePuy Synthes Products, Inc. Orthopedic fixation control and manipulation
US10834377B2 (en) * 2016-08-29 2020-11-10 Faro Technologies, Inc. Forensic three-dimensional measurement device
AU2016433859A1 (en) * 2016-12-30 2019-07-04 Barco Nv System and method for camera calibration
US10559213B2 (en) * 2017-03-06 2020-02-11 Rosemount Aerospace Inc. Method and system for aircraft taxi strike alerting
US10186051B2 (en) 2017-05-11 2019-01-22 Dantec Dynamics A/S Method and system for calibrating a velocimetry system
CN107133613B (en) * 2017-06-06 2020-06-30 上海天马微电子有限公司 Display panel and display device
KR102389197B1 (en) * 2017-08-16 2022-04-21 엘지디스플레이 주식회사 Display Device
WO2019060645A1 (en) 2017-09-20 2019-03-28 Look Dynamics, Inc. Photonic neural network system
GB201802597D0 (en) * 2018-02-16 2018-04-04 Vision Rt Ltd A calibration object for calibrating a patient monitoring system
US10823664B2 (en) 2018-06-22 2020-11-03 Wisconsin Alumni Research Foundation Ultrafast, multiphoton-pump, multiphoton-probe spectroscopy
JP7163656B2 (en) * 2018-07-30 2022-11-01 株式会社リコー Delivery system, receiving client terminal, delivery method
EP3844949A1 (en) * 2018-08-29 2021-07-07 PCMS Holdings, Inc. Optical method and system for light field displays based on mosaic periodic layer
US10832377B2 (en) * 2019-01-04 2020-11-10 Aspeed Technology Inc. Spherical coordinates calibration method for linking spherical coordinates to texture coordinates
US11439436B2 (en) 2019-03-18 2022-09-13 Synthes Gmbh Orthopedic fixation strut swapping
US11304757B2 (en) 2019-03-28 2022-04-19 Synthes Gmbh Orthopedic fixation control and visualization
CN115039060A (en) 2019-12-31 2022-09-09 内奥诺德公司 Non-contact touch input system
US11360375B1 (en) 2020-03-10 2022-06-14 Rockwell Collins, Inc. Stereoscopic camera alignment via laser projection
US11334997B2 (en) 2020-04-03 2022-05-17 Synthes Gmbh Hinge detection for orthopedic fixation
US11486818B2 (en) 2020-05-26 2022-11-01 Wisconsin Alumni Research Foundation Methods and systems for coherent multidimensional spectroscopy
US11669210B2 (en) 2020-09-30 2023-06-06 Neonode Inc. Optical touch sensor
CN112995432B (en) * 2021-02-05 2022-08-05 杭州叙简科技股份有限公司 Depth image identification method based on 5G double recorders
US20220264072A1 (en) * 2021-02-12 2022-08-18 Sony Group Corporation Auto-calibrating n-configuration volumetric camera capture array
US12111180B2 (en) * 2021-07-01 2024-10-08 Summer Robotics, Inc. Calibration of sensor position offsets based on rotation and translation vectors for matched trajectories
US11704835B2 (en) * 2021-07-29 2023-07-18 Summer Robotics, Inc. Dynamic calibration of 3D acquisition systems
WO2023028226A1 (en) 2021-08-27 2023-03-02 Summer Robotics, Inc. Multi-sensor superresolution scanning and capture system
WO2023177692A1 (en) 2022-03-14 2023-09-21 Summer Robotics, Inc. Stage studio for immersive 3-d video capture
US11974055B1 (en) 2022-10-17 2024-04-30 Summer Robotics, Inc. Perceiving scene features using event sensors and image sensors

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6167295A (en) * 1991-01-28 2000-12-26 Radionics, Inc. Optical and computer graphic stereotactic localizer
US6654490B2 (en) * 2000-08-25 2003-11-25 Limbic Systems, Inc. Method for conducting analysis of two-dimensional images

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3428393A (en) 1965-11-05 1969-02-18 Roger Lannes De Montebello Optical dissector
US3604780A (en) 1969-05-01 1971-09-14 Ibm Three-dimensional fiber optic display
US4160973A (en) 1977-10-11 1979-07-10 Massachusetts Institute Of Technology Three-dimensional display
DE2825417C2 (en) 1978-06-09 1980-08-07 Max-Planck-Gesellschaft Zur Foerderung Der Wissenschaften E.V., 3400 Goettingen Method and object setting device for setting a specimen slide with respect to the device axis of a corpuscular beam device
US4871231A (en) 1987-10-16 1989-10-03 Texas Instruments Incorporated Three dimensional color display and system
US5013917A (en) 1988-07-07 1991-05-07 Kaman Aerospace Corporation Imaging lidar system using non-visible light
US4979815A (en) 1989-02-17 1990-12-25 Tsikos Constantine J Laser range imaging system based on projective geometry
US5177752A (en) 1989-06-30 1993-01-05 Matsushita Electric Industrial Co., Ltd. Optical pulse generator using gain-switched semiconductor laser
US5148301A (en) 1990-02-27 1992-09-15 Casio Computer Co., Ltd. Liquid crystal display device having a driving circuit inside the seal boundary
IN187926B (en) 1992-09-10 2002-07-27 United Syndicate Insurance Ltd
US5451785A (en) 1994-03-18 1995-09-19 Sri International Upconverting and time-gated two-dimensional infrared transillumination imaging
US5489984A (en) 1994-04-01 1996-02-06 Imra America, Inc. Differential ranging measurement system and method utilizing ultrashort pulses
US5585913A (en) 1994-04-01 1996-12-17 Imra America Inc. Ultrashort pulsewidth laser ranging system employing a time gate producing an autocorrelation and method therefore
US5919140A (en) 1995-02-21 1999-07-06 Massachusetts Institute Of Technology Optical imaging using time gated scattered light
US5684621A (en) 1995-05-08 1997-11-04 Downing; Elizabeth Anne Method and system for three-dimensional display of information based on two-photon upconversion
JP3861306B2 (en) 1996-01-18 2006-12-20 Kddi株式会社 Optical pulse generator
US6108576A (en) 1996-03-18 2000-08-22 The Research Foundation Of City College Of New York Time-resolved diffusion tomographic 2D and 3D imaging in highly scattering turbid media
US5936767A (en) 1996-03-18 1999-08-10 Yale University Multiplanar autostereoscopic imaging system
US5939739A (en) 1996-05-31 1999-08-17 The Whitaker Corporation Separation of thermal and electrical paths in flip chip ballasted power heterojunction bipolar transistors
IT1286140B1 (en) 1996-07-02 1998-07-07 Cselt Centro Studi Lab Telecom PROCEDURE AND DEVICE FOR THE GENERATION OF ULTRA SHORT OPTICAL IMPULSES.
US5870220A (en) 1996-07-12 1999-02-09 Real-Time Geometry Corporation Portable 3-D scanning system and method for rapid shape digitizing and adaptive mesh generation
US6302542B1 (en) 1996-08-23 2001-10-16 Che-Chih Tsao Moving screen projection technique for volumetric three-dimensional display
US5936736A (en) 1996-09-30 1999-08-10 Asahi Seimitsu Kabushiki Kaisha Focusing method and apparatus for a surveying instrument having an AF function, and arrangement of an AF beam splitting optical system therein
US6031511A (en) 1997-06-10 2000-02-29 Deluca; Michael J. Multiple wave guide phosphorous display
US6024496A (en) 1998-01-06 2000-02-15 Delta Electronics, Inc. Shaft coupling arrangement including oil sleeve bearing and oil supply
US6177913B1 (en) 1998-04-23 2001-01-23 The United States Of America As Represented By The Secretary Of The Navy Volumetric display
US6765566B1 (en) 1998-12-22 2004-07-20 Che-Chih Tsao Method and apparatus for displaying volumetric 3D images
US6445491B2 (en) 1999-01-29 2002-09-03 Irma America, Inc. Method and apparatus for optical sectioning and imaging using time-gated parametric image amplification
JP2000221549A (en) 1999-02-02 2000-08-11 Japan Science & Technology Corp Method and device for generation of ultrashort pulse light using raman resonator
US6600168B1 (en) 2000-02-03 2003-07-29 Genex Technologies, Inc. High speed laser three-dimensional imager
JP4032603B2 (en) * 2000-03-31 2008-01-16 コニカミノルタセンシング株式会社 3D measuring device
FR2812741B1 (en) * 2000-08-02 2003-01-17 Ge Med Sys Global Tech Co Llc METHOD AND DEVICE FOR RECONSTRUCTING A DYNAMIC THREE-DIMENSIONAL IMAGE OF AN OBJECT TRAVELED BY A CONTRAST PRODUCT
US6724489B2 (en) 2000-09-22 2004-04-20 Daniel Freifeld Three dimensional scanning camera
US6816629B2 (en) 2001-09-07 2004-11-09 Realty Mapping Llc Method and system for 3-D content creation
JP3984018B2 (en) 2001-10-15 2007-09-26 ペンタックス株式会社 3D image detection apparatus and 3D image detection adapter
US6664501B1 (en) 2002-06-13 2003-12-16 Igor Troitski Method for creating laser-induced color images within three-dimensional transparent media
AU2003285098A1 (en) 2002-10-29 2004-05-25 Metron Systems, Inc. Calibration for 3d measurement system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6167295A (en) * 1991-01-28 2000-12-26 Radionics, Inc. Optical and computer graphic stereotactic localizer
US6654490B2 (en) * 2000-08-25 2003-11-25 Limbic Systems, Inc. Method for conducting analysis of two-dimensional images

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2415250B (en) * 2004-06-18 2007-05-02 Japan Aerospace Exploration Transparent camera calibration tool for camera calibration and calibration method thereof
GB2428793B (en) * 2004-06-18 2007-05-23 Japan Aerospace Exploration Transparent camera calibration tool for camera calibration and calibration method thereof
GB2428793A (en) * 2004-06-18 2007-02-07 Japan Aerospace Exploration A fluorescent transparent camera calibration tool
TWI451358B (en) * 2007-02-14 2014-09-01 Photint Venture Group Inc Banana codec
WO2009055195A1 (en) * 2007-10-23 2009-04-30 Gii Acquisition, Llc Dba General Inspection, Llc Calibration device for use in an optical part measuring system
CN101836092A (en) * 2007-10-23 2010-09-15 Gii采集有限责任公司,以总检测有限责任公司的名义营业 The correcting device that is used for optical part measuring system
US8013990B2 (en) 2007-10-23 2011-09-06 Gii Acquisition, Llc Calibration device for use in an optical part measuring system
US7755754B2 (en) 2007-10-23 2010-07-13 Gii Acquisition, Llc Calibration device for use in an optical part measuring system
CN102456296A (en) * 2010-10-14 2012-05-16 上海科斗电子科技有限公司 LED three-dimensional display screen
CN102740105A (en) * 2012-06-08 2012-10-17 美信华成(北京)科技有限公司 Method and system for displaying three-dimensional images
US20210146978A1 (en) * 2019-11-20 2021-05-20 Thales Canada Inc. High-integrity object detection system and method
US11945478B2 (en) * 2019-11-20 2024-04-02 Ground Transportation Systems Canada Inc. High-integrity object detection system and method
RU2821221C1 (en) * 2023-12-13 2024-06-18 Федеральное государственное бюджетное учреждение науки Институт оптики атмосферы им. В.Е. Зуева Сибирского отделения Российской академии наук Digital system for synchronizing operating modes of elements of active optical imaging systems
CN118169901A (en) * 2024-05-13 2024-06-11 成都工业学院 Transparent three-dimensional display device based on conjugate viewpoint imaging
CN118169901B (en) * 2024-05-13 2024-07-02 成都工业学院 Transparent three-dimensional display device based on conjugate viewpoint imaging

Also Published As

Publication number Publication date
WO2005065272A3 (en) 2005-10-06
CA2550842A1 (en) 2005-07-21
EP1709617A2 (en) 2006-10-11
US8098275B2 (en) 2012-01-17
US20080012850A1 (en) 2008-01-17

Similar Documents

Publication Publication Date Title
US8098275B2 (en) Three-dimensional imaging system using optical pulses, non-linear optical mixers and holographic calibration
FI113987B (en) Screen Events
US10685492B2 (en) Switchable virtual reality and augmented/mixed reality display device, and light field methods
US20180224552A1 (en) Compressed-sensing ultrafast photography (cup)
US5111313A (en) Real-time electronically modulated cylindrical holographic autostereoscope
US8537204B2 (en) 3D television broadcasting system
US5745197A (en) Three-dimensional real-image volumetric display system and method
EP1706778B1 (en) Three-dimensional display device with optical path length adjuster
US10599098B2 (en) Method and apparatus for light field generation
CN104570382B (en) The device and method of optical image super-resolution are shifted using integral
CN105974573B (en) Light field spectrum microscopic imaging method and system based on microlens array
US20040130783A1 (en) Visual display with full accommodation
Nayar Computational cameras: approaches, benefits and limits
CN103777453B (en) True three-dimensional image display systems and display packing
KR20010093245A (en) Three-dimensional image sensing device and method, three-dimensional image displaying device and method, and three-dimensional image position changing device and method
CN106257919A (en) Full field vision mid-infrared imaging system
US4682029A (en) Stereoscopic infrared imager having a time-shared detector array
US6212007B1 (en) 3D-display including cylindrical lenses and binary coded micro-fields
EP2398235A2 (en) Imaging and projection devices and methods
CN103713463B (en) True three-dimensional image display systems and display packing
WO2008023196A1 (en) Three-dimensional image recording and display apparatus
CN103809365A (en) True three-dimensional image display system and true three-dimensional image display method
KR100932560B1 (en) 3D parallax image acquisition system
JP2007108626A (en) Stereoscopic image forming system
US20100194865A1 (en) Method of generating and displaying a 3d image and apparatus for performing the method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2550842

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWE Wipo information: entry into national phase

Ref document number: 2004815479

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2004815479

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 10585157

Country of ref document: US

WWP Wipo information: published in national office

Ref document number: 10585157

Country of ref document: US