WO2008120217A2 - Depth mapping using projected patterns - Google Patents

Depth mapping using projected patterns Download PDF

Info

Publication number
WO2008120217A2
WO2008120217A2 PCT/IL2008/000458 IL2008000458W WO2008120217A2 WO 2008120217 A2 WO2008120217 A2 WO 2008120217A2 IL 2008000458 W IL2008000458 W IL 2008000458W WO 2008120217 A2 WO2008120217 A2 WO 2008120217A2
Authority
WO
WIPO (PCT)
Prior art keywords
pattern
transparency
image
spots
uncorrelated
Prior art date
Application number
PCT/IL2008/000458
Other languages
French (fr)
Other versions
WO2008120217A3 (en
Inventor
Barak Freedman
Alexander Shpunt
Meir Machline
Yoel Arieli
Original Assignee
Prime Sense Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/899,542 external-priority patent/US8150142B2/en
Application filed by Prime Sense Ltd. filed Critical Prime Sense Ltd.
Priority to US12/522,171 priority Critical patent/US8493496B2/en
Publication of WO2008120217A2 publication Critical patent/WO2008120217A2/en
Publication of WO2008120217A3 publication Critical patent/WO2008120217A3/en
Priority to US13/931,935 priority patent/US9885459B2/en
Priority to US15/841,361 priority patent/US10514148B2/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V5/00Refractors for light sources
    • F21V5/04Refractors for light sources of lens shape
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/145Illumination specially adapted for pattern recognition, e.g. using gratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/12Acquisition of 3D measurements of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers

Definitions

  • the present invention relates generally to methods and systems for mapping of three- dimensional (3D) objects, and specifically to optical 3D mapping.
  • optical 3D mapping i.e., generating a 3D profile of the surface of an object by processing an optical image of the object.
  • Some methods are based on projecting a laser speckle pattern onto the object, and then analyzing an image of the pattern on the object.
  • PCT International Publication WO 2007/043036 whose disclosure is incorporated herein by reference, describes a system and method for object reconstruction, in which a coherent light source and a generator of a random speckle pattern projects onto the object a coherent random speckle pattern.
  • An imaging unit detects the light response of the illuminated region and generates image data. Shifts of the pattern in the image of the object relative to a reference image of the pattern are used in real-time reconstruction of a 3D map of the object.
  • a pattern of spots is projected onto an object, and an image of the pattern on the object is processed in order to reconstruct a 3D map of the object.
  • the pattern on the object is created by projecting optical radiation through a transparency containing the pattern.
  • the embodiments disclosed herein differ in this respect from methods of 3D reconstruction that use laser speckle, in which the pattern is created by optical interference using a diffuser.
  • the novel patterns that are used in these embodiments make it possible to perform 3D reconstruction quickly and accurately, using a single, stationary transparency to project the pattern, and a single, stationary image capture assembly to capture images of the object.
  • apparatus for mapping an object including: an illumination assembly, including: a single transparency containing a fixed pattern of spots; and a light source, which is configured to transilluminate the single transparency with optical radiation so as to project the pattern onto the object; an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the single transparency; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
  • the pattern is uncorrelated over a range of depths that is mapped by the apparatus.
  • the image capture assembly is arranged to capture images of the pattern on the object from a single, fixed location and angle relative to the illumination assembly.
  • the transparency and light source are fixed in respective positions in the illumination assembly, and the processor is configured to reconstruct the 3D map using the images that are captured only from the single, fixed location and angle with the transparency and light source only in the respective positions.
  • the light source includes a point source of the optical radiation.
  • the light source may include a light-emitting diode (LED).
  • the processor is arranged to process a succession of images captured while the object is moving so as to map a 3D movement of the object, wherein the object is a part of a human body, and the 3D movement includes a gesture made by the part of the human body, and wherein the processor is coupled to provide an input to a computer application responsively to the gesture.
  • apparatus for mapping an object including: an illumination assembly, including: a transparency containing an uncorrelated pattern of spots; and a light source, which is configured to transilluminate the transparency with optical radiation so as to project the uncorrelated pattern onto the object; an image capture assembly, which is configured to capture an image of the uncorrelated pattern that is projected onto the object; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
  • an illumination assembly including: a transparency containing an uncorrelated pattern of spots; and a light source, which is configured to transilluminate the transparency with optical radiation so as to project the uncorrelated pattern onto the object
  • an image capture assembly which is configured to capture an image of the uncorrelated pattern that is projected onto the object
  • a processor which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
  • the uncorrelated pattern has a duty cycle that is less than lie.
  • the spots have a local duty cycle that varies across the pattern.
  • the transparency contains a plurality of parallel bands, repeating periodically in a first direction, each band containing a replica of the uncorrelated pattern extending across at least a part of the transparency in a second direction, perpendicular to the first direction.
  • the processor is configured to derive the 3D map by finding respective offsets between the pattern of the spots on multiple areas of the object captured in the image of the pattern that is projected onto the object and a reference image of the pattern, wherein the respective offsets are indicative of respective distances between the areas and the image capture assembly, hi one embodiment, the spots have a local duty cycle that varies monotonically along an axis across the pattern, and the processor is configured to determine local gray levels of the multiple areas in the image responsively to the local duty cycle, and to estimate the respective offsets based on the local gray levels.
  • the spots in the transparency comprise micro-lenses arranged in the fixed or uncorrelated pattern.
  • apparatus for mapping an object including: an illumination assembly, including: a transparency containing a plurality of micro-lenses arranged in a non-uniform pattern; and a light source, which is configured to transilluminate the transparency with optical radiation so as to project the pattern onto the object; an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the transparency; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
  • an illumination assembly including: a transparency containing a plurality of micro-lenses arranged in a non-uniform pattern; and a light source, which is configured to transilluminate the transparency with optical radiation so as to project the pattern onto the object
  • an image capture assembly which is configured to capture an image of the pattern that is projected onto the object using the transparency
  • a processor which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
  • the micro-lenses are configured to focus the optical radiation to form respective focal spots at a focal plane in the non-uniform pattern
  • the light source includes optics for projecting the non-uniform pattern of the focal spots from the focal plane onto the object.
  • the micro-lenses have differing focal lengths, and the light source includes optics for projecting the non-uniform pattern of the focal spots so that the pattern that is projected on the object varies with distance from the illumination assembly.
  • a method for mapping an object including: transilluminating a single transparency containing a fixed pattern of spots so as to project the pattern onto the object; capturing an image of the pattern that is projected onto the object using the single transparency; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
  • a method for mapping an object including: an illumination assembly, including: transilluminating a transparency containing an uncorrelated pattern of spots so as to project the uncorrelated pattern onto the object; capturing an image of the uncorrelated pattern that is projected onto the object; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
  • a method for mapping an object including: transilluminating a transparency containing a plurality of micro-lenses arranged in a non-uniform pattern so as to project the non-uniform pattern onto the object; capturing an image of the non-uniform pattern that is projected onto the object; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
  • Fig. 1 is a schematic, pictorial illustration of a system for 3D mapping, in accordance with an embodiment of the present invention
  • Fig. 2 is a schematic top view of an imaging device for use in 3D mapping, in accordance with an embodiment of the present invention
  • Fig. 3 is a schematic top view of an illumination assembly for use in 3D mapping, in accordance with an embodiment of the present invention
  • Figs. 4-6 are schematic representations of patterns for use in 3D mapping, in accordance with embodiments of the present invention.
  • Fig. 7A is a schematic frontal view of a transparency for use in 3D mapping, in accordance with an embodiment of the present invention.
  • Fig. 7B is a schematic side view of the transparency of Fig. 7A, showing passage of optical rays through the transparency, in accordance with an embodiment of the present invention.
  • Fig. 8 is a schematic top view of an illumination assembly for use in 3D mapping, in accordance with another embodiment of the present invention.
  • Fig. 1 is a schematic, pictorial illustration of a system 20 for 3D optical mapping, in accordance with an embodiment of the present invention.
  • System 20 comprises an imaging device 22, which generates and projects a pattern onto an object 28 and captures an image of the pattern appearing on the object. Details of the design and operation of device 22 are shown in the figures that follow and are described hereinbelow with reference thereto.
  • device 22 projects an uncorrelated pattern of spots onto object 28.
  • the term "uncorrelated pattern” refers to a projected pattern of spots (which may be bright or dark), whose positions are uncorrelated in planes transverse to the projection beam axis.
  • the positions are uncorrelated in the sense that the auto-correlation of the pattern as a function of transverse shift is insignificant for any shift larger than the spot size and no greater than the maximum shift that may occur over the range of depths mapped by the system.
  • Random patterns such as a laser speckle pattern
  • Synthetic patterns created by human or computer design, such as pseudo-random and quasi-periodic patterns, may also be uncorrelated to the extent specified by the above definition.
  • An image processor 24 processes image data generated by device 22 in order to reconstruct a 3D map of object 28.
  • the term "3D map” refers to a set of 3D coordinates representing the surface of the object. The derivation of such a map based on image data is referred to herein as “3D mapping” or equivalently, “3D reconstruction.”
  • Image processor 24 computes the 3D coordinates of points on the surface of object 28 by triangulation, based on the transverse shifts of the spots in an image of the pattern that is projected onto the object relative to a reference pattern at a known distance from device 22.
  • Image processor 24 may comprise a general-purpose computer processor, which is programmed in software to carry out the functions described hereinbelow.
  • the software may be downloaded to processor 24 in electronic form, over a network, for example, or it may alternatively be provided on tangible media, such as optical, magnetic, or electronic memory media.
  • processor 24 may be implemented in dedicated hardware, such as a custom or semi-custom integrated circuit or a programmable digital signal processor (DSP).
  • DSP programmable digital signal processor
  • processor 24 is shown in Fig. 1, by way of example, as a separate unit from imaging device 22, some or all of the processing functions of processor 24 may be performed by suitable dedicated circuitry within the housing of the imaging device or otherwise associated with the imaging device.
  • the 3D map that is generated by processor 24 may be used for a wide range of different purposes.
  • the map may be sent to an output device, such as a display 26, which shows a pseudo-3D image of the object, hi the example shown in Fig. 1, object 28 comprises all or a part (such as a hand) of the body of a subject, hi this case, system 20 may be used to provide a gesture-based user interface, in which user movements detected by means of device 22 control an interactive computer application, such as a game, in place of tactile interface elements such as a mouse, joystick or other accessory.
  • system 20 may be used to create 3D maps of objects of other types, for substantially any application in which 3D coordinate profiles are needed.
  • Fig. 2 is a schematic top view of device 22, in accordance with an embodiment of the present invention.
  • An illumination assembly 30 in device 22 comprises a light source 34 (which may be a point source, such as a laser, without additional optics, as explained below) and a transparency 36, which are used in combination to project a pattern of spots onto object 28.
  • a light source 34 which may be a point source, such as a laser, without additional optics, as explained below
  • a transparency 36 which are used in combination to project a pattern of spots onto object 28.
  • the term "transparency" is used in its ordinary sense to mean a positive image on a transparent support. Slides and foils are examples of such transparencies.
  • the positive image on transparency 36 is an image of the pattern that is to be projected onto object 28.
  • a single, stationary transparency, fixed in the housing of assembly 30, with a fixed, uncorrelated pattern of spots, is sufficient for the purposes of these embodiments.
  • the illumination assembly may be configured to provide variable patterns, by alternating among different fixed transparencies, or using a variable transparency, such as a programmable spatial light modulator.
  • Transparency 36 may contain various sorts of fixed, uncorrelated patterns of spots.
  • the transparency may contain a pattern of binary (white/black) spots, distributed over the area of the transparency according to the values of a pseudo-random distribution function.
  • Other examples of uncorrelated spot patterns are described hereinbelow with reference to Figs. 4 and 5.
  • the spot pattern have a low duty cycle, i.e., that the fraction of the area of the pattern with above-average brightness be no greater than Me, and desirably less than 1/4 or even 1/10.
  • the low duty cycle is beneficial in enhancing the signal/noise ratio of spot shift detection for 3D mapping.
  • Light source 34 transilluminates transparency 36 with optical radiation so as to project an image of the spot pattern that is contained by the transparency onto object 28.
  • the terms "light” and “optical radiation” in the context of the present patent application refer to any band of optical radiation, including infrared and ultraviolet, as well as visible light. In some applications, however, near-infrared light is preferred on account of the availability of suitable, low-cost sources and detectors and the fact that the spot pattern is thus invisible to human viewers.
  • light source 34 is a point source, meaning that the rays of radiation emitted by the light source emanate from a locus small enough so that the spot pattern on transparency 36 is replicated sharply on object 28.
  • light source 34 may comprise, for example, a coherent source with large angular divergence, such as a laser diode.
  • the illumination assembly may comprise suitable projection optics, as shown in Fig. 3, for example.
  • the light source is typically mounted in the housing of assembly 30 in a fixed position relative to transparency 36.
  • An image capture assembly 32 captures an image of the pattern that is projected by illumination assembly 30 onto, object 28.
  • Assembly 32 comprises objective optics 40, which focus the image onto an image sensor 42.
  • sensor 42 comprises a rectilinear array of detector elements 44, such as a CCD or CMOS-based image sensor array.
  • Assembly 32 may also comprise a bandpass filter (not shown in the figures), chosen and positioned so that sensor 42 receives only light in the emission band of light source 34, while filtering out ambient light that might otherwise reduce the contrast of the image of the projected pattern that is captured by the sensor.
  • a bandpass filter (not shown in the figures), chosen and positioned so that sensor 42 receives only light in the emission band of light source 34, while filtering out ambient light that might otherwise reduce the contrast of the image of the projected pattern that is captured by the sensor.
  • illumination assembly 30 and image capture assembly 32 are held in a fixed spatial relation.
  • This configuration and the processing techniques used by image processor 24 make it possible to perform 3D mapping using the single image capture assembly, without relative movement between the illumination and image capture assemblies and without moving parts.
  • the techniques of illumination and mapping that are described hereinbelow may be used in conjunction with other sorts of image capture assemblies, in various different configurations, such as those described in the Background section above.
  • the image capture assembly may be movable relative to the illumination assembly.
  • two or more image capture assemblies may be used to capture images of object 28 from different angles.
  • assemblies 30 and 32 may be mounted so that an axis passing through the centers of the entrance pupil of image capture assembly 32 and the spot formed by light source 34 on transparency 36 is parallel to one of the axes of sensor 40 (taken for convenience to be the X-axis, while the Z-axis corresponds to distance from device 22).
  • a Z-direction shift of a point on the object, ⁇ Z will engender a concomitant transverse shift ⁇ X in the spot pattern observed in the image.
  • Z-coordinates of points on the object, as well as shifts in the Z-coordinates over time may thus be determined by measuring shifts in the X-coordinates of the spots in the image captured by assembly 32 relative to a reference image taken at a known distance Z. Y- direction shifts may be disregarded.
  • This sort of triangulation approach is appropriate particularly in 3D mapping using uncorrelated patterns of spots, although aspects of the approach may be adapted for use with other types of patterns, as well.
  • image processor 24 compares the group of spots in each area of the captured image to the reference image in order to find the most closely-matching group of spots in the reference image.
  • the relative shift between the matching groups of spots in the image gives the Z-direction shift of the area of the captured image relative to the reference image.
  • the shift in the spot pattern may be measured using image correlation or other image matching computation methods that are known in the art. Some exemplary methods are described in the above-mentioned PCT patent application and PCT International Publication WO 2007/043036.
  • Fig. 3 is a schematic top view of an illumination assembly 50, which may be used in device 22 in place of assembly 30, in accordance with an alternative embodiment of the present invention.
  • transparency 36 is transilluminated using a non-point light source, such as a light-emitting diode (LED) 52 with suitable optics 54 and 56.
  • a non-point light source such as a light-emitting diode (LED) 52
  • suitable optics 54 and 56 are arbitrary, and any suitable sort of projection optics may be used to project the image of the pattern from transparency 36 onto object 28 using light from LED 52.
  • the elements of illumination assembly 50 may be fixed within the housing of the assembly.
  • LED 52 in assembly 50 is advantageous in terms of reducing the size, cost and heat dissipation of the assembly, as well as improving the mean time between failures (MTBF) and overall reliability of the assembly. Furthermore, because the LED emits light in a relatively narrow band of wavelengths, the light collected by objective optics 40 (Fig. 2) can be effectively filtered by a suitable bandpass filter in image capture assembly 32, as explained above.
  • Fig. 4 is a schematic representation of a pattern 60 that may be contained in transparency 36, in accordance with an embodiment of the present invention.
  • Pattern 60 is quasi-periodic with an underlying five-fold symmetry. Quasi-periodic patterns are characterized by distinct peaks in the frequency domain (reciprocal space), but contain no unit cell that repeats over the area of the pattern in the spatial domain (real space).
  • pattern 60 belongs to the family of patterns with w-fold symmetry having local intensity /(r) described by the following equation:
  • transparency 36 may contain uncorrelated quasi-periodic patterns of other types.
  • quasi-periodic patterns in system 20 is advantageous in that the pattern has a known spatial frequency spectrum, with distinct peaks (as opposed to random and pseudorandom patterns, whose spectrum is flat).
  • Processor 24 may use this spectral information in filtering digital images of the pattern that are captured by image capture assembly 32, and may thus reduce the effects of noise and ambient light in the image correlation computation.
  • the pattern is uncorrelated over the range of depths mapped by the system, the likelihood of erroneous mapping results is reduced, since only a correct match between an area of the image of the object and a corresponding area of the reference image will give a high correlation value.
  • Fig. 5 is a schematic representation of a pattern 70 that may be contained in transparency 36, in accordance with another embodiment of the present invention.
  • Pattern 70 comprises a pseudo-random distribution of black pixels 72 interspersed with white pixels 74, with a local duty cycle of white pixels that decreases monotonically along the horizontal (X) axis. In other words, in any local region, the distribution of black and white pixels is random.
  • the number of white pixels relative to the number of black pixels decreases from left to right across the pattern.
  • the gray level similarly decreases monotonically across the pattern.
  • processor 24 may process the image at low resolution in order to determine the gray level of each area in the image of the object. The processor may then compare this gray level to the distribution of gray levels across the reference image in order to make a rough estimate of the depth (Z-coordinate) of each area of the object. For some applications, this rough estimate may be sufficient. Alternatively, the processor may use this initial estimate in choosing, for each area of the image of the object, the appropriate area of the reference image in which to search for a matching part of the spot pattern. By matching the spot pattern, the processor computes more accurate depth values. This two-step processing approach can be advantageous in avoiding erroneous mapping results and possibly in reducing the overall computation time.
  • Fig. 5 shows a pseudo-random pattern, the variation of spot density across the transparency may similarly be applied to patterns of other sorts.
  • Fig. 6 is a schematic representation of a pattern 80 that may be contained in transparency 36, in accordance with yet another embodiment of the present invention.
  • Pattern 80 comprises multiple parallel bands 82, repeating periodically, each band comprising a replica of the same pseudo-random distribution of spots.
  • the bands are assumed to extend across the transparency (or across at least a part of the transparency) in the X-direction, which is horizontal in Fig. 6.
  • device 22 is configured in the manner shown in Fig.
  • bands 82 in pattern 80 repeat periodically in the Y-direction
  • processor 24 may use the image of a single band 82 as a reference image in determining the X-direction shift of an area in the image of the object, regardless of the Y-coordinates of the area. Therefore the memory required to store the reference image is reduced. The complexity of the computation may be reduced, as well, since the range of the search for a matching area in the reference image is limited.
  • Bands 82 may alternatively comprise other types of patterns that are uncorrelated in the X-direction, such as types of patterns shown above in Figs. 4 and 5.
  • Figs. 7A and 7B schematically illustrate a transparency 90, which may be used in illumination assembly 50 in place of transparency 36 (Fig. 3), in accordance with an embodiment of the present invention.
  • Fig. 7A is a frontal view of the transparency
  • Fig. 7B is a side view.
  • the spots on transparency 90 comprise micro-lenses 92, which are distributed over a transparent substrate 94 in a non-uniform, uncorrelated pattern, such as a random or pseudo-random pattern.
  • the duty cycle of the pattern is given by the density of the micro-lenses per unit area and the optical properties of the micro-lenses and other projection optics (which define the focal spot size).
  • Micro-lenses 92 may be formed on substrate 94 using a photolithographic process, for example, as is used to produce uniform micro-lens grid arrays that are known in the art. Such processes are capable of fabricating micro-lenses with diameter on the order of 0.1 mm and focal lengths of 5-6 mm. Alternatively, micro-lenses 92 may have larger or smaller dimensions and focal lengths, depending on the process and application requirements.
  • micro-lenses 92 focus light from a light source, such as LED 52, onto a focal plane 96. Each micro-lens thus forms a corresponding bright focal spot at the focal plane. Optic 56 projects this pattern of bright spots onto object 28.
  • a light source such as LED 52
  • micro-lenses 92 are shown in Figs. 7A and 7B as being sparsely distributed over the area of substrate 94, but in practice the micro-lenses may be more densely packed. This arrangement is advantageous, by comparison with the other embodiments described above, in that substantially all of the light from the light source is projected onto the object: Transparency 90 effectively redistributes the light, rather blocking a part of the light in the dark areas of the pattern.
  • the micro-lenses may have non-uniform focal lengths.
  • different micro-lenses may have different focal lengths, so that the pattern that is projected on the object varies with distance from the illumination assembly.
  • some or all of the micro-lenses may have multiple different focal lengths.
  • the micro-lenses or the projection optics (such as optic 56 in Fig. 3) may be astigmatic, meaning that they have different focal lengths in different planes, so that the shapes of the spots will vary with distance. (As yet another alternative, an equivalent result may be obtained by making objective optics 40 (Fig. 2) astigmatic.)
  • These sorts of depth- varying pattern may be used in range mapping, as described, for example, in PCT International Publications WO 2007/996893 and WO 2007/105215, whose disclosures are incorporated herein by reference.
  • Fig. 8 is a schematic top view of an illumination assembly 100, which may be used in place of assembly 30 (Fig. 2) or assembly 50 (Fig. 3) in 3D mapping, in accordance with another embodiment of the present invention, hi this case, light from LED 52 or from another source is directed by optic 54 through a diffuser 102 followed by a uniform micro-lens array 104. hi this configuration, the wavefront variations introduced by the diffuser will give rise to randomization of the locations of the spots formed by the micro-lenses.
  • Optic 56 projects this pattern of spots onto the object that is to be mapped, hi this case, too, astigmatic optics or other means may be used to make the pattern vary with distance from the illumination assembly.
  • Figs. 4-6 and 7A/B are shown solely by way of example, and transparencies containing other sorts of uncorrelated patterns may similarly be used in system 20 and are considered to be within the scope of the present invention.
  • the embodiments described above relate to the specific configuration of system 20 and design of device 22 that are described above, certain principles of the present invention may similarly be applied in systems and devices of other types for optical 3D mapping.
  • aspects of the embodiments described above may be applied in systems in that use multiple image capture assemblies, or in which the image capture assembly and the illumination assembly are movable relative to one another.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Apparatus for mapping an object includes an illumination assembly(30, 50), which includes a single transparency (36) containing a fixed pattern (60, 70, 80) of spots. A light source (34, 52) transilluminates the single transparency with optical radiation so as to project the pattern onto the object (28). An image capture assembly (32) captures an image of the pattern that is projected onto the object using the single transparency. A processor (24) processes the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.

Description

DEPTH MAPPING USING PROJECTED PATTERNS
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Patent Application 61/016,832, filed December 27, 2007. This application is also a continuation-in-part of U.S. Patent Application 11/899,542, filed September 6, 2007, which claims the benefit of U.S. Provisional
Patent Application 60/909,487, filed April 2, 2007. All of these related applications are incorporated herein by reference.
FIELD OF THE INVENTION
The present invention relates generally to methods and systems for mapping of three- dimensional (3D) objects, and specifically to optical 3D mapping.
BACKGROUND OF THE INVENTION
Various methods are known in the art for optical 3D mapping, i.e., generating a 3D profile of the surface of an object by processing an optical image of the object.
Some methods are based on projecting a laser speckle pattern onto the object, and then analyzing an image of the pattern on the object. For example, PCT International Publication WO 2007/043036, whose disclosure is incorporated herein by reference, describes a system and method for object reconstruction, in which a coherent light source and a generator of a random speckle pattern projects onto the object a coherent random speckle pattern. An imaging unit detects the light response of the illuminated region and generates image data. Shifts of the pattern in the image of the object relative to a reference image of the pattern are used in real-time reconstruction of a 3D map of the object.
Other methods of optical 3D mapping project different sorts of patterns onto the object to be mapped. For example, PCT International Publication WO 93/03579 describes a three- dimensional vision system in which one or two projectors establish structured light comprising two sets of parallel stripes having different periodicities and angles. As another example, U.S. Patent 6,751,344 describes a method for optically scanning a subject in which the subject is illuminated with a matrix of discrete two-dimensional image objects, such as a grid of dots. Other methods involve projection of a grating pattern, as described, for example, in U.S. Patent 4,802,759. The disclosures of the above-mentioned patents and publications are incorporated herein by reference. SUMMARY OF THE INVENTION
In embodiments of the present invention, a pattern of spots is projected onto an object, and an image of the pattern on the object is processed in order to reconstruct a 3D map of the object. The pattern on the object is created by projecting optical radiation through a transparency containing the pattern. The embodiments disclosed herein differ in this respect from methods of 3D reconstruction that use laser speckle, in which the pattern is created by optical interference using a diffuser. At the same time, the novel patterns that are used in these embodiments make it possible to perform 3D reconstruction quickly and accurately, using a single, stationary transparency to project the pattern, and a single, stationary image capture assembly to capture images of the object.
There is therefore provided, in accordance with an embodiment of the present invention, apparatus for mapping an object, including: an illumination assembly, including: a single transparency containing a fixed pattern of spots; and a light source, which is configured to transilluminate the single transparency with optical radiation so as to project the pattern onto the object; an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the single transparency; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object. hi a disclosed embodiment, the pattern is uncorrelated over a range of depths that is mapped by the apparatus. hi some embodiments, the image capture assembly is arranged to capture images of the pattern on the object from a single, fixed location and angle relative to the illumination assembly. Typically, the transparency and light source are fixed in respective positions in the illumination assembly, and the processor is configured to reconstruct the 3D map using the images that are captured only from the single, fixed location and angle with the transparency and light source only in the respective positions.
In one embodiment, the light source includes a point source of the optical radiation. Alternatively, the light source may include a light-emitting diode (LED). hi a disclosed embodiment, the processor is arranged to process a succession of images captured while the object is moving so as to map a 3D movement of the object, wherein the object is a part of a human body, and the 3D movement includes a gesture made by the part of the human body, and wherein the processor is coupled to provide an input to a computer application responsively to the gesture.
There is also provided, in accordance with an embodiment of the present invention, apparatus for mapping an object, including: an illumination assembly, including: a transparency containing an uncorrelated pattern of spots; and a light source, which is configured to transilluminate the transparency with optical radiation so as to project the uncorrelated pattern onto the object; an image capture assembly, which is configured to capture an image of the uncorrelated pattern that is projected onto the object; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
In one embodiment, the uncorrelated pattern includes a pseudo-random pattern. In another embodiment, the uncorrelated pattern includes a quasi-periodic pattern, wherein the quasi-periodic pattern has an n-fold symmetry, with n=5 or n ≥ 7.
Typically, the uncorrelated pattern has a duty cycle that is less than lie. Alternatively or additionally, the spots have a local duty cycle that varies across the pattern.
In an alternative embodiment, the transparency contains a plurality of parallel bands, repeating periodically in a first direction, each band containing a replica of the uncorrelated pattern extending across at least a part of the transparency in a second direction, perpendicular to the first direction. hi some embodiments, the processor is configured to derive the 3D map by finding respective offsets between the pattern of the spots on multiple areas of the object captured in the image of the pattern that is projected onto the object and a reference image of the pattern, wherein the respective offsets are indicative of respective distances between the areas and the image capture assembly, hi one embodiment, the spots have a local duty cycle that varies monotonically along an axis across the pattern, and the processor is configured to determine local gray levels of the multiple areas in the image responsively to the local duty cycle, and to estimate the respective offsets based on the local gray levels. In an alternative embodiment, the spots in the transparency comprise micro-lenses arranged in the fixed or uncorrelated pattern.
There is furthermore provided, in accordance with an embodiment of the present invention, apparatus for mapping an object, including: an illumination assembly, including: a transparency containing a plurality of micro-lenses arranged in a non-uniform pattern; and a light source, which is configured to transilluminate the transparency with optical radiation so as to project the pattern onto the object; an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the transparency; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
Typically, the micro-lenses are configured to focus the optical radiation to form respective focal spots at a focal plane in the non-uniform pattern, and the light source includes optics for projecting the non-uniform pattern of the focal spots from the focal plane onto the object. Alternatively, at least some of the micro-lenses have differing focal lengths, and the light source includes optics for projecting the non-uniform pattern of the focal spots so that the pattern that is projected on the object varies with distance from the illumination assembly. There is additionally provided, in accordance with an embodiment of the present invention, a method for mapping an object, including: transilluminating a single transparency containing a fixed pattern of spots so as to project the pattern onto the object; capturing an image of the pattern that is projected onto the object using the single transparency; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
There is further provided, in accordance with an embodiment of the present invention, a method for mapping an object, including: an illumination assembly, including: transilluminating a transparency containing an uncorrelated pattern of spots so as to project the uncorrelated pattern onto the object; capturing an image of the uncorrelated pattern that is projected onto the object; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
There is moreover provided, in accordance with an embodiment of the present invention, a method for mapping an object, including: transilluminating a transparency containing a plurality of micro-lenses arranged in a non-uniform pattern so as to project the non-uniform pattern onto the object; capturing an image of the non-uniform pattern that is projected onto the object; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
The present invention will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings in which:
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic, pictorial illustration of a system for 3D mapping, in accordance with an embodiment of the present invention;
Fig. 2 is a schematic top view of an imaging device for use in 3D mapping, in accordance with an embodiment of the present invention;
Fig. 3 is a schematic top view of an illumination assembly for use in 3D mapping, in accordance with an embodiment of the present invention; Figs. 4-6 are schematic representations of patterns for use in 3D mapping, in accordance with embodiments of the present invention;
Fig. 7A is a schematic frontal view of a transparency for use in 3D mapping, in accordance with an embodiment of the present invention;
Fig. 7B is a schematic side view of the transparency of Fig. 7A, showing passage of optical rays through the transparency, in accordance with an embodiment of the present invention; and
Fig. 8 is a schematic top view of an illumination assembly for use in 3D mapping, in accordance with another embodiment of the present invention. DETAILED DESCRIPTION OF EMBODIMENTS
Fig. 1 is a schematic, pictorial illustration of a system 20 for 3D optical mapping, in accordance with an embodiment of the present invention. System 20 comprises an imaging device 22, which generates and projects a pattern onto an object 28 and captures an image of the pattern appearing on the object. Details of the design and operation of device 22 are shown in the figures that follow and are described hereinbelow with reference thereto. hi some embodiments, device 22 projects an uncorrelated pattern of spots onto object 28. hi the context of the present patent application and in the claims, the term "uncorrelated pattern" refers to a projected pattern of spots (which may be bright or dark), whose positions are uncorrelated in planes transverse to the projection beam axis. The positions are uncorrelated in the sense that the auto-correlation of the pattern as a function of transverse shift is insignificant for any shift larger than the spot size and no greater than the maximum shift that may occur over the range of depths mapped by the system. Random patterns, such as a laser speckle pattern, are uncorrelated in this sense. Synthetic patterns, created by human or computer design, such as pseudo-random and quasi-periodic patterns, may also be uncorrelated to the extent specified by the above definition.
An image processor 24 processes image data generated by device 22 in order to reconstruct a 3D map of object 28. The term "3D map" refers to a set of 3D coordinates representing the surface of the object. The derivation of such a map based on image data is referred to herein as "3D mapping" or equivalently, "3D reconstruction." Image processor 24 computes the 3D coordinates of points on the surface of object 28 by triangulation, based on the transverse shifts of the spots in an image of the pattern that is projected onto the object relative to a reference pattern at a known distance from device 22. Methods for this sort of triangulation-based 3D mapping using a projected laser speckle pattern are described in the above-mentioned PCT publication WO 2007/043036 and in PCT Patent Application PCT/IL2007/000306, filed March 8, 2007, and published as WO 2007/105205, which is assigned to the assignee of the present patent application, and whose disclosure is incorporated herein by reference. These methods may be implemented, mutatis mutandis, using synthetic uncorrelated patterns in system 20. Image processor 24 may comprise a general-purpose computer processor, which is programmed in software to carry out the functions described hereinbelow. The software may be downloaded to processor 24 in electronic form, over a network, for example, or it may alternatively be provided on tangible media, such as optical, magnetic, or electronic memory media. Alternatively or additionally, some or all of the functions of the image processor may be implemented in dedicated hardware, such as a custom or semi-custom integrated circuit or a programmable digital signal processor (DSP). Although processor 24 is shown in Fig. 1, by way of example, as a separate unit from imaging device 22, some or all of the processing functions of processor 24 may be performed by suitable dedicated circuitry within the housing of the imaging device or otherwise associated with the imaging device.
The 3D map that is generated by processor 24 may be used for a wide range of different purposes. For example, the map may be sent to an output device, such as a display 26, which shows a pseudo-3D image of the object, hi the example shown in Fig. 1, object 28 comprises all or a part (such as a hand) of the body of a subject, hi this case, system 20 may be used to provide a gesture-based user interface, in which user movements detected by means of device 22 control an interactive computer application, such as a game, in place of tactile interface elements such as a mouse, joystick or other accessory. Alternatively, system 20 may be used to create 3D maps of objects of other types, for substantially any application in which 3D coordinate profiles are needed.
Fig. 2 is a schematic top view of device 22, in accordance with an embodiment of the present invention. An illumination assembly 30 in device 22 comprises a light source 34 (which may be a point source, such as a laser, without additional optics, as explained below) and a transparency 36, which are used in combination to project a pattern of spots onto object 28. In the context of the present patent application and in the claims, the term "transparency" is used in its ordinary sense to mean a positive image on a transparent support. Slides and foils are examples of such transparencies. In some embodiments of the present invention, the positive image on transparency 36 is an image of the pattern that is to be projected onto object 28. A single, stationary transparency, fixed in the housing of assembly 30, with a fixed, uncorrelated pattern of spots, is sufficient for the purposes of these embodiments. Alternatively, the illumination assembly may be configured to provide variable patterns, by alternating among different fixed transparencies, or using a variable transparency, such as a programmable spatial light modulator. Transparency 36 may contain various sorts of fixed, uncorrelated patterns of spots. For example, the transparency may contain a pattern of binary (white/black) spots, distributed over the area of the transparency according to the values of a pseudo-random distribution function. Other examples of uncorrelated spot patterns are described hereinbelow with reference to Figs. 4 and 5. For good performance in the mapping process, it is advantageous that the spot pattern have a low duty cycle, i.e., that the fraction of the area of the pattern with above-average brightness be no greater than Me, and desirably less than 1/4 or even 1/10. The low duty cycle is beneficial in enhancing the signal/noise ratio of spot shift detection for 3D mapping.
Light source 34 transilluminates transparency 36 with optical radiation so as to project an image of the spot pattern that is contained by the transparency onto object 28. (The terms "light" and "optical radiation" in the context of the present patent application refer to any band of optical radiation, including infrared and ultraviolet, as well as visible light. In some applications, however, near-infrared light is preferred on account of the availability of suitable, low-cost sources and detectors and the fact that the spot pattern is thus invisible to human viewers.) In the configuration shown in Fig. 2, light source 34 is a point source, meaning that the rays of radiation emitted by the light source emanate from a locus small enough so that the spot pattern on transparency 36 is replicated sharply on object 28. For this purpose, light source 34 may comprise, for example, a coherent source with large angular divergence, such as a laser diode. When a point source is used with the transparency in this manner, no other projection optics are required. Alternatively, the illumination assembly may comprise suitable projection optics, as shown in Fig. 3, for example. In either case, the light source is typically mounted in the housing of assembly 30 in a fixed position relative to transparency 36. An image capture assembly 32 captures an image of the pattern that is projected by illumination assembly 30 onto, object 28. Assembly 32 comprises objective optics 40, which focus the image onto an image sensor 42. Typically, sensor 42 comprises a rectilinear array of detector elements 44, such as a CCD or CMOS-based image sensor array. Assembly 32 may also comprise a bandpass filter (not shown in the figures), chosen and positioned so that sensor 42 receives only light in the emission band of light source 34, while filtering out ambient light that might otherwise reduce the contrast of the image of the projected pattern that is captured by the sensor.
In the embodiment shown in Fig. 2, illumination assembly 30 and image capture assembly 32 are held in a fixed spatial relation. This configuration and the processing techniques used by image processor 24 make it possible to perform 3D mapping using the single image capture assembly, without relative movement between the illumination and image capture assemblies and without moving parts. Alternatively, the techniques of illumination and mapping that are described hereinbelow may be used in conjunction with other sorts of image capture assemblies, in various different configurations, such as those described in the Background section above. For example, the image capture assembly may be movable relative to the illumination assembly. Additionally or alternatively, two or more image capture assemblies may be used to capture images of object 28 from different angles.
To simplify the computation of the 3D map and of changes in the map due to motion of object 28 in the configuration of Fig. 2, assemblies 30 and 32 may be mounted so that an axis passing through the centers of the entrance pupil of image capture assembly 32 and the spot formed by light source 34 on transparency 36 is parallel to one of the axes of sensor 40 (taken for convenience to be the X-axis, while the Z-axis corresponds to distance from device 22). The advantages of this arrangement are explained further in the above-mentioned PCT patent application PCT/IL2007/000306.
Specifically, by triangulation in this arrangement, a Z-direction shift of a point on the object, δZ, will engender a concomitant transverse shift δX in the spot pattern observed in the image. Z-coordinates of points on the object, as well as shifts in the Z-coordinates over time, may thus be determined by measuring shifts in the X-coordinates of the spots in the image captured by assembly 32 relative to a reference image taken at a known distance Z. Y- direction shifts may be disregarded. This sort of triangulation approach is appropriate particularly in 3D mapping using uncorrelated patterns of spots, although aspects of the approach may be adapted for use with other types of patterns, as well.
Thus, to generate the 3D map of object 28, image processor 24 (Fig. 1) compares the group of spots in each area of the captured image to the reference image in order to find the most closely-matching group of spots in the reference image. The relative shift between the matching groups of spots in the image gives the Z-direction shift of the area of the captured image relative to the reference image. The shift in the spot pattern may be measured using image correlation or other image matching computation methods that are known in the art. Some exemplary methods are described in the above-mentioned PCT patent application and PCT International Publication WO 2007/043036.
Fig. 3 is a schematic top view of an illumination assembly 50, which may be used in device 22 in place of assembly 30, in accordance with an alternative embodiment of the present invention. In this embodiment, transparency 36 is transilluminated using a non-point light source, such as a light-emitting diode (LED) 52 with suitable optics 54 and 56. The configuration and locations of optics 54 and 56 in Fig. 3 are arbitrary, and any suitable sort of projection optics may be used to project the image of the pattern from transparency 36 onto object 28 using light from LED 52. As in assembly 30 (Fig. 2), the elements of illumination assembly 50 may be fixed within the housing of the assembly. The use of LED 52 in assembly 50 is advantageous in terms of reducing the size, cost and heat dissipation of the assembly, as well as improving the mean time between failures (MTBF) and overall reliability of the assembly. Furthermore, because the LED emits light in a relatively narrow band of wavelengths, the light collected by objective optics 40 (Fig. 2) can be effectively filtered by a suitable bandpass filter in image capture assembly 32, as explained above.
Fig. 4 is a schematic representation of a pattern 60 that may be contained in transparency 36, in accordance with an embodiment of the present invention. Pattern 60 is quasi-periodic with an underlying five-fold symmetry. Quasi-periodic patterns are characterized by distinct peaks in the frequency domain (reciprocal space), but contain no unit cell that repeats over the area of the pattern in the spatial domain (real space). For example, pattern 60 belongs to the family of patterns with w-fold symmetry having local intensity /(r) described by the following equation:
Figure imgf000012_0001
wherein km or n>7 (n = 5, 7, 8, ...), these patterns are
Figure imgf000012_0002
uncorrelated in the sense defined above. Alternatively, transparency 36 may contain uncorrelated quasi-periodic patterns of other types.
The use of quasi-periodic patterns in system 20 is advantageous in that the pattern has a known spatial frequency spectrum, with distinct peaks (as opposed to random and pseudorandom patterns, whose spectrum is flat). Processor 24 may use this spectral information in filtering digital images of the pattern that are captured by image capture assembly 32, and may thus reduce the effects of noise and ambient light in the image correlation computation. On the other hand, because the pattern is uncorrelated over the range of depths mapped by the system, the likelihood of erroneous mapping results is reduced, since only a correct match between an area of the image of the object and a corresponding area of the reference image will give a high correlation value.
Fig. 5 is a schematic representation of a pattern 70 that may be contained in transparency 36, in accordance with another embodiment of the present invention. Pattern 70 comprises a pseudo-random distribution of black pixels 72 interspersed with white pixels 74, with a local duty cycle of white pixels that decreases monotonically along the horizontal (X) axis. In other words, in any local region, the distribution of black and white pixels is random.
Taken over a larger area, however (for example, a block of 10 x 10 pixels), the number of white pixels relative to the number of black pixels decreases from left to right across the pattern. Taking the sum of pixel values over such a block to be its gray level, the gray level similarly decreases monotonically across the pattern.
When slide 36 contains pattern 70, the gray level of the pattern projected onto object 28, when observed at low resolution, will likewise vary across the image of the object. Therefore, in an initial processing phase, processor 24 may process the image at low resolution in order to determine the gray level of each area in the image of the object. The processor may then compare this gray level to the distribution of gray levels across the reference image in order to make a rough estimate of the depth (Z-coordinate) of each area of the object. For some applications, this rough estimate may be sufficient. Alternatively, the processor may use this initial estimate in choosing, for each area of the image of the object, the appropriate area of the reference image in which to search for a matching part of the spot pattern. By matching the spot pattern, the processor computes more accurate depth values. This two-step processing approach can be advantageous in avoiding erroneous mapping results and possibly in reducing the overall computation time. Although Fig. 5 shows a pseudo-random pattern, the variation of spot density across the transparency may similarly be applied to patterns of other sorts.
Fig. 6 is a schematic representation of a pattern 80 that may be contained in transparency 36, in accordance with yet another embodiment of the present invention. Pattern 80 comprises multiple parallel bands 82, repeating periodically, each band comprising a replica of the same pseudo-random distribution of spots. In accordance with the arrangement of axes in Fig. 2, the bands are assumed to extend across the transparency (or across at least a part of the transparency) in the X-direction, which is horizontal in Fig. 6. As noted above, when device 22 is configured in the manner shown in Fig. 2, with the entrance pupil of image capture assembly 32 and the spot formed by light source 34 on transparency 36 aligned parallel to the X-axis, only X-direction shifts of the pattern projected on the object need be measured in order to map the Z-coordinates of the object. Y-direction shifts may be disregarded. Therefore, it is sufficient that the pattern be uncorrelated in the X-direction, while Y-direction correlation is unimportant (for distances greater than the size of the correlation window).
Because bands 82 in pattern 80 repeat periodically in the Y-direction, processor 24 may use the image of a single band 82 as a reference image in determining the X-direction shift of an area in the image of the object, regardless of the Y-coordinates of the area. Therefore the memory required to store the reference image is reduced. The complexity of the computation may be reduced, as well, since the range of the search for a matching area in the reference image is limited. Bands 82 may alternatively comprise other types of patterns that are uncorrelated in the X-direction, such as types of patterns shown above in Figs. 4 and 5.
Reference is now made to Figs. 7A and 7B, which schematically illustrate a transparency 90, which may be used in illumination assembly 50 in place of transparency 36 (Fig. 3), in accordance with an embodiment of the present invention. Fig. 7A is a frontal view of the transparency, while Fig. 7B is a side view. hi this embodiment, the spots on transparency 90 comprise micro-lenses 92, which are distributed over a transparent substrate 94 in a non-uniform, uncorrelated pattern, such as a random or pseudo-random pattern. The duty cycle of the pattern is given by the density of the micro-lenses per unit area and the optical properties of the micro-lenses and other projection optics (which define the focal spot size). The duty cycle is typically (although not necessarily) less than 1/e, as explained above. Micro-lenses 92 may be formed on substrate 94 using a photolithographic process, for example, as is used to produce uniform micro-lens grid arrays that are known in the art. Such processes are capable of fabricating micro-lenses with diameter on the order of 0.1 mm and focal lengths of 5-6 mm. Alternatively, micro-lenses 92 may have larger or smaller dimensions and focal lengths, depending on the process and application requirements.
As shown in Fig. 7B, micro-lenses 92 focus light from a light source, such as LED 52, onto a focal plane 96. Each micro-lens thus forms a corresponding bright focal spot at the focal plane. Optic 56 projects this pattern of bright spots onto object 28. For clarity of illustration, micro-lenses 92 are shown in Figs. 7A and 7B as being sparsely distributed over the area of substrate 94, but in practice the micro-lenses may be more densely packed. This arrangement is advantageous, by comparison with the other embodiments described above, in that substantially all of the light from the light source is projected onto the object: Transparency 90 effectively redistributes the light, rather blocking a part of the light in the dark areas of the pattern.
As a further alternative, the micro-lenses may have non-uniform focal lengths. For example, different micro-lenses may have different focal lengths, so that the pattern that is projected on the object varies with distance from the illumination assembly. As another example, some or all of the micro-lenses may have multiple different focal lengths. Alternatively or additionally, the micro-lenses or the projection optics (such as optic 56 in Fig. 3) may be astigmatic, meaning that they have different focal lengths in different planes, so that the shapes of the spots will vary with distance. (As yet another alternative, an equivalent result may be obtained by making objective optics 40 (Fig. 2) astigmatic.) These sorts of depth- varying pattern may be used in range mapping, as described, for example, in PCT International Publications WO 2007/996893 and WO 2007/105215, whose disclosures are incorporated herein by reference.
Fig. 8 is a schematic top view of an illumination assembly 100, which may be used in place of assembly 30 (Fig. 2) or assembly 50 (Fig. 3) in 3D mapping, in accordance with another embodiment of the present invention, hi this case, light from LED 52 or from another source is directed by optic 54 through a diffuser 102 followed by a uniform micro-lens array 104. hi this configuration, the wavefront variations introduced by the diffuser will give rise to randomization of the locations of the spots formed by the micro-lenses. Optic 56 projects this pattern of spots onto the object that is to be mapped, hi this case, too, astigmatic optics or other means may be used to make the pattern vary with distance from the illumination assembly.
The patterns in Figs. 4-6 and 7A/B are shown solely by way of example, and transparencies containing other sorts of uncorrelated patterns may similarly be used in system 20 and are considered to be within the scope of the present invention. Furthermore, although the embodiments described above relate to the specific configuration of system 20 and design of device 22 that are described above, certain principles of the present invention may similarly be applied in systems and devices of other types for optical 3D mapping. For example, aspects of the embodiments described above may be applied in systems in that use multiple image capture assemblies, or in which the image capture assembly and the illumination assembly are movable relative to one another.
It will thus be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.

Claims

1. Apparatus for mapping an object, comprising: an illumination assembly, comprising: a single transparency containing a fixed pattern of spots; and a light source, which is configured to transilluminate the single transparency with optical radiation so as to project the pattern onto the object; an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the single transparency; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
2. The apparatus according to claim 1, wherein the pattern is uncorrelated over a range of depths that is mapped by the apparatus.
3. The apparatus according to claim 1, wherein the processor is arranged to derive the 3D map by finding respective offsets between the pattern of the spots on multiple areas of the object captured in the image of the pattern that is projected onto the object and a reference image of the pattern, wherein the respective offsets are indicative of respective distances between the areas and the image capture assembly.
4. The apparatus according to claim 1, wherein the image capture assembly is arranged to capture images of the pattern on the object from a single, fixed location and angle relative to the illumination assembly.
5. The apparatus according to claim 4, wherein the transparency and light source are fixed in respective positions in the illumination assembly.
6. The apparatus according to claim 5, wherein the processor is configured to reconstruct the 3D map using the images that are captured only from the single, fixed location and angle with the transparency and light source only in the respective positions.
7. The apparatus according to claim 1, wherein the light source comprises a point source of the optical radiation.
8. The apparatus according to claim 1, wherein the light source comprises a light-emitting diode (LED).
9. The apparatus according to any of claims 1-8, wherein the processor is arranged to process a succession of images captured while the object is moving so as to map a 3D movement of the object.
10. The apparatus according to claim 9, wherein the object is a part of a human body, and wherein the 3D movement comprises a gesture made by the part of the human body, and wherein the processor is coupled to provide an input to a computer application responsively to the gesture.
11. The apparatus according to claim 1, wherein the spots in the transparency comprise micro-lenses arranged in the fixed pattern.
12. Apparatus for mapping an object, comprising: an illumination assembly, comprising: a transparency containing an uncorrelated pattern of spots; and a light source, which is configured to transilluminate the transparency with optical radiation so as to project the uncorrelated pattern onto the object; an image capture assembly, which is configured to capture an image of the uncorrelated pattern that is projected onto the object; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
13. The apparatus according to claim 12, wherein the uncorrelated pattern comprises a pseudo-random pattern.
14. The apparatus according to claim 12, wherein the uncorrelated pattern comprises a quasi-periodic pattern.
15. The apparatus according to claim 14, wherein the quasi-periodic pattern has an n-fold symmetry, with «=5 or n > 7.
16. The apparatus according to claim 12, wherein the uncorrelated pattern has a duty cycle that is less than lie.
17. The apparatus according to claim 12, wherein the spots have a local duty cycle that varies across the pattern.
18. The apparatus according to claim 12, wherein the transparency contains a plurality of parallel bands, repeating periodically in a first direction, each band containing a replica of the uncorrelated pattern extending across at least a part of the transparency in a second direction, perpendicular to the first direction.
19. The apparatus according to any of claims 12-18, wherein the processor is configured to derive the 3D map by finding respective offsets between the pattern of the spots on multiple areas of the object captured in the image of the pattern that is projected onto the object and a reference image of the pattern, wherein the respective offsets are indicative of respective distances between the areas and the image capture assembly.
20. The apparatus according to claim 19, wherein the spots have a local duty cycle that varies monotonically along an axis across the pattern, and wherein the processor is configured to determine local gray levels of the multiple areas in the image responsively to the local duty cycle, and to estimate the respective offsets based on the local gray levels.
21. The apparatus according to claim 12, wherein the spots in the transparency comprise micro-lenses arranged in the uncorrelated pattern.
22. Apparatus for mapping an object, comprising: an illumination assembly, comprising: a transparency containing a plurality of micro-lenses arranged in a non-uniform pattern; and a light source, which is configured to transilluminate the transparency with optical radiation so as to project the pattern onto the object; an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the transparency; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
23. The apparatus according to claim 22, wherein the micro-lenses are configured to focus the optical radiation to form respective focal spots at a focal plane in the non-uniform pattern, and wherein the light source comprises optics for projecting the non-uniform pattern of the focal spots from the focal plane onto the object.
24. The apparatus according to claim 22, wherein at least some of the micro-lenses have non-uniform focal lengths, and wherein the light source comprises optics for projecting the non-uniform pattern of the focal spots so that the pattern that is projected on the object varies with distance from the illumination assembly.
25. A method for mapping an object, comprising: transilluminating a single transparency containing a fixed pattern of spots so as to project the pattern onto the object; capturing an image of the pattern that is projected onto the object using the single transparency; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
26. The method according to claim 25, wherein the pattern is uncorrelated over a range of depths that is mapped by the method.
27. The method according to claim 25, wherein processing the captured image comprises finding respective offsets between the pattern of the spots on multiple areas of the object captured in the image of the pattern that is projected onto the object and a reference image of the pattern, and determining respective distances to the areas responsively to the respective offsets.
28. The method according to claim 25, wherein capturing the image comprises capturing images of the pattern on the object from a single, fixed location and angle relative to the single transparency.
29. The method according to claim 28, wherein transilluminating the transparency comprises projecting optical radiation through the transparency from a light source, wherein the transparency and light source are fixed in respective positions relative to one another.
30. The method according to claim 28, wherein processing the captured image comprises reconstructing the 3D map without use of further images captured from any location or angle other than the single, fixed location and angle and without movement of the transparency and the light source from the fixed respective positions.
31. The method according to claim 25, wherein transilluminating the single transparency comprises projecting optical radiation from a point source through the single transparency.
32. The method according to claim 25, wherein transilluminating the single transparency comprises projecting optical radiation from a light-emitting diode (LED) through the single transparency.
33. The method according to any of claims 25-32, wherein processing the captured image comprises processing a succession of images captured while the object is moving so as to map a 3D movement of the object.
34. The method according to claim 33, wherein the object is a part of a human body, and wherein the 3D movement comprises a gesture made by the part of the human body, and wherein processing the succession of the images comprises providing an input to a computer application responsively to the gesture.
35. The method according to claim 25, wherein the spots in the transparency comprise micro-lenses arranged in the fixed pattern.
36. A method for mapping an object, comprising: transilluminating a transparency containing an uncorrelated pattern of spots so as to project the uncorrelated pattern onto the object; capturing an image of the uncorrelated pattern that is projected onto the object; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
37. The method according to claim 36, wherein the uncorrelated pattern comprises a pseudo-random pattern.
38. The method according to claim 36, wherein the uncorrelated pattern comprises a quasi- periodic pattern.
39. The method according to claim 38, wherein the quasi-periodic pattern has an n-fold symmetry, with n=5 or n ≥ 7.
40. The method according to claim 36, wherein the uncorrelated pattern has a duty cycle that is less than Me.
41. The method according to claim 36, wherein the spots have a local duty cycle that varies across the pattern.
42. The method according to claim 36, wherein the transparency contains a plurality of parallel bands, repeating periodically in a first direction, each band containing a replica of the uncorrelated pattern extending across at least a part of the transparency in a second direction, perpendicular to the first direction.
43. The method according to any of claims 36-42, wherein processing the captured image comprises finding respective offsets between the pattern of the spots on multiple areas of the object captured in the image of the pattern that is projected onto the object and a reference image of the pattern, and determining respective distances to the areas responsively to the respective offsets.
44. The method according to claim 43, wherein the spots have a local duty cycle that varies monotonically along an axis across the pattern, and wherein finding the respective offsets comprises determining local gray levels of the multiple areas in the image responsively to the local duty cycle, and estimating the respective offsets based on the local gray levels.
45. The method according to claim 36, wherein the spots in the transparency comprise micro-lenses arranged in the uncorrelated pattern.
46. A method for mapping an object, comprising: transilluminating a transparency containing a plurality of micro-lenses arranged in a non-uniform pattern so as to project the non-uniform pattern onto the object; capturing an image of the non-uniform pattern that is projected onto the object; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
47. The method according to claim 46, wherein the micro-lenses are configured to focus the optical radiation to form respective focal spots at a focal plane in the non-uniform pattern, and wherein transilluminating the transparency comprises projecting the non-uniform pattern of the focal spots from the focal plane onto the object.
48. The method according to claim 46, wherein at least some of the micro-lenses have nonuniform focal lengths, and wherein transilluminating the transparency comprises projecting the non-uniform pattern of the focal spots so that the pattern that is projected on the object varies with distance from the transparency.
49. Apparatus for mapping an object, comprising: an illumination assembly, comprising: a diffuser; a light source, which is configured to transilluminate the diffuser with optical radiation; a transparency containing an array of micro-lenses, which are arranged to focus the optical radiation transmitted through the diffuser, thereby generating a pattern of spots; and projection optics, which are arranged to project the pattern onto the object; an image capture assembly, which is configured to capture an image of the pattern that is projected onto the object using the transparency; and a processor, which is coupled to process the image captured by the image capture assembly so as to reconstruct a three-dimensional (3D) map of the object.
50. The apparatus according to claim 49, wherein the illumination assembly is configured to project the pattern so that the pattern varies with distance from the illumination assembly.
51. A method for mapping an object, comprising: transilluminating a diffuser with optical radiation; focusing the optical radiation transmitted through the diffuser using an array of micro- lenses, thereby generating a pattern of spots; projecting the pattern onto the object; capturing an image of the pattern that is projected onto the object; and processing the captured image so as to reconstruct a three-dimensional (3D) map of the object.
52. The method according to claim 51, wherein the projected pattern varies with distance from the array of micro-lenses.
PCT/IL2008/000458 2007-04-02 2008-04-02 Depth mapping using projected patterns WO2008120217A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/522,171 US8493496B2 (en) 2007-04-02 2008-04-02 Depth mapping using projected patterns
US13/931,935 US9885459B2 (en) 2007-04-02 2013-06-30 Pattern projection using micro-lenses
US15/841,361 US10514148B2 (en) 2007-04-02 2017-12-14 Pattern projection using microlenses

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US90948707P 2007-04-02 2007-04-02
US60/909,487 2007-04-02
US11/899,542 US8150142B2 (en) 2007-04-02 2007-09-06 Depth mapping using projected patterns
US11/899,542 2007-09-06
US1683207P 2007-12-27 2007-12-27
US61/016,832 2007-12-27

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US12/522,171 A-371-Of-International US8493496B2 (en) 2007-04-02 2008-04-02 Depth mapping using projected patterns
US13/931,935 Continuation US9885459B2 (en) 2007-04-02 2013-06-30 Pattern projection using micro-lenses

Publications (2)

Publication Number Publication Date
WO2008120217A2 true WO2008120217A2 (en) 2008-10-09
WO2008120217A3 WO2008120217A3 (en) 2010-02-25

Family

ID=39808782

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2008/000458 WO2008120217A2 (en) 2007-04-02 2008-04-02 Depth mapping using projected patterns

Country Status (3)

Country Link
US (3) US8493496B2 (en)
TW (1) TWI433052B (en)
WO (1) WO2008120217A2 (en)

Cited By (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2363686A1 (en) 2010-02-02 2011-09-07 Primesense Ltd. Optical apparatus, an imaging system and a method for producing a photonics module
CN102196220A (en) * 2010-03-04 2011-09-21 索尼公司 Information processing apparatus, information processing method and program
US8050461B2 (en) 2005-10-11 2011-11-01 Primesense Ltd. Depth-varying light fields for three dimensional sensing
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8249334B2 (en) 2006-05-11 2012-08-21 Primesense Ltd. Modeling of humanoid forms from depth maps
WO2012174406A1 (en) * 2011-06-15 2012-12-20 University Of Washington Methods and systems for haptic rendering and creating virtual fixtures from point clouds
US8350847B2 (en) 2007-01-21 2013-01-08 Primesense Ltd Depth mapping using multi-beam illumination
US8384997B2 (en) 2008-01-21 2013-02-26 Primesense Ltd Optical pattern projection
US8390821B2 (en) 2005-10-11 2013-03-05 Primesense Ltd. Three-dimensional sensing using speckle patterns
US8400494B2 (en) 2005-10-11 2013-03-19 Primesense Ltd. Method and system for object reconstruction
WO2013038089A1 (en) 2011-09-16 2013-03-21 Prynel Method and system for acquiring and processing images for the detection of motion
US8456517B2 (en) 2008-07-09 2013-06-04 Primesense Ltd. Integrated processor for 3D mapping
US8462207B2 (en) 2009-02-12 2013-06-11 Primesense Ltd. Depth ranging with Moiré patterns
CN103164995A (en) * 2013-04-03 2013-06-19 湖南第一师范学院 Children somatic sense interactive learning system and method
US8492696B2 (en) 2009-11-15 2013-07-23 Primesense Ltd. Optical projector with beam monitor including mapping apparatus capturing image of pattern projected onto an object
US8493496B2 (en) 2007-04-02 2013-07-23 Primesense Ltd. Depth mapping using projected patterns
US8494252B2 (en) 2007-06-19 2013-07-23 Primesense Ltd. Depth mapping using optical elements having non-uniform focal characteristics
EP2643659A1 (en) * 2010-11-19 2013-10-02 Primesense Ltd. Depth mapping using time-coded illumination
US8565479B2 (en) 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
US8582867B2 (en) 2010-09-16 2013-11-12 Primesense Ltd Learning-based pose estimation from depth maps
US8594425B2 (en) 2010-05-31 2013-11-26 Primesense Ltd. Analysis of three-dimensional scenes
US8630039B2 (en) 2008-01-21 2014-01-14 Primesense Ltd. Optical designs for zero order reduction
US8717417B2 (en) 2009-04-16 2014-05-06 Primesense Ltd. Three-dimensional mapping and imaging
US8749796B2 (en) 2011-08-09 2014-06-10 Primesense Ltd. Projectors of structured light
US8786682B2 (en) 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US8787663B2 (en) 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
US8830227B2 (en) 2009-12-06 2014-09-09 Primesense Ltd. Depth-based gain control
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US8908277B2 (en) 2011-08-09 2014-12-09 Apple Inc Lens array projector
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
WO2015038443A1 (en) * 2013-09-11 2015-03-19 Microsoft Corporation Optical modules for use with depth cameras
US9002099B2 (en) 2011-09-11 2015-04-07 Apple Inc. Learning-based estimation of hand and finger pose
US9019267B2 (en) 2012-10-30 2015-04-28 Apple Inc. Depth mapping with enhanced resolution
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9030466B2 (en) 2010-10-05 2015-05-12 Empire Technology Development Llc Generation of depth data based on spatial light pattern
US9036158B2 (en) 2010-08-11 2015-05-19 Apple Inc. Pattern projector
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US9047507B2 (en) 2012-05-02 2015-06-02 Apple Inc. Upper-body skeleton extraction from depth maps
US9098931B2 (en) 2010-08-11 2015-08-04 Apple Inc. Scanning projectors and image capture modules for 3D mapping
WO2015124780A1 (en) 2014-02-21 2015-08-27 Ipo.Plan Gmbh Device for sensing a three-dimensional object
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9131136B2 (en) 2010-12-06 2015-09-08 Apple Inc. Lens arrays for pattern projection and imaging
US9152234B2 (en) 2012-12-02 2015-10-06 Apple Inc. Detecting user intent to remove a pluggable peripheral device
US9201237B2 (en) 2012-03-22 2015-12-01 Apple Inc. Diffraction-based sensing of mirror position
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
WO2016018550A1 (en) 2014-07-28 2016-02-04 Apple Inc. Overlapping pattern projector
US9348111B2 (en) 2010-08-24 2016-05-24 Apple Inc. Automatic detection of lens deviations
US9443310B2 (en) 2013-10-09 2016-09-13 Microsoft Technology Licensing, Llc Illumination modules that emit structured light
US9477307B2 (en) 2013-01-24 2016-10-25 The University Of Washington Methods and systems for six degree-of-freedom haptic interaction with streaming point data
US9525863B2 (en) 2015-04-29 2016-12-20 Apple Inc. Time-of-flight depth mapping with flexible scan pattern
US9528906B1 (en) 2013-12-19 2016-12-27 Apple Inc. Monitoring DOE performance using total internal reflection
US9582889B2 (en) 2009-07-30 2017-02-28 Apple Inc. Depth mapping based on pattern matching and stereoscopic information
US9595156B2 (en) 2012-01-23 2017-03-14 Novomatic Ag Prize wheel with gesture-based control
CN107206451A (en) * 2014-12-12 2017-09-26 屋罗斯·杜兰强尼 The application of shape bending process of the 3D video cameras on the bending machine of three and four rollers
US9825425B2 (en) 2013-06-19 2017-11-21 Apple Inc. Integrated structured-light projector comprising light-emitting elements on a substrate
US9888225B2 (en) 2011-02-04 2018-02-06 Koninklijke Philips N.V. Method of recording an image and obtaining 3D information from the image, camera system
US9946089B2 (en) 2015-10-21 2018-04-17 Princeton Optronics, Inc. Generation of coded structured light patterns using VCSEL arrays
US10012831B2 (en) 2015-08-03 2018-07-03 Apple Inc. Optical monitoring of scan parameters
US10043279B1 (en) 2015-12-07 2018-08-07 Apple Inc. Robust detection and classification of body parts in a depth map
US10073004B2 (en) 2016-09-19 2018-09-11 Apple Inc. DOE defect monitoring utilizing total internal reflection
US10153614B1 (en) 2017-08-31 2018-12-11 Apple Inc. Creating arbitrary patterns on a 2-D uniform grid VCSEL array
EP3425327A1 (en) * 2017-07-06 2019-01-09 Car-O-Liner Group AB A method for determining wheel alignment parameters
US10226869B2 (en) 2014-03-03 2019-03-12 University Of Washington Haptic virtual fixture tools
CN109477710A (en) * 2016-07-27 2019-03-15 微软技术许可有限责任公司 The reflectance map of structured light system based on point is estimated
US10310281B1 (en) 2017-12-05 2019-06-04 K Laser Technology, Inc. Optical projector with off-axis diffractive element
US10317684B1 (en) 2018-01-24 2019-06-11 K Laser Technology, Inc. Optical projector with on axis hologram and multiple beam splitter
US10349037B2 (en) 2014-04-03 2019-07-09 Ams Sensors Singapore Pte. Ltd. Structured-stereo imaging assembly including separate imagers for different wavelengths
US10366278B2 (en) 2016-09-20 2019-07-30 Apple Inc. Curvature-based face detector
EP3527121A1 (en) 2011-02-09 2019-08-21 Apple Inc. Gesture detection in a 3d mapping environment
US10441176B2 (en) 2012-12-31 2019-10-15 Omni Medsci, Inc. Imaging using near-infrared laser diodes with distributed bragg reflectors
US10517484B2 (en) 2012-12-31 2019-12-31 Omni Medsci, Inc. Semiconductor diodes-based physiological measurement device with improved signal-to-noise ratio
US10545457B2 (en) 2017-12-05 2020-01-28 K Laser Technology, Inc. Optical projector with off-axis diffractive element and conjugate images
WO2020103165A1 (en) * 2018-11-24 2020-05-28 深圳阜时科技有限公司 Light source structure, optical projection module, sensing apparatus, and device
WO2020103166A1 (en) * 2018-11-24 2020-05-28 深圳阜时科技有限公司 Light source structure, optical projection module and sensing device and apparatus
US11422292B1 (en) 2018-06-10 2022-08-23 Apple Inc. Super-blazed diffractive optical elements with sub-wavelength structures
US11506762B1 (en) 2019-09-24 2022-11-22 Apple Inc. Optical module comprising an optical waveguide with reference light path
US11681019B2 (en) 2019-09-18 2023-06-20 Apple Inc. Optical module with stray light baffle
US11754767B1 (en) 2020-03-05 2023-09-12 Apple Inc. Display with overlaid waveguide
US12111421B2 (en) 2021-03-17 2024-10-08 Apple Inc. Waveguide-based transmitters with adjustable lighting

Families Citing this family (221)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9330324B2 (en) 2005-10-11 2016-05-03 Apple Inc. Error compensation in three-dimensional mapping
US20110096182A1 (en) * 2009-10-25 2011-04-28 Prime Sense Ltd Error Compensation in Three-Dimensional Mapping
EP1994503B1 (en) 2006-03-14 2017-07-05 Apple Inc. Depth-varying light fields for three dimensional sensing
CN101627280B (en) * 2006-11-21 2013-09-25 曼蒂斯影像有限公司 3d geometric modeling and 3d video content creation
US8090194B2 (en) 2006-11-21 2012-01-03 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
US8265793B2 (en) 2007-03-20 2012-09-11 Irobot Corporation Mobile robot for telecommunication
US20090219253A1 (en) * 2008-02-29 2009-09-03 Microsoft Corporation Interactive Surface Computer with Switchable Diffuser
US8866920B2 (en) 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
EP2289235A4 (en) * 2008-05-20 2011-12-28 Pelican Imaging Corp Capturing and processing of images using monolithic camera array with hetergeneous imagers
US8417385B2 (en) * 2009-07-01 2013-04-09 Pixart Imaging Inc. Home appliance control device
US8514491B2 (en) 2009-11-20 2013-08-20 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
JP4783456B2 (en) * 2009-12-22 2011-09-28 株式会社東芝 Video playback apparatus and video playback method
US8982182B2 (en) * 2010-03-01 2015-03-17 Apple Inc. Non-uniform spatial resource allocation for depth mapping
US8649025B2 (en) 2010-03-27 2014-02-11 Micrometric Vision Technologies Methods and apparatus for real-time digitization of three-dimensional scenes
KR101824672B1 (en) 2010-05-12 2018-02-05 포토네이션 케이맨 리미티드 Architectures for imager arrays and array cameras
US8918209B2 (en) 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US8918213B2 (en) 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US8935005B2 (en) 2010-05-20 2015-01-13 Irobot Corporation Operating a mobile robot
US9014848B2 (en) 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
GB2494081B (en) 2010-05-20 2015-11-11 Irobot Corp Mobile human interface robot
EP2593748A1 (en) * 2010-07-16 2013-05-22 Koninklijke Philips Electronics N.V. A light projector and vision system for distance determination
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
CN102959616B (en) 2010-07-20 2015-06-10 苹果公司 Interactive reality augmentation for natural interaction
US9232912B2 (en) 2010-08-26 2016-01-12 The Regents Of The University Of California System for evaluating infant movement using gesture recognition
TWI428568B (en) * 2010-09-03 2014-03-01 Pixart Imaging Inc Distance measurement method and system, and processing software thereof
US20140031668A1 (en) * 2010-09-08 2014-01-30 Disruptive Navigational Technologies, Llc Surgical and Medical Instrument Tracking Using a Depth-Sensing Device
US8681255B2 (en) 2010-09-28 2014-03-25 Microsoft Corporation Integrated low power depth camera and projection device
US8878950B2 (en) 2010-12-14 2014-11-04 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using super-resolution processes
GB2502213A (en) 2010-12-30 2013-11-20 Irobot Corp Mobile Human Interface Robot
US8930019B2 (en) 2010-12-30 2015-01-06 Irobot Corporation Mobile human interface robot
US8963883B2 (en) 2011-03-17 2015-02-24 Symbol Technologies, Inc. Touchless interactive display system
US9030528B2 (en) 2011-04-04 2015-05-12 Apple Inc. Multi-zone imaging sensor and lens array
EP2469300B1 (en) 2011-04-18 2012-08-22 Sick Ag 3D camera and method for three dimensional surveillance of a surveillance area
EP2708019B1 (en) 2011-05-11 2019-10-16 FotoNation Limited Systems and methods for transmitting and receiving array camera image data
EP2772676B1 (en) 2011-05-18 2015-07-08 Sick Ag 3D camera and method for three dimensional surveillance of a surveillance area
US8123622B1 (en) 2011-06-03 2012-02-28 Nyko Technologies, Inc. Lens accessory for video game sensor device
CN103154666B (en) 2011-06-14 2015-03-18 日产自动车株式会社 Distance measurement device and environment map generation apparatus
RU2455676C2 (en) 2011-07-04 2012-07-10 Общество с ограниченной ответственностью "ТРИДИВИ" Method of controlling device using gestures and 3d sensor for realising said method
US9459758B2 (en) * 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US8869073B2 (en) * 2011-07-28 2014-10-21 Hewlett-Packard Development Company, L.P. Hand pose interaction
US8971572B1 (en) 2011-08-12 2015-03-03 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction
DE102011052802B4 (en) 2011-08-18 2014-03-13 Sick Ag 3D camera and method for monitoring a room area
US20130044912A1 (en) 2011-08-19 2013-02-21 Qualcomm Incorporated Use of association of an object detected in an image to obtain information to display to a user
JP2013070030A (en) 2011-09-06 2013-04-18 Sony Corp Imaging device, electronic apparatus, and information processor
US20130070060A1 (en) 2011-09-19 2013-03-21 Pelican Imaging Corporation Systems and methods for determining depth from multiple views of a scene that include aliasing using hypothesized fusion
WO2013049699A1 (en) 2011-09-28 2013-04-04 Pelican Imaging Corporation Systems and methods for encoding and decoding light field image files
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US20150253428A1 (en) 2013-03-15 2015-09-10 Leap Motion, Inc. Determining positional information for an object in space
US11493998B2 (en) 2012-01-17 2022-11-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US8693731B2 (en) 2012-01-17 2014-04-08 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
US8638989B2 (en) 2012-01-17 2014-01-28 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9070019B2 (en) 2012-01-17 2015-06-30 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9501152B2 (en) 2013-01-15 2016-11-22 Leap Motion, Inc. Free-space user interface and control using virtual constructs
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
WO2013121366A1 (en) 2012-02-15 2013-08-22 Primesense Ltd. Scanning depth engine
US9412206B2 (en) 2012-02-21 2016-08-09 Pelican Imaging Corporation Systems and methods for the manipulation of captured light field image data
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
US8958911B2 (en) 2012-02-29 2015-02-17 Irobot Corporation Mobile robot
WO2013144807A1 (en) 2012-03-26 2013-10-03 Primesense Ltd. Enhanced virtual touchpad and touchscreen
DE102012103766A1 (en) 2012-04-27 2013-10-31 Bircher Reglomat Ag Method for controlling and / or monitoring the areas around resealable building openings
US9210392B2 (en) 2012-05-01 2015-12-08 Pelican Imaging Coporation Camera modules patterned with pi filter groups
DE102012105401B3 (en) * 2012-06-21 2013-04-25 Sick Ag Three-dimensional camera i.e. stereoscopic security camera, for monitoring e.g. press, has classification unit to form measuring curve of correlation quality based on depth maps produced at different exposure times
DE202012102298U1 (en) 2012-06-21 2013-09-25 Sick Ag 3D camera
CN104508681B (en) 2012-06-28 2018-10-30 Fotonation开曼有限公司 For detecting defective camera array, optical device array and the system and method for sensor
US20140002674A1 (en) 2012-06-30 2014-01-02 Pelican Imaging Corporation Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors
US10060728B2 (en) 2012-07-26 2018-08-28 Nec Corporation Three-dimensional object-measurement device, medium, and control method
US8619082B1 (en) 2012-08-21 2013-12-31 Pelican Imaging Corporation Systems and methods for parallax detection and correction in images captured using array cameras that contain occlusions using subsets of images to perform depth estimation
US20140055632A1 (en) 2012-08-23 2014-02-27 Pelican Imaging Corporation Feature based high resolution motion estimation from low resolution images captured using an array source
US20140092281A1 (en) 2012-09-28 2014-04-03 Pelican Imaging Corporation Generating Images from Light Fields Utilizing Virtual Viewpoints
DE202012104074U1 (en) 2012-10-23 2014-01-27 Sick Ag 3D camera for three-dimensional monitoring of a surveillance area
JP6061616B2 (en) * 2012-10-29 2017-01-18 キヤノン株式会社 Measuring apparatus, control method therefor, and program
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
US9143711B2 (en) 2012-11-13 2015-09-22 Pelican Imaging Corporation Systems and methods for array camera focal plane control
TWI454968B (en) 2012-12-24 2014-10-01 Ind Tech Res Inst Three-dimensional interactive device and operation method thereof
US10660526B2 (en) 2012-12-31 2020-05-26 Omni Medsci, Inc. Near-infrared time-of-flight imaging using laser diodes with Bragg reflectors
WO2014143276A2 (en) 2012-12-31 2014-09-18 Omni Medsci, Inc. Short-wave infrared super-continuum lasers for natural gas leak detection, exploration, and other active remote sensing applications
EP3184038B1 (en) 2012-12-31 2019-02-20 Omni MedSci, Inc. Mouth guard with short-wave infrared super-continuum lasers for early detection of dental caries
US10609285B2 (en) 2013-01-07 2020-03-31 Ultrahaptics IP Two Limited Power consumption in motion-capture systems
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US9459697B2 (en) 2013-01-15 2016-10-04 Leap Motion, Inc. Dynamic, free-space user interactions for machine control
US9858721B2 (en) 2013-01-15 2018-01-02 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for generating an augmented scene display
US10241639B2 (en) 2013-01-15 2019-03-26 Leap Motion, Inc. Dynamic user interactions for display control and manipulation of display objects
NL2010213C2 (en) 2013-01-31 2014-08-04 Lely Patent Nv Camera system, animal related system therewith, and method to create 3d camera images.
WO2014130849A1 (en) 2013-02-21 2014-08-28 Pelican Imaging Corporation Generating compressed light field representation data
US9253380B2 (en) 2013-02-24 2016-02-02 Pelican Imaging Corporation Thin form factor computational array cameras and modular array cameras
US9804576B2 (en) 2013-02-27 2017-10-31 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with position and derivative decision reference
US9393695B2 (en) 2013-02-27 2016-07-19 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with person and object discrimination
US9798302B2 (en) 2013-02-27 2017-10-24 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with redundant system input support
US9498885B2 (en) 2013-02-27 2016-11-22 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with confidence-based decision support
WO2014138695A1 (en) 2013-03-08 2014-09-12 Pelican Imaging Corporation Systems and methods for measuring scene information while capturing images using array cameras
US8866912B2 (en) 2013-03-10 2014-10-21 Pelican Imaging Corporation System and methods for calibration of an array camera using a single captured image
US9521416B1 (en) 2013-03-11 2016-12-13 Kip Peli P1 Lp Systems and methods for image data compression
US9124831B2 (en) 2013-03-13 2015-09-01 Pelican Imaging Corporation System and methods for calibration of an array camera
WO2014164909A1 (en) 2013-03-13 2014-10-09 Pelican Imaging Corporation Array camera architecture implementing quantum film sensors
US9106784B2 (en) 2013-03-13 2015-08-11 Pelican Imaging Corporation Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
US9519972B2 (en) 2013-03-13 2016-12-13 Kip Peli P1 Lp Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies
WO2014153098A1 (en) 2013-03-14 2014-09-25 Pelican Imaging Corporation Photmetric normalization in array cameras
WO2014159779A1 (en) 2013-03-14 2014-10-02 Pelican Imaging Corporation Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US9445003B1 (en) 2013-03-15 2016-09-13 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
WO2014145856A1 (en) 2013-03-15 2014-09-18 Pelican Imaging Corporation Systems and methods for stereo imaging with camera arrays
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
WO2014150856A1 (en) 2013-03-15 2014-09-25 Pelican Imaging Corporation Array camera implementing quantum dot color filters
US10620709B2 (en) 2013-04-05 2020-04-14 Ultrahaptics IP Two Limited Customized gesture interpretation
US9294758B2 (en) * 2013-04-18 2016-03-22 Microsoft Technology Licensing, Llc Determining depth data for a captured image
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
US9747696B2 (en) 2013-05-17 2017-08-29 Leap Motion, Inc. Systems and methods for providing normalized parameters of motions of objects in three-dimensional space
JP6221394B2 (en) 2013-06-19 2017-11-01 富士通株式会社 Image processing apparatus, image processing method, and image processing program
US10281987B1 (en) 2013-08-09 2019-05-07 Leap Motion, Inc. Systems and methods of free-space gestural interaction
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US9898856B2 (en) 2013-09-27 2018-02-20 Fotonation Cayman Limited Systems and methods for depth-assisted perspective distortion correction
US9632572B2 (en) 2013-10-03 2017-04-25 Leap Motion, Inc. Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US8917327B1 (en) 2013-10-04 2014-12-23 icClarity, Inc. Method to use array sensors to measure multiple types of data at full resolution of the sensor
US10168873B1 (en) 2013-10-29 2019-01-01 Leap Motion, Inc. Virtual interactions for machine control
US9996638B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Predictive information for free space gesture control and communication
US9264592B2 (en) 2013-11-07 2016-02-16 Pelican Imaging Corporation Array camera modules incorporating independently aligned lens stacks
EP3068301A4 (en) 2013-11-12 2017-07-12 Highland Instruments, Inc. Analysis suite
WO2015074078A1 (en) 2013-11-18 2015-05-21 Pelican Imaging Corporation Estimating depth from projected texture using camera arrays
US9456134B2 (en) 2013-11-26 2016-09-27 Pelican Imaging Corporation Array camera configurations incorporating constituent array cameras and constituent cameras
WO2015095737A2 (en) 2013-12-19 2015-06-25 The University Of North Carolina At Chapel Hill Optical see-through near-eye display using point light source backlight
WO2015099211A1 (en) * 2013-12-24 2015-07-02 엘지전자 주식회사 3d camera module
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
WO2015134996A1 (en) 2014-03-07 2015-09-11 Pelican Imaging Corporation System and methods for depth regularization and semiautomatic interactive matting using rgb-d images
CN103888674B (en) * 2014-04-15 2017-08-11 聚晶半导体股份有限公司 Image capture unit and image acquisition method
TWI524050B (en) * 2014-04-15 2016-03-01 聚晶半導體股份有限公司 Image capture device, depth generating device and method thereof
US9589359B2 (en) * 2014-04-24 2017-03-07 Intel Corporation Structured stereo
FR3021205B1 (en) 2014-05-20 2021-12-24 Essilor Int METHOD FOR DETERMINING AT LEAST ONE BEHAVIORAL PARAMETER
FR3021204A1 (en) 2014-05-20 2015-11-27 Essilor Int METHOD FOR DETERMINING AT LEAST ONE PARAMETER OF VISUAL BEHAVIOR OF AN INDIVIDUAL
EP2950268B1 (en) * 2014-05-28 2018-07-04 Wincor Nixdorf International GmbH Method and device for detecting the three-dimensional shape of an object
AU2015287252C1 (en) 2014-07-08 2019-08-22 Facebook Technologies, Llc Method and system for adjusting light pattern for structured light imaging
CN204480228U (en) 2014-08-08 2015-07-15 厉动公司 motion sensing and imaging device
JP6452361B2 (en) 2014-09-10 2019-01-16 キヤノン株式会社 Information processing apparatus, information processing method, and program
USD733141S1 (en) 2014-09-10 2015-06-30 Faro Technologies, Inc. Laser scanner
WO2016054089A1 (en) 2014-09-29 2016-04-07 Pelican Imaging Corporation Systems and methods for dynamic calibration of array cameras
US9482624B2 (en) * 2014-10-29 2016-11-01 SUNMOON UNIVERSITY Industry-University Cooperation Apparatus for inspecting
EP3021072B1 (en) 2014-11-14 2016-09-28 Sick Ag Lighting device and method for projecting an illumination pattern
US9881235B1 (en) 2014-11-21 2018-01-30 Mahmoud Narimanzadeh System, apparatus, and method for determining physical dimensions in digital images
US9841496B2 (en) 2014-11-21 2017-12-12 Microsoft Technology Licensing, Llc Multiple pattern illumination optics for time of flight system
WO2016099527A1 (en) 2014-12-19 2016-06-23 Hewlett-Packard Development Company, L.P. Determine image capture position information based on a quasi-periodic pattern
US9273846B1 (en) 2015-01-29 2016-03-01 Heptagon Micro Optics Pte. Ltd. Apparatus for producing patterned illumination including at least one array of light sources and at least one array of microlenses
US10509147B2 (en) 2015-01-29 2019-12-17 ams Sensors Singapore Pte. Ltd Apparatus for producing patterned illumination using arrays of light sources and lenses
US10429923B1 (en) 2015-02-13 2019-10-01 Ultrahaptics IP Two Limited Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
US9696795B2 (en) 2015-02-13 2017-07-04 Leap Motion, Inc. Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments
US9858719B2 (en) 2015-03-30 2018-01-02 Amazon Technologies, Inc. Blended reality systems and methods
US9984519B2 (en) 2015-04-10 2018-05-29 Google Llc Method and system for optical user recognition
US9942474B2 (en) 2015-04-17 2018-04-10 Fotonation Cayman Limited Systems and methods for performing high speed video capture and depth estimation using array cameras
US10721912B2 (en) 2015-04-30 2020-07-28 Kevin Hanson Methods and device for pet enrichment
US10666848B2 (en) 2015-05-05 2020-05-26 Microsoft Technology Licensing, Llc Remote depth sensing via relayed depth from diffusion
TWI663377B (en) * 2015-05-15 2019-06-21 高準精密工業股份有限公司 Optical device and light emitting device thereof
JP6566768B2 (en) * 2015-07-30 2019-08-28 キヤノン株式会社 Information processing apparatus, information processing method, and program
US10063849B2 (en) 2015-09-24 2018-08-28 Ouster, Inc. Optical system for collecting distance information within a field
US9992477B2 (en) 2015-09-24 2018-06-05 Ouster, Inc. Optical system for collecting distance information within a field
DE202015105376U1 (en) 2015-10-12 2015-10-19 Sick Ag 3D camera for taking three-dimensional images
US10610133B2 (en) 2015-11-05 2020-04-07 Google Llc Using active IR sensor to monitor sleep
US9971948B1 (en) * 2015-11-12 2018-05-15 Apple Inc. Vein imaging using detection of pulsed radiation
EP3408585B1 (en) * 2016-01-26 2020-12-09 Heptagon Micro Optics Pte. Ltd. Multi-mode illumination module and related method
US10139217B1 (en) * 2016-02-16 2018-11-27 Google Llc Array based patterned illumination projector
US10652489B2 (en) 2016-03-14 2020-05-12 Insightness Ag Vision sensor, a method of vision sensing, and a depth sensor assembly
US10955235B2 (en) * 2016-03-22 2021-03-23 Mitsubishi Electric Corporation Distance measurement apparatus and distance measurement method
KR101733228B1 (en) * 2016-04-28 2017-05-08 주식회사 메디트 Apparatus for three dimensional scanning with structured light
KR101892013B1 (en) 2016-05-27 2018-08-27 엘지전자 주식회사 Mobile terminal
WO2017204498A1 (en) * 2016-05-27 2017-11-30 엘지전자 주식회사 Mobile terminal
US10924638B2 (en) 2016-06-27 2021-02-16 Intel Corporation Compact, low cost VCSEL projector for high performance stereodepth camera
WO2018000036A1 (en) 2016-07-01 2018-01-04 Cylite Pty Ltd Apparatus and method for confocal microscopy using dispersed structured illumination
US10591277B2 (en) 2016-07-28 2020-03-17 Liberty Reach Inc. Method and system for measuring outermost dimension of a vehicle positioned at an inspection station
US10241244B2 (en) 2016-07-29 2019-03-26 Lumentum Operations Llc Thin film total internal reflection diffraction grating for single polarization or dual polarization
US10948572B2 (en) 2016-08-24 2021-03-16 Ouster, Inc. Optical system for collecting distance information within a field
US10049443B2 (en) 2016-08-29 2018-08-14 Liberty Reach Inc. Method and system for determining the presence or absence of a part of an assembly within a work cell
US10270947B2 (en) 2016-09-15 2019-04-23 Microsoft Technology Licensing, Llc Flat digital image sensor
JP6799751B2 (en) * 2016-09-28 2020-12-16 パナソニックIpマネジメント株式会社 Imaging device
US11860292B2 (en) 2016-11-17 2024-01-02 Trinamix Gmbh Detector and methods for authenticating at least one object
US11635486B2 (en) 2016-11-17 2023-04-25 Trinamix Gmbh Detector for optically detecting at least one object
AU2016259442A1 (en) 2016-11-18 2018-06-07 Canon Kabushiki Kaisha Method and system for reproducing visual content
US10158845B2 (en) 2017-01-18 2018-12-18 Facebook Technologies, Llc Tileable structured light projection for wide field-of-view depth sensing
US9983412B1 (en) 2017-02-02 2018-05-29 The University Of North Carolina At Chapel Hill Wide field of view augmented reality see through head mountable display with distance accommodation
JP2020510820A (en) 2017-03-16 2020-04-09 トリナミクス ゲゼルシャフト ミット ベシュレンクテル ハフツング Detector for optically detecting at least one object
US11150347B2 (en) 2017-05-15 2021-10-19 Ouster, Inc. Micro-optics for optical imager with non-uniform filter
US11182915B2 (en) 2017-07-12 2021-11-23 Gentex Corporation Visual, depth and micro-vibration data extraction using a unified imaging device
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
KR102681300B1 (en) 2017-08-28 2024-07-04 트리나미엑스 게엠베하 A detector that determines the position of at least one object
US11448762B2 (en) 2017-08-28 2022-09-20 Trinamix Gmbh Range finder for determining at least one geometric information
US10612912B1 (en) * 2017-10-31 2020-04-07 Facebook Technologies, Llc Tileable structured light projection system
CN107748475A (en) 2017-11-06 2018-03-02 深圳奥比中光科技有限公司 Structured light projection module, depth camera and the method for manufacturing structured light projection module
US10398855B2 (en) 2017-11-14 2019-09-03 William T. MCCLELLAN Augmented reality based injection therapy
JP7254799B2 (en) 2017-11-17 2023-04-10 トリナミクス ゲゼルシャフト ミット ベシュレンクテル ハフツング a detector for determining the position of at least one object
US11029713B2 (en) 2017-11-27 2021-06-08 Liberty Reach Inc. Method and system for expanding the range of working environments in which a 3-D or depth sensor can operate without damaging or degrading the measurement performance of the sensor
US11353556B2 (en) 2017-12-07 2022-06-07 Ouster, Inc. Light ranging device with a multi-element bulk lens system
US10521926B1 (en) 2018-03-21 2019-12-31 Facebook Technologies, Llc Tileable non-planar structured light patterns for wide field-of-view depth sensing
CN108594454B (en) 2018-03-23 2019-12-13 深圳奥比中光科技有限公司 Structured light projection module and depth camera
GB2572831A (en) 2018-04-04 2019-10-16 Cambridge Mechatronics Ltd Apparatus and methods for 3D sensing
US11314220B2 (en) * 2018-04-26 2022-04-26 Liberty Reach Inc. Non-contact method and system for controlling an industrial automation machine
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments
US11675114B2 (en) 2018-07-23 2023-06-13 Ii-Vi Delaware, Inc. Monolithic structured light projector
US10760957B2 (en) 2018-08-09 2020-09-01 Ouster, Inc. Bulk optics for a scanning array
US10739189B2 (en) 2018-08-09 2020-08-11 Ouster, Inc. Multispectral ranging/imaging sensor arrays and systems
CN110824599B (en) 2018-08-14 2021-09-03 白金科技股份有限公司 Infrared band-pass filter
US12007576B2 (en) 2018-08-27 2024-06-11 Lumentum Operations Llc Lens array to disperse zero-order beams of an emitter array on a diffractive optical element
KR102551261B1 (en) * 2018-10-08 2023-07-05 삼성전자주식회사 Method for generating depth information by using structured light pattern projected to external object and Electronic device using the same
JP2020102014A (en) 2018-12-21 2020-07-02 富士通株式会社 Organism authentication device, organism authentication program, and organism authentication method
KR20210113637A (en) 2019-01-09 2021-09-16 트리나미엑스 게엠베하 a detector for determining a position of at least one object
WO2020144200A1 (en) 2019-01-09 2020-07-16 Trinamix Gmbh Detector for determining a position of at least one object
JP7166445B2 (en) 2019-05-10 2022-11-07 富士フイルム株式会社 sensor
GB201907188D0 (en) 2019-05-21 2019-07-03 Cambridge Mechatronics Ltd Apparatus
US10805549B1 (en) * 2019-08-20 2020-10-13 Himax Technologies Limited Method and apparatus of auto exposure control based on pattern detection in depth sensing system
CN114600165A (en) 2019-09-17 2022-06-07 波士顿偏振测定公司 System and method for surface modeling using polarization cues
EP4042101A4 (en) 2019-10-07 2023-11-22 Boston Polarimetrics, Inc. Systems and methods for surface normals sensing with polarization
DE102019216813A1 (en) * 2019-10-31 2021-05-06 Robert Bosch Gmbh Sending unit for LIDAR devices with near-field beams and far-field beams
KR102558903B1 (en) 2019-11-30 2023-07-24 보스턴 폴라리메트릭스, 인크. System and Method for Segmenting Transparent Objects Using Polarized Signals
JP7462769B2 (en) 2020-01-29 2024-04-05 イントリンジック イノベーション エルエルシー System and method for characterizing an object pose detection and measurement system - Patents.com
CN115428028A (en) 2020-01-30 2022-12-02 因思创新有限责任公司 System and method for synthesizing data for training statistical models in different imaging modalities including polarized images
WO2021243088A1 (en) 2020-05-27 2021-12-02 Boston Polarimetrics, Inc. Multi-aperture polarization optical systems using beam splitters
US12069227B2 (en) 2021-03-10 2024-08-20 Intrinsic Innovation Llc Multi-modal and multi-spectral stereo camera arrays
US12020455B2 (en) 2021-03-10 2024-06-25 Intrinsic Innovation Llc Systems and methods for high dynamic range image reconstruction
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US12067746B2 (en) 2021-05-07 2024-08-20 Intrinsic Innovation Llc Systems and methods for using computer vision to pick up small objects
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers
CN114858800A (en) * 2022-04-20 2022-08-05 广州市泰和混凝土有限公司 Visual measurement device and method for slump and expansibility of concrete
US11856180B1 (en) 2022-06-30 2023-12-26 Apple Inc. Systems and methods for performing temperature-dependent reference image correction for light projectors

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4542376A (en) * 1983-11-03 1985-09-17 Burroughs Corporation System for electronically displaying portions of several different images on a CRT screen through respective prioritized viewports
US5630043A (en) * 1995-05-11 1997-05-13 Cirrus Logic, Inc. Animated texture map apparatus and method for 3-D image displays
US5636025A (en) * 1992-04-23 1997-06-03 Medar, Inc. System for optically measuring the surface contour of a part using more fringe techniques
US5856871A (en) * 1993-08-18 1999-01-05 Applied Spectral Imaging Ltd. Film thickness mapping using interferometric spectral imaging
US6825985B2 (en) * 2001-07-13 2004-11-30 Mems Optical, Inc. Autostereoscopic display with rotated microlens and method of displaying multidimensional images, especially color images
US20070057946A1 (en) * 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object

Family Cites Families (152)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2951207A1 (en) * 1978-12-26 1980-07-10 Canon Kk METHOD FOR THE OPTICAL PRODUCTION OF A SPREADING PLATE
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
JPH0615968B2 (en) * 1986-08-11 1994-03-02 伍良 松本 Three-dimensional shape measuring device
US5016173A (en) * 1989-04-13 1991-05-14 Vanguard Imaging Ltd. Apparatus and method for monitoring visually accessible surfaces of the body
US5075562A (en) * 1990-09-20 1991-12-24 Eastman Kodak Company Method and apparatus for absolute Moire distance measurements using a grating printed on or attached to a surface
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
EP0559978B1 (en) * 1992-03-12 1998-08-05 International Business Machines Corporation Image processing method
US6041140A (en) * 1994-10-04 2000-03-21 Synthonics, Incorporated Apparatus for interactive image correlation for three dimensional image production
JPH08186845A (en) 1994-12-27 1996-07-16 Nobuaki Yanagisawa Focal distance controlling stereoscopic-vision television receiver
IL114278A (en) * 1995-06-22 2010-06-16 Microsoft Internat Holdings B Camera and method
CN1196791A (en) * 1995-07-18 1998-10-21 工业技术研究所 Moire interferometary system and method with extended image depth
JPH09261535A (en) * 1996-03-25 1997-10-03 Sharp Corp Image pickup device
DE19638727A1 (en) 1996-09-12 1998-03-19 Ruedger Dipl Ing Rubbert Method for increasing the significance of the three-dimensional measurement of objects
JP3402138B2 (en) * 1996-09-27 2003-04-28 株式会社日立製作所 Liquid crystal display
IL119341A (en) * 1996-10-02 1999-09-22 Univ Ramot Phase-only filter for generating an arbitrary illumination pattern
IL119831A (en) * 1996-12-15 2002-12-01 Cognitens Ltd Apparatus and method for 3d surface geometry reconstruction
EP0946856A1 (en) 1996-12-20 1999-10-06 Pacific Title and Mirage, Inc. Apparatus and method for rapid 3d image parametrization
US5838428A (en) * 1997-02-28 1998-11-17 United States Of America As Represented By The Secretary Of The Navy System and method for high resolution range imaging with split light source and pattern mask
JPH10327433A (en) 1997-05-23 1998-12-08 Minolta Co Ltd Display device for composted image
US6008813A (en) * 1997-08-01 1999-12-28 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Real-time PC based volume rendering system
DE19736169A1 (en) 1997-08-20 1999-04-15 Fhu Hochschule Fuer Technik Method to measure deformation or vibration using electronic speckle pattern interferometry
US6101269A (en) * 1997-12-19 2000-08-08 Lifef/X Networks, Inc. Apparatus and method for rapid 3D image parametrization
DE19815201A1 (en) * 1998-04-04 1999-10-07 Link Johann & Ernst Gmbh & Co Measuring arrangement for detecting dimensions of test specimens, preferably of hollow bodies, in particular of bores in workpieces, and methods for measuring such dimensions
US6731391B1 (en) * 1998-05-13 2004-05-04 The Research Foundation Of State University Of New York Shadow moire surface measurement using Talbot effect
DE19821611A1 (en) * 1998-05-14 1999-11-18 Syrinx Med Tech Gmbh Recording method for spatial structure of three-dimensional surface, e.g. for person recognition
GB2352901A (en) 1999-05-12 2001-02-07 Tricorder Technology Plc Rendering three dimensional representations utilising projected light patterns
US6377700B1 (en) 1998-06-30 2002-04-23 Intel Corporation Method and apparatus for capturing stereoscopic images using image sensors
US6078371A (en) * 1998-10-05 2000-06-20 Canon Kabushiki Kaisha Liquid crystal device and liquid crystal display apparatus
JP3678022B2 (en) 1998-10-23 2005-08-03 コニカミノルタセンシング株式会社 3D input device
US6084712A (en) * 1998-11-03 2000-07-04 Dynamic Measurement And Inspection,Llc Three dimensional imaging using a refractive optic design
US8965898B2 (en) 1998-11-20 2015-02-24 Intheplay, Inc. Optimizations for live event, real-time, 3D object tracking
GB9828118D0 (en) * 1998-12-21 1999-02-17 Greenagate Limited Trading As Flash unit for 3d photography
DE19903486C2 (en) * 1999-01-29 2003-03-06 Leica Microsystems Method and device for the optical examination of structured surfaces of objects
JP2001166810A (en) * 1999-02-19 2001-06-22 Sanyo Electric Co Ltd Device and method for providing solid model
EP1037069A3 (en) * 1999-03-17 2004-01-14 Matsushita Electric Industrial Co., Ltd. Rangefinder
US6259561B1 (en) * 1999-03-26 2001-07-10 The University Of Rochester Optical system for diffusing light
US6751344B1 (en) * 1999-05-28 2004-06-15 Champion Orthotic Investments, Inc. Enhanced projector system for machine vision
US6512385B1 (en) * 1999-07-26 2003-01-28 Paul Pfaff Method for testing a device under test including the interference of two beams
US6268923B1 (en) * 1999-10-07 2001-07-31 Integral Vision, Inc. Optical method and system for measuring three-dimensional surface topography of an object having a surface contour
JP2001141430A (en) 1999-11-16 2001-05-25 Fuji Photo Film Co Ltd Image pickup device and image processing device
LT4842B (en) * 1999-12-10 2001-09-25 Uab "Geola" Universal digital holographic printer and method
US6301059B1 (en) * 2000-01-07 2001-10-09 Lucent Technologies Inc. Astigmatic compensation for an anamorphic optical system
US6937348B2 (en) * 2000-01-28 2005-08-30 Genex Technologies, Inc. Method and apparatus for generating structural pattern illumination
US6700669B1 (en) * 2000-01-28 2004-03-02 Zheng J. Geng Method and system for three-dimensional imaging using light pattern having multiple sub-patterns
JP4560869B2 (en) * 2000-02-07 2010-10-13 ソニー株式会社 Glasses-free display system and backlight system
KR100355718B1 (en) * 2000-06-10 2002-10-11 주식회사 메디슨 System and method for 3-d ultrasound imaging using an steerable probe
US6810135B1 (en) * 2000-06-29 2004-10-26 Trw Inc. Optimized human presence detection through elimination of background interference
TW527518B (en) 2000-07-14 2003-04-11 Massachusetts Inst Technology Method and system for high resolution, ultra fast, 3-D imaging
US7227526B2 (en) * 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
US6686921B1 (en) * 2000-08-01 2004-02-03 International Business Machines Corporation Method and apparatus for acquiring a set of consistent image maps to represent the color of the surface of an object
US6754370B1 (en) * 2000-08-14 2004-06-22 The Board Of Trustees Of The Leland Stanford Junior University Real-time structured light range scanning of moving scenes
US6639684B1 (en) 2000-09-13 2003-10-28 Nextengine, Inc. Digitizer using intensity gradient to image features of three-dimensional objects
US6765197B2 (en) * 2000-09-27 2004-07-20 Adelphi Technology Inc. Methods of imaging, focusing and conditioning neutrons
US6813440B1 (en) * 2000-10-10 2004-11-02 The Hong Kong Polytechnic University Body scanner
JP3689720B2 (en) 2000-10-16 2005-08-31 住友大阪セメント株式会社 3D shape measuring device
JP2002152776A (en) 2000-11-09 2002-05-24 Nippon Telegr & Teleph Corp <Ntt> Method and device for encoding and decoding distance image
JP2002191058A (en) * 2000-12-20 2002-07-05 Olympus Optical Co Ltd Three-dimensional image acquisition device and three- dimensional image acquisition method
JP2002213931A (en) 2001-01-17 2002-07-31 Fuji Xerox Co Ltd Instrument and method for measuring three-dimensional shape
US6841780B2 (en) * 2001-01-19 2005-01-11 Honeywell International Inc. Method and apparatus for detecting objects
WO2002084340A1 (en) * 2001-04-10 2002-10-24 President And Fellows Of Harvard College Microlens for projection lithography and method of preparation thereof
JP2002365023A (en) 2001-06-08 2002-12-18 Koji Okamoto Apparatus and method for measurement of liquid level
US6741251B2 (en) * 2001-08-16 2004-05-25 Hewlett-Packard Development Company, L.P. Method and apparatus for varying focus in a scene
US7340077B2 (en) * 2002-02-15 2008-03-04 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US7369685B2 (en) 2002-04-05 2008-05-06 Identix Corporation Vision-based operating method and system
US7811825B2 (en) 2002-04-19 2010-10-12 University Of Washington System and method for processing specimens and images for optical tomography
US7385708B2 (en) 2002-06-07 2008-06-10 The University Of North Carolina At Chapel Hill Methods and systems for laser based real-time structured light depth extraction
US7006709B2 (en) * 2002-06-15 2006-02-28 Microsoft Corporation System and method deghosting mosaics using multiperspective plane sweep
US6859326B2 (en) * 2002-09-20 2005-02-22 Corning Incorporated Random microlens array for optical beam shaping and homogenization
KR100624405B1 (en) 2002-10-01 2006-09-18 삼성전자주식회사 Substrate for mounting optical component and method for producing the same
US7194105B2 (en) * 2002-10-16 2007-03-20 Hersch Roger D Authentication of documents and articles by moiré patterns
WO2004046645A2 (en) * 2002-11-21 2004-06-03 Solvision Fast 3d height measurement method and system
US20040174770A1 (en) * 2002-11-27 2004-09-09 Rees Frank L. Gauss-Rees parametric ultrawideband system
AU2003208566A1 (en) * 2003-01-08 2004-08-10 Explay Ltd. An image projecting device and method
US7639419B2 (en) * 2003-02-21 2009-12-29 Kla-Tencor Technologies, Inc. Inspection system using small catadioptric objective
US7127101B2 (en) * 2003-03-10 2006-10-24 Cranul Technologies, Inc. Automatic selection of cranial remodeling device trim lines
US20040213463A1 (en) * 2003-04-22 2004-10-28 Morrison Rick Lee Multiplexed, spatially encoded illumination system for determining imaging and range estimation
US7539340B2 (en) 2003-04-25 2009-05-26 Topcon Corporation Apparatus and method for three-dimensional coordinate measurement
CA2435935A1 (en) * 2003-07-24 2005-01-24 Guylain Lemelin Optical 3d digitizer with enlarged non-ambiguity zone
US6934018B2 (en) * 2003-09-10 2005-08-23 Shearographics, Llc Tire inspection apparatus and method
US7187437B2 (en) * 2003-09-10 2007-03-06 Shearographics, Llc Plurality of light sources for inspection apparatus and method
US7874917B2 (en) * 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US7112774B2 (en) * 2003-10-09 2006-09-26 Avago Technologies Sensor Ip (Singapore) Pte. Ltd CMOS stereo imaging system and method
JP2005181965A (en) * 2003-11-25 2005-07-07 Ricoh Co Ltd Spatial light modulator, display device, and projection display device
US20050135555A1 (en) 2003-12-23 2005-06-23 Claus Bernhard Erich H. Method and system for simultaneously viewing rendered volumes
US7250949B2 (en) 2003-12-23 2007-07-31 General Electric Company Method and system for visualizing three-dimensional data
US8134637B2 (en) * 2004-01-28 2012-03-13 Microsoft Corporation Method and system to increase X-Y resolution in a depth (Z) camera using red, blue, green (RGB) sensing
US7961909B2 (en) 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
US20070165243A1 (en) * 2004-02-09 2007-07-19 Cheol-Gwon Kang Device for measuring 3d shape using irregular pattern and method for the same
US7427981B2 (en) * 2004-04-15 2008-09-23 Avago Technologies General Ip (Singapore) Pte. Ltd. Optical device that measures distance between the device and a surface
US7308112B2 (en) * 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
CN101031837B (en) 2004-07-23 2011-06-15 通用电气医疗集团尼亚加拉有限公司 Method and apparatus for fluorescent confocal microscopy
US20060017656A1 (en) * 2004-07-26 2006-01-26 Visteon Global Technologies, Inc. Image intensity control in overland night vision systems
KR101323966B1 (en) 2004-07-30 2013-10-31 익스트림 리얼리티 엘티디. A system and method for 3D space-dimension based image processing
US7120228B2 (en) 2004-09-21 2006-10-10 Jordan Valley Applied Radiation Ltd. Combined X-ray reflectometer and diffractometer
JP2006128818A (en) 2004-10-26 2006-05-18 Victor Co Of Japan Ltd Recording program and reproducing program corresponding to stereoscopic video and 3d audio, recording apparatus, reproducing apparatus and recording medium
US7076024B2 (en) * 2004-12-01 2006-07-11 Jordan Valley Applied Radiation, Ltd. X-ray apparatus with dual monochromators
US20060156756A1 (en) * 2005-01-20 2006-07-20 Becke Paul E Phase change and insulating properties container and method of use
US20060221218A1 (en) * 2005-04-05 2006-10-05 Doron Adler Image sensor with improved color filter
US7751063B2 (en) 2005-04-06 2010-07-06 Dimensional Photonics International, Inc. Multiple channel interferometric surface contour measurement system
US7560679B1 (en) * 2005-05-10 2009-07-14 Siimpel, Inc. 3D camera
US8050461B2 (en) 2005-10-11 2011-11-01 Primesense Ltd. Depth-varying light fields for three dimensional sensing
US20110096182A1 (en) * 2009-10-25 2011-04-28 Prime Sense Ltd Error Compensation in Three-Dimensional Mapping
US8400494B2 (en) * 2005-10-11 2013-03-19 Primesense Ltd. Method and system for object reconstruction
US8390821B2 (en) 2005-10-11 2013-03-05 Primesense Ltd. Three-dimensional sensing using speckle patterns
WO2007056711A2 (en) * 2005-11-04 2007-05-18 Clean Earth Technologies, Llc Tracking using an elastic cluster of trackers
US7627161B2 (en) * 2005-11-28 2009-12-01 Fuji Xerox Co., Ltd. Authenticity determination method, apparatus and program
US7856125B2 (en) * 2006-01-31 2010-12-21 University Of Southern California 3D face reconstruction from 2D images
WO2007096893A2 (en) 2006-02-27 2007-08-30 Prime Sense Ltd. Range mapping using speckle decorrelation
EP1994503B1 (en) * 2006-03-14 2017-07-05 Apple Inc. Depth-varying light fields for three dimensional sensing
US7869649B2 (en) 2006-05-08 2011-01-11 Panasonic Corporation Image processing device, image processing method, program, storage medium and integrated circuit
GB2438600B (en) * 2006-05-19 2008-07-09 Exitech Ltd Method for patterning thin films on moving substrates
JP4316668B2 (en) * 2006-05-30 2009-08-19 パナソニック株式会社 Pattern projection light source and compound eye distance measuring device
US8488895B2 (en) 2006-05-31 2013-07-16 Indiana University Research And Technology Corp. Laser scanning digital camera with pupil periphery illumination and potential for multiply scattered light imaging
US8139142B2 (en) * 2006-06-01 2012-03-20 Microsoft Corporation Video manipulation of red, green, blue, distance (RGB-Z) data including segmentation, up-sampling, and background substitution techniques
US8411149B2 (en) * 2006-08-03 2013-04-02 Alterface S.A. Method and device for identifying and extracting images of multiple users, and for recognizing user gestures
CN101512601B (en) * 2006-09-04 2013-07-31 皇家飞利浦电子股份有限公司 Method for determining a depth map from images, device for determining a depth map
US7256899B1 (en) * 2006-10-04 2007-08-14 Ivan Faul Wireless methods and systems for three-dimensional non-contact shape sensing
US8090194B2 (en) * 2006-11-21 2012-01-03 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
US7990545B2 (en) 2006-12-27 2011-08-02 Cambridge Research & Instrumentation, Inc. Surface measurement of in-vivo subjects using spot projector
US7840031B2 (en) 2007-01-12 2010-11-23 International Business Machines Corporation Tracking a range of body movement based on 3D captured image streams of a user
US8350847B2 (en) 2007-01-21 2013-01-08 Primesense Ltd Depth mapping using multi-beam illumination
US20080212835A1 (en) 2007-03-01 2008-09-04 Amon Tavor Object Tracking by 3-Dimensional Modeling
WO2008120217A2 (en) 2007-04-02 2008-10-09 Prime Sense Ltd. Depth mapping using projected patterns
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
US8488868B2 (en) 2007-04-03 2013-07-16 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Industry, Through The Communications Research Centre Canada Generation of a depth map from a monoscopic color image for rendering stereoscopic still and video images
US7835561B2 (en) 2007-05-18 2010-11-16 Visiongate, Inc. Method for image processing and reconstruction of images for optical tomography
US8494252B2 (en) * 2007-06-19 2013-07-23 Primesense Ltd. Depth mapping using optical elements having non-uniform focal characteristics
EP2168096A1 (en) * 2007-07-12 2010-03-31 Thomson Licensing System and method for three-dimensional object reconstruction from two-dimensional images
US20090060307A1 (en) * 2007-08-27 2009-03-05 Siemens Medical Solutions Usa, Inc. Tensor Voting System and Method
DE102007045332B4 (en) 2007-09-17 2019-01-17 Seereal Technologies S.A. Holographic display for reconstructing a scene
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8176497B2 (en) 2008-01-16 2012-05-08 Dell Products, Lp Method to dynamically provision additional computer resources to handle peak database workloads
JP5588353B2 (en) 2008-01-21 2014-09-10 プライムセンス リミテッド Optical design for zero order reduction
DE102008011350A1 (en) 2008-02-27 2009-09-03 Loeffler Technology Gmbh Apparatus and method for real-time detection of electromagnetic THz radiation
US8121351B2 (en) * 2008-03-09 2012-02-21 Microsoft International Holdings B.V. Identification of objects in a 3D video using non/over reflective clothing
US8456517B2 (en) 2008-07-09 2013-06-04 Primesense Ltd. Integrated processor for 3D mapping
US8462207B2 (en) * 2009-02-12 2013-06-11 Primesense Ltd. Depth ranging with Moiré patterns
US8786682B2 (en) * 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US8717417B2 (en) * 2009-04-16 2014-05-06 Primesense Ltd. Three-dimensional mapping and imaging
US8503720B2 (en) * 2009-05-01 2013-08-06 Microsoft Corporation Human body pose estimation
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
EP2275990B1 (en) * 2009-07-06 2012-09-26 Sick Ag 3D sensor
WO2011013079A1 (en) * 2009-07-30 2011-02-03 Primesense Ltd. Depth mapping based on pattern matching and stereoscopic information
WO2011031538A2 (en) * 2009-08-27 2011-03-17 California Institute Of Technology Accurate 3d object reconstruction using a handheld device with a projected light pattern
US8830227B2 (en) * 2009-12-06 2014-09-09 Primesense Ltd. Depth-based gain control
US8320621B2 (en) * 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
US8982182B2 (en) 2010-03-01 2015-03-17 Apple Inc. Non-uniform spatial resource allocation for depth mapping
US8654152B2 (en) 2010-06-21 2014-02-18 Microsoft Corporation Compartmentalizing focus area within field of view
US20140081459A1 (en) * 2012-09-20 2014-03-20 Marc Dubois Depth mapping vision system with 2d optical pattern for robotic applications
US20140268879A1 (en) * 2013-03-14 2014-09-18 Panasonic Corporation Transparent waveguide diffuser for lighting and methods of manufacturing transparent waveguide diffuser
US20140320605A1 (en) * 2013-04-25 2014-10-30 Philip Martin Johnson Compound structured light projection system for 3-D surface profiling
US20160219266A1 (en) * 2015-01-25 2016-07-28 3dMD Technologies Ltd Anatomical imaging system for product customization and methods of use thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4542376A (en) * 1983-11-03 1985-09-17 Burroughs Corporation System for electronically displaying portions of several different images on a CRT screen through respective prioritized viewports
US5636025A (en) * 1992-04-23 1997-06-03 Medar, Inc. System for optically measuring the surface contour of a part using more fringe techniques
US5856871A (en) * 1993-08-18 1999-01-05 Applied Spectral Imaging Ltd. Film thickness mapping using interferometric spectral imaging
US5630043A (en) * 1995-05-11 1997-05-13 Cirrus Logic, Inc. Animated texture map apparatus and method for 3-D image displays
US6825985B2 (en) * 2001-07-13 2004-11-30 Mems Optical, Inc. Autostereoscopic display with rotated microlens and method of displaying multidimensional images, especially color images
US20070057946A1 (en) * 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object

Cited By (108)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8390821B2 (en) 2005-10-11 2013-03-05 Primesense Ltd. Three-dimensional sensing using speckle patterns
US8400494B2 (en) 2005-10-11 2013-03-19 Primesense Ltd. Method and system for object reconstruction
US8050461B2 (en) 2005-10-11 2011-11-01 Primesense Ltd. Depth-varying light fields for three dimensional sensing
US8249334B2 (en) 2006-05-11 2012-08-21 Primesense Ltd. Modeling of humanoid forms from depth maps
US8350847B2 (en) 2007-01-21 2013-01-08 Primesense Ltd Depth mapping using multi-beam illumination
US8493496B2 (en) 2007-04-02 2013-07-23 Primesense Ltd. Depth mapping using projected patterns
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
US8494252B2 (en) 2007-06-19 2013-07-23 Primesense Ltd. Depth mapping using optical elements having non-uniform focal characteristics
US9035876B2 (en) 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8384997B2 (en) 2008-01-21 2013-02-26 Primesense Ltd Optical pattern projection
US9239467B2 (en) 2008-01-21 2016-01-19 Apple Inc. Optical pattern projection
US8630039B2 (en) 2008-01-21 2014-01-14 Primesense Ltd. Optical designs for zero order reduction
US8456517B2 (en) 2008-07-09 2013-06-04 Primesense Ltd. Integrated processor for 3D mapping
US8462207B2 (en) 2009-02-12 2013-06-11 Primesense Ltd. Depth ranging with Moiré patterns
US8786682B2 (en) 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US8717417B2 (en) 2009-04-16 2014-05-06 Primesense Ltd. Three-dimensional mapping and imaging
US9582889B2 (en) 2009-07-30 2017-02-28 Apple Inc. Depth mapping based on pattern matching and stereoscopic information
US8565479B2 (en) 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
US8492696B2 (en) 2009-11-15 2013-07-23 Primesense Ltd. Optical projector with beam monitor including mapping apparatus capturing image of pattern projected onto an object
US8830227B2 (en) 2009-12-06 2014-09-09 Primesense Ltd. Depth-based gain control
EP2363686A1 (en) 2010-02-02 2011-09-07 Primesense Ltd. Optical apparatus, an imaging system and a method for producing a photonics module
US9736459B2 (en) 2010-02-02 2017-08-15 Apple Inc. Generation of patterned radiation
US8787663B2 (en) 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
EP2364013A3 (en) * 2010-03-04 2014-01-29 Sony Corporation Information processing apparatus, method and program for imaging device
US9516206B2 (en) 2010-03-04 2016-12-06 Sony Corporation Information processing apparatus, information processing method, and program
CN102196220A (en) * 2010-03-04 2011-09-21 索尼公司 Information processing apparatus, information processing method and program
US11190678B2 (en) 2010-03-04 2021-11-30 Sony Corporation Information processing apparatus, information processing method, and program
US9049376B2 (en) 2010-03-04 2015-06-02 Sony Corporation Information processing apparatus, information processing method, and program
US10659681B2 (en) 2010-03-04 2020-05-19 Sony Corporation Information processing apparatus, information processing method, and program
US10015392B2 (en) 2010-03-04 2018-07-03 Sony Corporation Information processing apparatus, information processing method, and program
CN102196220B (en) * 2010-03-04 2016-05-18 索尼公司 Messaging device and information processing method
US10306136B2 (en) 2010-03-04 2019-05-28 Sony Corporation Information processing apparatus, information processing method, and program
US8594425B2 (en) 2010-05-31 2013-11-26 Primesense Ltd. Analysis of three-dimensional scenes
US8824737B2 (en) 2010-05-31 2014-09-02 Primesense Ltd. Identifying components of a humanoid form in three-dimensional scenes
US8781217B2 (en) 2010-05-31 2014-07-15 Primesense Ltd. Analysis of three-dimensional scenes with a surface model
US9098931B2 (en) 2010-08-11 2015-08-04 Apple Inc. Scanning projectors and image capture modules for 3D mapping
US9036158B2 (en) 2010-08-11 2015-05-19 Apple Inc. Pattern projector
US9348111B2 (en) 2010-08-24 2016-05-24 Apple Inc. Automatic detection of lens deviations
US8582867B2 (en) 2010-09-16 2013-11-12 Primesense Ltd Learning-based pose estimation from depth maps
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
US9746319B2 (en) 2010-10-05 2017-08-29 Empire Technology Development Llc Generation of depth data based on spatial light pattern
US9030466B2 (en) 2010-10-05 2015-05-12 Empire Technology Development Llc Generation of depth data based on spatial light pattern
EP2643659A4 (en) * 2010-11-19 2017-03-29 Apple Inc. Depth mapping using time-coded illumination
US9066087B2 (en) 2010-11-19 2015-06-23 Apple Inc. Depth mapping using time-coded illumination
EP2643659A1 (en) * 2010-11-19 2013-10-02 Primesense Ltd. Depth mapping using time-coded illumination
US9131136B2 (en) 2010-12-06 2015-09-08 Apple Inc. Lens arrays for pattern projection and imaging
US9167138B2 (en) 2010-12-06 2015-10-20 Apple Inc. Pattern projection and imaging using lens arrays
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
US9888225B2 (en) 2011-02-04 2018-02-06 Koninklijke Philips N.V. Method of recording an image and obtaining 3D information from the image, camera system
US10469825B2 (en) 2011-02-04 2019-11-05 Koninklijke Philips N.V. Image recording and 3D information acquisition
EP3527121A1 (en) 2011-02-09 2019-08-21 Apple Inc. Gesture detection in a 3d mapping environment
WO2012174406A1 (en) * 2011-06-15 2012-12-20 University Of Washington Methods and systems for haptic rendering and creating virtual fixtures from point clouds
US9471142B2 (en) 2011-06-15 2016-10-18 The University Of Washington Methods and systems for haptic rendering and creating virtual fixtures from point clouds
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US8908277B2 (en) 2011-08-09 2014-12-09 Apple Inc Lens array projector
US8749796B2 (en) 2011-08-09 2014-06-10 Primesense Ltd. Projectors of structured light
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9218063B2 (en) 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9002099B2 (en) 2011-09-11 2015-04-07 Apple Inc. Learning-based estimation of hand and finger pose
WO2013038089A1 (en) 2011-09-16 2013-03-21 Prynel Method and system for acquiring and processing images for the detection of motion
US9595156B2 (en) 2012-01-23 2017-03-14 Novomatic Ag Prize wheel with gesture-based control
US9201237B2 (en) 2012-03-22 2015-12-01 Apple Inc. Diffraction-based sensing of mirror position
US9047507B2 (en) 2012-05-02 2015-06-02 Apple Inc. Upper-body skeleton extraction from depth maps
US9019267B2 (en) 2012-10-30 2015-04-28 Apple Inc. Depth mapping with enhanced resolution
US9152234B2 (en) 2012-12-02 2015-10-06 Apple Inc. Detecting user intent to remove a pluggable peripheral device
US10441176B2 (en) 2012-12-31 2019-10-15 Omni Medsci, Inc. Imaging using near-infrared laser diodes with distributed bragg reflectors
US10517484B2 (en) 2012-12-31 2019-12-31 Omni Medsci, Inc. Semiconductor diodes-based physiological measurement device with improved signal-to-noise ratio
US9477307B2 (en) 2013-01-24 2016-10-25 The University Of Washington Methods and systems for six degree-of-freedom haptic interaction with streaming point data
US9753542B2 (en) 2013-01-24 2017-09-05 University Of Washington Through Its Center For Commercialization Methods and systems for six-degree-of-freedom haptic interaction with streaming point data
CN103164995A (en) * 2013-04-03 2013-06-19 湖南第一师范学院 Children somatic sense interactive learning system and method
US9825425B2 (en) 2013-06-19 2017-11-21 Apple Inc. Integrated structured-light projector comprising light-emitting elements on a substrate
WO2015038443A1 (en) * 2013-09-11 2015-03-19 Microsoft Corporation Optical modules for use with depth cameras
US9443310B2 (en) 2013-10-09 2016-09-13 Microsoft Technology Licensing, Llc Illumination modules that emit structured light
US9528906B1 (en) 2013-12-19 2016-12-27 Apple Inc. Monitoring DOE performance using total internal reflection
WO2015124780A1 (en) 2014-02-21 2015-08-27 Ipo.Plan Gmbh Device for sensing a three-dimensional object
US10226869B2 (en) 2014-03-03 2019-03-12 University Of Washington Haptic virtual fixture tools
US10349037B2 (en) 2014-04-03 2019-07-09 Ams Sensors Singapore Pte. Ltd. Structured-stereo imaging assembly including separate imagers for different wavelengths
EP3598063A1 (en) 2014-07-28 2020-01-22 Apple Inc. Overlapping pattern projector
WO2016018550A1 (en) 2014-07-28 2016-02-04 Apple Inc. Overlapping pattern projector
CN107206451A (en) * 2014-12-12 2017-09-26 屋罗斯·杜兰强尼 The application of shape bending process of the 3D video cameras on the bending machine of three and four rollers
CN107206451B (en) * 2014-12-12 2019-09-03 屋罗斯·杜兰强尼 The application of shape bending process of the 3D video camera on the bending machine of three and four rollers
US9525863B2 (en) 2015-04-29 2016-12-20 Apple Inc. Time-of-flight depth mapping with flexible scan pattern
US10012831B2 (en) 2015-08-03 2018-07-03 Apple Inc. Optical monitoring of scan parameters
US10353215B2 (en) 2015-10-21 2019-07-16 Princeton Optronics, Inc. Generation of coded structured light patterns using VCSEL arrays
US9946089B2 (en) 2015-10-21 2018-04-17 Princeton Optronics, Inc. Generation of coded structured light patterns using VCSEL arrays
US10043279B1 (en) 2015-12-07 2018-08-07 Apple Inc. Robust detection and classification of body parts in a depth map
CN109477710B (en) * 2016-07-27 2021-01-29 微软技术许可有限责任公司 Reflectance map estimation for point-based structured light systems
CN109477710A (en) * 2016-07-27 2019-03-15 微软技术许可有限责任公司 The reflectance map of structured light system based on point is estimated
US10073004B2 (en) 2016-09-19 2018-09-11 Apple Inc. DOE defect monitoring utilizing total internal reflection
US10366278B2 (en) 2016-09-20 2019-07-30 Apple Inc. Curvature-based face detector
CN110770536A (en) * 2017-07-06 2020-02-07 瑞典卡尔拉得集团有限公司 Method for determining wheel alignment parameters
EP3425327A1 (en) * 2017-07-06 2019-01-09 Car-O-Liner Group AB A method for determining wheel alignment parameters
CN110770536B (en) * 2017-07-06 2021-10-29 瑞典卡尔拉得集团有限公司 Method for determining wheel alignment parameters
WO2019007813A1 (en) * 2017-07-06 2019-01-10 Car-O-Liner Group AB A method for determining wheel alignment parameters
US10153614B1 (en) 2017-08-31 2018-12-11 Apple Inc. Creating arbitrary patterns on a 2-D uniform grid VCSEL array
US10310281B1 (en) 2017-12-05 2019-06-04 K Laser Technology, Inc. Optical projector with off-axis diffractive element
US10545457B2 (en) 2017-12-05 2020-01-28 K Laser Technology, Inc. Optical projector with off-axis diffractive element and conjugate images
US10317684B1 (en) 2018-01-24 2019-06-11 K Laser Technology, Inc. Optical projector with on axis hologram and multiple beam splitter
US11422292B1 (en) 2018-06-10 2022-08-23 Apple Inc. Super-blazed diffractive optical elements with sub-wavelength structures
WO2020103165A1 (en) * 2018-11-24 2020-05-28 深圳阜时科技有限公司 Light source structure, optical projection module, sensing apparatus, and device
WO2020103166A1 (en) * 2018-11-24 2020-05-28 深圳阜时科技有限公司 Light source structure, optical projection module and sensing device and apparatus
US11681019B2 (en) 2019-09-18 2023-06-20 Apple Inc. Optical module with stray light baffle
US11506762B1 (en) 2019-09-24 2022-11-22 Apple Inc. Optical module comprising an optical waveguide with reference light path
US11754767B1 (en) 2020-03-05 2023-09-12 Apple Inc. Display with overlaid waveguide
US12111421B2 (en) 2021-03-17 2024-10-08 Apple Inc. Waveguide-based transmitters with adjustable lighting

Also Published As

Publication number Publication date
TW200847061A (en) 2008-12-01
US9885459B2 (en) 2018-02-06
US8493496B2 (en) 2013-07-23
US20100118123A1 (en) 2010-05-13
TWI433052B (en) 2014-04-01
WO2008120217A3 (en) 2010-02-25
US20180180248A1 (en) 2018-06-28
US20130294089A1 (en) 2013-11-07
US10514148B2 (en) 2019-12-24

Similar Documents

Publication Publication Date Title
US10514148B2 (en) Pattern projection using microlenses
US8150142B2 (en) Depth mapping using projected patterns
US8761495B2 (en) Distance-varying illumination and imaging techniques for depth mapping
US11310479B2 (en) Non-uniform spatial resource allocation for depth mapping
KR101408959B1 (en) Depth-varying light fields for three dimensional sensing
US8374397B2 (en) Depth-varying light fields for three dimensional sensing
US9392262B2 (en) System and method for 3D reconstruction using multiple multi-channel cameras
US7433024B2 (en) Range mapping using speckle decorrelation
KR101974651B1 (en) Measuring method of 3d image depth and a system for measuring 3d image depth using boundary inheritance based hierarchical orthogonal coding
US8538166B2 (en) 3D geometric modeling and 3D video content creation
US20100020078A1 (en) Depth mapping using multi-beam illumination
US20070263903A1 (en) Enhancing stereo depth measurements with projected texture
US20040105580A1 (en) Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
CN111971525B (en) Method and system for measuring an object with a stereoscope
JP2024055569A (en) Three-dimensional measurement device, three-dimensional measurement method, program, system, and method for manufacturing article

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08720068

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 12522171

Country of ref document: US

122 Ep: pct application non-entry in european phase

Ref document number: 08720068

Country of ref document: EP

Kind code of ref document: A2