JP5174684B2 - 3D detection using speckle patterns - Google Patents

3D detection using speckle patterns Download PDF

Info

Publication number
JP5174684B2
JP5174684B2 JP2008558981A JP2008558981A JP5174684B2 JP 5174684 B2 JP5174684 B2 JP 5174684B2 JP 2008558981 A JP2008558981 A JP 2008558981A JP 2008558981 A JP2008558981 A JP 2008558981A JP 5174684 B2 JP5174684 B2 JP 5174684B2
Authority
JP
Japan
Prior art keywords
image
subject
speckle pattern
device
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2008558981A
Other languages
Japanese (ja)
Other versions
JP2009531655A (en
Inventor
シュプント、アレクサンダー
ザレフスキー、ジーフ
Original Assignee
プライムセンス リミテッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to PCT/IL2006/000335 priority Critical patent/WO2007043036A1/en
Priority to ILPCT/IL2006/000335 priority
Priority to US78518706P priority
Priority to US60/785,187 priority
Application filed by プライムセンス リミテッド filed Critical プライムセンス リミテッド
Priority to PCT/IL2007/000306 priority patent/WO2007105205A2/en
Publication of JP2009531655A publication Critical patent/JP2009531655A/en
Application granted granted Critical
Publication of JP5174684B2 publication Critical patent/JP5174684B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/557Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras

Description

( Cross-reference to related applications )
This application claims the benefit of US Provisional Patent Application 60 / 785,187, filed March 24, 2006. This application is a continuation-in-part of PCT patent application PCT / IL2006 / 000335 filed March 14, 2006, which is a US provisional patent application 60 / 724,903 filed October 11, 2005. I claim the benefit. All of these related applications are assigned to the assignee of the present patent application, the disclosures of which are incorporated herein by reference.

(Technical field)
The present invention relates generally to methods and systems for mapping three-dimensional (3D) objects, and more particularly to three-dimensional optical imaging using speckle patterns.

As the coherent light beam passes through the diffuser and is projected onto one surface, a primary speckle pattern can be observed on that surface. This primary speckle is caused by the interference of different components of the diffused beam. In this patent application and in the claims, “first order speckle” is used in this sense and is distinguished from second order speckle caused by diffuse reflection of coherent light from the rough surface of the subject.

Hart describes the use of speckle patterns in a high-speed three-dimensional imaging system in US Pat. The system includes a single lens camera subsystem with an active imaging device and a CCD device, and a correlation processing subsystem. The active imager can be a rotary stop, which allows for adjustable non-equal spacing between defocused images to increase depth of field and increase displacement accuracy below the pixel. The speckle pattern is projected onto the subject and the resulting pattern image is obtained from multiple angles. These images are locally cross-correlated using image correlation techniques, and the surface uses relative camera position information to calculate the three-dimensional coordinates of each locally correlated region. To be resolved.

Patent document 3 by Hunter et al. (The disclosure of which is incorporated herein by reference) describes another speckle based three-dimensional imaging technique. The random speckle pattern, is projected into one three-dimensional table surface is imaged by a plurality of cameras, a plurality of two-dimensional digital image is obtained. Two-dimensional image is processed, one three-dimensional characteristics of the front surface is obtained.
Taiwan patent TW527528 B US patent application 09 / 616,606 US Pat. No. 6,101,269

(Summary of the Invention)
Embodiments of the present invention perform accurate and real-time mapping of a three-dimensional subject using a primary speckle pattern. In the method and system described in the embodiments above PCT patent application and below, using a single coherent light source, stationary and a single image sensor which is held at a fixed angle relative to the light source Such a three-dimensional mapping can be performed.

One aspect of the present invention, one reference image of the speckle pattern is in that first are acquired on a single lookup table surface of known contour. Next, an image of a speckle pattern projected on the subject is acquired, and the three-dimensional contour of the subject is determined by comparing this image with a reference image.

Another aspect of the present invention is that a continuous image of a speckle pattern on the subject is acquired as the subject moves. Each image is compared to one or more of the preceding images in order to track the movement of the subject in three dimensions. In one embodiment described below, the light source and the image sensor are arranged in one straight line, by calculating a one-dimensional correlation coefficients between successive images, tracks the rapid and accurate motion be able to.

In some embodiments , novel illumination and image processing configurations are used to increase accuracy, depth of field , and computational speed of the 3D mapping system.

Therefore, according to one embodiment of the present invention, an illumination device having a coherent light source arranged a to project the primary speckle pattern onto the object and the diffusion plate, relative to the lighting device A single image acquisition device arranged to acquire a primary speckle pattern image on a subject from a single, fixed position and angle, and a single image to derive a three-dimensional map of the subject An object three-dimensional mapping apparatus is provided having a processor connected to process an image of a primary speckle pattern acquired at a fixed angle.

In some embodiments, the apparatus, an image capturing device in order to spatially fixed relative to the illumination device, having a base attached to the lighting device and an image acquisition apparatus. In one embodiment, the image acquisition device includes a detector array and an objective optical system arranged in a linear pattern defining first and second mutually orthogonal axes, the objective optical system comprising: A beam having an entrance pupil and arranged to focus the image on the array, where the beam produced by the light source coherent with the entrance pupil and parallel to the first axis is coherent The illuminating device and the image acquisition device are arranged by a table so as to define the axis of the device passing through the spot passing through the plate. Thus, on only the first axis, the deviation value between the primary speckle pattern acquired in one or more images and the reference image of the primary speckle pattern is By finding, a processor is arranged to derive a three-dimensional map.

Cubic by finding each deviation value between a primary speckle pattern of multiple regions on a subject acquired in one or more images and a reference image of the primary speckle pattern In some embodiments, a processor is arranged to derive the original map, with each shift indicating each distance between the region and the image acquisition device . Usually, the image acquisition device is positioned at a predetermined interval from the illumination device , and each offset is proportional to each distance at a rate determined by this interval. In the disclosed embodiment, the primary speckle pattern projected by the illuminating device has a speckle characteristic dimension, the dimension of speckle in an image, the image by the tolerance that depends on the spacing The distance varies across the entire area, and the interval is selected such that the tolerance is within a predetermined range.

Additionally or alternatively, a processor is arranged to associate each shift with each coordinate of the three-dimensional map using a distortion coefficient model in the image acquisition device . Additionally or alternatively, the first speckle pattern in the first area of the subject and the area of the reference image corresponding to the first shift value relative to the first area. A range expansion method is used based on the first shift to find each shift by finding an initial fit between and to each shift value of a pixel adjacent to the first region. As such, a processor is arranged.

In certain disclosed embodiments, a processor is arranged to process successive acquired images while the subject is moving in order to map the three-dimensional motion of the subject, A body part, a three-dimensional movement, is an action performed by a part of the human body, and a processor is connected to provide input to a computer application in response to the action .

In some embodiments , the illuminating device comprises a beam forming device, which is arranged to reduce the variation in the contrast of the speckle pattern created by the diffuser throughout the detection volume of the device. In one embodiment, the beam former has a diffractive optical element (DOE) and a lens arranged to define a Fourier plane of the diffuser, the DOE being positioned in the Fourier plane. Beam forming apparatus may be arranged to reduce the divergence of the light emitted from the diffusion plate, or the intensity of the light emitted from the diffusion plate, in order to equally whole surface transverse to the optical axis of the illumination device It may be arranged.

In some embodiments, the processor includes an optical correlator, the optical correlator includes a diffractive optical element (DOE) that includes a reference speckle pattern, and the image acquisition device includes a plurality of sub- subjects of the subject. A small lens array is arranged to project the image onto the DOE and generate each correlation peak indicative of the three-dimensional coordinates of the subject.

In some embodiments, the coherence length of the coherent light source is less than 1 cm. Additionally or alternatively, the primary speckle pattern has speckles with characteristic dimensions, and the speckle features can be changed by changing the distance between the coherent light source and the diffuser. The lighting device is configured so that the general dimensions can be adjusted.

According to one embodiment of the present invention, in order to project the primary speckle pattern onto the object, in a coherent light beam diffused from a light source, comprising the steps of illuminating the object with respect to the light source in comparison, the position and angle of the single, fixed, a step of acquiring the images of the primary speckle pattern on the object, to derive the three-dimensional map of the object is acquired at a single, fixed angle There is also provided a method for three-dimensional mapping of a subject comprising the step of processing an image of a primary speckle pattern.

In addition, according to one embodiment of the present invention, a coherent light source having a coherence length of less than 1 cm and a diffuser plate are arranged to project a primary speckle pattern on the subject. In addition, an illumination device , an image acquisition device arranged to acquire a primary speckle pattern image on the subject, and an image of the primary speckle pattern to derive a three-dimensional map of the subject An object three-dimensional mapping apparatus is provided having a processor connected for processing.

In certain embodiments, the coherence length of the coherent light source is less than 0.5 mm. Additionally or alternatively, greater than 5 ° diverging coherent light source.

  The present invention will be more fully understood by reference to the following detailed description of embodiments of the invention in conjunction with the drawings.

(Brief description of the drawings)
FIG. 1 is a schematic diagram illustrating a three-dimensional mapping system according to an embodiment of the present invention.
FIG. 2 is a schematic top view of a speckle imaging device according to an embodiment of the present invention.
FIG. 3 is a flowchart schematically illustrating one method of three-dimensional mapping according to an embodiment of the present invention.
FIG. 4 is a schematic side view of a lighting device used in a three-dimensional mapping system according to another embodiment of the present invention.
FIG. 5 is a schematic side view of a beam forming apparatus according to an embodiment of the present invention.
FIG. 6 is a schematic side view of a beam forming apparatus according to still another embodiment of the present invention.
FIG. 7 is a schematic side view of an optical correlator used in a three-dimensional mapping system, according to yet another embodiment of the present invention.

FIG. 1 is a schematic diagram illustrating a three-dimensional mapping system 20 according to an embodiment of the present invention. The system 20 includes a speckle imaging device 22 that generates a primary speckle pattern, projects it onto a subject 28, and displays an image of the primary speckle pattern that appears on the subject. To win. The detailed design and operation of the device 22 is shown in the following drawings and will be described in connection therewith.

The image processor 24 processes the image data generated by the device 22 to obtain a three-dimensional map of the subject 28. The term "three-dimensional map" as used in this patent application and in the claims, refers to a set of three-dimensional coordinate group representing the front surface of the object. Deriving such a map based on image data can also be referred to as “three-dimensional reconstruction”. Such an image processor 24 which reconstructs the performed may be made from the general purpose computer processor, which is programmed to software for implementing the functions described below. This software may be downloaded to processor 24 in electronic form, for example via a network, or may be provided on tangible media such as optical, magnetic or electronic memory media. Alternatively or in addition, some or all of the functionality of the image processor may be implemented in dedicated hardware, such as a custom or semi-custom integrated circuit or a programmable digital signal processor (DSP). . The processor 24 is shown in FIG. 1 as an example separate unit from the imaging device 22, but some or all of the processing functions of the processor 24 may be performed by appropriate dedicated circuitry within the imaging device housing, and it may be implemented included with the imaging device.

The three-dimensional map generated by the processor 24 can be used for a wide variety of purposes. For example, the map can be sent to an output device such as a display 26 showing a pseudo 3D image of the subject. In the example of FIG. 1, the object 28 is composed of all parts or a part of a subject body (e.g., a hand). In this case, the system 20 may be used to provide a user interface based on gestures, movement of the user detected by the device 22 is a mouse, tactile type interface member such as a joystick or other accessories on behalf of, to control the interactive computer applications such as games. Alternatively, the system 20 for virtually any application outline of the three-dimensional coordinates Ru is required, may be used to generate a three-dimensional map of the other type of subject.

FIG. 2 is a schematic top view of one device 22 according to one embodiment of the invention. One of the lighting device 30 has one coherent light source 32. This usually consists of one laser and one of the diffusion plate 33. (In the contexts of the present patent application, the term "light" means also the optical radiation of any kind, for example, infrared, ultraviolet and visible light.) The beam of light emitted from the light source 32 , it passes through the diffuser plate 33 in one spot 34 of radius w 0, that generates a one diverging beam 36. As described in the above-mentioned PCT patent application PCT / IL2006 / 000335, the primary speckle pattern created at a distance of Z obj1 and Z obj2 by the diffusion plate 34 is
The on-axis dimension ΔZ of the speckle pattern at the subject distance Z obj is
ΔZ = (Z obj / w 0 ) 2 · λ
By the Z obj1 and Z obj2 the range of that given distance, if there is an image that has been scaled increases linearly to each other to the extent of good approximation.

One image acquisition apparatus 38 acquires one image of the speckle pattern projected onto the object 28. 38 consists of an objective optical system 39 which focuses images on one image sensor 40. Usually, the sensor 40, such as a CCD or CMOS based image sensor array, consisting of linear arrays of detector elements 41. The objective optical system 39 has one entrance pupil 42 that defines one field 44 of the image acquisition device along with the dimensions of the image sensor. Detection area of the device 22 is comprised of one overlapping region 46 between the beam 36 and the field 44.

The characteristic transverse speckle dimension projected by the lighting device 30 at a distance of Z obj (which is defined by the secondary statistics of the speckle pattern) is
ΔX = (Z obj / W 0 ) · λ
It is. We, to optimize performance of the optical image processing, speckle size imaged on the sensor 40 on, depending on the range and resolution requirements, should be 1 to 10 pixels, i.e. It has been found that each speckle imaged on the sensor 40 by the optical system 39 should be expanded by the number of 1 to 10 detection elements 41 in the horizontal direction. In a typical application, it results in good speckle size of 2-3 pixels.

From the above equation for ΔX, it can be seen that the speckle dimension can be adjusted by changing the distance between the light source 32 and the diffuser plate 33, which means that the radius w 0 of the spot 34 is the distance from the light source. This is because as the length increases, the length increases. As described above, the speckle coefficient of the illumination device 30 can be controlled by simply moving the light source in the lateral direction without using a lens or other optical system. Illumination device 30 is thus being adjustable to be used together with different dimensions and resolution image sensor and different magnifications of the objective optical system. Given the small speckle size that is forced by a factor of above, have large divergence (5 ° or more) and short coherence length (less than 1 cm, and optionally, less than 0.5 mm), low cost, such as a laser diode A good light source can be used in the system 20.

The illumination device 30 and the image acquisition device 38 are held at positions fixed by the base 43. In the embodiment illustrated in FIG. 2, the platform has a housing that holds these devices . Alternatively, a desired positional relationship between the lighting device and the image acquisition device can be maintained by using another appropriate mechanical base . With the configuration of the apparatus 22 and the processing technique described below, three-dimensional mapping can be performed using a single image acquisition apparatus without relatively moving between the illumination apparatus and the image acquisition apparatus and without moving the member. It can be carried out. The image acquisition device 38 can thus acquire an image at a single and fixed angle relative to the illumination device 30.

In order to simplify the calculation of the map change due to the movement of the three-dimensional map and the subject 28, the axis passing through the center of the entrance pupil 42 and the spot 34 is one of the axes of the sensor 40 as shown below. The platform 43 preferably holds the two devices 30 , 38 so that they are parallel to each other. That is, (puts the origin to the optical axis of the objective optical system 3 9) to define the X and Y axes which are orthogonal to each other, through taking the columns and rows of the array of detector elements 41, the pupil 42 and spot 34 The axis should be parallel to one of the array axes (for convenience, the X axis). The advantages of this arrangement are further explained below.

The optical axes of the devices 30 and 38 (which respectively pass through the center of the spot 34 and the pupil 42) are separated by a distance S. Therefore, when Z obj changes, distortion of the speckle pattern occurs in the subject image acquired by the image acquisition device 38. In particular, the triangulation, as can be seen from FIG. 2, Z-axis direction shift δZ of one point on the subject, shift ancillary transverse speckle pattern in observed during image δX The
It takes to take the value indicated by.

Shift of the Z coordinate by Z-coordinate and time points on the object, the speckle in the images that are acquired by the image acquisition apparatus 38 of the X-coordinate, the one reference image obtained by the known distance Z It can be calculated by measuring the deviation . That is, to find the speckle group of best fit in the reference image, speckle group in each region of the acquired image is compared with the reference image. The relative shift between the groups to meet the speckle in the image indicates the Z direction of the shift of the region of the acquired image against the reference image. The speckle pattern shift can be measured using an image correlation method or other image fitting calculation methods known in the art. Some of these methods are described in the above-mentioned PCT patent application. Another particularly useful method applied with apparatus 22 is U.S. Provisional Patent Application 60 / 785,202, filed Mar. 24, 2006 (assigned to the assignee of the present patent application, the disclosure of which is hereby incorporated by reference). It is included as a).

Further, in the arrangement in which the X axis passing through the pupil 42 and the spot 34 is parallel to the X axis of the sensor 40 as shown in FIG. 2, the speckle pattern deviation δZ is (distortion by the optical system 39). Is strictly limited to the X direction and there is no Y component of the shift . Thus, the image matching operation is simplified and only needs to find the closest matching group of speckles due to X shift . That is, in order to calculate the shift δZ that against the reference image areas of the current image (anything good if the previous image of the speckle pattern), the value of shift δX that best fit to the reference image In order to find out, it is only necessary to check a copy of the region of the current image shifted in the X-axis direction with respect to the reference image.

Alternatively, if the geometry of the components of device 22 deviates from the above criteria, or if lens distortion is significant , the processor may use a single coefficient model to compensate for this deviation. it can. That is, the known displacement is measured, or modeled, the processor, in order to find the actual three-dimensional coordinates of the object plane, thus the coefficient model deviation appropriate against the reference image (X, Y) A copy of the area of the current image that is shifted by the direction shift can be checked.

Usually, for convenience of construction and operation, operation factor of the system 20 is selected to be S << Z obj. (On the other hand, since the resolution in the Z direction of the system 20 depends on the ratio S / Z obj , S must be relatively large relative to the intended working distance of the system to obtain the desired resolution. as long as it is not) S << Z obj, from the lighting device and an image acquisition device, the value of each distance to each object point, becomes a value close to, but in general should exactly equal. Accordingly, the size of the speckle in the speckle pattern image acquired by the image acquisition device 38 varies with a certain tolerance γ throughout the area 46. Some of the computation methods known in the art have been described in the above-mentioned PCT patent application, and they are used to change these dimensional changes in the fit region of the current image relative to the corresponding region of the reference image. Can be compensated.

However, in general, in order not to put an excessive calculation load on the processor 24, γ falls within a predetermined range according to the size of the frame to be adapted and the characteristic speckle size . It is desirable to maintain it. The inventors generally found that γ should be limited to the extent that the variation in characteristic frame dimensions does not exceed 30% of one speckle dimension . Using the diagonal θ of the field of view of the image acquisition device 38,
It is. Thus, the substantial dimensional invariance of the local speckle pattern of the dimension N frame is
(S · sin (θ) · N / 2 · Z obj ) <0.3 (λ · Z obj / w 0 · psize (Z obj ))
It is when the condition is satisfied, wherein, psize (Z obj) is the dimension of the pixels in the Z obj. Under these conditions, the shift in the Z-axis direction of the subject in successive image frames acquired by the device 38 can generally be calculated without taking into account changes in speckle dimensions .

FIG. 3 is a flowchart schematically illustrating a three-dimensional mapping method using the system 20 according to an embodiment of the present invention. This method is especially speckle pattern projected by the illuminating device 30 is based on the recognition that does not substantially change over time. Therefore, the three-dimensional map of the subject can be accurately calculated using each image of the speckle pattern projected on the subject acquired by the image acquisition device 38 at a fixed position and a fixed angle with respect to the device . Can do.

Before mapping the object, in a calibration step 50, from a and 22 a known spatial shape to a certain object separated by a known distance, by projecting the speckle patterns from the device 30, calibration of the apparatus 22 I do. Usually, for this purpose, a flat subject over the entire region 46 at a known distance Z obj is used as a calibration target. Image acquisition device 38 acquires a reference image of the object, to be saved in the memory of the processor 24. This calibration step may be performed at the time of manufacture, reference images stored in the memory as long as the relative movement between different parts material of the device 22 is not performed without control, be used in this region Can do. To save memory, simplifying the subsequent operations, the reference image may also be stored in a suitable, such as an image of based rather binary threshold, and reduce the amount of data format for fitting algorithm used.

When the system 20 is ready for use, in a first image acquisition step 52 , the system 20 is activated and uses the device 22 to acquire an image of the target subject (in this case, the subject 28). Processor 24, in the map calculation step 54, the image is compared with speckle patterns in the stored calibration image. Dark regions of an image that have a pixel value less than a predetermined threshold (or do not contain significant speckle information) are usually classified as shadow regions, from which depth (Z) information is not extracted. . Other part of the image, for an efficient adaptation of pairs in the reference image, possibly using a known one adaptive threshold in the art, or binarized Ru, or reduced less of the amount of data.

The processor 24 selects one of the frame in the non-shadow portion of the image, the sub-image in this frame, as compared to the portion of the reference image, which until that part of the reference image to which the sub picture is best fit Continue. See above, as shown in FIG. 2, device 30, 38, if it is arranged along the X-axis, the processor, the sub-image, it is displaced relative X direction with respect to the sub-images (as noted above, it is the magnification dimensional change of the speckle pattern to magnification gamma) portion of the image and comparing lever is sufficient. Processor is to fit part of the reference picture, using a shift in the transverse direction of the sub image, based on the principle of triangulation described above, determines the Z coordinate of the surface region of the object 28 in the sub-image . This region of the object table surface rather than being an X-Y plane direction, when tilted, the speckle patterns within the sub-image shows the distortion. The processor 24 may optionally analyze speckle distortion and estimate the tilt angle, thereby improving the accuracy of the three-dimensional mapping.

The processor 24 may calculate the coordinates of adjacent areas of the image using the map coordinates of the first frame as a starting point. Especially, the processor, and a region of an image, finds a high correlation between the corresponding regions in the reference image, shift against the reference image of this region, shift of the adjacent pixels in the image Can be useful in predicting Processor, a pixel in which these adjacent more deviation in the range close equal shift or to the shift of the region adapted initially to try to conform to the reference image. In this way, the processor expands the range of the fit region and continues this until it reaches the end. In this way, the processor calculates the Z coordinate of the non-shadow region of the image until the three-dimensional contour of the subject 28 is completed. This approach has the advantage that it can be quickly and robustly matched even with small frames and images with poor signal to noise ratios. Details of the computing methods that can be used for this purpose are described in the aforementioned PCT patent application.

As a result of the above steps, the processor 24 has computed a complete three-dimensional map of the portion of the object plane that can be seen in the first image. However, in the next image step 56 , the method can be easily extended to acquire and analyze successive images to track the three-dimensional movement of the subject. The device 22 acquires successive images at a predetermined frame rate, and the processor 24 updates the three-dimensional map based on each successive image. The three-dimensional map may be computed on a stored and calibrated reference image if desired. Alternatively, the subject typically does not move much from one image frame to the next, so it is often more efficient to use each successive image as a reference image for the next frame.

In this manner, processor 24, in shift operation step 58, in order to calculate the speckle of each sub-image, a relative shift in the X direction for the same speckles preceding in the image, each successive image it can be compared with the preceding image a. Usually, this shift does not exceed several pixels, so that the calculation can be performed quickly and efficiently. After each new image has been processed in this way, the processor 24 outputs an updated three-dimensional map in a new map output step 60. This image acquisition and update process can thus be performed indefinitely. Since operation of the continuous three-dimensional map is easy, the system 20 is a simple, while using low-cost image hardware and processing hardware, fast real-time video rates than or about 30 frames / sec, map coordinates Can be operated and output. Furthermore, the efficient image matching computation and range expansion method as described above allows the system 20 to operate at video speeds even when a local shift cannot be computed from the preceding image.

Because of this capability of the system 20, the system 20 can be used properly in a wide range of applications , and can be implemented in machine interfaces based on human movements in particular. In such an interface, a computer (having a processor 24 or capable of receiving a three-dimensional map output by the processor) allows a user's body part (eg, arm, hand and / or finger, and possibly the head, Identify one or more quantities in the three-dimensional map corresponding to the torso and other extremities. The computer is programmed to identify actions corresponding to certain movements of these body parts and to control computer applications in response to these actions . Examples of such operations and applications include the following.
Mouse Interpretation and Click—The computer interprets the movement of the user's hand and fingers as if the user moved the mouse on the table and clicked the mouse button.
-Point, select, and interpret the subject freehand on the computer screen.
A computer game in which an actual or virtual subject used in the game is hit, grabbed, moved, and released by user action .
A computer interface for the disabled user based on detecting limited movements that the user can perform.
Other types of applications on the virtual keyboard will be obvious to those skilled in the art.

Returning to FIG. 2, the beam 36 and extends beyond the Rayleigh distance, intensity of illumination to be dropped on the subject 28 is reduced substantially in proportion to the Z 2. The contrast of the speckle pattern projected on the subject also decreases accordingly. This particularly decreases when there is strong ambient light having the same wavelength as that of the light source 32. Accordingly, the depth (Z coordinate) range over which the system 20 can produce useful results can be limited due to the large Z and weak illumination. This point Ri known der in the art, can be mitigated by your adaptive control and image system. Some examples of such suitable methods are described in the above-mentioned PCT patent application PCT / IL2006 / 000335. Alternatively, or in addition, as described below, the optical beam-shaped formation, it is possible to improve the contour illumination.

FIG. 4 is a schematic side view of a lighting device 70 used in the system 20 to widen the useful depth range of the system, according to one embodiment of the present invention. The device 70 includes a beam forming device 72 together with the light source 32 and the diffusion plate 33. The beam forming apparatus, reduces the divergence between the intermediate region 76, it has been designed to generate a beam 74 to maintain the linear dimension of the speckle pattern of the axial distance Z in this region. As a result, over the entire area 76 in the image of the object 28, it is maintained contrast high speckle, resulting depth range is widened, which is covered by a three-dimensional mapping system. Several optical designs that can be used to improve performance in region 76 are described below.

FIG. 5 is a schematic side view of a beam forming device 72 according to one embodiment of the present invention. The beam forming apparatus includes a single diffractive optical element (DOE) 80, consists of one conical lens 82. The DOE 80 may be in contact with the diffusion plate 33 or may be incorporated as an etching layer or a deposition layer on the surface of the diffusion plate. Various diffraction designs can be used to reduce beam divergence in region 76. For example, the DOE 80 may have a pattern that is concentric with the center on the optical axis of the light source 32 and that the radius of the circle is randomly distributed. The conical lens 82 has a conical outline centered on the optical axis, that is, a kind of rotationally symmetric prism. Both the DOE 80 and the conical lens 82 have the effect of creating a long focal region along the optical axis, so that any one of these members can be used to create a region with reduced beam divergence. This reduction in divergence is further enhanced by using two members together.

FIG. 6 is a schematic side view of a beam forming apparatus 90 according to another embodiment of the present invention. The beam forming apparatus 90 includes one DOE 92 and lenses 94 and 96 having a focal length F. As shown, these lenses are separated from the diffuser plate 33 and the DOE 92 by a distance equal to the focal length, so that the DOE is located in the Fourier plane of the diffuser plate. Therefore, the Fourier transform of the diffuser is multiplied by the transmission function of the DOE. In the far region, the speckle pattern is multiplied by the Fourier transform of the pattern on the DOE.

As shown in FIG. 4, the DOE pattern can be selected such that the Fourier transform reduces divergence and / or provides more uniform illumination across the illumination beam. The latter problem is that transmission in the central region is lower than that in the peripheral region (as opposed to the intensity distribution due to the angle of the beam from the diffuser plate 33 which is bright in the center and tends to decrease as the angle increases from the optical axis). Can be achieved by designing the diffractive optical element ( DOE) 92 as follows. To light intensity of interest, also other designs of DOE92 or DOE80 (Figure 5) for providing a light intensity contrast more uniform speckle, be apparent to those skilled in the art, it is within the scope of the present invention it is conceivable that.

FIG. 7 is a schematic side view of one optical correlator 110 for determining the Z coordinate of the region of the subject 28 that may be used in the system 20 according to yet another embodiment of the present invention. That is, the correlator 110 uses optical techniques that perform some of the functions of the processor 24 described above. The correlator is very fast and can determine the coordinates of multiple regions of the subject in parallel at approximately the same time. Therefore, it is very useful for an application field in which the subject moves quickly.

One lenslet array 116 forms a plurality of sub-images of the subject 28 under speckle illumination by the device 30. One aperture array 118 limits the field of view of the lenslet array 116 so that each sub-image contains only light from a narrow angular region. The second single lenslet array 120 projects the sub-image onto one DOE 122. The lenslet array 120 is separated from the surface of the sub-image by a distance equal to the focal length of the lenslets in the array and is separated from the DOE 122 surface by an equal distance. One lenslet array 124 of the rear is located between the DOE122 and the sensor 40, by a distance equal to the focal length of the lenslet away from the respective.

DOE122 is a spatial Fourier transform of the reference speckle pattern speckle image of the subject 128 are compared, has one reference diffraction patterns. For example, the reference diffraction pattern can be a Fourier transform of the calibration speckle image formed in step 50 using a plane that is a known distance from the light source. In this case, the reference diffraction pattern can be etched or deposited on the DOE surface. Alternatively, the DOE 122 may consist of a single spatial modulator (SLM) that is driven to dynamically project the reference diffraction pattern .

In either case, the correlator 110 is a sub-image of the (formed by the lenslets during lenslet array 116) subjects, multiplied by the reference speckle pattern in the Fourier space. Therefore, the intensity distribution projected onto the sensor 40 by the lenslet array 124 matches the cross-correlation between the reference speckle pattern and each sub-image . In general, the intensity distribution on the sensor has a plurality of correlation peaks, each peak corresponding to one of the sub-images . The transverse shift value of each peak relative to the axis of the corresponding sub-image (defined by the corresponding aperture in the aperture array 118) is the transverse displacement of the speckle pattern on the corresponding region of the subject 28. It is proportional to This displacement is further, as described above, it is proportional to the Z-direction displacement of the area to be against the surface of the reference speckle pattern. Thus, the output of the sensor 40, in order determine the Z coordinate of the area of each sub-image, and, in order to calculate the three-dimensional map of the object can be processed.

Although the embodiments described above relate to the particular system 20 configuration and device 22 design described above, the principles of the present invention are also applicable to other types of speckle based 3D mapping systems and devices. It can be applied to. For example, the aspect of the above-described embodiment may be applied to a system using a plurality of image acquisition devices , or may be applied to a system in which the image acquisition device and the illumination device are relatively movable. .

Although the above embodiments have been shown by way of example, it is understood that the present invention is not limited to what has been particularly shown and described herein. Rather, the scope of the present invention is the combination of the various features described above in this application, any combination thereof, or any combination thereof, as would be conceived by one of ordinary skill in the art upon reading the above description and not disclosed in the prior art. Includes changes and modifications.

1 is a schematic diagram illustrating a three-dimensional mapping system according to an embodiment of the present invention. 1 is a schematic top view of a speckle imaging device according to an embodiment of the present invention. 4 is a flowchart schematically illustrating one method of three-dimensional mapping according to an embodiment of the present invention. FIG. 6 is a schematic side view of a lighting device used in a three-dimensional mapping system according to another embodiment of the present invention. 1 is a schematic side view of a beam forming apparatus according to an embodiment of the present invention. It is a schematic side view of the beam forming apparatus by further another embodiment of this invention. FIG. 6 is a schematic side view of an optical correlator used in a three-dimensional mapping system, according to yet another embodiment of the present invention.

20 three-dimensional mapping system 22 speckle imaging device 24 image processor 28 subjects 30 illumination device 32 coherent light source 33 diffusion plate 38 image acquisition apparatus 39 objective optical system 40 imaging sensor

Claims (42)

  1. One illumination device having one coherent light source and one diffuser arranged to project one primary speckle pattern onto one subject;
    A single image acquisition device arranged to acquire the first speckle pattern image on the subject from a single and fixed position and angle relative to the illumination device;
    The subject acquired within one or more of the images of the primary speckle pattern acquired at the single and fixed position and angle to derive a three-dimensional map of the subject. said first primary speckle pattern of the plurality of regions of the upper reference of the primary speckle pattern obtained on the reference surface of known contour from said single image acquisition device a known distance One processor connected to find each shift between the images;
    Here, the respective shifts indicate the respective distances between the plurality of regions and the image acquisition device,
    A three-dimensional mapping apparatus for a subject characterized by comprising:
  2.   2. The apparatus of claim 1, further comprising: a stand attached to the illumination device and the image acquisition device to hold the image acquisition device in a fixed positional relationship with the illumination device. The device described.
  3. The image acquisition device includes:
    One detector element array disposed in one linear pattern defining first and second mutually orthogonal axes;
    An objective optical system having one entrance pupil and arranged to focus the image on the detection element array;
    Defines the axis of one device that is parallel to the first axis and passes through the entrance pupil and a spot through which one beam emitted by the coherent light source passes through the diffuser. The apparatus according to claim 2, wherein the lighting device and the image acquisition device are arranged by the platform.
  4.   Only on the first axis, a deviation between the first speckle pattern acquired in one or more of the images and the reference image of the first speckle pattern. 4. The apparatus of claim 3, wherein the processor is arranged to derive a three-dimensional map by finding a transition.
  5.   The image acquisition device is arranged at a predetermined space from the lighting device, wherein the respective shifts are proportional to the respective distances at a rate determined by the space. The apparatus according to claim 1.
  6.   The primary speckle pattern projected by the illumination device consists of speckles having one characteristic dimension, where the dimension of the speckle in the image depends on the space 1 6. An apparatus according to claim 5, characterized in that it varies over the entire image by one tolerance and the space is selected to keep the tolerance within one predetermined range.
  7.   The processor is configured to associate the respective shifts with respective coordinates of the three-dimensional map using a single coefficient model of distortion in the image acquisition device. The device described in 1.
  8.   The processor, between the first speckle pattern in the first region of the subject and the corresponding region of the reference image, in a first one shift with respect to the first region. To find the respective shifts by finding the first one match and to find the respective shifts of pixels adjacent to the first region based on the first shift The apparatus according to claim 1, wherein the apparatus is configured to apply one range extension procedure.
  9.   The processor according to claim 1, wherein the processor is configured to process a series of images acquired while the subject is moving in order to map one three-dimensional movement of the subject. 9. The apparatus according to any one of 8.
  10.   The subject is a part of a human body, and the three-dimensional movement is an operation performed by the human body part, wherein the processor is responsive to the operation by a computer. 10. Device according to claim 9, characterized in that it is connected to provide one input to the application.
  11.   The illumination device is arranged from one beam forming device arranged so as to reduce the fluctuation of the light intensity contrast of the speckle pattern created by the diffuser plate over the entire detection area of the three-dimensional mapping device. The device according to claim 1, characterized in that
  12.   The apparatus according to claim 11, wherein the beam forming device comprises one diffractive optical element (DOE).
  13.   The beam forming device includes one lens arranged to form one Fourier plane of the diffusion plate, wherein the diffractive optical element (DOE) is arranged on the Fourier plane. The apparatus of claim 12.
  14.   The apparatus according to claim 11, wherein the beam forming apparatus is arranged to reduce divergence of light emitted from the diffusion plate.
  15.   The beam forming device is arranged to make the intensity of light emitted from the diffuser plate and crossing one surface orthogonal to one optical axis of the illumination device uniform. 11. The apparatus according to 11.
  16.   The apparatus according to claim 1, wherein the processor comprises one optical correlator.
  17.   The optical correlator is composed of one diffractive optical element (DOE) including one reference speckle pattern, where the image acquisition device generates respective correlation peaks indicating the three-dimensional coordinates of the subject. The apparatus of claim 16, comprising a lenslet array that projects a plurality of sub-images of the subject onto the DOE.
  18.   9. A device according to any preceding claim, wherein the coherent light source has a coherence length of less than 1 cm.
  19.   The primary speckle pattern comprises speckles having one characteristic dimension, wherein the illumination device changes the distance between the coherent light source and the diffuser by changing the distance between the speckle patterns. 9. A device according to any one of the preceding claims, characterized in that it is configured to be able to adjust the characteristic dimensions of the cable.
  20. Illuminating the subject with a beam of coherent light diffused from a light source to project a primary speckle pattern onto a subject;
    Using a single image acquisition device to acquire an image of the first speckle pattern on the subject from a single and fixed position and angle relative to the light source;
    Processing the image of the first speckle pattern acquired at the single and fixed position and angle to derive a three-dimensional map of the subject, and within one or more of the images said first primary speckle pattern of the plurality of areas on acquired the object in the primary that has been acquired on the reference surface of known contour from said single image acquisition device a known distance Finding each shift between the reference image of the speckle pattern;
    Wherein each of the shifts indicates a respective distance between the plurality of regions and the single image acquisition device;
    A method for three-dimensional mapping of a subject characterized by comprising:
  21.   21. The method of claim 20, wherein the image acquisition device is supported in one fixed positional relationship with respect to the light source while acquiring the image.
  22. The image acquisition device includes:
    Having one detector element array arranged in one linear pattern defining first and second mutually orthogonal axes;
    Here, the light source is composed of one diffusion plate,
    Acquiring the image comprises aligning one entrance pupil and one spot of the image acquisition device, wherein the spot is one device with one beam parallel to the first axis. The method of claim 21, wherein the location is through the diffuser plate along an axis.
  23.   The step of processing the image comprises: only on the first axis, the first speckle pattern acquired in one or more images and the reference image of the first speckle pattern. 23. The method of claim 22, comprising the step of finding one shift between
  24.   21. The respective shifts according to claim 20, characterized in that the respective shifts are proportional to the respective distances at a rate determined by a space at a fixed position from the light source. Method.
  25. The primary speckle pattern comprises speckles having one characteristic dimension, wherein the dimension of the speckle in the image is spread over the entire image by one tolerance that depends on the space. Change across,
    25. The method of claim 24, wherein acquiring the image comprises selecting the space such that the tolerance is within a predetermined range.
  26.   The step of finding each shift comprises the step of associating each shift with a respective coordinate of the three-dimensional map using a coefficient model of distortion in the image acquisition device; The method of claim 20.
  27. The step of finding each deviation is:
    In the first one shift relative to the first one region, the first speckle pattern in the first region of the subject and the corresponding one region of the reference image Discovering said respective shifts by finding one match;
    Applying one range extension procedure to find each respective shift of a pixel adjacent to the first region based on the first shift. The method described in 1.
  28.   The step of processing the image comprises the step of processing a series of images acquired while the subject is moving in order to map one three-dimensional movement of the subject. The method according to any one of 20 to 27.
  29.   The subject is a part of the human body, and the three-dimensional movement is an action performed by the part of the human body, wherein the step of processing the image is responsive to the action. 30. The method of claim 28, comprising providing one input to one computer application.
  30.   The step of illuminating the subject comprises the step of forming the beam so as to reduce fluctuations in the light intensity contrast of the speckle pattern created by the light source over the entire detection area of the three-dimensional mapping device. 28. A method according to any of claims 20 to 27, characterized in that
  31.   The method of claim 30, wherein forming the beam comprises passing the beam through a diffractive optical element (DOE).
  32.   The light source includes a single diffusion plate, wherein the step of passing the beam includes the step of disposing the diffractive optical element (DOE) on a single Fourier plane of the diffusion plate. Item 32. The method according to Item 31.
  33.   The method of claim 30, wherein forming the beam comprises reducing divergence of the beam.
  34.   The method of claim 30, wherein forming the beam comprises uniforming the intensity of the beam across a plane orthogonal to one optical axis of the light source.
  35.   28. A method according to any of claims 20 to 27, wherein the step of processing the image comprises applying the image to a single optical correlator.
  36.   The optical correlator is composed of a diffractive optical element (DOE) including a reference speckle pattern, wherein the step of acquiring the image generates respective correlation peaks indicating the three-dimensional coordinates of the subject. 36. The method according to claim 35, comprising projecting a plurality of sub-images of the subject onto the diffractive optical element (DOE) to achieve this.
  37.   28. A method according to any of claims 20 to 27, wherein the coherent light source has a coherence length of less than 1 cm.
  38.   Illuminating the subject comprises passing the light from the coherent light source through the one diffuser plate to generate the primary speckle pattern, wherein the primary The speckle pattern consists of speckles having one characteristic dimension, and the method includes changing the distance between the coherent light source and the diffuser to change the characteristic of the speckle. 28. A method according to any of claims 20 to 27, comprising the step of adjusting the dimensions.
  39. An illuminator comprising a coherent light source having a coherence length of less than 1 cm and a diffusing plate arranged to project a first speckle pattern on an object;
    One image acquisition device arranged to acquire the first speckle pattern image on the subject;
    In order to derive a three-dimensional map of the subject , the primary speckles of a plurality of regions on the subject acquired in one or more of the images of the primary speckle pattern Finding each shift between a pattern and a reference image of the first speckle pattern acquired on a reference surface of known contour at a known distance from the single image acquisition device One processor connected to the
    Here, the respective shifts indicate the respective distances between the plurality of regions and the image acquisition device,
    A three-dimensional mapping apparatus for a subject characterized by comprising:
  40.   40. The apparatus of claim 39, wherein the coherence length of the coherent light source is less than 0.5 mm.
  41.   41. An apparatus according to any of claims 39 or 40, wherein the coherent light source has a divergence greater than 5 [deg.].
  42.   The primary speckle pattern comprises speckles having one characteristic dimension, wherein the illumination device changes the distance between the coherent light source and the diffuser by changing the distance between the speckle patterns. 41. The device according to claim 39 or 40, characterized in that it is arranged to be able to adjust the characteristic dimensions of the cable.
JP2008558981A 2005-10-11 2007-03-08 3D detection using speckle patterns Active JP5174684B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
PCT/IL2006/000335 WO2007043036A1 (en) 2005-10-11 2006-03-14 Method and system for object reconstruction
ILPCT/IL2006/000335 2006-03-14
US78518706P true 2006-03-24 2006-03-24
US60/785,187 2006-03-24
PCT/IL2007/000306 WO2007105205A2 (en) 2006-03-14 2007-03-08 Three-dimensional sensing using speckle patterns

Publications (2)

Publication Number Publication Date
JP2009531655A JP2009531655A (en) 2009-09-03
JP5174684B2 true JP5174684B2 (en) 2013-04-03

Family

ID=38509871

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008558981A Active JP5174684B2 (en) 2005-10-11 2007-03-08 3D detection using speckle patterns

Country Status (5)

Country Link
US (2) US8390821B2 (en)
JP (1) JP5174684B2 (en)
KR (1) KR101331543B1 (en)
CN (1) CN101496033B (en)
WO (1) WO2007105205A2 (en)

Families Citing this family (153)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007105205A2 (en) 2006-03-14 2007-09-20 Prime Sense Ltd. Three-dimensional sensing using speckle patterns
EP1994503B1 (en) * 2006-03-14 2017-07-05 Apple Inc. Depth-varying light fields for three dimensional sensing
WO2007043036A1 (en) * 2005-10-11 2007-04-19 Prime Sense Ltd. Method and system for object reconstruction
US8050461B2 (en) 2005-10-11 2011-11-01 Primesense Ltd. Depth-varying light fields for three dimensional sensing
US9330324B2 (en) * 2005-10-11 2016-05-03 Apple Inc. Error compensation in three-dimensional mapping
CN103778635B (en) * 2006-05-11 2016-09-28 苹果公司 For the method and apparatus processing data
US8350847B2 (en) * 2007-01-21 2013-01-08 Primesense Ltd Depth mapping using multi-beam illumination
US8265793B2 (en) 2007-03-20 2012-09-11 Irobot Corporation Mobile robot for telecommunication
US8150142B2 (en) * 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
TWI433052B (en) 2007-04-02 2014-04-01 Primesense Ltd Depth mapping using projected patterns
US8494252B2 (en) * 2007-06-19 2013-07-23 Primesense Ltd. Depth mapping using optical elements having non-uniform focal characteristics
FR2921719B1 (en) * 2007-09-28 2010-03-12 Noomeo Method for constructing a synthesis image of a three-dimensional surface of a physical object
DE102007058590B4 (en) * 2007-12-04 2010-09-16 Sirona Dental Systems Gmbh Recording method for an image of a recording object and recording device
US9035876B2 (en) * 2008-01-14 2015-05-19 Apple Inc. Three-dimensional user interface session control
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8384997B2 (en) * 2008-01-21 2013-02-26 Primesense Ltd Optical pattern projection
WO2009093228A2 (en) * 2008-01-21 2009-07-30 Prime Sense Ltd. Optical designs for zero order reduction
US8456517B2 (en) 2008-07-09 2013-06-04 Primesense Ltd. Integrated processor for 3D mapping
FR2940423B1 (en) * 2008-12-22 2011-05-27 Noomeo Dense reconstruction three-dimensional scanning device
US8462207B2 (en) 2009-02-12 2013-06-11 Primesense Ltd. Depth ranging with Moiré patterns
US8786682B2 (en) 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US8717417B2 (en) 2009-04-16 2014-05-06 Primesense Ltd. Three-dimensional mapping and imaging
WO2011013079A1 (en) 2009-07-30 2011-02-03 Primesense Ltd. Depth mapping based on pattern matching and stereoscopic information
US8565479B2 (en) * 2009-08-13 2013-10-22 Primesense Ltd. Extraction of skeletons from 3D maps
CN102022979A (en) * 2009-09-21 2011-04-20 鸿富锦精密工业(深圳)有限公司 Three-dimensional optical sensing system
US8963829B2 (en) * 2009-10-07 2015-02-24 Microsoft Corporation Methods and systems for determining and tracking extremities of a target
US8564534B2 (en) * 2009-10-07 2013-10-22 Microsoft Corporation Human tracking system
US8867820B2 (en) 2009-10-07 2014-10-21 Microsoft Corporation Systems and methods for removing a background of an image
US7961910B2 (en) 2009-10-07 2011-06-14 Microsoft Corporation Systems and methods for tracking a model
US20110096182A1 (en) * 2009-10-25 2011-04-28 Prime Sense Ltd Error Compensation in Three-Dimensional Mapping
JP5588310B2 (en) 2009-11-15 2014-09-10 プライムセンス リミテッド Optical projector with beam monitor
US8830227B2 (en) 2009-12-06 2014-09-09 Primesense Ltd. Depth-based gain control
JP4783456B2 (en) * 2009-12-22 2011-09-28 株式会社東芝 Video playback apparatus and video playback method
US20110187878A1 (en) * 2010-02-02 2011-08-04 Primesense Ltd. Synchronization of projected illumination with rolling shutter of image sensor
US20110188054A1 (en) * 2010-02-02 2011-08-04 Primesense Ltd Integrated photonics module for optical projection
US8786757B2 (en) 2010-02-23 2014-07-22 Primesense Ltd. Wideband ambient light rejection
US8982182B2 (en) * 2010-03-01 2015-03-17 Apple Inc. Non-uniform spatial resource allocation for depth mapping
US8787663B2 (en) 2010-03-01 2014-07-22 Primesense Ltd. Tracking body parts by combined color image and depth processing
WO2011127646A1 (en) * 2010-04-13 2011-10-20 Nokia Corporation An apparatus, method, computer program and user interface
US9014848B2 (en) 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US8918209B2 (en) 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US8918213B2 (en) 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
DE112011104645T5 (en) 2010-12-30 2013-10-10 Irobot Corp. Mobile human interface robot
US8935005B2 (en) 2010-05-20 2015-01-13 Irobot Corporation Operating a mobile robot
GB2494081B (en) 2010-05-20 2015-11-11 Irobot Corp Mobile human interface robot
US8594425B2 (en) 2010-05-31 2013-11-26 Primesense Ltd. Analysis of three-dimensional scenes
US8670029B2 (en) * 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
JP5791131B2 (en) 2010-07-20 2015-10-07 アップル インコーポレイテッド Interactive reality extension for natural interactions
US9201501B2 (en) 2010-07-20 2015-12-01 Apple Inc. Adaptive projector
CN103097925B (en) 2010-08-06 2016-04-13 旭硝子株式会社 Diffraction optical element and measuring device
JP5834602B2 (en) 2010-08-10 2015-12-24 旭硝子株式会社 Diffractive optical element and measuring device
CN103053167B (en) 2010-08-11 2016-01-20 苹果公司 Scanning projector and an image capture module for mapping 3d
US9036158B2 (en) 2010-08-11 2015-05-19 Apple Inc. Pattern projector
US9348111B2 (en) 2010-08-24 2016-05-24 Apple Inc. Automatic detection of lens deviations
US9002099B2 (en) 2011-09-11 2015-04-07 Apple Inc. Learning-based estimation of hand and finger pose
US8582867B2 (en) 2010-09-16 2013-11-12 Primesense Ltd Learning-based pose estimation from depth maps
US8959013B2 (en) 2010-09-27 2015-02-17 Apple Inc. Virtual keyboard for a non-tactile three dimensional user interface
IL208568A (en) * 2010-10-07 2018-06-28 Elbit Systems Ltd Mapping, detecting and tracking objects in an arbitrary outdoor scene using active vision
JP5760391B2 (en) 2010-11-02 2015-08-12 旭硝子株式会社 diffractive optical element and measuring device
KR20120046973A (en) * 2010-11-03 2012-05-11 삼성전자주식회사 Method and apparatus for generating motion information
US9066087B2 (en) 2010-11-19 2015-06-23 Apple Inc. Depth mapping using time-coded illumination
US9167138B2 (en) 2010-12-06 2015-10-20 Apple Inc. Pattern projection and imaging using lens arrays
US8872762B2 (en) 2010-12-08 2014-10-28 Primesense Ltd. Three dimensional user interface cursor control
US8933876B2 (en) 2010-12-13 2015-01-13 Apple Inc. Three dimensional user interface session control
EP2466560A1 (en) 2010-12-20 2012-06-20 Axis AB Method and system for monitoring the accessibility of an emergency exit
US8930019B2 (en) 2010-12-30 2015-01-06 Irobot Corporation Mobile human interface robot
US8717488B2 (en) 2011-01-18 2014-05-06 Primesense Ltd. Objective optics with interference filter
EP3527121A1 (en) 2011-02-09 2019-08-21 Apple Inc. Gesture detection in a 3d mapping environment
US9052512B2 (en) 2011-03-03 2015-06-09 Asahi Glass Company, Limited Diffractive optical element and measuring apparatus
JP5948948B2 (en) * 2011-03-03 2016-07-06 旭硝子株式会社 Diffractive optical element and measuring device
US9030528B2 (en) 2011-04-04 2015-05-12 Apple Inc. Multi-zone imaging sensor and lens array
CN102760234B (en) 2011-04-14 2014-08-20 财团法人工业技术研究院 Depth image acquiring device, system and method
WO2012147495A1 (en) * 2011-04-28 2012-11-01 三洋電機株式会社 Information acquisition device and object detection device
WO2012147702A1 (en) 2011-04-28 2012-11-01 シャープ株式会社 Head-mounted display
EP2530442A1 (en) 2011-05-30 2012-12-05 Axis AB Methods and apparatus for thermographic measurements.
JP5926500B2 (en) * 2011-06-07 2016-05-25 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5298161B2 (en) * 2011-06-13 2013-09-25 シャープ株式会社 Operating device and image forming apparatus
JP5948949B2 (en) * 2011-06-28 2016-07-06 旭硝子株式会社 Diffractive optical element and measuring device
US9459758B2 (en) 2011-07-05 2016-10-04 Apple Inc. Gesture-based interface with enhanced features
US8881051B2 (en) 2011-07-05 2014-11-04 Primesense Ltd Zoom-based gesture user interface
US9377865B2 (en) 2011-07-05 2016-06-28 Apple Inc. Zoom-based gesture user interface
US8869073B2 (en) * 2011-07-28 2014-10-21 Hewlett-Packard Development Company, L.P. Hand pose interaction
US8908277B2 (en) 2011-08-09 2014-12-09 Apple Inc Lens array projector
US8749796B2 (en) 2011-08-09 2014-06-10 Primesense Ltd. Projectors of structured light
US8971572B1 (en) 2011-08-12 2015-03-03 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction
US9030498B2 (en) 2011-08-15 2015-05-12 Apple Inc. Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface
US9122311B2 (en) 2011-08-24 2015-09-01 Apple Inc. Visual feedback for tactile and non-tactile user interfaces
US9218063B2 (en) * 2011-08-24 2015-12-22 Apple Inc. Sessionless pointing user interface
FR2980292B1 (en) 2011-09-16 2013-10-11 Prynel Method and system for acquiring and processing images for motion detection
WO2013067526A1 (en) 2011-11-04 2013-05-10 Remote TelePointer, LLC Method and system for user interface for interactive devices using a mobile device
DE102011121696A1 (en) * 2011-12-16 2013-06-20 Friedrich-Schiller-Universität Jena Method for 3D measurement of depth-limited objects
EP2611169A1 (en) 2011-12-27 2013-07-03 Thomson Licensing Device for the acquisition of stereoscopic images
LT2618316T (en) 2012-01-23 2018-11-12 Novomatic Ag Wheel of fortune with gesture control
US9157790B2 (en) 2012-02-15 2015-10-13 Apple Inc. Integrated optoelectronic modules with transmitter, receiver and beam-combining optics for aligning a beam axis with a collection axis
US20130222369A1 (en) 2012-02-23 2013-08-29 Charles D. Huston System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment
US9229534B2 (en) 2012-02-28 2016-01-05 Apple Inc. Asymmetric mapping for tactile and non-tactile user interfaces
KR101898490B1 (en) * 2012-02-29 2018-09-13 엘지전자 주식회사 Holographic display device and method for generating hologram using redundancy of 3-D video
US8958911B2 (en) 2012-02-29 2015-02-17 Irobot Corporation Mobile robot
CN104221059B (en) 2012-03-22 2017-05-10 苹果公司 Diffraction-based sensing of mirror position
CN104246682B (en) 2012-03-26 2017-08-25 苹果公司 Enhanced virtual touchpad and touch-screen
US9047507B2 (en) 2012-05-02 2015-06-02 Apple Inc. Upper-body skeleton extraction from depth maps
CN103424077A (en) * 2012-05-23 2013-12-04 联想(北京)有限公司 Motion detection device, detection method and electronic equipment
CN102681183B (en) * 2012-05-25 2015-01-07 合肥鼎臣光电科技有限责任公司 Two-way three-dimensional imaging and naked-eye three-dimensional display system based on lens array
US8896594B2 (en) * 2012-06-30 2014-11-25 Microsoft Corporation Depth sensing with depth-adaptive illumination
WO2014003796A1 (en) * 2012-06-30 2014-01-03 Hewlett-Packard Development Company, L.P. Virtual hand based on combined data
US9904998B2 (en) 2012-08-27 2018-02-27 Koninklijke Philips N.V. Patient-specific and automatic x-ray system adjustment based on optical 3D scene detection and interpretation
US9019267B2 (en) 2012-10-30 2015-04-28 Apple Inc. Depth mapping with enhanced resolution
DE102012110460A1 (en) * 2012-10-31 2014-04-30 Audi Ag A method for entering a control command for a component of a motor vehicle
US9661304B2 (en) * 2012-10-31 2017-05-23 Ricoh Company, Ltd. Pre-calculation of sine waves for pixel values
US9152234B2 (en) 2012-12-02 2015-10-06 Apple Inc. Detecting user intent to remove a pluggable peripheral device
US9217665B2 (en) 2013-01-31 2015-12-22 Hewlett Packard Enterprise Development Lp Viewing-angle imaging using lenslet array
JP6044403B2 (en) * 2013-03-18 2016-12-14 富士通株式会社 Imaging apparatus, imaging method, and imaging program
US20140307055A1 (en) 2013-04-15 2014-10-16 Microsoft Corporation Intensity-modulated light pattern for active stereo
CN103268608B (en) * 2013-05-17 2015-12-02 清华大学 Based on depth estimation method and device for near-infrared laser speckle
CN105324631B (en) 2013-06-19 2018-11-16 苹果公司 integrated structured light projector
CN105358063B (en) 2013-06-19 2018-11-30 皇家飞利浦有限公司 The calibration of imager with dynamic beam reshaper
US9208566B2 (en) 2013-08-09 2015-12-08 Microsoft Technology Licensing, Llc Speckle sensing for motion tracking
JP6387964B2 (en) 2013-09-02 2018-09-12 Agc株式会社 Measuring device
TWI485361B (en) * 2013-09-11 2015-05-21 Univ Nat Taiwan Measuring apparatus for three-dimensional profilometry and method thereof
KR20150069771A (en) 2013-12-16 2015-06-24 삼성전자주식회사 Event filtering device and motion recognition device thereof
US9528906B1 (en) 2013-12-19 2016-12-27 Apple Inc. Monitoring DOE performance using total internal reflection
JP6359466B2 (en) * 2014-01-13 2018-07-18 フェイスブック,インク. Optical detection at sub-resolution
WO2015148604A1 (en) 2014-03-25 2015-10-01 Massachusetts Institute Of Technology Space-time modulated active 3d imager
WO2015152829A1 (en) 2014-04-03 2015-10-08 Heptagon Micro Optics Pte. Ltd. Structured-stereo imaging assembly including separate imagers for different wavelengths
US10455212B1 (en) * 2014-08-25 2019-10-22 X Development Llc Projected pattern motion/vibration for depth sensing
USD733141S1 (en) 2014-09-10 2015-06-30 Faro Technologies, Inc. Laser scanner
US9881235B1 (en) 2014-11-21 2018-01-30 Mahmoud Narimanzadeh System, apparatus, and method for determining physical dimensions in digital images
US9841496B2 (en) 2014-11-21 2017-12-12 Microsoft Technology Licensing, Llc Multiple pattern illumination optics for time of flight system
TWI564754B (en) * 2014-11-24 2017-01-01 圓剛科技股份有限公司 Spatial motion sensing device and spatial motion sensing method
US9858703B2 (en) * 2014-12-18 2018-01-02 Facebook, Inc. System, device and method for providing user interface for a virtual reality environment
CN107864667A (en) * 2014-12-27 2018-03-30 贾迪安光学技术有限公司 For the system and method for the multiple vibrations for detecting surface
US10186034B2 (en) 2015-01-20 2019-01-22 Ricoh Company, Ltd. Image processing apparatus, system, image processing method, calibration method, and computer-readable recording medium
US9958758B2 (en) * 2015-01-21 2018-05-01 Microsoft Technology Licensing, Llc Multiple exposure structured light pattern
US9817159B2 (en) 2015-01-31 2017-11-14 Microsoft Technology Licensing, Llc Structured light pattern generation
JP6575795B2 (en) 2015-03-11 2019-09-18 パナソニックIpマネジメント株式会社 Human detection system
US10001583B2 (en) 2015-04-06 2018-06-19 Heptagon Micro Optics Pte. Ltd. Structured light projection using a compound patterned mask
US9525863B2 (en) 2015-04-29 2016-12-20 Apple Inc. Time-of-flight depth mapping with flexible scan pattern
US9947098B2 (en) * 2015-05-13 2018-04-17 Facebook, Inc. Augmenting a depth map representation with a reflectivity map representation
JP6566768B2 (en) * 2015-07-30 2019-08-28 キヤノン株式会社 Information processing apparatus, information processing method, and program
US10012831B2 (en) 2015-08-03 2018-07-03 Apple Inc. Optical monitoring of scan parameters
US10043279B1 (en) 2015-12-07 2018-08-07 Apple Inc. Robust detection and classification of body parts in a depth map
US10154234B2 (en) * 2016-03-16 2018-12-11 Omnivision Technologies, Inc. Image sensor with peripheral 3A-control sensors and associated imaging system
KR101745651B1 (en) * 2016-03-29 2017-06-09 전자부품연구원 System and method for recognizing hand gesture
EP3226042A1 (en) 2016-03-30 2017-10-04 Samsung Electronics Co., Ltd Structured light generator and object recognition apparatus including the same
US10474297B2 (en) 2016-07-20 2019-11-12 Ams Sensors Singapore Pte. Ltd. Projecting a structured light pattern onto a surface and detecting and responding to interactions with the same
US10241244B2 (en) 2016-07-29 2019-03-26 Lumentum Operations Llc Thin film total internal reflection diffraction grating for single polarization or dual polarization
US10481740B2 (en) 2016-08-01 2019-11-19 Ams Sensors Singapore Pte. Ltd. Projecting a structured light pattern onto a surface and detecting and responding to interactions with the same
US10073004B2 (en) 2016-09-19 2018-09-11 Apple Inc. DOE defect monitoring utilizing total internal reflection
US10366278B2 (en) 2016-09-20 2019-07-30 Apple Inc. Curvature-based face detector
TWI587206B (en) * 2016-11-24 2017-06-11 財團法人工業技術研究院 Interactive display device and system thereof
US10158845B2 (en) 2017-01-18 2018-12-18 Facebook Technologies, Llc Tileable structured light projection for wide field-of-view depth sensing
US10310281B1 (en) 2017-12-05 2019-06-04 K Laser Technology, Inc. Optical projector with off-axis diffractive element
US10317684B1 (en) 2018-01-24 2019-06-11 K Laser Technology, Inc. Optical projector with on axis hologram and multiple beam splitter

Family Cites Families (167)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE2951207C2 (en) 1978-12-26 1993-05-27 Canon K.K., Tokio/Tokyo, Jp
US4542376A (en) 1983-11-03 1985-09-17 Burroughs Corporation System for electronically displaying portions of several different images on a CRT screen through respective prioritized viewports
JPS6079108U (en) * 1983-11-08 1985-06-01
JPH0762869B2 (en) 1986-03-07 1995-07-05 日本電信電話株式会社 Pattern - position shape measuring method according to emission projections
US4843568A (en) 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
JPH0615968B2 (en) 1986-08-11 1994-03-02 伍良 松本 Three-dimensional shape measurement device
JP2714152B2 (en) * 1989-06-28 1998-02-16 古野電気株式会社 Object shape measuring method
US5075562A (en) 1990-09-20 1991-12-24 Eastman Kodak Company Method and apparatus for absolute Moire distance measurements using a grating printed on or attached to a surface
US5856871A (en) 1993-08-18 1999-01-05 Applied Spectral Imaging Ltd. Film thickness mapping using interferometric spectral imaging
GB9116151D0 (en) 1991-07-26 1991-09-11 Isis Innovation Three-dimensional vision system
US5483261A (en) 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
DE69226512D1 (en) 1992-03-12 1998-09-10 Ibm An image processing method
US5636025A (en) 1992-04-23 1997-06-03 Medar, Inc. System for optically measuring the surface contour of a part using more fringe techniques
JP3353365B2 (en) * 1993-03-18 2002-12-03 浜松ホトニクス株式会社 Displacement and displacement velocity measuring device
JPH10505171A (en) * 1994-09-05 1998-05-19 ミコー・テクノロジィ・リミテッド Diffractive surface and methods for their production
US6041140A (en) 1994-10-04 2000-03-21 Synthonics, Incorporated Apparatus for interactive image correlation for three dimensional image production
JPH08186845A (en) 1994-12-27 1996-07-16 Nobuaki Yanagisawa Focal distance controlling stereoscopic-vision television receiver
US5630043A (en) 1995-05-11 1997-05-13 Cirrus Logic, Inc. Animated texture map apparatus and method for 3-D image displays
IL114278A (en) 1995-06-22 2010-06-16 Microsoft Internat Holdings B Camera and method
WO1997004285A1 (en) 1995-07-18 1997-02-06 The Budd Company Moire interferometry system and method with extended imaging depth
JPH09261535A (en) 1996-03-25 1997-10-03 Sharp Corp Image pickup device
DE19638727A1 (en) 1996-09-12 1998-03-19 Ruedger Dipl Ing Rubbert Method of increasing the significance of the three-dimensional measurement of objects
JP3402138B2 (en) 1996-09-27 2003-04-28 株式会社日立製作所 The liquid crystal display device
IL119341A (en) 1996-10-02 1999-09-22 Univ Ramot Phase-only filter for generating an arbitrary illumination pattern
IL119831A (en) 1996-12-15 2002-12-01 Cognitens Ltd Apparatus and method for 3d surface geometry reconstruction
JP2001507133A (en) * 1996-12-20 2001-05-29 ライフェフ/エックス・ネットワークス・インク Fast 3d image parameter displaying apparatus and method
US5838428A (en) 1997-02-28 1998-11-17 United States Of America As Represented By The Secretary Of The Navy System and method for high resolution range imaging with split light source and pattern mask
JPH10327433A (en) 1997-05-23 1998-12-08 ▲たち▼ ▲すすむ▼ Display device for composted image
US6008813A (en) 1997-08-01 1999-12-28 Mitsubishi Electric Information Technology Center America, Inc. (Ita) Real-time PC based volume rendering system
DE19736169A1 (en) 1997-08-20 1999-04-15 Fhu Hochschule Fuer Technik Method to measure deformation or vibration using electronic speckle pattern interferometry
US6101269A (en) 1997-12-19 2000-08-08 Lifef/X Networks, Inc. Apparatus and method for rapid 3D image parametrization
US6438272B1 (en) 1997-12-31 2002-08-20 The Research Foundation Of State University Of Ny Method and apparatus for three dimensional surface contouring using a digital video projection system
DE19815201A1 (en) 1998-04-04 1999-10-07 Link Johann & Ernst Gmbh & Co A measuring arrangement for the detection of dimensions of specimens, preferably hollow bodies, in particular of bores in workpieces, and methods for measuring such dimensions
US6731391B1 (en) 1998-05-13 2004-05-04 The Research Foundation Of State University Of New York Shadow moire surface measurement using Talbot effect
DE19821611A1 (en) 1998-05-14 1999-11-18 Syrinx Med Tech Gmbh Recording method for spatial structure of three-dimensional surface, e.g. for person recognition
US6377700B1 (en) 1998-06-30 2002-04-23 Intel Corporation Method and apparatus for capturing stereoscopic images using image sensors
JP3678022B2 (en) 1998-10-23 2005-08-03 コニカミノルタセンシング株式会社 3-dimensional input device
US6084712A (en) 1998-11-03 2000-07-04 Dynamic Measurement And Inspection,Llc Three dimensional imaging using a refractive optic design
US8965898B2 (en) 1998-11-20 2015-02-24 Intheplay, Inc. Optimizations for live event, real-time, 3D object tracking
US6759646B1 (en) 1998-11-24 2004-07-06 Intel Corporation Color interpolation for a four color mosaic pattern
JP2001166810A (en) 1999-02-19 2001-06-22 Sanyo Electric Co Ltd Device and method for providing solid model
CN2364507Y (en) 1999-03-18 2000-02-16 香港生产力促进局 Small non-contact symmetric imput type three-D profile scanning head
US6259561B1 (en) 1999-03-26 2001-07-10 The University Of Rochester Optical system for diffusing light
GB2352901A (en) 1999-05-12 2001-02-07 Tricorder Technology Plc Rendering three dimensional representations utilising projected light patterns
EP1190213A1 (en) 1999-05-14 2002-03-27 3DMetrics, Incorporated Color structured light 3d-imaging system
US6751344B1 (en) 1999-05-28 2004-06-15 Champion Orthotic Investments, Inc. Enhanced projector system for machine vision
US6512385B1 (en) 1999-07-26 2003-01-28 Paul Pfaff Method for testing a device under test including the interference of two beams
US6268923B1 (en) 1999-10-07 2001-07-31 Integral Vision, Inc. Optical method and system for measuring three-dimensional surface topography of an object having a surface contour
JP2001141430A (en) 1999-11-16 2001-05-25 Fuji Photo Film Co Ltd Image pickup device and image processing device
LT4842B (en) * 1999-12-10 2001-09-25 Uab "Geola" Universal digital holographic printer and method
US6301059B1 (en) 2000-01-07 2001-10-09 Lucent Technologies Inc. Astigmatic compensation for an anamorphic optical system
US6700669B1 (en) 2000-01-28 2004-03-02 Zheng J. Geng Method and system for three-dimensional imaging using light pattern having multiple sub-patterns
US6937348B2 (en) 2000-01-28 2005-08-30 Genex Technologies, Inc. Method and apparatus for generating structural pattern illumination
JP4560869B2 (en) 2000-02-07 2010-10-13 ソニー株式会社 Glasses-free display system and backlight system
JP4265076B2 (en) 2000-03-31 2009-05-20 沖電気工業株式会社 Multi-angle camera and automatic photographing device
JP3423937B2 (en) * 2000-06-10 2003-07-07 株式会社 メディソン The distance calculation method between video frames, a three-dimensional image generating system and method
US6810135B1 (en) 2000-06-29 2004-10-26 Trw Inc. Optimized human presence detection through elimination of background interference
TW527518B (en) 2000-07-14 2003-04-11 Massachusetts Inst Technology Method and system for high resolution, ultra fast, 3-D imaging
US7227526B2 (en) 2000-07-24 2007-06-05 Gesturetek, Inc. Video-based image control system
US6686921B1 (en) 2000-08-01 2004-02-03 International Business Machines Corporation Method and apparatus for acquiring a set of consistent image maps to represent the color of the surface of an object
US6754370B1 (en) 2000-08-14 2004-06-22 The Board Of Trustees Of The Leland Stanford Junior University Real-time structured light range scanning of moving scenes
US6639684B1 (en) 2000-09-13 2003-10-28 Nextengine, Inc. Digitizer using intensity gradient to image features of three-dimensional objects
US6813440B1 (en) 2000-10-10 2004-11-02 The Hong Kong Polytechnic University Body scanner
JP3689720B2 (en) 2000-10-16 2005-08-31 住友大阪セメント株式会社 Three-dimensional shape measurement device
JP2002152776A (en) 2000-11-09 2002-05-24 Nippon Telegr & Teleph Corp <Ntt> Method and device for encoding and decoding distance image
JP2002191058A (en) 2000-12-20 2002-07-05 Olympus Optical Co Ltd Three-dimensional image acquisition device and three- dimensional image acquisition method
JP2002213931A (en) 2001-01-17 2002-07-31 Fuji Xerox Co Ltd Instrument and method for measuring three-dimensional shape
US6841780B2 (en) 2001-01-19 2005-01-11 Honeywell International Inc. Method and apparatus for detecting objects
JP2002365023A (en) * 2001-06-08 2002-12-18 Koji Okamoto Apparatus and method for measurement of liquid level
US6825985B2 (en) 2001-07-13 2004-11-30 Mems Optical, Inc. Autostereoscopic display with rotated microlens and method of displaying multidimensional images, especially color images
US6741251B2 (en) 2001-08-16 2004-05-25 Hewlett-Packard Development Company, L.P. Method and apparatus for varying focus in a scene
AU2003217587A1 (en) 2002-02-15 2003-09-09 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US7369685B2 (en) 2002-04-05 2008-05-06 Identix Corporation Vision-based operating method and system
US7811825B2 (en) 2002-04-19 2010-10-12 University Of Washington System and method for processing specimens and images for optical tomography
AU2003253626A1 (en) 2002-06-07 2003-12-22 University Of North Carolina At Chapel Hill Methods and systems for laser based real-time structured light depth extraction
US7006709B2 (en) 2002-06-15 2006-02-28 Microsoft Corporation System and method deghosting mosaics using multiperspective plane sweep
US20040001145A1 (en) 2002-06-27 2004-01-01 Abbate Jeffrey A. Method and apparatus for multifield image generation and processing
US7874917B2 (en) 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US6859326B2 (en) 2002-09-20 2005-02-22 Corning Incorporated Random microlens array for optical beam shaping and homogenization
KR100624405B1 (en) 2002-10-01 2006-09-18 삼성전자주식회사 Substrate for mounting optical component and method for producing the same
US7194105B2 (en) 2002-10-16 2007-03-20 Hersch Roger D Authentication of documents and articles by moiré patterns
TWI291040B (en) 2002-11-21 2007-12-11 Solvision Inc Fast 3D height measurement method and system
US7103212B2 (en) 2002-11-22 2006-09-05 Strider Labs, Inc. Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
US20040174770A1 (en) 2002-11-27 2004-09-09 Rees Frank L. Gauss-Rees parametric ultrawideband system
US7639419B2 (en) 2003-02-21 2009-12-29 Kla-Tencor Technologies, Inc. Inspection system using small catadioptric objective
US7127101B2 (en) 2003-03-10 2006-10-24 Cranul Technologies, Inc. Automatic selection of cranial remodeling device trim lines
US20040213463A1 (en) 2003-04-22 2004-10-28 Morrison Rick Lee Multiplexed, spatially encoded illumination system for determining imaging and range estimation
US7539340B2 (en) 2003-04-25 2009-05-26 Topcon Corporation Apparatus and method for three-dimensional coordinate measurement
ES2313036T3 (en) 2003-07-24 2009-03-01 Cognitens Ltd. Procedure and system for the reconstruction of the three-dimensional surface of an object.
CA2435935A1 (en) 2003-07-24 2005-01-24 Guylain Lemelin Optical 3d digitizer with enlarged non-ambiguity zone
US20050111705A1 (en) 2003-08-26 2005-05-26 Roman Waupotitsch Passive stereo sensing for 3D facial shape biometrics
US6934018B2 (en) * 2003-09-10 2005-08-23 Shearographics, Llc Tire inspection apparatus and method
US7187437B2 (en) * 2003-09-10 2007-03-06 Shearographics, Llc Plurality of light sources for inspection apparatus and method
US7112774B2 (en) 2003-10-09 2006-09-26 Avago Technologies Sensor Ip (Singapore) Pte. Ltd CMOS stereo imaging system and method
US7250949B2 (en) 2003-12-23 2007-07-31 General Electric Company Method and system for visualizing three-dimensional data
US20050135555A1 (en) 2003-12-23 2005-06-23 Claus Bernhard Erich H. Method and system for simultaneously viewing rendered volumes
US8139142B2 (en) 2006-06-01 2012-03-20 Microsoft Corporation Video manipulation of red, green, blue, distance (RGB-Z) data including segmentation, up-sampling, and background substitution techniques
US8134637B2 (en) 2004-01-28 2012-03-13 Microsoft Corporation Method and system to increase X-Y resolution in a depth (Z) camera using red, blue, green (RGB) sensing
US7961909B2 (en) 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
KR100764419B1 (en) 2004-02-09 2007-10-05 강철권 Device for measuring 3d shape using irregular pattern and method for the same
US7427981B2 (en) 2004-04-15 2008-09-23 Avago Technologies General Ip (Singapore) Pte. Ltd. Optical device that measures distance between the device and a surface
US7308112B2 (en) 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
WO2006008637A1 (en) 2004-07-23 2006-01-26 Ge Healthcare Niagara, Inc. Method and apparatus for fluorescent confocal microscopy
US20060017656A1 (en) 2004-07-26 2006-01-26 Visteon Global Technologies, Inc. Image intensity control in overland night vision systems
KR101238608B1 (en) 2004-07-30 2013-02-28 익스트림 리얼리티 엘티디. A system and method for 3D space-dimension based image processing
US7120228B2 (en) 2004-09-21 2006-10-10 Jordan Valley Applied Radiation Ltd. Combined X-ray reflectometer and diffractometer
JP2006128818A (en) 2004-10-26 2006-05-18 Victor Co Of Japan Ltd Recording program and reproducing program corresponding to stereoscopic video and 3d audio, recording apparatus, reproducing apparatus and recording medium
IL165212A (en) 2004-11-15 2012-05-31 Elbit Systems Electro Optics Elop Ltd Device for scanning light
US7076024B2 (en) 2004-12-01 2006-07-11 Jordan Valley Applied Radiation, Ltd. X-ray apparatus with dual monochromators
US20060156756A1 (en) 2005-01-20 2006-07-20 Becke Paul E Phase change and insulating properties container and method of use
US20060221218A1 (en) 2005-04-05 2006-10-05 Doron Adler Image sensor with improved color filter
US7595892B2 (en) 2005-04-06 2009-09-29 Dimensional Photonics International, Inc. Multiple channel interferometric surface contour measurement system
US7560679B1 (en) 2005-05-10 2009-07-14 Siimpel, Inc. 3D camera
US7609875B2 (en) 2005-05-27 2009-10-27 Orametrix, Inc. Scanner system and method for mapping surface of three-dimensional object
EP1994503B1 (en) 2006-03-14 2017-07-05 Apple Inc. Depth-varying light fields for three dimensional sensing
WO2007105205A2 (en) 2006-03-14 2007-09-20 Prime Sense Ltd. Three-dimensional sensing using speckle patterns
US8050461B2 (en) 2005-10-11 2011-11-01 Primesense Ltd. Depth-varying light fields for three dimensional sensing
WO2007043036A1 (en) 2005-10-11 2007-04-19 Prime Sense Ltd. Method and system for object reconstruction
US8018579B1 (en) 2005-10-21 2011-09-13 Apple Inc. Three-dimensional imaging and display system
US20070133840A1 (en) 2005-11-04 2007-06-14 Clean Earth Technologies, Llc Tracking Using An Elastic Cluster of Trackers
US7856125B2 (en) 2006-01-31 2010-12-21 University Of Southern California 3D face reconstruction from 2D images
US7433024B2 (en) 2006-02-27 2008-10-07 Prime Sense Ltd. Range mapping using speckle decorrelation
US7869649B2 (en) 2006-05-08 2011-01-11 Panasonic Corporation Image processing device, image processing method, program, storage medium and integrated circuit
US8488895B2 (en) 2006-05-31 2013-07-16 Indiana University Research And Technology Corp. Laser scanning digital camera with pupil periphery illumination and potential for multiply scattered light imaging
EP2584530A2 (en) 2006-08-03 2013-04-24 Alterface S.A. Method and device for identifying and extracting images of multiple users, and for recognizing user gestures
US7737394B2 (en) 2006-08-31 2010-06-15 Micron Technology, Inc. Ambient infrared detection in solid state sensors
WO2008029345A1 (en) 2006-09-04 2008-03-13 Koninklijke Philips Electronics N.V. Method for determining a depth map from images, device for determining a depth map
US7256899B1 (en) 2006-10-04 2007-08-14 Ivan Faul Wireless methods and systems for three-dimensional non-contact shape sensing
WO2008061259A2 (en) 2006-11-17 2008-05-22 Celloptic, Inc. System, apparatus and method for extracting three-dimensional information of an object from received electromagnetic radiation
US8090194B2 (en) 2006-11-21 2012-01-03 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
US7990545B2 (en) 2006-12-27 2011-08-02 Cambridge Research & Instrumentation, Inc. Surface measurement of in-vivo subjects using spot projector
US7840031B2 (en) 2007-01-12 2010-11-23 International Business Machines Corporation Tracking a range of body movement based on 3D captured image streams of a user
US8350847B2 (en) 2007-01-21 2013-01-08 Primesense Ltd Depth mapping using multi-beam illumination
US20080212835A1 (en) 2007-03-01 2008-09-04 Amon Tavor Object Tracking by 3-Dimensional Modeling
TWI433052B (en) 2007-04-02 2014-04-01 Primesense Ltd Depth mapping using projected patterns
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
US8488868B2 (en) 2007-04-03 2013-07-16 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Industry, Through The Communications Research Centre Canada Generation of a depth map from a monoscopic color image for rendering stereoscopic still and video images
WO2008133959A1 (en) 2007-04-23 2008-11-06 California Institute Of Technology Single-lens 3-d imaging device using a polarization-coded aperture maks combined with a polarization-sensitive sensor
US7835561B2 (en) 2007-05-18 2010-11-16 Visiongate, Inc. Method for image processing and reconstruction of images for optical tomography
US8494252B2 (en) 2007-06-19 2013-07-23 Primesense Ltd. Depth mapping using optical elements having non-uniform focal characteristics
JP5160643B2 (en) 2007-07-12 2013-03-13 トムソン ライセンシングThomson Licensing System and method for recognizing 3D object from 2D image
JP4412362B2 (en) 2007-07-18 2010-02-10 船井電機株式会社 Compound eye imaging device
US20090060307A1 (en) 2007-08-27 2009-03-05 Siemens Medical Solutions Usa, Inc. Tensor Voting System and Method
DE102007045332B4 (en) 2007-09-17 2019-01-17 Seereal Technologies S.A. Holographic display for reconstructing a scene
KR100858034B1 (en) 2007-10-18 2008-09-10 (주)실리콘화일 One chip image sensor for measuring vitality of subject
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US8176497B2 (en) 2008-01-16 2012-05-08 Dell Products, Lp Method to dynamically provision additional computer resources to handle peak database workloads
US8384997B2 (en) 2008-01-21 2013-02-26 Primesense Ltd Optical pattern projection
WO2009093228A2 (en) 2008-01-21 2009-07-30 Prime Sense Ltd. Optical designs for zero order reduction
DE102008011350A1 (en) 2008-02-27 2009-09-03 Loeffler Technology Gmbh Apparatus and method for real-time detection of electromagnetic THz radiation
US8121351B2 (en) 2008-03-09 2012-02-21 Microsoft International Holdings B.V. Identification of objects in a 3D video using non/over reflective clothing
US8035806B2 (en) 2008-05-13 2011-10-11 Samsung Electronics Co., Ltd. Distance measuring sensor including double transfer gate and three dimensional color image sensor including the distance measuring sensor
US8456517B2 (en) 2008-07-09 2013-06-04 Primesense Ltd. Integrated processor for 3D mapping
US8462207B2 (en) 2009-02-12 2013-06-11 Primesense Ltd. Depth ranging with Moiré patterns
US8786682B2 (en) 2009-03-05 2014-07-22 Primesense Ltd. Reference image techniques for three-dimensional sensing
US8717417B2 (en) 2009-04-16 2014-05-06 Primesense Ltd. Three-dimensional mapping and imaging
US8503720B2 (en) 2009-05-01 2013-08-06 Microsoft Corporation Human body pose estimation
US8744121B2 (en) 2009-05-29 2014-06-03 Microsoft Corporation Device for identifying and tracking multiple humans over time
EP2275990B1 (en) 2009-07-06 2012-09-26 Sick Ag 3D sensor
WO2011013079A1 (en) 2009-07-30 2011-02-03 Primesense Ltd. Depth mapping based on pattern matching and stereoscopic information
WO2011031538A2 (en) 2009-08-27 2011-03-17 California Institute Of Technology Accurate 3d object reconstruction using a handheld device with a projected light pattern
US20110096182A1 (en) 2009-10-25 2011-04-28 Prime Sense Ltd Error Compensation in Three-Dimensional Mapping
US8830227B2 (en) 2009-12-06 2014-09-09 Primesense Ltd. Depth-based gain control
US8320621B2 (en) 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
US8982182B2 (en) 2010-03-01 2015-03-17 Apple Inc. Non-uniform spatial resource allocation for depth mapping
US8330804B2 (en) 2010-05-12 2012-12-11 Microsoft Corporation Scanned-beam depth mapping to 2D image
US8654152B2 (en) 2010-06-21 2014-02-18 Microsoft Corporation Compartmentalizing focus area within field of view

Also Published As

Publication number Publication date
WO2007105205A2 (en) 2007-09-20
WO2007105205A3 (en) 2009-04-23
KR20080111474A (en) 2008-12-23
CN101496033B (en) 2012-03-21
US8390821B2 (en) 2013-03-05
CN101496033A (en) 2009-07-29
JP2009531655A (en) 2009-09-03
US20130136305A1 (en) 2013-05-30
KR101331543B1 (en) 2013-11-20
US9063283B2 (en) 2015-06-23
US20090096783A1 (en) 2009-04-16

Similar Documents

Publication Publication Date Title
US6996291B2 (en) Systems and methods for correlating images in an image correlation system with reduced computational loads
JP5865910B2 (en) Depth camera based on structured light and stereoscopic vision
CN103053167B (en) Scanning projector and an image capture module for mapping 3d
JP4709814B2 (en) Lens array imaging with optical diaphragm structure to suppress crosstalk
GB2564794B (en) Image-stitching for dimensioning
DE10241392B4 (en) Apparatus and method for detecting a three-dimensional relative movement
JP5001286B2 (en) Object reconstruction method and system
JP2013505508A (en) Remote control of computer equipment
EP1586857A1 (en) An optical device that measures distance between the device and a surface
US9922249B2 (en) Super-resolving depth map by moving pattern projector
CN101641964B (en) Mid-air video interaction device and its program
KR20110059247A (en) Apparatus and method for processing image using light field data
US20140218281A1 (en) Systems and methods for eye gaze determination
US8558873B2 (en) Use of wavefront coding to create a depth image
CN100432905C (en) Method and device for optical navigation
JP5189725B2 (en) Motion tracking method and system using interference pattern
US8600123B2 (en) System and method for contactless multi-fingerprint collection
KR20150093831A (en) Direct interaction system for mixed reality environments
US6873422B2 (en) Systems and methods for high-accuracy displacement determination in a correlation based position transducer
JP2011185872A (en) Information processor, and processing method and program of the same
KR101601331B1 (en) System and method for three-dimensional measurment of the shape of material object
US9826216B1 (en) Systems and methods for compact space-time stereo three-dimensional depth sensing
US20030226968A1 (en) Apparatus and method for inputting data
US4963017A (en) Variable depth range camera
JP2008241355A (en) Device for deriving distance of object

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20091225

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100108

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110930

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20111102

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120116

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120704

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120710

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20121205

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20121228

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250