US20140307055A1 - Intensity-modulated light pattern for active stereo - Google Patents

Intensity-modulated light pattern for active stereo Download PDF

Info

Publication number
US20140307055A1
US20140307055A1 US13915626 US201313915626A US2014307055A1 US 20140307055 A1 US20140307055 A1 US 20140307055A1 US 13915626 US13915626 US 13915626 US 201313915626 A US201313915626 A US 201313915626A US 2014307055 A1 US2014307055 A1 US 2014307055A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
pattern
intensity
set
points
fig
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13915626
Inventor
Sing Bing Kang
Andreas Georgiou
Richard S. Szeliski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00496Recognising patterns in signals and combinations thereof
    • G06K9/00536Classification; Matching
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE, IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE, IN GENERAL
    • B29CSHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
    • B29C64/00Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
    • B29C64/30Auxiliary operations or equipment
    • B29C64/386Data acquisition or data processing for additive manufacturing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/22Measuring arrangements characterised by the use of optical means for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4205Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/44Grating systems; Zone plate systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1876Diffractive Fresnel lenses; Zone plates; Kinoforms
    • G02B5/189Structurally combined with optical elements not having diffractive power
    • G02B5/1895Structurally combined with optical elements not having diffractive power such optical elements having dioptric power
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/3024Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a central processing unit [CPU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/0207Addressing or allocation; Relocation with multidimensional access, e.g. row/column, matrix
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/0223User address space allocation, e.g. contiguous or non contiguous base addressing
    • G06F12/0292User address space allocation, e.g. contiguous or non contiguous base addressing using tables or multilevel address translation means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from or digital output to record carriers, e.g. RAID, emulated record carriers, networked record carriers
    • G06F3/0601Dedicated interfaces to storage systems
    • G06F3/0628Dedicated interfaces to storage systems making use of a particular technique
    • G06F3/0653Monitoring storage devices or systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from or digital output to record carriers, e.g. RAID, emulated record carriers, networked record carriers
    • G06F3/0601Dedicated interfaces to storage systems
    • G06F3/0628Dedicated interfaces to storage systems making use of a particular technique
    • G06F3/0655Vertical data movement, i.e. input-output transfer; data movement between one or more hosts and one or more storage devices
    • G06F3/0659Command handling arrangements, e.g. command buffers, queues, command scheduling
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from or digital output to record carriers, e.g. RAID, emulated record carriers, networked record carriers
    • G06F3/0601Dedicated interfaces to storage systems
    • G06F3/0668Dedicated interfaces to storage systems adopting a particular infrastructure
    • G06F3/0671In-line storage system
    • G06F3/0683Plurality of storage devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/30003Arrangements for executing specific machine instructions
    • G06F9/3004Arrangements for executing specific machine instructions to perform operations on memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/30003Arrangements for executing specific machine instructions
    • G06F9/3004Arrangements for executing specific machine instructions to perform operations on memory
    • G06F9/30043LOAD or STORE instructions; Clear instruction
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/30098Register arrangements
    • G06F9/3012Organisation of register space, e.g. banked or distributed register file
    • G06F9/30123Organisation of register space, e.g. banked or distributed register file according to context, e.g. thread buffers
    • G06F9/30127Register windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00201Recognising three-dimensional objects, e.g. using range or tactile information
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/0063Recognising patterns in remote scenes, e.g. aerial images, vegetation versus urban areas
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/586Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
    • H04N13/0022
    • H04N13/02
    • H04N13/0239
    • H04N13/025
    • H04N13/0253
    • H04N13/0271
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/2256Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infra-red radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infra-red radiation
    • H04N5/332Multispectral imaging comprising at least a part of the infrared region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/04Picture signal generators
    • H04N9/045Picture signal generators using solid-state devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4233Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive element [DOE] contributing to a non-imaging application
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Abstract

The subject disclosure is directed towards projecting light in a pattern in which the pattern contains components (e.g., spots) having different intensities. The pattern may be based upon a grid of initial points associated with first intensities and points between the initial points with second intensities, and so on. The pattern may be rotated relative to cameras that capture the pattern, with captured images used active depth sensing based upon stereo matching of dots in stereo images.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • [0001]
    The present application claims priority to U.S. provisional patent application Ser. No. 61/812,232, filed Apr. 15, 2013.
  • BACKGROUND
  • [0002]
    In active depth sensing, such as used by active stereo systems, a projector projects patterns of light such as infrared (IR) dots or lines to illuminate a scene being sensed. The projected patterns are then captured by a camera/sensor (two or more in stereo systems), with the image (or images) processed to compute a depth map or the like.
  • [0003]
    For example, in stereo systems, stereo cameras capture two images from different viewpoints. Then, for example, one way to perform depth estimation with a stereo pair of images is to find correspondences of local patches between the images, e.g., to correlate each projected and sensed local dot pattern in the left image with a counterpart local dot pattern in the right image. Once matched, the projected patterns within the images may be correlated with one another, and disparities between one or more features of the correlated dots used to estimate (e.g., triangulate) a depth to that particular dot pair.
  • [0004]
    IR lasers have been used to produce such patterns. In order to allow the stereo system to work over a wide range of depths, more powerful lasers (around 1W or more) are needed. At such power levels, multi-mode lasers are more cost-effective. However, using multi-mode lasers results in the design pattern looking blurrier at closer distances. This is problematic in active stereo depth sensing, because correlating the correct pairs of left and right pairs of dots is subject to more errors when the dots are blurred.
  • SUMMARY
  • [0005]
    This Summary is provided to introduce a selection of representative concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used in any way that would limit the scope of the claimed subject matter.
  • [0006]
    Briefly, one or more of various aspects of the subject matter described herein are directed towards an intensity-modulated light pattern for active sensing. A projector including a laser and a diffractive optical component projects a light pattern towards a scene. The diffractive optical component is configured to output the light pattern as a plurality of sets of sub-patterns, with each set corresponding to a different range of intensities.
  • [0007]
    One or more aspects are directed towards generating a grid comprising a first set of points, associating each point in the first set with an intensity value that is within a first intensity range, adding a second set of points between subsets of points of the first set of points and associating each point in the second set with an intensity value that is within a second intensity range. This subdivision process may be repeated if necessary. A diffractive optical component may be encoded based upon the first set of points and the second set of points. Another variant is to generate a random set of points with approximately uniform density throughout, with a random subset of them having a specified range of intensities, and the rest having a different range of intensities.
  • [0008]
    One or more aspects are directed towards projecting light through a diffractive optical component to project a pattern comprising a first set of spots corresponding to a the first intensity range, and a second set of spots corresponding to a second intensity range. The positions of the spots in the first set are based upon an initial grid layout, and the positions of spots in the second set of spots are based upon the positions of the first set of spots. The first set of spots and the second set of spots are sensed as left and right stereo camera images. The images are processed to correlate spots in the left image with spots in the right image, in which scanlines of the images are not aligned with the initial grid layout.
  • [0009]
    Other advantages may become apparent from the following detailed description when taken in conjunction with the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0010]
    The present invention is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
  • [0011]
    FIG. 1 is a block diagram representing example components that may be used to project and capture a light pattern modulated with different intensities, according to one or more example implementations.
  • [0012]
    FIG. 2 is a representation of an example of projecting dots having different intensities into a scene, according to one or more example implementations.
  • [0013]
    FIGS. 3A and 3B are representations of a pattern may be designed based upon a grid, and subdivision of points aligned via the grid, to facilitate having points with different intensities, according to one or more example implementations.
  • [0014]
    FIG. 4 is a representation of further subdivision of points having different intensities, according to one or more example implementations.
  • [0015]
    FIG. 5 is a flow diagram representing example steps in laying out points for of different intensities, such as for encoding corresponding data into a diffractive optical element, according to one or more example implementations.
  • [0016]
    FIG. 6 is a block diagram representing example components of a device that projects a diffraction pattern of light having different intensities, according to one example implementation.
  • [0017]
    FIGS. 7 and 8 are representations of how non-rotation versus rotation of a projected pattern affects scanning of captured images that include the projected pattern, according to one or more example implementations.
  • [0018]
    FIG. 9 is a representation of how dots of different intensities may be captured in a part of an image, and moved over time, according to one or more example implementations.
  • [0019]
    FIG. 10 is a block diagram representing an exemplary non-limiting computing system or operating environment, in the form of a gaming system, into which one or more aspects of various embodiments described herein can be implemented.
  • DETAILED DESCRIPTION
  • [0020]
    Various aspects of the technology described herein are generally directed towards having a light pattern projected into a scene, in which the light pattern is configured to provide for enhanced pattern matching, including at different depths to illuminated objects. In one aspect, a light pattern includes intermixed points of light (e.g., spots such as dots) of different intensities. The technology also leverages the depth-dependent appearance of the pattern by having the pattern include points that are semi-randomly distributed.
  • [0021]
    As will be understood, the peak intensities of neighboring points are different. This results in local changes in intensity independent of the scene depth, to allow stereo matching to function properly.
  • [0022]
    It should be understood that any of the examples herein are non-limiting. For example, the projected light pattern may use spots, generally exemplified herein as dots, but the dots may be of any shape. As another, the dots are exemplified as arranged according to a triangular grid, however this is only one example, and other arrangements (e.g., a hexagonal grid) may be implemented. Rotation angles of the patterns (described below), different ranges or values of intensity peaks (e.g., for large, medium and small intensities) from those described herein may be used, and so on. As such, the present invention is not limited to any particular embodiments, aspects, concepts, structures, functionalities or examples described herein. Rather, any of the embodiments, aspects, concepts, structures, functionalities or examples described herein are non-limiting, and the present invention may be used various ways that provide benefits and advantages in active depth sensing and image processing in general.
  • [0023]
    FIG. 1 shows an example system in which stereo cameras 102 and 103 of an image capturing system or subsystem 104 capture images synchronized in time (e.g., the cameras are “genlocked”). In one implementation the cameras capture infrared (IR) images, as IR does not affect the visible appearance of the scene (which is highly advantageous, such as in video conferencing and object modeling applications). As can be readily appreciated, in some scenarios such as studio environments, more than two IR depth-sensing cameras may be present. Further, one or more other cameras may be present in a given system, such as RBG cameras, and such other cameras may be used to help correlate dot pairs in different stereo images, for example.
  • [0024]
    In FIG. 1, a projector 106 is shown that projects an IR pattern onto a scene, such as a pattern of spots (e.g., dots) or a line pattern, although other spot shapes and/or pattern types may be used. For purposes of brevity, dots are generally described hereinafter. By illuminating the scene with a relatively large number of distributed infrared dots, the cameras 102 and 103 capture texture data as part of the infrared image data. As described herein, to facilitate more accurate dot matching between left and right images, the dots in the pattern are arranged with different intensities, and also the pattern may be rotated relative to the cameras. The pattern with intensity modulation may be designed (e.g., encoded) into a diffractive optical component (a diffractive optical element or combination of elements) that disperse laser light into the scene, e.g., as a dot pattern.
  • [0025]
    FIG. 2 exemplifies this projection concept. The projector 106, represented as a circle in between the stereo cameras 102 and 103, projects a dot pattern onto a scene 222. The dot pattern is modulated with different intensities, and the dot pattern may be rotated (e.g., fifteen degrees) relative to the cameras' orientation. The cameras 102 and 103 capture the dots as they reflect off of object surfaces in the scene 222 and (possibly) the background. In general, one or more features of the captured dots are indicative of the distance to the reflective surface. Note that FIG. 2 is not intended to be to scale, nor convey any sizes, distance, dot distribution pattern, dot density and so on. However, it is understood that different intensities exist in the dot pattern, and that the dot pattern may be rotated relative to the cameras.
  • [0026]
    Note that the placement of the projector 106 may be outside the cameras (e.g., FIG. 1), or in between the cameras (FIG. 2) or at another location, such as above or below one or both of the cameras. The examples herein are in no way limiting of where the cameras and/or projector are located relative to one another, and similarly, the cameras may be positioned at different positions relative to each other.
  • [0027]
    In one implementation the example image capturing system or subsystem 104 includes a controller 108 that via a camera interface 110 controls the operation of the cameras 102 and 103. The exemplified controller via a projector interface 112 also controls the operation of the projector 106. For example, the cameras 102 and 103 are synchronized (genlocked) to capture stereo images at the same time, such as by a controller signal (or different signals for each camera). The projector 106 may be turned on or off, pulsed, and otherwise have one or more parameters controllably varied, for example.
  • [0028]
    The images 116 captured by the cameras 102 and 103 are provided to an image processing system or subsystem 118. In some implementations, the image processing system 118 and image capturing system or subsystem 104, or parts thereof, may be combined into a single device. For example a home entertainment device may include all of the components shown in FIG. 1 (as well as others not shown). In other implementations, parts (or all) of the image capturing system or subsystem 104, such as the cameras and projector, may be a separate device that couples to a gaming console, personal computer, mobile device, dedicated processing device and/or the like. Indeed, a gaming console is exemplified in FIG. 10 as one environment that may be used for processing images into depth data.
  • [0029]
    The image processing system or subsystem 118 includes a processor 120 and a memory 122 containing one or more image processing algorithms 124. One or more depth maps may be obtained via the algorithms 124 such as by extracting matching features (such as dots and/or lines). For example, as is known, such as described in U.S. published patent application no. 20130100256, hereby incorporated by reference, different dots or other projected elements have different features when captured, including intensity (brightness), depending on the distance from the projector to the reflective surfaces and/or the distance from the camera to the reflective surfaces. As is also known, the dots in different images taken at the same time (e.g., with genlocked stereo cameras) may be correlated with one another, such as by matching small (e.g., RGB) patches between RGB color images of the same scene captured at the same instant. Thus, with captured images, known algorithms can determine individual depth-related features (depth maps) by matching projected light components (e.g., dots) in each image, using disparities of certain features between matched dots to determine depths. This is one way in which a depth map may be obtained via stereo image processing.
  • [0030]
    Also shown in FIG. 1 is an interface 132 to the image processing system or subsystem 118, such as for connecting a keyboard, game controller, display, pointing device microphone for speech commands and/or the like as appropriate for a user to interact with an application or the like that uses the depth map.
  • [0031]
    FIGS. 3A and 3B, along with FIG. 4 show the concept of subdivision, in which dots of larger intensity (larger dots with an “X” shaped cross therein) are arranged in a triangular grid layout 330 (FIG. 3A). In FIG. 3B, each triangle of the larger intensity dots is subdivided by triangles of lesser intensity dots (circles), providing the pattern 332. In FIG. 4, each of those sub-triangle sub-patterns is further subdivided by even lesser intensity dots (smaller-sized circles relative to those in FIG. 3B). Thus, FIG. 4 represents a triangular pattern 440 of higher intensity dots, medium intensity dots, and lower intensity dots. The dot sizes relative to the distribution pattern and each other are only intended to illustrate distribution of dots of differing relative intensities or intensity ranges, and are not intended to convey any particular intensity levels or ranges.
  • [0032]
    FIG. 5 summarizes subdivision, beginning at step 502 where in this example a triangular grid of a specified between-vertex distance is generated, e.g., comprising regular triangles or substantially regular triangles (or other polygons). The intensity peaks are set to a high value; however rather than being the same intensity value for each point, the high values may be randomly set to be within a high range (step 504), e.g., between 200-255 (with 255 being the maximum intensity). Note that as used herein, an intensity “range” includes a range with as little as one single fixed intensity value, e.g., a range may be from 200 to 200.
  • [0033]
    Step 506 represents adding points between the previously generated points, e.g., as smaller sets of triangles (a “subdivision”) such as shown in FIG. 3B. Step 508 randomly sets the intensity peaks of these points to be within a lower range, e.g., between 100-125. Note that these example intensity ranges do not overlap one another, but it is feasible to have different ranges overlap to an extent; if weighted random techniques may be used to bias most values in overlapping ranges away from one another.
  • [0034]
    Step 510 evaluates whether subdivision has been completed to the lowest desired level, which is configurable. Thus, by returning to step 506, another subdivision of points may be optionally added, (such as exemplified in FIG. 4), with an even lower range of intensities, and so on, until the desired pattern and sets of intensities/intensity ranges is reached. The result is a projection pattern that contains sub-patterns, in this example different sets of triangular sub-patterns, such as a larger intensity sub-pattern set and a smaller-intensity sub-pattern set (FIG. 3B), or small, medium and large intensity sub-pattern sets (FIG. 4) and so on. In general, the sets/sub-patterns are interleaved via subdivision.
  • [0035]
    Note that once the intensity-modulated pattern is designed, such as via the example steps of FIG. 5, the diffractive optical element or elements may be manufactured in known ways to output that pattern. Various eye safe diffractive optical element arrangements are described in the aforementioned provisional patent application Ser. No. 61/812,232. However, as another (optional) step, step 512 represents pseudo-randomly rearranging (e.g., slightly “jittering”) at least some of the points into modified positions, such as to further reduce repetition intervals. Typically this repositioning of a point is small relative to its respective triangle (or other grid pattern), whereby the regular polygon or substantially regular polygon is now modified to be only generally/approximately regular.
  • [0036]
    FIG. 6 is one such example configuration in which a diffractive optical component 660 (e.g., diffractive optical one or more elements) is configured to output an intensity-modulated illumination pattern. The component 660 may be built into or coupled to device 662, such as a built into or part of a home entertainment device. A laser 664 (e.g., multimode) provides the light source. Stereo cameras 666A and 666B capture the reflection from an illuminated object (e.g., person 668) and use the captured images as desired; note that a single camera may be used in a given implementation.
  • [0037]
    As represented n FIG. 6, the diffractive optical component 660 disperses the laser light into a large number of spots based upon the pattern designed as described herein, such as on the order of 100,000 dots. Some of the pattern is represented in FIG. 1 by the solid lines coming from the element and by the dots on the object/person 668 and image plane 670. Note that as with any of the figures herein, neither FIG. 6 nor its components are intended to be to scale or convey any particular distance, distribution and/or the like.
  • [0038]
    FIGS. 7 and 8 illustrate another aspect, namely rotation of the modulated patterns. In FIG. 7, camera captured dots of part of a left pattern 770L are shown alongside parts of a right pattern 770R. In general, dot A correlates with dot C, and is supposed to be matched. However, when scanning a line of the (e.g., right) image of pixels from left-to-right to match patterns, there is significant repetition, whereby dot B (or possibly dot D) may be erroneously matched with dot A
  • [0039]
    In FIG. 8, camera-captured dots of part of a rotated left pattern 880L are shown alongside parts of a rotated right pattern 880R. As can be seen, when scanning a line of pixels to match dot A, for example, neither dot B nor dot D will be encountered. In this way, the rotation (e.g., by fifteen degrees in this example, although other rotational angles may be used) helps to provide a larger repetition interval along the scanline (x-direction).
  • [0040]
    Rotation and intensity distribution is generally shown in the partial image representation 990 of FIG. 9, where the dots are illustrated by concentric circles, and (some relative) intensity by the sizes thereof. The pixels are represented by the square blocks behind the dots. Note that in FIG. 9 the different diameters of the circles only suggest changes in intensity; the size of the circles and the grid squares are not intended to convey any particular scale, resolution, or the like, nor any particular intensity value or relative intensity values (other than within at least two different ranges). Further, the density of the dots and/or their sizes or distribution are not intended to represent any actual density and/or distribution.
  • [0041]
    As can be seen, there is provided a light pattern modulated with different intensities. The pattern may be based upon a grid, and projected such that the cameras that capture the light pattern are not aligned with the grid on which the pattern was based. The intensity-modulated pattern provides for more robust stereo matching/depth sensing.
  • Example Operating Environment
  • [0042]
    It can be readily appreciated that the above-described implementation and its alternatives may be implemented on any suitable computing device, including a gaming system, personal computer, tablet, DVR, set-top box, smartphone and/or the like. Combinations of such devices are also feasible when multiple such devices are linked together. For purposes of description, a gaming (including media) system is described as one exemplary operating environment hereinafter.
  • [0043]
    FIG. 10 is a functional block diagram of an example gaming and media system 1000 and shows functional components in more detail. Console 1001 has a central processing unit (CPU) 1002, and a memory controller 1003 that facilitates processor access to various types of memory, including a flash Read Only Memory (ROM) 1004, a Random Access Memory (RAM) 1006, a hard disk drive 1008, and portable media drive 1009. In one implementation, the CPU 1002 includes a level 1 cache 1010, and a level 2 cache 1012 to temporarily store data and hence reduce the number of memory access cycles made to the hard drive, thereby improving processing speed and throughput.
  • [0044]
    The CPU 1002, the memory controller 1003, and various memory devices are interconnected via one or more buses (not shown). The details of the bus that is used in this implementation are not particularly relevant to understanding the subject matter of interest being discussed herein. However, it will be understood that such a bus may include one or more of serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus, using any of a variety of bus architectures. By way of example, such architectures can include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnects (PCI) bus also known as a Mezzanine bus.
  • [0045]
    In one implementation, the CPU 1002, the memory controller 1003, the ROM 1004, and the RAM 1006 are integrated onto a common module 1014. In this implementation, the ROM 1004 is configured as a flash ROM that is connected to the memory controller 1003 via a Peripheral Component Interconnect (PCI) bus or the like and a ROM bus or the like (neither of which are shown). The RAM 1006 may be configured as multiple Double Data Rate Synchronous Dynamic RAM (DDR SDRAM) modules that are independently controlled by the memory controller 1003 via separate buses (not shown). The hard disk drive 1008 and the portable media drive 1009 are shown connected to the memory controller 1003 via the PCI bus and an AT Attachment (ATA) bus 1016. However, in other implementations, dedicated data bus structures of different types can also be applied in the alternative.
  • [0046]
    A three-dimensional graphics processing unit 1020 and a video encoder 1022 form a video processing pipeline for high speed and high resolution (e.g., High Definition) graphics processing. Data are carried from the graphics processing unit 1020 to the video encoder 1022 via a digital video bus (not shown). An audio processing unit 1024 and an audio codec (coder/decoder) 1026 form a corresponding audio processing pipeline for multi-channel audio processing of various digital audio formats. Audio data are carried between the audio processing unit 1024 and the audio codec 1026 via a communication link (not shown). The video and audio processing pipelines output data to an A/V (audio/video) port 1028 for transmission to a television or other display/speakers. In the illustrated implementation, the video and audio processing components 1020, 1022, 1024, 1026 and 1028 are mounted on the module 1014.
  • [0047]
    FIG. 10 shows the module 1014 including a USB host controller 1030 and a network interface (NW I/F) 1032, which may include wired and/or wireless components. The USB host controller 1030 is shown in communication with the CPU 1002 and the memory controller 1003 via a bus (e.g., PCI bus) and serves as host for peripheral controllers 1034. The network interface 1032 provides access to a network (e.g., Internet, home network, etc.) and may be any of a wide variety of various wire or wireless interface components including an Ethernet card or interface module, a modem, a Bluetooth module, a cable modem, and the like.
  • [0048]
    In the example implementation depicted in FIG. 10, the console 1001 includes a controller support subassembly 1040, for supporting four game controllers 1041(1)-1041(4). The controller support subassembly 1040 includes any hardware and software components needed to support wired and/or wireless operation with an external control device, such as for example, a media and game controller. A front panel I/O subassembly 1042 supports the multiple functionalities of a power button 1043, an eject button 1044, as well as any other buttons and any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the console 1001. The subassemblies 1040 and 1042 are in communication with the module 1014 via one or more cable assemblies 1046 or the like. In other implementations, the console 1001 can include additional controller subassemblies. The illustrated implementation also shows an optical I/O interface 1048 that is configured to send and receive signals (e.g., from a remote control 1049) that can be communicated to the module 1014.
  • [0049]
    Memory units (MUs) 1050(1) and 1050(2) are illustrated as being connectable to MU ports “A” 1052(1) and “B” 1052(2), respectively. Each MU 1050 offers additional storage on which games, game parameters, and other data may be stored. In some implementations, the other data can include one or more of a digital game component, an executable gaming application, an instruction set for expanding a gaming application, and a media file. When inserted into the console 1001, each MU 1050 can be accessed by the memory controller 1003.
  • [0050]
    A system power supply module 1054 provides power to the components of the gaming system 1000. A fan 1056 cools the circuitry within the console 1001.
  • [0051]
    An application 1060 comprising machine instructions is typically stored on the hard disk drive 1008. When the console 1001 is powered on, various portions of the application 1060 are loaded into the RAM 1006, and/or the caches 1010 and 1012, for execution on the CPU 1002. In general, the application 1060 can include one or more program modules for performing various display functions, such as controlling dialog screens for presentation on a display (e.g., high definition monitor), controlling transactions based on user inputs and controlling data transmission and reception between the console 1001 and externally connected devices.
  • [0052]
    The gaming system 1000 may be operated as a standalone system by connecting the system to high definition monitor, a television, a video projector, or other display device. In this standalone mode, the gaming system 1000 enables one or more players to play games, or enjoy digital media, e.g., by watching movies, or listening to music. However, with the integration of broadband connectivity made available through the network interface 1032, gaming system 1000 may further be operated as a participating component in a larger network gaming community or system.
  • CONCLUSION
  • [0053]
    While the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention.

Claims (20)

    What is claimed is:
  1. 1. A system comprising, a projector that projects a light pattern towards a scene, the projector including a laser and a diffractive optical component, the diffractive optical component configured to output the light pattern as a plurality of sets of sub-patterns, each set corresponding to a different range of intensities.
  2. 2. The system of claim 1 wherein the sets comprise differently-sized sub-patterns, including large-sized sub-patterns and at least one set comprising smaller-sized sub-patterns.
  3. 3. The system of claim 2 wherein the sets of differently-sized sub-patterns are interleaved via subdivision.
  4. 4. The system of claim 1 wherein at least some of the sub-patterns comprise triangles corresponding to points.
  5. 5. The system of claim 1 wherein one or more of the sub-patterns comprise points corresponding to positions that form an approximate regular polygon, based upon repositioning points of a substantially regular polygon with additional random repositioning of at least some points thereof.
  6. 6. The system of claim 1 wherein at least one of the points that form a sub-pattern is randomly or pseudo-randomly assigned in intensity value within a range of intensities associated with that sub-pattern.
  7. 7. The system of claim 1 further comprising a pair of stereo cameras that capture the projected pattern in stereo images, and further comprising an image processing component configured to process stereo images into depth maps.
  8. 8. The system of claim 7 wherein the light pattern comprises infrared light for sensing by infrared stereo cameras, or visible light for sensing by color stereo cameras.
  9. 9. The system of claim 7 wherein the stereo cameras and the projector are incorporated into a single device.
  10. 10. The system of claim 7 wherein the diffractive optical component is configured to project a rotated pattern relative to an orientation of the stereo cameras.
  11. 11. A method comprising, generating a grid comprising a first set of points, associating each point in the first set with an intensity value that is within a first intensity range, adding a second set of points between subsets of points of the first set of points, associating each point in the second set with an intensity value that is within a second intensity range, and encoding a diffractive optical component based upon the first set of points and the second set of points.
  12. 12. The method of claim 11 further comprising, projecting light through the diffractive optical component to project a pattern comprising a first set of spots each within the first intensity range, and the second set of spots each within the second intensity range.
  13. 13. The method of claim 11 wherein projecting the light comprises projecting the pattern rotated relative to orientation of a camera.
  14. 14. The method of claim 11 further comprising, adding a third set of points between subsets of points of the second set of points, associating each point in the third set with an intensity value that is within a third intensity range, and wherein encoding the diffractive optical component further comprises encoding the diffractive optical based upon the third set of points.
  15. 15. The method of claim 11 wherein the grid is triangular, and wherein generating the gird comprises arranging the first set of points into triangles based upon the grid, and wherein adding the second set of points comprises subdividing at least some of the triangles.
  16. 16. The method of claim 11 wherein associating each point in the first set with an intensity value that is within a first intensity range comprises randomly or pseudo-randomly selecting an intensity value within the first range for at least some of the points in the first set.
  17. 17. The method of claim 11 wherein associating each point in the second set with an intensity value that is within a second intensity range comprises randomly or pseudo-randomly selecting an intensity value within the second range for at least some of the points in the second set.
  18. 18. The method of claim 11 further comprising, randomly or pseudo-randomly positioning at least some of the points in the first set relative to the grid.
  19. 19. The method of claim 11 further comprising, randomly or pseudo-randomly randomly positioning at least some of the points in the second set relative to their positions between subsets of points of the first set of points.
  20. 20. A method comprising, projecting light through a diffractive optical component to project a pattern comprising a first set of spots corresponding to a the first intensity range, and a second set of spots corresponding to a second intensity range, wherein the positions of the spots of the first set are based upon an initial grid layout and the positions of the spots of the second set of spots are based upon the positions of the spots of the first set, sensing the first set of spots and the second set of spots as left and right stereo camera images, and processing the images to correlate spots in the left image with spots in the right image, in which scanlines of the images are not aligned with the initial grid layout.
US13915626 2013-04-15 2013-06-11 Intensity-modulated light pattern for active stereo Abandoned US20140307055A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201361812232 true 2013-04-15 2013-04-15
US13915626 US20140307055A1 (en) 2013-04-15 2013-06-11 Intensity-modulated light pattern for active stereo

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US13915626 US20140307055A1 (en) 2013-04-15 2013-06-11 Intensity-modulated light pattern for active stereo
PCT/US2014/033910 WO2014172222A1 (en) 2013-04-15 2014-04-14 Intensity-modulated light pattern for active stereo
CN 201480021199 CN105229412A (en) 2013-04-15 2014-04-14 Intensity-modulated light pattern for active stereo
EP20140725312 EP2986935A1 (en) 2013-04-15 2014-04-14 Intensity-modulated light pattern for active stereo

Publications (1)

Publication Number Publication Date
US20140307055A1 true true US20140307055A1 (en) 2014-10-16

Family

ID=51686521

Family Applications (10)

Application Number Title Priority Date Filing Date
US13915626 Abandoned US20140307055A1 (en) 2013-04-15 2013-06-11 Intensity-modulated light pattern for active stereo
US13915622 Pending US20140307098A1 (en) 2013-04-15 2013-06-11 Extracting true color from a color and infrared sensor
US13918892 Active 2034-04-20 US9760770B2 (en) 2013-04-15 2013-06-14 Parallel memories for multidimensional data access
US13923135 Active US9959465B2 (en) 2013-04-15 2013-06-20 Diffractive optical element with undiffracted light expansion for eye safe operation
US13924475 Active 2034-03-31 US9697424B2 (en) 2013-04-15 2013-06-21 Active stereo with satellite device or devices
US13924485 Active 2035-04-10 US9922249B2 (en) 2013-04-15 2013-06-21 Super-resolving depth map by moving pattern projector
US13924464 Pending US20140307047A1 (en) 2013-04-15 2013-06-21 Active stereo with adaptive support weights from a separate image
US13925762 Active 2034-09-26 US9928420B2 (en) 2013-04-15 2013-06-24 Depth imaging system based on stereo vision and infrared radiation
US14088408 Abandoned US20140309764A1 (en) 2013-04-15 2013-11-24 Adaptive material deposition in three-dimensional fabrication
US14253696 Active US9508003B2 (en) 2013-04-15 2014-04-15 Hardware-amenable connected components labeling

Family Applications After (9)

Application Number Title Priority Date Filing Date
US13915622 Pending US20140307098A1 (en) 2013-04-15 2013-06-11 Extracting true color from a color and infrared sensor
US13918892 Active 2034-04-20 US9760770B2 (en) 2013-04-15 2013-06-14 Parallel memories for multidimensional data access
US13923135 Active US9959465B2 (en) 2013-04-15 2013-06-20 Diffractive optical element with undiffracted light expansion for eye safe operation
US13924475 Active 2034-03-31 US9697424B2 (en) 2013-04-15 2013-06-21 Active stereo with satellite device or devices
US13924485 Active 2035-04-10 US9922249B2 (en) 2013-04-15 2013-06-21 Super-resolving depth map by moving pattern projector
US13924464 Pending US20140307047A1 (en) 2013-04-15 2013-06-21 Active stereo with adaptive support weights from a separate image
US13925762 Active 2034-09-26 US9928420B2 (en) 2013-04-15 2013-06-24 Depth imaging system based on stereo vision and infrared radiation
US14088408 Abandoned US20140309764A1 (en) 2013-04-15 2013-11-24 Adaptive material deposition in three-dimensional fabrication
US14253696 Active US9508003B2 (en) 2013-04-15 2014-04-15 Hardware-amenable connected components labeling

Country Status (7)

Country Link
US (10) US20140307055A1 (en)
EP (8) EP2986931A1 (en)
KR (2) KR20150140841A (en)
CN (8) CN105229412A (en)
CA (1) CA2907895A1 (en)
RU (1) RU2015143654A (en)
WO (8) WO2014172222A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150036023A1 (en) * 2012-03-13 2015-02-05 Dolby Laboratories Licensing Corporation Lighting system and method for image and object enhancement
US20150229915A1 (en) * 2014-02-08 2015-08-13 Microsoft Corporation Environment-dependent active illumination for stereo matching
US20160371845A1 (en) * 2015-06-18 2016-12-22 Apple Inc. Monitoring doe performance using software scene evaluation
US9699394B2 (en) 2015-03-09 2017-07-04 Microsoft Technology Licensing, Llc Filter arrangement for image sensor

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120072245A (en) * 2010-12-23 2012-07-03 한국전자통신연구원 Apparatus and method for stereo matching
US20140307055A1 (en) * 2013-04-15 2014-10-16 Microsoft Corporation Intensity-modulated light pattern for active stereo
EP3088839A4 (en) * 2013-12-27 2017-11-29 Sony Corp Image processing device and image processing method
US9720506B2 (en) * 2014-01-14 2017-08-01 Microsoft Technology Licensing, Llc 3D silhouette sensing system
US9842424B2 (en) * 2014-02-10 2017-12-12 Pixar Volume rendering using adaptive buckets
US9674493B2 (en) * 2014-03-24 2017-06-06 Omnivision Technologies, Inc. Color image sensor with metal mesh to detect infrared light
GB201407270D0 (en) * 2014-04-24 2014-06-11 Cathx Res Ltd 3D data in underwater surveys
US9823842B2 (en) 2014-05-12 2017-11-21 The Research Foundation For The State University Of New York Gang migration of virtual machines using cluster-wide deduplication
JP2016051918A (en) * 2014-08-28 2016-04-11 任天堂株式会社 Information processing terminal, information processing program, information processing terminal system, and information processing method
US9507995B2 (en) * 2014-08-29 2016-11-29 X Development Llc Combination of stereo and structured-light processing
DE102014113389A1 (en) * 2014-09-17 2016-03-17 Pilz Gmbh & Co. Kg A method and apparatus for identifying structural elements of a projected pattern structure in camera images
EP3040941B1 (en) * 2014-12-29 2017-08-02 Dassault Systèmes Method for calibrating a depth camera
DE102015202182A1 (en) * 2015-02-06 2016-08-11 Siemens Aktiengesellschaft Apparatus and method for sequential, diffractive pattern projection
JP2016166811A (en) * 2015-03-10 2016-09-15 アルプス電気株式会社 Object detection device
US20160309134A1 (en) * 2015-04-19 2016-10-20 Pelican Imaging Corporation Multi-baseline camera array system architectures for depth augmentation in vr/ar applications
US9751263B2 (en) * 2015-04-20 2017-09-05 Xerox Corporation Injection molding to finish parts printed with a three-dimensional object printer
US9683834B2 (en) * 2015-05-27 2017-06-20 Intel Corporation Adaptable depth sensing system
US9495584B1 (en) * 2015-06-05 2016-11-15 Digital Signal Corporation System and method for facial recognition using images captured from a target illuminated with infrared light
WO2017014691A1 (en) * 2015-07-17 2017-01-26 Heptagon Micro Optics Pte. Ltd. Generating a distance map based on captured images of a scene
US9800795B2 (en) * 2015-12-21 2017-10-24 Intel Corporation Auto range control for active illumination depth camera
WO2017193013A1 (en) * 2016-05-06 2017-11-09 Zhang, Yunbo Determining manufacturable models
US20170374352A1 (en) * 2016-06-22 2017-12-28 Intel Corporation Depth image provision apparatus and method
CN106375740B (en) * 2016-09-28 2018-02-06 华为技术有限公司 Rgb image generating method, apparatus and system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6751344B1 (en) * 1999-05-28 2004-06-15 Champion Orthotic Investments, Inc. Enhanced projector system for machine vision
US20070263903A1 (en) * 2006-03-23 2007-11-15 Tyzx, Inc. Enhancing stereo depth measurements with projected texture
US7315383B1 (en) * 2004-07-09 2008-01-01 Mohsen Abdollahi Scanning 3D measurement technique using structured lighting and high-speed CMOS imager
US20080205748A1 (en) * 2007-02-28 2008-08-28 Sungkyunkwan University Structural light based depth imaging method and system using signal separation coding, and error correction thereof
US8077034B2 (en) * 2006-09-28 2011-12-13 Bea Sa Sensor for presence detection
US20120038986A1 (en) * 2010-08-11 2012-02-16 Primesense Ltd. Pattern projector
US20120056982A1 (en) * 2010-09-08 2012-03-08 Microsoft Corporation Depth camera based on structured light and stereo vision
US20120307075A1 (en) * 2011-06-01 2012-12-06 Empire Technology Development, Llc Structured light projection for motion detection in augmented reality
US20130229396A1 (en) * 2012-03-05 2013-09-05 Kenneth J. Huebner Surface aware, object aware, and image aware handheld projector
US20140168380A1 (en) * 2012-12-14 2014-06-19 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US20140293011A1 (en) * 2013-03-28 2014-10-02 Phasica, LLC Scanner System for Determining the Three Dimensional Shape of an Object and Method for Using
US20150316368A1 (en) * 2012-11-29 2015-11-05 Koninklijke Philips N.V. Laser device for projecting a structured light pattern onto a scene

Family Cites Families (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3938102A (en) 1974-08-19 1976-02-10 International Business Machines Corporation Method and apparatus for accessing horizontal sequences and rectangular sub-arrays from an array stored in a modified word organized random access memory system
EP0085210A1 (en) 1982-01-29 1983-08-10 International Business Machines Corporation Image processing system
US5351152A (en) * 1991-07-23 1994-09-27 The Board Of Governers Of Wayne State University Direct-view stereoscopic confocal microscope
US5471326A (en) * 1993-04-30 1995-11-28 Northrop Grumman Corporation Holographic laser scanner and rangefinder
US5586200A (en) * 1994-01-07 1996-12-17 Panasonic Technologies, Inc. Segmentation based image compression system
US5739906A (en) * 1996-06-07 1998-04-14 The United States Of America As Represented By The Secretary Of Commerce Interferometric thickness variation test method for windows and silicon wafers using a diverging wavefront
US6105139A (en) * 1998-06-03 2000-08-15 Nec Usa, Inc. Controller-based power management for low-power sequential circuits
US6414930B1 (en) 1998-08-03 2002-07-02 Matsushita Electric Industrial Co., Ltd. Optical head
GB0008303D0 (en) * 2000-04-06 2000-05-24 British Aerospace Measurement system and method
US6826299B2 (en) 2000-07-31 2004-11-30 Geodetic Services, Inc. Photogrammetric image correlation and measurement system and method
US7554737B2 (en) * 2000-12-20 2009-06-30 Riake Corporation Illumination device and method using adaptable source and output format
US6850872B1 (en) * 2000-08-30 2005-02-01 Microsoft Corporation Facial image processing methods and systems
WO2003005733A1 (en) * 2001-07-06 2003-01-16 Explay Ltd. An image projecting device and method
US6940538B2 (en) * 2001-08-29 2005-09-06 Sony Corporation Extracting a depth map from known camera and model tracking data
US7762964B2 (en) * 2001-12-10 2010-07-27 Candela Corporation Method and apparatus for improving safety during exposure to a monochromatic light source
JP4075418B2 (en) 2002-03-15 2008-04-16 ソニー株式会社 Image processing apparatus and image processing method, a printed matter manufacturing apparatus and printed material manufacturing method, and printed material production system,
US6771271B2 (en) * 2002-06-13 2004-08-03 Analog Devices, Inc. Apparatus and method of processing image data
US7399220B2 (en) * 2002-08-02 2008-07-15 Kriesel Marshall S Apparatus and methods for the volumetric and dimensional measurement of livestock
CN1176351C (en) * 2002-10-09 2004-11-17 天津大学 Method and device of 3D digital imaging with dynamic multiple resolution ratio
CN1186671C (en) * 2002-10-09 2005-01-26 天津大学 Production method and device of projection structural light
GB0226242D0 (en) 2002-11-11 2002-12-18 Qinetiq Ltd Ranging apparatus
US7103212B2 (en) * 2002-11-22 2006-09-05 Strider Labs, Inc. Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
US7154157B2 (en) * 2002-12-30 2006-12-26 Intel Corporation Stacked semiconductor radiation sensors having color component and infrared sensing capability
JP3938120B2 (en) * 2003-09-17 2007-06-27 ノーリツ鋼機株式会社 An image processing apparatus, method, and program
FR2870621B1 (en) * 2004-05-21 2006-10-27 Inst Francais Du Petrole A method for generating a hybrid grid conforms in three dimensions of a heterogeneous formation traversed by one or more geometric discontinuities in order to perform simulations
JP4011039B2 (en) * 2004-05-31 2007-11-21 三菱電機株式会社 The imaging device and a signal processing method
DE102004029552A1 (en) * 2004-06-18 2006-01-05 Peter Mäckel A method for visualization and measurement of deformation of vibrating objects by means of a combination of a synchronized stroboscopic image recording with image correlation method
EP1779321A2 (en) 2004-08-11 2007-05-02 Philips Electronics N.V. Stripe-based image data storage
JPWO2006025271A1 (en) 2004-09-03 2008-05-08 コニカミノルタオプト株式会社 Coupling lens and an optical pickup device
JP4883517B2 (en) 2004-11-19 2012-02-22 学校法人福岡工業大学 Three-dimensional measuring apparatus and the three-dimensional measurement method and three-dimensional measuring program
US7719533B2 (en) * 2004-11-24 2010-05-18 General Electric Company Graph extraction labelling and visualization
US7367682B2 (en) * 2004-12-07 2008-05-06 Symbol Technologies, Inc. Color image projection arrangement and method
WO2006074310A3 (en) * 2005-01-07 2008-02-21 Gesturetek Inc Creating 3d images of objects by illuminating with infrared patterns
US7295771B2 (en) * 2005-04-25 2007-11-13 Delphi Technologies, Inc. Method and apparatus for minimizing ambient illumination effects in a vision system
JP4577126B2 (en) * 2005-07-08 2010-11-10 オムロン株式会社 Generating apparatus and method for generating a projection pattern for stereo correspondence
CN101496033B (en) * 2006-03-14 2012-03-21 普莱姆森斯有限公司 Depth-varying light fields for three dimensional sensing
JP5592070B2 (en) * 2006-03-14 2014-09-17 プライム センス リミティド Light field to depth change for three-dimensional detection
EP1934945A4 (en) * 2005-10-11 2016-01-20 Apple Inc Method and system for object reconstruction
US20070145273A1 (en) * 2005-12-22 2007-06-28 Chang Edward T High-sensitivity infrared color camera
US7821552B2 (en) * 2005-12-27 2010-10-26 Sanyo Electric Co., Ltd. Imaging apparatus provided with imaging device having sensitivity in visible and infrared regions
JP4466569B2 (en) * 2006-01-10 2010-05-26 株式会社豊田中央研究所 Color image reproducing apparatus
DE102006007170B4 (en) * 2006-02-08 2009-06-10 Sirona Dental Systems Gmbh Method and apparatus for fast and robust chromatic confocal 3D measurement technology
WO2007132399A1 (en) 2006-05-09 2007-11-22 Koninklijke Philips Electronics N.V. Programmable data processing circuit
WO2008133650A3 (en) * 2006-11-07 2008-12-24 Rvsi Inspection Llc Method and system for providing a high definition triangulation system
US8090194B2 (en) * 2006-11-21 2012-01-03 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
JP5129267B2 (en) * 2007-01-10 2013-01-30 スリーディー システムズ インコーポレーテッド 3D printing material systems with improved color, article performance and ease of use, the
US7683962B2 (en) * 2007-03-09 2010-03-23 Eastman Kodak Company Camera using multiple lenses and image sensors in a rangefinder configuration to provide a range map
FR2914422B1 (en) 2007-03-28 2009-07-03 Soitec Silicon On Insulator surface defects detection method of a substrate and device implementing the method.
JP2008288629A (en) * 2007-05-15 2008-11-27 Sony Corp Image signal processing apparatus, imaging device, image signal processing method, and computer program
EP2186337A4 (en) * 2007-08-08 2011-09-28 Tony Mayer Non-retro-reflective license plate imaging system
GB0718706D0 (en) * 2007-09-25 2007-11-07 Creative Physics Ltd Method and apparatus for reducing laser speckle
US7933056B2 (en) * 2007-09-26 2011-04-26 Che-Chih Tsao Methods and systems of rapid focusing and zooming for volumetric 3D displays and cameras
GB201006202D0 (en) * 2007-10-02 2010-06-02 Doubleshot Inc Laser beam pattern projector
US8446470B2 (en) * 2007-10-04 2013-05-21 Magna Electronics, Inc. Combined RGB and IR imaging sensor
US7958468B2 (en) * 2008-02-21 2011-06-07 Oracle America, Inc. Unidirectional relabeling for subcircuit recognition
US7861193B2 (en) * 2008-02-21 2010-12-28 Oracle America, Inc. Reuse of circuit labels for verification of circuit recognition
US8788990B2 (en) * 2008-02-21 2014-07-22 Oracle America, Inc. Reuse of circuit labels in subcircuit recognition
US8368753B2 (en) * 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
EP3117765A1 (en) * 2008-03-18 2017-01-18 Novadaq Technologies Inc. Imaging system for combined full-colour reflectance and near-infrared imaging
US8405727B2 (en) * 2008-05-01 2013-03-26 Apple Inc. Apparatus and method for calibrating image capture devices
JP5317169B2 (en) * 2008-06-13 2013-10-16 洋 川崎 Image processing apparatus, image processing method, and program
KR101530930B1 (en) * 2008-08-19 2015-06-24 삼성전자주식회사 Pattern projection apparatus comprising: a three-dimensional image forming apparatus having the same, and a variable focus liquid lens used therein
US8442940B1 (en) * 2008-11-18 2013-05-14 Semantic Research, Inc. Systems and methods for pairing of a semantic network and a natural language processing information extraction system
CN101509764A (en) 2009-02-27 2009-08-19 东南大学 Method for rapidly acquiring object three-dimensional form
DE102009001889A1 (en) 2009-03-26 2010-09-30 Robert Bosch Gmbh Laser marking with the coordinate system
US8823775B2 (en) * 2009-04-30 2014-09-02 Board Of Regents, The University Of Texas System Body surface imaging
US8204904B2 (en) * 2009-09-30 2012-06-19 Yahoo! Inc. Network graph evolution rule generation
US8630509B2 (en) * 2009-11-03 2014-01-14 Samsung Electronics Co., Ltd. Structured grids for label propagation on a finite number of layers
KR101377325B1 (en) * 2009-12-21 2014-03-25 한국전자통신연구원 Stereo images, acquired multi-view images and depth images the camera device and the control method
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US20110222757A1 (en) * 2010-03-10 2011-09-15 Gbo 3D Technology Pte. Ltd. Systems and methods for 2D image and spatial data capture for 3D stereo imaging
WO2011114571A1 (en) 2010-03-16 2011-09-22 三洋電機株式会社 Object detecting apparatus and information acquiring apparatus
US8619143B2 (en) * 2010-03-19 2013-12-31 Pixim, Inc. Image sensor including color and infrared pixels
KR20110132260A (en) 2010-05-29 2011-12-07 이문기 Monitor based augmented reality system
US8670029B2 (en) 2010-06-16 2014-03-11 Microsoft Corporation Depth camera illuminator with superluminescent light-emitting diode
EP2400261A1 (en) * 2010-06-21 2011-12-28 Leica Geosystems AG Optical measurement method and system for determining 3D coordination in a measuring object surface
GB2481459B (en) 2010-06-25 2017-05-03 Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E V Capturing a surface structure of an object surface
US8357899B2 (en) * 2010-07-30 2013-01-22 Aptina Imaging Corporation Color correction circuitry and methods for dual-band imaging systems
US8903119B2 (en) * 2010-10-11 2014-12-02 Texas Instruments Incorporated Use of three-dimensional top-down views for business analytics
JP5787508B2 (en) 2010-11-11 2015-09-30 キヤノン株式会社 Diffractive optical element and an optical system
US20120154397A1 (en) * 2010-12-03 2012-06-21 Old Dominion University Research Foundation Method and system for generating mesh from images
KR101694292B1 (en) * 2010-12-17 2017-01-09 한국전자통신연구원 Apparatus for matching stereo image and method thereof
US9247238B2 (en) 2011-01-31 2016-01-26 Microsoft Technology Licensing, Llc Reducing interference between multiple infra-red depth cameras
KR101289595B1 (en) * 2011-02-28 2013-07-24 이경자 Grid pattern projection device
KR101792501B1 (en) * 2011-03-16 2017-11-21 한국전자통신연구원 Method and apparatus for feature-based stereo matching
KR101801355B1 (en) * 2011-03-25 2017-11-24 엘지전자 주식회사 Apparatus for recognizing distance of object using diffracting element and light source
US8718748B2 (en) * 2011-03-29 2014-05-06 Kaliber Imaging Inc. System and methods for monitoring and assessing mobility
JP5979500B2 (en) * 2011-04-07 2016-08-24 パナソニックIpマネジメント株式会社 Stereoscopic imaging apparatus
CN102760234B (en) * 2011-04-14 2014-08-20 财团法人工业技术研究院 Depth image acquiring device, system and method
US8760499B2 (en) * 2011-04-29 2014-06-24 Austin Russell Three-dimensional imager and projection device
US20120281087A1 (en) * 2011-05-02 2012-11-08 Faro Technologies, Inc. Three-dimensional scanner for hand-held phones
US9536312B2 (en) * 2011-05-16 2017-01-03 Microsoft Corporation Depth reconstruction using plural depth capture units
CN102831380A (en) 2011-06-15 2012-12-19 康佳集团股份有限公司 Body action identification method and system based on depth image induction
WO2013019255A1 (en) * 2011-07-29 2013-02-07 Hewlett-Packard Development Company, L.P. Projection capture system, programming and method
US20140225988A1 (en) * 2011-09-07 2014-08-14 Commonwealth Scientific And Industrial Research Organisation System and method for three-dimensional surface imaging
US9285871B2 (en) 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Personal audio/visual system for providing an adaptable augmented reality environment
US9248623B2 (en) * 2011-10-14 2016-02-02 Makerbot Industries, Llc Grayscale rendering in 3D printing
US9098908B2 (en) * 2011-10-21 2015-08-04 Microsoft Technology Licensing, Llc Generating a depth map
US20140098342A1 (en) * 2011-11-04 2014-04-10 The General Hospital Corporation System and method for corneal irradiation
JP5910043B2 (en) * 2011-12-02 2016-04-27 富士通株式会社 Imaging device, image processing program, image processing method, and an image processing device
JP5898484B2 (en) * 2011-12-19 2016-04-06 キヤノン株式会社 The information processing apparatus, a method of controlling an information processing apparatus, and program
CN102572485B (en) * 2012-02-02 2015-04-22 北京大学 Self-adaptive weighted stereo matching algorithm, stereo display and collecting device and system
JP5994715B2 (en) * 2012-04-10 2016-09-21 パナソニックIpマネジメント株式会社 Computer-generated hologram-type display device
KR20130120730A (en) * 2012-04-26 2013-11-05 한국전자통신연구원 Method for processing disparity space image
US9514522B2 (en) * 2012-08-24 2016-12-06 Microsoft Technology Licensing, Llc Depth data processing and compression
US9332243B2 (en) * 2012-10-17 2016-05-03 DotProduct LLC Handheld portable optical scanner and method of using
US20140225985A1 (en) * 2012-10-17 2014-08-14 DotProduct LLC Handheld portable optical scanner and method of using
US9117267B2 (en) * 2012-10-18 2015-08-25 Google Inc. Systems and methods for marking images for three-dimensional image generation
US20140120319A1 (en) * 2012-11-01 2014-05-01 Benjamin E. Joseph 3d mapping using structured light and formation of custom surface contours
US20140132728A1 (en) * 2012-11-12 2014-05-15 Shopperception, Inc. Methods and systems for measuring human interaction
KR20140075163A (en) * 2012-12-11 2014-06-19 한국전자통신연구원 Method and apparatus for projecting pattern using structured-light
US9298945B2 (en) * 2012-12-26 2016-03-29 Elwha Llc Ad-hoc wireless sensor package
US9292927B2 (en) * 2012-12-27 2016-03-22 Intel Corporation Adaptive support windows for stereoscopic image correlation
US9251590B2 (en) * 2013-01-24 2016-02-02 Microsoft Technology Licensing, Llc Camera pose estimation for 3D reconstruction
US9191643B2 (en) * 2013-04-15 2015-11-17 Microsoft Technology Licensing, Llc Mixing infrared and color component data point clouds
US20140307055A1 (en) * 2013-04-15 2014-10-16 Microsoft Corporation Intensity-modulated light pattern for active stereo
US20140320605A1 (en) * 2013-04-25 2014-10-30 Philip Martin Johnson Compound structured light projection system for 3-D surface profiling

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6751344B1 (en) * 1999-05-28 2004-06-15 Champion Orthotic Investments, Inc. Enhanced projector system for machine vision
US7315383B1 (en) * 2004-07-09 2008-01-01 Mohsen Abdollahi Scanning 3D measurement technique using structured lighting and high-speed CMOS imager
US20070263903A1 (en) * 2006-03-23 2007-11-15 Tyzx, Inc. Enhancing stereo depth measurements with projected texture
US8077034B2 (en) * 2006-09-28 2011-12-13 Bea Sa Sensor for presence detection
US20080205748A1 (en) * 2007-02-28 2008-08-28 Sungkyunkwan University Structural light based depth imaging method and system using signal separation coding, and error correction thereof
US20120038986A1 (en) * 2010-08-11 2012-02-16 Primesense Ltd. Pattern projector
US20120056982A1 (en) * 2010-09-08 2012-03-08 Microsoft Corporation Depth camera based on structured light and stereo vision
US20120307075A1 (en) * 2011-06-01 2012-12-06 Empire Technology Development, Llc Structured light projection for motion detection in augmented reality
US20130229396A1 (en) * 2012-03-05 2013-09-05 Kenneth J. Huebner Surface aware, object aware, and image aware handheld projector
US20150316368A1 (en) * 2012-11-29 2015-11-05 Koninklijke Philips N.V. Laser device for projecting a structured light pattern onto a scene
US20140168380A1 (en) * 2012-12-14 2014-06-19 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US20140293011A1 (en) * 2013-03-28 2014-10-02 Phasica, LLC Scanner System for Determining the Three Dimensional Shape of an Object and Method for Using

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150036023A1 (en) * 2012-03-13 2015-02-05 Dolby Laboratories Licensing Corporation Lighting system and method for image and object enhancement
US9438813B2 (en) * 2012-03-13 2016-09-06 Dolby Laboratories Licensing Corporation Lighting system and method for image and object enhancement
US20150229915A1 (en) * 2014-02-08 2015-08-13 Microsoft Corporation Environment-dependent active illumination for stereo matching
US9699394B2 (en) 2015-03-09 2017-07-04 Microsoft Technology Licensing, Llc Filter arrangement for image sensor
US20160371845A1 (en) * 2015-06-18 2016-12-22 Apple Inc. Monitoring doe performance using software scene evaluation

Also Published As

Publication number Publication date Type
CN105229411A (en) 2016-01-06 application
US20140309764A1 (en) 2014-10-16 application
EP2987138A1 (en) 2016-02-24 application
WO2014172221A1 (en) 2014-10-23 application
KR20150140838A (en) 2015-12-16 application
WO2014172228A1 (en) 2014-10-23 application
CN105229412A (en) 2016-01-06 application
US20140307047A1 (en) 2014-10-16 application
RU2015143654A (en) 2017-04-28 application
US9760770B2 (en) 2017-09-12 grant
EP2987132A1 (en) 2016-02-24 application
CN105230003A (en) 2016-01-06 application
US9508003B2 (en) 2016-11-29 grant
CA2907895A1 (en) 2014-10-23 application
US20140307098A1 (en) 2014-10-16 application
WO2014172229A1 (en) 2014-10-23 application
EP2986931A1 (en) 2016-02-24 application
US9697424B2 (en) 2017-07-04 grant
US20140307057A1 (en) 2014-10-16 application
WO2014172276A1 (en) 2014-10-23 application
US20140307953A1 (en) 2014-10-16 application
CN105247859A (en) 2016-01-13 application
CN105143817A (en) 2015-12-09 application
EP2987323A1 (en) 2016-02-24 application
EP2986936A1 (en) 2016-02-24 application
US20140310496A1 (en) 2014-10-16 application
CN105229696A (en) 2016-01-06 application
EP2987131A1 (en) 2016-02-24 application
US20140307058A1 (en) 2014-10-16 application
EP2986935A1 (en) 2016-02-24 application
KR20150140841A (en) 2015-12-16 application
WO2014172222A1 (en) 2014-10-23 application
US20140307307A1 (en) 2014-10-16 application
JP2016522889A (en) 2016-08-04 application
US9928420B2 (en) 2018-03-27 grant
WO2014172227A1 (en) 2014-10-23 application
EP2987132B1 (en) 2017-11-01 grant
CN105308650A (en) 2016-02-03 application
WO2014172231A1 (en) 2014-10-23 application
US20150078672A1 (en) 2015-03-19 application
US9959465B2 (en) 2018-05-01 grant
WO2014172223A1 (en) 2014-10-23 application
CN105210112A (en) 2015-12-30 application
US9922249B2 (en) 2018-03-20 grant
EP2987320A1 (en) 2016-02-24 application

Similar Documents

Publication Publication Date Title
US8259163B2 (en) Display with built in 3D sensing
US20120056982A1 (en) Depth camera based on structured light and stereo vision
US20110084983A1 (en) Systems and Methods for Interaction With a Virtual Environment
US20070236485A1 (en) Object Illumination in a Virtual Environment
US20130141434A1 (en) Virtual light in augmented reality
US20120194517A1 (en) Using a Three-Dimensional Environment Model in Gameplay
US6664531B2 (en) Combined stereovision, color 3D digitizing and motion capture system
US20100060632A1 (en) Method and devices for the real time embeding of virtual objects in an image stream using data from a real scene represented by said images
US20130326364A1 (en) Position relative hologram interactions
US20110211044A1 (en) Non-Uniform Spatial Resource Allocation for Depth Mapping
US20110124410A1 (en) Controller for interfacing with a computing program using position, orientation, or motion
US20080044079A1 (en) Object-based 3-dimensional stereo information generation apparatus and method, and interactive system using the same
US20120274745A1 (en) Three-dimensional imager and projection device
US8405680B1 (en) Various methods and apparatuses for achieving augmented reality
US20060072076A1 (en) Interactive projection system and method
US7967451B2 (en) Multi-directional image displaying device
US20120242795A1 (en) Digital 3d camera using periodic illumination
US20130335531A1 (en) Apparatus for projecting grid pattern
US20130050426A1 (en) Method to extend laser depth map range
US7703924B2 (en) Systems and methods for displaying three-dimensional images
US20140347391A1 (en) Hologram anchoring and dynamic positioning
CN1604014A (en) Image display apparatus and method
US20120082346A1 (en) Time-of-flight depth imaging
CN102970548A (en) Image depth sensing device
US20130207962A1 (en) User interactive kiosk with three-dimensional display

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KANG, SING BING;GEORGIOU, ANDREAS;SZELISKI, RICHARD S.;SIGNING DATES FROM 20130608 TO 20130611;REEL/FRAME:030592/0181

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417

Effective date: 20141014

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454

Effective date: 20141014