US20140307055A1 - Intensity-modulated light pattern for active stereo - Google Patents
Intensity-modulated light pattern for active stereo Download PDFInfo
- Publication number
- US20140307055A1 US20140307055A1 US13/915,626 US201313915626A US2014307055A1 US 20140307055 A1 US20140307055 A1 US 20140307055A1 US 201313915626 A US201313915626 A US 201313915626A US 2014307055 A1 US2014307055 A1 US 2014307055A1
- Authority
- US
- United States
- Prior art keywords
- points
- pattern
- spots
- intensity
- randomly
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- H04N13/0239—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B29—WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
- B29C—SHAPING OR JOINING OF PLASTICS; SHAPING OF MATERIAL IN A PLASTIC STATE, NOT OTHERWISE PROVIDED FOR; AFTER-TREATMENT OF THE SHAPED PRODUCTS, e.g. REPAIRING
- B29C64/00—Additive manufacturing, i.e. manufacturing of three-dimensional [3D] objects by additive deposition, additive agglomeration or additive layering, e.g. by 3D printing, stereolithography or selective laser sintering
- B29C64/30—Auxiliary operations or equipment
- B29C64/386—Data acquisition or data processing for additive manufacturing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/22—Measuring arrangements characterised by the use of optical techniques for measuring depth
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
- G01B11/2527—Projection by scanning of the object with phase change by in-plane movement of the patern
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2545—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/42—Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
- G02B27/4205—Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/42—Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
- G02B27/44—Grating systems; Zone plate systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/18—Diffraction gratings
- G02B5/1876—Diffractive Fresnel lenses; Zone plates; Kinoforms
- G02B5/189—Structurally combined with optical elements not having diffractive power
- G02B5/1895—Structurally combined with optical elements not having diffractive power such optical elements having dioptric power
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/3003—Monitoring arrangements specially adapted to the computing system or computing system component being monitored
- G06F11/3024—Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system component is a central processing unit [CPU]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F12/00—Accessing, addressing or allocating within memory systems or architectures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F12/00—Accessing, addressing or allocating within memory systems or architectures
- G06F12/02—Addressing or allocation; Relocation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F12/00—Accessing, addressing or allocating within memory systems or architectures
- G06F12/02—Addressing or allocation; Relocation
- G06F12/0207—Addressing or allocation; Relocation with multidimensional access, e.g. row/column, matrix
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F12/00—Accessing, addressing or allocating within memory systems or architectures
- G06F12/02—Addressing or allocation; Relocation
- G06F12/0223—User address space allocation, e.g. contiguous or non contiguous base addressing
- G06F12/0292—User address space allocation, e.g. contiguous or non contiguous base addressing using tables or multilevel address translation means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/06—Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
- G06F3/0601—Interfaces specially adapted for storage systems
- G06F3/0628—Interfaces specially adapted for storage systems making use of a particular technique
- G06F3/0653—Monitoring storage devices or systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/06—Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
- G06F3/0601—Interfaces specially adapted for storage systems
- G06F3/0628—Interfaces specially adapted for storage systems making use of a particular technique
- G06F3/0655—Vertical data movement, i.e. input-output transfer; data movement between one or more hosts and one or more storage devices
- G06F3/0659—Command handling arrangements, e.g. command buffers, queues, command scheduling
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/06—Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
- G06F3/0601—Interfaces specially adapted for storage systems
- G06F3/0668—Interfaces specially adapted for storage systems adopting a particular infrastructure
- G06F3/0671—In-line storage system
- G06F3/0683—Plurality of storage devices
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/30003—Arrangements for executing specific machine instructions
- G06F9/3004—Arrangements for executing specific machine instructions to perform operations on memory
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/30003—Arrangements for executing specific machine instructions
- G06F9/3004—Arrangements for executing specific machine instructions to perform operations on memory
- G06F9/30043—LOAD or STORE instructions; Clear instruction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/30—Arrangements for executing machine instructions, e.g. instruction decode
- G06F9/30098—Register arrangements
- G06F9/3012—Organisation of register space, e.g. banked or distributed register file
- G06F9/30123—Organisation of register space, e.g. banked or distributed register file according to context, e.g. thread buffers
- G06F9/30127—Register windows
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/60—Memory management
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/586—Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/25—Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/131—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
- H04N25/611—Correction of chromatic aberration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/42—Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
- G02B27/4233—Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive element [DOE] contributing to a non-imaging application
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- a projector projects patterns of light such as infrared (IR) dots or lines to illuminate a scene being sensed.
- the projected patterns are then captured by a camera/sensor (two or more in stereo systems), with the image (or images) processed to compute a depth map or the like.
- stereo cameras capture two images from different viewpoints. Then, for example, one way to perform depth estimation with a stereo pair of images is to find correspondences of local patches between the images, e.g., to correlate each projected and sensed local dot pattern in the left image with a counterpart local dot pattern in the right image. Once matched, the projected patterns within the images may be correlated with one another, and disparities between one or more features of the correlated dots used to estimate (e.g., triangulate) a depth to that particular dot pair.
- IR lasers have been used to produce such patterns.
- more powerful lasers around 1W or more
- multi-mode lasers are more cost-effective.
- using multi-mode lasers results in the design pattern looking blurrier at closer distances. This is problematic in active stereo depth sensing, because correlating the correct pairs of left and right pairs of dots is subject to more errors when the dots are blurred.
- a projector including a laser and a diffractive optical component projects a light pattern towards a scene.
- the diffractive optical component is configured to output the light pattern as a plurality of sets of sub-patterns, with each set corresponding to a different range of intensities.
- One or more aspects are directed towards generating a grid comprising a first set of points, associating each point in the first set with an intensity value that is within a first intensity range, adding a second set of points between subsets of points of the first set of points and associating each point in the second set with an intensity value that is within a second intensity range. This subdivision process may be repeated if necessary.
- a diffractive optical component may be encoded based upon the first set of points and the second set of points.
- Another variant is to generate a random set of points with approximately uniform density throughout, with a random subset of them having a specified range of intensities, and the rest having a different range of intensities.
- One or more aspects are directed towards projecting light through a diffractive optical component to project a pattern comprising a first set of spots corresponding to a the first intensity range, and a second set of spots corresponding to a second intensity range.
- the positions of the spots in the first set are based upon an initial grid layout, and the positions of spots in the second set of spots are based upon the positions of the first set of spots.
- the first set of spots and the second set of spots are sensed as left and right stereo camera images.
- the images are processed to correlate spots in the left image with spots in the right image, in which scanlines of the images are not aligned with the initial grid layout.
- FIG. 1 is a block diagram representing example components that may be used to project and capture a light pattern modulated with different intensities, according to one or more example implementations.
- FIG. 2 is a representation of an example of projecting dots having different intensities into a scene, according to one or more example implementations.
- FIGS. 3A and 3B are representations of a pattern may be designed based upon a grid, and subdivision of points aligned via the grid, to facilitate having points with different intensities, according to one or more example implementations.
- FIG. 4 is a representation of further subdivision of points having different intensities, according to one or more example implementations.
- FIG. 5 is a flow diagram representing example steps in laying out points for of different intensities, such as for encoding corresponding data into a diffractive optical element, according to one or more example implementations.
- FIG. 6 is a block diagram representing example components of a device that projects a diffraction pattern of light having different intensities, according to one example implementation.
- FIGS. 7 and 8 are representations of how non-rotation versus rotation of a projected pattern affects scanning of captured images that include the projected pattern, according to one or more example implementations.
- FIG. 9 is a representation of how dots of different intensities may be captured in a part of an image, and moved over time, according to one or more example implementations.
- FIG. 10 is a block diagram representing an exemplary non-limiting computing system or operating environment, in the form of a gaming system, into which one or more aspects of various embodiments described herein can be implemented.
- a light pattern projected into a scene, in which the light pattern is configured to provide for enhanced pattern matching, including at different depths to illuminated objects.
- a light pattern includes intermixed points of light (e.g., spots such as dots) of different intensities.
- the technology also leverages the depth-dependent appearance of the pattern by having the pattern include points that are semi-randomly distributed.
- the peak intensities of neighboring points are different. This results in local changes in intensity independent of the scene depth, to allow stereo matching to function properly.
- the projected light pattern may use spots, generally exemplified herein as dots, but the dots may be of any shape.
- the dots are exemplified as arranged according to a triangular grid, however this is only one example, and other arrangements (e.g., a hexagonal grid) may be implemented.
- Rotation angles of the patterns described below
- different ranges or values of intensity peaks e.g., for large, medium and small intensities
- the present invention is not limited to any particular embodiments, aspects, concepts, structures, functionalities or examples described herein. Rather, any of the embodiments, aspects, concepts, structures, functionalities or examples described herein are non-limiting, and the present invention may be used various ways that provide benefits and advantages in active depth sensing and image processing in general.
- FIG. 1 shows an example system in which stereo cameras 102 and 103 of an image capturing system or subsystem 104 capture images synchronized in time (e.g., the cameras are “genlocked”).
- the cameras capture infrared (IR) images, as IR does not affect the visible appearance of the scene (which is highly advantageous, such as in video conferencing and object modeling applications).
- IR infrared
- more than two IR depth-sensing cameras may be present.
- one or more other cameras may be present in a given system, such as RBG cameras, and such other cameras may be used to help correlate dot pairs in different stereo images, for example.
- a projector 106 projects an IR pattern onto a scene, such as a pattern of spots (e.g., dots) or a line pattern, although other spot shapes and/or pattern types may be used.
- dots are generally described hereinafter.
- the cameras 102 and 103 capture texture data as part of the infrared image data.
- the dots in the pattern are arranged with different intensities, and also the pattern may be rotated relative to the cameras.
- the pattern with intensity modulation may be designed (e.g., encoded) into a diffractive optical component (a diffractive optical element or combination of elements) that disperse laser light into the scene, e.g., as a dot pattern.
- FIG. 2 exemplifies this projection concept.
- the projector 106 represented as a circle in between the stereo cameras 102 and 103 , projects a dot pattern onto a scene 222 .
- the dot pattern is modulated with different intensities, and the dot pattern may be rotated (e.g., fifteen degrees) relative to the cameras' orientation.
- the cameras 102 and 103 capture the dots as they reflect off of object surfaces in the scene 222 and (possibly) the background. In general, one or more features of the captured dots are indicative of the distance to the reflective surface.
- FIG. 2 is not intended to be to scale, nor convey any sizes, distance, dot distribution pattern, dot density and so on. However, it is understood that different intensities exist in the dot pattern, and that the dot pattern may be rotated relative to the cameras.
- the placement of the projector 106 may be outside the cameras (e.g., FIG. 1 ), or in between the cameras ( FIG. 2 ) or at another location, such as above or below one or both of the cameras.
- the examples herein are in no way limiting of where the cameras and/or projector are located relative to one another, and similarly, the cameras may be positioned at different positions relative to each other.
- the example image capturing system or subsystem 104 includes a controller 108 that via a camera interface 110 controls the operation of the cameras 102 and 103 .
- the exemplified controller via a projector interface 112 also controls the operation of the projector 106 .
- the cameras 102 and 103 are synchronized (genlocked) to capture stereo images at the same time, such as by a controller signal (or different signals for each camera).
- the projector 106 may be turned on or off, pulsed, and otherwise have one or more parameters controllably varied, for example.
- the images 116 captured by the cameras 102 and 103 are provided to an image processing system or subsystem 118 .
- the image processing system 118 and image capturing system or subsystem 104 may be combined into a single device.
- a home entertainment device may include all of the components shown in FIG. 1 (as well as others not shown).
- parts (or all) of the image capturing system or subsystem 104 such as the cameras and projector, may be a separate device that couples to a gaming console, personal computer, mobile device, dedicated processing device and/or the like.
- a gaming console is exemplified in FIG. 10 as one environment that may be used for processing images into depth data.
- the image processing system or subsystem 118 includes a processor 120 and a memory 122 containing one or more image processing algorithms 124 .
- One or more depth maps may be obtained via the algorithms 124 such as by extracting matching features (such as dots and/or lines).
- matching features such as dots and/or lines.
- different dots or other projected elements have different features when captured, including intensity (brightness), depending on the distance from the projector to the reflective surfaces and/or the distance from the camera to the reflective surfaces.
- the dots in different images taken at the same time may be correlated with one another, such as by matching small (e.g., RGB) patches between RGB color images of the same scene captured at the same instant.
- known algorithms can determine individual depth-related features (depth maps) by matching projected light components (e.g., dots) in each image, using disparities of certain features between matched dots to determine depths. This is one way in which a depth map may be obtained via stereo image processing.
- an interface 132 to the image processing system or subsystem 118 such as for connecting a keyboard, game controller, display, pointing device microphone for speech commands and/or the like as appropriate for a user to interact with an application or the like that uses the depth map.
- FIGS. 3A and 3B along with FIG. 4 show the concept of subdivision, in which dots of larger intensity (larger dots with an “X” shaped cross therein) are arranged in a triangular grid layout 330 ( FIG. 3A ).
- each triangle of the larger intensity dots is subdivided by triangles of lesser intensity dots (circles), providing the pattern 332 .
- each of those sub-triangle sub-patterns is further subdivided by even lesser intensity dots (smaller-sized circles relative to those in FIG. 3B ).
- FIG. 4 represents a triangular pattern 440 of higher intensity dots, medium intensity dots, and lower intensity dots.
- the dot sizes relative to the distribution pattern and each other are only intended to illustrate distribution of dots of differing relative intensities or intensity ranges, and are not intended to convey any particular intensity levels or ranges.
- FIG. 5 summarizes subdivision, beginning at step 502 where in this example a triangular grid of a specified between-vertex distance is generated, e.g., comprising regular triangles or substantially regular triangles (or other polygons).
- the intensity peaks are set to a high value; however rather than being the same intensity value for each point, the high values may be randomly set to be within a high range (step 504 ), e.g., between 200-255 (with 255 being the maximum intensity).
- an intensity “range” includes a range with as little as one single fixed intensity value, e.g., a range may be from 200 to 200.
- Step 506 represents adding points between the previously generated points, e.g., as smaller sets of triangles (a “subdivision”) such as shown in FIG. 3B .
- Step 508 randomly sets the intensity peaks of these points to be within a lower range, e.g., between 100-125. Note that these example intensity ranges do not overlap one another, but it is feasible to have different ranges overlap to an extent; if weighted random techniques may be used to bias most values in overlapping ranges away from one another.
- Step 510 evaluates whether subdivision has been completed to the lowest desired level, which is configurable. Thus, by returning to step 506 , another subdivision of points may be optionally added, (such as exemplified in FIG. 4 ), with an even lower range of intensities, and so on, until the desired pattern and sets of intensities/intensity ranges is reached.
- the result is a projection pattern that contains sub-patterns, in this example different sets of triangular sub-patterns, such as a larger intensity sub-pattern set and a smaller-intensity sub-pattern set ( FIG. 3B ), or small, medium and large intensity sub-pattern sets ( FIG. 4 ) and so on.
- the sets/sub-patterns are interleaved via subdivision.
- step 512 represents pseudo-randomly rearranging (e.g., slightly “jittering”) at least some of the points into modified positions, such as to further reduce repetition intervals.
- this repositioning of a point is small relative to its respective triangle (or other grid pattern), whereby the regular polygon or substantially regular polygon is now modified to be only generally/approximately regular.
- FIG. 6 is one such example configuration in which a diffractive optical component 660 (e.g., diffractive optical one or more elements) is configured to output an intensity-modulated illumination pattern.
- the component 660 may be built into or coupled to device 662 , such as a built into or part of a home entertainment device.
- a laser 664 e.g., multimode
- Stereo cameras 666 A and 666 B capture the reflection from an illuminated object (e.g., person 668 ) and use the captured images as desired; note that a single camera may be used in a given implementation.
- the diffractive optical component 660 disperses the laser light into a large number of spots based upon the pattern designed as described herein, such as on the order of 100,000 dots. Some of the pattern is represented in FIG. 1 by the solid lines coming from the element and by the dots on the object/person 668 and image plane 670 . Note that as with any of the figures herein, neither FIG. 6 nor its components are intended to be to scale or convey any particular distance, distribution and/or the like.
- FIGS. 7 and 8 illustrate another aspect, namely rotation of the modulated patterns.
- camera captured dots of part of a left pattern 770 L are shown alongside parts of a right pattern 770 R.
- dot A correlates with dot C, and is supposed to be matched.
- dot B or possibly dot D may be erroneously matched with dot A
- FIG. 8 camera-captured dots of part of a rotated left pattern 880 L are shown alongside parts of a rotated right pattern 880 R.
- the rotation e.g., by fifteen degrees in this example, although other rotational angles may be used helps to provide a larger repetition interval along the scanline (x-direction).
- Rotation and intensity distribution is generally shown in the partial image representation 990 of FIG. 9 , where the dots are illustrated by concentric circles, and (some relative) intensity by the sizes thereof.
- the pixels are represented by the square blocks behind the dots. Note that in FIG. 9 the different diameters of the circles only suggest changes in intensity; the size of the circles and the grid squares are not intended to convey any particular scale, resolution, or the like, nor any particular intensity value or relative intensity values (other than within at least two different ranges). Further, the density of the dots and/or their sizes or distribution are not intended to represent any actual density and/or distribution.
- a light pattern modulated with different intensities may be based upon a grid, and projected such that the cameras that capture the light pattern are not aligned with the grid on which the pattern was based.
- the intensity-modulated pattern provides for more robust stereo matching/depth sensing.
- FIG. 10 is a functional block diagram of an example gaming and media system 1000 and shows functional components in more detail.
- Console 1001 has a central processing unit (CPU) 1002 , and a memory controller 1003 that facilitates processor access to various types of memory, including a flash Read Only Memory (ROM) 1004 , a Random Access Memory (RAM) 1006 , a hard disk drive 1008 , and portable media drive 1009 .
- the CPU 1002 includes a level 1 cache 1010 , and a level 2 cache 1012 to temporarily store data and hence reduce the number of memory access cycles made to the hard drive, thereby improving processing speed and throughput.
- the CPU 1002 , the memory controller 1003 , and various memory devices are interconnected via one or more buses (not shown).
- the details of the bus that is used in this implementation are not particularly relevant to understanding the subject matter of interest being discussed herein.
- a bus may include one or more of serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus, using any of a variety of bus architectures.
- bus architectures can include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnects (PCI) bus also known as a Mezzanine bus.
- ISA Industry Standard Architecture
- MCA Micro Channel Architecture
- EISA Enhanced ISA
- VESA Video Electronics Standards Association
- PCI Peripheral Component Interconnects
- the CPU 1002 , the memory controller 1003 , the ROM 1004 , and the RAM 1006 are integrated onto a common module 1014 .
- the ROM 1004 is configured as a flash ROM that is connected to the memory controller 1003 via a Peripheral Component Interconnect (PCI) bus or the like and a ROM bus or the like (neither of which are shown).
- the RAM 1006 may be configured as multiple Double Data Rate Synchronous Dynamic RAM (DDR SDRAM) modules that are independently controlled by the memory controller 1003 via separate buses (not shown).
- DDR SDRAM Double Data Rate Synchronous Dynamic RAM
- the hard disk drive 1008 and the portable media drive 1009 are shown connected to the memory controller 1003 via the PCI bus and an AT Attachment (ATA) bus 1016 .
- ATA AT Attachment
- dedicated data bus structures of different types can also be applied in the alternative.
- a three-dimensional graphics processing unit 1020 and a video encoder 1022 form a video processing pipeline for high speed and high resolution (e.g., High Definition) graphics processing.
- Data are carried from the graphics processing unit 1020 to the video encoder 1022 via a digital video bus (not shown).
- An audio processing unit 1024 and an audio codec (coder/decoder) 1026 form a corresponding audio processing pipeline for multi-channel audio processing of various digital audio formats. Audio data are carried between the audio processing unit 1024 and the audio codec 1026 via a communication link (not shown).
- the video and audio processing pipelines output data to an A/V (audio/video) port 1028 for transmission to a television or other display/speakers.
- the video and audio processing components 1020 , 1022 , 1024 , 1026 and 1028 are mounted on the module 1014 .
- FIG. 10 shows the module 1014 including a USB host controller 1030 and a network interface (NW I/F) 1032 , which may include wired and/or wireless components.
- the USB host controller 1030 is shown in communication with the CPU 1002 and the memory controller 1003 via a bus (e.g., PCI bus) and serves as host for peripheral controllers 1034 .
- the network interface 1032 provides access to a network (e.g., Internet, home network, etc.) and may be any of a wide variety of various wire or wireless interface components including an Ethernet card or interface module, a modem, a Bluetooth module, a cable modem, and the like.
- the console 1001 includes a controller support subassembly 1040 , for supporting four game controllers 1041 ( 1 )- 1041 ( 4 ).
- the controller support subassembly 1040 includes any hardware and software components needed to support wired and/or wireless operation with an external control device, such as for example, a media and game controller.
- a front panel I/O subassembly 1042 supports the multiple functionalities of a power button 1043 , an eject button 1044 , as well as any other buttons and any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the console 1001 .
- the subassemblies 1040 and 1042 are in communication with the module 1014 via one or more cable assemblies 1046 or the like.
- the console 1001 can include additional controller subassemblies.
- the illustrated implementation also shows an optical I/O interface 1048 that is configured to send and receive signals (e.g., from a remote control 1049 ) that can be communicated to the module 1014 .
- Memory units (MUs) 1050 ( 1 ) and 1050 ( 2 ) are illustrated as being connectable to MU ports “A” 1052 ( 1 ) and “B” 1052 ( 2 ), respectively.
- Each MU 1050 offers additional storage on which games, game parameters, and other data may be stored.
- the other data can include one or more of a digital game component, an executable gaming application, an instruction set for expanding a gaming application, and a media file.
- each MU 1050 can be accessed by the memory controller 1003 .
- a system power supply module 1054 provides power to the components of the gaming system 1000 .
- a fan 1056 cools the circuitry within the console 1001 .
- An application 1060 comprising machine instructions is typically stored on the hard disk drive 1008 .
- various portions of the application 1060 are loaded into the RAM 1006 , and/or the caches 1010 and 1012 , for execution on the CPU 1002 .
- the application 1060 can include one or more program modules for performing various display functions, such as controlling dialog screens for presentation on a display (e.g., high definition monitor), controlling transactions based on user inputs and controlling data transmission and reception between the console 1001 and externally connected devices.
- the gaming system 1000 may be operated as a standalone system by connecting the system to high definition monitor, a television, a video projector, or other display device. In this standalone mode, the gaming system 1000 enables one or more players to play games, or enjoy digital media, e.g., by watching movies, or listening to music. However, with the integration of broadband connectivity made available through the network interface 1032 , gaming system 1000 may further be operated as a participating component in a larger network gaming community or system.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Materials Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Manufacturing & Machinery (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- Quality & Reliability (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Diffracting Gratings Or Hologram Optical Elements (AREA)
- Controls And Circuits For Display Device (AREA)
- Lenses (AREA)
- User Interface Of Digital Computer (AREA)
- Measurement Of Optical Distance (AREA)
- Image Analysis (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Color Television Image Signal Generators (AREA)
- Extrusion Moulding Of Plastics Or The Like (AREA)
- Non-Portable Lighting Devices Or Systems Thereof (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/915,626 US20140307055A1 (en) | 2013-04-15 | 2013-06-11 | Intensity-modulated light pattern for active stereo |
EP14725312.4A EP2986935B1 (en) | 2013-04-15 | 2014-04-14 | Intensity-modulated light pattern for active stereo |
CN201480021199.4A CN105229412B (zh) | 2013-04-15 | 2014-04-14 | 用于主动立体的强度调制光图案 |
PCT/US2014/033910 WO2014172222A1 (en) | 2013-04-15 | 2014-04-14 | Intensity-modulated light pattern for active stereo |
US15/912,555 US10928189B2 (en) | 2013-04-15 | 2018-03-05 | Intensity-modulated light pattern for active stereo |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361812232P | 2013-04-15 | 2013-04-15 | |
US13/915,626 US20140307055A1 (en) | 2013-04-15 | 2013-06-11 | Intensity-modulated light pattern for active stereo |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/912,555 Continuation US10928189B2 (en) | 2013-04-15 | 2018-03-05 | Intensity-modulated light pattern for active stereo |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140307055A1 true US20140307055A1 (en) | 2014-10-16 |
Family
ID=51686521
Family Applications (14)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/915,626 Abandoned US20140307055A1 (en) | 2013-04-15 | 2013-06-11 | Intensity-modulated light pattern for active stereo |
US13/915,622 Active 2033-12-30 US10268885B2 (en) | 2013-04-15 | 2013-06-11 | Extracting true color from a color and infrared sensor |
US13/918,892 Active 2034-04-20 US9760770B2 (en) | 2013-04-15 | 2013-06-14 | Parallel memories for multidimensional data access |
US13/923,135 Active US9959465B2 (en) | 2013-04-15 | 2013-06-20 | Diffractive optical element with undiffracted light expansion for eye safe operation |
US13/924,464 Active 2035-04-28 US10929658B2 (en) | 2013-04-15 | 2013-06-21 | Active stereo with adaptive support weights from a separate image |
US13/924,485 Active 2035-04-10 US9922249B2 (en) | 2013-04-15 | 2013-06-21 | Super-resolving depth map by moving pattern projector |
US13/924,475 Expired - Fee Related US9697424B2 (en) | 2013-04-15 | 2013-06-21 | Active stereo with satellite device or devices |
US13/925,762 Active 2034-09-26 US9928420B2 (en) | 2013-04-15 | 2013-06-24 | Depth imaging system based on stereo vision and infrared radiation |
US14/088,408 Abandoned US20140309764A1 (en) | 2013-04-15 | 2013-11-24 | Adaptive material deposition in three-dimensional fabrication |
US14/253,696 Active US9508003B2 (en) | 2013-04-15 | 2014-04-15 | Hardware-amenable connected components labeling |
US15/889,188 Active US10816331B2 (en) | 2013-04-15 | 2018-02-05 | Super-resolving depth map by moving pattern projector |
US15/912,555 Active US10928189B2 (en) | 2013-04-15 | 2018-03-05 | Intensity-modulated light pattern for active stereo |
US15/937,851 Abandoned US20180218210A1 (en) | 2013-04-15 | 2018-03-27 | Diffractive optical element with undiffracted light expansion for eye safe operation |
US18/186,933 Active US12305974B2 (en) | 2013-04-15 | 2023-03-20 | Diffractive optical element with undiffracted light expansion for eye safe operation |
Family Applications After (13)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/915,622 Active 2033-12-30 US10268885B2 (en) | 2013-04-15 | 2013-06-11 | Extracting true color from a color and infrared sensor |
US13/918,892 Active 2034-04-20 US9760770B2 (en) | 2013-04-15 | 2013-06-14 | Parallel memories for multidimensional data access |
US13/923,135 Active US9959465B2 (en) | 2013-04-15 | 2013-06-20 | Diffractive optical element with undiffracted light expansion for eye safe operation |
US13/924,464 Active 2035-04-28 US10929658B2 (en) | 2013-04-15 | 2013-06-21 | Active stereo with adaptive support weights from a separate image |
US13/924,485 Active 2035-04-10 US9922249B2 (en) | 2013-04-15 | 2013-06-21 | Super-resolving depth map by moving pattern projector |
US13/924,475 Expired - Fee Related US9697424B2 (en) | 2013-04-15 | 2013-06-21 | Active stereo with satellite device or devices |
US13/925,762 Active 2034-09-26 US9928420B2 (en) | 2013-04-15 | 2013-06-24 | Depth imaging system based on stereo vision and infrared radiation |
US14/088,408 Abandoned US20140309764A1 (en) | 2013-04-15 | 2013-11-24 | Adaptive material deposition in three-dimensional fabrication |
US14/253,696 Active US9508003B2 (en) | 2013-04-15 | 2014-04-15 | Hardware-amenable connected components labeling |
US15/889,188 Active US10816331B2 (en) | 2013-04-15 | 2018-02-05 | Super-resolving depth map by moving pattern projector |
US15/912,555 Active US10928189B2 (en) | 2013-04-15 | 2018-03-05 | Intensity-modulated light pattern for active stereo |
US15/937,851 Abandoned US20180218210A1 (en) | 2013-04-15 | 2018-03-27 | Diffractive optical element with undiffracted light expansion for eye safe operation |
US18/186,933 Active US12305974B2 (en) | 2013-04-15 | 2023-03-20 | Diffractive optical element with undiffracted light expansion for eye safe operation |
Country Status (11)
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150036023A1 (en) * | 2012-03-13 | 2015-02-05 | Dolby Laboratories Licensing Corporation | Lighting system and method for image and object enhancement |
US20150229915A1 (en) * | 2014-02-08 | 2015-08-13 | Microsoft Corporation | Environment-dependent active illumination for stereo matching |
US20160371845A1 (en) * | 2015-06-18 | 2016-12-22 | Apple Inc. | Monitoring doe performance using software scene evaluation |
US9699394B2 (en) | 2015-03-09 | 2017-07-04 | Microsoft Technology Licensing, Llc | Filter arrangement for image sensor |
DE102017215850A1 (de) * | 2017-09-08 | 2019-03-14 | Robert Bosch Gmbh | Diffraktives optisches Element und Verfahren zu dessen Herstellung |
US10268885B2 (en) | 2013-04-15 | 2019-04-23 | Microsoft Technology Licensing, Llc | Extracting true color from a color and infrared sensor |
WO2019182840A1 (en) * | 2018-03-22 | 2019-09-26 | Microsoft Technology Licensing, Llc | Replicated dot maps for simplified depth computation using machine learning |
US10565720B2 (en) | 2018-03-27 | 2020-02-18 | Microsoft Technology Licensing, Llc | External IR illuminator enabling improved head tracking and surface reconstruction for virtual reality |
US10890839B1 (en) * | 2019-11-06 | 2021-01-12 | Himax Technologies Limited | Structured light imaging device |
US20230334701A1 (en) * | 2021-06-28 | 2023-10-19 | Motional Ad Llc | Systems and methods for camera alignment using pre-distorted targets |
US20240005538A1 (en) * | 2019-11-27 | 2024-01-04 | Trinamix Gmbh | Depth measurement through display |
US12288362B2 (en) | 2022-01-21 | 2025-04-29 | Motional Ad Llc | Active alignment of an optical assembly with intrinsic calibration |
Families Citing this family (188)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120072245A (ko) * | 2010-12-23 | 2012-07-03 | 한국전자통신연구원 | 스테레오 영상 정합 장치 및 방법 |
EP2700920B1 (en) | 2012-08-23 | 2016-06-22 | ams AG | Light sensor system and method for processing light sensor signals |
US9467680B2 (en) | 2013-12-12 | 2016-10-11 | Intel Corporation | Calibration of a three-dimensional acquisition system |
CN105829829B (zh) * | 2013-12-27 | 2019-08-23 | 索尼公司 | 图像处理装置和图像处理方法 |
US9720506B2 (en) * | 2014-01-14 | 2017-08-01 | Microsoft Technology Licensing, Llc | 3D silhouette sensing system |
CN105939837B (zh) * | 2014-01-16 | 2019-05-10 | 惠普发展公司,有限责任合伙企业 | 对用于增材制造系统的切片数据进行处理 |
US9842424B2 (en) * | 2014-02-10 | 2017-12-12 | Pixar | Volume rendering using adaptive buckets |
US20170078649A1 (en) | 2014-03-07 | 2017-03-16 | Brown University | Method and system for unsynchronized structured lighting |
US10005126B2 (en) * | 2014-03-19 | 2018-06-26 | Autodesk, Inc. | Systems and methods for improved 3D printing |
US9674493B2 (en) * | 2014-03-24 | 2017-06-06 | Omnivision Technologies, Inc. | Color image sensor with metal mesh to detect infrared light |
WO2015152829A1 (en) * | 2014-04-03 | 2015-10-08 | Heptagon Micro Optics Pte. Ltd. | Structured-stereo imaging assembly including separate imagers for different wavelengths |
GB201407270D0 (en) * | 2014-04-24 | 2014-06-11 | Cathx Res Ltd | 3D data in underwater surveys |
US9823842B2 (en) | 2014-05-12 | 2017-11-21 | The Research Foundation For The State University Of New York | Gang migration of virtual machines using cluster-wide deduplication |
US9533449B2 (en) | 2014-06-19 | 2017-01-03 | Autodesk, Inc. | Material deposition systems with four or more axes |
US10252466B2 (en) | 2014-07-28 | 2019-04-09 | Massachusetts Institute Of Technology | Systems and methods of machine vision assisted additive fabrication |
WO2016020073A1 (en) * | 2014-08-08 | 2016-02-11 | Cemb S.P.A. | Vehicle equipment with scanning system for contactless measurement |
US10455212B1 (en) * | 2014-08-25 | 2019-10-22 | X Development Llc | Projected pattern motion/vibration for depth sensing |
JP6397698B2 (ja) * | 2014-08-28 | 2018-09-26 | 任天堂株式会社 | 情報処理端末、情報処理プログラム、情報処理端末システム、および情報処理方法 |
US9507995B2 (en) * | 2014-08-29 | 2016-11-29 | X Development Llc | Combination of stereo and structured-light processing |
DE102014113389A1 (de) * | 2014-09-17 | 2016-03-17 | Pilz Gmbh & Co. Kg | Verfahren und Vorrichtung zum Identifizieren von Strukturelementen eines projizierten Strukturmusters in Kamerabildern |
US20170305066A1 (en) | 2014-09-26 | 2017-10-26 | Hewlett-Packard Development Company, L.P. | Lighting for additive manufacturing |
EP3018587B1 (en) * | 2014-11-05 | 2018-08-29 | Renesas Electronics Europe GmbH | Memory access unit |
EP3043159B1 (en) * | 2015-01-08 | 2019-12-18 | ams AG | Method for processing light sensor signals and light sensor system |
CN107003116A (zh) * | 2014-12-15 | 2017-08-01 | 索尼公司 | 图像捕捉装置组件、三维形状测量装置和运动检测装置 |
EP3040941B1 (en) * | 2014-12-29 | 2017-08-02 | Dassault Systèmes | Method for calibrating a depth camera |
DE102015202182A1 (de) * | 2015-02-06 | 2016-08-11 | Siemens Aktiengesellschaft | Vorrichtung und Verfahren zur sequentiellen, diffraktiven Musterprojektion |
US11562286B2 (en) * | 2015-02-06 | 2023-01-24 | Box, Inc. | Method and system for implementing machine learning analysis of documents for classifying documents by associating label values to the documents |
JP6484071B2 (ja) * | 2015-03-10 | 2019-03-13 | アルプスアルパイン株式会社 | 物体検出装置 |
CN106032059B (zh) * | 2015-03-13 | 2019-11-26 | 三纬国际立体列印科技股份有限公司 | 立体打印方法与立体打印装置 |
KR102238794B1 (ko) * | 2015-03-25 | 2021-04-09 | 한국전자통신연구원 | 영상 촬영 장치의 촬영 속도 증가 방법 |
DE112015006245B4 (de) | 2015-03-30 | 2019-05-23 | Fujifilm Corporation | Abstandsbild-Erfassungsvorrichtung und Abstandsbild-Erfassungsverfahren |
EP3081384B1 (en) * | 2015-04-17 | 2019-11-13 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and program |
KR20250035038A (ko) | 2015-04-19 | 2025-03-11 | 포토내이션 리미티드 | Vr/ar 응용에서 심도 증강을 위한 다중-기선 카메라 어레이 시스템 아키텍처 |
US9751263B2 (en) * | 2015-04-20 | 2017-09-05 | Xerox Corporation | Injection molding to finish parts printed with a three-dimensional object printer |
US10459114B2 (en) * | 2015-05-18 | 2019-10-29 | Lasermotive, Inc. | Wireless power transmitter and receiver |
US9683834B2 (en) * | 2015-05-27 | 2017-06-20 | Intel Corporation | Adaptable depth sensing system |
US9495584B1 (en) * | 2015-06-05 | 2016-11-15 | Digital Signal Corporation | System and method for facial recognition using images captured from a target illuminated with infrared light |
US9824278B2 (en) * | 2015-06-24 | 2017-11-21 | Netflix, Inc. | Determining native resolutions of video sequences |
KR102660109B1 (ko) * | 2015-07-13 | 2024-04-24 | 코닌클리케 필립스 엔.브이. | 이미지에 대한 깊이 맵을 결정하기 위한 방법 및 장치 |
WO2017014691A1 (en) * | 2015-07-17 | 2017-01-26 | Heptagon Micro Optics Pte. Ltd. | Generating a distance map based on captured images of a scene |
US10699476B2 (en) | 2015-08-06 | 2020-06-30 | Ams Sensors Singapore Pte. Ltd. | Generating a merged, fused three-dimensional point cloud based on captured images of a scene |
WO2017030507A1 (en) | 2015-08-19 | 2017-02-23 | Heptagon Micro Optics Pte. Ltd. | Generating a disparity map having reduced over-smoothing |
CN106550228B (zh) * | 2015-09-16 | 2019-10-15 | 上海图檬信息科技有限公司 | 获取三维场景的深度图的设备 |
US20170116779A1 (en) * | 2015-10-26 | 2017-04-27 | Microsoft Technology Licensing, Llc | Volumetric representation of objects |
US10554956B2 (en) | 2015-10-29 | 2020-02-04 | Dell Products, Lp | Depth masks for image segmentation for depth-based computational photography |
US10021371B2 (en) | 2015-11-24 | 2018-07-10 | Dell Products, Lp | Method and apparatus for gross-level user and input detection using similar or dissimilar camera pair |
KR102323217B1 (ko) * | 2015-12-21 | 2021-11-08 | 삼성전자주식회사 | 매크로 픽셀의 노이즈를 제어하는 뎁스 센서, 3차원 카메라 및 제어 방법 |
US9800795B2 (en) | 2015-12-21 | 2017-10-24 | Intel Corporation | Auto range control for active illumination depth camera |
US10761497B2 (en) | 2016-01-14 | 2020-09-01 | Microsoft Technology Licensing, Llc | Printing 3D objects with automatic dimensional accuracy compensation |
CN106980630B (zh) * | 2016-01-19 | 2020-03-10 | 菜鸟智能物流控股有限公司 | 一种数据旋转展示方法及装置 |
EP3417337A4 (en) * | 2016-02-18 | 2019-10-23 | Apple Inc. | HEAD-MOUNTED DISPLAY FOR VIRTUAL AND MIXED REALITY WITH INSIDE OUT POSITION, USER BODY AND ENVIRONMENTAL TRACKING |
JP6860000B2 (ja) * | 2016-03-03 | 2021-04-14 | ソニー株式会社 | 医療用画像処理装置、システム、方法、プログラム、画像処理システム及び医療用画像処理システム |
DE102016106121A1 (de) | 2016-04-04 | 2017-10-05 | Carl Zeiss Ag | Verfahren und Vorrichtung zum Bestimmen von Parametern zur Brillenanpassung |
US20190196449A1 (en) * | 2016-05-06 | 2019-06-27 | Yunbo ZHANG | Determining manufacturable models |
EP3273685A4 (en) * | 2016-06-08 | 2018-01-24 | Panasonic Intellectual Property Management Co., Ltd. | Projection system |
US10659764B2 (en) * | 2016-06-20 | 2020-05-19 | Intel Corporation | Depth image provision apparatus and method |
US10609359B2 (en) * | 2016-06-22 | 2020-03-31 | Intel Corporation | Depth image provision apparatus and method |
US10638060B2 (en) * | 2016-06-28 | 2020-04-28 | Intel Corporation | Color correction of RGBIR sensor stream based on resolution recovery of RGB and IR channels |
CN106210568A (zh) * | 2016-07-15 | 2016-12-07 | 深圳奥比中光科技有限公司 | 图像处理方法以及装置 |
US10241244B2 (en) | 2016-07-29 | 2019-03-26 | Lumentum Operations Llc | Thin film total internal reflection diffraction grating for single polarization or dual polarization |
CN106204414A (zh) * | 2016-08-05 | 2016-12-07 | 蓝普金睛(北京)科技有限公司 | 一种动态图像缓存的方法及系统 |
US10192311B2 (en) * | 2016-08-05 | 2019-01-29 | Qualcomm Incorporated | Methods and apparatus for codeword boundary detection for generating depth maps |
CN106375740B (zh) * | 2016-09-28 | 2018-02-06 | 华为技术有限公司 | 生成rgb图像的方法、装置和系统 |
CN106447588A (zh) * | 2016-09-30 | 2017-02-22 | 天津大学 | 菲涅耳变换域混沌双随机相位编码光学图像加密方法 |
JP6645394B2 (ja) * | 2016-10-03 | 2020-02-14 | 株式会社デンソー | 画像センサ |
US10456984B2 (en) | 2016-12-16 | 2019-10-29 | Massachusetts Institute Of Technology | Adaptive material deposition for additive manufacturing |
EP3565259A1 (en) * | 2016-12-28 | 2019-11-06 | Panasonic Intellectual Property Corporation of America | Three-dimensional model distribution method, three-dimensional model receiving method, three-dimensional model distribution device, and three-dimensional model receiving device |
US10372974B2 (en) | 2017-01-11 | 2019-08-06 | Microsoft Technology Licensing, Llc | 3D imaging recognition by stereo matching of RGB and infrared images |
CN108399633A (zh) * | 2017-02-06 | 2018-08-14 | 罗伯团队家居有限公司 | 用于立体视觉的方法和装置 |
CN106908391A (zh) * | 2017-02-10 | 2017-06-30 | 广东欧珀移动通信有限公司 | 终端中盖板玻璃颜色识别方法和装置 |
CN106909320B (zh) * | 2017-02-20 | 2020-01-21 | 北京中科睿芯科技有限公司 | 一种多维数据扩充传输的方法、装置以及系统 |
US10827129B2 (en) * | 2017-02-24 | 2020-11-03 | Sony Corporation | Image processing apparatus and imaging apparatus |
US10955814B2 (en) | 2017-04-24 | 2021-03-23 | Autodesk, Inc. | Closed-loop robotic deposition of material |
US11181886B2 (en) * | 2017-04-24 | 2021-11-23 | Autodesk, Inc. | Closed-loop robotic deposition of material |
CN107084686B (zh) * | 2017-04-26 | 2019-04-30 | 西安交通大学 | 一种无运动部件的动态多光刀扫描测量方法 |
US11300402B2 (en) * | 2017-05-31 | 2022-04-12 | Hewlett-Packard Development Company, L.P. | Deriving topology information of a scene |
US20180347967A1 (en) * | 2017-06-01 | 2018-12-06 | RGBDsense Information Technology Ltd. | Method and apparatus for generating a random coding pattern for coding structured light |
US10817493B2 (en) | 2017-07-07 | 2020-10-27 | Raytheon Company | Data interpolation |
KR102346031B1 (ko) | 2017-07-25 | 2022-01-03 | 삼성디스플레이 주식회사 | 표시 장치 및 이의 구동 방법 |
KR102402477B1 (ko) * | 2017-08-04 | 2022-05-27 | 엘지이노텍 주식회사 | ToF 모듈 |
US10586342B2 (en) * | 2017-08-31 | 2020-03-10 | Facebook Technologies, Llc | Shifting diffractive optical element for adjustable depth sensing resolution |
US20190072771A1 (en) * | 2017-09-05 | 2019-03-07 | Facebook Technologies, Llc | Depth measurement using multiple pulsed structured light projectors |
US10962790B2 (en) | 2017-09-05 | 2021-03-30 | Facebook Technologies, Llc | Depth measurement using a pulsed structured light projector |
CN107884066A (zh) * | 2017-09-29 | 2018-04-06 | 深圳奥比中光科技有限公司 | 基于泛光功能的光传感器及其3d成像装置 |
US10545457B2 (en) | 2017-12-05 | 2020-01-28 | K Laser Technology, Inc. | Optical projector with off-axis diffractive element and conjugate images |
US10310281B1 (en) * | 2017-12-05 | 2019-06-04 | K Laser Technology, Inc. | Optical projector with off-axis diffractive element |
CN109889799B (zh) * | 2017-12-06 | 2020-08-25 | 西安交通大学 | 基于rgbir摄像头的单目结构光深度感知方法及装置 |
US10628952B2 (en) | 2017-12-11 | 2020-04-21 | Google Llc | Dual-band stereo depth sensing system |
DE102017222708A1 (de) * | 2017-12-14 | 2019-06-19 | Conti Temic Microelectronic Gmbh | 3D-Umfelderfassung mittels Projektor und Kameramodulen |
US11227371B2 (en) * | 2017-12-14 | 2022-01-18 | Nec Corporation | Image processing device, image processing method, and image processing program |
JP6939501B2 (ja) * | 2017-12-15 | 2021-09-22 | オムロン株式会社 | 画像処理システム、画像処理プログラム、および画像処理方法 |
CN108133494A (zh) * | 2018-01-17 | 2018-06-08 | 南京华捷艾米软件科技有限公司 | 利用rgb-ir同时生成深度图和彩色图的系统和方法 |
DE102019000272B4 (de) | 2018-01-19 | 2023-11-16 | Cognex Corporation | System zum bilden einer homogenisierten beleuchtungslinie, die als eine linie mit geringem speckle bildlich erfasst werden kann |
US10317684B1 (en) | 2018-01-24 | 2019-06-11 | K Laser Technology, Inc. | Optical projector with on axis hologram and multiple beam splitter |
CN108319437B (zh) * | 2018-02-28 | 2019-01-11 | 上海熙香艺享电子商务有限公司 | 内容大数据密集程度分析平台 |
CN108490632B (zh) * | 2018-03-12 | 2020-01-10 | Oppo广东移动通信有限公司 | 激光投射模组、深度相机和电子装置 |
WO2019182881A1 (en) * | 2018-03-20 | 2019-09-26 | Magik Eye Inc. | Distance measurement using projection patterns of varying densities |
US10771766B2 (en) * | 2018-03-30 | 2020-09-08 | Mediatek Inc. | Method and apparatus for active stereo vision |
CN108564613A (zh) * | 2018-04-12 | 2018-09-21 | 维沃移动通信有限公司 | 一种深度数据获取方法及移动终端 |
CN112236289B (zh) | 2018-05-22 | 2023-02-21 | 曼特尔公司 | 用于自动工具路径生成的方法和系统 |
US10878590B2 (en) * | 2018-05-25 | 2020-12-29 | Microsoft Technology Licensing, Llc | Fusing disparity proposals in stereo matching |
CN108917640A (zh) * | 2018-06-06 | 2018-11-30 | 佛山科学技术学院 | 一种激光盲孔深度检测方法及其系统 |
FI128523B (en) | 2018-06-07 | 2020-07-15 | Ladimo Oy | Modeling of topography of a 3D surface |
JP7297891B2 (ja) | 2018-07-19 | 2023-06-26 | アクティブ サージカル, インコーポレイテッド | 自動化された外科手術ロボットのためのビジョンシステム内の深度のマルチモード感知のためのシステムおよび方法 |
US11067820B2 (en) | 2018-07-31 | 2021-07-20 | Himax Technologies Limited | Structured light projector and three-dimensional image sensing module |
CN109102540B (zh) * | 2018-08-16 | 2022-01-28 | 杭州电子科技大学 | 基于fpga的标记面积块下限分离分道方法 |
TWI676781B (zh) * | 2018-08-17 | 2019-11-11 | 鑑微科技股份有限公司 | 三維掃描系統 |
US10761337B2 (en) * | 2018-08-24 | 2020-09-01 | Himax Technologies Limited | Projecting apparatus for spreading non-diffracted light |
JP6907277B2 (ja) * | 2018-08-30 | 2021-07-21 | コグネックス・コーポレイション | 歪みが低減された物体の3次元再構成を生成するための方法及び装置 |
US11039122B2 (en) * | 2018-09-04 | 2021-06-15 | Google Llc | Dark flash photography with a stereo camera |
CN109146953B (zh) * | 2018-09-11 | 2021-12-10 | 杭州电子科技大学 | 基于fpga的标记面积块上限分离分道方法 |
US10791277B2 (en) * | 2018-09-11 | 2020-09-29 | Cognex Corporation | Methods and apparatus for optimizing image acquisition of objects subject to illumination patterns |
US20200082160A1 (en) * | 2018-09-12 | 2020-03-12 | Kneron (Taiwan) Co., Ltd. | Face recognition module with artificial intelligence models |
KR102562360B1 (ko) * | 2018-10-05 | 2023-08-02 | 엘지이노텍 주식회사 | 깊이 정보를 획득하는 방법 및 카메라 모듈 |
CN109532021B (zh) * | 2018-10-10 | 2020-08-25 | 浙江大学 | 基于结构光线性异常点的3d打印熔积缺陷逐层检测方法 |
US11176694B2 (en) * | 2018-10-19 | 2021-11-16 | Samsung Electronics Co., Ltd | Method and apparatus for active depth sensing and calibration method thereof |
US11480793B2 (en) * | 2018-10-24 | 2022-10-25 | Google Llc | Systems, devices, and methods for aligning a lens in a laser projector |
JP7146576B2 (ja) * | 2018-10-29 | 2022-10-04 | 芝浦機械株式会社 | 積層造形装置、積層造形方法、及びプログラム |
WO2020091764A1 (en) | 2018-10-31 | 2020-05-07 | Hewlett-Packard Development Company, L.P. | Recovering perspective distortions |
US11024037B2 (en) | 2018-11-15 | 2021-06-01 | Samsung Electronics Co., Ltd. | Foreground-background-aware atrous multiscale network for disparity estimation |
US10628968B1 (en) * | 2018-12-05 | 2020-04-21 | Toyota Research Institute, Inc. | Systems and methods of calibrating a depth-IR image offset |
CN112912896B (zh) * | 2018-12-14 | 2024-06-28 | 苹果公司 | 机器学习辅助的图像预测 |
CN109798838B (zh) * | 2018-12-19 | 2020-10-27 | 西安交通大学 | 一种基于激光散斑投射的ToF深度传感器及其测距方法 |
CN109741386B (zh) * | 2018-12-26 | 2020-07-31 | 豪威科技(武汉)有限公司 | 立体视觉系统的增强方法及立体视觉系统 |
CA3125166A1 (en) | 2018-12-28 | 2020-07-02 | Activ Surgical, Inc. | User interface elements for orientation of remote camera during surgery |
WO2020140056A1 (en) | 2018-12-28 | 2020-07-02 | Activ Surgical, Inc. | Systems and methods to optimize reachability, workspace, and dexterity in minimally invasive surgery |
US10917568B2 (en) * | 2018-12-28 | 2021-02-09 | Microsoft Technology Licensing, Llc | Low-power surface reconstruction |
US11333895B1 (en) | 2019-01-11 | 2022-05-17 | Facebook Technologies, Llc | Systems and methods for structured light projector operational safety |
JP7211835B2 (ja) * | 2019-02-04 | 2023-01-24 | i-PRO株式会社 | 撮像システムおよび同期制御方法 |
CN110087057B (zh) * | 2019-03-11 | 2021-10-12 | 歌尔股份有限公司 | 一种投影仪的深度图像获取方法和装置 |
US20200292297A1 (en) * | 2019-03-15 | 2020-09-17 | Faro Technologies, Inc. | Three-dimensional measurement device |
JP2022526626A (ja) | 2019-04-08 | 2022-05-25 | アクティブ サージカル, インコーポレイテッド | 医療撮像のためのシステムおよび方法 |
US12292564B2 (en) | 2019-04-08 | 2025-05-06 | Activ Surgical, Inc. | Systems and methods for medical imaging |
US11039118B2 (en) | 2019-04-17 | 2021-06-15 | XRSpace CO., LTD. | Interactive image processing system using infrared cameras |
WO2020214821A1 (en) | 2019-04-19 | 2020-10-22 | Activ Surgical, Inc. | Systems and methods for trocar kinematics |
EP3731175A1 (en) * | 2019-04-26 | 2020-10-28 | XRSpace CO., LTD. | Interactive image processing system using infrared cameras |
CN110111390A (zh) * | 2019-05-15 | 2019-08-09 | 湖南科技大学 | 基于双目视觉光流跟踪的薄壁件全向振动测量方法及系统 |
CN110012206A (zh) * | 2019-05-24 | 2019-07-12 | Oppo广东移动通信有限公司 | 图像获取方法、图像获取装置、电子设备和可读存储介质 |
CN110209363A (zh) * | 2019-05-30 | 2019-09-06 | 大连理工大学 | 基于遗传算法的智能3d打印路径规划方法 |
EP3760966B1 (en) * | 2019-07-02 | 2024-06-26 | Topcon Corporation | Method of optical coherence tomography imaging and method of processing oct data |
WO2021035094A1 (en) | 2019-08-21 | 2021-02-25 | Activ Surgical, Inc. | Systems and methods for medical imaging |
CN110524874B (zh) * | 2019-08-23 | 2022-03-08 | 源秩科技(上海)有限公司 | 光固化3d打印装置及其打印方法 |
BR112022004811A2 (pt) | 2019-09-17 | 2022-06-21 | Boston Polarimetrics Inc | Sistemas e métodos para modelagem de superfície usando indicações de polarização |
CN112559037B (zh) * | 2019-09-25 | 2024-04-12 | 阿里巴巴集团控股有限公司 | 一种指令执行方法、单元、装置及系统 |
EP4042366A4 (en) | 2019-10-07 | 2023-11-15 | Boston Polarimetrics, Inc. | SYSTEMS AND METHODS FOR AUGMENTING SENSOR SYSTEMS AND IMAGING SYSTEMS WITH POLARIZATION |
US11796829B1 (en) * | 2019-10-31 | 2023-10-24 | Meta Platforms Technologies, Llc | In-field illuminator for eye depth sensing |
KR20230116068A (ko) | 2019-11-30 | 2023-08-03 | 보스턴 폴라리메트릭스, 인크. | 편광 신호를 이용한 투명 물체 분할을 위한 시스템및 방법 |
CN113009705A (zh) * | 2019-12-19 | 2021-06-22 | 苏州苏大维格科技集团股份有限公司 | 一种消除零级衍射影响的结构光组件 |
US11132804B2 (en) * | 2020-01-07 | 2021-09-28 | Himax Technologies Limited | Hybrid depth estimation system |
EP4081933A4 (en) | 2020-01-29 | 2024-03-20 | Intrinsic Innovation LLC | SYSTEMS AND METHODS FOR CHARACTERIZING OBJECT POSE DETECTION AND MEASUREMENT SYSTEMS |
US11797863B2 (en) | 2020-01-30 | 2023-10-24 | Intrinsic Innovation Llc | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
WO2021171695A1 (ja) * | 2020-02-28 | 2021-09-02 | 富士フイルム株式会社 | 撮像システム、撮像システムの制御方法、及びプログラム |
CN113365035B (zh) * | 2020-03-04 | 2022-10-21 | 合肥君正科技有限公司 | 一种图像色彩还原的校准系统 |
US11503266B2 (en) * | 2020-03-06 | 2022-11-15 | Samsung Electronics Co., Ltd. | Super-resolution depth map generation for multi-camera or other environments |
CN111246073B (zh) * | 2020-03-23 | 2022-03-25 | 维沃移动通信有限公司 | 成像装置、方法及电子设备 |
EP4144085A4 (en) | 2020-04-30 | 2023-10-25 | Siemens Healthcare Diagnostics, Inc. | APPARATUS, METHOD FOR CALIBRATING AN APPARATUS AND ASSOCIATED DEVICE |
CN111678457B (zh) * | 2020-05-08 | 2021-10-01 | 西安交通大学 | 一种OLED透明屏下ToF装置及测距方法 |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
CN111787084A (zh) * | 2020-06-23 | 2020-10-16 | 杭州数澜科技有限公司 | 一种圈选对象的方法和装置 |
KR102788915B1 (ko) | 2020-09-10 | 2025-03-31 | 삼성전자주식회사 | 증강 현실 장치 및 그 제어 방법 |
CN114268774A (zh) * | 2020-09-16 | 2022-04-01 | Oppo广东移动通信有限公司 | 图像采集方法、图像传感器、装置、设备以及存储介质 |
US11657529B2 (en) * | 2020-10-12 | 2023-05-23 | Black Sesame Technologies Inc. | Multiple camera system with flash for depth map generation |
DE102020133085A1 (de) | 2020-12-11 | 2022-06-15 | Dürr Assembly Products GmbH | Verfahren zur Vermessung der Kotflügelkante eines Fahrzeugs in einem Prüfstand |
CN112959661B (zh) * | 2021-01-26 | 2024-02-02 | 深圳市创必得科技有限公司 | Lcd光固化3d打印均光优化补偿方法及装置 |
WO2022164720A1 (en) * | 2021-01-29 | 2022-08-04 | Essentium, Inc. | Contour smoothing for material extrusion three-dimensionally printed parts |
CN112859330B (zh) * | 2021-02-18 | 2025-03-25 | 嘉兴驭光光电科技有限公司 | 衍射光学元件及设计方法、光学投影装置以及车辆 |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
WO2022249321A1 (ja) * | 2021-05-26 | 2022-12-01 | 日本電信電話株式会社 | ひび割れ画像点検システムおよび方法 |
US12175741B2 (en) | 2021-06-22 | 2024-12-24 | Intrinsic Innovation Llc | Systems and methods for a vision guided end effector |
US12340538B2 (en) | 2021-06-25 | 2025-06-24 | Intrinsic Innovation Llc | Systems and methods for generating and using visual datasets for training computer vision models |
US12172310B2 (en) | 2021-06-29 | 2024-12-24 | Intrinsic Innovation Llc | Systems and methods for picking objects using 3-D geometry and segmentation |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
US12293535B2 (en) | 2021-08-03 | 2025-05-06 | Intrinsic Innovation Llc | Systems and methods for training pose estimators in computer vision |
JP2024530158A (ja) * | 2021-08-06 | 2024-08-16 | ピーピージー・インダストリーズ・オハイオ・インコーポレイテッド | 非平面状の表面を3dプリントするためのシステムおよび方法 |
US11852439B2 (en) * | 2021-11-24 | 2023-12-26 | Wrap Technologies, Inc. | Systems and methods for generating optical beam arrays |
CN114371554B (zh) * | 2021-12-31 | 2024-08-13 | 嘉兴驭光光电科技有限公司 | 用于分束的衍射光学元件及其设计方法、结构光投射器 |
CN116800947A (zh) * | 2022-03-16 | 2023-09-22 | 安霸国际有限合伙企业 | 用于大规模生产过程的快速rgb-ir校准验证 |
CN118922685A (zh) | 2022-03-30 | 2024-11-08 | 索尼集团公司 | 信息处理装置、信息处理方法和信息处理程序 |
KR20230174621A (ko) * | 2022-06-21 | 2023-12-28 | 삼성전자주식회사 | 깊이 맵 생성을 위한 전자 장치 및 그 동작 방법 |
US12244937B2 (en) * | 2022-07-29 | 2025-03-04 | Texas Instruments Incorporated | RGB-IR pixel pattern conversion via conversion engine |
US11972504B2 (en) * | 2022-08-10 | 2024-04-30 | Zhejiang Lab | Method and system for overlapping sliding window segmentation of image based on FPGA |
KR102674408B1 (ko) * | 2022-12-28 | 2024-06-12 | 에이아이다이콤 (주) | 비 접촉식 의료 영상 제어 시스템 |
US12207003B2 (en) | 2023-03-02 | 2025-01-21 | e-con Systems India Private | System and method for IR subtraction in an RGB-IR image sensor using FPGA |
CN116448250A (zh) * | 2023-06-14 | 2023-07-18 | 国网山西省电力公司超高压变电分公司 | 一种电力设备红外热成像辅助定位装置及辅助定位方法 |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6751344B1 (en) * | 1999-05-28 | 2004-06-15 | Champion Orthotic Investments, Inc. | Enhanced projector system for machine vision |
US20070263903A1 (en) * | 2006-03-23 | 2007-11-15 | Tyzx, Inc. | Enhancing stereo depth measurements with projected texture |
US7315383B1 (en) * | 2004-07-09 | 2008-01-01 | Mohsen Abdollahi | Scanning 3D measurement technique using structured lighting and high-speed CMOS imager |
US20080205748A1 (en) * | 2007-02-28 | 2008-08-28 | Sungkyunkwan University | Structural light based depth imaging method and system using signal separation coding, and error correction thereof |
US8077034B2 (en) * | 2006-09-28 | 2011-12-13 | Bea Sa | Sensor for presence detection |
US20120038986A1 (en) * | 2010-08-11 | 2012-02-16 | Primesense Ltd. | Pattern projector |
US20120056982A1 (en) * | 2010-09-08 | 2012-03-08 | Microsoft Corporation | Depth camera based on structured light and stereo vision |
US20120307075A1 (en) * | 2011-06-01 | 2012-12-06 | Empire Technology Development, Llc | Structured light projection for motion detection in augmented reality |
US20130229396A1 (en) * | 2012-03-05 | 2013-09-05 | Kenneth J. Huebner | Surface aware, object aware, and image aware handheld projector |
US20140168380A1 (en) * | 2012-12-14 | 2014-06-19 | Faro Technologies, Inc. | Device for optically scanning and measuring an environment |
US20140293011A1 (en) * | 2013-03-28 | 2014-10-02 | Phasica, LLC | Scanner System for Determining the Three Dimensional Shape of an Object and Method for Using |
US20150316368A1 (en) * | 2012-11-29 | 2015-11-05 | Koninklijke Philips N.V. | Laser device for projecting a structured light pattern onto a scene |
Family Cites Families (151)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3938102A (en) | 1974-08-19 | 1976-02-10 | International Business Machines Corporation | Method and apparatus for accessing horizontal sequences and rectangular sub-arrays from an array stored in a modified word organized random access memory system |
EP0085210A1 (en) | 1982-01-29 | 1983-08-10 | International Business Machines Corporation | Image processing system |
US5351152A (en) | 1991-07-23 | 1994-09-27 | The Board Of Governers Of Wayne State University | Direct-view stereoscopic confocal microscope |
US5471326A (en) | 1993-04-30 | 1995-11-28 | Northrop Grumman Corporation | Holographic laser scanner and rangefinder |
US5586200A (en) | 1994-01-07 | 1996-12-17 | Panasonic Technologies, Inc. | Segmentation based image compression system |
US5739906A (en) | 1996-06-07 | 1998-04-14 | The United States Of America As Represented By The Secretary Of Commerce | Interferometric thickness variation test method for windows and silicon wafers using a diverging wavefront |
US6105139A (en) | 1998-06-03 | 2000-08-15 | Nec Usa, Inc. | Controller-based power management for low-power sequential circuits |
TW495749B (en) | 1998-08-03 | 2002-07-21 | Matsushita Electric Ind Co Ltd | Optical head |
JP3450792B2 (ja) | 1999-03-25 | 2003-09-29 | キヤノン株式会社 | 奥行き画像計測装置及び方法、並びに複合現実感提示システム |
GB0008303D0 (en) | 2000-04-06 | 2000-05-24 | British Aerospace | Measurement system and method |
US6826299B2 (en) | 2000-07-31 | 2004-11-30 | Geodetic Services, Inc. | Photogrammetric image correlation and measurement system and method |
US6850872B1 (en) | 2000-08-30 | 2005-02-01 | Microsoft Corporation | Facial image processing methods and systems |
US7554737B2 (en) | 2000-12-20 | 2009-06-30 | Riake Corporation | Illumination device and method using adaptable source and output format |
US6895115B2 (en) | 2001-04-23 | 2005-05-17 | The United States Of America As Represented By The United States National Aeronautics And Space Administration | Method for implementation of recursive hierarchical segmentation on parallel computers |
CN1298175C (zh) | 2001-07-06 | 2007-01-31 | 以克斯普雷有限公司 | 图像投影装置和方法 |
JP4635392B2 (ja) | 2001-08-09 | 2011-02-23 | コニカミノルタホールディングス株式会社 | 3次元物体の表面形状モデリング装置、および、プログラム |
US6940538B2 (en) | 2001-08-29 | 2005-09-06 | Sony Corporation | Extracting a depth map from known camera and model tracking data |
RU2237284C2 (ru) | 2001-11-27 | 2004-09-27 | Самсунг Электроникс Ко., Лтд. | Способ генерирования структуры узлов, предназначенных для представления трехмерных объектов с использованием изображений с глубиной |
US7762964B2 (en) | 2001-12-10 | 2010-07-27 | Candela Corporation | Method and apparatus for improving safety during exposure to a monochromatic light source |
JP4075418B2 (ja) | 2002-03-15 | 2008-04-16 | ソニー株式会社 | 画像処理装置及び画像処理方法、印刷物製造装置及び印刷物製造方法、並びに印刷物製造システム |
US6771271B2 (en) | 2002-06-13 | 2004-08-03 | Analog Devices, Inc. | Apparatus and method of processing image data |
US7399220B2 (en) | 2002-08-02 | 2008-07-15 | Kriesel Marshall S | Apparatus and methods for the volumetric and dimensional measurement of livestock |
CN1176351C (zh) | 2002-10-09 | 2004-11-17 | 天津大学 | 动态多分辨率的三维数字成像的方法及装置 |
CN1186671C (zh) | 2002-10-09 | 2005-01-26 | 天津大学 | 投影结构光的产生方法及装置 |
JP2004135209A (ja) | 2002-10-15 | 2004-04-30 | Hitachi Ltd | 広視野高解像度映像の生成装置及び方法 |
GB2395261A (en) | 2002-11-11 | 2004-05-19 | Qinetiq Ltd | Ranging apparatus |
US7103212B2 (en) | 2002-11-22 | 2006-09-05 | Strider Labs, Inc. | Acquisition of three-dimensional images by an active stereo technique using locally unique patterns |
US7154157B2 (en) | 2002-12-30 | 2006-12-26 | Intel Corporation | Stacked semiconductor radiation sensors having color component and infrared sensing capability |
JP3938120B2 (ja) | 2003-09-17 | 2007-06-27 | ノーリツ鋼機株式会社 | 画像処理装置、方法、及びプログラム |
FR2870621B1 (fr) | 2004-05-21 | 2006-10-27 | Inst Francais Du Petrole | Methode pour generer un maillage hybride conforme en trois dimensions d'une formation heterogene traversee par une ou plusieurs discontinuites geometriques dans le but de realiser des simulations |
JP4011039B2 (ja) | 2004-05-31 | 2007-11-21 | 三菱電機株式会社 | 撮像装置及び信号処理方法 |
DE102004029552A1 (de) | 2004-06-18 | 2006-01-05 | Peter Mäckel | Verfahren zur Sichtbarmachung und Messung von Verformungen von schwingenden Objekten mittels einer Kombination einer synchronisierten, stroboskopischen Bildaufzeichnung mit Bildkorrelationsverfahren |
JP2008510213A (ja) | 2004-08-11 | 2008-04-03 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | ストライプに基づく画像データ記憶方法 |
WO2006025271A1 (ja) | 2004-09-03 | 2006-03-09 | Konica Minolta Opto, Inc. | カップリングレンズ及び光ピックアップ装置 |
JP4883517B2 (ja) | 2004-11-19 | 2012-02-22 | 学校法人福岡工業大学 | 三次元計測装置および三次元計測方法並びに三次元計測プログラム |
US7719533B2 (en) | 2004-11-24 | 2010-05-18 | General Electric Company | Graph extraction labelling and visualization |
US7367682B2 (en) | 2004-12-07 | 2008-05-06 | Symbol Technologies, Inc. | Color image projection arrangement and method |
JP2008537190A (ja) | 2005-01-07 | 2008-09-11 | ジェスチャー テック,インコーポレイテッド | 赤外線パターンを照射することによる対象物の三次元像の生成 |
JP4506501B2 (ja) | 2005-02-21 | 2010-07-21 | 株式会社日立製作所 | 画像合成装置及び撮像システム |
US7512262B2 (en) | 2005-02-25 | 2009-03-31 | Microsoft Corporation | Stereo-based image processing |
US7295771B2 (en) | 2005-04-25 | 2007-11-13 | Delphi Technologies, Inc. | Method and apparatus for minimizing ambient illumination effects in a vision system |
JP4577126B2 (ja) | 2005-07-08 | 2010-11-10 | オムロン株式会社 | ステレオ対応づけのための投光パターンの生成装置及び生成方法 |
EP1934945A4 (en) | 2005-10-11 | 2016-01-20 | Apple Inc | METHOD AND SYSTEM FOR OBJECT RECONSTRUCTION |
WO2007105205A2 (en) | 2006-03-14 | 2007-09-20 | Prime Sense Ltd. | Three-dimensional sensing using speckle patterns |
US20070145273A1 (en) | 2005-12-22 | 2007-06-28 | Chang Edward T | High-sensitivity infrared color camera |
US7821552B2 (en) | 2005-12-27 | 2010-10-26 | Sanyo Electric Co., Ltd. | Imaging apparatus provided with imaging device having sensitivity in visible and infrared regions |
JP4466569B2 (ja) | 2006-01-10 | 2010-05-26 | 株式会社豊田中央研究所 | カラー画像再生装置 |
DE102006007170B4 (de) | 2006-02-08 | 2009-06-10 | Sirona Dental Systems Gmbh | Verfahren und Anordnung zur schnellen und robusten chromatisch konfokalen 3D-Messtechnik |
EP1994503B1 (en) | 2006-03-14 | 2017-07-05 | Apple Inc. | Depth-varying light fields for three dimensional sensing |
GB0718706D0 (en) | 2007-09-25 | 2007-11-07 | Creative Physics Ltd | Method and apparatus for reducing laser speckle |
WO2007132399A1 (en) | 2006-05-09 | 2007-11-22 | Koninklijke Philips Electronics N.V. | Programmable data processing circuit |
WO2008036092A1 (en) | 2006-09-21 | 2008-03-27 | Thomson Licensing | A method and system for three-dimensional model acquisition |
WO2008133650A2 (en) | 2006-11-07 | 2008-11-06 | Rudolph Technologies, Inc. | Method and system for providing a high definition triangulation system |
EP2618102A2 (en) | 2006-11-21 | 2013-07-24 | Mantisvision Ltd. | 3d geometric modeling and 3d video content creation |
US8090194B2 (en) | 2006-11-21 | 2012-01-03 | Mantis Vision Ltd. | 3D geometric modeling and motion capture using both single and dual imaging |
CN101616785B (zh) | 2007-01-10 | 2014-01-08 | 3D系统公司 | 具有改进的颜色、制品性能和易用性的三维印刷材料体系 |
US7683962B2 (en) | 2007-03-09 | 2010-03-23 | Eastman Kodak Company | Camera using multiple lenses and image sensors in a rangefinder configuration to provide a range map |
FR2914422B1 (fr) | 2007-03-28 | 2009-07-03 | Soitec Silicon On Insulator | Procede de detection de defauts de surface d'un substrat et dispositif mettant en oeuvre ledit procede. |
AU2008244495A1 (en) | 2007-04-23 | 2008-11-06 | California Institute Of Technology | An aperture system with spatially-biased aperture shapes for 3-D defocusing-based imaging |
JP2008288629A (ja) | 2007-05-15 | 2008-11-27 | Sony Corp | 画像信号処理装置、撮像素子、および画像信号処理方法、並びにコンピュータ・プログラム |
WO2008142846A1 (ja) * | 2007-05-18 | 2008-11-27 | Panasonic Corporation | 立体画像表示装置 |
JP5018282B2 (ja) | 2007-07-04 | 2012-09-05 | マツダ株式会社 | 製品の3次元形状モデルデータ作成方法 |
US20120002045A1 (en) | 2007-08-08 | 2012-01-05 | Mayer Tony | Non-retro-reflective license plate imaging system |
US7933056B2 (en) | 2007-09-26 | 2011-04-26 | Che-Chih Tsao | Methods and systems of rapid focusing and zooming for volumetric 3D displays and cameras |
WO2009045439A1 (en) | 2007-10-02 | 2009-04-09 | Doubleshot, Inc . | Laser beam pattern projector |
US8446470B2 (en) * | 2007-10-04 | 2013-05-21 | Magna Electronics, Inc. | Combined RGB and IR imaging sensor |
IL191615A (en) | 2007-10-23 | 2015-05-31 | Israel Aerospace Ind Ltd | A method and system for producing tie points for use in stereo adjustment of stereoscopic images and a method for identifying differences in the landscape taken between two time points |
US8384997B2 (en) | 2008-01-21 | 2013-02-26 | Primesense Ltd | Optical pattern projection |
US8788990B2 (en) | 2008-02-21 | 2014-07-22 | Oracle America, Inc. | Reuse of circuit labels in subcircuit recognition |
US7861193B2 (en) | 2008-02-21 | 2010-12-28 | Oracle America, Inc. | Reuse of circuit labels for verification of circuit recognition |
US7958468B2 (en) | 2008-02-21 | 2011-06-07 | Oracle America, Inc. | Unidirectional relabeling for subcircuit recognition |
US8368753B2 (en) | 2008-03-17 | 2013-02-05 | Sony Computer Entertainment America Llc | Controller with an integrated depth camera |
RU2510235C2 (ru) | 2008-03-18 | 2014-03-27 | Новадак Текнолоджиз Инк. | Система визуализации для получения комбинированного изображения из полноцветного изображения в отраженном свете и изображение в ближней инфракрасной области |
US8405727B2 (en) * | 2008-05-01 | 2013-03-26 | Apple Inc. | Apparatus and method for calibrating image capture devices |
NZ567986A (en) | 2008-05-02 | 2010-08-27 | Auckland Uniservices Ltd | Real-time stereo image matching system |
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
JP5317169B2 (ja) | 2008-06-13 | 2013-10-16 | 洋 川崎 | 画像処理装置、画像処理方法およびプログラム |
JP4513905B2 (ja) | 2008-06-27 | 2010-07-28 | ソニー株式会社 | 信号処理装置、信号処理方法、プログラム及び記録媒体 |
KR101530930B1 (ko) | 2008-08-19 | 2015-06-24 | 삼성전자주식회사 | 패턴투영장치, 이를 구비한 3차원 이미지 형성장치, 및 이에 사용되는 초점 가변 액체렌즈 |
EP2166304A1 (de) * | 2008-09-23 | 2010-03-24 | Sick Ag | Beleuchtungseinheit und Verfahren zur Erzeugung eines selbstunähnlichen Musters |
US8442940B1 (en) | 2008-11-18 | 2013-05-14 | Semantic Research, Inc. | Systems and methods for pairing of a semantic network and a natural language processing information extraction system |
JP5430138B2 (ja) | 2008-12-17 | 2014-02-26 | 株式会社トプコン | 形状測定装置およびプログラム |
CN101509764A (zh) | 2009-02-27 | 2009-08-19 | 东南大学 | 一种快速获取物体三维形状的方法 |
DE102009001889A1 (de) | 2009-03-26 | 2010-09-30 | Robert Bosch Gmbh | Lasermarkierung mit Koordinatensystem |
US8823775B2 (en) | 2009-04-30 | 2014-09-02 | Board Of Regents, The University Of Texas System | Body surface imaging |
WO2011013079A1 (en) | 2009-07-30 | 2011-02-03 | Primesense Ltd. | Depth mapping based on pattern matching and stereoscopic information |
US8204904B2 (en) | 2009-09-30 | 2012-06-19 | Yahoo! Inc. | Network graph evolution rule generation |
KR101173668B1 (ko) | 2009-10-27 | 2012-08-20 | 서울대학교산학협력단 | 다중 공간 주파수를 이용한 3차원 물체의 깊이 측정 방법 및 그 장치 |
US9047674B2 (en) | 2009-11-03 | 2015-06-02 | Samsung Electronics Co., Ltd. | Structured grids and graph traversal for image processing |
KR101377325B1 (ko) | 2009-12-21 | 2014-03-25 | 한국전자통신연구원 | 스테레오 영상, 다시점 영상 및 깊이 영상 획득 카메라 장치 및 그 제어 방법 |
US20130278631A1 (en) | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
US20110222757A1 (en) | 2010-03-10 | 2011-09-15 | Gbo 3D Technology Pte. Ltd. | Systems and methods for 2D image and spatial data capture for 3D stereo imaging |
JP2011191221A (ja) | 2010-03-16 | 2011-09-29 | Sanyo Electric Co Ltd | 物体検出装置および情報取得装置 |
US8619143B2 (en) | 2010-03-19 | 2013-12-31 | Pixim, Inc. | Image sensor including color and infrared pixels |
WO2011152634A2 (ko) | 2010-05-29 | 2011-12-08 | Lee Moon Key | 모니터 기반 증강현실 시스템 |
US8670029B2 (en) | 2010-06-16 | 2014-03-11 | Microsoft Corporation | Depth camera illuminator with superluminescent light-emitting diode |
EP2400261A1 (de) | 2010-06-21 | 2011-12-28 | Leica Geosystems AG | Optisches Messverfahren und Messsystem zum Bestimmen von 3D-Koordinaten auf einer Messobjekt-Oberfläche |
GB2481459B (en) | 2010-06-25 | 2017-05-03 | Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E V | Capturing a surface structure of an object surface |
US8357899B2 (en) | 2010-07-30 | 2013-01-22 | Aptina Imaging Corporation | Color correction circuitry and methods for dual-band imaging systems |
DE102010039246A1 (de) | 2010-08-12 | 2012-02-16 | Robert Bosch Gmbh | Verfahren zum Kalibrieren eines Messsystems und Vorrichtung zum Durchführen des Verfahrens |
US8903119B2 (en) | 2010-10-11 | 2014-12-02 | Texas Instruments Incorporated | Use of three-dimensional top-down views for business analytics |
JP5787508B2 (ja) | 2010-11-11 | 2015-09-30 | キヤノン株式会社 | 回折光学素子及び光学系 |
US20120154397A1 (en) | 2010-12-03 | 2012-06-21 | Old Dominion University Research Foundation | Method and system for generating mesh from images |
KR101694292B1 (ko) | 2010-12-17 | 2017-01-09 | 한국전자통신연구원 | 스테레오 영상 정합 장치 및 그 방법 |
CN102867328B (zh) | 2011-01-27 | 2014-04-23 | 深圳泰山在线科技有限公司 | 一种物体表面重建的系统 |
US9247238B2 (en) | 2011-01-31 | 2016-01-26 | Microsoft Technology Licensing, Llc | Reducing interference between multiple infra-red depth cameras |
DE102011004663B4 (de) * | 2011-02-24 | 2018-11-22 | Robert Bosch Gmbh | Vorrichtung zur Fahrzeugvermessung |
KR101289595B1 (ko) | 2011-02-28 | 2013-07-24 | 이경자 | 격자패턴투영장치 |
KR101792501B1 (ko) | 2011-03-16 | 2017-11-21 | 한국전자통신연구원 | 특징기반의 스테레오 매칭 방법 및 장치 |
KR101801355B1 (ko) | 2011-03-25 | 2017-11-24 | 엘지전자 주식회사 | 회절 소자와 광원을 이용한 대상물의 거리 인식 장치 |
US8718748B2 (en) | 2011-03-29 | 2014-05-06 | Kaliber Imaging Inc. | System and methods for monitoring and assessing mobility |
WO2012137434A1 (ja) * | 2011-04-07 | 2012-10-11 | パナソニック株式会社 | 立体撮像装置 |
CN102760234B (zh) | 2011-04-14 | 2014-08-20 | 财团法人工业技术研究院 | 深度图像采集装置、系统及其方法 |
US8760499B2 (en) | 2011-04-29 | 2014-06-24 | Austin Russell | Three-dimensional imager and projection device |
US20120281087A1 (en) | 2011-05-02 | 2012-11-08 | Faro Technologies, Inc. | Three-dimensional scanner for hand-held phones |
US9536312B2 (en) | 2011-05-16 | 2017-01-03 | Microsoft Corporation | Depth reconstruction using plural depth capture units |
CN102831380A (zh) | 2011-06-15 | 2012-12-19 | 康佳集团股份有限公司 | 一种基于深度图像感应的肢体动作识别方法及系统 |
US9530192B2 (en) | 2011-06-30 | 2016-12-27 | Kodak Alaris Inc. | Method for determining stereo quality score and automatically improving the quality of stereo images |
EP2748675B1 (en) | 2011-07-29 | 2018-05-23 | Hewlett-Packard Development Company, L.P. | Projection capture system, programming and method |
DE102011052802B4 (de) * | 2011-08-18 | 2014-03-13 | Sick Ag | 3D-Kamera und Verfahren zur Überwachung eines Raumbereichs |
US8867825B2 (en) | 2011-08-30 | 2014-10-21 | Thompson Licensing | Method and apparatus for determining a similarity or dissimilarity measure |
WO2013033787A1 (en) | 2011-09-07 | 2013-03-14 | Commonwealth Scientific And Industrial Research Organisation | System and method for three-dimensional surface imaging |
US9285871B2 (en) | 2011-09-30 | 2016-03-15 | Microsoft Technology Licensing, Llc | Personal audio/visual system for providing an adaptable augmented reality environment |
US20130095920A1 (en) * | 2011-10-13 | 2013-04-18 | Microsoft Corporation | Generating free viewpoint video using stereo imaging |
US9248623B2 (en) | 2011-10-14 | 2016-02-02 | Makerbot Industries, Llc | Grayscale rendering in 3D printing |
US9098908B2 (en) | 2011-10-21 | 2015-08-04 | Microsoft Technology Licensing, Llc | Generating a depth map |
US20140098342A1 (en) | 2011-11-04 | 2014-04-10 | The General Hospital Corporation | System and method for corneal irradiation |
JP5910043B2 (ja) * | 2011-12-02 | 2016-04-27 | 富士通株式会社 | 撮像装置、画像処理プログラム、画像処理方法、および画像処理装置 |
JP5898484B2 (ja) | 2011-12-19 | 2016-04-06 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法、およびプログラム |
CN102572485B (zh) | 2012-02-02 | 2015-04-22 | 北京大学 | 一种自适应加权立体匹配算法、立体显示采集装置及系统 |
JP5994715B2 (ja) | 2012-04-10 | 2016-09-21 | パナソニックIpマネジメント株式会社 | 計算機ホログラム型表示装置 |
KR20130120730A (ko) | 2012-04-26 | 2013-11-05 | 한국전자통신연구원 | 변이 공간 영상의 처리 방법 |
US9514522B2 (en) | 2012-08-24 | 2016-12-06 | Microsoft Technology Licensing, Llc | Depth data processing and compression |
US10674135B2 (en) | 2012-10-17 | 2020-06-02 | DotProduct LLC | Handheld portable optical scanner and method of using |
US9332243B2 (en) | 2012-10-17 | 2016-05-03 | DotProduct LLC | Handheld portable optical scanner and method of using |
US9117267B2 (en) | 2012-10-18 | 2015-08-25 | Google Inc. | Systems and methods for marking images for three-dimensional image generation |
US20140120319A1 (en) | 2012-11-01 | 2014-05-01 | Benjamin E. Joseph | 3d mapping using structured light and formation of custom surface contours |
US10049281B2 (en) | 2012-11-12 | 2018-08-14 | Shopperception, Inc. | Methods and systems for measuring human interaction |
KR20140075163A (ko) | 2012-12-11 | 2014-06-19 | 한국전자통신연구원 | 구조광 방식을 활용한 패턴 프로젝팅 방법 및 장치 |
US9298945B2 (en) | 2012-12-26 | 2016-03-29 | Elwha Llc | Ad-hoc wireless sensor package |
US9292927B2 (en) | 2012-12-27 | 2016-03-22 | Intel Corporation | Adaptive support windows for stereoscopic image correlation |
US9251590B2 (en) | 2013-01-24 | 2016-02-02 | Microsoft Technology Licensing, Llc | Camera pose estimation for 3D reconstruction |
US20140241612A1 (en) | 2013-02-23 | 2014-08-28 | Microsoft Corporation | Real time stereo matching |
US20140307055A1 (en) | 2013-04-15 | 2014-10-16 | Microsoft Corporation | Intensity-modulated light pattern for active stereo |
US9191643B2 (en) | 2013-04-15 | 2015-11-17 | Microsoft Technology Licensing, Llc | Mixing infrared and color component data point clouds |
US20140320605A1 (en) | 2013-04-25 | 2014-10-30 | Philip Martin Johnson | Compound structured light projection system for 3-D surface profiling |
CN103308517B (zh) | 2013-05-21 | 2015-09-30 | 谢绍鹏 | 中药颜色客观化方法及中药图像获取装置 |
US10311746B2 (en) | 2016-06-14 | 2019-06-04 | Orcam Technologies Ltd. | Wearable apparatus and method for monitoring posture |
WO2020210004A1 (en) | 2019-04-12 | 2020-10-15 | University Of Iowa Research Foundation | System and method to predict, prevent, and mitigate workplace injuries |
CN113345069B (zh) | 2020-03-02 | 2025-02-18 | 京东方科技集团股份有限公司 | 三维人体模型的建模方法、装置、系统及存储介质 |
CN115211683A (zh) | 2022-06-10 | 2022-10-21 | 重庆第二师范学院 | 一种基于智能座椅的坐姿矫正方法、系统、设备和介质 |
-
2013
- 2013-06-11 US US13/915,626 patent/US20140307055A1/en not_active Abandoned
- 2013-06-11 US US13/915,622 patent/US10268885B2/en active Active
- 2013-06-14 US US13/918,892 patent/US9760770B2/en active Active
- 2013-06-20 US US13/923,135 patent/US9959465B2/en active Active
- 2013-06-21 US US13/924,464 patent/US10929658B2/en active Active
- 2013-06-21 US US13/924,485 patent/US9922249B2/en active Active
- 2013-06-21 US US13/924,475 patent/US9697424B2/en not_active Expired - Fee Related
- 2013-06-24 US US13/925,762 patent/US9928420B2/en active Active
- 2013-11-24 US US14/088,408 patent/US20140309764A1/en not_active Abandoned
-
2014
- 2014-04-14 CN CN201480021199.4A patent/CN105229412B/zh active Active
- 2014-04-14 EP EP14725861.0A patent/EP2986931A1/en not_active Ceased
- 2014-04-14 EP EP14724942.9A patent/EP2987138B1/en active Active
- 2014-04-14 WO PCT/US2014/033996 patent/WO2014172276A1/en active Application Filing
- 2014-04-14 WO PCT/US2014/033915 patent/WO2014172227A1/en active Application Filing
- 2014-04-14 KR KR1020157032651A patent/KR102130187B1/ko active Active
- 2014-04-14 WO PCT/US2014/033919 patent/WO2014172231A1/en active Application Filing
- 2014-04-14 EP EP20184935.3A patent/EP3757510B1/en active Active
- 2014-04-14 EP EP14723271.4A patent/EP2987131A1/en not_active Ceased
- 2014-04-14 CN CN201480021528.5A patent/CN105210112B/zh active Active
- 2014-04-14 EP EP14724600.3A patent/EP2987132B1/en active Active
- 2014-04-14 EP EP14725312.4A patent/EP2986935B1/en active Active
- 2014-04-14 WO PCT/US2014/033909 patent/WO2014172221A1/en active Application Filing
- 2014-04-14 WO PCT/US2014/033916 patent/WO2014172228A1/en active Application Filing
- 2014-04-14 CN CN201480021493.5A patent/CN105229696A/zh active Pending
- 2014-04-14 EP EP14726261.2A patent/EP2987320B1/en active Active
- 2014-04-14 MX MX2015014577A patent/MX357307B/es active IP Right Grant
- 2014-04-14 CN CN201480021422.5A patent/CN105308650B/zh active Active
- 2014-04-14 CA CA2907895A patent/CA2907895C/en active Active
- 2014-04-14 BR BR112015025819A patent/BR112015025819A8/pt not_active Application Discontinuation
- 2014-04-14 WO PCT/US2014/033911 patent/WO2014172223A1/en active Application Filing
- 2014-04-14 WO PCT/US2014/033910 patent/WO2014172222A1/en active Application Filing
- 2014-04-14 CN CN201480021460.0A patent/CN105229411B/zh active Active
- 2014-04-14 JP JP2016508993A patent/JP6469080B2/ja active Active
- 2014-04-14 AU AU2014254219A patent/AU2014254219B2/en active Active
- 2014-04-14 KR KR1020157032633A patent/KR102207768B1/ko active Active
- 2014-04-14 CN CN201480021487.XA patent/CN105143817B/zh active Active
- 2014-04-14 WO PCT/US2014/033917 patent/WO2014172229A1/en active Application Filing
- 2014-04-14 EP EP14727992.1A patent/EP2986936B1/en active Active
- 2014-04-14 EP EP14724934.6A patent/EP2987323B1/en active Active
- 2014-04-14 CN CN201480021519.6A patent/CN105247859B/zh active Active
- 2014-04-14 RU RU2015143654A patent/RU2663329C2/ru not_active IP Right Cessation
- 2014-04-14 CN CN201480021958.7A patent/CN105230003B/zh active Active
- 2014-04-15 US US14/253,696 patent/US9508003B2/en active Active
-
2018
- 2018-02-05 US US15/889,188 patent/US10816331B2/en active Active
- 2018-03-05 US US15/912,555 patent/US10928189B2/en active Active
- 2018-03-27 US US15/937,851 patent/US20180218210A1/en not_active Abandoned
-
2023
- 2023-03-20 US US18/186,933 patent/US12305974B2/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6751344B1 (en) * | 1999-05-28 | 2004-06-15 | Champion Orthotic Investments, Inc. | Enhanced projector system for machine vision |
US7315383B1 (en) * | 2004-07-09 | 2008-01-01 | Mohsen Abdollahi | Scanning 3D measurement technique using structured lighting and high-speed CMOS imager |
US20070263903A1 (en) * | 2006-03-23 | 2007-11-15 | Tyzx, Inc. | Enhancing stereo depth measurements with projected texture |
US8077034B2 (en) * | 2006-09-28 | 2011-12-13 | Bea Sa | Sensor for presence detection |
US20080205748A1 (en) * | 2007-02-28 | 2008-08-28 | Sungkyunkwan University | Structural light based depth imaging method and system using signal separation coding, and error correction thereof |
US20120038986A1 (en) * | 2010-08-11 | 2012-02-16 | Primesense Ltd. | Pattern projector |
US20120056982A1 (en) * | 2010-09-08 | 2012-03-08 | Microsoft Corporation | Depth camera based on structured light and stereo vision |
US20120307075A1 (en) * | 2011-06-01 | 2012-12-06 | Empire Technology Development, Llc | Structured light projection for motion detection in augmented reality |
US20130229396A1 (en) * | 2012-03-05 | 2013-09-05 | Kenneth J. Huebner | Surface aware, object aware, and image aware handheld projector |
US20150316368A1 (en) * | 2012-11-29 | 2015-11-05 | Koninklijke Philips N.V. | Laser device for projecting a structured light pattern onto a scene |
US20140168380A1 (en) * | 2012-12-14 | 2014-06-19 | Faro Technologies, Inc. | Device for optically scanning and measuring an environment |
US20140293011A1 (en) * | 2013-03-28 | 2014-10-02 | Phasica, LLC | Scanner System for Determining the Three Dimensional Shape of an Object and Method for Using |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9438813B2 (en) * | 2012-03-13 | 2016-09-06 | Dolby Laboratories Licensing Corporation | Lighting system and method for image and object enhancement |
US20150036023A1 (en) * | 2012-03-13 | 2015-02-05 | Dolby Laboratories Licensing Corporation | Lighting system and method for image and object enhancement |
US10816331B2 (en) | 2013-04-15 | 2020-10-27 | Microsoft Technology Licensing, Llc | Super-resolving depth map by moving pattern projector |
US10268885B2 (en) | 2013-04-15 | 2019-04-23 | Microsoft Technology Licensing, Llc | Extracting true color from a color and infrared sensor |
US10928189B2 (en) | 2013-04-15 | 2021-02-23 | Microsoft Technology Licensing, Llc | Intensity-modulated light pattern for active stereo |
US10929658B2 (en) | 2013-04-15 | 2021-02-23 | Microsoft Technology Licensing, Llc | Active stereo with adaptive support weights from a separate image |
US20150229915A1 (en) * | 2014-02-08 | 2015-08-13 | Microsoft Corporation | Environment-dependent active illumination for stereo matching |
US11265534B2 (en) * | 2014-02-08 | 2022-03-01 | Microsoft Technology Licensing, Llc | Environment-dependent active illumination for stereo matching |
US9699394B2 (en) | 2015-03-09 | 2017-07-04 | Microsoft Technology Licensing, Llc | Filter arrangement for image sensor |
US20160371845A1 (en) * | 2015-06-18 | 2016-12-22 | Apple Inc. | Monitoring doe performance using software scene evaluation |
US11054664B2 (en) * | 2015-06-18 | 2021-07-06 | Apple Inc. | Monitoring DOE performance using software scene evaluation |
DE102017215850A1 (de) * | 2017-09-08 | 2019-03-14 | Robert Bosch Gmbh | Diffraktives optisches Element und Verfahren zu dessen Herstellung |
DE102017215850B4 (de) * | 2017-09-08 | 2019-12-24 | Robert Bosch Gmbh | Verfahren zur Herstellung eines diffraktiven optischen Elements, LIDAR-System mit einem diffraktiven optischen Element und Kraftfahrzeug mit einem LIDAR-System |
US11143801B2 (en) | 2017-09-08 | 2021-10-12 | Robert Bosch Gmbh | Diffractive optical element and method for the manufacture thereof |
US10643341B2 (en) | 2018-03-22 | 2020-05-05 | Microsoft Technology Licensing, Llc | Replicated dot maps for simplified depth computation using machine learning |
WO2019182840A1 (en) * | 2018-03-22 | 2019-09-26 | Microsoft Technology Licensing, Llc | Replicated dot maps for simplified depth computation using machine learning |
US10565720B2 (en) | 2018-03-27 | 2020-02-18 | Microsoft Technology Licensing, Llc | External IR illuminator enabling improved head tracking and surface reconstruction for virtual reality |
US10890839B1 (en) * | 2019-11-06 | 2021-01-12 | Himax Technologies Limited | Structured light imaging device |
US20240005538A1 (en) * | 2019-11-27 | 2024-01-04 | Trinamix Gmbh | Depth measurement through display |
US12406384B2 (en) * | 2019-11-27 | 2025-09-02 | Trinamix Gmbh | Depth measurement through display |
US20230334701A1 (en) * | 2021-06-28 | 2023-10-19 | Motional Ad Llc | Systems and methods for camera alignment using pre-distorted targets |
US12051224B2 (en) * | 2021-06-28 | 2024-07-30 | Motional Ad Llc | Systems and methods for camera alignment using pre-distorted targets |
US12288362B2 (en) | 2022-01-21 | 2025-04-29 | Motional Ad Llc | Active alignment of an optical assembly with intrinsic calibration |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10928189B2 (en) | Intensity-modulated light pattern for active stereo | |
US9491441B2 (en) | Method to extend laser depth map range | |
EP3018903B1 (en) | Method and system for projector calibration | |
CN106062862A (zh) | 用于沉浸式和交互式多媒体生成的系统和方法 | |
CN107469355A (zh) | 游戏人物形象创建方法及装置、终端设备 | |
CN107493411A (zh) | 图像处理系统及方法 | |
KR20200063937A (ko) | 적외선 스테레오 카메라를 이용한 위치 검출 시스템 | |
KR101451792B1 (ko) | 영상 렌더링 장치 및 그 방법 | |
CN107610240A (zh) | 头像替换方法、装置和移动终端 | |
Luo | Study on three dimensions body reconstruction and measurement by using kinect | |
JP2011215919A (ja) | プログラム、情報記憶媒体及び画像生成システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KANG, SING BING;GEORGIOU, ANDREAS;SZELISKI, RICHARD S.;SIGNING DATES FROM 20130608 TO 20130611;REEL/FRAME:030592/0181 |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417 Effective date: 20141014 Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454 Effective date: 20141014 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |