WO2014011182A1 - Techniques de détermination de profondeur basées sur convergence/divergence et utilisations avec une imagerie par défocalisation - Google Patents

Techniques de détermination de profondeur basées sur convergence/divergence et utilisations avec une imagerie par défocalisation Download PDF

Info

Publication number
WO2014011182A1
WO2014011182A1 PCT/US2012/046557 US2012046557W WO2014011182A1 WO 2014011182 A1 WO2014011182 A1 WO 2014011182A1 US 2012046557 W US2012046557 W US 2012046557W WO 2014011182 A1 WO2014011182 A1 WO 2014011182A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
defocusing
depth
camera
pattern
Prior art date
Application number
PCT/US2012/046557
Other languages
English (en)
Inventor
Morteza Gharib
Original Assignee
Calfornia Institute Of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Calfornia Institute Of Technology filed Critical Calfornia Institute Of Technology
Priority to PCT/US2012/046557 priority Critical patent/WO2014011182A1/fr
Publication of WO2014011182A1 publication Critical patent/WO2014011182A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images

Definitions

  • Active stereo imaging is a 3D imaging method that operates by projecting a
  • the system is historically calibrated. The characteristics of the plane projection on the system are obtained for the known calibration sample.
  • the light pattern is projected onto the surface from an angle relative to the camera such that the light appears to be at different locations based on the distance to the object or to a point on the object. Based on the calibration set, the position of the surface, and hence the contour of the object, can be determined.
  • use of non-uniform patterns limit the resolution the system can achieve due to un-used space in the pattern.
  • a type of "defocusing" system described in US Publication No. 2009/0295908 employs a projected pattern of laser dots or "markers” for the purpose of generating high accuracy local 3D object data from the reflected light.
  • z-axis or depth image information is derived from the spacing observed between matched sets of points, each point in a given matched set (e.g., as arranged in a triangle) derived from a different aperture from a multi-aperture mask.
  • the 3D data determined at each of the different times or positions can be combined to stitch together multiple different scenes to complete a model for an object larger than each individual imaged area.
  • the '908 application describes one manner of determining the 3D date, another is presented in PCT/US 10/57532.
  • US Publication No. 201 1/0074932 also projects a pattern of laser dots onto a surface to be imaged.
  • the pattern is projected at an angular offset from the camera. Data recorded by the camera is used in two different modes.
  • Information obtained through another aperture is used for resolving depth (z) information from the deformation of the planar (x, y) coordinates of the pattern.
  • the '932 reference uses the depth information from defocusing to identify the correspondence of dots between the deformed and original pattern for stereo imaging depth determination. This approach reportedly offers greatly increased working depth of the system and allows the active stereo imaging to be used in more applications including at lower angles and over greater working depths. Stated otherwise, the '932 system (via its processor) uses defocused information from the projected dots to determine a correspondence of the optical dots recorded on a surface to the optical dots of the projected pattern to determine and approximate z-axis position for in performing active stereo imaging.
  • the process of "defocus-based" imaging is one in which large data structures are created by light capture (with a CMOS, CCD or other imaging apparatus) through restricted areas positioned at different radial locations along a common optical axis.
  • Corresponding x and y value points (or features) from a given imaged scene are matched and z-values calculated from the separation of the points (or features) such as by equations of the type described in US Patent Nos. 6,278,847 and 7,006,132.
  • defocus-based imaging employs a technique called aperture- coded imaging. Suitable hardware for such purposes is described in PCT/US 10/57532 as well as USPNs 6,278,847; 7,006,132; 7,612,869 and 7,612,870.
  • This technique uses off-axis apertures to measure the depth and location of a scattering site in separated (color, time-wise, etc.) channels. The shifts in the images caused by these off-axis apertures are monitored to determine the three-dimensional position of the site or sites.
  • the '932 reference instead utilized defocusing for gross (i.e., less accurate) depth determination of recorded x, y location dots in order to facilitate the arrival at image correspondence for active stereo determination of most accurate position. Moreover, multiple images can be combined or "stitched" together to fully image a 3D object. [0009] Yet, there exists opportunity for other advantageous systems employing a projected pattern for 3-D imaging in connection with defocusing techniques.
  • a first embodiment combines a diverging light projection with a defocusing
  • depth can be estimated by measuring the distance between points on the projection pattern as imaged through a single aperture.
  • the distance between neighboring points on the light pattern will be linearly related to the distance from the camera. Stated otherwise, the apparent spacing between the imaged dots varies with depth, depending on the relative distance from their respective origins.
  • the approach according to this embodiment of the invention that first approximates image depth using divergence, and then refines with defocusing, offers advantages in terms of hardware and computational simplicity. Namely, knowing - basically - a priori at which depth various imaged points should be located for matching offset defocusing images provides an advantage. With such an advantage, signal separation (i.e., by color, time, polarization, pattern, etc.) with associated hardware need not be employed in some cases. Stated otherwise, image crowding problems (e.g., as commented upon in the '870 patent above) are reduced or eliminated depending on setup.
  • Another embodiment employs converging structured light projections that focus on one plane.
  • one camera can be used to measure depth by comparing the distance between corresponding points.
  • the approach works as "reverse" defocusing (i.e., by using multiple light projections to produce doublets, triplets, etc., the positions of which are compared instead of using multiple apertures).
  • the projections can be configured as a grid of varying intensity and/or varying color to help identify corresponding points.
  • a red and blue projection pattern - where the respective grids register at one plane - can be used to infer depth by looking at the distance between red and blue points.
  • multiple projection systems are used that come from different origins, but focus on the same plane. The apparent position of the multiple dots relates to the distance from the focal plane.
  • imaging device with three lasers placed at vertices of an equilateral triangle on the periphery of the lens, can measure exact distance between a particular point and a reference plane.
  • the reference plane is chosen with the design as the plane at which the beams intersect, though if necessary it could be changed during use by retuning the wavelength of the laser (such that the focal distance through its lens will change slightly) or by mechanically moving the laser.
  • Points on planes between the imaging device and the reference plane will form a pattern of an upright equilateral triangle (identical in orientation to the layout of the lasers). At a plane farther away than the reference plane, the pattern is inverted.
  • the orientation identifies the point of interest as being behind or in front of the reference plane while the size of the pattern projected correlates directly to the distance from the plane in question to the reference plane.
  • the three lasers are used so that it is known
  • such systems are adaptable to existing imagers such as cameras, photodetectors, microscopes, and others, and need not be at wavelengths in the visible range.
  • the beams need not be separate lasers; they can be one beam split into the appropriate quantity of beams that the system requires via a diffraction grating, prism, thin-film filters, etc.
  • Yet another arrangement employs converging light focused at a reference plane in which depth measurement can be extrapolated from the intensity of the scattered light since such intensity will vary as the inverse of the area of the spot size.
  • the advantage of this arrangement is that the intensity can be measured with a single photocell and thus acquisition can be very fast - no post-processing is needed, other than any desired calibration.
  • this arrangement has strong potential for miniaturization into all- inclusive microchip sensors to develop small, portable, hand-held devices. It also has the advantage in that a multitude of light sources can be used as a light cone that can easily be formed with simple lenses and fiber optics.
  • One embodiment employs a single light source projected as a ring onto a
  • Telecentricity is optional but has advantages as discussed below. As long as all of the light from the scattered image of the ring is received by the receptor, a depth measurement can be obtained.
  • the intensity may be calibrated for the material being imaged simply by scanning X, Y for the reference plane, establishing a maximum value, and scanning X, Y out of range of the receptor where the received intensity should drop rapidly.
  • the light source is focused at the reference plane, thus it converges as it
  • the usable range of the sensor is the intersection of the field of view cylinder and the light source cone(s).
  • Capture is fast enough that detection of which region is being used (aft of or in front of the reference plane) may be confirmed by checking to see if the reference plane is crossed at any point in time.
  • a fast acquisition device may be used, for example, to measure the out-of-plane deformation of a material in a highly dynamic stress situation (such as impact) in a small region (point).
  • detector components are be moved away from or towards the point in question until the reference plane is found.
  • the reference plane is found.
  • phototransistor that may be used, for example, as part of already-present accurate measurement devices.
  • This type of device has many applications, including but not limited to precise focusing of optical devices. By using a carefully tuned laser beam as a light source, the focal distance variation with wavelength can be measured. All these devices could be connected to a piezo-electric actuator for rapid scanning in depth (i.e., the Z direction).
  • the Z-scanning of a point will increase the accuracy of the measurement. Rapid Z-scanning could also expand the depth range of the devices which is a factor of the design of the light source, and, in the case of the on-off device, could convert it into a full depth measuring device in which a surface would be scanned in Z by the piezo-actuator and the depths at which the reference plane coincides with the object are recorded.
  • the piezo device could be replaced with a tunable light source, as by changing the wavelength of the light the focal distance through its optics will change in a predictable or measurable manner.
  • these single-point measurement devices can be extended into array form where multiple points in the X, Y plane can be imaged simultaneously by multiplying the number of devices (as in the case of the light-intensity based unit) or by multiplying the apparent number of light sources (in the tri-laser or the multicolor/intensity, etc. arrangements already described).
  • Multiplying the apparent number of light sources can be done by using beam splitters and/or holographic lenses or diffraction gratings so that a single beam can be split into multiple beams and thus be emitted from several different points. The same holds true for generating the light patterns in any of the embodiments herein.
  • the illumination and the sensor need not be on the same side of the surface, for example, if measuring on a semi-opaque film. It should also be clear from the embodiments where the depth is extracted from the position of projected dots that the detector need not be concentric with the central axis of the illumination, as any offset can be compensated for in the initial calibration. Further, the approaches described may be used alone or in
  • Fig. 1 illustrates a light beam divergence-based imaging system
  • Fig. 2 illustrates an imaging system employing converging light beams
  • Fig. 3 illustrates a system employing light beams in a pattern converging and diverging across a focal plane
  • Fig. 4 illustrates a system employing "ring" or "flood” lighting converging and diverging across a focal plane
  • Figs. 5A and 5B are flowcharts illustrating methods of system operation.
  • System 100 combines the concepts of a diverging structured light projection with a defocusing camera system.
  • a system includes a camera 110 and a projector 120.
  • the camera view is taken as lines 1 12.
  • the projector provides a structured light pattern illustrated with lines 122 incident upon an object.
  • image capture 140 At an object plane 130, the apparent spacing between the dots as imaged by the camera is illustrated as image capture 140; at object position 132, the apparent spacing between the dots is illustrated as image capture 142; and at object position 134, the apparent spacing between the dots is illustrated as image capture 144.
  • the apparent spacing between the imaged dots varies with depth, depending on the relative distance from their respective origins.
  • a complex mapping of an object and/or object image tracking can be performed in real time.
  • real time what is meant - in connection with this system and others - is that a user experiences no appreciable or apparent delay as a computer system provides display or responsiveness to action.
  • depth can be estimated by measuring the distance between points on the projected light pattern.
  • the distance between neighboring points on the light pattern will be linearly related to the distance from the camera in the setup shown.
  • the estimated depth is then advantageously refined by using defocusing to achieve final position accuracy.
  • the divergence-based calculations regarding point depth (z-axis) can be very useful.
  • the divergence-based calculations are very simple and thus fast. As a "starting point” they offer advantage in terms of completing the more complex defocusing refinement calculations while still maintaining activity in real time (for display purposes, active imagine, etc.) even with a relatively simple or low-powered processor.
  • the first set of calculations i.e., the divergence based calculations
  • data is provided that can prove useful for point match-up between images captured with an offset for defocusing.
  • such a system can operate very effectively, even at higher point densities and/or without coding the apertures.
  • these and other embodiments may use a laser, an LED or any other kind of other light generator for the light source.
  • the light patterns may be provided by multiplying these entities or using beam splitters and/or holographic lenses or diffraction gratings (i.e., as in a diffractive optical element (DOE)) as mentioned above.
  • DOE diffractive optical element
  • a projected "grid” may be used in different embodiments that extends (in reference to the noted figures) into and out of the page, such that the captured images and associated data structures that are processed measure between parts vertically (as well as horizontally as shown and described in connection with the image capture screens 140, 142, etc.).
  • the camera system is of low enough frame rate, one can also devise a system using a scanning laser, similar to CRT's, so that successive line-by-line capture and comparison is made.
  • comparison across the captured lines in a scan-type system may be employed.
  • the examples are provided in a non-limiting sense as should be understood with respect to the other inventive variations described.
  • Fig. 2 illustrates a system 200 in which a converging structured light projection
  • the projection can have a grid of varying intensity or varying color to help identify corresponding points - for example as described in above-referenced copending PCT Application No. PCT/US12/46484, filed July 12, 2012 and incorporated by reference in its entirety.
  • red and blue projections may be provided where the respective grids register at one plane 210. Then depth can be calculated in connection with images captured for an object (e.g., by pixel-by-pixel comparison and data transformation) as related to the distance measured between red and blue points in different depth (z-axis) planes 212, 214. While such activity will typically be populated across an entire x,y grid (optionally as noted above) a linear representation of the corresponding image frames 222, 224 captured by the camera are shown. In this case, the relative position of the multiple dots relates to the distance from the focal plane as determined by a computer running appropriate software or by an application specific integrated circuit (ASIC) or chip set in reference to an image calibration set.
  • ASIC application specific integrated circuit
  • Profilometry of an entire surface, or tracking the changing shape of a surface or body can be accomplished in this fashion.
  • Another option is to employ such a projection system 200 in connection with a multiple-aperture camera employing an aperture mask 230 (or independent camera system aligned to achieve the same basic optical result).
  • a first depth measurement can be made employing the offset beams (optionally through a central camera or aperture - though only two suitably-used offset apertures are shown in the figure), then subsequent refinement performed comparing the offset of image points within each color. For this purpose, only the red points can be used or only the blue points can be used.
  • one or more of the offset aperture(s) may include a color filter. Otherwise, a Bayer or other filter associated with the camera CMOS or CCD can be employed to eliminate the color channel not used for defocusing purposes.
  • system 300 uses multiple angles of illumination in conjunction with an imaging device to extrapolate the third dimension of measurement in connection with a computer processor.
  • an imaging device 302 including optics and a sensor, etc.
  • three lasers 310, 312, 314 placed at vertices of an equilateral triangle on the periphery of a lens, is used to extract distance between one or more points and a reference plane 320.
  • the reference plane is chosen in the design as the plane at which the beams (or multiple projected beams) meet at a focal point 340 along an optical axis 350, although the location of this plane can be changed during use by retuning the wavelength of the laser(s) - such that the focal distance through any associated lens, grating, etc. will change slightly - or mechanically by moving the laser(s).
  • points on a plane 322 between the imaging device and the reference plane will form a pattern of an upright equilateral triangle 330 (identical in orientation to the layout of the lasers).
  • the pattern 330' is inverted.
  • the orientation identifies the point of interest as being behind or in front of the reference plane while the size of the pattern projected correlates directly to the distance from the plane in question to the reference plane.
  • the referenced approach can be multiplied/multiplexed and/or used in conjunction with defocusing techniques, especially as a preceding/precedent matter, to subsequently simplify defocus imaging feature (be they dot, point, SIFT or SURF resolved, etc. features) match-up for subsequent calculations.
  • FIG. 4 illustrates a related imaging or depth-finding system 400 in which a camera or simplified photo-intensity detector 402 is associated with a light 404 and an associated optics focusing beam 406 so it converges to a focal point 410 at a reference plane 412 and diverges beyond that point.
  • a field of view 420 for the sensor is (at least) substantially cylindrical.
  • the depth of the field 422 which may be imaged is then the intersection of the bi-cylindrical beam and cylinder 420.
  • the vision system's image magnification is (at least
  • a measure of depth can be obtained simply by measuring reflected light intensity where the intensity of the light is expected to vary in inverse- square relation to distance from the source (i.e., to follow "Lambert's Law" as qualified/compared against calibration results for such an object or surface).
  • magnification-dependent intensity differences can be accounted for with system calibration. In any case, operation in a zone at our about (i.e., scanning around to identify as discussed as an option above) the converging/diverging light focal point can be limited.
  • such a system 400 with defocusing hardware and software control as described above can be incorporated such that by using captured images from multiple offset apertures (e.g., as by using an optional aperture plate 430 with offset apertures 432 for collecting defocusing information and a central 434 aperture used for the intensity-based depth determination), distances between features in the captured image identified on an object (e.g., as applied as laser dots, by way of contrast medium, etc.) can be resolved for more accurate depth measurement.
  • various "active shutter” arrangements could instead be used.
  • an LCD rapidly switches between collecting intensity data in a fully “open” configuration and defocusing data by "blacking out” all but at least two offset windows for additional image capture.
  • the intensity-based depth measurement can be used to inform and speed, and/or improve the accuracy of the defocusing calculations - especially in connection with feature matching.
  • FIGs. 5A and 5B are flowcharts illustrating operation possibilities for the subject imaging systems.
  • the systems above may operate according to these methods or the methods described can implicate alternative hardware options.
  • the system obtains image information as
  • a computer processor utilizes the image data captured as associated with the selected mode of converging/diverging light projection to make a determination of depth or distance of the point or points.
  • the sub-process outputs an initial depth determination 520 for the feature(s) of interest. With this initial depth determination and imaging data already obtained at 500, or with image data acquired specifically for defocusing at 530 (such as with a system 400 collecting different types of image data through its various apertures), a more refined depth determination 540 is made based on defocusing principles.
  • the refined depth determination results 550 of the defocusing sub-process may be accomplished as described in connection with the teachings of any of the above- referenced patents or applications, as in employing defocusing equations and/or in reference to a calibration set generated for the system optics (especially as further described in PCT/US10/57532).
  • the system obtains the various image data to process.
  • depth determinations are made from the same based on the converging/diverging projections.
  • potential point matches are identified for defocusing.
  • the projection based depth determination is used as an estimate of expected neighbor point position depth to select which one is the most likely defocusing point pair match. So-matched, a defocusing- based depth determination is output at 600 in reference to a calibration set and/or defocusing equations per the references cited above.
  • feature matching for defocusing is aided by the information from the initial depth determination. Possession of such information a priori limits the range where expected matching points should be found within the possibilities for defocusing processing - thus reducing the computational intensity of the problem to be solved. In other words, with a given range or set of possible locations of z-position for an identified point in x, y space from one aperture/channel, its match from another aperture/channel is more easily identified. Once so-matched, extremely accurate defocusing-based imaging proceeds apace with reduced possibilities or even the elimination of
  • the process stops at the initial depth determination in Fig.
  • determination for a given imaged "scene” may be aggregated with other related image scene data taken for an object larger that the imager field of view.
  • teachings for approaches to calculating camera pose, transforming and aggregating image data as presented in PCT/US10/57532 may be applied or others as may be apparent to those with skill in the art.
  • the cameras described herein can be handheld portable units, or machine vision cameras, or underwater units. Or the camera may be mounted in a stationary position an object moved relative to them or otherwise configured. Still further, the camera may be worn by a user to record facial expressions or gestures to be blended with animation. Other possibilities exist as well. [0055] Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. Indeed, given the type of pixel-to-pixel matching and associated calculations required with the data structures recorded and manipulated, computer use is necessary.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • the processor can be part of a computer system that also has a user interface port that communicates with a user interface, and which receives commands entered by a user, has at least one memory (e.g., hard drive or other comparable storage, and random access memory) that stores electronic information including a program that operates under control of the processor and with communication via the user interface port, and a video output that produces its output via any kind of video output format, e.g., VGA, DVI, HDMI, displayport, or any other form.
  • a memory e.g., hard drive or other comparable storage, and random access memory
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such
  • a software module may reside in Random Access Memory (RAM), flash memory, Read Only Memory (ROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.
  • the ASIC may reside in a user terminal.
  • the processor and the storage medium may reside as discrete components in a user terminal.
  • Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a computer.
  • such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • the memory storage can also be rotating magnetic hard disk drives, optical disk drives, or flash memory based storage drives or other such solid state, magnetic, or optical storage devices.
  • any connection is properly termed a computer-readable medium.
  • the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave
  • the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
  • the computer readable media can be an article comprising a machine- readable non-transitory tangible medium embodying information indicative of instructions that when performed by one or more machines result in computer implemented operations comprising the actions described throughout this specification.
  • Operations as described herein can be carried out on or over a website.
  • the website can be operated on a server computer, or operated locally, e.g., by being downloaded to the client computer, or operated via a server farm.
  • the website can be accessed over a mobile phone or a PDA, or on any other client.
  • the website can use HTML code in any form, e.g., MHTML, or XML, and via any form such as cascading style sheets (“CSS”) or other.
  • the computers described herein may be any kind of computer, either general purpose, or some specific purpose computer such as a workstation.
  • the programs may be written in C, or Java, Brew or any other programming language.
  • the programs may be resident on a storage medium, e.g., magnetic or optical, e.g. the computer hard drive, a removable disk or media such as a memory stick or SD media, or other removable medium.
  • the programs may also be run over a network, for example, with a server or other machine sending signals to the local machine, which allows the local machine to carry out the operations described herein.

Abstract

Selon la présente invention, une lumière est projetée vers une surface et une mesure de profondeur au niveau de points est déterminée par des attributs de convergence ou de divergence de la lumière capturée par un système de caméra. Dans des applications de défocalisation, le système optique comprend au moins deux ouvertures ou parties de caméra hors axe, agencées pour obtenir une image des motifs de lumière projetée comprenant des informations défocalisées. La caméra peut être mobile entre différentes positions pour imager la surface depuis lesdites différentes positions.
PCT/US2012/046557 2012-07-12 2012-07-12 Techniques de détermination de profondeur basées sur convergence/divergence et utilisations avec une imagerie par défocalisation WO2014011182A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2012/046557 WO2014011182A1 (fr) 2012-07-12 2012-07-12 Techniques de détermination de profondeur basées sur convergence/divergence et utilisations avec une imagerie par défocalisation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2012/046557 WO2014011182A1 (fr) 2012-07-12 2012-07-12 Techniques de détermination de profondeur basées sur convergence/divergence et utilisations avec une imagerie par défocalisation

Publications (1)

Publication Number Publication Date
WO2014011182A1 true WO2014011182A1 (fr) 2014-01-16

Family

ID=49916444

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/046557 WO2014011182A1 (fr) 2012-07-12 2012-07-12 Techniques de détermination de profondeur basées sur convergence/divergence et utilisations avec une imagerie par défocalisation

Country Status (1)

Country Link
WO (1) WO2014011182A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9736463B2 (en) 2007-04-23 2017-08-15 California Institute Of Technology Single-lens, single-sensor 3-D imaging device with a central aperture for obtaining camera position
DE102016203306A1 (de) * 2016-03-01 2017-09-07 Robert Bosch Gmbh Profiltiefenmessgerät
US9928592B2 (en) 2016-03-14 2018-03-27 Sensors Unlimited, Inc. Image-based signal detection for object metrology
US10007971B2 (en) 2016-03-14 2018-06-26 Sensors Unlimited, Inc. Systems and methods for user machine interaction for image-based metrology
EP3435026A1 (fr) 2017-07-24 2019-01-30 Hand Held Products, Inc. Dimensionnement 3d optique à double motif
TWI661232B (zh) * 2018-05-10 2019-06-01 視銳光科技股份有限公司 泛光照射器與點陣投影器的整合結構
CN110470219A (zh) * 2019-08-16 2019-11-19 福建农林大学 基于边缘频谱保留的散焦图像测距方法及装置
CN110651166A (zh) * 2017-03-13 2020-01-03 赫普塔冈微光有限公司 用于收集三维数据的光电装置
CN111854575A (zh) * 2020-07-30 2020-10-30 唐山市德龙钢铁有限公司 辊环环槽检测装置
US11450083B2 (en) 2019-09-27 2022-09-20 Honeywell International Inc. Dual-pattern optical 3D dimensioning
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4879664A (en) * 1985-05-23 1989-11-07 Kabushiki Kaisha Toshiba Three-dimensional position sensor and three-dimensional position setting system
US6229913B1 (en) * 1995-06-07 2001-05-08 The Trustees Of Columbia University In The City Of New York Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus
US20030160970A1 (en) * 2002-01-30 2003-08-28 Anup Basu Method and apparatus for high resolution 3D scanning
US20090129667A1 (en) * 2007-11-16 2009-05-21 Gwangju Institute Of Science And Technology Device and method for estimatiming depth map, and method for generating intermediate image and method for encoding multi-view video using the same
US20090295908A1 (en) * 2008-01-22 2009-12-03 Morteza Gharib Method and device for high-resolution three-dimensional imaging which obtains camera pose using defocusing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4879664A (en) * 1985-05-23 1989-11-07 Kabushiki Kaisha Toshiba Three-dimensional position sensor and three-dimensional position setting system
US6229913B1 (en) * 1995-06-07 2001-05-08 The Trustees Of Columbia University In The City Of New York Apparatus and methods for determining the three-dimensional shape of an object using active illumination and relative blurring in two-images due to defocus
US20030160970A1 (en) * 2002-01-30 2003-08-28 Anup Basu Method and apparatus for high resolution 3D scanning
US20090129667A1 (en) * 2007-11-16 2009-05-21 Gwangju Institute Of Science And Technology Device and method for estimatiming depth map, and method for generating intermediate image and method for encoding multi-view video using the same
US20090295908A1 (en) * 2008-01-22 2009-12-03 Morteza Gharib Method and device for high-resolution three-dimensional imaging which obtains camera pose using defocusing

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9736463B2 (en) 2007-04-23 2017-08-15 California Institute Of Technology Single-lens, single-sensor 3-D imaging device with a central aperture for obtaining camera position
DE102016203306A1 (de) * 2016-03-01 2017-09-07 Robert Bosch Gmbh Profiltiefenmessgerät
US9928592B2 (en) 2016-03-14 2018-03-27 Sensors Unlimited, Inc. Image-based signal detection for object metrology
US10007971B2 (en) 2016-03-14 2018-06-26 Sensors Unlimited, Inc. Systems and methods for user machine interaction for image-based metrology
EP3596425A4 (fr) * 2017-03-13 2020-12-16 Heptagon Micro Optics Pte. Ltd. Dispositifs optoélectroniques pour collecter des données tridimensionnelles
CN110651166A (zh) * 2017-03-13 2020-01-03 赫普塔冈微光有限公司 用于收集三维数据的光电装置
CN110651166B (zh) * 2017-03-13 2021-09-24 赫普塔冈微光有限公司 用于收集三维数据的光电装置
EP3435026A1 (fr) 2017-07-24 2019-01-30 Hand Held Products, Inc. Dimensionnement 3d optique à double motif
US10733748B2 (en) 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
TWI661232B (zh) * 2018-05-10 2019-06-01 視銳光科技股份有限公司 泛光照射器與點陣投影器的整合結構
CN110470219A (zh) * 2019-08-16 2019-11-19 福建农林大学 基于边缘频谱保留的散焦图像测距方法及装置
US11450083B2 (en) 2019-09-27 2022-09-20 Honeywell International Inc. Dual-pattern optical 3D dimensioning
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
CN111854575A (zh) * 2020-07-30 2020-10-30 唐山市德龙钢铁有限公司 辊环环槽检测装置
CN111854575B (zh) * 2020-07-30 2022-05-10 唐山市德龙钢铁有限公司 辊环环槽检测装置

Similar Documents

Publication Publication Date Title
WO2014011182A1 (fr) Techniques de détermination de profondeur basées sur convergence/divergence et utilisations avec une imagerie par défocalisation
US10742957B2 (en) Three-dimensional imaging system
JP7043085B2 (ja) 視点から距離情報を取得するための装置及び方法
RU2668404C2 (ru) Устройство для записи изображения в трехмерном масштабе, способ создания 3D-изображения и способ формирования устройства для записи изображения в трехмерном масштабе
JP5929553B2 (ja) 画像処理装置、撮像装置、画像処理方法およびプログラム
US8773514B2 (en) Accurate 3D object reconstruction using a handheld device with a projected light pattern
US20140184748A1 (en) Single-sensor system for extracting depth information from image blur
KR20100019455A (ko) 단일-렌즈, 단일-개구, 단일-센서의 3-cd 영상화 장치
KR20090107536A (ko) 정량적 3-d 이미징을 위한 방법
WO2022126870A1 (fr) Procédé imageur tridimensionnel et procédé sur la base d'un appareil photographique plénoptique et ligne de production de mesure d'imagerie tridimensionnelle
US11538193B2 (en) Methods and systems for calibrating a camera
CN103486979A (zh) 混合系统
CN111811431A (zh) 一种三维扫描仪、三维扫描系统及方法
US10096113B2 (en) Method for designing a passive single-channel imager capable of estimating depth of field
JP2010066155A (ja) 形状測定装置
US20210256729A1 (en) Methods and systems for determining calibration quality metrics for a multicamera imaging system
CN111272101A (zh) 一种四维高光谱深度成像系统
JP2023547699A (ja) 重なる視野を有するセンサを有する三次元スキャナ
JP4651550B2 (ja) 三次元座標計測装置および方法
CN211205210U (zh) 四维高光谱深度成像系统
CN115514877A (zh) 用于从多视角图像降噪的装置和方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12880823

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12880823

Country of ref document: EP

Kind code of ref document: A1