WO2009067223A2 - Procédé et système pour une imagerie à trois dimensions rapide à l'aide d'une défocalisation et d'une reconnaissance de caractéristiques - Google Patents

Procédé et système pour une imagerie à trois dimensions rapide à l'aide d'une défocalisation et d'une reconnaissance de caractéristiques Download PDF

Info

Publication number
WO2009067223A2
WO2009067223A2 PCT/US2008/012947 US2008012947W WO2009067223A2 WO 2009067223 A2 WO2009067223 A2 WO 2009067223A2 US 2008012947 W US2008012947 W US 2008012947W WO 2009067223 A2 WO2009067223 A2 WO 2009067223A2
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
images
segments
positions
interest
Prior art date
Application number
PCT/US2008/012947
Other languages
English (en)
Other versions
WO2009067223A3 (fr
Inventor
Morteza Gharib
Emilio Castano Graff
Francisco Pereira
Original Assignee
California Institute Of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/011,016 external-priority patent/US8576381B2/en
Priority claimed from US12/150,237 external-priority patent/US7916309B2/en
Application filed by California Institute Of Technology filed Critical California Institute Of Technology
Publication of WO2009067223A2 publication Critical patent/WO2009067223A2/fr
Publication of WO2009067223A3 publication Critical patent/WO2009067223A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus

Definitions

  • the present invention relates to three-dimensional imaging and, more particularly, to a method and system for fast three-dimensional imaging using defocusing and feature recognition. [0007] (2) Description of Related Art
  • the present invention relates to three-dimensional imaging.
  • the various methods for imaging objects in three-dimensions can be divided into scanning methods and instantaneous methods.
  • Scanning systems are generally suitable for imaging immobile objects. Scanning methods include laser scanning, ultrasound (tomography), magnetic resonance imaging (tomography), and confocal scanning microscopy.
  • Instantaneous systems can be used to capture moving objects and thus analyze motion, such as in defocusing DPIV (Digital Particle Imaging Velocimetry).
  • Other instantaneous methods include ultrasound, three-dimensional photogrammetry, correlation surface mapping, and time-of-flight systems.
  • the present invention relates to three-dimensional imaging and, more particularly, to a method and system for fast three-dimensional imaging using defocusing and feature recognition.
  • a first aspect of the method comprises acts of capturing a plurality of defocused images of an object on a sensor, identifying segments of interest in each of the plurality of images using a feature recognition algorithm, and matching the segments with three-dimensional coordinates according to the positions of the images of the segments on the sensor. Thus, a three- dimensional position of each segment of interest is produced.
  • the images of the object are obtained by an instantaneous scanning method selected from the group consisting of defocusing, ultrasound, three-dimensional photogrammetry, correlation surface mapping, and time-of-flight methods.
  • Another aspect of the present invention is a data processing system comprising a sensor for capturing a plurality of defocused images of an object substantially simultaneously.
  • the system also includes one or more processors configured to receive a plurality of defocused images of an object captured on the sensor, identify segments of interest in each of the plurality of images using a feature recognition algorithm, and match the segments with three-dimensional coordinates according to the positions of the images of the segments on the sensor.
  • the data processing system further comprises a camera having a lens obstructed by a mask with a plurality of off-axis apertures.
  • the sensor is configured to capture a plurality of defocused images of an object by receiving a signal transmitted by the object through the plurality of off-axis apertures and through the lens.
  • the data processing system is further configured to ascertain sub-pixel positions of the segment positions on the sensor using a cross- correlation type algorithm.
  • an additional aspect of the present invention is computer program product.
  • the computer program product comprises computer-readable instruction means stored on a computer-readable medium that are executable by a computer for causing the computer to receive a plurality of defocused images of an object on a sensor, identify segments of interest in each of the plurality of images using a feature recognition algorithm, and match the segments with three-dimensional coordinates according to the positions of the images of the segments on the sensor.
  • FIG. IA is a flow diagram showing the acts of a "dumb" imaging method as currently exists in the art
  • FIG. IB is a flow diagram showing the acts of an "aware" imaging method of the present invention.
  • FIG. 2 is an illustration of a 3 -aperture camera mask for use with the present invention, and groups of defocused images produced by such a mask;
  • FIG. 3 A is an illustration showing an outline of a person from an input image
  • FIG. 3 B is an illustration showing a point cloud representing a person, as output by a "dumb” imaging method
  • FIG. 3C is an illustration showing segments of interest corresponding to a person as output by the "aware" imaging method of the present invention.
  • FIG. 4 is a block diagram showing a generic data processing system for use with the present invention.
  • FIG. 5 is an illustration showing a computer program product for use with the present invention.
  • the present invention relates to three-dimensional imaging and, more particularly, to a method and system for fast three-dimensional imaging using defocusing and feature recognition.
  • the following description is presented to enable one of ordinary skill in the art to make and use the invention and to incorporate it in the context of particular applications.
  • Various modifications, as well as a variety of uses in different applications will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to a wide range of embodiments.
  • the present invention is not intended to be limited to the embodiments presented, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
  • the labels left, right, front, back, top, bottom, forward, reverse, clockwise and counter clockwise have been used for convenience purposes only and are not intended to imply any particular fixed direction. Instead, they are used to reflect relative locations and/or directions between various portions of an object.
  • the present invention relates to three-dimensional imaging and, more particularly, to a method and system for fast three-dimensional imaging using defocusing and feature recognition.
  • defocusing is known in the art and was first introduced by Willert, CE. and Gharib, M. in Three-dimensional particle imaging with a single camera. Experiments in Fluids 12, 353-358 (1992), which is incorporated by reference as though fully set forth herein.
  • FIG. IA is a flow diagram illustrating a "dumb” method for three-dimensional imaging as currently exists in the art.
  • the "dumb” method is initiated by capturing a plurality of defocused images 100 of an object.
  • the object contains a set of either natural or projected marker points.
  • the pixel or sub-pixel centers are then identified 102 for each marker point in the plurality of images.
  • FIG. 2 shows an example of a plurality of defocused marker images produced by an imager with a 3- aperture mask 200, resulting in a triangular arrangement of defocused images 202, or "triads," corresponding to each feature or marker on the target object.
  • Determining a sub-pixel center 204 of each triad will yield the x and y-coordinates of that marker point on the object.
  • the camera may be optionally calibrated 104 (in FIG. IA) to account for lens distortions, assembly precision, and experimental setup in preparation for matching the marker images with their z- coordinates (three-dimensional coordinates) 106.
  • the physical size of the triangle 206 (in FIG. 2) produced (in FIG. 2) by the triad will yield the z, or depth coordinate of the marker point.
  • the depth coordinate can be generated through use of defocusing equations known in the art, a non-limiting example of which was published by Willert, CE. and Gharib, M. in Three-dimensional particle imaging with a single camera.
  • a second optional camera calibration may be performed 108.
  • the end result of the "dumb” method is the output of a three-dimensional point cloud 110 representing the three-dimensional shape of the object.
  • a three-dimensional point cloud 110 representing the three-dimensional shape of the object.
  • FIG. 3 A An illustration of a point cloud 301 produced by a "dumb” method is illustrated in FIG. 3B.
  • FIG. IB is a flow diagram showing the acts of the "aware" imaging method of the present invention.
  • the first act is capturing a plurality of defocused images of an object on a sensor 112.
  • the images captured are not limited to visual images, as the defocusing techniques of the present invention are also applicable to acoustics.
  • a non-limiting example of how the defocusing can be achieved is by a camera having a lens obscured by a mask with a plurality of off-axis apertures.
  • FIG. 2 shows a defocusing mask containing three off-axis apertures 200 arranged in a triangular shape.
  • the present invention can utilize any of a wide variety of defocusing masks and camera assemblies, including but not limited to those described in U.S. Patent Application No. 12/011,023, filed January 22, 2008, entitled “METHOD AND APPARATUS FOR QUANTITATIVE 3-D IMAGING;” U.S. Patent Application No. 12/011,016, filed January 22, 2008, entitled “METHOD AND APPARATUS FOR QUANTITATIVE 3-D IMAGING;" U.S. Patent Application No.
  • the senor is configured to capture the plurality of defocused images of the object by receiving a signal transmitted from the object through the plurality of off-axis apertures and through the lens of the camera.
  • signal is meant to encompass both electromagnetic radiation and sound waves.
  • transmitted from an object is meant to encompass both reflection and emission of the signal from the object.
  • Non limiting examples of transmission by emission include, but are not limited to, radiation from a light bulb or from an object undergoing radioactive decay.
  • Non-limiting examples of transmission by reflection include, but are not limited to, reflection of light off of an object from a light bulb or from a laser.
  • the result of the defocusing is to produce a plurality of defocused images 202 on the sensor.
  • the next act is to identify segments of interest in the plurality of defocused images using a feature recognition algorithm 114.
  • a feature recognition algorithm 114 Such an "aware" imaging method searches for a small number of a priori known features in the images. The segments of interest searched for will depend on the type of object being viewed.
  • FIG. 3C shows examples of segments of interest 302 used in human feature recognition.
  • the present invention can use any feature recognition algorithm know in the art to find the segments of interest.
  • the camera can be optionally calibrated 116 (in FIG. IB) to account for lens distortions, assembly precision, and experimental setup.
  • the feature segments are then matched with their three- dimensional coordinates 118 based on the position of the images of the segments of interest on the sensor.
  • the defocused images that are produced will form triads 202.
  • the position of the center of the triad 204 on the sensor gives the x and y locations of the segment in the image.
  • the physical size of the triad 206 gives the z coordinate through the use of defocusing equations first described by Willert, CE. and Gharib, M. in Three-dimensional particle imaging with a single camera. Experiments in Fluids 12, 353-358 (1992).
  • the physical size of the triad can be represented as any of a variety of measurements including but not limited to the area of the triad, the circumference of the triad, the distance from a center point to any vertex of the triad, or the circumference of a circle encompassing and tangent with the vertices of the triad.
  • the end result of the "aware" imaging method is an image containing three-dimensional locations of the segments of interest 302.
  • FIG. 4 is a block diagram showing a generic data processing system for use with the present invention.
  • the data processing system 400 comprises a memory 402 and a processor 404 (or a plurality of processors).
  • the processor(s) 404 is configured to receive a defocused image input 406 and output three-dimensional locations of the segments of interest 408.
  • the system is further configured to perform the acts of the method of the present invention, including: capturing a plurality of defocused images of an object on a sensor, identifying segments of interest in each of the plurality of images using a feature recognition algorithm, and matching the segments with three-dimensional coordinates according to the positions of the images of the segments on the sensor to produce a three- dimensional position of each segment of interest.
  • the present invention also comprises a computer program product 500.
  • the computer program product 500 comprises computer readable instruction means encoded thereon for causing the data processing system to perform the operations described herein.
  • instruction means as used with respect to this invention generally indicates a set of operations to be performed on a computer (or computers), and may represent pieces of a whole program or individual, separable, software modules.
  • Non-limiting examples of "instruction means” include computer program code (source or object code) and "hard-coded” electronics (i.e. computer operations coded into a computer chip).
  • the "instruction means” may be stored in the memory of a computer or on a computer-readable medium such as a floppy disk, a CD-ROM, and a flash drive.
  • the computer program product 500 shown in FIG. 5 is an optical disk such as a CD or DVD. However, the computer program product 500 generally represents computer-readable instructions stored on any compatible computer-readable medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

L'invention porte sur un procédé et sur un système pour une imagerie à trois dimensions rapide à l'aide de défocalisation et d'une reconnaissance de caractéristiques. Le procédé consiste à capturer une pluralité d'images défocalisées d'un objet sur un capteur, à identifier des segments d'intérêt dans chacune de la pluralité des images à l'aide d'un algorithme de reconnaissance des caractéristiques et à faire correspondre les segments avec les coordonnées en trois dimensions selon les positions des images de segments sur le capteur afin de produire une position en trois dimensions de chaque segment d'intérêt. Le procédé d'imagerie de la présente invention est « conscient » par le fait qu'il utilise une connaissance a priori d'un petit nombre de caractéristiques d'objet pour réduire le temps de calcul par comparaison aux procédés « non intelligents » connus dans l'art qui calculent des positions de manière exhaustive d'un grand nombre de points de marqueurs.
PCT/US2008/012947 2007-11-19 2008-11-19 Procédé et système pour une imagerie à trois dimensions rapide à l'aide d'une défocalisation et d'une reconnaissance de caractéristiques WO2009067223A2 (fr)

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US369807P 2007-11-19 2007-11-19
US61/003,698 2007-11-19
US12/011,016 US8576381B2 (en) 2007-01-22 2008-01-22 Method and apparatus for quantitative 3-D imaging
US12/011,016 2008-01-22
US12/011,023 US7826067B2 (en) 2007-01-22 2008-01-22 Method and apparatus for quantitative 3-D imaging
US12/011,023 2008-01-22
US12/150,237 US7916309B2 (en) 2007-04-23 2008-04-23 Single-lens, single-aperture, single-sensor 3-D imaging device
US12/150,237 2008-04-23
US12/150,238 2008-04-23
US12/150,236 US8619126B2 (en) 2007-04-23 2008-04-23 Single-lens, single-sensor 3-D imaging device with a central aperture for obtaining camera position
US12/150,238 US7894078B2 (en) 2007-04-23 2008-04-23 Single-lens 3-D imaging device using a polarization-coded aperture masks combined with a polarization-sensitive sensor
US12/150,236 2008-04-23
US12/150,239 US20080278572A1 (en) 2007-04-23 2008-04-23 Aperture system with spatially-biased aperture shapes and positions (SBPSP) for static and dynamic 3-D defocusing-based imaging
US12/150,239 2008-04-23

Publications (2)

Publication Number Publication Date
WO2009067223A2 true WO2009067223A2 (fr) 2009-05-28
WO2009067223A3 WO2009067223A3 (fr) 2009-08-27

Family

ID=40668055

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/012947 WO2009067223A2 (fr) 2007-11-19 2008-11-19 Procédé et système pour une imagerie à trois dimensions rapide à l'aide d'une défocalisation et d'une reconnaissance de caractéristiques

Country Status (1)

Country Link
WO (1) WO2009067223A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9736463B2 (en) 2007-04-23 2017-08-15 California Institute Of Technology Single-lens, single-sensor 3-D imaging device with a central aperture for obtaining camera position

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996041304A1 (fr) * 1995-06-07 1996-12-19 The Trustees Of Columbia University In The City Of New York Appareil et procedes de determination de la forme tridimensionnelle d'un objet au moyen d'un eclairage dynamique et d'une diminution de nettete relative dans deux images due a la defocalisation
US20040136567A1 (en) * 2002-10-22 2004-07-15 Billinghurst Mark N. Tracking a surface in a 3-dimensional scene using natural visual features of the surface
US20060098872A1 (en) * 2004-03-22 2006-05-11 Stereo Display, Inc. Three-dimensional imaging system for pattern recognition
WO2007130122A2 (fr) * 2006-05-05 2007-11-15 Thomson Licensing Système et procédé permettant une reconstruction tridimensionnelle d'objet à partir d'images bidimensionnelles
US20080031513A1 (en) * 2000-07-14 2008-02-07 Massachusetts Institute Of Technology Method and system for high resolution, ultra fast 3-D imaging
US20080259354A1 (en) * 2007-04-23 2008-10-23 Morteza Gharib Single-lens, single-aperture, single-sensor 3-D imaging device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996041304A1 (fr) * 1995-06-07 1996-12-19 The Trustees Of Columbia University In The City Of New York Appareil et procedes de determination de la forme tridimensionnelle d'un objet au moyen d'un eclairage dynamique et d'une diminution de nettete relative dans deux images due a la defocalisation
US20080031513A1 (en) * 2000-07-14 2008-02-07 Massachusetts Institute Of Technology Method and system for high resolution, ultra fast 3-D imaging
US20040136567A1 (en) * 2002-10-22 2004-07-15 Billinghurst Mark N. Tracking a surface in a 3-dimensional scene using natural visual features of the surface
US20060098872A1 (en) * 2004-03-22 2006-05-11 Stereo Display, Inc. Three-dimensional imaging system for pattern recognition
WO2007130122A2 (fr) * 2006-05-05 2007-11-15 Thomson Licensing Système et procédé permettant une reconstruction tridimensionnelle d'objet à partir d'images bidimensionnelles
US20080259354A1 (en) * 2007-04-23 2008-10-23 Morteza Gharib Single-lens, single-aperture, single-sensor 3-D imaging device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FRANCISCO PEREIRA ET AL: "Two-frame 3D particle tracking" MEASUREMENT SCIENCE AND TECHNOLOGY, IOP, BRISTOL, GB, vol. 17, no. 7, 1 July 2006 (2006-07-01), pages 1680-1692, XP020103583 ISSN: 0957-0233 *
WILLERT C E ET AL: "THREE-DIMENSIONAL PARTICLE IMAGING WITH A SINGLE CAMERA" EXPERIMENTS IN FLUIDS, SPRINGER, HEIDELBERG, DE, vol. 12, no. 6, 1 April 1992 (1992-04-01), pages 353-358, XP000287710 ISSN: 0723-4864 cited in the application *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9736463B2 (en) 2007-04-23 2017-08-15 California Institute Of Technology Single-lens, single-sensor 3-D imaging device with a central aperture for obtaining camera position

Also Published As

Publication number Publication date
WO2009067223A3 (fr) 2009-08-27

Similar Documents

Publication Publication Date Title
US8456645B2 (en) Method and system for fast three-dimensional imaging using defocusing and feature recognition
JP5467404B2 (ja) 3d撮像システム
US8988317B1 (en) Depth determination for light field images
US9210404B2 (en) Calibration and registration of camera arrays using a single circular grid optical target
US9053547B2 (en) Three-dimensional point cloud position data processing device, three-dimensional point cloud position data processing system, and three-dimensional point cloud position data processing method and program
JP5337243B2 (ja) 表面特徴の適応型3次元走査システム
JP6573419B1 (ja) 位置決め方法、ロボット及びコンピューター記憶媒体
KR20160116075A (ko) 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법
WO2004044522A1 (fr) Procede et dispositif permettant de mesurer une forme tridimensionnelle
JP2007129709A (ja) イメージングデバイスをキャリブレートするための方法、イメージングデバイスの配列を含むイメージングシステムをキャリブレートするための方法およびイメージングシステム
JP2012058076A (ja) 3次元計測装置及び3次元計測方法
CN102810205A (zh) 一种摄像或照相装置的标定方法
JP2017509986A (ja) 超音波深度検出を使用するオプティカルフロー画像化システム及び方法
JP6580761B1 (ja) 偏光ステレオカメラによる深度取得装置及びその方法
KR20200067641A (ko) 3차원 증강 현실을 위한 캘리브레이션 방법 및 그 장치
US20180240241A1 (en) Three-dimensional imager
Shen et al. Calibrating light sources by using a planar mirror
JP2009545265A (ja) コード化アパーチャーセンサのための処理方法
KR102668245B1 (ko) 3차원 깊이 측정 장치 및 방법
CN104200456A (zh) 一种用于线结构光三维测量的解码方法
KR20180054737A (ko) 픽셀 빔을 나타내는 데이터를 생성하기 위한 장치 및 방법
TW201830338A (zh) 用於獲得電磁場的波前的層析分佈的方法和光學系統
RU2729698C2 (ru) Устройство и способ для кодирования изображения, захваченного оптической системой получения данных
Shim et al. Performance evaluation of time-of-flight and structured light depth sensors in radiometric/geometric variations
WO2009067223A2 (fr) Procédé et système pour une imagerie à trois dimensions rapide à l'aide d'une défocalisation et d'une reconnaissance de caractéristiques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08851329

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08851329

Country of ref document: EP

Kind code of ref document: A2