EP1903299A1 - Verfahren und System zur Erfassung eines 3D-Bildes einer Szene - Google Patents

Verfahren und System zur Erfassung eines 3D-Bildes einer Szene Download PDF

Info

Publication number
EP1903299A1
EP1903299A1 EP06120996A EP06120996A EP1903299A1 EP 1903299 A1 EP1903299 A1 EP 1903299A1 EP 06120996 A EP06120996 A EP 06120996A EP 06120996 A EP06120996 A EP 06120996A EP 1903299 A1 EP1903299 A1 EP 1903299A1
Authority
EP
European Patent Office
Prior art keywords
light
lock
emitting devices
modulation phase
light emitting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06120996A
Other languages
English (en)
French (fr)
Inventor
Marie-Astrid Cauquy
Laurent Lamesch
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
IEE International Electronics and Engineering SA
Original Assignee
IEE International Electronics and Engineering SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IEE International Electronics and Engineering SA filed Critical IEE International Electronics and Engineering SA
Priority to EP06120996A priority Critical patent/EP1903299A1/de
Priority to US12/441,827 priority patent/US7952690B2/en
Priority to CNA2007800348145A priority patent/CN101517355A/zh
Priority to PCT/EP2007/059507 priority patent/WO2008034738A1/en
Priority to EP07803393.3A priority patent/EP2074377B1/de
Priority to JP2009528678A priority patent/JP2010504509A/ja
Publication of EP1903299A1 publication Critical patent/EP1903299A1/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar

Definitions

  • the present invention relates to a 3-D (three-dimensional) imaging system.
  • Systems for creating a 3-D representation of a given portion of space have a variety of potential applications in many different fields. Examples are automotive sensor technology (e.g. vehicle occupant detection and classification), robotic sensor technology (e.g. object identification) or safety engineering (e.g. plant monitoring) to name only a few.
  • automotive sensor technology e.g. vehicle occupant detection and classification
  • robotic sensor technology e.g. object identification
  • safety engineering e.g. plant monitoring
  • a 3-D imaging system requires depth information about the target scene. In other words, the distances between one or more observed objects and an optical receiver of the system need to be determined.
  • a well-known approach for distance measurement which is used e.g. in radar applications, consists in timing the interval between emission and echo-return of a measurement signal.
  • TOF time-of-flight
  • the measurement signal consists of light waves.
  • the term "light” is to be understood as including visible, infrared (IR) and ultraviolet (UV) light.
  • the TOF method can e.g. be using the phase-shift technique or the pulse technique.
  • the phase-shift technique the amplitude of the emitted light is periodically modulated (e.g. by sinusoidal modulation) and the phase of the modulation at emission is compared to the phase of the modulation at reception.
  • the pulse technique light is emitted in discrete pulses without the requirement of periodicity.
  • the modulation period is normally in the order of twice the difference between the maximum measurement distance and the minimum measurement distance divided by the velocity of light.
  • the propagation time interval is determined as phase difference by means of a phase comparison between the emitted and the received light signal.
  • phase comparison requires synchronization of the demodulation signal with the emitted light signal. Due to the high propagation speed given by the velocity of light, a fundamental difficulty encountered in distance measurements based on the pulse technique or the phase-shift technique resides in the required temporal resolution of the measurement device. In fact, a spatial resolution in the order of centimetres requires a temporal resolution in the order of 10 -11 seconds (10 ps).
  • a 3-D imaging system comprising an illumination unit for emitting light into a scene and an imaging sensor for imaging the scene by detecting scattered light.
  • the system also comprises an evaluation unit, for determining distance information related to the scene on the basis of light propagation time, and synchronization means for providing synchronization information to the evaluation unit.
  • the synchronization means comprises means for generating an electrical reference signal in the illumination unit, the reference signal being directly derived from the emitted light.
  • the scene is illuminated with modulated light emitted by an illumination unit and imaged onto an array of lock-in pixel sensor cells, which detect the previously emitted light after it has been scattered or reflected by an object or a living being in the scene.
  • modulation phase we herein use the term modulation phase to designate the phase of the modulation and avoid confusion with the phase of the electromagnetic waves forming the light.
  • the modulated light is emitted by a plurality of individual light emitting devices of the illumination unit, each of the light emitting devices emitting a part of the modulated light, and the reference modulation phase is provided as an average of the modulation phases of the parts of modulated light emitted by the light emitting devices.
  • the delay between the driving signal driving the light emitting devices and the emitted optical power may be variations.
  • an accuracy of the 3-D imaging system in the cm range is required. This implies that the required accuracy in the delay would have to be about 30 ps (out of 10 ns). If only one light emitting device were used as a source for the reference modulation phase, this accuracy would thus imply extremely severe requirements with regard to production tolerances, component ageing and temperature drifts.
  • one provides the reference modulation phase by gathering modulated light from the individual light emitting devices in the illumination unit.
  • the modulation phase of the gathered modulated light as a whole is then detected and used as reference modulation phase.
  • one detects the modulation phases of the parts of modulated light emitted by the light emitting devices individually; one then calculates an average of these modulation phases and uses it as reference modulation phase.
  • the average of the modulation phases of the parts of modulated light emitted by the light emitting devices preferably takes into account how much each light emitting device contributes to the illumination of the scene. If the individual light emitting devices contribute approximately equally to illuminating the scene, the modulation phases of the different parts of modulated light preferably contribute approximately equally to the average. If the individual light emitting devices contribute by different extents to illuminating the scene, the modulation phases of the different parts of modulated light preferably contribute by corresponding extents to the average.
  • the invention also concerns a 3-D imaging system configured for implementing the method described above.
  • a 3-D imaging system preferably comprises an illumination unit with a plurality of individual light emitting devices for emitting modulated light into the scene, at least one reference light sensor for detecting a reference modulation phase of light emitted at the illumination unit and a camera, e.g. a CMOS or CCD camera.
  • the camera includes an array of lock-in pixel sensor cells and an optical arrangement for imaging the scene onto the array of lock-in pixel sensor cells and determines, in operation, the modulation phase of light detected at the lock-in pixel sensor cells.
  • the imaging system further comprises an evaluation unit operationally connected to the at least one reference light sensor and the lock-in pixel sensor cell array for calculating depth information on the scene based upon the reference modulation phase and the modulation phase of the light detected at the lock-in pixel sensor cells.
  • each individual light emitting device emits a part of the modulated light and the reference light sensor provides the reference modulation phase as an average of the modulation phases of the parts of modulated light emitted by the light emitting devices.
  • the reference light sensor is arranged in the illumination unit in vicinity of the individual light emitting devices so as to be able to gather light from the individual light emitting devices.
  • the system may comprise a light guide arranged for gathering light from the individual light emitting devices and guiding the gathered light from the individual light emitting devices to the reference light sensor.
  • a light guide may, for instance, comprise an optical fibre.
  • the light guide comprises a substantially transparent plate portion (e.g. made of a thermoplastic material) arranged in front of the light emitting devices and a tubular light pipe portion connected to the plate portion and preferably made of the same material as the plate portion and formed as a single piece together therewith.
  • the 3-D imaging system may comprise a plurality of light guides arranged each for guiding light from an individual light emitting device to the reference light sensor.
  • the reference modulation phase is derived from the gathered light as a whole, which intrinsically provides the averaging.
  • the reference light sensor might e.g. comprise a photodiode, but preferably, it comprises one or more lock-in pixel sensor cells. These may be part of the camera, which implies that the same hardware is used for retrieving the reference modulation phase and the scene information. This is, of course, considered as particularly advantageous since the pixels used as reference light sensor thus are subject to the same ageing and temperature effects (thermal noise) as the pixels used for detecting light from the scene. The result of the method is thus expected to be even more accurate in this particular embodiment.
  • the evaluation unit may be configured for calculating the reference modulation phase as an average of modulation phases of light detected at the individual lock-in pixel sensor cells of the reference light sensor.
  • Fig.1 shows a first embodiment of a 3-D imaging system, generally identified by reference numeral 10.
  • the 3-D imaging system 10 comprises an illumination unit 12 emitting light onto a scene, and an imaging sensor 14 imaging the scene.
  • the imaging sensor 14 comprises, in a manner known per se, the required optical accessories such as a focusing lens (not shown) and an electronic camera chip executed in any suitable technology, such as CCD, CMOS and/or TFA. Accordingly, the imaging sensor 14 comprises a two-dimensional array of individual lock-in pixel sensor cells 16 (each of which receives light from a small portion of a target scene) for creating a pixel-by-pixel image thereof.
  • the illumination unit 12 comprises several individual light emitting devices 18 such as light emitting diodes (LEDs), which are collectively driven by means of an illumination driver 20.
  • a signal source 22 provides the input signals for the illumination driver 20 and a photo gate driver 24.
  • the output of the photo gate driver 24 is connected to the imaging sensor 14.
  • An evaluation unit 26 comprising a suitable electronic calculation device, e.g. a digital signal processor (DSP), is connected to the output of the imaging sensor 14.
  • DSP digital signal processor
  • the signal source 22 In operation the signal source 22 generates a modulation signal E1 on its output and feeds this modulation signal E1 to the illumination driver 20.
  • the latter drives the illumination unit 12 with a drive signal E2 to emit a temporally modulated light signal L1 onto a target scene comprising an object 30 (for illustration purposes).
  • Examples for temporal modulation of the light signal L1 are a sinusoidal amplitude modulation or a periodically pulsed emission scheme.
  • the modulated light signal L1 is reflected or scattered by the object 30 so as to form a returning light signal L2 which is received as incident light by the imaging sensor 14.
  • the modulation signal E1 is also fed to the photo gate driver 24 which transforms the modulation signal E1 into a demodulation signal E3.
  • the imaging sensor 14 receives this demodulation signal E3. By means of this demodulation signal E3 and the detected light signal L2, the imaging sensor 14 produces a phase information signal E4 which is fed to the evaluation unit 26 for extraction of the distance information concerning the object 30.
  • the 3-D imaging system 10 comprises a photodiode 28 mounted in the illumination unit 12 contiguous to the light emitting devices 18 in such a way that it gathers light from all the light emitting devices 18.
  • the photodiode 28 derives an output signal E5 from the emitted light by photoelectric conversion.
  • the electrical output signal E5 thus corresponds to an average of the optical signals received by the photodiode 28 from the individual light emitting devices.
  • a mixer element 32 receives the output signal E5 of the photodiode 28 and the demodulation signal E3 at its input and provides a reference modulation phase signal E6 at its output.
  • the mixer element 32 is designed to imitate, together with the photodiode 28, the function of a sensor cell 16 of the imaging sensor 14.
  • Fig. 2 shows another embodiment of a 3-D imaging system 10.
  • a single independent sensor cell 228 is placed in the illumination unit 12 in close proximity to its light emitting devices 18, in such a way that it gathers light from all the light emitting devices 18.
  • the sensor cell 228 is of essentially the same construction as the sensor cells 16 of the imaging sensor 14. As the sensor cells 16, the sensor cell 228 is fed with the demodulation signal E3.
  • the instant 3-D imaging system does not require a mixer element since the sensor cell 228 directly provides an averaged electrical reference signal with the function of the synchronization signal E6.
  • the independent sensor cell 228 placed inside the illumination unit 12 provides a modulation phase reference essentially corresponding to a "zero" distance measurement.
  • the evaluation unit 26 determines distance information on the basis of the synchronization signal E6 from the sensor cell 228 and the modulation phase information signal E4 from the imaging sensor 14.
  • Fig. 3 shows yet another embodiment of a 3-D imaging system 10.
  • a light guide 34 e.g. an optical fibre, mirror and/or prism optics
  • the lock-in pixel sensor cells of area 328 thus provide the averaged reference modulation phase signal E6.
  • the sensor cells of area 328 provide a modulation phase reference that essentially corresponds to the length of the light path in light guide 34.
  • the evaluation unit 26 determines distance information on the basis of the synchronization signal E6 from the sensor cells of area 328 and the modulation phase information signal E4 from the imaging sensor 14.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
EP06120996A 2006-09-20 2006-09-20 Verfahren und System zur Erfassung eines 3D-Bildes einer Szene Withdrawn EP1903299A1 (de)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP06120996A EP1903299A1 (de) 2006-09-20 2006-09-20 Verfahren und System zur Erfassung eines 3D-Bildes einer Szene
US12/441,827 US7952690B2 (en) 2006-09-20 2007-09-11 Method and system for acquiring a 3-D image of a scene
CNA2007800348145A CN101517355A (zh) 2006-09-20 2007-09-11 用于获取景物的3-d图像的方法和系统
PCT/EP2007/059507 WO2008034738A1 (en) 2006-09-20 2007-09-11 Method and system for acquiring a 3-d image of a scene
EP07803393.3A EP2074377B1 (de) 2006-09-20 2007-09-11 Verfahren und system zur aufnahme eines 3d-bildes einer szene
JP2009528678A JP2010504509A (ja) 2006-09-20 2007-09-11 シーン(情景)の3d画像を捕捉する方法及びシステム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP06120996A EP1903299A1 (de) 2006-09-20 2006-09-20 Verfahren und System zur Erfassung eines 3D-Bildes einer Szene

Publications (1)

Publication Number Publication Date
EP1903299A1 true EP1903299A1 (de) 2008-03-26

Family

ID=37814471

Family Applications (2)

Application Number Title Priority Date Filing Date
EP06120996A Withdrawn EP1903299A1 (de) 2006-09-20 2006-09-20 Verfahren und System zur Erfassung eines 3D-Bildes einer Szene
EP07803393.3A Active EP2074377B1 (de) 2006-09-20 2007-09-11 Verfahren und system zur aufnahme eines 3d-bildes einer szene

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP07803393.3A Active EP2074377B1 (de) 2006-09-20 2007-09-11 Verfahren und system zur aufnahme eines 3d-bildes einer szene

Country Status (5)

Country Link
US (1) US7952690B2 (de)
EP (2) EP1903299A1 (de)
JP (1) JP2010504509A (de)
CN (1) CN101517355A (de)
WO (1) WO2008034738A1 (de)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011145109A (ja) * 2010-01-12 2011-07-28 Topcon Corp 光波距離測定装置
WO2011144458A1 (en) * 2010-05-17 2011-11-24 Iee International Electronics & Engineering S.A. Imaging device for imaging a scene using time - of - flight and thermographic measurements
LU91715B1 (en) * 2010-08-04 2012-02-06 Iee Sarl Imaging device for imaging a scene operating according to the time-of-flight principle
WO2012123152A1 (de) * 2011-03-17 2012-09-20 Robert Bosch Gmbh Messvorrichtung zur messung einer entfernung zwischen der messvorrichtung und einem zielobjekt mit hilfe optischer messstrahlung
WO2013026678A1 (de) * 2011-08-25 2013-02-28 Ifm Electronic Gmbh Lichtlaufzeitkamerasystem mit signalpfadüberwachung
EP3007441A1 (de) * 2014-09-30 2016-04-13 Shenzhen Estar Technology Group Co., Ltd Interaktives Anzeigeverfahren, Steuerungsverfahren und System zur Erzielung der Anzeige eines holografischen Bildes
CN106652463A (zh) * 2016-11-02 2017-05-10 沈阳大学 一种3d摄像智能识别车辆装置
EP3182156A1 (de) * 2015-12-18 2017-06-21 STMicroelectronics (Research & Development) Limited Sortiervorrichtung

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10532617B2 (en) 2004-03-04 2020-01-14 Infineon Technologies Ag Apparatus and method for determining a state parameter of an object to be monitored
US9566836B2 (en) 2004-03-04 2017-02-14 Infineon Technologies Ag Apparatus and method for determining a state parameter of an object to be monitored
EP2264481A1 (de) 2009-06-04 2010-12-22 IEE International Electronics & Engineering S.A. Verfahren und Vorrichtung zum Erzeugen eines entfernungsaufgelösten Bildes
EP2275833A1 (de) 2009-07-01 2011-01-19 IEE International Electronics & Engineering S.A.R.L. Entfernungskamera und Entfernungsbildaufnahmeverfahren
KR101565969B1 (ko) 2009-09-01 2015-11-05 삼성전자주식회사 깊이 정보를 추정할 수 있는 방법과 장치, 및 상기 장치를 포함하는 신호 처리 장치
US20110210237A1 (en) * 2010-01-21 2011-09-01 Shmuel Sternklar Methods and systems for measuring the frequency response and impulse response of objects and media
DE102010043768B3 (de) 2010-09-30 2011-12-15 Ifm Electronic Gmbh Lichtlaufzeitkamera
KR101669412B1 (ko) * 2010-11-01 2016-10-26 삼성전자주식회사 3d 카메라를 위한 깊이 정보 측정 방법 및 장치
US8698084B2 (en) * 2011-03-10 2014-04-15 Sionyx, Inc. Three dimensional sensors, systems, and associated methods
WO2014043587A1 (en) * 2012-09-13 2014-03-20 California Institute Of Technology Coherent camera
US9879995B2 (en) 2012-09-13 2018-01-30 Laser Technology, Inc. System and method for superimposing a virtual aiming mechanism with a projected system beam in a compact laser-based rangefinding instrument
US9354051B2 (en) * 2012-09-13 2016-05-31 Laser Technology, Inc. System and method for a rangefinding instrument incorporating pulse and continuous wave signal generating and processing techniques for increased distance measurement accuracy
US9213101B2 (en) 2012-09-13 2015-12-15 Laser Technology, Inc. Self-aligned aiming system and technique for a laser rangefinder incorporating a retroreflector
CN103424748B (zh) * 2013-08-31 2015-02-11 王敏 光敏恒流电容积分面阵列距离传感器
US10203399B2 (en) 2013-11-12 2019-02-12 Big Sky Financial Corporation Methods and apparatus for array based LiDAR systems with reduced interference
US9360554B2 (en) 2014-04-11 2016-06-07 Facet Technology Corp. Methods and apparatus for object detection and identification in a multiple detector lidar array
DE102014111431A1 (de) 2014-08-11 2016-02-11 Infineon Technologies Ag Flugzeitvorrichtungen und eine Beleuchtungsquelle
US9897690B2 (en) 2014-10-27 2018-02-20 Laser Technology, Inc. Technique for a pulse/phase based laser rangefinder utilizing a single photodiode in conjunction with separate pulse and phase receiver circuits
US10036801B2 (en) 2015-03-05 2018-07-31 Big Sky Financial Corporation Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array
CN107431080B (zh) * 2015-03-30 2020-07-07 株式会社尼康 拍摄元件及拍摄装置
US9866816B2 (en) 2016-03-03 2018-01-09 4D Intellectual Properties, Llc Methods and apparatus for an active pulsed 4D camera for image acquisition and analysis
US10890649B2 (en) * 2016-08-11 2021-01-12 Qualcomm Incorporated System and method for measuring reference and returned light beams in an optical system
DE102017101945A1 (de) * 2017-02-01 2018-08-02 Osram Opto Semiconductors Gmbh Messanordnung mit einem optischen Sender und einem optischen Empfänger
JP7082404B2 (ja) * 2018-06-26 2022-06-08 国立大学法人静岡大学 距離計測装置
WO2020243901A1 (en) 2019-06-04 2020-12-10 Texas Instruments Incorporated An optical time of flight sensor for navigation systems in robotic applications

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3813165A (en) * 1971-09-20 1974-05-28 Laser Syst & Electronics Inc Digital distance measuring apparatus employing modulated light beam
DE3219452A1 (de) * 1981-06-09 1983-03-17 Bölkow, Ludwig, Dipl-Ing. Dr., 8022 Grünwald Dynamik-steuerungsanordnung fuer ein entfernungsmessgeraet
DE4439298A1 (de) * 1994-11-07 1996-06-13 Rudolf Prof Dr Ing Schwarte 3D-Kamera nach Laufzeitverfahren
WO1998010255A1 (de) * 1996-09-05 1998-03-12 Rudolf Schwarte Verfahren und vorrichtung zur bestimmung der phasen- und/oder amplitudeninformation einer elektromagnetischen welle
WO2006097406A2 (en) * 2005-03-17 2006-09-21 Iee International Electronics & Engineering S.A. 3-d imaging system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61272677A (ja) * 1985-05-29 1986-12-02 Toshiba Corp 光路長測定器
JPS62115388A (ja) * 1985-11-14 1987-05-27 Matsushita Electric Works Ltd 測距型異常監視センサ
JPH0962985A (ja) * 1995-08-21 1997-03-07 Nippon Signal Co Ltd:The 車両検出装置
US5877851A (en) * 1997-09-24 1999-03-02 The United States Of America As Represented By The Secretary Of The Army Scannerless ladar architecture employing focal plane detector arrays and FM-CW ranging theory
JP3643794B2 (ja) * 2001-07-25 2005-04-27 三菱重工業株式会社 レーザを用いた形状計測方法及びシステム
EP1762862A1 (de) * 2005-09-09 2007-03-14 IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. Verfahren und Vorrichtung zur 3D-Abbildung

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3813165A (en) * 1971-09-20 1974-05-28 Laser Syst & Electronics Inc Digital distance measuring apparatus employing modulated light beam
DE3219452A1 (de) * 1981-06-09 1983-03-17 Bölkow, Ludwig, Dipl-Ing. Dr., 8022 Grünwald Dynamik-steuerungsanordnung fuer ein entfernungsmessgeraet
DE4439298A1 (de) * 1994-11-07 1996-06-13 Rudolf Prof Dr Ing Schwarte 3D-Kamera nach Laufzeitverfahren
WO1998010255A1 (de) * 1996-09-05 1998-03-12 Rudolf Schwarte Verfahren und vorrichtung zur bestimmung der phasen- und/oder amplitudeninformation einer elektromagnetischen welle
WO2006097406A2 (en) * 2005-03-17 2006-09-21 Iee International Electronics & Engineering S.A. 3-d imaging system

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8908156B2 (en) 2010-01-12 2014-12-09 Kabushiki Kaisha Topcon Electro-optical distance measuring device
JP2011145109A (ja) * 2010-01-12 2011-07-28 Topcon Corp 光波距離測定装置
WO2011144458A1 (en) * 2010-05-17 2011-11-24 Iee International Electronics & Engineering S.A. Imaging device for imaging a scene using time - of - flight and thermographic measurements
LU91715B1 (en) * 2010-08-04 2012-02-06 Iee Sarl Imaging device for imaging a scene operating according to the time-of-flight principle
WO2012123152A1 (de) * 2011-03-17 2012-09-20 Robert Bosch Gmbh Messvorrichtung zur messung einer entfernung zwischen der messvorrichtung und einem zielobjekt mit hilfe optischer messstrahlung
CN103443648A (zh) * 2011-03-17 2013-12-11 罗伯特·博世有限公司 用于借助光学测量射线来测量在测量装置和目标对象之间的距离的测量装置
US9348018B2 (en) 2011-03-17 2016-05-24 Robert Bosch Gmbh Measurement device for measuring a distance between the measurement device and a target object using an optical measurement beam
CN103443648B (zh) * 2011-03-17 2016-08-17 罗伯特·博世有限公司 用于借助光学测量射线来测量在测量装置和目标对象之间的距离的测量装置
WO2013026678A1 (de) * 2011-08-25 2013-02-28 Ifm Electronic Gmbh Lichtlaufzeitkamerasystem mit signalpfadüberwachung
EP3007441A1 (de) * 2014-09-30 2016-04-13 Shenzhen Estar Technology Group Co., Ltd Interaktives Anzeigeverfahren, Steuerungsverfahren und System zur Erzielung der Anzeige eines holografischen Bildes
EP3182156A1 (de) * 2015-12-18 2017-06-21 STMicroelectronics (Research & Development) Limited Sortiervorrichtung
US10261175B2 (en) 2015-12-18 2019-04-16 Stmicroelectronics (Research & Development) Limited Ranging apparatus
CN106652463A (zh) * 2016-11-02 2017-05-10 沈阳大学 一种3d摄像智能识别车辆装置

Also Published As

Publication number Publication date
EP2074377B1 (de) 2015-07-15
EP2074377A1 (de) 2009-07-01
US7952690B2 (en) 2011-05-31
JP2010504509A (ja) 2010-02-12
WO2008034738A1 (en) 2008-03-27
US20100045966A1 (en) 2010-02-25
CN101517355A (zh) 2009-08-26

Similar Documents

Publication Publication Date Title
EP2074377B1 (de) Verfahren und system zur aufnahme eines 3d-bildes einer szene
US7924411B2 (en) 3-D imaging system
US10261175B2 (en) Ranging apparatus
US7834985B2 (en) Surface profile measurement
EP2815251B1 (de) Tof-kamera mit streifenbeleuchtung
JP4405154B2 (ja) イメージングシステムおよび物体の画像を取得する方法
CA2650235C (en) Distance measuring method and distance measuring element for detecting the spatial dimension of a target
US20040118624A1 (en) CMOS camera with integral laser ranging and velocity measurement
US20120236290A1 (en) Optical Distance Measuring Device with Calibration Device
KR102559910B1 (ko) 차량 주변 환경을 특성화하기 위한 시스템
KR20160111030A (ko) 거리 측정 장치 및 시차 연산 시스템
EP2717069B1 (de) Verfahren zur Bestimmung und/oder Kompensation des Abstandsversatzes eines Abstandssensors
EP2264481A1 (de) Verfahren und Vorrichtung zum Erzeugen eines entfernungsaufgelösten Bildes
CA2716980C (en) Light-integrating rangefinding device and method
JP2018066609A (ja) 測距装置、監視カメラ、3次元計測装置、移動体、ロボット及び測距方法
EP2275833A1 (de) Entfernungskamera und Entfernungsbildaufnahmeverfahren
JP6864599B2 (ja) レーザー距離計測装置
GB2374743A (en) Surface profile measurement
US20210055406A1 (en) Object sensing device
US20180246192A1 (en) Information processing device, control method, program, and storage medium
EP3069167B1 (de) Verfahren zur beleuchtung eines objekts und zur bestimmung einer entfernung
CN111902733A (zh) 距离测量装置、距离测量系统、距离测量方法和程序

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK YU

AKX Designation fees paid
REG Reference to a national code

Ref country code: DE

Ref legal event code: 8566

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20080927