WO2012013486A1 - Procédé et système pour étalonner un appareil photographique en trois dimensions à vues multiples - Google Patents

Procédé et système pour étalonner un appareil photographique en trois dimensions à vues multiples Download PDF

Info

Publication number
WO2012013486A1
WO2012013486A1 PCT/EP2011/061819 EP2011061819W WO2012013486A1 WO 2012013486 A1 WO2012013486 A1 WO 2012013486A1 EP 2011061819 W EP2011061819 W EP 2011061819W WO 2012013486 A1 WO2012013486 A1 WO 2012013486A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth map
camera
hollow body
inner shape
correction parameters
Prior art date
Application number
PCT/EP2011/061819
Other languages
English (en)
Inventor
Varun Akur Venkatesan
Antony Louis Piriyakumar Douglas
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2012013486A1 publication Critical patent/WO2012013486A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Definitions

  • This invention relates to a method and a system for
  • a lens of the camera gathers the reflected electromagnetic pulses from various such surfaces of the object in the scene and generates a depth map of the whole scene. Depending on the distance travelled by each of the pulse, the electromagnetic pulse experiences a delay when captured by the lens of the camera.
  • a 360 degree lens system has lenses as well as curved mirrors to combine refraction via lenses and reflection via curved mirrors to gather the reflected electromagnetic pulses from all angles of the scene.
  • a multi-view three dimensional camera based on the time-of-flight principle is used.
  • the multi-view three dimensional camera uses the 360 degree lens system to gather the reflected electromagnetic pulses from various angles of the scene to generate the depth map.
  • the 360 degree lens system has a tendency to distort path length of the reflected electromagnetic pulse when the electromagnetic pulse passes through the 360 degree lens system.
  • the depth map generated by the time of flight camera using the 360 degree lens system gets
  • the object of the invention is achieved by a method of claim 1 and a system of the claim 10.
  • the underlying idea of the invention is to generate a depth map using a 360 degree lens system representing multi-view of a hollow body with a known inner shape and to compare this depth map with the known inner shape to determine for a set of undetermined parameters of the depth map to compensate the distortion of the electromagnetic pulse path length due to the 360 degree lens system. This allows calibrating the multi-view three dimensional cameras easily.
  • the correction parameters are determined such that the depth map corrected with the correction parameters corresponds to the known inner shape, so that, the compensation parameters are aligned synchronously to the comparison of depth map with the known inner shape.
  • Such compensation parameters calibrate the multi-view three dimensional cameras more precisely.
  • the correction parameters compensate a difference offset of a path length of an
  • the depth map generated from the electromagnetic pulse after correction will correspond to the known inner shape.
  • the method includes determining the correction parameters for every pixel of the depth map, so that the correction parameters can be availabl for entire depth map to correct the entire depth map at the same time in a way to provide the depth map after correction which corresponds to the inner shape of the hollow body.
  • the matching algorithm is based on linear shift or spline transformation of the depth map or combination thereof.
  • Such matching algorithms are easy to implement, as the concepts of linear shift and spline transformation are generally known.
  • the inner shape of the hollow body is cylinder or hemisphere. This makes method easy to implement and use, as the hollow body with such inner shape are readily available.
  • the hollow body is a room o known shape, thus using a room itself as the hollow body for calibration of camera. This replaces a need of separate hollow body for the calibration of the camera and also provides a solution for calibration when the separate hollow body generally used for calibration of the camera is lost.
  • placing the camera in predetermined position relative to the hollow body This helps to determine the correction factors easily and quickly, as time taken by the processor to calculate the position of the camera with respect to the surrounding body is saved.
  • the shape of the hollow body is having a symmetry axis and the hollow body is placed coaxially with an optical axis of the camera.
  • the symmetry axis is easy to locate, so a user can easily place the camera relative to the surrounding of the hollow body.
  • the system includes a holder for receiving the hollow body, so that the hollow body can be supported during the calibration of the camera.
  • FIG 1 shows a schematic diagram of a system for calibrating a multi-view three dimensional camera.
  • FIG 2 shows a flowchart for a matching algorithm used for determining the correction parameters by matching the depth map to the known inner shape.
  • a camera includes a electromagnetic pulse source which emits a electromagnetic pulse to be captured back by the camera on being reflected by a surface of a known inner shap, so that to generate a multi-view depth map of the known inner shape representing views from various angles of the known inner shape. But, as the electromagnetic pulse passes through the 360 degree lens system, path length of the electromagnetic b pulse gets distorted resulting in a change in the path length. To calibrate the camera for generating the depth map to correspond to a known inner shape on compensation for distortion of path length, a system is illustrated in FIG 1. b
  • a system 4 is exemplified showing a multi-view three dimensional camera 4 with a 360 degree lens system 8 for generating a multi-view depth map 6, a
  • the camera 4 is generating the multi-view depth map of the inner shape 12 and processor 22 receiving the depth map from the camera 4 to calibrate the camera 4 by determining correction parameters 14 for a set of undetermined parameters 26 for each pixel 16 of depth map 6 lb such that the depth map 6 on being corrected using the
  • correction parameters 14 corresponds to the known inner shape 12.
  • the undetermined parameters 26 are based on various physical 20 properties of the camera 4, the hollow body 10, the known
  • the camera 4 also includes a source of electromagnetic pulse which emits a electromagnetic pulse like a light pulse or a laser pulse or any such pulses having electromagnetic
  • the 360 degree lens system 8 distorts the path length of the electromagnetic pulse by elongating or
  • the processor 22 takes the depth map 6 as an input to process the depth map 6 and determines the correction factors.
  • the depth map 6 on correction using the correction parameters 14 represents the known inner shape 12.
  • the correction parameters 14 on being used with the depth map 6 compensate a difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6, wherein the difference offset is defined by a difference in the path lengths of the electromagnetic pulse when the 360 degree lens system 8 is used and when the 360 degree lens system 8 is not used.
  • the correction parameters 14 compensate a difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6 by compensating intensity of each pixel 16 of the depth map 6.
  • the correction parameters 14 can compensate the difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6 by compensating frequency or wavelength or a combination of intensity, wavelength or frequency of each pixel 16 of the depth map 6.
  • the depth map 6 comprises of various pixels 16 on the depth map 6 and each pixel 16 is generated by different
  • electromagnetic pulses In general, the electromagnetic pulse source emits the electromagnetic pulse in different
  • each of the electromagnetic pulse gets to a different distance and generates the pixel 16 on the depth map 6 on a basis of the distance travelled by each of the electromagnetic pulse.
  • each of the electromagnetic pulse gets to a different distance and generates the pixel 16 on the depth map 6 on a basis of the distance travelled by each of the electromagnetic pulse.
  • the correction parameters 14 can be determined for each of the pixels 16 on the depth map 6, as the electromagnetic pulses corresponding to each of the pixels 16 have been distorted differently by the 360 degree lens system 8.
  • the correction parameters 14 can be determined as a function to represent correction parameters 14 for each of the pixels 16 generated from the electromagnetic pulses distorted differently by the 360 degree lens system 8.
  • the function for correction parameters 14 can be an algebraic function, a vector function, a
  • the correction parameters 14 are determined by matching the depth map 6 to the known inner shape 12 by a matching
  • the matching algorithm takes the depth map 6 and spatial coordinates referring to the known inner shape 12 as an input and matches those spatial parameters with the pixels 16 in the depth map 6 referring to the spatial
  • the processor 22 determines the correction parameters 14. The matching
  • the known inner shape 12 of the hollow body 10 is a cylinder.
  • the inner shape 12 of the device is known through the
  • the cylinder 10 has a regular shape which makes it fast to make calculations while determining
  • Location and orientation of the camera 4 can be calculated by measuring spatial distances travelled by the electromagnetic pulse and using the spatial distances along with the
  • the processor 22 calculates the correction parameters 14.
  • the known inner shape 12 can be a hemisphere.
  • the shape of the hemisphere is known through a dimension of the hemisphere, i.e., a radius of the hemisphere.
  • the processor 22 takes in consideration of the radius to determine the correction parameters 14 by taking in consideration the spatial distance travelled by the electromagnetic pulse and using the spatial distance along with the radius of the hemisphere.
  • the orientation and the location of the camera 4 into the surrounding of the hemisphere can also be determined using the spatial distance travelled by the electromagnetic pulse and the radius of the hemisphere.
  • the known inner shape 12 can be cuboids, cube, trapezium or any other known shape for which the dimensions of the inner shape 12 is known and the correction parameters 14 can be determined by the processor 22 using the dimensions of the inner shape 12 and the spatial distance travelled by the electromagnetic pulse to various parts of the inner shape 12 of the hollow body 10.
  • the known inner shape 12 can be a room of known dimension and the camera 4 can be placed inside the room to determine the correction parameters 14.
  • the dimensions of the room can be made available by architectural map 6 of the room and the data in relation to the dimensions related to room can be fed into the processor 22.
  • processor 22 will determine the correction parameters 14 taking in consideration dimensions of the room.
  • the orientation and the location of the camera 4 into the surrounding of the room can also be
  • the camera 4 is placed in a predetermined position relative to the hollow body 10. If the predetermined position is known than determination of the correction parameters 14 by the processor 22 even becomes faster, because the orientation and the location of the camera 4 is not required to be known due to the availability of these data by knowledge of the
  • the camera 4 need not be placed in a predetermined position, rather it can be placed arbitrary in respect to the surrounding of the hollow body 10 and the correction parameters 14 are calculated by the processor 22 using the matching algorithm.
  • the hollow body 10 is having a symmetry axis 18 and the camera 4 is placed into the surrounding of the hollow body 10, so that the optical axis 20 of the camera 4 and the symmetry axis 18 are coaxial. Placing the camera 4 in such a way helps to place the camera 4 into the predetermined position, as when the camera 4 is placed coaxially to the hollow body 10 the orientation and the location of the camera 4 is easily known due to the symmetry of the hollow body 10.
  • the hollow body 10 need not have a symmetry axis 18 and the camera 4 is placed in a predetermined
  • the hollow body 10 and the camera 4 both are movable to move in a way to attain a desired position of camera 4 and the hollow body 10 relative to each other.
  • either of the camera 4 or the hollow body 10 is movable to attain a desired position of camera 4 and the hollow body 10 relative to each other.
  • the hollow body 10 is placed on a holder 24, so that the hollow body 10 can be easily placed and retained in a
  • the holder 24 can be used to hold the camera 4 when the camera 4 is kept fixed and the hollow body 10 is movable. Yet alternatively, the holder 24 can be provided to keep both the holder 24 and the camera 4 to be in a desired position of camera 4 and the hollow body 10 relative to each other.
  • the holder 24 is provided with a flexibility to move the hollow body 10 rotationally and transitionally in three dimensional spaces. While moving the hollow body 10, when the hollow body 10 has attained the desired position with respect to the hollow body 10, the movement of the hollow body 10 is locked using a locking mechanism of the hollow body 10.
  • the holder 24 can have a resistive movement of the hollow body 10 by having a resistive movement mechanism, so that the hollow body 10 can be easily moved into the desired position easily and quickly with larger precedence.
  • the 360 degree lens system 8 includes combination of lenses and curved mirrors to provide multi-view depth map 6 of the known inner shape 12, so that a sectional view of a part of the known inner shape 12 is generated. In an alternate embodiment, the 360 degree lens system 8 generates a 360 degree view depth map 6 of the known inner shape 12
  • the processor 22 receives the multi-view depth map 12
  • the processor 14 can be a general purpose computer like a central processing unit (CPU) and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm.
  • the processor 14 can be a general purpose computer like a central processing unit (CPU) and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm.
  • the processor 14 can be a general purpose computer like a central processing unit (GPU) and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm.
  • the processor 14 can be a general purpose computer like a central processing unit (CPU) and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm.
  • the processor 14 can be a general purpose computer like a central processing unit (GPU) and further compares the depth
  • FIG 2 illustrates a matching algorithm used for determining the correction parameters by matching a depth map of a known inner shape, while calibrating a multi-view three dimensional camera .
  • intensity of pixels on the depth map are compensated using linear shift or spline transform or any other such transform, on a basis of path length of a electromagnetic pulse.
  • the matching algorithm comprises following steps.
  • step 102 a set of undetermined parameters based on various physical properties of the camera, the hollow body, the known inner shape of the hollow body, the interrelation between the hollow body and the camera, etc or combination thereof, is chosen for each of the pixels on the depth map of a known inner shape of the hollow body.
  • step 104 the intensities of pixels are transformed radialy for the set of undetermined parameters.
  • a transformed depth map is produced in step 106 by using the intensities transformed in step 104.
  • step 108 the depth maps of the known inner shape of the hollow body are compared at various relative positions of the camera with respect to the hollow body to obtain best
  • step 110 the undetermined parameters are changed if the matching of the depth map and the inner known inner shape of the hollow body is not appropriate and steps 104 to 110 are iterated till the matching is not appropriate.
  • step 112 on finding the appropriate match, the undetermined parameters are saved as correction parameters to transform the intensity of the pixels on the depth map to get accurate depth map.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention porte sur un procédé pour étalonner un appareil photographique en trois dimensions à vues multiples (4) ayant un appareil photographique de temps de vol (4) pour générer une carte de profondeur en trois dimensions (6) par l'intermédiaire d'un système d'objectif à 360 degrés (8), lequel procédé met en œuvre la disposition de l'appareil photographique (4) par rapport à un corps creux environnant (10) d'une forme interne connue (12), la génération de la carte de profondeur (6) de la forme interne connue (12) du corps creux (10), la comparaison de la carte de profondeur (6) avec la forme interne connue (12) afin de déterminer des paramètres de correction (14) de la carte de profondeur (6).
PCT/EP2011/061819 2010-07-27 2011-07-12 Procédé et système pour étalonner un appareil photographique en trois dimensions à vues multiples WO2012013486A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN828KO2010 2010-07-27
IN828/KOL/2010 2010-07-27

Publications (1)

Publication Number Publication Date
WO2012013486A1 true WO2012013486A1 (fr) 2012-02-02

Family

ID=44318180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/061819 WO2012013486A1 (fr) 2010-07-27 2011-07-12 Procédé et système pour étalonner un appareil photographique en trois dimensions à vues multiples

Country Status (1)

Country Link
WO (1) WO2012013486A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104284173A (zh) * 2013-07-10 2015-01-14 宏达国际电子股份有限公司 用于产生多视点视频的方法和电子装置
US10141022B2 (en) 2013-07-10 2018-11-27 Htc Corporation Method and electronic device for generating multiple point of view video
CN110506297A (zh) * 2017-04-17 2019-11-26 康耐视公司 高精确度校准系统和方法
US11682131B2 (en) * 2017-10-27 2023-06-20 Canon Kabushiki Kaisha Image capturing apparatus and method of controlling image capturing apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070061040A1 (en) * 2005-09-02 2007-03-15 Home Robots, Inc. Multi-function robotic device
EP2073035A1 (fr) * 2007-12-18 2009-06-24 IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. Enregistrement d'images 3D d'une scène

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070061040A1 (en) * 2005-09-02 2007-03-15 Home Robots, Inc. Multi-function robotic device
EP2073035A1 (fr) * 2007-12-18 2009-06-24 IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. Enregistrement d'images 3D d'une scène

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ALLESSANDRO BEVILACQUA ET AL: "People Tracking Using a Time-of-Flight Depth Sensor", IEEE INTERNATIONAL CONFERENCE ON VIDEO AND SIGNAL BASED SURVEILLANCE, 2006. AVSS '06, IEEE, 1 November 2006 (2006-11-01), pages 1 - 5, XP002509695, ISBN: 978-0-7695-2688-1 *
J.A. BERALDIN, S.F. EL HAKIM, L. COURNOYER: "practical range camera calibration", SPIE VIDEOMETRICS II, 1993, pages 21 - 31, XP002656448 *
KAHLMANN T ET AL: "Calibration of the fast range imaging camera SwissRanger(TM) for the use in the surveillance of the environment", PROCEEDINGS OF SPIE, THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE, USA, vol. 6396, 1 January 2006 (2006-01-01), pages 639605 - 1, XP002539116, ISSN: 0277-786X, DOI: 10.1117/12.684458 *
STEFAN MAY, DAVID DROESCHEL,DIRK HOLZ,CHRISTOPH WIESEN: "3D pose estimation and mapping with time-of-flight cameras", IEEE/RSJ INT. CONF. ON INTELLIGENT ROBOTS AND SYSTEMS IROS, 2008 - 2008, Nice, France, XP002656447, Retrieved from the Internet <URL:www.robotic.de/fileadmin/robotic/fuchs/iros08_3dcam.pdf> [retrieved on 20110804] *
XIAOFENG LIAN ET AL: "Reconstructing indoor environmental 3D model using laser range scanners and omnidirectional camera", INTELLIGENT CONTROL AND AUTOMATION, 2008. WCICA 2008. 7TH WORLD CONGRESS ON, IEEE, PISCATAWAY, NJ, USA, 25 June 2008 (2008-06-25), pages 1640 - 1644, XP031302428, ISBN: 978-1-4244-2113-8 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104284173A (zh) * 2013-07-10 2015-01-14 宏达国际电子股份有限公司 用于产生多视点视频的方法和电子装置
US10141022B2 (en) 2013-07-10 2018-11-27 Htc Corporation Method and electronic device for generating multiple point of view video
CN110506297A (zh) * 2017-04-17 2019-11-26 康耐视公司 高精确度校准系统和方法
CN110506297B (zh) * 2017-04-17 2023-08-11 康耐视公司 高精确度校准系统和方法
US11682131B2 (en) * 2017-10-27 2023-06-20 Canon Kabushiki Kaisha Image capturing apparatus and method of controlling image capturing apparatus

Similar Documents

Publication Publication Date Title
JP7519126B2 (ja) アクティブアライメントを備えた拡張現実ディスプレイおよび対応する方法
US11002537B2 (en) Distance sensor including adjustable focus imaging sensor
US10582188B2 (en) System and method for adjusting a baseline of an imaging system with microlens array
US9211643B1 (en) Automatic in-situ registration and calibration of robotic arm/sensor/workspace system
US9594167B2 (en) Geodetic referencing of point clouds
US11307031B2 (en) Surveying device, and calibration checking method and calibration checking program for surveying device
US9377298B2 (en) Surface determination for objects by means of geodetically precise single point determination and scanning
CN105424005B (zh) 具有用于校准要按照距离相关的方式设置的聚焦光学单元位置的功能的测量装置
US20160103209A1 (en) Imaging device and three-dimensional-measurement device
CN109729721A (zh) 光学测距方法以及光学测距装置
EP3123445A1 (fr) Caméra en réseau computationnelle ayant un éclairage dynamique pour un suivi oculaire
JP2019100915A (ja) 測量装置、測量装置の校正方法および測量装置の校正用プログラム
WO2012013486A1 (fr) Procédé et système pour étalonner un appareil photographique en trois dimensions à vues multiples
She et al. Adjustment and calibration of dome port camera systems for underwater vision
CN112686961A (zh) 一种深度相机标定参数的修正方法、装置
RU2567126C1 (ru) Устройство для формирования инфракрасного изображения
JP2019500606A5 (fr)
CN111044039A (zh) 基于imu的单目目标区域自适应高精度测距装置和方法
JP2014178124A (ja) 3次元計測システム、プログラム及び方法。
JP7228294B2 (ja) プロジェクタの制御装置、プロジェクタ、投影システム、投影方法及びプログラム
RU152545U1 (ru) Устройство для формирования инфракрасного изображения
CN115077468B (zh) 一种变焦测距方法及装置
JP2018063161A (ja) 情報処理装置、情報処理装置の制御方法及びプログラム
KR101630916B1 (ko) 스테레오 모바일 3d 카메라
NL2002406C2 (en) Optical range finder and imaging apparatus.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11735624

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11735624

Country of ref document: EP

Kind code of ref document: A1