WO2015178575A1 - Appareil pour l'acquisition d'une image de temps de vol en trois dimensions - Google Patents

Appareil pour l'acquisition d'une image de temps de vol en trois dimensions Download PDF

Info

Publication number
WO2015178575A1
WO2015178575A1 PCT/KR2015/002711 KR2015002711W WO2015178575A1 WO 2015178575 A1 WO2015178575 A1 WO 2015178575A1 KR 2015002711 W KR2015002711 W KR 2015002711W WO 2015178575 A1 WO2015178575 A1 WO 2015178575A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
viewing angle
light source
time
dimensional
Prior art date
Application number
PCT/KR2015/002711
Other languages
English (en)
Korean (ko)
Inventor
이민구
한승헌
Original Assignee
주식회사 더에스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 더에스 filed Critical 주식회사 더에스
Publication of WO2015178575A1 publication Critical patent/WO2015178575A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/09Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted for automatic focusing or varying magnification

Definitions

  • the present invention relates to a three-dimensional time-flight image acquisition device, and more particularly to a three-dimensional time-flight image acquisition device that can automatically adjust the viewing angle of the light source and the focus of the light reflected from the object.
  • the conventional stereoscopic three-dimensional scanner uses two cameras, and compares the two images taken with the image measured against two reference coordinates, thereby correcting the radiation distortion according to the viewing field of view, Stereo matching and trigonometric image processing are used to construct a 3D depth image map.
  • the kinect sensor-based 3D scanner generates light of a small spot pattern from a light source, and measures a distance by using a difference between a reference pattern and a measured pattern.
  • the stereoscopic 3D scanner takes a lot of time to process data, and there are many factors to consider such as power consumption, influence by external light sources, and the need for initial correction by surface fringes.
  • the three-dimensional scanner based on the Kinect sensor is inconvenient to perform a correction process for generating a reference pattern directly according to each environment, and is based on an algorithm that converts light intensity information into distance information.
  • the use of the product is restricted in the outdoor where the change of the light source is severe.
  • the technical problem to be solved by the present invention to solve the above problems is to provide a three-dimensional time flight image acquisition apparatus that can increase the accuracy of the three-dimensional time flight image obtained.
  • Another technical problem to be solved by the present invention is to provide a small three-dimensional hand-held type (handheld type), a three-dimensional time flight image acquisition device that is easy to use and can be used outdoors.
  • Another technical problem to be solved by the present invention is to provide a three-dimensional time-flight image acquisition device that can reach the far distance by adjusting the viewing angle of the light source.
  • a three-dimensional time flight image acquisition apparatus in the three-dimensional time flight image acquisition device for obtaining a three-dimensional time flight image of the object, the object A light source for irradiating light toward the light source; A viewing angle adjusting unit configured to adjust a viewing angle of the light source according to a distance between the object and the 3D time-flight image acquisition device; An adjustment lens unit for adjusting focus according to a distance between the object and the three-dimensional time-flight image acquisition device, wherein the light reflected from the object is received by the image pickup device via the adjustment lens unit; An imaging device configured to receive light reflected from the object and generate raw image data; And a signal processor configured to receive the raw image data from the imaging device and to process the raw data to generate a three-dimensional depth image and a three-dimensional intensity image.
  • the viewing angle of the light source can be adjusted, and the focus can be automatically adjusted so that the focus of the light reflected from the object is formed on the image pickup device, the accuracy of the obtained three-dimensional time-flight image is improved. It can increase.
  • the flight time method is applied to the three-dimensional time flight image acquisition device, it is possible to obtain a three-dimensional time flight image of the object with a simple configuration compared to the conventional method.
  • FIG. 1 is a view showing the configuration of a three-dimensional time flight image acquisition apparatus according to an embodiment of the present invention.
  • FIG. 2 is a view illustrating that the viewing angle of the light source is adjusted by the viewing angle adjusting unit in the 3D time-flight image acquisition device of FIG. 1.
  • FIG. 3 is a view illustrating that the focus of light reflected by the adjustment lens unit is adjusted in the 3D time-flight image acquisition device of FIG. 1.
  • FIG. 4 is a flowchart illustrating a viewing angle adjustment function of a 3D time-flight image acquisition device according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a focus adjustment function of the apparatus for acquiring a 3D time flight image according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating a configuration of a 3D time flight image acquisition system including the 3D time flight image acquisition device of FIG. 1.
  • FIG. 7 is a diagram illustrating that a 3D depth image is displayed on a display unit of the 3D time flight image acquisition system of FIG. 6.
  • FIG. 8 is a diagram illustrating that a 3D intensity image is displayed on a display unit of the 3D time flight image acquisition system of FIG. 6.
  • FIG. 9 illustrates an example of an electronic device to which the 3D flying image acquisition device of FIG. 1 is applied.
  • FIG. 1 a configuration of an apparatus for obtaining 3D time flight images according to an embodiment of the present invention is disclosed.
  • a light source is controlled by a viewing angle controller in the apparatus for obtaining 3D time flight images of FIG. 1. It is disclosed that the viewing angle of is adjusted, and referring to FIG. 3, it is disclosed that the focus of the light reflected by the adjusting lens unit is adjusted in the three-dimensional time-flight image acquisition apparatus of FIG. 1.
  • the 3D time-flight image acquisition device 1 includes a light source 10, a viewing angle adjusting unit 11, an adjusting lens unit 21, an imaging device 20, and a signal processing unit 30. can do.
  • the present invention is not limited thereto, and the 3D time-flight image acquisition device 1 may include a light source lens 12, a light source driver 13, a filter 22, a light receiving lens 23, a modulation driver 24, and a controller ( 40 and the motion sensor 50 may be further included.
  • the light source 10 is controlled by the light source driver 13, and the light irradiated from the light source 10 may reach the object 100 via the viewing angle adjusting unit 11 and the light source lens 12.
  • the light source 10 may irradiate light toward the object 100.
  • the light source 10 may be, for example, a light emitting diode (LED) or a laser diode (LD), and specifically, the light emitting diode or the laser diode may be arranged in an array form, but is not limited thereto.
  • the light source 10 may irradiate near infrared rays (NIR: NearInfraRed) having a wavelength of about 850 nm so as not to be harmful to a person, but is not limited thereto.
  • NIR nearInfraRed
  • the three-dimensional time-flight image acquisition device 1 since the light of the near infrared is used, the three-dimensional time-flight image acquisition device 1 can be used for both day and night. Furthermore, when the light source 10 irradiates near infrared rays having a wavelength of about 850 nm, the reception sensitivity may be excellent with respect to the imaging device 20 which is a silicon-based CMOS image device.
  • the light source 10 may be controlled by the light source driver 13, and the light irradiated to the object 100 may have a form of a periodic continuous function having a predetermined period, but is not limited thereto. It may also have the form of.
  • the irradiation light may have a specially defined waveform such as a sine wave, a triangular wave, a sawtooth wave, a ramp wave, a square wave, or the like, but is not limited thereto and may have a general shape waveform that is not specifically defined.
  • the light source driver 13 may be driven by modulating the light source 10 by being controlled by the controller 40.
  • the light source driver 13 modulates an amplitude of the irradiated light, modulates a pulse, or modulates a phase of light.
  • the light source 10 may be driven by modulating a phase), a frequency, whether a pulse is encrypted, a pulse width, and a rise time.
  • the viewing angle adjusting unit 11 may change the viewing angle of the light source 10. Specifically, the viewing angle of the light source 10 may be adjusted according to the distance between the object 100 and the 3D time-flight image acquisition device 1, and the viewing angle adjusting unit 11 may include the light source lens 12 and the light source ( 10) and may be implemented in various ways.
  • a sufficient amount of light can reach the object 100.
  • the light reflected from the object 100 may be received by the imaging device 20 through the light receiving lens 23 and the filter 22, and may be driven by the modulation driver 24.
  • the imaging device 20 may generate raw image data from the received reflected light, and the signal processor 30 may process the raw image data received from the imaging device 20.
  • the light receiving lens 23 may collect light reflected from the object 100 to a region where the imaging device 20 is located.
  • the filter 22 may transmit only light having a predetermined wavelength, and may be, for example, a bandpass optical filter.
  • the adjustment lens unit 21 may be positioned between the light receiving lens 23 and the imaging device 20, and may adjust the focus according to the distance between the object 100 and the 3D time-flight image acquisition device 1.
  • the focal position may be adjusted by adjusting an adjustment medium (not shown) included in the adjustment lens unit 21 under the control of the controller 40. Therefore, a three-dimensional image in which a focus is formed on the corresponding object 100 obtained through the three-dimensional time-flight image acquisition device 1 may be obtained.
  • the distance between the object 100 and the three-dimensional time-flight image acquisition device 1 (more specifically, the distance between the object 100 and the imaging device 20) is represented by D1 to D3.
  • the case where the object 100 is located in D1-D3 can be assumed, respectively.
  • the 3D image focused on the object 100 located at the closest distance D1 is acquired, the 3D time-flight image from the object 100 is analyzed by receiving and analyzing the light reflected from the object 100 located at D1.
  • Information about the distance between the acquisition apparatuses 1 may be obtained, and the focus may be adjusted by adjusting the adjustment medium included in the adjustment lens unit 21 using the corresponding information.
  • the adjustment lens unit 21 may adjust a path of the light reflected from the object 100 so that a specific portion of the object 100 within the viewing angle of the light source 10 may be enlarged to form an image on the imaging device 20.
  • the lens may be mechanically adjusted to zoom-in or zoom-out, the driving scheme is not limited thereto.
  • the selection of an object as a reference for focusing may be performed by inputting information about a reference object from a user by using an input means, but the present invention is not limited thereto, and a specific object satisfying a predetermined criterion is automatically selected as a reference. You may. Therefore, according to the three-dimensional time flight image acquisition apparatus 1 according to an embodiment of the present invention, it is possible to improve the accuracy of the three-dimensional image obtained by the automatic adjustment of the focus.
  • the adjustment of the focus may be performed to correct the error rate of the distance value.
  • the imaging device 20 may receive light reflected from the object 100 and generate raw image data from the reflected light.
  • the imaging device 20 may include a 2D image sensor, and the present invention may include, but is not limited to, an RGB camera sensor.
  • the image pickup device 20 may have a structure in which reflected light is split and simultaneously accommodated in the 2D image sensor and the RGB sensor. Can be, but is not limited to this.
  • the imaging device 20 may be configured as a set of unit pixels having an optical synthesis function in a 3D time of flight, but is not limited thereto.
  • the modulation driver 24 may drive the three-dimensional imaging device 20, and the three-dimensional imaging unit 20 may generate a signal having the same phase as the light emitted from the light source 10 or having a phase difference such as 90 °, 180 °, and 270 °.
  • the waveform output from the imaging device 20 can be modulated, when using a direct time-of-flight (direct TOF) method, since the light source 10 having the discontinuous pulse is used, it may be different.
  • the signal processor 30 may receive the raw image data from the imaging device 20 and may process the raw data to generate a 3D depth image and a 3D intensity image.
  • the signal processor 30 compares the waveform of the modulated light with the modulated phase of the light irradiated from the light source 10 with the waveform of the reflected light to form a three-dimensional depth image and a three-dimensional intensity image. You can get values for and strength.
  • the depth value and the intensity value obtained from the signal processor 30 may be used to generate a 3D image having a volume similar to a real object through a point cloud method, a voxel, or a color code map method in a post-processing process.
  • phase modulation method using the phase difference as a method of generating a three-dimensional depth image and a three-dimensional intensity image
  • a direct time flight method using a three-dimensional time of flight method is used. It can also be used.
  • the controller 40 controls the overall operation of the three-dimensional time-flight image acquisition device 1, for example, the viewing angle adjusting unit 11, the light source driving unit 13, the adjusting lens unit 21, the modulation driving unit 24, and the like.
  • the signal processor 30 and the like may be controlled by the controller 40.
  • the motion sensor 50 may increase the accuracy in generating a three-dimensional image by detecting information such as movement, tilt and direction of the three-dimensional time-flight image acquisition device 1, and the information may be stored together with the three-dimensional image. It can be used when displaying three-dimensional images.
  • the absolute coordinate value of the 3D image may be detected and generated.
  • the absolute coordinate values of the three-dimensional image for example, three axes of the X axis, the Y axis and the Z axis may be used.
  • a gyro sensor which detects a tilt by detecting a rotation state of the 3D time flight image acquisition device 1 based on three axes, and displays a movement state of the 3D time flight image acquisition device 10.
  • At least one or more of an acceleration sensor for sensing based on three axes and a geomagnetic sensor for sensing magnetic field strength based on three axes may be used, but is not limited thereto.
  • FIG. 4 the viewing angle adjustment function of the 3D time-flight image acquisition device 1 according to an embodiment of the present invention will be described.
  • FIG. 4 a flowchart for explaining a viewing angle adjustment function of the 3D time-flight image acquisition device 1 according to an embodiment of the present invention is shown.
  • the signal processor 30 may analyze the 3D depth image and the 3D intensity image to detect an object and calculate a depth value and an intensity value of the detected object (S1).
  • the depth value may be a distance value, and may mean a value for the distance between the 3D time flight image obtaining apparatus 1 and the object 100 calculated from the 3D depth image.
  • the intensity value of the light reflected from the object 100 may be obtained.
  • a three-dimensional depth image and a three-dimensional intensity image may be matched to each other to calculate a depth value and an intensity value for a detected object.
  • a relatively close object may have a large intensity value and a small depth value, and may be relatively Far away objects may have small strength values and large depth values.
  • the signal processor 30 may correct the calculated depth value and the intensity value (S2).
  • the object 100 to acquire the 3D image is transparent and excessively reflects light such as glass
  • a distorted 3D depth image and a 3D intensity image may be generated.
  • the image for the glass is not clear, while referring to the three-dimensional intensity image for the transparent reflective object, the image for the glass is clear. Therefore, in order to correct distortion, it is possible to perform correction on the calculated depth value and intensity value.
  • the signal processor 30 may calculate information on the size of the detected object and the distance between the detected object and the 3D time-flight image acquisition device 1 based on the calculated depth value and the intensity value. (S3).
  • information about a size and a distance may be calculated for the plurality of objects.
  • the signal processor 30 may calculate the maximum viewing angle of the light source 10 based on the calculated size, distance, and position information (S4). That is, in consideration of the size and distance of the detected object, it is possible to derive the maximum viewing angle of the light source 10 that can include all the objects to be detected and can reach a sufficient amount of light to the object 100. To this end, the signal processor 30 may have a database in which the required amount of light is recorded according to the size or position of the object 100.
  • the maximum of the light source 10 based on the object having the largest size or the farthest distance from the three-dimensional time-flight image acquisition device 1 among the detected objects
  • the viewing angle can be calculated.
  • the maximum viewing angle may be calculated based on the outermost objects.
  • the signal processor 30 may determine the validity of the current viewing angle by comparing the current viewing angle of the current light source 10 with the calculated maximum viewing angle (S5). When the viewing angle of the current light source 10 and the calculated maximum viewing angle do not coincide, it may be determined that the current viewing angle is not valid, and when the viewing angle of the current light source 10 coincides with the calculated maximum viewing angle. It can be determined that the current viewing angle is valid.
  • control unit 40 may adjust the viewing angle so that the viewing angle adjusting unit 11 has the maximum viewing angle of the light source 10 (S6), and when it is determined to be effective, the control unit 40 ), The viewing angle adjusting unit 11 may maintain the current viewing angle (S7).
  • the viewing angle is automatically adjusted according to the viewing angle adjustment function as described above, according to the 3D time-flight image acquisition device according to an embodiment of the present invention, it is possible to obtain a high quality 3D image while minimizing power consumption.
  • minimizing the power consumption can be achieved by simultaneously adjusting the viewing angle of the light source and adjusting the light source intensity according to the distance of the object, and as a result, a power consumption reduction effect can be obtained.
  • FIG. 5 a focus adjustment function of a 3D time flight image acquisition device according to an embodiment of the present invention will be described.
  • FIG. 5 a flowchart for describing a focus adjustment function of an apparatus for acquiring 3D time flight images according to an embodiment of the present invention is shown.
  • the signal processor 30 may detect an object by analyzing a 3D depth image and a 3D intensity image, and calculate a depth value and an intensity value of the detected object (S11).
  • the depth value may be a distance value, and may mean a value for the distance between the 3D time flight image obtaining apparatus 1 and the object 100 calculated from the 3D depth image.
  • the intensity value of the light reflected from the object 100 may be obtained.
  • the signal processor 30 may correct the calculated depth value and the intensity value (S12).
  • the object 100 to acquire the 3D image is transparent and excessively reflects light such as glass
  • a distorted 3D depth image and a 3D intensity image may be generated.
  • the image for the glass is not clear, while referring to the three-dimensional intensity image for the transparent reflective object, the image for the glass is clear. Therefore, in order to correct distortion, it is possible to perform correction on the calculated depth value and intensity value.
  • the signal processor 30 may calculate information about the boundary of the detected object and the distance between the detected object and the 3D time flight image obtaining apparatus 1 based on the calculated depth value and the intensity value. (S13).
  • the signal processor 30 may calculate a focus coefficient required to obtain a 3D time-flight image in which a focus is formed on the detected object based on the calculated information about the boundary and the distance (S14).
  • the selection of an object as a reference for focusing may be inputted by the user with information about a reference object from a user, but the present invention is not limited thereto, and a specific object is automatically used as a reference. It can also be selected.
  • the signal processor 30 may determine whether focus adjustment is necessary by comparing the current focus coefficient with the calculated coefficient (S15). If the current focus coefficient and the calculated focus coefficient do not match, it may be determined that focus adjustment is necessary. If the current focus coefficient and the calculated focus coefficient match, it may be determined that focus adjustment is not necessary.
  • control unit 40 can adjust the focus position so that the adjustment lens unit 21 becomes the calculated focus coefficient (S16), and when it is determined that focus adjustment is not necessary, the control unit 40 40, the adjustment lens unit 21 may maintain the current focus position (S17).
  • the focus is automatically adjusted according to the focus adjustment function, according to the 3D time-flight image acquisition device according to an embodiment of the present invention, it is possible to improve the accuracy of the 3D image obtained by the automatic adjustment of the focus. have.
  • FIG. 6 a 3D time flight image acquisition system including a 3D time flight image acquisition device according to an embodiment of the present invention will be described. Referring to FIG. 6, a configuration of a 3D time flight image acquisition system including the 3D time flight image acquisition device of FIG. 1 is disclosed.
  • the 3D time flight image acquisition system includes a display unit 2 for displaying an image transmitted from the 3D time flight image acquisition device 1 and the 3D time flight image acquisition device 1, and an input unit for receiving input from a user.
  • (3) and a power supply for a three-dimensional time flight image acquisition system including a data storage unit 4 capable of storing data generated from the three-dimensional time flight image acquisition device 1 and a three-dimensional time flight image acquisition device 1. It may include a power supply unit (5) for supplying a and a three-dimensional time flight image acquisition device (1) for communicating with an external device.
  • the communication unit 6 includes WPAN (Wireless Personal Area Networks) including Bluetooth, ZigBee, and NFC, Wireless LAN (WLAN), Wi-Fi, Wibro, Wimax, and High Speed Downlink Packet.
  • Wireless communication method such as Access, Global Positioning System (GPS) or Ethernet, xDSL (ADSL, VDSL), Hybrid Fiber Coax (HFC), Fiber to The Curb (FTTC), Fiber To The Home (FTTH),
  • a wired communication method such as USB (Universal Serial Bus) can be used
  • the communication method of the communication unit 6 is not limited to the communication method described above, and in addition to the above-mentioned communication method, all other forms of well-known or future development It may include a communication scheme.
  • FIG. 7 a diagram illustrating that a 3D depth image is displayed on a display unit of the 3D time flight image acquisition system of FIG. 6 is described.
  • FIG. 8 the 3D time flight image acquisition system of FIG. A diagram showing that a three-dimensional intensity image is displayed on a display is disclosed.
  • the display unit 2 may include a resolution display area 110 in which the resolution of the 3D image is displayed.
  • the display unit 2 may be displayed as Full-HD, but is not limited thereto.
  • the display unit 2 may include an auto focus region 120 that is displayed by the adjustment lens unit 21 to display the position of the region where the focus is to be formed. It is not limited.
  • the display unit 2 may display a three-dimensional depth image in a color corresponding to the size at each distance value, and a color-distance value correspondence relationship display area expressing a relationship between the distance value and the color in the form of a stick. 130 may be included in the display unit 2.
  • the display unit 2 indicates whether the auto focus control is performed by the adjustment lens unit 21 in the form of an icon in the auto focus display area 140 and the viewing angle control unit 11 that the auto viewing angle is adjusted.
  • An automatic viewing angle adjustment display area 150 displayed in the form of an icon may be included.
  • the display unit 2 may display a three-dimensional intensity image in a shade corresponding to the size of each intensity value, and a shade-intensity value correspondence relation display area expressing a relationship between the distance value and the shade in the form of a stick. 160 may be included in the display unit 2.
  • FIG. 9 an embodiment to which an apparatus for obtaining 3D time flight images according to an embodiment of the present invention may be applied will be described.
  • FIG. 7 an example of an electronic device to which the 3D flying image acquisition device of FIG. 1 is applied is shown.
  • the three-dimensional time-flight image acquisition device 1 may be used as an image acquisition device of various electronic devices.
  • the three-dimensional time flight image acquisition device 1 may be applied to a smartphone 200, in addition to a tablet PC, a printer, a handheld game console, a portable notebook, navigation, a car or a household appliance. appliances).

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

La présente invention concerne un appareil pour l'acquisition d'une image de temps de vol en trois dimensions (3D). L'appareil pour l'acquisition d'une image de temps de vol 3D d'un objet, selon un mode de réalisation de la présente invention, comprend : une source lumineuse pour émettre de la lumière en direction de l'objet ; un organe de commande d'angle de visualisation pour commander un angle de visualisation de la source lumineuse selon une distance entre l'objet et l'appareil pour l'acquisition d'une image de temps de vol 3D ; une unité de lentille de commande pour commander une mise au point en fonction de la distance entre l'objet et l'appareil pour l'acquisition d'une image de temps de vol 3D, la lumière réfléchie par l'objet passant à travers l'unité de lentille de commande pour être reçue par un dispositif d'imagerie ; le dispositif d'imagerie recevant la lumière réfléchie par l'objet de manière à générer des données d'images brutes ; et une unité de traitement de signal pour recevoir les données d'images brutes en provenance du dispositif d'imagerie et traiter les données d'images brutes de manière à générer une image de profondeur 3D et une image d'intensité 3D.
PCT/KR2015/002711 2014-05-21 2015-03-20 Appareil pour l'acquisition d'une image de temps de vol en trois dimensions WO2015178575A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2014-0061191 2014-05-21
KR1020140061191A KR101557295B1 (ko) 2014-05-21 2014-05-21 3차원 시간 비행 이미지 획득 장치

Publications (1)

Publication Number Publication Date
WO2015178575A1 true WO2015178575A1 (fr) 2015-11-26

Family

ID=54344616

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/002711 WO2015178575A1 (fr) 2014-05-21 2015-03-20 Appareil pour l'acquisition d'une image de temps de vol en trois dimensions

Country Status (2)

Country Link
KR (1) KR101557295B1 (fr)
WO (1) WO2015178575A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950694A (zh) * 2021-02-08 2021-06-11 Oppo广东移动通信有限公司 图像融合的方法、单颗摄像头模组、拍摄装置及存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102476404B1 (ko) * 2017-07-18 2022-12-12 엘지이노텍 주식회사 ToF 모듈 및 그 ToF 모듈을 이용한 피사체 인식장치
WO2023055223A1 (fr) * 2021-10-01 2023-04-06 설윤호 Procédé de calcul d'informations de site de terrain tridimensionnelles d'un objet de commande

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004077262A (ja) * 2002-08-16 2004-03-11 Fuji Xerox Co Ltd 三次元撮像装置および方法
JP2004170305A (ja) * 2002-11-21 2004-06-17 Nippon Telegr & Teleph Corp <Ntt> 3次元形状計測方法および3次元形状計測装置
KR20120043843A (ko) * 2010-10-27 2012-05-07 한국과학기술원 3차원 영상화 펄스 레이저 레이더 시스템 및 이 시스템에서의 자동 촛점 방법
KR20130102400A (ko) * 2012-03-07 2013-09-17 삼성전자주식회사 티오에프 센서 및 티오에프 카메라

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004077262A (ja) * 2002-08-16 2004-03-11 Fuji Xerox Co Ltd 三次元撮像装置および方法
JP2004170305A (ja) * 2002-11-21 2004-06-17 Nippon Telegr & Teleph Corp <Ntt> 3次元形状計測方法および3次元形状計測装置
KR20120043843A (ko) * 2010-10-27 2012-05-07 한국과학기술원 3차원 영상화 펄스 레이저 레이더 시스템 및 이 시스템에서의 자동 촛점 방법
KR20130102400A (ko) * 2012-03-07 2013-09-17 삼성전자주식회사 티오에프 센서 및 티오에프 카메라

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950694A (zh) * 2021-02-08 2021-06-11 Oppo广东移动通信有限公司 图像融合的方法、单颗摄像头模组、拍摄装置及存储介质

Also Published As

Publication number Publication date
KR101557295B1 (ko) 2015-10-05

Similar Documents

Publication Publication Date Title
CN110998223B (zh) 用于确定至少一个对像的位置的检测器
US9900517B2 (en) Infrared binocular system with dual diopter adjustment
JP6646652B2 (ja) 走査型レーザ平面度検出
KR101951318B1 (ko) 컬러 영상과 깊이 영상을 동시에 얻을 수 있는 3차원 영상 획득 장치 및 3차원 영상 획득 방법
US20150002649A1 (en) Device for detecting the three-dimensional geometry of objects and method for the operation thereof
US20150138325A1 (en) Camera integrated with light source
US20170050555A1 (en) Light control systems and methods
US10652513B2 (en) Display device, display system and three-dimension display method
US20140098192A1 (en) Imaging optical system and 3d image acquisition apparatus including the imaging optical system
WO2013138082A3 (fr) Otoscanner vidéo avec sonde de ligne de visée et écran
CN105279490A (zh) 一种人机交互式虹膜图像自动采集装置
JP2012252091A5 (fr)
WO2015178575A1 (fr) Appareil pour l&#39;acquisition d&#39;une image de temps de vol en trois dimensions
JP2006280938A (ja) 安全な眼球検出
CN109990757A (zh) 激光测距和照明
CN104657970B (zh) 一种全自动双目内窥镜的标定方法及标定系统
KR102099935B1 (ko) Tof 카메라 장치
WO2015115799A1 (fr) Dispositif d&#39;appareil photo
WO2021096328A2 (fr) Dispositif de localisation laser présentant une fonction destinée à la détection de position initiale d&#39;une cible et procédé de localisation
CN110491316A (zh) 一种投影仪及其投影控制方法
CN102685537A (zh) 显示装置、显示系统以及显示装置的控制方法
KR20120056441A (ko) 적외선 레이저 프로젝션 디스플레이를 이용한 3차원 깊이 카메라
JP4930115B2 (ja) 画像表示システム
CA2848879A1 (fr) Dispositif de localisation d&#39;elements mecaniques
CN105278228B (zh) 激光投影显示器及其颜色对准方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15796702

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 27.01.2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15796702

Country of ref document: EP

Kind code of ref document: A1