WO2021114497A1 - Time-of-flight-based depth camera and three-dimensional imaging method - Google Patents

Time-of-flight-based depth camera and three-dimensional imaging method Download PDF

Info

Publication number
WO2021114497A1
WO2021114497A1 PCT/CN2020/077847 CN2020077847W WO2021114497A1 WO 2021114497 A1 WO2021114497 A1 WO 2021114497A1 CN 2020077847 W CN2020077847 W CN 2020077847W WO 2021114497 A1 WO2021114497 A1 WO 2021114497A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
light
illumination
type
target
Prior art date
Application number
PCT/CN2020/077847
Other languages
French (fr)
Chinese (zh)
Inventor
孙瑞
周兴
王兆民
孙飞
马宣
杨神武
方兆翔
Original Assignee
深圳奥比中光科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳奥比中光科技有限公司 filed Critical 深圳奥比中光科技有限公司
Publication of WO2021114497A1 publication Critical patent/WO2021114497A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4804Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning

Definitions

  • the present invention relates to the field of optics and electronics technology, in particular to a depth camera and a three-dimensional imaging method based on flight time.
  • the vertical cavity surface laser transmitter illuminates the target surface with a certain field of view (FOV).
  • FOV field of view
  • the target may exist in a very small area or part of the area, and the rest of the FOV data is not necessary, which consumes a lot of extra work such as light source and depth calculations, and these redundant work Will increase a lot of power consumption and calculation speed. Therefore, how to increase the depth calculation rate and reduce power consumption is a problem that needs to be solved in the TOF system design, especially for the indirect time-of-flight ranging system (iTOF).
  • iTOF indirect time-of-flight ranging system
  • the present invention provides a depth camera and a three-dimensional imaging method based on flight time.
  • a depth camera based on time of flight including a transmitting module, a receiving module, and a control and processing module, wherein: the transmitting module includes at least two light sources for transmitting light signals, and the light signals are set with a certain angle of view
  • the receiving module includes an image processor composed of at least one pixel, which is used to collect the light signal reflected by the full field of view area and the target area; the control and processing module is used to control the transmitting module and the The receiving module, and processing the light signal reflected back through the full field of view area, determining the target area where the target object is located according to the light signal, and controlling the transmitting module to only emit light signals to the target area; controlling the receiving module to collect the light signal The light signal reflected back from the target area is processed to obtain the depth information of the target object; wherein, the transmitting module has a first type of lighting and a second type of lighting, and the first type of lighting is It refers to scanning the light signal emitted from the full field of view area, and the second type of illumination refer
  • the emission module includes: a first light source, a second light source, and a third light source for emitting light signals outward; a first diffuser, a second diffuser, and a third diffuser.
  • the diffuser corresponds to the first light source, the second light source, and the third light source, respectively, and is used to modulate the field of view angle of the optical signal.
  • the emission module includes: a first light source, a second light source, and a third light source for emitting light signals outward; a first liquid crystal modulation element, a second liquid crystal modulation element, and a second light source; Three liquid crystal modulation elements, respectively corresponding to the first light source, the second light source, and the third light source, are used to control the divergence angle of the optical signal.
  • the emission module includes: a substrate for providing at least three oblique angles to carry the light source; the first light source, the second light source, and the third light source are placed at different oblique angles On the substrate, it is used to emit light signals to the outside.
  • the base is semicircular or isosceles trapezoid.
  • the first type of lighting includes turning on the first light source, the second light source, and the third light source at the same time; the second type of lighting includes: turning on only the first light source; or, turning on only the second light source.
  • Two light sources or, only start the third light source; or, start the first light source and the second light source at the same time; or; start the second light source and the third light source at the same time; or, start all at the same time
  • the first light source, the second light source, and the third light source
  • the present invention also provides a three-dimensional imaging method, including the following steps: S1: use the first type of illumination to obtain a first frame image of the full field of view area; S2: determine the target area where the target object is located according to the first frame image; S3 : Use the second type of lighting to illuminate the target area; S4: Obtain a second frame of image to obtain the depth information of the target.
  • the first type of illumination refers to turning on all light sources to scan the full field of view area; the second type of illumination refers to turning on only the light source corresponding to the target area to scan the entire field of view. The target area is scanned.
  • the first type of illumination includes using an infrared light source to scan the full field of view area to obtain the first frame of depth image.
  • the first type of illumination includes using an RGB sensor to collect a two-dimensional image of the full field of view area under ambient light or LED illumination of the full field of view area as the source of light. Describe the first frame of image.
  • the beneficial effect of the present invention is to provide a depth camera and a three-dimensional imaging method based on time of flight, by setting multiple light sources to illuminate different areas respectively, and defining the area where the target object is located by roughly scanning the image obtained by scanning the full field of view area, Then only the light source corresponding to the defined target area is turned on to perform a fine scan on the target to obtain a depth map, thereby achieving the effect of reducing the power consumption of the light source, reducing the power consumption of depth calculation, reducing the difficulty of heat dissipation design, and improving the calculation rate.
  • Fig. 1 is a schematic diagram of a depth camera based on time-of-flight according to the present invention.
  • Fig. 2 is a schematic structural diagram of a transmitting module according to the present invention.
  • Fig. 3 is a schematic structural diagram of another transmitting module provided according to the present invention.
  • Fig. 4 is a structural diagram of another transmitting module provided according to the present invention.
  • Fig. 5 is a schematic diagram of a three-dimensional imaging method based on adaptive illumination of a time-of-flight depth camera according to the present invention.
  • connection can be used for fixing or circuit connection.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • a plurality of means two or more than two, unless otherwise specifically defined.
  • Fig. 1 is a schematic diagram of a depth camera based on time-of-flight according to the present invention.
  • the depth camera 100 is a TOF depth camera, and includes a transmitting module 101, a receiving module 102, and a control and processing module 103.
  • the control and processing module 103 is connected to the transmitting module 101 and the receiving module 102.
  • the transmitting module 101 includes at least two light sources for emitting light signals (such as pulsed laser beams) to the full-field area 104;
  • the receiving module 102 includes an image processor composed of at least one pixel to capture the entire field-of-view area.
  • the control and processing module 103 is used to control the transmitting module 101 and the receiving module 102, and process the light signal reflected by the full field of view area 104, and determine the target area of the target 105 according to the reflected light signal 106 and then control the emission module 101 to only emit light signals to the target area 106 where the target 105 is located; wherein, the emission module has a first type of lighting and a second type of lighting, and the first type of lighting refers to emitting light to the area within the full field of view.
  • the light signal is uniformly illuminated, and the second type of illumination means that the light signal is emitted only to the area where the target object is located.
  • the depth camera 100 may also include a circuit module, a power module, a housing, and other components, which are not fully shown here.
  • the depth camera 100 may be set independently or integrated into electronic devices such as mobile phones, tablets, computers, etc., and there is no limitation here.
  • the light source of the emission module 101 is set to any suitable light source as required.
  • it can be a single light source, such as a side-emitting laser, or an array composed of multiple light sources, such as a light source chip array composed of multiple vertical cavity surface lasers, or a side-emitting laser diode, infrared laser, LED, etc. Set according to specific needs, there is no restriction here.
  • the image sensor of the receiving module 102 is an image sensor specially used for time-of-flight (TOF) measurement.
  • TOF time-of-flight
  • it may be CMOS (Complementary Metal Oxide Semiconductor), APD (Avalanche Photodiode), SPAD ( Single-photon avalanche photodiodes) and other image sensors.
  • the pixels of the image sensor can be in the form of single dot, linear array or area array.
  • control and processing module 103 is connected to the transmitting module 101 and the receiving module 102 respectively, and is used to send control signals to each module to implement corresponding control operations, and perform related calculations or processing on the received optical signals. Wait.
  • the functions of the control and processing module 103 include providing periodic modulation signals required by the transmitter module 101 to emit optical signals, and providing acquisition signals for the receiving module 102 to collect optical signals. It can also provide auxiliary monitoring signals, including temperature sensing, overcurrent , Overvoltage protection, drop-off protection, etc.
  • the control and processing module 103 sends a control signal to the transmitting module 101.
  • the transmitting module 101 uses the first type of illumination, turns on all the light sources to perform a rough scan of the full field of view area 104, sends the acquisition signal to the receiving module 102, and receives
  • the module 102 collects the light signal reflected back by the full field of view area 104, the control and processing module 103 performs corresponding processing on the collected light signal, determines the target area 106 of the target object 105, and sends a control signal to the transmitting module 101 again,
  • the transmitting module 101 uses the second type of illumination, turning on part of the light source only emits the light signal 11 to the target area 106 of the target 105, and performs fine scanning, while the receiving module 102 collects the light signal reflected back through the target area under the action of the collected signal 12.
  • the control and processing module 103 processes the collected optical signal 12 to obtain the depth information of the target 105.
  • Fig. 2 is a structural diagram of a transmitting module provided according to the present invention.
  • the emission module 200 includes a first light source 201, a second light source 202, a third light source 203, a first diffuser 204, a second diffuser 205 and a third diffuser 206.
  • the first light source 201, the second light source 202, and the third light source 203 are used to emit light signals outward.
  • the surfaces of the first diffuser 204, the second diffuser 205 and the third diffuser 206 have micro-nano structures, which can be According to different requirements, by changing the size, period, and shape of the micro-nano structure, the field angle of the emitted light signal is modulated, so that the first light source 201, the second light source 202, and the third light source 203 have a certain field angle.
  • Fig. 3 is a structural diagram of another transmitting module provided according to the present invention.
  • the emission module 300 includes a first light source 301, a second light source 302, a third light source 303, a first liquid crystal modulation element 304, a second liquid crystal modulation element 305, and a third liquid crystal modulation element 306.
  • the first light source 301, the second light source 302, and the third light source 303 are used to emit light signals outward.
  • the first liquid crystal modulation element 304, the second liquid crystal modulation element 305, and the third liquid crystal modulation element 306 are controlled by different voltages.
  • the liquid crystal can be deflected at different angles, and then the divergence angle of the emitted light signal can be controlled, so that the first light source 301, the second light source 302, and the third light source 303 have different fields of view, respectively.
  • the first diffuser 204, the second diffuser 205, the third diffuser 206, the first liquid crystal modulation element 304, the second liquid crystal modulation element 305, and the third liquid crystal modulation element 306 may be Free combination, used to modulate the field of view of the light signal emitted by the light source.
  • Fig. 4 is a structural diagram of another transmitting module provided according to the present invention.
  • the emission module 400 includes a substrate 401, a first light source 402, a second light source 403, and a third light source 404.
  • the first light source 402, the second light source 403, and the third light source 404 are used to emit light signals outward, and the substrate 401 is set in an isosceles trapezoid shape, so that the first light source 402, the second light source 403, and the third light source 404 are placed at different tilt angles Therefore, the optical signals emitted by the first light source 402, the second light source 403, and the third light source 404 have different fields of view.
  • the shape of the substrate 401 may also be a semicircle, which can provide at least three angles to place the light source, and the angle setting can be adjusted according to specific requirements, and there is no limitation here.
  • the control and processing module sends a control signal to the emission module to turn on the first light source 201, the second light source 202, and the third light source 203 to emit light signals to the full field of view area to perform a rough scan of the full field of view area
  • the receiving module collects the light signal reflected from the full field of view area, and the control and processing module processes the light signal and judges the target area where the target object is located.
  • the control and processing module again emits light signals to the target area where the target object is located, if the target object is within the field of view of the first light source 201 emitting light signals, only the first light source 201 is turned on to emit light signals to the target area where the target object is located.
  • the second light source 202 and the third light source 203 are not turned on at this time; if the target is within the field of view of the second light source 202 emitting light signals, only the second light source 202 is turned on to emit light signals to the target area where the target is located, and the first light source 201 and the third light source 203 are not turned on; if the target is in the field of view of the third light source 203 emitting light signals, only the third light source 203 is turned on to emit light signals to the target area where the target is located, the first light source 201 and the second light source 203 is not turned on; if the target occupies the field of view of the first light source 201 and the second light source 202 at the same time, the first light source 201 and the second light source 202 are turned on at the same time, and the third light source 203 is not turned on; if the target occupies the second light source at the same time In the field of view of 202 and the third light source 203, the first light source 201 is not turned on;
  • the emission module may also include two light sources, four light sources, five light sources, etc.
  • This embodiment only uses three light sources for description, but is not limited thereto.
  • the power consumption of the light source and the depth calculation power consumption can be greatly reduced, and the subsequent depth calculation rate can be increased.
  • the power of using one light source to illuminate the full field of view area is 2.5W.
  • multiple light sources are set according to the target area. Turn on the corresponding light source, you can select two light sources with a power of 1.25W or three powers of 0.83W, so that the corresponding light source is turned on for lighting in the area where the target is located, which significantly reduces the power consumption.
  • Fig. 5 is a three-dimensional imaging method based on adaptive illumination of a time-of-flight depth camera according to the present invention. Explained in four steps:
  • Step S1 Use the first type of illumination to obtain the first frame of image of the full field of view area
  • Step S2 Judging the target area where the target object is located according to the first frame of image
  • Step S3 Use the second type of lighting to illuminate the target area
  • Step S4 Obtain a second frame of image to obtain the depth information of the target object.
  • the first type of illumination means that all light sources are turned on to roughly scan the full field of view area, and the receiving module collects the light signal reflected back through the full field of view area, and then processes the light signal based on the control and processing module.
  • the first frame image of the full field of view area in step S2, determine the area where the target object is located according to the obtained first frame image, and define the area where the target object is located; in step S3, according to the target area defined in step S2, use the second Type lighting performs a fine scan on the defined target area, where the second type of lighting means that under the control of the control and processing module, only the light source corresponding to the target area defined in step S2 is turned on; in step S4, in the second In the case of type lighting, the receiving module collects the light signal reflected back through the defined target area to obtain the second frame of image, which is processed by the control and processing module to obtain the depth information of the target, which improves the scanning accuracy of the target. It should be understood that the size of the target area is specifically determined according to the size of the target object.
  • the first type of illumination includes using an infrared light source to illuminate the entire field of view area, perform a rough scan to obtain the first frame of depth map, define the area where the target is located according to the first frame of depth map, and then turn on the define target The corresponding light source in the area where the object is located is finely scanned and processed to obtain the depth information of the object.
  • the first type of illumination includes the use of RGB sensors to collect two-dimensional images of the full field of view under ambient light or LED illumination of the full field of view, and determine the area where the target object is located through the collected two-dimensional images , And then turn on the light source corresponding to the target area to perform fine scanning of the target, and obtain the depth information of the target after processing.
  • the beneficial effects achieved by the present invention are: setting a plurality of light sources to illuminate different areas respectively, defining the area where the target object is located by roughly scanning the image obtained from the full field of view area, and then only turning on the light source corresponding to the defined target area to target the target object. Perform a fine scan to obtain a depth map, thereby achieving the effect of reducing the power consumption of the light source, reducing the power consumption of depth calculation, reducing the difficulty of heat dissipation design, and improving the calculation rate.
  • the program can be stored in a computer-readable storage medium.
  • the storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.

Abstract

A time-of-flight-based depth camera (100) and a three-dimensional imaging method. The depth camera (100) comprises: a transmitting module (101) comprising at least two light sources for transmitting an optical signal, the optical signal being provided with a certain field of view; a receiving module (102), comprising an image processor consisting of at least one pixel, for acquiring a reflected optical signal; a control and processing module (103) for controlling the transmitting module (101) and the receiving module (102) and processing an optical signal reflected back by a full field of view region (104), and determining a target region (106) where a target object (105) is located according to the optical signal, so as to control the transmitting module (101) to transmit an optical signal only to the target region (106) and control the receiving module (102) to acquire the optical signal reflected back by the target region(106) and process same to obtain depth information of the target object (105), the transmitting module (101) having: a first type of illumination which refers to transmitting an optical signal to the full field of view region (104) for scanning, and a second type of illumination which refers to transmitting an optical signal only to the target region (106) for scanning. The depth camera and the three-dimensional imaging method can reduce the power consumption of a light source, reduce the power consumption of depth calculation, and increase the calculation rate.

Description

一种基于飞行时间的深度相机及三维成像方法A depth camera and three-dimensional imaging method based on flight time 技术领域Technical field
本发明涉及光学及电子技术领域,尤其涉及一种基于飞行时间的深度相机及三维成像方法。The present invention relates to the field of optics and electronics technology, in particular to a depth camera and a three-dimensional imaging method based on flight time.
背景技术Background technique
基于飞行时间测距(TOF)三维成像技术,垂直腔面激光发射器(VCSEL)以一定的视场角(FOV)照射至目标物表面。同一时刻以全视场角照射时,为了提高信噪比,提高测距和成像精度,往往需要提高光源的峰值功率,功耗比较大,这加大了供电系统和散热系统的设计难度。由于FOV范围内,目标物可能存在于其中的某个非常小的区域,或者部分区域,而其余FOV的数据均非必须的,耗费了很多的光源和深度计算等多余的工作,而这些多余工作会增加很多功耗以及计算速度。因此,如何提高深度计算速率、降低功耗是TOF系统设计中需要解决的问题,尤其对于间接飞行时间测距系统(iTOF)中尤为关键。Based on the time-of-flight (TOF) three-dimensional imaging technology, the vertical cavity surface laser transmitter (VCSEL) illuminates the target surface with a certain field of view (FOV). When irradiating with a full field of view at the same time, in order to improve the signal-to-noise ratio and the accuracy of ranging and imaging, it is often necessary to increase the peak power of the light source, and the power consumption is relatively large, which increases the difficulty of designing the power supply system and the heat dissipation system. Due to the FOV range, the target may exist in a very small area or part of the area, and the rest of the FOV data is not necessary, which consumes a lot of extra work such as light source and depth calculations, and these redundant work Will increase a lot of power consumption and calculation speed. Therefore, how to increase the depth calculation rate and reduce power consumption is a problem that needs to be solved in the TOF system design, especially for the indirect time-of-flight ranging system (iTOF).
发明内容Summary of the invention
本发明为了解决现有的技术问题,提供一种基于飞行时间的深度相机及三维成像方法。In order to solve the existing technical problems, the present invention provides a depth camera and a three-dimensional imaging method based on flight time.
为了解决上述问题,本发明采用的技术方案如下所述:In order to solve the above problems, the technical solutions adopted by the present invention are as follows:
一种基于飞行时间的深度相机,包括发射模块、接收模块及控制与处理模块,其中:所述发射模块,包括至少两个光源,用于发射光信号,所述光信号设置有一定视场角;所述接收模块,包括至少一个像素组成的图像处理器,用于采集经全视场区域、目标区域反射回的光信号;所述控制与处理模块,用于控制所述发射模块和所述接收模块,以及处理经全视场区域反射回的光信号,根据所述光信号判断目标物所在目标区域进而控制所述发射模块仅向所述目标区域发射光信号;控制所述接收模块采集所述目标区域反射回的光信号,并对所述光信号进行处理得到所述目标物的深度信息;其中,所述发射模块具有第一类型照明和第二类型照明,所述第一类型照明是指对所述全视场区域发射光信号进行扫描,所述第二类型照明是指仅对所述目标物所在的所述目标区域发射光信号进行扫描。A depth camera based on time of flight, including a transmitting module, a receiving module, and a control and processing module, wherein: the transmitting module includes at least two light sources for transmitting light signals, and the light signals are set with a certain angle of view The receiving module includes an image processor composed of at least one pixel, which is used to collect the light signal reflected by the full field of view area and the target area; the control and processing module is used to control the transmitting module and the The receiving module, and processing the light signal reflected back through the full field of view area, determining the target area where the target object is located according to the light signal, and controlling the transmitting module to only emit light signals to the target area; controlling the receiving module to collect the light signal The light signal reflected back from the target area is processed to obtain the depth information of the target object; wherein, the transmitting module has a first type of lighting and a second type of lighting, and the first type of lighting is It refers to scanning the light signal emitted from the full field of view area, and the second type of illumination refers to scanning only the light signal emitted from the target area where the target object is located.
在本发明的一种实施例中,所述发射模块包括:第一光源、第二光源和第三 光源,用于向外发射光信号;第一漫射体、第二漫射体和第三漫射体,分别与所述第一光源、所述第二光源和所述第三光源对应,用于对所述光信号的视场角进行调制。In an embodiment of the present invention, the emission module includes: a first light source, a second light source, and a third light source for emitting light signals outward; a first diffuser, a second diffuser, and a third diffuser. The diffuser corresponds to the first light source, the second light source, and the third light source, respectively, and is used to modulate the field of view angle of the optical signal.
在本发明的又一种实施例中,所述发射模块包括:第一光源、第二光源和第三光源,用于向外发射光信号;第一液晶调制元件、第二液晶调制元件和第三液晶调制元件,分别与所述第一光源、所述第二光源和所述第三光源对应,用于控制所述光信号的发散角。In yet another embodiment of the present invention, the emission module includes: a first light source, a second light source, and a third light source for emitting light signals outward; a first liquid crystal modulation element, a second liquid crystal modulation element, and a second light source; Three liquid crystal modulation elements, respectively corresponding to the first light source, the second light source, and the third light source, are used to control the divergence angle of the optical signal.
在本发明的再一种实施例中,所述发射模块包括:基底,用于提供至少三个倾斜角度承载光源;第一光源、第二光源和第三光源,以不同的所述倾斜角度放置在所述基底上,用于向外发射光信号。所述基底是半圆形或等腰梯形。所述第一类型照明包括同时开启所述第一光源、所述第二光源、所述第三光源;所述第二类型照明包括:仅开启所述第一光源;或,仅开启所述第二光源;或,仅开始所述第三光源;或,同时开始所述第一光源和所述第二光源;或;同时开始所述第二光源和所述第三光源;或,同时开始所述第一光源、所述第二光源、所述第三光源。In still another embodiment of the present invention, the emission module includes: a substrate for providing at least three oblique angles to carry the light source; the first light source, the second light source, and the third light source are placed at different oblique angles On the substrate, it is used to emit light signals to the outside. The base is semicircular or isosceles trapezoid. The first type of lighting includes turning on the first light source, the second light source, and the third light source at the same time; the second type of lighting includes: turning on only the first light source; or, turning on only the second light source. Two light sources; or, only start the third light source; or, start the first light source and the second light source at the same time; or; start the second light source and the third light source at the same time; or, start all at the same time The first light source, the second light source, and the third light source.
本发明又提供一种三维成像方法,包括如下步骤:S1:使用第一类型照明获取全视场区域的第一帧图像;S2:根据所述第一帧图像判断目标物所在的目标区域;S3:使用第二类型照明对所述目标区域进行照明;S4:获取第二帧图像,得到所述目标物的深度信息。The present invention also provides a three-dimensional imaging method, including the following steps: S1: use the first type of illumination to obtain a first frame image of the full field of view area; S2: determine the target area where the target object is located according to the first frame image; S3 : Use the second type of lighting to illuminate the target area; S4: Obtain a second frame of image to obtain the depth information of the target.
在本发明的一种实施例中,所述第一类型照明是指开启所有光源对所述全视场区域进行扫描;所述第二类型照明是指仅开启所述目标区域对应的光源对所述目标区域进行扫描。In an embodiment of the present invention, the first type of illumination refers to turning on all light sources to scan the full field of view area; the second type of illumination refers to turning on only the light source corresponding to the target area to scan the entire field of view. The target area is scanned.
在本发明的又一种实施例中,所述第一类型照明包括利用红外光源对所述全视场区域进行扫描,得到所述第一帧深度图像。In another embodiment of the present invention, the first type of illumination includes using an infrared light source to scan the full field of view area to obtain the first frame of depth image.
在本发明的再一种实施例中,所述第一类型照明包括在环境光或LED对所述全视场区域的照明下,利用RGB传感器对所述全视场区域采集二维图像作为所述第一帧图像。In still another embodiment of the present invention, the first type of illumination includes using an RGB sensor to collect a two-dimensional image of the full field of view area under ambient light or LED illumination of the full field of view area as the source of light. Describe the first frame of image.
本发明的有益效果为:提供一种基于飞行时间的深度相机及三维成像方法,通过设置多个光源分别照射不同的区域,通过粗略扫描全视场区域得到的图像对 目标物所在区域进行界定,后仅开启界定的目标物区域对应的光源对目标物进行精细扫描,获得深度图,从而达到了降低光源功耗、降低深度计算功耗、降低散热设计难度,提高了计算速率的效果。The beneficial effect of the present invention is to provide a depth camera and a three-dimensional imaging method based on time of flight, by setting multiple light sources to illuminate different areas respectively, and defining the area where the target object is located by roughly scanning the image obtained by scanning the full field of view area, Then only the light source corresponding to the defined target area is turned on to perform a fine scan on the target to obtain a depth map, thereby achieving the effect of reducing the power consumption of the light source, reducing the power consumption of depth calculation, reducing the difficulty of heat dissipation design, and improving the calculation rate.
附图说明Description of the drawings
图1为根据本发明提供的一种基于飞行时间的深度相机示意图。Fig. 1 is a schematic diagram of a depth camera based on time-of-flight according to the present invention.
图2为根据本发明提供的一种发射模块结构示意图。Fig. 2 is a schematic structural diagram of a transmitting module according to the present invention.
图3为根据本发明提供的又一种发射模块结构示意图。Fig. 3 is a schematic structural diagram of another transmitting module provided according to the present invention.
图4为根据本发明提供的又一种发射模块结构图。Fig. 4 is a structural diagram of another transmitting module provided according to the present invention.
图5为根据本发明提供的一种基于飞行时间深度相机自适应照明的三维成像方法的示意图。Fig. 5 is a schematic diagram of a three-dimensional imaging method based on adaptive illumination of a time-of-flight depth camera according to the present invention.
其中,100-深度相机,101-发射模块,102-接收模块,103-控制与处理模块,104-全视场区域,105-目标物,106-目标区域,11-光信号,12-光信号,200-发射模块,201-第一光源,202-第二光源,203-第三光源,204-第一漫射体,205-第二漫射体,206-第三漫射体,300-发射模块,301-第一光源,302-第二光源,303-第三光源,304-第一液晶调制元件,305-第二液晶调制元件,306-第三液晶调制元件,400-发射模块,401-基底,402-第一光源,403-第二光源,404-第三光源。Among them, 100-depth camera, 101-transmitting module, 102-receiving module, 103-control and processing module, 104-full field of view area, 105-target object, 106-target area, 11-optical signal, 12-optical signal , 200-emission module, 201-first light source, 202-second light source, 203-third light source, 204-first diffuser, 205-second diffuser, 206-third diffuser, 300- Transmitting module, 301-first light source, 302-second light source, 303-third light source, 304-first liquid crystal modulation element, 305-second liquid crystal modulation element, 306-third liquid crystal modulation element, 400-transmitting module, 401-substrate, 402-first light source, 403-second light source, 404-third light source.
具体实施方式Detailed ways
为了使本发明实施例所要解决的技术问题、技术方案及有益效果更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。In order to make the technical problems, technical solutions, and beneficial effects to be solved by the embodiments of the present invention clearer, the following further describes the present invention in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, but not used to limit the present invention.
需要说明的是,当元件被称为“固定于”或“设置于”另一个元件,它可以直接在另一个元件上或者间接在该另一个元件上。当一个元件被称为是“连接于”另一个元件,它可以是直接连接到另一个元件或间接连接至该另一个元件上。另外,连接既可以是用于固定作用也可以是用于电路连通作用。It should be noted that when an element is referred to as being "fixed to" or "disposed on" another element, it can be directly on the other element or indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or indirectly connected to the other element. In addition, the connection can be used for fixing or circuit connection.
需要理解的是,术语“长度”、“宽度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明实施例和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、 以特定的方位构造和操作,因此不能理解为对本发明的限制。It should be understood that the terms "length", "width", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top" The orientation or positional relationship indicated by "bottom", "inner", "outer", etc. is based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the embodiments of the present invention and simplifying the description, rather than indicating or implying what is meant. The device or element must have a specific orientation, be constructed and operated in a specific orientation, and therefore cannot be understood as a limitation of the present invention.
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多该特征。在本发明实施例的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。In addition, the terms "first" and "second" are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with "first" and "second" may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present invention, "a plurality of" means two or more than two, unless otherwise specifically defined.
图1为根据本发明提供的一种基于飞行时间的深度相机示意图。该深度相机100为TOF深度相机,包括发射模块101、接收模块102以及控制与处理模块103,控制与处理模块103与发射模块101和接收模块102连接。其中,发射模块101,包括至少两个光源,用于向全视场区域104发射光信号(例如脉冲激光束);接收模块102包括至少一个像素组成的图像处理器,以采集经全视场区域104反射回的光信号;控制与处理模块103用于控制发射模块101和接收模块102,以及处理经全视场区域104反射回的光信号,根据反射回的光信号判断目标物105所在目标区域106进而控制发射模块101仅向目标物105所在目标区域106发射光信号;其中,发射模块具有第一类型照明和第二类型照明,第一类型照明是指对所述全视场内的区域发射光信号,进行均匀照明,第二类型照明是指仅对所述目标物所在区域发射光信号。应当理解的是,深度相机100还可以包括电路模块、电源模块、外壳及其他部件,此处并未完全示出。深度相机100可以是独立设置,也可以被集成于手机、平板电脑、计算机等电子设备中,此处不做限制。Fig. 1 is a schematic diagram of a depth camera based on time-of-flight according to the present invention. The depth camera 100 is a TOF depth camera, and includes a transmitting module 101, a receiving module 102, and a control and processing module 103. The control and processing module 103 is connected to the transmitting module 101 and the receiving module 102. Among them, the transmitting module 101 includes at least two light sources for emitting light signals (such as pulsed laser beams) to the full-field area 104; the receiving module 102 includes an image processor composed of at least one pixel to capture the entire field-of-view area. 104 The light signal reflected back; the control and processing module 103 is used to control the transmitting module 101 and the receiving module 102, and process the light signal reflected by the full field of view area 104, and determine the target area of the target 105 according to the reflected light signal 106 and then control the emission module 101 to only emit light signals to the target area 106 where the target 105 is located; wherein, the emission module has a first type of lighting and a second type of lighting, and the first type of lighting refers to emitting light to the area within the full field of view. The light signal is uniformly illuminated, and the second type of illumination means that the light signal is emitted only to the area where the target object is located. It should be understood that the depth camera 100 may also include a circuit module, a power module, a housing, and other components, which are not fully shown here. The depth camera 100 may be set independently or integrated into electronic devices such as mobile phones, tablets, computers, etc., and there is no limitation here.
在一个实施例中,发射模组101的光源根据需要设置成任意合适的光源。比如可以是单个光源,如边发射激光器,也可以是多个光源组成的阵列,比如多个垂直腔面激光器组成的光源芯片阵列,还可以为边发射激光二极管、红外激光、LED等等,可根据具体需求进行设置,此处不作限制。In an embodiment, the light source of the emission module 101 is set to any suitable light source as required. For example, it can be a single light source, such as a side-emitting laser, or an array composed of multiple light sources, such as a light source chip array composed of multiple vertical cavity surface lasers, or a side-emitting laser diode, infrared laser, LED, etc. Set according to specific needs, there is no restriction here.
在一个实施例中,接收模块102的图像传感器是一种专门用于光飞行时间(TOF)测量的图像传感器,例如可以是CMOS(互补金属氧化物半导体)、APD(雪崩光电二极管)、SPAD(单光子雪崩光电二极管)等图像传感器,图像传感器的像素可以是单点、线阵或者面阵等形式。In one embodiment, the image sensor of the receiving module 102 is an image sensor specially used for time-of-flight (TOF) measurement. For example, it may be CMOS (Complementary Metal Oxide Semiconductor), APD (Avalanche Photodiode), SPAD ( Single-photon avalanche photodiodes) and other image sensors. The pixels of the image sensor can be in the form of single dot, linear array or area array.
在一个实施例中,控制与处理模块103分别与发射模块101与接收模块102连接,用于向每个模块发出控制信号以实施相应的控制操作,对已经接收到的光信号进行相关计算或处理等。控制与处理模块103功能包括提供发射模块101 发射光信号时所需要的周期性调制信号、提供接收模块102采集光信号的采集信号等,还可以提供辅助的监测信号,包括温度传感、过电流、过压保护、脱落保护等。In one embodiment, the control and processing module 103 is connected to the transmitting module 101 and the receiving module 102 respectively, and is used to send control signals to each module to implement corresponding control operations, and perform related calculations or processing on the received optical signals. Wait. The functions of the control and processing module 103 include providing periodic modulation signals required by the transmitter module 101 to emit optical signals, and providing acquisition signals for the receiving module 102 to collect optical signals. It can also provide auxiliary monitoring signals, including temperature sensing, overcurrent , Overvoltage protection, drop-off protection, etc.
在一个实施例中,控制与处理模块103向发射模块101发送控制信号,发射模块101使用第一类型照明,开启全部光源对全视场区域104进行粗略扫描,向接收模块102发送采集信号,接收模块102采集经全视场区域104反射回的光信号,控制与处理模块103对采集到的光信号进行相应的处理,判断目标物105的目标区域106,并再次向发射模块101发送控制信号,发射模块101使用第二类型照明,开启部分光源仅对目标物105的目标区域106发射光信号11,并进行精细扫描,同时接收模块102在采集信号的作用下采集经由目标区域反射回的光信号12,控制与处理模块103对采集到的光信号12进行处理,从而得到目标物105的深度信息。In one embodiment, the control and processing module 103 sends a control signal to the transmitting module 101. The transmitting module 101 uses the first type of illumination, turns on all the light sources to perform a rough scan of the full field of view area 104, sends the acquisition signal to the receiving module 102, and receives The module 102 collects the light signal reflected back by the full field of view area 104, the control and processing module 103 performs corresponding processing on the collected light signal, determines the target area 106 of the target object 105, and sends a control signal to the transmitting module 101 again, The transmitting module 101 uses the second type of illumination, turning on part of the light source only emits the light signal 11 to the target area 106 of the target 105, and performs fine scanning, while the receiving module 102 collects the light signal reflected back through the target area under the action of the collected signal 12. The control and processing module 103 processes the collected optical signal 12 to obtain the depth information of the target 105.
图2是根据本发明提供的一种发射模块结构图。发射模块200包括第一光源201、第二光源202、第三光源203、第一漫射体204、第二漫射体205以及第三漫射体206。第一光源201、第二光源202和第三光源203用于向外发射光信号,第一漫射体204、第二漫射体205和第三漫射体206表面具有微纳结构,可根据不同需求,通过改变微纳结构的尺寸、周期、形状,对发射光信号的视场角进行调制,使得第一光源201、第二光源202和第三光源203具有一定的视场角。Fig. 2 is a structural diagram of a transmitting module provided according to the present invention. The emission module 200 includes a first light source 201, a second light source 202, a third light source 203, a first diffuser 204, a second diffuser 205 and a third diffuser 206. The first light source 201, the second light source 202, and the third light source 203 are used to emit light signals outward. The surfaces of the first diffuser 204, the second diffuser 205 and the third diffuser 206 have micro-nano structures, which can be According to different requirements, by changing the size, period, and shape of the micro-nano structure, the field angle of the emitted light signal is modulated, so that the first light source 201, the second light source 202, and the third light source 203 have a certain field angle.
图3是根据本发明提供的又一种发射模块结构图。发射模块300包括第一光源301、第二光源302、第三光源303、第一液晶调制元件304、第二液晶调制元件305和第三液晶调制元件306。第一光源301、第二光源302和第三光源303用于向外发射光信号,第一液晶调制元件304、第二液晶调制元件305和第三液晶调制元件306在不同电压的控制作用下,液晶可以进行不同角度的偏转,进而可以控制发射光信号的发散角,使得第一光源301、第二光源302和第三光源303分别具有不同的视场。Fig. 3 is a structural diagram of another transmitting module provided according to the present invention. The emission module 300 includes a first light source 301, a second light source 302, a third light source 303, a first liquid crystal modulation element 304, a second liquid crystal modulation element 305, and a third liquid crystal modulation element 306. The first light source 301, the second light source 302, and the third light source 303 are used to emit light signals outward. The first liquid crystal modulation element 304, the second liquid crystal modulation element 305, and the third liquid crystal modulation element 306 are controlled by different voltages. The liquid crystal can be deflected at different angles, and then the divergence angle of the emitted light signal can be controlled, so that the first light source 301, the second light source 302, and the third light source 303 have different fields of view, respectively.
在一个实施例中,第一漫射体204、第二漫射体205、第三漫射体206、第一液晶调制元件304、第二液晶调制元件305、以及第三液晶调制元件306可以进行自由组合,用于对光源发射的光信号的视场进行调制。In one embodiment, the first diffuser 204, the second diffuser 205, the third diffuser 206, the first liquid crystal modulation element 304, the second liquid crystal modulation element 305, and the third liquid crystal modulation element 306 may be Free combination, used to modulate the field of view of the light signal emitted by the light source.
图4是根据本发明提供的又一种发射模块结构图。发射模块400包括基底 401、第一光源402、第二光源403和第三光源404。第一光源402、第二光源403和第三光源404用于向外发射光信号,基底401设置为等腰梯形,使得第一光源402、第二光源403和第三光源404成不同倾斜角度放置,从而使得第一光源402、第二光源403和第三光源404发射的光信号具有不同的视场。应当理解的是,基底401的形状还可以是半圆形,其可以提供至少三个角度放置光源即可,其角度设置可以根据具体的需求进行调整,此处不作限制。Fig. 4 is a structural diagram of another transmitting module provided according to the present invention. The emission module 400 includes a substrate 401, a first light source 402, a second light source 403, and a third light source 404. The first light source 402, the second light source 403, and the third light source 404 are used to emit light signals outward, and the substrate 401 is set in an isosceles trapezoid shape, so that the first light source 402, the second light source 403, and the third light source 404 are placed at different tilt angles Therefore, the optical signals emitted by the first light source 402, the second light source 403, and the third light source 404 have different fields of view. It should be understood that the shape of the substrate 401 may also be a semicircle, which can provide at least three angles to place the light source, and the angle setting can be adjusted according to specific requirements, and there is no limitation here.
在一个实施例中,控制与处理模块向发射模块发送控制信号,开启第一光源201、第二光源202以及第三光源203向全视场区域发射光信号,对全视场区域进行粗略扫描,接收模块采集经全视场区域反射回来的光信号,控制与处理模块对光信号进行处理,并对目标物所在的目标区域进行判断。当控制与处理模块再次向目标物所在目标区域发射光信号时,若目标物在第一光源201发射光信号的视场内,仅开启第一光源201向目标物所在的目标区域发射光信号,第二光源202与第三光源203此时不开启;若目标物在第二光源202发射光信号的视场内,仅开启第二光源202向目标物所在的目标区域发射光信号,第一光源201与第三光源203不开启;若目标物在第三光源203发射光信号的视场内,仅开启第三光源203向目标物所在的目标区域发射光信号,第一光源201与第二光源203不开启;若目标物同时占据第一光源201与第二光源202的视场,则第一光源201和第二光源202同时开启,第三光源203不开启;若目标物同时占据第二光源202和第三光源203的视场时,第一光源201不开启;若目标物同时占据第一光源201、第二光源202和第三光源203的视场时,则三个光源同时开启。In one embodiment, the control and processing module sends a control signal to the emission module to turn on the first light source 201, the second light source 202, and the third light source 203 to emit light signals to the full field of view area to perform a rough scan of the full field of view area, The receiving module collects the light signal reflected from the full field of view area, and the control and processing module processes the light signal and judges the target area where the target object is located. When the control and processing module again emits light signals to the target area where the target object is located, if the target object is within the field of view of the first light source 201 emitting light signals, only the first light source 201 is turned on to emit light signals to the target area where the target object is located. The second light source 202 and the third light source 203 are not turned on at this time; if the target is within the field of view of the second light source 202 emitting light signals, only the second light source 202 is turned on to emit light signals to the target area where the target is located, and the first light source 201 and the third light source 203 are not turned on; if the target is in the field of view of the third light source 203 emitting light signals, only the third light source 203 is turned on to emit light signals to the target area where the target is located, the first light source 201 and the second light source 203 is not turned on; if the target occupies the field of view of the first light source 201 and the second light source 202 at the same time, the first light source 201 and the second light source 202 are turned on at the same time, and the third light source 203 is not turned on; if the target occupies the second light source at the same time In the field of view of 202 and the third light source 203, the first light source 201 is not turned on; if the target occupies the field of view of the first light source 201, the second light source 202, and the third light source 203 at the same time, the three light sources are turned on at the same time.
需要说明的是,发射模块还可以包括两个光源、四个光源、五个光源等,本实施例仅以三个光源进行说明,但不限于此。通过设置多个光源,可以大大降低光源功耗和深度计算功耗,提高后续深度计算速率,例如使用一个光源照射全视场区域的功率为2.5W,根据本发明设置多个光源根据目标物区域开启对应的光源,可以选择两个功率为1.25W或三个功率为0.83W的光源,从而针对目标物所在区域开启相应的光源进行照明,显著降低了功耗。It should be noted that the emission module may also include two light sources, four light sources, five light sources, etc. This embodiment only uses three light sources for description, but is not limited thereto. By setting multiple light sources, the power consumption of the light source and the depth calculation power consumption can be greatly reduced, and the subsequent depth calculation rate can be increased. For example, the power of using one light source to illuminate the full field of view area is 2.5W. According to the present invention, multiple light sources are set according to the target area. Turn on the corresponding light source, you can select two light sources with a power of 1.25W or three powers of 0.83W, so that the corresponding light source is turned on for lighting in the area where the target is located, which significantly reduces the power consumption.
图5为根据本发明提供的一种基于飞行时间深度相机自适应照明的三维成像方法。分四步进行说明:Fig. 5 is a three-dimensional imaging method based on adaptive illumination of a time-of-flight depth camera according to the present invention. Explained in four steps:
步骤S1:使用第一类型照明获取全视场区域的第一帧图像;Step S1: Use the first type of illumination to obtain the first frame of image of the full field of view area;
步骤S2:根据所述第一帧图像判断目标物所在的目标区域;Step S2: Judging the target area where the target object is located according to the first frame of image;
步骤S3:使用第二类型照明对所述目标区域进行照明;Step S3: Use the second type of lighting to illuminate the target area;
步骤S4:获取第二帧图像,得到所述目标物的深度信息。Step S4: Obtain a second frame of image to obtain the depth information of the target object.
更为具体地,在步骤S1中,第一类型照明是指开启所有光源对全视场区域进行粗略扫描,接收模块采集经全视场区域反射回的光信号,经过控制与处理模块处理得到基于全视场区域的第一帧图像;步骤S2中,根据得到的第一帧图像判断目标物所在区域,界定目标物所在区域;在步骤S3中,根据步骤S2中界定的目标区域,使用第二类型照明对界定的目标区域进行精细扫描,其中,第二类型照明是指在控制与处理模块的控制下,仅开启基于步骤S2中界定的目标物区域对应的光源;步骤S4中,在第二类型照明的情况下,接收模块采集经由界定的目标物区域反射回的光信号,获得第二帧图像,经过控制与处理模块的处理得到目标物的深度信息,提高了目标物的扫描精度。需要理解的是,目标区域的大小依据目标物的大小具体确定的。More specifically, in step S1, the first type of illumination means that all light sources are turned on to roughly scan the full field of view area, and the receiving module collects the light signal reflected back through the full field of view area, and then processes the light signal based on the control and processing module. The first frame image of the full field of view area; in step S2, determine the area where the target object is located according to the obtained first frame image, and define the area where the target object is located; in step S3, according to the target area defined in step S2, use the second Type lighting performs a fine scan on the defined target area, where the second type of lighting means that under the control of the control and processing module, only the light source corresponding to the target area defined in step S2 is turned on; in step S4, in the second In the case of type lighting, the receiving module collects the light signal reflected back through the defined target area to obtain the second frame of image, which is processed by the control and processing module to obtain the depth information of the target, which improves the scanning accuracy of the target. It should be understood that the size of the target area is specifically determined according to the size of the target object.
在一个实施例中,第一类型照明包括利用红外光源对全视场区域照明,进行粗略扫描,得到第一帧深度图,根据第一帧深度图对目标物所在区域进行界定,再开启界定目标物所在区域的对应光源进行精细扫描,经过处理从而得到目标物的深度信息。In one embodiment, the first type of illumination includes using an infrared light source to illuminate the entire field of view area, perform a rough scan to obtain the first frame of depth map, define the area where the target is located according to the first frame of depth map, and then turn on the define target The corresponding light source in the area where the object is located is finely scanned and processed to obtain the depth information of the object.
在一个实施例中,第一类型照明包括在环境光或LED对全视场区域的照明下,利用RGB传感器对全视场区域采集二维图像,通过采集到的二维图像判断目标物所在区域,再开启目标物区域所对应的光源对目标物进行精细扫描,经过处理从而得到目标物的深度信息。In one embodiment, the first type of illumination includes the use of RGB sensors to collect two-dimensional images of the full field of view under ambient light or LED illumination of the full field of view, and determine the area where the target object is located through the collected two-dimensional images , And then turn on the light source corresponding to the target area to perform fine scanning of the target, and obtain the depth information of the target after processing.
本发明达到的有益效果为:设置多个光源分别照射不同的区域,通过粗略扫描全视场区域得到的图像对目标物所在区域进行界定,后仅开启界定的目标物区域对应的光源对目标物进行精细扫描,获得深度图,从而达到了降低光源功耗、降低深度计算功耗、降低散热设计难度,提高了计算速率的效果。The beneficial effects achieved by the present invention are: setting a plurality of light sources to illuminate different areas respectively, defining the area where the target object is located by roughly scanning the image obtained from the full field of view area, and then only turning on the light source corresponding to the defined target area to target the target object. Perform a fine scan to obtain a depth map, thereby achieving the effect of reducing the power consumption of the light source, reducing the power consumption of depth calculation, reducing the difficulty of heat dissipation design, and improving the calculation rate.
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。A person of ordinary skill in the art can understand that all or part of the steps in the above embodiments can be implemented by hardware, or by a program to instruct relevant hardware. The program can be stored in a computer-readable storage medium. The storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.
以上内容是结合具体的优选实施方式对本发明所做的进一步详细说明,不能认定本发明的具体实施只局限于这些说明。对于本发明所属技术领域的技术人员来说,在不脱离本发明构思的前提下,还可以做出若干等同替代或明显变型,而且性能或用途相同,都应当视为属于本发明的保护范围。The above content is a further detailed description of the present invention in conjunction with specific preferred embodiments, and it cannot be considered that the specific implementation of the present invention is limited to these descriptions. For those skilled in the art to which the present invention belongs, without departing from the concept of the present invention, several equivalent substitutions or obvious modifications can be made, and the same performance or use should be regarded as belonging to the protection scope of the present invention.

Claims (10)

  1. 一种基于飞行时间的深度相机,其特征在于,包括发射模块、接收模块及控制与处理模块,其中:A depth camera based on time-of-flight, which is characterized by comprising a transmitting module, a receiving module, and a control and processing module, wherein:
    所述发射模块,包括至少两个光源,用于发射光信号,所述光信号设置有一定视场角;The transmitting module includes at least two light sources for transmitting optical signals, and the optical signals are set with a certain angle of view;
    所述接收模块,包括至少一个像素组成的图像处理器,用于采集经全视场区域、目标区域反射回的光信号;The receiving module includes an image processor composed of at least one pixel, which is used to collect the light signal reflected back through the full field of view area and the target area;
    所述控制与处理模块,用于控制所述发射模块和所述接收模块,以及处理经全视场区域反射回的光信号,根据所述光信号判断目标物所在目标区域进而控制所述发射模块仅向所述目标区域发射光信号;控制所述接收模块采集所述目标区域反射回的光信号,并对所述光信号进行处理得到所述目标物的深度信息;The control and processing module is used to control the transmitting module and the receiving module, and process the light signal reflected back through the full field of view area, and determine the target area where the target is located according to the light signal to control the transmitting module Only emit light signals to the target area; control the receiving module to collect the light signals reflected back from the target area, and process the light signals to obtain the depth information of the target;
    其中,所述发射模块具有第一类型照明和第二类型照明,所述第一类型照明是指对所述全视场区域发射光信号进行扫描,所述第二类型照明是指仅对所述目标物所在的所述目标区域发射光信号进行扫描。Wherein, the emission module has a first type of illumination and a second type of illumination, the first type of illumination refers to scanning the light signal emitted from the full field of view area, and the second type of illumination refers to only the The target area where the target is located emits light signals for scanning.
  2. 如权利要求1所述的基于飞行时间的深度相机,其特征在于,所述发射模块包括:The depth camera based on time of flight of claim 1, wherein the transmitting module comprises:
    第一光源、第二光源和第三光源,用于向外发射光信号;The first light source, the second light source, and the third light source are used to emit light signals outward;
    第一漫射体、第二漫射体和第三漫射体,分别与所述第一光源、所述第二光源和所述第三光源对应,用于对所述光信号的视场角进行调制。The first diffuser, the second diffuser, and the third diffuser correspond to the first light source, the second light source, and the third light source, respectively, and are used to adjust the angle of view of the optical signal Make modulation.
  3. 如权利要求1所述的基于飞行时间的深度相机,其特征在于,所述发射模块包括:The depth camera based on time of flight of claim 1, wherein the transmitting module comprises:
    第一光源、第二光源和第三光源,用于向外发射光信号;The first light source, the second light source, and the third light source are used to emit light signals outward;
    第一液晶调制元件、第二液晶调制元件和第三液晶调制元件,分别与所述第一光源、所述第二光源和所述第三光源对应,用于控制所述光信号的发散角。The first liquid crystal modulation element, the second liquid crystal modulation element and the third liquid crystal modulation element respectively correspond to the first light source, the second light source and the third light source, and are used for controlling the divergence angle of the optical signal.
  4. 如权利要求1所述的基于飞行时间的深度相机,其特征在于,所述发射模块包括:The depth camera based on time of flight of claim 1, wherein the transmitting module comprises:
    基底,用于提供至少三个倾斜角度承载光源;The substrate is used to provide at least three oblique angles to carry the light source;
    第一光源、第二光源和第三光源,以不同的所述倾斜角度放置在所述基底上,用于向外发射光信号。The first light source, the second light source, and the third light source are placed on the substrate at different inclination angles for emitting light signals outward.
  5. 如权利要求4所述的基于飞行时间的深度相机,其特征在于,所述基底 是半圆形或等腰梯形。The time-of-flight-based depth camera of claim 4, wherein the base is semicircular or isosceles trapezoid.
  6. 如权利要求2-5任一所述的基于飞行时间的深度相机,其特征在于,所述第一类型照明包括同时开启所述第一光源、所述第二光源、所述第三光源;5. The time-of-flight-based depth camera according to any one of claims 2-5, wherein the first type of illumination comprises simultaneously turning on the first light source, the second light source, and the third light source;
    所述第二类型照明包括:仅开启所述第一光源;或,仅开启所述第二光源;或,仅开始所述第三光源;或,同时开始所述第一光源和所述第二光源;或;同时开始所述第二光源和所述第三光源;或,同时开始所述第一光源、所述第二光源、所述第三光源。The second type of illumination includes: only the first light source is turned on; or, only the second light source is turned on; or, only the third light source is started; or, the first light source and the second light source are started simultaneously Light source; or; start the second light source and the third light source at the same time; or, start the first light source, the second light source, and the third light source at the same time.
  7. 一种三维成像方法,其特征在于,包括如下步骤:A three-dimensional imaging method is characterized in that it comprises the following steps:
    S1:使用第一类型照明获取全视场区域的第一帧图像;S1: Use the first type of illumination to obtain the first image of the full field of view area;
    S2:根据所述第一帧图像判断目标物所在的目标区域;S2: Judging the target area where the target object is located according to the first frame of image;
    S3:使用第二类型照明对所述目标区域进行照明;S3: Use the second type of lighting to illuminate the target area;
    S4:获取第二帧图像,得到所述目标物的深度信息。S4: Obtain a second frame of image to obtain depth information of the target object.
  8. 如权利要求7所述的三维成像方法,其特征在于,所述第一类型照明是指开启所有光源对所述全视场区域进行扫描;所述第二类型照明是指仅开启所述目标区域对应的光源对所述目标区域进行扫描。The three-dimensional imaging method according to claim 7, wherein the first type of illumination refers to turning on all light sources to scan the full field of view area; the second type of illumination refers to turning on only the target area The corresponding light source scans the target area.
  9. 如权利要求7所述的三维成像方法,其特征在于,所述第一类型照明包括利用红外光源对所述全视场区域进行扫描,得到所述第一帧深度图像。8. The three-dimensional imaging method of claim 7, wherein the first type of illumination comprises scanning the full field of view area with an infrared light source to obtain the first frame of depth image.
  10. 如权利要求7所述的三维成像方法,其特征在于,所述第一类型照明包括在环境光或LED对所述全视场区域的照明下,利用RGB传感器对所述全视场区域采集二维图像作为所述第一帧图像。The three-dimensional imaging method according to claim 7, wherein the first type of illumination includes using an RGB sensor to collect the second type of illumination of the full field of view area under ambient light or LED illumination of the full field of view area. A two-dimensional image is used as the first frame image.
PCT/CN2020/077847 2019-12-12 2020-03-04 Time-of-flight-based depth camera and three-dimensional imaging method WO2021114497A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911276033.8 2019-12-12
CN201911276033.8A CN111025329A (en) 2019-12-12 2019-12-12 Depth camera based on flight time and three-dimensional imaging method

Publications (1)

Publication Number Publication Date
WO2021114497A1 true WO2021114497A1 (en) 2021-06-17

Family

ID=70206365

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077847 WO2021114497A1 (en) 2019-12-12 2020-03-04 Time-of-flight-based depth camera and three-dimensional imaging method

Country Status (2)

Country Link
CN (1) CN111025329A (en)
WO (1) WO2021114497A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114258176A (en) * 2021-12-31 2022-03-29 欧普照明股份有限公司 Lamp and lamp control method
JP2023144238A (en) * 2022-03-28 2023-10-11 ソニーセミコンダクタソリューションズ株式会社 ranging module

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140055771A1 (en) * 2012-02-15 2014-02-27 Mesa Imaging Ag Time of Flight Camera with Stripe Illumination
US8761594B1 (en) * 2013-02-28 2014-06-24 Apple Inc. Spatially dynamic illumination for camera systems
CN106226977A (en) * 2016-08-24 2016-12-14 深圳奥比中光科技有限公司 Laser projection module, image capturing system and control method thereof and device
CN106574964A (en) * 2014-12-22 2017-04-19 谷歌公司 Integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with a partitioned field of view
CN106662640A (en) * 2014-12-22 2017-05-10 谷歌公司 Smart illumination time of flight system and method
WO2019017692A1 (en) * 2017-07-18 2019-01-24 엘지이노텍 주식회사 Tof module and object recognition device using tof module
CN208874676U (en) * 2018-07-30 2019-05-17 深圳阜时科技有限公司 A kind of image acquiring device, identity recognition device and electronic equipment

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6079772B2 (en) * 2012-03-28 2017-02-15 富士通株式会社 Imaging device
CN102927500A (en) * 2012-11-01 2013-02-13 深圳市通用科技有限公司 Elongated light-distributing light-emitting diode (LED) lamp
US20140139632A1 (en) * 2012-11-21 2014-05-22 Lsi Corporation Depth imaging method and apparatus with adaptive illumination of an object of interest
EP2941772B1 (en) * 2013-01-07 2020-07-01 Ascentia Imaging, Inc. Optical guidance system using mutually distinct signal-modifying sensors
CN103256527A (en) * 2013-01-30 2013-08-21 深圳市通用科技有限公司 All-round light-distribution LED lamp
CN103364961B (en) * 2013-08-02 2016-03-09 浙江大学 Based on the 3 D displaying method of many projected array and multilayer liquid crystal complex modulated
CN203550113U (en) * 2013-09-02 2014-04-16 广东志高暖通设备股份有限公司 Air conditioning and indoor line controller of air conditioning
CN103926699B (en) * 2014-01-17 2016-08-17 吉林大学 A kind of light emission angle modulation device that can be used for three-dimensional display pixel
KR102376134B1 (en) * 2014-05-30 2022-03-18 쓰리엠 이노베이티브 프로퍼티즈 컴파니 Optical systems having variable viewing angles
US9674415B2 (en) * 2014-12-22 2017-06-06 Google Inc. Time-of-flight camera system with scanning illuminator
CN204347348U (en) * 2014-12-26 2015-05-20 南京中科神光科技有限公司 A kind of many field detection camera lens
US9894257B2 (en) * 2015-05-13 2018-02-13 Apple Inc. Light source module with adjustable diffusion
CN105849628A (en) * 2016-03-23 2016-08-10 香港应用科技研究院有限公司 Phase modulator for holographic perspective display
DE102016124612A1 (en) * 2016-12-16 2018-06-21 Carl Zeiss Microscopy Gmbh Segmented optics for a lighting module for angle-selective lighting
CN107037577A (en) * 2017-05-10 2017-08-11 中国科学院苏州生物医学工程技术研究所 A kind of Structured Illumination optical system and method
CN109375237B (en) * 2018-12-12 2019-11-19 北京华科博创科技有限公司 A kind of all solid state face array three-dimensional imaging laser radar system
CN109782254A (en) * 2019-01-28 2019-05-21 上海禾赛光电科技有限公司 Scanning means and its scan method, laser radar
CN109991584A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN109991583A (en) * 2019-03-14 2019-07-09 深圳奥比中光科技有限公司 A kind of jamproof distance measurement method and depth camera
CN110346780A (en) * 2019-07-31 2019-10-18 炬佑智能科技(苏州)有限公司 Flight time sensing cameras and its local detection accuracy method of adjustment
CN110441785A (en) * 2019-08-19 2019-11-12 深圳奥锐达科技有限公司 Time flying distance measuring system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140055771A1 (en) * 2012-02-15 2014-02-27 Mesa Imaging Ag Time of Flight Camera with Stripe Illumination
US8761594B1 (en) * 2013-02-28 2014-06-24 Apple Inc. Spatially dynamic illumination for camera systems
CN106574964A (en) * 2014-12-22 2017-04-19 谷歌公司 Integrated camera system having two dimensional image capture and three dimensional time-of-flight capture with a partitioned field of view
CN106662640A (en) * 2014-12-22 2017-05-10 谷歌公司 Smart illumination time of flight system and method
CN106226977A (en) * 2016-08-24 2016-12-14 深圳奥比中光科技有限公司 Laser projection module, image capturing system and control method thereof and device
WO2019017692A1 (en) * 2017-07-18 2019-01-24 엘지이노텍 주식회사 Tof module and object recognition device using tof module
CN208874676U (en) * 2018-07-30 2019-05-17 深圳阜时科技有限公司 A kind of image acquiring device, identity recognition device and electronic equipment

Also Published As

Publication number Publication date
CN111025329A (en) 2020-04-17

Similar Documents

Publication Publication Date Title
US11575843B2 (en) Image sensor modules including primary high-resolution imagers and secondary imagers
WO2021128587A1 (en) Adjustable depth measuring device and measuring method
EP3185037B1 (en) Depth imaging system
JP7290571B2 (en) Integrated LIDAR lighting output control
CN111142088B (en) Light emitting unit, depth measuring device and method
CN107885023B (en) Time-of-flight sensing for brightness and autofocus control in image projection devices
WO2021072802A1 (en) Distance measurement system and method
WO2021120403A1 (en) Depth measurement device and method
WO2021114497A1 (en) Time-of-flight-based depth camera and three-dimensional imaging method
US11762151B2 (en) Optical radar device
WO2020256924A8 (en) Hyperspectral imaging with minimal area monolithic image sensor
US20210041534A1 (en) Distance measurement module, distance measurement method, and electronic apparatus
JPWO2018211831A1 (en) Photodetectors and portable electronics
US20230258809A9 (en) Patterned illumination for three dimensional imaging
US11393183B2 (en) Integrated electronic module for 3D sensing applications, and 3D scanning device including the integrated electronic module
EP3226024B1 (en) Optical 3-dimensional sensing system and method of operation
KR101458696B1 (en) A fully or partially electronic-scanned high speed 3-dimensional laser scanner system using laser diode arrays
CN111766594A (en) Distance measuring device, distance measuring method, and signal processing method
JP2023532676A (en) Projector for diffuse illumination and structured light
US20220228857A1 (en) Projecting a structured light pattern from an apparatus having an oled display screen
US20220291502A1 (en) Optical scanning system
US11557054B1 (en) Method for performing region-of-interest-based depth detection with aid of pattern-adjustable projector, and associated apparatus
US20220413107A1 (en) Distance measurement apparatus and distance measurement system
EP4001989A1 (en) Electronic device and method for an electronic device
US11828587B2 (en) 3D sensing device, lighting module and control method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20900289

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20900289

Country of ref document: EP

Kind code of ref document: A1