WO2021120403A1 - 一种深度测量装置及测量方法 - Google Patents

一种深度测量装置及测量方法 Download PDF

Info

Publication number
WO2021120403A1
WO2021120403A1 PCT/CN2020/077863 CN2020077863W WO2021120403A1 WO 2021120403 A1 WO2021120403 A1 WO 2021120403A1 CN 2020077863 W CN2020077863 W CN 2020077863W WO 2021120403 A1 WO2021120403 A1 WO 2021120403A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
spot pattern
target object
light
source array
Prior art date
Application number
PCT/CN2020/077863
Other languages
English (en)
French (fr)
Inventor
许星
Original Assignee
深圳奥比中光科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳奥比中光科技有限公司 filed Critical 深圳奥比中光科技有限公司
Publication of WO2021120403A1 publication Critical patent/WO2021120403A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4804Auxiliary means for detecting or identifying lidar signals or the like, e.g. laser illuminators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements

Definitions

  • This application relates to the field of optical measurement technology, and in particular to a depth measurement device and measurement method.
  • the depth measurement device can be used to obtain the depth image of the object, and can further perform 3D modeling, skeleton extraction, face recognition, etc., and has a very wide range of applications in the fields of 3D measurement and human-computer interaction.
  • the current depth measurement technology mainly includes TOF ranging technology and structured light ranging technology.
  • TOF ranging technology is a technology that achieves precise ranging by measuring the round-trip flight time of light pulses between the transmitting/receiving device and the target object. It is divided into direct measurement Distance technology and indirect distance measurement technology. Among them, the indirect ranging technology measures the phase delay of the reflected light signal relative to the emitted light signal, and then calculates the flight time by the phase delay. According to the different types of modulation and demodulation, it can be divided into continuous wave (CW) modulation and decoding. Modulation method and pulse modulation (Pulse Modulated, PM) modulation and demodulation method. TOF ranging technology does not require complex image processing calculations, and the detection distance is relatively long and can maintain high accuracy.
  • CW continuous wave
  • PM Pulse Modulated
  • the structured light ranging technology emits a structured light beam to a space object, and then collects the structured light pattern formed by the structured light beam modulated and reflected by the object, and finally uses the triangulation method to perform depth calculation to obtain the depth data of the object.
  • Commonly used structured light patterns include irregular spot patterns, stripe patterns, phase shift patterns and so on.
  • Structured light technology has very high accuracy in short-distance measurement. It performs well in low-light environments, but is susceptible to strong light environments. Relatively speaking, the anti-interference effect of TOF technology in strong light environments is better than that of structure. Light technology.
  • the purpose of the present application is to provide a depth measurement device and measurement method to solve at least one of the above-mentioned background technical problems.
  • a depth measuring device includes a transmitting module, a receiving module, and a control and processing circuit respectively connected to the transmitting module and the receiving module; wherein the transmitting module includes a light source array, and the light source array includes at least two sub-light source arrays, Used to emit the first spot pattern beam or the second spot pattern beam to the target object; wherein the first and second spot pattern beams are beams whose amplitude is modulated in time sequence; the receiving module includes a TOF image sensor, the The TOF image sensor includes a pixel array, the pixel array obtains intensity information after receiving the first spot pattern beam reflected by the target object; or, after receiving the second spot pattern beam reflected by the target object, obtains phase information; The control and processing circuit uses the intensity information to form a grayscale image, and uses the grayscale image to calculate the structured light depth image of the target object; or, uses the phase information and calculates the phase difference, based on the phase difference Calculate the TOF depth image of the target object.
  • the transmitting module includes a light source array,
  • a driving circuit is further included, and the light source array emits light in groups or as a whole under the control of the driving circuit.
  • the light source array includes first and second sub-light source arrays; wherein, the first sub-light source array is a sparse light source array for emitting light beams with a sparse spot pattern; the second sub-light source array It is a dense light source array for emitting dense spot pattern beams.
  • the TOF image sensor includes at least one pixel, and each pixel includes two or more taps.
  • the taps are sequentially switched in a certain order within a single frame period to collect corresponding photons, receive light signals, and convert them. Into an electrical signal.
  • the amplitude of the light beam corresponding to each spot in the first and second spot pattern light beams is modulated by at least one of a continuous wave, a square wave, or a pulse mode in time sequence.
  • a depth measurement method includes the following steps:
  • a light source array to emit a first spot pattern beam or a second spot pattern beam to a target object; wherein the first and second spot pattern beams are light beams whose amplitude is modulated in time sequence; the light source array includes at least two Sub-light source arrays to respectively emit the first spot pattern light beam and the second spot pattern light beam;
  • the light source array includes at least one sparse light source array and at least one dense light source array; the sparse light source array is used to emit a sparse spot pattern light beam, and the dense light source array is used to emit a dense spot pattern light beam.
  • the light source arrays can group or collectively emit spot pattern light beams toward the target area.
  • the amplitude of the light beam corresponding to each spot in the spot pattern light beam is modulated by at least one of a continuous wave, a square wave, or a pulse mode in time sequence.
  • An electronic device comprising: a housing, a screen, and a depth measuring device; the depth measuring device includes a transmitting module, a receiving module, and a control and processing circuit respectively connected to the transmitting module and the receiving module; the transmitting module includes A light source array, the light source array includes at least two sub-light source arrays for emitting a first spot pattern beam or a second spot pattern beam to a target object; wherein the first and second spot pattern beams are amplitude-modulated in time series.
  • the receiving module includes a TOF image sensor, the TOF image sensor includes a pixel array, and the pixel array obtains intensity information after receiving the first spot pattern beam reflected by the target object; or, receiving The second spot pattern beam reflected by the target object obtains phase information; the control and processing circuit uses the intensity information to form a grayscale image, and uses the grayscale image to calculate a structured light depth image of the target object Or, using the phase information and calculating the phase difference, and calculating the TOF depth image of the target object based on the phase difference;
  • the transmitting module and the receiving module of the depth measuring device are arranged on the first plane of the electronic device, for transmitting to the target object the spot pattern light beam whose amplitude is modulated in time sequence and receiving the light beam reflected by the target object.
  • the spot pattern beam; the screen is installed on the second plane of the electronic device for displaying information such as images or text; the first plane and the second plane are the same plane or the first plane and The second plane is an opposite plane.
  • An embodiment of the application provides a depth measurement device, including: a transmitting module, which includes a light source array, and the light source array includes at least two sub-light source arrays to respectively emit first and second spot pattern light beams; wherein, the first The second spot pattern beam is a beam whose amplitude is modulated in time sequence; the receiving module includes a TOF image sensor, the sensor includes a pixel array, and the pixel array receives the first spot pattern beam reflected by the target object to obtain the intensity Information; or, receiving the second spot pattern beam reflected by the target object to obtain phase information; controlling and processing circuits, using the intensity information to form a grayscale image, and using the grayscale image to calculate a structured light depth image; or, The phase difference is calculated using the phase information, and the TOF depth image is calculated based on the phase difference.
  • the depth measurement device of the present application can be reasonably modulated based on different requirements such as measurement range, test environment, test accuracy, etc., so as to meet the needs of different application scenarios, while ensuring the mini
  • Fig. 1 is a schematic diagram of a depth measuring device according to an embodiment of the present application.
  • Fig. 2 is a schematic diagram of a light source array of a depth measuring device according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of the principle of a depth measuring device according to an embodiment of the present application.
  • Fig. 4 is a flowchart of a depth measurement method according to an embodiment of the present application.
  • Fig. 5 is a schematic diagram of an electronic device integrated with the depth measuring device of Fig. 1 according to an embodiment of the present application.
  • connection can be used for fixing or circuit connection.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • “plurality” means two or more, unless otherwise specifically defined.
  • the depth measuring device 10 includes a transmitting module 11, a receiving module 12, and a control and processing circuit 13.
  • the transmitting module 11 is configured to emit a light beam 30 to the target object 20, the transmitted light beam 30 is a spot pattern light beam whose amplitude is modulated in time sequence, and the spot pattern light beam is emitted into the target space to illuminate the target object 20 in the space; At least part of the emitted light beam 30 is reflected by the target object 20 to form a reflected light beam 40, and at least part of the reflected light beam 40 is received by the receiving module 12;
  • the control and processing circuit 13 is respectively connected with the transmitting module 11 and the receiving module 12 to control the beam Transmit and receive, and receive the information generated by the receiving module 12 to receive the reflected light beam, and calculate the information to obtain the depth information of the target object.
  • the emission module 11 includes a light source array 111, an optical element 112, a light source driver (not shown in the figure), and the like.
  • the light source array 111 may be a light source array composed of multiple light sources such as light emitting diodes (LED), edge emitting lasers (EEL), vertical cavity surface emitting lasers (VCSEL), etc.
  • the light beams emitted by the light sources may be visible light, infrared light, and ultraviolet light. Light and so on.
  • the light source array 111 is an irregularly arranged VCSEL S array for emitting irregular spot pattern light beams.
  • the light source array 111 is controlled by the light source driver (which may be further controlled by the control and processing circuit 13) to emit light beams after being modulated with a certain timing amplitude.
  • the light source array 111 is under the control of the light source driver. Transmit pulse modulated beams, square wave modulated beams, sine wave modulated beams and other beams at a certain frequency.
  • the amplitude of the beam corresponding to each spot in the irregular spot pattern beam is modulated in a continuous wave, square wave or pulse manner in time sequence. It is understandable that, in some embodiments, a part of the control and processing circuit 13 or a sub-circuit independent of the control and processing circuit 13 may be used to control the light source array 111 to emit related light beams, such as a pulse signal generator.
  • the optical element 112 receives the light beam from the light source array 111 and emits the spot pattern light beam outward. In some embodiments, the optical element 112 is also used to expand the received light beam to expand the angle of view. It can be understood that the amplitude of the light beam modulated by the optical element 112 is still modulated in a certain time sequence, that is, the incident sine wave modulated light beam is still the sine wave modulated light beam.
  • the optical element 112 may be one or a combination of a lens unit, a diffractive optical element (DOE), a microlens array, and a liquid crystal.
  • DOE diffractive optical element
  • the receiving module 12 includes a TOF image sensor 121, a filter unit 122, and a lens unit 123; wherein the lens unit 123 receives and images at least part of the spot pattern beam reflected by the target object on at least part of the TOF image sensor 121, and the filter unit 122 is provided It is a narrow-band filter matched to the wavelength of the light source and is used to suppress background light noise in the remaining wavelength bands.
  • the TOF image sensor 121 can be an image sensor array composed of charge coupled devices (CCD), complementary metal oxide semiconductors (CMOS), avalanche diodes (AD), single photon avalanche diodes (SPAD), etc.
  • the size of the array represents the depth of the camera. Resolution, such as 320 ⁇ 240, etc.
  • the TOF image sensor 121 is connected to a readout circuit composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC) and other devices (not shown in the figure). Out).
  • the TOF image sensor 121 includes at least one pixel. Compared with a traditional image sensor that is only used for taking pictures, each pixel of the TOF image sensor 121 includes two or more taps (tap, used to control the corresponding electrode). Store and read or discharge the charge signal generated by incident photons), for example, including 2 taps, within a single frame period (or single exposure time), switch the taps in a certain order in order to collect the corresponding photons to receive The optical signal is converted into an electrical signal.
  • the control and processing circuit 13 can be an independent dedicated circuit, such as a dedicated SOC chip, FPGA chip, ASIC chip, etc. composed of CPU, memory, bus, etc., or a general processing circuit, such as when the depth measuring device is integrated into In the case of smart terminals such as mobile phones, TVs, and computers, the processing circuit in the terminal can be used as at least a part of the control and processing circuit 13.
  • the control and processing circuit 13 is used to provide a modulation signal (transmission signal) required when the light source array 111 emits laser light, and the light source emits a light beam to a target object under the control of the modulation signal.
  • the modulation signal is a square wave signal or a pulse signal
  • the light source is modulated in time sequence in amplitude under the modulation of the modulation signal to generate a square wave signal or a pulse signal for external emission.
  • the control and processing circuit 13 also provides the demodulated signal (collection signal) of each tap in each pixel of the TOF image sensor 121, and the tap collects the electrical signal generated by the reflected light beam containing the target object under the control of the demodulated signal.
  • the electrical signal contains the intensity information of the reflected light beam.
  • the control and processing circuit 13 then processes the electrical signal and calculates the intensity information reflecting the intensity of the reflected light beam to form a gray-scale pattern, and finally performs image matching calculations and triangulation based on the gray-scale pattern. Calculations and other calculations to obtain the depth image of the target object.
  • each tap in each pixel of the TOF image sensor 121 collects the electrical signal generated by the reflected light beam including the reflected light from the target object under the control of the demodulated signal.
  • the electrical signal contains the phase information of the reflected light beam, and the control and processing circuit 13 Then the electrical signal is processed and the phase difference reflecting the light beam from emission to reception is calculated, and the flight time of the light beam is calculated based on the phase difference, and the depth image of the target object is further obtained.
  • the depth measurement device 10 may also include a drive circuit, a power supply, a color camera, an infrared camera, an IMU, and other devices, which are not shown in the figure.
  • the combination with these devices can achieve richer functions, such as 3D. Texture modeling, infrared face recognition, SLAM and other functions.
  • the depth measuring device 10 may be embedded in electronic products such as mobile phones, tablet computers, and computers.
  • Fig. 2 is a schematic diagram of a light source array according to an embodiment of the present application.
  • the light source array 111 is composed of a plurality of sub-light sources arranged on a single substrate (or multiple substrates), and the sub-light sources are arranged on the substrate in an irregular form.
  • the substrate may be a semiconductor substrate, a metal substrate, etc.
  • the sub-light source may be a light emitting diode, an edge-emitting laser emitter, a vertical cavity surface laser emitter (VCSEL), etc.; preferably, the light source array 111 is composed of a plurality of VCSELs arranged on the semiconductor substrate.
  • An array of VCSEL chips composed of sub-light sources.
  • the sub-light source is used to emit light beams of any wavelength, such as visible light, infrared light, and ultraviolet light.
  • the light source array 111 emits light under the modulation drive of the driving circuit (which may be a part of the processing circuit 13), such as continuous wave modulation, pulse modulation, and the like.
  • the light source array 111 emits light in groups or as a whole under the control of the driving circuit (the driving circuit may also be controlled by the control and processing circuit 13).
  • the light source array 111 includes the first sub-light source array 201 ( Figure 2 is represented by a hollow circle), the second sub-light source array 202 (represented by a circle with a vertical line in FIG. 2), etc.; the driving circuit includes a first driving circuit and a second driving circuit.
  • the first sub-light source array 201 is a sparse light source array, which emits sparse spot pattern light beams toward the target area under the control of the first driving circuit;
  • the second sub-light source array 202 is a dense light source array, which is directed toward the target area under the control of the second driving circuit.
  • the target area emits a dense spot pattern beam.
  • the light source array 111 may also include a third sub-light source array, a fourth sub-light source array, etc., which are not particularly limited in the example of the present application.
  • Fig. 3 is a schematic diagram of the principle of a depth measuring device according to an embodiment of the present application.
  • the control and processing circuit 13 controls the light source array 111 to group or emit a spot pattern beam 30 whose amplitude is modulated by a square wave or pulse to the target object as a whole. It is understandable that the light sources in the light source array 111 are modulated in the same manner. In some other embodiments, a sine wave may also be used to modulate the amplitude of the emitted light beam.
  • control and processing circuit 13 controls the first light source array 201 to emit the sparse spot pattern light beam 301 to the target object, and the amplitude of each light spot 302 is modulated by a square wave or pulse in time sequence.
  • Each pixel of the TOF image sensor in the receiving module 12 includes 4 taps, which are respectively used to collect 4 optical signals and convert them into electrical signals C 1 , C 2 , C 3 and C 4 in a single frame period, 4 times The acquisition time and interval are the same.
  • the control and processing circuit 13 receives the electrical signals C 1 , C 2 , C 3 and C 4 to calculate the intensity information of the spot pattern beam.
  • the intensity information is calculated according to the following formula:
  • the gray-scale pattern can be formed, and finally the gray-scale image is used for matching calculation to obtain the parallax and calculate the depth image according to the parallax.
  • the intensity information will be calculated according to the following formula:
  • a gray-scale image is generated according to the intensity information of the spot pattern beam calculated by the formula (2), and a matching calculation is further performed according to the gray-scale image to obtain the parallax and the depth image is calculated according to the parallax.
  • the aforementioned 4-tap-based TOF image sensor and the gray pattern acquisition scheme of square wave or pulse modulated light emission signals are also applicable to other tapped TOF image sensors and other types of depth measurement devices that modulate light emission signals. It is understandable that, compared with the traditional structured light depth measurement, the present application uses the transmitting end to emit a time-modulated speckle projection beam, and adopts the receiving end multi-tap pixel collection method, so that this method can have more functions than the traditional solution. , Such as the realization of depth measurement methods that are difficult to resist environmental interference in traditional solutions.
  • control and processing circuit 13 can also control the second light source array 202 to emit a dense spot pattern beam to the target object, wherein the amplitude of each spot is also modulated by a square wave or pulse in time sequence, and the spot pattern beam reflected by the target object is
  • the TOF image sensor in the receiving module 12 receives.
  • Each tap in each pixel of the TOF image sensor 121 outputs the electrical signal generated by collecting the reflected light beam under the control of the demodulation signal.
  • the electrical signal is related to the phase of the reflected light beam.
  • the control and processing circuit 13 then processes the electrical signal to The phase difference is calculated, and the flight time of the reflected light beam from the transmitting end to the receiving end is calculated according to the phase difference, and the depth image of the target object is further calculated based on the flight time.
  • the first light source array 201 and the second light source array 202 can be controlled to simultaneously emit more dense spot pattern light beams to the target object.
  • Each of the TOF image sensors Each tap in the pixel collects the electrical signal caused by the reflected beam, and processes the electrical signal to calculate the phase difference of the beam from emission to reception, and calculates the flight time and the depth map of the target object based on the phase difference, which can effectively improve the system resolution .
  • the light source array when the measurement target object is close, the light source array is controlled to emit a sparse spot pattern beam toward the target object.
  • the TOF image sensor generates a grayscale image after receiving the reflected beam, and the control and processing circuit is based on the grayscale.
  • the structured light depth image is obtained by calculating the degree map.
  • the light source array is controlled to emit a dense spot pattern beam toward the target object.
  • the TOF image sensor receives the reflected beam to obtain the phase information, and the control and processing circuit calculates the phase difference of the beam from emission to reception. Calculate the TOF depth image.
  • Figure 4 is a flow chart of the depth measurement method of this application, which specifically includes the following steps:
  • the light source array is used to emit a light beam whose amplitude is modulated in time sequence. After receiving the light beam, the optical element emits a spot patterned beam to the target object; wherein the light source array includes at least two sub-light source arrays that respectively emit the first spot patterned light beam and the target object.
  • the light source array includes at least one sparse light source array and at least one dense light source array.
  • the light source arrays can group or collectively emit spot pattern light beams toward the target area; wherein the sparse light source array emits sparse spot patterns
  • the dense light source array emits a dense spot pattern beam; the light beam emitted by the light source array is modulated with a certain timing amplitude under the control of the control and processing circuit.
  • the light beam corresponding to each spot in the modulated spot pattern beam is in timing.
  • the amplitude is modulated by at least one of continuous wave, square wave, or pulse.
  • each pixel in the pixel array includes at least two taps; preferably, each pixel includes 4 taps.
  • the taps are sequentially switched in a certain order to collect corresponding photons, which are used to receive optical signals and convert them into electrical signals.
  • the control and processing circuit provides the demodulated signal (collected signal) of each tap in each pixel.
  • the TOF image sensor receives the sparsely patterned light beam reflected by the target object, and the tap receives the optical signal under the control of the demodulation signal and generates an electrical signal, the electrical signal includes the intensity information of the reflected light; or, the TOF image sensor receives the target The dense patterned light beam reflected by the object, the tap receives the optical signal under the control of the demodulation signal and generates an electrical signal, which includes the phase information of the reflected light.
  • control and processing circuit receives the electrical signal containing the intensity information to obtain a grayscale image corresponding to the sparse spot patterned beam, and uses the grayscale image combined with matching algorithms, triangulation, etc. to obtain a structured light depth image of the target object; or , The control and processing circuit receives the electrical signal containing phase information to calculate the phase difference of the dense spot patterned beam from emission to reception, and calculates the flight time according to the phase difference, and further obtains the TOF depth map of the target object according to the flight time.
  • the depth measurement device is reasonably modulated based on different requirements such as measurement range, test environment, test accuracy, etc., to meet the needs of different application scenarios, while ensuring the miniaturization and simplicity of the equipment.
  • an electronic device is also provided.
  • the electronic device may be a mobile phone, a tablet, a computer, a TV, a smart helmet, smart glasses, a robot, and so on.
  • the electronic device 500 includes a housing 51, a screen 52, and the depth measuring device described in the foregoing embodiment; the screen 52 is used for information display;
  • the housing 51 can provide protection functions such as dustproof, waterproof, and drop-proof for the electronic device.
  • the transmitting module 11 and the receiving module 12 of the depth measuring device are arranged on the first plane of the electronic device 500, and are used to transmit a spot pattern beam whose amplitude is modulated in time sequence to the target object and receive the spot pattern reflected by the target object.
  • the light beam; the screen 52 is installed on the second plane of the electronic device for displaying images or text and other information; the first plane and the second plane are the same plane or the first plane and the second plane are opposite Plane.
  • the functions of electronic equipment are continuously expanded and applications are more and more extensive. For example, according to Different requirements such as measurement range, test environment, and test accuracy are reasonably modulated so that electronic equipment can meet the needs of different application scenarios.

Abstract

本申请公开了一种深度测量装置,包括:发射模块,其包括光源阵列,所述光源阵列包括至少两个子光源阵列,以分别发射第一和第二斑点图案光束;其中,所述第一、第二斑点图案光束是在时序上振幅被调制的光束;接收模块,其包括有TOF图像传感器,所述传感器包括有像素阵列,所述像素阵列接收目标物体反射的第一斑点图案光束获取强度信息;或,接收目标物体反射的第二斑点图案光束获取相位信息;控制和处理电路,利用所述强度信息以形成灰度图,并利用所述灰度图计算出结构光深度图像;或,利用所述相位信息计算出相位差,基于所述相位差计算出TOF深度图像。本申请基于不同的测量需求可进行合理的调制,从而可满足不同应用场景的需要。

Description

一种深度测量装置及测量方法
本申请要求于2019年12月18日提交中国专利局,申请号为201911305980.5,发明名称为“一种深度测量装置及测量方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及光学测量技术领域,尤其涉及一种深度测量装置及测量方法。
背景技术
深度测量装置可以用来获取物体的深度图像,进一步可以进行3D建模、骨架提取、人脸识别等,在3D测量以及人机交互等领域有着非常广泛的应用。目前的深度测量技术主要有TOF测距技术、结构光测距技术等。
TOF的全称是Time-of-Flight,即飞行时间,TOF测距技术是一种通过测量光脉冲在发射/接收装置和目标物体间的往返飞行时间来实现精确测距的技术,分为直接测距技术和间接测距技术。其中,间接测距技术测量反射光信号相对于发射光信号的相位延迟,再由相位延迟对飞行时间进行计算,按照调制解调类型方式的不同可以分为连续波(Continuous Wave,CW)调制解调方法和脉冲调制(Pulse Modulated,PM)调制解调方法。TOF测距技术无需复杂的图像处理计算,且探测距离较远并能保持较高的精度。
结构光测距技术则向空间物体发射结构光光束,其次采集被物体调制及反射后的结构光光束所形成的结构光图案,最后利用三角法进行深度计算以获取物体的深度数据。常用的结构光图案有不规则斑点图案、条纹图案、相移图案等。
结构光技术在近距离测量时具有非常高的精度,在低光环境中表现良好, 但是在强光环境中易受影响,相对而言TOF技术在强光环境中的抗干扰效果要优于结构光技术。
发明内容
本申请的目的在于提供一种深度测量装置及测量方法,以解决上述背景技术问题中的至少一种。
为达到上述目的,本申请实施例的技术方案是这样实现的:
一种深度测量装置,包括发射模块、接收模块、以及分别与发射模块和接收模块连接的控制和处理电路;其中,发射模块,其包括有光源阵列,所述光源阵列包括至少两个子光源阵列,用于向目标物体发射第一斑点图案光束或第二斑点图案光束;其中,第一、第二斑点图案光束是在时序上振幅被调制的光束;接收模块,其包括有TOF图像传感器,所述TOF图像传感器中包括有像素阵列,所述像素阵列接收由目标物体反射的所述第一斑点图案光束后获取强度信息;或,接收目标物体反射的所述第二斑点图案光束后获取相位信息;控制和处理电路,利用所述强度信息以形成灰度图,并利用所述灰度图计算目标物体的结构光深度图像;或,利用所述相位信息并计算出相位差,基于所述相位差计算目标物体的TOF深度图像。
在一些实施例中,还包括有驱动电路,所述光源阵列在驱动电路的控制下分组发光或者整体发光。
在一些实施例中,所述光源阵列包括第一、第二子光源阵列;其中,所述第一子光源阵列是稀疏光源阵列,以用于发射稀疏斑点图案光束;所述第二子光源阵列是密集光源阵列,以用于发射密集斑点图案光束。
在一些实施例中,所述TOF图像传感器包括至少一个像素,每个像素包含两个及以上的抽头,在单个帧周期内以一定的次序依次切换抽头以采集相应的光子,接收光信号并转换成电信号。
在一些实施例中,所述第一、第二斑点图案光束中每个斑点对应的光束在 时序上振幅被连续波、方波或脉冲方式中的至少一种方式调制。
本申请实施例另一技术方案为:
一种深度测量方法,包括如下步骤:
S1、利用光源阵列向目标物体发射第一斑点图案光束或第二斑点图案光束;其中,所述第一、第二斑点图案光束是在时序上振幅被调制的光束;所述光源阵列包括至少两个子光源阵列,以分别发射所述第一斑点图案光束与所述第二斑点图案光束;
S2、利用TOF图像传感器中的像素阵列接收所述目标物体反射的第一斑点图案光束后获取强度信息;或,接收所述目标物体反射的第二斑点图案光束后获取相位信息;
S3、接收所述强度信息以形成灰度图,并利用所述灰度图计算目标物体的结构光深度图像;或,接收所述相位信息并计算出相位差,基于所述相位差计算目标物体的TOF深度图像。
在一些实施例中,所述光源阵列包括至少一个稀疏光源阵列和至少一个密集光源阵列;所述稀疏光源阵列用于发射稀疏斑点图案光束,而所述密集光源阵列用于发射密集斑点图案光束。
在一些实施例中,在控制和处理电路的控制下,光源阵列可以分组或共同朝向目标区域发射斑点图案光束。
在一些实施例中,所述斑点图案光束中每个斑点对应的光束在时序上振幅被连续波、方波或脉冲方式中的至少一种方式调制。
本申请实施例又一技术方案为:
一种电子设备,包括:壳体、屏幕、以及深度测量装置;所述深度测量装置包括发射模块、接收模块、以及分别与发射模块和接收模块连接的控制和处理电路;所述发射模块包括有光源阵列,所述光源阵列包括至少两个子光源阵列,用于向目标物体发射第一斑点图案光束或第二斑点图案光束;其中,所述第一、第二斑点图案光束是在时序上振幅被调制的光束;所述接收模块包括有 TOF图像传感器,所述TOF图像传感器中包括有像素阵列,所述像素阵列接收由所述目标物体反射的第一斑点图案光束后获取强度信息;或,接收所述目标物体反射的第二斑点图案光束后获取相位信息;所述控制和处理电路利用所述强度信息以形成灰度图,并利用所述灰度图计算所述目标物体的结构光深度图像;或,利用所述相位信息并计算出相位差,基于所述相位差计算所述目标物体的TOF深度图像;
其中,所述深度测量装置的发射模块与接收模块设置于电子设备的第一平面上,以用于向所述目标物体发射在时序上振幅被调制的斑点图案光束与接收由所述目标物体反射的斑点图案光束;所述屏幕安装在所述电子设备的第二平面上,用于显示图像或者文字等信息;所述第一平面与所述第二平面为同一平面或所述第一平面与所述第二平面为相对立的平面。
本申请实施例提供一种深度测量装置,包括:发射模块,其包括光源阵列,所述光源阵列包括至少两个子光源阵列,以分别发射第一和第二斑点图案光束;其中,所述第一、第二斑点图案光束是在时序上振幅被调制的光束;接收模块,其包括有TOF图像传感器,所述传感器包括有像素阵列,所述像素阵列接收目标物体反射的第一斑点图案光束获取强度信息;或,接收目标物体反射的第二斑点图案光束获取相位信息;控制和处理电路,利用所述强度信息以形成灰度图,并利用所述灰度图计算出结构光深度图像;或,利用所述相位信息计算出相位差,基于所述相位差计算出TOF深度图像。本申请深度测量装置可以基于测量范围、测试环境、测试精度等不同的需求进行合理的调制,从而以满足不同应用场景的需要,同时保证设备的小型化、简易化。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付 出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是根据本申请一个实施例深度测量装置的示意图。
图2是根据本申请一个实施例深度测量装置的光源阵列的示意图。
图3是根据本申请一个实施例深度测量装置的原理示意图。
图4是根据本申请一个实施例深度测量方法的流程图。
图5是根据本申请一个实施例集成有图1深度测量装置的电子设备的示意图。
具体实施方式
为了使本申请实施例所要解决的技术问题、技术方案及有益效果更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
需要说明的是,当元件被称为“固定于”或“设置于”另一个元件,它可以直接在另一个元件上或者间接在该另一个元件上。当一个元件被称为是“连接于”另一个元件,它可以是直接连接到另一个元件或间接连接至该另一个元件上。另外,连接即可以是用于固定作用也可以是用于电路连通作用。
需要理解的是,术语“长度”、“宽度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请实施例和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多该特征。在本申请实施例的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
参照图1所示,作为本申请一个实施例,提供一种深度测量装置。所述深度 测量装置10包括发射模块11、接收模块12以及控制和处理电路13。其中,发射模块11用于向目标物体20发射光束30,所述发射光束30是在时序上振幅被调制的斑点图案光束,该斑点图案光束发射至目标空间中以照明空间中的目标物体20;至少部分发射光束30经目标物体20反射后形成反射光束40,反射光束40中的至少部分光束被接收模块12接收;控制和处理电路13分别与发射模块11以及接收模块12连接,以控制光束的发射与接收,同时接收来自于接收模块12接收反射光束并产生的信息,并对该信息进行计算以获取目标物体的深度信息。
发射模块11包括光源阵列111、光学元件112以及光源驱动器(图中未示出)等。其中,光源阵列111可以是发光二极管(LED)、边发射激光器(EEL)、垂直腔面发射激光器(VCSEL)等多个光源组成的光源阵列,光源所发射的光束可以是可见光、红外光、紫外光等。优选地,光源阵列111为不规则排列的VCSEL S阵列,用于发射不规则的斑点图案光束。光源阵列111在光源驱动器(其可以进一步被控制和处理电路13控制)的控制下以一定的时序振幅被调制后向外发射光束,比如在一个实施例中,光源阵列111在光源驱动器的控制下以一定的频率发射脉冲调制光束、方波调制光束、正弦波调制光束等光束。在本申请的一个实施例中,不规则斑点图案光束中每个斑点对应的光束的振幅在时序上以连续波、方波或脉冲的方式被调制。可以理解的是,在一些实施例中,可以利用控制和处理电路13中的一部分或者独立于控制和处理电路13存在的子电路来控制光源阵列111发射相关的光束,比如脉冲信号发生器。
光学元件112接收来自光源阵列111的光束,并向外发射斑点图案光束。在一些实施例中,光学元件112还用于将接收到的光束进行扩束,以扩大视场角。可以理解的是,通过光学元件112调制后的光束其振幅依旧是以一定的时序被调制,即入射的是正弦波调制光束,出射的依旧是正弦波调制光束。光学元件112可以是透镜单元、衍射光学元件(DOE)、微透镜阵列、液晶中的一种或几种的组合。
接收模块12包括TOF图像传感器121、过滤单元122、以及透镜单元123;其中,透镜单元123接收并将由目标物体反射回的至少部分斑点图案光束成像在至少部分TOF图像传感器121上,过滤单元122设置为与光源波长相匹配的窄带滤光片,用于抑制其余波段的背景光噪声。TOF图像传感器121可以是电荷耦合元件(CCD)、互补金属氧化物半导体(CMOS)、雪崩二极管(AD)、单光子雪崩二极管(SPAD)等组成的图像传感器阵列,阵列大小代表着该深度相机的分辨率,比如320×240等。一般地,与TOF图像传感器121连接的还包括由信号放大器、时数转换器(TDC)、模数转换器(ADC)等器件中的一种或多种组成的读出电路(图中未示出)。
一般地,TOF图像传感器121包括至少一个像素,与传统的仅用于拍照的图像传感器相比,TOF图像传感器121的每个像素包含两个及以上的抽头(tap,用于在相应电极的控制下存储并读取或者排出由入射光子产生的电荷信号),比如包括2个抽头,在单个帧周期(或单次曝光时间内)内以一定的次序依次切换抽头以采集相应的光子,以接收光信号并转换成电信号。
控制和处理电路13可以是独立的专用电路,比如包含CPU、存储器、总线等组成的专用SOC芯片、FPGA芯片、ASIC芯片等等,也可以包含通用处理电路,比如当该深度测量装置被集成到如手机、电视、电脑等智能终端时,终端中的处理电路可以作为该控制和处理电路13的至少一部分。
控制和处理电路13用于提供光源阵列111发射激光时所需的调制信号(发射信号),光源在调制信号的控制下向目标物体发射光束。比如在一个实施例中,调制信号为方波信号或脉冲信号,光源在该调制信号的调制下振幅被时序上调制以产生方波信号或者脉冲信号向外发射。
控制和处理电路13还提供TOF图像传感器121各像素中各抽头的解调信号(采集信号),抽头在解调信号的控制下采集由包含目标物体反射回的反射光束所产生的电信号,该电信号包含反射光束的强度信息,控制和处理电路13随后对该电信号进行处理并计算出反映反射光束强度的强度信息以形成灰度图案,最 后基于该灰度图案进行像匹配计算、三角法计算等计算以获得目标物体的深度图像。
此外,TOF图像传感器121各像素中各抽头在解调信号的控制下采集由包含目标物体反射回的反射的光束所产生的电信号,该电信号包含反射光束的相位信息,控制和处理电路13随后对该电信号进行处理并计算出反映光束从发射到接收的相位差,基于相位差计算光束的飞行时间,进一步获得目标物体的深度图像。
在一些实施例中,深度测量装置10还可以包括驱动电路、电源、彩色相机、红外相机、IMU等器件,在图中并未示出,与这些器件的组合可以实现更加丰富的功能,比如3D纹理建模、红外人脸识别、SLAM等功能。深度测量装置10可以被嵌入到手机、平板电脑、计算机等电子产品中。
图2是根据本申请一个实施例的光源阵列示意图。光源阵列111由设置在单片基底(或多片基底)上的多个子光源组成,子光源以不规则的形式排列在基底上。基底可以是半导体基底、金属基底等,子光源可以是发光二极管、边发射激光发射器、垂直腔面激光发射器(VCSEL)等;优选地,光源阵列111由设置在半导体基底上的多个VCSEL子光源所组成的阵列VCSEL芯片。子光源用于发射任意波长的光束,比如可见光、红外光、紫外光等。光源阵列111在驱动电路(可以是处理电路13的一部分)的调制驱动下进行发光,比如连续波调制、脉冲调制等。
在本申请实施例中,光源阵列111在驱动电路的控制下分组发光或者整体发光(驱动电路也可以是接受控制和处理电路13的控制),比如光源阵列111包含第一子光源阵列201(图2中用空心的圆表示)、第二子光源阵列202(图2中用带竖线的圆表示)等;驱动电路包括第一驱动电路和第二驱动电路。其中,第一子光源阵列201是稀疏光源阵列,在第一驱动电路的控制下朝向目标区域发射稀疏斑点图案光束;第二子光源阵列202是密集光源阵列,在第二驱动电路的控制下朝向目标区域发射密集斑点图案光束。可以理解的是,光源阵列111 也可以包括第三子光源阵列、第四子光源阵列等等,在本申请实例中不做特别限制。
图3是根据本申请一实施例的深度测量装置原理示意图。控制与处理电路13控制光源阵列111分组或者整体向目标物体发射振幅被方波或脉冲调制的斑点图案光束30。可以理解的是,光源阵列111中光源的调制方式相同,在一些其他实施例中,还可以利用正弦波对发射光束的振幅进行调制。
在一个实施例中,控制和处理电路13控制第一光源阵列201向目标物体发射稀疏斑点图案光束301,各个光斑302的振幅在时序上被方波或脉冲调制。接收模组12中的TOF图像传感器的各个像素包括4个抽头,分别用于在单个帧周期内分别采集4次光信号并转换成电信号C 1、C 2、C 3和C 4,4次采集的时间以及间隔相同。
控制和处理电路13接收电信号C 1、C 2、C 3和C 4计算出斑点图案光束的强度信息。在一个实施例中的,强度信息根据下式计算:
Figure PCTCN2020077863-appb-000001
在获取所有像素的强度信息之后就可以形成灰度图案,最后再利用灰度图像进行匹配计算以获取视差以及根据视差计算出深度图像。
对于存在环境光信号时,这一光束强度计算方式与传统方式一样,均难以进行消除,从而导致最终的灰度图案信噪比较低。因此,在一个实施例,强度信息将根据下式计算:
Figure PCTCN2020077863-appb-000002
根据式子(2)计算出的斑点图案光束的强度信息生成灰度图像,进一步根据灰度图像进行匹配计算以获取视差以及根据视差计算出深度图像。
上述基于4抽头的TOF图像传感器以及方波或脉冲调制光发射信号的灰度图案获取方案同样也适用于其他抽头TOF图像传感器以及其他类型调制光发射信号的深度测量装置。可以理解的是,与传统的结构光深度测量相比,本申请 利用发射端发出时序调制的斑点投影光束,采用接收端多抽头像素采集方式,从而可以使得该方法拥有比传统方案更多的功能,比如实现传统方案难以抗环境干扰的深度测量方法。
同理,控制和处理电路13还可以控制第二光源阵列202向目标物体发射密集斑点图案光束,其中各个光斑上的振幅在时序上同样被方波或脉冲调制,目标物体反射的斑点图案光束被接收模块12中的TOF图像传感器接收。TOF图像传感器121各个像素中的各个抽头在解调信号的控制下输出采集反射光束所产生的电信号,该电信号与反射光束的相位相关,控制和处理电路13随后对该电信号进行处理以计算出相位差,并根据该相位差计算出反映光束从发射端发射到被接收端接收所用的飞行时间,进一步基于该飞行时间计算出目标物体的深度图像。
在另一个实施例中,还可以在控制和处理电路13的控制下,控制第一光源阵列201和第二光源阵列202同时向目标物体发射密集度更高的斑点图案光束,TOF图像传感器中各像素中的各抽头采集经反射光束引起的电信号,并对电信号进行处理计算光束从发射到接收的相位差,基于相位差计算飞行时间以及目标物体的深度图,可以有效的提高系统分辨率。
在本申请又一个实施例中,测量目标物体范围较近时,控制光源阵列朝向目标物体发射稀疏斑点图案光束,此时TOF图像传感器接收到反射光束后生成灰度图,控制和处理电路根据灰度图进行计算得到结构光深度图像。
而当测量目标物体范围较远时,控制光源阵列朝向目标物体发射密集斑点图案光束,此时TOF图像传感器接收到反射光束得到相位信息,控制和处理电路计算光束从发射到接收的相位差并进行计算得到TOF深度图像。
作为本申请另一实施例,还提供一种深度测量方法。参照图4所示,图4为本申请深度测量方法流程图示,具体包括如下步骤:
S1、利用光源阵列发射时序上振幅被调制的光束,光学元件接收光束后向目标物体发射斑点图案化光束;其中,光源阵列包括至少两个子光源阵列,分 别朝向目标物体发射第一斑点图案光束和第二斑点图案光束;
具体的,光源阵列包括至少一个稀疏光源阵列和至少一个密集光源阵列,在控制和处理电路的控制下,光源阵列可以分组或共同朝向目标区域发射斑点图案光束;其中,稀疏光源阵列发射稀疏斑点图案光束,而密集光源阵列发射密集斑点图案光束;光源阵列发射的光束在控制和处理电路的控制下以一定的时序振幅被调制,具体的,调制斑点图案光束中每个斑点对应的光束在时序上振幅被连续波、方波或脉冲的方式中的至少一种方式调制。
S2、利用TOF图像传感器中的像素阵列接收目标物体反射的第一斑点图案光束后获取强度信息;或,接收目标物体反射的第二斑点图案光束后获取相位信息;
具体的,像素阵列中的每个像素包括有至少两个抽头;优选地,每个像素包括4个抽头。在单个帧周期(或单次曝光时间内)内以一定的次序依次切换抽头以采集相应的光子,用于接收光信号并转换成电信号。其中,控制和处理电路提供各像素中各抽头的解调信号(采集信号)。
具体的,TOF图像传感器接收目标物体反射回的稀疏图案化光束,抽头在解调信号的控制下接收光信号并生成电信号,该电信号包括反射光的强度信息;或,TOF图像传感器接收目标物体反射回的密集图案化光束,抽头在解调信号的控制下接收光信号并生成电信号,该电信号包括反射光的相位信息。
S3、接收强度信息以形成灰度图,并利用灰度图计算目标物体的结构光深度图像;或,接收相位信息并计算出相位差,基于相位差计算目标物体的TOF深度图像。
具体的,控制和处理电路接收包含强度信息的电信号得出稀疏斑点图案化光束所对应的灰度图,利用灰度图并结合匹配算法、三角法等得到目标物体的结构光深度图像;或,控制和处理电路接收包含相位信息的电信号计算出密集斑点图案化光束从发射到接收的相位差,并根据该相位差计算出飞行时间,进一步根据飞行时间得到目标物体的TOF深度图。
在本申请实施例中,深度测量装置基于测量范围、测试环境、测试精度等不同的需求进行合理的调制,以满足不同应用场景的需要,同时保证设备的小型化、简易化。
作为本申请又一实施例,还提供一种电子设备。所述电子设备可以是手机、平板、电脑、电视、智能头盔、智能眼镜以及机器人等。参照图5所示,以手机为例进行说明,所述电子设备500包括壳体51、屏幕52、以及前述实施例所述的深度测量装置;所述屏幕52用于进行信息显示;而所述壳体51可为该电子设备提供防尘、防水、防摔等保护功能。
具体的,深度测量装置的发射模块11与接收模块12设置于电子设备500的第一平面上,用于向目标物体发射在时序上振幅被调制的斑点图案光束与接收由目标物体反射的斑点图案光束;所述屏幕52安装在电子设备的第二平面上,用于显示图像或者文字等信息;所述第一平面与第二平面为同一平面或所述第一平面与第二平面为相对立的平面。
通过将所述深度测量装置集成到电子设备中,如:手机、平板、电脑、电视、智能头盔、智能眼镜以及机器人等,从而使得电子设备的功能不断扩大,应用越来越广泛,例如可以根据测量范围、测试环境、测试精度等不同的需求进行合理的调制,以使得电子设备可满足不同应用场景的需要。
可以理解的是,以上内容是结合具体/优选的实施方式对本申请所作的进一步详细说明,不能认定本申请的具体实施只局限于这些说明。对于本申请所属技术领域的普通技术人员来说,在不脱离本申请构思的前提下,其还可以对这些已描述的实施方式做出若干替代或变型,而这些替代或变型方式都应当视为属于本申请的保护范围。在本说明书的描述中,参考术语“一种实施例”、“一些实施例”、“优选实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。
在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或 示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。尽管已经详细描述了本申请的实施例及其优点,但应当理解,在不脱离由所附权利要求限定的范围的情况下,可以在本文中进行各种改变、替换和变更。
此外,本申请的范围不旨在限于说明书中所述的过程、机器、制造、物质组成、手段、方法和步骤的特定实施例。本领域普通技术人员将容易理解,可以利用执行与本文所述相应实施例基本相同功能或获得与本文所述实施例基本相同结果的目前存在的或稍后要开发的上述披露、过程、机器、制造、物质组成、手段、方法或步骤。因此,所附权利要求旨在将这些过程、机器、制造、物质组成、手段、方法或步骤包含在其范围内。

Claims (10)

  1. 一种深度测量装置,其特征在于,包括发射模块、接收模块、以及分别与发射模块和接收模块连接的控制和处理电路;其中,
    发射模块,其包括有光源阵列,所述光源阵列包括至少两个子光源阵列,用于向目标物体发射第一斑点图案光束或第二斑点图案光束;其中,所述第一、第二斑点图案光束是在时序上振幅被调制的光束;
    接收模块,其包括有TOF图像传感器,所述TOF图像传感器中包括有像素阵列,所述像素阵列接收由所述目标物体反射的第一斑点图案光束后获取强度信息;或,接收所述目标物体反射的第二斑点图案光束后获取相位信息;
    控制和处理电路,利用所述强度信息以形成灰度图,并利用所述灰度图计算所述目标物体的结构光深度图像;或,利用所述相位信息并计算出相位差,基于所述相位差计算所述目标物体的TOF深度图像。
  2. 如权利要求1所述的深度测量装置,其特征在于:还包括有驱动电路,所述光源阵列在所述驱动电路的控制下分组发光或者整体发光。
  3. 如权利要求1所述的深度测量装置,其特征在于:所述光源阵列包括第一、第二子光源阵列;其中,所述第一子光源阵列是稀疏光源阵列,以用于发射稀疏斑点图案光束;所述第二子光源阵列是密集光源阵列,以用于发射密集斑点图案光束。
  4. 如权利要求1所述的深度测量装置,其特征在于:所述TOF图像传感器包括至少一个像素,每个像素包含两个及以上的抽头,在单个帧周期内以一定的次序依次切换抽头以采集相应的光子,接收光信号并转换成电信号。
  5. 如权利要求1所述的深度测量装置,其特征在于:所述第一、第二斑点图案光束中每个斑点对应的光束在时序上振幅被连续波、方波或脉冲方式中的至少一种方式调制。
  6. 一种深度测量方法,其特征在于,包括如下步骤:
    S1、利用光源阵列向目标物体发射第一斑点图案光束或第二斑点图案光束; 其中,所述第一、第二斑点图案光束是在时序上振幅被调制的光束;所述光源阵列包括至少两个子光源阵列,以分别发射所述第一斑点图案光束与所述第二斑点图案光束;
    S2、利用TOF图像传感器中的像素阵列接收所述目标物体反射的第一斑点图案光束后获取强度信息;或,接收所述目标物体反射的第二斑点图案光束后获取相位信息;
    S3、接收所述强度信息以形成灰度图,并利用所述灰度图计算所述目标物体的结构光深度图像;或,接收所述相位信息并计算出相位差,基于所述相位差计算所述目标物体的TOF深度图像。
  7. 如权利要求6所述的深度测量方法,其特征在于:所述光源阵列包括至少一个稀疏光源阵列和至少一个密集光源阵列;所述稀疏光源阵列用于发射稀疏斑点图案光束,所述密集光源阵列用于发射密集斑点图案光束。
  8. 如权利要求6所述的深度测量方法,其特征在于:在控制和处理电路的控制下,光源阵列可以分组或共同朝向目标区域发射斑点图案光束。
  9. 如权利要求6所述的深度测量方法,其特征在于:所述斑点图案光束中每个斑点对应的光束在时序上振幅被连续波、方波或脉冲方式中的至少一种方式调制。
  10. 一种电子设备,包括:壳体、屏幕、以及深度测量装置;所述深度测量装置包括发射模块、接收模块、以及分别与发射模块和接收模块连接的控制和处理电路;所述发射模块包括有光源阵列,所述光源阵列包括至少两个子光源阵列,用于向目标物体发射第一斑点图案光束或第二斑点图案光束;其中,所述第一、第二斑点图案光束是在时序上振幅被调制的光束;所述接收模块包括有TOF图像传感器,所述TOF图像传感器中包括有像素阵列,所述像素阵列接收由所述目标物体反射的第一斑点图案光束后获取强度信息;或,接收所述目标物体反射的第二斑点图案光束后获取相位信息;所述控制和处理电路利用所述强度信息以形成灰度图,并利用所述灰度图计算所述目标物体的结构光深 度图像;或,利用所述相位信息并计算出相位差,基于所述相位差计算所述目标物体的TOF深度图像;
    其中,所述深度测量装置的发射模块与接收模块设置于电子设备的第一平面上,以用于向所述目标物体发射在时序上振幅被调制的斑点图案光束与接收由所述目标物体反射的斑点图案光束;所述屏幕安装在所述电子设备的第二平面上,用于显示图像或者文字等信息;所述第一平面与所述第二平面为同一平面或所述第一平面与所述第二平面为相对立的平面。
PCT/CN2020/077863 2019-12-18 2020-03-04 一种深度测量装置及测量方法 WO2021120403A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911305980.5A CN111123289B (zh) 2019-12-18 2019-12-18 一种深度测量装置及测量方法
CN201911305980.5 2019-12-18

Publications (1)

Publication Number Publication Date
WO2021120403A1 true WO2021120403A1 (zh) 2021-06-24

Family

ID=70498382

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077863 WO2021120403A1 (zh) 2019-12-18 2020-03-04 一种深度测量装置及测量方法

Country Status (2)

Country Link
CN (1) CN111123289B (zh)
WO (1) WO2021120403A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117607837A (zh) * 2024-01-09 2024-02-27 苏州识光芯科技术有限公司 传感器阵列、距离测量设备及方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111693149A (zh) * 2020-06-23 2020-09-22 广东小天才科技有限公司 温度测量方法、装置、可穿戴设备和介质
CN114355384B (zh) * 2020-07-07 2024-01-02 柳州阜民科技有限公司 飞行时间tof系统和电子设备
CN112255639B (zh) * 2020-12-23 2021-09-03 杭州蓝芯科技有限公司 一种感兴趣区域深度感知传感器及深度感知传感模块
CN113052887A (zh) * 2021-03-24 2021-06-29 奥比中光科技集团股份有限公司 一种深度计算方法及系统
CN113534596B (zh) * 2021-07-13 2022-09-27 盛景智能科技(嘉兴)有限公司 Rgbd立体相机及成像方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190014977A (ko) * 2017-08-04 2019-02-13 엘지이노텍 주식회사 ToF 모듈
CN110333501A (zh) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 深度测量装置及距离测量方法
CN110456379A (zh) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 融合的深度测量装置及距离测量方法
CN110471080A (zh) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 基于tof图像传感器的深度测量装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107424188B (zh) * 2017-05-19 2020-06-30 深圳奥比中光科技有限公司 基于vcsel阵列光源的结构光投影模组
CN108490634B (zh) * 2018-03-23 2019-12-13 深圳奥比中光科技有限公司 一种结构光投影模组和深度相机

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190014977A (ko) * 2017-08-04 2019-02-13 엘지이노텍 주식회사 ToF 모듈
CN110333501A (zh) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 深度测量装置及距离测量方法
CN110456379A (zh) * 2019-07-12 2019-11-15 深圳奥比中光科技有限公司 融合的深度测量装置及距离测量方法
CN110471080A (zh) * 2019-07-12 2019-11-19 深圳奥比中光科技有限公司 基于tof图像传感器的深度测量装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117607837A (zh) * 2024-01-09 2024-02-27 苏州识光芯科技术有限公司 传感器阵列、距离测量设备及方法
CN117607837B (zh) * 2024-01-09 2024-04-16 苏州识光芯科技术有限公司 传感器阵列、距离测量设备及方法

Also Published As

Publication number Publication date
CN111123289A (zh) 2020-05-08
CN111123289B (zh) 2023-03-24

Similar Documents

Publication Publication Date Title
WO2021120403A1 (zh) 一种深度测量装置及测量方法
WO2021128587A1 (zh) 一种可调的深度测量装置及测量方法
WO2021120402A1 (zh) 一种融合的深度测量装置及测量方法
WO2021008209A1 (zh) 深度测量装置及距离测量方法
CN111142088B (zh) 一种光发射单元、深度测量装置和方法
CN110596721B (zh) 双重共享tdc电路的飞行时间距离测量系统及测量方法
WO2021051477A1 (zh) 一种直方图可调的飞行时间距离测量系统及测量方法
EP3185037B1 (en) Depth imaging system
WO2021072802A1 (zh) 一种距离测量系统及方法
US9807369B2 (en) 3D imaging apparatus
CN110596725B (zh) 基于插值的飞行时间测量方法及测量系统
CN110596724B (zh) 动态直方图绘制飞行时间距离测量方法及测量系统
CN111722241B (zh) 一种多线扫描距离测量系统、方法及电子设备
WO2021238212A1 (zh) 一种深度测量装置、方法及电子设备
CN109343070A (zh) 时间飞行深度相机
CN110596723B (zh) 动态直方图绘制飞行时间距离测量方法及测量系统
WO2021169531A1 (zh) 一种ToF深度测量装置、控制ToF深度测量装置的方法及电子设备
WO2021238213A1 (zh) 一种基于tof的深度测量装置、方法及电子设备
CN110221274A (zh) 时间飞行深度相机及多频调制解调的距离测量方法
CN209894976U (zh) 时间飞行深度相机及电子设备
CN110221273A (zh) 时间飞行深度相机及单频调制解调的距离测量方法
WO2020237764A1 (zh) 激光雷达装置
CN109283508A (zh) 飞行时间计算方法
CN110596720A (zh) 距离测量系统
WO2022241942A1 (zh) 一种深度相机及深度计算方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20900902

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20900902

Country of ref document: EP

Kind code of ref document: A1