WO2021128587A1 - 一种可调的深度测量装置及测量方法 - Google Patents

一种可调的深度测量装置及测量方法 Download PDF

Info

Publication number
WO2021128587A1
WO2021128587A1 PCT/CN2020/077865 CN2020077865W WO2021128587A1 WO 2021128587 A1 WO2021128587 A1 WO 2021128587A1 CN 2020077865 W CN2020077865 W CN 2020077865W WO 2021128587 A1 WO2021128587 A1 WO 2021128587A1
Authority
WO
WIPO (PCT)
Prior art keywords
zoom
focal length
light beam
light source
projection
Prior art date
Application number
PCT/CN2020/077865
Other languages
English (en)
French (fr)
Inventor
王兆民
Original Assignee
深圳奥比中光科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳奥比中光科技有限公司 filed Critical 深圳奥比中光科技有限公司
Publication of WO2021128587A1 publication Critical patent/WO2021128587A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • G01S17/36Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone

Definitions

  • This application relates to the field of optical measurement technology, and in particular to an adjustable depth measurement device and measurement method.
  • the depth measurement device can be used to obtain the depth image of the object, and can further perform 3D modeling, skeleton extraction, face recognition, etc., and has a very wide range of applications in the fields of 3D measurement and human-computer interaction.
  • the current depth measurement technology mainly includes TOF ranging technology, structured light ranging technology, binocular ranging technology, etc.
  • TOF ranging technology is a technology that achieves precise ranging by measuring the round-trip flight time of light pulses between the transmitting/receiving device and the target object. It is divided into direct measurement Distance technology and indirect distance measurement technology. Among them, the direct ranging technology is to continuously send light pulses to the target object, and then use the sensor to receive the light signal reflected from the object, and obtain the target object distance by detecting the flight (round trip) time of these transmitted and reflected light pulses.
  • Indirect ranging technology is to measure the phase delay of the reflected beam relative to the emitted beam by emitting a light beam whose amplitude is modulated in time sequence to the target object, and then calculate the flight time based on the phase delay.
  • modulation and demodulation it can be divided into continuous wave (CW) modulation and demodulation method and pulse modulation (Pulse Modulated, PM) modulation and demodulation method.
  • CW continuous wave
  • PM Pulse Modulated
  • the structured light ranging technology is to project a structured light beam to the target area, and collect the reflected structured light beam to form a structured light pattern, and finally calculate the depth image of the target object by triangulation.
  • Commonly used structured light patterns include irregular spot patterns, stripe patterns, phase shift patterns and so on.
  • Structured light technology has the characteristics of high resolution, high precision, and low power consumption.
  • the measurement range and measurement accuracy of the depth measurement device have very high requirements, but due to the influence of the internal structure, the measurement range of the depth measurement device is limited, and the measurement range of the depth measurement device is limited. The measurement accuracy at the range will also be affected.
  • the purpose of this application is to provide an adjustable depth measuring device and measuring method to solve at least one of the above-mentioned background technical problems.
  • An adjustable depth measuring device including a transmitting unit, a receiving unit, and a control and processing circuit; wherein the transmitting unit includes a light source array and a zoom projection lens; the light source array includes at least two sub-light source arrays, each The sub-light source array is used to emit a spot pattern beam; the zoom projection lens is configured to receive the beam and project the beam to a target area, and change the light source projection beam by changing the focal length of the zoom projection lens
  • the receiving unit includes a TOF image sensor and a zoom imaging lens; the TOF image sensor is configured to collect at least part of the light beam reflected back from the target area and form an electrical signal; the zoom imaging lens is configured to Project the reflected light beam into the TOF image sensor, and change the angle of view of the TOF image sensor to collect the reflected light beam by changing the focal length of the zoom imaging lens; the control and processing circuit and the emission unit, and The receiving unit is connected to calculate a depth image of the target area according to the electrical signal.
  • a driver is further included, and the control and processing circuit controls the driver to adjust the focal length of the zoom projection lens and the zoom imaging lens.
  • control and processing circuit stores a constraint condition for the relationship between the focal length of the zoom projection lens and the focal length of the zoom imaging lens, and the control and processing circuit controls the control and processing circuit according to the constraint condition. The adjustment of the focal lengths of the zoom projection lens and the zoom imaging lens.
  • the zoom projection lens is configured to have a first projection focal length and a second projection focal length, the first projection focal length is smaller than the second projection focal length, and the light beam is projected to the zoom projection lens through the zoom projection lens.
  • the first projection field angle of the target area is greater than the second projection field angle;
  • the zoom imaging lens is configured to have a first imaging focal length and a second imaging focal length, the first imaging focal length being smaller than the second imaging focal length,
  • the first imaging field angle of the TOF image sensor collecting the reflected light beam through the zoom imaging lens is greater than the second imaging field angle.
  • the zoom projection lens and the zoom imaging lens are configured to have a second projection focal length and a second imaging focal length, and in the second projection view Projecting the beam within the field angle and collecting the reflected beam within the second imaging angle of view; when the second sub-light source array projects the beam toward the target area, the zoom projection lens and the zoom imaging lens are configured to have a first projection
  • the focal length and the first imaging focal length are used to project a light beam within the first projection angle of view and collect a reflected light beam within the first imaging angle of view.
  • the number of light sources in each sub-light source array is not equal and can be individually controlled; the multiple light sources in the sub-light source array are arranged irregularly.
  • the TOF image sensor includes at least one pixel; wherein each pixel includes at least two taps, and the taps are used to sequentially collect the reflected light beams in a certain order within a single frame period. Generate electrical signals.
  • control and processing circuit receives the electrical signal for processing, calculates the intensity information of the reflected light beam and generates a structured light image, and calculates a depth image of the target area based on the structured light image; or The control and processing circuit receives the electrical signal for processing, calculates the phase difference of the light beam from emission to reflection being received, and further calculates the depth image of the target area based on the phase difference.
  • a depth measurement method includes the following steps:
  • the emission unit is controlled to project a light beam to the target area; wherein the emission unit includes a light source array and a zoom projection lens; the light source array includes at least two sub-light source arrays, each of the sub-light source arrays is used to emit a spot pattern light beam, the
  • the zoom projection lens is configured to receive the light beam and project the light beam to a target area, and change the angle of view of the light source projected by the light source by changing the focal length of the zoom projection lens;
  • the receiving unit is controlled to collect at least part of the light beam reflected back from the target area and form an electrical signal; wherein the receiving unit includes a TOF image sensor and a zoom imaging lens; the TOF image sensor is configured to collect at least part of the light beam reflected back from the target area and An electrical signal is formed; the zoom imaging lens is configured to project a reflected light beam into the TOF image sensor, and the field angle of the TOF image sensor to collect the reflected light beam is changed by changing the focal length of the zoom imaging lens;
  • the control and processing circuit receives the electrical signal, calculates the depth image of the target area according to the electrical signal, and completes the depth measurement.
  • the zoom projection lens is configured to have a first projection focal length and a second projection focal length, the first projection focal length is smaller than the second projection focal length, and the light beam is projected to the zoom projection lens through the zoom projection lens.
  • the first projection field angle of the target area is greater than the second projection field angle;
  • the zoom imaging lens is configured to have a first imaging focal length and a second imaging focal length, the first imaging focal length being smaller than the second imaging focal length, The first imaging field angle of the reflected light beam collected by the zoom imaging lens is greater than the second imaging field angle.
  • the embodiment of the application provides an adjustable depth measurement device, including a transmitting unit, a receiving unit, and a control and processing circuit; wherein the transmitting unit includes a light source array and a zoom projection lens; the light source array includes at least two sub-light source arrays, To emit a spot pattern beam; the zoom projection lens receives the beam and projects it to the target area, and changes the focal length of the zoom projection lens to change the angle of view of the light source projected beam; the receiving unit includes a TOF image sensor and a zoom imaging lens; the zoom imaging lens reflects the beam It is projected into the TOF image sensor and collected by the sensor to form an electrical signal.
  • the transmitting unit includes a light source array and a zoom projection lens
  • the light source array includes at least two sub-light source arrays, To emit a spot pattern beam
  • the zoom projection lens receives the beam and projects it to the target area, and changes the focal length of the zoom projection lens to change the angle of view of the light source projected beam
  • the receiving unit includes a TOF image sensor and
  • the control and processing circuit calculates the depth image of the target area according to the electrical signal.
  • this application has a more flexible and changeable depth of field, thereby achieving a larger range of depth measurement and improving the accuracy of distance measurement.
  • Fig. 1 is a schematic structural diagram of an adjustable depth measuring device according to an embodiment of the present application.
  • FIGS. 2a and 2b are schematic diagrams of a light source array of an adjustable depth measurement device according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of the principle of an adjustable depth measuring device according to an embodiment of the present application.
  • Fig. 4 is a flowchart of a depth measurement method according to another embodiment of the present application.
  • connection can be used for fixing or circuit connection.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • “plurality” means two or more than two, unless otherwise specifically defined.
  • FIG. 1 is a schematic diagram of an adjustable depth measuring device according to an embodiment of the present application.
  • the depth measuring device 10 includes a transmitting unit 11, a receiving unit 12, and a control and processing circuit 13.
  • the transmitting unit 11 is used to transmit a light beam 30 to the target area 20, and the light beam is emitted into the target space to illuminate the target object 20 in the space, at least Part of the emitted light beam 30 is reflected by the target area 20 to form a reflected light beam 40. At least part of the reflected light beam 40 is received by the receiving unit 12.
  • the control and processing circuit 13 is respectively connected to the light emitting unit 11 and the receiving unit 12 to control the emission of the light beam. At the same time as receiving, receiving the information generated by receiving the reflected light beam from the receiving unit 12, and calculating the information to obtain the depth information of the target object.
  • the light emitting unit 11 includes a light source 111, an optical element 112, a zoom projection lens 113, a light source driver (not shown in the figure), and the like.
  • the light source 111 may be a light emitting diode (LED), an edge emitting laser (EEL), a vertical cavity surface emitting laser (VCSEL), etc., or may be a light source array composed of multiple light sources for emitting a spot light beam toward a target area.
  • the arrangement of the light source 111 may be regular or irregular, and the light beam emitted by the light source 111 may be visible light, infrared light, ultraviolet light, or the like.
  • the light source 111 emits light beams outwardly under the control of the light source driver (which may be further controlled by the control and processing circuit 13).
  • the light source 111 emits a light beam whose amplitude is modulated under the control of the control and processing circuit 13.
  • the beam can be a pulse modulated beam, a square wave modulated beam or a sine wave modulated beam. It is understandable that a part of the control and processing circuit 13 or a sub-circuit independent of the control and processing circuit 13 can be used to control the light source 111 to emit related light beams, such as a pulse signal generator.
  • the optical element 112 receives the light beam from the light source 111, shapes it and projects it to the target area.
  • the optical element 112 receives the pulsed beam from the light source 111, and optically modulates the pulsed beam, such as diffraction, refraction, reflection, etc., and then emits the modulated beam into the space, such as a focused beam, Floodlight beam, spot pattern beam, etc.
  • the optical element 112 may be a lens, a liquid crystal element, a diffractive optical element, a micro lens array, a meta-surface optical element, a mask, a mirror, a MEMS galvanometer, etc., in one or more combinations.
  • the spot-patterned light beam emitted by the light source array arranged in an irregular form passes through the optical element 112 and then projects the flood beam or the spot-patterned light beam to the target area.
  • the zoom projection lens 113 is configured to receive the light beam emitted by the light source and project the light beam to a target area, and the field angle of the light beam projected by the light source is changed by changing the focal length of the zoom projection lens 113.
  • the zoom projection lens may be continuously zooming or may have multiple adjustable focal lengths.
  • it may be a zoom lens with at least two adjustable focal lengths.
  • the zoom projection lens can realize the zoom function by changing the focus of the lens through the driver.
  • the zoom projection lens may be a liquid lens, and zooming can be achieved by changing the shape of the liquid.
  • the receiving unit 12 includes a TOF image sensor 121, a filter 122, and a zoom imaging lens 123.
  • the zoom imaging lens 123 receives and images at least part of the light beam reflected by the target object on at least part of the TOF image sensor 121.
  • the filter 122 is configured to A narrow-band filter matched to the wavelength of the light source is used to suppress background light noise in the remaining wavelength bands.
  • the TOF image sensor 121 can be an image sensor composed of charge coupled devices (CCD), complementary metal oxide semiconductors (CMOS), avalanche diodes (AD), single photon avalanche diodes (SPAD), etc.
  • the size of the array represents the resolution of the depth camera Rate, such as 320 ⁇ 240, etc.
  • the TOF image sensor 121 is connected to a readout circuit composed of one or more of a signal amplifier, a time-to-digital converter (TDC), an analog-to-digital converter (ADC) and other devices (not shown in the figure). Out).
  • the zoom imaging lens 123 may be continuously zooming or may have multiple adjustable focal lengths.
  • it may be a zoom lens with at least two adjustable focal lengths.
  • the zoom imaging lens may realize the zoom function by changing the focus of the lens through the driver.
  • the zoom imaging lens may be a liquid lens, and zooming can be achieved by changing the shape of the liquid.
  • the TOF image sensor 121 includes at least one pixel. Compared with a traditional image sensor that is only used for taking pictures, each pixel of the TOF image sensor 121 includes two or more taps (tap, used to control the corresponding electrode). Store and read or discharge the charge signal generated by the incident photons), switch the taps in a certain order within a single frame period (or single exposure time) to collect the corresponding photons to receive the light signal and convert it into electricity signal.
  • the control and processing circuit 13 can be an independent dedicated circuit, such as a dedicated SOC chip, FPGA chip, ASIC chip, etc. composed of CPU, memory, bus, etc., or a general-purpose processing circuit, such as when the depth camera is integrated into In smart terminals such as mobile phones, TVs, and computers, the processing circuit in the terminal can be used as at least a part of the control and processing circuit 13.
  • the control and processing circuit 13 synchronizes the modulation and demodulation of the transmitting unit 11 and the receiving unit 12, and provides the modulation signal (transmission signal) required when the light source 111 emits laser light.
  • the modulated light beam for example, in some embodiments, the modulation signal is a sine wave signal, a square wave signal or a pulse signal, and the light source is modulated in time sequence to generate a sine wave signal, a square wave signal or a pulse signal under the modulation of the modulation signal.
  • the control and processing circuit 13 also provides a demodulated signal (collection signal) of each tap in each pixel in the TOF image sensor 121, and each tap collects the reflected light beam and generates an electrical signal under the control of the demodulated signal.
  • control and processing circuit 13 receives the electrical signal and processes it to calculate the intensity information of the reflected light beam.
  • the weighted average method can be used to calculate the intensity information of the light beam, and the structured light image is generated based on the intensity information.
  • the structured light image is combined with matching algorithm and triangulation calculation to obtain the depth image of the target area.
  • control and processing circuit 13 receives the electrical signal for processing and calculates the phase difference of the light beam from emission to reflection back to being received, and calculates the time required for the light beam from emission to reflection back to being received based on the phase difference. Depth image of the target area.
  • the depth measurement device 10 may also include a drive circuit, a power supply, a color camera, an infrared camera, an IMU, and other devices, which are not shown in the figure.
  • the combination with these devices can achieve richer functions, such as 3D. Texture modeling, infrared face recognition, SLAM and other functions.
  • the depth measuring device 10 may be embedded in electronic products such as mobile phones, tablet computers, and computers.
  • the light source array 111 is composed of a plurality of sub-light sources arranged on a single substrate (or multiple substrates), and the sub-light sources are arranged on the substrate in a certain pattern.
  • the substrate may be a semiconductor substrate, a metal substrate, etc.
  • the sub-light source may be a light emitting diode, an edge-emitting laser transmitter, a vertical cavity surface laser transmitter (VCSEL), etc., preferably, the light source array 111 is composed of a plurality of VCSELs arranged on the semiconductor substrate.
  • the sub-light source is used to emit light beams of any desired wavelength, such as visible light, infrared light, and ultraviolet light.
  • the light source array 111 emits light under the modulation drive of the driving circuit (which may be part of the processing circuit 13), such as continuous wave modulation, pulse modulation, etc.
  • the light source array 111 can also emit light in groups or as a whole under the control of the driving circuit.
  • the light source array 111 includes a first sub-light source array 201 (represented by a hollow circle in FIG. 2a) and a second sub-light source array 202 (represented by a shaded circle in FIG. 2a) Wait.
  • the first sub-light source array 201 is concentrated in the middle of the light source array 111 and is less in number than the second sub-light source array 202.
  • the first and second sub-light source arrays emit light toward the target area under the control of the first and second driving circuits, respectively.
  • the first spot pattern light beam and the second spot pattern light beam are examples of the first and second spot pattern light beam.
  • the light source array 111 includes a first sub-light source array 203 (represented by a hollow circle in FIG. 2b), and another sub-light source array 204 (represented by a shaded circle in FIG. 2a). )Wait.
  • the first sub-light source array 203 is concentrated in the middle position of the light source array 111 and is small in number, and emits the first spot pattern light beam toward the target area under the control of the first driving circuit.
  • the second driving circuit controls the first sub-light source array 203 and the other sub-light source array 204 to jointly emit light to form a second sub-light source array to emit the second spot pattern light beam to the target area.
  • the light source array 111 may also include a third sub-light source array, a fourth sub-light source array, etc., which are not particularly limited in the embodiment of the present application.
  • the configuration of grouping or co-emitting multiple sub-light source arrays can realize emission beams of different densities. The greater the density, the more suitable for long-distance measurement, so a measurement range composed of a variety of different measurement intervals can be achieved.
  • Fig. 3 is a schematic diagram of the principle of a depth measuring device according to an embodiment of the present application.
  • the depth measuring device includes a driver, and the driver drives the zoom projection lens and the zoom imaging lens to adjust the focal length under the control of the control and processing circuit.
  • the light beam emitted from the light source array 111 is projected into the target area after passing through the zoom projection lens 113, and the field angle of the projected light beam can be adjusted by adjusting the focal length of the zoom projection lens.
  • the zoom projection lens 113 is configured to have at least two adjustable focal lengths. When the focal point of the zoom projection lens 113 is located at the first projection position, it has the first projection focal length.
  • the emitted light beam is projected into the target area 20 with the first projection angle of view 301; when the focal point of the zoom projection lens is located at the second projection position
  • the projected focal length is second, the projected beam projected into the target area 20 has a second projected field of view 303; as the focal length of the zoom projection lens increases, the projected field angle of the projected beam into the target area decreases .
  • the TOF image sensor collects at least part of the light beam reflected back from the target area and forms an electrical signal.
  • the zoom imaging lens projects the reflected light beam to the pixels of the TOF image sensor.
  • the TOF image sensor collects the reflected light field of view. angle.
  • the zoom imaging lens 122 may be a zoom lens with at least two adjustable focal lengths. When the focal point of the zoom lens is at the first imaging position, it has the first imaging focal length. At this time, the TOF image sensor collects within the first imaging angle of view 302. Part of the light beam reflected by the target area 20; when the focal point of the zoom imaging lens is at the second imaging position, it has a second imaging focal length.
  • the TOF image sensor collects the part of the light beam reflected by the target area 20 within the second imaging angle of view 304.
  • the angle of view of the TOF image sensor to collect the reflected light beam in the target area decreases.
  • the focal lengths of the transmitting unit and the receiving unit should meet a specific constraint relationship, which can also be called a constraint. Under this relationship, it is ensured that the beam projected by the transmitting unit to the target area can finally reach the receiving unit.
  • the focal lengths of the transmitting unit and the receiving unit can be set to always remain the same during zooming. That is, the first projection focal length is equal to the first imaging focal length, and at this time, the area of the first projection field of view and the first imaging field of view are basically coincident; the second projection focal length is equal to the second imaging focal length, at this time, the second projected field of view It basically coincides with the area of the second imaging angle of view.
  • these constraint relationships are stored in the control and processing circuit.
  • the control and processing circuit sends adjustment instructions to the driver, these constraint relationships need to be called first, and the constraint relationships are converted into corresponding control instructions to control the focal length. Adjustment.
  • adjusting the light source array to emit light in groups or as a whole, combined with the adjustment of the focal length of the zoom lens, can achieve higher-precision imaging when corresponding to target objects at different distances.
  • the first driving circuit drives the first sub-light source array to project the light beam toward the target area, and adjust the zoom projection lens at the second projection position to control the second projection of the light beam to the target area
  • the angle of view is small. At this time, it can be ensured that the projected beam is more concentrated when fewer light sources are used, and the resolution of the measurement is improved while reducing power consumption, and the measurement accuracy is ensured.
  • the zoom imaging lens is adjusted to be located at the second imaging position, and the reflected light beam is collected within the second imaging field angle corresponding to the second projection field angle. At this time, it can be ensured that all the projected beams reflected by the target area are received by the sensor, and the influence of ambient light can be effectively reduced.
  • the second driving circuit drives the second sub-light source array to project the beam toward the target area, adjusts the zoom projection lens to be located at the first projection position, and controls the beam to be projected to the first view of the target area.
  • the field angle is larger. Since the intensity of the reflected light will decrease when the target is far away, it may be difficult for the sensor to receive an effective beam to form an electrical signal. After the dense spot beam is projected to the target area, the intensity of the projected beam will be higher in the first field of view. , Can improve the light intensity of the reflected beam.
  • the zoom imaging lens is adjusted to be located at the first imaging position, and the reflected light beam is collected within the first imaging field angle corresponding to the first projection field angle.
  • the depth measurement device By adjusting the focal length of the adjustable depth measurement device of the present application, the depth measurement device has a more flexible and changeable depth of field, thereby realizing a larger range of depth measurement.
  • combining the sub-regional working mode of the light source array can also effectively reduce the power consumption of the device and improve the accuracy of different ranging ranges.
  • FIG. 4 is a flowchart of a depth measurement method according to another embodiment of the application, and the measurement method includes:
  • the emitting unit controls the emitting unit to project a light beam to the target area; wherein the emitting unit includes a light source array and a zoom projection lens; the light source array includes at least two sub-light source arrays, and each of the sub-light source arrays is used to emit a spot pattern light beam.
  • the zoom projection lens is configured to receive the light beam and project the light beam to a target area, and change the angle of view of the light source projected by the light source by changing the focal length of the zoom projection lens.
  • S42 Control the receiving unit to collect at least part of the light beam reflected back from the target area and form an electrical signal; wherein the receiving unit includes a TOF image sensor and a zoom imaging lens, and the TOF image sensor is configured to collect at least part of the light beam reflected back from the target area and form an electrical signal.
  • the zoom imaging lens is configured to project the reflected light beam into the TOF image sensor, and the field angle of the TOF image sensor collecting the reflected light beam is changed by changing the focal length of the zoom imaging lens.
  • the control and processing circuit receives the electrical signal, calculates the depth image of the target area according to the electrical signal, and completes the depth measurement.
  • the light beam emitted from the light source array is projected into the target area after passing through the zoom projection lens, and the field angle of the projected light beam is adjusted by adjusting the focal length of the zoom projection lens.
  • the zoom projection lens is configured to have at least two adjustable focal lengths, or the zoom projection lens may be continuously zooming. When the focal point of the zoom projection lens is at the first projection position, it has the first projection focal length.
  • the emitted light beam is projected into the target area with the first projection angle of view; when the focal point of the zoom projection lens is at the second projection position, it has The second projection focal length, when the emitted light beam is projected into the target area has a second projection field angle; as the focal length of the zoom projection lens increases, the field angle of the emitted light beam projected into the target area decreases.
  • the TOF image sensor collects at least part of the light beam reflected back from the target area and forms an electrical signal
  • the zoom imaging lens projects the reflected light beam to the pixels of the TOF image sensor
  • the TOF image sensor collects the reflected light beam by changing the focal length of the zoom imaging lens.
  • the zoom imaging lens is configured as a zoom lens with at least two adjustable focal lengths, or the zoom projection lens may be continuously zoomed. When the focus of the zoom lens is at the first imaging position, it has the first imaging focal length.
  • the TOF image sensor collects part of the light beam reflected by the target area within the first imaging angle of view; when the focus of the zoom imaging lens is at the second imaging position When it has a second imaging focal length, the TOF image sensor collects part of the light beam reflected by the target area within the second imaging angle of view. As the focal length of the zoom lens increases, the angle of view of the TOF image sensor to collect the reflected light beam in the target area decreases.
  • the focal lengths of the transmitting unit and the receiving unit can be set to always remain the same during zooming. That is, the first projection focal length is equal to the first imaging focal length, and at this time, the area of the first projection field of view and the first imaging field of view are basically coincident; the second projection focal length is equal to the second imaging focal length, at this time, the second projected field of view It basically coincides with the area of the second imaging angle of view.
  • the depth measuring device by adjusting the focal length, the depth measuring device has a more flexible and changeable depth of field, thereby realizing a larger range of depth measurement.
  • combining the sub-regional working mode of the light source array can also effectively reduce the power consumption of the device and improve the accuracy of different ranging ranges.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请公开了一种可调的深度测量装置,包括发射单元、接收单元、以及控制与处理电路;其中,发射单元包括有光源阵列和变焦投影透镜;光源阵列包括至少两个子光源阵列,用于发射斑点图案光束;变焦投影透镜接收光束并投射至目标区域,改变变焦投影透镜的焦距而改变光源投射光束的视场角;接收单元包括TOF图像传感器和变焦成像透镜;变焦成像透镜将反射光束投射到TOF图像传感器中被传感器采集后形成电信号,通过改变变焦成像透镜的焦距而改变TOF图像传感器采集反射光束的视场角;控制与处理电路根据电信号计算目标区域的深度图像。本申请通过调整焦距,拥有更加灵活多变的景深,从而实现更大范围的深度测量,并提高了测距的精度。

Description

一种可调的深度测量装置及测量方法
本申请要求于2019年12月28日提交中国专利局,申请号为201911384703.8,发明名称为“一种可调的深度测量装置及测量方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及光学测量技术领域,尤其涉及一种可调的深度测量装置及测量方法。
背景技术
深度测量装置可以用来获取物体的深度图像,进一步可以进行3D建模、骨架提取、人脸识别等,在3D测量以及人机交互等领域有着非常广泛的应用。目前的深度测量技术主要有TOF测距技术、结构光测距技术、双目测距技术等。
TOF的全称是Time-of-Flight,即飞行时间,TOF测距技术是一种通过测量光脉冲在发射/接收装置和目标物体间的往返飞行时间来实现精确测距的技术,分为直接测距技术和间接测距技术。其中,直接测距技术是通过向目标物体连续发送光脉冲,然后利用传感器接收从物体反射回的光信号,通过探测这些发射和反射回被接收光脉冲的飞行(往返)时间来得到目标物体距离;间接测距技术则是通过向目标物体发射时序上振幅被调制的光束,测量反射光束相对于发射光束的相位延迟,再根据相位延迟对飞行时间进行计算。按照调制解调类型方式的不同可以分为连续波(Continuous Wave,CW)调制解调方法和脉冲调制(Pulse Modulated,PM)调制解调方法。
结构光测距技术是通过向目标区域投射结构光光束,并采集反射回的结构光光束后形成结构光图案,最后利用三角法等计算出目标物体的深度图像。常 用的结构光图案有不规则斑点图案、条纹图案、相移图案等。结构光技术具有分辨率高、精度高、功耗低等特点。
在进行深度测量的装置中,由于应用场景众多,对深度测量装置的测量范围和测量精度都具有非常高的要求,但受到内部结构的影响导致深度测量装置的测量范围有限,而且在不同的测量范围处测量精度也会受到影响。
发明内容
本申请的目的在于提供一种可调的深度测量装置及测量方法,以解决上述背景技术问题中的至少一种问题。
为达到上述目的,本申请实施例的技术方案是这样实现的:
一种可调的深度测量装置,包括发射单元、接收单元、以及控制与处理电路;其中,所述发射单元包括有光源阵列和变焦投影透镜;所述光源阵列包括至少两个子光源阵列,每个所述子光源阵列用于发射斑点图案光束;所述变焦投影透镜被配置为接收所述光束并将所述光束投射至目标区域,通过改变所述变焦投影透镜的焦距而改变所述光源投射光束的视场角;所述接收单元包括TOF图像传感器和变焦成像透镜;所述TOF图像传感器被配置成采集所述目标区域反射回的至少部分光束并形成电信号;所述变焦成像透镜被配置为将所述反射光束投射到所述TOF图像传感器中,通过改变所述变焦成像透镜的焦距而改变所述TOF图像传感器采集反射光束的视场角;所述控制与处理电路与所述发射单元以及所述接收单元连接,根据所述电信号计算所述目标区域的深度图像。
在一些实施例中,还包括有驱动器,所述控制与处理电路控制所述驱动器对所述变焦投影透镜和所述变焦成像透镜的焦距进行调整。
在一些实施例中,所述控制与处理电路中存储有所述变焦投影透镜的焦距与所述变焦成像透镜的焦距之间关系的约束条件,所述控制与处理电路根据所述约束条件控制所述变焦投影透镜与所述变焦成像透镜的焦距的调整。
在一些实施例中,所述变焦投影透镜被配置为具有第一投影焦距和第二投 影焦距,所述第一投影焦距小于所述第二投影焦距,所述光束经过所述变焦投影透镜投射到目标区域的第一投影视场角大于第二投影视场角;所述变焦成像透镜被配置为具有第一成像焦距和第二成像焦距,所述第一成像焦距小于所述第二成像焦距,所述TOF图像传感器经过所述变焦成像透镜采集反射光束的第一成像视场角大于第二成像视场角。
在一些实施例中,第一子光源阵列朝向目标区域投射光束时,所述变焦投影透镜和所述变焦成像透镜被配置为具有第二投影焦距和第二成像焦距,在所述第二投影视场角内投射光束并在所述第二成像视场角内采集反射光束;第二子光源阵列朝向目标区域投射光束时,所述变焦投影透镜和所述变焦成像透镜被配置为具有第一投影焦距和第一成像焦距,在所述第一投影视场角内投射光束并在所述第一成像视场角内采集反射光束。
在一些实施例中,每个所述子光源阵列中光源的数量不等并且能够被单独控制;所述子光源阵列中的多个光源是不规则排列的。
在一些实施例中,所述TOF图像传感器包括至少一个像素;其中,每个所述像素包括至少两个抽头,所述抽头用于在单个帧周期内以一定的次序依次采集所述反射光束并产生电信号。
在一些实施例中,所述控制与处理电路接收所述电信号进行处理,计算出反射光束的强度信息并生成结构光图像,基于所述结构光图像计算出所述目标区域的深度图像;或,所述控制与处理电路接收所述电信号进行处理,计算出光束从发射到反射被接收的相位差,基于所述相位差进一步计算出所述目标区域的深度图像。
本申请另一技术方案为:
一种深度测量方法,包括如下步骤:
控制发射单元向目标区域投射光束;其中,所述发射单元包括光源阵列和变焦投影透镜;所述光源阵列包括至少两个子光源阵列,每个所述子光源阵列用于发射斑点图案光束,所述变焦投影透镜被配置为接收所述光束,并将所述 光束投射至目标区域,通过改变所述变焦投影透镜的焦距而改变所述光源投射光束的视场角;
控制接收单元采集目标区域反射回的至少部分光束并形成电信号;其中,所述接收单元包括TOF图像传感器和变焦成像透镜;所述TOF图像传感器被配置成采集目标区域反射回的至少部分光束并形成电信号;所述变焦成像透镜被配置为将反射光束投射到所述TOF图像传感器中,通过改变所述变焦成像透镜的焦距而改变所述TOF图像传感器采集反射光束的视场角;
控制与处理电路接收所述电信号,根据所述电信号计算目标区域的深度图像,完成深度测量。
在一些实施例中,所述变焦投影透镜被配置为具有第一投影焦距和第二投影焦距,所述第一投影焦距小于所述第二投影焦距,所述光束经过所述变焦投影透镜投射到目标区域的第一投影视场角大于第二投影视场角;所述变焦成像透镜被配置为具有第一成像焦距和第二成像焦距,所述第一成像焦距小于所述第二成像焦距,经过所述变焦成像透镜采集反射光束的第一成像视场角大于第二成像视场角。
本申请实施例提供一种可调的深度测量装置,包括发射单元、接收单元、以及控制与处理电路;其中,发射单元包括有光源阵列和变焦投影透镜;光源阵列包括至少两个子光源阵列,用于发射斑点图案光束;变焦投影透镜接收光束并投射至目标区域,改变变焦投影透镜的焦距而改变光源投射光束的视场角;接收单元包括TOF图像传感器和变焦成像透镜;变焦成像透镜将反射光束投射到TOF图像传感器中被传感器采集后形成电信号,通过改变变焦成像透镜的焦距而改变TOF图像传感器采集反射光束的视场角;控制与处理电路根据电信号计算目标区域的深度图像。本申请通过调整焦距,拥有更加灵活多变的景深,从而实现更大范围的深度测量,并提高了测距的精度。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是根据本申请一个实施例可调的深度测量装置的结构示意图。
图2a、图2b是根据本申请一个实施例可调的深度测量装置的光源阵列的示意图。
图3是根据本申请一个实施例可调的深度测量装置的原理示意图。
图4是根据本申请另一个实施例一种深度测量方法的流程图示。
具体实施方式
为了使本申请实施例所要解决的技术问题、技术方案及有益效果更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
需要说明的是,当元件被称为“固定于”或“设置于”另一个元件,它可以直接在另一个元件上或者间接在该另一个元件上。当一个元件被称为是“连接于”另一个元件,它可以是直接连接到另一个元件或间接连接至该另一个元件上。另外,连接即可以是用于固定作用也可以是用于电路连通作用。
需要理解的是,术语“长度”、“宽度”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请实施例和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多该特征。在本申请实施 例的描述中,“多个”的含义是两个或两个以上,除非另有明确具体的限定。
参照图1所示,图1是根据本申请一个实施例可调的深度测量装置的示意图。深度测量装置10包括发射单元11、接收单元12以及控制与处理电路13,其中发射单元11用于向目标区域20发射光束30,该光束发射至目标空间中以照明空间中的目标物体20,至少部分发射光束30经目标区域20反射后形成反射光束40,反射光束40中的至少部分光束被接收单元12接收,控制与处理电路13分别与光发射单元11以及接收单元12连接以控制光束的发射与接收,同时接收来自接收单元12接收反射光束并产生的信息,并对该信息进行计算以获取目标物体的深度信息。
光发射单元11包括光源111、光学元件112、变焦投影透镜113、及光源驱动器(图中未示出)等。光源111可以是发光二极管(LED)、边发射激光器(EEL)、垂直腔面发射激光器(VCSEL)等,也可以是由多个光源组成的光源阵列,用于朝向目标区域发射点状光束。光源111的排列方式可以是规则的也可以是不规则的,光源111所发射的光束可以是可见光、红外光、紫外光等。光源111在光源驱动器(其可以进一步被控制与处理电路13控制)的控制下向外发射光束,比如在一个实施例中,光源111在控制与处理电路13的控制下发射时序上振幅被调制的光束,可以是脉冲调制光束、方波调制光束或正弦波调制光束。可以理解的是,可以利用控制与处理电路13中的一部分或者独立于控制与处理电路13存在的子电路来控制光源111发射相关的光束,比如脉冲信号发生器。
光学元件112接收来自光源111的光束并整形后投射至目标区域。例如在一个实施例中,光学元件112接收来自光源111的脉冲光束,并将脉冲光束进行光学调制,比如衍射、折射、反射等调制,随后向空间中发射被调制后的光束,比如聚焦光束、泛光光束、斑点图案光束等。光学元件112可以是透镜、液晶元件、衍射光学元件、微透镜阵列、超表面(Meta-surface)光学元件、掩膜板、反射镜、MEMS振镜等形式中的一种或多种组合。在本申请一个实施例中,以不规则形式排列的光源阵列发射的斑点图案光束经过光学元件112后向 目标区域投影泛光光束或斑点图案光束。
变焦投影透镜113被配置为接收光源发射的光束,并将光束投射至目标区域,通过改变变焦投影透镜113的焦距而改变光源投射光束的视场角。变焦投影透镜可以是连续变焦的也可以是具有多个可调焦距,例如在一些实施例中,可以是具有至少两个可调焦距的变焦透镜。在一些实施例中,变焦投影透镜可以是通过驱动器改变透镜的焦点实现变焦功能。在另一些实施例中,变焦投影透镜可以是液体透镜,通过改变液体的形状来实现变焦。
接收单元12包括TOF图像传感器121、滤光片122和变焦成像透镜123,变焦成像透镜123接收并将由目标物体反射回的至少部分光束成像在至少部分TOF图像传感器121上,滤光片122设置为与光源波长相匹配的窄带滤光片,用于抑制其余波段的背景光噪声。TOF图像传感器121可以是电荷耦合元件(CCD)、互补金属氧化物半导体(CMOS)、雪崩二极管(AD)、单光子雪崩二极管(SPAD)等组成的图像传感器,阵列大小代表着该深度相机的分辨率,比如320×240等。一般地,与TOF图像传感器121连接的还包括由信号放大器、时数转换器(TDC)、模数转换器(ADC)等器件中的一种或多种组成的读出电路(图中未示出)。
同样的,变焦成像透镜123可以是连续变焦的也可以是具有多个可调焦距,例如在一些实施例中,可以是具有至少两个可调焦距的变焦透镜。在一些实施例中,变焦成像透镜可以是通过驱动器改变透镜的焦点实现变焦功能。在另一些实施例中,变焦成像透镜可以是液体透镜,通过改变液体的形状来实现变焦。
一般地,TOF图像传感器121包括至少一个像素,与传统的仅用于拍照的图像传感器相比,TOF图像传感器121的每个像素包含两个及以上的抽头(tap,用于在相应电极的控制下存储并读取或者排出由入射光子产生的电荷信号),在单个帧周期(或单次曝光时间内)内以一定的次序依次切换抽头以采集相应的光子,以接收光信号并转换成电信号。
控制与处理电路13可以是独立的专用电路,比如包含CPU、存储器、总线等组成的专用SOC芯片、FPGA芯片、ASIC芯片等等,也可以包含通用处理电路, 比如当该深度相机被集成到如手机、电视、电脑等智能终端中去,终端中的处理电路可以作为控制与处理电路13的至少一部分。
控制与处理电路13同步发射单元11和接收单元12的调制与解调,提供光源111发射激光时所需的调制信号(发射信号),光源在调制信号的控制下向目标物体发射时序上振幅被调制的光束,比如在一些实施例中,调制信号为正弦波信号、方波信号或脉冲信号,光源在该调制信号的调制下振幅被时序上调制以产生正弦波信号、方波信号或者脉冲信号向外发射。控制与处理电路13还提供TOF图像传感器121中各像素中各抽头的解调信号(采集信号),各抽头在解调信号的控制下采集反射光束并产生电信号。
在一些实施例中,控制与处理电路13接收电信号并进行处理计算出反射光束的强度信息,优选地,可采用加权平均的方式计算出光束的强度信息,根据强度信息生成结构光图像,基于结构光图像结合匹配算法、三角法计算等计算获得目标区域的深度图像。
在一些实施例中,控制与处理电路13接收电信号进行处理并计算出光束从发射到反射回被接收的相位差,基于相位差计算出光束从发射到反射回被接收所需要的时间来计算目标区域的深度图像。
在一些实施例中,深度测量装置10还可以包括驱动电路、电源、彩色相机、红外相机、IMU等器件,在图中并未示出,与这些器件的组合可以实现更加丰富的功能,比如3D纹理建模、红外人脸识别、SLAM等功能。深度测量装置10可以被嵌入到手机、平板电脑、计算机等电子产品中。
图2a、图2b是根据本申请一个实施例可调深度测量装置的光源阵列的结构示意图。光源阵列111由设置在单片基底(或多片基底)上的多个子光源组成,子光源以一定的图案形式排列在基底上。基底可以是半导体基底、金属基底等,子光源可以是发光二极管、边发射激光发射器、垂直腔面激光发射器(VCSEL)等,优选地,光源阵列111由设置在半导体基底上的多个VCSEL子光源所组成的阵列VCSEL芯片,多个VCSEL子光源以不规则的形式排列。子光源用于发射任意 需要波长的光束,比如可见光、红外光、紫外光等。光源阵列111在驱动电路(可以是处理电路13的一部分)的调制驱动下进行发光,比如连续波调制、脉冲调制等,光源阵列111也可以在驱动电路的控制下分组发光或者整体发光。
在一个实施例中,如图2a所示,光源阵列111包含第一子光源阵列201(图2a中用带空心的圆表示)、第二子光源阵列202(图2a中用阴影的圆表示)等。第一子光源阵列201集中在光源阵列111的中间位置且数量上比第二子光源阵列202少,第一、第二子光源阵列分别在第一、第二驱动电路的控制下朝向目标区域发射第一斑点图案光束和第二斑点图案光束。
在另一个实施例中,如图2b所示,光源阵列111包含第一子光源阵列203(图2b中用空心的圆表示)、另一子光源阵列204(图2a中用带阴影的圆表示)等。第一子光源阵列203集中在光源阵列111的中间位置且数量上较少,在第一驱动电路的控制下朝向目标区域发射第一斑点图案光束。第二驱动电路控制第一子光源阵列203和另一子光源阵列204共同发光形成第二子光源阵列,以向目标区域发射第二斑点图案光束。
可以理解的是,光源阵列111也可以包括第三子光源阵列、第四子光源阵列等等,在本申请实施例中不作特别限制。通过这种多个子光源阵列分组或共同发光的配置方式可以实现不同密度的发射光束,密度越大时越适用于远距离测量,因此可以实现由多种不同测量区间组成的测量范围。
图3是根据本申请一个实施例深度测量装置的原理示意图。深度测量装置包括驱动器,驱动器在控制与处理电路的控制下,驱动变焦投影透镜和变焦成像透镜的焦距进行调整。
光源阵列111发出光束经过变焦投影透镜113后投射至目标区域中,可以通过调节变焦投影透镜的焦距调节投射光束的视场角。变焦投影透镜113被配置为具有至少两个可调焦距。当变焦投影透镜113的焦点位于第一投影位置时,具有第一投影焦距,此时发射光束投射至目标区域20中具有第一投影视场角301;当变焦投影透镜的焦点位于第二投影位置时,具有第二投影焦距,此时发射光 束投射至目标区域20中具有第二投影视场角303;随着变焦投影透镜的焦距增大,发射光束投射至目标区域内的视场角减小。
TOF图像传感器采集目标区域反射回的至少部分光束并形成电信号,变焦成像透镜将反射光束投射到TOF图像传感器的像素中,通过改变变焦成像透镜的焦距而改变TOF图像传感器采集反射光束的视场角。变焦成像透镜122可以是具有至少两个可调焦距的变焦透镜,当变焦透镜的焦点位于第一成像位置时,具有第一成像焦距,此时TOF图像传感器在第一成像视场角302内采集目标区域20反射的部分光束;当变焦成像透镜的焦点位于第二成像位置时,具有第二成像焦距,此时TOF图像传感器在第二成像视场角304内采集目标区域20反射的部分光束。随着变焦透镜的焦距增大,TOF图像传感器在目标区域内采集反射光束的视场角减小。
一般地,在进行深度测量时,发射单元和接收单元的焦距应符合特定的约束关系,也可称为约束条件,在这种关系下,确保发射单元投射至目标区域的光束最终能够在接收单元中实现高质量的成像。在一些实施例中,可以设置发射单元和接收单元的焦距在变焦时始终保持相等。即,第一投影焦距等于第一成像焦距,此时第一投影视场角与第一成像视场角的区域基本重合;第二投影焦距等于第二成像焦距,此时第二投影视场角与第二成像视场角的区域基本重合。当然,也可以有其他约束条件。在一个实施例中,这些约束关系被保存在控制与处理电路中,控制与处理电路对驱动器发送调整指令时需要先调用这些约束关系,并将该约束关系转换成相应的控制指令以控制焦距的调整。
在一些实施例中,调整光源阵列分组发光或者整体发光,结合变焦透镜的焦距调整可以在对应不同距离处的目标物体时实现更高精度的成像。例如当目标物体离深度测量装置较近时,第一驱动电路驱动第一子光源阵列朝向目标区域投射光束,此时调整变焦投影透镜位于第二投影位置,控制光束投射至目标区域的第二投影视场角较小。此时,可以保证在使用较少的光源时投射光束更集中,在减小功耗的同时提高测量的分辨率,保证测量精度。相对应的,调整 变焦成像透镜位于第二成像位置,在与第二投影视场角对应的第二成像视场角内采集反射光束。此时,即可以保证全部的投射光束经过目标区域反射后的光束被传感器接收到,又能有效降低环境光的影响。
同样的,当目标物体离深度测量装置较远时,第二驱动电路驱动第二子光源阵列朝向目标区域投射光束,调整变焦投影透镜位于第一投影位置,控制光束投射至目标区域的第一视场角较大。由于目标距离较远处时,反射光强度会降低,可能导致传感器难以接收到有效的光束形成电信号,密集的斑点光束投射至目标区域后,在第一视场角内投影光束的强度较高,可以提高反射光束的光强度。相对应的,调整变焦成像透镜位于第一成像位置,在与第一投影视场角对应的第一成像视场角内采集反射光束。
本申请可调的深度测量装置通过调整焦距,使得深度测量装置拥有更加灵活多变的景深,从而实现更大范围的深度测量。另一方面,结合光源阵列的分区域工作模式,也可以有效降低了装置的功耗,提高不同测距范围的精度。
参照图4所示,图4所示为本申请另一实施例一种深度测量方法的流程图,该测量方法包括:
S41、控制发射单元向目标区域投射光束;其中,发射单元包括光源阵列和变焦投影透镜;所述光源阵列包括至少两个子光源阵列,每个所述子光源阵列用于发射斑点图案光束,所述变焦投影透镜被配置为接收所述光束,并将所述光束投射至目标区域,通过改变所述变焦投影透镜的焦距而改变所述光源投射光束的视场角。
S42、控制接收单元采集目标区域反射回的至少部分光束并形成电信号;其中,接收单元包括TOF图像传感器和变焦成像透镜,TOF图像传感器被配置成采集目标区域反射回的至少部分光束并形成电信号,变焦成像透镜被配置为将反射光束投射到TOF图像传感器中,通过改变变焦成像透镜的焦距而改变TOF图像传感器采集反射光束的视场角。
S43、控制与处理电路接收电信号,根据电信号计算目标区域的深度图像, 完成深度测量。
具体的,光源阵列发出光束经过变焦投影透镜后投射至目标区域中,通过调节变焦投影透镜的焦距调节投射光束的视场角。变焦投影透镜被配置为具有至少两个可调焦距,或者变焦投影透镜可以是连续变焦的。当变焦投影透镜的焦点位于第一投影位置时,具有第一投影焦距,此时发射光束投射至目标区域中具有第一投影视场角;当变焦投影透镜的焦点位于第二投影位置时,具有第二投影焦距,此时发射光束投射至目标区域中具有第二投影视场角;随着变焦投影透镜的焦距增大,发射光束投射至目标区域内的视场角减小。
具体的,TOF图像传感器采集目标区域反射回的至少部分光束并形成电信号,变焦成像透镜将反射光束投射到TOF图像传感器的像素中,通过改变变焦成像透镜的焦距而改变TOF图像传感器采集反射光束的视场角。在一些实施例中,变焦成像透镜配置为具有至少两个可调焦距的变焦透镜,或者变焦投影透镜可以是连续变焦的。当变焦透镜的焦点位于第一成像位置时,具有第一成像焦距,此时TOF图像传感器在第一成像视场角内采集目标区域反射的部分光束;当变焦成像透镜的焦点位于第二成像位置时,具有第二成像焦距,此时TOF图像传感器在第二成像视场角内采集目标区域反射的部分光束。随着变焦透镜的焦距增大,TOF图像传感器在目标区域内采集反射光束的视场角减小。
在一些实施例中,可以设置发射单元和接收单元的焦距在变焦时始终保持相等。即,第一投影焦距等于第一成像焦距,此时第一投影视场角与第一成像视场角的区域基本重合;第二投影焦距等于第二成像焦距,此时第二投影视场角与第二成像视场角的区域基本重合。
本方法通过调整焦距,使得深度测量装置拥有更加灵活多变的景深,从而实现更大范围的深度测量。另一方面,结合光源阵列的分区域工作模式,也可以有效降低了装置的功耗,提高不同测距范围的精度。
可以理解的是,以上内容是结合具体/优选的实施方式对本申请所作的进一步详细说明,不能认定本申请的具体实施只局限于这些说明。对于本申请所属 技术领域的普通技术人员来说,在不脱离本申请构思的前提下,其还可以对这些已描述的实施方式做出若干替代或变型,而这些替代或变型方式都应当视为属于本申请的保护范围。在本说明书的描述中,参考术语“一种实施例”、“一些实施例”、“优选实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。
在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。尽管已经详细描述了本申请的实施例及其优点,但应当理解,在不脱离由所附权利要求限定的范围的情况下,可以在本文中进行各种改变、替换和变更。
此外,本申请的范围不旨在限于说明书中所述的过程、机器、制造、物质组成、手段、方法和步骤的特定实施例。本领域普通技术人员将容易理解,可以利用执行与本文所述相应实施例基本相同功能或获得与本文所述实施例基本相同结果的目前存在的或稍后要开发的上述披露、过程、机器、制造、物质组成、手段、方法或步骤。因此,所附权利要求旨在将这些过程、机器、制造、物质组成、手段、方法或步骤包含在其范围内。

Claims (10)

  1. 一种可调的深度测量装置,其特征在于,包括发射单元、接收单元、以及控制与处理电路;其中,
    所述发射单元包括有光源阵列和变焦投影透镜;所述光源阵列包括至少两个子光源阵列,每个所述子光源阵列用于发射斑点图案光束;所述变焦投影透镜被配置为接收所述光束并将所述光束投射至目标区域,通过改变所述变焦投影透镜的焦距而改变所述光源投射光束的视场角;
    所述接收单元包括TOF图像传感器和变焦成像透镜;所述TOF图像传感器被配置成采集所述目标区域反射回的至少部分光束并形成电信号;所述变焦成像透镜被配置为将所述反射光束投射到所述TOF图像传感器中,通过改变所述变焦成像透镜的焦距而改变所述TOF图像传感器采集反射光束的视场角;
    所述控制与处理电路与所述发射单元和所述接收单元连接,根据所述电信号计算所述目标区域的深度图像。
  2. 如权利要求1所述可调的深度测量装置,其特征在于:还包括有驱动器,所述控制与处理电路控制所述驱动器对所述变焦投影透镜和所述变焦成像透镜的焦距进行调整。
  3. 如权利要求2所述可调的深度测量装置,其特征在于:所述控制与处理电路中存储有所述变焦投影透镜的焦距与所述变焦成像透镜的焦距之间关系的约束条件,所述控制与处理电路根据所述约束条件控制所述变焦投影透镜与所述变焦成像透镜的焦距的调整。
  4. 如权利要求1所述可调的深度测量装置,其特征在于:所述变焦投影透镜被配置为具有第一投影焦距和第二投影焦距,所述第一投影焦距小于所述第二投影焦距,所述光束经过所述变焦投影透镜投射到所述目标区域的第一投影视场角大于第二投影视场角;
    所述变焦成像透镜被配置为具有第一成像焦距和第二成像焦距,所述第一成像焦距小于所述第二成像焦距,所述TOF图像传感器经过所述变焦成像透镜 采集反射光束的第一成像视场角大于第二成像视场角。
  5. 如权利要求4所述可调的深度测量装置,其特征在于:第一子光源阵列朝向目标区域投射光束时,所述变焦投影透镜和所述变焦成像透镜被配置为具有第二投影焦距和第二成像焦距,在所述第二投影视场角内投射光束并在所述第二成像视场角内采集反射光束;
    第二子光源阵列朝向目标区域投射光束时,所述变焦投影透镜和所述变焦成像透镜被配置为具有第一投影焦距和第一成像焦距,在所述第一投影视场角内投射光束并在所述第一成像视场角内采集反射光束。
  6. 如权利要求1所述可调的深度测量装置,其特征在于:每个所述子光源阵列中光源的数量不等并且能够被单独控制;所述子光源阵列中的多个光源是不规则排列的。
  7. 如权利要求1所述可调的深度测量装置,其特征在于:所述TOF图像传感器包括至少一个像素;其中,每个所述像素包括至少两个抽头,所述抽头用于在单个帧周期内以一定的次序依次采集所述反射光束并产生电信号。
  8. 如权利要求7所述可调的深度测量装置,其特征在于:所述控制与处理电路接收所述电信号进行处理,计算出反射光束的强度信息并生成结构光图像,基于所述结构光图像计算出所述目标区域的深度图像;或,所述控制与处理电路接收所述电信号进行处理,计算出光束从发射到反射被接收的相位差,基于所述相位差进一步计算出所述目标区域的深度图像。
  9. 一种深度测量方法,其特征在于,包括如下步骤:
    控制发射单元向目标区域投射光束;其中,所述发射单元包括光源阵列和变焦投影透镜;所述光源阵列包括至少两个子光源阵列,每个所述子光源阵列用于发射斑点图案光束,所述变焦投影透镜被配置为接收所述光束,并将所述光束投射至目标区域,通过改变所述变焦投影透镜的焦距而改变所述光源投射光束的视场角;
    控制接收单元采集目标区域反射回的至少部分光束并形成电信号;其中, 所述接收单元包括TOF图像传感器和变焦成像透镜;所述TOF图像传感器被配置成采集目标区域反射回的至少部分光束并形成电信号;所述变焦成像透镜被配置为将反射光束投射到所述TOF图像传感器中,通过改变所述变焦成像透镜的焦距而改变所述TOF图像传感器采集反射光束的视场角;
    控制与处理电路接收所述电信号,根据所述电信号计算目标区域的深度图像,完成深度测量。
  10. 如权利要求9所述的深度测量方法,其特征在于:所述变焦投影透镜被配置为具有第一投影焦距和第二投影焦距,所述第一投影焦距小于所述第二投影焦距,所述光束经过所述变焦投影透镜投射到目标区域的第一投影视场角大于第二投影视场角;
    所述变焦成像透镜被配置为具有第一成像焦距和第二成像焦距,所述第一成像焦距小于所述第二成像焦距,所述TOF图像传感器经过所述变焦成像透镜采集反射光束的第一成像视场角大于第二成像视场角。
PCT/CN2020/077865 2019-12-28 2020-03-04 一种可调的深度测量装置及测量方法 WO2021128587A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911384703.8 2019-12-28
CN201911384703.8A CN111025317B (zh) 2019-12-28 2019-12-28 一种可调的深度测量装置及测量方法

Publications (1)

Publication Number Publication Date
WO2021128587A1 true WO2021128587A1 (zh) 2021-07-01

Family

ID=70194981

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/077865 WO2021128587A1 (zh) 2019-12-28 2020-03-04 一种可调的深度测量装置及测量方法

Country Status (2)

Country Link
CN (1) CN111025317B (zh)
WO (1) WO2021128587A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115856835A (zh) * 2023-03-01 2023-03-28 常州星宇车灯股份有限公司 实现变焦距扫描成像的激光雷达控制系统及其控制方法
CN116320746A (zh) * 2023-05-16 2023-06-23 武汉昊一源科技有限公司 Tof对焦装置、对焦方法及拍摄设备
CN116342710A (zh) * 2023-02-10 2023-06-27 深圳市中图仪器股份有限公司 用于激光跟踪仪的双目相机的标定方法

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111913305B (zh) * 2020-07-28 2022-11-08 Oppo广东移动通信有限公司 发射模组及深度传感器、电子设备
CN114070994B (zh) * 2020-07-30 2023-07-25 宁波舜宇光电信息有限公司 摄像模组装置、摄像系统、电子设备和自动变焦成像方法
CN111812663A (zh) * 2020-08-24 2020-10-23 浙江水晶光电科技股份有限公司 深度测量模组及系统
CN112462528A (zh) * 2020-09-27 2021-03-09 嘉兴驭光光电科技有限公司 分区匀光照明光学系统、包括其的投射系统及电子设备
US20240027617A1 (en) * 2020-12-10 2024-01-25 Maxell, Ltd. Portable terminal and electronic glasses
WO2022198376A1 (zh) * 2021-03-22 2022-09-29 深圳市大疆创新科技有限公司 测距装置、成像装置以及云台
CN114502985A (zh) * 2021-05-21 2022-05-13 深圳市汇顶科技股份有限公司 飞行时间深度检测的发射装置及电子设备
CN113466884B (zh) * 2021-06-30 2022-11-01 深圳市汇顶科技股份有限公司 飞行时间深度测量发射装置及电子设备
CN114486186A (zh) * 2021-12-27 2022-05-13 歌尔股份有限公司 一种镜头的有效焦距的检测设备和方法
JP7413426B2 (ja) * 2022-03-18 2024-01-15 維沃移動通信有限公司 投光装置、測距装置及び電子機器
CN115002307B (zh) * 2022-05-06 2024-03-08 杭州海康威视数字技术股份有限公司 摄像机用补光组件及摄像机用光源系统
CN115016201B (zh) * 2022-06-17 2023-09-29 杭州海康威视数字技术股份有限公司 变焦摄像机用补光系统
CN116046715A (zh) * 2023-02-09 2023-05-02 上海石兆通讯科技有限公司 一种用于野外地表可燃物含水率测量的多波长近红外光源

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106895793A (zh) * 2015-12-21 2017-06-27 财团法人工业技术研究院 双模式深度测量的方法与装置
CN109543660A (zh) * 2018-12-20 2019-03-29 深圳奥比中光科技有限公司 一种调焦装置以及调焦方法
CN209167538U (zh) * 2018-11-21 2019-07-26 深圳奥比中光科技有限公司 时间飞行深度相机
CN110333501A (zh) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 深度测量装置及距离测量方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7762466B2 (en) * 2008-12-18 2010-07-27 Symbol Technologies, Inc. Two position zoom lens assembly for an imaging-based bar code reader
CN208239772U (zh) * 2017-12-05 2018-12-14 宁波舜宇光电信息有限公司 结构光投影装置、包括其的深度相机及电子设备
CN108718406B (zh) * 2018-05-31 2020-04-03 西安知微传感技术有限公司 一种可变焦3d深度相机及其成像方法
CN110196023B (zh) * 2019-04-08 2024-03-12 奥比中光科技集团股份有限公司 一种双变焦结构光深度相机及变焦方法
CN209783544U (zh) * 2019-04-08 2019-12-13 深圳奥比中光科技有限公司 一种变焦结构光深度相机

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106895793A (zh) * 2015-12-21 2017-06-27 财团法人工业技术研究院 双模式深度测量的方法与装置
CN209167538U (zh) * 2018-11-21 2019-07-26 深圳奥比中光科技有限公司 时间飞行深度相机
CN109543660A (zh) * 2018-12-20 2019-03-29 深圳奥比中光科技有限公司 一种调焦装置以及调焦方法
CN110333501A (zh) * 2019-07-12 2019-10-15 深圳奥比中光科技有限公司 深度测量装置及距离测量方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116342710A (zh) * 2023-02-10 2023-06-27 深圳市中图仪器股份有限公司 用于激光跟踪仪的双目相机的标定方法
CN116342710B (zh) * 2023-02-10 2024-01-30 深圳市中图仪器股份有限公司 用于激光跟踪仪的双目相机的标定方法
CN115856835A (zh) * 2023-03-01 2023-03-28 常州星宇车灯股份有限公司 实现变焦距扫描成像的激光雷达控制系统及其控制方法
CN116320746A (zh) * 2023-05-16 2023-06-23 武汉昊一源科技有限公司 Tof对焦装置、对焦方法及拍摄设备

Also Published As

Publication number Publication date
CN111025317A (zh) 2020-04-17
CN111025317B (zh) 2022-04-26

Similar Documents

Publication Publication Date Title
WO2021128587A1 (zh) 一种可调的深度测量装置及测量方法
WO2021120403A1 (zh) 一种深度测量装置及测量方法
US20210181317A1 (en) Time-of-flight-based distance measurement system and method
CN111142088B (zh) 一种光发射单元、深度测量装置和方法
WO2021051478A1 (zh) 一种双重共享tdc电路的飞行时间距离测量系统及测量方法
WO2021008209A1 (zh) 深度测量装置及距离测量方法
CN111025318B (zh) 一种深度测量装置及测量方法
WO2021072802A1 (zh) 一种距离测量系统及方法
WO2021120402A1 (zh) 一种融合的深度测量装置及测量方法
WO2021238212A1 (zh) 一种深度测量装置、方法及电子设备
CN110596725B (zh) 基于插值的飞行时间测量方法及测量系统
CN109343070A (zh) 时间飞行深度相机
WO2021051480A1 (zh) 一种动态直方图绘制飞行时间距离测量方法及测量系统
WO2021051481A1 (zh) 一种动态直方图绘制飞行时间距离测量方法及测量系统
CN110824490B (zh) 一种动态距离测量系统及方法
WO2021212915A1 (zh) 一种激光测距装置及方法
CN109791207A (zh) 用于确定到对象的距离的系统和方法
WO2021238213A1 (zh) 一种基于tof的深度测量装置、方法及电子设备
CN111025321B (zh) 一种可变焦的深度测量装置及测量方法
CN111722241A (zh) 一种多线扫描距离测量系统、方法及电子设备
WO2021169531A1 (zh) 一种ToF深度测量装置、控制ToF深度测量装置的方法及电子设备
CN110221274A (zh) 时间飞行深度相机及多频调制解调的距离测量方法
WO2021056669A1 (zh) 一种集成分束扫描装置及其制造方法
CN110221272A (zh) 时间飞行深度相机及抗干扰的距离测量方法
CN209894976U (zh) 时间飞行深度相机及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20907421

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20907421

Country of ref document: EP

Kind code of ref document: A1