CN212341462U - Depth measurement module and system - Google Patents
Depth measurement module and system Download PDFInfo
- Publication number
- CN212341462U CN212341462U CN202021788870.7U CN202021788870U CN212341462U CN 212341462 U CN212341462 U CN 212341462U CN 202021788870 U CN202021788870 U CN 202021788870U CN 212341462 U CN212341462 U CN 212341462U
- Authority
- CN
- China
- Prior art keywords
- light
- light source
- optical
- mode
- receiving
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 143
- 230000003287 optical effect Effects 0.000 claims abstract description 212
- 238000009792 diffusion process Methods 0.000 claims description 15
- 238000012545 processing Methods 0.000 claims description 11
- 239000000758 substrate Substances 0.000 claims description 7
- 238000013461 design Methods 0.000 description 16
- 238000000034 method Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 238000000691 measurement method Methods 0.000 description 5
- 230000009286 beneficial effect Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000000295 complement effect Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 230000002349 favourable effect Effects 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000005530 etching Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000002366 time-of-flight method Methods 0.000 description 1
Images
Landscapes
- Optical Radar Systems And Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
The application provides a degree of depth measurement module and system, the degree of depth measurement module includes: the projection module comprises a light source unit, a first optical structure and a second optical structure, wherein the light source unit comprises a first light-emitting mode and a second light-emitting mode, light emitted by the light source unit based on the first light-emitting mode passes through the first optical structure and projects a first picture into a space, and light emitted by the light source unit based on the second light-emitting mode passes through the second optical structure and projects a second picture into the space; the receiving module is used for receiving the first picture based on the first receiving mode so as to realize direct flight time measurement of the target object, and the receiving module is also used for receiving the second picture based on the second receiving mode so as to realize indirect flight time measurement of the target object. Therefore, the depth measurement module can select a proper flight time measurement mode under different distance and definition requirements, and the adaptability problem of the depth measurement module under different application scenes is effectively solved.
Description
Technical Field
The application relates to the technical field of depth measurement, in particular to a depth measurement module and a depth measurement system.
Background
The time-of-flight method is to calculate the distance of an object by measuring the time of flight of light in space. There are two main methods. One is an indirect Time of Flight (iToF), and the other is a direct Time of Flight (dtef), which is a direct Time of Flight.
The iToF is to transmit a modulated optical signal, perform phase delay measurement on the transmitted optical signal and the reflected optical signal, and then calculate the flight time by using the delayed phase to obtain the measured distance. And a high-precision time counter in the dToF system is started synchronously with the optical signal emitter, when the optical signal reflected from the object is detected, the time counter is stopped, so that the round-trip time of the optical signal is obtained, and the distance to be measured is obtained because the speed of light is constant.
At present, the iToF technology and the dToF technology have different performances in all aspects due to different technical principles and hardware structures: compared with dToF, iToF has the advantage of high resolution, but simultaneously has large power consumption, and the measurement distance is limited by the modulation wavelength, so that the method is generally suitable for short-distance application scenes; the dToF has smaller power consumption, and the measuring distance is not limited by the modulation wavelength, but the receiving end chip has low resolution, so that the restoring degree of the details and textures of the object at a short distance is far insufficient, and the dToF is generally applied to a long distance.
In summary, the image resolution of the iToF technique is higher, but the test distance is shorter; while the dtofs technique measures longer distances, but the image resolution is lower. Therefore, the iToF technology and the dToF technology are different in application scenarios and both face the adaptability problem of the actual application scenario.
SUMMERY OF THE UTILITY MODEL
An object of the embodiments of the present application is to provide a depth measurement module and a depth measurement system, so as to overcome the adaptability problem of the iToF technology and the dToF technology in an application scenario.
In order to achieve the above object, embodiments of the present application are implemented as follows:
in a first aspect, an embodiment of the present application provides a depth measurement module, including: the projection module comprises a light source unit, a first optical structure and a second optical structure, wherein the light source unit comprises a first light-emitting mode and a second light-emitting mode, light rays emitted by the light source unit based on the first light-emitting mode pass through the first optical structure to project a first picture into a space, and light rays emitted by the light source unit based on the second light-emitting mode pass through the second optical structure to project a second picture into the space; the receiving module comprises a first receiving mode and a second receiving mode, and is used for receiving the first picture based on the first receiving mode so as to realize direct flight time measurement of the target object, and is also used for receiving the second picture based on the second receiving mode so as to realize indirect flight time measurement of the target object.
In the embodiment of the present application, by two emission modes (first and second light emission modes) of the light source unit and two reception modes (first and second reception modes) of the reception module: the light rays emitted based on the first light emitting module project a first picture to the space through the first optical structure, and the receiving module can receive the first picture based on the first receiving mode so as to realize direct flight time measurement of the target object; the light emitted based on the second light-emitting mode is used for projecting a second picture to the space through the second optical structure, and the receiving module can receive the second picture based on the second receiving mode so as to realize indirect flight time measurement of the target object. In such a mode, direct flight time measurement and indirect flight time measurement can be fused, so that the depth measurement module can select a proper flight time measurement mode under different distance and definition requirements, and the adaptability problem of the depth measurement module under different application scenes is effectively solved.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the light source unit includes a VCSEL light source, and the VCSEL light source is connected to an external control circuit, and the VCSEL light source is configured to emit a first light signal with a preset pulse width under the control of the external control circuit, where the VCSEL light source emits the first light signal with the preset pulse width in the first light emitting mode; or, the VCSEL light source is configured to emit a second optical signal with a preset modulation frequency under the control of the external control circuit, where the VCSEL light source emits the second optical signal with the preset modulation frequency in the second light emitting mode.
In this implementation, the switching of the emission mode of the VCSEL light source can be simply and reliably achieved by controlling the emission mode of the VCSEL light source (the first optical signal with a preset pulse width, or the second optical signal with a preset modulation frequency) by the external control circuit.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the first optical structure includes a collimating mirror and a diffractive optical element, where the collimating mirror is configured to collimate the first optical signal; the diffractive optical element is configured to project a speckle pattern into a space based on the collimated first optical signal, where the speckle pattern is the first picture.
In this implementation, the first optical structure includes a collimating mirror (which collimates the first optical signal) and a diffractive optical element (which projects a speckle pattern into space based on the collimated first optical signal), which may ensure the reliability of the direct time-of-flight measurement.
With reference to the first possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the second optical structure includes an optical diffusion sheet, and the optical diffusion sheet is configured to perform light homogenizing processing on the second optical signal, so as to project an optical pattern into a space based on the second optical signal after the light homogenizing processing, where the optical pattern is the second picture.
In this implementation, the second optical structure includes an optical diffuser (for homogenizing the second optical signal) to project an optical pattern into space, which may ensure reliability of the indirect time-of-flight measurement.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, the light source unit includes a first light source and a second light source that are disposed on a same substrate, where the first light source and the second light source are respectively connected to an external control circuit, and the first light source is configured to emit a first light signal with a preset pulse width under the control of the external control circuit, where the first light source emits the first light signal with the preset pulse width in the first light emitting mode; the second light source is used for emitting a second optical signal with a preset modulation frequency under the control of the external control circuit, wherein the second optical signal with the preset modulation frequency emitted by the second light source is in the second light emitting mode.
In this implementation, the light source unit includes a first light source and a second light source disposed on the same substrate, the first light source being configured to emit a first light signal (first light emitting mode) with a preset pulse width, and the second light source being configured to emit a second light signal (second light emitting mode) with a preset modulation frequency. In this way, an additional optical structural member is not required to be designed to control the optical path of the light source (the first light source or the second light source) to the corresponding optical structure (the first optical structure or the second optical structure), development cost can be saved, and the structure is simple and high in reliability.
With reference to the fourth possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, the first optical structure includes a first collimating mirror and a diffractive optical element, the second optical structure includes an optical diffusion sheet, the first collimating mirror is disposed on an optical path of the first light source and is configured to collimate the first light signal emitted by the first light source, and the diffractive optical element is disposed on a front side of the first collimating mirror and is configured to project a speckle pattern into a space based on the collimated first light signal, where the front side is a side of the collimating mirror away from the first light source, and the speckle pattern is the first picture; the optical diffusion sheet is arranged on an optical path of the second light source and used for carrying out light homogenizing treatment on the second light signal so as to project an optical pattern to a space based on the second light signal after the light homogenizing treatment, wherein the optical pattern is the second picture.
In this implementation, the first optical structure includes a first collimating mirror and a diffractive optical element, and the second optical structure includes an optical diffuser, so that the first light source and the second light source have their respective corresponding optical structures in their respective light emitting modes to implement their corresponding projection modes, thereby reliably implementing their respective corresponding time-of-flight measurement modes.
With reference to the fifth possible implementation manner of the first aspect, in a sixth possible implementation manner of the first aspect, the second optical structure further includes a second collimating mirror, where the second collimating mirror is disposed on the optical path of the second light source, and is located between the second light source and the optical diffusion sheet, and is configured to receive the second light signal emitted by the second light source, so that the optical diffusion sheet performs light uniformization on the received second light signal, where the light uniformization indicates that a divergence angle of a light ray is reduced.
In this implementation, the second optical structure further includes a second collimating mirror, and the second collimating mirror can collimate light emitted by the second light source, so that crosstalk between the first light source and the second light source is avoided as much as possible, and reliability of direct time-of-flight measurement and indirect time-of-flight measurement is ensured.
With reference to the sixth possible implementation manner of the first aspect, in a seventh possible implementation manner of the first aspect, the first collimating mirror and the second collimating mirror are integrally designed, and the diffractive optical element and the optical diffusion sheet are integrally designed.
In this implementation, first collimating mirror and second collimating mirror integrated design, diffraction optical element and the integrated design of optical diffusion piece can reduce the use of component as far as like this, can reduce the complexity of module assembly on the one hand, and on the other hand is favorable to practicing thrift the cost.
With reference to the first aspect, or with reference to any one of the first to the seventh possible implementation manners of the first aspect, in an eighth possible implementation manner of the first aspect, the receiving module includes an image sensor, and the image sensor is connected to an external control circuit and is configured to receive the first image under the control of the external control circuit to implement direct time-of-flight measurement on the target object, or receive the second image to implement indirect time-of-flight measurement on the target object.
In this implementation, the image sensor is connected to an external control circuit, and switching of the receiving mode may be implemented, for example, receiving a first picture to implement direct time-of-flight measurement of the target object, or receiving a second picture to implement indirect time-of-flight measurement of the target object. The method can effectively simplify the complexity of the receiving module, optimize the structure and the size of the receiving module and is also beneficial to controlling the cost.
With reference to the first aspect, or with reference to any one of the first to seventh possible implementation manners of the first aspect, in a ninth possible implementation manner of the first aspect, the receiving module includes a first image sensor and a second image sensor, where the first image sensor and the second image sensor are respectively connected to an external control circuit, and the first image sensor is configured to receive the first picture under the control of the external control circuit to implement direct time-of-flight measurement on an object; and the second image sensor is used for receiving the second picture under the control of the external control circuit so as to realize indirect flight time measurement of the target object.
In this implementation, the receiving module may include a first image sensor and a second image sensor, which are respectively configured to receive the first image and the second image, so as to respectively implement direct time-of-flight measurement and indirect time-of-flight measurement on the target object. The method is beneficial to simplifying the design complexity of an external control circuit and improving the independence and reliability of two measurement modes.
In a second aspect, an embodiment of the present application provides a depth measurement system, which includes a distance sensor, a controller, and the depth measurement module described in any one of the first aspect or possible implementations of the first aspect, where the distance sensor, the projection module, and the receiving module are respectively connected to the controller, and the distance sensor detects a distance to a target object and sends the distance to the controller; the controller is configured to control the light source unit to emit light through the first light emitting module to project a first picture into a space including the target object when the distance is greater than a first distance threshold, and control the receiving module to receive the first picture based on the first receiving mode, so as to implement direct time-of-flight measurement on the target object; and the controller is further configured to control the light source unit to emit light through the second light emitting mode to project a second picture into a space including the target object when the distance is smaller than a second distance threshold, and control the receiving module to receive the second picture based on the second receiving mode to achieve indirect time-of-flight measurement of the target object, wherein the first distance threshold is greater than or equal to the second distance threshold.
In the embodiment of the application, the distance of the target object is detected through the distance sensor, and the controller controls the flight time measurement mode of the depth measurement module based on the distance threshold range where the detected distance is located, so that the adaptability measurement of different application scenes (mainly different distances) can be realized, and the adaptability problem of the direct flight time measurement mode and the indirect flight time measurement mode in the application scenes is effectively solved.
With reference to the second aspect, in a first possible implementation manner of the second aspect, when the distance is between the first distance threshold and the second distance threshold, the controller is further configured to: acquiring a current shooting mode, wherein the shooting mode comprises a definition priority mode and a contour priority mode; when the shooting mode is the profile priority mode, controlling the light source unit to emit light rays through the first light emitting mode so as to project a first picture into a space containing the target object, and controlling the receiving module to receive the first picture based on the first receiving mode so as to realize direct flight time measurement of the target object; and when the shooting mode is the definition priority mode, controlling the light source unit to emit light rays through the second light emitting mode so as to project a second picture into a space containing the target object, and controlling the receiving module to receive the second picture based on the second receiving mode so as to realize indirect flight time measurement of the target object.
In this implementation, when the distance is between the first distance threshold and the second distance threshold, it is described that the direct time-of-flight measurement and the indirect time-of-flight measurement can both effectively detect the distance of the target object over the distance, in this case, the factors of the shooting mode (the contour priority mode and the definition priority mode) can also be considered, in the contour priority mode (for example, attitude recognition), too high definition is not required, the distance of the target object can be measured by using the direct time-of-flight measurement method, and resources can be saved; in the sharpness priority mode (e.g., face recognition), the captured image can be made to have higher sharpness. Therefore, the flexibility of the depth measuring method can be improved, and different requirements can be met.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic view of a depth measurement module according to an embodiment of the present disclosure.
Fig. 2 is a schematic structural diagram of a projection module according to an embodiment of the present disclosure.
Fig. 3 is a schematic structural diagram of another projection module according to an embodiment of the present disclosure.
Fig. 4 is a schematic view of a first optical structure and a second optical structure provided in an embodiment of the present application with a baffle added therebetween.
Fig. 5 is a schematic view of an integrated design of a diffractive optical element and an optical diffuser provided in an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a receiving module according to an embodiment of the present disclosure.
Fig. 7 is a schematic structural diagram of another receiving module according to an embodiment of the present disclosure.
Fig. 8 is a schematic structural diagram of a depth measurement system according to an embodiment of the present application.
Icon: 10-a depth measurement system; 100-a depth measurement module; 110-a projection module; 111-a light source unit; 1111-a first light source; 1112-a second light source; 112-a first optical structure; 1121-first collimating mirror; 1122-diffractive optical element; 113-a second optical structure; 1131 — a second collimating mirror; 1132-optical diffuser; 120-a receiving module; 121-an image sensor; 1211 — a first image sensor; 1212-a second image sensor; 122-lens group; 200-a controller; 300-distance sensor.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
In the description of the present application, it should be noted that the terms "inside", "outside", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings or orientations or positional relationships that the products of the application usually place when using, and are only used for convenience in describing the present application and simplifying the description, but do not indicate or imply that the devices or elements that are referred to must have a specific orientation, be constructed in a specific orientation, and operate, and thus, should not be construed as limiting the present application. Furthermore, the terms "first," "second," and the like are used merely to distinguish one description from another, and are not to be construed as indicating or implying relative importance.
It should also be noted that, unless expressly stated or limited otherwise, the terms "disposed" and "connected" are to be construed broadly, e.g., as meaning fixedly connected, detachably connected, or integrally connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
Before introducing the depth measurement module provided in the embodiments of the present application, in order to facilitate understanding of the present solution, a direct time-of-flight measurement method (abbreviated as dToF, hereinafter referred to as dToF) and an indirect time-of-flight measurement method (abbreviated as iToF, hereinafter referred to as iToF) are first performed.
The principle of measurement of dtofs is: and a high-precision time counter is adopted to be synchronous with the optical signal transmitter, when the optical signal reflected from the object is detected, the time counter is stopped, the round-trip time of the optical signal is obtained, and the distance to be measured can be obtained because the speed of light is constant.
The dtod product structure generally includes a projection module and a receiving module mounted on the same substrate, a VCSEL (Vertical-Cavity Surface-Emitting Laser) light source may be used as a light source of the projection module, and a collimating lens may collimate a light beam of the VCSEL light source, so that the light beam is incident into a diffractive Optical Element 1122 (DOE) in the form of parallel light and projects a regular speckle pattern into a space. The receiving module is usually composed of Single Photon Avalanche Diode (SPAD), and the resolution is about 3 ten thousand pixels. During depth measurement, the VCSEL light source can emit a high-energy and short-pulse-width optical signal (usually 1 nanosecond), and the optical signal enters the DOE after being collimated by the collimating lens to perform beam splitting and diffraction of light, and a diffraction optical pattern with a specific arrangement is projected into a space; the receiving module can receive optical signals reflected by objects in the space, and the time of the light flying in the space is recorded through the high-precision time counter, so that the depth measurement is realized. The depth calculation formula is as follows:
where d is the depth of the object, t0Starting time, t, for the light signal emitted by the projection module1C represents the speed of light for the time when the reflected light signal is received by the receiving module.
The principle of iToF measurement is: the method comprises the steps of firstly transmitting a modulated optical signal, carrying out phase delay measurement on the transmitted optical signal and a reflected optical signal, and then calculating the flight time by using the delayed phase to obtain the measured distance.
The product structure of the iToF typically includes a projection module and a receiving module mounted on the same substrate, and the projection module may include a light source (e.g., a VCSEL, or other type of laser light source) and an optical Diffuser 1132 (i.e., Diffuser). The chip (i.e., the image sensor 121) used by the receiving module is usually a CMOS (Complementary Metal Oxide Semiconductor) chip, and the resolution is about 30 ten thousand pixels. When depth measurement is carried out, the light source can emit an optical signal (usually 10-100MHz) with a specific modulation frequency, and after diffuiser dodging processing, an optical pattern with the specific modulation frequency is projected into space; the receiving module can receive the optical signal reflected by the object in the space, and the depth measurement is realized by comparing the phase delay of the emitted optical signal and the reflected optical signal. The depth calculation formula is as follows:
wherein d is the depth of the object to be measured, fmIs the modulation frequency of the optical signal and,c is the speed of light for the phase difference between the transmitted and received signals.
The above is an introduction of the direct time-of-flight measurement and the indirect time-of-flight measurement in the present embodiment, and the depth measurement module 100 provided in the embodiment of the present application will be described in detail below.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a depth measurement module 100 according to an embodiment of the present disclosure.
In this embodiment, the depth measurement module 100 may include: a projection module 110 and a receiving module 120, wherein the projection module 110 may include a light source unit 111, a first optical structure 112 and a second optical structure 113. The light source unit 111 may include a first light emitting mode and a second light emitting mode, and the receiving module 120 may include a first receiving mode and a second receiving mode. In the first light emitting mode, the light emitted from the light source unit 111 may project a first image (which may be understood as a light signal) into the space through the first optical structure 112, and the receiving module 120 may receive the first image reflected by the target object in the space through the first receiving mode, and implement direct time-of-flight measurement on the target object in the space through a preset direct time-of-flight measurement manner. In the second light emitting mode, the light emitted from the light source unit 111 may project a second image (which may also be understood as a light signal) into the space through the second optical structure 113, and the receiving module 120 may receive the second image reflected by the target object in the space through the second receiving mode, and implement indirect time-of-flight measurement on the target object in the space through a preset indirect time-of-flight measurement mode.
Two emission modes (first and second light emission modes) by the light source unit 111 and two reception modes (first and second reception modes) by the reception module 120: the light emitted based on the first light emitting mode projects a first image into the space through the first optical structure 112, and the receiving module 120 can receive the first image based on the first receiving mode to achieve direct time-of-flight measurement of the target object; the light emitted based on the second light emitting mode projects a second image into the space through the second optical structure 113, and the receiving module 120 can receive the second image based on the second receiving mode to achieve indirect time-of-flight measurement of the target object. In such a way, direct flight time measurement and indirect flight time measurement can be fused, so that the depth measurement module 100 can select a proper flight time measurement mode under different distance and definition requirements, and the problem of adaptability of the depth measurement module 100 in different application scenes is effectively solved.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a projection module 110 according to an embodiment of the present disclosure.
Illustratively, the light source unit 111 may include a VCSEL light source, and the VCSEL light source is connected to an external control circuit (the external control circuit may control a light emitting mode of the VCSEL light source). The VCSEL light source can emit a first light signal with a preset pulse width (e.g. 1 ns) under the control of the external control circuit, where the first light signal with the preset pulse width emitted by the VCSEL light source is the first light emitting mode of the light source unit 111. Of course, the VCSEL light source may also emit a second light signal with a predetermined modulation frequency (e.g. 100MHz, 30MHz, etc.) under the control of the external control circuit, where the VCSEL light source emits the second light signal with the predetermined modulation frequency, which is the second light emitting mode of the light source unit 111.
The switching of the emission mode of the VCSEL light source can be simply and reliably achieved by controlling the emission mode of the VCSEL light source (the first optical signal with a preset pulse width or the second optical signal with a preset modulation frequency) by an external control circuit.
Of course, the above-mentioned implementation of the direct time-of-flight measurement mode or the indirect time-of-flight measurement mode by switching the light emitting mode through the VCSEL light source is only one of various implementation schemes, and some other schemes may also be provided in this embodiment to implement switching of the emission mode of the light source unit 111, so as to implement switching of the direct time-of-flight measurement mode and the indirect time-of-flight measurement mode.
Referring to fig. 2, fig. 2 is a schematic structural diagram of another projection module 110 according to an embodiment of the present disclosure.
In this embodiment, the light source unit 111 may include a first light source 1111 and a second light source 1112, and the first light source 1111 and the second light source 1112 may be disposed on the same substrate and respectively connected to an external control circuit. The first light source 1111 may emit a first light signal with a preset pulse width under the control of the external control circuit to implement a first light emitting mode of the light source unit 111, the first light signal projects a first image into the space through the first optical structure 112, and the receiving module 120 receives the light (light signal) reflected by the target object in the space by using the first receiving mode to implement direct time-of-flight measurement of the target object. The second light source 1112 can emit a second light signal with a preset modulation frequency under the control of the external control circuit to realize a second light emitting mode of the light source unit 111, the second light signal projects a second image into the space through the second optical structure 113, and the receiving module 120 receives the light (light signal) reflected by the target object in the space by using the second receiving mode to realize indirect flight time measurement of the target object.
In this way, it is not necessary to design an additional optical structure to control the light path of the light source (the first light source 1111 or the second light source 1112) to the corresponding optical structure (the first optical structure 112 or the second optical structure 113), which can save the development cost and has a simple structure and high reliability.
For example, the first light source 1111 and the second light source 1112 may employ the same type of light source, for example, the first light source 1111 and the second light source 1112 may both be VCSEL light sources; different types of light sources can be used, and the light sources are not limited herein, subject to actual needs.
It should be noted that, since the optical paths of the first optical structure 112 and the second optical structure 113 may be different, if two light emitting modes of the same light source are required to implement direct time-of-flight measurement and indirect time-of-flight measurement respectively, the optical path of the light source unit 111 in the first light emitting mode may be different from the optical path in the second light emitting mode. In this case, it is necessary to design some structural members for changing the optical path (for example, changing the optical path by reflection, refraction, etc.) or changing the position of the light source (for example, the position of the light source unit 111 can be adjusted, in the first lighting mode, the light source unit 111 emits the first light signal at the first position so that the first light signal projects the first picture into the space through the first optical structure 112, and in the second lighting mode, the light source unit 111 emits the second light signal at the second position so that the second light signal projects the first picture into the space through the second optical structure 113). Therefore, if the two light sources (the first light source 1111 and the second light source 1112) are used to respectively realize the first light emitting mode and the second light emitting mode of the light source unit 111, no additional structural design is required, on one hand, the development difficulty can be reduced, the development cost can be saved, and on the other hand, the projection module 110 can have the advantages of simple structure and high reliability.
With continued reference to fig. 2, in the present embodiment, the first optical structure 112 may include a collimating mirror and a diffractive optical element 1122.
For example, the collimating mirror may be disposed in front of the light source unit 111 (or the first light source 1111) (i.e., the side of the light source unit 111 that emits the first optical signal) to collimate the first optical signal. The diffractive optical element 1122 may be disposed in front of the collimating mirror (i.e., on the side from which the collimated first optical signal exits) to project a speckle pattern (i.e., a first image) into space based on the collimated first optical signal, thereby ensuring the reliability of the direct time-of-flight measurement.
In this embodiment, the second optical structure 113 may include an optical diffuser 1132.
For example, the optical diffuser 1132 may be disposed in front of the light source unit 111 (or the second light source 1112) (which may be located at the same baseline as the diffractive optical element 1122 in the first optical structure 112 or at a different baseline, but not limited thereto) for performing the light evening process on the second light signal to project an optical pattern (i.e., a second frame) into a space based on the light evening processed second light signal, so as to ensure the reliability of the indirect time-of-flight measurement method.
In order to prevent crosstalk of the light source unit 111 in different light emitting modes (i.e. in the first light emitting mode, the first light signal emitted by the light source unit 111 or the first light source 1111 should be incident into the first optical structure 112, but there is a possibility that a part of the first light signal is incident into the second optical structure 113), in this embodiment, a collimating mirror may be added between the optical diffusing sheet 1132 of the second optical structure 113 and the light source unit 111 (or the second light source 1112) to collimate the second light signal emitted by the light source unit 111 (or the second light source 1112) and perform a light receiving function (where the light receiving means that the divergence angle of the light is reduced), so that the optical diffusing sheet 1132 performs light homogenizing processing on the received second light signal, thereby avoiding the occurrence of crosstalk as much as possible.
Referring to fig. 3 again, the first optical structure 112 may include a first collimating mirror 1121 and a diffractive optical element 1122, and the second optical structure 113 may include a second collimating mirror 1131 and an optical diffuser 1132.
The first collimating mirror 1121 may be disposed on an optical path of the first light source 1111 for collimating the first light signal emitted from the first light source 1111, and the diffractive optical element 1122 may be disposed at a front side of the first collimating mirror 1121 (i.e., a side of the collimating mirror away from the first light source 1111) and may project a speckle pattern (i.e., a first picture) into a space based on the collimated first light signal. The second collimating mirror 1131 is disposed on the optical path of the second light source 1112, and is located between the second light source 1112 and the optical diffusion sheet 1132, and is configured to receive the second light signal emitted by the second light source 1112 (reduce the divergence angle of the light ray), so that the optical diffusion sheet 1132 performs light homogenizing processing on the received second light signal, and projects an optical pattern (i.e., a second picture) into a space based on the second light signal after the light homogenizing processing.
In this way, the first light source 1111 and the second light source 1112 can have their respective optical structures in their respective light emitting modes to realize their respective projection modes, thereby reliably realizing their respective time-of-flight measurement modes. In addition, the second collimating mirror 1131 of the second optical structure 113 can collimate the light emitted by the second light source 1112, so as to avoid crosstalk between the first light source 1111 and the second light source 1112 as much as possible, and ensure the reliability of direct time-of-flight measurement and indirect time-of-flight measurement.
Of course, the mode of collecting light by adding the collimating lens to prevent light crosstalk is only an exemplary mode, and light crosstalk may be prevented by adding a baffle between the first optical structure 112 and the second optical structure 113 (as shown in fig. 4), which is not limited herein.
Referring to fig. 3, in the present embodiment, the first optical structure 112 may include a first collimating mirror 1121 and a diffractive optical element 1122, and the second optical structure 113 may include a second collimating mirror 1131 and an optical diffuser 1132. In order to reduce the complexity of assembling the depth measurement module 100, in the present embodiment, a single-piece design may be adopted between the partial structural members of the first optical structure 112 and the partial structural members of the second optical structure 113.
For example, the first collimating mirror 1121 and the second collimating mirror 1131 may be integrally designed, and the first collimating mirror 1121 and the second collimating mirror 1131 may be integrally designed regardless of the type. The design mode can be realized by the current processing technology. In designing and processing, the focal length and the position of the first collimating mirror 1121 and the second collimating mirror 1131 can be considered, and the design may be based on the requirement in actual design, and is not particularly limited.
For example, the diffractive optical element 1122 and the optical diffuser 1132 may be integrally formed. For example, the diffractive optical element 1122 and the optical diffuser 1132 may be designed on the same glass substrate by etching or nano-imprinting, so as to realize an integrated design of the diffractive optical element 1122 and the optical diffuser 1132, as shown in fig. 5.
Through with first collimating mirror 1121 and second collimating mirror 1131 integrated design, diffraction optical element 1122 and optical diffusion piece 1132 integrated design, can reduce the use of component as far as like this, can reduce the complexity of module assembly on the one hand, on the other hand is favorable to practicing thrift the cost.
It should be noted that the above description of the projection module 110 is only exemplary. For example, the light source unit 111 may implement the first light emitting mode and the second light emitting mode by a single light source (specifically, implemented by changing a light path or changing a position of the light source), or may implement the first light emitting mode and the second light emitting mode by a dual light source; as another example, the relationship between the first optical structure 112 and the second optical structure 113, for example, the first optical structure 112 may include the first collimating mirror 1121 and the diffractive optical element 1122, while the second optical element may include the optical diffuser 1132, or the second collimating mirror 1131 and the optical diffuser 1132, and whether a unified design is adopted between the first optical structure 112 and the second optical structure 113, or the like. All these can be combined arbitrarily, and only some of the modes are described in this embodiment, not all of them, and other combinations may be included, all of which are intended to fall within the scope of protection of this application.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a receiving module 120 according to an embodiment of the present disclosure.
In this embodiment, the receiving module 120 may include an image sensor 121. The image sensor 121 may be connected to the external control circuit, and configured to receive a first picture (i.e., a first receiving mode) to achieve direct time-of-flight measurement of the target object or receive a second picture (i.e., a second receiving mode) to achieve indirect time-of-flight measurement of the target object under the control of the external control circuit. In this way, two receiving modes of the receiving module 120 are realized by using a single image sensor 121, which can effectively simplify the complexity of the receiving module 120, optimize the structure and the size of the receiving module 120, and is also beneficial to controlling the cost.
Illustratively, the image sensor 121 may be an array pixel sensor cell (array size representing the resolution of the camera) composed of a charge coupled device, a Complementary Metal Oxide Semiconductor (CMOS), an Avalanche Diode (AD), a Single Photon Avalanche Diode (SPAD), etc.
In this embodiment, the receiving module 120 may further include a lens group 122, where the lens group 122 is disposed in front of the image sensor 121 (i.e., before the light in the space propagates to the image sensor 121), and processes (e.g., converges) the light to be received by the image sensor 121, so as to improve the efficiency of the image sensor 121 for receiving the light.
Of course, the receiving module 120 may also implement two receiving modes (i.e. the first receiving mode and the second receiving mode) in other manners. For example, referring to fig. 7, the receiving module 120 may include a first image sensor 1211 and a second image sensor 1212, and the first image sensor 1211 and the second image sensor 1212 are respectively connected to the external control circuit. The first image sensor 1211 may receive a first picture under the control of the external control circuit to implement direct time-of-flight measurement of the target object; and the second image sensor 1212 may receive a second picture under the control of the external control circuit to achieve indirect time-of-flight measurement of the target object. The method is beneficial to simplifying the design complexity of an external control circuit and improving the independence and reliability of two measurement modes.
It should be noted that, in this embodiment, the specific structure of the projection module 110 and the specific structure of the receiving module 120 may be arbitrarily combined to implement the present solution, and the present invention is not limited herein. In addition, the external control circuit mentioned in this embodiment is a unified concept regarding the peripheral circuit controlling the entire depth measurement module 100, and the external control circuit may be a mutually associated circuit, or may be a relatively independent circuit (for example, the circuit controlling the first light source 1111 and the circuit controlling the second light source 1112 in the light source unit 111 may be mutually independent; for example, a circuit composed of one or more of a signal amplifier, a time-to-digital converter, an analog-to-digital converter, and a filter connected to the image sensor 121 may be mutually independent from the circuit controlling the light source unit 111, etc.), which may be based on actual needs, and is not limited herein.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a depth measurement system 10 according to an embodiment of the present disclosure.
In this embodiment, the depth measuring system 10 may include a distance sensor 300, a controller 200, and a depth measuring module 100 in this embodiment. The distance sensor 300, the projection module 110 and the receiving module 120 are respectively connected to the controller 200.
The distance sensor 300 may detect the distance of the target object and transmit the distance to the controller 200. The controller 200 may determine the distance: when the distance is greater than a first distance threshold (e.g., 5 meters), the light source unit 111 is controlled to emit light through the first light emitting module to project a first image into the space containing the target object, and the receiving module 120 is controlled to receive the first image based on the first receiving mode, so as to implement direct time-of-flight measurement on the target object. When the distance is smaller than a second distance threshold (e.g. 3 meters), the light source unit 111 may be controlled to emit light through the second light emitting mode to project a second image into the space containing the target object, and the receiving module 120 may be controlled to receive the second image based on the second receiving mode, so as to implement indirect time-of-flight measurement on the target object. And the first distance threshold value is greater than or equal to the second distance threshold value.
The distance of the target object is detected by the distance sensor 300, and the controller 200 controls the flight time measurement mode of the depth measurement module 100 based on the distance threshold range of the detected distance, so that adaptive measurement on different application scenes (mainly different distances) can be realized, and the problem of adaptability of the direct flight time measurement mode and the indirect flight time measurement mode in the application scenes is effectively solved.
The controller 200 may also acquire the current photographing mode (including the sharpness priority mode and the contour priority mode) when the distance is between the first distance threshold and the second distance threshold (i.e., between 3 and 5 meters). When the photographing mode is the profile priority mode (e.g., in a gesture recognition scene), the controller 200 may control the light source unit 111 to emit light through the first light emitting mode to project a first picture into a space including the target object, and control the receiving module 120 to receive the first picture based on the first receiving mode, so as to implement direct time-of-flight measurement on the target object. When the photographing mode is the sharpness priority mode (e.g., in a face recognition scene), the controller 200 may control the light source unit 111 to emit light through the second light emitting mode to project a second picture into the space including the target object, and control the receiving module 120 to receive the second picture based on the second receiving mode, so as to achieve indirect time-of-flight measurement of the target object.
When the distance is between the first distance threshold and the second distance threshold, the direct time-of-flight measurement and the indirect time-of-flight measurement can be used for effectively detecting the distance of the target object at the distance, in this case, the factors of the shooting mode (the profile priority mode and the definition priority mode) can be considered, in the profile priority mode (such as attitude identification), too high definition is not needed, the distance of the target object can be measured by adopting a direct time-of-flight measurement method, and resources can be saved; in the sharpness priority mode (e.g., face recognition), the captured image can be made to have higher sharpness. Therefore, the flexibility of the depth measuring method can be improved, and different requirements can be met.
In summary, the embodiments of the present application provide a depth measurement module and a system, which are configured to measure depth by using two emission modes (a first light emitting mode and a second light emitting mode) of a light source unit and two receiving modes (a first receiving mode and a second receiving mode) of a receiving module: the light rays emitted based on the first light emitting module project a first picture to the space through the first optical structure, and the receiving module can receive the first picture based on the first receiving mode so as to realize direct flight time measurement of the target object; the light emitted based on the second light-emitting mode is used for projecting a second picture to the space through the second optical structure, and the receiving module can receive the second picture based on the second receiving mode so as to realize indirect flight time measurement of the target object. In such a mode, direct flight time measurement and indirect flight time measurement can be fused, so that the depth measurement module can select a proper flight time measurement mode under different distance and definition requirements, and the adaptability problem of the depth measurement module under different application scenes is effectively solved.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (12)
1. A depth measurement module, comprising:
the projection module comprises a light source unit, a first optical structure and a second optical structure, wherein the light source unit comprises a first light-emitting mode and a second light-emitting mode, light rays emitted by the light source unit based on the first light-emitting mode pass through the first optical structure to project a first picture into a space, and light rays emitted by the light source unit based on the second light-emitting mode pass through the second optical structure to project a second picture into the space;
the receiving module comprises a first receiving mode and a second receiving mode, and is used for receiving the first picture based on the first receiving mode so as to realize direct flight time measurement of the target object, and is also used for receiving the second picture based on the second receiving mode so as to realize indirect flight time measurement of the target object.
2. The depth measurement module of claim 1, wherein the light source unit comprises a VCSEL light source, and the VCSEL light source is connected to an external control circuit,
the VCSEL light source is used for emitting a first optical signal with a preset pulse width under the control of the external control circuit, wherein the VCSEL light source emits the first optical signal with the preset pulse width in the first light emitting mode; or,
the VCSEL light source is configured to emit a second optical signal with a preset modulation frequency under the control of the external control circuit, where the VCSEL light source emits the second optical signal with the preset modulation frequency in the second light emitting mode.
3. The depth measurement module of claim 2, wherein the first optical structure comprises a collimating mirror and a diffractive optical element,
the collimating mirror is used for collimating the first optical signal;
the diffractive optical element is configured to project a speckle pattern into a space based on the collimated first optical signal, where the speckle pattern is the first picture.
4. The depth measurement module of claim 2, wherein the second optical structure comprises an optical diffuser,
the optical diffusion sheet is configured to perform light homogenizing processing on the second optical signal, so as to project an optical pattern into a space based on the second optical signal after the light homogenizing processing, where the optical pattern is the second picture.
5. The depth measurement module of claim 1, wherein the light source unit comprises a first light source and a second light source disposed on the same substrate, the first light source and the second light source being respectively connected to an external control circuit,
the first light source is used for emitting a first light signal with a preset pulse width under the control of the external control circuit, wherein the first light source emits the first light signal with the preset pulse width in the first light emitting mode;
the second light source is used for emitting a second optical signal with a preset modulation frequency under the control of the external control circuit, wherein the second optical signal with the preset modulation frequency emitted by the second light source is in the second light emitting mode.
6. The depth measurement module of claim 5, wherein the first optical structure comprises a first collimating mirror and a diffractive optical element, the second optical structure comprises an optical diffuser,
the first collimating mirror is arranged on a light path of the first light source and used for collimating the first light signal emitted by the first light source, and the diffractive optical element is arranged on the front side of the first collimating mirror and used for projecting a speckle pattern into a space based on the collimated first light signal, wherein the front side is the side of the collimating mirror far away from the first light source, and the speckle pattern is the first picture;
the optical diffusion sheet is arranged on an optical path of the second light source and used for carrying out light homogenizing treatment on the second light signal so as to project an optical pattern to a space based on the second light signal after the light homogenizing treatment, wherein the optical pattern is the second picture.
7. The depth measurement module of claim 6, wherein the second optical configuration further comprises a second collimating mirror,
the second collimating lens is arranged on a light path of the second light source, is located between the second light source and the optical diffusion sheet, and is configured to receive the second light signal emitted by the second light source, so that the optical diffusion sheet performs light homogenizing processing on the received second light signal, where the light receiving indicates that a divergence angle of light is reduced.
8. The depth measurement module of claim 7, wherein the first collimating mirror is integrally designed with the second collimating mirror and the diffractive optical element is integrally designed with the optical diffuser.
9. The depth measurement module of any one of claims 1 to 8, wherein the receiving module comprises an image sensor, and the image sensor is connected to an external control circuit, and is configured to receive the first image to realize direct time-of-flight measurement of the target object or receive the second image to realize indirect time-of-flight measurement of the target object under the control of the external control circuit.
10. The depth measurement module of any one of claims 1 to 8, wherein the receiving module comprises a first image sensor and a second image sensor, the first image sensor and the second image sensor being respectively connected to an external control circuit,
the first image sensor is used for receiving the first picture under the control of the external control circuit so as to realize direct flight time measurement of the target object;
and the second image sensor is used for receiving the second picture under the control of the external control circuit so as to realize indirect flight time measurement of the target object.
11. A depth measurement system comprising a distance sensor, a controller and the depth measurement module of any one of claims 1 to 10, the distance sensor, the projection module and the receiving module being respectively connected to the controller,
the distance sensor detects the distance of a target object and sends the distance to the controller;
the controller is configured to control the light source unit to emit light through the first light emitting module to project a first picture into a space including the target object when the distance is greater than a first distance threshold, and control the receiving module to receive the first picture based on the first receiving mode, so as to implement direct time-of-flight measurement on the target object; and the number of the first and second groups,
the controller is further configured to control the light source unit to emit light through the second light emitting mode to project a second picture into a space including the target object when the distance is smaller than a second distance threshold, and control the receiving module to receive the second picture based on the second receiving mode to achieve indirect time-of-flight measurement of the target object, where the first distance threshold is greater than or equal to the second distance threshold.
12. The depth measurement system of claim 11, wherein when the distance is between the first distance threshold and the second distance threshold, the controller is further to:
acquiring a current shooting mode, wherein the shooting mode comprises a definition priority mode and a contour priority mode;
when the shooting mode is the profile priority mode, controlling the light source unit to emit light rays through the first light emitting mode so as to project a first picture into a space containing the target object, and controlling the receiving module to receive the first picture based on the first receiving mode so as to realize direct flight time measurement of the target object;
and when the shooting mode is the definition priority mode, controlling the light source unit to emit light rays through the second light emitting mode so as to project a second picture into a space containing the target object, and controlling the receiving module to receive the second picture based on the second receiving mode so as to realize indirect flight time measurement of the target object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202021788870.7U CN212341462U (en) | 2020-08-24 | 2020-08-24 | Depth measurement module and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202021788870.7U CN212341462U (en) | 2020-08-24 | 2020-08-24 | Depth measurement module and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN212341462U true CN212341462U (en) | 2021-01-12 |
Family
ID=74071039
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202021788870.7U Active CN212341462U (en) | 2020-08-24 | 2020-08-24 | Depth measurement module and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN212341462U (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111812663A (en) * | 2020-08-24 | 2020-10-23 | 浙江水晶光电科技股份有限公司 | Depth measurement module and system |
CN112526546A (en) * | 2021-02-09 | 2021-03-19 | 深圳市汇顶科技股份有限公司 | Depth information determination method and device |
CN112924986A (en) * | 2021-04-15 | 2021-06-08 | 东莞埃科思科技有限公司 | Common substrate module, assembling method thereof and manufacturing and detecting integrated equipment |
CN112946604A (en) * | 2021-02-05 | 2021-06-11 | 上海鲲游科技有限公司 | dTOF-based detection device and electronic device and application thereof |
CN112965073A (en) * | 2021-02-05 | 2021-06-15 | 上海鲲游科技有限公司 | Partition projection device and light source unit and application thereof |
WO2022193888A1 (en) * | 2021-03-18 | 2022-09-22 | 华为技术有限公司 | Detection apparatus and terminal device |
-
2020
- 2020-08-24 CN CN202021788870.7U patent/CN212341462U/en active Active
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111812663A (en) * | 2020-08-24 | 2020-10-23 | 浙江水晶光电科技股份有限公司 | Depth measurement module and system |
CN112946604A (en) * | 2021-02-05 | 2021-06-11 | 上海鲲游科技有限公司 | dTOF-based detection device and electronic device and application thereof |
CN112965073A (en) * | 2021-02-05 | 2021-06-15 | 上海鲲游科技有限公司 | Partition projection device and light source unit and application thereof |
CN112526546A (en) * | 2021-02-09 | 2021-03-19 | 深圳市汇顶科技股份有限公司 | Depth information determination method and device |
CN112526546B (en) * | 2021-02-09 | 2021-08-17 | 深圳市汇顶科技股份有限公司 | Depth information determination method and device |
WO2022193888A1 (en) * | 2021-03-18 | 2022-09-22 | 华为技术有限公司 | Detection apparatus and terminal device |
CN115113220A (en) * | 2021-03-18 | 2022-09-27 | 华为技术有限公司 | Detection device and terminal equipment |
TWI820637B (en) * | 2021-03-18 | 2023-11-01 | 大陸商華為技術有限公司 | A detection apparatus and terminal device |
CN112924986A (en) * | 2021-04-15 | 2021-06-08 | 东莞埃科思科技有限公司 | Common substrate module, assembling method thereof and manufacturing and detecting integrated equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN212341462U (en) | Depth measurement module and system | |
CN111812663A (en) | Depth measurement module and system | |
CN111142088B (en) | Light emitting unit, depth measuring device and method | |
WO2019110022A1 (en) | Method and medium for emitting and receiving laser pulse, and laser radar system | |
CN110914705A (en) | Integrated LIDAR lighting power control | |
KR20230126704A (en) | LiDAR system using transmit optical power monitor | |
CN115144842A (en) | Transmitting module, photoelectric detection device, electronic equipment and three-dimensional information detection method | |
WO2021208582A1 (en) | Calibration apparatus, calibration system, electronic device and calibration method | |
WO2020142941A1 (en) | Light emitting method, device and scanning system | |
WO2020237764A1 (en) | Laser radar apparatus | |
CN111694161A (en) | Light emitting module, depth camera and electronic equipment | |
US20220120899A1 (en) | Ranging device and mobile platform | |
CN114549609A (en) | Depth measurement system and method | |
CN114488173A (en) | Distance detection method and system based on flight time | |
CN112924986A (en) | Common substrate module, assembling method thereof and manufacturing and detecting integrated equipment | |
KR101255816B1 (en) | Device and method for optically scanning three dimensional object | |
WO2024050902A1 (en) | Itof camera, calibration method, and related device | |
CN110244310A (en) | A kind of TOF system and image processing method, storage medium | |
CN115480253A (en) | Three-dimensional scanning laser radar based on SPAD linear array detector | |
JP7468661B2 (en) | LIDAR DEVICE AND METHOD FOR CALCULATING DISTANCE TO AN OBJECT | |
CN208596224U (en) | Laser radar apparatus and laser radar system | |
CN116359935B (en) | Gating imaging ranging system and ranging method | |
CN219871762U (en) | Coaxial laser radar and terminal equipment | |
CN221056668U (en) | Distance measuring device and distance measuring system | |
US20240168168A1 (en) | 3d scanning system and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |