WO2023048687A1 - High resolution lidar imaging system - Google Patents

High resolution lidar imaging system Download PDF

Info

Publication number
WO2023048687A1
WO2023048687A1 PCT/TR2022/051024 TR2022051024W WO2023048687A1 WO 2023048687 A1 WO2023048687 A1 WO 2023048687A1 TR 2022051024 W TR2022051024 W TR 2022051024W WO 2023048687 A1 WO2023048687 A1 WO 2023048687A1
Authority
WO
WIPO (PCT)
Prior art keywords
high resolution
imaging system
detector
lidar imaging
target
Prior art date
Application number
PCT/TR2022/051024
Other languages
French (fr)
Inventor
Emre YÜCE
Koray ÜRKMEN
Original Assignee
Orta Dogu Teknik Universitesi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from TR2021/015058 external-priority patent/TR2021015058A2/en
Application filed by Orta Dogu Teknik Universitesi filed Critical Orta Dogu Teknik Universitesi
Publication of WO2023048687A1 publication Critical patent/WO2023048687A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning

Definitions

  • the invention relates to an imaging system developed by obtaining high resolution depth information, comprising an analog and/or a digital control system.
  • LIDAR There are many different types of LIDAR available today. These can be listed as flash LIDAR, solid state LIDARs, optical phase array LIDARs and MEMS-based LIDAR, etc. They all use the principle of illuminating the target with the signal generated by the light source and measuring the time difference (time of flight) between the returned and transmitted signal to calculate the Time of Flight. A signal to Noise ratio (SNR) level is set to prevent false alarm rates at the detector time-detecting electronic. The time difference between the transmitted signal and the returned signal from the target defines the distance of the target to the LIDAR system. Basically, measurements made at different spatial positions can be used to calculate the distance of different spatial points on the target, and therefore their relative distances to each other is used for gathering depth information.
  • SNR Signal to Noise ratio
  • the depth information depends on the sharpness of the signal rise time and also to the time resolution that can be measured by the detector timedetecting electronic.
  • the depth resolutions of current LIDAR systems are limited by the rise time of the illumination systems used, and high-cost illumination sources are required for measurements with high depth resolution.
  • Another problem that can be encountered with current systems is that in the case of target points having different depths within the spatially illuminated area, a sequence of signals is returned from the target instead of a single signal, and the different depths within this spatial resolution cannot be calculated.
  • Another method that can be used for high depth measurement is the in-direct Time of Flight measurement techniques. In this methods, a phase shift occurred between the transmitted and received signals. The phase shift is proportional to the time of flight.
  • This phase difference can be measured by comparing the transmitted signal and received signal by using an interferometer system.
  • This methods provides high depth resolutions down to the nm levels in the optical wavelength, but the maximum measurement range or depth information is limited by the wavelength of the illumination source used. Since the depth information obtained is only available in multiples of the wavelength, it provides only a relative measurement. In addition, the use of these systems in the field is limited due to their sensitivity.
  • Another parameter in the analysis of current LIDAR systems is the methods used to create spatial resolution.
  • depth information is measured for each point on the azimuth and elevation axes and combined with the related position information on the azimuth and elevation axes to generate the 3 -dimensional image.
  • the depth resolution is limited by the signal rise time of the illumination system and the time resolution that can be measured by the detector time-detecting electronic.
  • LIDARs can be classified into two main systems. These systems are called mechanical and electromagnetic beam steered systems. As mentioned above, the time of flight calculations are made by stimulating the sensors by the source. These systems work without mirrors, if the light source is steered or by steering, or with the aid of moving mirrors. Each of these systems is example of mechanical systems.
  • LIDAR systems can be examined in two categories; coherent and non-coherent. While coherent systems are measuring the phase change depending on the wavelength of the light used, their depth measurement are limited with the wavelength of the light.
  • Non-coherent systems basically measure the target distance, and distances for different spatial points on the target, by measuring the time of flight of a modulated or pulsed signal that transmitted to the target.
  • the gathered distance information may comprise different angular positions in the Field of View (FOV) depending on the system design used, or the angular position information can be gathered with electronic, mechanical, electro-mechanical, etc. scan systems (including control electronics) for the single point measurement systems.
  • FOV Field of View
  • the patent application numbered WO 2008008970 which is in the state of the art, mentions a lidar-based 3D point cloud measurement system.
  • the system includes elements such as a target, a light source, a plurality of optical elements and detectors, and a photon detector.
  • a lidar system with 64 elements (32 each, including 2 mounts) was constructed.
  • the system can collect 1 million distance points per second based on “time of flight”.
  • the standard deviations of the time of flight measurements are equal to or less than 5 cm.
  • an inertial navigation system (INS) sensor is used to correct these deviations.
  • the light source produces its own light (laser).
  • the system is also capable of receiving and decoding multiple returns from a single laser emission by digitizing and analyzing the waveform produced by the detector when the signal generated from the emitter returns.
  • the distance measurement method mentioned in the invention is performed by continuous time of flight (TOF) measurement for all points on the target and each different image frame. Thus, it cannot reach high resolution values since it can provide resolution values limited by pulse time.
  • TOF continuous time of flight
  • the Korean patent numbered KR 20170132977 which is in the known state of the art, mentions a lidar imaging system developed for vehicles.
  • the vehicle lidar device is capable of detecting an object, based on the time of flight or phase difference measurements of the transmitted signal and/or a received signal.
  • Said lidar device comprises a transmitter/light signal, a receiver, interface, processor, photodetector, photodiode and power supply.
  • the processor is able to generate an image of the object, based on transmitted and received light. Specifically, the processor can generate an image of the object by comparing the transmitted light and the light corresponding to each pixel, or it can generate an image of the object by calculating the “time of flight” or phase change.
  • the processor is able to edit the image depending on the degree to which the image is tilted horizontally or vertically.
  • a depth map is generated based on transmitted light and reflected light.
  • the processor calculates the time of flight and phase change per pixel to generate a depth map.
  • These data can be accumulated using memory. In this way, the images at the points are determined and the imaging system is developed.
  • the distance measurement method mentioned in this patent application is performed by continuous time of flight (TOF) measurement for all points on the target and each different image frame. Thus, it cannot reach high resolution values since it can provide resolution values limited by pulse time.
  • TOF continuous time of flight
  • WO 2019064062 which is in the state of the art, mentions a system and method that uses lidar technology to image objects in the environment.
  • Said system includes elements such as a light source, a processor developed to determine the distance between the vehicle and the object, a sensor developed to detect the reflections of the light source, a moving MEMS mirror, and an actuator.
  • the lidar system can be used to generate depth maps. It is mentioned that depth information for each pixel of the image may also be recorded or may be temporal.
  • the processor controls the light source for different durations during the scanning of the target and determines whether additional light pulses are required based on the image quality.
  • the Time of Flight (TOF) of the closest point on the target can be calculated from the time of flight measured for different pixels of the background data and the time of flight of the second signal obtained directly and/or by summing the values of the successive peaks in the multiple reflection signals returned from the target in a group of neighboring pixels.
  • TOF Time of Flight
  • the time of flight measurement is performed indirectly on the background of a single illumination signal by using the signal returned by multiple reflections. Consequently, it cannot reach high resolution values since it can only provide resolution values limited by the pulse duration.
  • the present invention relates to a high resolution Lidar imaging system that meets the above- mentioned requirements, eliminates all of the disadvantages and provides some additional advantages.
  • the main object of the invention is to provide a Lidar imaging system that provides high- resolution depth information without measuring multiple of times of flight data.
  • the object of the invention is to perform a distance measurement with a detector time- detecting electronic circuit for the first measurement of the distance, as in the current systems.
  • Another object of the invention is to use the change of focus of the target image on the detector to measure the relative distances of the points on the target to each other (depth information on the target).
  • Another object of the invention is to provide a structure that does not need an individual signal illumination and time of flight calculation for each spatial position and does not need to wait for at least time of flight for each spatial position on the target and therefore wherein the depth resolution of the LIDAR system is significantly increased as well as frame rate time.
  • Figure 1 4-staged high resolution and fast LIDAR system flow chart for a close- middle distance target
  • Figure 2 4-staged high resolution and fast LIDAR system for a close-middle distance target
  • Figure 3 4-staged high resolution and fast LIDAR system flow chart for a long-distance target
  • Figure 4 4-staged high resolution and fast LIDAR system for a long-distance target
  • the present invention relates to a high resolution Lidar imaging system that meets the above- mentioned requirements, eliminates all of the disadvantages and provides some additional advantages.
  • Figure 1 shows a 4-staged high resolution and fast LIDAR system flow chart for a close-middle distance target.
  • Figure 2 shows 4-staged high resolution and fast LIDAR system for a closemiddle distance target.
  • a light source (100) is illuminates the target (150) to be imaged.
  • the light source (100) may be sinle and/or multiple pulsed lasers, CW lasers, LEDs or fluorescents, etc. It may comprise optical and electronic systems such as focusers, diffusers, scanners, etc. It may comprise signal shaping optical/electronic systems such as modulators etc.
  • the first detector (111), second detector (112), third detector (113) and fourth detector (114) are the detectors in the system. These detectors can be single or multiple photodiodes, phototransistors, thermal detectors, single photon detectors, avalanche detectors, photon counters and detectors with an array structure.
  • the first reflective/transmissive/absorptive optical element (121), second reflective/transmissive/absorptive optical element (122), third reflective/transmissive/absorptive optical element (123), forth reflective/transmissive/absorptive optical element (124) may have a reflective, transmissive or absorptive structure. These optical elements can be iris, pinhole, selective pass filter or gradient filter.
  • the optical system (130) in the structure may comprise one and/or more refractive and/or reflective optical elements and may be used for spatial scanning.
  • the control system (160) included in the system can be in Analog and/or Digital structure.
  • the control system (160) can control illumination time, shape, modulation, scan system control, etc., illumination functions, optical system scan functions, etc., detector settings, etc., process the detected signal, calculate the time of flight, calculate the depth information based on the positions of the detectors (111, 112, 113, 114) on the optical axis, and perform operations of 3 -dimensional image generation, etc. with the spatial position information such as depth information calculated based on the positions of the detectors (111, 112, 113, 114) on the optical axis and/or the scanning system position.
  • the computer may include components such as a power supply, drive circuit, reader circuit, specialized software, etc.
  • the operation of the system is as follows.
  • the light source (100) managed by the analog and/or digital control system (160) illuminates the target (150) to be imaged for a predetermined or undetermined period of time.
  • Other optical components may be present to steer or collect the beam inside the light source (100).
  • Light reflected from the target passes through the optical system (130).
  • the control system (160) With the help of the control system (160), the time of flight measurement is performed between the received signal on one or more of the first, second, third or fourth detectors (111, 112, 113, 114) and signal of the light source (100). From the time of flight measurement, the distance to the LIDAR system of the reference surface on the target (150) is determined by the control system (160).
  • point A falls on the first detector (111), point B on the second detector (112), point C on the third detector (113) and point D on the fourth detector (114) over the target.
  • the light reflected back from point A on the target passes through the first reflective/transmissive/absorptive optical element (121) and reaches the first detector (111), while a significant part of the rest of the light rays that reflected back from points B, C and D on the target are transferred to the next parts.
  • the rest of the light reflected back from point B on the target passes through the second reflective/transmissive/absorptive optical element (122) and reaches the second detector (112), while the rest of the light rays reflected back from points C and D on the target are mostly transferred to the next parts.
  • the light reflected back from point C on the target passes through the third reflective/transmissive/absorptive optical element (123) and reaches the third detector (113), while a significant part of the rest of the light rays reflected back from point D on the target is transferred to the next part.
  • a significant part of the light reflected back from point D on the target (150) reaches the fourth detector (113).
  • the images obtained on the first detector (111) generate the images of A depth
  • the images obtained on the second detector (112) generate the images of B depth
  • the images obtained on the third detector (113) generate the images of C depth
  • the images obtained on the fourth detector (114) generate the images of D depth.
  • the depth information of these images is calculated by the control system (160) along with the position information of the detectors.
  • the number of the detectors and the reflective/transmissive/absorptive optical elements in front of it can be increased or reduced according to the desired resolution.
  • the entire system shown in Figure 2, the light source (100), the detectors (111, 112, 113, 114), the reflective/transmissive/absorptive optical elements (121, 122, 123, 124) or the optical system (130) can be rotated in elevation and azimuth together or independently. In this manner, elevation and azimuth scanning can be performed.
  • the entire system described in Figure 2 can be produced integrated with each other.
  • detectors (111, 112, 113, 114), reflective/transmissive/absorptive optical elements (121, 122, 123, 124) or optical system (130) elements can also be produced integrated with each other.
  • the spatial scan can be performed optically and/or electronically by using the linear array detectors or focal plane array detectors as (111, 112, 113, 114) numbered detectors of the system.
  • Optical and/or electronic scanning position information controlled by the analog and/or digital control system (160) can be used to generate a 3-dimensional image in combination with depth information calculated by the analog and/or digital control system (160).
  • the operation of the system is as follows.
  • the light source (100) that controlled by the analog and/or digital control system (160), illuminates the target (150) to be imaged for a predetermined or undetermined period of time.
  • Other optical components may be present to steer or collect the beam inside the light source (100).
  • Light reflected from the target passes through the optical system (130) or systems.
  • the time of flight measurement is performed by using the backscattering time of the light with one or more of the first, second, third or fourth detectors (111, 112, 113, 114), the pulse time of the light source (100) and the control system (160). From the time of flight measurement, the distance of the reference surface of the target to the LIDAR system is determined by the control system (160).
  • point A falls on the first detector (111), point B on the second detector (112), point C on the third detector (113) and point D on the fourth detector (114) over the target.
  • the light reflected back from point A on the target passes completely through the first reflective/transmissive/absorptive optical element (121) and reaches the first detector (111), while a significant part of the rest of the light reflected back from points B, C and D on the target is transferred to the next parts.
  • the rest of the light reflected back from point B on the target passes completely through the second reflective/transmissive/absorptive optical element (122) and reaches the second detector (112), while the rest of the light reflected back from points C and D on the target is mostly transferred to the next parts.
  • the light reflected back from point C on the target passes completely through the third reflective/transmissive/absorptive optical element (123) and reaches the third detector (113), while a significant part of the rest of the light reflected back from point D on the target is transferred to the next parts.
  • a significant part of the light reflected back from point D on the target reaches the fourth detector (114).
  • the images obtained on the first detector (111) generate the images of A depth
  • the images obtained on the second detector (112) generate the images of B depth
  • the images obtained on the third detector (113) generate the images of C depth
  • the images obtained on the fourth detector (114) generate the images of D depth.
  • the depth information of these images is calculated by the control system (160) along with the position information of the detectors.
  • the number of the detectors and the reflective/transmissive/absorptive optical elements in front of it can be increased or reduced according to the desired resolution.
  • the entire system shown in Figure 4, the light source (100), the detectors (111, 112, 113, 114), the reflective/transmissive/absorptive optical elements (121, 122, 123, 124) or the optical system (130) can be rotated in the desired direction together or independently. In this manner, horizontal or vertical scanning can be performed.
  • the entire system described in Figure 4 can be produced as an array.
  • detectors (111, 112, 113, 114), reflective/transmissive/absorptive optical elements (121, 122, 123, 124) or optical system (130) elements can also be produced as a layered array.
  • the spatial scan can be performed optically and/or electronically by using the array detectors or detectors with a focal plane array structure for the detectors (111, 112, 113, 114).
  • Optical and/or electronic scanning position information controlled by the analog and/or digital control system (160) can be used to generate a 3- dimensional image in combination with depth information calculated by the control system (160).
  • the operation of the system is as follows.
  • the light source (100) that controlled by the control system (160), illuminates the target (150) to be imaged for a predetermined or undetermined period of time.
  • Other optical components may be present to steer or collect the beam inside the light source (100).
  • Light reflected from the target passes through the optical system (130) or elements.
  • the time of flight measurement is performed by the control system (160) by using the backscattering time of the light coming to the first detector (111) with the pulse time of said light source (100).
  • the first detector (111) detector can consist of single pixel, array or focal plane array type detectors. From the time of flight measurement, the distance of the reference surface of the target to the LIDAR system is determined by the control system (160).
  • the position of the detectors and the reflective/transmissive/absorptive optical elements on the optical axis are changed by the high resolution positioner (140) in subsequent illuminations and the image of the target is taken.
  • the position of the detectors and the reflective/transmissive/absorptive optical elements on the optical axis are changed by the high resolution positioner (140) in the next illumination and the image of the target is taken.
  • the illumination and imaging process are repeated for a predetermined or undetermined number of different high resolution positioner (140) positions.
  • Each high resolution positioner (140) position will correspond to a different depth on the target.
  • the optical system (130) uses its optical properties to calculate the depth information on the target (150) to which each high resolution positioner (140) position corresponds.
  • the images obtained for N different target depths corresponding to N high resolution positioner (140) positions are processed by the control system (160) along with the high resolution positioner (140) position information to obtain different depth information on the target.
  • the entire system shown in Figure 6, the light source (100) or optical system (130), the first detector (111) and the first reflective/transmissive/absorptive optical element (121) can be rotated in the desired direction together or independently. In this manner, horizontal or vertical scanning can be performed.
  • the entire system described in Figure 6 can be produced as an array.
  • the optical system (130), the first detector (111) and the first reflective/transmissive/absorptive optical element (121) can be produced as a layered array.
  • the spatial scan can be performed optically and/or electronically by using the array detector or a detector with a focal plane array structure for the first detector (111).
  • Optical and/or electronic scanning position information controlled by the control system (160) can be used to generate a 3- dimensional image in combination with depth information calculated by the control system (160).
  • the operation of the system is as follows.
  • the light source (100) controlled by the control system (160) illuminates the target (150) to be imaged for a predetermined or undetermined period of time.
  • Other optical components may be present to steer or collect the beam inside the light source (100).
  • Light reflected from the target passes through the element or elements in the optical system (130).
  • the time of flight measurement is performed by the control system (160) by using the backscattering time of the light coming to the first detector (111) with the pulse time of the light source (100). From the time of flight measurement, the distance of the reference surface of the target to the LIDAR system is determined by the Control System (160).
  • the position of the first detector (111) and the first reflective/transmissive/absorptive optical elements (121) on the optical axis are changed by the high resolution positioner (140) in subsequent illuminations and the image of the target is taken.
  • the position of the first detector (111) and the first reflective/transmissive/absorptive optical elements (121) on the optical axis are changed by the high resolution positioner (140) in the next illumination and the image of the target is taken.
  • the illumination and imag process are repeated for a predetermined or undetermined number of different high resolution positioner positions. Each high resolution positioner position will correspond to a different depth on the target.
  • the optical system (130) uses its optical properties to calculate the depth information on the target to which each high resolution positioner (140) position corresponds.
  • the images obtained for N different target depths corresponding to N High Resolution Positioner (140) positions are processed by the Control System (160) along with the High Resolution Positioner (140) position information to obtain different depth information on the target.
  • the entire system shown in Figure 8 the light source (100) or the elements (130, 111, 121) can be rotated together or independently. In this manner, horizontal or vertical scanning can be performed.
  • the entire system described in Figure 4B can be produced as an array.
  • the optical system (130), the first detector (111) and the first reflective/transmissive/absorptive optical element (121) can be produced as a layered array.
  • the spatial scan can be performed optically and/or electronically by using the array detector or a detector with a focal plane array structure for the first detector (111).
  • Optical and/or electronic scanning position information controlled by the control system (160) can be used to generate a 3-dimensional image in combination with depth information calculated by the analog and/or digital control system (160).

Abstract

The invention relates to a LIDAR imaging system, that comprising at least one light source (100) which enables illumination of the target (150) to be imaged and which provides the illumination time information, developed to obtain high resolution depth information.

Description

HIGH RESOLUTION LIDAR IMAGING SYSTEM
Field of the Invention
The invention relates to an imaging system developed by obtaining high resolution depth information, comprising an analog and/or a digital control system.
State of the Art
There are many different types of LIDAR available today. These can be listed as flash LIDAR, solid state LIDARs, optical phase array LIDARs and MEMS-based LIDAR, etc. They all use the principle of illuminating the target with the signal generated by the light source and measuring the time difference (time of flight) between the returned and transmitted signal to calculate the Time of Flight. A signal to Noise ratio (SNR) level is set to prevent false alarm rates at the detector time-detecting electronic. The time difference between the transmitted signal and the returned signal from the target defines the distance of the target to the LIDAR system. Basically, measurements made at different spatial positions can be used to calculate the distance of different spatial points on the target, and therefore their relative distances to each other is used for gathering depth information. In this method, the depth information depends on the sharpness of the signal rise time and also to the time resolution that can be measured by the detector timedetecting electronic. The depth resolutions of current LIDAR systems are limited by the rise time of the illumination systems used, and high-cost illumination sources are required for measurements with high depth resolution. Another problem that can be encountered with current systems is that in the case of target points having different depths within the spatially illuminated area, a sequence of signals is returned from the target instead of a single signal, and the different depths within this spatial resolution cannot be calculated. Another method that can be used for high depth measurement is the in-direct Time of Flight measurement techniques. In this methods, a phase shift occurred between the transmitted and received signals. The phase shift is proportional to the time of flight. This phase difference can be measured by comparing the transmitted signal and received signal by using an interferometer system. This methods provides high depth resolutions down to the nm levels in the optical wavelength, but the maximum measurement range or depth information is limited by the wavelength of the illumination source used. Since the depth information obtained is only available in multiples of the wavelength, it provides only a relative measurement. In addition, the use of these systems in the field is limited due to their sensitivity.
Another parameter in the analysis of current LIDAR systems is the methods used to create spatial resolution. In the 3 -dimensional image generation, depth information is measured for each point on the azimuth and elevation axes and combined with the related position information on the azimuth and elevation axes to generate the 3 -dimensional image. In the case of using the pulse method to measure the depth information for each point, the depth resolution is limited by the signal rise time of the illumination system and the time resolution that can be measured by the detector time-detecting electronic.
According to the gathering position information in the azimuth and elevation axes, LIDARs can be classified into two main systems. These systems are called mechanical and electromagnetic beam steered systems. As mentioned above, the time of flight calculations are made by stimulating the sensors by the source. These systems work without mirrors, if the light source is steered or by steering, or with the aid of moving mirrors. Each of these systems is example of mechanical systems.
In mechanical steering systems, lasers or detectors are steered with the aid of mirrors. However, these systems perform spatial space scanning very slowly. On the other hand, Solid state systems are a form of beam steering that obtained by one or more micro-mirrors scans for a specific solid angle. However, due to the high-tech nature of these systems and their high costs, their deployment in civilian applications has remained limited. LIDAR systems can be examined in two categories; coherent and non-coherent. While coherent systems are measuring the phase change depending on the wavelength of the light used, their depth measurement are limited with the wavelength of the light. Non-coherent systems, including the systems discussed below, basically measure the target distance, and distances for different spatial points on the target, by measuring the time of flight of a modulated or pulsed signal that transmitted to the target. The gathered distance information may comprise different angular positions in the Field of View (FOV) depending on the system design used, or the angular position information can be gathered with electronic, mechanical, electro-mechanical, etc. scan systems (including control electronics) for the single point measurement systems.
In all systems in the current state of the art, single or a series of several laser pulses are sent to the target point and the distance is determined from the time of flight between the transmitted and the received signal. The depth resolution of the target area depends on the time duration of the transmitted pulse. Pulse durations (signal rise times) should be shortened if more precise measurements are desired. Short pulse lasers are of limited use in LIDAR applications due to their high cost and large size.
The patent application numbered WO 2008008970, which is in the state of the art, mentions a lidar-based 3D point cloud measurement system. The system includes elements such as a target, a light source, a plurality of optical elements and detectors, and a photon detector. In one embodiment of the invention, there is a rotary component, and a rotary power connection configured to provide power from an external source to the rotary motor, photon transmitters and photon detectors, and also to provide an input and output signal to the unit. In said document, a lidar system with 64 elements (32 each, including 2 mounts) was constructed. The system can collect 1 million distance points per second based on “time of flight". The standard deviations of the time of flight measurements are equal to or less than 5 cm. In the system, an inertial navigation system (INS) sensor is used to correct these deviations. The light source produces its own light (laser). The system is also capable of receiving and decoding multiple returns from a single laser emission by digitizing and analyzing the waveform produced by the detector when the signal generated from the emitter returns. However, the distance measurement method mentioned in the invention is performed by continuous time of flight (TOF) measurement for all points on the target and each different image frame. Thus, it cannot reach high resolution values since it can provide resolution values limited by pulse time. However, the complex structure and cost of the system are other disadvantages.
The Korean patent numbered KR 20170132977, which is in the known state of the art, mentions a lidar imaging system developed for vehicles. The vehicle lidar device is capable of detecting an object, based on the time of flight or phase difference measurements of the transmitted signal and/or a received signal. Said lidar device comprises a transmitter/light signal, a receiver, interface, processor, photodetector, photodiode and power supply. The processor is able to generate an image of the object, based on transmitted and received light. Specifically, the processor can generate an image of the object by comparing the transmitted light and the light corresponding to each pixel, or it can generate an image of the object by calculating the “time of flight” or phase change. Also, the processor is able to edit the image depending on the degree to which the image is tilted horizontally or vertically. In the present invention, a depth map is generated based on transmitted light and reflected light. By comparing the transmitted light and the reflected light corresponding to each pixel, the processor calculates the time of flight and phase change per pixel to generate a depth map. These data can be accumulated using memory. In this way, the images at the points are determined and the imaging system is developed. Also, the distance measurement method mentioned in this patent application is performed by continuous time of flight (TOF) measurement for all points on the target and each different image frame. Thus, it cannot reach high resolution values since it can provide resolution values limited by pulse time.
The patent document numbered WO 2019064062, which is in the state of the art, mentions a system and method that uses lidar technology to image objects in the environment. Said system includes elements such as a light source, a processor developed to determine the distance between the vehicle and the object, a sensor developed to detect the reflections of the light source, a moving MEMS mirror, and an actuator. It is mentioned that the lidar system can be used to generate depth maps. It is mentioned that depth information for each pixel of the image may also be recorded or may be temporal. It is mentioned that the processor controls the light source for different durations during the scanning of the target and determines whether additional light pulses are required based on the image quality. In this patent application, it can be seen that the Time of Flight (TOF) of the closest point on the target can be calculated from the time of flight measured for different pixels of the background data and the time of flight of the second signal obtained directly and/or by summing the values of the successive peaks in the multiple reflection signals returned from the target in a group of neighboring pixels. With this method, it is seen that the time of flight measurement is performed indirectly on the background of a single illumination signal by using the signal returned by multiple reflections. Consequently, it cannot reach high resolution values since it can only provide resolution values limited by the pulse duration.
Brief Description and Advantages of the Invention
The present invention relates to a high resolution Lidar imaging system that meets the above- mentioned requirements, eliminates all of the disadvantages and provides some additional advantages.
The main object of the invention is to provide a Lidar imaging system that provides high- resolution depth information without measuring multiple of times of flight data.
The object of the invention is to perform a distance measurement with a detector time- detecting electronic circuit for the first measurement of the distance, as in the current systems.
Another object of the invention is to use the change of focus of the target image on the detector to measure the relative distances of the points on the target to each other (depth information on the target).
Another object of the invention is to provide a structure that does not need an individual signal illumination and time of flight calculation for each spatial position and does not need to wait for at least time of flight for each spatial position on the target and therefore wherein the depth resolution of the LIDAR system is significantly increased as well as frame rate time. Description of the Figures
Figure 1: 4-staged high resolution and fast LIDAR system flow chart for a close- middle distance target
Figure 2: 4-staged high resolution and fast LIDAR system for a close-middle distance target
Figure 3: 4-staged high resolution and fast LIDAR system flow chart for a long-distance target
Figure 4: 4-staged high resolution and fast LIDAR system for a long-distance target
Figure 5: High resolution LIDAR system flow chart for a close-middle distance target
Figure 6: High resolution LIDAR system for a close-middle distance target
Figure 7: High resolution LIDAR system flow chart for a long-distance target
Figure 8: High resolution LIDAR system for a long-distance target
Element Numbers in the Figures
100 - Light source
111 - First detector
112 - Second detector
113 - Third detector
114 - Fourth detector
UN - Nth detector
121 - First reflective/transmissive/absorptive optical element
122 - Second reflective/transmissive/absorptive optical element
123 - Third reflective/transmissive/absorptive optical element
124 - Fourth reflective/transmissive/absorptive optical element
12N - Nth reflective/transmissive/absorptive optical element
130 - Optical system
140 - High resolution positioner
150 - Target
160 - Control system #
Detailed Description of the Invention
The present invention relates to a high resolution Lidar imaging system that meets the above- mentioned requirements, eliminates all of the disadvantages and provides some additional advantages.
Figure 1 shows a 4-staged high resolution and fast LIDAR system flow chart for a close-middle distance target. Figure 2 shows 4-staged high resolution and fast LIDAR system for a closemiddle distance target.
A light source (100) is illuminates the target (150) to be imaged. The light source (100) may be sinle and/or multiple pulsed lasers, CW lasers, LEDs or fluorescents, etc. It may comprise optical and electronic systems such as focusers, diffusers, scanners, etc. It may comprise signal shaping optical/electronic systems such as modulators etc. The first detector (111), second detector (112), third detector (113) and fourth detector (114) are the detectors in the system. These detectors can be single or multiple photodiodes, phototransistors, thermal detectors, single photon detectors, avalanche detectors, photon counters and detectors with an array structure. The first reflective/transmissive/absorptive optical element (121), second reflective/transmissive/absorptive optical element (122), third reflective/transmissive/absorptive optical element (123), forth reflective/transmissive/absorptive optical element (124) may have a reflective, transmissive or absorptive structure. These optical elements can be iris, pinhole, selective pass filter or gradient filter. The optical system (130) in the structure may comprise one and/or more refractive and/or reflective optical elements and may be used for spatial scanning. The control system (160) included in the system can be in Analog and/or Digital structure. The control system (160) can control illumination time, shape, modulation, scan system control, etc., illumination functions, optical system scan functions, etc., detector settings, etc., process the detected signal, calculate the time of flight, calculate the depth information based on the positions of the detectors (111, 112, 113, 114) on the optical axis, and perform operations of 3 -dimensional image generation, etc. with the spatial position information such as depth information calculated based on the positions of the detectors (111, 112, 113, 114) on the optical axis and/or the scanning system position. The computer may include components such as a power supply, drive circuit, reader circuit, specialized software, etc.
According to Figures 1 and 2, the operation of the system is as follows. The light source (100) managed by the analog and/or digital control system (160) illuminates the target (150) to be imaged for a predetermined or undetermined period of time. Other optical components may be present to steer or collect the beam inside the light source (100). Light reflected from the target passes through the optical system (130). With the help of the control system (160), the time of flight measurement is performed between the received signal on one or more of the first, second, third or fourth detectors (111, 112, 113, 114) and signal of the light source (100). From the time of flight measurement, the distance to the LIDAR system of the reference surface on the target (150) is determined by the control system (160). In the subsequent illuminations, point A falls on the first detector (111), point B on the second detector (112), point C on the third detector (113) and point D on the fourth detector (114) over the target. The light reflected back from point A on the target passes through the first reflective/transmissive/absorptive optical element (121) and reaches the first detector (111), while a significant part of the rest of the light rays that reflected back from points B, C and D on the target are transferred to the next parts. The rest of the light reflected back from point B on the target passes through the second reflective/transmissive/absorptive optical element (122) and reaches the second detector (112), while the rest of the light rays reflected back from points C and D on the target are mostly transferred to the next parts. The light reflected back from point C on the target passes through the third reflective/transmissive/absorptive optical element (123) and reaches the third detector (113), while a significant part of the rest of the light rays reflected back from point D on the target is transferred to the next part. A significant part of the light reflected back from point D on the target (150) reaches the fourth detector (113). The images obtained on the first detector (111) generate the images of A depth, the images obtained on the second detector (112) generate the images of B depth, the images obtained on the third detector (113) generate the images of C depth and the images obtained on the fourth detector (114) generate the images of D depth. The depth information of these images is calculated by the control system (160) along with the position information of the detectors. The number of the detectors and the reflective/transmissive/absorptive optical elements in front of it can be increased or reduced according to the desired resolution. The entire system shown in Figure 2, the light source (100), the detectors (111, 112, 113, 114), the reflective/transmissive/absorptive optical elements (121, 122, 123, 124) or the optical system (130) can be rotated in elevation and azimuth together or independently. In this manner, elevation and azimuth scanning can be performed. The entire system described in Figure 2 can be produced integrated with each other. Or detectors (111, 112, 113, 114), reflective/transmissive/absorptive optical elements (121, 122, 123, 124) or optical system (130) elements can also be produced integrated with each other. Or, the spatial scan can be performed optically and/or electronically by using the linear array detectors or focal plane array detectors as (111, 112, 113, 114) numbered detectors of the system. Optical and/or electronic scanning position information controlled by the analog and/or digital control system (160), can be used to generate a 3-dimensional image in combination with depth information calculated by the analog and/or digital control system (160).
According to Figures 3 and 4, the operation of the system is as follows. The light source (100) that controlled by the analog and/or digital control system (160), illuminates the target (150) to be imaged for a predetermined or undetermined period of time. Other optical components may be present to steer or collect the beam inside the light source (100). Light reflected from the target passes through the optical system (130) or systems. The time of flight measurement is performed by using the backscattering time of the light with one or more of the first, second, third or fourth detectors (111, 112, 113, 114), the pulse time of the light source (100) and the control system (160). From the time of flight measurement, the distance of the reference surface of the target to the LIDAR system is determined by the control system (160). In the subsequent illuminations, point A falls on the first detector (111), point B on the second detector (112), point C on the third detector (113) and point D on the fourth detector (114) over the target. The light reflected back from point A on the target, passes completely through the first reflective/transmissive/absorptive optical element (121) and reaches the first detector (111), while a significant part of the rest of the light reflected back from points B, C and D on the target is transferred to the next parts. The rest of the light reflected back from point B on the target passes completely through the second reflective/transmissive/absorptive optical element (122) and reaches the second detector (112), while the rest of the light reflected back from points C and D on the target is mostly transferred to the next parts. The light reflected back from point C on the target passes completely through the third reflective/transmissive/absorptive optical element (123) and reaches the third detector (113), while a significant part of the rest of the light reflected back from point D on the target is transferred to the next parts. A significant part of the light reflected back from point D on the target reaches the fourth detector (114). The images obtained on the first detector (111) generate the images of A depth, the images obtained on the second detector (112) generate the images of B depth, the images obtained on the third detector (113) generate the images of C depth and the images obtained on the fourth detector (114) generate the images of D depth. The depth information of these images is calculated by the control system (160) along with the position information of the detectors. The number of the detectors and the reflective/transmissive/absorptive optical elements in front of it can be increased or reduced according to the desired resolution. The entire system shown in Figure 4, the light source (100), the detectors (111, 112, 113, 114), the reflective/transmissive/absorptive optical elements (121, 122, 123, 124) or the optical system (130) can be rotated in the desired direction together or independently. In this manner, horizontal or vertical scanning can be performed. The entire system described in Figure 4 can be produced as an array. Or detectors (111, 112, 113, 114), reflective/transmissive/absorptive optical elements (121, 122, 123, 124) or optical system (130) elements can also be produced as a layered array. Or the spatial scan can be performed optically and/or electronically by using the array detectors or detectors with a focal plane array structure for the detectors (111, 112, 113, 114). Optical and/or electronic scanning position information controlled by the analog and/or digital control system (160) can be used to generate a 3- dimensional image in combination with depth information calculated by the control system (160).
According to Figures 5 and 6, the operation of the system is as follows. The light source (100) that controlled by the control system (160), illuminates the target (150) to be imaged for a predetermined or undetermined period of time. Other optical components may be present to steer or collect the beam inside the light source (100). Light reflected from the target passes through the optical system (130) or elements. The time of flight measurement is performed by the control system (160) by using the backscattering time of the light coming to the first detector (111) with the pulse time of said light source (100). The first detector (111) detector can consist of single pixel, array or focal plane array type detectors. From the time of flight measurement, the distance of the reference surface of the target to the LIDAR system is determined by the control system (160). The position of the detectors and the reflective/transmissive/absorptive optical elements on the optical axis are changed by the high resolution positioner (140) in subsequent illuminations and the image of the target is taken. The position of the detectors and the reflective/transmissive/absorptive optical elements on the optical axis are changed by the high resolution positioner (140) in the next illumination and the image of the target is taken. The illumination and imaging process are repeated for a predetermined or undetermined number of different high resolution positioner (140) positions. Each high resolution positioner (140) position will correspond to a different depth on the target. The optical system (130) uses its optical properties to calculate the depth information on the target (150) to which each high resolution positioner (140) position corresponds. The images obtained for N different target depths corresponding to N high resolution positioner (140) positions are processed by the control system (160) along with the high resolution positioner (140) position information to obtain different depth information on the target. The entire system shown in Figure 6, the light source (100) or optical system (130), the first detector (111) and the first reflective/transmissive/absorptive optical element (121) can be rotated in the desired direction together or independently. In this manner, horizontal or vertical scanning can be performed. The entire system described in Figure 6 can be produced as an array. Or the optical system (130), the first detector (111) and the first reflective/transmissive/absorptive optical element (121) can be produced as a layered array. Or the spatial scan can be performed optically and/or electronically by using the array detector or a detector with a focal plane array structure for the first detector (111). Optical and/or electronic scanning position information controlled by the control system (160) can be used to generate a 3- dimensional image in combination with depth information calculated by the control system (160).
According to Figures 7 and 8, the operation of the system is as follows. The light source (100) controlled by the control system (160), illuminates the target (150) to be imaged for a predetermined or undetermined period of time. Other optical components may be present to steer or collect the beam inside the light source (100). Light reflected from the target passes through the element or elements in the optical system (130). The time of flight measurement is performed by the control system (160) by using the backscattering time of the light coming to the first detector (111) with the pulse time of the light source (100). From the time of flight measurement, the distance of the reference surface of the target to the LIDAR system is determined by the Control System (160). The position of the first detector (111) and the first reflective/transmissive/absorptive optical elements (121) on the optical axis are changed by the high resolution positioner (140) in subsequent illuminations and the image of the target is taken. The position of the first detector (111) and the first reflective/transmissive/absorptive optical elements (121) on the optical axis are changed by the high resolution positioner (140) in the next illumination and the image of the target is taken. The illumination and imag process are repeated for a predetermined or undetermined number of different high resolution positioner positions. Each high resolution positioner position will correspond to a different depth on the target. The optical system (130) uses its optical properties to calculate the depth information on the target to which each high resolution positioner (140) position corresponds. The images obtained for N different target depths corresponding to N High Resolution Positioner (140) positions are processed by the Control System (160) along with the High Resolution Positioner (140) position information to obtain different depth information on the target. The entire system shown in Figure 8, the light source (100) or the elements (130, 111, 121) can be rotated together or independently. In this manner, horizontal or vertical scanning can be performed. The entire system described in Figure 4B can be produced as an array. Or the optical system (130), the first detector (111) and the first reflective/transmissive/absorptive optical element (121) can be produced as a layered array. Or the spatial scan can be performed optically and/or electronically by using the array detector or a detector with a focal plane array structure for the first detector (111). Optical and/or electronic scanning position information controlled by the control system (160) can be used to generate a 3-dimensional image in combination with depth information calculated by the analog and/or digital control system (160).

Claims

CLAIMS A LIDAR imaging system, that comprise at least one light source (100) which enables illumination of the target (150) to be imaged, developed to obtaine high resolution depth information. Its characterization and comprises are;
At least one optical system (130) that enables steering or collecting the beam that reflected from said target (150,
At least one first reflective/transmissive/absorptive optical element (121), second reflective/transmissive/absorptive optical element (122), third reflective/transmissive/absorptive optical element (123), forth reflective/transmissive/absorptive optical element (124) that enables the beams that reflected over said target (150), to reach the detectors by passing through the optical system (130),
At least one detector that detects the illumination time of said light source (100),
At least one detector that detects the receiving time of the light reflected back from the target (150),
At least one first detector (111), second detector (112), third detector (113), fourth detector (114) that detect the light reflected back from the target (150),
At least one control system (160) that performs the time of flight measurement by using the return time of the light with the information coming from said detectors and that determines the distance from the reference surface of the target (150) to the LIDAR system. A high resolution LIDAR imaging system according to claim 1 , characterized in that the light source (100) has a multiple pulsed laser and/or CW laser and/or LED and/or fluorescent and/or another light source. A high resolution LIDAR imaging system according to claim 1 , characterized in that the light source (100) has a structure that provides the pulse time information. A high resolution LIDAR imaging system according to claim 1 , characterized in that the light source (100) has a detector that detects the pulse time information. A high resolution LIDAR imaging system according to claim 1, characterized in that it has a focuser and/or diffuser and/or scanner and/or signal shaper. A high resolution LIDAR imaging system according to claim 1 , characterized in that the detectors (111, 112, 113, 114 ... UN) comprise at least one photodiode and/or phototransistor and/or thermal detector and/or single photon detector and/or avalanche detector and/or photon counter and/or detector array and/or focal plane array. A high resolution LIDAR imaging system according to claim 1 , characterized in that the reflective/transmissive/absorptive optical elements (121, 122, 123, 124 ... 12N) comprise at least one iris. A high resolution LIDAR imaging system according to claim 1 , characterized in that the reflective/transmissive/absorptive optical elements (121, 122, 123, 124 ... 12N) comprise at least one pinhole and/or selective pass filter and/or selective gradient filter. A high resolution LIDAR imaging system according to claim 1 , characterized in that the optical system (130) comprises at least one refractive optic element for the spatial scan. A high resolution LIDAR imaging system according to claim 1, characterized in that the optical system (130) comprises at least one reflective optic element for the spatial scan. A high resolution LIDAR imaging system according to claim 1, characterized in that the control system (160) has an analog structure. A high resolution LIDAR imaging system according to claim 1, characterized in that the control system (160) has a digital structure. A high resolution LIDAR imaging system according to claim 1, characterized in that it comprises at least one computer. A high resolution LIDAR imaging system according to claim 1, characterized in that it comprises at least one power source. A high resolution LIDAR imaging system according to claim 1, characterized in that it comprises at least one drive circuit. A high resolution LIDAR imaging system according to claim 1, characterized in that it comprises at least one readout circuit. A LIDAR imaging system developed by obtaining high resolution depth information comprising at least one light source (100) that enables illumination of the target (150) to be imaged, characterized in that it comprises;
At least one optical system (130) that enables the beam reflected from said target (150) to be steered or collected,
At least one first reflective/transmissive/absorptive optical element (121) that enables the beams reflected over the said target (150) to reach the detectors by passing through the optical system (130),
At least one control system (160) that performs the time of flight measurement by using the return time of the light coming to the first detector (111) with the illumination time of said light source (100) and that determines the distance from the reference surface of the target to LIDAR system from the time of flight measurement,
At least one high resolution positioner (140) that allows the measurement of different depths over the target (150) by changing the position of detectors and
15 reflective/transmissive/absorptive optical elements on the optical axis in subsequent illuminations. A high resolution LIDAR imaging system according to claim 17, characterized in that the light source (100) has at least one multiple pulsed laser and/or at least one CW laser and/or LED and/or fluorescent and/or another light source. A high resolution LIDAR imaging system according to claim 17, characterized in that it has a structure that provides the pulse time information of the light source (100). A high resolution LIDAR imaging system according to claim 17, characterized in that it has a detector that detects the illumination time information of the light source (100). A high resolution LIDAR imaging system according to any of the claims 17-20, characterized in that the light source (100) comprises a focuser and/or diffuser and/or scanner and/or signal shaper and/or photodiode and/or phototransistor and/or thermal detector and/or photon detector and/or avalanche detector and/or photon counter. A high resolution LIDAR imaging system according to claim 17, characterized in that the detectors (111) comprise at least one focal plane array. A high resolution LIDAR imaging system according to claim 17, characterized in that the reflective/transmissive/absorptive optical elements (121) comprise at least one iris and/or selective pass filter and/or pinhole filter and/or selective gradient filter. A high resolution LIDAR imaging system according to claim 17, characterized in that the optical system (130) comprises at least one reflective optic element for spatial scan. A high resolution LIDAR imaging system according to claim 17, characterized in that the optical system (130) comprises at least one refractive optic element for the spatial scan.
16
26. A high resolution LIDAR imaging system according to claim 17, characterized in that the control system (160) has an analog structure.
27. A high resolution LIDAR imaging system according to claim 17, characterized in that the control system (160) has a digital structure.
28. A high resolution LIDAR imaging system according to claim 17, characterized in that it comprises at least one computer.
29. A high resolution LIDAR imaging system according to claim 17, characterized in that it comprises at least one power source.
30. A high resolution LIDAR imaging system according to claim 17, characterized in that it comprises at least one drive circuit.
31. A high resolution LIDAR imaging system according to claim 17, characterized in that it comprises at least one readout circuit.
17
PCT/TR2022/051024 2021-09-27 2022-09-21 High resolution lidar imaging system WO2023048687A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR2021015058 2021-09-27
TR2021/015058 TR2021015058A2 (en) 2021-09-27 HIGH RESOLUTION LIDAR IMAGING SYSTEM

Publications (1)

Publication Number Publication Date
WO2023048687A1 true WO2023048687A1 (en) 2023-03-30

Family

ID=85721017

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2022/051024 WO2023048687A1 (en) 2021-09-27 2022-09-21 High resolution lidar imaging system

Country Status (1)

Country Link
WO (1) WO2023048687A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017079483A1 (en) * 2015-11-05 2017-05-11 Luminar Technologies, Inc. Lidar system with improved scanning speed for high-resolution depth mapping
WO2017095817A1 (en) * 2015-11-30 2017-06-08 Luminar Technologies, Inc. Lidar system with distributed laser and multiple sensor heads and pulsed laser for lidar system
CN111398934A (en) * 2018-12-13 2020-07-10 百度(美国)有限责任公司 L IDAR 3D design using a polygon prism

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017079483A1 (en) * 2015-11-05 2017-05-11 Luminar Technologies, Inc. Lidar system with improved scanning speed for high-resolution depth mapping
WO2017095817A1 (en) * 2015-11-30 2017-06-08 Luminar Technologies, Inc. Lidar system with distributed laser and multiple sensor heads and pulsed laser for lidar system
CN111398934A (en) * 2018-12-13 2020-07-10 百度(美国)有限责任公司 L IDAR 3D design using a polygon prism

Similar Documents

Publication Publication Date Title
CN108885263B (en) LIDAR-based 3D imaging with variable pulse repetition
CN108291968B (en) Three-dimensional LIDAR system with target field of view
JP4405154B2 (en) Imaging system and method for acquiring an image of an object
US9335220B2 (en) Calibration of time-of-flight measurement using stray reflections
CN105143820B (en) Depth scan is carried out using multiple transmitters
RU2538418C2 (en) Optical rangefinder
EP2686701B1 (en) System, method and computer program for receiving a light beam
CN109557551A (en) Laser scanner
CN110986756B (en) Measuring device for three-dimensional geometrical acquisition of the surroundings
CN112020660A (en) LIDAR-based distance measurement with layered power control
KR20200097683A (en) LIDAR signal acquisition
KR20220146711A (en) Noise adaptive solid-state lidar system
US6741082B2 (en) Distance information obtaining apparatus and distance information obtaining method
EP2824418A1 (en) Surround sensing system
CN109425324A (en) The total station or theodolite for setting range of receiving with scanning function and receiver
CN101449181A (en) Distance measuring method and distance measuring element for detecting the spatial dimension of a target
US11531104B2 (en) Full waveform multi-pulse optical rangefinder instrument
KR102324449B1 (en) Multi-detector with interleaved photodetector arrays and analog readout circuits for lidar receiver
US20230291885A1 (en) Stereoscopic image capturing systems
GB2374743A (en) Surface profile measurement
US11156716B1 (en) Hybrid LADAR with co-planar scanning and imaging field-of-view
CN210128694U (en) Depth imaging device
CN210835244U (en) 3D imaging device and electronic equipment based on synchronous ToF discrete point cloud
WO2023048687A1 (en) High resolution lidar imaging system
WO2020146493A1 (en) Lidar systems and methods with beam steering and wide angle signal detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22873320

Country of ref document: EP

Kind code of ref document: A1