WO2022017366A1 - Procédé d'imagerie de profondeur et système d'imagerie de profondeur - Google Patents

Procédé d'imagerie de profondeur et système d'imagerie de profondeur Download PDF

Info

Publication number
WO2022017366A1
WO2022017366A1 PCT/CN2021/107301 CN2021107301W WO2022017366A1 WO 2022017366 A1 WO2022017366 A1 WO 2022017366A1 CN 2021107301 W CN2021107301 W CN 2021107301W WO 2022017366 A1 WO2022017366 A1 WO 2022017366A1
Authority
WO
WIPO (PCT)
Prior art keywords
laser
scanning angle
photoelectric detection
target object
laser spot
Prior art date
Application number
PCT/CN2021/107301
Other languages
English (en)
Chinese (zh)
Inventor
余恺
俞锋
贝努瓦菲利普
屈丰广
张朝龙
曾佳
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022017366A1 publication Critical patent/WO2022017366A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers

Definitions

  • the present application relates to the field of imaging technologies, and in particular, to a depth imaging method and a depth imaging system.
  • Radar is an important technology for realizing computer vision. Radar includes but is not limited to lidar, millimeter-wave radar, and visible light radar.
  • a 3D camera is an application example of a radar system. The example system consists of a laser pulse transmitter, a laser pulse receiver, a Time to Digital Converter (TDC), and a control system.
  • TDC Time to Digital Converter
  • the laser pulse transmitter In the radar working scene, the laser pulse transmitter generates light pulses and transmits them into the environment. After the light pulses are reflected by the target objects in the environment, they are received by the receiver. The receiving end converts the received photons into electrical signals and then provides them to the TDC. The timing of the emission of the TDC reference pulse quantifies the delay of the returning photons and puts them into a time grid of a given width. When enough pulses are fired, the number of events in the time grid can form a histogram. The highest position in the histogram corresponds to the time of flight (TOF) of the pulse, through which the distance of the target object can be calculated.
  • TOF time of flight
  • Photoelectric sensors are usually used as receivers to detect light pulses. Depth imaging is widely used in fields such as computer vision. With the continuous enrichment of the application requirements of depth imaging, the demand for resolution of depth imaging is also getting higher and higher. However, due to the limitation of semiconductor technology and volume, the resolution of the photodetection unit on the photoelectric sensor is difficult to meet the actual demand.
  • the contradiction between the high-resolution requirement of depth imaging and the low-resolution performance of the photoelectric sensor can be solved through the lattice super-resolution technology.
  • a virtual super-resolution pixel with smaller size can be constructed, so as to make the The resolution of photoelectric sensors has been improved to meet the high-resolution requirements of depth imaging.
  • the transceiver module of the 3D camera used in depth imaging usually has a certain baseline distance, which leads to the mismatch of the transceiver position, and the mismatch offset is also called parallax. Parallax takes the number of photodetection units as a measurement unit, and the parallax changes with the target distance.
  • the photoelectric sensor cannot determine the position of the photoelectric detection unit where the laser spot is located. Therefore, it is often necessary to turn on all the surrounding photodetection units to detect the photodetector units that have received the pulse.
  • the power consumption generated by fully opening the photodetector unit in the strong ambient light scene is unbearable for the chip.
  • multiple photodetector units share one TDC. When the photodetector unit is fully opened, only one photodetector unit has a target signal, and the rest of the photodetector units detect noise, which seriously reduces the TDC receiving signal. signal-to-noise ratio, and it is difficult to determine the signal light from the histogram.
  • the position of the light spot can be detected by turning on the photoelectric detection unit in a time-sharing manner.
  • each photoelectric detection unit is exposed for a certain period of time in turn, but this detection method requires many exposure times and long exposure time, resulting in very high system power consumption, lower frame rate, and slower depth imaging speed.
  • a plurality of photodetection units can be fully turned on at the same time, which reduces the signal-to-noise ratio of the histogram and thus reduces the detection probability of signal light.
  • the present application provides a depth imaging method and a depth imaging system, so as to reduce system power consumption while ensuring a signal-to-noise ratio, achieve high frame rate lattice super-resolution, and improve the speed of depth imaging.
  • the present application provides a depth imaging method, including:
  • the photoelectric detection unit on the photoelectric sensor is polled to be turned on and off;
  • the photoelectric sensor includes a plurality of photoelectric detection units, and each photoelectric detection unit is divided into a plurality of super-divided pixels according to the size of the laser spot;
  • the depth information of the laser spot at each scanning angle is spliced to obtain a super-resolution depth image of the target object.
  • polling and controlling the photodetection unit on the photoelectric sensor to turn on and off in each pulse emission period specifically includes:
  • Each of the photodetection units to be polled is polled and controlled to be turned on and off according to the time window.
  • the laser pulses are emitted with a pulse emission period at each scanning angle, which specifically includes:
  • the laser pulses are emitted to the target object in a dot matrix projection manner for a plurality of the pulse emission periods at each scanning angle, so as to simultaneously form a plurality of laser light spots on the photoelectric sensor;
  • the determining of the photoelectric detection units to be polled in each pulse emission period specifically includes:
  • the photodetection unit to be polled corresponding to each laser spot in each pulse emission period is determined.
  • obtain the minimum interval between two adjacent laser spots in the field of view of the laser specifically including:
  • the minimum detection distance of the photoelectric sensor the lateral field angle and lateral resolution of the photoelectric detection unit, two adjacent lasers in the field of view of the laser are obtained Minimum spacing of the spots.
  • determining the time window corresponding to each photoelectric detection unit to be polled specifically includes:
  • the offset range and the corresponding relationship between the offset and the distance determine the corresponding distance range when the laser spot falls on each of the photoelectric detection units to be polled;
  • the distance range and the corresponding relationship between the distance and the time delay determine the time-of-flight range in which each photoelectric detection unit to be polled can receive the laser spot; the time-of-flight range is used as the time window.
  • each photoelectric detection unit turned on by polling in the pulse emission period obtain the depth information of the laser spot under the current scanning angle, specifically including:
  • the distance information of the laser spot at the current scanning angle is obtained as the depth information of the laser spot.
  • the method further includes:
  • the offset corresponding to the distance information is obtained according to the correspondence between the offset and the distance;
  • the integer part of the offset indicates the number of photodetection units offset by the target photodetection unit relative to the photodetection unit without parallax,
  • the target photoelectric detection unit is the photoelectric detection unit where the laser spot is located, and the fractional part of the offset indicates the super-divided position where the laser spot is received in the target photoelectric detection unit;
  • the offset and the superdivision multiple of the photoelectric sensor determine the superdivision pixel at which the laser spot is detected in the target photoelectric detection unit under the current scanning angle, so as to construct the relationship between the depth information and the superdivision pixel.
  • the depth information of the laser spot at each scanning angle is spliced, which specifically includes:
  • the depth information of the laser spot at each scanning angle is stitched together by using the corresponding relationship between the depth information and the super-divided pixels.
  • scan the target object at different scanning angles including:
  • the laser light path is adjusted before scanning the target object at the next scanning angle to form the next scanning angle.
  • the present application provides a depth imaging system, including: a laser, a controller, a gating element, a photoelectric sensor, a time-to-digital converter, and a processor; the controller is connected to the laser; the photoelectric sensor includes multiple photoelectric detection units, each photoelectric detection unit is divided into a plurality of super-divided pixels according to the size of the laser spot; the gating switch is connected to all the photoelectric detection units of the photoelectric sensor; the controller and the processor The time-to-digital converters are respectively connected; the time-to-digital converters are also connected to the photodetection unit through the gate switch;
  • the controller is used to control the laser to scan the target object with different scanning angles, and emit laser pulses with a pulse emission period under each scanning angle;
  • the gate switch is used to poll and control the photodetection unit on the photoelectric sensor to turn on and off in each pulse emission period; the turned-on photodetection unit is used to receive the light signal reflected by the target object, and to converting the optical signal into an electrical signal;
  • the time-to-digital converter is used to obtain the flight time according to the electrical signals provided by each photoelectric detection unit that is turned on by polling in the pulse emission period and the emission time of each pulse under the current scanning angle, and convert the flight time into counts value;
  • the processor is configured to form a direct time-of-flight histogram corresponding to the current scanning angle according to the count value converted by the time-to-digital converter and the electrical signal; Depth information of the laser spot; after the laser scans the target object, the depth information of the laser spot at each scanning angle is spliced to obtain a super-resolution depth image of the target object.
  • the laser includes: a laser light source, a collimating lens, a deflecting mirror, a light splitting element and a driving device;
  • the laser light source is used to emit a laser beam, and the laser beam includes laser pulses emitted according to a pulse emission period;
  • the collimating lens is used for collimating and sending the laser beam to the deflecting mirror;
  • the deflecting mirror is connected with the driving device, and is used for reflecting the laser beam from the collimating lens to the beam splitting element; during the period, the deflecting mirror is periodically deflected by the driving device;
  • the beam splitting element is used for splitting the received laser beam into multiple beams, and then projecting the multiple laser beams to the target object.
  • the embodiments of the present application at least have the following advantages:
  • the target object is scanned at different scanning angles, and laser pulses are emitted in a pulse emission period under each scanning angle; in each pulse emission period, the photoelectric detection unit on the photoelectric sensor is polled and controlled to be turned on
  • the photoelectric sensor includes a plurality of photoelectric detection units, and each photoelectric detection unit is divided into a plurality of super-divided pixels according to the size of the laser spot; according to the electrical signal provided by each photoelectric detection unit that is turned on by polling in the pulse emission period, the current After scanning the target object, the depth information of the laser spot at each scanning angle is spliced to obtain the super-resolution depth image of the target object.
  • the photoelectric detection unit Since the photoelectric detection unit is turned on by polling in each pulse emission period, the cumulative exposure time of the photoelectric detection unit on the photoelectric sensor is greatly shortened compared with the time-sharing opening of the photoelectric detection unit, which in turn can reduce power consumption and improve frame rate. and depth imaging speed.
  • the polling-on method ensures that the photodetector units that are turned on at a certain time will not be interfered by other adjacent photodetector units, and the converted signal has a higher signal-to-noise ratio. Furthermore, ensuring a high effective signal detection probability is beneficial to improve the imaging quality of the depth image of the target object.
  • Fig. 1 is a kind of lattice super-resolution schematic diagram
  • FIG. 2 is a schematic diagram of the relationship between a transmitting field of view and a receiving field of view
  • Fig. 3 is a kind of schematic diagram of the relationship between offset and distance
  • FIG. 4 is a schematic diagram of the actual size of the field of view of a single photoelectric detection unit in the RX field of view under various different distances;
  • FIG. 5 is a flowchart of a depth imaging method provided by an embodiment of the present application.
  • Fig. 6 is a kind of laser dot matrix projection schematic diagram
  • FIG. 7 is a schematic diagram of the positional change of a plurality of laser light spots in the super-divided pixel when the scanning angle changes according to an embodiment of the present application;
  • FIG. 8 is a schematic diagram of the offset range of other photodetection units in the lateral direction relative to the photodetection unit without parallax according to an embodiment of the present application;
  • FIG. 9 is a schematic diagram of the polling sequence of each photoelectric detection unit SPAD0 to SPAD4 to be polled in one pulse emission period provided by the embodiment of the present application;
  • FIG. 10 is a schematic diagram of a direct flight time histogram provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a lattice emission parallax provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of a depth imaging system according to an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a laser according to an embodiment of the present application.
  • the dot matrix super-resolution technique can be used to improve the resolution.
  • Figure 1 is a schematic diagram of lattice super-resolution.
  • the size of one photodetection unit corresponds to 4 ⁇ 4 super-divided pixels, and 4 ⁇ 4 photodetection units share one TDC.
  • the size of the super-divided pixels in the photodetection unit matches the size of the laser spot.
  • the diameter of the laser spot is K
  • the photodetection unit is divided into a plurality of super-divided pixels of K ⁇ K size.
  • the diameter of the laser spot can also be slightly smaller than the size of the super-pixel.
  • FIG. 2 is a schematic diagram showing the relationship between the transmitting field of view and the receiving field of view.
  • TX represents a laser
  • RX represents a photoelectric sensor.
  • the dashed line through TX represents the baseline of the laser and the dashed line through RX represents the baseline of the photosensor.
  • the horizontal direction represents the offset
  • the vertical direction represents the distance. Due to the existence of the baseline distance, the sending and receiving positions of light are mismatched, and the offset of the mismatch (also called parallax) is inversely proportional to the distance.
  • FIG. 3 is a schematic diagram of the relationship between the offset and the distance.
  • the offset is measured by the number of photodetection units, for example, 3 photodetector units are offset. Taking the lateral shift as an example, the actual meaning is to laterally shift the size of 3 photodetection units. As shown in Figure 3, the farther the distance, the smaller the offset; the closer the distance, the greater the offset.
  • represents the offset
  • d baseline represents the baseline distance
  • d spad represents the size of the field of view of a single photoelectric detection unit corresponding to the actual field of view at a given distance.
  • the size of the field of view of a single photodetector unit corresponds to the size in the actual field of view and is inversely proportional to the offset.
  • the field of view of a single photoelectric detection unit corresponds to the size in the actual field of view and is proportional to the distance.
  • Figure 4 shows the actual size of the field of view of a single photodetector unit in the RX field of view at various distances. As shown in Figure 3, at a distance closer to RX, a single photodetector unit corresponds to a smaller size in the RX field of view; at a farther distance from RX, a single photodetector unit corresponds to a larger size in the RX field of view.
  • Formula (2) shows the calculation method of the field of view of a single photoelectric detection unit corresponding to the size in the actual field of view:
  • d spad represents the size of the field of view of a single photodetector unit corresponding to the actual field of view
  • Dist represents the distance
  • FOV h represents the lateral field of view of the photodetector unit
  • N h represents the lateral resolution of the photodetector unit .
  • the premise of lattice super-resolution is to know the specific position of the light spot in the photoelectric detection unit to be super-separated in advance, and does not depend on the output of the photoelectric sensor. That is, the precise position of the light spot formed by the emitted laser pulse on the photosensor is predicted.
  • the time-sharing method of turning on the photodetector units requires sequentially turning on the photodetector units that may detect the light spot, and detecting whether there is a laser pulse in the histogram formed by the converted signals of these photodetector units to determine the position of the light spot.
  • the time-sharing opening means multiple exposure times and exposure times, resulting in a doubled increase in system power consumption and a doubled drop in frame rate.
  • Another method of fully opening the photoelectric detection unit at the same time is easy to detect a large amount of ambient light, which reduces the signal-to-noise ratio of the histogram, and the effective signal is easily covered by noise, which makes it difficult to detect the light pulse, and thus it is difficult to determine the spot position.
  • the present application provides a depth imaging method and a depth imaging system.
  • the photoelectric detection unit on the photoelectric sensor is polled and controlled to be turned on and off in each pulse emission period, which shortens the exposure time of the photoelectric detection unit and reduces the number of exposures, thus saving system power consumption and improving the frame rate.
  • Improve depth imaging speed improve depth imaging speed.
  • the signal-to-noise ratio of the histogram is enhanced, and the detection rate of valid signals is improved.
  • FIG. 5 is a flowchart of a depth imaging method provided by an embodiment of the present application. As shown in Figure 5, the depth imaging method includes:
  • Step 501 Scan the target object with different scanning angles, and emit laser pulses with a pulse emission period at each scanning angle.
  • the target object refers to the object that depth imaging needs to present. According to the actual needs of the depth image, the target object may be a person, an animal, a building, etc.
  • the type of the target object is not limited here.
  • a laser is used to transmit light pulses to the target object, and then the photoelectric sensor receives the light pulses reflected from the target object.
  • the collimation of the laser is good, and a light spot is formed when it is projected on the target object, and the photoelectric sensor detects the light spot on the target object specifically.
  • various types of lasers can be used, such as lasers operating in the infrared band or lasers operating in the visible light band.
  • the detection band of the photoelectric sensor should match the working band of the laser, so as to realize the effective detection of the formed light spot.
  • a two-dimensional line scan of the target object can be performed. Since the two-dimensional line scan can be divided into several independent one-dimensional line scans, in the following description, the horizontal one-dimensional line scan is taken as an example for description.
  • the scanning angle of the laser to the target object changes constantly.
  • the scanning angle can be changed by adjusting the laser light path inside the laser, or the scanning angle can be changed by adjusting the position of the laser as a whole without changing the laser light path inside the laser.
  • the next scanning angle is formed, and then the scanning is performed at the next scanning angle until the super-division scanning covering the entire receiving field of view is completed.
  • the laser can be made to emit a laser lattice, and then multiple laser spots can be formed on the surface of the target object when the laser pulse is emitted.
  • FIG. 6 is a schematic diagram of a laser dot matrix projection.
  • the photoelectric sensor includes a plurality of photoelectric detection units.
  • the photoelectric detection units on the photoelectric sensor are arranged according to the horizontal and vertical rules.
  • the lateral dimension of each photodetector unit may be the same as the longitudinal dimension, or may be different from the longitudinal dimension.
  • the photodetection unit may be a single photon avalanche diode (Single Photon Avalanche Diode, SPAD) or an avalanche photodiode (Avalanche Photon Diode, APD).
  • SPAD Single Photon Avalanche Diode
  • APD avalanche Photodiode
  • the specific type of the photodetection unit is not limited here.
  • Each photodetection unit is divided into a plurality of super-divided pixels according to the size of the laser spot. For example, if the diameter of the laser spot is D, and the lateral and vertical dimensions of the photodetection unit are both 4D, each photodetector unit can be divided into 4*4 super-divided pixels.
  • the photodetection unit is divided into a plurality of super-resolution pixels according to the size of the laser spot, in order to perform depth imaging with the depth information recorded when the light spot is detected by each super-resolution pixel, so as to meet the high resolution requirements for depth imaging.
  • the embodiment of the present application provides a schematic diagram of the position change of multiple laser light spots in the super-divided pixel when the scanning angle changes, as shown in FIG. 7 .
  • the left side and the right side of FIG. 7 respectively show the position changes of the laser spots 001 to 002 in the photoelectric conversion unit of the photoelectric sensor under two scanning angles. It can be seen from FIG. 7 that when the scanning angle changes once, each laser spot 001 to 002 is laterally shifted by one super-divided pixel on the photoelectric sensor as a whole.
  • laser pulses are emitted with a preset pulse emission period at each scanning angle, in order to accumulate enough photons to obtain the depth information of the laser spot subsequently.
  • the pulse emission period can be set according to actual requirements, for example, the pulse emission period is set to 100ns.
  • Step 502 polling and controlling the photodetection unit on the photoelectric sensor to turn on and off in each pulse emission period.
  • the photoelectric sensor detects that the position of the super-divided pixel of the light spot formed by the laser pulses emitted by the laser in different pulse emission periods is unchanged.
  • the specific photodetection unit that receives the light signal reflected from the laser spot, and the specific super-divided pixel on the photodetector unit that receives the light signal reflected from the laser spot are unknown.
  • an implementation scheme of controlling the opening and closing of the photoelectric detection unit by means of round-robin training is proposed. Specifically, as described in this step, in each pulse emission period, the photodetection unit on the photoelectric sensor is controlled to be turned on and off by polling.
  • the photodetection unit when the target distance is infinite and the TX and RX do not have parallax can be pre-calibrated as the photodetection unit without parallax.
  • the position of the photodetection unit without parallax is used as a reference position to measure the offset of each other photodetection unit.
  • FIG. 8 it is a schematic diagram of the offset range of other photodetection units relative to the photodetection unit without parallax in the lateral direction. As shown in FIG. 8 , the offset range of the photodetection unit Spad i is [ ⁇ i-1 , ⁇ i ].
  • the distance of the target object can be obtained by the time of flight of the light pulse emitted by the laser.
  • the relationship between distance and flight time is as follows:
  • ⁇ t represents the flight time
  • c represents the speed of light. Therefore, according to formulas (4) to (6), the range of flight time [t init_i , t end_i ] corresponding to the distance range [d i-1 , d i ] of the target object detected by the photoelectric detection unit can be obtained, and the range of flight time [t
  • the expressions for the lower and upper bounds of init_i , t end_i ] are as follows:
  • the photodetection units to be polled Before the polling controls the photodetection units to be turned on during the pulse emission period, the photodetection units to be polled may be determined first based on the determination. Then, the offset range of each photoelectric detection unit to be polled relative to the photodetection unit without parallax, combined with the relationship between the offset and the distance, obtain the corresponding distance range when the laser spot falls on each photoelectric detection unit to be polled, The time-of-flight range in which each photoelectric detection unit to be polled can receive the laser spot is determined based on the correspondence between distance and time delay, and the time-of-flight range is used as the time window for polling the photoelectric detection unit. In this way, the time window corresponding to each photoelectric detection unit to be polled is determined. Next, each photodetection unit to be polled is controlled to be turned on and off according to the time window polling.
  • FIG. 9 exemplarily shows the polling sequence of each of the photodetection units SPAD0 to SPAD4 to be polled in one pulse emission period.
  • control SPAD0 to turn on control SPAD0 to turn off
  • control SPAD1 to turn off control SPAD2 to turn on
  • control SPAD2 to turn off control SPAD3 to turn on
  • control SPAD3 off control SPAD4 on
  • control SPAD4 off control SPAD4 off.
  • each photoelectric detection unit to be polled is only enabled within the time window in which the photoelectric detection unit may detect the laser spot of the corresponding distance, so as to avoid wasting exposure time and reduce the The chance of detecting ambient light.
  • Step 503 Obtain the depth information of the laser spot at the current scanning angle according to the electrical signal provided by each photoelectric detection unit that is turned on by polling in the pulse emission period.
  • the photodetection unit when the photodetection unit is turned on, it is possible to collect signal light or ambient light.
  • the photoelectric detection unit is hereinafter referred to as Target photodetector unit. It can be understood that, as the scanning angle changes, the target photodetection unit also changes.
  • the target photodetection unit detects signal light (or signal light + ambient light), and the remaining photodetection units detect ambient light.
  • the direct flight time histogram corresponding to the current scanning angle is generated according to the electrical signal provided by each photoelectric detection unit that is turned on by polling in the pulse emission period under the current scanning angle.
  • the horizontal axis of the direct flight time histogram represents time, and the vertical axis represents the count value.
  • Figure 10 is a schematic diagram of a direct flight time histogram. The time corresponding to the column with the highest count value can be obtained by finding the peak in the histogram, and this time is taken as the flight time corresponding to the laser spot at the current scanning angle.
  • the flight time obtained from the histogram in the previous step is substituted into formula (6), and the distance information of the laser spot at the current scanning angle can be obtained.
  • the distance information represents the depth information of the laser spot detected by the target photoelectric detection unit.
  • steps 501 to 503 are all executed cyclically.
  • Step 504 After the target object is scanned, the depth information of the laser spot at each scanning angle is spliced to obtain a super-resolution depth image of the target object.
  • the scanning position of the target object changes, that is, the position where the same laser beam transmits to the target object changes.
  • the position of the superdivision pixel receiving the same laser spot on the photoelectric sensor changes laterally, and the change scale is one superdivision pixel.
  • the depth information of the laser spot is obtained at each scanning angle, so the complete super-resolution of the target object can be obtained by splicing according to the positional relationship between the super-resolution pixels receiving the laser spot at each scanning angle rate depth image.
  • the target object is scanned at different scanning angles, and laser pulses are emitted at each scanning angle with a pulse emission period; in each pulse emission period, the photoelectric detection unit on the photoelectric sensor is polled and controlled to be turned on and off; the photoelectric sensor includes: A plurality of photoelectric detection units, each photoelectric detection unit is divided into a plurality of super-divided pixels according to the size of the laser spot; according to the electrical signals provided by each photoelectric detection unit that is turned on by polling in the pulse emission period, the laser spot at the current scanning angle is obtained.
  • the depth information of the laser spot at each scanning angle is spliced to obtain the super-resolution depth image of the target object.
  • the photoelectric detection unit Since the photoelectric detection unit is turned on by polling in each pulse emission period, the cumulative exposure time of the photoelectric detection unit on the photoelectric sensor is greatly shortened compared with the time-sharing opening of the photoelectric detection unit, which in turn can reduce power consumption and improve frame rate. and depth imaging speed.
  • the polling-on method ensures that the photodetector units that are turned on at a certain time will not be interfered by other adjacent photodetector units, and the converted signal has a higher signal-to-noise ratio. Furthermore, ensuring a high effective signal detection probability is beneficial to improve the imaging quality of the depth image of the target object.
  • the time window that each photoelectric detection unit is enabled in a pulse emission period is precisely controlled, and the photons that may be returned at each distance are accurately received, and the interference of ambient light is reduced as much as possible.
  • the number of ambient light photons collected during the time when multiple photodetector units are turned on is equivalent to the number of ambient light photons collected during the time when a single photodetector unit is always turned on in the prior art , and the detection of signal light is equivalent to fully opening all photoelectric detection units, so the signal quantity will not be lost, and the system power consumption will be greatly saved.
  • multiple laser beams can be projected to the target object at the same time. These laser beams may be parallel or non-parallel.
  • parallel output is taken as an example for description.
  • FIG. 11 the figure is a schematic diagram of dot matrix emission parallax.
  • the solid circles in Figure 11 represent different laser spots in the TX field of view, and the open circles represent different laser spots in the RX field of view.
  • FIG. 11 it is not difficult to see that if there are too many photodetector units involved in a single polling control, it is possible to obtain optical signals of two or more valid laser spots after polling. At this time, it is easy to mistake the weak optical signal as a noise signal of ambient light and ignore it. To avoid this problem, the photodetector units to be polled based on a single laser spot can be determined first before polling.
  • the depth imaging method includes:
  • Step 1201 Scan the target object at different scanning angles, and emit laser pulses to the target object in a dot matrix projection manner for multiple pulse emission cycles at each scanning angle, so as to form multiple laser spots on the photoelectric sensor at the same time.
  • the multiple laser spots formed at different scanning angles are shown on the left side of FIG. 7 and the right side of FIG. 7 , respectively.
  • Step 1202 Obtain the minimum interval between two adjacent laser spots in the field of view of the laser according to the baseline distance between the laser and the photoelectric sensor, the minimum detection distance of the photoelectric sensor, the lateral field angle and lateral resolution of the photoelectric detection unit.
  • the baseline distance between the laser and the photoelectric sensor can be determined by pre-calibration.
  • the detection capability of each photoelectric sensor includes the maximum detection distance and the minimum detection distance.
  • the minimum detection distance can be obtained from the factory parameters of the photoelectric sensor, or obtained through multiple tests.
  • the lateral field of view and lateral resolution of the photodetection unit can also be obtained from the factory parameters.
  • the photoelectric detection unit to be polled in order to determine the photoelectric detection unit to be polled corresponding to each laser spot in each pulse emission period and prevent more than one laser spot from being detected by one round of polling, first calculate the field of view of the laser (TX field of view). ) is the minimum interval between two adjacent laser spots. For each laser spot, the photodetection units to be polled are determined with a minimum interval.
  • N spot represents the minimum interval between two adjacent laser spots in the TX field of view
  • d baseline represents the baseline distance between the laser and the photoelectric sensor
  • Dist min represents the minimum detection distance of the photoelectric sensor
  • FOV h represents the photoelectric detection unit
  • N h represents the lateral resolution of the photodetection unit.
  • in the round-up symbol is the maximum offset corresponding to the minimum detection distance.
  • 1 is added to the result of rounding up the maximum offset, and the final value obtained by this is used as the minimum interval between two adjacent laser spots in the laser field of view.
  • the minimum interval is represented by the number of photodetector units.
  • Step 1203 Determine the photodetection unit to be polled corresponding to each laser spot in each pulse emission period according to the photodetection unit without parallax and the minimum interval.
  • N spot 4 means that four consecutive photodetector units are used as the photodetector units to be polled corresponding to a certain laser spot, and the following four photodetector units are to be used as the photodetector units corresponding to adjacent laser spots. Polled photodetector unit.
  • the minimum interval is 4.
  • the photoelectric detection units 701 to 704 are the photoelectric detection units to be polled corresponding to the laser spot 001
  • the photoelectric detection units 705 to 708 are the photoelectric detection units to be polled corresponding to the laser spot 002. detection unit.
  • N spot photoelectric detection units to be polled corresponding to a certain laser spot will not detect more than one laser spot accumulated in one pulse emission period. In this way, the difficulty of determining the signal light is reduced, the missed detection of the signal light is avoided, and the pertinence and accuracy of the signal light detection are enhanced.
  • Step 1204 Determine the time window corresponding to each photoelectric detection unit to be polled.
  • Step 1205 Control each photoelectric detection unit to be polled on and off according to the time window polling.
  • Step 1206 Generate a direct flight time histogram corresponding to the current scanning angle according to the electrical signals provided by each photoelectric detection unit that is turned on by polling in the pulse emission period under the current scanning angle.
  • S is an integer greater than 1.
  • S direct flight time histograms can be generated.
  • the time corresponding to the maximum count value of each direct flight histogram is the flight time of the corresponding laser spot.
  • An example of a histogram can be found in FIG. 10 .
  • Step 1207 find a peak in the direct flight time histogram, and determine the flight time corresponding to the laser spot at the current scanning angle.
  • Step 1208 According to the flight time and the corresponding relationship between the flight time and the distance, obtain the distance information of the laser spot at the current scanning angle as the depth information of the laser spot.
  • the following steps can be used to determine the target photoelectric detection unit where the super-resolution pixel that receives the laser beam reflected from the laser spot is located. and the position of the super-divided pixel in the target photodetection unit (ie, the super-divided position).
  • Step 1209 Obtain the offset corresponding to the distance information according to the correspondence between the offset and the distance.
  • d obj represents the distance information of a certain laser spot under the current scanning angle
  • ⁇ obj represents the offset corresponding to the distance information
  • the offset calculated according to formula (10) usually includes an integer part and a fractional part.
  • the integer part indicates the number of photodetection units offset by the target photodetector unit relative to the photodetector unit without parallax
  • the target photodetector unit is the photodetector unit where the laser spot is located.
  • the fractional part of the offset indicates the super-resolution position of the received laser spot in the target photodetector unit.
  • ⁇ obj 3.25 as an example, it means that the target photodetection unit is shifted by 3 photodetection units relative to the photodetection unit without parallax.
  • the following step 1210 needs to be performed.
  • Step 1210 According to the offset and the superdivision multiple of the photoelectric sensor, determine the superdivision pixel of the laser spot detected in the target photoelectric detection unit under the current scanning angle, so as to construct the corresponding relationship between the depth information and the superdivision pixel.
  • l rx represents the ordinal number of the super-resolution pixels in the target photodetection unit that detect the laser spot
  • N supres represents the super-resolution multiple of the photoelectric sensor
  • ⁇ obj represents the offset corresponding to the distance information of the laser spot. to round down notation, is the round-up symbol.
  • one photodetection unit includes 4 superdivision pixels laterally, so the superdivision multiple is 4.
  • the ordinal number of the super-divided pixel that detects the laser spot is calculated to be 1, which means that the super-divided pixel that receives the laser spot is the No. 1 pixel of the target photodetection unit along the moving direction of the laser spot. 1 superpixel.
  • steps 1201 to 1210 are performed cyclically. Therefore, by cyclically executing steps 1201 to 1210, the corresponding relationship of the laser spot depth information of each super-divided pixel that has received the laser spot can be constructed.
  • Step 1211 After the target object is scanned, the depth information of the laser spot at each scanning angle is stitched together by using the corresponding relationship between the depth information and the super-divided pixels.
  • the depth information of the laser spot 001 in the first super-divided pixel of the photodetection unit 701 is associated with the depth information of the second super-divided pixel in the photodetection unit 701 according to the positions of the two super-divided pixels. stitched together, and so on.
  • the minimum interval is obtained in step 1202, and then the photoelectric detection unit to be polled corresponding to each laser spot in each pulse emission period is determined based on the minimum interval in step 1203, which reduces the difficulty of determining the signal light , to avoid missed detection of signal light, and enhance the pertinence and accuracy of signal light detection.
  • step 1210 the corresponding relationship between the depth information and the super-divided pixels is constructed, so as to improve the splicing efficiency of the depth information of each position of each target object, and improve the speed of depth imaging.
  • the present application also provides a depth imaging system.
  • the system is described below with reference to the embodiments and the accompanying drawings.
  • FIG. 13 this figure is a schematic structural diagram of a depth imaging system provided by an embodiment of the present application. As shown in Figure 13, the depth imaging system includes:
  • the controller 1302 is connected to the laser 1301 .
  • the controller 1302 is used to control the laser 1301 to scan the target object at different scanning angles, and emit laser pulses with a pulse emission period under each scanning angle.
  • the controller 1302 sets the pulse emission period of the laser 1301, and the laser 1301 emits laser pulses in the above pulse emission period according to the pulse control signal provided by the controller 1302.
  • the controller 1302 may control the scanning angle of the laser 1301 .
  • the controller 1302 transmits a scan angle adjustment signal to the laser 1301, the laser 1301 adjusts the scan angle according to the scan angle adjustment signal.
  • the photosensor 1304 includes a plurality of photodetection units, and each photodetector unit is divided into a plurality of super-divided pixels according to the size of the laser spot.
  • the photodetection unit can be SPAD or APD.
  • the specific type of the photodetection unit is not limited here. In Fig. 13, only SPAD 1, SPAD 2, SPAD 3...SPADN are used as examples to represent different photoelectric detection units.
  • the gate switch 1303 connects all photodetection units of the photoelectric sensor 1304 .
  • the gate switch 1303 is used to poll and control the photodetection unit on the photoelectric sensor 1304 to turn on and off in each pulse emission period.
  • the turned-on photodetection unit is used to receive the light signal reflected by the target object (ie, the reflected laser beam), and convert the light signal into an electrical signal.
  • the controller 1302 and the processor 1305 are respectively connected to the time-to-digital converter TDC.
  • the time-to-digital converter TDC is also connected to each photodetection unit of the photoelectric sensor 1304 through the gate switch 1303 .
  • the time-to-digital converter TDC is used to obtain the flight according to the electrical signal provided by each photoelectric detection unit that is turned on by polling in the pulse emission period (also provides the time when the optical signal is detected) and the emission time of each pulse under the current scanning angle. time, and convert the flight time into a count value.
  • the processor 1305 is used to form the direct flight time histogram corresponding to the current scanning angle according to the count value converted by the time-to-digital converter TDC and the electrical signal; obtain the depth information of the laser spot under the current scanning angle according to the direct flight time histogram; After the laser 1301 scans the target object, the depth information of the laser spot at each scanning angle is spliced to obtain a super-resolution depth image of the target object.
  • the depth imaging system provided by the embodiment of the present application. Since the photoelectric detection unit is turned on by polling in each pulse emission period, the cumulative exposure time of the photoelectric detection unit on the photoelectric sensor is greatly shortened compared with the time-sharing opening of the photoelectric detection unit, which in turn can reduce power consumption and improve frame rate. and depth imaging speed.
  • the polling-on method ensures that the photodetector units that are turned on at a certain time will not be interfered by other adjacent photodetector units, and the converted signal has a higher signal-to-noise ratio. Furthermore, a higher probability of effective signal detection is ensured, which is beneficial to improve the imaging quality of the depth image of the target object.
  • the time window that each photoelectric detection unit is enabled in a pulse emission period is precisely controlled, and the photons that may be returned at each distance are accurately received, and the interference of ambient light is reduced as much as possible.
  • the number of ambient light photons collected during the time when multiple photodetector units are turned on is equivalent to the number of ambient light photons collected during the time when a single photodetector unit is always turned on in the prior art , and the detection of signal light is equivalent to fully opening all photoelectric detection units, so the signal quantity will not be lost, and the system power consumption will be greatly saved.
  • the depth imaging system may further include a memory connected to the processor for storing the direct time-of-flight histogram.
  • the figure illustrates the structure of a laser.
  • the laser includes: a laser light source 13011 , a collimating lens 13012 , a deflection mirror 13013 , a beam splitting element 13014 and a driving device 13015 .
  • the laser light source 13011 is used to emit a laser beam, and the laser beam includes laser pulses emitted according to the pulse emission period;
  • the collimating lens 13012 is used to collimate the laser beam and send it to the deflection mirror;
  • the deflection mirror 13013 is connected to the driving device 13015 for reflecting the laser beam from the collimating lens 13012 to the beam splitting element 13014; during the period, the deflection mirror 13013 is periodically deflected by the driving device 13015.
  • the deflection mirror 13013 is mechanically and/or electrically connected to the driving device 13015 .
  • the beam splitting element 13014 is used to split the received laser beam into multiple beams, and then project the multiple laser beams to the target object.
  • the projected multiple laser beams may be parallel to each other, or may form an included angle.
  • the controller 1302 controls the laser shown in FIG. 14 to adjust the scanning angle, it may specifically send the scanning angle adjustment signal to the driving device 13015, and then the driving device 13015 drives the deflection mirror 13013 to rotate according to the scanning angle adjustment signal.
  • At least one (item) refers to one or more, and "a plurality” refers to two or more.
  • “And/or” is used to describe the relationship between related objects, indicating that there can be three kinds of relationships, for example, “A and/or B” can mean: only A exists, only B exists, and both A and B exist at the same time. , where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s).
  • At least one (a) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, c can be single or multiple.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

Procédé d'imagerie de profondeur et système d'imagerie de profondeur, le procédé consistant : à balayer un objet cible à différents angles de balayage, et à émettre des impulsions laser dans des cycles d'émission d'impulsions à chaque angle de balayage (501); dans chaque cycle d'émission d'impulsions, à interroger et à commander l'activation ou la désactivation d'unités de photodétection sur un capteur photoélectrique (502), chaque unité de détection photoélectrique du capteur photoélectrique étant divisée en multiples pixels de super-division en fonction de la taille de points laser; à obtenir des informations de profondeur de points laser à un angle de balayage en cours en fonction de signaux électriques fournis par les unités de détection photoélectrique interrogées et activées pendant le cycle d'émission d'impulsions (503); et après le balayage de l'objet cible, à combiner les informations de profondeur des points laser à des angles de balayage respectifs afin d'obtenir une image de profondeur à super-résolution de l'objet cible (504). Le temps d'exposition cumulé des unités de détection photoélectrique sur le capteur photoélectrique est considérablement diminué, ce qui permet de réduire la consommation d'énergie et d'augmenter le débit de trames et la vitesse d'imagerie, et les signaux convertis présentent un rapport signal sur bruit élevé, ce qui facilite l'amélioration de la qualité d'imagerie de l'image de profondeur de l'objet cible.
PCT/CN2021/107301 2020-07-23 2021-07-20 Procédé d'imagerie de profondeur et système d'imagerie de profondeur WO2022017366A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010716865.3 2020-07-23
CN202010716865.3A CN113970757A (zh) 2020-07-23 2020-07-23 一种深度成像方法及深度成像系统

Publications (1)

Publication Number Publication Date
WO2022017366A1 true WO2022017366A1 (fr) 2022-01-27

Family

ID=79585292

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/107301 WO2022017366A1 (fr) 2020-07-23 2021-07-20 Procédé d'imagerie de profondeur et système d'imagerie de profondeur

Country Status (2)

Country Link
CN (1) CN113970757A (fr)
WO (1) WO2022017366A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115056818A (zh) * 2022-06-22 2022-09-16 中车青岛四方车辆研究所有限公司 3d测量模组异步控制方法、装置和轨道车辆三维检测系统
CN115685242A (zh) * 2022-10-31 2023-02-03 哈尔滨工业大学 一种探测激光延时反馈的控制系统
CN116300377A (zh) * 2023-03-06 2023-06-23 深圳市镭神智能系统有限公司 一种时间数字转换器与激光雷达
CN116400379A (zh) * 2023-06-08 2023-07-07 合肥国家实验室 单光子激光雷达三维成像系统和三维成像方法
WO2024066471A1 (fr) * 2022-09-27 2024-04-04 杭州海康机器人股份有限公司 Dispositif, procédé et appareil de collecte de données, et support de stockage
CN117928386A (zh) * 2024-03-22 2024-04-26 四川拓及轨道交通设备股份有限公司 一种便携式双目接触网几何参数检测系统及方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116917761A (zh) * 2022-02-17 2023-10-20 华为技术有限公司 探测装置及探测方法
CN114994704B (zh) * 2022-08-04 2022-12-27 中国科学院西安光学精密机械研究所 基于圆周扫描路径的非视域成像方法、系统及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108431626A (zh) * 2015-12-20 2018-08-21 苹果公司 光检测和测距传感器
WO2019121437A1 (fr) * 2017-12-18 2019-06-27 Robert Bosch Gmbh Système lidar à impulsions multiples pour la détection multidimensionnelle d'objets
CN110609293A (zh) * 2019-09-19 2019-12-24 深圳奥锐达科技有限公司 一种基于飞行时间的距离探测系统和方法
WO2020033001A2 (fr) * 2018-02-13 2020-02-13 Sense Photonics, Inc. Procédés et systèmes pour lidar flash longue portée et haute résolution
WO2020070311A1 (fr) * 2018-10-04 2020-04-09 Iris-Gmbh Infrared & Intelligent Sensors Capteur d'imagerie
CN111427230A (zh) * 2020-03-11 2020-07-17 深圳市安思疆科技有限公司 基于时间飞行的成像方法及3d成像装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108431626A (zh) * 2015-12-20 2018-08-21 苹果公司 光检测和测距传感器
WO2019121437A1 (fr) * 2017-12-18 2019-06-27 Robert Bosch Gmbh Système lidar à impulsions multiples pour la détection multidimensionnelle d'objets
WO2020033001A2 (fr) * 2018-02-13 2020-02-13 Sense Photonics, Inc. Procédés et systèmes pour lidar flash longue portée et haute résolution
WO2020070311A1 (fr) * 2018-10-04 2020-04-09 Iris-Gmbh Infrared & Intelligent Sensors Capteur d'imagerie
CN110609293A (zh) * 2019-09-19 2019-12-24 深圳奥锐达科技有限公司 一种基于飞行时间的距离探测系统和方法
CN111427230A (zh) * 2020-03-11 2020-07-17 深圳市安思疆科技有限公司 基于时间飞行的成像方法及3d成像装置

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115056818A (zh) * 2022-06-22 2022-09-16 中车青岛四方车辆研究所有限公司 3d测量模组异步控制方法、装置和轨道车辆三维检测系统
CN115056818B (zh) * 2022-06-22 2024-04-09 中车青岛四方车辆研究所有限公司 3d测量模组异步控制方法、装置和轨道车辆三维检测系统
WO2024066471A1 (fr) * 2022-09-27 2024-04-04 杭州海康机器人股份有限公司 Dispositif, procédé et appareil de collecte de données, et support de stockage
CN115685242A (zh) * 2022-10-31 2023-02-03 哈尔滨工业大学 一种探测激光延时反馈的控制系统
CN115685242B (zh) * 2022-10-31 2024-05-31 哈尔滨工业大学 一种探测激光延时反馈的控制系统
CN116300377A (zh) * 2023-03-06 2023-06-23 深圳市镭神智能系统有限公司 一种时间数字转换器与激光雷达
CN116300377B (zh) * 2023-03-06 2023-09-08 深圳市镭神智能系统有限公司 一种时间数字转换器与激光雷达
CN116400379A (zh) * 2023-06-08 2023-07-07 合肥国家实验室 单光子激光雷达三维成像系统和三维成像方法
CN116400379B (zh) * 2023-06-08 2023-09-12 合肥国家实验室 单光子激光雷达三维成像系统和三维成像方法
CN117928386A (zh) * 2024-03-22 2024-04-26 四川拓及轨道交通设备股份有限公司 一种便携式双目接触网几何参数检测系统及方法
CN117928386B (zh) * 2024-03-22 2024-05-31 四川拓及轨道交通设备股份有限公司 一种便携式双目接触网几何参数检测系统及方法

Also Published As

Publication number Publication date
CN113970757A (zh) 2022-01-25

Similar Documents

Publication Publication Date Title
WO2022017366A1 (fr) Procédé d'imagerie de profondeur et système d'imagerie de profondeur
CN110609293B (zh) 一种基于飞行时间的距离探测系统和方法
CN110596722B (zh) 直方图可调的飞行时间距离测量系统及测量方法
CN110596721B (zh) 双重共享tdc电路的飞行时间距离测量系统及测量方法
CN108431626B (zh) 光检测和测距传感器
CN109725326B (zh) 飞行时间相机
WO2021072802A1 (fr) Système et procédé de mesure de distance
WO2021103428A1 (fr) Système et procédé de mesure de profondeur
WO2021051479A1 (fr) Procédé et système de mesure de temps de vol fondée sur une interpolation
WO2021196194A1 (fr) Système d'émission et de réception laser, radar laser et appareil d'entraînement automatique
WO2021051481A1 (fr) Procédé de mesure de distance par temps de vol par traçage d'un histogramme dynamique et système de mesure associé
KR102226359B1 (ko) 스캔 모드를 변경하는 3차원 스캐닝 라이다 센서
WO2023015880A1 (fr) Procédé d'acquisition d'un ensemble d'échantillons d'apprentissage, procédé de formation de modèle et appareil associé
CN110780312B (zh) 一种可调距离测量系统及方法
WO2022206031A1 (fr) Procédé de détermination de niveau de bruit, lidar et procédé de télémétrie
KR101145132B1 (ko) 3차원 영상화 펄스 레이저 레이더 시스템 및 이 시스템에서의 자동 촛점 방법
WO2019091004A1 (fr) Dispositif radar-laser à réseau plan à semi-conducteur et procédé de détection
WO2021026709A1 (fr) Système de radar laser
IL269455B2 (en) Time of flight sensor
WO2022241942A1 (fr) Caméra de profondeur et procédé de calcul de profondeur
CN114488173A (zh) 一种基于飞行时间的距离探测方法和系统
CN116559846A (zh) 用于全固态激光雷达的光子探测方法、装置与激光雷达
WO2022083301A1 (fr) Système de télémétrie de capteur d'image 3d et procédé de télémétrie l'utilisant
CN114814880A (zh) 一种激光雷达探测参数调整控制方法及装置
WO2022198386A1 (fr) Appareil de télémétrie laser, procédé de télémétrie laser et plateforme mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21845420

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21845420

Country of ref document: EP

Kind code of ref document: A1