WO2023280647A1 - Procédé de correction de la diaphonie optique entre des pixels d'un dispositif de détection optique et dispositif de détection correspondant - Google Patents

Procédé de correction de la diaphonie optique entre des pixels d'un dispositif de détection optique et dispositif de détection correspondant Download PDF

Info

Publication number
WO2023280647A1
WO2023280647A1 PCT/EP2022/067823 EP2022067823W WO2023280647A1 WO 2023280647 A1 WO2023280647 A1 WO 2023280647A1 EP 2022067823 W EP2022067823 W EP 2022067823W WO 2023280647 A1 WO2023280647 A1 WO 2023280647A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection device
signal
signals
pixels
distance
Prior art date
Application number
PCT/EP2022/067823
Other languages
German (de)
English (en)
Inventor
Thorsten BEUTH
Christoph Parl
Original Assignee
Valeo Schalter Und Sensoren Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Schalter Und Sensoren Gmbh filed Critical Valeo Schalter Und Sensoren Gmbh
Publication of WO2023280647A1 publication Critical patent/WO2023280647A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4915Time delay measurement, e.g. operational details for pixel components; Phase measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4918Controlling received signal intensity, gain or exposure of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Definitions

  • the invention relates to a method for operating an optical detection device for determining at least distance variables which characterize distances from objects detected with the optical detection device, in which at least one electromagnetic scanning signal is generated and sent to a monitoring area of the detection device, with at least one part of pixels of an optical receiver matrix, at least one electromagnetic echo signal, which originates from at least one electromagnetic scanning signal reflected on an object, is detected and converted into corresponding electrical reception signals, and by means of at least a part of the reception signals at least one distance variable, which is a distance at least of a detected object to which characterizes at least one detection device is determined.
  • the invention relates to a detection device for determining at least distance variables which characterize distances from objects detected with the detection device, which has at least one transmission device with which at least one electromagnetic scanning signal can be generated and sent into at least one monitoring area of the detection device, and at least a receiving device with at least one optical receiver matrix with a plurality of pixels, with which electromagnetic echo signals, which stem from at least one electromagnetic scanning signal reflected on an object, can be detected and converted into corresponding electrical reception signals.
  • the invention also relates to a vehicle with at least one detection device for determining at least distance variables which characterize distances from objects detected with the detection device, wherein the at least one detection device has at least one transmission device, with which at least one electromagnetic scanning signal is generated and transmitted to at least one monitoring area of the Detection device can be sent, and at least one receiving device with at least one optical receiver matrix a plurality of pixels with which electromagnetic echo signals, which stem from at least one electromagnetic scanning signal reflected on an object, can be detected and converted into corresponding received electrical signals.
  • a TOF (time-of-flight) distance sensor and a method for operating a TOF distance sensor are known from EP 2743724 B1.
  • the TOF distance sensor comprises an electronic device for generating a modulation signal and for generating four correlation signals which are phase-shifted with respect to one another and have the same period length as the modulation signal; a radiation source for emitting radiation modulated with the modulation signal; a receiving device which is in a predetermined spatial relationship to the radiation source for receiving radiation reflected from the object; a correlation device for correlating the received radiation or a corresponding quantity with one of the four correlation signals to form four corresponding correlation values; a difference-forming device for forming two differential correlation values from the difference between two of the correlation values in each case; a calculation device which is designed to calculate the distance in a predetermined linear dependence on the two differential correlation values.
  • the invention is based on the object of designing a method, a detection device and a vehicle of the type mentioned at the outset, in which a determination of distance variables in scenes with objects that are reflective to different degrees with respect to the scanning signals can be improved.
  • optical crosstalk between pixels of the receiver matrix is corrected, with echo signals with at least a short integration period being received and converted into electrical reception signals in at least one short measurement phase with at least some of the pixels, for at least one pixel in which at least one amplitude of at least one received signal is greater than in at least one other pixel
  • at least one correction variable is determined by means of the at least one received signal is, in at least one long measurement phase with at least some of the pixels
  • echo signals with at least one long integration period, which is longer than the at least one short integration period are received and converted into electrical reception signals, for at least some of the pixels at least one distance variable, which characterizes a distance of at least one detected object, is determined by means of at least one received signal from at least one long measurement phase and at least one correction variable from at least one short measurement phase.
  • echo signals are received with a short integration period in at least one short measurement phase.
  • At least one correction value is determined for at least one pixel.
  • the at least one correction variable is used to correct the effects of optical crosstalk between the pixels in at least one long measurement phase with at least one long integration period.
  • the at least one correction variable can advantageously be determined for at least one pixel in which at least one amplitude of at least one received signal is greater than in at least one other pixel. In this way, the correction variable can be determined for pixels that encounter strong echo signals that can lead to optical crosstalk.
  • At least one correction variable is determined using the received signals from the identified pixels.
  • the optical crosstalk is reduced in the case of longer integration times, so that weaker reflecting objects can also be detected.
  • the respective distance variables can also be determined in scenes with both strongly reflecting objects and weakly reflecting objects.
  • strongly reflective objects such as traffic signs with retroreflective properties, and less reflective objects, such as pedestrians, obstacles such as walls or the like
  • the signals of the distance measurement can be smeared across the receiver matrix due to internal reflections, in particular in the optical receiving path and/or due to optical crosstalk in the detection device.
  • the echo signals from the highly reflective objects effectively overwhelm the echo signals from the weaker reflective objects, thus becoming the dominant information in the rest of the image - even when the weaker reflective objects are at a different distance from the more reflective objects.
  • distances from strongly reflecting objects whose echo signals dominate are incorrectly determined for all other objects as well. In a corresponding distance image, weaker reflecting objects are displayed as if they were all at the same distance.
  • At least distance variables which characterize distances from objects, can be determined with the method and the detection device.
  • further information about a monitoring area in particular about objects, in particular directions and/or speeds of objects relative to the detection device and/or a vehicle with at least one detection device, can be determined with the method and the detection device.
  • the at least one detection device can work according to an indirect signal propagation time method.
  • Optical detection devices working according to a signal transit time method can be used as time-of-flight (TOF), light-detection and ranging systems (LiDAR), laser detection and ranging systems (LaDAR), radar systems or the like designed and designated.
  • TOF time-of-flight
  • LiDAR light-detection and ranging systems
  • LaDAR laser detection and ranging systems
  • radar systems or the like designed and designated.
  • a phase shift of the received signal relative to the transmitted signal caused by the propagation time of the scanning signal and the corresponding echo signal can be determined. From the phase shift, the distance of an object can be determined from which the corresponding scanning signal is reflected.
  • optical scanning signals in particular light signals, in particular laser signals
  • electromagnetic scanning signals can be used as electromagnetic scanning signals.
  • Objects can be detected without contact using electromagnetic scanning signals, in particular light signals.
  • the detection device can be an optical detection device.
  • the detection device can advantageously be designed as a laser-based distance measuring system.
  • a laser-based distance measuring system can have at least one laser, in particular a diode laser, as the light source of a transmission device.
  • pulsed light scanning signals can be sent with the at least one laser.
  • the laser can be used to emit scanning signals in wavelength ranges that are visible or invisible to the human eye.
  • at least one receiver matrix can be implemented with at least one detector designed for the wavelength of the emitted light, in particular a CCD sensor, an active pixel sensor, in particular a CMOS sensor or the like.
  • the laser-based distance measuring system can advantageously be a laser scanner.
  • a monitoring area can be scanned with a laser scanner, in particular with a pulsed scanning signal.
  • the invention can advantageously be used in vehicles, in particular motor vehicles.
  • the invention can advantageously be used in land vehicles, in particular passenger cars, trucks, buses, motorcycles or the like, aircraft, in particular drones, and/or water vehicles.
  • the invention can also be used in vehicles that can be operated autonomously or at least partially autonomously.
  • the invention is not limited to vehicles. It can also be used in stationary operation, in robotics and/or be used in machines, in particular construction or transport machines such as cranes, excavators or the like.
  • the detection device can advantageously be connected to at least one electronic control device of a vehicle or a machine, in particular a driver assistance system and/or chassis control and/or a driver information device and/or a parking assistance system and/or gesture recognition or the like, or be part of such being. In this way, at least some of the functions of the vehicle or machine can be operated autonomously or partially autonomously.
  • the detection device can be used to detect stationary or moving objects, in particular vehicles, people, animals, plants, debris, bumps in the road, in particular potholes or stones, road boundaries, traffic signs, open spaces, in particular parking spaces, precipitation or the like, and/or movements and /or gestures are used.
  • At least one correction variable for at least one pixel can be determined from at least one distance variable, which characterizes a distance of at least one detected object, and at least one correction parameter, with at least one correction parameter being determined in advance before the operation of the detection device and/or or at least one correction parameter is calculated from variables that are determined during operation of the detection device. In this way, optical crosstalk can be corrected even more precisely.
  • the method can be accelerated overall by using correction parameters determined in advance.
  • the correction parameters can be determined in advance, in particular as part of a calibration of the detection device, in particular at the end of a production line, and stored in corresponding storage means of the detection device.
  • the calculation of correction parameters from variables that are determined during operation of the detection device enables a more individual adjustment of the correction correction parameters and thus an increase in the accuracy of the distance determination.
  • At least one short measurement phase can be carried out before at least one long measurement phase and/or at least one short measurement phase can be carried out after at least one long measurement phase. Overall, the flexibility of the measurements can be improved in this way.
  • At least one short measurement phase can be carried out before at least one long measurement phase.
  • the variables determined as part of the at least one short measurement phase are available more quickly for the at least one long measurement phase.
  • At least one short measurement phase can advantageously be carried out after at least one long measurement phase. In this way, it can already be determined from the results of the at least one long measurement phase whether a scene is present in which there is optical crosstalk between pixels. If this is not the case, the short measurement phase can be omitted. In this way, the measurement can be adjusted as required.
  • At least one short integration period can be set in such a way that optical crosstalk in the optical receiver matrix is minimized. In this way, pixels that receive echo signals from highly reflective objects can be localized more precisely. Furthermore, the correspondingly strong echo signals can be detected more precisely.
  • At least one long integration period can be set longer than at least one short integration period by a factor of approximately between 10 and 10,000.
  • at least one short integration period can have a length on the order of microseconds, in particular approximately between 0.5 ps and 2 ps. In this way it can be avoided that echo signals from strongly reflecting objects lead to oversteering and/or to optical crosstalk in neighboring pixels.
  • At least one long integration period can have a length of the order of 1000 ps, in particular between approximately 500 ps and 10000 ps. In this way, echo signals from weakly reflecting objects can also be detected.
  • At least one electromagnetic scanning signal can be generated on the basis of at least one electrical transmission signal.
  • the at least one electrical transmission signal can be generated with corresponding electrical signal generation means.
  • At least one electrical transmission signal can be used to control at least one corresponding electro-optical signal source, in particular a light source, in particular a laser or the like, for emitting electro-optical scanning signals, in particular light signals, in particular laser pulses or the like.
  • At least one modulated electromagnetic scanning signal can be generated from at least one modulated electrical transmission signal.
  • An indirect propagation time determination can be carried out with modulated transmission signals and scanning signals and corresponding modulated echo signals and reception signals.
  • phase shifts between the modulated transmission signals and corresponding reception envelope curves of the reception signals can be determined as distance variables.
  • the phase shifts characterize the respective signal propagation time between the transmission of at least a sample signal and the reception of the corresponding echo signal.
  • the distance of a reflecting object can be determined from the signal propagation time.
  • At least one transmission signal and thus at least one sampled signal can be amplitude-modulated over at least one modulation period.
  • transmission signals can be efficiently defined on the transmitter side.
  • reception envelope curves can be efficiently characterized on the receiver side using the reception variables determined with the pixels.
  • the transmission signals and the reception envelope can be compared directly with one another.
  • At least one scanning signal is generated from at least one transmission signal.
  • the transmission signals are modulated, in particular amplitude modulated.
  • the transmission signals have a modulation period within which the at least one electrical transmission signal is modulated.
  • a modulation period can be specified as a time interval or based on a circle function, in particular as 360° or 2p.
  • the reception envelope is the envelope of the received signals that can be formed from the received echo signal.
  • At least one modulation period of at least one transmission signal can have a period of the order of magnitude of approximately 10 ms to 100 ms, in particular between 40 ms and 50 ms. With such periods, distances from objects in a few tens of meters to a few hundred meters can be detected.
  • At least one signal section of at least one electromagnetic echo signal of at least one scanning signal reflected on at least one object can be detected in at least one defined recording time range with at least one pixel and converted into a corresponding electrical received signal.
  • at least one interpolation point can be defined for a course of a reception envelope.
  • a recording time range can advantageously be defined by a defined starting point, an end point and/or a duration.
  • At least one duration of at least one recording time range can be defined by the integration period during which a corresponding pixel is activated in order to record the incident optical energy of an echo signal and convert it into an electrical reception signal.
  • at least one recording time range can be related to at least one characteristic point of at least one electrical transmission signal and/or at least one scanning signal. In this way, the at least one acquisition time range can be assigned more simply and/or unambiguously.
  • At least one characteristic point of at least one transmission signal and/or at least one sampled signal, to which at least one recording time range can be related can be a maximum, a minimum, a turning point, a zero crossing or the like, an edge of the at least one transmission signal and/or of the at least one sample signal. In this way, the characteristic point can be determined more precisely.
  • respective signal sections of the at least one echo signal can be recorded as electrical reception variables .
  • the reception variables detected with the respective pixels can define respective support points with which a profile of a reception envelope can be approximated.
  • the corresponding signal section of the received echo signal can be recorded in the at least two recording time ranges.
  • a modulation period sequence comprises at least one modulation period of the at least one electrical transmission signal.
  • At least one modulation period sequence can advantageously include a plurality, in particular approximately 1000 or more, modulation periods.
  • the electrical reception variables can be determined in the same modulation period sequence in the same way, in particular with the same activation of the reception areas.
  • At least one defined recording time range can advantageously be specified, which is shorter than a modulation period of the at least one electrical transmission signal.
  • a phase shift of the reception envelope compared to the transmission signal can be determined from the recorded signal excerpts.
  • the phase shift characterizes the signal propagation time between the transmission of the scanning signal and the reception of the echo signal.
  • the distance of a reflecting object can be determined from the signal propagation time. The phase shift can thus be used as at least one distance quantity.
  • the time interval between at least two recording time ranges can be smaller than the duration of a modulation period of the at least one electrical transmission signal.
  • two support points for at least one reception envelope can be realized within a modulation period.
  • the object is achieved according to the invention with the detection device in that the detection device has means for carrying out the method according to the invention.
  • the detection device has means for carrying out the method according to the invention.
  • the means for carrying out the method according to the invention can be implemented partially or entirely by means of at least one control and evaluation device, at least one transmitting device and/or at least one receiving device.
  • the detection device can have at least one control and evaluation device.
  • the functions of the detection device can be controlled with the control and evaluation device.
  • the control and evaluation device can be used to evaluate received signals which are detected with the detection device.
  • information from the received signals determined with the control and evaluation device can be transmitted to other devices, in particular a driver assistance system.
  • the detection device can have means for correcting optical crosstalk between pixels of the receiver matrix. Scenes with objects that reflect strongly differently can also be detected more precisely with the detection device.
  • the object is achieved according to the invention in the vehicle in that the vehicle has at least one detection device with means for carrying out the method according to the invention. In this way, distances of objects relative to the vehicle can be determined that have greatly differing reflectivities with respect to the scanning signals.
  • the vehicle can advantageously have at least one driver assistance system.
  • the vehicle can be operated autonomously or at least partially autonomously.
  • At least one detection device can be connected in terms of signals to at least one driver assistance system of the vehicle.
  • information about the monitoring area in particular distance variables and/or directional variables, which can be determined with the at least one detection device, can be transmitted to the at least one driver assistance system.
  • the vehicle With the at least one driver assistance system, the vehicle can be operated autonomously or at least partially autonomously, taking into account the information about the monitoring area.
  • FIG. 1 shows a front view of a vehicle with a driver assistance system and a LiDAR system for determining distances of objects from the vehicle;
  • FIG. 2 shows a functional representation of the vehicle with the driver assistance system and the LiDAR system from FIG. 1;
  • FIG. 3 shows a front view of a reception matrix of a reception device of the LiDAR system from FIGS. 1 and 2, the reception matrix having a multiplicity of linear reception areas, each of which consists of a multiplicity of pixels;
  • FIG. 4 shows a signal strength-time diagram with received quantities, which is determined from an electromagnetic echo signal of a reflected electromagnetic scanning signal of the LiDAR system from FIGS Received signals are determined from the echo signal;
  • FIG. 5 shows a signal strength-time diagram of an electromagnetic scanning signal of the LiDAR system from FIGS. 1 and 2;
  • FIG. 6 Signal strength-time diagrams of an electromagnetic echo signal, above, which can be received with the receiving device of the LiDAR system from FIGS. 1 and 2, of a first and a second shutter signal, in the middle and below, for determining a reception variable the electromagnetic echo signals;
  • FIG. 7 shows an amplitude-time diagram in which the composition of the respective and the four phase images DCS0 to DCS3 are shown, in which echo signals from retroreflective objects lead to optical crosstalk between pixels of the receiving matrix from FIG. 3;
  • FIG. 8 shows a diagram which shows a relationship between a correction factor Ci and a distance DR of a retroreflective object target, the correction factor Ci being used to correct an optical over- Speech between the pixels is used with a correction method according to a first embodiment
  • FIG. 9 is a diagram showing a relationship between phase shifts and test factors used for a correction method for correcting optical crosstalk between pixels according to a second embodiment
  • FIG. 10 shows a temporal control scheme for some reception areas of the reception matrix from FIG. 3, according to which the reception areas are controlled alternately with a short integration period and a long integration period.
  • FIG. 1 shows a front view of a vehicle 10 by way of example in the form of a passenger car.
  • Figure 2 shows a functional representation of part of the vehicle 10.
  • the vehicle 10 has a detection device, for example in the form of a LiDAR system 12.
  • the LiDAR system 12 is arranged in the front bumper of the vehicle 10, for example. With the LiDAR system 12, a monitoring area 14 in the direction of travel 16 in front of the vehicle 10 can be monitored for objects 18, or 18T and 18R.
  • the LiDAR system 12 can also be arranged elsewhere on the vehicle 10 and oriented differently.
  • the LiDAR system 12 can be used to determine object information, for example distances D, or DT and DR, directions and speeds of objects 18 relative to the vehicle 10 or to the LiDAR system 12 .
  • the objects 18 can be stationary or moving objects, for example other vehicles, people, animals, plants, obstacles, bumps in the road, for example potholes or stones, road boundaries, traffic signs, open spaces, for example parking spaces, precipitation or the like.
  • a retroreflective object 1 8R for example in the form of a traffic sign, road markings or the like, is shown in Figure 2 at a distance DR and a less reflective object 1 8T, such as a pedestrian, a wall or the like, at a distance DT is indicated.
  • each object 18 is equated with a single object target.
  • An object target is a location on an object 18 at which electromagnetic scanning signals 20 transmitted from the LiDAR system 12 into the surveillance area 14 may be reflected.
  • Each object 18 usually has a number of such object targets.
  • the LiDAR system 12 is connected to a driver assistance system 22 .
  • the vehicle 10 can be operated autonomously or partially autonomously with the driver assistance system 22 .
  • the LiDAR system 12 includes, for example, a transmitting device 24, a receiving device 26 and a control and evaluation device 28.
  • control and evaluation device 28 can be implemented centrally or decentrally. Parts of the functions of the control and evaluation device 28 can also be integrated into the transmitting device 24 and/or the receiving device 26 .
  • the control and evaluation device 28 can be used to generate electrical transmission signals 30, such as a square-wave signal indicated by dashed lines in FIG.
  • the transmission device 24 can be controlled with the electrical transmission signals 30, so that it transmits corresponding electromagnetic, for example optical, scanning signals 20 in the form of light signals, for example light pulses, as shown by way of example in Figure 5 in the form of square-wave signals, into the monitoring area 14 .
  • FIG. 5 shows only one modulation period MP of the corresponding sampled signal 20 in a signal strength versus time diagram.
  • FIG. 5 only shows the time profile of the corresponding electrical transmission signal 30 for comparison purposes, with the unit of the strength of the electrical transmission signal 30 differing from the signal strength P s of the electromagnetic scanning signal 20 .
  • the transmitting device 24 can have, for example, one or more lasers as a light source.
  • the transmission device 24 can optionally have a scanning signal deflection device with which the electromagnetic scanning signals 20 can be correspondingly directed into the monitoring area 14 .
  • the electromagnetic scanning signals 20 reflected on an object 18 in the direction of the receiving device 26 as electromagnetic echo signals 34 can be received with the receiving device 26 .
  • an echo signal 34 is shown as an example, which belongs to the scanning signal 20 from FIG. Like the corresponding scanning signal 20, the echo signal 34 is a square-wave signal.
  • the receiving device 26 can optionally have an echo signal deflection device, with which the electromagnetic echo signals 34 are directed to a receiving matrix 36 of the receiving device 26 shown in the front view in FIG.
  • the reception matrix 36 is implemented, for example, with an area sensor in the form of a CCD sensor with a large number of pixels 38 .
  • the components of the electromagnetic echo signal 34 which are incident in each case can be converted into corresponding electrical reception signals.
  • Each pixel 38 can be activated via suitable locking means for the detection of electromagnetic echo signals 34 for defined recording time ranges TB.
  • different recording time ranges TB can be provided with different indices, for example i, in the following, ie they can be referred to as recording time range TBi.
  • the pixels 38 can each be activated in four recording time ranges TB1, namely TB0, TB1, TB2 and TB3, for detecting received signals 34, which are labeled in FIG. 4, for example.
  • Each recording time range TBi is defined by a start time and an integration period ⁇ INT.
  • the integration periods ⁇ INT of the recording time ranges TBi are significantly shorter than a period tMOD of the modulation period MP of the transmission signal 30 and the electromagnetic scanning signal 20.
  • the time intervals between two defined recording time ranges TBi are shorter than the period tMOD of the modulation period period MP.
  • Several consecutive modulation periods MP can be seen below for better distinction with a respective index, for example k, ver, ie referred to as modulation period MPk.
  • portions of echo signals 34 striking the respective pixel 38 can be converted into corresponding received electrical signals.
  • respective phase images DCSi Different Correlation Sample
  • Ai respective phase images
  • the phase images DCSo, DCSi, DCS2 and DCS3 and their amplitudes Ao, A-i, A2 and A3 characterize the respective amount of light that is collected during the recording time ranges TBo, TBi, TB2 and TB3 with the correspondingly activated pixels 38 of the receiver 36.
  • each pixel 38 can be activated and read out individually.
  • the closure means can be implemented in software and/or hardware. Such closure means can be realized as a so-called “shutter”.
  • the pixels 38 can be driven with corresponding periodic recording control signals in the form of shutter signals 56-1 and 56-2.
  • the shutter signals 56-1 and 56-2 are shown in FIG. 6 in the middle and at the bottom, with which the respective receiver pixels 38 are controlled in order to determine the reception variable DCSo.
  • the shutter signals 56-1 and 56-2 are square-wave signals with the same period as the transmission signals 30, the scanning signals 20 and the echo signals 34.
  • the shutter signals 56-1 and 56-2 are transmitted via the electrical signals 30 or triggered together with these.
  • the received electrical signals will be related to the transmitted electrical signals 30 .
  • the electrical transmission signals 30 can be triggered at a starting time ST, which is indicated in FIG.
  • the receiver pixels 38 are triggered with the shutter signals 56-1 and 56-2, which are offset in time accordingly.
  • the pixels 38 are arranged areally in, for example, more than 100 receiver areas EBi in the form of lines, each with, for example, more than 100 pixels 38 .
  • the pixels 38 of a receiver area EBi are equal in a modulation period MPk activated early in the same recording time range TBi.
  • the pixels 38 of adjacent receiver areas EBi can be activated in a modulation period MPk, for example, in different or identical recording time ranges TBi.
  • the receiving device 26 can optionally have optical elements, for example refractive elements, diffractive elements and/or reflective elements or the like, with which electromagnetic echo signals 34 coming from the monitoring area 14 are viewed in the direction of the receiving areas EBi depending on the direction from which they come are mapped onto respective pixels 38.
  • the direction of an object 18 on which the scanning signal 38 is reflected can thus be determined from the position of the illuminated pixels 38 within a receiver area EBi. Viewed in the direction perpendicular to the receiver areas EBi, the echo signals 34 are imaged as uniformly as possible on the pixels 38 in the same column of all receiver areas EBi.
  • FIG. 4 shows a modulation period MP of a reception envelope curve 42 of the reception variables DCSo, DCSi, DCS2 and DCS3 in a common signal strength/time diagram.
  • the reception envelope curve 42 is offset in time with respect to the start time ST.
  • the time offset in the form of a phase difference F characterizes the flight time between the transmission of the electromagnetic scanning signal 20 and the reception of the corresponding electromagnetic echo signal 34.
  • the distance D of the reflecting object 18 can be determined from the phase difference F.
  • the phase shift F can therefore be used as a distance variable for distance D.
  • the time of flight is known to be proportional to the distance D of the object 18 relative to the LiDAR system 12.
  • the period duration tMOD of the transmission signals 30 and the scanning signals 20 specifies the maximum distance that can still be clearly detected with the LiDAR system 12 .
  • the period tMOD is greater than the flight time of the scanning signal 20 and the echo signal 34 in the case of reflections from objects 18 at the maximum distance of interest.
  • the measurement duration of a measurement corresponds to the period duration tMOD.
  • Exemplary the period duration tMOD can be of the order of about 40 ms to 50 ms.
  • Distance measurements can be carried out continuously within the unambiguity range. Distances outside the maximum distance, which are not within the unambiguous range, can also be recorded by appropriate data processing, which is of no further interest here.
  • the reception envelope curve 42 can be approximated by four interpolation points in the form of the four phase images DCSo, DCSi, DCS2 and DCS3. Alternatively, the reception envelope curve 42 can also be approximated by more or fewer support points in the form of phase images.
  • the recording time ranges TBo, TB1, TB2 and TB3 are each started in relation to a start event, for example in the form of a trigger signal for the electrical transmission signal 30 .
  • a start event for example in the form of a trigger signal for the electrical transmission signal 30 .
  • the modulation period MPk of the transmission signal 30 and thus of the sampled signal 20 extends over 360°.
  • the recording areas TBo, TB1, TB2 and TB3 each start at a distance of 90° relative to the modulation period MP.
  • a distance D of a detected object 18 can be calculated, for example, from the amplitudes Ao, Ai, A2 and A3 of the phase images DCSo, DCS1, DCS2 and DCS3 for a respective pixel 38 as follows:
  • c is the speed of light and fs is the modulation frequency of the transmission signal 30.
  • fs is the modulation frequency of the transmission signal 30.
  • the term from the amplitudes Ao, A-i, A2 and A3 of the phase images DCSo, DCS1, DCS2 and DCS3 represents the phase shift F.
  • the echo signal 34 from the retroreflective object 18R can cause optical crosstalk between the pixels 38 that are directly hit by the echo signal 34 of the retroreflective object 18R and the neighboring pixels 38 that are hit by the echo signals 34 of the object 18T, for example , to lead.
  • the measurement for the retroreflective object 18R itself can be overdriven. Consequently, the distances D of the object 18 cannot be determined or cannot be determined correctly.
  • the signal from the distance measurement can smear over large parts of the reception matrix 36.
  • the echo signal 34 reflected from the retroreflective object 18R can effectively overlay the echo signals 34 from the other object 18T in the reception matrix 36 and form the dominant information in the rest of the image.
  • the distance DR of the retroreflective object 18R is thus incorrectly assumed for the object 18T.
  • FIG. 7 shows an amplitude-time diagram for the respective amplitudes Ao, A-i, A2 and A3 of four phase images DCSo, DCS1, DCS2 and DCS3 in an exemplary distance measurement with an exemplary pixel 38, which is shown below for better differentiation referred to as the target pixel 38 for the object of interest 18T.
  • the distance measurement took place in a measurement phase MP in the scene with the object 18T and the retroreflective object 18R shown as an example in FIG.
  • the target pixel 38 is directly illuminated by the echo signal 34 coming from the object 18T.
  • optical crosstalk from pixels 38 which are illuminated with the strong echo signal 34 coming from the retroreflective object 18R, affects the phase images DCSo, DCS1, DCS2 and DCS3 acquired with the target pixel 38 and their amplitudes Ao, A-i, A2 and A3 off.
  • Each amplitude Ak where k is the index for the corresponding phase image DCSk, which is detected with the target pixel 38, is made up of a background noise N, in each case at the bottom in FIG. 7, and an amplitude component AAT,k, FIG. 7 in the middle , which originates from the echo signal 34 of the object 18T, and an amplitude component AAR,k, FIG. 7 above, which originates from the optical crosstalk of the echo signal 34 of the retroreflective object 18R.
  • the background noise N is almost the same across all phase images DCSk.
  • the amplitude component AAT,k of the object 18T depends on the distance DT at which the object 1 8T is located and, accordingly, on the phase shift Ft . The following applies:
  • AT,k is an amplitude parameter of the phase image DCSk for the object 18T
  • fMOD is the modulation frequency of the transmission signal 30
  • ⁇ INT is the integration period for the measurement.
  • the amplitude component AAR,k from the crosstalk effect of the retroreflective object 18R on the target pixel 38 is dependent on the distance DR at which the retroreflective object 18R is located and correspondingly on the phase shift OR. IT applies:
  • Ai,R,k are individual amplitude parameters of the phase image DCSk for the pixels 38 which are illuminated by the echo signal 34 of the retroreflective object 18R and from which optical crosstalk to neighboring pixels 38 emanates.
  • the run parameter i designates the pixels 38, other than the target pixel 38, from which optical crosstalk originates.
  • the corresponding distance DT of the object 18t can be determined from:
  • the background noise N can easily be determined, for example by means of calibration measurements.
  • the amplitude component AAR.L from the optical crosstalk effect can be approximated as follows using a correction term KT.
  • At least one distance measurement (short distance measurement) in a short measurement phase in the form of a measurement period MPK with a short integration period tiNT.K and at least one distance measurement (long distance measurement) in a long measurement phase in the form of a measurement period MPL with a long -Integration time tiNT.L performed.
  • the distance measurements can be carried out, for example, according to a control scheme that is explained further below in connection with FIG.
  • the length of the short integration period ⁇ INT,K is chosen such that the strong echo signals 34 from the retroreflective object 18R do not lead to an overload in the reception matrix 36 either.
  • the short integration time ⁇ INT,K can be about 1 ps.
  • the phase shift OR for the retroreflective object 18R, which correlates with the distance DR, is determined from the amplitudes Ao, A-i, A2 and A3 of the phase images DCSo, DCS1, DCS2 and DCSa recorded during the short distance measurement.
  • the length of the long integration period tiNT.L is chosen such that weaker echo signals 34 from the object 18T can also be detected.
  • the long integration period tiNT.L can be about 1000 ps.
  • the correction term KT is based on the phase shift OR from the short Distance measurement as follows:
  • Ci is a respective correction factor for the pixels 38 as well as target pixels 38.
  • the correction factor Ci can be specified or determined from measured values of distance measurements. The longer the integration period ⁇ INT,L, the greater the correction term KT. Furthermore, the correction term KT becomes all the larger, the greater the number of pixels 38 that are detected as having been hit by echo signals 34 of the retroreflective object 18R in the short distance measurement.
  • the background noise N and the correction term KT are determined as follows from the recorded amplitudes Ak, namely Ao , Ai, A2 and A3, and the corresponding corrected amplitudes Ak,corr , namely Ao,corr, Ai,corr, A2,corr and A3,corr , are determined! so
  • the distance DT of object 1 8T is determined from the corrected amplitudes Ao,corr, Ai, Corr , A2,corr and A3,corr analogously to equation G5:
  • Two exemplary embodiments for methods for determining the correction term KT are explained below:
  • the correction factor Ci is specified, for example, from model considerations. It is assumed that the individual amplitude parameters Ai,R,k of the phase image DCSk for the pixels 38 that are illuminated by the echo signal 34 of the retroreflective object 18R depend on the distance DR of the retroreflective object 18R from Equation G3. Correspondingly, the correction factor Ci is specified as a function of the distance DR. For example, there can be an exponential relationship between the correction factor Ci and the distance DR, as shown in FIG.
  • the correction factor Ci is determined from measured values of distance measurements.
  • Equation G8 for the pixels 38 that have a phase shift Fr close to the phase shift OR for the retroreflective object 18R, for different test factors TFi that are used instead of the correction factor Ci, the respective test phase shifts ⁇ t> test calculated. With good nutrition, the test phase shifts 0Test approach the phase shift Ft, which corresponds to the detected object 18T, with the corresponding test factor TFi.
  • test phase shift Ft q e ⁇ The relationship between the test phase shift Ft q e ⁇ and the test factor TFi is shown in a diagram in FIG. There the test phase shifts Ftqe ⁇ for, for example, nine different test factors TFi.i to TFi,g shown.
  • pairs of values for the test phase shifts Ftqe ⁇ for the last three test factors TFi,7 to TFi,9, on the right in FIG. 9, are all approximately on a second plateau 48 in the area of the phase shift Fh for the retroreflective object 18R plus 180°.
  • the pairs of values for the test phase shifts ⁇ t>Test for the mean test factors TFi,4 to TFi,6 lie approximately on a straight line 50, which extends between the pair of values for the test phase shifts ⁇ t>Test for the test factors TFi,3 and the pair of values for the test phase shift ⁇ t>Test for the test factor TFi,z , ie between the first plateau 46 and the second plateau 48 .
  • the correction factor Ci can be determined, for example, by a numerical method, an iteration method or the like.
  • phase shift Ft is in the range of the phase shift Fh anyway, no major change in the phase shift is to be expected.
  • the correct range of the signal of the object 1 8T lying below the interference signal of the retroreflective object 1 8R can be calculated out. This method works very well when the spillover from the interference signal is very strong, since the two plateaus 46 and 48 differ greatly.
  • FIG. 10 shows a timing control scheme for an example of four reception areas EBn to EBn +3 for measurement cycles each with four modulation periods MPi to MP, which can be used for the above-described method for correcting crosstalk.
  • FIG. 10 shows an example of a complete measurement cycle with the modulation periods MPi to MP4 and a section of another measurement cycle with the modulation periods MPi to MP3.
  • the reception areas EB n to EB n+3 are activated with different recording time areas TBo to TB3.
  • the reception areas EBn to EBn +3 cover all four phase images DCS0 to DCS3 in the modulation period MPi.
  • the reception area EB n is activated with the recording time range TBo
  • the reception area EB n+i is activated with the recording time range TB1
  • the reception area EB n +2 is activated with the recording time range TB2
  • the reception area is activated EB n+3 with the recording time range TB3.
  • each of the reception areas EBn to EBn +3 is controlled with a total of all four recording time areas TBo to TB3.
  • the reception area EB n is activated in the four successive modulation periods MPi to MP 4 in succession with the recording time areas TBo to TB3.
  • the reception areas EBn to EBn +3 are driven with the short integration period ⁇ INT,K, for example with 1 ps.
  • a short distance measurement as described above is carried out in each case in the second modulation periods MP2.
  • the second modulation periods MP2 form short measurement phases and are additionally provided with the index “K” in FIG. 10 for better differentiation.
  • the reception areas EB n to EB n+3 are driven with the long integration period tiNT.L, for example with 1000 ps.
  • the long-distance measurements described above are carried out in the modulation periods MPi, MP3 and MP4.
  • the modulation periods MPi, MP3 and MP4 form the long measurement phases and are additionally provided with the index “L” in FIG. 10 for better differentiation.
  • the distance DR and/or the phase shift OR for the retroreflective object 18R can also be estimated instead of being determined using the short distance measurement. In this way, the repetition rate of distance measurements with the LiDAR system 12 can be increased.
  • different modulation frequencies fMOD can also be used for the transmission signal 30 in the short-distance measurement and the long-distance measurement.
  • the modulation frequencies fMOD used can be offset against each other. In this way, the short distance measurement can be extended to determine an unambiguity in phase measurements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un procédé destiné à faire fonctionner un dispositif de détection optique pour déterminer au moins des variables de distance qui caractérisent des distances d'objets capturées par le dispositif de détection optique. Dans le procédé, au moins un signal de balayage électromagnétique est généré et transmis dans une zone de surveillance. Avec au moins certains des pixels d'une matrice de réception optique, au moins un signal d'écho électromagnétique est capturé et converti en signaux de réception électrique. Dans au moins une phase de mesure courte (MPK), avec au moins certains des pixels, des signaux d'écho ayant au moins une courte durée d'intégration (tINT,K) sont reçus et convertis en signaux de réception électrique. Pour au moins un pixel, au moins une variable de correction est déterminée au moyen du ou des signaux de réception. Dans une phase de mesure longue (MPL), avec au moins certains des pixels, des signaux d'écho ayant une longue durée d'intégration (tINT,L) sont reçus et convertis en signaux de réception. Pour au moins certains des pixels, au moins une variable de distance est déterminée au moyen d'au moins un signal de réception à partir d'une phase de mesure longue (MPL) et au moins une variable de correction à partir d'une phase de mesure courte (MPK).
PCT/EP2022/067823 2021-07-06 2022-06-29 Procédé de correction de la diaphonie optique entre des pixels d'un dispositif de détection optique et dispositif de détection correspondant WO2023280647A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021117361.2 2021-07-06
DE102021117361.2A DE102021117361A1 (de) 2021-07-06 2021-07-06 Verfahren zum Betreiben einer optischen Detektionsvorrichtung, Detektionsvorrichtung und Fahrzeug mit wenigstens einer Detektionsvorrichtung

Publications (1)

Publication Number Publication Date
WO2023280647A1 true WO2023280647A1 (fr) 2023-01-12

Family

ID=82608248

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/067823 WO2023280647A1 (fr) 2021-07-06 2022-06-29 Procédé de correction de la diaphonie optique entre des pixels d'un dispositif de détection optique et dispositif de détection correspondant

Country Status (2)

Country Link
DE (1) DE102021117361A1 (fr)
WO (1) WO2023280647A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2743724B1 (fr) 2012-12-12 2015-09-23 Espros Photonics AG Capteur de distance TOF et procédé de fonctionnement
US20200072946A1 (en) * 2018-08-29 2020-03-05 Sense Photonics, Inc. Glare mitigation in lidar applications

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007046562A1 (de) 2007-09-28 2009-04-02 Siemens Ag Verfahren und Vorrichtung zum Bestimmen eines Abstands mittels eines optoelektronischen Bildsensors

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2743724B1 (fr) 2012-12-12 2015-09-23 Espros Photonics AG Capteur de distance TOF et procédé de fonctionnement
US20200072946A1 (en) * 2018-08-29 2020-03-05 Sense Photonics, Inc. Glare mitigation in lidar applications

Also Published As

Publication number Publication date
DE102021117361A1 (de) 2023-01-12

Similar Documents

Publication Publication Date Title
EP2240797B1 (fr) Dispositif de mesure de distance optoélectronique
EP2126607B1 (fr) Détecteur de champ destiné à détecter des objets et procédé d'utilisation de celui-ci
DE102010045657A1 (de) Umfeld-Überwachungssystem für ein Fahrzeug
DE19832800A1 (de) Hinderniserfassungssystem für ein Kraftfahrzeug
WO2019145078A1 (fr) Système de détection de distance, procédé pour un système de détection de distance et véhicule
EP0444402A2 (fr) Méthode et appareil pour indiquer aux automobilistes la limite de visibilité dans le brouillard
DE102016213007A1 (de) Verfahren und System zur Abtastung eines Objekts
EP1628141A1 (fr) Procède de triangulation ayant des diodes laser et une caméra pour determiner la distance pour les applications arret-demarrage des véhicules automobiles
DE102007022372A1 (de) Verfahren und Vorrichtung zur Ermittlung der Fahrzeugklasse von Fahrzeugen
DE102007030823A1 (de) Radargerät
DE102019216085A1 (de) Laserabstandsmessvorrichtung
EP1624278A1 (fr) Dispositif de mesure pour déterminer la distance de pièces d'une véhicule automotive
WO2009141019A1 (fr) Procédé et dispositif pour mesurer un obstacle
EP4139709A1 (fr) Procédé et dispositif d'identification de l'efflorescence dans une mesure lidar
DE102007032997A1 (de) Fahrerassistenzvorrichtung
EP3809157A1 (fr) Capteur optoélectronique de mesure de distance et procédé de détection d'un objet cible
EP3867666B1 (fr) Procédé de détection d'au moins des compositions de particules dans une zone de surveillance comprenant un dispositif de détection optique et dispositif de détection
WO2023280647A1 (fr) Procédé de correction de la diaphonie optique entre des pixels d'un dispositif de détection optique et dispositif de détection correspondant
WO2022112203A1 (fr) Procédé de fonctionnement d'un dispositif de détection pour déterminer des variables de distance avec réglage de température, dispositif de détection correspondant et véhicule comportant au moins un dispositif de détection de ce type
DE102020124017A1 (de) Verfahren zum Betreiben einer optischen Detektionsvorrichtung, optische Detektionsvorrichtung und Fahrzeug mit wenigstens einer optischen Detektionsvorrichtung
WO2022243089A1 (fr) Procédé d'opération de dispositif de détection, dispositif de détection et véhicule doté d'au moins un dispositif de détection
WO2023247395A1 (fr) Procédé de fonctionnement d'un système lidar à correction de lumière parasite, système lidar correspondant et véhicule
DE10149423B4 (de) Verfahren und Vorrichtung zur Messung von Entfernungen in optisch trüben Medien
DE102021129091A1 (de) Verfahren zum Betreiben einer Detektionsvorrichtung zur ortsaufgelösten Überwachung wenigstens eines Überwachungsbereichs, Detektionsvorrichtung, Fahrzeug mit wenigstens einer Detektionsvorrichtung
DE102022108021A1 (de) Verfahren und Vorrichtung zur Messung des Orts und der Geschwindigkeit mehrerer Messpunkte

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22743754

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE