WO2022058155A1 - Procédé destiné à la détection d'objets et dispositif de détection - Google Patents

Procédé destiné à la détection d'objets et dispositif de détection Download PDF

Info

Publication number
WO2022058155A1
WO2022058155A1 PCT/EP2021/073948 EP2021073948W WO2022058155A1 WO 2022058155 A1 WO2022058155 A1 WO 2022058155A1 EP 2021073948 W EP2021073948 W EP 2021073948W WO 2022058155 A1 WO2022058155 A1 WO 2022058155A1
Authority
WO
WIPO (PCT)
Prior art keywords
tber
signals
detection device
electromagnetic
transmission
Prior art date
Application number
PCT/EP2021/073948
Other languages
German (de)
English (en)
Inventor
Jan Christoph SIMON
Christoph Parl
Original Assignee
Valeo Schalter Und Sensoren Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Schalter Und Sensoren Gmbh filed Critical Valeo Schalter Und Sensoren Gmbh
Priority to EP21769725.9A priority Critical patent/EP4214538A1/fr
Publication of WO2022058155A1 publication Critical patent/WO2022058155A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4868Controlling received signal intensity or exposure of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/489Gain of receiver varied automatically during pulse-recurrence period

Definitions

  • the invention relates to a method for detecting objects in a monitoring area with a detection device using electromagnetic signals, in which at least one electromagnetic transmission signal is sent into the monitoring area during at least one measurement with at least one transmission device, at least one receiving element of at least one receiving device for receiving electromagnetic signals is activated, from the result of at least one recording information about the surveillance area can be determined.
  • the invention relates to a detection device for detecting objects in a surveillance area by means of electromagnetic signals, with at least one transmission device, with which electromagnetic transmission signals can be transmitted into the surveillance region, with at least one reception device, with which electromagnetic reception signals can be received, which of electromagnetic transmission signals can be received which are reflected in the monitored area, and with which received electromagnetic signals can be converted into evaluation signals that can be processed with an evaluation device, and with at least one evaluation device with which information about the monitored area can be determined on the basis of the received electromagnetic received signals.
  • the invention relates to a vehicle with at least one detection device for detecting objects in a surveillance area by means of electromagnetic signals.
  • DE 10 2018 126 631 A1 discloses a method for determining a distance of an object using an optical detection device and an optical detection device.
  • a light signal pulse is emitted.
  • the light signal pulse reflected by the object is received as an echo light signal pulse.
  • the echo light signal pulse is divided into subsequent discrete reception time windows and converted into corresponding electrical energy.
  • the signal propagation time is used as a measure of the distance from the object to the detection device.
  • the invention is based on the object of designing a method and a detection device of the type mentioned at the outset, in which the detection of objects can be improved.
  • this object is achieved in the method in that at least two receiving elements are activated for a respective predetermined standby period for receiving electromagnetic reception signals, which originate from the at least one electromagnetic transmission signal that is reflected in the monitoring area, with the respective standby periods for the at least two receiving elements are specified individually depending on the respective regions of the monitoring area to which the receiving elements are assigned.
  • the receiving elements are each activated for individual standby times.
  • the receiving elements are assigned to respective regions of the surveillance area so that they can receive electromagnetic signals coming from the corresponding region.
  • the standby times depend on the respective regions of the surveillance area to which the receiving elements are assigned.
  • the at least one receiving device can be individually adapted to the existing or expected operating conditions of the detection device, so that objects can be detected more efficiently.
  • the operating conditions can depend on the installation location of the detection device, in particular in or on a vehicle, and/or the orientation of the detection device.
  • distances to boundaries in the monitored area, in particular the ground, can be taken into account.
  • Vehicle parts that are located within the surveillance area and limit the detection range can be taken into account.
  • the information about the monitored area can advantageously be object information about objects in the monitored area, in particular distances, directions and/or speeds of objects relative to the detection device.
  • the information about the surveillance area can also contain the information that no object is detected.
  • the information about the monitored area can also contain that a detection range is limited, in particular, by visibility impairments such as fog, precipitation or the like.
  • the standby period of a receiving element can be the period during which a receiving element is ready to receive received electromagnetic signals. If an electromagnetic received signal hits the corresponding receiving element during the stand-by period, the received signal can be picked up. If no received electromagnetic signal strikes the receiving element during the stand-by period, nothing is realized except possibly noise.
  • the recording of received signals can advantageously include receiving electromagnetic received signals and converting them into evaluation signals that can be processed with an evaluation device.
  • the evaluation signals can advantageously be electrical evaluation signals. Electrical evaluation signals can be processed with an electrical evaluation device.
  • the detection device can work according to a light transit time method, in particular a light pulse transit time method.
  • Optical detection devices working according to the light pulse propagation time method can be designed and referred to as time-of-flight (TOF), light detection and ranging systems (LiDAR), laser detection and ranging systems (LaDAR) or the like.
  • TOF time-of-flight
  • LiDAR light detection and ranging systems
  • LaDAR laser detection and ranging systems
  • a direct light propagation time method a propagation time from the transmission of a transmission signal, in particular a light pulse, with a transmission device and the reception of the corresponding speaking reflected transmitted signal, ie a received signal, measured with a receiving device and determines a distance between the detection device and the detected object.
  • the detection device can work according to a so-called indirect time-of-flight method. A transit time-related phase shift of the transmission signal or of the reflected transmission signal can be measured and the transit time derived from this. The distance can be determined
  • the detection device can advantageously be designed as a so-called flash system, in particular as a flash LiDAR.
  • Corresponding transmission signals can simultaneously illuminate a part of the monitored area or the entire monitored area.
  • the detection device can be designed as a scanning system.
  • a monitoring area can be sampled, ie scanned, with transmission signals.
  • the direction of propagation of the transmission signals can be swiveled over the surveillance area.
  • At least a deflection device in particular a scanning device, a deflection mirror device or the like, can be used here.
  • the detection device can advantageously be designed as a laser-based distance measuring system.
  • the laser-based distance measuring system can have at least one laser, in particular a diode laser, as the light source of the at least one transmission device.
  • pulsed transmission beams can be transmitted as transmission signals with the at least one laser.
  • the laser can be used to emit transmission signals in wavelength ranges that are visible or invisible to the human eye.
  • at least one receiver of at least one receiving device can have a sensor designed for the wavelength of the emitted light, in particular a line sensor or area sensor, in particular an (avalanche) photodiode, a photodiode line, a CCD sensor, an active pixel sensor, in particular a CMOS sensor or the like.
  • the at least one receiver can include a large number of receiving elements, which are also called pixels, which can be controlled at least in part individually, in particular with regard to their standby time.
  • the laser-based distance measuring system can advantageously be a laser scanner. With a laser scanner, a monitored area can be equipped with a particularly pulsed laser beam can be scanned.
  • the invention can advantageously be used in a vehicle, in particular a motor vehicle.
  • the invention can advantageously be used in a land vehicle, in particular a passenger car, a truck, a bus, a motorcycle or the like, an aircraft, in particular drones, and/or a watercraft.
  • the invention can also be used in vehicles that can be operated autonomously or at least partially autonomously.
  • the invention is not limited to vehicles. It can also be used in stationary operation and/or in robotics.
  • the detection device can advantageously be connected to at least one electronic control device of the vehicle, in particular a driver assistance system and/or chassis control and/or a driver information device and/or a parking assistance system and/or gesture recognition or the like, or be part of such. In this way, the vehicle can be operated autonomously or semi-autonomously.
  • the detection device can be used to detect stationary or moving objects, in particular vehicles, people, animals, plants, obstacles, bumps in the road, in particular potholes or stones, road boundaries, traffic signs, open spaces, in particular parking spaces, precipitation or the like.
  • the standby time for at least one receiving element can be specified as twice the runtime that a transmission signal requires from transmission to reaching a specified distance limit of the corresponding region of the monitoring area.
  • the detection range of the detection device ie the maximum distance at which objects can still be detected, can be adjusted by adjusting the standby time and thus the recording time. The shorter the standby time, the smaller the detection range of the detection device.
  • the detection range of the detection system can be specified by the transmission power of the transmission device. In this way, receiving elements that are assigned to regions with closer distance limits can be read out earlier than receiving elements that are assigned to regions with distance limits that are further away. Accordingly, the evaluation of the received signals can be started earlier.
  • the overall image frequency ie the frequency at which measurements are carried out
  • saturation effects in the receiving elements which can be caused by excessive signal strengths of received signals, can be reduced. It is not necessary to carry out further measurements with a different detection range in order to correct any saturation of the receiving elements.
  • the demands on optical components, in particular optical lenses or the like, which can be arranged in front of a light source of the transmission device can be reduced.
  • the optical components are used to achieve more homogeneous propagation of the electromagnetic transmission signals in the surveillance area.
  • Receiving elements that are assigned to regions with closer distance limits can advantageously be activated for a correspondingly shorter standby time than receiving elements that are assigned to regions with distance limits that are further away. In this way, measurements can be carried out more efficiently overall.
  • the detection device can be used in connection with a vehicle.
  • the maximum distance limit of interest can be specified as a function of an operating situation, in particular a driving situation, of the vehicle.
  • the maximum required detection ranges can be limited for a detection device which is intended to cover a close range around a vehicle in particular.
  • a distance limit of the order of 100 m and more can be specified for a monitoring area in front of the vehicle in the direction of travel.
  • objects in front of the vehicle in particular vehicles driving ahead, can be detected at an early stage.
  • a distance limit of less than 10 m can be specified when the vehicle is parked. In this way, the measurements can be carried out more quickly and correspondingly objects in the parking area can be detected more quickly.
  • the distance limit of a region can be predefined by operational obstacles.
  • the corresponding distance limits can be specified for the distance between the detection device and the ground.
  • the distance limits, which are assigned to regions of the monitoring area, in particular parallel to the vehicle above the vehicle, can be specified in such a way that the corresponding region of the monitoring area extends up to the height of the vehicle at most. Objects that are above the vehicle height are generally not of interest since they generally cannot collide with the vehicle.
  • the specified standby time for this at least one receiving element can be reduced at least for a subsequent measurement.
  • the sensitivity can be reduced when a receiving element is overdriven.
  • the corresponding object information can also be determined with the method in the event of reflection from highly reflective objects, in particular reflective surfaces.
  • receiving elements that are assigned to regions from which received signals with higher intensities are expected can be activated for shorter standby times than receiving elements that are assigned to regions from which received signals with lower intensities are expected. In this way, overloading of the receiving elements can be avoided and detection can be improved.
  • received variables which generate received signals in at least one receiving element can be cumulated for the length of the standby time of the at least one receiving element and information about the monitoring area can be determined from the received variables.
  • the standby time can be characterized by an integration time. Corresponding received quantities are integrated during the integration time. If the detection device works according to a direct transit time method, the standby time can be characterized by a measurement time window. In this way, the integration time or the measurement time window can be adapted to an operating situation that is to be expected.
  • the received variables can in particular be electrical variables, in particular electrical charges, electrical currents, electrical capacitances, electrical energies or the like, which are generated by electromagnetic received signals in or with a receiving element.
  • the reception variables characterize the corresponding electromagnetic reception signal, in particular its signal strength, intensity, duration and/or phase shift or the like.
  • a transit time between the transmission of a transmission signal and the receipt of a corresponding reception signal, which originates from the transmission signal that is reflected by an object in the monitoring area, can be determined and a distance of the object relative to the detection device can be determined from this and /or at least one phase shift of a received received signal, which originates from a transmitted signal reflected by an object in the surveillance area, are determined and from this a distance of the object is determined relative to the detection device.
  • a transit time between the transmission of a transmission signal and the receipt of the corresponding reception signal can be determined and a distance can be determined therefrom. In this way, a direct transit time measurement can be carried out.
  • the distance can be determined via a phase shift of a received signal. In this way, an indirect transit time measurement can be carried out.
  • At least two measurements can be carried out, between which the standby time for at least one receiving element can be changed.
  • a quick measurement in the near range and a measurement in the far range can be carried out alternately.
  • the results of the measurements can be put together to form an overall picture of the monitoring area.
  • transmission signals can be sent at least for the longest standby time of the receiving elements used. In this way it can be avoided that successive measurements do not overlap. Those receiving elements that are activated with a shorter standby time can be read earlier. In this way, the overall readout speed can be increased.
  • At least one transmission signal can be transmitted simultaneously to a number of regions of the surveillance area.
  • the corresponding regions can be monitored for objects at the same time during a measurement.
  • the detection device can be implemented as a flash LiDAR.
  • the object is achieved according to the invention with the detection device in that the detection device has means for carrying out the method according to the invention.
  • the means for carrying out the method according to the invention can be implemented using software and/or hardware.
  • the at least one receiving device can have at least a line sensor and/or an area sensor, in particular a CCD sensor, an active pixel sensor or the like.
  • a surface sensor has a large number of receiving elements which are arranged in two dimensions. In this way, a spatial resolution in two spatial dimensions can be achieved.
  • a line sensor has a large number of receiving elements which are arranged along a line, ie in one dimension. In this way, a spatial resolution can be realized in one spatial dimension.
  • CCD sensors, active pixel sensors or the like have a large number of receiving elements, so-called pixels.
  • the receiving elements can be controlled separately.
  • the receiving elements can be adjusted flexibly.
  • At least one transmission device can be a flash transmission device.
  • at least one transmission signal can be sent simultaneously to several regions of the surveillance area. The corresponding regions can thus be checked simultaneously during a measurement.
  • the at least one transmission device can advantageously have at least one light source, in particular at least one laser, with which transmission signals, in particular laser pulses, can be transmitted.
  • the at least one transmission device can advantageously have at least one signal-influencing device, in particular at least one optical lens, with which the transmission signals can be guided into the monitoring area.
  • the signal-influencing device can advantageously expand the transmission signals, so that a correspondingly larger field of view can be illuminated simultaneously.
  • the object is achieved according to the invention in the vehicle in that the vehicle has at least one detection device according to the invention.
  • FIG. 1 shows a front view of a vehicle with a driver assistance system and a LiDAR system for monitoring a monitoring area to the left of the vehicle in the direction of travel;
  • FIG. 2 shows a functional representation of the vehicle from FIG. 1 with the driver assistance system and the LiDAR system;
  • FIG. 3 shows a front view of a receiver of the LiDAR system from FIGS. 1 and 2.
  • FIG. 1 shows a front view of a vehicle 10 in the form of a passenger car.
  • Figure 2 shows a functional representation of vehicle 10.
  • the x-axis extends in the direction of a vehicle longitudinal axis of vehicle 10
  • the y-axis extends along a vehicle transverse axis
  • the z-axis extends spatially upwards perpendicular to the xy plane along a vehicle vertical axis.
  • the motor vehicle 10 is operational on a horizontal roadway
  • the x-axis and y-axis extend horizontally in space and the z-axis extends vertically in space.
  • the vehicle 10 has an optical detection device, for example in the form of a LiDAR system 12.
  • the LiDAR system 12 is arranged, for example, laterally in an upper area of the vehicle 10 and directed into a monitoring area 14 in the direction of travel 16 to the left of the vehicle 10.
  • the monitoring area 14 can be monitored for objects 18 with the LiDAR system 12 .
  • the LiDAR system 12 can also be arranged elsewhere on the vehicle 10 and oriented differently.
  • the vehicle 10 can also have a number of different detection devices.
  • the LiDAR system 12 can be used to detect stationary or moving objects 18, for example vehicles, people, animals, plants, obstacles, bumps in the road, in particular potholes or stones, road boundaries, traffic signs, open spaces, in particular parking spaces, precipitation or the like.
  • the vehicle 10 has a driver assistance system 20.
  • the vehicle 10 can be operated autonomously or partially autonomously.
  • the driver assistance system 20 is functionally connected to the LiDAR system 12 .
  • information about the monitored area 14 for example object information about objects 18 in the monitored area 14 , which is detected using the LiDAR system 12 , can be transmitted to the driver assistance system 20 .
  • the information about the monitoring area 14 can be used to support operating functions of the vehicle 10, for example with regard to the drive, steering and braking.
  • the object information of an object 18, which can be determined with the LiDAR system 12, includes, for example, distances, speeds and directions of objects 18 relative to the vehicle 10 or to the LiDAR system 12.
  • the direction of an object 18 can be specified, for example, as an angle in relation to reference axes.
  • the azimuth can be specified in relation to the vehicle transverse axis of the vehicle 10 and the elevation in relation to the vehicle vertical axis to characterize the direction.
  • the LiDAR system 12 includes, for example, a transmitting device 22, a receiving device 24 and a control and evaluation device 26.
  • Electromagnetic transmission signals 28 can be transmitted with the transmission device 22 .
  • the transmission signals 28 are, for example, pulsed laser beams with wavelengths in the near infrared, for example.
  • the transmission device 22 has a laser diode as a light source, for example, with which the transmission signals 28 can be generated.
  • the transmission device 22 can also have more than one light source, for example a number of laser diodes.
  • the LiDAR system 12 is what is known as a flash LiDAR system, in which each transmission signal 28 illuminates the entire monitoring area 14 .
  • the transmission device 22 can have corresponding optical components, for example optical lenses, diffusers, diffraction elements, diffractive optical elements, beam-shaping elements or the like, with which the transmission signals 28 are influenced in such a way that they can illuminate the monitored area 14 as uniformly as possible.
  • the transmission signals 28, which are reflected in the monitoring area 14, for example on an object 18, can be converted as electromagnetic reception signals 30 into corresponding electrical evaluation signals.
  • the electrical evaluation signals can be transmitted to the electronic control and evaluation device 26 of the LiDAR system 12 and processed with it.
  • the control and evaluation device 26 includes means for controlling the LiDAR system 12 and for processing the electrical evaluation signals.
  • the means for control and the means for evaluation can also be configured separately.
  • a control device and evaluation device can be implemented separately from one another. be oriented.
  • the means for control and evaluation are implemented in software and hardware. Parts of the control and evaluation device 26 or the entire control and evaluation device 26 can also be combined with an electronic control device of the vehicle 10, for example also with the driver assistance system 20.
  • the object information about the detected object 18 can be obtained with the LiDAR system 12 from the received signals 30 or the electrical evaluation signals.
  • the distance of the object 18 relative to the LiDAR system 12 can be determined as position information using a light propagation time method, in which the propagation time between the transmission of a transmission signal 28 and the receipt of the corresponding reception signal 30 is determined.
  • the receiving device 24 has an optical imaging system 32, for example in the form of an optical lens, a receiver 34, for example in the form of a CCD chip, and electronic components which are not shown in the figures for the sake of clarity.
  • the optical imaging system 32 is located between the receiver 24 and the monitoring area 14. With the optical imaging system 32, reflecting objects 18 can be imaged with spatial resolution via the corresponding received signals 30 in two dimensions, namely in the xy plane, i.e. vertically and horizontally .
  • the receiver 34 is shown in FIG. 3 in a front view as viewed from the optical imaging system 32 .
  • the receiver 34 comprises a total of 16 flatly arranged receiving elements 36, which can also be referred to as “pixels”.
  • the receiving elements 36 are arranged in 13 vertical columns 38 and 7 horizontal rows 40, for example.
  • the columns 38 run parallel to the z-axis, for example vertically in space.
  • the rows 40 run parallel to the x-axis, for example spatially horizontal.
  • Each receiving element 36 can be activated individually for a respective predetermined standby period Tber, namely Tber_1, Tber_2 and Tber_3, for receiving electromagnetic reception signals 30.
  • the recording of received signals 30 includes the reception of the received signals 30 and their conversion into corresponding electrical evaluation signals.
  • the standby period Tber of a receiving element 36 is the period in which the receiving element 36 is ready to receive received electromagnetic signals 30 . If an electromagnetic received signal 30 strikes the corresponding receiving element 36 during the standby period Tber, the received signal 30 can be received. If no electromagnetic received signal 30 strikes the receiving element 36 during the stand-by period Tber, nothing is realized except possibly noise.
  • the LiDAR system 12 can be operated using a direct time-of-flight method.
  • the transit time from the transmission of a transmission signal 28 with the transmission device 22 and the reception of the corresponding reflected reception signal 30 with the receiver 34 is measured and the distance between the LiDAR system 12 and the detected object 18 is determined therefrom.
  • the LiDAR system 12 can work according to an indirect time-of-flight method. In this case, a transit time-related phase shift of the reflected transmission signal 28, ie of the reception signal 30, is measured and the transit time and from this the distance are determined.
  • the respective standby periods Tber_1, Tber_2 and Tber_3 for the receiving elements 36 become dependent on respective regions, namely a ground region 14a, a near-field region 14b, a far-field region 14c and a high-altitude region 14d, of the surveillance area 14 to which the receiving elements 36 are assigned, respectively , individually specified.
  • the respective standby time Tber is specified in such a way that it corresponds to the respective propagation time of a transmission signal 28, which is reflected in a detection range 48a, 48b, 48c or 48d of the receiving elements 36 for the corresponding region 14a, 14b, 14c or 14d of the monitoring area 14, and of the corresponding reflected received signal 30 corresponds.
  • the detection ranges 48a, 48b, 48c and 48d are defined by the respective maximum distance limits of interest 44a, 44b, 44c and 44d of the corresponding regions 14a, 14b, 14c and 14d.
  • the distance limits 44a, 44b, 44c and 44d correspond to the respective distances to the LiDAR system 12 for which the detection of any objects 18 is still of interest. For example, the detection of an object 18 is of interest when a collision with the vehicle 10 is imminent.
  • the standby times Tber_1, Tber_2 and Tber_3 for the receiving elements 36 are each used as the double pelte running time is specified, which a transmission signal 28 requires from transmission until it reaches the respective maximum distance limit 44a, 44b, 44c and 44d of the corresponding region 14a, 14b, 14c or 14d of the monitoring area 14.
  • the receiving elements 36 of the bottom two rows 40 in FIG. 3 form a base group 42a.
  • the receiving elements 36 of the floor group 42a are assigned to the floor region 14a.
  • the received signals 30 from any objects 18 in the ground region 14a are imaged onto the receiving elements 36 of the ground group 42 .
  • the floor region 14a of the monitoring area 14 extends directly next to the vehicle 10.
  • the distance limit 44a of the floor region 14a is defined by the floor 46, or the roadway.
  • a reduced ground detection range 48a of approximately 2 m, for example, is sufficient, which corresponds to the distance between the LiDAR system and the ground 46 on the far side of the ground region 14a.
  • the receiving elements 36 of the floor assembly 42a are activated, for example, with the same first standby time Tber_1.
  • the receiving elements 36 in the third row 40 from the bottom in FIG. 3 form a near-field group 42b.
  • the receiving elements 36 of the near-field group 42b are assigned to the near-field region 14b.
  • the received signals 30 from any objects 18 in the near-field region 14b are imaged onto the receiving elements 36 of the near-field group 42b.
  • the near-field region 14b of the surveillance area 14 extends next to the ground region 14a on the side facing away from the vehicle 10 .
  • the near-field distance boundary 44b of the near-field region 14b is defined by the ground 46 at a greater distance than that of the ground region 14a.
  • a near-field detection range 48b of approximately 10 m, for example, is sufficient, which corresponds to the distance between the LiDAR system and the ground 46 on the far side of the near-field region 14b.
  • the receiving elements 36 of the near-field group 42b are activated, for example, with the same second standby time Tber_2.
  • the second period of availability Tber_2 is longer than the first period of availability Tber_1.
  • the receiving elements 36 of the fourth, fifth and sixth row 40 from the bottom in FIG. 3 form a far-field group 42c.
  • the receiving elements 36 of the far-field group 42c are assigned to the far-field region 14c.
  • the received signals 30 from any objects 18 in the far-field region 14c are imaged onto the receiving elements 36 of the far-field group 42c.
  • the far-field region 14c of the surveillance area 14 extends next to the near-field region 14b on the side facing away from the vehicle 10 .
  • An imaginary horizon line 50 runs through the far-field region 14c at the level of the roof of the vehicle 10.
  • the far-field distance limit 44c of the far-field region 14c is at a distance of, for example, approximately 40 m. Objects 18 can thus be up to a distance of approximately 40 m next to the vehicle 10 can still be detected.
  • a far-field detection range 48c of 40 m is correspondingly required.
  • the receiving elements 36 of the far-field group 42c are activated, for example, with the same third standby time Tber_3.
  • the third period of availability Tber_3 is significantly greater than the second period of availability Tber_2.
  • the receiving elements 36 of the upper row 40 in FIG. 3 form a vertical group 42d.
  • the receiving elements 36 of the height group 42d are assigned to the height region 14d.
  • the received signals 30 from any objects 18 in the height region 14d are mapped onto the receiving elements 36 of the height group 42d.
  • the high-altitude region 14d of the monitor area 14 extends above the far-field region 14c.
  • the height-distance limit 14d of the far-field region 14c is at a distance of, for example, approximately 10 m.
  • the area above the vehicle height is of less interest since objects 18 there generally cannot collide with the vehicle 10 .
  • a far-field detection range 48d of 10 m is required, which corresponds to the near-field detection range 48b.
  • the receiving elements 36 of the high-altitude group 42d are activated, for example, with the same second standby time Tber_2 as the receiving elements 36 of the near-field group 42b.
  • measurements are carried out continuously with the LiDAR system 12, for example.
  • transmission signals 28 for the length of the longest standby duration Tber_3 sent to the monitoring area 14.
  • the receiving elements 36 are activated for their respective standby times Tber_1, Tber_2 and Tber_3.
  • the transmission signals 28 which strike an object 18 in the surveillance area 14 are reflected accordingly.
  • the reflected transmission signals 28 are imaged onto the corresponding receiving elements 36 by means of the optical imaging system 32 of the receiving device 24 .
  • the received variables that generate the received signals 30 in the relevant receiving element 36 are integrated.
  • the corresponding receiving elements 36 are deactivated and read out.
  • the integrated received variables are output as respective electrical evaluation signals.
  • the object information is transmitted to driver assistance system 20 .
  • driver assistance system 20 With the driver assistance system 20 corresponding operating functions of the vehicle 10 are influenced on the basis of the object information, for example controlled or regulated.
  • the vehicle 10 can thus be operated autonomously or partially autonomously.
  • the predetermined standby time Tber for the relevant receiving element 36 is reduced for the next measurement.
  • the critical absorption capacity can be achieved, for example, by the transmission signals 28 being reflected on highly reflective objects 18, for example reflective surfaces, and impinging on the corresponding receiving element 36 as received signals 30 with the appropriate strength. This leads to saturation of the corresponding receiving element 36. Reducing the corresponding standby time Tber counteracts saturation.
  • two consecutive measurements can be carried out in pairs. In one of the measurements, for example, the longest standby time Tber_3 can be shortened. In this way, the far-field detection range 48c is reduced. The measurement with the shortened standby time Tber_3 can be carried out faster. The results of the two measurements with standby times Tber_3 of different lengths can be combined to form an overall picture of the monitored area 14 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un procédé de détection d'objets (18) dans une zone surveillée (14) au moyen de signaux électromagnétiques (28, 30) à l'aide d'un dispositif de détection (12), à un dispositif de détection (12), et à un véhicule (10) comprenant un dispositif de détection (12). Selon le procédé, au moins un signal électromagnétique est transmis dans la zone surveillée (14) par au moins un dispositif de transmission pendant au moins une mesure. Au moins un élément de réception d'au moins un dispositif de réception est activé pour recevoir des signaux électromagnétiques. Des informations sur la zone surveillée sont déterminées à partir du résultat d'au moins une réception. Au moins deux éléments de réception sont activés pendant une durée de veille déterminée respective afin de recevoir des signaux de réception électromagnétiques qui sont provoqués par l'un ou les signaux de transmission électromagnétiques réfléchis dans la zone surveillée (14). La durée de veille respective pour chacun desdits au moins deux éléments de réception est déterminée individuellement sur la base de régions respectives (14a, 14b, 14c, 14d) de la zone surveillée (14) associée à chaque élément de réception.
PCT/EP2021/073948 2020-09-15 2021-08-31 Procédé destiné à la détection d'objets et dispositif de détection WO2022058155A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21769725.9A EP4214538A1 (fr) 2020-09-15 2021-08-31 Procédé destiné à la détection d'objets et dispositif de détection

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020124023.6A DE102020124023A1 (de) 2020-09-15 2020-09-15 Verfahren zum Detektieren von Objekten und Detektionsvorrichtung
DE102020124023.6 2020-09-15

Publications (1)

Publication Number Publication Date
WO2022058155A1 true WO2022058155A1 (fr) 2022-03-24

Family

ID=77739077

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/073948 WO2022058155A1 (fr) 2020-09-15 2021-08-31 Procédé destiné à la détection d'objets et dispositif de détection

Country Status (3)

Country Link
EP (1) EP4214538A1 (fr)
DE (1) DE102020124023A1 (fr)
WO (1) WO2022058155A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153393A1 (en) * 2007-07-18 2009-06-18 Mazda Motor Corporation Obstacle detecting control device of vehicle
US20190271767A1 (en) * 2016-11-16 2019-09-05 Innoviz Technologies Ltd. Dynamically Allocating Detection Elements to Pixels in LIDAR Systems
DE102018126631A1 (de) 2018-10-25 2020-04-30 Valeo Schalter Und Sensoren Gmbh Verfahren zur Bestimmung einer Entfernung eines Objekts mithilfe einer optischen Detektionsvorrichtung und optische Detektionsvorrichtung

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019200157A1 (de) 2019-01-09 2020-07-09 Robert Bosch Gmbh LIDAR-Vorrichtung mit winkelbasierter Detektorauswertung

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153393A1 (en) * 2007-07-18 2009-06-18 Mazda Motor Corporation Obstacle detecting control device of vehicle
US20190271767A1 (en) * 2016-11-16 2019-09-05 Innoviz Technologies Ltd. Dynamically Allocating Detection Elements to Pixels in LIDAR Systems
DE102018126631A1 (de) 2018-10-25 2020-04-30 Valeo Schalter Und Sensoren Gmbh Verfahren zur Bestimmung einer Entfernung eines Objekts mithilfe einer optischen Detektionsvorrichtung und optische Detektionsvorrichtung

Also Published As

Publication number Publication date
EP4214538A1 (fr) 2023-07-26
DE102020124023A1 (de) 2022-03-17

Similar Documents

Publication Publication Date Title
DE102019217627A1 (de) Lidar mit distanzabhängiger vertikaler Auflösung
EP2306217B1 (fr) Détermination d'un environnement
WO2023247302A1 (fr) Procédé de détermination d'au moins une fonction de correction pour un système lidar, système lidar, véhicule comprenant au moins un système lidar et système de mesure
EP3545338B1 (fr) Dispositif de réception pour un dispositif de détection optique, dispositif de détection et système d'aide à la conduite
WO2022023117A1 (fr) Dispositif de transmission d'un dispositif de détection optique, dispositif de détection, véhicule et procédé
WO2022058155A1 (fr) Procédé destiné à la détection d'objets et dispositif de détection
EP3519858A1 (fr) Unité de balayage d'un dispositif de réception et d'émission optique d'un dispositif de détection optique d'un véhicule
WO2021023423A1 (fr) Système de mesure lidar comprenant deux dispositifs de mesure lidar
DE102020124017A1 (de) Verfahren zum Betreiben einer optischen Detektionsvorrichtung, optische Detektionsvorrichtung und Fahrzeug mit wenigstens einer optischen Detektionsvorrichtung
DE102021126999A1 (de) Verfahren zum Betreiben eines LiDAR-Systems, LiDAR-System und Fahrzeug aufweisend wenigstens eine LiDAR-System
WO2020221619A1 (fr) Appareil de détection optique pour détecter des objets et dispositif de réception pour un dispositif de détection optique
DE102022119584A1 (de) Verfahrensüberprüfung eine Ausrichtung wenigstens einer optischen Einrichtung eines LiDAR-Systems, LiDAR-System, Fahrassistenzsystem und Fahrzeug mit wenigstens einem LiDAR-System
WO2023247304A1 (fr) Procédé de fonctionnement d'un système lidar, système lidar, et véhicule comprenant au moins un système lidar
WO2021078552A1 (fr) Dispositif d'étalonnage pour l'étalonnage d'au moins un dispositif de détection optique
WO2021078511A1 (fr) Procédé de détection d'objets dans une région surveillée à l'aide d'un appareil de détection optique et appareil de détection optique
DE102020119720A1 (de) Verfahren zum Betreiben einer Detektionsvorrichtung, Detektionsvorrichtung und Fahrzeug mit wenigstens einer Detektionsvorrichtung
WO2023247474A1 (fr) Procédé de fonctionnement d'un système lidar, système lidar et véhicule comprenant au moins un système lidar
WO2023099426A1 (fr) Procédé de fonctionnement d'un système lidar, système lidar et véhicule comprenant au moins un système lidar
WO2023247395A1 (fr) Procédé de fonctionnement d'un système lidar à correction de lumière parasite, système lidar correspondant et véhicule
EP3994482A1 (fr) Dispositif de mesure optique pour la détermination d'informations d'objet pour des objets dans au moins une zone de surveillance
DE102021113962A1 (de) Empfangseinrichtung einer Detektionsvorrichtung, Detektionsvorrichtung, Fahrzeug mit wenigstens einer Detektionsvorrichtung und Verfahren zum Betreiben wenigstens einer Detektionsvorrichtung
WO2023247475A1 (fr) Procédé de fonctionnement de système lidar flash pour véhicule, système lidar flash et véhicule
EP3945339A1 (fr) Dispositif de détection optique permettant de surveiller au moins une zone de surveillance au niveau d'objets et procédé de fonctionnement d'un dispositif de détection optique
DE102019124641A1 (de) Detektionsvorrichtung zur Erfassung von Objekten und Verfahren zum Betreiben einer Detektionsvorrichtung
DE102021129091A1 (de) Verfahren zum Betreiben einer Detektionsvorrichtung zur ortsaufgelösten Überwachung wenigstens eines Überwachungsbereichs, Detektionsvorrichtung, Fahrzeug mit wenigstens einer Detektionsvorrichtung

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21769725

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021769725

Country of ref document: EP

Effective date: 20230417