WO2020065019A1 - Système de reconnaissance de l'environnement et procédé pour un système de reconnaissance de l'environnement - Google Patents

Système de reconnaissance de l'environnement et procédé pour un système de reconnaissance de l'environnement Download PDF

Info

Publication number
WO2020065019A1
WO2020065019A1 PCT/EP2019/076201 EP2019076201W WO2020065019A1 WO 2020065019 A1 WO2020065019 A1 WO 2020065019A1 EP 2019076201 W EP2019076201 W EP 2019076201W WO 2020065019 A1 WO2020065019 A1 WO 2020065019A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
environment
information
viewing angle
detection system
Prior art date
Application number
PCT/EP2019/076201
Other languages
German (de)
English (en)
Inventor
Günther Scharnagel
Stefan Hakspiel
Original Assignee
Zf Friedrichshafen Ag
Ibeo Automotive Systems GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zf Friedrichshafen Ag, Ibeo Automotive Systems GmbH filed Critical Zf Friedrichshafen Ag
Publication of WO2020065019A1 publication Critical patent/WO2020065019A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4004Means for monitoring or calibrating of parts of a radar system
    • G01S7/4026Antenna boresight
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves

Definitions

  • the present invention relates to systems for recognizing the environment e.g. of a vehicle.
  • the invention further relates to a method for an environment detection system.
  • a LIDAR measuring system can have one or more LIDAR sensors.
  • Each LIDAR sensor preferably comprises a LIDAR transmitter unit and a LIDAR receiver unit.
  • a LIDAR sensor resolves a spatial area with a predetermined resolution at a given viewing angle.
  • a higher resolution of the LIDAR image is sometimes required.
  • the same LIDAR sensor can be used for this if its viewing angle is reduced by appropriate optics and thus its resolution is increased with the same number of pixels. So that the LIDAR sensor can cover the same image area with a narrowed viewing angle as before, the viewing angle of the LIDAR sensor is deflected (i.e. varies). This can be done, for example, by tilting the sensor (e.g. by ⁇ 5 °) or by tilting a mirror or prism (e.g. by ⁇ 10 °).
  • the viewing angle of the LIDAR sensor i.e. the angle of the tilt can be sensed very precisely (e.g. with an accuracy of ⁇ 0.05 °).
  • a rotation angle sensor is usually used for sensing.
  • the relative position of the first camera to the second camera is determined in the individual image of the second camera. Depending on the relative position determined, the two individual images are now combined to form an overall image.
  • Document DE 10 2008 004 370 A1 proposes a method for calibrating a camera of a night vision system which is attached to a vehicle.
  • the video image of the camera is measured using a reference object in order to determine a relative angle of rotation of the video image relative to the reference object.
  • the reference object can e.g. the video image of a second camera.
  • a shear for the video image is determined by means of the determined rotation angle in order to enable mechanically free rotation of the video image.
  • the invention relates to an environment detection system.
  • the environment detection system comprises a first environment sensor with a variable viewing angle, which is set up to provide first information about a first spatial area.
  • the first area of the room depends on a current viewing angle of the first environment sensor.
  • the environment detection system further comprises a second environment sensor, which is set up to provide second information about a second spatial area.
  • the first room area is a partial area of the second room area.
  • Both the first environment sensor and the second environment sensor are sensors which scan or sense their environment in order to obtain information about objects in the environment (for example position, distance, relative movement to the environment sensor, etc.). Because of the variable viewing angle of the first environment sensor, the first room area can be adjusted. Accordingly, the first environment sensor can scan different areas of the room.
  • the second environment sensor can have a fixed, unchangeable viewing angle, for example, so that it always scans the same spatial area. Alternatively, the second order field sensor can also have a variable viewing angle.
  • the environment detection system includes an evaluation element, which is set up to determine the instantaneous viewing angle of the first environment sensor based on a comparison of the first information and the second information.
  • the second environment sensor scans the first room area as part of the second room area, so that the second information contains information corresponding to the first information.
  • the subset of the second information corresponding to the first information is assigned to a specific subarea of the second spatial area scanned by the second environment sensor. From the position of this partial area within the known second room area, it is thus possible to infer the first room area scanned by the first environment sensor and thus its viewing angle.
  • the inventive comparison of the information provided by a first order field sensor with a variable viewing angle and a second environment sensor can thus enable the determination of the viewing angle of the first environment sensor without the use of a rotation angle sensor.
  • At least the second information can be arranged in a two-dimensional matrix (likewise the first information) in order to provide a flat representation of the surroundings.
  • the evaluation element is then set up to determine a subset of the second information corresponding to the first information. For example, the evaluation element can check whether an object described in distance information provided by the first environment sensor is also described in distance information provided by the second environment sensor. Similar comparisons can e.g. can also be carried out by means of speed or elevation angle information provided by the two surroundings sensors. Information from different categories (e.g. distance and speed) provided by the two environment sensors can also be evaluated together. Based on a position of the subset of the second information in the two-dimensional matrix, the evaluation element is further configured to determine the instantaneous viewing angle of the first environment sensor.
  • the first environment sensor can be set up, for example, the first information based on received reflections from one of the first environment sensor provide transmitted signal.
  • the signal emitted by the first environment sensor can be, for example, a high-frequency signal or a light signal.
  • a distance or a speed relative to the object can be calculated from the reflections received from objects in the surroundings of the first surroundings sensor.
  • the second environment sensor can be designed accordingly.
  • the first environment sensor is a LIDAR sensor.
  • the LIDAR sensor e.g. the entire LIDAR sensor or individual elements of the LIDAR sensor (e.g. its receiving unit and its receiving optics or an optical element such as a mirror or a prism) can be tilted about a respective tilt axis.
  • the first environment sensor can also be a radar sensor, which provides the first information based on received reflections of a high-frequency signal emitted by the radar sensor. Accordingly, the viewing angle of the radar sensor can be determined or calculated by means of the comparison according to the invention of the information provided by the environment sensors.
  • the first environment sensor can also be a camera with a variable viewing angle, for example, which records the environment.
  • the second environment sensor can be of the same type as the first environment sensor or can be an environment sensor of a different type.
  • both environment sensors can be LIDAR sensors.
  • a LIDAR image can be compared with a camera and / or a radar image.
  • the second environment sensor can be a LIDAR sensor, a radar sensor or a camera.
  • one of the two environment sensors can be used to scan a sub-area of the area scanned by the other environment sensor with greater accuracy.
  • the first spatial region can have a higher resolution in the first information be than in the second information.
  • the first environment sensor can be set up to sense the first room area with a higher resolution than the second environment sensor.
  • the present invention also relates to a method for an environment detection system comprising a first environment sensor with a variable viewing angle and a second environment sensor.
  • the method comprises providing first information about a first spatial area by the first environment sensor.
  • the first area of the room depends on a current viewing angle of the first environment sensor.
  • the method further comprises providing second information about a second spatial area by the second environment sensor.
  • the first room area is a partial area of the second room area.
  • the method further comprises determining the instantaneous viewing angle of the first environment sensor based on a comparison of the first information and the second information.
  • the method according to the invention can also enable a more improved and less expensive determination of the viewing angle of the environment sensor with a variable viewing angle.
  • Another aspect of the present invention also relates to a program with a program code for performing the method described herein when the program code runs on a processor or a programmable hardware component or is executed there.
  • the present invention also relates to a vehicle.
  • a vehicle can be thought of as a device that includes one or more motor driven wheels (and optionally a powertrain system).
  • a vehicle can be a passenger car, a truck, motorcycle, or tractor.
  • the vehicle comprises at least one environment detection system according to the present invention in order to detect an environment of the vehicle.
  • the vehicle includes a control element, which is set up to derive a reaction of the vehicle to the first information, depending on the determined instantaneous viewing angle of the first environment sensor.
  • the vehicle according to the invention can enable an improved and more cost-effective determination of the gaze angle of the environment sensor with a variable angle of view due to the environment detection system according to the invention, so that the manufacture of the vehicle can be reduced and the derivation of the vehicle reaction can be improved.
  • Fig. 1 shows schematically an embodiment of an environment detection system
  • FIG. 2 schematically shows an exemplary embodiment of an environment detection system in the form of a LIDAR measuring system
  • FIG. 3 schematically shows a section through the LIDAR sensor shown in FIG. 2; and FIG. 4 comparisons of the recordings of two LIDAR sensors.
  • the environment detection system 100 comprises a first environment sensor 110, which is set up to provide first information 1111 about a first spatial area.
  • the environment detection system 100 comprises a second environment sensor 120, which is set up to provide second information 121 about a second spatial area.
  • both the first environment sensor 110 and the second environment sensor 120 can, for example, each have an object 140 by means of signals 1 12 and 122, respectively are reflected back from the object 140 to the environment sensors 110 and 120 (indicated by reflections 142 and 143).
  • the LIDAR measurement system 100 also includes an evaluation element 130.
  • the evaluation element 130 is set up to determine the instantaneous viewing angle of the first environment sensor 110 based on a comparison of the first information 11 1 and the second information 121.
  • the second environment sensor 120 scans the first room area as part of the second room area, so that the second information 121 contains information corresponding to the first information.
  • This subset of the second information 121 which corresponds to the first information 11 1, is assigned to a specific partial area of the first spatial area scanned by the second environment sensor 120. From the position of this sub-area within the known second room area can thus be concluded on the first area area scanned by the first order sensor 110 and thus its viewing angle.
  • FIG. 2 An exemplary construction of a LIDAR measuring system 200 with a first LIDAR sensor 210 and a second LIDAR sensor 220, which represents a possible implementation of the environment detection system according to the invention, is shown in FIG. 2.
  • a sectional view of the first LIDAR sensor 210 is shown in FIG. 3.
  • the second LIDAR sensor 220 can be configured like the first LIDAR sensor 210.
  • the basic structure of the LIDAR measuring system 200 is designed in accordance with the statements relating to the prior art (WO 2017/081294 A1).
  • the first LIDAR sensor 210 and the second LIDAR sensor 220 are arranged in a housing 230 of the LIDAR measuring system 200.
  • the second LIDAR sensor 220 is immobile, i.e. his perspective is fixed.
  • the second LIDAR sensor 220 comprises a LIDAR transmission unit and a LIDAR reception unit.
  • the first LIDAR sensor 210 comprises a LIDAR transmission unit and a LIDAR reception unit.
  • the LIDAR receiver unit forms together with the LIDAR transmitter unit, the receiving optics 213, the sending optics 214 and one by one
  • Tilt axis 215 tiltable mirror 216 is a movable lidar unit with a viewing angle that can be changed by tilting the mirror 216.
  • the LIDAR receiving unit and / or the LIDAR transmitting unit are advantageously designed in a focal plane array configuration, as indicated in FIG. 3.
  • the elements of the respective unit are essentially arranged in one plane, more favorably on a chip.
  • the respective unit is preferably arranged on the LIDAR sensor 210 in a focal point of a corresponding optical system - transmitting optical system 214 or receiving optical system 213.
  • the sensor elements 21 1 and the emitter elements 212 are arranged in the focal point of the receiving optics 213 and the sen deoptik 214.
  • Such optics can be formed, for example, by an optical lens system.
  • the LIDAR receiver unit has a plurality of sensor elements 21 1, which are preferably designed as SPAD, single photon avalanche diode.
  • the LIDAR transmitter unit has several emitter elements 212 for sending e.g. Laser light, conveniently laser pulses.
  • the emitter elements 212 are advantageously designed as VCSELs, Vertical Cavity Surface Emitting Lasers.
  • the transmitter unit has emitter elements 212 which are distributed over an area of the transmitter chip.
  • the receiving unit has sensor elements 211 which are distributed over an area of the receiving chip.
  • An optical transmitter 214 is assigned to the transmit chip and the optical receiver 213 is assigned to the receive chip.
  • the optics represent a light arriving from a room area on the respective chip.
  • the spatial area corresponds to the viewing area of the measuring system 200, which is examined or sensed for objects.
  • the spatial area of the transmitting unit and the receiving unit are essentially identical.
  • the transmitting optics map an emitter element 212 to a solid angle, which represents a partial area of the spatial area.
  • the emitter element 212 accordingly emits laser light into this solid angle.
  • the emitter elements 212 together cover the entire room area.
  • the receiving optics 213 images a sensor element 21 1 to a solid angle, which represents a partial area of the spatial area.
  • the number of all sensor elements 21 1 covers the entire room area.
  • Emitter elements 212 and sensor elements 21 1, which consider the same solid angle, map to one another and are assigned to one another accordingly.
  • a laser light from an emitter element 212 normally maps onto the associated sensor element 21 1. If necessary, a plurality of sensor elements 21 1 are arranged within the solid angle of an emitter element 212.
  • the measurement system 200 carries out a measurement process to determine objects within the spatial area.
  • a measuring process comprises one or more measuring cycles, depending on the design of the measuring system 200 and its electronics.
  • the Time Correlated Single Photon Counting method is preferably used.
  • individual incoming photons are detected, in particular by SPAD, and the time at which sensor element 21 1 is triggered, including the time of detection, is stored in a memory element.
  • the time of detection is related to a reference time at which the laser light is emitted.
  • the transit time of the laser light can be determined from the difference, from which the distance of the object can be determined.
  • a sensor element 21 1 can be triggered on the one hand by the laser light and on the other hand by the ambient radiation.
  • a laser light always arrives at a certain distance from the object at the same time, whereas the ambient radiation always provides the same probability of triggering a sensor element.
  • the triggering of the sensor element 21 1 at the time of detection that corresponds to the transit time of the laser light with respect to the distance of the object, whereas the triggering by the ambient radiation is uniform over the measuring duration of one Distribute measuring cycle.
  • a measurement corresponds to the emission and subsequent detection of the laser light.
  • the one in the Data stored in the storage element of the individual measurement cycles of a measurement process enable the multiple times of detection to be evaluated in order to infer the distance of the object.
  • a sensor element 21 1 is advantageously connected to a time to digital converter, TDC, which stores the point in time at which the sensor unit is triggered in the memory element.
  • TDC time to digital converter
  • Such a memory element can be designed, for example, as a short-term memory or as a long-term memory.
  • the TDC fills a storage element with the times at which the sensor elements detected an arrival of the photon. This can be graphically represented by a histogram, which is based on the data of the storage element. In the case of a histogram, the duration of a measurement cycle is divided into short periods of time, so-called bins. If a sensor element 21 1 is triggered, the TDC increases the value of a bin by one. The bin which corresponds to the transit time of the laser pulse, that is the difference between the detection time and the reference time, is filled up.
  • FIGS. 2 and 3 An evaluation element according to the invention is shown in FIGS. 2 and 3 not shown for reasons of clarity.
  • the tiltable mirror 216 can be dispensed with. Accordingly, the first LIDAR sensor 210 can then be tilted as a whole or at least partially about a tilt axis.
  • the second LIDAR sensor 220 can also be tiltable about a further tilt axis.
  • FIG. 4 Exemplary information of a LIDAR sensor with a variable viewing angle in comparison to information of a LIDAR sensor with an unchanging viewing angle is shown in FIG. 4.
  • the information provided by the LI DAR sensors on the spatial area covered by them is shown as a graph in FIG. 4. It should be noted that this representation is chosen purely for educational reasons and the information as described above is of a different type.
  • a LIDAR measuring system determines individual reflections, which are also referred to as detections. Each detection can include information such as distance, elevation angle, azimuth angle, speed, intensity and / or other variables.
  • the two LIDAR sensors can be installed in a vehicle 400, for example, in order to detect the surroundings of the vehicle.
  • Figure 410 shows the area covered by the LIDAR sensor with a fixed viewing angle.
  • Figure 420 shows the area covered by the LIDAR sensor with a variable viewing angle at a first viewing angle of the LIDAR sensor.
  • the image 430 shows the area covered by the LIDAR sensor with a variable viewing angle at a second viewing angle of the LIDAR sensor.
  • images 420 and 430 of the LIDAR sensor with a variable viewing angle each represent a small image section of the image recorded by the LIDAR sensor with an unchanging viewing angle.
  • Images 420 and 430 show the respective image section with increased resolution .
  • the position of the respective image section in the image 410 can now be determined from the comparison of the images 420 and 430 with the image 410, and the viewing angle of the LIDAR sensor with a variable viewing angle can be calculated therefrom.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un système de reconnaissance de l'environnement (100). Ce système de reconnaissance de l'environnement comprend un premier capteur d'environnement (110) à angle de vision variable qui est conçu pour fournir des premières informations sur une première zone spatiale. La première zone spatiale dépend d'un angle de vision instantané du premier capteur d'environnement. Le système de reconnaissance de l'environnement comprend en outre un deuxième capteur d'environnement (120) qui est conçu pour fournir des deuxièmes informations sur une deuxième zone spatiale. La première zone spatiale constitue une partie de la deuxième zone spatiale. Le système de reconnaissance de l'environnement comprend en outre un élément d'évaluation (130) qui est conçu pour déterminer l'angle de vision instantané du premier capteur d'environnement sur la base d'une comparaison des premières informations et des deuxièmes informations.
PCT/EP2019/076201 2018-09-28 2019-09-27 Système de reconnaissance de l'environnement et procédé pour un système de reconnaissance de l'environnement WO2020065019A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018216707.9A DE102018216707A1 (de) 2018-09-28 2018-09-28 Umfelderkennungssystem sowie Verfahren für ein Umfelderkennungssystem
DE102018216707.9 2018-09-28

Publications (1)

Publication Number Publication Date
WO2020065019A1 true WO2020065019A1 (fr) 2020-04-02

Family

ID=68104628

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/076201 WO2020065019A1 (fr) 2018-09-28 2019-09-27 Système de reconnaissance de l'environnement et procédé pour un système de reconnaissance de l'environnement

Country Status (2)

Country Link
DE (1) DE102018216707A1 (fr)
WO (1) WO2020065019A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008004370A1 (de) 2008-01-15 2009-07-16 Robert Bosch Gmbh Bildjustageverfahren für ein Videobild
US20100235129A1 (en) * 2009-03-10 2010-09-16 Honeywell International Inc. Calibration of multi-sensor system
DE102011120535A1 (de) * 2011-12-08 2013-06-13 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Verfahren und Vorrichtung zum Einstellen zumindest eines Sensors eines Fahrzeugs
DE102013211648A1 (de) 2013-06-20 2014-12-24 Continental Automotive Gmbh Verfahren und Vorrichtung zum Kalibrieren einer ersten und einer zweiten Kamera
WO2017081294A1 (fr) 2015-11-11 2017-05-18 Ibeo Automotive Systems GmbH Procédé et dispositif de mesure de distance par voie optique
US20170307759A1 (en) * 2016-04-26 2017-10-26 Cepton Technologies, Inc. Multi-Range Three-Dimensional Imaging Systems
EP3422049A1 (fr) * 2017-06-30 2019-01-02 Aptiv Technologies Limited Système d'alignement de capteur lidar

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8089518B2 (en) * 2007-11-16 2012-01-03 Samsung Electronics Co., Ltd. System and method for automatic image capture in a handheld camera with a multiple-axis actuating mechanism
US20170186291A1 (en) * 2015-12-24 2017-06-29 Jakub Wenus Techniques for object acquisition and tracking

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008004370A1 (de) 2008-01-15 2009-07-16 Robert Bosch Gmbh Bildjustageverfahren für ein Videobild
US20100235129A1 (en) * 2009-03-10 2010-09-16 Honeywell International Inc. Calibration of multi-sensor system
DE102011120535A1 (de) * 2011-12-08 2013-06-13 GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) Verfahren und Vorrichtung zum Einstellen zumindest eines Sensors eines Fahrzeugs
DE102013211648A1 (de) 2013-06-20 2014-12-24 Continental Automotive Gmbh Verfahren und Vorrichtung zum Kalibrieren einer ersten und einer zweiten Kamera
WO2017081294A1 (fr) 2015-11-11 2017-05-18 Ibeo Automotive Systems GmbH Procédé et dispositif de mesure de distance par voie optique
US20170307759A1 (en) * 2016-04-26 2017-10-26 Cepton Technologies, Inc. Multi-Range Three-Dimensional Imaging Systems
EP3422049A1 (fr) * 2017-06-30 2019-01-02 Aptiv Technologies Limited Système d'alignement de capteur lidar

Also Published As

Publication number Publication date
DE102018216707A1 (de) 2020-04-02

Similar Documents

Publication Publication Date Title
DE102017113675B4 (de) Optoelektronischer Sensor und Verfahren zur Messung der Entfernung zu einem Objekt
EP2469296B1 (fr) Capteur optoélectronique et procédé destiné à la détection et la détermination de l'éloignement d'objets
EP2124069B1 (fr) Système LIDAR omnidirectionnel
EP0396865B1 (fr) Radar optique
DE102020110809B3 (de) Verfahren und Vorrichtung zum Erkennen von Blooming in einer Lidarmessung
EP3994497A1 (fr) Dispositif d'adaptation et dispositif de mesure lidar
DE102009046597A1 (de) Verfahren und Einrichtung für die Störungsverminderung bei einem Lidarsystem
WO2020065019A1 (fr) Système de reconnaissance de l'environnement et procédé pour un système de reconnaissance de l'environnement
EP3531167B1 (fr) Procédé et dispositif de mesure de distance optique
DE102018216705A1 (de) LIDAR-Messsystem sowie Verfahren für ein LIDAR-Messsystem
WO2018172258A1 (fr) Système lidar à base de spad (photodiodes avalanche à photon unique)
DE2526753C3 (de) Verfahren und Anordnung zur Deformationsmessung großer Objekte durch Laserstrahlreflexion
DE102017208700A1 (de) Verfahren und Vorrichtung zur Objekterfassung und LIDAR-System
DE102016224764A1 (de) Verfahren und Vorrichtung zum Betreiben eines Laserscanners und Laserscanner
DE102020124017A1 (de) Verfahren zum Betreiben einer optischen Detektionsvorrichtung, optische Detektionsvorrichtung und Fahrzeug mit wenigstens einer optischen Detektionsvorrichtung
WO2020065016A1 (fr) Système de reconnaissance de l'environnement, véhicule et procédé pour un système de reconnaissance de l'environnement
EP3994481A1 (fr) Dispositif de lecture et dispositif de mesure lidar
DE4422886A1 (de) Verfahren und Einrichtung zur optischen Bestimmung räumlicher Positionen einzelner reflektierender Objekte
DE102010064682B3 (de) Optoelektronischer Sensor und Verfahren zur Erfassung und Abstandsbestimmung von Objekten
DE102007036632B4 (de) Optischer Sensor und Verfahren zum Nachweis von Objekten in einem Überwachungsbereich
WO2018184942A1 (fr) Dispositif lidar et procédé pour le balayage d'un angle de balayage et l'évaluation d'un détecteur
WO2020058406A1 (fr) Système, écran de projection et procédé de mesure de la réponse de signal d'un dispositif de détection d'objets ayant un dispositif de balayage laser
DE102022213944A1 (de) LIDAR-Sensor und Verfahren zur optischen Erfassung eines Sichtfeldes
DE102022115273A1 (de) Verfahren zum Betreiben eines LiDAR-Systems, LiDAR-System und Fahrzeug aufweisend wenigstens ein LiDAR-System
WO2023247395A1 (fr) Procédé de fonctionnement d'un système lidar à correction de lumière parasite, système lidar correspondant et véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19779853

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19779853

Country of ref document: EP

Kind code of ref document: A1