WO2022122502A1 - Capteur optique - Google Patents

Capteur optique Download PDF

Info

Publication number
WO2022122502A1
WO2022122502A1 PCT/EP2021/083752 EP2021083752W WO2022122502A1 WO 2022122502 A1 WO2022122502 A1 WO 2022122502A1 EP 2021083752 W EP2021083752 W EP 2021083752W WO 2022122502 A1 WO2022122502 A1 WO 2022122502A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical sensor
angular resolution
test object
sensor
spread function
Prior art date
Application number
PCT/EP2021/083752
Other languages
German (de)
English (en)
Inventor
Johannes Richter
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2022122502A1 publication Critical patent/WO2022122502A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S2007/4975Means for monitoring or calibrating of sensor obstruction by, e.g. dirt- or ice-coating, e.g. by reflection measurement on front-screen

Definitions

  • SAE levels 3-5 will increasingly be used on public roads in the coming years. All known concepts of automated vehicles require a combination of different environment detection sensors known per se, such as cameras, radar, lidar, etc.
  • the latter environment detection sensors are in principle laser scanners that emit laser light pulses and times of arrival at an object Measure and evaluate reflected laser light.
  • the LiDAR sensors can determine a distance to the object from the measured time-of-flight.
  • US 2019/0188498 A1 discloses a detection of ground markings, in particular road markings or markings on vehicle parking spaces. Disclosure of Invention
  • the invention provides an optical sensor, comprising: optically transmissive cover means; a detection device for detecting an object by sensors; and a checking device, by means of which an angular resolution of the optical sensor is checked using a test object detected by the detection device, it being possible for the checking device to output a control signal if the angular resolution is insufficient.
  • the object is achieved with a method for operating an optical sensor, having the steps:
  • a further advantageous development of the optical sensor is characterized in that an edge between two areas is used as the test object, with it being checked how a transition between the two areas is detected.
  • a kind of "edge resolution" of the optical sensor is advantageously recorded and evaluated in this way.
  • a further advantageous development of the optical sensor provides that a point spread function applied to the test object is evaluated.
  • the point-spread function mentioned represents a measure of the angular resolution and, in principle, represents an output variable for the measurement carried out.
  • a further advantageous development of the optical sensor provides that a width of a plateau and a width in a lower area of an intensity curve obtained with the point spread function are compared to one another and evaluated.
  • an evaluation of the intensity curve that is easy to implement mathematically is used to determine the angular resolution of the optical sensor.
  • a further advantageous development of the proposed optical sensor is characterized in that the optical sensor has a Gaussian or a rectangular point spread function. Both types of point spread function are present in known optical sensors and represent an intrinsic characteristic of the sensor. It primarily depends on an optical design of the optical sensor and can be used to carry out the proposed method for determining the angular resolution.
  • a further advantageous embodiment of the proposed optical sensor is characterized in that an analysis of an intensity curve over an angle is carried out using the test object.
  • the aforementioned edge resolution capability of the optical sensor is determined and evaluated in a simple manner.
  • the detected angle can be a horizontal angle and/or a vertical angle in relation to the optical sensor.
  • optical sensor is a LiDAR sensor or a camera.
  • An advantageous embodiment of the proposed method is characterized in that a repetition rate of the measurement of the angular resolution is set in a defined manner.
  • a measurement intensity can advantageously be defined, with the measurement intensity being able to be carried out with a frequency that corresponds to a frame rate of the LiDAR sensor. In normal operation of the optical sensor, this advantageously results in a particularly low computing outlay for determining the angular resolution.
  • Fig. 1 is a simplified representation of a brightness distribution of a
  • Fig. 4 is a block diagram of a proposed optical sensor
  • Fig. 5 is a flow chart of a proposed method for
  • Fig. 6 is a flow chart of a proposed method for
  • a core idea of the present invention is in particular to provide an optical sensor that is able to determine and evaluate its own angular resolution.
  • Road markings are subject to the “Guidelines for Road Markings” and thus have firmly defined line widths, depending on whether the journey is on the motorway or other roads and whether they are guidelines or lane boundaries.
  • LiDAR sensors can detect road markings because they are more reflective than the road. The points of the road markings appear with a higher intensity in the point cloud and are used, for example, to position the vehicle in the lane.
  • the proposed LiDAR sensor preferably works with a frame rate of the order of approx. 10 Hz, the frame rate corresponding to a sum of all detector pixels of a detection device 20 .
  • the horizontal reflectivity distribution of a white road marking on a dark road surface can be represented approximately as a rectangular function, as is indicated in FIGS. 1a, 1b with the progression of brightness H over a horizontal angle ⁇ .
  • the road marking appears as a convolution of this rectangular function with the so-called "Point Spread Function” (PSF) of the LiDAR sensor, as indicated in FIGS. 3a, 3b.
  • PSF Point Spread Function
  • the proposed method can either be used for a single horizontal LiDAR scan plane or information from several or all scan planes, which include road markings or other suitable test objects (e.g.
  • Transition from lane to shoulder, crash barrier, traffic sign, etc. can be combined. Furthermore, the resolution estimation can be improved by averaging the data from several consecutive frames of the LiDAR sensor.
  • the following method indicated in FIG. 5 is proposed, which can easily be incorporated into a typical detection pipeline. Since ground plane estimation of detected objects is carried out in most detection pipelines anyway, this data can be used directly for determining the angular resolution. If the optical sensor 100 detects an unexpectedly poor resolution, the autonomous vehicle equipped therewith can take appropriate measures.
  • the detection device 20 outputs a control signal S, with which a defined action for an at least partially automated vehicle (not shown) equipped with the optical sensor 100 can be initiated, for example a degradation of a driving function, an initiation a braking manoeuvre, bringing the vehicle into a safe state, etc.
  • FIG. 5 shows an exemplary sequence of how the proposed method can be carried out as part of a detection activity of an optical sensor 100 .
  • a cloud of points is detected by means of the optical sensor 100; in a step 210, a roadway is determined (ground estimation) by means of the cloud of points detected.
  • a roadway is determined (ground estimation) by means of the cloud of points detected.
  • an object segmentation is carried out.
  • an object classification takes place and in a step 240 the detected and classified object is tracked. Steps 200 to 240 thus represent a perception or detection activity of optical sensor 100.
  • Fig. 6 shows a basic sequence of an embodiment of the proposed method for operating an optical sensor 100.
  • a test object is detected by sensors.
  • a control signal S is output if the determined angular resolution of the optical sensor 100 is not defined as sufficient.
  • the proposed optical sensor 100 is disclosed above primarily in the form of a LiDAR sensor, it is also conceivable to implement the optical sensor 100 in a different technical way, for example as a camera.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un capteur optique (100), comprenant : un dispositif de couverture optiquement transparent (10) ; un dispositif de détection (20) pour détecter de manière sensorielle un objet ; et un dispositif de test (20), au moyen de laquelle une résolution angulaire du capteur optique (100) est testé au moyen d'un objet de test détecté par le dispositif de détection (20), si la résolution angulaire est insuffisante, un signal (S) pouvant être émis par le dispositif de test (20).
PCT/EP2021/083752 2020-12-07 2021-12-01 Capteur optique WO2022122502A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020215401.5 2020-12-07
DE102020215401.5A DE102020215401A1 (de) 2020-12-07 2020-12-07 Optischer Sensor

Publications (1)

Publication Number Publication Date
WO2022122502A1 true WO2022122502A1 (fr) 2022-06-16

Family

ID=78845094

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/083752 WO2022122502A1 (fr) 2020-12-07 2021-12-01 Capteur optique

Country Status (2)

Country Link
DE (1) DE102020215401A1 (fr)
WO (1) WO2022122502A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2565699A2 (fr) * 2011-09-02 2013-03-06 Sick Ag Capteur optoélectronique et un procédé pour détecter des objets dans une zone de surveillance
DE102018118679A1 (de) 2017-08-02 2019-02-07 GM Global Technology Operations LLC Verfahren und vorrichtung zur parallelen aufnahme in ein lidar-array
US20190188498A1 (en) 2016-05-13 2019-06-20 Institut Français Des Sciences Et Technologies Des Transports, De L'aménagement Et Des Réseaux Image Processing Method For Recognizing Ground Marking And System For Detecting Ground Marking
US20200019160A1 (en) * 2018-07-13 2020-01-16 Waymo Llc Vehicle Sensor Verification and Calibration
US20200096637A1 (en) * 2017-04-04 2020-03-26 pmdtechnologies ag Time-of-flight camera

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10272886B2 (en) 2017-09-08 2019-04-30 Ford Global Technologies, Llc Vehicle sensor system
CN110850391B (zh) 2019-10-28 2021-11-26 中国人民解放军63963部队 一种激光雷达性能测试装置及测试方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2565699A2 (fr) * 2011-09-02 2013-03-06 Sick Ag Capteur optoélectronique et un procédé pour détecter des objets dans une zone de surveillance
US20190188498A1 (en) 2016-05-13 2019-06-20 Institut Français Des Sciences Et Technologies Des Transports, De L'aménagement Et Des Réseaux Image Processing Method For Recognizing Ground Marking And System For Detecting Ground Marking
US20200096637A1 (en) * 2017-04-04 2020-03-26 pmdtechnologies ag Time-of-flight camera
DE102018118679A1 (de) 2017-08-02 2019-02-07 GM Global Technology Operations LLC Verfahren und vorrichtung zur parallelen aufnahme in ein lidar-array
US20200019160A1 (en) * 2018-07-13 2020-01-16 Waymo Llc Vehicle Sensor Verification and Calibration

Also Published As

Publication number Publication date
DE102020215401A1 (de) 2022-06-09

Similar Documents

Publication Publication Date Title
DE3816392C2 (fr)
DE112008001384B4 (de) Verfahren zur Verschmutzungserkennung bei einer TOF-Distanzbildkamera
EP1303768B1 (fr) Procede pour determiner la distance de visibilite
EP0785883B1 (fr) Detecteur de distance de visibilite et de presence de pluie
EP2936049A1 (fr) Dispositif et procédé de mesure de la profondeur des sculptures d'un pneu
DE102010027647A1 (de) Laserbasiertes Verfahren zur Reibwertklassifikation in Kraftfahrzeugen
EP0444402A2 (fr) Méthode et appareil pour indiquer aux automobilistes la limite de visibilité dans le brouillard
DE102010039092B4 (de) Verfahren und Steuergerät zum Ermitteln eines Abstandes eines Objektes von einem Fahrzeug
DE102016223068A1 (de) Verfahren zur Blindheitserkennung bei Radarsensoren für Kraftfahrzeuge
DE102011105074A1 (de) Verfahren und Vorrichtung zur Bestimmung einer Sichtweite für ein Fahrzeug
DE102019209846A1 (de) Verfahren zum Betreiben einer 3D-Distanzsensorvorrichtung
EP3055682A1 (fr) Dispositif et procédé pour mesurer des vitres, en particulier des pare-brises de véhicules
WO2022122502A1 (fr) Capteur optique
DE102020214991A1 (de) Optischer Sensor
EP3591424A1 (fr) Caméra d'évaluation du temps de propagation de la lumière 3d et procédé de détection de données d'image tridimensionnelles
DE102018201620B4 (de) Vorrichtung und Verfahren zum Radar-basierten Klassifizieren von Fahrbahnzuständen
DE102019212877A1 (de) LiDAR-System sowie Fahrzeug und Verfahren zur Erkennung von Wetterbedingungen
DE102019107396A1 (de) Erfassen und Klassifizieren erhabener Fahrbahnmarkierungen unter Verwendung von LIDAR
DE102018126506A1 (de) Regenerkennung mit einem Umgebungssensor zur punktweisen Erfassung einer Umgebung eines Fahrzeugs, insbesondere mit einem LiDAR-basierten Umgebungssensor
DE102021002904A1 (de) Verfahren zur Ermittlung alterungsbedingter Funktionsbeeinträchtigungen eines Lidarsensors eines Fahrzeugs
DE102022001878B3 (de) Verfahren zur Detektion einer Degradation eines Lidarsensors
WO2007137835A2 (fr) Procédé et ensemble permettant de déterminer la qualité optique d'une vitre transparente
EP4241111A1 (fr) Procédé de détermination d'un changement de portée d'un capteur lidar
WO2024089080A1 (fr) Procédé de reconnaissance et de classification de marquages routiers surélevés
DE102020107450A1 (de) Lidar-Sensoreinrichtung für ein Kraftfahrzeug, Verfahren zum Betrieb einer Lidar-Sensoreinrichtung und Kraftfahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21823862

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21823862

Country of ref document: EP

Kind code of ref document: A1