EP4298600A1 - Verfahren und vorrichtung zur detektion und höhenbestimmung von objekten - Google Patents

Verfahren und vorrichtung zur detektion und höhenbestimmung von objekten

Info

Publication number
EP4298600A1
EP4298600A1 EP22709172.5A EP22709172A EP4298600A1 EP 4298600 A1 EP4298600 A1 EP 4298600A1 EP 22709172 A EP22709172 A EP 22709172A EP 4298600 A1 EP4298600 A1 EP 4298600A1
Authority
EP
European Patent Office
Prior art keywords
environment
detection sensor
environment detection
camera
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22709172.5A
Other languages
German (de)
English (en)
French (fr)
Inventor
Dieter KRÖKEL
Jonathan Wache
Stefan Heinrich
Martin PFITZER
Thomas Fechner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Autonomous Mobility Germany GmbH
Original Assignee
Continental Autonomous Mobility Germany GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Autonomous Mobility Germany GmbH filed Critical Continental Autonomous Mobility Germany GmbH
Publication of EP4298600A1 publication Critical patent/EP4298600A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Definitions

  • the invention relates to a method and a device for detecting and determining the height of objects in a vehicle environment.
  • LIDAR systems and stereo camera systems are currently being developed for this task, as they offer the possibility of mapping the environment in 3D. In this way, the size and distance of the obstacles and the free space can be recorded at the same time.
  • Stereo camera systems offer a significantly higher lateral resolution than LIDAR systems, while LIDAR systems allow very precise distance determination.
  • LIDAR systems allow very precise distance determination.
  • Previous stereo methods are mainly based on the SGM (Semi Global Matching) method.
  • SGM Semi Global Matching
  • machine learning methods based on deep neural networks, which are trained with ground truth data from LIDAR measurements.
  • High-resolution LIDAR systems are still a factor of 3 to 6 above this value and are therefore insufficiently suitable for this task.
  • High-resolution camera systems with 8 MPix image sensors and a lens with a 30° opening angle offer an angular resolution of 130 pixels/degree and display a 10 cm high object at a distance of 100 m on 8 pixels.
  • a stereo camera system consisting of 2 high-resolution cameras could theoretically detect such a small object at a distance of 100m.
  • the signal for noise in the depth map which is obtained from the stereo images, is usually too poor to unambiguously detect the object and at the same time to determine the height of the object.
  • a method for detecting and determining the height of objects using an environment detection system comprising a first and a second environment detection sensor of a vehicle, with at least one of the environment detection sensors being a camera, is therefore proposed with the following steps:
  • the camera is preferably a high-resolution telephoto camera.
  • this camera has a resolution of 8 Mpix and an opening angle of 30° as well as an angular resolution of 130 pixels/degree.
  • the second environment detection sensor is a stereo camera, for example.
  • the environment detection system would be designed as a stereo camera and the mono camera could be a component of the stereo system in this embodiment. It would also be conceivable to use a single mono camera and a separate stereo camera.
  • the environment representation of the second environment detection sensor would be a depth image. It would also be conceivable to use a radar or lidar sensor as the second environment detection sensor.
  • the area representation In the case of a radar sensor, the area representation would be an object list or a radar signature list, in which detected objects or radar signatures and their distances from the vehicle are entered. This data can then be merged with the mono image.
  • the environment representation With a lidar sensor as the second environment detection sensor, the environment representation would be a point cloud.
  • the singular number is used for the object for the sake of readability. However, it goes without saying that a number of objects can also be detected.
  • the two steps for object detection can take place simultaneously or sequentially in any order. For example, the object can be detected first in the environment representation of the second environment detection sensor and only then can the object be detected in the mono image. As already mentioned, a reverse order or a simultaneous process would also be conceivable.
  • the detection of the object in the mono image indicates the size of the object in pixels.
  • the distance to the object is preferably measured by means of the second environment detection sensor, in particular if this is a radar or lidar sensor.
  • the height of the object is based on a height of the object in pixels in the mono image, the determined distance of the object and a known angular resolution of the Environment detection sensors determined.
  • the angular resolution of the respective sensors results from the specific sensor data.
  • the object detection in the mono image is carried out using semantic segmentation based on a trained convolutional neural network CNN.
  • a trained convolutional neural network CNN This is advantageous since objects can be detected efficiently by means of the semantic segmentation and not only the detection of objects can be improved by means of a correspondingly trained CNN, but also the shape of the object can be determined. This information can be helpful in determining whether an object would be traversable, if necessary, without damaging the vehicle.
  • the object is confirmed by comparing the mono image and the second environment representation. In this embodiment, a comparison is made as to whether the positions of the detections match. If there is a match in the comparison, the object is unequivocally confirmed.
  • the object detection of one of the surroundings detection sensors defines an area of interest for the other surroundings detection sensor.
  • the object detection of the second environment detection sensor for example by means of a radar sensor, defines a region of interest (ROI) or an area of interest or also a search area for the camera. It is thus achieved that only this specific area in the mono image is examined for an object. This leads to more efficient object detection. It would also be conceivable that the object detection in the mono image would allow an increase in sensitivity in the radar search field. As a result, very weak radar detections, for example, can be recognized as objects. Here, too, this leads to more efficient and, above all, more reliable object detection.
  • an environment detection system for a vehicle comprising a first and a second environment detection sensor with a specific angular resolution and a computing unit, with at least the first environment detection sensor being designed as a camera, with a mono image being recorded with the camera and a further environment representation being recorded with the second environment detection sensor, with the processing unit being configured to record an object in the mono image and in the environment representation of the second environment detection sensor, the computing unit being further configured to determine the distance and the height of the object.
  • the first surroundings detection sensor is a telecamera and the second surroundings detection sensor is a stereo camera, a radar sensor or a lidar sensor.
  • the second environment detection sensor is designed as a stereo camera, with the telecamera being a component of the stereo camera.
  • the second environment detection sensor would thus be the stereo camera and the first would still be a mono camera.
  • the stereo camera serves as a height-determining sensor by determining the height in a depth image.
  • FIG. 2 shows a schematic representation of a surroundings detection system according to an embodiment of the invention
  • FIG. 4 shows an example of a depth image from the scene shown in FIG. 3 .
  • FIG. 1 shows a schematic flowchart of a preferred embodiment of the method for detecting and determining the height of objects using a stereo detection system, comprising a first and a second environment detection sensor of a vehicle, with at least one of the environment detection sensors being a mono camera.
  • a mono image is recorded by the camera.
  • an environment representation is recorded with the second environment detection sensor.
  • an object detection is carried out in the mono image.
  • an object is detected in the environment representation of the second environment detection sensor.
  • Steps S3 and S4 can be carried out one after the other in any order, ie S4 before S3 or S3 before S4, or at the same time.
  • step S5 the distance to the object is determined in the environment representation of the second environment detection sensor.
  • step S6 the height of the detected object is determined.
  • FIG. 2 shows a schematic representation of a surroundings detection system 1 according to an embodiment of the invention.
  • the environment detection system 1 includes a first environment detection sensor 2a, which is preferably designed as a camera, and a second environment detection sensor 2b. These two surroundings detection sensors 2a, 2b are connected to a computing unit 3 via a data connection D.
  • This processing unit 3 is designed to detect an object in the mono image, the processing unit being further designed to carry out object detection in the mono image and the environment representation of the second environment detection sensor, and the processing unit being further designed to determine the distance and height of the object.
  • the data connection D can be wired or wireless.
  • FIG. 3 shows a representation of an exemplary scene.
  • This representation shows a section of a roadway.
  • the representation here is mono image M.
  • the objects 01, 02, 03 are detected in the mono image M, for example by means of a semantic segmentation.
  • FIG. 4 shows an exemplary representation of a depth image recorded by means of a stereo camera from the scene shown in FIG.
  • the second environment detection sensor is designed as a stereo camera.
  • the objects 01, 02 and 03 are visible as elevations.
  • the distance to the objects 01, 02, 03 can also be determined in the depth image T.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Geometry (AREA)
  • Electromagnetism (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Analysis (AREA)
EP22709172.5A 2021-02-24 2022-01-27 Verfahren und vorrichtung zur detektion und höhenbestimmung von objekten Pending EP4298600A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021201713.4A DE102021201713A1 (de) 2021-02-24 2021-02-24 Verfahren und Vorrichtung zur Detektion und Höhenbestimmung von Objekten
PCT/DE2022/200009 WO2022179669A1 (de) 2021-02-24 2022-01-27 Verfahren und vorrichtung zur detektion und höhenbestimmung von objekten

Publications (1)

Publication Number Publication Date
EP4298600A1 true EP4298600A1 (de) 2024-01-03

Family

ID=80684868

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22709172.5A Pending EP4298600A1 (de) 2021-02-24 2022-01-27 Verfahren und vorrichtung zur detektion und höhenbestimmung von objekten

Country Status (7)

Country Link
US (1) US20240135720A1 (ja)
EP (1) EP4298600A1 (ja)
JP (1) JP2024505422A (ja)
KR (1) KR20230118974A (ja)
CN (1) CN116888630A (ja)
DE (1) DE102021201713A1 (ja)
WO (1) WO2022179669A1 (ja)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7356408B2 (en) 2003-10-17 2008-04-08 Fuji Jukogyo Kabushiki Kaisha Information display apparatus and information display method
US8605947B2 (en) * 2008-04-24 2013-12-10 GM Global Technology Operations LLC Method for detecting a clear path of travel for a vehicle enhanced by object detection
DE102011054852B4 (de) * 2011-07-30 2024-05-16 Götting KG Verfahren zur Erfassung und Bewertung einer Ebene
CN107563256A (zh) 2016-06-30 2018-01-09 北京旷视科技有限公司 辅助驾驶信息产生方法及装置、辅助驾驶系统
WO2021016596A1 (en) 2019-07-25 2021-01-28 Nvidia Corporation Deep neural network for segmentation of road scenes and animate object instances for autonomous driving applications
DE102020007057A1 (de) * 2020-11-19 2021-01-14 Daimler Ag Fahrzeug und Verfahren zur Bestimmung einer Durchfahrtshöhe

Also Published As

Publication number Publication date
DE102021201713A1 (de) 2022-08-25
US20240135720A1 (en) 2024-04-25
WO2022179669A1 (de) 2022-09-01
KR20230118974A (ko) 2023-08-14
CN116888630A (zh) 2023-10-13
JP2024505422A (ja) 2024-02-06

Similar Documents

Publication Publication Date Title
DE19926559A1 (de) Verfahren und Vorrichtung zur Detektion von Objekten im Umfeld eines Straßenfahrzeugs bis in große Entfernung
DE112015003605B4 (de) Mehrere Objekterfassungseinrichtungen verwendende Objekterkennungsvorrichtung
WO2015048954A1 (de) Verfahren und vorrichtung zur erkennung von verkehrszeichen
EP1927093B1 (de) Verfahren und Vorrichtung zur Erkennung von Fahrspurmarkierungen
DE102011011929A1 (de) Verfahren zum Detektieren von Zielobjekten in einem Überwachungsbereich
DE102012112104A1 (de) Verfahren und vorrichtung zur befahrbarkeitsanalyse
DE102011087797A1 (de) Verfahren und Vorrichtung zur Lokalisation einer vordefinierten Parkposition
DE102015203016A1 (de) Verfahren und Vorrichtung zur optischen Selbstlokalisation eines Kraftfahrzeugs in einem Umfeld
EP3671546A1 (de) Verfahren und system zum bestimmen von landmarken in einer umgebung eines fahrzeugs
DE102012221563A1 (de) Funktionsdiagnose und validierung eines fahrzeugbasierten bildgebungssystems
EP2856390A1 (de) Verfahren und vorrichtung zur verarbeitung stereoskopischer daten
EP2799903B1 (de) Verfahren zur Erfassung von Geschwindigkeitsverstößen mit restriktiver Datenspeicherung
DE102018204451A1 (de) Verfahren und Vorrichtung zur Autokalibrierung eines Fahrzeugkamerasystems
DE10151983A1 (de) Verfahren zur Dokumentation einer Unfallsituation
EP3614299A1 (de) Verfahren und anordnung zum erkennen von objekten an anlagen
DE102018005969A1 (de) Verfahren zum Betreiben eines Fahrerassistenzsvstems mit zwei Erfassungseinrichtungen
EP4298600A1 (de) Verfahren und vorrichtung zur detektion und höhenbestimmung von objekten
DE602004004594T2 (de) Verfahren zur messung des naheverhältnisses zweier konturen und system zur automatischen zielidentifikation
DE102006007550A1 (de) Vorrichtung und Verfahren zur Erkennung einer Fahrbahnmarkierung für ein Kraftfahrzeug
DE102011056948A1 (de) Verfahren zur Kalibrierung einer Kamera zu einem Lagesensor
DE102022003362A1 (de) Verfahren und Vorrichtung zur Bestimmung des Abstands eines Fahrzeugs zu einem Objekt
EP4068223A1 (de) Verfahren und system zur bestimmung der bodenebene mit einem künstlichen neuronalen netz
DE102008008571A1 (de) Hintergrundbilderzeugungsvorrichtung
DE102019211459B4 (de) Verfahren und Vorrichtung zum Überprüfen einer Kalibrierung von Umfeldsensoren
DE112021002598T5 (de) Bildverarbeitungsvorrichtung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230925

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)