EP2396746A2 - Procédé de détection d'objets - Google Patents

Procédé de détection d'objets

Info

Publication number
EP2396746A2
EP2396746A2 EP10703018A EP10703018A EP2396746A2 EP 2396746 A2 EP2396746 A2 EP 2396746A2 EP 10703018 A EP10703018 A EP 10703018A EP 10703018 A EP10703018 A EP 10703018A EP 2396746 A2 EP2396746 A2 EP 2396746A2
Authority
EP
European Patent Office
Prior art keywords
determined
segments
segment
height
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10703018A
Other languages
German (de)
English (en)
Inventor
Hernan Badino
Uwe Franke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Publication of EP2396746A2 publication Critical patent/EP2396746A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Definitions

  • the invention relates to a method for object detection according to the preamble of claim 1.
  • a distance image is determined by means of a sensor system via horizontal and vertical angles, wherein a depth map of an environment is determined from the distance image.
  • a free space boundary line is defined that bounds an obstacle-free area of the environment, segmenting the depth map outside and along the free space boundary line by forming segments of appropriate equal widths of pixels of equal or similar distance from a plane, with a height of each segment as a part of an object located outside the obstacle-free area, so that each segment is characterized by a two-dimensional position of a foot point (eg given by distance and angle to a vehicle longitudinal axis) and by its height.
  • the three-dimensional environment described by the distance image and the depth map is approximated by the obstacle-free area (also called the free space area).
  • the obstacle-free area is, for example, a drivable area, which, however, does not necessarily have to be planar.
  • the obstacle-free area is limited by the rod-like segments, which in their entirety model the objects surrounding the obstacle-free area. These segments are in the simplest case on the ground and approximate a mean height of the object in the region of the respective segment. Variable height objects, such as cyclists from the side, are thus described by a piecewise constant height function.
  • the resulting segments represent a compact and robust representation of the objects and require only a limited amount of data, regardless of the density of the stereo correspondence analysis used to create the depth map.
  • Location and altitude are stored for each stixel. These Representation is optimally suitable for any subsequent steps, such as object formation and scene interpretation.
  • the stixel representation represents an ideal interface between application-independent stereo analysis and application-specific evaluations.
  • Fig. 1 is a two-dimensional representation of an environment with a
  • Free space boundary and a number of segments for modeling objects in the environment are free space boundary and a number of segments for modeling objects in the environment.
  • FIG. 1 shows a two-dimensional representation of an environment 1 with a free-space delimitation line 2 and a number of segments 3 for modeling objects 4.1 to 4.6 in the environment 1.
  • the segments 3 or stixels model the objects 4.1 to 4.6 that are defined by the free-space delimiting line 2 limit free travel.
  • a method is used in which two images of an environment are recorded and a disparity image is determined by means of stereo image processing.
  • stereo image processing the method described in [H. Hirschmüller: "Accurate and efficient stereo processing by semi-global matching and mutual information.” CVPR 2005, San Diego, CA. Volume 2. (June 2005), pp. 807-814.].
  • a depth map of the environment is determined, for example as described in [H.Badino, U. Franke, R.Mester: "Free Space Computation Using Stochastic Occupancy Grids and Dynamic Programming", In Dynamic Programming Workshop for ICCV 07, Rio de Janeiro, Brasil].
  • the free space boundary line 2 is identified which delimits the obstacle-free area of the surroundings 1.
  • the depth map is segmented by forming the segments 3 with a predetermined width of pixels of equal or similar distance to an image plane of a camera or multiple cameras.
  • the segmentation may be accomplished, for example, by the method described in [H.Badino, U.Franke, R.Mester: "Free Space Computation using Stochastic Occupancy Grids and Dynamic Programming", In Dynamic Vision Workshop for ICCV 07, Rio de Janeiro, Brasil] ,
  • An approximation of the found free space boundary line 2 in segments 3 (bars, stixels) of predetermined width provides the distance of the segments; with known orientation of the camera to the environment (for example, a road in front of a vehicle on which the camera is arranged) and known 3D curve results in a respective base point of the segments 3 in the image.
  • each segment 3 is characterized by a two-dimensional position of a foot point and its height.
  • Height estimation is most easily accomplished by histogram-based analysis of all 3D points in the segment area. This step can be solved by dynamic programming.
  • Areas that have no segments 3, are those in which no objects were found by the free space analysis.
  • Multiple images can be sequentially acquired and processed, and from changes in the depth map and disparity image, motion information can be extracted and assigned to segments 3.
  • moving scenes can also be represented and, for example, used to predict an expected movement of the objects 4.1 to 4.6.
  • This kind of motion tracking is also called tracking.
  • a vehicle own motion can be determined and used for compensation.
  • the compactness and robustness of the segments 3 results from the integration of many pixels in the area of the segment 3 and - in the tracking variant - from the additional integration over time.
  • the membership of each of the segments 3 to one of the objects 4.1 to 4.6 can also be stored with the remaining information about each segment. However, this is not mandatory.
  • the movement information can be obtained, for example, by integration of the optical flow, so that a real movement can be estimated for each of the segments 3.
  • Corresponding methods are for. B. from works on the 6D vision, which are published in DE 102005008131 A1, known. This motion information further simplifies the grouping into objects, as compatible movements can be checked.
  • the position of the foot point, the height and the motion information of the segment 3 can be determined by means of Scene Flow.
  • the Scene Flow is a class of procedures that attempts to determine the correct movement in space plus its SD position from at least 2 consecutive stereo image pairs for as many pixels as possible; See [Sundar Vedulay, Simon Bakery, Peter Randeryz, Robert Collinsy, and Takeo Kanade, "Three Dimensional Scene Flow,” Appeared in the 7th International Conference on Computer Vision, Corfu, Greece, September 1999.]
  • information for a driver assistance system can be generated in a vehicle on which the cameras are arranged.
  • a remaining time until collision of the vehicle with an object 4.1 to 4.6 formed by segments 3 can be estimated.
  • a driving corridor 5 can be placed in the obstacle-free area to be used by the vehicle, wherein a lateral distance of at least one of the objects 4.1 to 4.6 to the driving corridor 5 is determined.
  • Information from other sensors can be combined with the driver assistance system information associated with segments 3 (sensor fusion).
  • active sensors such as a LIDAR
  • the segments 3 have unique neighborhood relationships, which makes them very easy to group into objects 4.1 through 4.6. In the simplest case, only distance and height are to be transmitted to each segment 3, with known width of the segment 3 results in an angle (columns in the image) from an index.
  • the distance image can be determined by means of any sensor system over horizontal and vertical angle, wherein from the distance image, the depth map of the environment is determined.
  • two images of the surroundings (1) can each be recorded by means of one camera and a disparity image can be determined by means of stereo image processing, the distance image and the depth map being determined from the disparities determined.
  • a photonic mixer device and / or a three-dimensional camera and / or a lidar and / or a radar can be used as the sensor system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé de détection d'objets, selon lequel deux images des alentours (1) sont enregistrées et une image de disparité est calculée au moyen d'un traitement d'image stéréo, une carte de profondeur des alentours (1) étant déterminée à partir des disparités calculées, carte de profondeur dans laquelle une ligne de délimitation de l'espace libre (2) est identifiée, laquelle délimite une zone sans obstacle des alentours (1), la carte de profondeur étant segmentée en dehors et le long de la ligne de délimitation de l'espace libre (2), en ce sens que des segments (3) d'une largeur appropriée sont formés de pixels à distance identique ou similaire d'un plan d'image, une hauteur de chaque segment (3) étant estimée comme partie d'un objet (4.1 à 4.6) se trouvant en dehors de la zone sans obstacle de sorte que chaque segment (3) est caractérisé par la position en deux dimensions de son extrémité inférieure (fournie par exemple par la distance et l'angle par rapport à l'axe longitudinal du véhicule) et sa hauteur.
EP10703018A 2009-02-16 2010-02-04 Procédé de détection d'objets Withdrawn EP2396746A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102009009047A DE102009009047A1 (de) 2009-02-16 2009-02-16 Verfahren zur Objektdetektion
PCT/EP2010/000671 WO2010091818A2 (fr) 2009-02-16 2010-02-04 Procédé de détection d'objets

Publications (1)

Publication Number Publication Date
EP2396746A2 true EP2396746A2 (fr) 2011-12-21

Family

ID=42338731

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10703018A Withdrawn EP2396746A2 (fr) 2009-02-16 2010-02-04 Procédé de détection d'objets

Country Status (5)

Country Link
US (1) US8548229B2 (fr)
EP (1) EP2396746A2 (fr)
CN (1) CN102317954B (fr)
DE (1) DE102009009047A1 (fr)
WO (1) WO2010091818A2 (fr)

Families Citing this family (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10776635B2 (en) 2010-09-21 2020-09-15 Mobileye Vision Technologies Ltd. Monocular cued detection of three-dimensional structures from depth images
JP5316572B2 (ja) * 2011-03-28 2013-10-16 トヨタ自動車株式会社 物体認識装置
DE102011111440A1 (de) 2011-08-30 2012-06-28 Daimler Ag Verfahren zur Umgebungsrepräsentation
CN103164851B (zh) * 2011-12-09 2016-04-20 株式会社理光 道路分割物检测方法和装置
DE102012000459A1 (de) 2012-01-13 2012-07-12 Daimler Ag Verfahren zur Objektdetektion
US20150022664A1 (en) 2012-01-20 2015-01-22 Magna Electronics Inc. Vehicle vision system with positionable virtual viewpoint
US8824733B2 (en) 2012-03-26 2014-09-02 Tk Holdings Inc. Range-cued object segmentation system and method
US8768007B2 (en) 2012-03-26 2014-07-01 Tk Holdings Inc. Method of filtering an image
CN103390164B (zh) * 2012-05-10 2017-03-29 南京理工大学 基于深度图像的对象检测方法及其实现装置
TWI496090B (zh) * 2012-09-05 2015-08-11 Ind Tech Res Inst 使用深度影像的物件定位方法與裝置
US9349058B2 (en) 2012-10-31 2016-05-24 Tk Holdings, Inc. Vehicular path sensing system and method
DE102012021617A1 (de) 2012-11-06 2013-05-16 Daimler Ag Verfahren zur Objektdetektion
US20140139632A1 (en) * 2012-11-21 2014-05-22 Lsi Corporation Depth imaging method and apparatus with adaptive illumination of an object of interest
CN103871042B (zh) * 2012-12-12 2016-12-07 株式会社理光 基于视差图的视差方向上连续型物体检测方法和装置
WO2014152470A2 (fr) 2013-03-15 2014-09-25 Tk Holdings, Inc. Détection de trajectoire utilisant un éclairage structuré
CN104723953A (zh) * 2013-12-18 2015-06-24 青岛盛嘉信息科技有限公司 一种行人检测装置
GB201407643D0 (en) * 2014-04-30 2014-06-11 Tomtom Global Content Bv Improved positioning relatie to a digital map for assisted and automated driving operations
EP3283843B1 (fr) * 2015-04-01 2024-01-10 Vayavision Sensing Ltd. Génération de cartes en trois dimensions d'une scène à l'aide de mesures passives et actives
US9928430B2 (en) * 2015-04-10 2018-03-27 GM Global Technology Operations LLC Dynamic stixel estimation using a single moving camera
EP3998455A1 (fr) 2015-08-03 2022-05-18 TomTom Global Content B.V. Procédés et systèmes de génération et d'utilisation de données de référence de localisation
JP6326641B2 (ja) 2015-08-21 2018-05-23 パナソニックIpマネジメント株式会社 画像処理装置および画像処理方法
US9761000B2 (en) * 2015-09-18 2017-09-12 Qualcomm Incorporated Systems and methods for non-obstacle area detection
US10482331B2 (en) * 2015-11-20 2019-11-19 GM Global Technology Operations LLC Stixel estimation methods and systems
CN106909141A (zh) * 2015-12-23 2017-06-30 北京机电工程研究所 障碍物探测定位装置及避障系统
KR101795270B1 (ko) 2016-06-09 2017-11-07 현대자동차주식회사 장애물의 지면경계 정보를 이용한 물체 측면 검출 방법 및 장치
CN105974938B (zh) * 2016-06-16 2023-10-03 零度智控(北京)智能科技有限公司 避障方法、装置、载体及无人机
US10321114B2 (en) * 2016-08-04 2019-06-11 Google Llc Testing 3D imaging systems
EP3293668B1 (fr) * 2016-09-13 2023-08-30 Arriver Software AB Système de vision et procédé pour véhicule à moteur
US10535142B2 (en) 2017-01-10 2020-01-14 Electronics And Telecommunication Research Institute Method and apparatus for accelerating foreground and background separation in object detection using stereo camera
US10445928B2 (en) 2017-02-11 2019-10-15 Vayavision Ltd. Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types
US10474908B2 (en) * 2017-07-06 2019-11-12 GM Global Technology Operations LLC Unified deep convolutional neural net for free-space estimation, object detection and object pose estimation
JP6970577B2 (ja) * 2017-09-29 2021-11-24 株式会社デンソー 周辺監視装置および周辺監視方法
DE102017123984A1 (de) 2017-10-16 2017-11-30 FEV Europe GmbH Fahrerassistenzsystem mit einem Nanodraht zur Erfassung eines Objektes in einem Umfeld eines Fahrzeugs
DE102017123980A1 (de) 2017-10-16 2017-11-30 FEV Europe GmbH Fahrerassistenzsystem mit einer frequenzgesteuerten Ausrichtung eines Senders zur Erfassung eines Objektes in einem Umfeld eines Fahrzeugs
DE102018202244A1 (de) 2018-02-14 2019-08-14 Robert Bosch Gmbh Verfahren zur Abbildung der Umgebung eines Fahrzeugs
DE102018202753A1 (de) 2018-02-23 2019-08-29 Audi Ag Verfahren zur Ermittlung einer Entfernung zwischen einem Kraftfahrzeug und einem Objekt
DE102018114987A1 (de) 2018-06-21 2018-08-09 FEV Europe GmbH Fahrerassistenzsystem zur Bestimmung einer Farbe eines Objektes in einer Fahrzeugumgebung
DE102018005969A1 (de) 2018-07-27 2020-01-30 Daimler Ag Verfahren zum Betreiben eines Fahrerassistenzsvstems mit zwei Erfassungseinrichtungen
DE102018214875A1 (de) * 2018-08-31 2020-03-05 Audi Ag Verfahren und Anordnung zum Erzeugen einer Umgebungsrepräsentation eines Fahrzeugs und Fahrzeug mit einer solchen Anordnung
DE102018128538A1 (de) 2018-11-14 2019-01-24 FEV Europe GmbH Fahrerassistenzsystem mit einem Sender mit einer frequenzgesteuerten Abstrahlrichtung und einem Konverter zur Frequenzangleichung
DE102019107310A1 (de) 2019-03-21 2019-06-19 FEV Europe GmbH Fahrerassistenzsystem zur Erkennung fremder Signale
DE102019211582A1 (de) * 2019-08-01 2021-02-04 Robert Bosch Gmbh Verfahren zum Erstellen einer Höhenkarte
CN110659578A (zh) * 2019-08-26 2020-01-07 中国电子科技集团公司电子科学研究院 基于检测跟踪技术的客流统计方法、系统及设备
US11669092B2 (en) * 2019-08-29 2023-06-06 Rockwell Automation Technologies, Inc. Time of flight system and method for safety-rated collision avoidance
EP3882813A1 (fr) 2020-03-20 2021-09-22 Aptiv Technologies Limited Procédé de génération d'un réseau d'occupation dynamique
EP3905106A1 (fr) 2020-04-27 2021-11-03 Aptiv Technologies Limited Procédé de détermination d'une zone carrossable
EP3905105A1 (fr) 2020-04-27 2021-11-03 Aptiv Technologies Limited Procédé pour déterminer un espace exempt de collision
DE102020208068A1 (de) 2020-06-30 2021-12-30 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Erkennung eines in einem Überwachungsbereich erscheinenden Objekts, Computerprogramm, Speichermedium und Steuereinrichtung
DE102020208066B3 (de) 2020-06-30 2021-12-23 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren Objekterkennung Computerprogramm, Speichermedium und Steuereinrichtung
CA3125623C (fr) 2020-07-21 2023-06-27 Leddartech Inc. Dispositif de pointage de faisceau, en particulier pour des systemes lidar
WO2022016277A1 (fr) 2020-07-21 2022-01-27 Leddartech Inc. Systèmes et procédés pour lidar grand angle faisant appel à une optique de grossissement non uniforme
CA3125718C (fr) 2020-07-21 2023-10-03 Leddartech Inc. Dispositifs de pointage de faisceau et methodes pour des applications lidar
DE102020210816A1 (de) * 2020-08-27 2022-03-03 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zur Erkennung dreidimensionaler Objekte, Computerprogramm, Maschinenlesbares Speichermedium, Steuergerät, Fahrzeug und Videoüberwachungssystem
EP4009228A1 (fr) * 2020-12-02 2022-06-08 Aptiv Technologies Limited Méthode pour déterminer un espace libre sémantique
DE102022115447A1 (de) 2022-06-21 2023-12-21 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Assistenzsystem zum Unterstützen einer Fahrzeugführung basierend auf einem Fahrschlauch und einer Begrenzungsschätzung und Kraftfahrzeug

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9913687D0 (en) * 1999-06-11 1999-08-11 Canon Kk Image processing apparatus
CN1343551A (zh) * 2000-09-21 2002-04-10 上海大学 机器视觉分层模块结构模型
WO2005020147A1 (fr) * 2003-08-21 2005-03-03 Philips Intellectual Property & Standards Gmbh Dispositif et procede permettant de fondre deux images
US8330814B2 (en) * 2004-07-30 2012-12-11 Panasonic Corporation Individual detector and a tailgate detection device
EP1668384B1 (fr) * 2004-09-17 2008-04-16 Matsushita Electric Works, Ltd. Capteur d'images à gamme de longueur
DE102005008131A1 (de) 2005-01-31 2006-08-03 Daimlerchrysler Ag Objektdetektion auf Bildpunktebene in digitalen Bildsequenzen
JP4797794B2 (ja) 2006-05-24 2011-10-19 日産自動車株式会社 歩行者検出装置および歩行者検出方法
US8385599B2 (en) * 2008-10-10 2013-02-26 Sri International System and method of detecting objects

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2010091818A2 *

Also Published As

Publication number Publication date
DE102009009047A1 (de) 2010-08-19
US8548229B2 (en) 2013-10-01
CN102317954A (zh) 2012-01-11
WO2010091818A3 (fr) 2011-10-20
CN102317954B (zh) 2014-09-24
US20110311108A1 (en) 2011-12-22
WO2010091818A2 (fr) 2010-08-19

Similar Documents

Publication Publication Date Title
EP2396746A2 (fr) Procédé de détection d'objets
DE102013209415B4 (de) Dynamische Hinweisüberlagerung mit Bildbeschneidung
DE102007002419B4 (de) Fahrzeugumgebungsüberwachungsgerät, -verfahren und -programm
DE69937699T2 (de) Vorrichtung zum Überwachen der Umgebung eines Fahrzeuges
DE69624980T2 (de) Objektüberwachungsverfahren und -gerät mit zwei oder mehreren Kameras
DE10030421B4 (de) Fahrzeugumgebungsüberwachungssystem
EP2394234B1 (fr) Procédé et dispositif de détermination d'un marquage de voie de circulation en vigueur
DE10251880B4 (de) Bilderkennungsvorrichtung
WO2013029722A2 (fr) Procédé de représentation de l'environnement
DE102015203016B4 (de) Verfahren und Vorrichtung zur optischen Selbstlokalisation eines Kraftfahrzeugs in einem Umfeld
DE102012101014A1 (de) Fahrzeugdetektionsvorrichtung
DE102016210254A9 (de) Fahrzeugortung an kreuzungen anhand von visuellen anhaltspunkte, stationären objekten und durch gps
EP3731187A1 (fr) Procédé et dispositif de détermination de la position géographique et de l'orientation d'un véhicule
EP3520024B1 (fr) Détection et validation d'objets provenant d'images séquentielles d'une caméra en utilsant des homographies
DE112018007484T5 (de) Hindernis-Detektionsvorrichtung, automatische Bremsvorrichtung unter Verwendung einer Hindernis-Detektionsvorrichtung, Hindernis-Detektionsverfahren und automatisches Bremsverfahren unter Verwendung eines Hindernis-Detektionsverfahrens
DE102017218366A1 (de) Verfahren und system zur fussgängererfassung in einem fahrzeug
EP3520023B1 (fr) Détection et validation d'objets provenant d'images séquentielles d'une caméra
DE102015115012A1 (de) Verfahren zum Erzeugen einer Umgebungskarte einer Umgebung eines Kraftfahrzeugs anhand eines Bilds einer Kamera, Fahrerassistenzsystem sowie Kraftfahrzeug
DE102014112820A1 (de) Fahrzeugaußenumgebungerkennungsvorrichtung
EP2707862B1 (fr) Détermination de distance au moyen d'un capteur de caméra
DE102012000459A1 (de) Verfahren zur Objektdetektion
DE102018121008A1 (de) Kreuzverkehrserfassung unter verwendung von kameras
DE102009022278A1 (de) Verfahren zur Ermittlung eines hindernisfreien Raums
DE102018123393A1 (de) Erkennung von Parkflächen
DE112021000482T5 (de) Erkennung von fremdkörpern in einem fahrzeugweg

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110702

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160901