EP1920406A1 - Objektdetektion auf bildpunktebene in digitalen bildsequenzen - Google Patents
Objektdetektion auf bildpunktebene in digitalen bildsequenzenInfo
- Publication number
- EP1920406A1 EP1920406A1 EP06700512A EP06700512A EP1920406A1 EP 1920406 A1 EP1920406 A1 EP 1920406A1 EP 06700512 A EP06700512 A EP 06700512A EP 06700512 A EP06700512 A EP 06700512A EP 1920406 A1 EP1920406 A1 EP 1920406A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- pixels
- obj
- relevant
- pixel
- ektdetektion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
Definitions
- the invention relates to a method for object detection at the pixel level in digital image sequences.
- the 3D position of relevant pixels is determined by analyzing a pair of images of a calibrated stereo camera arrangement.
- a method for stereo image analysis is described in "Real-time Stereo Vision for Urban Traffic Scene Understanding, U. Franke, IEEE Conference on Intelligent Vehicles 2000, October 2000, Dearborn", wherein pixels are initially determined by means of an interest operator, in which the stereo disparity can be measured well. Subsequently, a hierarchical correlation method is then used to measure the disparity and thus to determine the 3D position of relevant pixels.
- objects can be distinguished from the background by grouping adjacent pixels with the same distance from the image sensor into one object.
- the invention is based on the object to provide a novel method for obj ektdetetation on pixel level in digital image sequences.
- the object is according to the invention by a
- a method for object detection at the pixel level in digital image sequences is proposed.
- the 2D position of relevant pixels is determined in an inventive manner within a first image acquisition and relevant to each of them Pixel determines an associated distance value.
- These pixels are tracked and localized in at least one second image acquisition, whereby the 2D position or the displacement of the pixel as well as the associated distance value are again determined for each of the pixels.
- the position and movement of relevant pixels are determined by means of at least one suitable filter.
- relevant pixels are then combined to form objects.
- the inventive method provides due to the fusion of spatial and temporal information to each pixel considered an accurate 3D position and the associated SD movement direction.
- relevant pixels are understood to mean those pixels which are suitable for tracking in at least two or more consecutive image recordings of an image sequence, eg. have a certain contrast.
- the method described in "Detection and Tracking of Point Features, School of Computer Science, Carnegie Mellon University, Pittsburg, PA, April 1991 (CMU-CS-91-132)" is suitable for selection of relevant pixels.
- a 3D position determination is performed, so it is also advantageous if it is also possible to determine the stereo disparity with these relevant pixels.
- relevant pixels are subsequently tracked and located in the next image. This does not necessarily have to be an image acquisition directly following the first image acquisition.
- For tracking is suitable for example in the o. g. 11 KLT trackers shown. "With a renewed stereoscopic SD position determination then closes the circle, the process is continued in the same way.
- the proper motion of the image sensor is taken into account in the determination of position and movement of relevant pixels.
- the obj ects to be detected may be both stationary and moving obj ects.
- the in the frame Obj ektdetetation detected positions and movements of relevant pixels can be fixed coordinates or on the co-moving coordinate system of a mobile image sensor, which z. B. is located on a vehicle, be related.
- the proper motion of the image sensor is determined on the basis of the image recordings and / or by means of inertial sensor technology.
- inertial sensor technology For example, in modern vehicles already inertial sensors installed, which detect the movement, inclination, acceleration, rate of rotation, etc. These self-motion of the vehicle and thus also those of an image sensor connected to the vehicle are measured, for. B. provided via the vehicle bus system.
- pixels in the image recordings are tracked for a sufficiently long time and checked to see whether they are at rest and do not move.
- the proper motion of the vehicle or vehicle can then be determined by means of suitable image evaluation methods. be determined by the image sensor.
- the at least one filter for determining the position and movement of relevant pixels is a Kalman filter.
- a Kalman filter with a respective tracked relevant pixel is provided with a State vector [xyz vx vy vz] assigned.
- the quantities x, y and z describe the spatial position of the pixel z .B. in a mitbewegten vehicle-fixed coordinate system.
- the quantities vx, vy and vz indicate the velocity in the respective spatial direction.
- relevant pixels can be reliably tracked from two or more images and their spatial position, as well as their direction of movement and movement speed, can be determined.
- the Kalman filter by means of the Kalman filter the spatial and temporal information is integrated, whereby a reliable detection of fast moving objects is only possible.
- the mathematical calculations required in connection with such a cayman filter-based multi-filter system for vehicle environment analysis are detailed.
- a filter may be based on the hypothesis that the pixel under consideration represents part of a high-relative-velocity vehicle, with another filter based on the hypothesis that the pixel is a pixel is part of a vehicle traveling at a similar speed. Taking into account the innovation errors of the individual filters, a decision can already be made after a few image cycles as to whether a hypothesis is correct or not.
- the result of the individual filter is fused to an overall result of the filtering.
- different filters can be fused by combining the individual results as a weighted average into a total result.
- a convergence between estimated values and the actual value is achieved much faster, which is particularly advantageous in the case of real-time applications such as collision avoidance.
- the Overall result of the filtering is fed back in a further profitable manner to the inputs of the single filter.
- the overall result hereby influences in particular the parameter settings of the individual filters and therefore also has a profitable effect on the future determination of position and movement of relevant pixels.
- the distance value associated with a pixel is determined in a profitable manner on the basis of image recordings and / or by means of distance-resolving sensor technology.
- the distance associated with a pixel can be determined by means of a stereo image analysis method.
- the 3D position of relevant pixels is determined by analyzing a pair of images of a calibrated stereo camera arrangement.
- a suitable distance-resolving sensor This may, for example, be an additional punctiform laser sensor which directly supplies distance values to a specific object point.
- B. Laser scanner or distance image cameras are known which provide a low value for each pixel.
- these pixels are grouped into objects which have similar state vectors, wherein, for example, limits for the maximum permissible deviation of individual or several elements of the state vector are predetermined.
- the object detection may be limited only to certain image areas, eg. B.
- object detection in connection with vehicle applications may be limited to specific lanes. It is furthermore conceivable that only the relevant pixels are combined to form objects which have a certain direction of movement.
- Obj ekte it is also of great advantage, if already summarized Obj ekte continue to be tracked in images by means of filters.
- Methods which track the 3D position of potential objects as an entity after an initial segmentation are already known from the prior art and are preferably based on a simple Kalman filter. This tracking of already combined pixels to Obj ects is also used in connection with the inventive method.
- very reliable segmentation can be generated and on the other hand very good
- the positions and movements in particular the state vectors of combined pixels are used to initialize the filtering.
- the positions and movements in particular the state vectors of combined pixels are used.
- the continuously determined positions and movements of individual pixels are used.
- the method according to the invention for object detection at the pixel level can be used, for example, in connection with driver assistance systems.
- driver assistance systems Numerous applications for driver assistance systems are already known, which are based on image-based object detection. For example, systems for traffic sign recognition, parking, tracking, etc. are known.
- the method according to the invention is distinguished, in particular, by its speed and robustness with regard to the detection results, it is particularly suitable in connection with • an insert for collision detection or. Collision avoidance.
- the driver can be made aware in time of suddenly approaching road users or it can, for. B. actively intervened in the vehicle dynamics.
- the method according to the invention can also be used for object detection at the pixel level in connection with robot systems.
- Future robots will be equipped with imaging sensors. These can be, for example, autonomous transport systems, which freely navigate at your place of work or stationary robots.
- the inventive method can be used in this context, for example, for collision detection or collision avoidance.
- the method in conjunction with a robot is used for the safe gripping of moving objects. For the moving obj ekten this may, for. B. moving objects or a person who is assisted by the robot.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102005004510 | 2005-01-31 | ||
DE102005008131A DE102005008131A1 (de) | 2005-01-31 | 2005-02-21 | Objektdetektion auf Bildpunktebene in digitalen Bildsequenzen |
PCT/EP2006/000013 WO2006081906A1 (de) | 2005-01-31 | 2006-01-03 | Objektdetektion auf bildpunktebene in digitalen bildsequenzen |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1920406A1 true EP1920406A1 (de) | 2008-05-14 |
Family
ID=36577450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP06700512A Withdrawn EP1920406A1 (de) | 2005-01-31 | 2006-01-03 | Objektdetektion auf bildpunktebene in digitalen bildsequenzen |
Country Status (4)
Country | Link |
---|---|
US (1) | US20090297036A1 (de) |
EP (1) | EP1920406A1 (de) |
DE (1) | DE102005008131A1 (de) |
WO (1) | WO2006081906A1 (de) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2048618A1 (de) * | 2007-10-08 | 2009-04-15 | Delphi Technologies, Inc. | Verfahren zur Objekterfassung |
AT506051B1 (de) * | 2007-11-09 | 2013-02-15 | Hopf Richard | Verfahren zum erfassen und/oder auswerten von bewegungsabläufen |
DE102008005993A1 (de) * | 2008-01-24 | 2009-07-30 | Siemens Ag Österreich | Verfahren zum Verfolgen von Objekten mittels einer Kamera |
DE102009009047A1 (de) | 2009-02-16 | 2010-08-19 | Daimler Ag | Verfahren zur Objektdetektion |
DE102009016819B4 (de) | 2009-04-09 | 2011-12-15 | Carl Zeiss Optronics Gmbh | Verfahren zur Detektion wenigstens eines Objekts und/oder wenigstens einer Objektgruppe, Computerprogramm, Computerprogammprodukt, Stereokameraeinrichtung, aktiv Strahlung aussendendes Bildsensorsystem und Überwachungsvorrichtung |
DE102009028742A1 (de) | 2009-08-20 | 2011-02-24 | Robert Bosch Gmbh | Verfahren und Steuergerät zur Bestimmung einer Bewegungsinformation eines Objekts |
US9418556B2 (en) * | 2010-12-30 | 2016-08-16 | Wise Automotive Corporation | Apparatus and method for displaying a blind spot |
DE102011111440A1 (de) | 2011-08-30 | 2012-06-28 | Daimler Ag | Verfahren zur Umgebungsrepräsentation |
DE102012000459A1 (de) | 2012-01-13 | 2012-07-12 | Daimler Ag | Verfahren zur Objektdetektion |
JP5957359B2 (ja) * | 2012-10-19 | 2016-07-27 | 日立オートモティブシステムズ株式会社 | ステレオ画像処理装置及びステレオ画像処理方法 |
DE102013016032A1 (de) | 2013-07-10 | 2014-04-10 | Daimler Ag | Verfahren zur Objektdetektion in stereoskopisch erfassten Bildern |
JP6110256B2 (ja) | 2013-08-21 | 2017-04-05 | 株式会社日本自動車部品総合研究所 | 対象物推定装置および対象物推定方法 |
DE102013020947A1 (de) * | 2013-12-12 | 2015-06-18 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Verfolgen eines Zielobjekts bei Helligkeitsänderung, Kamerasystem und Kraftfahrzeug |
DE102015213557A1 (de) * | 2015-07-20 | 2017-01-26 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und System zum Erstellen eines dreidimensionalen Modells einer Produktionsumgebung |
DE102019128219A1 (de) | 2019-10-18 | 2021-04-22 | Connaught Electronics Ltd. | Ein Bildverarbeitungsverfahren |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4980762A (en) * | 1989-10-13 | 1990-12-25 | Massachusetts Institute Of Technology | Method and apparatus for image processing to obtain three dimensional motion and depth |
US6009188A (en) * | 1996-02-16 | 1999-12-28 | Microsoft Corporation | Method and system for digital plenoptic imaging |
JP3512992B2 (ja) * | 1997-01-07 | 2004-03-31 | 株式会社東芝 | 画像処理装置および画像処理方法 |
US6124864A (en) * | 1997-04-07 | 2000-09-26 | Synapix, Inc. | Adaptive modeling and segmentation of visual image streams |
US6215898B1 (en) * | 1997-04-15 | 2001-04-10 | Interval Research Corporation | Data processing system and method |
US6295367B1 (en) * | 1997-06-19 | 2001-09-25 | Emtera Corporation | System and method for tracking movement of objects in a scene using correspondence graphs |
US6192156B1 (en) * | 1998-04-03 | 2001-02-20 | Synapix, Inc. | Feature tracking using a dense feature array |
US6236738B1 (en) * | 1998-04-09 | 2001-05-22 | Board Of Trustees Of The Leland Stanford Junior University | Spatiotemporal finite element method for motion analysis with velocity data |
US7116324B2 (en) * | 1998-05-27 | 2006-10-03 | In-Three, Inc. | Method for minimizing visual artifacts converting two-dimensional motion pictures into three-dimensional motion pictures |
US6628819B1 (en) * | 1998-10-09 | 2003-09-30 | Ricoh Company, Ltd. | Estimation of 3-dimensional shape from image sequence |
US6677941B2 (en) * | 2000-08-05 | 2004-01-13 | American Gnc Corporation | Three-dimensional relative positioning and tracking using LDRI |
US7058204B2 (en) * | 2000-10-03 | 2006-06-06 | Gesturetek, Inc. | Multiple camera control system |
US7590264B2 (en) * | 2001-03-08 | 2009-09-15 | Julian Mattes | Quantitative analysis, visualization and movement correction in dynamic processes |
US7177445B2 (en) * | 2002-04-16 | 2007-02-13 | Koninklijke Philips Electronics N.V. | Discriminating between changes in lighting and movement of objects in a series of images using different methods depending on optically detectable surface characteristics |
US7132961B2 (en) * | 2002-08-12 | 2006-11-07 | Bae Systems Information And Electronic Systems Integration Inc. | Passive RF, single fighter aircraft multifunction aperture sensor, air to air geolocation |
US7188048B2 (en) * | 2003-06-25 | 2007-03-06 | Lockheed Martin Corporation | Refining stochastic grid filter |
US20070008210A1 (en) * | 2003-09-11 | 2007-01-11 | Noriko Kibayashi | Radar device |
US7394977B2 (en) * | 2003-10-07 | 2008-07-01 | Openvr Co., Ltd. | Apparatus and method for creating 3-dimensional image |
WO2005036371A2 (en) * | 2003-10-09 | 2005-04-21 | Honda Motor Co., Ltd. | Moving object detection using low illumination depth capable computer vision |
US7831087B2 (en) * | 2003-10-31 | 2010-11-09 | Hewlett-Packard Development Company, L.P. | Method for visual-based recognition of an object |
US7456847B2 (en) * | 2004-08-12 | 2008-11-25 | Russell Steven Krajec | Video with map overlay |
US7447337B2 (en) * | 2004-10-25 | 2008-11-04 | Hewlett-Packard Development Company, L.P. | Video content understanding through real time video motion analysis |
US7583849B2 (en) * | 2005-07-25 | 2009-09-01 | Microsoft Corporation | Lossless image compression with tree coding of magnitude levels |
-
2005
- 2005-02-21 DE DE102005008131A patent/DE102005008131A1/de not_active Withdrawn
-
2006
- 2006-01-03 WO PCT/EP2006/000013 patent/WO2006081906A1/de active Application Filing
- 2006-01-03 EP EP06700512A patent/EP1920406A1/de not_active Withdrawn
- 2006-01-03 US US11/993,398 patent/US20090297036A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
"Serious Games", vol. 2449, 1 January 2002, SPRINGER INTERNATIONAL PUBLISHING, Cham, ISBN: 978-3-642-15171-2, ISSN: 0302-9743, article STEFAN HEINRICH: "Real Time Fusion of Motion and Stereo Using Flow/Depth Constraint for Fast Obstacle Detection", pages: 75 - 82, XP055537281, DOI: 10.1007/3-540-45783-6_10 * |
Also Published As
Publication number | Publication date |
---|---|
WO2006081906A1 (de) | 2006-08-10 |
US20090297036A1 (en) | 2009-12-03 |
DE102005008131A1 (de) | 2006-08-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1920406A1 (de) | Objektdetektion auf bildpunktebene in digitalen bildsequenzen | |
EP0541567B1 (de) | Verfahren zur analyse zeitlicher folgen digitaler bilder | |
EP2043045B1 (de) | Verfahren zur Objektverfolgung | |
DE102009012435B4 (de) | Vorrichtung und Verfahren zum monokularen Motion-Stereo-basierten Detektieren von freien Parkplätzen | |
JP6574611B2 (ja) | 立体画像に基づいて距離情報を求めるためのセンサシステム | |
DE602004012962T2 (de) | Echtzeit-hinderniserkennung mit einer kalibrierten kamera und bekannter ego-bewegung | |
EP2757524A1 (de) | Tiefenmessungsverfahren und System für autonome Fahrzeuge | |
EP2372642B1 (de) | Verfahren und System zur Erkennung von bewegenden Objekten | |
WO2017206999A1 (de) | Verfahren zur auswertung von bilddaten einer fahrzeugkamera | |
EP2256690B1 (de) | Objektbewegungsdetektionssystem, das auf der Kombination von dreidimensionalen Krümmungstechniken und einer eigentlichen Objektbewegungsdetektion basiert | |
DE112016000187T5 (de) | Verfahren und Vorrichtung zur Schätzung einer Fahrzeugeigenbewegung ausgehend von Rundumsichtbildern | |
DE102010006828A1 (de) | Verfahren zur automatischen Erstellung eines Modells der Umgebung eines Fahrzeugs sowie Fahrerassistenzsystem und Fahrzeug | |
DE102016104729A1 (de) | Verfahren zur extrinsischen Kalibrierung einer Kamera, Rechenvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug | |
DE102012219834A1 (de) | Spurverfolgungssystem | |
EP2033165B1 (de) | Verfahren für die erfassung eines verkehrsraums | |
CN111862673A (zh) | 基于顶视图的停车场车辆自定位及地图构建方法 | |
DE102016104730A1 (de) | Verfahren zum Detektieren eines Objekts entlang einer Straße eines Kraftfahrzeugs, Rechenvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug | |
Willersinn et al. | Robust obstacle detection and tracking by motion analysis | |
DE102010013093A1 (de) | Verfahren und System zur Erstellung eines Modells eines Umfelds eines Fahrzeugs | |
EP2394247B1 (de) | Verfahren und vorrichtung zum betrieb eines videobasierten fahrerassistenzsystems in einem fahrzeug | |
Shu et al. | Vision based lane detection in autonomous vehicle | |
DE102017100062A1 (de) | Visuelle Odometrie | |
WO2011020713A1 (de) | Verfahren und steuergerät zur bestimmung einer bewegungsinformation eines objekts | |
Goyat et al. | Tracking of vehicle trajectory by combining a camera and a laser rangefinder | |
Rabie et al. | Mobile active‐vision traffic surveillance system for urban networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20071207 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
RBV | Designated contracting states (corrected) |
Designated state(s): DE FR GB |
|
RBV | Designated contracting states (corrected) |
Designated state(s): DE FR GB |
|
17Q | First examination report despatched |
Effective date: 20080618 |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: RABE, CLEMENS Inventor name: GEHRIG, STEFAN Inventor name: BADINO, HERNAN Inventor name: FRANKE, UWE |
|
R17P | Request for examination filed (corrected) |
Effective date: 20071207 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: DAIMLER AG |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20190716 |