EP1920406A1 - Object detection on a pixel plane in a digital image sequence - Google Patents
Object detection on a pixel plane in a digital image sequenceInfo
- Publication number
- EP1920406A1 EP1920406A1 EP06700512A EP06700512A EP1920406A1 EP 1920406 A1 EP1920406 A1 EP 1920406A1 EP 06700512 A EP06700512 A EP 06700512A EP 06700512 A EP06700512 A EP 06700512A EP 1920406 A1 EP1920406 A1 EP 1920406A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- pixels
- obj
- relevant
- pixel
- ektdetektion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
Definitions
- the invention relates to a method for object detection at the pixel level in digital image sequences.
- the 3D position of relevant pixels is determined by analyzing a pair of images of a calibrated stereo camera arrangement.
- a method for stereo image analysis is described in "Real-time Stereo Vision for Urban Traffic Scene Understanding, U. Franke, IEEE Conference on Intelligent Vehicles 2000, October 2000, Dearborn", wherein pixels are initially determined by means of an interest operator, in which the stereo disparity can be measured well. Subsequently, a hierarchical correlation method is then used to measure the disparity and thus to determine the 3D position of relevant pixels.
- objects can be distinguished from the background by grouping adjacent pixels with the same distance from the image sensor into one object.
- the invention is based on the object to provide a novel method for obj ektdetetation on pixel level in digital image sequences.
- the object is according to the invention by a
- a method for object detection at the pixel level in digital image sequences is proposed.
- the 2D position of relevant pixels is determined in an inventive manner within a first image acquisition and relevant to each of them Pixel determines an associated distance value.
- These pixels are tracked and localized in at least one second image acquisition, whereby the 2D position or the displacement of the pixel as well as the associated distance value are again determined for each of the pixels.
- the position and movement of relevant pixels are determined by means of at least one suitable filter.
- relevant pixels are then combined to form objects.
- the inventive method provides due to the fusion of spatial and temporal information to each pixel considered an accurate 3D position and the associated SD movement direction.
- relevant pixels are understood to mean those pixels which are suitable for tracking in at least two or more consecutive image recordings of an image sequence, eg. have a certain contrast.
- the method described in "Detection and Tracking of Point Features, School of Computer Science, Carnegie Mellon University, Pittsburg, PA, April 1991 (CMU-CS-91-132)" is suitable for selection of relevant pixels.
- a 3D position determination is performed, so it is also advantageous if it is also possible to determine the stereo disparity with these relevant pixels.
- relevant pixels are subsequently tracked and located in the next image. This does not necessarily have to be an image acquisition directly following the first image acquisition.
- For tracking is suitable for example in the o. g. 11 KLT trackers shown. "With a renewed stereoscopic SD position determination then closes the circle, the process is continued in the same way.
- the proper motion of the image sensor is taken into account in the determination of position and movement of relevant pixels.
- the obj ects to be detected may be both stationary and moving obj ects.
- the in the frame Obj ektdetetation detected positions and movements of relevant pixels can be fixed coordinates or on the co-moving coordinate system of a mobile image sensor, which z. B. is located on a vehicle, be related.
- the proper motion of the image sensor is determined on the basis of the image recordings and / or by means of inertial sensor technology.
- inertial sensor technology For example, in modern vehicles already inertial sensors installed, which detect the movement, inclination, acceleration, rate of rotation, etc. These self-motion of the vehicle and thus also those of an image sensor connected to the vehicle are measured, for. B. provided via the vehicle bus system.
- pixels in the image recordings are tracked for a sufficiently long time and checked to see whether they are at rest and do not move.
- the proper motion of the vehicle or vehicle can then be determined by means of suitable image evaluation methods. be determined by the image sensor.
- the at least one filter for determining the position and movement of relevant pixels is a Kalman filter.
- a Kalman filter with a respective tracked relevant pixel is provided with a State vector [xyz vx vy vz] assigned.
- the quantities x, y and z describe the spatial position of the pixel z .B. in a mitbewegten vehicle-fixed coordinate system.
- the quantities vx, vy and vz indicate the velocity in the respective spatial direction.
- relevant pixels can be reliably tracked from two or more images and their spatial position, as well as their direction of movement and movement speed, can be determined.
- the Kalman filter by means of the Kalman filter the spatial and temporal information is integrated, whereby a reliable detection of fast moving objects is only possible.
- the mathematical calculations required in connection with such a cayman filter-based multi-filter system for vehicle environment analysis are detailed.
- a filter may be based on the hypothesis that the pixel under consideration represents part of a high-relative-velocity vehicle, with another filter based on the hypothesis that the pixel is a pixel is part of a vehicle traveling at a similar speed. Taking into account the innovation errors of the individual filters, a decision can already be made after a few image cycles as to whether a hypothesis is correct or not.
- the result of the individual filter is fused to an overall result of the filtering.
- different filters can be fused by combining the individual results as a weighted average into a total result.
- a convergence between estimated values and the actual value is achieved much faster, which is particularly advantageous in the case of real-time applications such as collision avoidance.
- the Overall result of the filtering is fed back in a further profitable manner to the inputs of the single filter.
- the overall result hereby influences in particular the parameter settings of the individual filters and therefore also has a profitable effect on the future determination of position and movement of relevant pixels.
- the distance value associated with a pixel is determined in a profitable manner on the basis of image recordings and / or by means of distance-resolving sensor technology.
- the distance associated with a pixel can be determined by means of a stereo image analysis method.
- the 3D position of relevant pixels is determined by analyzing a pair of images of a calibrated stereo camera arrangement.
- a suitable distance-resolving sensor This may, for example, be an additional punctiform laser sensor which directly supplies distance values to a specific object point.
- B. Laser scanner or distance image cameras are known which provide a low value for each pixel.
- these pixels are grouped into objects which have similar state vectors, wherein, for example, limits for the maximum permissible deviation of individual or several elements of the state vector are predetermined.
- the object detection may be limited only to certain image areas, eg. B.
- object detection in connection with vehicle applications may be limited to specific lanes. It is furthermore conceivable that only the relevant pixels are combined to form objects which have a certain direction of movement.
- Obj ekte it is also of great advantage, if already summarized Obj ekte continue to be tracked in images by means of filters.
- Methods which track the 3D position of potential objects as an entity after an initial segmentation are already known from the prior art and are preferably based on a simple Kalman filter. This tracking of already combined pixels to Obj ects is also used in connection with the inventive method.
- very reliable segmentation can be generated and on the other hand very good
- the positions and movements in particular the state vectors of combined pixels are used to initialize the filtering.
- the positions and movements in particular the state vectors of combined pixels are used.
- the continuously determined positions and movements of individual pixels are used.
- the method according to the invention for object detection at the pixel level can be used, for example, in connection with driver assistance systems.
- driver assistance systems Numerous applications for driver assistance systems are already known, which are based on image-based object detection. For example, systems for traffic sign recognition, parking, tracking, etc. are known.
- the method according to the invention is distinguished, in particular, by its speed and robustness with regard to the detection results, it is particularly suitable in connection with • an insert for collision detection or. Collision avoidance.
- the driver can be made aware in time of suddenly approaching road users or it can, for. B. actively intervened in the vehicle dynamics.
- the method according to the invention can also be used for object detection at the pixel level in connection with robot systems.
- Future robots will be equipped with imaging sensors. These can be, for example, autonomous transport systems, which freely navigate at your place of work or stationary robots.
- the inventive method can be used in this context, for example, for collision detection or collision avoidance.
- the method in conjunction with a robot is used for the safe gripping of moving objects. For the moving obj ekten this may, for. B. moving objects or a person who is assisted by the robot.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102005004510 | 2005-01-31 | ||
DE102005008131A DE102005008131A1 (en) | 2005-01-31 | 2005-02-21 | Object e.g. road sign, detecting method for use with e.g. driver assistance system, involves determining position and movement of relevant pixels using filter and combining relevant pixels to objects under given terms and conditions |
PCT/EP2006/000013 WO2006081906A1 (en) | 2005-01-31 | 2006-01-03 | Object detection on a pixel plane in a digital image sequence |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1920406A1 true EP1920406A1 (en) | 2008-05-14 |
Family
ID=36577450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP06700512A Withdrawn EP1920406A1 (en) | 2005-01-31 | 2006-01-03 | Object detection on a pixel plane in a digital image sequence |
Country Status (4)
Country | Link |
---|---|
US (1) | US20090297036A1 (en) |
EP (1) | EP1920406A1 (en) |
DE (1) | DE102005008131A1 (en) |
WO (1) | WO2006081906A1 (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2048618A1 (en) * | 2007-10-08 | 2009-04-15 | Delphi Technologies, Inc. | Method for detecting an object |
AT506051B1 (en) * | 2007-11-09 | 2013-02-15 | Hopf Richard | METHOD FOR DETECTING AND / OR EVALUATING MOTION FLOWS |
DE102008005993A1 (en) * | 2008-01-24 | 2009-07-30 | Siemens Ag Österreich | Object e.g. vehicle, tracking method, involves evaluating movement pattern of particular movement of camera, and integrating evaluated movement pattern of particular movement of camera in movement model of object |
DE102009009047A1 (en) | 2009-02-16 | 2010-08-19 | Daimler Ag | Method for object detection |
DE102009016819B4 (en) | 2009-04-09 | 2011-12-15 | Carl Zeiss Optronics Gmbh | Method for detecting at least one object and / or at least one object group, computer program, computer program product, stereo camera device, actively radiation-emitting image sensor system and monitoring device |
DE102009028742A1 (en) | 2009-08-20 | 2011-02-24 | Robert Bosch Gmbh | Method and control device for determining a movement information of an object |
JP2013541915A (en) * | 2010-12-30 | 2013-11-14 | ワイズ オートモーティブ コーポレーション | Blind Spot Zone Display Device and Method |
DE102011111440A1 (en) | 2011-08-30 | 2012-06-28 | Daimler Ag | Method for representation of environment of vehicle, involves forming segments of same width from image points of equal distance in one of image planes, and modeling objects present outside free space in environment |
DE102012000459A1 (en) | 2012-01-13 | 2012-07-12 | Daimler Ag | Method for detecting object e.g. vehicle in surrounding area, involves transforming segments with classification surfaces into two-dimensional representation of environment, and searching and classifying segments in representation |
JP5957359B2 (en) * | 2012-10-19 | 2016-07-27 | 日立オートモティブシステムズ株式会社 | Stereo image processing apparatus and stereo image processing method |
DE102013016032A1 (en) | 2013-07-10 | 2014-04-10 | Daimler Ag | Method for detecting e.g. robot in stereoscopically detected images by two different perspectives using image capture unit, involves performing time filtering or optimization for estimation of disparity images by filter |
JP6110256B2 (en) | 2013-08-21 | 2017-04-05 | 株式会社日本自動車部品総合研究所 | Object estimation apparatus and object estimation method |
DE102013020947A1 (en) * | 2013-12-12 | 2015-06-18 | Valeo Schalter Und Sensoren Gmbh | Method for tracking a target object with brightness change, camera system and motor vehicle |
DE102015213557A1 (en) * | 2015-07-20 | 2017-01-26 | Bayerische Motoren Werke Aktiengesellschaft | Method and system for creating a three-dimensional model of a production environment |
DE102019128219A1 (en) | 2019-10-18 | 2021-04-22 | Connaught Electronics Ltd. | An image processing method |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4980762A (en) * | 1989-10-13 | 1990-12-25 | Massachusetts Institute Of Technology | Method and apparatus for image processing to obtain three dimensional motion and depth |
US6009188A (en) * | 1996-02-16 | 1999-12-28 | Microsoft Corporation | Method and system for digital plenoptic imaging |
JP3512992B2 (en) * | 1997-01-07 | 2004-03-31 | 株式会社東芝 | Image processing apparatus and image processing method |
US6124864A (en) * | 1997-04-07 | 2000-09-26 | Synapix, Inc. | Adaptive modeling and segmentation of visual image streams |
US6215898B1 (en) * | 1997-04-15 | 2001-04-10 | Interval Research Corporation | Data processing system and method |
US6295367B1 (en) * | 1997-06-19 | 2001-09-25 | Emtera Corporation | System and method for tracking movement of objects in a scene using correspondence graphs |
US6192156B1 (en) * | 1998-04-03 | 2001-02-20 | Synapix, Inc. | Feature tracking using a dense feature array |
US6236738B1 (en) * | 1998-04-09 | 2001-05-22 | Board Of Trustees Of The Leland Stanford Junior University | Spatiotemporal finite element method for motion analysis with velocity data |
US7116324B2 (en) * | 1998-05-27 | 2006-10-03 | In-Three, Inc. | Method for minimizing visual artifacts converting two-dimensional motion pictures into three-dimensional motion pictures |
US6628819B1 (en) * | 1998-10-09 | 2003-09-30 | Ricoh Company, Ltd. | Estimation of 3-dimensional shape from image sequence |
US6677941B2 (en) * | 2000-08-05 | 2004-01-13 | American Gnc Corporation | Three-dimensional relative positioning and tracking using LDRI |
US7058204B2 (en) * | 2000-10-03 | 2006-06-06 | Gesturetek, Inc. | Multiple camera control system |
AU2002235939A1 (en) * | 2001-03-08 | 2002-09-19 | Universite Joseph Fourier | Quantitative analysis, visualization and movement correction in dynamic processes |
US7177445B2 (en) * | 2002-04-16 | 2007-02-13 | Koninklijke Philips Electronics N.V. | Discriminating between changes in lighting and movement of objects in a series of images using different methods depending on optically detectable surface characteristics |
US7132961B2 (en) * | 2002-08-12 | 2006-11-07 | Bae Systems Information And Electronic Systems Integration Inc. | Passive RF, single fighter aircraft multifunction aperture sensor, air to air geolocation |
US7188048B2 (en) * | 2003-06-25 | 2007-03-06 | Lockheed Martin Corporation | Refining stochastic grid filter |
EP1666915A1 (en) * | 2003-09-11 | 2006-06-07 | Mitsubishi Denki Kabushiki Kaisha | Radar device |
WO2005034041A1 (en) * | 2003-10-07 | 2005-04-14 | Openvr Co., Ltd. | Apparatus and method for creating 3-dimensional image |
US7366325B2 (en) * | 2003-10-09 | 2008-04-29 | Honda Motor Co., Ltd. | Moving object detection using low illumination depth capable computer vision |
US7831087B2 (en) * | 2003-10-31 | 2010-11-09 | Hewlett-Packard Development Company, L.P. | Method for visual-based recognition of an object |
US7456847B2 (en) * | 2004-08-12 | 2008-11-25 | Russell Steven Krajec | Video with map overlay |
US7447337B2 (en) * | 2004-10-25 | 2008-11-04 | Hewlett-Packard Development Company, L.P. | Video content understanding through real time video motion analysis |
US7583849B2 (en) * | 2005-07-25 | 2009-09-01 | Microsoft Corporation | Lossless image compression with tree coding of magnitude levels |
-
2005
- 2005-02-21 DE DE102005008131A patent/DE102005008131A1/en not_active Withdrawn
-
2006
- 2006-01-03 WO PCT/EP2006/000013 patent/WO2006081906A1/en active Application Filing
- 2006-01-03 EP EP06700512A patent/EP1920406A1/en not_active Withdrawn
- 2006-01-03 US US11/993,398 patent/US20090297036A1/en not_active Abandoned
Non-Patent Citations (1)
Title |
---|
"Serious Games", vol. 2449, 1 January 2002, SPRINGER INTERNATIONAL PUBLISHING, Cham, ISBN: 978-3-642-15171-2, ISSN: 0302-9743, article STEFAN HEINRICH: "Real Time Fusion of Motion and Stereo Using Flow/Depth Constraint for Fast Obstacle Detection", pages: 75 - 82, XP055537281, DOI: 10.1007/3-540-45783-6_10 * |
Also Published As
Publication number | Publication date |
---|---|
DE102005008131A1 (en) | 2006-08-03 |
US20090297036A1 (en) | 2009-12-03 |
WO2006081906A1 (en) | 2006-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006081906A1 (en) | Object detection on a pixel plane in a digital image sequence | |
EP2043045B1 (en) | Method for tracking an object | |
DE102009012435B4 (en) | Apparatus and method for monocular motion stereo-based detection of free parking spaces | |
EP2757524B1 (en) | Depth sensing method and system for autonomous vehicles | |
DE602004012962T2 (en) | REAL-TIME OBSTACLE DETECTION WITH A CALIBRATED CAMERA AND KNOWN EGO MOTION | |
EP2372642B1 (en) | Method and system for detecting moving objects | |
EP2256690B1 (en) | Object motion detection system based on combining 3D warping techniques and a proper object motion detection | |
DE112016000187T5 (en) | Method and apparatus for estimating vehicle intrinsic movement based on panoramic images | |
DE102016209625A1 (en) | Method for evaluating image data of a vehicle camera | |
WO1992002894A1 (en) | Process for analysing time sequences of digital images | |
DE102010006828A1 (en) | Method for creating model of surrounding of vehicle, involves detecting surrounding of vehicle with help of sensor, where two surrounding models are produced automatically on basis of sensor data | |
DE102016104729A1 (en) | Method for extrinsic calibration of a camera, computing device, driver assistance system and motor vehicle | |
CN111862673A (en) | Parking lot vehicle self-positioning and map construction method based on top view | |
DE102012219834A1 (en) | Tracking system | |
EP2033165B1 (en) | Method for picking up a traffic space | |
DE102016104730A1 (en) | Method for detecting an object along a road of a motor vehicle, computing device, driver assistance system and motor vehicle | |
EP2394247B1 (en) | Method and apparatus for operating a video-based driver assistance system in a vehicle | |
DE102017100062A1 (en) | Visual Odometry | |
Willersinn et al. | Robust obstacle detection and tracking by motion analysis | |
DE102010013093A1 (en) | Method for creating model of surrounding area of motor vehicle i.e. car, involves determining whether card cells are loaded with object represented by three- dimensional structures | |
DE112021006799T5 (en) | SIGNAL PROCESSING APPARATUS, SIGNAL PROCESSING METHOD AND SIGNAL PROCESSING SYSTEM | |
JP2007280387A (en) | Method and device for detecting object movement | |
Shu et al. | Vision based lane detection in autonomous vehicle | |
DE102009028742A1 (en) | Method and control device for determining a movement information of an object | |
Goyat et al. | Tracking of vehicle trajectory by combining a camera and a laser rangefinder |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20071207 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
RBV | Designated contracting states (corrected) |
Designated state(s): DE FR GB |
|
RBV | Designated contracting states (corrected) |
Designated state(s): DE FR GB |
|
17Q | First examination report despatched |
Effective date: 20080618 |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: RABE, CLEMENS Inventor name: GEHRIG, STEFAN Inventor name: BADINO, HERNAN Inventor name: FRANKE, UWE |
|
R17P | Request for examination filed (corrected) |
Effective date: 20071207 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: DAIMLER AG |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20190716 |