EP2467826A1 - Procédé et appareil de commande pour déterminer une information de déplacement d'un objet - Google Patents

Procédé et appareil de commande pour déterminer une information de déplacement d'un objet

Info

Publication number
EP2467826A1
EP2467826A1 EP10742134A EP10742134A EP2467826A1 EP 2467826 A1 EP2467826 A1 EP 2467826A1 EP 10742134 A EP10742134 A EP 10742134A EP 10742134 A EP10742134 A EP 10742134A EP 2467826 A1 EP2467826 A1 EP 2467826A1
Authority
EP
European Patent Office
Prior art keywords
coordinate
unfiltered
determined
information
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10742134A
Other languages
German (de)
English (en)
Inventor
Paul-Sebastian Lauer
Richard Arndt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP2467826A1 publication Critical patent/EP2467826A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the present invention relates to a method according to claim 1, a control device according to claim 13, and a computer program product according to claim 14.
  • the patent application DE 10 2005 008 131 A1 describes such a method. Its main feature is that it determines the 3D position and SD speed by means of at least one filter for relevant pixels.
  • the computation intensity of the method results from the fact that one filter instance, e.g. an extended Kalman filter or a multihypten Kalman filter, i. Overall, a composite of several extended Cayman filters, is needed.
  • the present invention proposes a method for determining motion information of an object located in an environment of a vehicle, furthermore a control device which uses this method and finally a corresponding computer program product according to the independent patent claims.
  • Advantageous embodiments tions result from the respective subclaims and the following description.
  • the purpose of the invention is to efficiently calculate the 3D flow of a set of relevant points.
  • the calculation can be made in a manner that it can be performed on a cost-effective controller.
  • the 3D flow describes, for a given time, the 3D position of a set of relevant pixels and an estimated SD speed for each of these points.
  • the position may be a direct, e.g. based on a disparity map and unfiltered 3D
  • the 3D velocity vector can be determined in relation to the same coordinate system using the temporally consecutive unfiltered 3D coordinates, by applying a suitable filter.
  • the temporal association of the 3D coordinates may be via a series of frames that may be provided by a camera system.
  • the 3D coordinates can be projected into the image space of the camera system, resulting in 2D image coordinates. Their displacement vectors between two successive images can then be, e.g. be determined by the determination of the optical flow.
  • the inventive approach enables an efficient 3D flow estimation.
  • the estimate may be based on recursive computation of 3D flux vectors by direct, unfiltered measurement of 3D coordinates and a derivative of a 3D velocity vector that may be determined by filtering the successive 3D coordinates.
  • the difference between 3D coordinates is a simple method that already gives good results. Furthermore, a filter can be used, the
  • both the unfiltered case and the filtered case can be used to determine the velocity vectors.
  • a waiver of the filtering for determining the 3D speed is possible. It can also be an inclusion of the proper motion of the sensor system in the 3D flow estimation for the determination of relevant pixels, for example, only those that move over ground, and / or to improve the speed estimate done.
  • a reduction of the computational effort is possible because filtering is required only for the velocity vectors of the relevant pixels.
  • an estimation of a 3D state takes place.
  • an easy-to-handle system is created, since a few manipulated variables are required for the parameterization, as they are required in contrast, with combined filtering of 3D position and 3D speed.
  • This fact stems from the fact that in a combined filtering usually a physically plausible motion model must be applied to obtain reliable estimates of speed and position.
  • linear models are used for this, which make it possible to change the (as uniformly assumed) speed, for example by (parametrizable) noise of the acceleration.
  • the noise can be fixed or adapted to the given input data (eg the video images to be evaluated).
  • a derivation of robust object hypotheses by a summary of many individual measurements into point clouds and a determination of object attributes on the basis of meaningful statistical quantities, e.g. Mean and variance possible. This can reduce the measurement uncertainty compared to the individual measurements.
  • a point cloud which has been summarized into an object, can provide several statistical quantities. The most important ones are the mean value of the velocity vector and variance, the object distance and variance, the object width and variance, and the object height and variance.
  • the 3D flow estimation according to the invention can be used, for example, for stereo video-based driver assistance systems.
  • the method is particularly suitable for security functions such as e.g. the collision avoiding braking.
  • All of the input data required for the 3D flow estimation can be provided, for example, by driver assistance systems with existing high-performance FPGA implementations, which enable an estimation of the optical flow that already provides a detection of relevant pixels and the calculation of disparity maps.
  • the reaction rate with which an object detector reacts to sudden movement changes can be increased.
  • the position of the vehicle in front can generally be correctly estimated immediately by means of the 3D flow estimation approach according to the invention.
  • the speed can vary depending on whether a filter is used or not and depending on the filter setting to go through a transient phase.
  • the present invention provides a method for determining movement information of an object located in an environment of a vehicle.
  • ject comprising the steps of: receiving, via an interface, a first unfiltered 3D coordinate, wherein the first unfiltered 3D coordinate defines a position of an object point within a predetermined coordinate system at a first time; Receiving at least a second unfiltered 3D coordinate via the interface, wherein the at least one second unfiltered 3D coordinate defines a position of the object point within the predetermined coordinate system at at least a second time; and determining velocity information concerning the object point based on the first unfiltered 3D coordinate and the at least one second unfiltered 3D coordinate.
  • the method according to the invention can be implemented, for example, in a control unit of a vehicle.
  • the vehicle may have an environment detection system with which at least a section of the surroundings of the vehicle can be detected.
  • the object may be, for example, an object, another vehicle, a person or an animal. In particular, it may be a moving object related to the environment.
  • movement information relating to a single object or a plurality of objects can be determined.
  • the object point may represent a point around the vehicle.
  • the object point can be detected by the surroundings detection system.
  • the method according to the invention can be carried out for all detected object points or for a selection of object points. For example, relevant object points can be selected from all detected object points.
  • the relevant object points may be object points that are in motion in relation to the environment.
  • it may be a point that can be assigned to a relevant object that is located in the environment.
  • the 3D coordinates may indicate spatial positions of the object point at different times.
  • a coordinate system can be spanned and the 3D coordinates can represent an x, y, and z position of the object point.
  • the x and y positions can be determined, for example, by mapping the object point onto an image plane of an optical sensor.
  • the z-position can be determined, for example, by means of a stereo depth measurement or else determined by means of other systems, such as radar measurement, navigation maps or C2X communication. Other coordinate systems and representations of the position of the object point are also possible.
  • the 3D coordinates can represent directly determined measured values. This means that the 3D coordinates can be determined without using a filter rule. Each 3D coordinate can thus be based exclusively on measured values which have been determined at a time assigned to the specific 3D coordinate. For example, each 3D coordinate may be determined from frames captured by the surround detection system simultaneously or within a detection cycle.
  • the interface may be an interface to the environment detection system or a corresponding evaluation or preprocessing device.
  • the second time may be later than the first time. In the period of time lying between the first and the second time point, the object point may have moved from the first position determined by the first 3D coordinate to the second position determined by the second 3D coordinate.
  • a movement speed and a direction of movement of the object point can be determined.
  • the speed information or motion information may include both the movement speed and the direction of movement or values based thereon.
  • the speed information can be provided for further processing to an interface. For example, the
  • the velocity information from the first unfiltered 3D coordinate and the at least one second unfiltered 3D coordinate may be determined as an unfiltered value.
  • the speed information can be determined without using a filter rule.
  • the speed information can be determined solely using the unfiltered 3D coordinate.
  • the computing intensity for determining the speed information can be kept very low.
  • the speed information can be determined using a filtering rule.
  • a filtering rule for example, a multi-model filter, for example a Kalman filter, can be used.
  • the velocity information may be a 3D velocity vector.
  • the speed information can indicate both an amount of the speed and a direction of movement of the object point.
  • the speed information may be information about an expected collision time between the object point and the vehicle.
  • the collision time can be further processed, for example, by a driver assistance system.
  • the speed information can be determined by means of a difference formation from the first unfiltered 3D coordinate and the at least one second unfiltered 3D coordinate. This enables a quick and uncomplicated determination of the speed information.
  • a plurality of unfiltered 3D coordinates and a plurality of speed information can be determined for a plurality of object points.
  • a set of object points, out of the plurality of object points, may be based on the plurality of unfiltered 3D objects.
  • Coordinates and the plurality of speed information associated with an object can be based on an assignment rule. If there are a plurality of relevant objects in the surroundings of the vehicle, then the plurality of object points can also be divided into a plurality of different objects.
  • the assignment of the set of object points can cause a combination of object points to an object cloud.
  • Each object cloud can represent an object.
  • the object can be determined by means of a suitable classification method from the object cloud. For example, adjacent object points that have a common direction of movement and movement speed can be assigned to an object. Thus, a detection and tracking of an object located in the environment of the vehicle is possible.
  • a statistical quantity may be determined that may be associated with the set of object points. This may be, for example, an average value or a variance corresponding to the observer. jekthuien associated, act values.
  • the statistical size may be determined via the velocity information of the object points associated with an object.
  • an object attribute assigned to the set of object points can be determined.
  • the object attribute may represent a distance, width, or velocity of the object.
  • the object attribute can be determined by an evaluation of the 3D coordinates as well as the speed information.
  • the object attribute can be used to classify or evaluate an object or move an object.
  • the 3D coordinates may represent information provided by a surround detection system.
  • the environment detection system may include a plurality of sensors. These may be optical sensors which are designed to detect and image the surroundings of the vehicle.
  • the surround detection system may be a stereo camera arrangement.
  • the surround detection system can provide a stereo depth measurement to determine a distance of an object point.
  • other suitable sensors, measuring methods or evaluation methods for determining the 3D coordinates can also be used.
  • the surroundings detection system can be designed to provide a first image of the environment and to the at least one second image at the first time
  • the first 3D coordinate may be determined based on the first image and the at least one second 3D coordinate may be determined based on the second image.
  • Known image analysis methods can be used to evaluate the images.
  • receiving information about an intrinsic movement of the surroundings detection system can take place via an interface. Accordingly, the speed information can also be determined based on the information about the proper motion. In this way, the Speed information can be determined with greater accuracy. In addition, the determination of relevant object points is simplified.
  • the present invention further provides a control device which is designed to carry out or implement the steps of the method according to the invention.
  • a control device in the form of a control device, the object underlying the invention can be achieved quickly and efficiently.
  • a control device can be understood as meaning an electrical device which processes sensor signals and outputs control signals in dependence thereon.
  • the control unit may have an interface, which may be formed in hardware and / or software.
  • the interfaces can be part of a so-called system ASIC, for example, which contains various functions of the control unit.
  • the interfaces are their own integrated circuits or at least partially consist of discrete components.
  • the interfaces may be software modules that are present, for example, on a microcontroller in addition to other software modules.
  • Also of advantage is a computer program product with program code, which is stored on a machine-readable carrier such as a semiconductor memory, a hard disk memory or an optical memory and is used to carry out the method according to one of the embodiments described above, when the program is executed on a control unit.
  • a machine-readable carrier such as a semiconductor memory, a hard disk memory or an optical memory
  • Fig. 1 is a flow chart of a first embodiment of the present invention.
  • FIG. 2 shows a sequence of an estimation of a vector of the 3D flux field, according to an exemplary embodiment of the present invention.
  • the same or similar reference numerals are used for the elements shown in the various figures and similarly acting, wherein a repeated description of these elements is omitted.
  • a first unfiltered 3D coordinate can be received, which can be assigned to a first point in time.
  • a second unfiltered 3D coordinate can be received, which can be assigned to a second time.
  • Time points can be assigned.
  • speed information can be determined from the 3D coordinates received with regard to the one object point.
  • Corresponding method steps 102, 104, 106 can be performed simultaneously or offset in time for further object points, so that speed information for different object points can be determined.
  • FIG. 2 shows a flow of an estimation of a vector of a 3D flow field, according to an embodiment of the present invention.
  • the vector can correspond to the velocity information which can be determined according to the invention.
  • the x-axis and the y-axis span an image plane 210 of an optical sensor.
  • the optical sensor may be disposed on a vehicle and configured to image an environment of the vehicle.
  • a focal point 212 of the optical sensor is located at the origin of the coordinate system. Further, a focal length 214 is shown.
  • An object point is at a first time at a position 221, for example in the environment of a vehicle. Located at a second time the object point at a position 222 and at a third time at a position 223.
  • the positions 221, 222, 223 can be defined with respect to the coordinate system respectively by 3D coordinates, each having an x-value, a y-value and have an z value. Distances between the focal point 212 and the object point at the positions 221, 222, 223 can each be determined via a depth measurement, for example by stereo video.
  • the object point can be imaged on the image plane 210. Thus, the object point can be imaged at position 231 at the first time, at position 232 at the second time, and at position 233 at the third time.
  • the positions 231, 232, 233 can each be described by an x and a y coordinate.
  • the object point performs a movement at a first time, the direction and magnitude of which is represented by a first velocity vector 241.
  • the object point performs a movement represented by a second velocity vector 242
  • the object point performs a motion represented by a third velocity vector 243.
  • the movements of the object point can be imaged onto the image plane 210.
  • a vector 251 represents a 2D trajectory located in the image plane 210 between position 231 at position 232.
  • a vector 252 represents a 2D trajectory between position 232 at position 233.
  • the vector represents 252 represents a shift of the relevant pixel from (t-1) to (t). This corresponds to the optical flow.
  • the positions 221, 222, 223 thus represent 3D positions of the object point at the respective times.
  • the first velocity vector 241 represents a 3D flow vector at the first time (t-2)
  • the second velocity vector 242 a 3D flow vector at the second time ( t-1)
  • the third velocity vector 243 represent a 3D flow vector at the third time (t).
  • the lengths of the velocity vectors 241, 242, 243 respectively represent SD
  • positions 221, 223, 223 represent a 3D trajectory of the relevant pixel from (t-2) to (t).
  • the real positions 221, 222, 223 of the object point can be represented by 3D coordinates passing through the corresponding 2D positions
  • respective 3D coordinates can be determined directly from the respectively corresponding 2D positions 231, 232, 233 as well as the respectively corresponding depth measurements without the use of a filter or with temporally preceded or temporally subsequent 2D positions or depth measurements.
  • one of the positions 221 corresponding 3D coordinate can be determined directly from the 2D position 231 and the depth measurements, which provides the distance between the position 221 and the focal point 212.
  • a 3D trajectory can be determined, which represents a real movement of the object point in the environment of the vehicle.
  • the 3D flow of a set of relevant pixels based on a digital image sequence and a distance-measuring sensor can be efficiently determined.
  • relevant pixels are first determined within the image sequence.
  • the 3D position of a relevant pixel is determined.
  • the pixel is captured by a suitable method, e.g. a KLT-T rackers, followed in the image sequence and then measured again 3-dimensional.
  • a 3D trajectory of the relevant pixel arises, from which an estimated value for the SD velocity vector can be derived. This can be done in the simplest case by subtraction of two consecutive 3D positions.
  • the distance traveled may then be, e.g. by including the time elapsed between observing both 3D positions, they are translated into a velocity vector.
  • various methods are conceivable, e.g. a 3-dimensional Kalman filter. Since the initial speed is i.d.R. unknown, a multihyptenal Kalman filter or, more generally, a multi-model filter can also be used. The reliability of the speed estimation as well as the selection of the relevant ones
  • Pixels can be increased if the proper motion of the sensor system is included in the calculation.
  • the proper motion can be determined by external sensors, such as odometry in the vehicle, and / or on the basis of the measured data of the sensor system itself.
  • the distance measurement can come from any sensor, eg from a stereo camera system or a MEMS LIDAR whose relative orientation to the video sensor providing the digital image sequence is known.
  • the 3D flow field is created by performing the procedure described above for many relevant pixels. Within the flow field, 3D
  • an exemplary embodiment comprises a "and / or" link between a first feature and a second feature
  • this can be read so that the embodiment according to one embodiment either only the first feature and the second feature and according to another embodiment having the first feature or only the second feature.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé pour déterminer une information de déplacement d'un objet se trouvant dans un environnement d'un véhicule, procédé qui comprend une étape de réception d'une première coordonnée en 3D non filtrée (221) et d'une seconde coordonnée en 3D non filtrée (222), les coordonnées en 3D définissant des positions d'un point objet à l'intérieur d'un système de coordonnées prédéfini à différents moments. Le procédé présente également une étape de détermination d'une information de vitesse concernant le point objet, sur la base de la première coordonnée en 3D non filtrée (221) et de la seconde coordonnée en 3D non filtrée (222).
EP10742134A 2009-08-20 2010-08-06 Procédé et appareil de commande pour déterminer une information de déplacement d'un objet Withdrawn EP2467826A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE200910028742 DE102009028742A1 (de) 2009-08-20 2009-08-20 Verfahren und Steuergerät zur Bestimmung einer Bewegungsinformation eines Objekts
PCT/EP2010/061483 WO2011020713A1 (fr) 2009-08-20 2010-08-06 Procédé et appareil de commande pour déterminer une information de déplacement d'un objet

Publications (1)

Publication Number Publication Date
EP2467826A1 true EP2467826A1 (fr) 2012-06-27

Family

ID=42751596

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10742134A Withdrawn EP2467826A1 (fr) 2009-08-20 2010-08-06 Procédé et appareil de commande pour déterminer une information de déplacement d'un objet

Country Status (3)

Country Link
EP (1) EP2467826A1 (fr)
DE (1) DE102009028742A1 (fr)
WO (1) WO2011020713A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016218849A1 (de) 2016-09-29 2018-03-29 Conti Temic Microelectronic Gmbh Detektion und Tracking von Objekten aus Bildern einer Kamera
DE102016218853A1 (de) 2016-09-29 2018-03-29 Conti Temic Microelectronic Gmbh Detektion und Validierung von Objekten aus Bildern einer Kamera
DE102016218852A1 (de) 2016-09-29 2018-03-29 Conti Temic Microelectronic Gmbh Detektion von Objekten aus Bildern einer Kamera
DE102016223107A1 (de) * 2016-11-23 2018-05-24 Volkswagen Aktiengesellschaft System und Verfahren zur Schätzung der Eigenbewegung eines Fahrzeuges
FR3070658B1 (fr) * 2017-09-06 2019-08-30 IFP Energies Nouvelles Procede de determination d'une vitesse a atteindre pour un premier vehicule precede par un deuxieme vehicule, en particulier pour un vehicule autonome

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6603535B1 (en) * 2002-08-30 2003-08-05 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Stereo imaging velocimetry system and method
DE102004018813A1 (de) * 2004-04-19 2006-02-23 Ibeo Automobile Sensor Gmbh Verfahren zur Erkennung und/oder Verfolgung von Objekten
DE102005008131A1 (de) 2005-01-31 2006-08-03 Daimlerchrysler Ag Objektdetektion auf Bildpunktebene in digitalen Bildsequenzen
DE102007018470A1 (de) * 2007-04-19 2008-10-23 Robert Bosch Gmbh Fahrerassistenzsystem und Verfahren zur Objektplausibilisierung
EP2071515A1 (fr) * 2007-12-11 2009-06-17 Honda Research Institute Europe GmbH Suivi visuel d'un objet dans le monde réel à l'aide de l'aspect 2D et des estimations de la profondeur à l'aide de plusieurs indices
DE102008025773A1 (de) * 2008-05-29 2009-01-08 Daimler Ag Verfahren zur Schätzung eines Orts- und Bewegungszustands eines beobachteten Objekts

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BOTA S ET AL: "Multi-feature walking pedestrians detection for driving assistance systems", IET INTELLIGENT TRANSPORT SYSTEMS,, vol. 2, no. 2, 9 June 2008 (2008-06-09), pages 92 - 104, XP006030989, ISSN: 1751-9578, DOI: 10.1049/IET-ITS:20070039 *
LOVE N S ET AL: "Recognition of 3D compressed images and its traffic monitoring applications", INTELLIGENT VEHICLES SYMPOSIUM, 2000. IV 2000. PROCEEDINGS OF THE IEEE DEARBORN, MI, USA 3-5 OCT. 2000, PISCATAWAY, NJ, USA,IEEE, US, 3 October 2000 (2000-10-03), pages 463 - 467, XP010528980, ISBN: 978-0-7803-6363-2, DOI: 10.1109/IVS.2000.898386 *
See also references of WO2011020713A1 *

Also Published As

Publication number Publication date
DE102009028742A1 (de) 2011-02-24
WO2011020713A1 (fr) 2011-02-24

Similar Documents

Publication Publication Date Title
EP2043045B1 (fr) Procédé destiné au suivi d'objet
DE102014209137B4 (de) Verfahren und Vorrichtung zur Kalibrierung eines Kamerasystems eines Kraftfahrzeugs
EP0541567B1 (fr) Procede d'analyse de sequences chronologiques d'images numeriques
WO2017206999A1 (fr) Procédé d'évaluation de données image d'une caméra de véhicule
EP1531343B1 (fr) Procédé de suivi d'objets
DE602004012962T2 (de) Echtzeit-hinderniserkennung mit einer kalibrierten kamera und bekannter ego-bewegung
DE102017102506B4 (de) Verfahren zum erfassen anderer fahrzeuge relativ zu einem trägerfahrzeug mittels raderfassung
DE102012216386A1 (de) Verfahren zum Betreiben eines Fahrerassistenzsystems eines Fahrzeugs
DE112014002019T5 (de) Kurvenformmodelliereinrichtung, Fahrzeuginformationsverarbeitungssystem, Kurvenformmodellierverfahren und Kurvenformmodellierprogramm
DE102005008131A1 (de) Objektdetektion auf Bildpunktebene in digitalen Bildsequenzen
EP3044727B1 (fr) Procédé et dispositif de détection d'objets d'après des données d'image ayant une résolution de profondeur
WO2013178407A1 (fr) Procédé et dispositif pour traiter des données stéréoscopiques
DE102018123393A1 (de) Erkennung von Parkflächen
DE102007049706A1 (de) Verfahren zur Schätzung der Relativbewegung von Video-Objekten und Fahrerassistenzsystem für Kraftfahrzeuge
WO2011020713A1 (fr) Procédé et appareil de commande pour déterminer une information de déplacement d'un objet
DE102018204451A1 (de) Verfahren und Vorrichtung zur Autokalibrierung eines Fahrzeugkamerasystems
DE102020211970A1 (de) Verfahren zum Steuern eines Fahrzeugs
EP3663881B1 (fr) Procédé de commande d'un véhicule autonome en fonction des vecteurs de mouvement estimés
EP2736017A2 (fr) Procédé de détermination du mouvement d'un véhicule automobile
EP2394247B1 (fr) Procédé et dispositif de mise en oeuvre d'un système d'aide à la conduite par vidéo dans un véhicule
DE102011118171A1 (de) Verfahren und Vorrichtung zur Schätzung einer Fahrbahnebene und zur Klassifikation von 3D-Punkten
DE102014204360A1 (de) Verfahren sowie Vorrichtung zur Abschätzung des Abstandes eines in Bewegung befindlichen Fahrzeuges von einem Objekt
DE102008042631A1 (de) Verfahren und Vorrichtung zur Abstandsermittlung bei einem monokularen Videoassistenzsystem
DE102020105192A1 (de) Verfahren zum Detektieren bewegter Objekte in einer Fahrzeugumgebung und Kraftfahrzeug
DE102016223072B4 (de) Bestimmen einer Eigenbewegung eines Fahrzeugs anhand von ortsfesten Objekten

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120320

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20130205

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20140124