WO2008058928A1 - Procédé de localisation d'un objet - Google Patents

Procédé de localisation d'un objet Download PDF

Info

Publication number
WO2008058928A1
WO2008058928A1 PCT/EP2007/062204 EP2007062204W WO2008058928A1 WO 2008058928 A1 WO2008058928 A1 WO 2008058928A1 EP 2007062204 W EP2007062204 W EP 2007062204W WO 2008058928 A1 WO2008058928 A1 WO 2008058928A1
Authority
WO
WIPO (PCT)
Prior art keywords
image features
route
navigation
vehicle
image
Prior art date
Application number
PCT/EP2007/062204
Other languages
German (de)
English (en)
Inventor
Thomas Engelberg
Mario Mueller
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2008058928A1 publication Critical patent/WO2008058928A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
    • G01C11/10Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken using computers to control the position of the pictures

Definitions

  • the invention relates to a method for locating an object, a device for locating an object, a computer program and a computer program product.
  • a GPS signal may be mis-scored for shading and / or multipath propagation. If the GPS signal is missing over longer distances, then larger errors accumulate in dead reckoning, which can eventually lead to incorrect route information. Especially in densely built-up areas this can lead to incorrect navigation.
  • first image features are extracted from a 3D navigation and second image features are extracted from video data. These image features are processed together. In one embodiment of this method, the image features are evaluated simultaneously. It is also possible that the image features are merged.
  • the extracted first and second image features can be associated or linked or linked, so that so-called matching, that is to say a matching, a tuning or an adaptation of the image features is possible.
  • An embodiment of the invention provides that a route of the object is determined.
  • a route of the object is determined.
  • from the video data provided image information and a representation of the route as a so-called.
  • Optically highlighted driving tube which is assigned to the route, in particular by cross-fading, contact analog and / or displayed together and thus represented.
  • an object designed as a vehicle is located.
  • a suitable route to be traveled can be determined for reaching a specific destination. Furthermore, a driver of the vehicle can be displayed the route or driving recommendation in a shared representation with the video images on a display device.
  • the invention also relates to a device for locating an object which has a processing module which is designed to jointly process image features extracted from a 3D navigation and image features extracted from video data.
  • this device has a particularly georeferenced 3D database for providing the image features of the 3D navigation and at least one camera for providing the video data.
  • the device can have a position determination module for locating the object.
  • the device has a navigation unit for calculating a route of the object.
  • the device has a display device for jointly displaying video images that are provided from the video data and a representation of the route.
  • this device cooperates with a mobile object designed as a vehicle, the device and in particular the at least one camera being designed as a component of the object and thus of the vehicle.
  • an environment of the object can be detected from the object.
  • a likewise computer program according to the invention with program code means is designed to carry out all the steps of a method according to the invention when the computer program is executed on a computer or a corresponding arithmetic unit, in particular in a device according to the invention.
  • the invention also relates to a computer program product with program code means which are stored on a computer-readable data carrier in order to carry out all the steps of a method according to the invention, when the computer program is executed on a computer or a corresponding computing unit, in particular in a device according to the invention.
  • the invention thus u. a. a device and thus a system is provided, which extracts features from the 3D database of a three-dimensional navigation and in particular image features that can be used for simultaneous location of image features from video data and thus video image data.
  • the invention can therefore be suitable for relieving the driver of the vehicle in confusing traffic situations.
  • a driving recommendation is displayed contact analogous in a video image.
  • the optical representation of the recommended route assigned to this route route as a color-coded area perspective in the image of the video camera or in a head-up display superimposed so that this colored surface comes exactly to the actual road course to cover.
  • a GPS signal and a location must in this case have an accuracy of no more than 1 m. As a rule, the path suggested by the inserted driving tube comes to coincide sufficiently well with the actual course of the road.
  • camera systems may be used which comprise at least one camera located on the vehicle and viewing a vehicle exterior, this at least one camera appropriately displaying spread image data on a display device (e.g., NightView from DC).
  • a display device e.g., NightView from DC
  • an extraction of the image features can take place offline. Accordingly, features and in particular image features can be determined in particular offline before the device is put into operation and stored together with the actual texture data in the 3D database. As a result, the processing time required to execute the method can be reduced. If, for example, no display of the 3 D data is required, storage of the texture data can be dispensed with. This considerably reduces a required amount of data.
  • Image features for providing the display may serve not only image features extracted from individual images, but also image features obtained from the evaluation of successive image pairs. Accordingly, the image features can be linked or connected to one another in an optical flow in a temporal sequence.
  • At least one 3D camera may be used instead of at least one ordinary camera which images a scene as a two-dimensional representation.
  • a depth value can be provided in addition to the gray value information for each pixel of the video image.
  • the search for correspondences can take place directly in three-dimensional space, i. A conversion of virtual scenes into a two-dimensional image and thus video image is eliminated.
  • a correspondence search becomes more robust, since ambiguities are possible in a search in the two-dimensional image, since points on the visual ray, regardless of which depth value they possess, are always imaged on the same pixel.
  • the at least one camera in particular a video camera, can be inside or outside the vehicle.
  • the 3 D texture data database displays a virtual but realistic 3D view of the street scene. These data including the textures are used for the location.
  • Figure 1 shows a schematic representation of an embodiment of a device according to the invention.
  • Figure 2 shows a schematic representation of an embodiment of a video image in which a contact analogue superimposed driving recommendation is shown.
  • the illustrated schematically in Figure 1 embodiment of the device 2 according to the invention comprises at least one camera 4, which is mounted on a vehicle, a georeferenced 3D database 6, a first module 8 for extracting image characteristics of the 3D database 6, a second module 10 for Extraction of image features of the camera 4. Furthermore, a processing module 12 is provided for associating the extracted image features and thus for matching these image features extracted in the first and second modules 8, 10 and a position determination module 14 for 3D location of the vehicle. This 3D location is based on associated data provided by the processing module 12.
  • the determination of the position and thus the 3D location is supported by GPS data which is provided to a receiving module 16 of the device 2 by a satellite and which are suitable for initialization. In the case of missing 3D data, e.g. in undeveloped areas, GPS data is used for 3D location.
  • the position determination module 14 is provided with a first sensor 18 for detecting
  • the position determination module 14 has a prediction module 24.
  • a position of the vehicle determined by the position determination module 14 is used by a navigation unit 26 to calculate a route of the vehicle.
  • Route calculation requires additional data stored in a 2D data base 28 for conventional digital maps.
  • This 2D database 28 includes lanes as connections between certain nodes, road classes, one-way streets, etc.
  • the data to be displayed which in the present embodiment comprise navigation instructions and a route tube associated with the route, on a display device 30, as a screen or head-up display is formed, displayed.
  • the 3D database 6 stores 3D models of buildings, bridges and other artificial structures. These 3D models also include texture data, which is photographic data of objects such as house fronts, and so forth. From the 3D models and the texture data, artificial views of a scene can be generated for the display. Furthermore, 3D models of road intersections and other road structures, e.g. a roundabout, be filed.
  • the 3D model of an intersection is parameterized by a number, width, and the angles of the intersecting roads as well as the coordinates of a reference point and the number of lanes in each intersecting road.
  • texture data from the intersection e.g. Stop lines, turn-off arrows, signs, with the appropriate coordinates stored.
  • GPS data or from a previous calculation step is determined, an artificial view and thus an artificial image of the scene is generated.
  • Image features are extracted from the artificial image. This can be, for example, edges and therefore lines with strong gray scale gradients, corners, closed contours or generally geometric objects.
  • the same image features are generated from a video image of the real scene captured by the at least one camera 4.
  • the image features found in the artificial image are associated with those from the real scene in the processing module 12, so that so-called matching takes place.
  • To speed up the search for correspondences for example, at different resolution levels of the images, starting at a low resolution, be performed.
  • the world position of the image features in the artificial scene is known since the models originate from the georeferenced 3D database.
  • the outer orientation of the camera 4 and thus the position and orientation of the vehicle are determined with a spatial backward cut known from geodesy, provided the points do not lie on one Straight line and an inner orientation of the camera 4, ie image main points and a camera constant, is known.
  • the 3 D coordinates and the rotation angles with respect to the three coordinate axes of the camera 4 are determined according to known calculation rules. From a known installation position of the camera 4 relative to the vehicle, the position and orientation of the vehicle is thus derived. If more than the minimum number of three points is found, then one can
  • the steering angle and the turning rate of the vehicle which are detected via the sensors 18, 20, 22, the route and thus a path is determined, on which the vehicle continues to move becomes.
  • the location data obtained in this way again serve as an initial value for the calculation of the artificial view of the scene.
  • the position relative to the intersection is now determined with this positioning, which is improved compared to the pure GPS data.
  • the model of the intersection stored in the 3D database 6 and the additional 2D data from the 2D database 28, the u. a. Representations of different types of road comprises, is determined by the navigation unit 26, a precise course of the recommended for departure route with the necessary lane changes.
  • the driving recommendation can now be superimposed on the video image on the display device 30 in the right perspective and contact-analogous manner.
  • FIG. 32 An embodiment of such a video picture 32 is shown in FIG.
  • a scene of a road is shown, the contact analogue a driving recommendation 34, a so-called. Driving tube, color overlaid.
  • this driving recommendation 34 is graphically shown to the driver of the vehicle via the display device 30 located in the vehicle, along which route he can travel to reach a desired destination.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)

Abstract

L'invention concerne un procédé de localisation d'un objet, selon lequel des premières caractéristiques d'image sont extraites d'une navigation en 3 dimensions et des deuxièmes caractéristiques d'image sont extraites de données vidéo, et selon lequel ces caractéristiques d'image sont conjointement traitées.
PCT/EP2007/062204 2006-11-17 2007-11-12 Procédé de localisation d'un objet WO2008058928A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102006054323.8 2006-11-17
DE200610054323 DE102006054323A1 (de) 2006-11-17 2006-11-17 Verfahren zum Orten eines Objekts

Publications (1)

Publication Number Publication Date
WO2008058928A1 true WO2008058928A1 (fr) 2008-05-22

Family

ID=38928039

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2007/062204 WO2008058928A1 (fr) 2006-11-17 2007-11-12 Procédé de localisation d'un objet

Country Status (2)

Country Link
DE (1) DE102006054323A1 (fr)
WO (1) WO2008058928A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011006347B4 (de) 2011-03-29 2023-02-09 Bayerische Motoren Werke Aktiengesellschaft Verfahren zur Ausgabe von grafischen Fahrhinweisen
US9443429B2 (en) 2012-01-24 2016-09-13 GM Global Technology Operations LLC Optimum gaze location on full windscreen display

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1283406A2 (fr) * 2001-08-01 2003-02-12 Siemens Aktiengesellschaft Dispositif et méthode de traitement d'images pour un véhicule
WO2003017226A2 (fr) * 2001-08-07 2003-02-27 Siemens Aktiengesellschaft Procede et dispositif pour representer des indications de conduite dans des systemes de navigation pour vehicules automobiles
EP1586861A1 (fr) * 2004-04-15 2005-10-19 Robert Bosch Gmbh Procédé et appareil pour afficher des informations pour le conducteur
WO2006037402A1 (fr) * 2004-10-01 2006-04-13 Daimlerchrysler Ag Systeme d'assistance a la conduite destine a afficher la trajectoire ulterieure de la route sur un affichage de vehicule, en position correcte par rapport au champ de vision du conducteur d'un vehicule automobile

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1283406A2 (fr) * 2001-08-01 2003-02-12 Siemens Aktiengesellschaft Dispositif et méthode de traitement d'images pour un véhicule
WO2003017226A2 (fr) * 2001-08-07 2003-02-27 Siemens Aktiengesellschaft Procede et dispositif pour representer des indications de conduite dans des systemes de navigation pour vehicules automobiles
EP1586861A1 (fr) * 2004-04-15 2005-10-19 Robert Bosch Gmbh Procédé et appareil pour afficher des informations pour le conducteur
WO2006037402A1 (fr) * 2004-10-01 2006-04-13 Daimlerchrysler Ag Systeme d'assistance a la conduite destine a afficher la trajectoire ulterieure de la route sur un affichage de vehicule, en position correcte par rapport au champ de vision du conducteur d'un vehicule automobile

Also Published As

Publication number Publication date
DE102006054323A1 (de) 2008-05-21

Similar Documents

Publication Publication Date Title
DE102010042063B4 (de) Verfahren und Vorrichtung zum Bestimmen von aufbereiteten Bilddaten über ein Umfeld eines Fahrzeugs
DE112008003424B4 (de) Navigationsgerät, das Videobilder einer Kamera verwendet
DE112019007451T5 (de) Hochpräzises Verfahren sowie Gerät zur Fahrzeugpositionierung in mehreren Szenen, und fahrzeugmontiertes Terminal
EP3765324B1 (fr) Procédé, dispositif et support d'enregistrement lisible par ordinateur doté d'instructions pour la commande d'un affichage d'un dispositif d'affichage à réalité augmentée pour un véhicule automobile
DE112017003916T5 (de) Head-Up-Display-Vorrichtung, Anzeigesteuerverfahren und Steuerprogramm
DE102011084993A1 (de) Übernahme von Daten aus bilddatenbasierenden Kartendiensten in ein Assistenzsystem
WO2013053438A2 (fr) Procédé d'intégration d'objets virtuels dans des affichages de véhicule
DE202005021607U1 (de) Navigationsvorrichtung mit Kamerainformation
DE102012200731A1 (de) Verfahren und Vorrichtung zum Visualisieren der Umgebung eines Fahrzeugs
DE102014200407A1 (de) Verfahren und Vorrichtung zum Betreiben eines Sichtfeldanzeigegeräts
DE112018004519T5 (de) Anzeigevorrichtung eines überlagerten Bildes und Computerprogramm
DE102021204765A1 (de) Rendern von Augmented-Reality mit Verdeckung
DE102010007091A1 (de) Verfahren zur Positionsermittlung für ein Kraftfahrzeug
WO2019042778A1 (fr) Lunettes à réalité augmentée, procédé de détermination de pose de lunettes à réalité augmentée, véhicule automobile adapté à l'utilisation d'un à réalité augmentéee ou du procédé
WO2021110412A1 (fr) Procédé d'affichage destiné à afficher un modèle d'environnement d'un véhicule, programme informatique, dispositif de commande et véhicule
WO2019154673A1 (fr) Procédé, dispositif et support d'enregistrement lisible par ordinateur doté d'instructions pour la commande d'un affichage d'un dispositif d'affichage tête haute à réalité augmentée d'un véhicule à moteur
DE102010062464A1 (de) Verfahren und Vorrichtung zur Darstellung eines Abschnitts einer Umgebung
DE10137582A1 (de) Bildverarbeitungsvorrichtung für ein Fahrzeug
WO2008058928A1 (fr) Procédé de localisation d'un objet
DE102011084596A1 (de) Verfahren zum Assistieren eines Fahrers in einer fremden Umgebung
EP2145157A1 (fr) Procédé d'affichage d'images vidéo, et système vidéo correspondant
DE102010042314A1 (de) Verfahren zur Ortsbestimmung mit einem Navigationssystem und Navigationssystem hierzu
DE102019201134B4 (de) Verfahren, Computerprogramm mit Instruktionen und System zum Einmessen einer Augmented-Reality-Brille und Augmented-Reality-Brille zur Verwendung in einem Kraftfahrzeug
DE102021123503A1 (de) Ermittlung einer absoluten Initialposition eines Fahrzeugs
WO2020260134A1 (fr) Procédé de localisation d'un véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07822490

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 07822490

Country of ref document: EP

Kind code of ref document: A1