EP2145157A1 - Procédé d'affichage d'images vidéo, et système vidéo correspondant - Google Patents
Procédé d'affichage d'images vidéo, et système vidéo correspondantInfo
- Publication number
- EP2145157A1 EP2145157A1 EP08718162A EP08718162A EP2145157A1 EP 2145157 A1 EP2145157 A1 EP 2145157A1 EP 08718162 A EP08718162 A EP 08718162A EP 08718162 A EP08718162 A EP 08718162A EP 2145157 A1 EP2145157 A1 EP 2145157A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- objects
- video
- transport
- determined
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 238000004458 analytical method Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 238000001514 detection method Methods 0.000 claims description 3
- 239000003643 water by type Substances 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000001932 seasonal effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
Definitions
- the present invention relates to a method for displaying video images in a means of transport and to a video system for carrying out the method.
- the display has become more and more accurate and realistic in recent years, which has been made possible by the increasing performance of the microprocessors used in the systems for the graphics calculations.
- navigation systems for motor vehicles are already known, which offer three-dimensional or perspective representations of inner cities. For example, this already allows a virtual departure of a calculated route during the trip preparation, in order to prevent possible confusion of the driver in difficult driving situations.
- the graphic representation for the virtual departure must convey a realistic impression.
- the current aircraft position and other information about the flight such as the altitude, the speed of the flight. speed, temperature and / or past or future time of flight.
- the aircraft position is visualized not only by an indication of location coordinates, but often by arranging an aircraft symbol on a map of the flight area shown below.
- map presentations When visualizing in the multimedia systems of the aircraft, the passengers expect map presentations whose elements can be recognized easily and quickly when looking out the window. Thus, more accurate and realistic views of map representations are required.
- the following method steps are provided: acquisition of image data by a video camera arranged in the means of transport, determination of current position data of the means of transport, determination of object data from a database and linking of the acquired image data, the determined one Position data and the determined object data such that display data is generated in which additional information is inserted from the database in the video display.
- tenbank are determined to objects that are shown in the corresponding video image.
- a suitable combination of the object data then enables the generation of the display data.
- the essence of the invention is not to improve the graphical representation of a virtual image with respect to the resolution and realistic view and to insert additional information in the virtual image as before, but to integrate the additional, graphically generated information into a real video image.
- the displayed video image is thus a combination of real images with graphic image components.
- the method is performed in real time. This allows the viewer on the video display to receive additional information about the objects he currently sees out of the means of transport.
- the object data are determined for objects which are located in a detection space of the video camera, i. in the room in front of the video camera, which is observable by this.
- the detection space can also be called "View Frustum”.
- a graphical analysis of the image data is performed to recognize imaged objects.
- graphical analysis techniques such as edge analysis, objects present in the video image can be identified.
- the detected objects and the detected objects are compared with each other and depending on the comparison result, the detected objects and the detected objects are assigned to each other. Since in each case a plurality of objects can come into question for an assignment, it is preferably provided that the comparison is carried out on the basis of position data of the determined and the detected objects.
- the shape, size and / or color of the determined and recognized objects are taken into account in the comparison.
- the possible occlusion by other objects can additionally be taken into account.
- the current specific position data of the means of transport are corrected, since a higher accuracy can be achieved by comparing the geographical data from the database with the position data of the sensors.
- This makes it possible to recalculate the position data, in particular if these were originally determined by a satellite-based location system, for example GPS and / or Galileo, which usually has a certain degree of inaccuracy.
- the video system 1 has at least one video camera 2, which is arranged in a means of transport (not shown) and detects an image outside the means of transport.
- the camera 2 is either permanently installed, ie immovably installed, so that the area covered by the camera or the view frustum is immutable with respect to the traffic orientation.
- additional sensors may be provided on the video camera 2, which detect the viewing direction of the video camera 2 with respect to the orientation of the means of transport.
- the current position data with information about the location and the orientation of the means of transport are determined by means of position determination sensors 3.
- position determination sensors 3 may be sensors for a satellite-based location system, for example GPS (Global Positioning System) and / or the European Galileo system.
- speed sensors, route sensors and / or curve sensors can be provided with which the current geographical position and the orientation of the means of transport are determined.
- the video system 1 has a memory 4, in which a database with object data is stored, which have geographical and totografic as well as any further suitable information.
- the image data acquired with the video camera 2, the position data determined by the position sensors 3 and the object data retrievable from the memory 4 are made available to a data processing unit 5, for example a suitable microprocessor.
- the data transmission from the video camera 2, the position sensors 3 and the memory 4 to the data processing unit 5 is indicated in the figure by corresponding arrows.
- the actual data processing is carried out within the data processing unit 5 by a plurality of modules.
- the current position and orientation of the means of transport is first calculated on the basis of the position data received by the position sensors 3 in the module 6.
- the video image of the video camera 2 is evaluated. This can be done by graphical analysis methods, such as edge analysis, and objects present in the video image are identified. With the aid of the previously calculated mode of transport and information as well as knowledge of the orientation and
- a position of the identified objects is determined.
- the positional only be carried out with a certain accuracy.
- the direction indication of a recognized object to the means of transport can be determined with sufficient accuracy.
- initially only a rough approximation to the object position is required.
- the objects detected in the video image are compared with objects to which the corresponding object data are stored in the database.
- the first starting point for the comparison to be carried out is the object position determined in module 7 or the indication in which direction the object is to the means of transport.
- the objects are searched for and determined, which are in the vicinity of the specific position data, and thus come as candidates for an assignment in question.
- a correction of the position determination calculated by means of the sensor data can furthermore be carried out.
- the direction in which detected objects are located in front of the means of transport can be determined. It is possible to deduce the current position of the means of transport from the indication of the direction and the exact position of the objects which can be read out of the memory 4 of the database stored there.
- the display data is calculated, ie the additional information from the database is inserted into the video image provided by the video camera 2.
- the assignment of objects in the video image to objects in the database that took place in the previous step in the module 8 plays a decisive role.
- These objects, such as roads, waters, localities, buildings, and / or point-of-interests may be labeled with their names or designations. It is also possible to display additional information.
- objects not recognized in the video image can be labeled because their relative position to identified objects results from the geographic database. This can be useful, for example, in the case of partial obscuration of the objects by clouds and / or fog.
- direction indicators can be entered in the video image on assigned roads. Finally, it is also possible to trace unrecognized or recognizable edges of objects.
- the multimedia system generally includes a plurality of such monitors 10. For example, in applications with only one monitor 10, this may be a TFT screen. If
- Input devices are present, for example, a touch screen, an interactive output is possible. The user can thus retrieve additional information by selecting a displayed object and display on the monitor.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Instructional Devices (AREA)
- Navigation (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102007022588A DE102007022588A1 (de) | 2007-05-14 | 2007-05-14 | Verfahren zur Anzeige von Videobildern und Videosystemen |
PCT/EP2008/053471 WO2008138670A1 (fr) | 2007-05-14 | 2008-03-25 | Procédé d'affichage d'images vidéo, et système vidéo correspondant |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2145157A1 true EP2145157A1 (fr) | 2010-01-20 |
Family
ID=39434145
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP08718162A Ceased EP2145157A1 (fr) | 2007-05-14 | 2008-03-25 | Procédé d'affichage d'images vidéo, et système vidéo correspondant |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2145157A1 (fr) |
DE (1) | DE102007022588A1 (fr) |
WO (1) | WO2008138670A1 (fr) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8994851B2 (en) | 2007-08-07 | 2015-03-31 | Qualcomm Incorporated | Displaying image data and geographic element data |
US9329052B2 (en) | 2007-08-07 | 2016-05-03 | Qualcomm Incorporated | Displaying image data and geographic element data |
DE102011084596A1 (de) | 2011-10-17 | 2013-04-18 | Robert Bosch Gmbh | Verfahren zum Assistieren eines Fahrers in einer fremden Umgebung |
DE102015226178A1 (de) | 2015-12-21 | 2017-06-22 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Mobilgerät zur Anzeige einer geografischen Bereichsdarstellung |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005124594A1 (fr) * | 2004-06-16 | 2005-12-29 | Koninklijke Philips Electronics, N.V. | Etiquetage automatique en temps reel de points superposes et d'objets d'interet dans une image visualisee |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6222583B1 (en) * | 1997-03-27 | 2001-04-24 | Nippon Telegraph And Telephone Corporation | Device and system for labeling sight images |
US6208353B1 (en) * | 1997-09-05 | 2001-03-27 | ECOLE POLYTECHNIQUE FEDéRALE DE LAUSANNE | Automated cartographic annotation of digital images |
JP4273119B2 (ja) * | 2003-10-21 | 2009-06-03 | 和郎 岩根 | ナビゲーション装置 |
US20060195858A1 (en) * | 2004-04-15 | 2006-08-31 | Yusuke Takahashi | Video object recognition device and recognition method, video annotation giving device and giving method, and program |
-
2007
- 2007-05-14 DE DE102007022588A patent/DE102007022588A1/de not_active Withdrawn
-
2008
- 2008-03-25 WO PCT/EP2008/053471 patent/WO2008138670A1/fr active Application Filing
- 2008-03-25 EP EP08718162A patent/EP2145157A1/fr not_active Ceased
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005124594A1 (fr) * | 2004-06-16 | 2005-12-29 | Koninklijke Philips Electronics, N.V. | Etiquetage automatique en temps reel de points superposes et d'objets d'interet dans une image visualisee |
Also Published As
Publication number | Publication date |
---|---|
DE102007022588A1 (de) | 2008-11-27 |
WO2008138670A1 (fr) | 2008-11-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2769373B1 (fr) | Transfert de données de services cartographiques à base de données d'images dans un système d'aide à la conduite | |
DE102016117659B4 (de) | Fahrunterstützungseinrichtung | |
DE112008003424B4 (de) | Navigationsgerät, das Videobilder einer Kamera verwendet | |
DE102011118161B3 (de) | Verfahren zur Positionsbestimmung | |
DE10138719A1 (de) | Verfahren und Vorrichtung zur Darstellung von Fahrhinweisen, insbesondere in Auto-Navigationssystemen | |
DE202005021607U1 (de) | Navigationsvorrichtung mit Kamerainformation | |
DE69815940T2 (de) | Verfahren und Anordnung zur Informationsdarstellung in Form einer Landkarte für Fahrzeugsnavigationsgeräte | |
EP1519152A1 (fr) | Dispositif et procédé pour représenter des instructions de conduite | |
WO2009149960A1 (fr) | Procédé de sortie combinée d'une image et d'une information locale, et véhicule à moteur associé | |
DE102010007091A1 (de) | Verfahren zur Positionsermittlung für ein Kraftfahrzeug | |
WO2019007605A1 (fr) | Procédé pour faire vérifier la carte numérique d'un véhicule très automatisé, dispositif correspondant et programme d'ordinateur | |
DE60121944T2 (de) | Verfahren und vorrichtung zum anzeigen von navigationsinformationen im echtzeitbetrieb | |
DE102010003851A1 (de) | Verfahren und Informationssystem zum Markieren eines Zielorts für ein Fahrzeug | |
WO2021110412A1 (fr) | Procédé d'affichage destiné à afficher un modèle d'environnement d'un véhicule, programme informatique, dispositif de commande et véhicule | |
EP2145157A1 (fr) | Procédé d'affichage d'images vidéo, et système vidéo correspondant | |
DE112016005798T5 (de) | Fahrzeuginternes system und verfahren zur bereitstellung von informationen in bezug auf punkte von interesse | |
DE102008043756B4 (de) | Verfahren und Steuergerät zum Bereitstellen einer Verkehrszeicheninformation | |
EP2813999A2 (fr) | Système à réalité augmentée et procédé de production et d'affichage de représentations d'objet à réalité augmentée pour un véhicule | |
EP1832848B1 (fr) | Procédé et dispositif destinés à l'affichage d'une section de carte numérique | |
WO2013056954A1 (fr) | Procédé pour assister un conducteur dans un environnement non familier | |
DE102017215868A1 (de) | Verfahren und Vorrichtung zum Erstellen einer Karte | |
DE102018121274B4 (de) | Verfahren zum Visualisieren einer Fahrabsicht, Computerprogrammprodukt und Visualisierungssystem | |
DE102010042314A1 (de) | Verfahren zur Ortsbestimmung mit einem Navigationssystem und Navigationssystem hierzu | |
DE602004009975T2 (de) | Fahrerassistenzsystem für Kraftfahrzeuge | |
DE102021116403A1 (de) | Einrichtung für ein fahrzeug zur anzeige von parkplatzinformationen sowie fahrzeug und system mit einer solchen einrichtung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20091214 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA MK RS |
|
17Q | First examination report despatched |
Effective date: 20100211 |
|
DAX | Request for extension of the european patent (deleted) | ||
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20120325 |