EP2800671A1 - Verfahren und vorrichtung zur fahrerinformation - Google Patents
Verfahren und vorrichtung zur fahrerinformationInfo
- Publication number
- EP2800671A1 EP2800671A1 EP12787679.5A EP12787679A EP2800671A1 EP 2800671 A1 EP2800671 A1 EP 2800671A1 EP 12787679 A EP12787679 A EP 12787679A EP 2800671 A1 EP2800671 A1 EP 2800671A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- information
- motor vehicle
- image
- driver
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 238000004590 computer program Methods 0.000 claims description 6
- 230000001427 coherent effect Effects 0.000 abstract description 3
- 230000003287 optical effect Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/282—Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/21—Indexing scheme for image data processing or generation, in general involving computational photography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/004—Annotating, labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2004—Aligning objects, relative positioning of parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2008—Assembling, disassembling
Definitions
- the invention relates to a method and a device for driver information.
- the invention relates to a method and apparatus for outputting personalized information to a driver of a motor vehicle having the features of the independent claims.
- optical, acoustic or haptic output devices To output information to a driver of a motor vehicle, it is known to use optical, acoustic or haptic output devices. In the first place, these devices are used to output information about the driving state of the motor vehicle, for example an optical speedometer (tachometer) or a haptic warning before leaving a lane. Since the issuing of such information involves the danger of distracting the driver from what is happening around the motor vehicle, such expenditure must always be cautious. In certain cases, statutory requirements must also be observed, for example, in some countries operating a navigation system while driving is not permitted.
- An inventive method for driver information comprises steps of
- the display area comprises a surface of an object displayed on the image.
- the method may perform recognition of objects in the image and provide a surface of a detected object as a display area. This makes it easy to find a coherent display area.
- the surface of the object can hold little or no relevant information for the guidance of the motor vehicle, so that overlaying with the personalized information represents virtually no information loss for the driver.
- advertisement information on the surface of the object may be blended by the personalizing information.
- the representation of the inserted information is aligned in perspective to a position and extent of the surface of the object with respect to the motor vehicle.
- the output of the personalized information can thus be made in accordance with the perceptible objects of the environment, so that the information recording by the driver can be facilitated.
- the information can be obtained from a source outside the motor vehicle.
- the information can be obtained, for example by means of wireless data transfer, from a computer or a computer network, which provides the personalized information to the driver.
- calendar or contact information, reminders or personalized advertising can be discreetly brought to the driver's attention.
- the method comprises determining a driving situation of the motor vehicle and selecting the information to be inserted as a function of the driving situation. If, for example, a driving situation requires increased attention from the driver due to its complexity, the personalized information may be limited to brief and urgently marked messages. However, if the driving situation is simple, such as driving at a constant speed on a low-traffic freeway in daylight and good visibility, the information may also be more complex or changed more frequently.
- the driving situation can be determined, for example, on the basis of a driving purpose, a time of day or a driving speed. An information overload of the driver can be prevented even better.
- the personalized information can be output to suit the driving situation, so that, for example, a current schedule can be output on a work journey, but personalized weather information of the destination can be output on a holiday trip.
- a computer program product comprises program code means for carrying out the described method when the computer program product runs on a processor or is stored on a computer-readable medium.
- the computer program product can, in particular, run on a processing device which is integrated into the motor vehicle and possibly also controls another system, for example a navigation system.
- the computer program product may also be removable on a motor vehicle
- Computer run as an output device alternatively an integrated Output device or an output device that is permanently installed in the vehicle, can be used.
- An inventive system for driver information comprises a recording device for scanning an image of an exterior of a motor vehicle, a determining device for determining personalized information, which are directed to a driver of the motor vehicle, processing means for determining a contiguous display area in the image and for inserting the Information in the image in the area of the display surface, and an output device for outputting the image to the driver.
- the system is permanently installed in the motor vehicle.
- it is possible, in particular, to facilitate networking with other systems of the motor vehicle in order, for example, to allow an improved or easier determination of the driving situation of the motor vehicle.
- the output device comprises a so-called head-up display.
- Such an optical output device allows an overlay of the immediately visually perceptible environment with a generated image.
- an overlay can only take place in subregions of the environment, for example by means of an alphanumeric output.
- a context-sensitive overlay of visually perceptible objects with other information is referred to as augmented reality. This technique can be used advantageously to carry out the inventive output to the driver.
- the system preferably comprises a receiving device for receiving the information from a source outside the motor vehicle.
- the receiving device may in particular comprise a unidirectional or bidirectional digital data interface which is connected to a computer or a computer network, for example the Internet.
- FIG. 1 is a system for driver information
- FIGS. 2 to 4 show examples of an overlay of visually perceptible information with output personalized information
- FIG. 5 is a flowchart of a method for execution on the system of FIG.
- FIG. 1 shows a system 100 for driver information.
- a processing device 110 On board a motor vehicle 105 there are a processing device 110, a first camera 15, a second camera 120, an interface 125, a receiving device 130 and an output device 135.
- a processing device 110 On board a motor vehicle 105 there are a processing device 110, a first camera 15, a second camera 120, an interface 125, a receiving device 130 and an output device 135.
- a processing device 110 On board a motor vehicle 105 there are a processing device 110, a first camera 15, a second camera 120, an interface 125, a receiving device 130 and an output device 135.
- a receiving device 130 On board a motor vehicle 105 there are a processing device 110, a first camera 15, a second camera 120, an interface 125, a receiving device 130 and an output device 135.
- not all of the elements shown must be present. as will be explained in more detail below.
- the processing device 110 preferably comprises a programmable microcomputer.
- the processing device 110 is installed permanently on board the motor vehicle 105, wherein the processing device 110, in conjunction with the devices connected to it, can take on further processing and control tasks.
- the processing device 1 10 may be part of a navigation or entertainment system on board the motor vehicle 105.
- the first camera 1 15 and the second camera 120 which may also be replaced by a stereo camera, are adapted to provide images or a combined image of an exterior of the motor vehicle 105.
- a viewing angle of the cameras 115 and 120 as closely as possible corresponds to a viewing angle of a driver of the motor vehicle 105.
- the cameras 1 15 and 120 are preferably aligned in the direction of travel forward and also offset from each other, so that from images that at the same time with the two Cameras 1 15, 120 were recorded for the purpose of tion of depth information can be superimposed.
- the determination of depth information in the combined image can take place either in the cameras 1 15, 120 or the stereo camera which replaces them or by means of the processing device 110.
- only a single camera 115 is provided and depth information of the image provided by this camera 15 can be determined, for example, by a geometric distance estimation.
- the optional interface 125 is configured to provide data indicative of a driving condition of the motor vehicle 105.
- data may include a position, a speed, an acceleration, a planned route, a time of day, an outside temperature, lighting conditions, and other parameters that are significant to the operation of the motor vehicle 105.
- the processing device 1 10 can determine the driving state of the motor vehicle 105.
- the receiving device 130 is also optional and configured to receive personalized information directed to a driver of the motor vehicle 105.
- the receiving device 130 may be wired or wirelessly connected to a data memory on board the motor vehicle 105, for example, with a mobile phone or a personal computer, for example for the management of appointments.
- the receiving device 130 may also be configured to wirelessly receive data from a network.
- This network may include, for example, a cellular network, which may be connected to the Internet.
- the receiving device 130 is also configured to transmit a request for personalized data from the processing device 110 to another computer, which then provides this data.
- the output device 135 is an optical output device, preferably with support of a multicolor output.
- the output device 135 is installed so that the driver of the motor vehicle 105 can easily read it.
- the output device 135 comprises a freely viewable display, such as a liquid crystal display.
- the output device 135 comprises a so-called head-up display, which is set up to store information in the viewing position. the area of the driver.
- the field of view of the cameras 1 15 and 120 includes the driver's main field of vision, so that the directly perceptible optical information from the surroundings of the motor vehicle 105 can be superimposed on the basis of the images provided by the cameras 1 15, 120 by means of the output device 135.
- the superimposition determines which objects in the environment of the motor vehicle 105 are visible to the driver and which are completely or partially superimposed by information of the image.
- Figures 2 to 5 show examples of an overlay optically perceptible
- an area 200 which lies in front of the motor vehicle 105 and which can be viewed by the driver of the motor vehicle 105 when looking in the direction of travel.
- an image 205 is taken, which was taken by means of the cameras 1 15, 120 and processed by means of the processing device 1 10 of FIG.
- the image 205 is preferably displayed in the region 200 in such a way that directly perceptible objects and representations of these objects in the image 205 are congruent with one another from the region 200, so that additional information of the image 205 falls on predetermined areas of the field of vision of the driver.
- the image 205 is not completely reproduced to the driver but only includes additional information that is superimposed on directly visible objects.
- an object 210 In each representation of FIGS. 2 to 4, an object 210, an area 215 of the respective object 210, and a graphical representation of information 220 are shown.
- the processing device 110 performs an object recognition on the image 205 in each case.
- one or more surfaces 215 may be determined, on the basis of which a contiguous display area 220 is determined.
- several areas 215 of one or more objects 210 can be combined.
- the object 210 comprises a guardrail.
- the display area 220 corresponds to the visible surface of the guardrail, and the information 225 shown on the display area 220 relates, for example, to a due travel booking that the driver of the motor vehicle 105 has to do.
- the object 210 is a truck and the surface 215 is the rear boundary surface thereof.
- the illustrated personalized information 225 relates, for example, to a due ticket order of the driver of the motor vehicle 105, which are superimposed on the boundary surface.
- the object 210 is an area of the road ahead of the motor vehicle 105 and the surface of the road forms the display area 220.
- the illustrated personalized information 225 relates, for example, to general product information presented to the driver.
- FIG. 5 shows a flow chart of a method 500 for driver information of the driver of the motor vehicle 105 from FIG. 1.
- the method 500 is set up in particular for execution on the processing device 110 of the system 100 from FIG.
- the method 500 includes steps 505 to 550, wherein in the simplest possible embodiment, only those steps are shown, which are shown thick outlined. The remaining steps are optional and may be omitted in other embodiments.
- steps 505 and 510 an image 205 is made whenever possible simultaneously by means of the cameras 1 15 and 120. If only one camera 115 is used, then steps 505 and 510 coincide.
- an object recognition is performed to detect objects 210 displayed on the image 205. If the image 200 was recorded by means of the two cameras 115, 120 or by means of a stereo camera, then a determination of depth information in the image 200 can be performed beforehand and the object recognition in step 515 can additionally be based on the depth information.
- a contiguous display area in the image 205 is determined.
- individual surfaces of the objects determined in step 515 can be used.
- a surface 215 of an object 210 or multiple surfaces 215 of one or more objects 210 may together form the display surface 220.
- personalized information directed to the driver of the motor vehicle 105 is obtained.
- this personalized information is received by the receiving device 130 of FIG.
- a driving condition of the motor vehicle 105 may be determined.
- a step 535 personalized information to be displayed is selected on the basis of the information obtained in step 525.
- the selection may be made based on the driving condition of the motor vehicle 105 determined in step 530.
- step 540 the information selected for display on the display area 220 selected in step 535 may be enlarged, reduced, rotated, or distorted in perspective.
- step 545 the information is inserted into the image 205 and subsequently output to the driver of the motor vehicle 105 in a step 550. Examples of possible expenses can be found in FIGS. 2 to 4.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Transportation (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Mechanical Engineering (AREA)
- Geometry (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Computing Systems (AREA)
- Traffic Control Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Instrument Panels (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102012200133A DE102012200133A1 (de) | 2012-01-05 | 2012-01-05 | Verfahren und Vorrichtung zur Fahrerinformation |
PCT/EP2012/071925 WO2013102508A1 (de) | 2012-01-05 | 2012-11-06 | Verfahren und vorrichtung zur fahrerinformation |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2800671A1 true EP2800671A1 (de) | 2014-11-12 |
Family
ID=47191714
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12787679.5A Withdrawn EP2800671A1 (de) | 2012-01-05 | 2012-11-06 | Verfahren und vorrichtung zur fahrerinformation |
Country Status (6)
Country | Link |
---|---|
US (1) | US20150296199A1 (ja) |
EP (1) | EP2800671A1 (ja) |
JP (1) | JP6104279B2 (ja) |
CN (1) | CN104039580B (ja) |
DE (1) | DE102012200133A1 (ja) |
WO (1) | WO2013102508A1 (ja) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9851882B2 (en) * | 2015-12-27 | 2017-12-26 | Thunder Power New Energy Vehicle Development Company Limited | Fully designable vehicle information panel interface |
US10366290B2 (en) * | 2016-05-11 | 2019-07-30 | Baidu Usa Llc | System and method for providing augmented virtual reality content in autonomous vehicles |
DE102017204254A1 (de) * | 2017-03-14 | 2018-09-20 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Vorrichtung zur Erinnerung eines Fahrers an ein Anfahren an einer Lichtsignaleinrichtung |
DE102017206312B4 (de) * | 2017-04-12 | 2024-08-01 | Ford Global Technologies, Llc | Unterstützung einer Handhabung eines innerhalb eines Fahrgastinnenraums befindlichen Gegenstands sowie Kraftfahrzeug |
DE112018004847B4 (de) | 2017-08-31 | 2024-02-08 | Sony Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, programm und mobiles objekt |
EP3573025A1 (en) * | 2018-05-24 | 2019-11-27 | Honda Research Institute Europe GmbH | Method and system for automatically generating an appealing visual based on an original visual captured by the vehicle mounted camera |
US11762390B1 (en) * | 2019-01-25 | 2023-09-19 | Amazon Technologies, Inc. | Autonomous machine safety management in a dynamic environment |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5550677A (en) * | 1993-02-26 | 1996-08-27 | Donnelly Corporation | Automatic rearview mirror system using a photosensor array |
US6498620B2 (en) * | 1993-02-26 | 2002-12-24 | Donnelly Corporation | Vision system for a vehicle including an image capture device and a display system having a long focal length |
JPH1026542A (ja) * | 1996-07-10 | 1998-01-27 | Toyoda Gosei Co Ltd | 自動車用デジタルメータ装置 |
DE10131720B4 (de) * | 2001-06-30 | 2017-02-23 | Robert Bosch Gmbh | Head-Up Display System und Verfahren |
JP2005069776A (ja) * | 2003-08-21 | 2005-03-17 | Denso Corp | 車両用表示方法、車両用表示装置 |
JP2005070231A (ja) * | 2003-08-21 | 2005-03-17 | Denso Corp | 車両における表示方法 |
JP3972366B2 (ja) * | 2003-09-26 | 2007-09-05 | マツダ株式会社 | 車両用情報提供装置 |
JP3931336B2 (ja) * | 2003-09-26 | 2007-06-13 | マツダ株式会社 | 車両用情報提供装置 |
JP3931334B2 (ja) * | 2003-09-26 | 2007-06-13 | マツダ株式会社 | 車両用情報提供装置 |
DE10355322A1 (de) * | 2003-11-27 | 2005-06-23 | Robert Bosch Gmbh | Anzeigegerät |
JP2005182306A (ja) * | 2003-12-17 | 2005-07-07 | Denso Corp | 車両用表示装置 |
US7561966B2 (en) * | 2003-12-17 | 2009-07-14 | Denso Corporation | Vehicle information display system |
DE102004033480A1 (de) * | 2004-07-10 | 2006-02-16 | Robert Bosch Gmbh | Vorrichtung zur Überwachung einer Fahrzeugbedienung |
JP4529735B2 (ja) * | 2005-03-07 | 2010-08-25 | 株式会社デンソー | テレビ放送表示用の表示制御装置および表示制御装置用プログラム |
WO2006121986A2 (en) * | 2005-05-06 | 2006-11-16 | Facet Technology Corp. | Network-based navigation system having virtual drive-thru advertisements integrated with actual imagery from along a physical route |
JP4740689B2 (ja) * | 2005-08-19 | 2011-08-03 | エイディシーテクノロジー株式会社 | 車載用画像表示装置及び車載用装置 |
US20070205963A1 (en) * | 2006-03-03 | 2007-09-06 | Piccionelli Gregory A | Heads-up billboard |
CN201030817Y (zh) * | 2006-07-20 | 2008-03-05 | 张玉枢 | 机动车文字警示交流系统 |
JP4942814B2 (ja) * | 2007-06-05 | 2012-05-30 | 三菱電機株式会社 | 車両用操作装置 |
JP4475308B2 (ja) * | 2007-09-18 | 2010-06-09 | 株式会社デンソー | 表示装置 |
JP2009126249A (ja) * | 2007-11-20 | 2009-06-11 | Honda Motor Co Ltd | 車両用情報表示装置 |
JP2009251968A (ja) * | 2008-04-07 | 2009-10-29 | Toyota Motor Corp | 緊急通報システム、通信管理サーバー、及び車載情報通信装置 |
JP4645675B2 (ja) * | 2008-04-23 | 2011-03-09 | 日本精機株式会社 | 車両用表示装置 |
KR20110102873A (ko) * | 2008-12-19 | 2011-09-19 | 텔레 아틀라스 비. 브이. | 내비게이션 시스템에서 이미지들의 대상체들로의 동적 매핑 |
US8395529B2 (en) * | 2009-04-02 | 2013-03-12 | GM Global Technology Operations LLC | Traffic infrastructure indicator on head-up display |
US8704653B2 (en) * | 2009-04-02 | 2014-04-22 | GM Global Technology Operations LLC | Enhanced road vision on full windshield head-up display |
US8564502B2 (en) * | 2009-04-02 | 2013-10-22 | GM Global Technology Operations LLC | Distortion and perspective correction of vector projection display |
US8503762B2 (en) * | 2009-08-26 | 2013-08-06 | Jacob Ben Tzvi | Projecting location based elements over a heads up display |
JP5158063B2 (ja) * | 2009-12-02 | 2013-03-06 | 株式会社デンソー | 車両用表示装置 |
KR101544524B1 (ko) * | 2010-12-16 | 2015-08-17 | 한국전자통신연구원 | 차량용 증강현실 디스플레이 시스템 및 차량용 증강현실 디스플레이 방법 |
US20120224060A1 (en) * | 2011-02-10 | 2012-09-06 | Integrated Night Vision Systems Inc. | Reducing Driver Distraction Using a Heads-Up Display |
-
2012
- 2012-01-05 DE DE102012200133A patent/DE102012200133A1/de active Pending
- 2012-11-06 CN CN201280066102.2A patent/CN104039580B/zh active Active
- 2012-11-06 EP EP12787679.5A patent/EP2800671A1/de not_active Withdrawn
- 2012-11-06 JP JP2014550658A patent/JP6104279B2/ja active Active
- 2012-11-06 US US14/370,650 patent/US20150296199A1/en not_active Abandoned
- 2012-11-06 WO PCT/EP2012/071925 patent/WO2013102508A1/de active Application Filing
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2013102508A1 * |
Also Published As
Publication number | Publication date |
---|---|
DE102012200133A1 (de) | 2013-07-11 |
CN104039580B (zh) | 2019-08-16 |
US20150296199A1 (en) | 2015-10-15 |
WO2013102508A1 (de) | 2013-07-11 |
JP6104279B2 (ja) | 2017-03-29 |
CN104039580A (zh) | 2014-09-10 |
JP2015504815A (ja) | 2015-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3055650B1 (de) | Verfahren und vorrichtung zur augmentierten darstellung | |
EP1405124B1 (de) | Head-up display system und verfahren zur projektion einer markierung eines verkehrszeichens in bezug auf die blickrichtung des fahrers | |
DE102017221191B4 (de) | Verfahren zur Anzeige des Verlaufs einer Sicherheitszone vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm | |
DE102012205316B4 (de) | Navigationssystem und Anzeigeverfahren hiervon | |
EP3658976B1 (de) | Verfahren zum bereitstellen einer anzeige in einem kraftfahrzeug, sowie kraftfahrzeug | |
WO2019170387A1 (de) | Einblendung von zusatzinformationen auf einer anzeigeeinheit | |
DE102018207440A1 (de) | Verfahren zur Berechnung einer "augmented reality"-Einblendung für die Darstellung einer Navigationsroute auf einer AR-Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm | |
EP3425442B1 (de) | Verfahren und vorrichtung zum anreichern eines sichtfeldes eines fahrers eines fahrzeuges mit zusatzinformationen, vorrichtung zur verwendung in einem beobachter-fahrzeug sowie kraftfahrzeug | |
EP3543059A1 (de) | Verfahren zur berechnung einer einblendung von zusatzinformationen für eine anzeige auf einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm | |
WO2013102508A1 (de) | Verfahren und vorrichtung zur fahrerinformation | |
WO2019166222A1 (de) | Verfahren zur berechnung einer ar-einblendung von zusatzinformationen für eine anzeige auf einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm | |
DE102011082398A1 (de) | Verfahren zur Nutzung eines Fahrerassistenzsystems | |
EP3695266B1 (de) | Verfahren zum betrieb einer anzeigeeinrichtung in einem kraftfahrzeug | |
DE102011122616A1 (de) | Verfahren und Vorrichtung zum Bereitstellen einer Einparkhilfe in einem Fahrzeug | |
DE102017221488A1 (de) | Verfahren zur Anzeige des Verlaufs einer Trajektorie vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm | |
DE102015223248A1 (de) | Verfahren für ein Fahrerassistenzsystem | |
EP3685123A1 (de) | Verfahren, vorrichtung und computerlesbares speichermedium mit instruktionen zur steuerung einer anzeige einer augmented-reality-head-up-display-vorrichtung für ein kraftfahrzeug | |
DE102006040537A1 (de) | Fahrzeugassistenzsystem | |
DE102012018556B4 (de) | Assistenzsystem zur Ermöglichung einer erweiterten Vorausschau für nachfolgende Verkehrsteilnehmer | |
EP3296795A1 (de) | Verfahren zur darstellung eines bildobjekts in einem fahrzeug auf einer fahrzeugintern und einer fahrzeugextern wahrgenommenen anzeige | |
DE102018213745A1 (de) | Verfahren zum Betreiben eines Infotainmentsystems für einen Passagier in einem Ego-Fahrzeug und Infotainmentsystem | |
DE102018207407A1 (de) | Fahrerassistenzsystem, Fortbewegungsmittel und Verfahren zur Anzeige eines Abbildes eines Umgebungsbereiches eines Fortbewegungsmittels | |
DE102014225686A1 (de) | Verfahren und Vorrichtung zur videobasierten Vorschau eines vorausliegenden Straßenabschnitts für ein Fahrzeug sowie Verfahren und Vorrichtung zum videobasierten Aufzeichnen eines Straßenabschnitts für ein Fahrzeug | |
DE102012212016A1 (de) | Verfahren zum Betreiben einer optischen Anzeigevorrichtung eines Fahrzeugs | |
DE102010001716A1 (de) | Anzeigevorrichtung für ein Kraftfahrzeug |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140805 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20190718 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ROBERT BOSCH GMBH |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20191129 |