EP3759694A1 - Method for calculating an ar-overlay of additional information for a display on a display unit, device for carrying out the method, as well as motor vehicle and computer program - Google Patents
Method for calculating an ar-overlay of additional information for a display on a display unit, device for carrying out the method, as well as motor vehicle and computer programInfo
- Publication number
- EP3759694A1 EP3759694A1 EP19705167.5A EP19705167A EP3759694A1 EP 3759694 A1 EP3759694 A1 EP 3759694A1 EP 19705167 A EP19705167 A EP 19705167A EP 3759694 A1 EP3759694 A1 EP 3759694A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- vehicle
- display
- calculated
- driver
- oncoming
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000004590 computer program Methods 0.000 title claims description 8
- 230000003190 augmentative effect Effects 0.000 claims abstract description 13
- 239000011521 glass Substances 0.000 claims abstract description 10
- 230000033001 locomotion Effects 0.000 claims abstract description 8
- 238000013519 translation Methods 0.000 claims abstract description 4
- 238000004364 calculation method Methods 0.000 claims description 11
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 8
- 230000004438 eyesight Effects 0.000 claims description 8
- 238000003780 insertion Methods 0.000 claims description 7
- 230000009466 transformation Effects 0.000 claims 1
- 230000006870 function Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 10
- 230000006399 behavior Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000037431 insertion Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000007774 longterm Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000001149 cognitive effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000011664 signaling Effects 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000001125 extrusion Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000035479 physiological effects, processes and functions Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/365—Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
-
- B60K2360/165—
-
- B60K2360/177—
-
- B60K2360/178—
-
- B60K2360/179—
-
- B60K35/28—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
- B60R2300/205—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
Definitions
- the proposal concerns the technical field of driver information systems, also known as infotainment systems. In particular, this involves a method for displaying a safety zone in front of a vehicle or an object on a display unit. Such systems are used primarily in vehicles. But there is also the possibility of using the invention in pedestrians, cyclists, etc. with data glasses. The proposal continues to cover a correspondingly designed one
- a first approach is not to completely relieve the driver of his duties, but to ensure that the driver can always take control of the vehicle. The driver also takes
- HUD Head-Up Display
- Vehicle sensors are in particular the following components are called, which allow an environment observation: RADAR devices according to Radio Detection and Ranging, LIDAR devices, according to Light Detection and Ranging, mainly for distance detection / warning, and cameras with appropriate image processing for the field of object recognition.
- This data on the environment can be used as a basis for system-side driving recommendations, warnings, etc.
- display / warn in which direction (possibly in the own trajectory) another, surrounding vehicle wants to turn.
- the vehicle-to-vehicle communication is now also possible by means of mobile communication with systems such as LTE according to Long Term Evolution.
- LTE V2X was adopted by the 3GPP organization.
- systems based on WLAN technology for direct vehicle communication are available, in particular the system according to WLAN p.
- Autonomous driving (sometimes also called automatic driving, automated driving or piloted driving) is the movement of vehicles, mobile robots and driverless transport systems to understand the largely autonomous behavior. There are different gradations of the term autonomous driving. Autonomous driving is also discussed at certain stages when there is still a driver in the vehicle who, if necessary, only takes over the monitoring of the automatic driving process. In Europe, the various ceremonies of transport (in Germany the Federal Highway Research Institute was involved) worked together and the following
- Level 0 Driver only, the driver drives, steers, accelerates, brakes, etc.
- Level 1 Certain assistance systems help with vehicle operation (including a cruise control system - Automatic Cruise Control ACC).
- Level 3 high automation. The driver does not have to keep the system permanently
- the vehicle autonomously performs functions such as triggering the turn signal, lane change and lane keeping.
- the driver can turn to other things, but if necessary within a warning time of the system
- a future vision in the automotive industry is to be able to record the windscreen of one's own vehicle with virtual elements in order to offer the driver some advantages.
- the so-called "augmented reality” technology (AR) is used. Less familiar is the corresponding German-language term of "augmented reality”.
- the real environment is enriched with virtual elements. This has several advantages: The view down, on displays other than the windshield, is eliminated, since many relevant information is displayed on the windshield. So the driver does not have to avert his gaze from the road. In addition, due to the positionally exact sclerosing of the virtual elements in the real environment, a lesser cognitive effort on the part of the driver is likely because no interpretation of a graphic must be done on a separate display. With regard to automatic driving, an added value can also be generated.
- HUD Head-Up Displays
- Projection units that project an image on the windshield.
- this image is a few meters to 15 meters, depending on the design of the module in front of the vehicle. This has the advantage that the displayed information is presented in such a way that the eyes of the driver are even relieved of accommodation activity.
- the “image” is composed as follows: It is less a virtual display, but rather a kind of "keyhole” in the virtual world.
- the virtual environment is theoretically placed over the real world and contains the virtual objects that support and inform the driver while driving. The limited
- Display area of the HUD has the consequence that a section of it can be seen.
- a head-up display unit for a vehicle for generating a virtual image in the field of view of the driver is known. This results in a situation-based adapted representation of information.
- DE 10 2013 016 241 A1 discloses a method for augmented presentation of at least one additional piece of information in at least one recorded digital image
- the additional information in particular event-driven and optionally alternately, as a 2D representation and / or output as a 3D representation.
- an additional information output as an additional object eg. B. a virtual road sign, in 3D presentation and / or issued as additional text additional information, in particular a caption of a real or virtual object in the output image, are displayed in 2D representation in the output image.
- Representation method with the most accurate contact-analogous placement of the virtual objects and thus the closest possible placement, for example GPS position, the virtual objects on each associated real object allows the invention by outputting the virtual additional information in a previously determined display region that the virtual additional information is optimally placed , in particular taking into account a
- Image overlay element in an image of a surrounding area of a motor vehicle known.
- the image is displayed on a display surface of the motor vehicle.
- at least one object from the surrounding area is imaged in the image.
- the image overlay element is motion-coupled to the object and is shown moving the object with this carried at the same location on the object, depending on a change of direction and / or resizing the direction and / or size of the image overlay element to the appropriate change of the object is adjusted. So it can be provided that the
- Image superposition element is adapted to the surface of the roadway. If the roadway is now rising, for example, because it is uphill, then this can be done on the basis of
- Three-dimensional information about the object can be detected, and the representation of the image overlay element can with respect to the orientation and design
- HUD head-up display
- Ambient condition and / or a condition affecting the physiology of the user are Ambient condition and / or a condition affecting the physiology of the user.
- AR augmented reality
- the processing of this information can change situationally and lead to misinterpretations or Misunderstandings. This can lead to dangerous situations.
- the driver is displayed on a conventional display of the navigation route, although the route to be traveled, it is characterized but not clearly enough that a
- the invention sets itself the task of finding such an approach.
- This object is achieved by a method for calculating a fade of
- the insertion of additional information serves the purpose of assisting the driver in the longitudinal and transverse guidance of the vehicle.
- the solution is to animate the elements of AR inserts, such as lines, surfaces, and other geometric elements, through "physical" behaviors.
- the invention relates to a method for calculating an AR display, corresponding to "augmented reality" insertion, of additional information for a display on a display unit, in particular a head-up display (HUD) of an observer vehicle or data glasses, wherein the Display of
- the AR overlay is calculated in the manner of augmented reality in accordance with "augmented reality" in a contact-analogous manner to one or more objects in the environment of the observer vehicle.
- the position of an oncoming or preceding vehicle or object is detected.
- a spatially extended animation graphic is calculated, wherein the animation graphic has a raster form consisting of a plurality of raster elements, which differs from the
- Observer vehicle extends to the oncoming or preceding vehicle.
- a special feature is that the spatial extent is calculated in such a way that the driver of the observer vehicle has the impression of a kinematic or dynamic movement of the spatial extent, such as translation and rotation.
- the spatial extent is calculated in such a way that the driver of the observer vehicle is given the impression of a shaft moving or moving away from him.
- the animation with a waveform can be designed so that the wave can travel on the X, Y, or Z axis.
- the animation graphic is calculated to periodically repeat the spatial extent of the animation graphic, giving the driver of the observer vehicle the impression that a number of wave trains are moving toward or away from him.
- one or two spatially extended animation graphics are calculated to support the lateral guidance of the vehicle, which are displayed laterally of the route, the further animation graphics having a grid shape consisting of a plurality of raster elements, and the spatial extent is calculated so that the side where an obstacle or an oncoming vehicle has been detected, the grid is spatially set up to emphasize a narrowing of the route.
- the goal of towering elements is to better communicate warnings.
- the piling / extrusion of elements can be done in any axis to any proportions.
- Animated graphics is calculated so that the at least one grid-shaped
- the conversion of the animation graphics is calculated so that the raster elements of the animation graphics to support the lateral guidance during the conversion phase swarming, from which at the end of the conversion phase, the hint symbol arises.
- the swarming behavior of the lines, surfaces, and geometric elements that cluster at each coordinate in the real world creates a superposition and an automatic increase in visual intensity.
- This "bundling" of elements can be e.g. be used for functions of the area and object marking, but also for the indication of a navigation path or an attention control.
- the display unit of the device is designed as a head-up display.
- a data goggle or a monitor can be used in the device as a display unit, on which a camera image is displayed, in which the grid is displayed.
- the device according to the invention can be used in a motor vehicle.
- the invention is preferably realized so that the display unit is permanently installed in the vehicle, e.g. in the form of a head-up display.
- the invention can also be advantageously used if the display unit corresponds to data glasses. Then, the invention can be
- Fig. 2 shows the typical cockpit of a vehicle
- FIG. 3 shows the block diagram of the infotainment system of the vehicle
- Fig. 4 is an illustration of a wave-shaped arched Rastereinblendung for
- FIG. 5 shows an illustration of a grid insertion for highlighting a narrowing of the travel path
- FIG. 6 is an illustration of a flock-like conversion of a grid-shaped AR fade to an instruction to the driver in the example of the lane narrowing
- FIG. 7 is an illustration of three basic levels of driver overlay information and;
- FIG. FIG. 8 is a flow chart for a program for calculating the AR fades for the three basic levels.
- Fig. 1 illustrates the basic operation of a head-up display.
- the head-up display 20 is in the vehicle 10 below / behind the instrument cluster in
- Additional information is displayed in the driver's field of vision.
- This additional information appears as if it were projected onto a projection surface 21 at a distance of 7 - 15 m in front of the vehicle 10.
- the additional information displayed creates a kind of virtual environment.
- the virtual environment is theoretically placed over the real world and contains the virtual objects that support and inform the driver while driving. However, it is only projected onto a part of the windshield, so that the additional information can not be arbitrarily arranged in the field of vision of the driver.
- Fig. 2 shows the cockpit of the vehicle 10. Shown is a passenger car. As vehicle 10, however, any other vehicles would also be considered. Examples of other vehicles are: buses, commercial vehicles, especially trucks trucks, agricultural machinery, construction machinery, rail vehicles, etc. The use of the invention would be generally in land vehicles, rail vehicles, watercraft and aircraft possible.
- the cockpit three display units of an infotainment system are shown. It is the head-up display 20 and a touch-sensitive screen 30 which is mounted in the center console.
- the center console When driving, the center console is not in the driver's field of vision. Therefore, the additional information is not displayed on the display unit 30 while driving.
- the touch-sensitive screen 30 serves in particular for the operation of functions of the vehicle 10. For example, about a radio, a radio, a radio, a
- infotainment is a boxword word composed of the words information and entertainment (entertainment).
- touch screen To operate the infotainment system mainly the touch-sensitive screen 30 ("touch screen") is used, this screen 30 can be well viewed and operated in particular by a driver of the vehicle 10, but also by a passenger of the vehicle 10.
- mechanical operating elements for example keys, rotary encoders or combinations thereof, such as, for example, a push-dial regulator, can be arranged in an input unit 50.
- This unit is not shown separately, but is considered part of the input unit 50.
- the operation device includes the touch-sensitive display unit 30, a calculator 40, an input unit 50, and a memory 60.
- the display unit 30 includes both a display surface for displaying variable graphic information and a control surface (touch-sensitive layer) disposed above the display surface
- the display unit 30 is connected to the computing device 40 via a data line 70
- the data line can be designed according to the LVDS standard, corresponding to Low Voltage Differential Signaling.
- the display unit 30 receives control data for driving the display surface of the touch screen 30 from the
- Computing device 40 Via the data line 70 are also control data of
- Reference numeral 50 denotes the input unit. It is associated with the already mentioned control elements such as buttons, knobs, sliders, or rotary pushbuttons, with the help of which the operator can make inputs via the menu. Input is generally understood as selecting a selected menu item, as well as changing a parameter, turning a function on and off, and so on.
- the memory device 60 is connected to the computing device 40 via a data line 80.
- a pictogram directory and / or symbol directory deposited with the pictograms and / or symbols for the possible overlays of additional information.
- the points / symbols can be stored, which serve as the basis for the calculation of the raster overlay.
- the other parts of the infotainment system camera 150, radio 140, navigation device 130, telephone 120 and instrument cluster 110 are connected via the data bus 100 with the device for operating the infotainment system.
- the data bus 100 is the high-speed variant of the CAN bus according to ISO standard 11898-2.
- Ethernet-based bus system such as BroadR-Reach in question.
- Bus systems in which the data is transmitted via optical fibers can also be used. Examples are the MOST Bus (Media Oriented System Transport) or the D2B Bus (Domestic Digital Bus).
- the camera 150 can be designed as a conventional video camera. In this case, it will record 25 frames / s, which is equivalent to 50 fields / s in the interlace recording mode.
- the vehicle 10 is with a special camera that captures more images / sec to increase the accuracy of object detection on faster moving objects.
- a special camera can be used that captures more images / sec to increase the accuracy of object detection on faster moving objects.
- Several cameras can be used for monitoring the environment.
- the already mentioned RADAR or LIDAR systems could be used in addition or alternatively to carry out or expand the field observation.
- the vehicle 10 is with a
- Communication module 160 equipped. This module is often referred to as an on-board unit. It can be used for mobile communication, e.g. according to LTE standard,
- WLAN communication according to Long Term Evolution, be designed.
- wireless LAN it for the communication to devices of the occupants in the vehicle or for the vehicle-to-vehicle communication etc.
- a driver assistance system for longitudinal guidance of the vehicle 10 is used.
- assistance systems are an automatic distance control ACC, according to Adaptive Cruise Control, and a cruise control system GRA, according to cruise control system.
- the invention would also be used in the same way, if the vehicle 10 would be controlled fully automatically.
- the following describes what steps are taken when the vehicle 10 with the longitudinal guidance system activated, here an ACC system, approaches the preceding vehicle 300, detects it and adapts its speed to the preceding vehicle 300. This is done so that a previously entered safety distance is maintained.
- a grid-shaped AR overlay is calculated for the path predicted by the navigation system. This displays the route to the driver without obscuring important information of the real scene.
- the basic idea and technique of the raster-shaped AR overlay is shown in applicant's co-pending patent application DE 10 2017 212 367. It is also expressly referred to the co-pending application with respect to the disclosure of the invention described herein.
- the basis of the display according to the invention of the longitudinal and / or transverse guidance function of the vehicle 10 on the HUD 20 is the display of a virtual grid 24 along the route, which is displayed at a distance above the actual road or without distance to the road.
- the road is located as a real road course in the field of vision of the driver.
- the special feature of the new proposal is that not only the track is marked with the grid 24, but also an event that is connected to the track.
- the event in the example shown in FIG. 4, is that a vehicle 300 is approaching on the roadway, by which, after estimation of
- the danger potential which leads to the AR-insertion of the event, consists in the case shown in by the relative movement of both moving towards each other vehicles 10, 300, taking into account possible objects or
- Danger potential made aware. This is done, as shown in Fig. 4, by the calculation of a spatially extended AR fade.
- On the urgency of the Hazard potential is indicated by the fact that the spatial extent starting from the oncoming vehicle 300 moves towards the observer vehicle 10. This creates for the driver of the observer vehicle 10, the impression of a tapering shaft on him. It can be a wave crest or several wave peaks are shown, which move towards the observer vehicle 10.
- the AR display is preferably calculated in perspective. As a result, a wave crest towers more and more in front of the observer vehicle 10, which increasingly points to the urgency of the imminent danger.
- FIG. 5 shows an example of an AR overlay which indicates the imminent
- Lane course a left grid 26a and a right grid 28a is displayed. These extend from the observer vehicle 10 to the oncoming or
- the grid 26a and 28a position themselves spatially laterally outward.
- the spatial arrangement is preferably such that it increases in front of the bottleneck and subsides after the bottleneck.
- Fig. 6 shows the case that the impending constriction of the driveway of the
- Longitudinal guidance system is estimated so that the risk of collision or contact with the oncoming vehicle 300 is estimated to be too large.
- an AR overlay is computed which gives the driver an action to what needs to be done to avoid a collision or touch.
- the action statement is designed as a dodge arrow 28b. So that the driver understands the handling instruction directly and intuitively, it is not simply superimposed on the position of the escape point but is also specially shaped by the sequence of its formation. This is done so that the symbol of the escape arrow 28b arises from the points of the right grid 28a.
- the points of the right-hand grid 28a are animated in such a way that they move in a swarming manner and finally accumulate in such a way that they create the escape arrow 28b.
- Fig. 6 is also shown that the points of the left grid 26 a for
- Shift side guide so that they mark the alleged route of the oncoming vehicle 300.
- Fig. 7 shows a variant of how the various AR overlays can be combined. It is shown that the grid 24 with the representation of
- Hazard potential of an event together with the grids 26a and 28a to Side guide are displayed. This can be done to better distinguish in different colors.
- the grid 24 is shown in red and the grid 26a and 28a in yellow.
- a computer program for the calculation of the AR fades is explained with reference to FIG. 8.
- the program is processed in the arithmetic unit 40.
- the program start is designated by the reference numeral 405.
- the detection of an oncoming vehicle 300 takes place.
- the images supplied by the camera 150 are provided with those intended for this purpose
- the distance to the oncoming vehicle 300 is estimated and also the relative speed between the
- Instantaneous speed of the oncoming vehicle can be estimated by continuous image evaluation of the images supplied by the camera 150.
- the instantaneous speed may be transmitted via car-2-car communication from the oncoming vehicle 300 to the observer vehicle 10. After the oncoming vehicle 300 has been detected and the distance and relative speed have been estimated, the calculation of the grid 24 with the corresponding spatial extent takes place in the program step 415.
- the grid 24 is calculated in perspective. The calculation continues to be such that the grid expands to the oncoming vehicle 300.
- program step 420 the calculated data for the grid 24 is transmitted to the head-up display 20. This performs the insertion of the grid 24, as seen in Fig. 4.
- program step 425 objects or obstacles are detected at the roadside. As shown in Figures 4, 5 and 6, parking vehicles are located on the right lane edge in parking bays.
- program step 430 a dynamic calculation of bottlenecks takes place. This is done as follows: The parked vehicle 310 is still at a distance from the observer vehicle 10. The oncoming vehicle 300 moves so that it arrives at approximately the level of the parked vehicle 310, even though the observer vehicle 10 passes the parked vehicle 310 , Thus, a bottleneck only arises in the future through the coincidence of
- Program step 440 the calculated data for the rasters 26a and 28a are transmitted to the head-up display 20.
- program step 445 a calculation of the risk potential of the identified bottleneck takes place.
- the program step 455 a calculation of the animation for the conversion of
- the animation consists in that the dots of the grid 28a move in a swarming manner and at the end of their arrangement form the evasion symbol. If no hazard potential is detected, the program branches back to program step 410. In step 460, the data calculated for the AR overlay animation is transmitted to the HUD 20.
- a loop is formed in the program, which is run through until a state change takes place.
- the state change is given when the driver intervenes and leaves the comfort function or shuts down the vehicle. Then the program ends in program step 465.
- Special purpose processors may include Application Specific Integrated Circuits (ASICs), Reduced Instruction Set Computer (RISC), and / or Field Programmable Gate Arrays (FPGAs).
- ASICs Application Specific Integrated Circuits
- RISC Reduced Instruction Set Computer
- FPGAs Field Programmable Gate Arrays
- the proposed method and apparatus is implemented as a combination of hardware and software.
- the software is preferably implemented as an application program on a program storage device Installed. Typically, it is a machine based on a computer platform that has hardware, such as one or more
- CPU Central processing units
- RAM random access memory
- I / O Input / output interface
- the computer platform also typically installs an operating system.
- the various processes and functions described herein may be part of the application program or part that is executed via the operating system.
- the invention can always be used when the field of view of a driver, an operator or even just a person with data glasses can be enriched with AR impressions.
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018203121.5A DE102018203121B4 (en) | 2018-03-02 | 2018-03-02 | Method for calculating an AR overlay of additional information for a display on a display unit, device for carrying out the method, motor vehicle and computer program |
PCT/EP2019/053461 WO2019166222A1 (en) | 2018-03-02 | 2019-02-12 | Method for calculating an ar-overlay of additional information for a display on a display unit, device for carrying out the method, as well as motor vehicle and computer program |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3759694A1 true EP3759694A1 (en) | 2021-01-06 |
Family
ID=65411880
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19705167.5A Pending EP3759694A1 (en) | 2018-03-02 | 2019-02-12 | Method for calculating an ar-overlay of additional information for a display on a display unit, device for carrying out the method, as well as motor vehicle and computer program |
Country Status (5)
Country | Link |
---|---|
US (1) | US11904688B2 (en) |
EP (1) | EP3759694A1 (en) |
CN (1) | CN111937044A (en) |
DE (1) | DE102018203121B4 (en) |
WO (1) | WO2019166222A1 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102019117689A1 (en) * | 2019-07-01 | 2021-01-07 | Bayerische Motoren Werke Aktiengesellschaft | Method and control unit for displaying a traffic situation by hiding traffic user symbols |
DE102019125958A1 (en) * | 2019-09-26 | 2021-04-01 | Audi Ag | Method for operating a motor vehicle in the vicinity of a bottleneck and a motor vehicle |
KR102408746B1 (en) * | 2020-07-31 | 2022-06-15 | 주식회사 에이치엘클레무브 | Collision risk reduction apparatus and method |
US11577725B2 (en) * | 2020-09-02 | 2023-02-14 | Ford Global Technologies, Llc | Vehicle speed and steering control |
DE102021206771A1 (en) * | 2021-06-29 | 2022-12-29 | Volkswagen Aktiengesellschaft | Method for outputting a starting area for a parking process for a motor vehicle, electronic computing device and motor vehicle |
DE102021129582A1 (en) | 2021-11-12 | 2023-05-17 | Bayerische Motoren Werke Aktiengesellschaft | Procedure for displaying hazard information on smart glasses |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102005062151B4 (en) * | 2005-12-22 | 2007-09-13 | Daimlerchrysler Ag | Method and device for assisting a driver in the passage of constrictions |
US8977489B2 (en) * | 2009-05-18 | 2015-03-10 | GM Global Technology Operations LLC | Turn by turn graphical navigation on full windshield head-up display |
US8781170B2 (en) * | 2011-12-06 | 2014-07-15 | GM Global Technology Operations LLC | Vehicle ghosting on full windshield display |
EP2608153A1 (en) * | 2011-12-21 | 2013-06-26 | Harman Becker Automotive Systems GmbH | Method and system for playing an augmented reality game in a motor vehicle |
DE112013002354T5 (en) * | 2012-05-07 | 2015-04-16 | Honda Motor Co., Ltd. | A method for generating virtual display surfaces from video images of a road-based landscape |
US9047703B2 (en) * | 2013-03-13 | 2015-06-02 | Honda Motor Co., Ltd. | Augmented reality heads up display (HUD) for left turn safety cues |
US20140362195A1 (en) * | 2013-03-15 | 2014-12-11 | Honda Motor, Co., Ltd. | Enhanced 3-dimensional (3-d) navigation |
US9393870B2 (en) * | 2013-03-15 | 2016-07-19 | Honda Motor Co., Ltd. | Volumetric heads-up display with dynamic focal plane |
DE102014219575A1 (en) * | 2013-09-30 | 2015-07-23 | Honda Motor Co., Ltd. | Improved 3-dimensional (3-D) navigation |
DE102013016242A1 (en) | 2013-10-01 | 2015-04-02 | Daimler Ag | Method and device for supporting at least one driver assistance system |
DE102013016251A1 (en) | 2013-10-01 | 2014-06-26 | Daimler Ag | Method for augmented representation of additional information in image in e.g. vehicle environment, involves changing additional information in dependence on current driving and surrounding situation |
DE102013016244A1 (en) * | 2013-10-01 | 2015-04-02 | Daimler Ag | Method and device for augmented presentation |
DE102013016241A1 (en) | 2013-10-01 | 2015-04-02 | Daimler Ag | Method and device for augmented presentation |
JP6273976B2 (en) * | 2014-03-31 | 2018-02-07 | 株式会社デンソー | Display control device for vehicle |
DE102014008152A1 (en) | 2014-05-30 | 2014-10-23 | Daimler Ag | Method and device for augmented presentation of at least one additional information in at least one image of an environment |
US9469248B2 (en) * | 2014-10-10 | 2016-10-18 | Honda Motor Co., Ltd. | System and method for providing situational awareness in a vehicle |
US20160109701A1 (en) | 2014-10-15 | 2016-04-21 | GM Global Technology Operations LLC | Systems and methods for adjusting features within a head-up display |
DE102014119317A1 (en) | 2014-12-22 | 2016-06-23 | Connaught Electronics Ltd. | Method for displaying an image overlay element in an image with 3D information, driver assistance system and motor vehicle |
JP6372402B2 (en) * | 2015-03-16 | 2018-08-15 | 株式会社デンソー | Image generation device |
US9487139B1 (en) * | 2015-05-15 | 2016-11-08 | Honda Motor Co., Ltd. | Determining a driver alert level for a vehicle alert system and method of use |
WO2017013739A1 (en) * | 2015-07-21 | 2017-01-26 | 三菱電機株式会社 | Display control apparatus, display apparatus, and display control method |
KR101714185B1 (en) * | 2015-08-05 | 2017-03-22 | 엘지전자 주식회사 | Driver Assistance Apparatus and Vehicle Having The Same |
DE102015116160B4 (en) | 2015-09-24 | 2022-10-13 | Denso Corporation | Head-up display with situation-based adjustment of the display of virtual image content |
US11004426B2 (en) * | 2015-09-25 | 2021-05-11 | Apple Inc. | Zone identification and indication system |
KR101916993B1 (en) * | 2015-12-24 | 2018-11-08 | 엘지전자 주식회사 | Display apparatus for vehicle and control method thereof |
DE102016203080A1 (en) * | 2016-02-26 | 2017-08-31 | Robert Bosch Gmbh | Method for operating a head-up display, head-up display device |
US10315566B2 (en) * | 2016-03-07 | 2019-06-11 | Lg Electronics Inc. | Vehicle control device mounted on vehicle and method for controlling the vehicle |
US9809165B1 (en) * | 2016-07-12 | 2017-11-07 | Honda Motor Co., Ltd. | System and method for minimizing driver distraction of a head-up display (HUD) in a vehicle |
KR102277852B1 (en) * | 2016-11-08 | 2021-07-16 | 가부시키가이샤 덴소 | vehicle display device |
JP6520905B2 (en) * | 2016-12-19 | 2019-05-29 | トヨタ自動車株式会社 | Vehicle driving support device |
DE112018000309B4 (en) * | 2017-01-04 | 2021-08-26 | Joyson Safety Systems Acquisition Llc | Vehicle lighting systems and methods |
KR20180123354A (en) * | 2017-05-08 | 2018-11-16 | 엘지전자 주식회사 | User interface apparatus for vehicle and Vehicle |
DE102017212367B4 (en) | 2017-07-19 | 2022-12-22 | Volkswagen Aktiengesellschaft | Device for displaying the course of a trajectory in front of a vehicle or an object with a display unit and motor vehicle |
DE102019202588A1 (en) * | 2019-02-26 | 2020-08-27 | Volkswagen Aktiengesellschaft | Method for operating a driver information system in an ego vehicle and driver information system |
US20230322248A1 (en) * | 2022-04-06 | 2023-10-12 | Gm Global Technology Operation Llc | Collision warning system for a motor vehicle having an augmented reality head up display |
-
2018
- 2018-03-02 DE DE102018203121.5A patent/DE102018203121B4/en active Active
-
2019
- 2019-02-12 CN CN201980016739.2A patent/CN111937044A/en active Pending
- 2019-02-12 US US16/977,059 patent/US11904688B2/en active Active
- 2019-02-12 EP EP19705167.5A patent/EP3759694A1/en active Pending
- 2019-02-12 WO PCT/EP2019/053461 patent/WO2019166222A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
US20210046822A1 (en) | 2021-02-18 |
US11904688B2 (en) | 2024-02-20 |
CN111937044A (en) | 2020-11-13 |
WO2019166222A1 (en) | 2019-09-06 |
DE102018203121B4 (en) | 2023-06-22 |
DE102018203121A1 (en) | 2019-09-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3543059B1 (en) | Method for calculating the insertion of additional information for display on a display unit, device for carrying out the method, motor vehicle and computer program | |
DE102017221191B4 (en) | Method for displaying the course of a safety zone in front of a vehicle or an object with a display unit, device for carrying out the method and motor vehicle and computer program | |
DE102018203121B4 (en) | Method for calculating an AR overlay of additional information for a display on a display unit, device for carrying out the method, motor vehicle and computer program | |
EP2720929B1 (en) | Method and device for assisting a driver in performing lateral guidance of a vehicle on a carriageway | |
EP3762684A1 (en) | Overlaying additional information on a display unit | |
DE102017212367B4 (en) | Device for displaying the course of a trajectory in front of a vehicle or an object with a display unit and motor vehicle | |
EP2720899B1 (en) | Method and display device for displaying a driving state of a vehicle and corresponding computer program product | |
DE102013200862B4 (en) | Optimal view of the display for the entire windscreen | |
EP3668742B1 (en) | Method for operating a driver assistance system of a motor vehicle and motor vehicle | |
EP3425442B1 (en) | Method for enriching a field of view of a driver of a vehicle with additional information, device for use in an observers' vehicle and motor vehicle | |
EP3717954B1 (en) | Method for displaying the course of a trajectory in front of a vehicle or an object by means of a display unit, and device for carrying out the method | |
EP3931034B1 (en) | Method for operating a driver information system in an ego-vehicle and driver information system | |
EP3269579B1 (en) | Method for operating an information system and information system | |
EP3803275A1 (en) | Method for calculating an "augmented reality" overlay for displaying a navigation route on an ar display unit, device for carrying out the method, motor vehicle and computer program | |
EP3931029B1 (en) | Method for operating a driver information system in an ego-vehicle and driver information system | |
EP3931028B1 (en) | Method for operating a driver information system in an ego-vehicle and driver information system | |
WO2015000882A1 (en) | Assistance system and assistance method for support when controlling a motor vehicle | |
EP3931023B1 (en) | Method for operating a driver information system in an ego-vehicle and driver information system | |
DE102017003399A1 (en) | Technology for issuing vehicle messages | |
EP3931030B1 (en) | Method for operating a driver information system in an ego-vehicle and driver information system | |
EP3931025B1 (en) | Method for operating a driver information system in an ego-vehicle and driver information system | |
EP3343177A1 (en) | Driver assistance system, computer program product, signal sequence, means of locomotion and method for providing information to a user of a means of locomotion | |
DE102020200047A1 (en) | Method and device for displaying virtual navigation elements |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20201002 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20220707 |