EP3759694A1 - Verfahren zur berechnung einer ar-einblendung von zusatzinformationen für eine anzeige auf einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm - Google Patents

Verfahren zur berechnung einer ar-einblendung von zusatzinformationen für eine anzeige auf einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm

Info

Publication number
EP3759694A1
EP3759694A1 EP19705167.5A EP19705167A EP3759694A1 EP 3759694 A1 EP3759694 A1 EP 3759694A1 EP 19705167 A EP19705167 A EP 19705167A EP 3759694 A1 EP3759694 A1 EP 3759694A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
display
calculated
driver
oncoming
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19705167.5A
Other languages
German (de)
English (en)
French (fr)
Inventor
Andro Kleen
Robert Jan Wyszka
Vitalij SADOVITCH
Adrian HAAR
Johannes Tümler
Michael Wittkämper
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of EP3759694A1 publication Critical patent/EP3759694A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/165Videos and animations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/179Distances to obstacles or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/205Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects

Definitions

  • the proposal concerns the technical field of driver information systems, also known as infotainment systems. In particular, this involves a method for displaying a safety zone in front of a vehicle or an object on a display unit. Such systems are used primarily in vehicles. But there is also the possibility of using the invention in pedestrians, cyclists, etc. with data glasses. The proposal continues to cover a correspondingly designed one
  • a first approach is not to completely relieve the driver of his duties, but to ensure that the driver can always take control of the vehicle. The driver also takes
  • HUD Head-Up Display
  • Vehicle sensors are in particular the following components are called, which allow an environment observation: RADAR devices according to Radio Detection and Ranging, LIDAR devices, according to Light Detection and Ranging, mainly for distance detection / warning, and cameras with appropriate image processing for the field of object recognition.
  • This data on the environment can be used as a basis for system-side driving recommendations, warnings, etc.
  • display / warn in which direction (possibly in the own trajectory) another, surrounding vehicle wants to turn.
  • the vehicle-to-vehicle communication is now also possible by means of mobile communication with systems such as LTE according to Long Term Evolution.
  • LTE V2X was adopted by the 3GPP organization.
  • systems based on WLAN technology for direct vehicle communication are available, in particular the system according to WLAN p.
  • Autonomous driving (sometimes also called automatic driving, automated driving or piloted driving) is the movement of vehicles, mobile robots and driverless transport systems to understand the largely autonomous behavior. There are different gradations of the term autonomous driving. Autonomous driving is also discussed at certain stages when there is still a driver in the vehicle who, if necessary, only takes over the monitoring of the automatic driving process. In Europe, the various ceremonies of transport (in Germany the Federal Highway Research Institute was involved) worked together and the following
  • Level 0 Driver only, the driver drives, steers, accelerates, brakes, etc.
  • Level 1 Certain assistance systems help with vehicle operation (including a cruise control system - Automatic Cruise Control ACC).
  • Level 3 high automation. The driver does not have to keep the system permanently
  • the vehicle autonomously performs functions such as triggering the turn signal, lane change and lane keeping.
  • the driver can turn to other things, but if necessary within a warning time of the system
  • a future vision in the automotive industry is to be able to record the windscreen of one's own vehicle with virtual elements in order to offer the driver some advantages.
  • the so-called "augmented reality” technology (AR) is used. Less familiar is the corresponding German-language term of "augmented reality”.
  • the real environment is enriched with virtual elements. This has several advantages: The view down, on displays other than the windshield, is eliminated, since many relevant information is displayed on the windshield. So the driver does not have to avert his gaze from the road. In addition, due to the positionally exact sclerosing of the virtual elements in the real environment, a lesser cognitive effort on the part of the driver is likely because no interpretation of a graphic must be done on a separate display. With regard to automatic driving, an added value can also be generated.
  • HUD Head-Up Displays
  • Projection units that project an image on the windshield.
  • this image is a few meters to 15 meters, depending on the design of the module in front of the vehicle. This has the advantage that the displayed information is presented in such a way that the eyes of the driver are even relieved of accommodation activity.
  • the “image” is composed as follows: It is less a virtual display, but rather a kind of "keyhole” in the virtual world.
  • the virtual environment is theoretically placed over the real world and contains the virtual objects that support and inform the driver while driving. The limited
  • Display area of the HUD has the consequence that a section of it can be seen.
  • a head-up display unit for a vehicle for generating a virtual image in the field of view of the driver is known. This results in a situation-based adapted representation of information.
  • DE 10 2013 016 241 A1 discloses a method for augmented presentation of at least one additional piece of information in at least one recorded digital image
  • the additional information in particular event-driven and optionally alternately, as a 2D representation and / or output as a 3D representation.
  • an additional information output as an additional object eg. B. a virtual road sign, in 3D presentation and / or issued as additional text additional information, in particular a caption of a real or virtual object in the output image, are displayed in 2D representation in the output image.
  • Representation method with the most accurate contact-analogous placement of the virtual objects and thus the closest possible placement, for example GPS position, the virtual objects on each associated real object allows the invention by outputting the virtual additional information in a previously determined display region that the virtual additional information is optimally placed , in particular taking into account a
  • Image overlay element in an image of a surrounding area of a motor vehicle known.
  • the image is displayed on a display surface of the motor vehicle.
  • at least one object from the surrounding area is imaged in the image.
  • the image overlay element is motion-coupled to the object and is shown moving the object with this carried at the same location on the object, depending on a change of direction and / or resizing the direction and / or size of the image overlay element to the appropriate change of the object is adjusted. So it can be provided that the
  • Image superposition element is adapted to the surface of the roadway. If the roadway is now rising, for example, because it is uphill, then this can be done on the basis of
  • Three-dimensional information about the object can be detected, and the representation of the image overlay element can with respect to the orientation and design
  • HUD head-up display
  • Ambient condition and / or a condition affecting the physiology of the user are Ambient condition and / or a condition affecting the physiology of the user.
  • AR augmented reality
  • the processing of this information can change situationally and lead to misinterpretations or Misunderstandings. This can lead to dangerous situations.
  • the driver is displayed on a conventional display of the navigation route, although the route to be traveled, it is characterized but not clearly enough that a
  • the invention sets itself the task of finding such an approach.
  • This object is achieved by a method for calculating a fade of
  • the insertion of additional information serves the purpose of assisting the driver in the longitudinal and transverse guidance of the vehicle.
  • the solution is to animate the elements of AR inserts, such as lines, surfaces, and other geometric elements, through "physical" behaviors.
  • the invention relates to a method for calculating an AR display, corresponding to "augmented reality" insertion, of additional information for a display on a display unit, in particular a head-up display (HUD) of an observer vehicle or data glasses, wherein the Display of
  • the AR overlay is calculated in the manner of augmented reality in accordance with "augmented reality" in a contact-analogous manner to one or more objects in the environment of the observer vehicle.
  • the position of an oncoming or preceding vehicle or object is detected.
  • a spatially extended animation graphic is calculated, wherein the animation graphic has a raster form consisting of a plurality of raster elements, which differs from the
  • Observer vehicle extends to the oncoming or preceding vehicle.
  • a special feature is that the spatial extent is calculated in such a way that the driver of the observer vehicle has the impression of a kinematic or dynamic movement of the spatial extent, such as translation and rotation.
  • the spatial extent is calculated in such a way that the driver of the observer vehicle is given the impression of a shaft moving or moving away from him.
  • the animation with a waveform can be designed so that the wave can travel on the X, Y, or Z axis.
  • the animation graphic is calculated to periodically repeat the spatial extent of the animation graphic, giving the driver of the observer vehicle the impression that a number of wave trains are moving toward or away from him.
  • one or two spatially extended animation graphics are calculated to support the lateral guidance of the vehicle, which are displayed laterally of the route, the further animation graphics having a grid shape consisting of a plurality of raster elements, and the spatial extent is calculated so that the side where an obstacle or an oncoming vehicle has been detected, the grid is spatially set up to emphasize a narrowing of the route.
  • the goal of towering elements is to better communicate warnings.
  • the piling / extrusion of elements can be done in any axis to any proportions.
  • Animated graphics is calculated so that the at least one grid-shaped
  • the conversion of the animation graphics is calculated so that the raster elements of the animation graphics to support the lateral guidance during the conversion phase swarming, from which at the end of the conversion phase, the hint symbol arises.
  • the swarming behavior of the lines, surfaces, and geometric elements that cluster at each coordinate in the real world creates a superposition and an automatic increase in visual intensity.
  • This "bundling" of elements can be e.g. be used for functions of the area and object marking, but also for the indication of a navigation path or an attention control.
  • the display unit of the device is designed as a head-up display.
  • a data goggle or a monitor can be used in the device as a display unit, on which a camera image is displayed, in which the grid is displayed.
  • the device according to the invention can be used in a motor vehicle.
  • the invention is preferably realized so that the display unit is permanently installed in the vehicle, e.g. in the form of a head-up display.
  • the invention can also be advantageously used if the display unit corresponds to data glasses. Then, the invention can be
  • Fig. 2 shows the typical cockpit of a vehicle
  • FIG. 3 shows the block diagram of the infotainment system of the vehicle
  • Fig. 4 is an illustration of a wave-shaped arched Rastereinblendung for
  • FIG. 5 shows an illustration of a grid insertion for highlighting a narrowing of the travel path
  • FIG. 6 is an illustration of a flock-like conversion of a grid-shaped AR fade to an instruction to the driver in the example of the lane narrowing
  • FIG. 7 is an illustration of three basic levels of driver overlay information and;
  • FIG. FIG. 8 is a flow chart for a program for calculating the AR fades for the three basic levels.
  • Fig. 1 illustrates the basic operation of a head-up display.
  • the head-up display 20 is in the vehicle 10 below / behind the instrument cluster in
  • Additional information is displayed in the driver's field of vision.
  • This additional information appears as if it were projected onto a projection surface 21 at a distance of 7 - 15 m in front of the vehicle 10.
  • the additional information displayed creates a kind of virtual environment.
  • the virtual environment is theoretically placed over the real world and contains the virtual objects that support and inform the driver while driving. However, it is only projected onto a part of the windshield, so that the additional information can not be arbitrarily arranged in the field of vision of the driver.
  • Fig. 2 shows the cockpit of the vehicle 10. Shown is a passenger car. As vehicle 10, however, any other vehicles would also be considered. Examples of other vehicles are: buses, commercial vehicles, especially trucks trucks, agricultural machinery, construction machinery, rail vehicles, etc. The use of the invention would be generally in land vehicles, rail vehicles, watercraft and aircraft possible.
  • the cockpit three display units of an infotainment system are shown. It is the head-up display 20 and a touch-sensitive screen 30 which is mounted in the center console.
  • the center console When driving, the center console is not in the driver's field of vision. Therefore, the additional information is not displayed on the display unit 30 while driving.
  • the touch-sensitive screen 30 serves in particular for the operation of functions of the vehicle 10. For example, about a radio, a radio, a radio, a
  • infotainment is a boxword word composed of the words information and entertainment (entertainment).
  • touch screen To operate the infotainment system mainly the touch-sensitive screen 30 ("touch screen") is used, this screen 30 can be well viewed and operated in particular by a driver of the vehicle 10, but also by a passenger of the vehicle 10.
  • mechanical operating elements for example keys, rotary encoders or combinations thereof, such as, for example, a push-dial regulator, can be arranged in an input unit 50.
  • This unit is not shown separately, but is considered part of the input unit 50.
  • the operation device includes the touch-sensitive display unit 30, a calculator 40, an input unit 50, and a memory 60.
  • the display unit 30 includes both a display surface for displaying variable graphic information and a control surface (touch-sensitive layer) disposed above the display surface
  • the display unit 30 is connected to the computing device 40 via a data line 70
  • the data line can be designed according to the LVDS standard, corresponding to Low Voltage Differential Signaling.
  • the display unit 30 receives control data for driving the display surface of the touch screen 30 from the
  • Computing device 40 Via the data line 70 are also control data of
  • Reference numeral 50 denotes the input unit. It is associated with the already mentioned control elements such as buttons, knobs, sliders, or rotary pushbuttons, with the help of which the operator can make inputs via the menu. Input is generally understood as selecting a selected menu item, as well as changing a parameter, turning a function on and off, and so on.
  • the memory device 60 is connected to the computing device 40 via a data line 80.
  • a pictogram directory and / or symbol directory deposited with the pictograms and / or symbols for the possible overlays of additional information.
  • the points / symbols can be stored, which serve as the basis for the calculation of the raster overlay.
  • the other parts of the infotainment system camera 150, radio 140, navigation device 130, telephone 120 and instrument cluster 110 are connected via the data bus 100 with the device for operating the infotainment system.
  • the data bus 100 is the high-speed variant of the CAN bus according to ISO standard 11898-2.
  • Ethernet-based bus system such as BroadR-Reach in question.
  • Bus systems in which the data is transmitted via optical fibers can also be used. Examples are the MOST Bus (Media Oriented System Transport) or the D2B Bus (Domestic Digital Bus).
  • the camera 150 can be designed as a conventional video camera. In this case, it will record 25 frames / s, which is equivalent to 50 fields / s in the interlace recording mode.
  • the vehicle 10 is with a special camera that captures more images / sec to increase the accuracy of object detection on faster moving objects.
  • a special camera can be used that captures more images / sec to increase the accuracy of object detection on faster moving objects.
  • Several cameras can be used for monitoring the environment.
  • the already mentioned RADAR or LIDAR systems could be used in addition or alternatively to carry out or expand the field observation.
  • the vehicle 10 is with a
  • Communication module 160 equipped. This module is often referred to as an on-board unit. It can be used for mobile communication, e.g. according to LTE standard,
  • WLAN communication according to Long Term Evolution, be designed.
  • wireless LAN it for the communication to devices of the occupants in the vehicle or for the vehicle-to-vehicle communication etc.
  • a driver assistance system for longitudinal guidance of the vehicle 10 is used.
  • assistance systems are an automatic distance control ACC, according to Adaptive Cruise Control, and a cruise control system GRA, according to cruise control system.
  • the invention would also be used in the same way, if the vehicle 10 would be controlled fully automatically.
  • the following describes what steps are taken when the vehicle 10 with the longitudinal guidance system activated, here an ACC system, approaches the preceding vehicle 300, detects it and adapts its speed to the preceding vehicle 300. This is done so that a previously entered safety distance is maintained.
  • a grid-shaped AR overlay is calculated for the path predicted by the navigation system. This displays the route to the driver without obscuring important information of the real scene.
  • the basic idea and technique of the raster-shaped AR overlay is shown in applicant's co-pending patent application DE 10 2017 212 367. It is also expressly referred to the co-pending application with respect to the disclosure of the invention described herein.
  • the basis of the display according to the invention of the longitudinal and / or transverse guidance function of the vehicle 10 on the HUD 20 is the display of a virtual grid 24 along the route, which is displayed at a distance above the actual road or without distance to the road.
  • the road is located as a real road course in the field of vision of the driver.
  • the special feature of the new proposal is that not only the track is marked with the grid 24, but also an event that is connected to the track.
  • the event in the example shown in FIG. 4, is that a vehicle 300 is approaching on the roadway, by which, after estimation of
  • the danger potential which leads to the AR-insertion of the event, consists in the case shown in by the relative movement of both moving towards each other vehicles 10, 300, taking into account possible objects or
  • Danger potential made aware. This is done, as shown in Fig. 4, by the calculation of a spatially extended AR fade.
  • On the urgency of the Hazard potential is indicated by the fact that the spatial extent starting from the oncoming vehicle 300 moves towards the observer vehicle 10. This creates for the driver of the observer vehicle 10, the impression of a tapering shaft on him. It can be a wave crest or several wave peaks are shown, which move towards the observer vehicle 10.
  • the AR display is preferably calculated in perspective. As a result, a wave crest towers more and more in front of the observer vehicle 10, which increasingly points to the urgency of the imminent danger.
  • FIG. 5 shows an example of an AR overlay which indicates the imminent
  • Lane course a left grid 26a and a right grid 28a is displayed. These extend from the observer vehicle 10 to the oncoming or
  • the grid 26a and 28a position themselves spatially laterally outward.
  • the spatial arrangement is preferably such that it increases in front of the bottleneck and subsides after the bottleneck.
  • Fig. 6 shows the case that the impending constriction of the driveway of the
  • Longitudinal guidance system is estimated so that the risk of collision or contact with the oncoming vehicle 300 is estimated to be too large.
  • an AR overlay is computed which gives the driver an action to what needs to be done to avoid a collision or touch.
  • the action statement is designed as a dodge arrow 28b. So that the driver understands the handling instruction directly and intuitively, it is not simply superimposed on the position of the escape point but is also specially shaped by the sequence of its formation. This is done so that the symbol of the escape arrow 28b arises from the points of the right grid 28a.
  • the points of the right-hand grid 28a are animated in such a way that they move in a swarming manner and finally accumulate in such a way that they create the escape arrow 28b.
  • Fig. 6 is also shown that the points of the left grid 26 a for
  • Shift side guide so that they mark the alleged route of the oncoming vehicle 300.
  • Fig. 7 shows a variant of how the various AR overlays can be combined. It is shown that the grid 24 with the representation of
  • Hazard potential of an event together with the grids 26a and 28a to Side guide are displayed. This can be done to better distinguish in different colors.
  • the grid 24 is shown in red and the grid 26a and 28a in yellow.
  • a computer program for the calculation of the AR fades is explained with reference to FIG. 8.
  • the program is processed in the arithmetic unit 40.
  • the program start is designated by the reference numeral 405.
  • the detection of an oncoming vehicle 300 takes place.
  • the images supplied by the camera 150 are provided with those intended for this purpose
  • the distance to the oncoming vehicle 300 is estimated and also the relative speed between the
  • Instantaneous speed of the oncoming vehicle can be estimated by continuous image evaluation of the images supplied by the camera 150.
  • the instantaneous speed may be transmitted via car-2-car communication from the oncoming vehicle 300 to the observer vehicle 10. After the oncoming vehicle 300 has been detected and the distance and relative speed have been estimated, the calculation of the grid 24 with the corresponding spatial extent takes place in the program step 415.
  • the grid 24 is calculated in perspective. The calculation continues to be such that the grid expands to the oncoming vehicle 300.
  • program step 420 the calculated data for the grid 24 is transmitted to the head-up display 20. This performs the insertion of the grid 24, as seen in Fig. 4.
  • program step 425 objects or obstacles are detected at the roadside. As shown in Figures 4, 5 and 6, parking vehicles are located on the right lane edge in parking bays.
  • program step 430 a dynamic calculation of bottlenecks takes place. This is done as follows: The parked vehicle 310 is still at a distance from the observer vehicle 10. The oncoming vehicle 300 moves so that it arrives at approximately the level of the parked vehicle 310, even though the observer vehicle 10 passes the parked vehicle 310 , Thus, a bottleneck only arises in the future through the coincidence of
  • Program step 440 the calculated data for the rasters 26a and 28a are transmitted to the head-up display 20.
  • program step 445 a calculation of the risk potential of the identified bottleneck takes place.
  • the program step 455 a calculation of the animation for the conversion of
  • the animation consists in that the dots of the grid 28a move in a swarming manner and at the end of their arrangement form the evasion symbol. If no hazard potential is detected, the program branches back to program step 410. In step 460, the data calculated for the AR overlay animation is transmitted to the HUD 20.
  • a loop is formed in the program, which is run through until a state change takes place.
  • the state change is given when the driver intervenes and leaves the comfort function or shuts down the vehicle. Then the program ends in program step 465.
  • Special purpose processors may include Application Specific Integrated Circuits (ASICs), Reduced Instruction Set Computer (RISC), and / or Field Programmable Gate Arrays (FPGAs).
  • ASICs Application Specific Integrated Circuits
  • RISC Reduced Instruction Set Computer
  • FPGAs Field Programmable Gate Arrays
  • the proposed method and apparatus is implemented as a combination of hardware and software.
  • the software is preferably implemented as an application program on a program storage device Installed. Typically, it is a machine based on a computer platform that has hardware, such as one or more
  • CPU Central processing units
  • RAM random access memory
  • I / O Input / output interface
  • the computer platform also typically installs an operating system.
  • the various processes and functions described herein may be part of the application program or part that is executed via the operating system.
  • the invention can always be used when the field of view of a driver, an operator or even just a person with data glasses can be enriched with AR impressions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Instrument Panels (AREA)
  • Traffic Control Systems (AREA)
EP19705167.5A 2018-03-02 2019-02-12 Verfahren zur berechnung einer ar-einblendung von zusatzinformationen für eine anzeige auf einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm Pending EP3759694A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018203121.5A DE102018203121B4 (de) 2018-03-02 2018-03-02 Verfahren zur Berechnung einer AR-Einblendung von Zusatzinformationen für eine Anzeige auf einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm
PCT/EP2019/053461 WO2019166222A1 (de) 2018-03-02 2019-02-12 Verfahren zur berechnung einer ar-einblendung von zusatzinformationen für eine anzeige auf einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm

Publications (1)

Publication Number Publication Date
EP3759694A1 true EP3759694A1 (de) 2021-01-06

Family

ID=65411880

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19705167.5A Pending EP3759694A1 (de) 2018-03-02 2019-02-12 Verfahren zur berechnung einer ar-einblendung von zusatzinformationen für eine anzeige auf einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm

Country Status (5)

Country Link
US (1) US11904688B2 (zh)
EP (1) EP3759694A1 (zh)
CN (1) CN111937044A (zh)
DE (1) DE102018203121B4 (zh)
WO (1) WO2019166222A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019117689A1 (de) * 2019-07-01 2021-01-07 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Steuereinheit zur Darstellung einer Verkehrssituation durch Ausblenden von Verkehrsteilnehmer-Symbolen
DE102019125958A1 (de) * 2019-09-26 2021-04-01 Audi Ag Verfahren zum Betrieb eines Kraftfahrzeugs im Umfeld einer Engstelle und Kraftfahrzeug
KR102408746B1 (ko) * 2020-07-31 2022-06-15 주식회사 에이치엘클레무브 충돌 위험 저감 장치 및 방법
US11577725B2 (en) * 2020-09-02 2023-02-14 Ford Global Technologies, Llc Vehicle speed and steering control
DE102021206771A1 (de) * 2021-06-29 2022-12-29 Volkswagen Aktiengesellschaft Verfahren zum Ausgeben eines Startbereichs für einen Parkvorgang eines Kraftfahrzeugs, elektronische Recheneinrichtung sowie Kraftfahrzeug
DE102021129582A1 (de) 2021-11-12 2023-05-17 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum Anzeigen von Gefahrenhinweisen auf einer Datenbrille

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005062151B4 (de) * 2005-12-22 2007-09-13 Daimlerchrysler Ag Verfahren und Vorrichtung zur Unterstützung eines Fahrzeugführers bei der Passage von Fahrwegverengungen
US8977489B2 (en) * 2009-05-18 2015-03-10 GM Global Technology Operations LLC Turn by turn graphical navigation on full windshield head-up display
US8781170B2 (en) 2011-12-06 2014-07-15 GM Global Technology Operations LLC Vehicle ghosting on full windshield display
EP2608153A1 (en) * 2011-12-21 2013-06-26 Harman Becker Automotive Systems GmbH Method and system for playing an augmented reality game in a motor vehicle
US9135754B2 (en) 2012-05-07 2015-09-15 Honda Motor Co., Ltd. Method to generate virtual display surfaces from video imagery of road based scenery
US9047703B2 (en) * 2013-03-13 2015-06-02 Honda Motor Co., Ltd. Augmented reality heads up display (HUD) for left turn safety cues
US20140362195A1 (en) * 2013-03-15 2014-12-11 Honda Motor, Co., Ltd. Enhanced 3-dimensional (3-d) navigation
US9393870B2 (en) * 2013-03-15 2016-07-19 Honda Motor Co., Ltd. Volumetric heads-up display with dynamic focal plane
DE102014219575A1 (de) * 2013-09-30 2015-07-23 Honda Motor Co., Ltd. Verbesserte 3-Dimensionale (3-D) Navigation
DE102013016241A1 (de) 2013-10-01 2015-04-02 Daimler Ag Verfahren und Vorrichtung zur augmentierten Darstellung
DE102013016244A1 (de) * 2013-10-01 2015-04-02 Daimler Ag Verfahren und Vorrichtung zur augmentierten Darstellung
DE102013016251A1 (de) * 2013-10-01 2014-06-26 Daimler Ag Verfahren und Vorrichtung zur augmentierten Darstellung
DE102013016242A1 (de) 2013-10-01 2015-04-02 Daimler Ag Verfahren und Vorrichtung zur Unterstützung mindestens eines Fahrassistenzsystems
JP6273976B2 (ja) * 2014-03-31 2018-02-07 株式会社デンソー 車両用表示制御装置
DE102014008152A1 (de) 2014-05-30 2014-10-23 Daimler Ag Verfahren und Vorrichtung zur augmentierten Darstellung mindstens einer Zusatzinformation in mindestens einem Bild einer Umgebung
US9469248B2 (en) * 2014-10-10 2016-10-18 Honda Motor Co., Ltd. System and method for providing situational awareness in a vehicle
US20160109701A1 (en) 2014-10-15 2016-04-21 GM Global Technology Operations LLC Systems and methods for adjusting features within a head-up display
DE102014119317A1 (de) 2014-12-22 2016-06-23 Connaught Electronics Ltd. Verfahren zur Darstellung eines Bildüberlagerungselements in einem Bild mit 3D-Information, Fahrerassistenzsystem und Kraftfahrzeug
JP6372402B2 (ja) 2015-03-16 2018-08-15 株式会社デンソー 画像生成装置
US9487139B1 (en) * 2015-05-15 2016-11-08 Honda Motor Co., Ltd. Determining a driver alert level for a vehicle alert system and method of use
DE112015006725T5 (de) * 2015-07-21 2018-04-12 Mitsubishi Electric Corporation Anzeigesteuervorrichtung, Anzeigevorrichtung und Anzeigesteuerverfahren
KR101714185B1 (ko) * 2015-08-05 2017-03-22 엘지전자 주식회사 차량 운전 보조장치 및 이를 포함하는 차량
DE102015116160B4 (de) 2015-09-24 2022-10-13 Denso Corporation Head-Up Display mit situationsbasierter Anpassung der Darstellung von virtuellen Bildinhalten
WO2017053616A1 (en) * 2015-09-25 2017-03-30 Nyqamin Dynamics Llc Augmented reality display system
KR101916993B1 (ko) * 2015-12-24 2018-11-08 엘지전자 주식회사 차량용 디스플레이 장치 및 그 제어방법
DE102016203080A1 (de) 2016-02-26 2017-08-31 Robert Bosch Gmbh Verfahren zum Betreiben eines Head-Up-Displays, Head-Up-Display Einrichtung
US10315566B2 (en) * 2016-03-07 2019-06-11 Lg Electronics Inc. Vehicle control device mounted on vehicle and method for controlling the vehicle
US9809165B1 (en) * 2016-07-12 2017-11-07 Honda Motor Co., Ltd. System and method for minimizing driver distraction of a head-up display (HUD) in a vehicle
CN109964101B (zh) * 2016-11-08 2021-10-29 株式会社电装 车辆用显示装置
JP6520905B2 (ja) * 2016-12-19 2019-05-29 トヨタ自動車株式会社 車両用運転支援装置
DE112018000309B4 (de) * 2017-01-04 2021-08-26 Joyson Safety Systems Acquisition Llc Fahrzeugbeleuchtungssysteme und -verfahren
KR20180123354A (ko) * 2017-05-08 2018-11-16 엘지전자 주식회사 차량용 사용자 인터페이스 장치 및 차량
DE102017212367B4 (de) 2017-07-19 2022-12-22 Volkswagen Aktiengesellschaft Vorrichtung zur Anzeige des Verlaufs einer Trajektorie vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit sowie Kraftfahrzeug
DE102019202588A1 (de) * 2019-02-26 2020-08-27 Volkswagen Aktiengesellschaft Verfahren zum Betreiben eines Fahrerinformationssystems in einem Ego-Fahrzeug und Fahrerinformationssystem
US20230322248A1 (en) * 2022-04-06 2023-10-12 Gm Global Technology Operation Llc Collision warning system for a motor vehicle having an augmented reality head up display

Also Published As

Publication number Publication date
CN111937044A (zh) 2020-11-13
DE102018203121A1 (de) 2019-09-05
DE102018203121B4 (de) 2023-06-22
US20210046822A1 (en) 2021-02-18
US11904688B2 (en) 2024-02-20
WO2019166222A1 (de) 2019-09-06

Similar Documents

Publication Publication Date Title
EP3543059B1 (de) Verfahren zur berechnung einer einblendung von zusatzinformationen für eine anzeige auf einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm
DE102017221191B4 (de) Verfahren zur Anzeige des Verlaufs einer Sicherheitszone vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm
DE102018203121B4 (de) Verfahren zur Berechnung einer AR-Einblendung von Zusatzinformationen für eine Anzeige auf einer Anzeigeeinheit, Vorrichtung zur Durchführung des Verfahrens sowie Kraftfahrzeug und Computerprogramm
DE102017212367B4 (de) Vorrichtung zur Anzeige des Verlaufs einer Trajektorie vor einem Fahrzeug oder einem Objekt mit einer Anzeigeeinheit sowie Kraftfahrzeug
EP2720929B1 (de) Verfahren und vorrichtung zur unterstützung eines fahrers bei einer spurführung eines fahrzeugs auf einer fahrbahn
WO2019170387A1 (de) Einblendung von zusatzinformationen auf einer anzeigeeinheit
EP2720899B1 (de) Verfahren und anzeigeeinrichtung zum anzeigen eines fahrzustands eines fahrzeugs und entsprechendes computer-programprodukt
DE102013200862B4 (de) Optimaler Blickort an Anzeige für gesamte Frontscheibe
EP3668742B1 (de) Verfahren zum betreiben eines fahrerassistenzsystems eines kraftfahrzeugs sowie kraftfahrzeug
EP3425442B1 (de) Verfahren und vorrichtung zum anreichern eines sichtfeldes eines fahrers eines fahrzeuges mit zusatzinformationen, vorrichtung zur verwendung in einem beobachter-fahrzeug sowie kraftfahrzeug
EP3717954B1 (de) Verfahren zur anzeige des verlaufs einer trajektorie vor einem fahrzeug oder einem objekt mit einer anzeigeeinheit, vorrichtung zur durchführung des verfahrens
EP3269579B1 (de) Verfahren zum betreiben eines informationssystems und informationssystem
EP3931034B1 (de) Verfahren zum betreiben eines fahrerinformationssystems in einem ego-fahrzeug und fahrerinformationssystem
EP3803275A1 (de) Verfahren zur berechnung einer "augmented reality"-einblendung für die darstellung einer navigationsroute auf einer ar-anzeigeeinheit, vorrichtung zur durchführung des verfahrens sowie kraftfahrzeug und computerprogramm
EP3931029B1 (de) Verfahren zum betreiben eines fahrerinformationssystems in einem ego-fahrzeug und fahrerinformationssystem
WO2015000882A1 (de) Assistenzsystem und assistenzverfahren zur unterstützung bei der steuerung eines kraftfahrzeugs
EP3931028B1 (de) Verfahren zum betreiben eines fahrerinformationssystems in einem ego-fahrzeug und fahrerinformationssystem
EP3931023B1 (de) Verfahren zum betreiben eines fahrerinformationssystems in einem ego-fahrzeug und fahrerinformationssystem
DE102017003399A1 (de) Technik zur Ausgabe von Fahrzeugmeldungen
EP3931030B1 (de) Verfahren zum betreiben eines fahrerinformationssystems in einem ego-fahrzeug und fahrerinformationssystem
EP3931025B1 (de) Verfahren zum betreiben eines fahrerinformationssystems in einem ego-fahrzeug und fahrerinformationssystem
EP3343177A1 (de) Fahrerassistenzsystem, computerprogrammprodukt, signalfolge, fortbewegungsmittel und verfahren zur information eines anwenders eines fortbewegungsmittels
DE102020200047A1 (de) Verfahren und Vorrichtung zur Darstellung von virtuellen Navigationselementen

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20201002

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20220707