EP3837594A1 - Verfahren zum betreiben einer mobilen, tragbaren ausgabevorrichtung in einem kraftfahrzeug, kontextbearbeitungseinrichtung, mobile ausgabevorrichtung, und kraftfahrzeug - Google Patents

Verfahren zum betreiben einer mobilen, tragbaren ausgabevorrichtung in einem kraftfahrzeug, kontextbearbeitungseinrichtung, mobile ausgabevorrichtung, und kraftfahrzeug

Info

Publication number
EP3837594A1
EP3837594A1 EP19725969.0A EP19725969A EP3837594A1 EP 3837594 A1 EP3837594 A1 EP 3837594A1 EP 19725969 A EP19725969 A EP 19725969A EP 3837594 A1 EP3837594 A1 EP 3837594A1
Authority
EP
European Patent Office
Prior art keywords
motor vehicle
information
media content
context
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19725969.0A
Other languages
German (de)
English (en)
French (fr)
Inventor
Daniel Profendiner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Publication of EP3837594A1 publication Critical patent/EP3837594A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/25Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using haptic output
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/164Infotainment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/186Displaying information according to relevancy
    • B60K2360/1868Displaying information according to relevancy according to driving situations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/731Instruments adaptations for specific vehicle types or users by comprising user programmable systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Definitions

  • the invention relates to a method for operating a mobile, portable output device, for example for operating data glasses that provide virtual reality (“VR glasses”, “AR glasses”).
  • the method specifically relates to the operation of the output device in a motor vehicle, for example while the motor vehicle is traveling.
  • augmented reality Systems for the use of virtual reality (“augmented reality”) are currently being developed in the vehicle. Such systems use movement data by means of localization of the motor vehicle in order to represent the movement of a virtual ego object.
  • the movement of the vehicle thus forms the basis for an in-vehicle system for a virtual reality ("in-car VR system", "in-car AR system”).
  • in-car VR system virtual reality
  • in-car AR system virtual reality
  • a passenger wears such data glasses in the motor vehicle
  • the passenger can play a computer game or watch a film, for example, in which the ego object accepts the movements of the motor vehicle.
  • Such a system can preferably be used by a passenger if the motor vehicle is operated in a piloted driving mode.
  • DE 10 2017 005 982 A1 describes a method for avoiding or reducing symptoms of kinetosis when using virtual reality glasses while driving in a vehicle. While driving, the virtual reality glasses next to a display of playback content an additional movable network structure is shown, the network structure also being moved in such a way that it conforms to a current dynamic driving state of motion of the vehicle.
  • DE 10 2015 003 882 A1 discloses a method for operating virtual reality glasses arranged in a motor vehicle.
  • the method includes, among other things, the step of detecting the movement of the motor vehicle and, in addition to displaying a first virtual environment, simultaneously displaying a second virtual environment in a second display area, which is displayed corresponding to the detected movement of the motor vehicle.
  • DE 101 56 219 C1 describes a method and a device for reducing the kinetose effect in passengers of means of transport and transportation.
  • Image signals are provided to the passenger sensitive to kinetosis via optical playback devices during the journey, which are modified in dependence on travel-specific movement data in such a way that for the passenger the visual collapse of the viewed images with the currently subjectively perceived situation. and movement values is correlated.
  • One object underlying the invention is to provide a system for providing an even more immersive virtual reality.
  • the object is achieved by the inventive method and the inventive devices according to the independent claims.
  • Advantageous further developments are given by the subclaims.
  • the invention is based on the idea of overcoming the existing disadvantages by using general, current information about an environment of the motor vehicle and / or location-based information about the environment and in a predetermined, thematic context of a media content of an output file the dispensing device is to be dispensed.
  • This information about an event in an environment of the motor vehicle is independent of a direction of movement of the motor vehicle, preferably of an own movement of the motor vehicle.
  • This creates a coherent, immersive system which means that much more sensory impressions acting on the user of the output device can be taken up during, for example, a journey into virtual reality. Using this information creates an extremely high-quality, immersive overall experience in virtual reality.
  • a context processing device is understood to mean a device or a device component which is set up for evaluating an ambient signal of a detection device, the ambient signal describing an environment of the motor vehicle detected by the detection device.
  • the context processing device can be set up, for example, for image analysis and / or for calling up information relating to, for example, a geographical position of the motor vehicle, for example via the Internet.
  • the context processing device is set up to establish a predetermined, thematic context of a media content of the output file, for example by an image analysis of the media content, a text analysis or to read out corresponding information on the media content.
  • the context processing device is also set up to change a media content, that is to say to generate an image, a series of images and / or a virtual reality.
  • the context processing device can also use an appropriate algorithm, for example.
  • a detection device is understood to be a device or a device component that is set up and configured to detect the environment and, for this purpose, can include, for example, a weather sensor and / or a motion sensor and / or a camera. Alternatively or additionally, the detection device can be set up to determine a current state of the motor vehicle and / or to receive information signals from an infrastructure external to the motor vehicle, for example a traffic light.
  • the context processing device receives a surrounding signal from the detection device of the motor vehicle, the received surrounding signal describing information about an event in a surroundings of the motor vehicle that is independent of a direction of movement of the motor vehicle.
  • the received ambient signal thus at least partially describes the event.
  • the context processing device carries out a determination of a predetermined thematic context of a media content of an output file, the output file being the file which is output or is to be output by the output device.
  • the output file is preferably a file for generating a virtual reality.
  • the thematic context for example a virtual computer game or a film, can have, for example, the topic "Journey through the universe" or "Crime”.
  • a context can be understood to mean a change in the media content.
  • the specified media content of the output file is changed in such a way that the changed media content takes into account the information of the event in the surroundings of the motor vehicle in a context-related manner.
  • the detection device detects, for example, another motor vehicle which, for example, can cross an intersection from right to left in front of the user's standing motor vehicle
  • the predetermined one can Media content can be changed so that, for example, a spaceship crosses the user's field of vision from right to left.
  • the specified media content of the output file is changed by generating and / or changing an image signal described by the output file, the generated and / or changed image signal being a virtual one that is adapted to the detected event of the environment in the context of the context Event describes.
  • the media content can be changed by generating or changing an image signal, the generated or changed image signal being able to describe the event of the surroundings of the motor vehicle as a virtual event in the determined thematic context.
  • the real environment can be integrated into the media content as a modified image depending on the determined thematic context; an object and / or a property of the object of the environment can thus be modified and incorporated in dependence on the determined thematic context.
  • a virtual reality is not generated in which the ego object moves like the motor vehicle, but the virtual environment, preferably objects of the virtual environment, can preferably have properties, for example movements, in the real environment of the motor vehicle - Real objects of the environment, adapted or added to the virtual reality.
  • the output file with the changed media content is transferred to a display device of the output device for outputting the output file.
  • a display device is understood to mean that component of the output device that is specifically designed to display the virtual reality, for example as a screen.
  • the information about the event in the environment can be current, the environment descriptive information, preferably information about a current weather at a current location of the motor vehicle.
  • real sounds of rain spatter or a real windshield wiper sound can be immersively integrated into virtual reality, for example as a virtual ion storm that hits the ego object, or, for example, as a virtual person who wipes fingerprints on an object.
  • the information about the event of the environment can preferably describe a property of a real object in the environment, preferably a property of a vehicle surface and / or information about a traffic light.
  • Real objects in the surroundings of the motor vehicle which act on the motor vehicle and / or the user from the outside and leave real sensations on the user, can for example make the motor vehicle rumble while driving on a cobblestone street or stop the motor vehicle on one red traffic light, are immersively integrated, for example by shaking a spaceship as an ego object in virtual reality from a meteor shower, or by, during a real red phase, in virtual reality, for example, a meteoroid crossing a trajectory of the ego spaceship tenschauer flies by.
  • the information can describe a movement of the real object, preferably a direction of movement and / or a speed at which the real object can move. In this way, an experience of virtual reality becomes even more immersive, that is, even more authentic.
  • the information describes a geographical position of the real object, for example location information relating to, for example, a landscape in which the motor vehicle is located can be integrated into the virtual reality. This would also make the experience much more authentic, so it increases and improves immersion.
  • the above object is achieved by a context processing device that is set up to carry out a method according to one of the embodiments described above.
  • the context processing device can be designed, for example, as a control device or control board.
  • the context processing device can preferably have a processor device, that is to say a component or a device component for electronic data processing.
  • the optional processor device can preferably have at least one microcontroller and / or at least one microprocessor.
  • a program code can optionally be stored in a data memory, for example a data memory of the context processing device or the processor device, which, when executed by the processor device, can cause the context processing device to carry out the above-described embodiments of the method according to the invention.
  • a data memory for example a data memory of the context processing device or the processor device, which, when executed by the processor device, can cause the context processing device to carry out the above-described embodiments of the method according to the invention.
  • a motor vehicle which can preferably be designed as a motor vehicle, for example as a passenger motor vehicle.
  • the motor vehicle according to the invention has an embodiment of the context processing device according to the invention. The advantages already mentioned also result here.
  • the above object is achieved by a mobile, portable output device which has an embodiment of the context processing device according to the invention.
  • the term “mobile” is understood here to mean that the output device can be operated in a portable manner and without coupling to the motor vehicle.
  • the advantages already mentioned result result.
  • the mobile output device is preferably an output device that can be worn on the head, and can particularly preferably be designed as VR data glasses, alternatively, for example, as a smartphone.
  • FIG. Shows a schematic representation of an embodiment of the method according to the invention and the devices according to the invention.
  • FIG. 1 shows an optional storage medium 22, a data memory known to the person skilled in the art, which can be an example of a memory for the context processing device 12.
  • a program code for carrying out the method according to the invention can be stored on the storage medium 22 and / or one or more output files which can describe a virtual reality with a media content, for example a computer game or a film.
  • the individual components are connected to one another by data communication connections, which can be configured either as wireless or as wired data communication connections.
  • An example of a wireless data communication connection is, for example, a WLAN connection or a Bluetooth connection
  • an example of a wired data communication connection can be, for example, a data bus of the motor vehicle 16 or a cable.
  • the data communication connections are shown as black connecting lines.
  • the detection device 10 of the motor vehicle 16 can comprise, for example, one or more sensors, that is, it can be designed as a device or device component for detecting, for example, a physical, chemical or optical perceptible property of the environment.
  • an exemplary sensor 24 is shown on a roof of the motor vehicle 16 for this purpose, which can be configured, for example, as a camera.
  • a further exemplary sensor 24 on, for example, a motor vehicle front of motor vehicle 16 can be designed as a weather sensor and / or temperature sensor.
  • Sensors 24 for detecting a movement of a real object 18 that is different from the motor vehicle 16, for example of another motor vehicle, or for example for detecting a signal from a traffic light 19, which for example can describe a duration of a red phase, can be used as that Sensors 24 known to those skilled in the art can be designed.
  • the detection device 10 can have, for example, a communication unit 26 for detecting a traffic light signal, that is to say a component or a device component for receiving signals, optionally also for transmitting signals.
  • the communication unit 26 can receive a signal from the traffic light 19, alternatively, for example, from a server device external to the motor vehicle (not shown in the figure) or from a satellite, for example via a radio link 21, which can describe the red phase.
  • the communication unit 26 can be designed, for example, to receive location signals and can then be referred to as a navigation element.
  • location signals can be, for example, GPS signals or GLONASS signals and preferably describe the coordinates of a current location of the motor vehicle 16.
  • Corresponding information for example information about whether the motor vehicle 16 is currently in a position, can be queried, for example, via an Internet or mobile radio connection to a data server external to the motor vehicle (not shown in the figure) in order to acquire associated environmental information about the current location Urban area, or in a special landscape.
  • the context processing device 12 can, for example, each be configured as a circuit or control board and optionally have a processor device 28.
  • the detection device 10 is optionally additionally set up to determine a movement speed of the real object 18 of the environment, this can be done, for example, by means of the optional navigation element.
  • a user of the motor vehicle 16 for example a passenger or, if the motor vehicle 16 is driving a piloted, that is to say fully autonomous, driving mode, any passenger, may have put on the output device 14.
  • the event can be the exemplary red phase of the traffic light 19 or, for example, the other motor vehicle, which can cross a path of the motor vehicle 16 of the user of the output device 14.
  • a current rainy weather can be recorded as an event (S1).
  • the exemplary camera can, for example, transmit a camera image from the motor vehicle to the detection device 10.
  • the detection device 10 can receive a radio signal from the other motor vehicle, for example via motor vehicle-to-motor vehicle communication, which can, for example, describe a direction of movement and a speed of movement of the other motor vehicle.
  • This information can be transmitted to the context processing device 12 as an environmental signal - or in separate environmental signals.
  • a corresponding ambient signal can describe the rainy weather.
  • optional environmental signal can be, for example, information determined on the basis of location information, the environmental signal being able to describe a cobblestone pavement as the underground of the motor vehicle 16.
  • the context processing device 12 can be set up, for example, for image analysis and / or image processing.
  • the context processing device 12 can recognize, for example, the contours of the other motor vehicle on the basis of the ambient signal, which the camera image can describe.
  • the example output file can be, for example, a computer game in a science fiction world.
  • the context processing device 12 can query and thus determine thematic context of the output file (S2).
  • the output file can, for example, specify a predetermined change as media content.
  • the context processing device 12 can, for example, determine (S3) that, in a current sequence of the computer game, an ego object is flying through the universe in its perspective. The event of the current red phase with, for example, a duration of three minutes can then be taken into account, for example, in such a way that the context processing device 12 switches on to change the predetermined media content (S4) changes the image signal described by the output file in such a way - or generates a new image signal - which can describe interrupting a flight movement of a virtual environment of the media content for three minutes.
  • the other motor vehicle can be taken into account, for example, in that the generated or changed image signal can describe a virtual object in the changed media content, which can cross the user's field of vision from left to right on a screen of the output device 14, for example, preferably at a speed of the real object 18, or at a speed that a passing of the virtual object from left to right on the screen may take for a period of three minutes.
  • the generated or changed image signal can describe a virtual object in the changed media content, which can cross the user's field of vision from left to right on a screen of the output device 14, for example, preferably at a speed of the real object 18, or at a speed that a passing of the virtual object from left to right on the screen may take for a period of three minutes.
  • the optional information on a current rainy weather can be taken into account, for example, when changing the predetermined media content (S4), in that the virtual image described by the output file can describe a meteor shower or ion storm.
  • Exemplary information about a real cobblestone pavement can be taken into account, for example, in that the changed media content can describe an image in which the virtual environment of the output file can describe a meteor shower around the ego object while the real motor vehicle 16 is traveling , In combination with the real perceived shaking of the motor vehicle 16, a particularly intensive immersive experience is created for the user.
  • the changed output file is transmitted to a display device (not shown in the figure) from the output device 14 (S5), for example to a screen.
  • the media content can be changed using techniques known to the person skilled in the art (S4).
  • preprogrammed sequences can be stored in the storage medium 22, which, for example, can be called up in a situation-specific manner and incorporated into the media content.
  • a sequence that is scripted in this way ie predefined, can be used if, for example, information can be provided from the navigation element that the user is currently driving from Kunststoff to Regensburg with the motor vehicle 16. From the information it can be deduced that the journey can take 1.5 hours, for example.
  • a slot can be installed in the output file at one or more locations, for example with a duration of 10 seconds, in order to let the exemplary spaceship fly by from left to right analogously to the other motor vehicle.
  • the output file can comprise, for example, a digital list, in which, for example, for a duration of 10 seconds or, for example, three minutes, it can be specified which specified sequences can be inserted into the media content.
  • the following information can be used, for example:
  • - Online traffic light information for example information about how long an associated traffic light 19 can still be red and / or green for the user / the motor vehicle 16; this information can be used in virtual reality, for example to give context and state or waiting phases of a virtual ego object of the media content.
  • scripted sequences and / or events can be generated, for example, both in terms of the location, for example a location of the traffic light 19, and a time, for example a length of the red phase, as the duration of the event , can be synchronized.
  • a flyby of a large spaceship can be shown, which can last exactly the length of a red phase; and or
  • Local weather information for example weather impressions (for example rain, thunder) can still be perceived by the user of the output device 14, even though the user is in a virtual world.
  • weather impressions for example rain, thunder
  • the information can be taken up and processed in the virtual reality.
  • a rain in reality can be picked up as an ion storm in a virtual space world, or a rain in reality can be picked up as flying through a cluster of particles in virtual reality, for example; and or
  • a vehicle surface Another piece of information that can be perceived by the user of the virtual reality is a current road surface (for example cobblestones, forest paths, gravel, normal asphalt, speed thresholds, so-called “speed bumps”, potholes).
  • a current road surface for example cobblestones, forest paths, gravel, normal asphalt, speed thresholds, so-called “speed bumps”, potholes.
  • These different surfaces create different haptic driving experiences and can be represented in virtual reality.
  • a real pothole can be recorded as a rocket impact in the virtual environment.
  • Cobblestones of the real environment of motor vehicle 16 can be picked up, for example, in the virtual environment as a field with a vibration, for example a flight through a canopy of trees.
  • system and / or the method can be implemented as follows:
  • Streaming associated vehicle data for example rain sensor, traffic light information, pothole detection
  • an output device 14 designed as VR glasses for example, an output device 14 designed as VR glasses; and or -
  • mobile VR glasses via used vehicle interfaces, for example Bluetooth and / or WiFi; and or
  • Some of the data can be determined using swarm data, which is why an interface to a backend via the mobile network can also be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)
EP19725969.0A 2018-08-14 2019-05-21 Verfahren zum betreiben einer mobilen, tragbaren ausgabevorrichtung in einem kraftfahrzeug, kontextbearbeitungseinrichtung, mobile ausgabevorrichtung, und kraftfahrzeug Pending EP3837594A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018213654.8A DE102018213654A1 (de) 2018-08-14 2018-08-14 Verfahren zum Betreiben einer mobilen, tragbaren Ausgabevorrichtung in einem Kraftfahrzeug, Kontextbearbeitungseinrichtung, mobile Ausgabevorrichtung, und Kraftfahrzeug
PCT/EP2019/063012 WO2020035183A1 (de) 2018-08-14 2019-05-21 Verfahren zum betreiben einer mobilen, tragbaren ausgabevorrichtung in einem kraftfahrzeug, kontextbearbeitungseinrichtung, mobile ausgabevorrichtung, und kraftfahrzeug

Publications (1)

Publication Number Publication Date
EP3837594A1 true EP3837594A1 (de) 2021-06-23

Family

ID=66640964

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19725969.0A Pending EP3837594A1 (de) 2018-08-14 2019-05-21 Verfahren zum betreiben einer mobilen, tragbaren ausgabevorrichtung in einem kraftfahrzeug, kontextbearbeitungseinrichtung, mobile ausgabevorrichtung, und kraftfahrzeug

Country Status (5)

Country Link
US (1) US11687149B2 (zh)
EP (1) EP3837594A1 (zh)
CN (1) CN112585563A (zh)
DE (1) DE102018213654A1 (zh)
WO (1) WO2020035183A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018213654A1 (de) 2018-08-14 2020-02-20 Audi Ag Verfahren zum Betreiben einer mobilen, tragbaren Ausgabevorrichtung in einem Kraftfahrzeug, Kontextbearbeitungseinrichtung, mobile Ausgabevorrichtung, und Kraftfahrzeug

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10156219C1 (de) 2001-11-15 2003-08-14 Daimler Chrysler Ag Verfahren und Vorrichtung zur Reduzierung von Kinetose-Störungen
US7128705B2 (en) * 2002-11-26 2006-10-31 Artis Llc Motion-coupled visual environment for prevention or reduction of motion sickness and simulator/virtual environment sickness
JP2007133489A (ja) * 2005-11-08 2007-05-31 Sony Corp 仮想空間画像表示方法、装置、仮想空間画像表示プログラム及び記録媒体
US20090119706A1 (en) * 2005-12-16 2009-05-07 Stepframe Media, Inc. Generation and Delivery of Stepped-Frame Content Via MPEG Transport Streams
US20080310707A1 (en) * 2007-06-15 2008-12-18 Microsoft Corporation Virtual reality enhancement using real world data
DE102013000068A1 (de) * 2013-01-08 2014-07-10 Audi Ag Verfahren zum Synchronisieren von Anzeigeeinrichtungen eines Kraftfahrzeugs
US9630631B2 (en) * 2013-10-03 2017-04-25 Honda Motor Co., Ltd. System and method for dynamic in-vehicle virtual reality
DE102015003948B4 (de) * 2015-03-26 2022-08-11 Audi Ag Verfahren zum Betreiben einer in einem Kraftfahrzeug angeordneten Virtual-Reality-Brille und Virtual-Reality-System
DE102015003882A1 (de) * 2015-03-26 2016-09-29 Audi Ag Verfahren zum Betreiben einer in einem Kraftfahrzeug angeordneten Virtual-Reality-Brille und Virtual-Reality-System
US10724874B2 (en) * 2015-10-13 2020-07-28 Here Global B.V. Virtual reality environment responsive to predictive route navigation
US20170352185A1 (en) * 2016-06-02 2017-12-07 Dennis Rommel BONILLA ACEVEDO System and method for facilitating a vehicle-related virtual reality and/or augmented reality presentation
US10205890B2 (en) * 2016-07-25 2019-02-12 Ford Global Technologies, Llc Systems, methods, and devices for rendering in-vehicle media content based on vehicle sensor data
DE102017200733A1 (de) * 2017-01-18 2018-07-19 Audi Ag Unterhaltungssystem für ein Kraftfahrzeug und Verfahren zum Betreiben eines Unterhaltungssystems
DE102017005982A1 (de) 2017-06-23 2018-02-22 Daimler Ag Verfahren und Vorrichtung zur Vermeidung oder Reduzierung von Kinetose-Symptomen, Fahrzeug mit einer Vorrichtung zur Vermeidung oder Reduzierung von Kinetose-Symptomen
DE102018213654A1 (de) 2018-08-14 2020-02-20 Audi Ag Verfahren zum Betreiben einer mobilen, tragbaren Ausgabevorrichtung in einem Kraftfahrzeug, Kontextbearbeitungseinrichtung, mobile Ausgabevorrichtung, und Kraftfahrzeug

Also Published As

Publication number Publication date
US20210318746A1 (en) 2021-10-14
US11687149B2 (en) 2023-06-27
DE102018213654A1 (de) 2020-02-20
WO2020035183A1 (de) 2020-02-20
CN112585563A (zh) 2021-03-30

Similar Documents

Publication Publication Date Title
DE102008052460B4 (de) Fahrzeugnavigationssystem mit Echtzeit-Verkehrsabbildungsanzeige
EP2724911B1 (de) Fahrassistenzverfahren und Fahrassistenzsystem zur Erhöhung des Fahrkomforts
DE102014109876B4 (de) Verfahren, Systeme und Vorrichtungen zum Bereitstellen einer anwendungserzeugten Information zur Darstellung in einer automobilen Haupteinheit
US9145129B2 (en) Vehicle occupant comfort
DE102012214988B4 (de) Fahrzeug-Spielesystem mit erweiterter Realität für Vorder- und Rück-Sitz zur Unterhaltung und Information von Passagieren
DE102006056874B4 (de) Navigationsgerät
KR102493862B1 (ko) 어려운 운전 조건하에서 랜드 마크를 이용한 내비게이션 명령 강화
DE102017116073A1 (de) System und verfahren zur verbesserung der fahrzeugumgebungserfassung
DE112020000110T5 (de) Nutzung von in fahrzeugen erfassten fahrgastaufmerksamkeitsdaten fürlokalisierungs- und standortbezogene dienste
WO2016000908A1 (de) Verfahren zur parkplatzvermittlung und freier-parkplatz-assistenzsystem
DE102015218830A1 (de) Verfahren für ein Kraftfahrzeug, zum Erkennen schlechter Fahrbahnverhältnisse und ein diesbezügliches System und Kraftfahrzeug
DE102015202859A1 (de) Abtastsystem und -verfahren für autonomes Fahren
WO2011117141A1 (de) Datenverarbeitung in einem fahrzeug
WO2019007605A1 (de) Verfahren zur verifizierung einer digitalen karte eines höher automatisierten fahrzeugs, entsprechende vorrichtung und computerprogramm
DE112017003418T5 (de) Fahrassistenzverfahren und dieses benutzende Fahrassistenzvorrichtung, Fahrassistenzsystem, automatische Fahrsteuerungsvorrichtung, Fahrzeug und Programm
DE112021001082T5 (de) Fahrsteuervorrichtung und hmi-steuervorrichtung
WO2020074280A1 (de) Verfahren und einrichtung zur steuerung von anzeigeinhalten auf einem ausgabemittel eines fahrzeugs
DE102020103633A1 (de) Weg-planungs fusion für ein fahrzeug
DE112014001209T5 (de) Navigation entsprechend zulässigen Fahrzeiten
DE102009053982A1 (de) System zur Berechnung einer verbrauchsoptimierten Route eines Kraftfahrzeugs, Kraftfahrzeug mit einem entsprechenden System sowie Verfahren zur Berechnung einer verbrauchsoptimierten Route
WO2020221575A1 (de) Verfahren zur erfassung von bildmaterial zur überprüfung von bildauswertenden systemen, vorrichtung und fahrzeug zur verwendung bei dem verfahren sowie computerprogramm
DE102019217642A1 (de) Verfahren zur Erfassung von Bildmaterial zur Überprüfung von bildauswertenden Systemen, Vorrichtung und Fahrzeug zur Verwendung bei dem Verfahren sowie Computerprogramm
EP2629054A2 (de) Verfahren zur Bereitstellung von Umgebungsinformationen
EP3837594A1 (de) Verfahren zum betreiben einer mobilen, tragbaren ausgabevorrichtung in einem kraftfahrzeug, kontextbearbeitungseinrichtung, mobile ausgabevorrichtung, und kraftfahrzeug
DE102018222378A1 (de) Vorrichtung und Verfahren Steuerung der Ausgabe von Fahrerinformation und zur Aufrechterhaltung der Aufmerksamkeit eines Fahrers eines automatisierten Fahrzeugs

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210315

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20230516

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230529