EP3999939A1 - Procédé de fonctionnement d'un terminal mobile au moyen d'un dispositif de détection de gestes et de commande, dispositif de détection de gestes et de commande, véhicule automobile et système d'émission pouvant se porter sur la tête - Google Patents

Procédé de fonctionnement d'un terminal mobile au moyen d'un dispositif de détection de gestes et de commande, dispositif de détection de gestes et de commande, véhicule automobile et système d'émission pouvant se porter sur la tête

Info

Publication number
EP3999939A1
EP3999939A1 EP20735582.7A EP20735582A EP3999939A1 EP 3999939 A1 EP3999939 A1 EP 3999939A1 EP 20735582 A EP20735582 A EP 20735582A EP 3999939 A1 EP3999939 A1 EP 3999939A1
Authority
EP
European Patent Office
Prior art keywords
mobile terminal
control device
gesture
operating
gesture recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20735582.7A
Other languages
German (de)
English (en)
Inventor
Norbert KULBAT
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Publication of EP3999939A1 publication Critical patent/EP3999939A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/1468Touch gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/175Autonomous driving
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/55Remote control arrangements
    • B60K2360/56Remote control arrangements using mobile devices
    • B60K2360/563Vehicle displaying mobile device information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/55Remote control arrangements
    • B60K2360/56Remote control arrangements using mobile devices
    • B60K2360/569Vehicle controlling mobile device functions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/55Remote control arrangements
    • B60K2360/56Remote control arrangements using mobile devices
    • B60K2360/577Mirror link with mobile devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, a motor vehicle, and an output device that can be worn on the head
  • the invention relates to a method for operating a mobile terminal, for example a smartphone or tablet PC, the method being carried out by a gesture recognition and control device.
  • a gesture recognition and control device is to be understood as meaning a device, a device component or a device group which is set up to recognize an operating gesture; and which is also set up to receive and evaluate signals, and to generate a control signal.
  • the gesture recognition and control device can also be designed to capture the operating gesture, for example by means of a camera.
  • a passenger in the rear of modern vehicles can use a mobile device, for example a Tablet PC on which vehicle functions such as the current speed can be displayed.
  • a mobile device for example a Tablet PC on which vehicle functions such as the current speed can be displayed.
  • It is a hardware control unit that enables, for example, the setting of vehicle functions, navigation and media content. This is particularly important in those vehicles in which people are often chauffeured, for example in vehicles of a driver service.
  • Such an exemplary tablet PC now looks rather old-fashioned and can only be used by one passenger.
  • the same problem- matic arises when using a mobile device, for example a smartphone.
  • DE 10 2016 207 530 A1 describes a system for displaying a virtual vehicle interior, with one or more vehicle interior components and digital glasses, which are set up to visually display a virtual vehicle interior to a user and the one or more vehicles in the optical display to consider interior components.
  • DE 10 2017 211 521 A1 describes a motor vehicle for transporting at least one user, which has a subsystem which is set up to perform a physical function of the motor vehicle.
  • One object of the invention is to simplify the operation of a mobile terminal in the motor vehicle.
  • the invention is based on the idea of mirroring a user interface of a mobile terminal onto an output device that can be worn on the head and of implementing spatial gesture recognition for operating the mirrored user surface.
  • a spatial gesture is understood to mean a gesture, in particular an operating gesture, in which no touch-sensitive surface, for example a touchscreen or a touchpad, has to be touched, but rather is carried out freely in space.
  • the spatial gesture can be, for example, a predetermined finger and / or hand posture and / or a movement of the exemplary hand.
  • the virtual or augmented reality (“augmented reality”, “AR”) which is displayed on the data glasses that can be worn on the head, integrates an image of a user interface of the mobile device and mirrors it in the motor vehicle.
  • a motor vehicle with a correspondingly configured gesture recognition and control device is compatible with all mobile terminals and output devices that can be worn on the head.
  • the operation of the mobile terminal is not only very modern, but a user can simultaneously operate the mobile terminal while using the output device that can be worn on the head for an entertainment function, for example a VR video game.
  • the user does not have to put down the output device, for example data glasses, every time, for example to answer a call or to look up something in a calendar of the mobile terminal.
  • VR virtual reality
  • the mobile device can also be operated in every sitting and / or lying position of the user in which he may not be able to reach the mobile device and take it in his hand, for example to make a swiping gesture on the touchscreen of the mobile device.
  • the method according to the invention and the devices according to the invention also provide a fully configurable and scalable system.
  • the invention can also be used for a driver of the motor vehicle if the motor vehicle is operated in a fully autonomous driving mode (ie a piloted driving mode).
  • remote control is provided that can be used while, for example, a user in the rear is busy with an entertainment system and experiences a game, for example, or, for example, a driver himself during an autonomous journey (level 5 ) attends a VR meeting.
  • a virtual reality or an augmented reality it is advantageous that no software for a virtual reality or an augmented reality has to be installed on the mobile terminal.
  • the inventive method for operating a mobile terminal is carried out by a gesture recognition and control device, in particular a gesture recognition and control device of a motor vehicle.
  • the gesture recognition and control device can be designed, for example, as a control chip or control device and can have, for example, gesture recognition software and / or, for example, a camera for capturing an operating gesture.
  • the mobile terminal device in particular a mobile terminal device which is located in or on the motor vehicle, is recognized.
  • the mobile terminal can be recognized, for example, by means of a Bluetooth connection.
  • the gesture recognition and control device establishes a current graphic user interface generated by an output device of the recognized mobile terminal, which provides an operating option in which an operating function of the mobile terminal can be triggered.
  • the gesture recognition and control device provides an output signal which, as display content, writes the graphic user interface generated by the display device of the mobile terminal.
  • a display device is understood to mean a device component for displaying image content, in particular a touch-sensitive screen.
  • an output signal of the mobile terminal can be forwarded to the gesture recognition and control device, and the output signal provided by the gesture recognition and control device can then be, for example, the output signal of the mobile terminal.
  • the output signal can be provided in that the gesture recognition and control device generates the output signal, wherein the generated output signal can describe an image of the graphical user interface of the mobile terminal.
  • the output signal provided by the gesture recognition and control device is transmitted by this to an output device which can be worn on the head.
  • an output signal from the mobile terminal is forwarded to the output device that can be worn on the head
  • the graphical user interface is mirrored on a display surface of the output device that is worn on the head, or a display surface of the output device that is worn on the head is synchronized with a screen of the mobile terminal .
  • the output device that can be worn on the head is an output device for outputting an augmented reality and / or a virtual reality.
  • the output device that can be worn on the head can preferably be designed as data glasses or as another “head-mounted display” (“HMD”) known from the prior art, that is, a visual output device to be worn on the head.
  • HMD head-mounted display
  • the output signal provided is transmitted to the output device that can be worn on the head to output the display content as part of an augmented or virtual reality provided or output by the output device in a predetermined output area in the interior of the motor vehicle.
  • the display content is not displayed in the specified output area, for example an area between the user on the back seat and a backrest of the front seat, but on a display surface of the output device that can be worn on the head, so that the display content when looking in the direction of the specified output area appears there.
  • the display content is output on a portion of a display surface of the output device which can be worn on the head and which is in a direction in which the user is looking at the predetermined output area.
  • a contactless operating gesture by the user is recognized as a function of the operating option made available. If the user interface currently provided by the mobile terminal device provides, for example, the operating option that an example of swiping across the screen can be used to select a current
  • the gesture recognition and control device can recognize a room gesture, which can be, for example, a contactless swiping in the room.
  • This recognized contactless operating gesture that is to say the recognized spatial gesture, can trigger the selection of the program of the mobile terminal without touching the screen of the mobile terminal. Since the graphical user interface continues to be mirrored from the mobile terminal onto the output device that can be worn on the head, the user then sees the opening of the selected program as display content, for example.
  • the gesture recognition and control device has, for example, a camera, for example a time-off flight camera
  • the contactless operating gesture can be detected before the contactless operating gesture is recognized.
  • the gesture recognition and control device generates a remote control signal as a function of the detected contactless operating gesture, the generated remote control signal describing a triggering of the operating function of the mobile terminal assigned to the recognized contactless operating gesture.
  • the contactless operating gesture for triggering the operating function can preferably be a spatial gesture that is similar or analogous to the touch gesture of the touching gesture assigned to the operating function. If the touch gesture is, for example, a swipe on the screen of the mobile terminal from left to right, for example, the corresponding spatial gesture to trigger the operating function can be moving the user's face from left to right in the air.
  • the gesture recognition and control device transmits the generated remote control signal to a control device of the recognized mobile terminal to trigger the operating function.
  • a control device is a device component or, for example, a component group for receiving and evaluating signals, as well as for generating them understood by control signals.
  • the control device of the recognized mobile terminal can be a control chip, for example.
  • the operating function which is assigned a touch gesture on, for example, a touchscreen of the mobile terminal in the mobile terminal, is assigned a contactless operating gesture when the graphic user interface is mirrored on the output device that can be worn on the head.
  • the graphical user interface provides the operating option in which, when the mobile terminal device is operated directly, the operating function can be triggered by means of an operating gesture, in particular a touch gesture.
  • the operating function is triggered indirectly via the mirrored user interface in virtual or augmented reality by means of a contactless operating gesture, i.e. a spatial gesture that is not recognized by the mobile device but by the gesture recognition and control device.
  • an embodiment of the method according to the invention can provide that the gesture recognition and control device recognizes a further contactless operating gesture, in particular while the display content is being output by the output device that can be worn on the head, the further contactless operating gesture being a Positioning or Plat decorating the display content (ie an image of the mirrored user surface) in the interior of the motor vehicle can be specified.
  • the further contactless operating gesture specifies the location or place or position in the motor vehicle at which the display content should be seen or appear when the user wears the output device that can be worn on the head.
  • the gesture recognition and control device can then specify the output area on the basis of the recognized further contactless operating gesture. In other words, a virtual location of the display content can be specified and / or changed.
  • another contactless operating gesture can be recognized by the gesture recognition and control device, which can describe a scaling of the display content (ie an image of the mirrored user interface).
  • the further contactless operating gesture can, for example, specify and / or change a format and / or a size of the display content.
  • the gesture recognition and control device can then scale an image that shows the display content. The image is scaled on the basis of the recognized additional contactless operating gesture.
  • the respective further contactless operating gesture can optionally be detected by the gesture recognition and control device.
  • a further, ie separate, user interface can be displayed by the output device that can be worn on the head.
  • the user is supported in scaling and positioning.
  • another output signal can be provided by the gesture recognition and control device, which can describe a further graphical user interface as display content, the further graphical user interface being an operating menu for scaling and / or positioning the output signal provided by the display device of the mobile Terminal generated graphical user interface can provide descriptive display content.
  • the further graphic user interface can therefore optionally be independent of what the mobile terminal is displaying and can therefore be referred to as an additional graphic user interface.
  • the gesture recognition and control device can preferably only perform one or more method steps of the embodiments described above as a function of activating a fully autonomous driving mode and / or as a function of an engine start of the motor vehicle.
  • the gesture recognition and control device can, for example, query the current driving mode (or a future driving mode) from a driver assistance system or receive a corresponding signal from the driver assistance system; and / or the gesture recognition and control device can, for example, receive and evaluate a start signal from an ignition system of the motor vehicle.
  • the reflection of the user interface of the mobile terminal is displayed reliably and immediately by the output device.
  • the fully autonomous driving mode there is the additional advantage that a driver can also carry and use the output device that can be worn on the head.
  • the object set above is achieved by a gesture recognition and control device which is set up to carry out a method according to one of the embodiments described above.
  • the gesture recognition and control device can be designed, for example, as a control device or control chip or as a user program (“app”).
  • the gesture recognition and control device can preferably have a processor device, that is to say a component or a device component which is designed and set up for electronic data processing and preferably can have at least one microcontroller and / or a microprocessor.
  • the gesture recognition and control device can have a data memory, for example a memory card or a memory chip, or another data memory on which a program code can preferably be stored which, when executed by the processor device, cause the gesture recognition and control device to do so can perform an embodiment of the method according to the invention.
  • a data memory for example a memory card or a memory chip, or another data memory on which a program code can preferably be stored which, when executed by the processor device, cause the gesture recognition and control device to do so can perform an embodiment of the method according to the invention.
  • the gesture recognition and control device can preferably be a gesture recognition and control device of the motor vehicle.
  • the gesture recognition and control device can have one or more sensors, for example one or more cameras.
  • an output device that can be worn on the head, for example data glasses, which are designed to output an extended reality and / or a virtual reality, and which has an embodiment of the inventive gesture recognition and control device.
  • the invention also includes further developments of the gesture recognition and control device according to the invention and of the motor vehicle according to the invention which have features as they have already been described in connection with the further developments of the method according to the invention. For this reason, the corresponding developments of the gesture recognition and control device according to the invention and the motor vehicle according to the invention are not described again here.
  • the motor vehicle according to the invention is preferably configured as a motor vehicle, in particular as a passenger vehicle or truck, or as a passenger bus or motorcycle.
  • the invention also includes the combinations of the features of the described embodiments enclosed. Exemplary embodiments of the invention are described below. This shows: 1 shows a schematic representation of a first embodiment example of the method according to the invention and the devices according to the invention; 2 shows a schematic representation of a further exemplary embodiment of the method according to the invention and the devices according to the invention;
  • FIG. 3 shows a schematic representation of a further exemplary embodiment of the method according to the invention and the devices according to the invention
  • FIG. 4 shows a further schematic illustration of the further exemplary embodiment from FIG. 3; FIG. and
  • Fig. 5 is a schematic representation of a further Ausense approximately example of the method according to the invention and the devices according to the invention.
  • the exemplary embodiments explained below are preferred embodiments of the invention.
  • the described components of the embodiments each represent individual features of the invention that are to be considered independently of one another and that further develop the invention in each case also independently of one another. Therefore, the disclosure is intended to include combinations of the features of the embodiments other than those shown.
  • the described embodiments can also be supplemented by further features of the invention already described.
  • the same reference symbols denote functionally identical elements.
  • Fig. 1 illustrates the principle of the method according to the invention and the devices according to the invention according to a first exemplary embodiment.
  • a gesture recognition and control device 10 can, as shown in the example of FIG. 1, be, for example, a control chip or a control device of a motor vehicle 12, which can be configured, for example, as a passenger vehicle, preferably as a passenger vehicle operating in a piloted or fully autonomous driving mode can be operated.
  • the gesture recognition and control device 10 can preferably have a processor device 14 with, for example, a plurality of microprocessors, and / or a data memory 16, for example a memory card or a memory chip.
  • a program code for carrying out the method can preferably be stored on the optional data memory 16.
  • a driver for an operating system of the mobile terminal device 22 can optionally be stored in the data memory 16 of the gesture recognition and control device 10.
  • Communication with an output device 18 that can be worn on the head can preferably take place via a wireless data communication connection 20, for example via a WLAN connection, Bluetooth connection or cellular radio connection.
  • the data communication link 20 can be, for example, a wired data communication link 20, for example a cable.
  • the output device 18 which can be worn on the head can preferably be an output device 18 for an augmented and / or virtual reality. If the gesture recognition and control device 10 is a component of the motor vehicle 12, any output device 18 known to a person skilled in the art, for example any known data glasses, can be used.
  • the gesture recognition and control device 10 can be a component of the output device 18.
  • the gesture recognition and control device 10 can in this variant for Example be located on a side bracket.
  • the output device 18 can, for example, have a camera on one end face.
  • Communication with a mobile terminal device 22, for example a smartphone or a tablet PC, can also preferably take place via a wireless data communication connection 20, or by means of a wired data communication connection 20, for example a data bus of the motor vehicle 12 and / or a cable.
  • the mobile terminal 22 has a display device 24, which can preferably include a touch-sensitive screen. Depending on the current graphic user interface, various operating functions of the mobile terminal device 22 can be triggered via this exemplary touch-sensitive screen, for example opening a program, switching to a navigation overview with small views of open programs, or for example accepting a phone call or playing a video, for example.
  • the direct control of the display device 24 is taken over by a control device 26, for example a control board and / or a user program (“app”) or an operating system.
  • the control device 26 of the mobile terminal 22 can also have a processor device and / or a data memory, these components not being shown in FIG. 1 (and in the following figures) for reasons of clarity.
  • a display surface 28 of the output device 18 is shown, which can ideally be in front of the eyes of the user when the Benut zer has the output device 18 on.
  • FIG. 1 shows an optional camera, for example a time-off-flight camera or an infrared sensor, which can be arranged, for example, on a roof lining of the motor vehicle or on a rearview mirror of the motor vehicle. Is the gesture recognition If the processing and control device 10 is, for example, a component of the output device 18, it can for example have a camera on an end face of the output device 18.
  • a camera for example a time-off-flight camera or an infrared sensor
  • the gesture recognition and control device 10 can preferably also have a plurality of such sensors 30.
  • the detection of the mobile terminal can take place, for example, as soon as the mobile terminal 22 approaches the motor vehicle 12, for which purpose a Bluetooth LE receiver can be located on the outside of the motor vehicle 12, for example.
  • the mobile terminal 22 can be recognized (S1) if, for example, it is placed in a charging cradle in the motor vehicle 12, wherein the mobile terminal 22 (S1) can be recognized by recognition techniques known from the prior art.
  • the mobile terminal 22 of the example in FIG. 1 can, for example, have just opened a user interface of a program for outputting information about the motor vehicle, for example current operating data.
  • the exemplary program can be another user program, for example a user program for playing films or a game, or a desktop of the mobile terminal device 22 can be displayed.
  • the graphical user interface that is currently being displayed by the display device 24 of the known mobile terminal 22 can be determined (S2), for example by transmitting a corresponding output signal from the display device 24 to the gesture recognition and control device 10, or by, for example the gesture recognition and control device 10 asks the mobile terminal 22 what is currently being displayed.
  • the output signal can be generated, for example, by the gesture recognition and control device 10, or an output signal from the mobile terminal device 22 can be forwarded to the output device 18.
  • the provision S3 also includes the transmission S4 of the provided output signal to the exemplary data glasses.
  • the predetermined output area 32 can be preset, for example, and, in the example in FIG. 1, for example an area above the passenger seat.
  • a virtual image 34 of the graphical user interface of the mobile terminal 22 in the output area 32 is shown.
  • a camera image of the interior of the motor vehicle can also be displayed on the display surface 28, in that the image 34 of the user interface of the mobile terminal device 22 can be displayed.
  • an area of the display area 28 on which the image 34 is not displayed can be switched to transparent so that the user can see the real interior of the motor vehicle 12 through the exemplary data glasses.
  • the output device 18 can, for example, generate a virtual reality that does not show an interior of the motor vehicle, but rather, for example, a landscape of a video game, and can display the image 34 of the graphical user interface.
  • the example in FIG. 1 shows a part of the body of the user, preferably a hand 36, which is currently executing a contactless operating gesture to control the mobile terminal device 22.
  • the contactless operating gesture which can also be referred to as a room gesture, can be, for example, a pointing gesture in which the user points in the air to the specified output area 32 and thus to the image 34 of the user interface, and to the Example moved to where a function to be activated can be represented by an icon on the user interface shown in Figure 34.
  • an operating area can be specified, that is to say an area within which the user must perform the spatial gesture so that it is detected and / or recognized.
  • the sensor 30 can then, for example, be directed to this exemplary, predetermined operating area, or alternatively the sensor 30 or a plurality of sensors 30 can cover a large part or the entire interior of the motor vehicle 12.
  • the user can be shown a user interface on which various menu items are displayed and / or, for example, values of various vehicle functions. If the user now wants to activate one of the displayed, possible operating functions or, for example, receive more detailed information on one of the vehicle functions, the spatial gesture can provide, for example, that he points with his finger at a corresponding icon or display element.
  • a corresponding sensor signal can be received by the gesture recognition and control device 10 (S6) and the contactless operating gesture can be recognized, for example, to trigger a video playback function (S7).
  • the graphical user interface can be mirrored, for example, by means of a “grabber”, ie a so-called “content grabber” or “frame grabber” and fade in into virtual reality.
  • a “grabber” ie a so-called “content grabber” or “frame grabber” and fade in into virtual reality.
  • an image of the flange 36 can be superimposed, for which purpose the gesture recognition is based on a similar principle Hand 36 of the user can be filmed or followed, for example, and can then be mapped onto the display surface 28 by the gesture recognition and control device 10.
  • another display element 38 is shown as an example, on which, for example, a further operating function can be displayed, for example a function for closing the gesture recognition and / or mirroring the graphical user interface and / or for menu navigation.
  • a further operating function can be displayed, for example a function for closing the gesture recognition and / or mirroring the graphical user interface and / or for menu navigation.
  • Such an exemplary “back button” can preferably be shown in a predetermined position, in particular in a highlighted position.
  • the gesture recognition and control device 10 can provide a further output signal (S8) that can describe the output of further display content in the form of an image 40 of a further graphical user interface.
  • This can preferably be an operating menu for scaling and / or positioning the first image 34, that is to say the first virtual graphic user interface that is mirrored by the mobile terminal 22.
  • the gesture recognition and control device 10 can use this additional spatial gesture to predetermine the output area 32 at a different position in the interior, in this case in front of the steering wheel (S8).
  • another spatial gesture can scale the image 34 (S9), that is to say the image 34 can, for example, be "drawn out” at the corners, that is, the image 34 can be enlarged, for example.
  • the gesture recognition and control device can generate a remote control signal (S10) that can activate the operating function for displaying the detailed information on an operating parameter or playing a video, for example.
  • the generated remote control signal is transmitted to the control device 26 of the identified mobile terminal 22 via the data communication link 20 (S11).
  • the mobile terminal 22, in particular the control device 26, then triggers the operating function (S12).
  • the gesture recognition and control device 10 can be in communication with a motor vehicle system (not shown in FIG. 1), for example an ignition or a driver assistance system, and on the basis of, for example, a signal that describes that the motor vehicle is currently in a piloted driving mode is operated, the method can be activated.
  • FIGS. 1 to 5 show further exemplary embodiments of the method and the devices, only the differences from the example of FIG. 1 being discussed below.
  • the image 34 of the graphical user interface of the mobile terminal 22 can, for example, display an operating menu as a segment of a circular ring.
  • One or more operating functions can be selected depending on whether the exemplary flange 36 points into one area of the annulus portion, for example to generate a display; or in another area of the annulus, for example to provide a new display.
  • the corresponding operating functions can preferably be displayed on the graphical user interface, which is mirrored on the display surface 28 of the output device 18, for better orientation.
  • Fig. 2 also shows an exemplary space gesture in which the Fland 36 can be a space gesture in which all fingers are extended, and in which a Fland plate, for example, points upwards.
  • This exemplary space gesture can be, for example, a space gesture for navigating back into a flome directory.
  • the Fig. 2 shown spatial gesture for example, a pivoting of the outstretched hand to the right or left, or up or down, describe in order to select the corresponding operating functions.
  • the real hand 36 of the user can be seen in an augmented reality, or an image of a virtual hand that moves exactly like the hand 36 of the user is displayed.
  • the mirrored graphical user interface can, for example, show a TV menu, or a desktop of the mobile terminal device 22 can be mirrored, or the graphical user interface can show what is known as a “content streamer”.
  • a mirrored user interface is shown, which can for example be an operating menu of a media program, with the help of which one can select and listen to music albums, for example, or switch to a video and photo function, for example.
  • FIG. 4 shows an extension of the exemplary embodiment in FIG. 3, in which a further graphical user interface, either another one of the mobile terminal 22 or a graphical user interface generated specifically by the gesture recognition and control device 10, through the image 40 can be displayed.
  • a music album stored in the mobile terminal device 22 can be deleted, for example, various tracks or functions can be selected, or the image 34 can be scaled and / or positioned with this additional menu ( S8, S9).
  • Fig. 5 shows an embodiment in which the user, for example a driver in a fully autonomous journey (level 5), or a front passenger or a passenger in the rear, generate any display, scale it (S9) and / or place it in the interior (that is, can specify the output area 32, S8), for example to watch a soccer game live.
  • Scale on another image 40 i.e. another display that it generates (S9) and / or as required, he can, for example, display the operating concept of his MMI and operate his mobile terminal 22 and / or the motor vehicle 10, or he can display, for example, combined content such as a speed, navigation information (for example remaining Distance and / or arrival time). He may also use another image 40 or display element, that is to say another display, in order to process his emails, for example.
  • the user can design his own display and control surfaces in his virtual environment.
  • This system can be easily connected to an entertainment system that provides virtual reality. During the exemplary game, the user would not have to get out of virtual reality in order to keep an eye on his secondary activities, for example retrieving emails, or to operate his motor vehicle 12.
  • the examples show how the invention enables VR- and / or AR-based remote control.
  • the user can activate a “Flying MMI” or “VR-MMI” - that is, the mirrored, virtual user interface - within a virtual reality, regardless of the context (for example gaming and / or meeting) .
  • This can offer the same functionality as a serial MMI.
  • the series MMI and the VR MMI can always be synchronized and show the same display content.
  • the VR-MMI can preferably also be operated as usual using analogue gestures, analogous to series operation (e.g. touch, slide, pinch).
  • the user can preferably place the VR-MMI, that is to say the mirrored graphic user interface, at a location that makes sense for him, that is to say the output area 32 is predefined.
  • the system is very easy to implement in both VR and AR (extended reality).
  • a technical implementation can provide that by tracking, for example, the hands 36 or a hand 36 of the user using one or more sensors 30, for example infrared sensors, preferably using leap motion, and through Determining the hand coordinates in relation to the Koordina th of the output device 18, that is to say of the HMD, the scope of action of the user can be completely recorded in three dimensions.
  • the image shown in the serial MMI ie the image shown by the display device 24 of the mobile terminal 22, can preferably be transferred to the "Area of Interest", ie to the output area 32, with other areas freely defined by the user by means of a grabber be streamed within virtual reality.
  • Actions by a third person on the serial MMI for example on the screen of the mobile terminal 22, can therefore be displayed in real time in the VR MMI, that is in the image 34 of the mirrored user interface.
  • the corresponding message can be transmitted to a main unit, for example, which carries out the actions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de fonctionnement d'un terminal mobile (22). Un dispositif de détection de gestes et de commande (10) détecte un terminal mobile (22, S1) et établit une surface d'utilisateur (S2) actuelle graphique, générée par un dispositif d'affichage (24) du terminal mobile détecté (22). Le dispositif de détection de gestes et de commande (10) met à disposition un signal d'émission qui décrit la surface d'utilisateur (S3) graphique générée par le dispositif d'affichage (24) du terminal mobile détecté (22) comme contenu d'affichage, et transmet ledit signal à un système d'émission (18) pouvant se porter sur la tête afin d'émettre le contenu d'affichage dans une zone d'émission prédéfinie (32) dans l'espace intérieur du véhicule automobile (12, S4) comme partie d'une réalité augmentée ou d'une réalité virtuelle émise par le système d'émission (18). Pendant l'émission du contenu d'affichage, le dispositif de détection de gestes et de commande (10) détecte un geste d'espace de l'utilisateur (S7), génère, en fonction de ce dernier, un signal de commande à distance servant à déclencher une fonction de fonctionnement (S12) du terminal mobile (22, S10) et le transmet au dispositif de commande (26) du terminal mobile détecté (22, S11).
EP20735582.7A 2019-07-15 2020-07-01 Procédé de fonctionnement d'un terminal mobile au moyen d'un dispositif de détection de gestes et de commande, dispositif de détection de gestes et de commande, véhicule automobile et système d'émission pouvant se porter sur la tête Pending EP3999939A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019210383.9A DE102019210383A1 (de) 2019-07-15 2019-07-15 Verfahren zum Betreiben eines mobilen Endgeräts mittels einer Gestenerkennungs- und Steuereinrichtung, Gestenerkennungs- und Steuereinrichtung, Kraftfahrzeug, und am Kopf tragbare Ausgabevorrichtung
PCT/EP2020/068456 WO2021008871A1 (fr) 2019-07-15 2020-07-01 Procédé de fonctionnement d'un terminal mobile au moyen d'un dispositif de détection de gestes et de commande, dispositif de détection de gestes et de commande, véhicule automobile et système d'émission pouvant se porter sur la tête

Publications (1)

Publication Number Publication Date
EP3999939A1 true EP3999939A1 (fr) 2022-05-25

Family

ID=71409426

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20735582.7A Pending EP3999939A1 (fr) 2019-07-15 2020-07-01 Procédé de fonctionnement d'un terminal mobile au moyen d'un dispositif de détection de gestes et de commande, dispositif de détection de gestes et de commande, véhicule automobile et système d'émission pouvant se porter sur la tête

Country Status (5)

Country Link
US (1) US20220244789A1 (fr)
EP (1) EP3999939A1 (fr)
CN (1) CN113994312A (fr)
DE (1) DE102019210383A1 (fr)
WO (1) WO2021008871A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021119970A1 (de) 2021-08-02 2023-02-02 Bayerische Motoren Werke Aktiengesellschaft Bedienverfahren für Fahrzeuge mit Datenbrille
US11813528B2 (en) * 2021-11-01 2023-11-14 Snap Inc. AR enhanced gameplay with a personal mobility system
DE102022113343A1 (de) 2022-05-25 2023-11-30 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zum Betreiben eines Anzeigesystems in einer mobilen Einrichtung mit einer Datenbrille
CN114839782B (zh) * 2022-06-07 2023-08-18 上汽大众汽车有限公司 一种用于车辆控制与信息显示的车载增强显示系统
DE102022129409A1 (de) 2022-11-08 2024-05-08 Bayerische Motoren Werke Aktiengesellschaft Vorrichtung und Verfahren zum Steuern eines Smartgerätes in einem Fahrzeug

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10133342B2 (en) * 2013-02-14 2018-11-20 Qualcomm Incorporated Human-body-gesture-based region and volume selection for HMD
US20150187357A1 (en) * 2013-12-30 2015-07-02 Samsung Electronics Co., Ltd. Natural input based virtual ui system for mobile devices
US10013083B2 (en) * 2014-04-28 2018-07-03 Qualcomm Incorporated Utilizing real world objects for user input
US10353532B1 (en) * 2014-12-18 2019-07-16 Leap Motion, Inc. User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
EP3079041B1 (fr) 2015-04-10 2018-06-27 Airbus Defence and Space GmbH Procédé et système de préparation d'un environnement de réalité virtuelle pour des passagers d'aéronefs et de véhicules terrestres
DE102016207530A1 (de) 2016-05-02 2017-11-02 Volkswagen Aktiengesellschaft System und Verfahren zum Darstellen eines virtuellen Fahrzeuginnenraums
DE102016225268A1 (de) * 2016-12-16 2018-06-21 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zum Betreiben eines Anzeigesystems mit einer Datenbrille
DE102017211521A1 (de) 2017-07-06 2019-01-10 Bayerische Motoren Werke Aktiengesellschaft Steuerung von Fahrzeugfunktionen aus einer virtuellen Realität
ES2704373B2 (es) * 2017-09-15 2020-05-29 Seat Sa Método y sistema para mostrar información de realidad virtual en un vehículo
CN109781136A (zh) * 2019-02-01 2019-05-21 谷东科技有限公司 一种基于ar眼镜的智能导航方法及系统

Also Published As

Publication number Publication date
CN113994312A (zh) 2022-01-28
US20220244789A1 (en) 2022-08-04
WO2021008871A1 (fr) 2021-01-21
DE102019210383A1 (de) 2021-01-21

Similar Documents

Publication Publication Date Title
EP3999939A1 (fr) Procédé de fonctionnement d'un terminal mobile au moyen d'un dispositif de détection de gestes et de commande, dispositif de détection de gestes et de commande, véhicule automobile et système d'émission pouvant se porter sur la tête
EP2451672B1 (fr) Procédé et dispositif permettant de fournir une interface utilisateur dans un véhicule
US9965169B2 (en) Systems, methods, and apparatus for controlling gesture initiation and termination
DE112012004789T5 (de) Konfigurierbare Fahrzeugkonsole
DE102006028046A1 (de) Kombinierte Anzeige- und Bedienvorrichtung für ein Kraftfahrzeug
DE202013012304U1 (de) Mobiles Endgerät
DE102013010932A1 (de) Verfahren zum Betreiben einer Bedienschnittstelle, Bedienschnittstelle sowie Kraftfahrzeug mit einer Bedienschnittstelle
DE102013004612B4 (de) Verfahren zum Betreiben eines Infotainmentsystems
DE102013227220A1 (de) Blindsteuerungssystem für fahrzeug
EP3688515B1 (fr) Procédé pour faire fonctionner un dispositif d'affichage électronique pouvant être porté sur la tête et système d'affichage pour l'affichage d'un contenu virtuel
DE102013021978A1 (de) Verfahren und mobiles Endgerät zur augmentierten Darstellung mindestens eines realen Bedienelements und/oder... eines realen Objekts
DE102018205664A1 (de) Vorrichtung zur Assistenz eines Insassen im Innenraum eines Kraftfahrzeugs
EP3254172A1 (fr) Détermination d'une position d'un objet étranger à un véhicule dans un véhicule
DE102014008204A1 (de) Kraftfahrzeug mit Medienwiedergabe
WO2014108152A2 (fr) Interface utilisateur pour véhicule automobile dotée d'un élément de commande permettant de détecter une action de commande
DE102009056014A1 (de) Verfahren und Vorrichtung zum Bereitstellen einer Bedienschnittstelle für ein in einem Fahrzeug lösbar befestigtes Gerät
DE102015226152A1 (de) Anzeigevorrichtung und Verfahren zum Ansteuern einer Anzeigevorrichtung
EP3718810A1 (fr) Procédé et dispositif de fonctionnement de composants à commande électronique d'un véhicule
CN115103152A (zh) 车载流媒体显示系统和方法
EP3948493A1 (fr) Procédé et dispositif pour interaction avec un objet environnant dans l'environnement d'un véhicule
EP3108333B1 (fr) Interface utilisateur et procédé d'assistance d'un utilisateur lors de la commande d'une interface utilisateur
WO2021043559A1 (fr) Système de communication dans un véhicule et procédé pour faire fonctionner un système de communication dans un véhicule
DE102019133559A1 (de) Bestimmen der Benutzung eines Mobilgeräts
DE102018204223A1 (de) Mobile, portable Bedienvorrichtung zum Bedienen eines mit der Bedienvorrichtung drahtlos gekoppelten Geräts, und Verfahren zum Betreiben eines Geräts mithilfe einer mobilen, portablen Bedienvorrichtung
DE102017218780A1 (de) Verfahren zum Betreiben eines Fahrzeugassistenzsystems sowie Fahrzeugassistenzsystem für ein Kraftfahrzeug

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220215

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230529

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240415