EP3999939A1 - Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, motor vehicle, and an output apparatus that can be worn on the head - Google Patents
Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, motor vehicle, and an output apparatus that can be worn on the headInfo
- Publication number
- EP3999939A1 EP3999939A1 EP20735582.7A EP20735582A EP3999939A1 EP 3999939 A1 EP3999939 A1 EP 3999939A1 EP 20735582 A EP20735582 A EP 20735582A EP 3999939 A1 EP3999939 A1 EP 3999939A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- mobile terminal
- control device
- gesture
- operating
- gesture recognition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 230000003190 augmentative effect Effects 0.000 claims abstract description 11
- 230000001960 triggered effect Effects 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 2
- 230000004913 activation Effects 0.000 claims 1
- 230000006870 function Effects 0.000 description 32
- 210000003128 head Anatomy 0.000 description 27
- 238000004891 communication Methods 0.000 description 9
- 239000011521 glass Substances 0.000 description 9
- 230000008901 benefit Effects 0.000 description 6
- 238000011161 development Methods 0.000 description 5
- 230000018109 developmental process Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/10—Interpretation of driver requests or demands
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
- B60K2360/1464—3D-gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
- B60K2360/1468—Touch gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/175—Autonomous driving
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/177—Augmented reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/563—Vehicle displaying mobile device information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/569—Vehicle controlling mobile device functions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/55—Remote control arrangements
- B60K2360/56—Remote control arrangements using mobile devices
- B60K2360/577—Mirror link with mobile devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Definitions
- Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, a motor vehicle, and an output device that can be worn on the head
- the invention relates to a method for operating a mobile terminal, for example a smartphone or tablet PC, the method being carried out by a gesture recognition and control device.
- a gesture recognition and control device is to be understood as meaning a device, a device component or a device group which is set up to recognize an operating gesture; and which is also set up to receive and evaluate signals, and to generate a control signal.
- the gesture recognition and control device can also be designed to capture the operating gesture, for example by means of a camera.
- a passenger in the rear of modern vehicles can use a mobile device, for example a Tablet PC on which vehicle functions such as the current speed can be displayed.
- a mobile device for example a Tablet PC on which vehicle functions such as the current speed can be displayed.
- It is a hardware control unit that enables, for example, the setting of vehicle functions, navigation and media content. This is particularly important in those vehicles in which people are often chauffeured, for example in vehicles of a driver service.
- Such an exemplary tablet PC now looks rather old-fashioned and can only be used by one passenger.
- the same problem- matic arises when using a mobile device, for example a smartphone.
- DE 10 2016 207 530 A1 describes a system for displaying a virtual vehicle interior, with one or more vehicle interior components and digital glasses, which are set up to visually display a virtual vehicle interior to a user and the one or more vehicles in the optical display to consider interior components.
- DE 10 2017 211 521 A1 describes a motor vehicle for transporting at least one user, which has a subsystem which is set up to perform a physical function of the motor vehicle.
- One object of the invention is to simplify the operation of a mobile terminal in the motor vehicle.
- the invention is based on the idea of mirroring a user interface of a mobile terminal onto an output device that can be worn on the head and of implementing spatial gesture recognition for operating the mirrored user surface.
- a spatial gesture is understood to mean a gesture, in particular an operating gesture, in which no touch-sensitive surface, for example a touchscreen or a touchpad, has to be touched, but rather is carried out freely in space.
- the spatial gesture can be, for example, a predetermined finger and / or hand posture and / or a movement of the exemplary hand.
- the virtual or augmented reality (“augmented reality”, “AR”) which is displayed on the data glasses that can be worn on the head, integrates an image of a user interface of the mobile device and mirrors it in the motor vehicle.
- a motor vehicle with a correspondingly configured gesture recognition and control device is compatible with all mobile terminals and output devices that can be worn on the head.
- the operation of the mobile terminal is not only very modern, but a user can simultaneously operate the mobile terminal while using the output device that can be worn on the head for an entertainment function, for example a VR video game.
- the user does not have to put down the output device, for example data glasses, every time, for example to answer a call or to look up something in a calendar of the mobile terminal.
- VR virtual reality
- the mobile device can also be operated in every sitting and / or lying position of the user in which he may not be able to reach the mobile device and take it in his hand, for example to make a swiping gesture on the touchscreen of the mobile device.
- the method according to the invention and the devices according to the invention also provide a fully configurable and scalable system.
- the invention can also be used for a driver of the motor vehicle if the motor vehicle is operated in a fully autonomous driving mode (ie a piloted driving mode).
- remote control is provided that can be used while, for example, a user in the rear is busy with an entertainment system and experiences a game, for example, or, for example, a driver himself during an autonomous journey (level 5 ) attends a VR meeting.
- a virtual reality or an augmented reality it is advantageous that no software for a virtual reality or an augmented reality has to be installed on the mobile terminal.
- the inventive method for operating a mobile terminal is carried out by a gesture recognition and control device, in particular a gesture recognition and control device of a motor vehicle.
- the gesture recognition and control device can be designed, for example, as a control chip or control device and can have, for example, gesture recognition software and / or, for example, a camera for capturing an operating gesture.
- the mobile terminal device in particular a mobile terminal device which is located in or on the motor vehicle, is recognized.
- the mobile terminal can be recognized, for example, by means of a Bluetooth connection.
- the gesture recognition and control device establishes a current graphic user interface generated by an output device of the recognized mobile terminal, which provides an operating option in which an operating function of the mobile terminal can be triggered.
- the gesture recognition and control device provides an output signal which, as display content, writes the graphic user interface generated by the display device of the mobile terminal.
- a display device is understood to mean a device component for displaying image content, in particular a touch-sensitive screen.
- an output signal of the mobile terminal can be forwarded to the gesture recognition and control device, and the output signal provided by the gesture recognition and control device can then be, for example, the output signal of the mobile terminal.
- the output signal can be provided in that the gesture recognition and control device generates the output signal, wherein the generated output signal can describe an image of the graphical user interface of the mobile terminal.
- the output signal provided by the gesture recognition and control device is transmitted by this to an output device which can be worn on the head.
- an output signal from the mobile terminal is forwarded to the output device that can be worn on the head
- the graphical user interface is mirrored on a display surface of the output device that is worn on the head, or a display surface of the output device that is worn on the head is synchronized with a screen of the mobile terminal .
- the output device that can be worn on the head is an output device for outputting an augmented reality and / or a virtual reality.
- the output device that can be worn on the head can preferably be designed as data glasses or as another “head-mounted display” (“HMD”) known from the prior art, that is, a visual output device to be worn on the head.
- HMD head-mounted display
- the output signal provided is transmitted to the output device that can be worn on the head to output the display content as part of an augmented or virtual reality provided or output by the output device in a predetermined output area in the interior of the motor vehicle.
- the display content is not displayed in the specified output area, for example an area between the user on the back seat and a backrest of the front seat, but on a display surface of the output device that can be worn on the head, so that the display content when looking in the direction of the specified output area appears there.
- the display content is output on a portion of a display surface of the output device which can be worn on the head and which is in a direction in which the user is looking at the predetermined output area.
- a contactless operating gesture by the user is recognized as a function of the operating option made available. If the user interface currently provided by the mobile terminal device provides, for example, the operating option that an example of swiping across the screen can be used to select a current
- the gesture recognition and control device can recognize a room gesture, which can be, for example, a contactless swiping in the room.
- This recognized contactless operating gesture that is to say the recognized spatial gesture, can trigger the selection of the program of the mobile terminal without touching the screen of the mobile terminal. Since the graphical user interface continues to be mirrored from the mobile terminal onto the output device that can be worn on the head, the user then sees the opening of the selected program as display content, for example.
- the gesture recognition and control device has, for example, a camera, for example a time-off flight camera
- the contactless operating gesture can be detected before the contactless operating gesture is recognized.
- the gesture recognition and control device generates a remote control signal as a function of the detected contactless operating gesture, the generated remote control signal describing a triggering of the operating function of the mobile terminal assigned to the recognized contactless operating gesture.
- the contactless operating gesture for triggering the operating function can preferably be a spatial gesture that is similar or analogous to the touch gesture of the touching gesture assigned to the operating function. If the touch gesture is, for example, a swipe on the screen of the mobile terminal from left to right, for example, the corresponding spatial gesture to trigger the operating function can be moving the user's face from left to right in the air.
- the gesture recognition and control device transmits the generated remote control signal to a control device of the recognized mobile terminal to trigger the operating function.
- a control device is a device component or, for example, a component group for receiving and evaluating signals, as well as for generating them understood by control signals.
- the control device of the recognized mobile terminal can be a control chip, for example.
- the operating function which is assigned a touch gesture on, for example, a touchscreen of the mobile terminal in the mobile terminal, is assigned a contactless operating gesture when the graphic user interface is mirrored on the output device that can be worn on the head.
- the graphical user interface provides the operating option in which, when the mobile terminal device is operated directly, the operating function can be triggered by means of an operating gesture, in particular a touch gesture.
- the operating function is triggered indirectly via the mirrored user interface in virtual or augmented reality by means of a contactless operating gesture, i.e. a spatial gesture that is not recognized by the mobile device but by the gesture recognition and control device.
- an embodiment of the method according to the invention can provide that the gesture recognition and control device recognizes a further contactless operating gesture, in particular while the display content is being output by the output device that can be worn on the head, the further contactless operating gesture being a Positioning or Plat decorating the display content (ie an image of the mirrored user surface) in the interior of the motor vehicle can be specified.
- the further contactless operating gesture specifies the location or place or position in the motor vehicle at which the display content should be seen or appear when the user wears the output device that can be worn on the head.
- the gesture recognition and control device can then specify the output area on the basis of the recognized further contactless operating gesture. In other words, a virtual location of the display content can be specified and / or changed.
- another contactless operating gesture can be recognized by the gesture recognition and control device, which can describe a scaling of the display content (ie an image of the mirrored user interface).
- the further contactless operating gesture can, for example, specify and / or change a format and / or a size of the display content.
- the gesture recognition and control device can then scale an image that shows the display content. The image is scaled on the basis of the recognized additional contactless operating gesture.
- the respective further contactless operating gesture can optionally be detected by the gesture recognition and control device.
- a further, ie separate, user interface can be displayed by the output device that can be worn on the head.
- the user is supported in scaling and positioning.
- another output signal can be provided by the gesture recognition and control device, which can describe a further graphical user interface as display content, the further graphical user interface being an operating menu for scaling and / or positioning the output signal provided by the display device of the mobile Terminal generated graphical user interface can provide descriptive display content.
- the further graphic user interface can therefore optionally be independent of what the mobile terminal is displaying and can therefore be referred to as an additional graphic user interface.
- the gesture recognition and control device can preferably only perform one or more method steps of the embodiments described above as a function of activating a fully autonomous driving mode and / or as a function of an engine start of the motor vehicle.
- the gesture recognition and control device can, for example, query the current driving mode (or a future driving mode) from a driver assistance system or receive a corresponding signal from the driver assistance system; and / or the gesture recognition and control device can, for example, receive and evaluate a start signal from an ignition system of the motor vehicle.
- the reflection of the user interface of the mobile terminal is displayed reliably and immediately by the output device.
- the fully autonomous driving mode there is the additional advantage that a driver can also carry and use the output device that can be worn on the head.
- the object set above is achieved by a gesture recognition and control device which is set up to carry out a method according to one of the embodiments described above.
- the gesture recognition and control device can be designed, for example, as a control device or control chip or as a user program (“app”).
- the gesture recognition and control device can preferably have a processor device, that is to say a component or a device component which is designed and set up for electronic data processing and preferably can have at least one microcontroller and / or a microprocessor.
- the gesture recognition and control device can have a data memory, for example a memory card or a memory chip, or another data memory on which a program code can preferably be stored which, when executed by the processor device, cause the gesture recognition and control device to do so can perform an embodiment of the method according to the invention.
- a data memory for example a memory card or a memory chip, or another data memory on which a program code can preferably be stored which, when executed by the processor device, cause the gesture recognition and control device to do so can perform an embodiment of the method according to the invention.
- the gesture recognition and control device can preferably be a gesture recognition and control device of the motor vehicle.
- the gesture recognition and control device can have one or more sensors, for example one or more cameras.
- an output device that can be worn on the head, for example data glasses, which are designed to output an extended reality and / or a virtual reality, and which has an embodiment of the inventive gesture recognition and control device.
- the invention also includes further developments of the gesture recognition and control device according to the invention and of the motor vehicle according to the invention which have features as they have already been described in connection with the further developments of the method according to the invention. For this reason, the corresponding developments of the gesture recognition and control device according to the invention and the motor vehicle according to the invention are not described again here.
- the motor vehicle according to the invention is preferably configured as a motor vehicle, in particular as a passenger vehicle or truck, or as a passenger bus or motorcycle.
- the invention also includes the combinations of the features of the described embodiments enclosed. Exemplary embodiments of the invention are described below. This shows: 1 shows a schematic representation of a first embodiment example of the method according to the invention and the devices according to the invention; 2 shows a schematic representation of a further exemplary embodiment of the method according to the invention and the devices according to the invention;
- FIG. 3 shows a schematic representation of a further exemplary embodiment of the method according to the invention and the devices according to the invention
- FIG. 4 shows a further schematic illustration of the further exemplary embodiment from FIG. 3; FIG. and
- Fig. 5 is a schematic representation of a further Ausense approximately example of the method according to the invention and the devices according to the invention.
- the exemplary embodiments explained below are preferred embodiments of the invention.
- the described components of the embodiments each represent individual features of the invention that are to be considered independently of one another and that further develop the invention in each case also independently of one another. Therefore, the disclosure is intended to include combinations of the features of the embodiments other than those shown.
- the described embodiments can also be supplemented by further features of the invention already described.
- the same reference symbols denote functionally identical elements.
- Fig. 1 illustrates the principle of the method according to the invention and the devices according to the invention according to a first exemplary embodiment.
- a gesture recognition and control device 10 can, as shown in the example of FIG. 1, be, for example, a control chip or a control device of a motor vehicle 12, which can be configured, for example, as a passenger vehicle, preferably as a passenger vehicle operating in a piloted or fully autonomous driving mode can be operated.
- the gesture recognition and control device 10 can preferably have a processor device 14 with, for example, a plurality of microprocessors, and / or a data memory 16, for example a memory card or a memory chip.
- a program code for carrying out the method can preferably be stored on the optional data memory 16.
- a driver for an operating system of the mobile terminal device 22 can optionally be stored in the data memory 16 of the gesture recognition and control device 10.
- Communication with an output device 18 that can be worn on the head can preferably take place via a wireless data communication connection 20, for example via a WLAN connection, Bluetooth connection or cellular radio connection.
- the data communication link 20 can be, for example, a wired data communication link 20, for example a cable.
- the output device 18 which can be worn on the head can preferably be an output device 18 for an augmented and / or virtual reality. If the gesture recognition and control device 10 is a component of the motor vehicle 12, any output device 18 known to a person skilled in the art, for example any known data glasses, can be used.
- the gesture recognition and control device 10 can be a component of the output device 18.
- the gesture recognition and control device 10 can in this variant for Example be located on a side bracket.
- the output device 18 can, for example, have a camera on one end face.
- Communication with a mobile terminal device 22, for example a smartphone or a tablet PC, can also preferably take place via a wireless data communication connection 20, or by means of a wired data communication connection 20, for example a data bus of the motor vehicle 12 and / or a cable.
- the mobile terminal 22 has a display device 24, which can preferably include a touch-sensitive screen. Depending on the current graphic user interface, various operating functions of the mobile terminal device 22 can be triggered via this exemplary touch-sensitive screen, for example opening a program, switching to a navigation overview with small views of open programs, or for example accepting a phone call or playing a video, for example.
- the direct control of the display device 24 is taken over by a control device 26, for example a control board and / or a user program (“app”) or an operating system.
- the control device 26 of the mobile terminal 22 can also have a processor device and / or a data memory, these components not being shown in FIG. 1 (and in the following figures) for reasons of clarity.
- a display surface 28 of the output device 18 is shown, which can ideally be in front of the eyes of the user when the Benut zer has the output device 18 on.
- FIG. 1 shows an optional camera, for example a time-off-flight camera or an infrared sensor, which can be arranged, for example, on a roof lining of the motor vehicle or on a rearview mirror of the motor vehicle. Is the gesture recognition If the processing and control device 10 is, for example, a component of the output device 18, it can for example have a camera on an end face of the output device 18.
- a camera for example a time-off-flight camera or an infrared sensor
- the gesture recognition and control device 10 can preferably also have a plurality of such sensors 30.
- the detection of the mobile terminal can take place, for example, as soon as the mobile terminal 22 approaches the motor vehicle 12, for which purpose a Bluetooth LE receiver can be located on the outside of the motor vehicle 12, for example.
- the mobile terminal 22 can be recognized (S1) if, for example, it is placed in a charging cradle in the motor vehicle 12, wherein the mobile terminal 22 (S1) can be recognized by recognition techniques known from the prior art.
- the mobile terminal 22 of the example in FIG. 1 can, for example, have just opened a user interface of a program for outputting information about the motor vehicle, for example current operating data.
- the exemplary program can be another user program, for example a user program for playing films or a game, or a desktop of the mobile terminal device 22 can be displayed.
- the graphical user interface that is currently being displayed by the display device 24 of the known mobile terminal 22 can be determined (S2), for example by transmitting a corresponding output signal from the display device 24 to the gesture recognition and control device 10, or by, for example the gesture recognition and control device 10 asks the mobile terminal 22 what is currently being displayed.
- the output signal can be generated, for example, by the gesture recognition and control device 10, or an output signal from the mobile terminal device 22 can be forwarded to the output device 18.
- the provision S3 also includes the transmission S4 of the provided output signal to the exemplary data glasses.
- the predetermined output area 32 can be preset, for example, and, in the example in FIG. 1, for example an area above the passenger seat.
- a virtual image 34 of the graphical user interface of the mobile terminal 22 in the output area 32 is shown.
- a camera image of the interior of the motor vehicle can also be displayed on the display surface 28, in that the image 34 of the user interface of the mobile terminal device 22 can be displayed.
- an area of the display area 28 on which the image 34 is not displayed can be switched to transparent so that the user can see the real interior of the motor vehicle 12 through the exemplary data glasses.
- the output device 18 can, for example, generate a virtual reality that does not show an interior of the motor vehicle, but rather, for example, a landscape of a video game, and can display the image 34 of the graphical user interface.
- the example in FIG. 1 shows a part of the body of the user, preferably a hand 36, which is currently executing a contactless operating gesture to control the mobile terminal device 22.
- the contactless operating gesture which can also be referred to as a room gesture, can be, for example, a pointing gesture in which the user points in the air to the specified output area 32 and thus to the image 34 of the user interface, and to the Example moved to where a function to be activated can be represented by an icon on the user interface shown in Figure 34.
- an operating area can be specified, that is to say an area within which the user must perform the spatial gesture so that it is detected and / or recognized.
- the sensor 30 can then, for example, be directed to this exemplary, predetermined operating area, or alternatively the sensor 30 or a plurality of sensors 30 can cover a large part or the entire interior of the motor vehicle 12.
- the user can be shown a user interface on which various menu items are displayed and / or, for example, values of various vehicle functions. If the user now wants to activate one of the displayed, possible operating functions or, for example, receive more detailed information on one of the vehicle functions, the spatial gesture can provide, for example, that he points with his finger at a corresponding icon or display element.
- a corresponding sensor signal can be received by the gesture recognition and control device 10 (S6) and the contactless operating gesture can be recognized, for example, to trigger a video playback function (S7).
- the graphical user interface can be mirrored, for example, by means of a “grabber”, ie a so-called “content grabber” or “frame grabber” and fade in into virtual reality.
- a “grabber” ie a so-called “content grabber” or “frame grabber” and fade in into virtual reality.
- an image of the flange 36 can be superimposed, for which purpose the gesture recognition is based on a similar principle Hand 36 of the user can be filmed or followed, for example, and can then be mapped onto the display surface 28 by the gesture recognition and control device 10.
- another display element 38 is shown as an example, on which, for example, a further operating function can be displayed, for example a function for closing the gesture recognition and / or mirroring the graphical user interface and / or for menu navigation.
- a further operating function can be displayed, for example a function for closing the gesture recognition and / or mirroring the graphical user interface and / or for menu navigation.
- Such an exemplary “back button” can preferably be shown in a predetermined position, in particular in a highlighted position.
- the gesture recognition and control device 10 can provide a further output signal (S8) that can describe the output of further display content in the form of an image 40 of a further graphical user interface.
- This can preferably be an operating menu for scaling and / or positioning the first image 34, that is to say the first virtual graphic user interface that is mirrored by the mobile terminal 22.
- the gesture recognition and control device 10 can use this additional spatial gesture to predetermine the output area 32 at a different position in the interior, in this case in front of the steering wheel (S8).
- another spatial gesture can scale the image 34 (S9), that is to say the image 34 can, for example, be "drawn out” at the corners, that is, the image 34 can be enlarged, for example.
- the gesture recognition and control device can generate a remote control signal (S10) that can activate the operating function for displaying the detailed information on an operating parameter or playing a video, for example.
- the generated remote control signal is transmitted to the control device 26 of the identified mobile terminal 22 via the data communication link 20 (S11).
- the mobile terminal 22, in particular the control device 26, then triggers the operating function (S12).
- the gesture recognition and control device 10 can be in communication with a motor vehicle system (not shown in FIG. 1), for example an ignition or a driver assistance system, and on the basis of, for example, a signal that describes that the motor vehicle is currently in a piloted driving mode is operated, the method can be activated.
- FIGS. 1 to 5 show further exemplary embodiments of the method and the devices, only the differences from the example of FIG. 1 being discussed below.
- the image 34 of the graphical user interface of the mobile terminal 22 can, for example, display an operating menu as a segment of a circular ring.
- One or more operating functions can be selected depending on whether the exemplary flange 36 points into one area of the annulus portion, for example to generate a display; or in another area of the annulus, for example to provide a new display.
- the corresponding operating functions can preferably be displayed on the graphical user interface, which is mirrored on the display surface 28 of the output device 18, for better orientation.
- Fig. 2 also shows an exemplary space gesture in which the Fland 36 can be a space gesture in which all fingers are extended, and in which a Fland plate, for example, points upwards.
- This exemplary space gesture can be, for example, a space gesture for navigating back into a flome directory.
- the Fig. 2 shown spatial gesture for example, a pivoting of the outstretched hand to the right or left, or up or down, describe in order to select the corresponding operating functions.
- the real hand 36 of the user can be seen in an augmented reality, or an image of a virtual hand that moves exactly like the hand 36 of the user is displayed.
- the mirrored graphical user interface can, for example, show a TV menu, or a desktop of the mobile terminal device 22 can be mirrored, or the graphical user interface can show what is known as a “content streamer”.
- a mirrored user interface is shown, which can for example be an operating menu of a media program, with the help of which one can select and listen to music albums, for example, or switch to a video and photo function, for example.
- FIG. 4 shows an extension of the exemplary embodiment in FIG. 3, in which a further graphical user interface, either another one of the mobile terminal 22 or a graphical user interface generated specifically by the gesture recognition and control device 10, through the image 40 can be displayed.
- a music album stored in the mobile terminal device 22 can be deleted, for example, various tracks or functions can be selected, or the image 34 can be scaled and / or positioned with this additional menu ( S8, S9).
- Fig. 5 shows an embodiment in which the user, for example a driver in a fully autonomous journey (level 5), or a front passenger or a passenger in the rear, generate any display, scale it (S9) and / or place it in the interior (that is, can specify the output area 32, S8), for example to watch a soccer game live.
- Scale on another image 40 i.e. another display that it generates (S9) and / or as required, he can, for example, display the operating concept of his MMI and operate his mobile terminal 22 and / or the motor vehicle 10, or he can display, for example, combined content such as a speed, navigation information (for example remaining Distance and / or arrival time). He may also use another image 40 or display element, that is to say another display, in order to process his emails, for example.
- the user can design his own display and control surfaces in his virtual environment.
- This system can be easily connected to an entertainment system that provides virtual reality. During the exemplary game, the user would not have to get out of virtual reality in order to keep an eye on his secondary activities, for example retrieving emails, or to operate his motor vehicle 12.
- the examples show how the invention enables VR- and / or AR-based remote control.
- the user can activate a “Flying MMI” or “VR-MMI” - that is, the mirrored, virtual user interface - within a virtual reality, regardless of the context (for example gaming and / or meeting) .
- This can offer the same functionality as a serial MMI.
- the series MMI and the VR MMI can always be synchronized and show the same display content.
- the VR-MMI can preferably also be operated as usual using analogue gestures, analogous to series operation (e.g. touch, slide, pinch).
- the user can preferably place the VR-MMI, that is to say the mirrored graphic user interface, at a location that makes sense for him, that is to say the output area 32 is predefined.
- the system is very easy to implement in both VR and AR (extended reality).
- a technical implementation can provide that by tracking, for example, the hands 36 or a hand 36 of the user using one or more sensors 30, for example infrared sensors, preferably using leap motion, and through Determining the hand coordinates in relation to the Koordina th of the output device 18, that is to say of the HMD, the scope of action of the user can be completely recorded in three dimensions.
- the image shown in the serial MMI ie the image shown by the display device 24 of the mobile terminal 22, can preferably be transferred to the "Area of Interest", ie to the output area 32, with other areas freely defined by the user by means of a grabber be streamed within virtual reality.
- Actions by a third person on the serial MMI for example on the screen of the mobile terminal 22, can therefore be displayed in real time in the VR MMI, that is in the image 34 of the mirrored user interface.
- the corresponding message can be transmitted to a main unit, for example, which carries out the actions.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019210383.9A DE102019210383A1 (en) | 2019-07-15 | 2019-07-15 | Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, motor vehicle, and output device that can be worn on the head |
PCT/EP2020/068456 WO2021008871A1 (en) | 2019-07-15 | 2020-07-01 | Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, motor vehicle, and an output apparatus that can be worn on the head |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3999939A1 true EP3999939A1 (en) | 2022-05-25 |
Family
ID=71409426
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20735582.7A Withdrawn EP3999939A1 (en) | 2019-07-15 | 2020-07-01 | Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, motor vehicle, and an output apparatus that can be worn on the head |
Country Status (5)
Country | Link |
---|---|
US (1) | US12008168B2 (en) |
EP (1) | EP3999939A1 (en) |
CN (1) | CN113994312B (en) |
DE (1) | DE102019210383A1 (en) |
WO (1) | WO2021008871A1 (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102019210383A1 (en) | 2019-07-15 | 2021-01-21 | Audi Ag | Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, motor vehicle, and output device that can be worn on the head |
EP4030773B1 (en) * | 2019-09-09 | 2023-11-01 | NISSAN MOTOR Co., Ltd. | Vehicle remote control method and vehicle remote control device |
JP7057393B2 (en) * | 2020-06-24 | 2022-04-19 | 株式会社電通 | Programs, head-mounted displays and information processing equipment |
DE102021119970A1 (en) | 2021-08-02 | 2023-02-02 | Bayerische Motoren Werke Aktiengesellschaft | Operating procedure for vehicles with data glasses |
US12030577B2 (en) | 2021-09-30 | 2024-07-09 | Snap Inc. | AR based performance modulation of a personal mobility system |
US11813528B2 (en) * | 2021-11-01 | 2023-11-14 | Snap Inc. | AR enhanced gameplay with a personal mobility system |
DE102022113343A1 (en) | 2022-05-25 | 2023-11-30 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for operating a display system in a mobile device with data glasses |
CN114839782B (en) * | 2022-06-07 | 2023-08-18 | 上汽大众汽车有限公司 | Vehicle-mounted enhanced display system for vehicle control and information display |
DE102022129409A1 (en) | 2022-11-08 | 2024-05-08 | Bayerische Motoren Werke Aktiengesellschaft | Device and method for controlling a smart device in a vehicle |
FR3144322A1 (en) * | 2022-12-23 | 2024-06-28 | Valeo Comfort And Driving Assistance | Vehicle-mounted immersive reality system |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10133342B2 (en) * | 2013-02-14 | 2018-11-20 | Qualcomm Incorporated | Human-body-gesture-based region and volume selection for HMD |
US20150187357A1 (en) * | 2013-12-30 | 2015-07-02 | Samsung Electronics Co., Ltd. | Natural input based virtual ui system for mobile devices |
US10013083B2 (en) * | 2014-04-28 | 2018-07-03 | Qualcomm Incorporated | Utilizing real world objects for user input |
US10353532B1 (en) * | 2014-12-18 | 2019-07-16 | Leap Motion, Inc. | User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments |
EP3079041B1 (en) | 2015-04-10 | 2018-06-27 | Airbus Defence and Space GmbH | Method and system for the production of a virtual reality environment for passengers of landcraft and aircraft |
DE102016207530A1 (en) | 2016-05-02 | 2017-11-02 | Volkswagen Aktiengesellschaft | System and method for displaying a virtual vehicle interior |
DE102016225268A1 (en) * | 2016-12-16 | 2018-06-21 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for operating a display system with data glasses |
DE102017211521A1 (en) | 2017-07-06 | 2019-01-10 | Bayerische Motoren Werke Aktiengesellschaft | Control of vehicle functions from a virtual reality |
ES2704373B2 (en) * | 2017-09-15 | 2020-05-29 | Seat Sa | Method and system to display virtual reality information in a vehicle |
CN109781136A (en) * | 2019-02-01 | 2019-05-21 | 谷东科技有限公司 | A kind of intelligent navigation method and system based on AR glasses |
DE102019210383A1 (en) | 2019-07-15 | 2021-01-21 | Audi Ag | Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, motor vehicle, and output device that can be worn on the head |
-
2019
- 2019-07-15 DE DE102019210383.9A patent/DE102019210383A1/en active Pending
-
2020
- 2020-07-01 WO PCT/EP2020/068456 patent/WO2021008871A1/en unknown
- 2020-07-01 EP EP20735582.7A patent/EP3999939A1/en not_active Withdrawn
- 2020-07-01 CN CN202080044013.2A patent/CN113994312B/en active Active
- 2020-07-01 US US17/627,304 patent/US12008168B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US12008168B2 (en) | 2024-06-11 |
DE102019210383A1 (en) | 2021-01-21 |
CN113994312B (en) | 2024-08-30 |
US20220244789A1 (en) | 2022-08-04 |
CN113994312A (en) | 2022-01-28 |
WO2021008871A1 (en) | 2021-01-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3999939A1 (en) | Method for operating a mobile terminal by means of a gesture recognition and control device, gesture recognition and control device, motor vehicle, and an output apparatus that can be worn on the head | |
EP2451672B1 (en) | Method and device for providing a user interface in a vehicle | |
US9965169B2 (en) | Systems, methods, and apparatus for controlling gesture initiation and termination | |
DE112012004789T5 (en) | Configurable vehicle console | |
DE102006028046A1 (en) | Combined display and control device for motor vehicle, has screen for displaying and selecting functions, and detecting device detecting control unit, where screen is switchable from information mode into control mode by control unit | |
DE202013012304U1 (en) | Mobile terminal | |
DE102013010932A1 (en) | Method for operating a user interface, user interface and motor vehicle with a user interface | |
DE102013004612B4 (en) | Method for operating an infotainment system | |
EP3688515B1 (en) | Method for operating a head-mounted electronic display device, and display system for displaying a virtual content | |
DE102013227220A1 (en) | BLIND CONTROL SYSTEM FOR VEHICLE | |
CN111142655A (en) | Interaction method, terminal and computer readable storage medium | |
DE102018205664A1 (en) | Device for assisting an occupant in the interior of a motor vehicle | |
EP3254172A1 (en) | Determination of a position of a non-vehicle object in a vehicle | |
DE102014008204A1 (en) | Motor vehicle with media reproduction | |
DE102013000069A1 (en) | Motor vehicle operating interface with a control element for detecting a control action | |
DE102015226152A1 (en) | Display device and method for driving a display device | |
EP3718810A1 (en) | Method and device for operating electronically controllable components of a vehicle | |
CN115103152A (en) | Vehicle mounted media display system and method | |
EP3948493A1 (en) | Method and apparatus for interaction with an environment object in the surroundings of a vehicle | |
DE102019208235B4 (en) | Method for operating an HMD display unit for a vehicle, HMD display unit, multifunction steering wheel and vehicle for use in the method | |
EP3108333B1 (en) | User interface and method for assisting a user in the operation of a user interface | |
DE102021200185A1 (en) | Method and device for outputting an image of a partial area of the surroundings of a vehicle | |
WO2021043559A1 (en) | Communication system in a vehicle and method for operating a communication system in a vehicle | |
DE102019133559A1 (en) | Determine the use of a mobile device | |
DE102017218780A1 (en) | Method for operating a vehicle assistance system and vehicle assistance system for a motor vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20220215 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230529 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20240415 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20240813 |