EP2802963A1 - Procédé et dispositif de commande de fonctions dans un véhicule à l'aide de gestes effectués dans l'espace tridimensionnel ainsi que produit-programme d'ordinateur correspondant - Google Patents
Procédé et dispositif de commande de fonctions dans un véhicule à l'aide de gestes effectués dans l'espace tridimensionnel ainsi que produit-programme d'ordinateur correspondantInfo
- Publication number
- EP2802963A1 EP2802963A1 EP12810080.7A EP12810080A EP2802963A1 EP 2802963 A1 EP2802963 A1 EP 2802963A1 EP 12810080 A EP12810080 A EP 12810080A EP 2802963 A1 EP2802963 A1 EP 2802963A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- gesture
- detected
- function
- dimensional space
- determined
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000006870 function Effects 0.000 title claims abstract description 74
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000004590 computer program Methods 0.000 title claims abstract description 9
- 238000001514 detection method Methods 0.000 claims abstract description 60
- 230000003213 activating effect Effects 0.000 claims abstract description 19
- 230000008569 process Effects 0.000 claims abstract description 9
- 230000003993 interaction Effects 0.000 claims description 15
- 230000003068 static effect Effects 0.000 claims description 5
- 230000002452 interceptive effect Effects 0.000 claims description 2
- 230000004913 activation Effects 0.000 abstract description 4
- 230000008901 benefit Effects 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 235000013361 beverage Nutrition 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
- B60K2360/1464—3D-gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
Definitions
- the present invention relates to a method and apparatus for operating functions in a vehicle using gestures executed in three-dimensional space and a related computer program product.
- US 2008/0065291 A1 discloses a method and a device for operating functions in a vehicle using gestures executed in three-dimensional space, in which it is determined whether a gesture carried out in three-dimensional space is detected by means of an image-based detection process or not, it is determined whether or not the detected gesture is a gesture associated with controlling a function, and the function is controlled if it is determined that the detected gesture is the gesture associated with controlling the function.
- a detected gesture is a gesture assigned to controlling a function
- movement of, for example, a user's finger or hand performed in a detection area of an image-based gesture detection means may be prohibited a function is erroneously determined as the gesture associated with controlling the function. Consequently, in this case, the function is carried out erroneously or unintentionally.
- a method for operating functions in a vehicle using gestures executed in three-dimensional space comprises a) determining whether or not a first gesture performed in three-dimensional space is detected by an image-based detection operation, b) determining whether the first gesture is a gesture associated with activating a control of a function or not, if it is determined that the first gesture has been detected, c) activating the control of the function, if it is determined that the detected first gesture is the one Activating controlling the gesture associated with the function; d) determining whether or not a second gesture performed in the three-dimensional space is detected by the image-based detection operation; e) determining whether the detected second gesture has a gesture associated with controlling the function or not, if it is determined that the second gesture has been detected, and f) a Controlling the function if it has been determined that the detected first gesture is the gesture associated with activating controlling the function, and if it is determined that the detected second gesture is the gesture associated with controlling the function.
- steps d) to f) are carried out repeatedly one after the other after carrying out steps a) to c).
- step b) it is determined in step b) that the detected first gesture is the gesture associated with activating the control of the function, if the detected first gesture is a first predetermined gesture, for a first predetermined time period in an interaction region of the three-dimensional space is static.
- a display element representing the activation of the function is displayed in step c).
- the display of the function-indicating display element is no longer displayed on the display unit after a fourth predetermined time period in which no gesture is detected.
- step e) it is determined in step e) that the detected second gesture is the gesture associated with controlling the function if the detected second gesture is a second predetermined gesture that is dynamic in the interaction area of the three-dimensional space.
- the interaction area is set smaller than a maximum detection area of the image-based detection process.
- the interaction area is set free of interfering objects.
- the interaction area is dynamically adapted context-dependent.
- the image-based detection process is camera-based, it detects a position of an object executing a gesture in the
- a device for operating functions in a vehicle using gestures executed in three-dimensional space comprises means adapted to carry out the method or its embodiments described above.
- a computer program product for operating functions in a vehicle using gestures executed in three-dimensional space is configured to interact with a computer or computer system directly or, after performing a predetermined routine, indirectly the method or its embodiments described above perform.
- the gesture associated with controlling the function since before being detected the gesture associated with controlling the function must detect a gesture associated with activating a control of a function by means of which the control of the function is activated.
- FIG. 1 is a schematic diagram showing a basic construction of a display unit and a detection concept according to an embodiment of the present invention
- Fig. 2 is a further schematic representation of the basic structure of
- FIG. 3 is a schematic diagram showing the basic structure of the display unit and a mounting location of a detection device in an overhead operating unit according to the embodiment of the present invention
- Fig. 4 is a schematic representation of the basic structure of the display unit and a mounting location of a detection device in an interior mirror according to the
- FIG. 5 is a flowchart of a method of operating functions in a vehicle using gestures executed in three-dimensional space according to the embodiment of the present invention.
- a display unit is a preferably central display of a vehicle, preferably of a motor vehicle, and a method for operating functions displayed on the display unit using in three-dimensional space
- executed gestures is performed in the vehicle.
- a gesture described below is a gesture made by a user of the vehicle by means of a hand or a finger of the user in the
- three-dimensional space is performed without touching a display, such as a touch screen, or a control element, such as a touchpad.
- the image-based detector described below may be any type of the image-based detector described below.
- a convenient camera that is capable of detecting a gesture in three-dimensional space, such as a depth camera, a camera with structured light, a Stereo camera, a camera based on time-of-flight technology or an infrared camera combined with a mono camera.
- a depth camera such as a depth camera, a camera with structured light, a Stereo camera, a camera based on time-of-flight technology or an infrared camera combined with a mono camera.
- An infrared camera combined with a mono camera improves detection capability because a mono camera with a high image resolution provides additional intensity information, which offers advantages in background segmentation, and a mono camera is insensitive to extraneous light.
- Fig. 1 shows a schematic representation of a basic structure of a
- reference numeral 10 denotes a display unit of a vehicle
- reference numeral 20 denotes a valid detection area of an image-based detector
- reference numeral 30 denotes an overhead operating unit of the vehicle
- reference numeral 40 denotes an interior mirror of the vehicle
- reference numeral 50 denotes a center console of the vehicle
- reference numeral 60 denotes a dome of the vehicle.
- the basic operating concept is that a gesture control for operating functions by means of a by a user's hand or finger
- the performed gesture is detected in the detection area 20 by the image-based detection means as a predetermined gesture.
- the valid detection area 20 is determined by an image-based detection device capable of detecting a three-dimensional position of the user's hand or finger in three-dimensional space.
- the image-based detection device is preferably a depth camera integrated into the vehicle.
- the image-based detection device must be integrated such that a
- Gesture control is permitted by a relaxed hand and / or Armen the user at any position in the area above the dome 60 and the center console 50 of the vehicle. Therefore, a valid detection range is limited upward from an upper edge of the display unit 10 and downwardly by a minimum distance to the dome 60 and the center console 50.
- a gesture control is activated if a first gesture in the valid
- Detection area 20 is detected, which is a first predetermined gesture.
- the first predetermined gesture is a static gesture made by moving the user's hand or finger into the valid detection area 20 and then momentarily holding the user's hand or finger for a first
- the gesture control is deactivated by guiding the user's hand or finger out of the valid detection range. Placing the user's hand or arm on the center console 20 and operating components of the vehicle is performed below the valid detection area 20, whereby no gesture control is activated.
- FIG. 2 is another schematic diagram showing the basic structure of the display unit and the detection concept according to the embodiment of the present invention.
- the same reference numerals as in FIG. 1 denote the same elements, and the reference numeral 70 designates an object existing in or on the center console 50 as an obstruction object such as a beverage container in one
- Cup holder
- a lower limit of the valid detection range 20 is dynamically applied to the
- Item 70 adapted.
- Such context-dependent adaptation of the valid detection area as an interaction area is performed such that a depth contour of the valid detection area is performed by depth information of the image-based detection means such as the depth camera in real time upon detection of a gesture. This means that a valid gesture must be made over the item 70.
- Arranging the image-based detection device in a roof area of the vehicle leads to the following advantages: No sunlight is radiated into an optic of the image-based detection device. It will be a complete one
- Detection area is also covered in a vicinity of the display unit 10 as a valid detection area 20. There is a high image resolution in
- the image-based detection device is from a normal field of vision of the driver and front passenger. Roof components can be strongly standardized across different series with fewer design variants. Low requirements for a detection range are required.
- Fig. 3 shows a schematic representation of the basic structure of
- Fig. 3 the same reference numerals as in Figs. 1 and 2 denote the same elements, and the reference numeral 100 denotes a maximum detection angle of an image-based integrated in the roof control unit 30 of the vehicle
- the complete valid detection range 20 is covered by the image-based detection device integrated in the roof control unit 30.
- Another advantage of the image-based detection device integrated in the roof control unit 30 is that the greatest possible vertical distance to the valid detection area 20 is achieved.
- Fig. 4 shows a schematic representation of the basic structure of
- FIG. 4 Display unit and a location of a detection device in an interior mirror according to the embodiment of the present invention.
- the same reference numerals as in FIGS. 1 and 2 denote the same elements, and the reference numeral 110 denotes a maximum detection angle of an image-based integrated in the vehicle interior mirror 40
- the complete valid detection area 20 is covered by the image-based detection device integrated in the roof control unit 30.
- an alignment offset of the image-based detection device In order to compensate for a changing orientation of the image-based detection device due to an adjustment of the inner mirror 40, based on a contour of the center console 50, an alignment offset of the image-based
- FIG. 5 shows a flowchart of a method for operating functions in a vehicle using gestures executed in three-dimensional space according to the embodiment of the present invention.
- a flow of processing of the flowchart in FIG. 5 is turned on after an initialization timing such as after the ignition of the vehicle is turned on and cyclically repeated until a termination timing such as turning off the ignition of the ignition Vehicle, is reached.
- the initialization timing may be, for example, the timing of starting an engine of the vehicle and / or the termination timing the timing of stopping the engine of the vehicle.
- the detected gesture is both a gesture executed by the driver and a gesture executed by the passenger can.
- the method of the flowchart in Fig. 5 is performed for both the driver and the passenger side.
- the processing sequence shown in Fig. 5 may conveniently be performed, for example, in parallel, serially, or in a nested manner for the driver and passenger sides.
- step S100 it is determined whether or not a first gesture is detected. If the first gesture is not detected, which corresponds to a "No" answer in step S100, the processing flow returns to step 100. If the first gesture is detected, which corresponds to a "Yes" answer in step S100, the processing flow advances to step S200.
- step S200 it is determined whether or not the detected first gesture is a gesture associated with activating control of a function. If the first gesture is not a gesture associated with activating the control of the function, which corresponds to a "No" answer in step S200, the processing flow returns to step S100. If the first gesture is the gesture associated with activating the control of the function, which corresponds to a "Yes" answer in step S200, the processing flow advances to step S300.
- the gesture associated with activating the control of the function is a first predetermined gesture that is for a first predetermined period of time in a
- Interaction area of the three-dimensional space is static.
- the first predetermined gesture is detected as previously described with reference to FIGS. 1 to 3.
- the interaction area corresponds to the valid one described above
- step S300 the control of the function is activated. After step S300, the processing flow advances to step S400.
- step S400 it is determined whether or not a predetermined cancellation condition is satisfied. If the predetermined termination condition is satisfied, which is a "Yes" answer Step S400, the processing flow returns to step S100. If the cancellation condition is not satisfied, which corresponds to a "No" answer in step S400, the processing flow advances to step S500.
- the predetermined termination condition may be, for example, that no gesture has been detected for a fourth predetermined period of time. If the predetermined
- step S400 Abort condition is satisfied in step S400, the function of the function indicating display element is no longer displayed on the display unit.
- step S500 it is determined whether or not a second gesture is detected. If the second gesture is not detected, which corresponds to a "No" answer in step S500, the processing flow returns to step S500. If the second gesture is detected, which corresponds to a "Yes" answer in step S500, the processing flow advances to step S600.
- step S600 it is determined whether or not the detected second gesture is a gesture associated with controlling the function. If the second gesture is not a gesture associated with controlling the function, which corresponds to a "No" answer in step S600, the processing flow returns to step S500. If the second gesture is the gesture associated with controlling the function, which corresponds to a "Yes" answer in step S600, the processing flow advances to step S700.
- the gesture associated with controlling the function is a second predetermined gesture that is dynamic in the interaction area of the three-dimensional space.
- step S700 the function is controlled.
- a display element indicating the control of the function may be displayed on the display unit.
- step S800 it is determined whether or not a predetermined abort condition is satisfied. If the predetermined abort condition is satisfied, which corresponds to a Yes answer in step S800, the processing flow returns to step S100. If the cancellation condition is not satisfied, which corresponds to a "No" answer in step S800, the processing flow returns to step S500.
- the predetermined termination condition may be, for example, that no gesture has been detected for the fourth predetermined time period. If the predetermined
- step S800 Abort condition is satisfied in step S800, that becomes the controlling of the function
- the method described above can be implemented by means that form a device for operating functions in a vehicle.
- Display unit is preferably a central display of the vehicle, preferably a motor vehicle.
- An application of the above-described embodiment is, for example, switching a menu, such as a menu, back and forth
- Main menus a radio station or a medium, such as a CD, in a central telematics unit of the vehicle by means of gestures, i. Hand or
- Touchscreen or a control element, such as a touchpad to touch.
- a user's learning process may be assisted by visual and / or audible feedback during gesture operation, allowing a user's blind operation after a user's learning phase.
- the user can switch off such optical and / or acoustic feedback manually or such visual and / or audible feedback messages are automatically turned off, for example, for a predetermined period of time after detecting a correct gesture operation by the user.
- the image-based gesture operation described above realizes a simple and fast operability which increases operating comfort, operating flexibility and an operator experience for the user and considerably increases design freedom for a vehicle interior.
- the embodiment described above can be realized as a computer program product, such as a storage medium, which is designed in cooperation with a computer or a plurality of computers, that is to say computer systems, or other computing units, a method according to the preceding one Execute embodiment.
- the computer program product may be configured to execute the method only after performing a predetermined routine, such as a setup routine.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne un procédé de commande de fonctions dans un véhicule à l'aide de gestes effectués dans l'espace tridimensionnel. Ledit procédé consiste à : déterminer (S100) si un premier geste effectué dans l'espace tridimensionnel a été détecté ou non au moyen d'une procédure de détection basée sur des images ; déterminer (S200) si le premier geste est ou non un geste attribué à l'activation d'une commande d'une fonction s'il est déterminé que le premier geste a été détecté ; activer (S300) la commande de la fonction s'il est déterminé que le premier geste détecté est le geste attribué à l'activation de la commande de la fonction ; déterminer (S500) si un deuxième geste effectué dans l'espace tridimensionnel est détecté ou non à l'aide de la procédure de détection basée sur des images ; déterminer (S600) si le deuxième geste détecté est ou non un geste attribué à la commande de la fonction s'il est déterminé que le deuxième geste a été détecté ; et commander (S700) la fonction s'il est déterminé que le premier geste détecté est le geste attribué à l'activation de la commande de la fonction et s'il est déterminé que le deuxième geste détecté est le geste attribué à la commande de la fonction. La présente invention concerne un dispositif correspondant et un produit-programme d'ordinateur correspondant.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102012000263A DE102012000263A1 (de) | 2012-01-10 | 2012-01-10 | Verfahren und Vorrichtung zum Bedienen von Funktionen in einem Fahrzeug unter Verwendung von im dreidimensionalen Raum ausgeführten Gesten sowie betreffendes Computerprogrammprodukt |
PCT/EP2012/005080 WO2013104389A1 (fr) | 2012-01-10 | 2012-12-08 | Procédé et dispositif de commande de fonctions dans un véhicule à l'aide de gestes effectués dans l'espace tridimensionnel ainsi que produit-programme d'ordinateur correspondant |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2802963A1 true EP2802963A1 (fr) | 2014-11-19 |
Family
ID=47504797
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12810080.7A Withdrawn EP2802963A1 (fr) | 2012-01-10 | 2012-12-08 | Procédé et dispositif de commande de fonctions dans un véhicule à l'aide de gestes effectués dans l'espace tridimensionnel ainsi que produit-programme d'ordinateur correspondant |
Country Status (5)
Country | Link |
---|---|
US (1) | US20140361989A1 (fr) |
EP (1) | EP2802963A1 (fr) |
CN (1) | CN104040464A (fr) |
DE (1) | DE102012000263A1 (fr) |
WO (1) | WO2013104389A1 (fr) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102013016490B4 (de) * | 2013-10-02 | 2024-07-25 | Audi Ag | Kraftfahrzeug mit berührungslos aktivierbarem Handschrifterkenner |
DE102013223540A1 (de) | 2013-11-19 | 2015-05-21 | Bayerische Motoren Werke Aktiengesellschaft | Auswahl von Menüeinträgen über Freiraumgesten |
KR20150087544A (ko) * | 2014-01-22 | 2015-07-30 | 엘지이노텍 주식회사 | 제스처 장치, 그 동작 방법 및 이를 구비한 차량 |
DE102014202833A1 (de) * | 2014-02-17 | 2015-08-20 | Volkswagen Aktiengesellschaft | Anwenderschnittstelle und Verfahren zum Wechseln von einem ersten Bedienmodus einer Anwenderschnittstelle in einen 3D-Gesten-Modus |
DE102014202834A1 (de) | 2014-02-17 | 2015-09-03 | Volkswagen Aktiengesellschaft | Anwenderschnittstelle und Verfahren zum berührungslosen Bedienen eines in Hardware ausgestalteten Bedienelementes in einem 3D-Gestenmodus |
DE102014202836A1 (de) | 2014-02-17 | 2015-08-20 | Volkswagen Aktiengesellschaft | Anwenderschnittstelle und Verfahren zur Unterstützung eines Anwenders bei der Bedienung einer Anwenderschnittstelle |
DE102014006945A1 (de) | 2014-05-10 | 2015-11-12 | Audi Ag | Fahrzeugsystem, Fahrzeug und Verfahren zum Reagieren auf Gesten |
DE102014013763A1 (de) | 2014-09-05 | 2016-03-10 | Daimler Ag | Bedienvorrichtung und Verfahren zum Bedienen von Funktionen eines Fahrzeugs, insbesondere eines Kraftwagens |
KR101556521B1 (ko) * | 2014-10-06 | 2015-10-13 | 현대자동차주식회사 | 휴먼 머신 인터페이스 장치, 그를 가지는 차량 및 그 제어 방법 |
DE102014221053B4 (de) | 2014-10-16 | 2022-03-03 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zum Bereitstellen einer Benutzerschnittstelle in einem Fahrzeug |
US9550406B2 (en) | 2015-03-16 | 2017-01-24 | Thunder Power Hong Kong Ltd. | Thermal dissipation system of an electric vehicle |
US9547373B2 (en) * | 2015-03-16 | 2017-01-17 | Thunder Power Hong Kong Ltd. | Vehicle operating system using motion capture |
DE102015006613A1 (de) | 2015-05-21 | 2016-11-24 | Audi Ag | Bediensystem und Verfahren zum Betreiben eines Bediensystems für ein Kraftfahrzeug |
FR3048933B1 (fr) * | 2016-03-21 | 2019-08-02 | Valeo Vision | Dispositif de commande d'eclairage interieur d'un vehicule automobile |
CN106959747B (zh) * | 2017-02-14 | 2020-02-18 | 深圳奥比中光科技有限公司 | 三维人体测量方法及其设备 |
CN106933352A (zh) * | 2017-02-14 | 2017-07-07 | 深圳奥比中光科技有限公司 | 三维人体测量方法和其设备及其计算机可读存储介质 |
KR20210125631A (ko) * | 2020-04-08 | 2021-10-19 | 현대자동차주식회사 | 단말기, 단말기와 통신하는 퍼스널 모빌리티 및 그의 제어 방법 |
CN111880660B (zh) * | 2020-07-31 | 2022-10-21 | Oppo广东移动通信有限公司 | 显示画面的控制方法、装置、计算机设备和存储介质 |
WO2022021432A1 (fr) | 2020-07-31 | 2022-02-03 | Oppo广东移动通信有限公司 | Procédé de commande gestuelle et dispositif associé |
DE102022121742A1 (de) * | 2022-08-29 | 2024-02-29 | Bayerische Motoren Werke Aktiengesellschaft | Steuern einer Funktion an Bord eines Kraftfahrzeugs |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080065291A1 (en) | 2002-11-04 | 2008-03-13 | Automotive Technologies International, Inc. | Gesture-Based Control of Vehicular Components |
JP3903968B2 (ja) * | 2003-07-30 | 2007-04-11 | 日産自動車株式会社 | 非接触式情報入力装置 |
JP3752246B2 (ja) * | 2003-08-11 | 2006-03-08 | 学校法人慶應義塾 | ハンドパターンスイッチ装置 |
JP4389855B2 (ja) * | 2005-09-05 | 2009-12-24 | トヨタ自動車株式会社 | 車両用操作装置 |
US7834847B2 (en) * | 2005-12-01 | 2010-11-16 | Navisense | Method and system for activating a touchless control |
CN101055193A (zh) * | 2006-04-12 | 2007-10-17 | 株式会社日立制作所 | 车载装置的非接触输入操作装置 |
US8972902B2 (en) * | 2008-08-22 | 2015-03-03 | Northrop Grumman Systems Corporation | Compound gesture recognition |
JP5430572B2 (ja) * | 2007-09-14 | 2014-03-05 | インテレクチュアル ベンチャーズ ホールディング 67 エルエルシー | ジェスチャベースのユーザインタラクションの処理 |
CN102112945B (zh) * | 2008-06-18 | 2016-08-10 | 奥布隆工业有限公司 | 用于交通工具接口的基于姿态的控制系统 |
DE102008048825A1 (de) * | 2008-09-22 | 2010-03-25 | Volkswagen Ag | Anzeige- und Bediensystem in einem Kraftfahrzeug mit nutzerbeeinflussbarer Darstellung von Anzeigeobjekten sowie Verfahren zum Betreiben eines solchen Anzeige- und Bediensystems |
DE102009046376A1 (de) * | 2009-11-04 | 2011-05-05 | Robert Bosch Gmbh | Fahrerassistenzsystem für ein Fahrzeug mit einer Eingabevorrichtung zur Eingabe von Daten |
CN102236409A (zh) * | 2010-04-30 | 2011-11-09 | 宏碁股份有限公司 | 基于影像的动作手势辨识方法及系统 |
CN102221891A (zh) * | 2011-07-13 | 2011-10-19 | 广州视源电子科技有限公司 | 实现视像手势识别的方法和系统 |
CN103782255B (zh) * | 2011-09-09 | 2016-09-28 | 泰利斯航空电子学公司 | 交通工具娱乐系统的眼动追踪控制 |
-
2012
- 2012-01-10 DE DE102012000263A patent/DE102012000263A1/de not_active Withdrawn
- 2012-12-08 US US14/371,090 patent/US20140361989A1/en not_active Abandoned
- 2012-12-08 EP EP12810080.7A patent/EP2802963A1/fr not_active Withdrawn
- 2012-12-08 WO PCT/EP2012/005080 patent/WO2013104389A1/fr active Application Filing
- 2012-12-08 CN CN201280066638.4A patent/CN104040464A/zh active Pending
Non-Patent Citations (1)
Title |
---|
See references of WO2013104389A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2013104389A1 (fr) | 2013-07-18 |
US20140361989A1 (en) | 2014-12-11 |
DE102012000263A1 (de) | 2013-07-11 |
CN104040464A (zh) | 2014-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013104389A1 (fr) | Procédé et dispositif de commande de fonctions dans un véhicule à l'aide de gestes effectués dans l'espace tridimensionnel ainsi que produit-programme d'ordinateur correspondant | |
EP2802964B1 (fr) | Procédé et dispositif de commande de fonctions dans un véhicule à l'aide de gestes effectués dans l'espace tridimensionnel ainsi que produit programme d'ordinateur associé | |
EP2802477B1 (fr) | Procédé et dispositif de commande de fonctions affichées sur l'unité d'affichage d'un véhicule, à l'aide de gestes effectués dans l'espace tridimensionnel, et produit-programme informatique correspondant | |
EP1998996B1 (fr) | Serveur interactif et procédé permettant de faire fonctionner le serveur interactif | |
EP2451672B1 (fr) | Procédé et dispositif permettant de fournir une interface utilisateur dans un véhicule | |
DE102013000068A1 (de) | Verfahren zum Synchronisieren von Anzeigeeinrichtungen eines Kraftfahrzeugs | |
DE102012020607B4 (de) | Kraftwagen mit einer Gestensteuerungseinrichtung sowie Verfahren zum Steuern eines Auswahlelements | |
WO2015062751A1 (fr) | Procédé pour faire fonctionner un dispositif de détection sans contact d'objets et/ou de personnes et de gestes et/ou d'actions de commande effectuées par celles-ci dans l'habitacle d'un véhicule | |
DE102016216577A1 (de) | Verfahren zur Interaktion mit Bildinhalten, die auf einer Anzeigevorrichtung in einem Fahrzeug dargestellt werden | |
EP3358454A1 (fr) | Interface utilisateur, véhicule et procédé de distinction de l'utilisateur | |
DE102018209400A1 (de) | Verfahren zum Betreiben einer Anzeige- und Bedienvorrichtung, Anzeige- und Bedienvorrichtung, und Kraftfahrzeug | |
WO2014108147A1 (fr) | Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage | |
DE102012018685B4 (de) | System und Verfahren zur Steuerung von zumindest einem Fahrzeugsystem mittels von einem Fahrer durchgeführter Gesten | |
WO2014067774A1 (fr) | Procédé et dispositif de fonctionnement d'un système d'entrée | |
EP3426516B1 (fr) | Dispositif de commande et procédé pour détecter la sélection, par l'utilisateur, d'au moins une fonction de commande du dispositif de commande | |
DE102011117289A1 (de) | Verfahren zum Betreiben einer mobilen Vorrichtung in einem Fahrzeug | |
DE102009039114B4 (de) | Bedienvorrichtung für ein Fahrzeug | |
DE102016001998A1 (de) | Kraftfahrzeug-Bedienvorrichtung und Verfahren zum Betreiben einer Bedienvorrichtung, um eine Wechselwirkung zwischen einer virtuellen Darstellungsebene und einer Hand zu bewirken | |
DE102013011414A1 (de) | Verfahren und Vorrichtung zum Steuern von Spiegelfunktionen in einem Fahrzeug sowie betreffendes Computerprogrammprodukt | |
DE102015201722A1 (de) | Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung | |
DE102018212398A1 (de) | Eingabeeinheit und Verfahren zum Eingeben von Steuerbefehlen | |
DE102022102504B9 (de) | Verfahren zum Betreiben einer Schnittstellenvorrichtung in einem Fahrzeug, sowie Schnittstellenvorrichtung und Fahrzeug | |
WO2024126352A1 (fr) | Procédé et système de capture d'entrées d'utilisateur | |
DE102014217969A1 (de) | Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung | |
DE102019207696A1 (de) | Kraftfahrzeug mit einer Bedienvorrichtung, Verfahren zum Wechseln einer Position eines Bedienelements mit einer berührungssensitiven Oberfläche, und Steuereinrichtung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140605 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: ENTENMANN, VOLKER Inventor name: ZHANG-XU, TINGTING |
|
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20170824 |