WO2016120251A1 - Procédé pour faire fonctionner un dispositif de saisie, dispositif de saisie - Google Patents

Procédé pour faire fonctionner un dispositif de saisie, dispositif de saisie Download PDF

Info

Publication number
WO2016120251A1
WO2016120251A1 PCT/EP2016/051531 EP2016051531W WO2016120251A1 WO 2016120251 A1 WO2016120251 A1 WO 2016120251A1 EP 2016051531 W EP2016051531 W EP 2016051531W WO 2016120251 A1 WO2016120251 A1 WO 2016120251A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
input
detected
palm
input device
Prior art date
Application number
PCT/EP2016/051531
Other languages
German (de)
English (en)
Inventor
Markus Langenberg
Claus Marberger
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2016120251A1 publication Critical patent/WO2016120251A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/29Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/141Activation of instrument input devices by approaching fingers or pens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras

Definitions

  • the invention relates to a method for operating an input device, in particular a motor vehicle, comprising at least one camera sensor for non-contact detection of the position and / or position change of at least one finger of a user's hand, depending on the he assten position and / or position change to a Input detected and this is executed.
  • the invention relates to an input device with such a device and a motor vehicle
  • Display devices mounted in the upper part of a control panel or dashboard of the motor vehicle, so that the driver does not have to look away from the traffic to read his gaze too strong.
  • touch-sensitive sensor in the area of the armrest of the driver, and the display device at the usual place in the area of the dashboard.
  • a visual feedback to the driver in the operation of the sensor can in the form of an indicated transparent hand, by the Display device is shown done.
  • the driver can conveniently operate the input device while the display is still presented to him in an advantageous viewpoint.
  • the display device it is also conceivable not to design the display device as a screen, but as a head-up display. While classic touch-sensitive sensors
  • Touchpads need a touch by the user to their operation, also input devices are known which
  • the positions of a user's hand, fingers and / or arm are detected in the room and evaluated for gesture operation.
  • Finger gestures require high resolution, which can be achieved with sensors such as time-of-flight sensors, stereo cameras, structured light, or the like.
  • sensors with a lower resolution can also be used, such as radar sensors.
  • the position or position change of the hand of a user is detected, being detected in response to the detected position and / or position change to an input and this is performed. This is already described, for example, in the published patent application DE 20 2012 005 255 Ul.
  • the user thus indicates with a movement of his hand or with at least one finger of his hand to the input device, which input he wants to make.
  • the input device recognizes on the basis of
  • Finger movement the desired input and executes them by implementing the command given by the movement and, for example, changes an operating parameter of the motor vehicle. For example, depending on the position and position change of a finger of the
  • Input device can be performed by the volume, for example, an entertainment system of the motor vehicle increases.
  • the inventive method with the features of claim 1 has the advantage that it can be easily distinguished between a desired input and a return movement.
  • a desired input In the room
  • a return movement In the room
  • the execution of an intended hand movement is generally associated with a necessary return movement. Because of the similarity of the movements, it is difficult to correctly recognize the gesture intended by the user.
  • Camera which does not provide depth information, is used to detect the movement of the hand, it is not possible to define a virtual plane in the space in relation to which the hand movement of the user could be evaluated.
  • image evaluation only one camera sensor, which is designed as a mono camera, the
  • the invention provides that when a hand is detected with at least substantially flat outstretched fingers, a palm is determined, and that is recognized only on an input when the hand is moved in the direction of the palm of the hand. It is thus provided that, first of all, an inner surface of the camera is determined from the image captured by the camera sensor, and then a
  • Movement of the hand is compared with the alignment of the palm of the hand to the hand. Only then, when the hand is moved in the direction of the palm of the hand, an input is detected. Will the hand in the
  • the inside of the hand is determined as a function of an orientation of the thumb of the hand to the fingers of the hand.
  • individual components of the hand are thus preferably first determined and optionally marked.
  • the thumb of the hand is determined.
  • Detected camera sensor it is recognized that the palm is on the left when the hand is vertically aligned and the thumb is facing up, that the palm is on the right, with the hand is vertically aligned and the thumb is not recognizable that the palm of the hand with the hand oriented horizontally and the thumb pointing to the right, with the palm of the hand facing down when the hand is horizontal and the thumb is to the left.
  • the vertical or horizontal arrangement of the hand is used in particular
  • Image evaluation can be determined in a simple manner. By the procedure described, it is thus possible to determine the orientation of the hand and thus the position of the inside of the hand in a simple manner.
  • the input is recognized only when the thumb is stretched out.
  • this assumes that the thumb is splayed by the hand, that is, it is stretched out.
  • Hand movements are distinguished, so that a
  • an activation of the gesture detection by the input device takes place in that the user inserts his hand into the detection range of the camera sensor and spreads the thumb. If the hand is oriented horizontally in the detection range of the sensor or if the fingertips are seen one behind the other from the direction of the camera sensor, the extended position of the thumb can depend on the position of the thumb tip on the hand and / or the remaining fingers of the hand and / or in dependence be closed by a detected movement of the thumb tip, which leads to a shortening of the detected thumb in the plan view of the thumb.
  • the palm of the hand is determined as a function of the visibility of fingernails of the hand.
  • fingernails of the hand are located in the image evaluation. Due to typical contours of fingernails on the fingers, these can be easily detected and their position determined. Is the hand vertical
  • the arrangement or orientation of the palm of the hand is located on the fingernails opposite side of the hand. If the hand is aligned horizontally so that the fingertips are juxtaposed in the image of the camera sensor, then it is recognized that the palm of the hand faces away from the sensor when the fingernails of the fingers are detected, and points to the sensor when no fingernails can be detected ,
  • the input is recognized only when the fingers of the hand are spread in a plane or become.
  • the recognition of individual fingers in the facilitates
  • Camera sensor is guided or guided. This offers the possibility that it is detected whether a movement performed in the detection range of the sensor, in particular by a driver of the
  • the device according to the invention with the features of claim 7 is characterized by a control unit, which performs the inventive method when used as intended. This results in the already mentioned advantages.
  • the input device according to the invention with the features of claim 8 is characterized by the inventive device, resulting in the advantages already mentioned.
  • the motor vehicle according to the invention with the features of claim 9 is characterized by input device according to the invention.
  • Figure 1 shows the interior of a motor vehicle with a
  • FIGS. 2A to 2D illustrate exemplary input recognized by the input device.
  • Figure 1 shows a schematic representation of the interior of a motor vehicle 1, not shown here, which has an input device 2 for contactless input of control commands.
  • the input device has for this purpose a non-contact sensor 3 and a
  • Display unit 4 The display unit 4 is in the dashboard
  • the display unit 4 is designed as a screen, in particular a display, and can be, for example, part of a navigation system or an entertainment system of the motor vehicle 1. It is also conceivable to form the display unit 4 alternatively or additionally as a head-up display (HU D).
  • the non-contact sensor 3 is preferably designed as a two-dimensional video camera or camera device having the detection range shown in dashed lines. The video camera is preferably oriented such that it points to the front end of a central armrest 5 of the motor vehicle 1.
  • the armrest 5 itself has no real input surface on which a driver could enter a command by means of a hand 6 shown here only schematically by touching the input surface.
  • the input device 2 is designed to detect a position and / or position change of at least one finger of the hand 6 in space and to recognize and execute it as a function of an input. For this purpose, it is provided that the input device 2 a
  • Control unit 8 which evaluates the data detected by the camera sensor 3 data and an input of the driver only under certain
  • the driver's hand 6 in the detection area of the sensor 3 is monitored for whether the hand is stretched out, ie the fingers lie essentially in a plane next to each other, and whether the driver with the entire hand 6 movement in the detection area 3 performs. It is provided herein that the input device 2 in detected movements of the hand 6 between an input and a
  • FIGS. 2A to 2D show different embodiments of hand movements which are recognized as inputs. It is envisaged that is detected on an input only when the user's hand 6 is moved in the direction of her palm 9, and that on a
  • FIG. 2A shows a hand 6 in a first exemplary embodiment for this purpose
  • the figure 2A shows, for example, the captured by the camera sensor image.
  • the hand contour of the hand 6 is detected by digital image analysis methods. As discussed in more detail later, for example, a background modeling for
  • Hand contour calculation performed to detect fingers and fingertips of the hand 6 and their orientation.
  • the hand 6 is checked for whether the hand is extended, so that the fingers 10, 11, 12, 13 of the hand 6 are aligned side by side in a plane, and whether the thumb 14 is stretched out, that is oriented away from the hand 6 ,
  • the input device 2 is activated to detect an input.
  • the gesture "open hand with outstretched thumb” is thus a static gesture after a short stay in the coverage of the
  • Camera sensor 3 is detected, and activates the gesture recognition of
  • the active state is preferably communicated acoustically or visually to the user.
  • FIG. 2A shows by way of example that the user has his hand 6 in FIG.
  • FIG. 2C shows an exemplary embodiment in which the user moves his hand to the left in the direction of the palm 9 according to an arrow 17.
  • Figure 2D shows an embodiment in which the user moves his hand 6 in the direction of an arrow 18 to the right in the direction of the palm 9.
  • the above-mentioned directional indications above, below, to the left and to the right relate to the image detected by the camera sensor, as shown in FIGS. 2A to 2D.
  • the user 6 has rotated his hand so that he always moves the hand in the direction of his palm 9. Thus, these movements are always recognized as input. If the user were to move his hand 6 in the opposite direction, this would be evaluated as a return movement.
  • Thumb down 14 can be determined in a simple manner that it is not detectable by the camera sensor 3, because it is hidden by the remaining fingers 10 to 13 of the hand 6.
  • An upward orientation of the palm of the hand corresponds to a linear horizontal arrangement of the fingertips of the fingers 10 to 13, as shown in Figure 2B, with the thumb facing to the left.
  • An orientation of the palm 9 downward, as shown in Figure 2A, corresponds to a linear horizontal arrangement of
  • the fingers are arranged vertically, so that they essentially lie one behind the other as viewed by the camera sensor 9, as shown in FIGS. 2C and 2D. If then fingernails are recognized on the right side, it is determined that the
  • Hand inner surface 9 points to the left, fingernails are detected on the left side, it is determined that the palm 9 points to the right. This is of course only possible if the fingernails through the
  • the interpretation of the gesture as a gesture or input is based on the analysis of the movement speed and direction of movement. Thresholds for the recognition of an intended input may be related to the intensity of the optical flow which the hand generates in the camera image or video image of the camera sensor 3, or to the course of movement (linear, rudatory, minimum movement distance or the like).
  • the distinction as to whether the detected hand belongs to the driver or to the front passenger of the motor vehicle 1 is advantageously detected as a function of the direction of entry of the hand 6 into the detection range of the camera sensor 3. To do this
  • motion history data of the hand 6 is used, or, in the case of a sufficiently large detection range of the camera sensor 3, the relative position of further arm features, such as, for example, the elbow of the hand having the arm.
  • a background modeling according to Zivkovic, Zoran (Zivkovic, Zoran, "Improved Adaptive Gaussian Mixture Model for Background Substation", ICPR, 2004) of the image acquired by the sensor 3 is performed
  • the background is understood to be the static or non-changing image in the motor vehicle detected by the sensor 3.
  • the static image serves as a reference image that is subtracted from a currently acquired image at any one time, so that only Picture elements or pixels in the picture
  • the foreground is the hand 6 of a user moving in the detection range of the sensor 3, as exemplified in FIG.
  • a foreground modeling according to Dadgostar and Sarrafzadeh (Dadgostar, Farhad and Sarrafzadeh, Abdolhossein: "An adaptive real-time skin detector based Hue thresholding:" Pattern Recognition Letters, 2006, 1342-1352), in which relevant foreground regions for the hand recognition by special
  • Skin color models are detected, whereby due to the color difference between the hand and the background of the hand is detected.
  • a hand contour of the hand 6 is determined and, in particular, the hand contour center of gravity is also calculated.
  • a hand contour segmentation according to Suzuki and Abe (Suzuki, S. and Abe, K .:
  • Fingertips are recalculated contours, with their centers of gravity representing the position of the final fingertip. If several false fingertips were detected by over-segmentation, they are discarded based on geometric heuristics and the distance to the hand contour center of gravity. This can be in a simple way and
  • the detection of the thumb 14 is advantageously carried out by comparison of geometric features, such as the direction and length of the thumb 14
  • Thumb 14 compared to the other recognized fingers 10 to 13 performed.
  • the method described is particularly suitable for
  • Swipe gestures performed up and down, as shown in Figures 2A and 2B, because the fingers in the image are not allowed to overlap. If all five fingers 10 to 14 are found with a minimum distance from each other, the gesture recognition is considered activated. Alternatively, it can also be provided that the fingers 10 to 13 abut each other and only the thumb 14, as previously described is spread apart. The position and orientation of the thumb 14 is crucial to the possible input, as previously discussed. Now follows in subsequent images a movement of all five
  • Fingertips in the "allowed" wiping direction this is recognized as “permissible” input. Is a movement of all five finger fingertips in the
  • An alternative or additional system characteristic can recognize all four directions, and preferably offers the following work chain: First, in a training stage, a recording of a large amount of
  • object classes are provided on the basis of the hand:
  • object classes can be determined based on the finger type:
  • Activation and possible swipe gesture / input not directly from the object class. Instead, similar to recognition based on the hand contour, it must be deduced from the presence and geometric constellation of the detected fingertips on the gesture / input.
  • the orientation of the palm 9 can be derived directly from the direction of the fingers 10 to 14 or fingernails.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé pour faire fonctionner un dispositif de saisie (2), notamment d'un véhicule automobile (1), comprenant au moins un capteur de caméra (3) destiné à la détection sans contact de la position et/ou d'un changement de position d'au moins un doigt (10-14) d'une main (6) d'un utilisateur. Une saisie est reconnue en fonction de la position et/ou d'un changement de position détectés, et cette saisie est exécutée. Selon l'invention, lorsque la main (6) est détectée avec les doigts (10-14) tendus au moins sensiblement à plat, une paume (9) de la main est déterminée et une saisie est reconnue uniquement si la main (6) est déplacée dans le sens de la paume (9).
PCT/EP2016/051531 2015-01-30 2016-01-26 Procédé pour faire fonctionner un dispositif de saisie, dispositif de saisie WO2016120251A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015201613.7A DE102015201613A1 (de) 2015-01-30 2015-01-30 Verfahren und Vorrichtung zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102015201613.7 2015-01-30

Publications (1)

Publication Number Publication Date
WO2016120251A1 true WO2016120251A1 (fr) 2016-08-04

Family

ID=55237642

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/051531 WO2016120251A1 (fr) 2015-01-30 2016-01-26 Procédé pour faire fonctionner un dispositif de saisie, dispositif de saisie

Country Status (2)

Country Link
DE (1) DE102015201613A1 (fr)
WO (1) WO2016120251A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110333772A (zh) * 2018-03-31 2019-10-15 广州卓腾科技有限公司 一种控制对象移动的手势控制方法
CN111051117A (zh) * 2017-09-05 2020-04-21 法国大陆汽车公司 方向盘上用于手指检测的光学效应触摸垫

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10281990B2 (en) * 2016-12-07 2019-05-07 Ford Global Technologies, Llc Vehicle user input control system and method
DE102017201312A1 (de) 2017-01-27 2018-08-02 Audi Ag Verfahren zum Steuern eines Temperierungselements eines Behälterhalters
CN110334561B (zh) * 2018-03-31 2023-05-23 广州卓腾科技有限公司 一种控制对象旋转的手势控制方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090139778A1 (en) * 2007-11-30 2009-06-04 Microsoft Corporation User Input Using Proximity Sensing
DE102009043798A1 (de) * 2008-12-17 2010-06-24 Northrop Grumman Space & Mission Systems Corporation, Los Angeles Verfahren zur Erkennung von Handgesten
DE202012005255U1 (de) 2012-05-29 2012-06-26 Youse Gmbh Bedienvorrichtung mit einer Gestenüberwachungseinheit
US20130267318A1 (en) * 1997-08-22 2013-10-10 Motion Games, Llc Advanced video gaming methods for education and play using camera based inputs
US20140354602A1 (en) * 2013-04-12 2014-12-04 Impression.Pi, Inc. Interactive input system and method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5030580B2 (ja) * 2006-12-27 2012-09-19 タカタ株式会社 車両用作動システム、車両
US8768006B2 (en) * 2010-10-19 2014-07-01 Hewlett-Packard Development Company, L.P. Hand gesture recognition
JP5969460B2 (ja) * 2011-03-14 2016-08-17 聖 星野 爪領域検出方法、プログラム、記憶媒体、及び爪領域検出装置
AU2013205613B2 (en) * 2012-05-04 2017-12-21 Samsung Electronics Co., Ltd. Terminal and method for controlling the same based on spatial interaction

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130267318A1 (en) * 1997-08-22 2013-10-10 Motion Games, Llc Advanced video gaming methods for education and play using camera based inputs
US20090139778A1 (en) * 2007-11-30 2009-06-04 Microsoft Corporation User Input Using Proximity Sensing
DE102009043798A1 (de) * 2008-12-17 2010-06-24 Northrop Grumman Space & Mission Systems Corporation, Los Angeles Verfahren zur Erkennung von Handgesten
DE202012005255U1 (de) 2012-05-29 2012-06-26 Youse Gmbh Bedienvorrichtung mit einer Gestenüberwachungseinheit
US20140354602A1 (en) * 2013-04-12 2014-12-04 Impression.Pi, Inc. Interactive input system and method

Non-Patent Citations (11)

* Cited by examiner, † Cited by third party
Title
CUTLER, R.; TURK, M.: "View-Based Interpretation on Real-Time Optical Flow for Gesture Recognition", IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, 1998
DADGOSTAR, FARHAD; SARRAFZADEH, ABDOLHOSSEIN: "An adaptive real-time skin detector based on Hue thresholding: A comparison on two motion tracking methods", PATTERN RECOGNITION LETTERS, 2006, pages 1342 - 1352
DO, MARTIN; ASFOUR, TAMIM; DILLMANN, RÜDIGER: "Particle Filter-Based Fingertip Tracking with Circular Hough Transform Features", MVA, 2011
DREUW, PHILIPPE; KEYSERS, DANIEL; DESELAERS, THOMAS; NEY, HERMANN: "Gesture Recognition Using Image Comparision Methods", INTERNATIONAL WORKSHOP ON GESTURE IN HUMAN-COMPUTER INTERACTION AND SIMULATION, 2005, pages 124 - 128
DREUW, PHILIPPE; RYBACH, DAVID; DESELAERS, THOMAS; ZAHEDI, MORTEZA; NEY, HERMANN: "Speech Recognition Techniques for a Sign Language Recogntion System", INTERSPEECH, 2007, pages 2513 - 2516
LEE, J.; KUNII, T. L.: "Model-Based Analysis of Hand Posture", IEEE COMPUTER GRAPHICS AND APPLICATIONS, 1995, pages 77 - 86
LIU, NIANJUN; LOVELL, BRIAN C: "Hand Gesture Extraction by Active Shape Models", PROCEEDINGS OF THE DIGITAL IMAGE COMPUTING ON TECHNIQUES AND APPLICATIONS, 2005
LOCKTON, R.; FITZGIBBON, A. W.: "Real-time gesture recognition using deterministic boosting", BMVC, 2002
MALIK, SHAHZAD: "Real-time Hand Tracking and Finger Tracking for Interaction", TORONTO: CSC2503F PROJECT REPORT, 2003
NAGI, JAWAD ET AL.: "Max-Pooling Convolutional Neural Networks for Vision-based Hand Gesture Recognition", ICSIPA, 2011, pages 342 - 347
SUZUKI, S.; ABE, K.: "Topological Structural Analysis of Digitized Binary Images by Border Following.", CVGIP, 1985, pages 32 - 46

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111051117A (zh) * 2017-09-05 2020-04-21 法国大陆汽车公司 方向盘上用于手指检测的光学效应触摸垫
CN111051117B (zh) * 2017-09-05 2023-01-13 法国大陆汽车公司 方向盘上用于手指检测的光学效应触摸垫
CN110333772A (zh) * 2018-03-31 2019-10-15 广州卓腾科技有限公司 一种控制对象移动的手势控制方法

Also Published As

Publication number Publication date
DE102015201613A1 (de) 2016-08-04

Similar Documents

Publication Publication Date Title
EP3642696B1 (fr) Procédé et dispositif de détection d'une entrée utilisateur en fonction d'un geste
WO2016120251A1 (fr) Procédé pour faire fonctionner un dispositif de saisie, dispositif de saisie
DE102010007455B4 (de) System und Verfahren zum berührungslosen Erfassen und Erkennen von Gesten in einem dreidimensionalen Raum
EP3695293A1 (fr) Procédé de fourniture d'une réponse haptique à un opérateur d'un dispositif d'affichage tactile
DE102014226553A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung, Kraftfahrzeug
EP3234736B1 (fr) Dispositif pour faire fonctionner un dispositif de saisie, dispositif de saisie, véhicule automobile
WO2015162058A1 (fr) Interaction de gestes avec un système d'information de conducteur d'un véhicule
DE202015100273U1 (de) Eingabevorrichtung
EP3232372B1 (fr) Procédé, dispositif d'entrée utilisateur et programme informatique pour reconnaître l'orientation d'une main d'un utilisateur
WO2014108150A2 (fr) Interface utilisateur pour une entrée de caractères manuscrite dans un appareil
DE102014224599A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
EP3642697B1 (fr) Procédé et dispositif de détection d'une entrée utilisateur en fonction d'un geste
WO2017140569A1 (fr) Dispositif de commande d'un véhicule automobile et procédé de fonctionnement d'un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main
DE102014224632A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102007041482A1 (de) Verfahren zur automatischen Erkennung wenigstens der Art und/oder der Lage einer mit einer Gliedmaße gebildeten Geste, insbesondere einer Handgeste
EP3025214B1 (fr) Procédé de fonctionnement d'un dispositif d'entrée et dispositif d'entrée
WO2015131954A1 (fr) Interface utilisateur et procédé de commande d'une interface utilisateur par des gestes exécutés librement dans l'espace
DE102018100335B4 (de) Verfahren und Vorrichtung zur 3D-Gestenerkennung
EP3426516B1 (fr) Dispositif de commande et procédé pour détecter la sélection, par l'utilisateur, d'au moins une fonction de commande du dispositif de commande
DE102014224618A1 (de) Verfahren und Vorrichtung zum Betreiben einer Eingabevorrichtung
DE102014224598A1 (de) Verfahren und Vorrichtung zum Betreiben einer Eingabevorrichtung
DE102012218155A1 (de) Erleichtern der Eingabe auf einer berührungsempfindlichen Anzeige in einem Fahrzeug
DE102014224641A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102015211521A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102013004251B4 (de) Verfahren zum Betreiben eines Bediensystems und Kraftwagen mit einem solchen Bediensystem

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16701749

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16701749

Country of ref document: EP

Kind code of ref document: A1