DE202015100273U1 - Input device - Google Patents

Input device Download PDF

Info

Publication number
DE202015100273U1
DE202015100273U1 DE201520100273 DE202015100273U DE202015100273U1 DE 202015100273 U1 DE202015100273 U1 DE 202015100273U1 DE 201520100273 DE201520100273 DE 201520100273 DE 202015100273 U DE202015100273 U DE 202015100273U DE 202015100273 U1 DE202015100273 U1 DE 202015100273U1
Authority
DE
Germany
Prior art keywords
input
hand
area
gesture
neutral
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
DE201520100273
Other languages
German (de)
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to DE102014224898.1 priority Critical
Priority to DE102014224898.1A priority patent/DE102014224898A1/en
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to DE201520100273 priority patent/DE202015100273U1/en
Publication of DE202015100273U1 publication Critical patent/DE202015100273U1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • B60K37/04Arrangement of fittings on dashboard
    • B60K37/06Arrangement of fittings on dashboard of controls, e.g. controls knobs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • B60K37/04Arrangement of fittings on dashboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/146Input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/146Input by gesture
    • B60K2370/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/20Optical features of instruments
    • B60K2370/21Optical features of instruments using cameras

Abstract

Input device (2), in particular of a motor vehicle (1), with a camera sensor (3) for contactless detection of a position and / or position change of at least one finger (8) of a hand (6) of a user, the input device (2) being designed for this purpose is to detect and execute an input in dependence on the position and / or position change of the at least one finger (8), characterized in that the input device (2) in the image captured by the camera sensor (3) has a neutral area (11) determined, adjacent to the at least one Gestikbereich (12-15), wherein the input device (2) takes into account in the Gestikbereich (12-15) position change only as an input when it comes from the neutral region (11) coming.

Description

  • The invention relates to an input device, in particular of a motor vehicle, having a camera sensor for detecting the position and / or position change of at least one finger of a user's hand without contact, wherein the input device is designed in dependence on the detected position and / or position change of the finger to recognize an input or an input and execute them.
  • State of the art
  • In today's motor vehicles operating concepts are used in which an input device and a display device are arranged close to each other. Usually, so-called touch-sensitive screens and so-called touch screens are provided for this purpose, in which operation and display take place at the same location. Frequently display devices are mounted in the upper part of a control panel or dashboard of the motor vehicle, so that the driver does not have to turn his eyes too far away from the traffic to read. In other vehicles are a touchpad, so a touch-sensitive sensor, in the area of the armrest of the driver, and the display device at the usual place in the area of the dashboard. A visual feedback to the driver in the operation of the sensor can take place in the form of an indicated transparent hand, which is represented by the display device. Thereby, the driver can conveniently operate the input device while the display is still presented to him in an advantageous viewpoint. In this case, it is also conceivable not to design the display device as a screen, but as a head-up display.
  • While classical touch-sensitive sensors or touchpads require a touch by the user to operate them, input devices are also known which recognize or register inputs without contact. With the aid of depth sensors, for example, the positions of a user's hand, fingers and / or arm are detected in the room and evaluated for gesture operation. Finger gestures require high resolution, which can be achieved with sensors such as time-of-flight sensors, stereo cameras, structured light, or the like. For hand or body gestures, sensors with a lower resolution can also be used, such as radar sensors. By one or more sensors, therefore, the position or position change of the hand of a user is detected, and detected in response to the detected position and / or position change or an input and this is executed. The user thus indicates with a movement of his hand or with a movement of at least one finger of his hand to the input device, which input he wants to make. The input device recognizes the desired input on the basis of the hand / finger movement and executes it by converting the command given by the movement and, for example, changing an operating parameter of the motor vehicle. For example, depending on the position and position change of a user's finger, the "increase volume" input can be detected and executed by the input device by increasing the volume of, for example, an entertainment system of the motor vehicle.
  • From the publication US 2012 010 5613 A1 It is already known to record hand or finger gestures by means of a camera sensor and to control functions of a motor vehicle as a function of the recognized gestures. A similar system is also from the published patent application EP 2 441 635 A1 already known. This also discloses a detection of the time course of changes in position of fingertips in free space.
  • Disclosure of the invention
  • The input device according to the invention with the features of claim 1 has the advantage that a distinction between intentional wiping gestures and unintentional return movements or Ausholbewegungen the user in a simple and resource-saving manner. A captured by the camera sensor image is divided into several areas, with a movement of the fingers and / or the hand of the user is only recognized as a gesture or taken into account for the determination of the input when the position change carried out in the respective area by the respective area defined conditions or conditions. According to the invention, it is provided that in the image captured by the camera sensor, a neutral area is determined to which at least one gesture area adjoins, wherein a change in position taking place in the gesture area is taken into account as input only if it occurs from the neutral area. Thus, the user must first position a hand in the neutral area and then, in particular, move his fingers from the neutral area to the gesture area to perform a swipe gesture so that his swipe gesture is recognized as an intended swipe gesture and an associated action is performed by the input device. In particular, the Users displayed on a display device, the position of his hand in the detection range of the camera sensor and thus in the image detected by the camera sensor at least schematically, additionally expediently the neutral area and at least one gestural area are displayed so that the user can easily orient and perform the swipe gesture. The size of the neutral region is preferably chosen such that a precipitation gesture which is normally smaller in comparison to an intended wiping gesture can take place within the neutral region. This avoids that in the Ausholgeste fingers of the hand unintentionally get into a neighboring Gestikbereich and there could be counted as input. The user is thus able to carry out clear swipe gestures in order to operate the input device, or to operate or operate other devices of the motor vehicle by means of the input device.
  • According to a preferred embodiment of the invention, it is provided that the input device is designed such that position changes in the gesture area are taken into account as input only if they take place in the direction away from the neutral area. This ensures that return movements of the hand or the fingers of the hand are also not considered as an input gesture.
  • Furthermore, it is preferably provided that the neutral area is determined as a function of a position of the hand in the captured image such that the at least substantially resting hand lies at least substantially within the neutral area. It is thereby achieved that movements, in particular of the fingertips, always take place out of the neutral area or start in the neutral area. In particular, the neutral area is rectangular in shape so that it can completely accommodate an outstretched hand of the user. Preferably, the size of the neutral area is adjusted to the size of a hand grasped in the image. As a result, it can also be taken into account, for example, how close the user's hand is to the camera sensor. This further increases the variability and robustness of the system. Alternatively, if the neutral area is displayed to the user, the neutral area may be determined as a fixed area in the image of the camera sensor.
  • According to a preferred embodiment it is provided that the neutral area is set larger than the hand, so that the gestural area is spaced from the hand located in the neutral area. As already mentioned above, this ensures that, when a swipe gesture is initiated, a portion of the neutral area must first be overcome until the fingertips reach the gestural area, whereby unintentional gestures or outward movements are not taken into account in the recognition of the input.
  • Particularly preferred is provided that the neutral area is moved by hand. As a result, in particular the display of the hand in the image captured by the camera sensor can be omitted. The fact that the neutral area is moved, it is always ensured that the user starts a swipe gesture from the neutral area. For this purpose, it is preferably provided that the neutral area is moved with a lower speed compared to a typical swiping gesture, in order to avoid that the neutral area moves with the swipe gesture and thus the swipe gesture is not recognized as an input. Alternatively or additionally, it can be provided that the moving region is anchored at a point of the hand, which usually moves slower than in particular the fingertips when performing a swipe gesture.
  • In particular, it is provided that a center of gravity of the hand is determined and the neutral area is moved in dependence on the position / position change of the center of gravity of the hand. In particular, it is provided that a wrist of the hand is determined as the center of gravity of the hand. This can be done by analyzing the image data of the camera sensor in a known manner. Compared to the wrist, the fingers are moved much faster in a swipe gesture, so there are speed differences between the fingers and wrist or between the fingers and the center of gravity of the hand, which are used for the method described above. While the neutral range is being moved maximally with the speed of the wrist or center of gravity of the hand, the fingers for the increased speed can leave the neutral range and penetrate the gesture range so that the position change (s) occurring there is taken into account in determining the input (FIG. become).
  • Furthermore, it is preferably provided that at least two gestural areas arranged opposite one another on the neutral area are set up adjacent to the neutral area. Both Gestikbereiche are operated as described above or used in the evaluation of a swipe gesture. Thus, starting from the neutral area, the user can swipe in two directions, for example, to move up through a contextual menu, by making the swipe gesture in the one gesture area, or moving down by he makes the swipe gesture in the opposite, different gestures area. Appropriately, on the opposite side of the wrist also a gesture area adjacent to the neutral area provided.
  • Furthermore, it is preferably provided that when a position change is detected in a plurality of gesture areas, only the position change for the input takes place, which takes place in the gesture area in which most of these position changes are detected. This also ensures that, if the user should perform a surprisingly large Ausholbewegung in which he gets his hand in a Gestikbereich that he actually did not want to reach, yet the correct input is performed. Particularly preferably, the position changes are calculated or determined on the basis of vector data which are determined on the basis of the image captured by the camera sensor. As a result, clear directions of movement can be determined and compared with the permissible direction of movement of one of the gesture areas, for example, in order to ensure rapid evaluation of the position changes and thus rapid recognition of the desired input.
  • In the following, the invention will be explained in more detail with reference to the drawing. Show this
  • 1 the interior of a motor vehicle with an advantageous input device and
  • 2 an image captured by a camera sensor of the input device.
  • 1 shows a schematic representation of the interior of a motor vehicle not shown here 1 which is an input device 2 for non-contact input of control commands. For this purpose, the input device has a non-contact camera sensor 3 and a display unit 4 on. The display unit 4 is in the dashboard or in the control console of the motor vehicle 1 arranged arranged. The display unit 4 is designed as a screen, in particular display, and may for example be part of a navigation system or an entertainment system of the motor vehicle 1 be. It is also conceivable, the display unit 4 alternatively or additionally as a Head-Up Display (HUD) form. The camera sensor 3 is preferably designed as a two-dimensional video camera or camera device having the detection range shown with dashed lines. The video camera is preferably oriented such that it faces the front end of a central armrest 5 of the motor vehicle 1 has. The armrest 5 itself has no real input surface on which a driver by means of a hand only schematically shown here 6 could enter a command by touching the input interface.
  • The input device 2 is instead adapted to a position and / or position change of at least one finger of the hand 6 in the room to capture and depending on an input to recognize and execute. The image data captured by the camera sensor are evaluated as described below and for the operation of the input device 2 used.
  • 2 shows by way of example that of the sensor 3 captured image, and thus the detection range 7 , In the captured image is the hand 6 the user. The input device 2 allows the user within the detection range of the sensor 3 Swipe gestures to operate the input device 2 can be performed, with Ausholbewegungen and return movements are ignored. For this purpose, the method described below takes into account swiping gestures that are performed ergonomically from the wrist. To distinguish between intended wiping gestures and unintentional return movements, the method takes advantage of the fact that when wiping the wrist 7 the hand 6 the user remains more or less rigid and his palm, and especially the fingers 8th the hand 6 move relative to the wrist. In the illustrated embodiment, a swipe gesture is performed to the left, as by an arrow 9 indicated, the wiping movement takes place on the left side of the wrist 7 from right to left. As a consequence, the return movement takes place from left to right and is therefore initially indistinguishable from wiping to the right. This, however, that the movement only until reaching the neutral position or starting position on the left side of the wrist 7 instead of on the right, such a return movement can be distinguished from an intended wipe to the right. It is assumed that the swiping gestures relative to a rigid as possible reference point, such as in this case to the wrist 7 , and that when returning the hand, the neutral position is not significantly exceeded.
  • This in 2 illustrated cross on the hand illustrates the relatively rigid hand center or the center of gravity 10 the hand 6 , The focus 10 is a neutral area in the captured image 11 assigned. The neutral area 11 is rectangular in the present embodiment and aligned and arranged so that the hand 6 essentially within the neutral range 11 lies. position changes or movements that are in the neutral area 11 are performed by the signaling device 2 ignored. If a swipe gesture is performed, so will the focus 10 moved, but less intense in its expression and width than the palm with the outstretched fingers 8th same hand 6 , A sufficiently large neutral area 11 around the center of gravity 10 the signal device lends around 2 an additional robustness, since the neutral position is presented by a spatially extended area, and thus unwanted swipe gestures are not recognized or do not lead to an input.
  • To the neutral area 11 adjacent are present four gestures 12 . 13 . 14 and 15 set up, each on one side to the neutral area 11 adjoin. The gesture areas 12 to 15 are also rectangular in this case.
  • To perform only one swipe gesture to the left, the palm of your hand must be with your fingers 8th from the central region or from the neutral area 11 in the left image area, ie in the area 12 be guided. In this image area, a position change is detected, which is in the direction of the neutral area 11 coming, according to arrow 9 , leads, the swipe gesture is recognized as left wipe. When the return movement is in the same range of gestures 12 a right movement detected. From the signaling device 2 However, they are in the gestural area 12 only position changes detected as input that are in the direction of the neutral range 11 coming done. In a right-hand movement in the gesture area 12 it is therefore an inadmissible movement, that of the signaling device 2 is not taken into account when detecting an input. Similarly, a "right-wipe" a rightward movement in the gestural area 14 recorded during a left movement, during the return movement in the gestural area 14 takes place, is not permitted and therefore not taken into account.
  • The gesture areas 12 to 15 can thus be understood as a kind of one-way street with regard to the permitted wiping direction. The gesture areas 12 and 14 are on opposite sides of the neutral area 11 arranged. When in multiple gesture areas, especially in the opposing gesture areas 12 and 14 , Position changes are detected, each made in the permissible direction, so expediently only the position changes in the Gestikbereich 12 or 14 taken into account in the recognition of the input in which most movements or position changes are detected.
  • To carry out the method, the signaling device 2 a computing unit, not shown here, based on the video image of the sensor 3 performs a motion detection. For this purpose, the optical flow in the image is determined. As a result, one obtains a set of vectors that are specific to certain pixels in the image, as shown in FIG 2 is shown at a time t show the offset in the image plane at time t + 1. This set of vectors is hereinafter referred to as 2D flow field and represents the movement of the hand 6 in the coverage area 2 of the sensor 3 , The arithmetic unit executes an algorithm stored on a non-volatile memory by means of or on the basis of this movement pattern in each time step as rigid as possible a reference point of the hand, namely the center of gravity described above 10 determined. This is preferably done by first bandpass filtering from a first resulting flow field in terms of the length of the flow vectors (motion intensity). As a result, only flow vectors whose length is above a first threshold value S 1 but below a second threshold value S 2 are considered for the further calculations. The threshold values are determined automatically based on the flow field statistics. For example, they can be chosen to correspond to the 5 and 95 percentile. This means that 5% of the flow vectors in the flow field have a shorter length than the first threshold value S 1 and 5% of the flow vectors have a greater length than the second threshold value S 2 . The first threshold value S 1 suppresses motion noise, for example caused by pixel noise. In addition, filtering at the second threshold S 2 eliminates the relatively large movements of the fingertips that occur when wiping in relation to the less movable hand center (center of gravity 10 ) occur. The center of gravity of the filtered flow field is then preferably calculated as follows:
    Figure DE202015100273U1_0002
  • N denotes the number of flow vectors in the filtered flow field and p (x i , y i ) the point of view of the respective flow vector j in image coordinates. To the robustness or inertia of the calculated center of gravity 10 this is additionally filtered over time. That is, in a fixed, sliding window of, for example 10 Time steps the arithmetic mean of the focal points determined for this time window is formed. Alternatively, it can also be provided that more complex filter methods, such as Gaussian filters, are used in order to further increase the robustness.
  • Relative to the determined focus 10 The image coordinates for the different gesture areas and the neutral area are then determined. In this case, the optimal extent of the neutral region can also be determined dynamically with the aid of the filtered flux field. First, the extent around the filtered flow field is calculated in the form of a bounding box. The neutral area 11 around the center of gravity 10 is then determined in relation to the determined extents of the filtered flux field, for example 90% of the extent in each direction. In the present embodiment according to 2 was for right from the center of gravity 10 , that is the right edge of the neutral area 11 , chosen a percentage smaller extent, because due to the physiology of the hand a wiping movement upwards has a lower amplitude, than a wiping in the other three directions. The gesture areas 12 to 15 are determined to be adjacent to the neutral region as previously described.
  • Furthermore, the arithmetic unit executes an algorithm which detects the gesture area with the momentarily maximum movement or with the currently maximum number of position changes. This is advantageously done by adding in each time step for each of the gesture areas 12 to 15 the mean length of all foot vectors is calculated. In addition, the number of flux vectors whose length is above a third threshold S 3 is determined in order to have a measure of how large the proportion of fast movements in this gesture area is currently. In this case, preferably the third threshold value S 3 as a function of the image resolution, the recording frequency of the sensor 3 as well as the distance of the sensor 3 determined to the interaction level and constantly selected.
  • For the Gestikbereich, in which both the average length of the flow vectors as well as the number of "fast" flow vectors is maximum, the motion information is further analyzed. For this purpose, the mean direction of the flow vectors (preferably in degrees) is calculated based on all the flow vectors in this gesture area and mapped to one quadrant of the unit circle. For example, a movement to the left corresponds to an angular range of 45 ° to 135 °, wiping down into the gesture area 15 an angular range of 135 ° to 225 °, wiping right into the gesture area 14 an angle range of 225 ° to 315 ° and wiping up in the gesture area 13 an angular range of 315 ° to 45 °.
  • A predefined gesture alphabet, that is, actions associated with different gestures, maps each of the four gesture areas 12 to 15 one of these four quadrants too. To detect a swipe gesture to the left, for example, a movement in the angular range of 45 ° degrees to 135 ° in the Gestikbereich must 12 be recognized. Any other movement in this zone is ignored. This results in the previously described specification that only movements that are from the direction of the neutral range 11 be evaluated for input. Similarly, this applies to the wiping movements down, right and above the gestures 13 . 14 and 15 ,
  • In general, the assignment of the wiping directions to quadrants on the unit circle can be freely parameterized and in particular depends on the concrete implementation of the algorithm. In addition, the angular resolutions for different directions can be chosen non-equidistantly and arbitrarily, for example to make the detection of swiping gestures in certain directions more or less sensitive. If, in accordance with a defined gesture alphabet in a zone with maximum movement, the permissible direction of movement is determined, the input device will determine 2 the event corresponding to the gesture is generated or the action assigned to the gesture is called or started.
  • Instead of calculating the flow direction in a gesture area by pure averaging, it is also conceivable to weight the direction information of the flow vectors with their length (movement intensity) in the calculation of the mean flow direction. The described procedure assumes that a gesture in the neutral range 11 is started. For this purpose, after the wiping movement, a return movement into the neutral area must take place, which, however, will be carried out intuitively anyway by the majority of users. If the amplitude or the width of the return movement is so great that the neutral region 11 is completely traversed, this leads to an undesirable detection of the return movement as a gesture in the opposite gesture area. The robustness of the gesture recognition can therefore be further enhanced by the fact that this return movement is explicitly modeled with. In this case, the associated event is only started, or the corresponding input is only recognized if, after the detected, correct wiping movement at the connection in a given time window, an opposite (return) movement has also been detected. The recognition is independent of the different areas 11 to 15 , In this way, return movements can be robustly ignored, their execution over the neutral range 11 out into a neighboring gesture area. The size of the temporal window is preferably dependent on the recording frequency of the sensor 3 and to select the desired proximity of the return motion to the actual gesture. Also an adaptation of the temporal window to individual user needs is conceivable.
  • By capturing the center of gravity 10 the hand 6 In addition to the robust "suppression" of feedback movements in the recognition of inputs, the recognition of any swipe gestures at any point in the detection area 7 of the sensor 3 covered, because the neutral area 11 with the main focus 10 the hand 6 is moved.
  • As an extension, it is conceivable, in addition to the recognition of simple unidirectional swipe gestures, to extend the gesture alphabet or the inputs to be recognized to composite swipe gestures. By dividing the coverage into the center of gravity 10 coupled areas 11 to 15 For example, a left-right waving may be distinguished from a left wipe followed by a return motion, since in addition to the pure directional information, additional benefits may now be drawn from the roughly sampled location information. Thus, intentional waving becomes very fast one-to-one left movements in the gesture area 12 followed by right-hand movements in the gestures area 14 lead, with such a waving could be detected with little latency. Analogously, further complex swiping gestures could be defined as the sum of detected directions of movement and associated gestural areas. The different gestures to be evaluated for an input, ie the gesture alphabet, is expediently stored as a model before commissioning in a non-volatile memory, and is used for comparison with the currently detected gestures in order to recognize the respective input.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • US 20120105613 A1 [0004]
    • EP 2441635 A1 [0004]

Claims (8)

  1. Input device ( 2 ), in particular a motor vehicle ( 1 ), with a camera sensor ( 3 ) for the non-contact detection of a position and / or position change of at least one finger ( 8th ) of a hand ( 6 ) of a user, wherein the input device ( 2 ) is adapted, depending on the position and / or position change of the at least one finger ( 8th ) to recognize and execute an input, characterized in that the input device ( 2 ) in which of the camera sensor ( 3 ) captured a neutral area ( 11 ), to the at least one gesture area ( 12 - 15 ), wherein the input device ( 2 ) one in the gesture area ( 12 - 15 ) position change is considered as input only if it is from the neutral region ( 11 ) coming.
  2. Input device according to claim 1, characterized in that it detects the change of position in the gestural area ( 12 - 15 ) are considered as input only when moving in the direction of the neutral region ( 11 ) coming.
  3. Input device according to one of the preceding claims, characterized in that it covers the neutral region ( 11 ) depending on a position of the hand ( 6 ) in the captured image such that the at least substantially resting hand ( 6 ) at least substantially within the neutral region ( 11 ) lies.
  4. Input device according to one of the preceding claims, characterized in that it covers the neutral region ( 11 ) bigger than the hand ( 6 ) so that the gesture area ( 12 - 15 ) spaced apart from the at least substantially stationary hand ( 6 ) lies.
  5. Input device according to one of the preceding claims, characterized in that it covers the neutral region ( 11 ) with the hand ( 6 ) moved.
  6. Input device according to one of the preceding claims, characterized in that it has a center of gravity ( 10 ) of the hand ( 6 ) and depending on the position / position change of the center of gravity ( 10 ) moves the neutral area.
  7. Input device according to one of the preceding claims, characterized in that it is connected to the neutral region ( 11 ) adjacent at least two of each other at the neutral region ( 11 ) oppositely arranged Gestikbereiche ( 12 - 15 ).
  8. Input device according to one of the preceding claims, characterized in that, when it detects a change in position in a plurality of gestures ( 12 - 15 ), taking into account only the position changes for the input that are in the gesture area ( 12 - 15 ), in which most changes in position were recorded.
DE201520100273 2014-12-04 2015-01-22 Input device Active DE202015100273U1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE102014224898.1 2014-12-04
DE102014224898.1A DE102014224898A1 (en) 2014-12-04 2014-12-04 Method for operating an input device, input device
DE201520100273 DE202015100273U1 (en) 2014-12-04 2015-01-22 Input device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE201520100273 DE202015100273U1 (en) 2014-12-04 2015-01-22 Input device
ITUB2015A006045A ITUB20156045A1 (en) 2014-12-04 2015-12-02 The input device
FR1561811A FR3029655B1 (en) 2014-12-04 2015-12-03 Device for entry in particular from a motor vehicle for non-contact seizure of the position and / or change of position of at least one finger of a user's hand
CN201511035963.6A CN105759955A (en) 2014-12-04 2015-12-04 Input device

Publications (1)

Publication Number Publication Date
DE202015100273U1 true DE202015100273U1 (en) 2015-04-08

Family

ID=52991262

Family Applications (2)

Application Number Title Priority Date Filing Date
DE102014224898.1A Pending DE102014224898A1 (en) 2014-12-04 2014-12-04 Method for operating an input device, input device
DE201520100273 Active DE202015100273U1 (en) 2014-12-04 2015-01-22 Input device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
DE102014224898.1A Pending DE102014224898A1 (en) 2014-12-04 2014-12-04 Method for operating an input device, input device

Country Status (4)

Country Link
CN (1) CN105759955A (en)
DE (2) DE102014224898A1 (en)
FR (1) FR3029655B1 (en)
IT (1) ITUB20156045A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015015067A1 (en) * 2015-11-20 2017-05-24 Audi Ag Motor vehicle with at least one radar unit
EP3232372A1 (en) 2016-04-13 2017-10-18 Volkswagen Aktiengesellschaft User interface, means of locomotion and method for detecting a hand of a user

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019092386A1 (en) 2017-11-13 2019-05-16 Nicand Patrick Gesture-based control system for actuators
FR3073649A1 (en) * 2017-11-13 2019-05-17 Frederic Delanoue Gesture control system for actuators

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2441635A1 (en) 2010-10-06 2012-04-18 Harman Becker Automotive Systems GmbH Vehicle User Interface System
US20120105613A1 (en) 2010-11-01 2012-05-03 Robert Bosch Gmbh Robust video-based handwriting and gesture recognition for in-car applications

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110055062A (en) * 2009-11-19 2011-05-25 삼성전자주식회사 Robot system and method for controlling the same
CN102436301B (en) * 2011-08-20 2015-04-15 Tcl集团股份有限公司 Human-machine interaction method and system based on reference region and time domain information
JP5593339B2 (en) * 2012-02-07 2014-09-24 日本システムウエア株式会社 Gesture recognition device using a steering wheel of an automobile, hand recognition method and program thereof
CN102662557B (en) * 2012-03-07 2016-04-13 上海华勤通讯技术有限公司 Mobile terminal and unlock method
US9448635B2 (en) * 2012-04-16 2016-09-20 Qualcomm Incorporated Rapid gesture re-engagement
JP6030430B2 (en) * 2012-12-14 2016-11-24 クラリオン株式会社 Control device, vehicle and portable terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2441635A1 (en) 2010-10-06 2012-04-18 Harman Becker Automotive Systems GmbH Vehicle User Interface System
US20120105613A1 (en) 2010-11-01 2012-05-03 Robert Bosch Gmbh Robust video-based handwriting and gesture recognition for in-car applications

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015015067A1 (en) * 2015-11-20 2017-05-24 Audi Ag Motor vehicle with at least one radar unit
US10528148B2 (en) 2015-11-20 2020-01-07 Audi Ag Motor vehicle with at least one radar unit
EP3232372A1 (en) 2016-04-13 2017-10-18 Volkswagen Aktiengesellschaft User interface, means of locomotion and method for detecting a hand of a user
DE102016206142A1 (en) 2016-04-13 2017-10-19 Volkswagen Aktiengesellschaft User interface, means of locomotion and method of detecting a hand of a user

Also Published As

Publication number Publication date
FR3029655A1 (en) 2016-06-10
DE102014224898A1 (en) 2016-06-09
ITUB20156045A1 (en) 2017-06-02
FR3029655B1 (en) 2018-11-16
CN105759955A (en) 2016-07-13

Similar Documents

Publication Publication Date Title
US9891716B2 (en) Gesture recognition in vehicles
US10203764B2 (en) Systems and methods for triggering actions based on touch-free gesture detection
US9910498B2 (en) System and method for close-range movement tracking
KR101872426B1 (en) Depth-based user interface gesture control
JP5802667B2 (en) gesture input device and gesture input method
JP2016538780A (en) Method and apparatus for remotely controlling vehicle functions
EP2635953B1 (en) Robust video-based handwriting and gesture recognition for in-car applications
EP2839357B1 (en) Rapid gesture re-engagement
US8593417B2 (en) Operation apparatus for in-vehicle electronic device and method for controlling the same
KR101761050B1 (en) Human-to-computer natural three-dimensional hand gesture based navigation method
US9020194B2 (en) Systems and methods for performing a device action based on a detected gesture
US20150367859A1 (en) Input device for a motor vehicle
US8666115B2 (en) Computer vision gesture based control of a device
KR101646616B1 (en) Apparatus and Method for Controlling Object
US9235269B2 (en) System and method for manipulating user interface in vehicle using finger valleys
JP5275970B2 (en) Interactive operating device and method for operating interactive operating device
US10019843B2 (en) Controlling a near eye display
US8675916B2 (en) User interface apparatus and method using movement recognition
US8867791B2 (en) Gesture recognition method and interactive system using the same
JP5921835B2 (en) Input device
US10459530B2 (en) Cursor mode switching
US8284168B2 (en) User interface device
KR101688355B1 (en) Interaction of multiple perceptual sensing inputs
US10761610B2 (en) Vehicle systems and methods for interaction detection
JP5167523B2 (en) Operation input device, operation determination method, and program

Legal Events

Date Code Title Description
R207 Utility model specification

Effective date: 20150513

R150 Term of protection extended to 6 years