WO2017054894A1 - Système de commande interactif et procédé de réalisation d'une opération de commande sur un système de commande intéractif - Google Patents

Système de commande interactif et procédé de réalisation d'une opération de commande sur un système de commande intéractif Download PDF

Info

Publication number
WO2017054894A1
WO2017054894A1 PCT/EP2016/001134 EP2016001134W WO2017054894A1 WO 2017054894 A1 WO2017054894 A1 WO 2017054894A1 EP 2016001134 W EP2016001134 W EP 2016001134W WO 2017054894 A1 WO2017054894 A1 WO 2017054894A1
Authority
WO
WIPO (PCT)
Prior art keywords
displayed
area
hand
display
cursor
Prior art date
Application number
PCT/EP2016/001134
Other languages
German (de)
English (en)
Inventor
Paul Sprickmann Kerkerinck
Original Assignee
Audi Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Ag filed Critical Audi Ag
Publication of WO2017054894A1 publication Critical patent/WO2017054894A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits

Definitions

  • the present invention relates to an interactive operating system with the features according to claim 1 and to a method for performing an operating action in an interactive operating system with the features according to claim 9.
  • DE 10 2013 207 528 A1 describes a method for interacting with an object that is displayed to a user through data glasses, comprising: displaying an object to the user by means of the display of the data glasses; Detecting that the user closes his first eye and keeps it closed for a predetermined period of time, in particular using a first camera that captures the user's eye; Picking up a hand of the user; especially with the help of a second camera; Determining that the user performs an input manipulation on the object during the time period; wherein the inputting action is that the user's hand assumes a posture and, from the perspective of the second eye of the user with respect to the object, a position that satisfies a predetermined condition; Performing an action on the object with the action pre-assigned to the input action.
  • DE 10 2013 215 370 A1 discloses a control device for use in a motor vehicle having a head-up display, by means of which information is displayed to an operator on an image surface arranged in the forward direction of travel, as well as with an operating device having an input device and a feedback device, wherein at least two classes of display objects are displayed or designated on the head-up display, wherein at least these two classes of display objects are addressed by different classes of operating commands or sequences of operating commands.
  • a feedback symbol (cursor) is displayed, which indicates which display object relates to a subsequently entered operating command, wherein selection commands can be entered, inter alia, by a gesture.
  • a feedback symbol In contrast to the operation with an input device such as a computer mouse, a trackball or by means of fingers on a touchpad, a hand does not have a guide when operated by a free gesture. Therefore, only a relatively coarse positioning of an input pointer (cursor) can take place here, and it is often difficult for a user of a gesture operating system to "hit" a displayed, selectable, graphic element with an input pointer.
  • an operation by means of free gestures is often also used when a display area or operating elements on the display device are not in the gripping area of the operator, for example due to a safety belt applied by a vehicle occupant.
  • the usual today cursor cursor
  • cursor such as in the form of an arrow or a crosshair are too small to allow a relaxed, fluid operation.
  • an interactive operating system which comprises a sensor device for acquiring image data in chronological sequence within a detection range of the sensor device and for transmitting the acquired image data to an evaluation device,
  • an evaluation device for evaluating the image data transmitted by the sensor device and for transmitting data relating to a position of a detected hand within the detection range determined by the evaluation to a control device
  • a display device having a display area in or on which one or more graphic elements which can be selected by means of a displayable cursor symbol can be displayed, and
  • control device by means of which, on the basis of the data transmitted by the evaluation device, the cursor symbol can be displayed as a function of a position of the hand in the detection area in or on the display surface of the display device,
  • the evaluation device is configured to evaluate to them transmitted from the sensor device image data to determine whether a detected hand within the detection range performs a free gesture in which at least two fingers of the hand of a first, greater distance to a second, smaller distance from each other and to transmit corresponding data to the controller, and
  • the control device is adapted to control the display device such that the cursor icon in the form of a closed Curve or in the form of the edges of a polygon in or on the display surface is displayed, and that upon transmission of data of such a gesture by the evaluation device, the size of the displayed in or on the display surface cursor icon is changed so that the of the closed curve or the edge of the polygon covered area is reduced from a first, larger area to a second, smaller area.
  • the interactive operating system has the advantage that a cursor symbol in the form of a closed curve or in the form of the edges of a polygon as an input pointer can be significantly larger than an arrow or a similar cursor symbol, without relevant, in particular selectable graphic elements (such as objects, controls) because the area within such a cursor symbol does not obscure the display on or in a display area, but merely "circles around" or includes.
  • an operator of the interactive control system receives immediate feedback as to whether he is making his free gesture within the coverage area and whether his free gesture is recognized as having at least two fingers of his hand moving toward each other, as in only one case in which both conditions are met, the cursor symbol is reduced on or in the display area.
  • control device is further configured to determine that in or on the display surface displayed, selectable, graphic element whose centroid has the smallest distance to the centroid of the displayed in or on the display surface cursor symbol and in the case where the distance has a first non-zero value, automatically shifting the position of the centroid of the displayed cursor icon during the change of the area of the displayed cursor icon from the first, larger area to the second, smaller area, the distance between the area centroid of the displayed cursor symbol and the area centroid of the determined displayed graphic element assumes a second value which is closer to zero than the first value and is preferably zero.
  • control device is further configured to control the second, smaller area covered by the cursor symbol to such a size that the area of the displayed cursor area enclosed by the closed curve or the edges of the polygon is. Symbols in the range of 50% to 200%, preferably in the range of 75% to 175%, particularly preferably in the range of 90% to 150% of the area of the displayed graphic element.
  • control device is further configured to display the cursor icon with the second, smaller area in or on the display area as long as the at least two fingers of the hand performing the gesture are within the detection area have a second, smaller distance from each other.
  • control device is further configured to adjust the position of the displayed cursor, comprising a second, smaller area in or on the display area. Symbols according to a movement of the gesture performing hand to change within the coverage area, where it can optionally be provided that a change in the position of the displayed cursor icon only after performing a movement of this hand over a predetermined minimum path length within the coverage.
  • control device can also be advantageously configured, after reaching a second, smaller distance of the at least two fingers of the hand performing the gesture, to carry out an action with respect to the displayed, selectable, graphic element whose centroid has the smallest distance to the centroid of the cursor icon, the action has been assigned to this graphic element in advance, if the at least two fingers of the hand performing the gesture within the detection range maintained a second, smaller distance from each other for a predeterminable period or within a predeterminable Period another operator action is performed by an operator of the interactive control system.
  • the display device may advantageously comprise a graphics-capable screen, a head-up display or a data goggles (retina projector), and the sensor device may comprise one or more visible light camera devices, one or more infrared infrared camera devices Light, one or more radar devices, and / or one or more time-of-flight devices.
  • the sensor device may comprise one or more visible light camera devices, one or more infrared infrared camera devices Light, one or more radar devices, and / or one or more time-of-flight devices.
  • the present invention also includes a method for performing an operation on an interactive operation system that
  • a sensor device for acquiring image data in chronological sequence within a detection range of the sensor device and for transmitting the acquired image data to an evaluation device
  • an evaluation device for evaluating the image data transmitted by the sensor device and for transmitting data relating to a position of a detected hand within the detection range determined by the evaluation to a control device
  • a display device having a display area in or on which one or more graphic elements which can be selected by means of a displayable cursor symbol can be displayed, and a control device, by means of which, on the basis of the data transmitted by the evaluation device, the cursor symbol can be displayed as a function of a position of the hand in the detection area in or on the display surface of the display device.
  • a cursor symbol as a function of a position of a hand in the detection area of a sensor device in the form of a closed curve or in the form of the edges of a polygon in or on a display surface of a display device, that of the closed curve or the edges of the polygon covered surface has a first, larger area
  • this further comprises
  • the single FIGURE shows two states of a display area of the interactive control system with at least one displayed, selectable, graphic element and a cursor symbol in the form of a ring with an associated, exemplary hand position is shown for each of the two states.
  • the illustrations in the figure are purely schematic and not to scale. Within the figure, the same or similar elements are provided with the same reference numerals.
  • the interactive operating system has a sensor device (not shown in the figure) for acquiring image data in chronological sequence within a detection range of the sensor device and for transmitting the acquired image data to an evaluation device.
  • the sensor device of the interactive operating system preferably has as detection device one or more visible light camera devices, one or more infrared light camera devices, one or more radar devices, and / or one or more time-of-flight devices.
  • the sensor device can, for example, also have a "light source" which serves to illuminate the detection area of the sensor device Infrared LEDs) are illuminated without the risk of glare for an operator of the interactive control system is given.
  • Detecting devices of the sensor device based on different technical principles can be used simultaneously or alternatively to capture image data.
  • simultaneous acquisition of image data by technically diverse detection devices eg by a camera device for visible light and a radar device, by a camera device for visible light and a camera device for infrared light, etc.
  • evaluation of the image data of the technically diverse detection devices by the evaluation device can be increased, for example, the recognition rate with respect to the hand gesture.
  • the interactive operating system preferably has a sensor device with which stereoscopic or three-dimensional images can be detected. This facilitates, for example, the detection of the upper extremities of an operator of the interactive operating system, since it is easier to differentiate between upper extremities, background and foreground within the detection range of the sensor device than is the case with two-dimensional image acquisition.
  • Stereoscopic or three-dimensional image acquisition can, for example, be achieved in a manner known per se by a corresponding stereoscopic arrangement and alignment of the detection axes of two camera devices for visible or infrared light and a corresponding evaluation of the acquired image data by the evaluation device.
  • the position and / or movement of a hand and the fingers within the detection range of the radar device can be detected spatially. Due to the (further) developments of recent years, the sizes of radar devices have been increasingly reduced, so that they can now be used for the interactive control system according to the present invention in many cases even with a limited space available.
  • the sensor device of the interactive operating system according to the present invention has one or more so-called “time-of-flight devices", wherein the interactive operating system can of course have further, technically different image acquisition devices.
  • a signal emitted by a transmitting device is known to be reflected at an object and the reflected signal is detected by a receiving device. Based on the speed and the duration of the signal, the distance of the object from the receiving device can be determined.
  • One of the most well-known time-of-flight device is a PMD or PMD camera system.
  • a PMD sensor has a single or, as a rule, multicell array of PMD detector elements and a receiving optics.
  • One or more PMD systems or PMD camera systems may be provided in a preferred manner in the interactive control system. It should be noted, however, that a sensor device with which stereoscopic or three-dimensional images can be detected is not a mandatory feature of the interactive operating system, and therefore also a sensor device with which only two-dimensional images can be detected is or will be sufficient in many cases can. Even if it should be advantageous or necessary for the interactive operating system according to the present invention to be able to recognize the position of a hand in space, the acquisition of only two-dimensional images may be sufficient since the position can also be determined from two-dimensional images by means of corresponding, intelligent algorithms of an object in space (at least approximately) can be determined.
  • the detection of the movement of the hand by the sensor device with an at least sufficient time sequence for the purpose of the invention for example in the range of 15 to 40 acquisitions (images) per second ( 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39 or Of course, it is also necessary to provide a sufficient resolution of the captured images, so that the gesture of one hand can be detected by the evaluation device with at least two fingers with at least sufficient security
  • the sensor device detected images can of course the known Image enhancement methods (in terms of brightness, contrast, brightness distribution, etc.) are used to improve the detection performance with respect to a free hand gesture performed in space.
  • the detection range of the sensor device is not particularly limited and, in a simple case, may be due to the structure and orientation (detection angle) of the sensor device.
  • a subarea of an available space can be realized in a particularly simple manner with the aid of a sensor device with which stereoscopic or three-dimensional images can be detected.
  • the interactive operating system has an evaluation device for evaluating the image data transmitted by the sensor device and for transmitting data relating to a position of a detected hand determined by the evaluation within the detection region to a control device.
  • the evaluation device can be any suitable device.
  • an evaluation device may include a picture evaluation system known per se, in which images captured by the sensor device are evaluated by means of a suitable combination of hardware (computer, digital computing device) and software (image evaluation and possibly image enhancement software) ,
  • the display device which is another component of the interactive control system according to the present invention, and which has a (real or virtual) display area 1 in or on which one or more graphic elements 3 that can be selected by means of a displayable cursor symbol 2 can be displayed , there are no special restrictions
  • the display device may advantageously comprise, for example, a graphics-capable (if necessary also touch-sensitive and / or proximity-sensitive) screen, a head-up display or a data goggles (a retina projector).
  • the displays in or on the display area 1 of the display device are controlled by a controller.
  • This control device is adapted to display the cursor symbol 2 in the form of a closed curve or in the form of the edges of a polygon in or on the display surface 1, and upon transmission of data of a free gesture in which at least two fingers Move a hand from a first, greater distance to a second, smaller distance to each other, by the evaluation device, the size of the displayed in or on the display surface 1 cursor icon 2 is changed such that the of the closed curve or the edges of Polygon's area was reduced from a first, larger area to a smaller, smaller area.
  • the change in the size of the cursor symbol 2 is preferably not sudden - even if such a sudden change is encompassed by the present invention - but over a predetermined period of time, particularly preferably as synchronously as possible to carry out the above-defined gesture.
  • the size of the displayed cursor symbol 2 is thus preferably dynamically changeable.
  • the closed-curve cursor symbol 2 is preferably a simple closed curve, such as a circular or oval ring. And in the polygon, through whose edges the cursor symbol 2 can be formed, it is preferably a simple polygon, such as a (preferably regular) triangle, square, pentagon, hexagon, heptagon, octagon, etc.
  • the line with which the cursor icon 2 is displayed may have any suitable line weight. Also, it is encompassed by the present invention if the line with which the cursor icon 2 is displayed is not completely closed, ie, without any interruption. Thus, according to the present invention, the latter can also have at least one or more brief interruptions, so that the displayed line of the cursor symbol 2 is only about 95%, 90%, 85%, 80% or 75% of the corresponding, uninterrupted corresponding line of the closed curve or the corresponding, continuous edges of the polygon.
  • the manner of displaying the selectable graphic elements 3 displayed in or on the display area 1 is advantageously not particularly influenced by the display of the cursor symbol 2. In particular, this applies to the area of the display area 1, which is enclosed by the cursor icon 2.
  • the cursor symbol 2 can indeed be displayed "above” one or more displayed, selectable, graphical elements 3, ie the closed curve or the edges of a polygon, by which the cursor symbol 2 in or on the display surface 1, one or more "underlying” displayed selectable graphical elements 3 may "overlay" that the remaining area of the displayed one or more of the selectable graphical elements 3 is displayed in or on the display area 1, however.
  • the control device of the interactive operating system may likewise be a suitable combination of hardware and software, by means of which control signals for the display device can be generated on the basis of the results of the evaluation device and transmitted to them.
  • the evaluation and control device can be designed as separate components / assemblies or as a common component or common assembly of the system according to the invention.
  • the operating system according to the present invention serves to be able to operate one or more devices interactively, it is preferred that the transmission of the acquired image data to the evaluation device takes place with little delay, that the evaluation of the image data transmitted to it takes place with little delay, that the transmission of the Data from the evaluation device to the control device takes place with little delay that the activation of the display device by the control device takes place with little delay and that the display of the cursor symbol 2 takes place with little delay due to the control by the control device.
  • all the devices of the interactive operating system according to the present invention are advantageously set up and cooperate in such a way that within the shortest possible time interval (of, for example, 0.01 s, 0.02 s, 0.03 s, 0, 04 s, 0.05 s, 0.06 s, 0.07 s, 0.08 s, 0.09 s, 0.10 s, 0.15 s, 0.20 s, 0.25 s, 0, 30 s, 0.35 s, 0.40 s, 0.45 s, 0.5 s) between performing a free gesture by operating at least two fingers of the hand from a first, greater distance to a second, smaller distance to move toward each other, a corresponding reduction of the displayed in or on the display surface 1 of the display device GE cursor symbol 2 takes place. In a preferred manner, this reduction thus takes place at the same time as possible with the movement of the at least two fingers toward one another.
  • the shortest possible time interval of, for example, 0.01 s, 0.02 s, 0.03 s, 0, 04 s
  • control means of the interactive control system is further adapted to determine the selectable graphic element 3 displayed in or on the display surface 1, the centroid of which is closest to the centroid of the surface in or on the display surface 1, and in the case where the distance has a first non-zero value, the position of the centroid of the displayed cursor icon 2 during the change of the area of the displayed cursor icon 2 from the first, larger one Area on the second, smaller area so automatically (preferably dynamically, ie as synchronously as possible to move the above-defined gesture) to move, that the distance between the centroid of the displayed cursor icon 2 and the centroid of the determined, displayed element 3 a second Takes value, d it is closer to zero than the first value and is preferably zero.
  • non-zero Unless referred to above as a value "non-zero”, this is to be understood as meaning a value for the non-zero pixel, nonzero centimeter, non-zero millimeter or similar distance in or on the display surface
  • the distance between the center of gravity areas of the displayed cursor symbol 2 and the nearest, displayed, selectable, graphic element 3 is automatically reduced during the displacement of the cursor symbol 2, preferably to zero, that is to say that both centroids at the end of the Changing the distance in or on the display surface 1 preferably occupy the same position.
  • the result of such a displacement, with simultaneous reduction of the area covered by the cursor 2 is shown schematically and exemplarily in the lower part of the figure.
  • the position of the displayed cursor icon 2 is tracked accordingly.
  • the size of the displayed cursor symbol 2, in which the area encompassed by the closed curve or the edges of the polygon has a first, larger area is not particularly limited. However, when performing a free gesture in which at least two fingers of a hand move toward each other, in the detection range of the sensor device, a reduction of the surrounded by the displayed cursor icon 2 surface should be made, and this reduction by an operator of the interactive control system also can be perceived as such, it can be provided that the first, larger area in the range from 250% to 1000% (250%, 300%, 350%, 400%, 450%, 500%, 550%, 600%, 650%, 700%, 750%, 800%, 850%, 900%, 1000%) of the largest selectable element displayed in or on the display area is 3.
  • the strength and / or color of the displayed curve or edges of the polygon is not particularly limited and will be determined by a person skilled in the art, taking into account the nature of the display device, the presumed or given distance of the operator's eyes from the display surface, the environmental conditions (e.g. Brightness), under which the interactive control system is used, etc. are selected so that the cursor icon 2 can be detected by a human safely and effortlessly.
  • the displayed line thickness, color and / or display mode of the displayed curve or the edges of the polygon in a reduction of the area covered by the closed curve or the edges of the polygon from a first, larger area to a second, smaller Area remains unchanged, and that at least one of them changes, for example, that with such a reduction, the line thickness decreases from a first, thicker line thickness to a second, thinner line thickness.
  • the cursor symbol 2 may, for example, be provided that the color of the displayed curve or the edges of the polygon or the display type changes, for example the cursor symbol 2 in a changed color or periodically changing color and / or brightness is displayed.
  • a square element is provided with the reference numeral 3 and this element represents a selectable graphic element 3 in the sense of the present invention.
  • it can may be other selectable graphic elements, or may be one, several or all of the graphical elements displayed, but which are in a given situation or in principle not selectable.
  • a selectable graphical element 3 may be an icon or other graphical object that is optically distinct from its environment and is spatially distinct from other displayed graphic elements in and on the display surface and in the display surface thereof
  • a program or app is started, a folder or a file is opened, information is indicates a device is started or stopped (enabled or disabled), or adjustments or adjustments can be made to a device.
  • a non-selectable graphical element may, for example, be one that displays only information, such as a temperature or status of a device.
  • the line forming the cursor symbol 2 "covers" certain areas of some displayed graphic elements, the remaining areas of the elements involved being enclosed within the line Surface of the cursor icon 2, and the graphical elements displayed entirely within that area are preferably displayed in the same manner as those elements and constituents of elements located outside the cursor icon 2. This will ensure in that an operator of the interactive operating system can at any time see the illustrated graphical elements and recognize their meaning.
  • the evaluation device Is recognized by the evaluation device a free gesture of a hand in the detection range of the sensor device in which at least two fingers of the hand from a first, greater distance to a second, smaller distance to each other to move (as they, for example, by the two on the right side the hand positions represented in FIG. 1 are symbolized), the displayed cursor symbol 2 is changed by the control device on the basis of the data transmitted to it by the evaluation device such that the surface encompassed by the closed curve or the edges of the polygon faces a first, larger surface is reduced to a second, smaller area.
  • a further advantageous embodiment of the present invention is shown, namely that the second, smaller area covered by the displayed cursor symbol 2 is controlled only to such a size that the one of the closed curve or the Edge of the polygon enclosed area of the displayed cursor icon 2 is in the range of 50% to 200%, preferably in the range of 75% to 175%, particularly preferably in the range of 90% to 150% of the area of the displayed graphic element 3.
  • the graphic element 3 is in a "selected state.”
  • the manner of displaying the selected element 3 may be changed, for example
  • the selected graphic element 3 may be displayed in a different color, different brightness, a periodically changing color and / or a periodically changing brightness than before the "selected state”. It can also be provided, for example, that a signal tone indicative of the "selected state" or a corresponding spoken advice is output to the operator of the interactive operating system.
  • control device is further configured to display the cursor symbol 2 with the second, smaller area as long as possible the at least two fingers of the free gesture performing hand within the detection area have a second, smaller distance from each other. It can also be provided that the position of the displayed, a smaller area in or on the display device, the displayed cursor icon 2 is changed according to a movement of the gesture performing hand within the detection range, which may optionally be provided that a change the position of the displayed cursor icon 2 is carried out only after carrying out a movement of the hand over a predeterminable minimum path length within the detection range.
  • centroid of the cursor symbol 2 to the centroid of the nearest, displayed, selectable, graphical element 3 is to serve to an operator of the interactive control system to "select” (aim and aim) such a graphical element 3 by "quasi” “magnetic” and often barely perceptible to the operator, the area centroid of the cursor icon 2 in the direction of the nearest, selectable, graphic element 3 "pulled” is.
  • the interactive operating system is intended to include all conceivable and possible devices, such as any machine, a plant (production or work facility with multiple facilities), a vehicle, a vehicle-mounted device, a mobile or stationary, electronic device
  • a mobile phone smart phone
  • a tablet computer a personal computer
  • a device in the field of consumer electronics, etc. or to be able to serve parts thereof and / or to be part of such a device can continue in an advantageous manner be provided that after reaching a second, smaller distance of the at least two fingers of the free gesture performing hand an action is performed with respect to the one displayed, selectable graphical element 3, the centroid of the smallest distance to the centroid of the cursor symbol.
  • the "action” taken is not particularly limited and may include, for example, the actions usually associated with the operation of a stationary or mobile computer (such as a personal computer, tablet computer), a mobile phone (such as a smartphone), a machine, or a computer Are such as starting or closing a program or app, opening or closing a file, making settings or selections in a program or app, editing files, changing parameters, and / or the generation and transmission of control commands, etc.
  • the further operating action (the first operating action here is the gesture of the operator in the detection range of the sensor device, in which at least two fingers of a hand move from a first, larger distance to a second smaller distance toward one another) is not particularly limited and it
  • this further operator action may be a further gesture, an actuating action with an operating device (a switch, a rocker, a touchpad, etc.) or a voice command.
  • a gesture is provided as a further operating action, this may be a turning of the hand, a movement of the hand in the direction of the (real or virtual) display area ("pressing") and / or a movement of the hand away from the (real or virtual) display area (“Pulling") - each in the detection range of the sensor device - include, the latter operating action is often detected in an advantageous manner most secure from the currently available sensor and evaluation devices.
  • the interactive operating system is intended to include all conceivable and possible devices, such as any machine, a system (production or work installation with several devices), a vehicle, a vehicle-side device , a mobile or stationary electronic device, such as a mobile phone (smartphone), a tablet computer, a personal computer, a device in the field of consumer electronics, etc. or to be able to serve a part thereof or a part thereof To be institution.
  • devices such as any machine, a system (production or work installation with several devices), a vehicle, a vehicle-side device , a mobile or stationary electronic device, such as a mobile phone (smartphone), a tablet computer, a personal computer, a device in the field of consumer electronics, etc. or to be able to serve a part thereof or a part thereof To be institution.
  • the interactive control system may be part of a mobile phone, a personal digital assistant, a tablet PC, a mobile or stationary device for playing music and / or video data, a game console, a television, etc., as well as part of any machine (such as a power tool) vehicle) or plant or a component of a device of a machine (such as a driver assistance system) or system.
  • the devices, machines and systems mentioned, etc. comprise all the elements of the interactive operating system, that is to say a sensor device, evaluation device, display device and control device respectively configured according to the present invention.
  • individual or all elements of the interactive operating system can also be present or provided separately from the devices, machines, installations, etc., that is to say, ie. it may be the sensor device, evaluation device, display device and / or the control device of the device, the machine, system, etc. or a part thereof separate facilities, or it may be in the interactive control system to a stand-alone device.
  • multiple data are to be transmitted from one device to another device of the interactive operating system.
  • data is often required to be communicated from the interactive control system to other devices, such as a device, machine, equipment, etc. or part thereof.
  • the transmission can be done in all known ways and ways, for example. Wired, wireless (such as by radio waves or light) and / or by means of a bus system.
  • transmission of data via a network is encompassed by the present invention.
  • data is transmitted in digital form.
  • Fingertips of the thumb and forefinger include, one to move towards the fingertips of the thumb, forefinger and middle finger, one to move on the fingertips of the thumb, forefinger, middle finger and Ring finger, a moving towards the fingertips of all fingers of a hand, or a gripping motion in which the fingers of the hand eventually form a fist.
  • the operation of graphical user interfaces on screens, head-up displays or data glasses with hand or arm gestures in free space excludes the possibility of a haptic feedback, so that other modalities, in particular an optical feedback for intuitive and fluid operation of special Meaning are.
  • the cursor keys used today for operation with an input device (such as mouse, trackball, etc.) or with fingers (touch input) are not well suited for operation by means of a free gesture, as with a movement of the hand in space the haptic feedback is missing. Since the hand has no guidance, the positioning is rather crude compared to the mouse or touch control.
  • an input pointer is proposed in the form of a closed curve (for example in the form of a ring) or a polygon whose size can be changed (preferably dynamically).
  • Diameter of the input pointer is thus preferably dynamically changeable.
  • a ring as an input pointer can be significantly larger than an arrow, without concealing relevant objects (such as selectable graphic elements), because the interior (the center) of the ring is not covered, but encircled.
  • the operator's "picking" of the graphical element is directly reflected by a decreasing ring according to the present invention.
  • suitable sensors such as 2D / 3D camera (s), etc. are provided, which detect a gesture control room.
  • the gesture sensor recognizes a "closing of the hand” (the execution of the above-mentioned gesture) and, if possible, reduces the diameter of the ring parallel to it. At least two fingers should preferably not be much smaller than the "gripped” graphic object (element), so that this remains visible even in the "gripped state” in the ring.
  • the center of the ring is centered on the object to be gripped. Then this object is exactly encircled in the gripped state. This provides perfect visual feedback for the gesture operation during gesture operation.
  • the present invention provides an interactive operator control system and method for performing an operator action in an interactive operator system in which the operator-in-itself and the presentation of image content occur in a manner that is specific to the physical characteristics of the human Measures consideration and this designed in a particularly appropriate manner.
  • the present en invention takes into account in particular the fact that in an operation by means of a free hand gesture in the room due to the physical conditions of a person, the operation can be done only comparatively inaccurate.
  • the present invention also provides the perception of displayed information to a person in a particularly expedient manner, in that this information comprises a dynamic resizing of displayed elements, wherein the perception possibility is improved or improved again, preferably also an immediate correlation between the execution an action (hand gesture) and a presentation that is dynamically changing in or on a display area.

Abstract

La présente invention concerne un système de commande interactif comprenant un moyen de détection destiné à détecter des données d'image, un moyen d'évaluation des données d'image, un moyen d'affichage pourvu d'une surface d'affichage (1) et un moyen de commande. Le moyen d'évaluation est adapté pour évaluer des données d'image afin de déterminer si une main détectée effectue un geste pendant lequel au moins deux doigts de la main se déplacent l'un vers l'autre d'une première distance relativement grande à une seconde distance relativement petite, et le moyen de commande est adapté pour commander le dispositif d'affichage de telle sorte qu'une icône-curseur (2) apparaît sous la forme d'une courbe fermée dans ou sur la surface d'affichage (1) et de telle sorte que, pendant un tel geste, la taille de l'icône-curseur (2), affichée dans ou sur la surface d'affichage (1), est modifiée de telle sorte que la zone délimitée par la courbe fermée est réduite d'une première surface relativement grande à une seconde surface relativement petite.
PCT/EP2016/001134 2015-10-01 2016-07-02 Système de commande interactif et procédé de réalisation d'une opération de commande sur un système de commande intéractif WO2017054894A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015012720.9A DE102015012720A1 (de) 2015-10-01 2015-10-01 Interaktives Bediensystem und Verfahren zum Durchführen einer Bedienhandlung bei einem interaktiven Bediensystem
DE102015012720.9 2015-10-01

Publications (1)

Publication Number Publication Date
WO2017054894A1 true WO2017054894A1 (fr) 2017-04-06

Family

ID=56360352

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/001134 WO2017054894A1 (fr) 2015-10-01 2016-07-02 Système de commande interactif et procédé de réalisation d'une opération de commande sur un système de commande intéractif

Country Status (2)

Country Link
DE (1) DE102015012720A1 (fr)
WO (1) WO2017054894A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2568507A (en) * 2017-11-17 2019-05-22 Jaguar Land Rover Ltd Vehicle Controller
GB2568510A (en) * 2017-11-17 2019-05-22 Jaguar Land Rover Ltd Vehicle controller
CN112286407A (zh) * 2019-07-13 2021-01-29 兰州大学 一种域光标

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108475085A (zh) * 2017-05-16 2018-08-31 深圳市柔宇科技有限公司 头戴式显示设备及其交互输入方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120051596A1 (en) * 2010-08-31 2012-03-01 Activate Systems, Inc. Methods and apparatus for improved motioin capture
US20130265222A1 (en) * 2011-07-05 2013-10-10 Primesense Ltd. Zoom-based gesture user interface
DE102013207528A1 (de) 2013-04-25 2014-10-30 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum Interagieren mit einem auf einer Datenbrille angezeigten Objekt
DE102013215370A1 (de) 2013-08-05 2015-02-05 Continental Automotive Gmbh Steuerungseinrichtung zur Verwendung in einem Kraftfahrzeug
US20150261408A1 (en) * 2014-03-17 2015-09-17 Microsoft Corporation Multi-stage Cursor Control

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8717318B2 (en) * 2011-03-29 2014-05-06 Intel Corporation Continued virtual links between gestures and user interface elements
JP6159078B2 (ja) * 2011-11-28 2017-07-05 京セラ株式会社 装置、方法、及びプログラム
SE536989C2 (sv) * 2013-01-22 2014-11-25 Crunchfish Ab Förbättrad återkoppling i ett beröringsfritt användargränssnitt

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120051596A1 (en) * 2010-08-31 2012-03-01 Activate Systems, Inc. Methods and apparatus for improved motioin capture
US20130265222A1 (en) * 2011-07-05 2013-10-10 Primesense Ltd. Zoom-based gesture user interface
DE102013207528A1 (de) 2013-04-25 2014-10-30 Bayerische Motoren Werke Aktiengesellschaft Verfahren zum Interagieren mit einem auf einer Datenbrille angezeigten Objekt
DE102013215370A1 (de) 2013-08-05 2015-02-05 Continental Automotive Gmbh Steuerungseinrichtung zur Verwendung in einem Kraftfahrzeug
US20150261408A1 (en) * 2014-03-17 2015-09-17 Microsoft Corporation Multi-stage Cursor Control

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MATTHIAS SCHWALLER ET AL: "Pointing in the Air: Measuring the Effect of Hand Selection Strategies on Performance and Effort", 1 July 2013, HUMAN FACTORS IN COMPUTING AND INFORMATICS, SPRINGER BERLIN HEIDELBERG, BERLIN, HEIDELBERG, PAGE(S) 732 - 747, ISBN: 978-3-642-39061-6, pages: 732 - 747, XP047033542 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2568507A (en) * 2017-11-17 2019-05-22 Jaguar Land Rover Ltd Vehicle Controller
GB2568510A (en) * 2017-11-17 2019-05-22 Jaguar Land Rover Ltd Vehicle controller
GB2568510B (en) * 2017-11-17 2020-04-01 Jaguar Land Rover Ltd Vehicle controller
CN112286407A (zh) * 2019-07-13 2021-01-29 兰州大学 一种域光标

Also Published As

Publication number Publication date
DE102015012720A1 (de) 2017-04-06

Similar Documents

Publication Publication Date Title
EP1998996B1 (fr) Serveur interactif et procédé permettant de faire fonctionner le serveur interactif
EP3507681B1 (fr) Procédé d'interaction avec des contenus d'image qui sont représentés sur un dispositif d'affichage dans un véhicule
EP2701938B1 (fr) Interface pour le transfert de données sans fil dans un véhicule automobile et produit logiciel informatique
EP3116737B1 (fr) Procédé et dispositif servant à fournir une interface graphique d'utilisateurs dans un véhicule
EP2943367A1 (fr) Procédé de synchronisation de dispositifs d'affichage d'un véhicule automobile
WO2017211817A1 (fr) Dispositif de commande pourvu d'une unité de suivi des yeux et procédé d'étalonnage d'une unité de suivi des yeux d'un dispositif de commande
DE102015211358A1 (de) Eingabevorrichtung für fahrzeuge und fahrzeugcockpitmodul
WO2017054894A1 (fr) Système de commande interactif et procédé de réalisation d'une opération de commande sur un système de commande intéractif
DE112015002938T5 (de) Bedienvorrichtung für ein Fahrzeug
WO2013029773A1 (fr) Procédé et dispositif de mise à disposition d'une interface utilisateur, en particulier dans un véhicule
WO2014108147A1 (fr) Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage
EP3573854A1 (fr) Procédé permettant de faire fonctionner un système de commande, système de commande et véhicule comprenant un système de commande
WO2018234147A1 (fr) Procédé servant à faire fonctionner un dispositif d'affichage pour un véhicule automobile, ainsi que véhicule automobile
WO2015162058A1 (fr) Interaction de gestes avec un système d'information de conducteur d'un véhicule
WO2016124473A1 (fr) Procédé de sélection d'un élément de commande d'un véhicule automobile et système de commande pour un véhicule automobile
DE102016218003B4 (de) Verfahren und Vorrichtung zur Optimierung der Darstellung auf mehreren Displays in einem Fahrzeug
DE102014224898A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102017218718A1 (de) Verfahren, Vorrichtung und Fortbewegungsmittel zur Unterstützung einer Gestensteuerung für ein virtuelles Display
WO2014114428A1 (fr) Procédé et système pour commander en fonction de la direction du regard une pluralité d'unités fonctionnelles et véhicule automobile et terminal mobile comprenant un tel système
WO2014040807A1 (fr) Entrées par effleurement le long d'un seuil d'une surface tactile
DE102014217969A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102020106021A1 (de) Verfahren und system zum betreiben eines auswahlmenüs einer grafischen benutzeroberfläche basierend auf dem erfassen einer rotierenden freiraumgeste
WO2021028274A1 (fr) Système utilisateur et procédé pour faire fonctionner un système utilisateur
DE102019207696A1 (de) Kraftfahrzeug mit einer Bedienvorrichtung, Verfahren zum Wechseln einer Position eines Bedienelements mit einer berührungssensitiven Oberfläche, und Steuereinrichtung
DE102018212398A1 (de) Eingabeeinheit und Verfahren zum Eingeben von Steuerbefehlen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16735579

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16735579

Country of ref document: EP

Kind code of ref document: A1