EP3044648A1 - Procédé de commande sélective par détection de direction préférentielle - Google Patents

Procédé de commande sélective par détection de direction préférentielle

Info

Publication number
EP3044648A1
EP3044648A1 EP13771080.2A EP13771080A EP3044648A1 EP 3044648 A1 EP3044648 A1 EP 3044648A1 EP 13771080 A EP13771080 A EP 13771080A EP 3044648 A1 EP3044648 A1 EP 3044648A1
Authority
EP
European Patent Office
Prior art keywords
movement
preferred direction
parallel
movements
straight lines
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13771080.2A
Other languages
German (de)
English (en)
Inventor
Dirk Beckmann
Max SCHUEMANN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Steinberg Media Technologies GmbH
Original Assignee
Steinberg Media Technologies GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Steinberg Media Technologies GmbH filed Critical Steinberg Media Technologies GmbH
Publication of EP3044648A1 publication Critical patent/EP3044648A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04166Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry

Definitions

  • the present application relates to a method for the selective sequential activation of at least two functions with an input means which is set up to detect at least two-dimensional coupled movements. It further relates to a method for converting a motion detected with such an input means and a computer program product, a data carrier and a data processing system.
  • Input means for detecting at least two-dimensional coupled movements are well known in the art. For example, a computer mouse or a trackball is known here. But also other input means, for example camera-based are known. They are to be distinguished from input means, such as a scroll wheel, which is set up only for the one-dimensional detection of movement. Input means for detecting at least two-dimensional coupled movements are characterized by the fact that motion components in different dimensions can be detected by a single movement. This is also the case for example with joysticks, sensor gloves or optical detection methods, if they are not designed purely one-dimensional. Although the detection in the input means is often carried out by several individual sensors for one-dimensional detection of movements.
  • an input means for detecting multidimensional coupled motions is also involved when at least two such one-dimensional acquisitions are performed to detect a coupled multi-dimensional motion. This is For example, the case when in an opto-mechanical computer mouse two wheels each pick up one-dimensional movements of the mouse ball and these each one-dimensional movements are then first detected individually.
  • computer mice also include input means for detecting only one-dimensional inputs, such as a scroll wheel or keys, they also have means for capturing coupled at least two-dimensional motions.
  • input means for detecting only one-dimensional inputs such as a scroll wheel or keys
  • means for capturing coupled at least two-dimensional motions can be designed differently.
  • optical scans but also mechanical solutions are known.
  • the detected movements can be taken over unfiltered and unsmoothed for further processing, and on the other hand, however, a movement component can also be ignored or a corresponding smoothing performed. It is thus known, for example, to recognize a preferred direction in a gesture which is initiated, for example, by pressing a key and holding this key, and to ignore, for the duration of the gesture, portions of movement which deviate from the preferred direction. However, this results in the user, if he wants to know movement components in a different direction and wants to know processed, the gesture, for example, by releasing a button must end.
  • the object of the present invention is therefore to provide a method or a computer program product, a data carrier or a data processing system which is suitable for the treatment or conversion of multi-dimensional coupled movements into control instructions.
  • the method according to the invention is a method for triggering at least two functions, each of which is selective in a time sequence, with an input means which is set up to detect at least two-dimensional coupled movements.
  • successively functions are controlled, each function can be controlled quite a number of times.
  • a temporal sequence is to be distinguished, which of the available at least two functions are each to be controlled.
  • the movement with the input means is detected and motion sections of successive time periods of the detected movement are obtained.
  • a preferred direction is determined according to the invention.
  • the preferred direction is directed parallel to or in one direction along a parallel of at least two predetermined non-parallel straight lines.
  • the movement of the respective movement section and, if it is not the first movement section the movement according to the invention for each movement section at least one of the respective movement section in direct succession preceding movement section used.
  • Each of the non-parallel straight lines or each direction along one of the non-parallel straight lines is assigned exactly one of the at least two functions.
  • the individual movement sections advantageously border directly on each other. But there are also movement sections with gaps in between conceivable.
  • the method is characterized by the fact that without interrupting a control or a gesture, for example, without pressing or releasing a button, targeted controls are selectively and selectively possible in different directions or Ansteue- ments of different functions.
  • targeted controls are selectively and selectively possible in different directions or Ansteue- ments of different functions.
  • Such a procedure can be used, for example, for navigation both within displayed data records, for example of image or sound data, and for controlling real machines, such as cranes, robots or the like.
  • Movement sections in direct sequence are those between which there is no other movement section that has been detected. If movement sections are detected with gaps in between, these gaps remain out of consideration even if the movement sections are considered with regard to their direct sequence.
  • the movements of the movement sections used to determine the respective preferred direction are weighted. It will Advantageously weighted so that movement sections that are older, with a lower weight flow.
  • the respective preferred direction is determined by comparing the portions of the movements of the movement sections used to determine the preferred direction or their amounts parallel to the predetermined straight lines or directions.
  • the preceding movements act in that their proportions along the given straight lines or directions are included in the determination of the new preferred direction.
  • the preceding preferred directions do not act as such, but only the earlier motion components are used to determine the new preferred direction. This allows a particularly efficient and accurate control.
  • the preferred direction is chosen, which coincides with the larger of the motion components. If the motion components are the same, any choice can be made or no activation can be carried out.
  • the portion of the detected movement used in each case for selective activation is dependent on the speed and direction of the detected movement. Faster movements thus advantageously result in larger detected portions of movement over a period of equal length. Also, the proportion of the detected movement used for the control is dependent on the direction thereof in relation to the preferred direction determined for the respective movement section.
  • the number of dimensions of the space defined by the straight line is equal to the number of dimensions of the room in which the input space means for detecting coupled movements.
  • This is not a selection rule for the number of specified lines, but only the number of dimensions of the space spanned by the line.
  • a classic computer mouse is set up to capture two-dimensional coupled motion.
  • it is advisable to choose the given straight lines in such a way that they span a two-dimensional space.
  • input means which detect motions coupled in a three-dimensional space as is often the case with camera-supported captures, it is advisable to choose the straight lines so that they also span a three-dimensional space.
  • the number of preferred directions not identical to the number of dimensions in which the input means for coupled detection is set up.
  • a preferred direction can be horizontal and a preferred direction perpendicular, while the third preferred direction each extends at an angle of 45 ° degrees to these two directions.
  • the reverse case in which the number of preferred directions is less than the number of dimensions in which the input means is designed for coupled detection is also conceivable.
  • an arrangement of two preferred directions can be selected and the detection of the input can be carried out with an input means which is set up to detect coupled movements in three-dimensional space.
  • the preferred direction of a movement section and the portion of the movement section directed into it can be determined by projecting the movement section in each case in the predetermined preferred directions and comparing the length of these projections. The preferred direction is then that in which the projection of the movement section has the greatest length. With several identical lengths with respect to different preferred directions can be decided by a different rule. The length the projection to the respective preferred direction then corresponds to the proportion of the movement section.
  • At least two predetermined straight lines are advantageously used for determining the preferred direction and at least two functions which can be selectively controlled.
  • advantageously three predetermined straight lines or at least three predetermined directions are used to determine the preferred direction and at least three functions for selective activation are kept available.
  • a time-weighted moving average of the movements or the portions of the movement parallel to the predetermined straight lines or directions or their amounts of generated for determining the respective motion sections used and in each case the determination of the preferred direction are based.
  • the object is also achieved by a method for converting a at least two-dimensional coupled movement into a scalar value acquired with an input means which is set up to detect at least two-dimensional coupled movements, and information about the selection of one of at least two predetermined non-parallel straight lines or directions ,
  • a preferred direction of the detected movement is determined parallel to or in one direction along a parallel of at least two predetermined non-parallel straight lines, and each straight line or each direction is assigned exactly one value of the information about the selection.
  • the method also includes the use of the scalar value and the information about the selection to drive a process, a process or a machine.
  • the method comprises the detection of the at least two-dimensional movement using the input means.
  • the object is also achieved by a computer program product, set up for carrying out one of the methods explained above.
  • the problem is also solved by a data carrier having such a computer program product.
  • a data processing system or a system of data processing systems or a system configured for carrying out a method described above, comprising at least one input means, which is set up to detect at least two-dimensional coupled movements.
  • the figure shows:
  • Fig. 1 is a schematic representation of the implementation of the method.
  • Yi in each case represents the position projected onto one of the given straight lines.
  • the weighting factor can also be chosen differently depending on the desired implementation.
  • it adopts smaller values for movements lying further back.
  • the amounts and weighted accumulated shares along the given straight line can then be compared. If the proportion along the first straight line is greater than the proportion along the second, then the first straight line is selected as the preferred direction and vice versa. In the case of identity, any straight line can be selected as the preferred direction or no action can be performed. If the preferred direction is determined, the function assigned to the preferred direction is called with the corresponding value. The value is determined by the proportion of the movement along the preferred direction. If, for example, the Y component is greater, the preferred direction corresponding to this component, in this case, for example, the second straight line, is selected as the preferred direction.
  • the individual preferred directions or lines are assigned corresponding functions.
  • the function for moving the actuator in the x direction and the function associated with the second straight line could be the function for moving the actuator in y direction. Act direction.
  • the function would be to move in the y direction then with the value or based on the value
  • a value means that the value for the control of the function can be changed in a predetermined manner, for example scaled.
  • the absolute values of the movements along the given straight line can also be summed up.
  • An alternative application of the method represents the navigation within an image. For example, a first straight line, the displacement of the image section in the x-direction and a second straight line, the displacement of the image detail in the y-direction be assigned as a function.
  • a color selection in a representation of an HSV, HSL or HSI color space can also be carried out using a corresponding method.
  • the input means is suitable for detecting three-dimensional coupled movements, all components can be controlled using this method.
  • the input means is set up only to detect two-dimensional coupled movements, in particular the selection of the H and S components can be controlled by this method.
  • navigation in a representation of audio data is also possible.
  • a function associated with a first straight line can be used for navigation on the time axis and a second function which is assigned to a second straight line can be set up for zooming in and out.
  • FIG. 1 shows a schematic Table of the implementation of the method.
  • arrows as three predetermined preferred directions u, w, v.
  • a movement section from point 1 to point 2 is shown.
  • the projections of the movement section are drawn in dashed lines to the three predetermined preferred directions u, v, w, denoted by U2, V2, W2.
  • the projection W2 has the greatest length.
  • w is the preferred direction of the movement sections from the point 1 to the point 2.
  • the length of the projection W2 or the projection W2 represents the portion of the movement section to be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention concerne la commande sélective séquentielle dans le temps d'au moins deux fonctions en utilisant un moyen de saisie adapté pour détecter des mouvements couplés au moins dans deux dimensions. Pour cela, on détermine pour chaque segment de mouvement une direction préférentielle du mouvement détecté, parallèlement à une direction ou dans une direction le long d'une parallèle, d'au moins deux droites non parallèles prédéfinies, on associe exactement une desdites au moins deux fonctions à chaque droite ou à chaque direction le long d'une des droites, et on commande uniquement la fonction associée à la droite parallèle à la direction préférentielle ou à la direction correspondant à la direction préférentielle. Pour la commande sélective séquentielle dans le temps desdites au moins deux fonctions, on utilise uniquement à chaque fois la partie du segment de mouvement concerné qui est orientée dans la direction préférentielle.
EP13771080.2A 2013-09-13 2013-09-13 Procédé de commande sélective par détection de direction préférentielle Withdrawn EP3044648A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2013/069030 WO2015036036A1 (fr) 2013-09-13 2013-09-13 Procédé de commande sélective par détection de direction préférentielle

Publications (1)

Publication Number Publication Date
EP3044648A1 true EP3044648A1 (fr) 2016-07-20

Family

ID=49293596

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13771080.2A Withdrawn EP3044648A1 (fr) 2013-09-13 2013-09-13 Procédé de commande sélective par détection de direction préférentielle

Country Status (3)

Country Link
US (1) US20160224132A1 (fr)
EP (1) EP3044648A1 (fr)
WO (1) WO2015036036A1 (fr)

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5374942A (en) * 1993-02-05 1994-12-20 Gilligan; Federico G. Mouse and method for concurrent cursor position and scrolling control
US5313229A (en) * 1993-02-05 1994-05-17 Gilligan Federico G Mouse and method for concurrent cursor position and scrolling control
US5565887A (en) * 1994-06-29 1996-10-15 Microsoft Corporation Method and apparatus for moving a cursor on a computer screen
AU2003202776A1 (en) * 2002-02-25 2003-09-09 Koninklijke Philips Electronics N.V. Display device and pointing device
DE10325284A1 (de) * 2003-06-04 2005-01-13 3Dconnexion Gmbh Multidimensionales Eingabegerät zur Navigation und Selektion von vituellen Objekten
US20050168443A1 (en) * 2004-01-29 2005-08-04 Ausbeck Paul J.Jr. Method and apparatus for producing one-dimensional signals with a two-dimensional pointing device
US20050212760A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
US20090109173A1 (en) * 2007-10-28 2009-04-30 Liang Fu Multi-function computer pointing device
US20090288889A1 (en) * 2008-05-23 2009-11-26 Synaptics Incorporated Proximity sensor device and method with swipethrough data entry
US8212794B2 (en) * 2008-09-30 2012-07-03 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Optical finger navigation utilizing quantized movement information
CN101599001B (zh) * 2009-07-13 2012-11-14 青岛海信移动通信技术股份有限公司 触摸屏显示界面更新方法和多媒体电子设备
CN103492986B (zh) * 2011-02-16 2017-08-15 日本电气株式会社 输入设备、输入方法和记录介质
JP5547139B2 (ja) * 2011-07-29 2014-07-09 株式会社東芝 認識装置、方法及びプログラム
JP5845860B2 (ja) * 2011-12-01 2016-01-20 株式会社デンソー 地図表示操作装置
JP2013214275A (ja) * 2012-03-08 2013-10-17 Canon Inc 三次元位置指定方法

Also Published As

Publication number Publication date
US20160224132A1 (en) 2016-08-04
WO2015036036A1 (fr) 2015-03-19

Similar Documents

Publication Publication Date Title
EP2124117B1 (fr) Dispositif de commande destiné à la commande d'une machine-outil
EP3532254A1 (fr) Procédé de planification de déplacement sans collision
DE102010063392A1 (de) Mikroskop mit Sensorbildschirm
DE102006048163A1 (de) Kamerabasierte Überwachung bewegter Maschinen und/oder beweglichen Maschinenelementen zur Kollisionsverhinderung
DE102012212754A1 (de) Verfahren zum Betreiben eines Sensorsystems sowie Sensorsystem
DE102016102902A1 (de) Numerische Steuereinheit zur Steuerung einer Werkzeugmaschine
WO2015018732A1 (fr) Procédé ainsi que dispositif de commande pour commander un appareil électronique par le biais d'un écran tactile
DE102013002830A1 (de) Manuell bedienbare Eingabevorrichtung mit Code-Erfassung
WO2014067774A1 (fr) Procédé et dispositif de fonctionnement d'un système d'entrée
DE60006149T2 (de) Verfahren zur steuerung einer berührungsempfindlichen oberfläche
EP3044648A1 (fr) Procédé de commande sélective par détection de direction préférentielle
WO2023066998A1 (fr) Procédé destiné à faire fonctionner un système d'entraînement plan, et système d'entraînement plan associé
EP2555097A1 (fr) Procédé et dispositif de détermination d'une section d'image et de déclenchement d'une détection d'image à l'aide d'un geste unique basé sur le contact
DE102016001998A1 (de) Kraftfahrzeug-Bedienvorrichtung und Verfahren zum Betreiben einer Bedienvorrichtung, um eine Wechselwirkung zwischen einer virtuellen Darstellungsebene und einer Hand zu bewirken
DE102014224599A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102015006614A1 (de) Verfahren zum Betreiben einer Bedienvorrichtung sowie Bedienvorrichtung für ein Kraftfahrzeug
DE112018007216B4 (de) Eingabesteuerungsvorrichtung, Anzeigeeingabevorrichtung und Eingabesteuerungsverfahren
DE102018117244B3 (de) Verfahren zum Ermitteln einer Grobbahn aus einer vorgegebenen Kontur
EP3268852B1 (fr) Procédé de sélection ciblée d'éléments affichés sur un écran tactile
EP3234733B1 (fr) Procédé de commande d'un système d'un véhicule
DE102013211046A1 (de) Verfahren und Vorrichtung zum Gewinnen eines Stellsignals aus einer Bediengeste
DE102019208605B4 (de) Verfahren zum Erfassen einer Bedienhandlung, Bedienvorrichtung sowie Kraftfahrzeug mit einer Bedienvorrichtung
EP3168673B1 (fr) Système de visualisation de données d'image
DE102020106021A1 (de) Verfahren und system zum betreiben eines auswahlmenüs einer grafischen benutzeroberfläche basierend auf dem erfassen einer rotierenden freiraumgeste
DE102007034141A1 (de) Verfahren zum Steuern einer Bewegung eines Auswahlelements

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160317

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20170401