WO2015113663A1 - Generieren eines eingabebefehls - Google Patents
Generieren eines eingabebefehls Download PDFInfo
- Publication number
- WO2015113663A1 WO2015113663A1 PCT/EP2014/074186 EP2014074186W WO2015113663A1 WO 2015113663 A1 WO2015113663 A1 WO 2015113663A1 EP 2014074186 W EP2014074186 W EP 2014074186W WO 2015113663 A1 WO2015113663 A1 WO 2015113663A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- touch
- processing unit
- elements
- command
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/033—Indexing scheme relating to G06F3/033
- G06F2203/0331—Finger worn pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0382—Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
Definitions
- the invention relates to an arrangement and method for generating an input command.
- touch interfaces are often used as a user interface, for example between man and machine. For example, show
- Smartphones or tablets often called touch pads or entire touch displays.
- inputs are made by hand or with one or more fingers via the touch-sensitive surfaces.
- the Touch Päd also referred to as touchpad, be determined solely for the purpose of entering a gesture.
- Capacitive, resistive and inductive methods are used for the determination of an input gesture.
- screens or displays are used as combined input and output devices by the screen surface is made touch-sensitive. For tablet PCs or smartphones, moreover, the use of multi-touch gestures is widespread. Here, especially capacitive methods are common.
- the user requires physical access to the touch-sensitive surface.
- a device such as a control computer of an industrial control system
- the device may be stowed inaccessible, for example, during a step in a bag or a suitcase.
- input devices such as a mouse or a touch pen or sensor for detecting gestures. A device is controlled via these external input devices.
- the invention relates to an arrangement for generating an input command, comprising
- At least two substantially annular and spatially separated input elements for attachment to a respective hand, in particular a finger, wherein the input elements each comprise touch-sensitive areas, by means of which a respective input pattern can be determined upon contact;
- a processing unit for generating an input command from the respective input patterns in response to predefined cooperative input patterns realizes a user interface by means of which an input command is generated in a particularly secure and comfortable manner.
- an input command is assigned to a combination of respective input patterns.
- a combination is thus given an effect or an action intended by the combined input.
- a first input pattern can be determined on a first touch of a first touch-sensitive area of a first input element.
- the first input element is attached to a first hand, for example to a finger such as the index finger of the first hand.
- the arrangement for generating the input command is associated with at least one second input element.
- the second input element is fastened, for example, to a second hand or to the same hand as the first input element, that is, to the first hand.
- first hand for example, it is attached to the same finger as the first input device or attached to another finger of the first hand, in particular the middle finger, ring finger or little finger.
- second input element is fastened analogously to the first input element on a forefinger of the second hand.
- the respective input elements are configured in a substantially annular shape. In essence, this means, for example, that the input element is ergonomically shaped with regard to the configuration of a human hand or a finger and can have a deviation from the symmetrical circular shape in cross-section.
- this is in particular a human hand or a robotic hand controllable with the aid of robotics controls, which is modeled in particular on the shape of a human hand.
- touch pad surfaces are used. These may in particular be formed as a flat surface or have a curved shape as a surface.
- the touch-sensitive areas are designed such that they each touch caused by a hand or a finger touches or caused by several fingers of one hand touch or caused by several fingers of different hands Berüh- evaluations.
- a touch interface or a multi-touch-sensitive interface is provided at the touch-sensitive areas of the respective input elements.
- the touch is generated on the touch-sensitive area with the help of the thumb of the hand, to which the respective input device is attached.
- the contact with the respective thumb of the respective hand is performed on at least two input devices, which are each attached to a different hand.
- a raw information is determined, which contains information about an absolute or relative position value of a touch or a time profile of registered touches, or independent of a relative or absolute position within the touch-sensitive area. So a raw data set is created if a touch is registered.
- a conventional mouse as an input element or an input command on a keyboard can be used to select or mark objects on a screen.
- the corresponding input is possible with touch-sensitive surfaces via a simple, short touch.
- the input pattern on the touch-sensitive area is then correspondingly a single touch within a certain period of time.
- the touch can be evaluated independently of a position on the touch-sensitive area, or depending on the absolute position on the touch-sensitive area. For example, a touch is only evaluated if it occurs in a predetermined section of the touch-sensitive area. This section may in particular be fixed for any display on a screen of a device to be controlled via the input or vary depending on the input that can be made, ie depending on the program that is currently running on the device.
- an input should always be made within a central area on the touch-sensitive area if a "select" command is to be made by a one-time, short touch.
- "select" command may mean that one parameter is selected from a variety of parameters.
- an acknowledgment of an action is to be carried out: Only if the input takes place in an outer or fixed section of the touch-sensitive area can the confirmation, for example for starting an action, be entered by a single short touch.
- a registered movement sequence can also be analyzed on a respective input element by means of an evaluation unit, and an assignment to a movement class can be carried out.
- a registered temporal and spatial movement can be recognized and evaluated as a gesture.
- the processing unit is provided according to a raw data record of a detected touch or a data set derived therefrom, which already represents a classification of the registered movement in a movement class and thus comprises an evaluation of the detected touch.
- mixed forms can be determined therefrom as input patterns by performing parts or individual steps of an evaluation of the detected touch by the input element.
- the arrangement further comprises the processing unit for generating the input command.
- the processing unit has the respective input pattern of the respective input element.
- the processing unit has information about predefined interacting input patterns. For example, a list is stored in a memory area of the processing unit as to which combinations of input patterns an associated input command is assigned. For example, the processing unit has a lis- te, in which possible combinations of the first input pattern with the second input pattern are listed. The generation of the input command thus takes place on the basis of the predefined interacting input patterns as a function of the respective currently registered input patterns, which is provided to the processing unit for example by one or more input devices.
- the processing unit has at least one communication interface via which the respective input patterns are received.
- a processor built into the processing unit such as a microprocessor or a main processor unit, or an FPGA, may be provided.
- the processing unit has a plurality of communication interfaces, wherein a respective communication interface for communication with a respective input element is configured.
- the processing unit is designed integrated into one of the input elements.
- a single input element can be equipped with high computing power and thus on the one hand fulfill the function of providing a user interface for determining the respective input pattern and on the other hand the function of generating the input command.
- the arrangement can be implemented externally to a device which should be able to access the input command.
- the input command controls a device or configures a system.
- the input as well as the processing of an input pattern for generating the input command thus takes place, for example, outside a device and thus independently of a possible access to a device.
- physical access to the device is not necessary for successfully generating an input command.
- the availability of the device in a communication network or an existing communication connection between the device and the device is not necessary for the successful generation of the input command. If the device is not switched on, for example, or if a communication connection between device and device is not possible, then the input command can still be generated with the aid of the device and provided, for example, at a later point in time at which the device can be reached for the device.
- the processing unit is arranged in a computer.
- the computer can be provided specifically for the processing unit or be provided for further computing steps or processing steps within a system.
- an adaptation to or utilization of existing computing capacities can be achieved in an advantageous manner.
- the realization of the processing unit in a computer advantageously makes it possible to keep the design of the respective input elements as small as possible.
- the respective input elements can be realized with a low weight and can be comfortably carried or fastened on one hand.
- the processing unit is arranged in a mobile or permanently installed device that can be controlled via the input elements.
- a device which is controlled by means of inputs via the proposed arrangement is designed in such a way that the processing unit is realized thereon.
- the input patterns determined by means of the input elements are processed in the device which is to be operated, controlled or configured by the input.
- the arrangement is already provided at the time of installation of the device or the device can be extended, for example, in a maintenance phase to the processing unit.
- this is advantageous if the processing unit is to be protected in a special way against manipulation or misconfiguration, in particular due to safety or security requirements. In such cases, the possibility of easy access to the processing unit or its simple replacement may be undesirable.
- the respective input elements have at least one respective interface for communication with the processing unit and / or one another.
- the first input element has a first interface for communication with the processing unit and the second input element has a second interface for communication with the processing unit.
- the first input element has a first interface for communication with the second input element, and the second input element has a second interface for communication with the processing unit.
- a communication connection can be adapted in an advantageous manner, in particular in which different transmission modes for the different transmission paths within the arrangement are selected. In this case, it is possible to choose between a wired or wireless transmission individually for each transmission path, and in turn between different wireless personal area network structures or, in short, WPAN structures such as Bluetooth or ZigBee.
- the respective interfaces are designed as wireless interfaces or as wired interfaces.
- a wireless transmission can be selected by means of the wireless interfaces, so that the input elements are carried in a simple, safe and comfortable manner by a user on one or more hands, without its mobility, for example within an industrial plant , is thereby restricted.
- a wired transmission path can advantageously be selected by means of the wired interfaces, for example if a WPAN or Wireless Local Area Network (WLAN) -based communication system is used. driving due to existing fields can be annoying or even be disturbed.
- WLAN Wireless Local Area Network
- At least one of the input elements has an energy store as an internal voltage source, for example a battery.
- an energy store for example a battery.
- At least one of the input elements is configured with an evaluation unit for evaluating touches and determining input patterns.
- only one of the input elements is equipped with an evaluation unit, which enables the evaluation of acquired raw data due to registered touches of the touch-sensitive area.
- the input element equipped with the evaluation unit thus determines, for example as second input element, a second input pattern as a function of detected touches of the second touch-sensitive area of the second input element.
- a first input element may, for example, have less computing capacity and send detected raw data of a touch on the first touch-sensitive area to the second input element for evaluation, the evaluation unit of which then determines the first input pattern in addition to the second input pattern.
- the invention further relates to a method for generating an input instruction, wherein
- a respective input pattern is determined in at least two spatially separated input elements when touching;
- An associated input command is generated from the respective input patterns in dependence on predefined interacting input patterns.
- the presented method allows the input of a plurality of commands via the use of two input elements whose inputs interact in a predefinable manner.
- a meaning which exceeds the usual possibilities for determining the meaning of a touch gesture, for example on a single touchpad, is assigned to a combination of occurring movement patterns by the generation of the input command.
- the contact of the at least two spatially separated input elements is carried out, for example, by a hand or a finger of a hand.
- a user can make an input to the respective input element.
- a respective hand may be used per input element, i.
- a first hand for example the right hand of a user and for a second input element, a second hand, for example, a left hand of a user.
- a single user's hand is used to input to multiple input elements, or multiple users can make input to one or more input elements with their respective hands or their respective two hands.
- the touch of a respective input element is effected by means of a thumb, wherein a touch-sensitive area of an input element on a index, middle, ring or small finger or on the middle hand is fastened so that an alignment of the touch-sensitive area is to the thumb and thus a simplified input by means of the thumb is possible.
- the input command is generated as a function of an assignment rule, the assignment rule assigning an input command to a combination of input patterns.
- the assignment rule can be made arbitrarily complex.
- the input patterns are in the form of a single input command per input element, i. for example, there is an "open" single-input command from the first input element due to a registered double-click by briefly touching a first touch-sensitive area of the first input element twice in succession.
- the assignment rule is then generated, for example, from a list listing all possible combinations of a registered "open" single entry command on one of the input elements with all the possible recordable single input commands on the other existing input elements.
- a tolerance range can be specified within which a respective registered touch is deemed to be simultaneous.
- the possibility of a time offset, with which a movement pattern is detected, can be deliberately used to open up further possible combinations. For example, double-clicking on a first input element with a double-click on a second input element can be assigned to a different input command in chronological succession than simultaneous double-clicking on both input elements.
- the amount of the time offset so for example an indication in seconds, can also be taken into account.
- the respective input pattern is determined by means of respective touch-sensitive areas of the input determined and provided at a respective interface of the input elements.
- Multitouch function provided by means of multiple input elements.
- the input of a multi-touch gesture is made possible by means of a respective single-touch gesture on a respective input element.
- the respective input pattern is received by a processing unit and the input command is generated by the processing unit.
- the generation of the input instruction can thus be transferred to an external unit, the processing unit.
- resources can be optimally distributed and the input elements are designed user-friendly.
- the input command can in particular be generated such that it can be provided by the processing unit and can be used directly for processing by a device to which the input is to be converted.
- a temporal and / or spatial course of a touch of the respective touch-sensitive areas is determined to determine the respective input pattern.
- capacitive or resistive touch measurement techniques it is thus possible, for example, to calculate an absolute position of a touch of the touch-sensitive area. It can be detected starting from the start position, a time course of the following touches.
- the information about an absolute position can be registered, for example, on the touchpad.
- only the relative course of a Touch be determined from temporally successive points of contact for detecting a gesture.
- a time information can be recorded by the input element, which can be brought in relation to an input pattern of a second or further input element, which also contains a time information.
- a respective single-input command is determined by means of an evaluation unit of at least one input element from a respective temporal and / or spatial course of a touch of the respective touch-sensitive areas of at least one input element.
- an evaluation unit of at least one input element from a respective temporal and / or spatial course of a touch of the respective touch-sensitive areas of at least one input element.
- Figure 1 is a schematic representation of an arrangement for generating an input instruction according to an embodiment of the invention
- Figure 2 is a schematic representation of a method for
- FIG. 1 schematically shows an arrangement 10 for generating an input instruction 3 according to an embodiment of the invention, comprising two substantially annular and spatially separated input elements, namely a first input element 11 and a second input element 12.
- the two input elements 11, 12 are both in the form of a ring about 1 to about 5 cm wide, which can be worn on a human hand.
- the ring is not closed, for example, for easier handling, but instead has a gap according to FIG.
- different ring sizes can be formed if a suitably flexible material is used for the skeleton of the ring.
- different ring sizes can be set.
- the ring may also be closed.
- the touch-sensitive area 111 of the first input element 11 is attached.
- This is, for example, a rectangular area which extends across the width of the annular input element 11 and covers in length, for example, a quarter to half of the outer circumference of the annular input element. All other sizes for the touch-sensitive area are advantageous embodiments depending on the application.
- the touch-sensitive region 111 is curved in particular in the longitudinal direction and thus has the shape of an arcuate disk or a ring segment. If the ring is worn on the index finger of a hand, so an advantageous input using the thumb of the same hand is possible.
- the second input element 12 is, in particular, processed mirror-inverted to the first input element 11 and has the characteristic touch-sensitive region 112 configured in the same way.
- an evaluation unit 14 is provided within the input elements 11, 12, which is implemented as a computer chip or computer chip and performs the evaluation or interpretation of the detected at the touch-sensitive areas lent gestures. As respective
- Interfaces 18, 19 are used in particular Bluetooth chips, which are suitable for communication with the processing unit 13.
- Integrated into the input elements 11, 12 are respective batteries, which can be connected and charged via a respective charging connection 16, 17 to a voltage source.
- the processing unit is installed in the device itself, for which the input of a gesture on the two input elements 11, 12 are to be made.
- the device is connected to a temperature control machine within an industrial plant to analyze a manufacturing process to perform maintenance steps such as testing or to make a security update.
- it is installed in a physically inaccessible manner so that, in particular, input interfaces provided, if appropriate, on the device are not freely accessible to the user or the service technician.
- a gesture such as a
- Multi-touch gesture possible in an advantageous way.
- the used Transmission methods such as the Bluetooth technology in this case, can be adapted to the circumstances of the field of application.
- short-range radio technology can be used via Wireless Personal Area Networks, WPAN for short.
- an input in the maintenance device for example a mobile tablet
- a maintenance device for example a mobile tablet
- it can be carried out by means of the input gestures navigation within a maintenance tool. It can be scrolled through so-called scrolling, the gestures through a scroll within a document, for example, be browsed through a list of values that are displayed on the tablet. Commands such as printing test reports or enlarging an ad can be generated on the display of the tablet PC.
- a schematic method for generating an input instruction 3 is shown in FIG.
- a processing unit 13 receives the respective input patterns 1, 2, 2 'and determines the input command.
- the processing unit 13 is designed to be configurable, and thus it is possible to modify or configure the assignment instruction which the processing unit uses. For example, in the configuration of the processing unit 13, it is determined for which combination of respective input patterns 1, 2, 2 'which input command 3 results.
- Safety-critical industrial equipment may be a requirement to classify a gesture as deliberate and not accidentally entered.
- the configuration may further define the meaning of a combination of respective input patterns. For example, punctual touching is considered a single click to mark an object or a double touch as a double click to select a marked object.
- gestures such as swiping across the touch-sensitive area in a predetermined direction may be defined as high-scrolling or scroll-down.
- an intervention-free, easily installable and quasi-open-handed input option for a device to be controlled via the input is provided.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020167015347A KR20160083110A (ko) | 2014-01-31 | 2014-11-10 | 입력 커맨드 생성 |
US15/033,666 US9952688B2 (en) | 2014-01-31 | 2014-11-10 | Generating an input command |
KR1020177036925A KR20180000350A (ko) | 2014-01-31 | 2014-11-10 | 입력 커맨드 생성 |
CN201480074528.1A CN106415451A (zh) | 2014-01-31 | 2014-11-10 | 输入命令的生成 |
EP14801965.6A EP3042269A1 (de) | 2014-01-31 | 2014-11-10 | Generieren eines eingabebefehls |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102014201794.7A DE102014201794A1 (de) | 2014-01-31 | 2014-01-31 | Generieren eines Eingabebefehls |
DE102014201794.7 | 2014-01-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015113663A1 true WO2015113663A1 (de) | 2015-08-06 |
Family
ID=51945844
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2014/074186 WO2015113663A1 (de) | 2014-01-31 | 2014-11-10 | Generieren eines eingabebefehls |
Country Status (6)
Country | Link |
---|---|
US (1) | US9952688B2 (de) |
EP (1) | EP3042269A1 (de) |
KR (2) | KR20180000350A (de) |
CN (1) | CN106415451A (de) |
DE (1) | DE102014201794A1 (de) |
WO (1) | WO2015113663A1 (de) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9921687B2 (en) * | 2012-10-02 | 2018-03-20 | Autodesk, Inc. | Always-available input through finger instrumentation |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
NL1018577C2 (nl) * | 2001-07-18 | 2003-01-21 | Glenn Kenneth Leilis | Computeraanwijsinrichting. |
US7109970B1 (en) * | 2000-07-01 | 2006-09-19 | Miller Stephen S | Apparatus for remotely controlling computers and other electronic appliances/devices using a combination of voice commands and finger movements |
DE102008014199A1 (de) * | 2008-03-14 | 2009-09-24 | Michael Philipp | Handbedienbare Vorrichtung zur Steuerung eines Cursors und zur Initialisierung weiterer Befehlsfunktionen eines Computers |
US20110148669A1 (en) * | 2009-12-17 | 2011-06-23 | Electronics And Telecommunications Research Institute | Thimble-type intermediation device and method for recognizing finger gesture using the same |
US20120293410A1 (en) * | 2011-05-18 | 2012-11-22 | Ian Bell | Flexible Input Device Worn on a Finger |
EP2587345A2 (de) * | 2007-08-19 | 2013-05-01 | Ringbow Ltd. | Am Finger getragene Vorrichtungen und zugehörige Verwendungsverfahren |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL139387A0 (en) * | 2000-11-01 | 2001-11-25 | Dintex Ltd | Feedback system and method for monitoring and measuring physical exercise related information |
KR100590528B1 (ko) * | 2003-06-28 | 2006-06-15 | 삼성전자주식회사 | 착용형 손가락 움직임 감지장치 및 이를 이용한 손가락의움직임 감지방법 |
US7042438B2 (en) * | 2003-09-06 | 2006-05-09 | Mcrae Michael William | Hand manipulated data apparatus for computers and video games |
US20090251407A1 (en) * | 2008-04-03 | 2009-10-08 | Microsoft Corporation | Device interaction with combination of rings |
CN101853096A (zh) * | 2010-05-21 | 2010-10-06 | 程喜庆 | 触控分离式无线控鼠指环及触摸板 |
KR101824921B1 (ko) * | 2013-06-11 | 2018-02-05 | 삼성전자주식회사 | 제스처 기반 통신 서비스 수행 방법 및 장치 |
US9535646B2 (en) * | 2013-06-18 | 2017-01-03 | Microsoft Technology Licensing, Llc | Methods and systems for electronic ink projection |
-
2014
- 2014-01-31 DE DE102014201794.7A patent/DE102014201794A1/de not_active Ceased
- 2014-11-10 US US15/033,666 patent/US9952688B2/en not_active Expired - Fee Related
- 2014-11-10 WO PCT/EP2014/074186 patent/WO2015113663A1/de active Application Filing
- 2014-11-10 KR KR1020177036925A patent/KR20180000350A/ko not_active Application Discontinuation
- 2014-11-10 CN CN201480074528.1A patent/CN106415451A/zh active Pending
- 2014-11-10 KR KR1020167015347A patent/KR20160083110A/ko active IP Right Grant
- 2014-11-10 EP EP14801965.6A patent/EP3042269A1/de not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7109970B1 (en) * | 2000-07-01 | 2006-09-19 | Miller Stephen S | Apparatus for remotely controlling computers and other electronic appliances/devices using a combination of voice commands and finger movements |
NL1018577C2 (nl) * | 2001-07-18 | 2003-01-21 | Glenn Kenneth Leilis | Computeraanwijsinrichting. |
EP2587345A2 (de) * | 2007-08-19 | 2013-05-01 | Ringbow Ltd. | Am Finger getragene Vorrichtungen und zugehörige Verwendungsverfahren |
DE102008014199A1 (de) * | 2008-03-14 | 2009-09-24 | Michael Philipp | Handbedienbare Vorrichtung zur Steuerung eines Cursors und zur Initialisierung weiterer Befehlsfunktionen eines Computers |
US20110148669A1 (en) * | 2009-12-17 | 2011-06-23 | Electronics And Telecommunications Research Institute | Thimble-type intermediation device and method for recognizing finger gesture using the same |
US20120293410A1 (en) * | 2011-05-18 | 2012-11-22 | Ian Bell | Flexible Input Device Worn on a Finger |
Also Published As
Publication number | Publication date |
---|---|
US20160328034A1 (en) | 2016-11-10 |
CN106415451A (zh) | 2017-02-15 |
KR20180000350A (ko) | 2018-01-02 |
KR20160083110A (ko) | 2016-07-11 |
DE102014201794A1 (de) | 2015-08-06 |
EP3042269A1 (de) | 2016-07-13 |
US9952688B2 (en) | 2018-04-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102009011687B4 (de) | Berührungsereignismodell | |
EP2962484B1 (de) | Verfahren und endgerät zur sicheren zugangscode-eingabe | |
DE112010002760T5 (de) | Benutzerschnittstelle | |
DE102011107166A1 (de) | Betätigungsvorrichtung und Steuerverfahren dafür | |
DE102011117012A1 (de) | Erfassung von gleitbewegungen von mehreren fingern unter verwendung von fingerabdrücken zur erzeugung verschiedener ereignisse | |
DE102012204921A1 (de) | Fahrzeugbedienvorrichtung | |
EP3234743B1 (de) | Verfahren zum betreiben einer bedienvorrichtung eines kraftfahrzeugs in unterschiedlichen bedienmodi sowie bedienvorrichtung und kraftfahrzeug | |
WO2016096065A1 (de) | Verfahren zum betreiben einer bedienvorrichtung eines kraftfahrzeugs bei einer mehrfingerbedienung | |
EP2551787A1 (de) | Vorrichtung und Verfahren für eine sicherheitsrelevante Eingabe über ein Anzeigegerät mit Berührungseingabe | |
EP3030956A1 (de) | Verfahren sowie bedienvorrichtung zum bedienen eines elektronischen gerätes über einen touchscreen | |
EP2444866B1 (de) | Bedieneinrichtung zur Bedienung einer Maschine aus der Automatisierungstechnik | |
WO2015113663A1 (de) | Generieren eines eingabebefehls | |
EP2030828B9 (de) | Multimodales Bediensystem und Verfahren zum Bedienen von Komponenten und Funktionen in einem Kraftfahrzeug | |
DE102010051639A1 (de) | Steuerungsvorrichtung mit Multi-Touch Funktionalität | |
DE102019206606B4 (de) | Verfahren zur berührungslosen Interaktion mit einem Modul, Computerprogrammprodukt, Modul sowie Kraftfahrzeug | |
DE102012011177A1 (de) | Verfahren zur Bedienung von Funktionen einesFahrzeuges sowie entsprechende Vorrichtung | |
DE102013105507A1 (de) | Verfahren und Vorrichtung zur Bedienung mindestens eines Bildschirms in einem Fahrzeug | |
DE112018007216B4 (de) | Eingabesteuerungsvorrichtung, Anzeigeeingabevorrichtung und Eingabesteuerungsverfahren | |
DE102010009622A1 (de) | Verfahren zum Betreiben einer Benutzerschnittstelle und Vorrichtung dazu, insbesondere in einem Fahrzeug | |
DE102019129395A1 (de) | Grafische Anwenderschnittstelle, Fortbewegungsmittel und Verfahren zum Betrieb einer grafischen Anwenderschnittstelle für ein Fortbewegungsmittel | |
DE102013220830A1 (de) | Eingabevorrichtung mit Touch Screen für Fahrzeuge | |
DE102016011365A1 (de) | Verfahren zur Steuerung eines Kraftfahrzeugmoduls und Kraftfahrzeugmodul | |
WO2018068940A1 (de) | Verfahren zum bedienen eines feldgerätes der automatisierungstechnik | |
AT519427B1 (de) | Verfahren zur Steuerung einer Datenverarbeitungsanlage | |
DE102019103584A1 (de) | Verfahren zur Steuerung einer berührungsempfindlichen Anzeige- und Eingabevorrichtung und Vorrichtung zur Ausführung des Verfahrens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14801965 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2014801965 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014801965 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15033666 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20167015347 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |