WO2010031123A1 - Appareil d'entrée et procédé - Google Patents

Appareil d'entrée et procédé Download PDF

Info

Publication number
WO2010031123A1
WO2010031123A1 PCT/AU2009/001228 AU2009001228W WO2010031123A1 WO 2010031123 A1 WO2010031123 A1 WO 2010031123A1 AU 2009001228 W AU2009001228 W AU 2009001228W WO 2010031123 A1 WO2010031123 A1 WO 2010031123A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
sensors
user
sensor
different
Prior art date
Application number
PCT/AU2009/001228
Other languages
English (en)
Inventor
Timothy Michael Walsh
Original Assignee
Timothy Michael Walsh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2008904794A external-priority patent/AU2008904794A0/en
Application filed by Timothy Michael Walsh filed Critical Timothy Michael Walsh
Publication of WO2010031123A1 publication Critical patent/WO2010031123A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/021Arrangements integrating additional peripherals in a keyboard, e.g. card or barcode reader, optical scanner
    • G06F3/0213Arrangements providing an integrated pointing device in a keyboard, e.g. trackball, mini-joystick
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0231Cordless keyboards

Definitions

  • none of the above examples provide any flexibility for a user to selectively choose between different forms of key-based or motion-based input.
  • the functions corresponding to each key or sensor are predefined, and cannot be changed to a different configuration based on a user's preference. Accordingly, it is desired to address at least some of the above issues, or to at least provide a useful alternative for users to input data.
  • the input apparatus When the user operates one or more keys, the input apparatus generates a control signal that is processed by software to generate instructions or commands that control a processor (or software application) to perform a specific task.
  • Each key may be associated with one or more characters to be provided as input. This makes the input apparatus easier to learn because users only need to remember which letters are activated by each finger rather than which key the finger must activate in order to enter the desired letter.
  • FIGS. IA and IB are block diagrams of the components of a data processing system
  • Figure 4 is a front view of the input apparatus;
  • Figure 5 is a side view of the input apparatus;
  • Figure 6 is a flowchart of a mode setting process performed by the system
  • the input apparatus 102 includes one or more input devices 106 and 108, each of which has one or more control portions. Each control portion has one or more input sensors that are independently operable by a user for providing input.
  • the input apparatus 102 (at predetermined time intervals) monitors whether any of the input sensors have been operated by the user.
  • the input apparatus 102 generates input data representing the one or more input sensors operated by a user (representing input provided by the user). For example, the input data may be generated in response to the user operating one or more input sensors simultaneously, repeatedly or in accordance with a particular sequence.
  • the input apparatus 102 then transmits the input data to the input control module 110 for analysis.
  • the input control module 110 then generates, based on the analysis of the input data, control signals (which may include one or more commands and instructions) for controlling the functions performed by the processor 112.
  • control signals which may include one or more commands and instructions
  • the functions performed by the input control module 110 can also be performed (in whole or in part) by the input apparatus 102.
  • the finger-operated input sensors 202 and 204 are located at the front of each input device 106 and 108 at positions and angles corresponding to the locations of the user's fingers (or fingertips).
  • the thumb-controlled input sensors 206 and 208 are located at the side of each input device 106 and 108 to correspond to the locations of the user's thumbs.
  • This arrangement of the input sensors 202, 204, 206 and 208 is more ergonomic, and is particularly advantageous as it minimises the degree of movement of the user's thumb and fingers in order to provide input, which helps reduce the risk of the user developing RSI.
  • Each of the input sensors 202, 204, 206 and 208 are independently operable by a user to provide input.
  • an input sensor 202, 204, 206 and 208 may include one or more of the following: i) a contact switch; ii) a touch-sensitive sensor; iii) a light-sensitive sensor; iv) a capacitance sensor; and v) a multidirectional control sensor that is adjustable by a user in two or more (e.g. orthogonal) directions relative to a reference point for providing directional input.
  • a contact switch ii) a touch-sensitive sensor
  • iii) a light-sensitive sensor iv) a capacitance sensor
  • v) a multidirectional control sensor that is adjustable by a user in two or more (e.g. orthogonal) directions relative to a reference point for providing directional input.
  • each input device 106 and 108 are adapted to be held by (or are otherwise attachable to - such as by way of a strap or similar device) the user's hands, rather than rested on a flat surface such as a table top.
  • each input device 106 and 108 includes motion sensors for detecting movement of the respective input device 106 and 108 in a three-dimensional space.
  • the processor 112 performs processes under the control of one or more application modules (not shown in Figures IA and IB).
  • application modules for example, an application module for a word processing application may control the processor 112 to perform a data entry process.
  • Other forms of applications modules can be provided to perform different processes.
  • An application module instructs the processor 112 to perform a different function depending on the input provided by a user.
  • the input control module 110 analyses the input data received from the input apparatus 102, and generates (based on the input data received from the input apparatus 102) the appropriate commands or instructions for controlling the functions performed by the processor 112 (in the context of other the processes to be performed by the processor 112 at that time). For example, processor 112 performs a data entry process (under the control of an application module) up to the point where user input can be provided.
  • the input control module 110 determines that the input data received from the input apparatus 102 represents a character, number or symbol to be provided as input for the data entry process. Alternatively, the input control module 110 may determine that the input data represents a trigger for performing a particular function (e.g.
  • each input sensor may be associated with a different predefined set of characters, wherein based on the number of times a particular input sensor is operated, a different character is selected from the set of characters corresponding to that particular input sensor; iii) A user sequentially operating a predefined one or more of the input sensors 202, 204, 206 and 208.
  • different combinations of sensors may represent the input of different input characters, or instructions for performing different input actions; iv) A user only moving one or more of the input devices 106 and 108.
  • the input apparatus 102 (when used in conjunction with the input control module 110) can provide many different user input modes. These modes (operated by a user) are controlled by the input control module 110 (e.g. by software drivers). The following describes an example of some of the possible user input modes provided by the input control module 110, and it should be understood that the present invention is not limited to the examples described herein.
  • the short names contained in Table 1 are used to describe the different finger positions (and direction of movement - e.g. for a directional input sensor 206 and 208) of the input sensors 202, 204, 206 and 208 of the input apparatus 102. The short names are used in the following description of the user input modes.
  • FIG. 6 is a flowchart of a mode setting process 600 performed by the processor 112 under the control of the input control module 110.
  • the mode setting process 600 allows a user to switch between different user input modes by operating a predefined combination of one or more input sensors 202, 204, 206 and 208 of the input apparatus 102.
  • the mode setting process 600 begins at step 602 by defining a combination of one or more input sensors 202, 204, 206 and 208 that a user operates in order to switch to a different user input mode.
  • the mode setting process 600 switches to a different user input mode by detecting whether the user simultaneously operates a predefined combination of input sensors 202, 204, 206 and 208.
  • Step 602 may define the manner in which the combination of one or more input sensors 202, 204, 206 and 208 are operated in order to switch to a different user input mode.
  • the combination may include one of the directional input sensors 206 and 208, and step 602 may specify the type of directional input required from the directional input sensor 206 and 208 in order to switch the user input mode.
  • the input control module 110 supports one or more user input modes.
  • the mouse input mode and various character input modes e.g. the sequential key input mode, dictionary input mode, QWERTY input mode, and chording input mode
  • music mode are described in greater detail below.
  • character refers to a letter of an alphabet (e.g. including an ideographic character), numeral, symbol, punctuation character and stop character (e.g. a space character and a new line character).
  • a text input mode can be used in conjunction with (e.g. at the same time as) the mouse input mode.
  • the user input modes are arranged in a predefined sequence. At step 603, the input control module 110 selects one of the user input modes in the sequence, and performs an input process in accordance with the selected user input mode.
  • the input apparatus 102 provides input data to the input control module 110.
  • the input control module 110 determines whether the input data represents a combination of activated input sensors 202, 204, 206 and 208 that corresponds to the specific combination of sensors (and manner of operation) as defined at step 602. If there is no match, step 606 returns to 604 to receive further input from the input apparatus 102. Otherwise, step 606 proceeds to step 608.
  • input control module 110 selects the next input mode in the sequence, and performs an input process in accordance with the selected user input mode.
  • the input control module 110 generates commands or instructions for controlling the processor 112 to generate a graphical display interface (e.g. a pop-up window) indicating the user input mode has changed successfully, and preferably, also indicating the current user input mode.
  • a graphical display interface e.g. a pop-up window
  • At least one of the input devices 106 and 108 functions in the same manner as a standard computer mouse (i.e. the movement of the at least one input device 106 and 108 controls the corresponding position of a pointer displayed on a graphical display generated by the processor 112).
  • both input devices 106 and 108 operate in this manner.
  • the user is able to manipulate two pointers on the graphical display.
  • each input device 106 and 108 has more buttons than traditional mice, it is possible to have more predefined "shortcut" functions assigned to each of the buttons (or combinations of one or more of the buttons), for actions such as cutting, copying, pasting, etc.
  • each input device 106 and 108 on a control surface is translated by the input control module 110 (e.g. via software) to control the movement of pointers or cursors on a graphical display on a computer screen. Since there are two hand-pieces, the graphical display can have two separate pointers, one corresponding to each input device 106 and 108 controlled by different hands of the user.
  • a first (e.g. index) finger and second (e.g. middle) finger input sensors 204 and 204 on each input device 106 and 108 can be configured so as to correspond to (i.e. perform the same function as) the left and right buttons on a standard computer mouse.
  • a first finger input sensor 202 and 204 can be used for selecting files or objects, or for positioning the cursor at a particular place within text in a word processor or similar application.
  • a second finger input sensor 202 and 204 can be used to bring up a context-sensitive menu of options from which the user may then make a selection. Once a menu has been brought up (e.g. by using the second finger input sensor), the menu items can be scrolled-through by moving a directional input sensor 206 and 208 of the corresponding input device 106 and 108 forwards or backwards.
  • the selection can be confirmed by depressing the lever of the directional input sensor 206 and 208 of the corresponding input device 106 and 108.
  • This method of selecting menu items using the thumb controls can be more efficient (e.g. faster) than moving the entire input device 106 and 108 and then using a first finger key to select the desired menu item (as is currently done with standard mice), and provides an added advantage of leaving the pointer unmoved after the menu item has been selected.
  • a third (e.g. ring) finger input sensor 202 and 204 may be used in the mouse input mode for cutting, or copying files or portions of text (e.g. in a document containing text). Further, a fourth (e.g. pinky) finger input sensor 202 and 204 may be used in the mouse input mode for pasting objects which have just been cut or copied with the third finger input sensor 202 and 204.
  • a fourth (e.g. pinky) finger input sensor 202 and 204 may be used in the mouse input mode for pasting objects which have just been cut or copied with the third finger input sensor 202 and 204.
  • an input apparatus 102 with two separate input devices 106 and 108 is that a user can separately control two different pointers, which makes is more efficient for a user to perform operations like cutting and pasting files (or other objects) from one place to another.
  • a file or other object
  • a file can be cut from one folder (or file storage location) using a first input device 106 controlled by one of the user's hands, and then pasted into another folder (or file storage location) using a second input device 108 controlled by the other of the user's hands.
  • Data manipulation performed in this way reduces the need to make substantial movement of either input device 106 and 108.
  • the direction control sensors 206 and 208 may be used for providing panning and/or zooming control functionality for the contents displayed on a graphical display.
  • Modern computer mice usually come with a small scroll wheel located between the left and right buttons. Such a scroll wheel is used for vertical panning (or scrolling) through documents which are too long to be displayed entirely on the screen.
  • the direction control sensors 206 and 208 of the input apparatus 102 can also be used for this type of scrolling of documents. Further, since the directional control sensors 206 and 208 can have four lateral directions of movement, such sensors 206 and 208 can be used for horizontal panning as well as vertical panning (scrolling) control purposes.
  • FIG 10 is a flowchart of a motion control process 1000 performed by input apparatus 102 working in conjunction with the input control module 110.
  • the motion control process 1000 begins at step 1002 with the input apparatus 102 generating (e.g. in real time or at predefined time intervals) motion data based on the movement or displacement of each input device 106 and 108 relative to a control area or space (based on the information provided by the motion sensor for each input device 106 and 108).
  • the input apparatus 102 generates receives a response signal from those input sensors 202,204, 206 and 208 in the active state (i.e. have been operated by a user).
  • the input apparatus 102 generates input data including the motion data and data representing one or more unique identifiers of the input sensors 202, 204, 206 and 208 in the active state.
  • the motion data may represent a magnitude or change in position relative to a reference point within a two-dimensional region, or alternatively, within a three- dimensional space.
  • the motion data may also represent a magnitude or change in the pitch, roll and/or yaw of each input device 106 and 108 within a three-dimensional space.
  • the input apparatus 102 sends the input data to the input control module 110.
  • the input control module 110 analyses the input data, and based on the analysis, generates commands or instructions for controlling the operation of a process performed by the processor 112 under the control of an application module.
  • the command or instructions may provide data (e.g. characters, symbols, numbers, etc.) to a process performed by the processor 112, or may interact with the process (e.g. to perform data manipulation or other functions).
  • Character input modes e.g. characters, symbols, numbers, etc.
  • the input control module 110 is performed by the input control module 110 (as described above) and may be triggered by the user pressing a predefined control sequence of input sensors 202, 204, 206 and 208 (for example, by depressing all keys simultaneously).
  • Dictionary input mode In the dictionary input mode, different characters are each assigned to an input sensor 202, 204, 206 and 208 in the same way as in the sequential key input mode. This mode is analogous to the dictionary mode for entering text on mobile phones.
  • the user can train the input control module 110 software to recognise that word. For example, this may involve entering the word (one letter at a time) using the sequential key input mode, and then preferably, selecting an option for storing the completed word into the dictionary.
  • step 818 the input control module 110 analyses the input data received from the input apparatus 102 and assesses whether the input data represents an action to confirm the selection of a word in the graphical display interface. If so, step 818 proceeds to step 820. Otherwise, step 818 proceeds to step 822.
  • step 820 the input control module 110 generates commands or instructions providing the characters stored in the input string as input for another process performed by the processor (e.g. under the control of an application module). Step 820 then proceeds to step 804.
  • step 822 the input control module 110 analyses the input data received from the input apparatus 102 and assesses whether the input data represents an action to end the input process 800. If so, process 800 ends. If not, step 822 proceeds to step 805 to process further input.
  • a desired character is selected by a pressing one (or a combination) of input sensors 202, 204, 206 and 208 together with slight movement of the corresponding input device 106 and 108 relative to a control surface.
  • the user imagines that there is a virtual QWERTY keyboard underneath his hands, and each key on the virtual keyboard is accessible by a operating (or pressing) an appropriate input sensor 202, 204, 206 and 208 (for home row keys), or a slight movement of the input device 106 and 108 accompanied by operation of an input sensor 202, 204, 206 and 208 (for non home row keys).
  • the user presses the L4, L3, L2, and Ll input sensors 202 without moving the input device 106 controlled by the user's left hand.
  • the user presses the Ll input sensor 202 and moving the input device 106 controlled by the user's left hand slightly to the right (relative to the control surface).
  • the user presses the L4 input sensor 202 and moving the input device 106 controlled by the user's left hand slightly forward (relative to the control surface), and so on.
  • An alternative letter arrangement may be suitable for users who are experienced QWERTY touch typists.
  • each of the finger input sensors 202 and 204 may be adapted so as to have three different operable states (e.g. as defined in Table 1).
  • Each finger input sensor 202 and 204 may be associated with a unique letter of an alphabet (e.g. under the control of the input control module 110).
  • each finger input sensor 202 and 204 may be associated with 24 different letters of the English alphabet.
  • the thumb input sensors 206 and 208 may be associated with the remaining two letters of the English alphabet. In this way, different input actions of the input sensors 202, 204, 206 and 208 of the input device 106 and 108 can be used to provide input representing different letters of the English alphabet (or any other alphabet).
  • the number mode provides the most convenient way to do this. It is no coincidence that we use a base 10 number system, given that we have ten digits (fingers and thumbs) on our two hands. In number mode, the ten numerals 1, 2, 3, 4, 5, 6, 7, 8, 9 and 0 may be assigned to each of the user's finger and thumb keys from left to right as shown in Table 4:

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un appareil d'entrée, comprenant un premier dispositif d'entrée conçu pour s'utiliser avec une main d’un utilisateur; et un second dispositif d'entrée conçu pour s'utiliser avec l'autre main dudit utilisateur; chaque dispositif d'entrée comportant une pluralité de capteurs d'entrée agencés spatialement pour un actionnement indépendant par un pouce ou un doigt respectif différent dudit utilisateur pour effectuer une entrée, de sorte que, lorsque ledit utilisateur actionne un ou plusieurs desdits capteurs pour effectuer une entrée sur la base d'un premier ensemble d'actions d'entrée associées auxdits capteurs dans un premier mode d'entrée, au moins un desdits dispositifs d'entrée génère des données d'entrée représentant ladite entrée pour commander un processeur d'un dispositif informatique. En réponse à l'actionnement par ledit utilisateur d'une combinaison prédéfinie d'un ou de plusieurs desdits capteurs. Lesdits capteurs sont reconfigurables pour effectuer une entrée différente sur la base d'un ensemble d'actions d'entrée associées auxdits capteurs et correspondant à un mode d'entrée différent.
PCT/AU2009/001228 2008-09-16 2009-09-16 Appareil d'entrée et procédé WO2010031123A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2008904794 2008-09-16
AU2008904794A AU2008904794A0 (en) 2008-09-16 Input apparatus and method

Publications (1)

Publication Number Publication Date
WO2010031123A1 true WO2010031123A1 (fr) 2010-03-25

Family

ID=42039012

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2009/001228 WO2010031123A1 (fr) 2008-09-16 2009-09-16 Appareil d'entrée et procédé

Country Status (1)

Country Link
WO (1) WO2010031123A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150028A (zh) * 2013-02-06 2013-06-12 鲍炜 手握式键盘及其输入方法
CN103150030A (zh) * 2013-03-14 2013-06-12 上海市七宝中学 单手输入键盘及其输入方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Power Glove - Wikipedia", Retrieved from the Internet <URL:http://en.wikipedia.org/w/index.php?title=Power_ Glove&oldid=228275141> *
"The Keypaw: ECE 476 Spring 2004 Final Project by Maudie Hampden and Sumul Shah", Retrieved from the Internet <URL:http://web.archive.org/web/20080511162736/http://instructl .cit.cornell.edu/Courses/ee 476/FinalProjects/s2004/sms95/keypaw/index.html> *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150028A (zh) * 2013-02-06 2013-06-12 鲍炜 手握式键盘及其输入方法
CN103150028B (zh) * 2013-02-06 2016-04-27 鲍炜 手握式键盘及其输入方法
CN103150030A (zh) * 2013-03-14 2013-06-12 上海市七宝中学 单手输入键盘及其输入方法

Similar Documents

Publication Publication Date Title
KR100478020B1 (ko) 화면표시식키이입력장치
US8125440B2 (en) Method and device for controlling and inputting data
US7091954B2 (en) Computer keyboard and cursor control system and method with keyboard map switching
KR101330760B1 (ko) 기호 입력을 위한 전자 장치 및 방법
JP2005521969A (ja) Qwerty型マッピングおよびタイピングをエミュレートする縮小キーボード・システム
US20080136679A1 (en) Using sequential taps to enter text
US10928906B2 (en) Data entry device for entering characters by a finger with haptic feedback
WO2017222397A1 (fr) Souris d&#39;ordinateur
EP1599787A1 (fr) Procede de saisie de texte non ambigue pour ecran tactile et systemes de clavier reduits
JP2005235188A (ja) データ入力装置
WO2009059479A1 (fr) Dispositifs d&#39;entrée avec interfaces d&#39;entrée virtuelles
CN101124532B (zh) 计算机输入设备
US9606633B2 (en) Method and apparatus for input to electronic devices
WO2012015333A1 (fr) Dispositif de composition et de saisie de symboles dans des moyens de communication portables
JP6740389B2 (ja) ハンドヘルド電子デバイスのための適応的ユーザ・インターフェース
WO2008041975A1 (fr) Émulation de clavier
KR20040101560A (ko) 사용자 인터페이스
WO2010031123A1 (fr) Appareil d&#39;entrée et procédé
JP6419994B2 (ja) 電気的な形態のデータの入力を行う方法及びデータ入力装置
US20100207887A1 (en) One-handed computer interface device
EP0279555A2 (fr) Système pour l&#39;entrée de données dans un ordinateur
JP2003263264A (ja) 文字入力装置、および文字入力方法
CN110780733A (zh) 一种用于虚拟现实的文本输入系统
CN219676553U (zh) 一种带有键盘的vr手柄
JP3766695B6 (ja) 画面表示式キー入力装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09813890

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09813890

Country of ref document: EP

Kind code of ref document: A1