WO2016115976A1 - Smart wearable input apparatus - Google Patents

Smart wearable input apparatus Download PDF

Info

Publication number
WO2016115976A1
WO2016115976A1 PCT/CN2016/070067 CN2016070067W WO2016115976A1 WO 2016115976 A1 WO2016115976 A1 WO 2016115976A1 CN 2016070067 W CN2016070067 W CN 2016070067W WO 2016115976 A1 WO2016115976 A1 WO 2016115976A1
Authority
WO
WIPO (PCT)
Prior art keywords
text input
virtual
input system
hand
controller
Prior art date
Application number
PCT/CN2016/070067
Other languages
French (fr)
Inventor
Liang KONG
Original Assignee
Kong Liang
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB1501018.4A external-priority patent/GB2534386A/en
Application filed by Kong Liang filed Critical Kong Liang
Publication of WO2016115976A1 publication Critical patent/WO2016115976A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device

Definitions

  • the present invention relates to input apparatus for controlling a computing device, and particularly to wearable input apparatus which allows text input.
  • keyboard allows for high-speed text input, but does have a number of drawbacks.
  • the use of keyboards is associated with the development of repetitive strain injury and other pains in the palms, wrists, hands, shoulders, neck, and back.
  • Keyboards are not especially portable, and therefore are not generally used with tablets or mobile telephones.
  • the most common mode of text input is to provide a ‘virtual’ keyboard displayed on a touch-sensitive screen. Text may be input by ‘typing’ on the touch-screen keyboard.
  • this in itself has a number of problems–for example, it can be difficult to find a good position to place the tablet where the user can easily look at the screen and also easily type on the screen.
  • the screen will almost certainly be too small to allow anything other than ‘two finger’ typing, and therefore a high typing speed cannot be achieved.
  • Motion tracking and gesture recognition devices and systems are also known, for example the Xbox (RTM) Kinect games system.
  • RTM Xbox
  • these systems are generally not suitable for high-speed text input. Text input is possible in essentially the same way as an on-screen keyboard is used on a touch-screen device, but the typing speed is generally slow.
  • a text input system for a computer comprising:
  • a wearable sensor device for measuring movement of at least one hand of a user, the sensor being capable of measuring movement of at least one individual finger with respect to the hand as well as movement of the hand as a whole with respect to its surroundings, and the sensor device being arranged to transmit movement data to a controller;
  • a display screen arranged to receive display information from the controller
  • a controller adapted to display an image of a virtual keyboard on the display screen, and to display an image of a virtual hand on the display screen, the virtual hand moving on the display screen in response to input from the sensor device, the controller being further adapted to identify a key press when the at least one finger is moved when the corresponding finger on the virtual hand is in a position corresponding with a key on the virtual keyboard, and to pass the key press as text input to the computer.
  • the text input system is advantageous, because it allows high-speed text input to a computer by movement of the hands in exactly the same way as text input is achieved with a standard keyboard. Because a standard keyboard is likely to be familiar to the user, relatively little learning will be required to use the device. At the same time, the user is not constrained to have his hands in a certain position, as with a regular physical keyboard. Since the device is wearable, the user can input text to a computer with his hands in more or less any position, for example on the arms of a chair or by his sides while lying in bed. This may be a particular advantage to users having certain physical disabilities, who may not be able to position their hands correctly to use a standard keyboard.
  • the controller is implemented as software running on the computer to which text input is being provided.
  • the text input system is suitable for use with desktop or laptop computers, and also with more portable devices such as tablets or smartphones.
  • the text input system of the invention allows for high-speed text input which is simply not possible using the touch screen keyboard which is typically provided with these devices.
  • two wearable sensor devices are provided, so that the user can use one sensor device to measure movement in each of his two hands.
  • two virtual hands may be displayed on the screen and keys on the virtual keyboard may be pressed by a finger on either hand.
  • the position of each virtual hand with respect to the real hand may be calibrated independently, allowing the physical hands to be separated by some distance, for example at either side of the user’s body, whilst the virtual hands are displayed relatively close together, both over the virtual keyboard.
  • the ratio of the distance moved by the or each real hand to the distance moved by the or each virtual hand may be set by the user. This allows the perceived ‘responsiveness’ of the virtual hands to be set to a level which is found to be most natural for the user.
  • the or each sensor device may be capable of measuring movement individually in each of the five fingers of a user’s hand.
  • the user is able to use all his fingers to achieve a high typing speed, in exactly the same way as with a standard physical keyboard.
  • the controller may be adapted to provide for pointer input, as well as text input, to the computer.
  • pointer input when any virtual hand is not positioned over the virtual keyboard, that hand can be used to point and ‘click’ items on the display screen. In this way, the system can replace both keyboard and mouse input, again allowing each hand to remain in a natural position whilst providing for full use of the computer.
  • the controller may also be adapted to recognise certain gestures other than ‘typing’ on the virtual keyboard. For example, gestures for the up, down, left and right keys may be provided to make those functions accessible without the need to move the virtual hands over the corresponding keys on the virtual keyboard. The combination of typing on the virtual keyboard, pointing and clicking with the virtual hand, and gesture control to access common functions makes it possible to operate a computer very quickly, as compared with standard known input devices.
  • the size of the virtual keyboard on the display screen may be adjustable by the user.
  • the size of the keyboard may be adjusted when a ‘pinching’ motion between two fingers is measured by the sensor device when the virtual hand is positioned near an edge of the virtual keyboard. In this way, the user can pinch or grab one or more edges of the keyboard, and move the edge (s) to adjust the size of the keyboard.
  • the size of the virtual hand (s) on the display screen may also be adjustable by the user. In some embodiments, the size of the virtual hand (s) may be adjusted automatically to match the size of the virtual keyboard when the size of the virtual keyboard is adjusted.
  • the wearable sensor device may incorporate force feedback components, so that the user can ‘feel’ when he has pressed a key.
  • the text input system of the invention provides for high-speed text input on a computer with any size of screen, whilst allowing the hands of the user to remain in substantially any position.
  • the system is highly flexible when compared to a standard keyboard.
  • the size of the virtual keyboard may be adjusted.
  • the virtual keyboard may be moved to different points on the display screen, and the transparency, design and colour of the virtual hands and keyboard can be set at the user’s discretion. Even the shape of the keys on the virtual keyboard, the spacing between keys, and the arrangement of keys may be changed to suit the individual user. This level of flexibility is impossible with a standard physical keyboard.
  • Figure 1 is a schematic illustration of a text input system according to the invention
  • Figure 2a shows the size of the virtual keyboard being adjusted in the text input system of Figure 1;
  • Figure 2b shows the position of the virtual keyboard being adjusted in the text input system of Figure 1;
  • Figure 2c shows the virtual keyboard being temporarily hidden or switched off in the text input system of Figure 1;
  • Figure 3 shows the arrangement of keys on the virtual keyboard being adjusted in the text input system of Figure 1;
  • Figure 4 shows example gestures which may be used to access common functions in the text input system of Figure 1;
  • Figures 5a and 5b show an example gesture which may be used to emulate a mouse click in the text input system of Figure 1;
  • Figure 6 shows how the system of Figure 1 may be used to scroll up and down a document
  • Figure 7 shows how the system of Figure 1 may be used to select items on a screen
  • Figure 8 shows how the system of Figure 1 may be used to select text on a screen.
  • the text input system 10 includes a first wearable sensor device 12a and a second wearable sensor device 12b. Each sensor device 12a, 12b is attached to one of the user’s wrists 100a, 100b. In this embodiment, the sensor devices are worn like a bracelet, but it is envisaged that other forms of sensor device may be used, for example sensors in the form of gloves.
  • Each sensor device 12a, 12b tracks the movement of the corresponding hand 100a, 100b. In particular, the overall speed and direction of movement of each hand with respect to its surroundings is measured, as well as the movement of each finger on each hand.
  • the measurements from the sensor devices 12a, 12b are transmitted to a controller 14, which in this embodiment is implemented as software running on the computing device which is being controlled.
  • the system 10 further includes a display screen 16, which is the display screen which is already used by the computing device for output.
  • the controller 14 is adapted to display items on the display screen 16, superimposed over other items which are being displayed by the computing device.
  • the controller 14 displays a virtual hand 18a and a virtual hand 18b.
  • the controller moves the virtual hands 18a, 18b around the screen in response to movement of the corresponding real hands 100a, 100b with respect to their surroundings, as measured by the sensor devices 12a, 12b.
  • the controller 14 also moves individual fingers on the virtual hands 18a, 18b in response to movement of the corresponding fingers on the real hands 100a, 100b, the movement of the fingers being measured by the sensor devices 12a, 12b.
  • the controller also displays a virtual keyboard 20 on the display screen 16.
  • the user can control the movement of the virtual hands 18a, 18b and therefore use all ten fingers to ‘type’ on the virtual keyboard 20.
  • the user can also use his hands 100a, 100b to control the virtual hands 18a, 18b outside of the virtual keyboard 20, to access functions other than text input.
  • the user can ‘pinch’ with his thumb and forefinger.
  • the virtual hands 18a, 18b will replicate this motion, and the controller (14) can interpret this to allow access to a re-sizing function.
  • it is the virtual keyboard 20 which is being re-sized, by ‘pinching’ a t the corners and ‘stretching’ the virtual keyboard 20.
  • other on-screen windows or other elements may be re-sized in this way.
  • the virtual keyboard 20 is being moved using virtual hand 18b.
  • the user moves his hand 100b so that the virtual hand 18b is at a corner of the item to be moved. He then makes a ‘grasping’ gesture, i.e. a clenched fist.
  • the controller (14) interprets this to allow access to a moving function.
  • the item is then moved to its new location by moving the hand 100b, before releasing the clenched fist to ‘drop’ the item in place.
  • the virtual keyboard can conveniently be minimized, or temporarily switched off, when text input is not required. This allows more space on the display screen for other items. As shown in Figure 2c, this may very simply be achieved by providing an ‘off’ key 22 on the virtual keyboard 20. When the ‘off’ key is pressed, the keyboard shrinks to leave only one key–an ‘on’ key 24. The ‘on’ key can be pressed to restore the full-size keyboard when required.
  • the layout of the keys on the virtual keyboard 20 may be adjusted to suit the user’s preferences.
  • a simple and common adjustment may be to adjust the spacing between keys or the size of the keys, but full flexibility is available if required.
  • the user is completely rearranging the keys on the virtual keyboard 20.
  • the virtual hands 18a, 18b may be used to access functions other than text input.
  • Figure 4 shows an example set of gestures which may be used to provide the functions of the arrow keys (up, down, left and right) without having to move the virtual hands to any particular position on the screen.
  • the up function is shown as a movement of the left middle finger away from the palm.
  • the down function is shown as a movement of the left middle finger towards the palm.
  • the left arrow function is accessed by stretching out the left ring finger
  • the right arrow function is accessed by stretching out the left index finger.
  • Figures 5a and 5b show how the input system may be used in place of a standard mouse.
  • the user’s right hand 100b can be moved to control movement of virtual hand 18b around the screen and, for example, a pressing motion with the index finger can be configured to correspond with a left mouse-click, and a pressing motion with the ring finger can be configured to correspond with a right mouse-click.
  • Figure 6 shows how the input system may be used for scrolling up and down a document.
  • an outstretched index finger gesture of the hand 100b causes scrolling in a ‘hand tracking’ mode, where the hand is simply moved to scroll the page up and down.
  • the virtual hand 18b can be used to operate a scroll bar 50, in the same way as the scroll bar 50 can be operated with a mouse pointer.
  • a third scrolling method is shown, in which the virtual hand 18b is manipulated to scroll just as a real hand would on a touch-sensitive screen.
  • the input system 10 is being used to select items on a screen.
  • the virtual hand 18b is used in exactly the same way as a conventional mouse to draw a box 70 around items 52, 54, 56, 58, 60, 62, to select those items. Items 64, 66, 68 are not selected in this example.
  • Virtual hand 18b is moved to the top left corner of the area to be selected, and held there for a predetermined period (for example, two seconds) . The virtual hand 18b is then dragged to the bottom right corner of the area to be selected, to create a box 70 and select the items within the box 70.
  • the input system is extremely flexible, and allows text input at a similar speed to a standard keyboard, without associated constraints on posture.
  • the system can provide high-speed text input in situations where a standard keyboard is not available, such as on a mobile telephone.

Abstract

A text and pointer input system for a computing device comprises: a wearable sensor device (12a, 12b) for measuring movement of at least one hand (100a, 100b) of a user, the sensor (12a, 12b) being capable of measuring movement of at least one individual finger with respect to the hand (100a, 100b) as well as movement of the hand (100a, 100b) as a whole with respect to its surroundings, and the sensor device (12a, 12b) being arranged to transmit movement data to a controller (14); a display screen (16) arranged to receive display information from the controller (14); and a controller (14) adapted to display an image of a virtual keyboard (20) on the display screen (16), and to display an image of a virtual hand (18a, 18b) on the display screen (16), the virtual hand (18a, 18b) moving on the display screen (16) in response to input from the sensor device (12a, 12b), the controller (14) being further adapted to identify a key press when the at least one finger is moved when the corresponding finger on the virtual hand (18a, 18b) is in a position corresponding with a key on the virtual keyboard (20), and to pass the key press as text input to the computer.

Description

SMART WEARABLE INPUT APPARATUS
Field of the Patent Application
The present invention relates to input apparatus for controlling a computing device, and particularly to wearable input apparatus which allows text input.
Background to the Invention
The most common means of providing text input to a computer system is via a standard keyboard. The keyboard allows for high-speed text input, but does have a number of drawbacks. The use of keyboards is associated with the development of repetitive strain injury and other pains in the palms, wrists, hands, shoulders, neck, and back.
Keyboards are not especially portable, and therefore are not generally used with tablets or mobile telephones. On these devices, the most common mode of text input is to provide a ‘virtual’ keyboard displayed on a touch-sensitive screen. Text may be input by ‘typing’ on the touch-screen keyboard. However, this in itself has a number of problems–for example, it can be difficult to find a good position to place the tablet where the user can easily look at the screen and also easily type on the screen. In the case of a mobile phone, the screen will almost certainly be too small to allow anything other than ‘two finger’ typing, and therefore a high typing speed cannot be achieved.
Motion tracking and gesture recognition devices and systems are also known, for example the Xbox (RTM) Kinect games system. However, these systems are generally not suitable for high-speed text input. Text input is possible in essentially the same way as an on-screen keyboard is used on a touch-screen device, but the typing speed is generally slow.
It is an object of the invention to reduce or substantially obviate the above mentioned problems.
Statement of Invention
According to the present invention, there is provided a text input system for a computer, the system comprising:
a wearable sensor device for measuring movement of at least one hand of a user, the sensor being capable of measuring movement of at least one individual finger with respect to  the hand as well as movement of the hand as a whole with respect to its surroundings, and the sensor device being arranged to transmit movement data to a controller;
a display screen arranged to receive display information from the controller; and
a controller adapted to display an image of a virtual keyboard on the display screen, and to display an image of a virtual hand on the display screen, the virtual hand moving on the display screen in response to input from the sensor device, the controller being further adapted to identify a key press when the at least one finger is moved when the corresponding finger on the virtual hand is in a position corresponding with a key on the virtual keyboard, and to pass the key press as text input to the computer.
The text input system is advantageous, because it allows high-speed text input to a computer by movement of the hands in exactly the same way as text input is achieved with a standard keyboard. Because a standard keyboard is likely to be familiar to the user, relatively little learning will be required to use the device. At the same time, the user is not constrained to have his hands in a certain position, as with a regular physical keyboard. Since the device is wearable, the user can input text to a computer with his hands in more or less any position, for example on the arms of a chair or by his sides while lying in bed. This may be a particular advantage to users having certain physical disabilities, who may not be able to position their hands correctly to use a standard keyboard.
Preferably, the controller is implemented as software running on the computer to which text input is being provided.
The text input system is suitable for use with desktop or laptop computers, and also with more portable devices such as tablets or smartphones. In the latter case, the text input system of the invention allows for high-speed text input which is simply not possible using the touch screen keyboard which is typically provided with these devices.
Preferably, two wearable sensor devices are provided, so that the user can use one sensor device to measure movement in each of his two hands. Correspondingly, two virtual hands may be displayed on the screen and keys on the virtual keyboard may be pressed by a finger on either hand. In this case, the position of each virtual hand with respect to the real hand may be calibrated independently, allowing the physical hands to be separated by some distance, for example at either side of the user’s body, whilst the virtual hands are displayed relatively close together, both over the virtual keyboard.
The ratio of the distance moved by the or each real hand to the distance moved by the or each virtual hand may be set by the user. This allows the perceived ‘responsiveness’ of the virtual hands to be set to a level which is found to be most natural for the user.
Preferably, the or each sensor device may be capable of measuring movement individually in each of the five fingers of a user’s hand. With two such sensor devices, the user is able to use all his fingers to achieve a high typing speed, in exactly the same way as with a standard physical keyboard.
The controller may be adapted to provide for pointer input, as well as text input, to the computer. In this case, when any virtual hand is not positioned over the virtual keyboard, that hand can be used to point and ‘click’ items on the display screen. In this way, the system can replace both keyboard and mouse input, again allowing each hand to remain in a natural position whilst providing for full use of the computer. The controller may also be adapted to recognise certain gestures other than ‘typing’ on the virtual keyboard. For example, gestures for the up, down, left and right keys may be provided to make those functions accessible without the need to move the virtual hands over the corresponding keys on the virtual keyboard. The combination of typing on the virtual keyboard, pointing and clicking with the virtual hand, and gesture control to access common functions makes it possible to operate a computer very quickly, as compared with standard known input devices.
The size of the virtual keyboard on the display screen may be adjustable by the user. In particular, the size of the keyboard may be adjusted when a ‘pinching’ motion between two fingers is measured by the sensor device when the virtual hand is positioned near an edge of the virtual keyboard. In this way, the user can pinch or grab one or more edges of the keyboard, and move the edge (s) to adjust the size of the keyboard.
The size of the virtual hand (s) on the display screen may also be adjustable by the user. In some embodiments, the size of the virtual hand (s) may be adjusted automatically to match the size of the virtual keyboard when the size of the virtual keyboard is adjusted.
Different visual, audio, or any other feedback effects may be provided when a key on the virtual keyboard is pressed by the virtual hand. In some embodiments, the wearable sensor device may incorporate force feedback components, so that the user can ‘feel’ when he has pressed a key.
The text input system of the invention provides for high-speed text input on a computer with any size of screen, whilst allowing the hands of the user to remain in substantially any position. In addition, the system is highly flexible when compared to a standard keyboard. As described above, the size of the virtual keyboard may be adjusted. In addition, the virtual keyboard may be moved to different points on the display screen, and the transparency, design and colour of the virtual hands and keyboard can be set at the user’s discretion. Even the shape of the keys on the virtual keyboard, the spacing between keys, and the arrangement of keys may be changed to suit the individual user. This level of flexibility is impossible with a standard physical keyboard.
Description of the Drawings
For a better understanding of the present invention, and to show more clearly how it may be carried into effect, reference will now be made by way of example only to the accompanying drawings, in which:
Figure 1 is a schematic illustration of a text input system according to the invention;
Figure 2a shows the size of the virtual keyboard being adjusted in the text input system of Figure 1;
Figure 2b shows the position of the virtual keyboard being adjusted in the text input system of Figure 1;
Figure 2c shows the virtual keyboard being temporarily hidden or switched off in the text input system of Figure 1;
Figure 3 shows the arrangement of keys on the virtual keyboard being adjusted in the text input system of Figure 1;
Figure 4 shows example gestures which may be used to access common functions in the text input system of Figure 1;
Figures 5a and 5b show an example gesture which may be used to emulate a mouse click in the text input system of Figure 1;
Figure 6 shows how the system of Figure 1 may be used to scroll up and down a document; 
Figure 7 shows how the system of Figure 1 may be used to select items on a screen; and
Figure 8 shows how the system of Figure 1 may be used to select text on a screen.
Description of the Preferred Embodiment (s)
Referring firstly to Figure 1, an embodiment of a text input system according to the invention is indicated generally at 10. The text input system 10 includes a first wearable sensor device 12a and a second wearable sensor device 12b. Each  sensor device  12a, 12b is attached to one of the user’s  wrists  100a, 100b. In this embodiment, the sensor devices are worn like a bracelet, but it is envisaged that other forms of sensor device may be used, for example sensors in the form of gloves.
Each  sensor device  12a, 12b tracks the movement of the  corresponding hand  100a, 100b. In particular, the overall speed and direction of movement of each hand with respect to its surroundings is measured, as well as the movement of each finger on each hand. The measurements from the  sensor devices  12a, 12b are transmitted to a controller 14, which in this embodiment is implemented as software running on the computing device which is being controlled.
The system 10 further includes a display screen 16, which is the display screen which is already used by the computing device for output. The controller 14 is adapted to display items on the display screen 16, superimposed over other items which are being displayed by the computing device. In particular, the controller 14 displays a virtual hand 18a and a virtual hand 18b. The controller moves the  virtual hands  18a, 18b around the screen in response to movement of the corresponding  real hands  100a, 100b with respect to their surroundings, as measured by the  sensor devices  12a, 12b. The controller 14 also moves individual fingers on the  virtual hands  18a, 18b in response to movement of the corresponding fingers on the  real hands  100a, 100b, the movement of the fingers being measured by the  sensor devices  12a, 12b.
The controller also displays a virtual keyboard 20 on the display screen 16. By moving his  hands  100a, 100b, the user can control the movement of the  virtual hands  18a, 18b and therefore use all ten fingers to ‘type’ on the virtual keyboard 20.
When not typing, the user can also use his  hands  100a, 100b to control the  virtual hands  18a, 18b outside of the virtual keyboard 20, to access functions other than text input. For example, as shown in Figure 2a, the user can ‘pinch’ with his thumb and forefinger. The  virtual hands  18a, 18b will replicate this motion, and the controller (14) can interpret this to allow access to a re-sizing function. In the Figure, it is the virtual keyboard 20 which is being re-sized, by  ‘pinching’ a t the corners and ‘stretching’ the virtual keyboard 20. However, it will be understood that other on-screen windows or other elements may be re-sized in this way.
In Figure 2b, the virtual keyboard 20 is being moved using virtual hand 18b. To move the virtual keyboard (or again, any other movable on-screen item) , the user moves his hand 100b so that the virtual hand 18b is at a corner of the item to be moved. He then makes a ‘grasping’ gesture, i.e. a clenched fist. The controller (14) interprets this to allow access to a moving function. The item is then moved to its new location by moving the hand 100b, before releasing the clenched fist to ‘drop’ the item in place.
As well as moving and resizing the virtual keyboard 20, the virtual keyboard can conveniently be minimized, or temporarily switched off, when text input is not required. This allows more space on the display screen for other items. As shown in Figure 2c, this may very simply be achieved by providing an ‘off’ key 22 on the virtual keyboard 20. When the ‘off’ key is pressed, the keyboard shrinks to leave only one key–an ‘on’ key 24. The ‘on’ key can be pressed to restore the full-size keyboard when required.
As shown in Figure 3, the layout of the keys on the virtual keyboard 20 may be adjusted to suit the user’s preferences. A simple and common adjustment may be to adjust the spacing between keys or the size of the keys, but full flexibility is available if required. In Figure 3 the user is completely rearranging the keys on the virtual keyboard 20.
As already described with reference to Figures 2a and 2b, the  virtual hands  18a, 18b (as controlled by  real hands  100a, 100b) may be used to access functions other than text input. For high-speed control of a computer, it is useful to combine text input with the virtual keyboard as described with ‘point and click’ input, and other gestures to quickly allow access to common functions. Figure 4 shows an example set of gestures which may be used to provide the functions of the arrow keys (up, down, left and right) without having to move the virtual hands to any particular position on the screen. On the far left, the up function is shown as a movement of the left middle finger away from the palm. In the next illustration, the down function is shown as a movement of the left middle finger towards the palm. The left arrow function is accessed by stretching out the left ring finger, and the right arrow function is accessed by stretching out the left index finger. These gestures are of course only examples, and mapping of gestures to common functions may be fully configurable by the user.
Figures 5a and 5b show how the input system may be used in place of a standard mouse. The user’s right hand 100b can be moved to control movement of virtual hand 18b around the screen and, for example, a pressing motion with the index finger can be configured to correspond with a left mouse-click, and a pressing motion with the ring finger can be configured to correspond with a right mouse-click.
Figure 6 shows how the input system may be used for scrolling up and down a document. In one scrolling mode (shown on the far left of the Figure) , an outstretched index finger gesture of the hand 100b causes scrolling in a ‘hand tracking’ mode, where the hand is simply moved to scroll the page up and down. Alternatively, as shown in the centre of the Figure, the virtual hand 18b can be used to operate a scroll bar 50, in the same way as the scroll bar 50 can be operated with a mouse pointer. On the right of the Figure, a third scrolling method is shown, in which the virtual hand 18b is manipulated to scroll just as a real hand would on a touch-sensitive screen.
In Figure 7, the input system 10 is being used to select items on a screen. The virtual hand 18b is used in exactly the same way as a conventional mouse to draw a box 70 around  items  52, 54, 56, 58, 60, 62, to select those items.  Items  64, 66, 68 are not selected in this example. Virtual hand 18b is moved to the top left corner of the area to be selected, and held there for a predetermined period (for example, two seconds) . The virtual hand 18b is then dragged to the bottom right corner of the area to be selected, to create a box 70 and select the items within the box 70.
The input system is extremely flexible, and allows text input at a similar speed to a standard keyboard, without associated constraints on posture. In addition, the system can provide high-speed text input in situations where a standard keyboard is not available, such as on a mobile telephone.
In Figure 8, like Figure 7, a selection is being made using the input system 10. In this case, a portion of text 70 (the first two lines in the figure) are being selected.
The embodiments described above are provided by way of example only, and various changes and modifications will be apparent to persons skilled in the art without departing from the scope of the present invention as defined by the appended claims.

Claims (19)

  1. A text and pointer input system for a computing device, the system comprising:
    a wearable sensor device for measuring movement of at least one hand of a user, the sensor being capable of measuring movement of at least one individual finger with respect to the hand as well as movement of the hand as a whole with respect to its surroundings, and the sensor device being arranged to transmit movement data to a controller;
    a display screen arranged to receive display information from the controller; and
    a controller adapted to display an image of a virtual keyboard on the display screen, and to display an image of a virtual hand on the display screen, the virtual hand moving on the display screen in response to input from the sensor device, the controller being further adapted to identify a key press when the at least one finger is moved when the corresponding finger on the virtual hand is in a position corresponding with a key on the virtual keyboard, and to pass the key press as text input to the computer,
    the controller being further adapted to emulate mouse or other pointer input when the virtual hands are not in a position corresponding with the virtual keyboard.
  2. The text input system as claimed in claim 1, wherein two wearable sensor devices are provided for measuring movement of two hands.
  3. The text input system as claimed in claim 2, wherein two virtual hands are displayed on the screen, each virtual hand moving in response to input from one of the two sensor devices.
  4. The text input system as claimed in any of the preceding claims, wherein the ratio of the moved distance measured by the or each sensor to the distance moved by the or each virtual hand is adjustable.
  5. The text input system as claimed in any of the preceding claims, wherein the or each sensor device is capable of measuring movement of each of the five fingers on the user’s hand individually.
  6. A text input system as claimed in any of the preceding claims, wherein the controller is adapted to recognise gestures other than key-press gestures, and to activate functions dependent on recognised gestures.
  7. The text input system as claimed in any of the preceding claims, wherein the size of the virtual keyboard on the display screen is adjustable.
  8. The text input system as claimed in any of the preceding claims, wherein the position of the virtual keyboard on the display screen is adjustable.
  9. The text input system as claimed in any of the preceding claims, wherein the size of the or each virtual hand is adjustable.
  10. The text input system as claimed in claim 9, when dependent on claim 7, wherein the size of the or each virtual hand is automatically adjusted when the size of the virtual keyboard is adjusted.
  11. The text input system as claimed in any of the preceding claims, wherein the wearable sensor device (s) include output means for providing feedback when a key is pressed.
  12. The text input system as claimed in claim 11, wherein the output means on the wearable sensor device include a force-feedback output device.
  13. The text input system as claimed in claim 12, wherein the force-feedback output device includes a vibrator.
  14. The text input system as claimed in any of the preceding claims, wherein the sensor device is in the form of a band.
  15. The text input system as claimed in any of the preceding claims, wherein the sensor device is in the form of a glove.
  16. The text input system as claimed in any of the preceding claims, wherein the virtual hand may be used to scroll a page on the screen.
  17. The text input system as claimed in any of the preceding claims, wherein the virtual hand may be used to select from a set of items on the screen.
  18. The text input system as claimed in any of the preceding claims, wherein the virtual hand may be used to select a portion of text on the screen.
  19. The text input system substantially as described herein, with reference to and as illustrated in Figures 1 to 5 of the accompanying drawings.
PCT/CN2016/070067 2015-01-21 2016-01-04 Smart wearable input apparatus WO2016115976A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB1501018.4A GB2534386A (en) 2015-01-21 2015-01-21 Smart wearable input apparatus
GB1501018.4 2015-01-21
CN201510988545.2 2015-12-22
CN201510988545.2A CN106445094A (en) 2015-01-21 2015-12-22 Smart wearable input apparatus

Publications (1)

Publication Number Publication Date
WO2016115976A1 true WO2016115976A1 (en) 2016-07-28

Family

ID=56416407

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/070067 WO2016115976A1 (en) 2015-01-21 2016-01-04 Smart wearable input apparatus

Country Status (1)

Country Link
WO (1) WO2016115976A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170038830A1 (en) * 2015-08-04 2017-02-09 Google Inc. Context sensitive hand collisions in virtual reality
CN109217005A (en) * 2018-09-07 2019-01-15 昆山龙梦电子科技有限公司 Electric connector
US10782793B2 (en) 2017-08-10 2020-09-22 Google Llc Context-sensitive hand interaction

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070103430A1 (en) * 2003-11-25 2007-05-10 Kenji Nishi Information inputting tool, storage device, information inputting device, and information processing equipment
CN1996092A (en) * 2005-12-28 2007-07-11 大学光学科技股份有限公司 Focus-adjustable headset type display system with virtual keyboard and device therefor
CN201638148U (en) * 2009-09-10 2010-11-17 深圳市亿思达显示科技有限公司 Glove-type virtual input device
CN102063183A (en) * 2011-02-12 2011-05-18 深圳市亿思达显示科技有限公司 Virtual input device of grove type
CN103019377A (en) * 2012-12-04 2013-04-03 天津大学 Head-mounted visual display equipment-based input method and device
US20130241927A1 (en) * 2011-07-03 2013-09-19 Neorai Vardi Computer device in form of wearable glasses and user interface thereof
CN104199550A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070103430A1 (en) * 2003-11-25 2007-05-10 Kenji Nishi Information inputting tool, storage device, information inputting device, and information processing equipment
CN1996092A (en) * 2005-12-28 2007-07-11 大学光学科技股份有限公司 Focus-adjustable headset type display system with virtual keyboard and device therefor
CN201638148U (en) * 2009-09-10 2010-11-17 深圳市亿思达显示科技有限公司 Glove-type virtual input device
CN102063183A (en) * 2011-02-12 2011-05-18 深圳市亿思达显示科技有限公司 Virtual input device of grove type
US20130241927A1 (en) * 2011-07-03 2013-09-19 Neorai Vardi Computer device in form of wearable glasses and user interface thereof
CN103019377A (en) * 2012-12-04 2013-04-03 天津大学 Head-mounted visual display equipment-based input method and device
CN104199550A (en) * 2014-08-29 2014-12-10 福州瑞芯微电子有限公司 Man-machine interactive type virtual touch device, system and method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170038830A1 (en) * 2015-08-04 2017-02-09 Google Inc. Context sensitive hand collisions in virtual reality
US10635161B2 (en) * 2015-08-04 2020-04-28 Google Llc Context sensitive hand collisions in virtual reality
US10782793B2 (en) 2017-08-10 2020-09-22 Google Llc Context-sensitive hand interaction
US11181986B2 (en) 2017-08-10 2021-11-23 Google Llc Context-sensitive hand interaction
CN109217005A (en) * 2018-09-07 2019-01-15 昆山龙梦电子科技有限公司 Electric connector

Similar Documents

Publication Publication Date Title
GB2534386A (en) Smart wearable input apparatus
Dube et al. Text entry in virtual reality: A comprehensive review of the literature
US10417880B2 (en) Haptic device incorporating stretch characteristics
US5917476A (en) Cursor feedback text input method
Harrison et al. Using shear as a supplemental two-dimensional input channel for rich touchscreen interaction
US20100164897A1 (en) Virtual keypad systems and methods
US20130275907A1 (en) Virtual keyboard
US10048760B2 (en) Method and apparatus for immersive system interfacing
US8552980B2 (en) Computer input devices and associated computing devices, software, and methods
KR20090096528A (en) Human computer interaction device, electronic device and human computer interaction method
Bachl et al. Challenges for designing the user experience of multi-touch interfaces
JPH0778120A (en) Hand-held arithmetic unit and processing method of input signal in hand-held arithmetic unit
Dobbelstein et al. PocketThumb: A wearable dual-sided touch interface for cursor-based control of smart-eyewear
Bergström et al. Human--Computer interaction on the skin
US20190034070A1 (en) Flexible & customisable human computer interaction (HCI) device that combines the functionality of traditional keyboard and pointing device (mouse/touchpad) on a laptop & desktop computer
WO2016115976A1 (en) Smart wearable input apparatus
US20160328024A1 (en) Method and apparatus for input to electronic devices
US20200168121A1 (en) Device for Interpretation of Digital Content for the Visually Impaired
Ahn et al. Evaluation of edge-based interaction on a square smartwatch
Benko et al. Imprecision, inaccuracy, and frustration: The tale of touch input
US20160378206A1 (en) Circular, hand-held stress mouse
Blaskó et al. Single-handed interaction techniques for multiple pressure-sensitive strips
CN104484073A (en) Hand touch interaction system
JP2012079097A (en) Information apparatus with key input unit disposed on surface invisible during use, input method and program
CN104063046A (en) Input Device And Method Of Switching Input Mode Thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16739723

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29/01/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16739723

Country of ref document: EP

Kind code of ref document: A1