CN106445094A - Smart wearable input apparatus - Google Patents

Smart wearable input apparatus Download PDF

Info

Publication number
CN106445094A
CN106445094A CN201510988545.2A CN201510988545A CN106445094A CN 106445094 A CN106445094 A CN 106445094A CN 201510988545 A CN201510988545 A CN 201510988545A CN 106445094 A CN106445094 A CN 106445094A
Authority
CN
China
Prior art keywords
wearable
input equipment
user
intelligence
virtual hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510988545.2A
Other languages
Chinese (zh)
Inventor
孔亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhenjiang Moneng Network Technology Co Ltd
Original Assignee
Zhenjiang Moneng Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhenjiang Moneng Network Technology Co Ltd filed Critical Zhenjiang Moneng Network Technology Co Ltd
Priority to PCT/CN2016/070067 priority Critical patent/WO2016115976A1/en
Publication of CN106445094A publication Critical patent/CN106445094A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

The invention relates to a smart wearable input apparatus. A text and pointer input system for a computing device comprises: a wearable sensor device (12a, 12b) for measuring movement of at least one hand (100a) of a user, the sensor (12a) being capable of measuring movement of at least one individual finger with respect to the hand as well as movement of the hand as a whole with respect to its surroundings, and the sensor device being arranged to transmit movement data to a controller; a display screen arranged to receive display information from the controller; and a controller adapted to display an image of a virtual keyboard on the display screen, and to display an image of a virtual hand on the display screen, the virtual hand moving on the display screen in response to input from the sensor device, the controller being further adapted to identify a key press when the at least one finger is moved when the corresponding finger on the virtual hand is in a position corresponding with a key on the virtual keyboard, and to pass the key press as text input to the computer.

Description

The wearable input equipment of intelligence
Technical field
The present invention is related to computer installation control input equipment, especially with the wearable input being capable of text input Equipment is related.
Background technology
The most common way providing text input for computer system is realized by QWERTY keyboard, and keyboard can be real Now high speed text input, but simultaneously there are disadvantages that.Development and palm, handss using keyboard and the repetitive stain injury state of an illness There is association between wrist, hand, shoulder, cervical region and back pain.
Keyboard not belongs to very portable equipment, does not therefore generally use with panel computer or mobile phone simultaneously.For this For kind equipment, modal text input mode is one " dummy keyboard " of display on touch sensitive screen, and user can pass through The mode " keying in " text on touchscreen keypad realizes text input.However, there are some problems in this kind of input mode itself For example, user is difficult to find one and is suitable to place panel computer the position it is easy to checking screen and keying in text.For mobile phone Speech, the size of screen only enables " double finger " and keys in, so cannot realize keying at a high speed.
User is also very familiar with for motion tracking and gesture identifying device and system, for example, Xbox (RTM) somatic sensation television game system System.But, such system is generally unsuitable for high speed text input.For such system, text input mode and screen The mode of keyboard equipment text input is identical, but typing speed is generally relatively low.
Aiming at of the present invention reduces or eliminates the problems referred to above.
Content of the invention
It is an object of the invention to:Reduce or eliminate above-mentioned technical problem, provide a kind of intelligence wearable input equipment. (cypher text increase)
The present invention provides a text input system for computer, and the present invention is achieved by the following technical solutions:
The wearable sensors device that user at least makees manually can be measured.This sensor can measure at least one handss Refer to action and hand with respect to surrounding action, and this sensor can to control device transmit related to action Data;
The display screen of the display information that controller sends can be received;And
Virtual keyboard image, virtual hand image and the input providing according to sensor device can be provided on a display screen The controller that information realization virtual hand moves on a display screen.At least one finger of user be moved and virtual hand phase In the case of answering finger to be located on dummy keyboard corresponding key mapping, controller also can key range situation as literary composition further This input information inputs computer.
Due to this text input system can by hand exercise realize to computer input at a high speed text task (with User is inputted identical by way of text adopts by QWERTY keyboard), therefore, the system possesses advantage.Because user is likely to right It is very familiar in QWERTY keyboard, so only needing to use this device through a small amount of study.Meanwhile, user is not necessarily like using conventional Hand is placed on ad-hoc location by physical keyboard like that.Because this device is wearable device, user can be in any position in both hands Put (for example:Both hands be located at armrest on or when user lies on a bed be located at body both sides) in the case of to computer input Text.This function is especially advantageous for there is the user of physical disabilities situation, cannot be double because such user is possibly Handss are placed on the appropriate location being capable of proper use of QWERTY keyboard.
Preferably using controller as the software running on computers and for computer offer text input.Text input system System is applied to desktop computer or notebook computer and the higher equipment of portable degree (for example:Panel computer or smart mobile phone). For the latter, the text input system of this invention is capable of user and generally cannot join by using such portable equipment Standby touchscreen keypad is come the high speed text input to realize.Preferably provide two wearable sensors devices, to allow the user can Measure the action of every handss using sensor.Correspondingly, screen can show two virtual hand, and user can use any one Finger on hand presses the button on dummy keyboard.In this case, every virtual hand can obtain with respect to the position of true handss To separate calibrations, make entity handss can separated by a distance (for example:Both sides positioned at user's body).Meanwhile, positioned at virtual key Virtual hand on disk can be relatively close.
User can arrange the truly ratio between chirokinesthetic distance and virtual hand move distance.This function enables users to Enough appreciable virtual hand " responsiveness " is set to the most natural level for itself.Preferably each sensor device is equal The action of user's 5 fingers of every handss can be measured respectively.By two sensor devices, user can be pressed using ten fingers The mode keyed in is impinged upon on standard physical keyboard and reaches higher typing speed.
Controller can achieve pointer input and inputs text in computer.In this case, when virtual hand does not exist When above dummy keyboard, user simultaneously " can click on " project on display screen using the instruction of this virtual hand.So, the system is alternative Keyboard and mouse input, and user can use computer in the case that both hands are in natural place comprehensively.Controller also can be known Other gestures not in addition to " keying in " certain gestures adopting on the virtual keyboard.For example:User can be need not will be virtual Handss movement to above dummy keyboard respective keys in the case of realized by the corresponding gesture making upper and lower, left and right key such Function.Key on the virtual keyboard, indicated using virtual hand and click on and common function is realized by gesture control and make to use Family being capable of fast operating computer (compared to known standard input device).
User can be adjusted to the size of dummy keyboard on display screen.Especially when virtual hand is located at dummy keyboard edge When nearby, user can be by being adjusted to keyboard size with the action that two fingers complete " pinching ".So, user can pinch and take Or capture one or more edges of keyboard and adjust keyboard size by way of mobile keyboard edge.
User can be adjusted to the size of virtual hand on display screen.In some specific cases, the size of virtual hand can Adjust automatically, to guarantee that virtual hand and the size of dummy keyboard match when dummy keyboard size is adjusted.
When pressing certain key on dummy keyboard when virtual hand, this system can provide different videos, audio frequency or any its Its feedback effect.In some specific cases, wearable sensors device can be in conjunction with force feedback element, to allow user in button When action to button " can be experienced ".
Text input system among the present invention can realize high speed text input on computer (any screen size), Meanwhile, both hands substantially can be placed on any position by user.Additionally, compared to QWERTY keyboard, this system has high degree of flexibility. As described above, user can be adjusted to the size of dummy keyboard.Additionally, user can be by dummy keyboard movement to display screen Diverse location, and user can arbitrarily arrange virtual hand and transparency, design style and the color of keyboard.User even can be in order to Adapt to the demand of itself and change the arrangement of the spacing between the shape of button, button and button on dummy keyboard, standard Physical keyboard is to realize this kind of motility level.
Brief description
Examples of the present invention will be described by way of reference to the accompanying drawings, wherein:
Fig. 1 is the text input system schematic diagram being provided according to this invention;
Fig. 2 a shows the size of the dummy keyboard being adjusted in the text input system shown in Fig. 1;
Fig. 2 b shows the position of the dummy keyboard being adjusted in the text input system shown in Fig. 1;
Fig. 2 c shows the dummy keyboard temporarily hidden in the text input system shown in Fig. 1 or close;
Fig. 3 shows the dummy keyboard key arrangement situation being adjusted in the text input system shown in Fig. 1;
Fig. 4 shows the gesture example that can realize common function in the text input system shown in Fig. 1;
Fig. 5 a and 5b shows the gesture example that can simulate click in the text input system shown in Fig. 1;
Fig. 6 illustrate how using the system shown in Fig. 1 realize document upwards, scroll down through;
Fig. 7 and Fig. 8 illustrates how to choose the project on screen using the system shown in Fig. 1.
Specific embodiment
All features disclosed in this specification, or disclosed all methods or during step, except mutually exclusive Feature and/or step beyond, all can combine by any way.
Any feature disclosed in this specification (including any accessory claim, summary and accompanying drawing), except non-specifically is chatted State, all can be replaced by other alternative features equivalent or that there is similar purpose.I.e., unless specifically stated otherwise, each feature It is a series of equivalent or one of similar characteristics example.
For Fig. 1, shown in the text input system 10 according to this invention design, text input system 10 Comprise first wearable sensors device 12a and second wearable sensors device 12b, by sensor device 12a, 12b It is worn over 100a, 100b in the wrist of user respectively.It is worn over sensor device in user's wrist as bracelet, but sensor dress Put and can also take other form (for example:The sensor of form of gloves).
Sensor device 12a and 12b follows the trail of the action of user's both hands (100a, 100b) respectively.Especially sensor device The bulk velocity for the action of surrounding for the every palmistry and the action of direction and every finger can be measured.From sensing The metrical information of device device 12a, 12b is transferred to controller 14, and this controller 14 is among the system as in computing device The software of upper operation.
Additionally, this system 10 also includes display screen 16, this display screen 16 is to be used for realizing the aobvious of output by computing device Display screen.Controller 14 can show each project on display screen 16, and this intermediate item is superimposed on the sundry item that computing device shows On.Specifically, controller 14 can show virtual hand 18a and virtual hand 18b.Controller 14 can be double to actual user Handss 100a, 100b respond with respect to the action of surrounding, make virtual hand 18a, 18b of screen periphery complete corresponding actions (as the measurement result of sensor device 12a, 12b).Additionally, controller 14 can to respond user both hands 100a, 100b upper corresponding The action of finger, and make the finger of virtual hand 18a, 18b complete corresponding actions (by sensor device 12a, 12b measurement accordingly Finger movement).
Controller 14 can also show dummy keyboard 20 on display screen 16, and user can be moved by both hands (100a, 100b) Make to control the action of virtual hand 18a, 18b, and using 10 fingers, information " is keyed in " on dummy keyboard 20.
User can control the void outside dummy keyboard 20 using both hands 100a, 100b in the case of not key entry information Intend handss 18a, 18b, to realize the function in addition to text input.For example:As shown in Figure 2 a, user can use thumb and forefinger Complete the action of " pinching ".Virtual hand 18a, 18b will replicate this action, and this action can be interpreted as realizing adjustment by controller 14 The function of size.In in figure, user by the corner of " pinching " dummy keyboard 20 and to adjust void by way of carrying out " stretching, extension " to it Intend the size of keyboard 20.User also can realize the size adjusting of other screen windows or other elements by this method.
In figure 2b, user carrys out mobile virtual keyboard 20 by using virtual hand 18b, and user can pass through the dynamic of hand 100b Make to carry out mobile virtual keyboard (or (here statement) any other removable projects on screen), to make virtual hand 18b Corner positioned at project to be moved.Then, user can complete the gesture (clenching) of " crawl ".Controller 14 can be by this hands Gesture is interpreted as realizing locomotive function.User can clench unclamping, and then will pass through mobile hand before this project " putting down " 100b is by the movement of this project to new position.
In addition to being capable of the size of mobile virtual keyboard 20 and adjustment dummy keyboard 20, user also can not need to carry out During text input, by way of very convenient, dummy keyboard is minimized or temporary close.So, display screen just can accommodate More sundry items.As shown in Figure 2 c, user can realize above-mentioned functions by the "close" key 22 above dummy keyboard 20.When When "close" key 22 is pressed, keyboard reduces and only stays next button " to open " key 24.User can be in situation about needing Pass through down to press " unlatching " key 24 recovery full size keyboard.
As shown in figure 3, user can adjust the key arrangement on dummy keyboard 20 according to the preference of itself.Conventional simple tune The whole spacing including between adjustment button or button size, but user can realize comprehensive motility in case of need.In Fig. 3 In, user has rearranged the button on dummy keyboard 20 completely.
As shown in Fig. 2 a and Fig. 2 b, user (can be controlled by user actual both hands 100a, 100b using virtual hand 18a, 18b System) function in addition to text input for the realization.For computer High-speed Control, the text input realized by dummy keyboard (as described above) and " click " input and other gestures between combination for user is rapid using common function very Useful.
Fig. 4 show user can need not by particular location any in virtual hand movement to screen in the case of using arrow The gesture example that key (upper and lower, left and right) function need to complete.Far Left shows that user can be by making left hand middle finger away from palm The mode function of realizing upwards.In next in figure, downward function is shown as in left hand pointing to volar direction movement.Zuo Jian Camera function can be realized by stretching out the left hand third finger, and right arrow function can be realized by stretching out left index finger.Certainly, this Class gesture is only used as example, and user can achieve the mapping between various gestures and common function completely.
Fig. 5 a and Fig. 5 b illustrates how to replace standard mouse with input system.User can pass through the action control of right hand 100b Virtual hand 18b processed is in the motion of screen periphery.For example:User can by configure by using forefinger button action with click on mouse Left button is corresponding and the action using nameless button is corresponding with clicking on right mouse button.
Fig. 6 is illustrated how to be scrolled up using input system and scrolls down through document.(as schemed under a kind of rolling mode Shown in the leftmost side), the gesture that hand 100b stretchs out forefinger can make document roll under the pattern of " hand tracking ", and user can pass through Hand motion realizes page scroll-up/down.Or, as shown in figure mid portion, user can operate scroll bar using virtual hand 18b 50, this mode of operation is identical with by way of mouse pointer operation scroll bar 50.The right part of figure shows the third rolling Dynamic method.Wherein, user can realize rolling operation by virtual hand 18b, and this kind of mode of operation and user are existed by actual hand The mode completing rolling operation on touch screen is identical.
In figures 7 and 8, user can select each project on screen using input system 10.User can be using operation The mode of conventional mice chooses the choice box 70 around project 52,54,56,58,60,62 using virtual hand 18b, to select Respective item.Among this example, project 64,66,68 is in non-selected state.User is by mobile for virtual hand 18b to treating The upper left corner of favored area simultaneously keeps the regular hour (for example:2 seconds).Then, user by the drag and drop of virtual hand 18b to treating favored area The lower right corner, to create choice box 70 and to select project therein.
This input system is extremely flexible, and user can reach, by the system, the speed inputting text on QWERTY keyboard, and Will not be limited by posture aspect.Additionally, the system can in the case that user cannot use QWERTY keyboard (for example:In handss On machine) realize high speed text input.
Particular embodiments described above, has carried out detailed further to the purpose of the present invention, technical scheme and beneficial effect Describe in detail bright, be should be understood that the specific embodiment that the foregoing is only the present invention, be not limited to the present invention.This Invention expands to any new feature disclosing in this manual or any new combination, and the arbitrary new method that discloses or The step of process or any new combination.

Claims (18)

1. a kind of wearable input equipment of intelligence it is adaptable to the text of computing device and index input system it is characterised in that this System comprises:
The wearable sensors device that user at least makees manually can be measured, this sensor can measure an at least finger Whole palmistry belonging to action and this finger is for the action of surrounding, and action data can be transferred to by sensor device Controller;
The display screen of the display information from controller can be received;
Virtual keyboard image, virtual hand image and the input information providing according to sensor device can be provided on a display screen Realize the controller that virtual hand moves on a display screen, at least one finger of user be moved and virtual hand corresponding handss In the case of referring to be located at corresponding key mapping on dummy keyboard, controller also can key range situation further, and as text Input information inputs computer;
When virtual hand is not in dummy keyboard relevant position, controller also can analog mouse or the input of other pointers.
2. the wearable input equipment of intelligence as claimed in claim 1 is it is characterised in that two wearable sensors devices can be surveyed The action of amount both hands.
3. the wearable input equipment of intelligence as claimed in claim 2 is it is characterised in that show two virtual hand, often on screen Virtual hand according to two sensor devices one of the input information that provides of sensor move.
4. input equipment as wearable in described intelligence arbitrary in claims 1 to 3 is it is characterised in that surveyed by each sensor Ratio between the displacement measured and the displacement of every virtual hand is adjustable.
5. input equipment as wearable in described intelligence arbitrary in Claims 1-4 is it is characterised in that sensor device can The action of measurement user every each finger on hand respectively.
6. input equipment as wearable in described intelligence arbitrary in claim 1 to 5 it is characterised in that controller can recognize that remove Other gestures beyond button gesture simultaneously activate corresponding function according to identified gesture.
7. input equipment as wearable in described intelligence arbitrary in claim 1 to 6 is it is characterised in that virtual key on display screen The size adjustable of disk.
8. input equipment as wearable in described intelligence arbitrary in claim 1 to 7 is it is characterised in that dummy keyboard is in display Position on screen is adjustable.
9. as the wearable input equipment of described intelligence arbitrary in claim 1 to 8 it is characterised in that the chi of every virtual hand Very little all adjustable.
10. the wearable input equipment of intelligence as claimed in claim 9 is it is characterised in that when user adjusts dummy keyboard size When, the size of every virtual hand can get adjust automatically.
The wearable input equipment of arbitrary described intelligence in 11. such as claim 1 to 10 is it is characterised in that wearable sensors It is included in the way of output that feedback information is provided when button is pressed.
The wearable input equipment of 12. intelligence as claimed in claim 11 is it is characterised in that the output of wearable sensors device Mode comprises force feedback output device.
The wearable input equipment of 13. intelligence as claimed in claim 12 is it is characterised in that force feedback output device comprises to vibrate Device.
In 14. such as claim 1 to 13, the wearable input equipment of arbitrary described intelligence is it is characterised in that sensor device is adopted Form with annuluss.
In 15. such as claim 1 to 14, the wearable input equipment of arbitrary described intelligence is it is characterised in that sensor device is adopted Form with glove.
The wearable input equipment of arbitrary described intelligence in 16. such as claim 1 to 15 is it is characterised in that user can be by void Intend handss scroll through pages on screen.
The wearable input equipment of arbitrary described intelligence in 17. such as claim 1 to 16 is it is characterised in that user can be by void Intend handss to make a choice in a series of projects on screen.
The wearable input equipment of arbitrary described intelligence in 18. such as claim 1 to 17 is it is characterised in that as described in presents The wearable input equipment of intelligence.
CN201510988545.2A 2015-01-21 2015-12-22 Smart wearable input apparatus Pending CN106445094A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/070067 WO2016115976A1 (en) 2015-01-21 2016-01-04 Smart wearable input apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1501018.4A GB2534386A (en) 2015-01-21 2015-01-21 Smart wearable input apparatus
GB1501018.4 2015-01-21

Publications (1)

Publication Number Publication Date
CN106445094A true CN106445094A (en) 2017-02-22

Family

ID=52630911

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510988545.2A Pending CN106445094A (en) 2015-01-21 2015-12-22 Smart wearable input apparatus

Country Status (2)

Country Link
CN (1) CN106445094A (en)
GB (1) GB2534386A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107562205A (en) * 2017-09-15 2018-01-09 上海展扬通信技术有限公司 A kind of projected keyboard of intelligent terminal and the operating method of the projected keyboard
CN109658758A (en) * 2019-02-18 2019-04-19 西安科技大学 A kind of computer accounting teaching simulation System
CN109683667A (en) * 2018-12-25 2019-04-26 上海萃钛智能科技有限公司 A kind of Wearing-on-head type computer and its data inputting method
CN109782999A (en) * 2019-01-30 2019-05-21 上海摩软通讯技术有限公司 A kind of input method, input equipment and a kind of computer-readable medium
CN110728828A (en) * 2019-11-11 2020-01-24 中国地质大学(武汉) Office sitting posture correction instrument and use method thereof
CN111831110A (en) * 2019-04-15 2020-10-27 苹果公司 Keyboard operation of head-mounted device
CN113499219A (en) * 2021-07-05 2021-10-15 西安交通大学 Multi-sense organ stimulation hand function rehabilitation system and method based on virtual reality game

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10635161B2 (en) * 2015-08-04 2020-04-28 Google Llc Context sensitive hand collisions in virtual reality
CN107024984A (en) * 2017-01-12 2017-08-08 瑞声科技(新加坡)有限公司 The feedback response method and terminal of a kind of button
EP3542252B1 (en) 2017-08-10 2023-08-02 Google LLC Context-sensitive hand interaction

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1244003A2 (en) * 2001-03-09 2002-09-25 Samsung Electronics Co., Ltd. Information input system using bio feedback and method thereof
TW200535676A (en) * 2003-11-25 2005-11-01 Kenji Nishi Information input unit, storing unit, information input device, and information processing device
CN102063183A (en) * 2011-02-12 2011-05-18 深圳市亿思达显示科技有限公司 Virtual input device of grove type
CN102736726A (en) * 2011-04-11 2012-10-17 曾亚东 Stealth technology for keyboard and mouse
US20130328769A1 (en) * 2011-02-23 2013-12-12 Lg Innotek Co., Ltd. Apparatus and method for inputting command using gesture

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6600480B2 (en) * 1998-12-31 2003-07-29 Anthony James Francis Natoli Virtual reality keyboard system and method
KR100499391B1 (en) * 2001-03-08 2005-07-07 은탁 Virtual input device sensed finger motion and method thereof
US20130265300A1 (en) * 2011-07-03 2013-10-10 Neorai Vardi Computer device in form of wearable glasses and user interface thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1244003A2 (en) * 2001-03-09 2002-09-25 Samsung Electronics Co., Ltd. Information input system using bio feedback and method thereof
TW200535676A (en) * 2003-11-25 2005-11-01 Kenji Nishi Information input unit, storing unit, information input device, and information processing device
CN102063183A (en) * 2011-02-12 2011-05-18 深圳市亿思达显示科技有限公司 Virtual input device of grove type
US20130328769A1 (en) * 2011-02-23 2013-12-12 Lg Innotek Co., Ltd. Apparatus and method for inputting command using gesture
CN102736726A (en) * 2011-04-11 2012-10-17 曾亚东 Stealth technology for keyboard and mouse

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王培俊 等: "《虚拟设计系统》", 31 March 2012, 西南交通大学出版社 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107562205A (en) * 2017-09-15 2018-01-09 上海展扬通信技术有限公司 A kind of projected keyboard of intelligent terminal and the operating method of the projected keyboard
CN109683667A (en) * 2018-12-25 2019-04-26 上海萃钛智能科技有限公司 A kind of Wearing-on-head type computer and its data inputting method
CN109782999A (en) * 2019-01-30 2019-05-21 上海摩软通讯技术有限公司 A kind of input method, input equipment and a kind of computer-readable medium
CN109658758A (en) * 2019-02-18 2019-04-19 西安科技大学 A kind of computer accounting teaching simulation System
CN111831110A (en) * 2019-04-15 2020-10-27 苹果公司 Keyboard operation of head-mounted device
CN110728828A (en) * 2019-11-11 2020-01-24 中国地质大学(武汉) Office sitting posture correction instrument and use method thereof
CN113499219A (en) * 2021-07-05 2021-10-15 西安交通大学 Multi-sense organ stimulation hand function rehabilitation system and method based on virtual reality game

Also Published As

Publication number Publication date
GB2534386A (en) 2016-07-27
GB201501018D0 (en) 2015-03-04

Similar Documents

Publication Publication Date Title
CN106445094A (en) Smart wearable input apparatus
Wobbrock et al. The performance of hand postures in front-and back-of-device interaction for mobile computing
Harrison et al. Using shear as a supplemental two-dimensional input channel for rich touchscreen interaction
Wagner et al. BiTouch and BiPad: designing bimanual interaction for hand-held tablets
CN101315593B (en) Touch control type mobile operation device and contact-control method used therein
US20100164897A1 (en) Virtual keypad systems and methods
US20120127070A1 (en) Control signal input device and method using posture recognition
US20150242120A1 (en) Data input peripherals and methods
US20130275907A1 (en) Virtual keyboard
US20040021676A1 (en) Method and apparatus of view window scrolling
KR20090096528A (en) Human computer interaction device, electronic device and human computer interaction method
Bergström et al. Human--Computer interaction on the skin
WO2017019390A1 (en) Universal keyboard
US20140191958A1 (en) Cursor control method for a touch screen
KR20160097410A (en) Method of providing touchless input interface based on gesture recognition and the apparatus applied thereto
WO2016115976A1 (en) Smart wearable input apparatus
Ahn et al. Evaluation of edge-based interaction on a square smartwatch
US20140191992A1 (en) Touch input method, electronic device, system, and readable recording medium by using virtual keys
CN112041795A (en) Wearable data input device and operation method
Huang et al. Differences in muscle activity, kinematics, user performance, and subjective assessment between touchscreen and mid-air interactions on a tablet
CN104484073A (en) Hand touch interaction system
Lepouras Comparing methods for numerical input in immersive virtual environments
CN105242795A (en) Method for inputting English letters by azimuth gesture
Liu et al. Tilt-scrolling: A comparative study of scrolling techniques for mobile devices
Wolf et al. Biomechanics of front and back-of-tablet pointing with grasping hands

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170222