EP1999547A2 - Systeme et procede pour entrer des donnees dans un systeme informatique - Google Patents

Systeme et procede pour entrer des donnees dans un systeme informatique

Info

Publication number
EP1999547A2
EP1999547A2 EP07706117A EP07706117A EP1999547A2 EP 1999547 A2 EP1999547 A2 EP 1999547A2 EP 07706117 A EP07706117 A EP 07706117A EP 07706117 A EP07706117 A EP 07706117A EP 1999547 A2 EP1999547 A2 EP 1999547A2
Authority
EP
European Patent Office
Prior art keywords
user
virtual
keyboard
hand
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP07706117A
Other languages
German (de)
English (en)
Other versions
EP1999547A4 (fr
Inventor
Harel Cohen
Giora Bar-Sakai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FTK Technologies Ltd
Original Assignee
FTK Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FTK Technologies Ltd filed Critical FTK Technologies Ltd
Publication of EP1999547A2 publication Critical patent/EP1999547A2/fr
Publication of EP1999547A4 publication Critical patent/EP1999547A4/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/018Input/output arrangements for oriental characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting

Definitions

  • the present disclosure generally relates to the field of data inputting devices. More specifically, the present disclosure relates to a method for facilitating multilingual data inputting and to a system utilizing the multilingual data inputting method.
  • keyboards are designed for the input of data such as text and other types of characters (collectively referred to hereinafter as "symbols” or “key labels”), and also to control the operation of the computer.
  • data such as text and other types of characters
  • keyboards are an arrangement of rectangular or near-rectangular buttons, or "keys”.
  • Keyboards typically have one or more symbols engraved, printed or otherwise marked on each key; in most cases, each press of a key corresponds to a single symbol being entered into the computer and, in many cases, displayed on the computer's display screen. However, producing some symbols requires pressing and holding several keys simultaneously, or in sequence.
  • Standard keyboards suffer from a number of disadvantages and limitations.
  • standard keyboards normally contain function keys, for example with symbols Fl to F 12, which hide functions that are defined in separate instructions. Often, the user has to learn and memorize these hidden functions or invoke their meaning from a lookup table, a "Help" directory, or from other sources.
  • Such keyboards are limited in the number of keys, and therefore key functions. Normally, a nonprofessional typist has to follow the typing action by shifting frequently his gaze between the keyboard placed on a desk and the PC monitor screen, which is normally placed in front and higher on the desk.
  • US Patent No. 6,611,253 by the same inventor as the present disclosure, describes a method and system for virtual input environment, and creation of an input unit with changeable keys display.
  • US 6,611,253 does not teach using hands gestures, to control the layout of the virtual keyboard, nor it teaches that the appearance of the virtual hands may be depend on the virtual keyboard's layout being used.
  • US 6,611,253 does not evaluating previous commands to predict current, or future, command or user's request.
  • keyboards For some languages, as stated above, for example, Russian and Hebrew, hardware keyboards generally have second language alphabet symbols etched into the keyboard keys together with the English alphabet. In some countries where the spoken languages have large alphabets and/or many characters (in excess of 50 characters, for example), the keyboards generally include only English letters, as all the letters of the other language cannot be displayed on the physical keyboard. This situation often creates a huge problem for users of such a languages to perform data entry tasks in such languages.
  • Indian scripts typically have 12-15 vowels, 35-40 consonants and a few diacritical marks. Besides this, for each vowel there is a corresponding modifier symbol and for each consonant, there is a corresponding pure consonant form (called half-letter). This makes the total set of symbols required to enter such languages larger than what a normal keyboard could accommodate.
  • different Indian language word processors are distributed with hardcopy "maps" indicating the Indian letter hiding behind each English key. Approximately 50 hardcopy maps are available among different distributors of Indian word processors, however hardware manufacturers do not generally supply keyboards with Indian languages layout. Over 95 percent of Indian population is generally deprived of the benefits of English-based Information Technology.
  • the system disclosed herein may include a controller adapted to set and display a map of an input key(s), or entire (or partial) layout of a keyboard, based on a signal associated with an acquired image; and an image acquisition device functionally coupled to the controller and adapted to provide the controller with a signal relating to, or associated with, the acquired image.
  • the system may capture and identify, recognize or interpret, one or more gestures, for example, gestures generated or performed by a user's hands, fingers, or other body parts, and to execute commands in accordance with the gestures.
  • the system may include a monitor, or display screen, to display a virtual keyboard and virtual hands that simulate the ⁇ osition(s) and/or movements) of the user's physical hands, optionally in real time.
  • the meaning of key(s), or key label(s), on the virtual keyboard may be dynamically updated (changed) according to user's commands.
  • the entire, or only part of the, layout of the virtual keyboard may be dynamically updated (changed) according to user's commands.
  • the virtual keyboard may be dynamically updated (changed) according to the user's hand location(s) and/or movement(s).
  • the system may include evaluation and predictability software application (the controller may be adapted) to evaluate, predict or otherwise determine, based on a user's previous command, anticipated key(s) subsequently required on the virtual keyboard and/or the anticipated layout subsequently required on the virtual keyboard.
  • hand movements or other gestures may implement mouse-type navigation.
  • a method for inputting data into a computing system.
  • the method may include acquiring image(s) of parts of a user body and of a physical keyboard, and setting and displaying a mapping of a key(s) based on the acquired image(s).
  • the method may further include processing and interpreting signals relating to, or associated with, acquired image(s), to enable inputting of selected commands and/or symbols according to the signals.
  • the method may further include using a keyboard identification function to identify keys of a physical keyboard placed in the FOV of the image acquisition device; processing the images of at least one user hand to determine the hand's position and/or hand's movement relative to the physical keyboard; and displaying the position(s) movement(s) of at least one hand on a virtual keyboard on a corresponding display screen, for example on a computer display, or on a computer display object.
  • a keyboard identification function to identify keys of a physical keyboard placed in the FOV of the image acquisition device
  • processing the images of at least one user hand to determine the hand's position and/or hand's movement relative to the physical keyboard
  • displaying the position(s) movement(s) of at least one hand on a virtual keyboard on a corresponding display screen, for example on a computer display, or on a computer display object.
  • the method may include dynamically updating key labels on the virtual keyboard in response to the images processed, and/or dynamically updating the entire or portions of the keyboard layout of the virtual keyboard in response to the images processed.
  • the method may include hand movements that are intended to implement mouse-type navigation using a virtual mouse, and/or other body movements that may be interpreted as user's input commands and/or data.
  • other body movements may refer, for example, to hand movements, head movements, eye movements, mouth movements or other types of movements that may indicate user commands and/or data entry.
  • FIGS. IA and IB are graphic exemplify a virtual keyboard according to some embodiments of the present disclosure.
  • Fig. 1C is a graphical example of a keyboard with a limited number of Hindi characters, which may be utilized as a physical and/or virtual keyboard, according to some embodiments;
  • Fig. ID shows an exemplary set of graphical views of various fingers-based signals or gestures which may be used to indicate commands and/or data for input, according to some embodiments;
  • Figs. IE and IF show examples for mapping an input key based on a signal associated with an acquired image, and examples of maps of an input key based on a signal associated with an acquired image according to some embodiments of the present disclosure
  • FIG. 2A shows an exemplary flowchart for operating a data entry system according to some embodiments of the present disclosure
  • FIG. 2B schematically illustrates a general layout and functionality of a data entry system according to some embodiments of the present disclosure.
  • FIG. 3 schematically illustrates a general layout and functionality of a data entry system according to other embodiments of the present disclosure.
  • gesture may include at least movement(s) and/or signal(s) and/or indication(s) and/or sign(s) and/or instruction(s) and/or request(s), and the like, made by body part(s) of a person operating a keyboard.
  • command is meant herein using a gesture, or a series or combination of gestures, to instruct, request or order a computer to change the meaning or interpretation (assigning, or reassigning a symbol) of selected keys, or to change the meaning or interpretation of the entire keyboard layout in accordance with the gesture, or the series or combination of gestures.
  • Embodiments as described herein may facilitate improvement in human computer interaction (HCI) problems associated with, in particular but not only, languages that have many symbols. Such embodiments enhance the speed of data inputting, offer a complete solution to Language data entry tasks, substantially reduce the number of typing errors and improve the usability of languages word processors by presenting a user-friendly data input environment.
  • HCI human computer interaction
  • Data inputting system 100 may include at least one image acquisition, or capturing, device, such as image acquisition device 110, which may be, for example, a digital camera, video camera, PC camera, Webcam, and so on, which may be located on a computer display monitor 120, for example.
  • image acquisition device 110 may be located at different locations, provided that position(s), location(s), movement(s) and gesture(s) of a user hand(s), or other parts of the user's body for that matter, are clearly visible to; that is, they appear in the FOV of, image acquisition device 110.
  • Data inputting system 100 may further include a controller (not shown) associated with, or functionally coupled to, image acquisition device 110 and adapted to set a map of a key(s) or a map of the entire keyboard layout, based on a signal that is generated and outputted by image acquisition device 110 to the controller, which signal represents an image(s) in the FOV of image acquisition device 110 relating to, and including, a gesture(s) or movement(s).
  • a controller (not shown) associated with, or functionally coupled to, image acquisition device 110 and adapted to set a map of a key(s) or a map of the entire keyboard layout, based on a signal that is generated and outputted by image acquisition device 110 to the controller, which signal represents an image(s) in the FOV of image acquisition device 110 relating to, and including, a gesture(s) or movement(s).
  • the mapping of a key(s) may include changing the symbolic meaning assigned to the key(s) in accordance with movements or gestures made by, or associated with, a user such as the user whose (real) hands only are shown, at 131 and 132, resting on physical keyboard 130.
  • the controller may be an integral part of, or embedded or incorporated or affiliated into a computer (PC, laptop and the like) that gets input signals from a keyboard such as keyboard 130 and operates a display screen such as display screen 120.
  • the user may move his hands 131 and/or 132 from one position to another, in respect of, or relative to, physical keyboard 130, while signals relating to images of hands 131 and 132, which are acquired by image acquisition device 110, are constantly, or intermittently, forwarded to data inputting system 100 for processing and interpretation.
  • Data inputting system 100 may process and interpret the signal relating to the acquired images to identify gesture(s) and/or movement(s) made by the user by his hand(s) or other body part(s), and execute commands in accordance, or in connection, with the gesture(s) and/or movement(s).
  • Physical keyboard 130 may be a standard keyboard (with symbols marked thereon), blank keyboard (a keyboard with no markings on the keys), paper keyboard (a drawing of a keyboard with any number of keys, for example), touch pad, keypad, imaginary keyboard (flat naked surfaces such as tables and boards), and so on.
  • Data inputting system 100 may also utilize a Word application(s) suitable for processing language(s) being used (for example English, Hindi and German).
  • the controller of data inputting system 100 may utilize digital signal processing ("DSP") techniques, for processing images captured by image acquisition device 110, and simulation techniques for displaying a corresponding virtual keyboard, such as virtual keyboard 140, on a computer screen such as computer screen 120.
  • DSP digital signal processing
  • the number, size and spacing of the keys on virtual keyboard 140 may substantially resemble those of physical keyboard 130 to facilitate user's orientation. According to other aspects either the number or size or spacing of the keys on virtual keyboard 140 may differ from those of physical keyboard 130.
  • the controller of data inputting system 100 may cause the symbol(s), or meaning, assigned to a key(s), and/or the symbols or meaning assigned to the entire layout of virtual keyboard 140, to change according to a corresponding user's gesture or movement, which may be identified, recognized or interpreted by the controller of data inputting system 100 from the acquired, or captured image(s).
  • the controller of data inputting system 100 may utilize simulation techniques for creating and handling virtual hand(s) and cause virtual hand(s) to appear and move on the display screen in accordance with the user's (real, physical) hand(s) position, location and movement.
  • virtual [034] hands 121 and 122 are shown in Fig. IA reflecting the user hands 131 and 132, respectively.
  • Virtual keyboard 140 and/or virtual hands 121 and/or 122 may be likewise or differently scaled to facilitate ease of data inputting.
  • a user may place his hands, shown as 131 and 132, and make a gesture, or a series or combination of gestures, in the FOV of image acquisition device 110, which is/are associated with the requested language.
  • the gesture, or series or combination of gestures may then be recognized or interpreted by the controller of data inputting system 100 as being associated with the requested language.
  • the controller of data inputting system 100 may assign symbol(s) constituting the requested language to selected keys on the virtual keyboard 140 and display the layout of virtual keyboard 140 with the currently requested assigned symbol(s).
  • the controller of data inputting system 100 has set a map of the keyboard layout, which corresponds to the requested language. Once the requested language has been set by the controller of data inputting system 100 responsive to the user's command, the user may enter data to data inputting system 100 by observing keys on virtual keyboard 140 and moving his hand (131 or 132), or certain finger(s) thereof, across physical keyboard 130.
  • the user may move his hand, or certain fmger(s) thereof, until the respective virtual hand (121 or 122), or corresponding virtual fmger(s) thereof, reaches the vicinity of the next key on virtual key 140 to be depressed and a finger of the virtual hand 121 or 122 overlaps that key. Then, the user may depress the key in physical keyboard 130 underneath the finger corresponding to, or associated with, the virtual finger overlapping the requested key on virtual keyboard 140. The above-described procedure may be repeated as many times as required for inputting additional symbols. Should the user wish to change to, or to set, a different language, the user may pose a gesture, or series or combination of gestures that correspond to the different language.
  • the controller of data inputting system 100 Every time the controller of data inputting system 100 is requested to set a different map of keys, to set a different language, the controller may enable the corresponding WORD application/processor. For example, if the controller of data inputting system 100 is requested to change from French to English, then the controller may disable the French WORD application/processor and enable the English application/processor.
  • Physical keyboard 130 is functionally coupled to the controller of data inputting system 100, or to a computer within which the controller of data inputting system 100 resides, for forwarding to the controller signals representative of default symbols or functions associated with the keys in physical keyboard 130. Nevertheless, the controller of data inputting system 100 is adapted, or configured, to interpret signals forwarded to it from physical keyboard 130 according to a current mapping setting.
  • Data inputting system 100 has several advantages over prior art solutions. For example, a user inputting data to the system does not have to shift gaze, back and forth, between the physical keyboard (keyboard 130, for example) and the screen displaying the resulting typing (display screen 120, for example). Instead, the user may only gaze at the virtual keyboard (virtual keyboard 140, for example), and see virtual hands (virtual hands 121 and 122, for example) positioned and moving in correlation with the position(s) and movements) of his (real) hands (hands 131 and 132, for example).
  • a symbol or function may be assigned to a key depending on the language, mode or function that is requested by a user (by performing corresponding movement or gesture), so that a given key, when depressed by the user after it has been assigned the new language, mode or function, will be interpreted by the controller of the data inputting system 100 in a different way.
  • the controller of data inputting system 100 may change the appearance of virtual layouts (such as virtual layout 140) responsive to commands issued, or posed, by the user.
  • the controller may change the keyboard architecture or arrangement of keys, for example by changing the number of keys, size, spacing and/or placing of keys on the virtual keyboard, depending on a desired application(s).
  • a layout of a virtual keyboard may be changed according to a real time simulation of the user's hands and the positioning and movement of the user hands over a physical keyboard, whether the keyboard is real (with actual labels marked on respective keys), blank or a paper keyboard.
  • Another advantage of data inputting system 100 is that the same physical keyboard (for example physical keyboard 130) may be used to enter as many sets of symbols (each of which belonging to a different language) as the number of available WORD applications/processors.
  • the controller of data inputting system 100 may locate, at any given moment and in real time, the position and/or movement of the user's hands and fingers, and mimic them by displaying virtual fingers in the appropriate position over the keys of virtual keyboard 140. This allows the user to view his/her hand positioning and movements on monitor 120, thereby giving the user confidence in his/her finger placements above any key on the physical keyboard 130, regardless of the selected language, at any given moment, before pressing down the key and without having to look down at the physical keyboard (keyboard 130, for example).
  • the controller may cause the virtual hands (hands 121 and 122, for example) to mimic the movement and shape of the finger in respect of the virtual keyboard.
  • data inputting system 100 may enable processing images of one or two hands, as well as other body movements.
  • data inputting system 100 may capture and process head movements, eye movements, mouth movements or other movements to indicate user commands, signals and/or data entry.
  • an additional image acquisition device such as image acquisition device 110 may be used in cases where different parts of a user's body cannot be placed in the FOV of a single image acquisition device. In such cases, each image acquisition device may be spatially located to acquire images associated with different body parts.
  • Virtual keyboard 150 which is shown displayed on computer screen 155, can be adjusted, adapted or modified by the controller of the data inputting system to be as large or small as required or desired. The controller can also change the location of virtual keyboard 150 relative to screen 155.
  • Virtual keyboard 150 is shown displaying a current Indian language keyboard setup, or layout, as may be defined by the language/script set (out of 50 options, for example) being employed, or by a macro defined for a function key or for the configured keyboard.
  • a physical finger is moved from one key to another on a physical keyboard (such as physical keyboard 130 of Fig.
  • a corresponding graphical change may be made in respect of virtual keyboard 150, which results in a movement of the hand(s) (shown as 160 in Fig. IB) from one location to another on virtual keyboard 150, that mimics, or reflects, the movement of the user's hands to the appropriate (desired) physical key.
  • the controller of the data inputting system may change the appearance of user's hands 160 according to a direct command issued by the user, or according to the result of such a command.
  • a data inputting system such as data inputting system 100 of Fig. IA, may allow a user to change the labeling and/or layout of a virtual keyboard such as virtual keyboard 150. If a user wants to write a document using an Indian Language (Hindi language with Devanagari script in the case represented in Fig.
  • a user may change keyboard modes or functions, for example, s/he may change between languages, characters, letters, graphics of the keys on the virtual keyboard and so on, by indicating with his/her hands/fingers suitable gestures. Additionally, the appearance and/or the transparency extent of the virtual hand(s) may change according to the actual keyboard being used. For example, virtual hand 172 is shown less transparent than virtual hand 173, as they are each related to a different keyboard layout (to keyboards 170 and 171, respectively). Virtual keyboard 175 is shown having only six selected symbols (collectively designated 174). Virtual keyboard 175 may be shown as semi transparent.
  • Fig. ID it depicts examples of signals, or gestures, that a user may pose to an image acquisition device such as image acquisition device 110, to command a data input system, such as data inputting system 100, to change languages, modes, enter data, change functions and so on.
  • a data input system such as data inputting system 100
  • the user may use an object and/or his/her left hand to create a selected signal/gesture that may be captured by the image acquisition device, causing virtual keyboard keys to be desirably mapped.
  • Fig. ID depicts ten exemplary hand gestures, each of which is assigned a unique hand gesture number, for example. Each hand gesture number may be associated with a specific command or action to be executed or taken by the controller of the data inputting system.
  • hand gesture number 5 shown at 182, which is associated with hand gesture description 181, may indicate to, command or signal the controller of the data inputting system to change the layout, or mapping, of a virtual keyboard (for example the layout of virtual keyboard 150 of Fig. IB) from one language to another, for example. Thereafter, the user may use the changed virtual layout to enter (type) characters or symbols of the other language by moving his hand(s)/fmgers over a physical keyboard, so as to create corresponding virtual hand(s) that move in correlation [042] with the changed virtual keyboard.
  • a virtual keyboard for example the layout of virtual keyboard 150 of Fig. IB
  • identifying signals, commands, instructions and the like, by data input system such as data inputting system 100 may be implemented by first identifying or recognizing the hand gesture or signal (by an image acquiring device such as image acquiring device 110 of Fig. IA) and, then, by interpreting the hand gesture to corresponding hand gesture number and using the hand gesture number as described hereinbefore.
  • Data inputting system 100 of Fig. IA may be instructed by a user (by displaying corresponding gestures or movements to image acquiring device 110) to receive signals manually, automatically or following a selected command, for example, after depressing the "Reset" button on the physical keyboard.
  • Any number and type of hand gestures, signals and/or movements and/or other suitable signals and/or gestures and/or movements made by body parts and/or objects and so on, may be used as commands to the controller of the data inputting system.
  • left and/or right hand positions and/or movements may be captured, as may facial movements, head movements, finger movements, shoulder movements or other suitable movements which a user may use to indicate a command.
  • the data inputting system may allow a user to change, in a minimal number of keystrokes or other actions, the layout, mode, functions, and so on, of keys in a virtual keyboard such as virtual keyboard 150 of Fig. IB.
  • the user may make a gesture that is associated with a chosen layout, and then subsequently type one or more keys in order to enter the required data.
  • a data entry which may normally require several key entries in order to change layouts, keys and so on, and arrive at the required layout, may be done by applying a combination of a gesture and typing of the selected key.
  • the user may enter a command using a gesture or signal, for example, to change the keyboard key labels and/or layout on the virtual keyboard.
  • This change in the virtual keyboard may cause the required characters to be displayed on the virtual keyboard, such that a minimal number of keystrokes are required to enter selected keys. Therefore, only one keystroke may be required to enter any selected character from a set of characters of a language with many distinct characters.
  • Other actions and/or combinations of actions may be implemented as well.
  • the data inputting system may include an evaluation and predictability application for helping the controller of the data inputting system determine anticipated keys and/or keyboard layout (map) that may be subsequently required or desired by the user.
  • the predictability may be based on the evaluation of user's previous command(s), for example, commands previously issued by using hand gesture(s), movement(s), mouse movement(s), key entry and so on.
  • the currently used language Word application may interpret a combination of two or more specific keys to be equivalent to entry of selected characters.
  • the predictability application may, for example after hitting the first of the combination keys, automatically update other relevant keys to complete the possible functions resulting from combinations of the first key with various other keys.
  • the virtual keyboard may be immediately changed to display all the relevant commands or keys that may be entered in combination with "A". In this way, the user does not need to remember or use a physical table to discover keys' combinations; rather, the relevant combinations may be dynamically updated on the virtual keyboard in real time.
  • Table 190 depicts some Indian Languages symbols and respective English letters that are to be inputted to obtain the Indian Languages symbols.
  • a Indian Languages characters may be represented, used or obtained, by using a conventional method according to which a single, or a combination of two, three, four or five English letters or signs have to be typed (entered, or keyed in).
  • character 193 is obtainable by entering the letter "s” (194)
  • character 195 is obtainable by entering a combination of letters "s/t/r” (196). Accordingly, five keystrokes are required to obtain character 195.
  • Table 191 depicts a way for obtaining the same Indian Languages characters (shown at 197) by entering only one English character (one-strike implementation) in combination with corresponding hand gestures.
  • character 193 is obtained by entering the character "s” (198), as before (194), and without using any hand gesture, because using one character (194 or 198) is simple enough.
  • only one character may be entered (for example character "s") in combination with a corresponding hand gesture 199 (Hand Gesture 3, in this example).
  • Fig. IF several examples of mapping are schematically illustrated and described, which correspond to the Indian Languages characters shown in Fig. IE.
  • Fig. IF will be described in association with Fig. IE.
  • the initial or default English character "S" is symbolically shown (at 184) assigned the Indian Languages character 183, since the character "S" was entered, according to this example, without any hand gesture ("Hand Gesture” equal “none", at 185 in both figures).
  • the initial or default English character “S” is symbolically shown (at 186) assigned the Indian Languages character 187, since the character "S" was entered, according to this example, with hand gesture ("Hand Gesture” equal "none", at 188 in both figures).
  • a series of operations or processes is schematically illustrated, that may be implemented to operate the data entry system.
  • a user may configure or initialize the data inputting system's software on his/her computer. Once the software is executed and functioning, a calibration screen may be shown, indicating that the system is beginning or has begun operations.
  • the user may place a real (physical) keyboard in the view of the camera.
  • the data inputting system may employ a keyboard identification function to identify the keyboard position and keys, for example, and may subsequently notify the user that the keyboard has been identified and that the data inputting system is ready to operate.
  • the user may place his/her hand(s) in the field of vision of the image acquiring device.
  • the data inputting system may capture and
  • the user may place his/her hand(s) in the field of vision of the image acquiring device.
  • the data inputting system may capture and process the images of the hand(s), after which the data inputting system may notify the user that hand(s) identification has been completed.
  • the user may operate a word processing application, according to the actual language being used.
  • the data inputting system may display a virtual keyboard on the data inputting system's display, with a default set of keys, according to the selected word processing application.
  • the user may type commands into the real keyboard, while looking at his/her virtual hand(s) correlated movements on the virtual keyboard, to enter data in a selected language, as required.
  • the virtual keyboard may depict virtual fingers actively moving to depress selected keys, thereby indicating actual entry of commands.
  • the user may make a selected signal by using one or two hands (or other body parts).
  • the signal(s) may be selected from a set of pre-configured signals or actions.
  • the data inputting system may capture and process the hand signal, and enter the required user command(s) or data.
  • the keys and/or layout on the virtual keyboard may be changed in accordance with the user's command(s), for example by entering a function key, combination of keys, mouse action or command, combination of key entry(ies) and mouse action(s) and so on. Any combination of the above steps may be implemented. Further, other steps or series of steps may be used instead of and/or in addition to steps specified hereinbefore.
  • the method may enable processing of one or two hands, as well as other body movements.
  • the method may capture and process head movements, eye movements, mouth movements or other movements to indicate user commands, signals and/or data entry.
  • image acquiring device 290 may capture the gesture made, or posed, by hand 292 of a user (not shown).
  • the captured gesture may be identified, for example, as gesture number 3 in Fig. ID.
  • the virtual keyboard (291) may be changed to display layout number 3 (for example), which corresponds to gesture number 3.
  • a Word Processor may change to mode 3 of operation (change language).
  • the user may depress a key(s) on physical keyboard 293.
  • the controller (not shown) of data inputting system 201 may simulate and display the user's finger(s) (shown at 294) hitting a corresponding key(s) on virtual keyboard 291, while it substantially correlates position(s) and movement(s) of virtual hands 294 to position(s) and movement(s) of physical hands 292 of the user using the data inputting system 201.
  • Computer 286 is functionally coupled to physical keyboard 293, from which it may receive signals that represent depressed keys, and to display screen 295, to which it forwards, among other things, an image of the virtual hands and virtual keyboards.
  • a virtual hand may simulate, or mimic, mouse-like navigation; a user may enter data into a Computer (304) and/or operate graphical applications by using a virtual mouse.
  • Computer 304 includes the controller (not shown) of data306.
  • image acquisition device 301 may capture a user's movement or body part, for example a hand (shown at 302), which may move in order to implement mouse-type navigation.
  • the direction of movement of the hand for example, in an X-Y plane, is observed and forwarded to Computer 304.
  • the movement or gesture(s) of the user may be captured by image acquisition device 301.
  • captured image(s) of gesture(s) may be processed to enter user's command(s) and/or data.
  • the commands and/or data and so on may be entered into Computer 304, where they may be executed accordingly, for example, by navigating on display 303, changing modes and/or functions, entering specific commands and so on.
  • the computer's screen display screen 295 of Fig. 2B or 303 of Fig. 3, for example
  • physical keyboard physical keyboard 293 of Fig. 2B, for example
  • image acquisition device image acquisition device 290 of Fig. 2B or 301 of Fig.
  • Computer 286 of Fig. 2B or Computer 304 of Fig. 3 may be any suitable conventional computer provided that it includes, in addition to its normal hardware and software components, virtual reality enabling hardware and software applications required for analyzing acquired images to determine movements and gestures made, or posed, by a user, and for generating, and generally handling, a virtual keyboard and virtual hand(s).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Image Analysis (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un système et un procédé pour permettre une entrée de données dans un système informatique. Le système peut comprendre une fonctionnalité de contrôleur couplée à un dispositif d'acquisition d'image et adaptée pour définir une mappe d'une touche d'entrée ou d'une disposition de clavier complet, sur la base de l'image ou des images acquises captées par le dispositif d'acquisition d'image. Le système peut capter des images de mouvements et/ou de gestes d'utilisateurs dans un champ de vision sélectionné, et peut traiter ces images pour identifier et exécuter des commandes selon les mouvements.
EP07706117A 2006-02-16 2007-02-08 Systeme et procede pour entrer des donnees dans un systeme informatique Withdrawn EP1999547A4 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN421DE2006 2006-02-16
PCT/IL2007/000174 WO2007093984A2 (fr) 2006-02-16 2007-02-08 système et procédé pour entrer des données dans un système informatique

Publications (2)

Publication Number Publication Date
EP1999547A2 true EP1999547A2 (fr) 2008-12-10
EP1999547A4 EP1999547A4 (fr) 2011-10-12

Family

ID=38371891

Family Applications (1)

Application Number Title Priority Date Filing Date
EP07706117A Withdrawn EP1999547A4 (fr) 2006-02-16 2007-02-08 Systeme et procede pour entrer des donnees dans un systeme informatique

Country Status (5)

Country Link
EP (1) EP1999547A4 (fr)
JP (1) JP2009527041A (fr)
KR (1) KR20080106265A (fr)
CN (1) CN101589425A (fr)
WO (1) WO2007093984A2 (fr)

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2201761B1 (fr) * 2007-09-24 2013-11-20 Qualcomm Incorporated Interface optimisée pour des communications de voix et de vidéo
FR2921634B1 (fr) * 2007-09-27 2010-03-19 Airbus Systeme et procede pour acceder a un equipement informatique personnel a bord d'un aeronef, et aeronef comprenant un tel systeme.
KR101352994B1 (ko) * 2007-12-10 2014-01-21 삼성전자 주식회사 적응형 온 스크린 키보드 제공 장치 및 그 제공 방법
US20100265182A1 (en) * 2009-04-20 2010-10-21 Microsoft Corporation Context-based state change for an adaptive input device
CN104808821A (zh) * 2009-05-26 2015-07-29 美国智能科技有限公司 数据输入方法及装置
US9400559B2 (en) 2009-05-29 2016-07-26 Microsoft Technology Licensing, Llc Gesture shortcuts
EP2480951A4 (fr) * 2009-09-21 2014-04-30 Extreme Reality Ltd Procédés, circuits, appareils et systèmes pour un interfaçage humain/machine avec un appareil électronique
US8621380B2 (en) 2010-01-06 2013-12-31 Apple Inc. Apparatus and method for conditionally enabling or disabling soft buttons
CN102214009A (zh) * 2010-04-08 2011-10-12 深圳市闪联信息技术有限公司 一种实现键盘输入的方法及系统
CN102289283A (zh) * 2010-06-16 2011-12-21 微软公司 自适应设备的状态变化
CN103221912A (zh) * 2010-10-05 2013-07-24 惠普发展公司,有限责任合伙企业 输入命令
US8587547B2 (en) 2010-11-05 2013-11-19 Apple Inc. Device, method, and graphical user interface for manipulating soft keyboards
US9141285B2 (en) 2010-11-05 2015-09-22 Apple Inc. Device, method, and graphical user interface for manipulating soft keyboards
US9250798B2 (en) 2011-01-24 2016-02-02 Apple Inc. Device, method, and graphical user interface with a dynamic gesture disambiguation threshold
US8842082B2 (en) 2011-01-24 2014-09-23 Apple Inc. Device, method, and graphical user interface for navigating and annotating an electronic document
KR101772979B1 (ko) * 2011-04-06 2017-08-31 엘지전자 주식회사 이동 단말기 및 그것의 제어 방법
WO2012144666A1 (fr) * 2011-04-19 2012-10-26 Lg Electronics Inc. Dispositif d'affichage et procédé de commande associé
EP2575007A1 (fr) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Mise à l'échelle d'entrées basées sur les gestes
EP2575006B1 (fr) 2011-09-27 2018-06-13 Elo Touch Solutions, Inc. Interaction utilisateur avec contact et sans contact avec un dispositif
WO2013101206A1 (fr) * 2011-12-30 2013-07-04 Intel Corporation Reconnaissance interactive de dessin
US8850349B2 (en) * 2012-04-06 2014-09-30 Google Inc. Smart user-customized graphical keyboard
US10664657B2 (en) 2012-12-27 2020-05-26 Touchtype Limited System and method for inputting images or labels into electronic devices
GB201322037D0 (en) * 2013-12-12 2014-01-29 Touchtype Ltd System and method for inputting images/labels into electronic devices
GB201223450D0 (en) 2012-12-27 2013-02-13 Touchtype Ltd Search and corresponding method
KR102040288B1 (ko) * 2013-02-27 2019-11-04 삼성전자주식회사 디스플레이 장치
KR101489069B1 (ko) * 2013-05-30 2015-02-04 허윤 동작 기반의 정보 입력 방법 및 이러한 방법을 사용한 입력 장치
US20140368434A1 (en) * 2013-06-13 2014-12-18 Microsoft Corporation Generation of text by way of a touchless interface
KR102166330B1 (ko) 2013-08-23 2020-10-15 삼성메디슨 주식회사 의료 진단 장치의 사용자 인터페이스 제공 방법 및 장치
JP5877824B2 (ja) * 2013-09-20 2016-03-08 ヤフー株式会社 情報処理システム、情報処理方法および情報処理プログラム
CN104978016A (zh) * 2014-04-14 2015-10-14 宏碁股份有限公司 具有虚拟输入功能的电子装置
CN105224069B (zh) * 2014-07-03 2019-03-19 王登高 一种增强现实虚拟键盘输入方法及使用该方法的装置
CN104199550B (zh) * 2014-08-29 2017-05-17 福州瑞芯微电子股份有限公司 一种虚拟键盘操作装置、系统及方法
JP2016177658A (ja) * 2015-03-20 2016-10-06 カシオ計算機株式会社 仮想入力装置、入力方法、およびプログラム
CN107430429B (zh) * 2015-04-07 2022-02-18 英特尔公司 化身键盘
CN106488160A (zh) * 2015-08-24 2017-03-08 中兴通讯股份有限公司 一种投影显示方法、装置及电子设备
KR102653267B1 (ko) * 2018-11-28 2024-04-02 삼성전자 주식회사 전자 장치의 키 입력 방법 및 이를 사용하는 전자 장치
CN110007774B (zh) * 2019-03-27 2022-01-14 联想(北京)有限公司 一种键盘装置及电子设备
CN110414225B (zh) * 2019-07-24 2023-05-26 广东魅视科技股份有限公司 一种防hid键盘攻击的系统及方法
CN112684901A (zh) * 2019-10-18 2021-04-20 王光达 屏幕键位标识显示方法及其单手和弦移动键盘
US11617953B2 (en) 2020-10-09 2023-04-04 Contact Control Interfaces, Llc. Virtual object interaction scripts
CN114167997B (zh) * 2022-02-15 2022-05-17 北京所思信息科技有限责任公司 一种模型显示方法、装置、设备和存储介质

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001059975A2 (fr) * 2000-02-11 2001-08-16 Canesta, Inc. Procede et appareil destines a entrer des donnees a l'aide d'un dispositif d'entree virtuelle
US6433774B1 (en) * 1998-12-04 2002-08-13 Intel Corporation Virtualization of interactive computer input

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100865598B1 (ko) * 2000-05-29 2008-10-27 브이케이비 인코포레이티드 수문자 조합 및 다른 데이터의 입력을 위한 가상 데이터입력 장치 및 방법
US7042442B1 (en) * 2000-06-27 2006-05-09 International Business Machines Corporation Virtual invisible keyboard
EP1352303A4 (fr) * 2001-01-08 2007-12-12 Vkb Inc Dispositif d'entree de donnees
JP4099117B2 (ja) * 2003-07-22 2008-06-11 シャープ株式会社 仮想キーボードシステム
IL161002A0 (en) * 2004-03-22 2004-08-31 Itay Katz Virtual video keyboard system
US20070063979A1 (en) * 2005-09-19 2007-03-22 Available For Licensing Systems and methods to provide input/output for a portable data processing device
US20080018591A1 (en) * 2006-07-20 2008-01-24 Arkady Pittel User Interfacing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6433774B1 (en) * 1998-12-04 2002-08-13 Intel Corporation Virtualization of interactive computer input
WO2001059975A2 (fr) * 2000-02-11 2001-08-16 Canesta, Inc. Procede et appareil destines a entrer des donnees a l'aide d'un dispositif d'entree virtuelle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2007093984A2 *

Also Published As

Publication number Publication date
KR20080106265A (ko) 2008-12-04
JP2009527041A (ja) 2009-07-23
WO2007093984A3 (fr) 2009-04-23
WO2007093984A2 (fr) 2007-08-23
CN101589425A (zh) 2009-11-25
EP1999547A4 (fr) 2011-10-12

Similar Documents

Publication Publication Date Title
WO2007093984A2 (fr) système et procédé pour entrer des données dans un système informatique
US5157384A (en) Advanced user interface
US6600480B2 (en) Virtual reality keyboard system and method
US6388657B1 (en) Virtual reality keyboard system and method
EP1383034B1 (fr) Organe de saisie a touches tactiles
AU780674B2 (en) Integrated keypad system
US9529523B2 (en) Method using a finger above a touchpad for controlling a computerized system
EP0769175B9 (fr) Jeu de caracteres a segments multiples et systeme de reconnaissance de l'ecriture
US20160364138A1 (en) Front touchscreen and back touchpad operated user interface employing semi-persistent button groups
US20060119588A1 (en) Apparatus and method of processing information input using a touchpad
US20140337786A1 (en) Method for controlling a virtual keyboard from a touchpad of a computerized device
US9542032B2 (en) Method using a predicted finger location above a touchpad for controlling a computerized system
US20020190946A1 (en) Pointing method
WO2004010276A1 (fr) Procede et dispositif d'entree d'affichage d'informations, et dispositif de traitement d'informations
JP2013515295A (ja) データ入力システムおよびその方法
JP2006524955A (ja) タッチスクリーン及び縮小型キーボードのための曖昧でないテキスト入力方法
JPH05508500A (ja) 疑似装置を有するユーザインターフェース
WO2007121673A1 (fr) Procédé et dispositif d'amélioration de la vitesse de saisie de caractères chinois
JP2009110092A (ja) 入力処理装置
JP2000112636A (ja) かな文字入力装置
JP2007510999A (ja) データ入力パネルの文字変換
JP2003196007A (ja) 文字入力装置
Hirche et al. Adaptive interface for text input on large-scale interactive surfaces
CN101551701A (zh) 多维控制方法及装置和最优较优显示输入方法及装置
WO2015013662A1 (fr) Procédé permettant de commander un clavier virtuel à partir d'un pavé tactile d'un dispositif informatisé

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA HR MK RS

R17D Deferred search report published (corrected)

Effective date: 20090423

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/02 20060101ALI20090518BHEP

Ipc: G09G 5/00 20060101AFI20090518BHEP

17P Request for examination filed

Effective date: 20091023

RBV Designated contracting states (corrected)

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR

A4 Supplementary search report drawn up and despatched

Effective date: 20110909

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/03 20060101ALI20110905BHEP

Ipc: G06F 3/048 20060101ALI20110905BHEP

Ipc: G06F 3/01 20060101ALI20110905BHEP

Ipc: G06F 3/023 20060101ALI20110905BHEP

Ipc: G06F 3/02 20060101ALI20110905BHEP

Ipc: G09G 5/00 20060101AFI20110905BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20110901