US20150261312A1 - Talking multi-surface keyboard - Google Patents

Talking multi-surface keyboard Download PDF

Info

Publication number
US20150261312A1
US20150261312A1 US14/214,710 US201414214710A US2015261312A1 US 20150261312 A1 US20150261312 A1 US 20150261312A1 US 201414214710 A US201414214710 A US 201414214710A US 2015261312 A1 US2015261312 A1 US 2015261312A1
Authority
US
United States
Prior art keywords
keys
operator
key
touch
finger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/214,710
Inventor
Hovsep Giragossian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US14/214,710 priority Critical patent/US20150261312A1/en
Publication of US20150261312A1 publication Critical patent/US20150261312A1/en
Priority to US16/599,835 priority patent/US10963068B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0238Programmable keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1669Detachable keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0216Arrangements for ergonomically adjusting the disposition of keys of a keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0221Arrangements for reducing keyboard size for transport or storage, e.g. foldable keyboards, keyboards with collapsible keys
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Position Input By Displaying (AREA)

Abstract

An advanced portable keyboard comprising of a method of providing an interface for receiving contact information indicative of points of contact between parts of an operator's hand and one or a plurality of touch-sensitive pads built on the front surface, one or a plurality of back surfaces, and edges, interacting with one or a plurality of computers or computer-driven appliances.
A mechanism mapping a plurality of interface elements onto touch-sensitive pads and static keys, each element corresponding to a key on a computer keyboard, each element being associated with one of the fingers of the operator, each element is mapped recurrently and dynamically, and activated by respective finger.
A mechanism to read and/or pronounces letters, words, errors, actions, transcribe voice to text, and to switch touch pads into mouse pads at different operational modes, thus, providing faster speed, healthier posture, and greater efficiency.

Description

  • This application claims the benefit of PPA Ser. No. 61/789,573 filed 2013 Mar. 15 by the present inventors, which is incorporated by reference.
  • BACKGROUND Prior Art
  • The following is a tabulation of some prior art that presently appear relevant:
  • U.S. Patent
    Patent Number Kind Code Filing Date Patentee
    US20130275907 A1 2011 Oct. 14 Hannes Lau and
    Christian Sax
    US20070036603 A1 2004 Sep. 22 Marek Swoboda
  • Smart phones are being increasingly used for sending text messages either through text messaging, email, or alternative methods. However, in most cases the texting interface is either limited to a miniaturized keyboard or a soft keyboard. In either case users are limited to using two thumbs when entering text. This limitation decreases typing speed significantly.
  • There are many different forms of computer keyboards for computers and computer-driven appliances, including those with static hard keys, static soft keys, and dynamically mapped soft keys on a touch-sensitive surface.
  • When using either of the keyboards with mechanical keys or with static soft keys, our fingers have to conform to the position of the keys; keys are not customizable to the posture of our fingers. Different operators have different hand sizes and hand postures. Keyboards with static keys can't be customized to individuals' needs. These types of keyboards put stress on the wrists of the operator.
  • Computer keyboards with dynamically mapped soft keys provide operators the flexibility to keep their wrists straight. Patent US20130275907 A1 to Hannes Lau and Christian Sax, 2011 Oct. 14 references such keyboards. However, these keyboards with dynamically mapped keys use a flat keyboard, which force the hands of the operator to rest on the palm side of the wrists, as if using a conventional flat keyboard. This limitation squeezes the median nerve which may lead to developing Carpal Tunnel Syndrome.
  • Another form of computer keyboard in patent US20070036603 A1 to Marek Swoboda, 2004 Sep. 22, describes a keyboard with keys on the back surface. Additional mechanical keys are built on the left and right edges of the keyboard. Both soft keys and mechanical keys are statically mounted or assigned on the back surface and on left and right edges. Keys on the back of the keyboard are visible through the top transparent surface. Operator looks through the device to see and then manually activate the keys. This limitation prevents proper use of the top surface. For example, a display screen can't be efficiently used to show entered text or display a running program. Activating keys mounted on the left and right edges are not as comfortable as if they were mounted and activated on the front surface.
  • I have found that keyboards with keys on the back where the back and the front surfaces are parallel with each other are harder to use than if the two surfaces were in an angle. I will discuss this in more detail later.
  • On the other hand, keyboards with dynamically mapped invisible soft keys can't be used without an external monitor. Operators need to see what they type and be able to correct typos and grammatical errors. Further improvements are required.
  • Advantages
  • Accordingly, several advantages of one or more aspects are as follows:
  • Mobility and portability allows operators to stand, walk, or do other activities while, typing on the keyboard, and, a sound module or other notification methods alert operators when detecting typos or other errors, or, transcribing voice to text and computer commands. Faster speeds, greater efficiency, and healthier body and finger posture is achieved when keys are dynamically mapped onto the back of an embodiment, and a limited number of keys assign on the front of the embodiment, where thumbs activate the front keys and other 8 finger activate the back keys.
  • These and other advantages will become apparent from the ensuing description and accompanying drawings.
  • SUMMARY
  • In accordance with one embodiment, a portable keyboard for a computer or a mobile device provides increased speed, functionality, and mobility. Use of dynamically mapped keys on back, front, and edges of an exemplary embodiment provides healthier posture for the hands and fingers of the operator. Sound and other media are incorporated to notify the operator and to transcribe voice into text and computer commands for greater efficiency.
  • DRAWINGS Figures
  • The embodiments of the invention is described by way of example only, with reference to the accompanying drawings in which, one or more embodiments are illustrated with the same reference numerals referring to the same pieces of an embodiment throughout the drawings, it is understood that the invention is not limited to the embodiments depicted in the drawings herein, but rather it is defined by the claims appended hereto and equivalent structures:
  • FIG. 1 shows different elements of the first embodiment, located or front, top, and edge surfaces.
  • FIG. 2 shows two back surfaces of the first embodiment, each with a touch-sensitive surface or pad.
  • FIG. 3 shows the top surface of the first embodiment, and angles between the front surface and the adjacent surfaces.
  • FIG. 4 shows the back left surface of the first embodiment, the area covered by fingers of the left hand, and relative contact points.
  • FIG. 5 shows interface elements in Numeric Mode of the first embodiment mapped on the left touch pad.
  • FIG. 6 shows the process of remapping adjacent keys when the hand of the operator shifts.
  • FIG. 7 shows the second embodiment as an example of a clocking station with a docked embodiment.
  • FIG. 8 shows the front surface and edges of the third embodiment.
  • FIG. 9 shows the elements of the back surfaces and edges of the third embodiment.
  • FIG. 10 shows fourth embodiment comprising of two split units, one for each hand.
  • FIG. 11 shows fifth embodiment as an attachment to a mobile device.
  • FIG. 12 shows sixth embodiment, as an enhanced movable mouse.
  • FIG. 13 shows seventh embodiment with a slide-out extended touch pad.
  • FIG. 14 shows seventh embodiment with two hinged touch pads.
  • FIG. 15 shows eighth embodiment with ergonomic sides
  • FIG. 16 shows ninth embodiment with character keys mapped on top pad
  • FIG. 17 shows tenth embodiment with collapsible arms.
  • DETAILED DESCRIPTION FIGS. 1 to 6—First Embodiment
  • FIG. 1 shows the front of the first exemplary embodiment. This embodiment provides a touch-sensitive screen or a touch-screen 105, a plurality of mechanical keys 109, 110, 111, 112, and 115, a plurality of soft programmable keys 122, 123, 124, 125, and 128, two speakers 103 and 120, two microphones 108 and 114, a plurality of status or indicator LED lights 117, and one visible light sensor 119 on the front surface 104. Touch pads 102, 106, 116 and 121 located on the right and left curved edges 101 and 118 of the embodiment are capable of detecting ranges of pressure levels from the thumbs and the hands of the operator. Adjustable legs 107 and 113 help keep the embodiment in a comfortable position for use.
  • The touch screen is capable of detecting at least the touch of two thumbs simultaneously and to display a plurality of icons, a plurality of running program windows, as well as text boxes and a plurality of programmable soft keys.
  • FIG. 2 shows the back of the embodiment with two bottom surfaces 201 and 214, each with touch-sensitive surfaces or touch pads 202 and 213 that are capable of detecting up to 4 fingers or a total of 8 fingers simultaneously. Additionally there is a plurality of invisible environment, motion and position sensor (not shown).
  • Both touch pads are capable of detecting different ranges of pressure levels. Each pressure range describes a specific action, for example, tapping with a finger, touching or resting on the touchpad, sliding or moving a finger or a plurality of fingers on a touch pad, squeezing a touch pad with the palm of one hand, pressing harder on a touch pad with a finger, etc.
  • Most computer keyboards have standard alphanumeric keys including home keys A, S, D, F, J, K, L, ;, other alpha keys adjacent to the home keys, numeric keys, punctuation keys including mathematical operators, and a set of special keys including function keys. An example of special keys is Tab, Caps, Shift, Ctrl, Command, Alt, Scroll, Delete, etc. Recent computer keyboards, in addition to these keys include additional keys to activate and control programs running on the host computer. For example a computer keyboard may have keys to open a web browser, a menu, and start and stop keys for playing music, etc. The adjacent keys of a home key are activated by the same finger that activates the home key. For example as shown in FIG. 2, key 203 and key 205 are the adjacent keys of the home key 204 on a QWERTY layout. There are several keyboards with different layouts including QWERTY, Dvorak, Arabic, etc.
  • The first embodiment provides at least: a housing, a non-volatile storage media, a volatile storage media, a source of rechargeable power to provide electricity when operating wirelessly, a method to communicate with the local and/or remote hosts, a first program to customize and build new computer layouts, a second program for choosing and activating a keyboard layout, a third program for obtaining the initial contact position of fingers when rested on the pads and the contact position of fingers when extended and retracted, a fourth program for detecting normal operation and activities performed by the hand of the operator, for remapping the interface elements on the touch pads as fingers move or shift from the initial position, and a fifth program to activate a sound module and/or other methods of notifying operator when typos, grammatical or other errors are detected, to transcribe voice into text and computer commands, and an electronic module to run those programs, generate codes, display information on the touch screen, and transmit generated codes through a wired, wireless, or both mediums to a locally attached host, and/or to one or a plurality of remote hosts. Other methods of notification include future and currently available including notification methods through vibration and notification through tactile sensation as described at: http://www.ncbi.nlm.nih.gov/pmc/articies/PMC3100092/.
  • The non-volatile storage media holds the programs and information pertaining to all available keyboard layouts and customized keyboard layouts. These programs are responsible for providing at least BIOS support and to initialize and operate the embodiment.
  • The keys of keyboard layouts are divided into a plurality of interface groups. The elements of each interface group are comprised of one home key and none, one, or a plurality of adjacent keys. The elements of each interface group are assigned to and activated by one of the fingers of the operator. For example, in FIG. 2 interface 208 representing interface group D is comprised of elements 207, 209, and 210, representing E, D, and C keys, where element 209 is the home key D. Similarly, interface 212 representing interface group F is comprised of home key 211 and several adjacent keys. Element 206 is an adjacent key or another interface group that represents the Enter key. Enter key is one of the special keys.
  • TABLE 1
    Interface Home Home and Adjacent Assigned
    Group Key Member Keys Hand finger
    A A Q A Z Tab Caps Shift Left Pinky finger
    S S W S X Ring finger
    D D E D C Middle finger
    F F R F V T G B Index finger
    J J Y H N U J M Right Index finger
    K K I K , Middle finger
    L L O L . Ring finger
    Semicolon ; P ; / ' Backspace Pinky finger
    Enter Shift
    Left Left Space Left Left thumb
    Space Space
    Right Right Space Right Right thumb
    Space Space
  • Also as shown in FIG. 3, to provide additional comfort for fingers, the back surfaces 201 and 214, and front surface 104 are not parallel with each other. Angle 301 between two back surfaces, and angles 303 and 304 between the front surface 104 and two back surfaces are less than 120 degrees, making the top surface 302 look triangular.
  • FIG. 4 shows touch pad 213 on back surface 214, where the fingers of the left hand are resting. Contact points 401, 403, 405, and 407 are the areas touched by the Index finger, Middle finger, Ring finger, and the Pinky finger of the left hand respectively. Areas 402, 404, 406, and 408 are the area assigned or mapped to home keys F, D, S, and A (invisible) to be activated by each respective finger.
  • FIG. 5 shows a set of numeric keys and arithmetic operation keys. Interface group 501 contains elements 502, 503, and 504. Element 503 is digit 5 home key and elements 502 and 504 are adjacent digits 8 and 2 keys.
  • Operation First Embodiment
  • Holding two mechanical keys labeled Space for 3 seconds or longer prompts operator to choose one of the programs. Operator may call the first program to customize a layout. As part of the customization, operator may customize each key of a selected keyboard, the functions of touch pads 102, 106, 116 and 121 (FIG. 1) located on the edges of the embodiment, the functions of each mechanical key, and the functions of the soft programmable keys 122, 123, 124, 125, and 126. Operator may choose to assign certain keys to the left or right thumb. Operator may choose to set the number of active fingers, on either touch pad to less than 4 fingers, excluding thumbs. In such a case operator will customize and alter the assignment of keys to each active finger. Each customized layout may be saved at one of a plurality of non-volatile storage locations.
  • An operator, at the first use of the keyboard calls up the second program and chooses and activates one of the available keyboard layouts or one of the customized layouts. Operator should also call the third program at least once to customize the keyboard to the hands and fingers.
  • When the third program is called, operator is prompted to follow a series if instruction to detect and register the contact points of each finger at resting position on the touch pads, as well as the contact points of each finger when extending or retracting to tap on other areas on the touch pads, as if typing on the adjacent keys of a keyboard.
  • For example, when the operator is prompted to rest at 8 fingers on the two touch pads, the third program detects and registers the contact point of the left index finger at resting position and maps the F key to that position. Then through a sequence of prompts, the operator taps on different areas of the left touch pad by extending or retracting the left index finger as if typing on the adjacent keys R, V, T, G, and B on a QWERTY keyboard. The center of all contact points or the coordinates of contact points between the left Middle finger and the touch pad are grouped together as elements of the F interface group. FIG. 2 shows interface groups 212 representing F interface group.
  • As shown in FIG. 4, for example, the small area 402 is assigned on the touch pad 213 to the F key (invisible) an the index finger of the operator activates it. Area 402 is the center of contact point 401 which is the contact point between the left index finger an the touch pad 213. Since the contact points are larger than mapped areas, operator can easily tap on mapped areas and activate respective keys. Mapped keys are invisible soft keys. Each special key, for example, Ctrl, Shift, Delete, etc may be mapped to a mechanical key, static soft key, dynamically mapped soft key, or mapped to a plurality of keys simultaneously.
  • Once all contact points or coordinates of all fingers are registered, the third program maps each of the contact points to a key. The third program also calculates the distance between each adjacent key and the home key, and the distance between same adjacent key and the next home key. For example, as shown in FIG. 6, the distance between adjacent key 601, representing key and home key 602, representing key F, as well as the distance between key 601 and home key 603, representing key F are calculated and saved together with their coordinates in the non-volatile storage media. The information stored in the storage media is available to the fourth program.
  • The fourth program is the main program that runs automatically ail the time to detect activities performed by the hands of the operator and to generate codes relative to each activity on a touch pad or a touch screen. For example, when the operator taps on the mapped location of key D with the left Middle finger, a character code representing letter d is generated. If the operator was pressing and holding the Shift key with right Pinky finger while performing the same action, a character code representing letter D is generated.
  • As the operator's hands move, the position of the home keys also shift and move. Fourth program recurrently detects the new location of the home keys by reading the contact points of the active fingers whenever they are at resting position. The new position of the home keys are dynamically remapped on the touch pads, With the information pertaining to adjacent keys and their distances from two home keys available, the coordinates of the adjacent keys are adjusted based on the coordinates of the new home keys. Therefore there is no need to detect the position of the adjacent keys.
  • For example, in FIG. 6, the existing position of two home keys 602 and 603 representing D and F home keys are shown. F key is the neighbor home key relative to D home key. Key 601 is the adjacent key of the D key and is an element of the D interface Group, Let's say, when we ran the third program, the distance between the E and the D keys was 12 mm and the distance between the E and F keys was 16 mm.
  • Let's say that the left hand of the operator moved and shifted forward. In this example in Fi the contact point of the left Middle finger 602 moved forward slightly more than the left index finger 603. The fourth program detects the new contact points of the left Middle finger 605 and left Index finger 604 and remaps D and F keys on the left touch pad accordingly. It also calculates and updates the coordinate of the E key 601 and maps it at the location 606 which is 12 mm from the new position of the D home key and 16 mm from the F neighbor home key.
  • Operator's left and right thumbs activate the mechanical keys located under the front touch screen, and any soft programmable key mapped on the touch screen.
  • The fifth program, at operators request, reads every letter or word, and notifies the operator with distinct sounds when typos, grammatical errors, or other errors, are detected. The program may use other forms of notification, for example vibration or displaying a message on the touch screen. The program may at operators request record and transcribe voice into text and computer commands, letting the operator only correct errors made during the transcribing process.
  • There are different modes of operation include Extended Mode, Arrows Mode, Multi-Mice Mode, etc, Other modes of operation may be achieved by creating new customized layouts. The operator may activate any of these modes one at a time by either pressing the mode key 111 in FIG. 1, or by touching one of the programmable soft keys located on the touch screen, one that is programmed to perform the same function. Each mode associates different keys with the fingers of the operator.
  • In Texting Mode, as shown in FIG. 1, mechanical keys 109, 110, 112, and 115 represent Ctrl, left Space, right Space, and Alt keys. Also in Texting Mode as shown in FIG. 2, the elements of each interface represent alpha keys or a combination of alpha and a set of special keys.
  • However in Numeric Mode as shown in FIG. 5 the elements of each interface represent numeric keys and arithmetic operation keys. For example, interface group 501, contains keys 8, 5, and 2 where element 503 is digit 5 home key and elements 502 and 504 are adjacent digits 8 and 2 keys.
  • Table 2 shows interface groups and associated elements in Numeric Mode. In this mode of operation one or both of the touch pads on back surfaces may be mapped with the same interface groups. For example, interface group 5 with elements 8, 5, and 2 may be associated with the left Middle finger, right Middle finger, or both.
  • TABLE 2
    Interface Home Home and Adjacent Assigned
    Group Key Member Keys Hand finger
    Minus Minus / - * Left Pinky Finger
    4 4 7 4 1 Ring finger
    5 5 8 5 2 Middle finger
    6 6 9 6 3 + Enter Index finger
    4 4 7 4 1 / Minus * Right Index finger
    5 5 8 5 2 Middle finger
    6 6 9 6 3 Ring finger
    Plus Plus Enter + Pinky Finger
    Zero Zero
    0 Left Left thumb
    Dot Dot . Right Right thumb
  • Extended Mode provides access to different symbols, functions keys, numbers, and special keys. Table 3 shows interface groups and associated elements in Extended Mode. For example, the interface group D in Texting Mode becomes interface group 3 in Extended Mode. Function keys F3, F13, and digit 3 key become the elements of interface group 3.
  • TABLE 3
    Interface Home Home and Adjacent Assigned
    Group Key Member Keys Hand finger
    1 1 F1 1 F11 Left Pinky finger
    2 2 F2 2 F12 Ring finger
    3 3 F3 3 F13 Middle finger
    4 4 F4 4 F14 Index finger
    5 5 F5 5 F15 Right Index finger
    6 6 F6 6 F16 Middle finger
    7 7 F7 7 F17 Ring finger
    8 8 F8 8 F18 Pinky finger
    9 9 9 Left Left thumb
    0 0 0 Right Right thumb
    F9 F9 F9 Left Left thumb*
    F10 F10 F10 Right Right thumb*
  • Each thumb is assigned to activate at least two mechanical keys located on the front surface under the touch screen. For example, right thumb can activate the two far most right mechanical keys which are map d to digit 0 key and F10 Function key.
  • In Multi-Mice Mode, both touch pads located on the back of the embodiment become mouse pads. In this mode, one or a plurality of fingers of each hand may control the movements of an independent mouse or perform other activities. For example, by sliding the Index finger of left hand on the left touch pad, operator can move the pointer of the 1st mouse and perform another action by tapping with same finger on the touch pad. Similarly, the Index finger of the right hand may perform the same action as the left Index finger, but to control a 2nd mouse. Additionally, when using 2 fingers, 3 fingers, or even 4 fingers simultaneously, operator may perform one of the common actions of a typical mouse, or simplify and facilitate a process which otherwise may require several steps to accomplish. For example, in a drawing program, by using two index fingers, the operator may grab the two ends of a line, move and stretch it to fit in a new location, which otherwise, the same function would require moving each end of the line one step at a time. In another example, operator may rotate, zoom, and perform other actions on an object simultaneously by using more fingers of one or both hands.
  • Table 4 shows an example of the functions to be assigned to each finger. As stated earlier, by combining functions of a plurality of fingers, a new function may be achieved. These new functions may extend the functions of currently available computer programs and the programs to become available in the future due to the capabilities of this or other embodiments.
  • TABLE 4
    Interface Home Home and Adjacent Assigned
    Group Key Member Keys Hand finger
    Left Pinky finger
    Zoom Zoom Zoom-in Zoom-out Ring finger
    Rotate Rotate Rotate Page-up Middle finger
    Page-down
    Mouse-1 Mouse-1 Mouse-1 Index finger
    Mouse-2 Mouse-2 Mouse-2 Right Index finger
    Rotate Rotate Rotate Page-up Middle finger
    Page-down
    Zoom Zoom Zoom-in Zoom-out Ring finger
    Pinky finger
    Hold Hold Hold Left Left thumb
    Hold Hold Hold Right Right thumb
  • In Arrows Mode, tapping with the right index finger at its resting position will move the cursor or the selected object to the left, tapping with the right Ring finger at its resting position will move the cursor or the selected object to the right, an tapping with the right Middle finger above or below its resting position will move the cursor or the selected object up or down accordingly. Same functions may be achieved by the fingers of the left hand.
  • Functions of the touch pads located on the left and the right edges of the embodiment may vary when selecting different layouts or different modes of operation. In FIG. 1, touch pads 102 and 121 may be programmed to provide status updates on certain programs when pushed by the thumbs, while touch pads 106 and 116 may be programmed to change the thickness of a pen in a drawing application.
  • An operator may hold the first embodiment in the air, lay it on a desk, or dock it on a docking station during the operation. As shown in FIG. 1, adjustable legs 107 and 113 help keep the keyboard at a comfortable angle when set on a desk or docked.
  • Detailed Description FIG. 7—Second Embodiment
  • FIG. 7 shows the second embodiment 702 as an exemplary model of a docking station. Different models of the embodiment can be used for docking the first embodiment or other embodiments. The docking station is adjustable such that it holds an embodiment 701 at a comfortable angle for the operator to operate. This embodiment provides charging power and communication ports including Ethernet, USE, etc.
  • Operation Second Embodiment
  • When an embodiment is docked into the second embodiment, all the ports of the docking station become available to the docked embodiment and the recharging process begins. Operator may use the docked embodiment while charging.
  • Detailed Description FIGS. 8 and 9—Third Embodiment
  • Another example of the invention is described in the third embodiment, A new smart phone may be built with a touch-sensitive pad on the back. FIG. 8 shows the front of the third embodiment. Speaker 801, touch screen 802, and microphone 806, are the standard parts of a smart phone, Soft keys 803, 804, 805, 807, 808, and 809 are examples of keys which the thumbs of the operator can activate.
  • In FIG. 9, flash LED 901 and camera 902 are parts of a common smart phone. Back surface 903 has a built-in touch sensitive pad 904. Invisible soft keys 905, 906, 907, and 908 are examples of keys mapped on the touch pad. Similar to the first embodiment, in Texting Mode, these keys represent alpha keys and some of the special keys. The same set of programs that are available on the first embodiment is also available on the third embodiment. The third embodiment uses the volatile and non-volatile storage media, voice module, power source and the electronic circuitry of the smart phone. Therefore the same functions and capabilities of the first embodiment are available with the third embodiment. Additionally, third embodiment performs as a smart cell phone.
  • Operation Third Embodiment
  • The third embodiment operates like a smart cell phone but with added features and capabilities of the first embodiment.
  • Detailed Description FIG. 10—Fourth Embodiment
  • The fourth embodiment as shown in FIG. 10 provides two separate units for each hand. Each unit provides a front touch pad 1005 and a back touch pad 1003. Openings 1004 and 1006 facilitate holding the units in each hand. Each unit is labeled accordingly and is worn on the respective hand. Reattachment parts 1001 and 1002 connect and lock two units together into a one unit. Other forms of the embodiment may be manufactured to provide straps to be fastened to hands or easily held in hands when in operation.
  • The same set of volatile and non-volatile storage media, voice module, power source, electronic circuitry, and programs available on the first embodiment is built onto the fourth embodiment. Therefore, all functions and capabilities of the first embodiment, except for the functions of the front touch screen, are available on the fourth embodiment.
  • Operation Fourth Embodiment
  • The fourth embodiment operates like the first embodiment but without a front touch screen as shown in FIG. 10. Operator may use it while walking, jogging, running, standing, sitting, lying in bed, or with other body postures. When two units are locked together as one unit, it may e placed on a desk.
  • Detailed Description FIG. 11—Fifth Embodiment
  • The fifth embodiment provides a touch pad and communication medium. It is attached to a mobile device or other portable devices. FIG. 11 is an example of such a unit which can be easily snapped on to the back of a smart phone with attachment arms. Attachment arm 1105 is shown in FIG. 11. Other attachment methods may be used. There is an opening 1102 on the surface 1101 making the camera visible and functional. Touch pad 1103 maps invisible soft keys dynamically on the touch pad 1103. Soft keys 1104, 1106, 1107, and 1108 are examples of the interface elements in Texting Mode. They perform the same functions as the two touch pads on the first embodiment. A set of programs and drivers providing the same functionality of the programs available on the first embodiment are loaded into the portable host device. They access the resources of the host to save and recall necessary information. The fifth embodiment is powered by the host or by built-in power sources.
  • Operation Fifth Embodiment
  • Operator attaches the fifth embodiment to the back of a mobile device or a smart phone and connects it to the host through a wired medium, for example a USE cable. Operator runs same set of programs to configure and customize this embodiment. Operation of the fifth embodiment is similar to the operation of the third embodiment.
  • Detailed Description FIG. 12—Sixth Embodiment
  • The sixth embodiment provides two mouse-like portable units, one to be operated with the left hand and the other with the right hand, each labeled accordingly for each hand. Each unit has a plurality of touch pads. An example of such a unit that is operable with the right hand is shown in FIG. 12. The central holding cup 1203 has a pressure sensitive pad 1202 which is activated by the palm of the operator. A touch sensitive surface 1201 is attached to the bottom of the cup 1203. Elements 1204, 1205, and 1206, are the elements of an interface group which are dynamically mapped on the touch pad. Element 1207 represents the Shift special key. There are moving wheels or smooth sliding surfaces under the bottom of each unit. Both units may utilize current or future technologies, including currently available technologies used on a conventional mouse to generate codes when moved or when any of the built-in keys are used. Both units may communicate with a host computer through a wired or wireless medium. Part of the programs performing the same functions as on the first embodiment are loaded on the host computer and part of those programs are built into this embodiment. They access the resources of the host to save and recall necessary information. The sixth embodiment is powered by the host or by built-in power sources.
  • Operation Sixth Embodiment
  • In this example using only one of the units, the operator rests the right hand on the cup of the unit, and rests fingers on the touch pad. Operator runs the same set of the program on the host computer to customize and configure both units.
  • The mouse pointer is moved either by moving or sliding the device or switching the touch pad into a mouse pad. Operator may call the first, second or the third program by pressing and holding on the touch pad located on the cup for more than 3 seconds. Once configured, operator can use the touch pads to enter text or perform other keyboard functions, or, by moving or sliding each unit, perform the functions of a computer mouse. Since there are two units, each operated by one hand, operator may perform advanced mouse functions with two mice.
  • All embodiments may use wired, wireless, or both technologies currently available or available in the future to communicate with the local or remote hosts. All embodiments may use any source of power including solar power, magnetic power, or batteries, host power, or other power sources currently not available in the market.
  • Detailed Description FIGS. 13 and 14—Seventh Embodiment
  • The seventh embodiment is an example of a mobile device with extended touch pads. FIG. 13 shows slide-out touch pad 1305. Touch pad 1302 is built on the back surface 1301 of said mobile device. Dynamic soft keys 1303 and 1304, as an example, are mapped on touch pads 1302 and 1005. Touch ad 1302 is for the right hand, and slide-out touch 1305 is for the left hand of the operator. As an alternative, slide-out touch pad 1305 could be hinged on the side of the mobile device instead (no figure).
  • Other example of an extended touch pad is shown in FIG. 14. There are two touch pads attached to the back surface 1401 of the mobile device with hinges 1404 and 1405.
  • Dynamic soft key 1403, as an example, is on touch pad 1402, which is used with the right hand, and dynamic soft key 1406, as an example, is on touch pad 1407, which is used with the left hand of the operator.
  • Seventh embodiment provides larder typing surface for the operator's fingers.
  • Detailed Description FIG. 15—Eighth Embodiment
  • The eighth embodiment as shown in FIG. 15 is an example of a mobile device with rounded sides that is ergonomically designed to contour to the hands of operators. Touch ad 1500 is housed on a mobile device with contoured sides 1501 and 1503. Soft key 150 as an example, is mapped on the touch pad 1500.
  • Detailed Description FIG. 16—Ninth Embodiment
  • The ninth embodiment as shown in FIG. 16 is very similar to the eighth embodiment except that all the character keys are mapped on the front touch pad 1600 and activated with the thumbs of the operator. Soft keys 1601 aid 1602 are examples of a character key and a control key respectively. The touch pad on the back of ninth embodiment (no figures) is used with the other fingers of the operator as a dual-mouse pad or to enter numbers, symbols, or perform other functions.
  • Detailed Description FIG. 17—Tenth Embodiment
  • The tenth embodiment as shown in FIG. 17 is an example of a mobile device with collapsible sides 1700 and 1701 to provide ergonomic and easy holding and typing capabilities. Said sides can be pushed in or out of the body of the mobile device and locked when in operation.
  • Operation Seventh to Tenth Embodiments
  • Embodiments seven, eight, nine and ten are examples of more ergonomically designed mobile devices. They provide larger surface for fingers or ergonomic sides for easy interaction. There are no additional functions or operation besides what is available with other embodiments.
  • CONCLUSION, RAMIFICATION, AND SCOPE
  • One or more embodiments provide the following benefits individually or collectively:
  • Mobility and portability provides operators the choice to stand, walk, lie in the bed, or do other activities while typing on the keyboard. With invisible keys on the back of a portable device, which dynamically conform to the fingers of the operators, all the fingers of the operator become active. Hence, increased typing speed, greater efficiency, and healthier posture of fingers, wrists, hands and the body of the operator is achieved.
  • Versatile and customizable layouts meet the needs of different operators, including operators with less than 10 active fingers. Sound module adds another attribute by detecting and notifying errors and transcribing voice to text and computer commands. Each embodiment, in addition to the functions of an advanced computer keyboard, can activate a plurality of instances of a computer mouse, controlling the actions of several mice in one application program. This enhanced capability provides support for future programs requiring such capability.

Claims (4)

1. A portable data processing device comprising:
a. a plurality of surfaces,
b. detecting means for receiving contact information indicative of points of contact between fingers and said surfaces, and
c. mapping onto said surfaces a plurality of computer keys, each key assigned to one of said point of contacts and activated by the associated finger.
2. Data processing device according to claim 1, wherein said device is attached to a portable device.
3. A method of providing error notification, pronunciation, and voice transcription interface for any of the preceding claims.
4. A method of providing docking support for claims 1 and 2.
US14/214,710 2014-03-15 2014-03-15 Talking multi-surface keyboard Abandoned US20150261312A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/214,710 US20150261312A1 (en) 2014-03-15 2014-03-15 Talking multi-surface keyboard
US16/599,835 US10963068B2 (en) 2014-03-15 2019-10-11 Talking multi-surface keyboard

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/214,710 US20150261312A1 (en) 2014-03-15 2014-03-15 Talking multi-surface keyboard

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/599,835 Continuation US10963068B2 (en) 2014-03-15 2019-10-11 Talking multi-surface keyboard

Publications (1)

Publication Number Publication Date
US20150261312A1 true US20150261312A1 (en) 2015-09-17

Family

ID=54068838

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/214,710 Abandoned US20150261312A1 (en) 2014-03-15 2014-03-15 Talking multi-surface keyboard
US16/599,835 Active US10963068B2 (en) 2014-03-15 2019-10-11 Talking multi-surface keyboard

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/599,835 Active US10963068B2 (en) 2014-03-15 2019-10-11 Talking multi-surface keyboard

Country Status (1)

Country Link
US (2) US20150261312A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107608534A (en) * 2017-08-31 2018-01-19 深圳大学 A kind of acoustic keyboard realizes system, implementation method and application method
USD890756S1 (en) * 2018-10-31 2020-07-21 Christie Scott Wall Keyboard
US11086516B2 (en) 2018-10-31 2021-08-10 Christie Scott Wall Mobile, versatile, transparent, double-sided data input or control device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017183923A (en) * 2016-03-29 2017-10-05 京セラ株式会社 Electronic apparatus, character input control method, and character input program
KR101983504B1 (en) * 2018-08-29 2019-05-29 최태홍 Smart phone case combination keyboard for blind
US11304511B2 (en) * 2019-11-12 2022-04-19 Logitech Europe S.A. Ergonomic keyboard system
US11963315B2 (en) * 2021-03-19 2024-04-16 Deere & Company Housing for a portable electronic device
USD995263S1 (en) * 2022-12-08 2023-08-15 Rvlock & Co, Llc Recreational vehicle door handle

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140104180A1 (en) * 2011-08-16 2014-04-17 Mark Schaffer Input Device

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5128672A (en) 1990-10-30 1992-07-07 Apple Computer, Inc. Dynamic predictive keyboard
US5748177A (en) 1995-06-07 1998-05-05 Semantic Compaction Systems Dynamic keyboard and method for dynamically redefining keys on a keyboard
US6359572B1 (en) 1998-09-03 2002-03-19 Microsoft Corporation Dynamic keyboard
US20070036603A1 (en) 2003-09-22 2007-02-15 Marek Swoboda Portable keyboard
US8036895B2 (en) * 2004-04-02 2011-10-11 K-Nfb Reading Technology, Inc. Cooperative processing for portable reading machine
GB0417069D0 (en) * 2004-07-30 2004-09-01 Hewlett Packard Development Co Methods, apparatus and software for validating entries made on a form
KR100724848B1 (en) * 2004-12-10 2007-06-04 삼성전자주식회사 Method for voice announcing input character in portable terminal
US8077147B2 (en) * 2005-12-30 2011-12-13 Apple Inc. Mouse with optical sensing surface
CN201035502Y (en) 2006-12-28 2008-03-12 上海麦柯信息技术有限公司 Safety accidental dynamic soft keyboard
US8949266B2 (en) 2007-03-07 2015-02-03 Vlingo Corporation Multiple web-based content category searching in mobile search application
US20090174663A1 (en) 2008-01-03 2009-07-09 Electronic Data Systems Corporation Dynamically configurable keyboard for computer
US9552155B2 (en) 2008-02-04 2017-01-24 Microsoft Technology Liecsnsing, LLC Dynamic soft keyboard
GB0816222D0 (en) 2008-09-05 2008-10-15 Elliptic Laboratories As Machine interfaces
US8812972B2 (en) 2009-09-30 2014-08-19 At&T Intellectual Property I, L.P. Dynamic generation of soft keyboards for mobile devices
US20110074692A1 (en) 2009-09-30 2011-03-31 At&T Mobility Ii Llc Devices and Methods for Conforming a Virtual Keyboard
US9128610B2 (en) 2009-09-30 2015-09-08 At&T Mobility Ii Llc Virtual predictive keypad
US8432362B2 (en) 2010-03-07 2013-04-30 Ice Computer, Inc. Keyboards and methods thereof
WO2011121103A2 (en) 2010-03-31 2011-10-06 Danmarks Tekniske Universitet A dynamic display keyboard and a key for use in a dynamic display keyboard
US10496714B2 (en) 2010-08-06 2019-12-03 Google Llc State-dependent query response
US9285840B2 (en) * 2010-08-19 2016-03-15 Michael S. Stamer Detachable sensory-interface device for a wireless personal communication device and method
US20130275907A1 (en) 2010-10-14 2013-10-17 University of Technology ,Sydney Virtual keyboard
EP2474890A1 (en) 2010-12-30 2012-07-11 Touch Software BV Virtual keyboard configuration putting fingers in rest positions on a multitouch screen, calibrating key positions thereof
AP2013007206A0 (en) 2011-03-31 2013-10-31 Infosys Ltd System and method for utilizing a dynamic virtual keyboard
WO2012167397A1 (en) 2011-06-07 2012-12-13 Intel Corporation Dynamic soft keyboard for touch screen device
US8599158B2 (en) * 2011-06-29 2013-12-03 Nokia Corporation Multi-surface touch sensitive apparatus and method
CN104509078A (en) 2012-08-09 2015-04-08 李永贵 Keyboard and mouse of cellular phone
US20140082517A1 (en) 2012-09-14 2014-03-20 Salesforce.Com, Inc. Facilitating dynamic creation, customization, and execution of keyboard shortcuts in an on-demand services environment
WO2014076258A1 (en) * 2012-11-15 2014-05-22 Schönleben Oliver Method and device for typing on mobile computing devices
EP3087559B1 (en) * 2013-12-24 2021-05-05 Flexterra, Inc. Support structures for a flexible electronic component

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140104180A1 (en) * 2011-08-16 2014-04-17 Mark Schaffer Input Device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107608534A (en) * 2017-08-31 2018-01-19 深圳大学 A kind of acoustic keyboard realizes system, implementation method and application method
USD890756S1 (en) * 2018-10-31 2020-07-21 Christie Scott Wall Keyboard
US11086516B2 (en) 2018-10-31 2021-08-10 Christie Scott Wall Mobile, versatile, transparent, double-sided data input or control device

Also Published As

Publication number Publication date
US10963068B2 (en) 2021-03-30
US20200081551A1 (en) 2020-03-12

Similar Documents

Publication Publication Date Title
US20150261312A1 (en) Talking multi-surface keyboard
US9218126B2 (en) Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
JP4981066B2 (en) Keyboard for portable electronic device
US8471689B2 (en) Touch-sensitive motion device
US8175664B2 (en) Angular keyboard for a handheld mobile communication device
US20100045611A1 (en) Touch screen mobile device as graphics tablet input
EP2367094A1 (en) Touch sensitive keypad with tactile feedback
KR20080083057A (en) Inputting information with finger-mounted sensors
CN102822785A (en) An image of a keyboard
US20130127791A1 (en) Thumb or Finger Devices with Electrically Conductive Tips & Other Features for Use with Capacitive Touch Screens and/or Mechanical Keyboards Employed in Smartphones & Other Small Mobile Devices
JP2013089257A (en) Portable electronic equipment
KR101156960B1 (en) Portable terminal with touch screen for the blind and mehtod for operating thereof
US9176631B2 (en) Touch-and-play input device and operating method thereof
WO2015106016A1 (en) Determining input associated with one-to-many key mappings
US20120139858A1 (en) Dual touch pad interface for a computing device
US11054984B2 (en) Gesture-based input command interface, method and system
US20150172430A1 (en) Keypad for mobile terminal
JP2014110480A (en) Information processing apparatus, and method and program for controlling information processing apparatus
US20130069881A1 (en) Electronic device and method of character entry
TWI631484B (en) Direction-based text input method, system and computer-readable recording medium using the same
US11099664B2 (en) Talking multi-surface keyboard
GB2421218A (en) Computer input device
CA2591182C (en) Angular keyboard for a handheld mobile communication device
WO2013111213A1 (en) Character input device, character input method, program and terminal device
US20140285443A1 (en) Method and system for keyglove fingermapping an input device of a computing device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION