WO2017112714A1 - Combinaison de clavier d'ordinateur et de dispositif de pointage d'ordinateur - Google Patents

Combinaison de clavier d'ordinateur et de dispositif de pointage d'ordinateur Download PDF

Info

Publication number
WO2017112714A1
WO2017112714A1 PCT/US2016/067890 US2016067890W WO2017112714A1 WO 2017112714 A1 WO2017112714 A1 WO 2017112714A1 US 2016067890 W US2016067890 W US 2016067890W WO 2017112714 A1 WO2017112714 A1 WO 2017112714A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
keyboard
data packet
key
computer
Prior art date
Application number
PCT/US2016/067890
Other languages
English (en)
Other versions
WO2017112714A8 (fr
Inventor
Michael Farr
Original Assignee
Michael Farr
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Michael Farr filed Critical Michael Farr
Publication of WO2017112714A1 publication Critical patent/WO2017112714A1/fr
Publication of WO2017112714A8 publication Critical patent/WO2017112714A8/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/021Arrangements integrating additional peripherals in a keyboard, e.g. card or barcode reader, optical scanner
    • G06F3/0213Arrangements providing an integrated pointing device in a keyboard, e.g. trackball, mini-joystick
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0445Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using two or more layers of sensing electrodes, e.g. using two layers of electrodes separated by a dielectric layer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04892Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • H03K17/945Proximity switches
    • H03K17/955Proximity switches using a capacitive detector
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03KPULSE TECHNIQUE
    • H03K17/00Electronic switching or gating, i.e. not by contact-making and –breaking
    • H03K17/94Electronic switching or gating, i.e. not by contact-making and –breaking characterised by the way in which the control signals are generated
    • H03K17/96Touch switches
    • H03K17/962Capacitive touch switches
    • H03K17/9622Capacitive touch switches using a plurality of detectors, e.g. keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present embodiment relates generally to computer input devices, and more particularly, to a touch sensitive keyboard configured to perform at least one action by combining a touch screen gesture with a plurality of discrete keys without utilizing a conventional pointing device.
  • the preferred embodiment of the present invention provides a keyboard system for displaying at least one action in response to at least one touch input without utilizing a pointing device.
  • the keyboard system includes a touch sensitive keyboard, a touch event processor and at least one graphical interface device (GID).
  • the touch sensitive keyboard in one embodiment includes a plurality of discrete keys and a plurality of sensor pads.
  • each of the plurality of discrete keys includes a key press switch and a keycap where the top of the key incorporates a capacitive touch sensor pad whereby the proximity or direct touch of a finger can be detected.
  • a plurality of sensor pads is arranged around and between each of the plurality of discrete keys of the touch sensitive keyboard.
  • the touch sensitive surface is implemented by deploying a plurality of driver lines orthogonal to a plurality of sensor lines.
  • the purpose of the sensors is to enable the determination the location and movement of fingers touching the surface of the keys.
  • the invention employs a separate keypress mechanism registers when keys are depressed as in the normal course of typing.
  • touch event refers to a user's action of touching, tapping, or sliding one or more fingers on the surface of the keys without depressing the keys.
  • keypress refers to completely depressing a key as would traditionally be done in a conventional keyboard to register a conventional key press.
  • a keyboard touch is generated by a touch on at least one of the plurality of discrete keys and optional motion along the surface of the keys, in other words, a touch sensitive gesture.
  • the plurality of sensor pads is configured to transfer the keyboard touch input signal.
  • Each of the plurality of sensor pads includes a via connected to an underlying underlying circuit board, either rigid or in some embodiments flexible which route the signals from the plurality of sensors either under or around the keys to a processor where capacitance of each sensor is measured.
  • the top layer of the circuit board is the sensor layer
  • the second is a a ground plane layer which insulates the traces in the routing layer underneath from background interference.
  • Each of the plurality of sensor pads whether located on the surface of each key or to the side of each key incorporated into frame of the keyboard is connected through a circuit board ground layer to one or more routing layers where circuit traces connect the sensors processor.
  • the raw capacitance values of the plurality of sensors are measured and collected by the touch determination module.
  • This module combines this information to determines touches much as a touch pad or a multitouch mouse.
  • the touch determination module may need to be modified for each different hardware configuration in order that output of this module be consistent across hardware implementations.
  • a data packet describing the touch information is sent to a low-level event module
  • the low-level event module combines the touch information into conventional mouse-touch events understood by existing GID systems. This information is described as a translated data packet an includes meaningful pointing device events such as mouseDown and mouseMove, touchBegin, etc. In practice, there are often a stream of data packets (and a corresponding stream of translated data packets) as the user uses the computer.
  • the translated data packets are sent to the High-Level Event Module, which monitors the output of the low-level event module and provides to the computer new or additional gestures and commands to be defined using combinations of touches, taps and gestures. For example, tapping ' dw' (that is quickly touching the 'd' and 'w' keys without fully depressing them might be mapped by the High-Level Event Module to a command to delete the word in which the text cursor resides. Currently no applications respond to "tap events" so the High-Level Event Module recognizes this sequence and will output a mouse double click followed by the key code indicating the delete key was depressed.
  • the High-Level Event Module translates this tap-touch gesture into a series of conventional events: a mouse down event, several mouse move events, a mouse release event and the command key for the copy command. These conventional pointing device and keyboard events are generated by the High-Level Event Module and sent to the user's computer. Thus, the High-Level Event Module enables the definition of new tap touch gestures that generate multiple conventional mouse and keyboard events. Note that the key upon which the tap-touch gesture began may be significant. Tapping 'd' and sliding might delete while tapping 'c' and sliding might copy.
  • the at least one graphical interface device is configured to display the at least one action performed by computer.
  • the keyboard input signal is generated by a touch on at least one of the plurality of discrete keys and a touch sensitive gesture.
  • the touch sensitive keyboard enables a user to provide an input through the plurality of discrete keys by a tap-touch.
  • the touch sensitive keyboard enables the keyboard system to perform the at least one action by combining the touch screen gesture or mouse gesture with the at least one of the plurality of discrete keys without utilizing a separate pointing device.
  • Each of the plurality of sensor pads is connected to the routing layer through the sensor layer and the ground plane layer.
  • the at least one touch input may be a low-level event or a high- level event.
  • the low-level event is processed by a low-level event module 11 and the high-level event is processed by a high-level event module 12.
  • the preferred embodiment includes a method for displaying the at least one action in response to the at least one touch input utilizing the keyboard system.
  • the method includes the steps of: providing at least one touch input to the at least one of the plurality of discrete keys of the touch sensitive keyboard; enabling the plurality of sensor pads arranged around each of the plurality of discrete keys to transfer the keyboard input signal to the touch event processor installed with the event translation software; enabling the touch determination module at the touch event processor to interpret the keyboard input signal and to generate a data packet; enabling the low- level event module coupled with the touch determination module to receive the data packet, translate the data packet to a translated data packet and sends it to a High-Level Event Module, which corresponding to the at least one touch input to either passes said translated data packet to the computer or modifies the translated data packet and passes the modified translated data packet to the computer, thereby instructing the computer to perform at least one action; enabling the at least one graphical interface device (GID) connected with the touch event processor to display the at least one action performed by computer in response
  • a first objective of the present invention is to provide a keyboard system having a touch sensitive keyboard that enables a user to perform at least one action by combining a touch screen gesture with at least one of a plurality of discrete keys without utilizing a separate pointing device such as a mouse.
  • a second objective of the present invention is to provide a keyboard system that equalizes the ease of use of a pointing device but without the time penalty and other drawbacks involved in repeatedly moving a hand from the keyboard to the pointing device and back.
  • a third objective of the present invention is to add a new layer of higher level events which incorporate tap-touch, slide with positional information that provides simple, easy to remember, high-level commands that are translated into a sequence of mouse-touch events.
  • a further objective of the present invention is to provide a keyboard system that greatly improves the efficiency of computer use.
  • FIG. 1 is a high-level block diagram for displaying at least one action in response to at least one touch input without utilizing a pointing device in accordance with the preferred embodiment of the present invention
  • FIG. 2 is a diagrammatic top view of a keyboard wherein the top surfaces of the mechanical keys are touch sensitive and when taken altogether make up a large touch sensitive area in accordance with the preferred embodiment of the present invention
  • FIG. 3 is a diagrammatic side view of a typical single key in use in accordance with the preferred embodiment of the present invention
  • FIG. 4 is a block diagram of a typical switch configuration for reporting key presses in accordance with the preferred embodiment of the present invention.
  • FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard
  • FIG. 5 B is a block diagram of another embodiment of the plurality of circuit board layers of the touch sensitive keyboard shown in FIG. 5A;
  • FIG. 6 is a block diagram of the touch sensitive components of a touch sensitive keyboard in accordance with an alternate embodiment of the present invention.
  • FIG. 7 is a perspective front view of two typical keys in use in accordance with the present invention.
  • FIG. 8 is a perspective front view of three typical keys in use in accordance with an alternative embodiment of the present invention.
  • FIG. 9 is a diagrammatic top view of a keyboard according to one embodiment of the present invention wherein the keyboard is implemented using a contiguous touch sensitive surface as opposed to discreet mechanical keys;
  • FIG. 10 is a block diagram showing the logic behind interpreting and translating several low-level events generated by the hardware circuitry into events which can be processed by existing applications in accordance with an embodiment of the present invention
  • FIG. 11A is a top view diagrammatic depiction of motion possible by a user' s fingers that does not require repositioning the user's hand in use according to a preferred method of the present invention
  • FIG. 11B is a diagrammatic depiction of a keyboard layout with discreet left and right mouse buttons according to an alternative embodiment of the present invention.
  • FIG. 12 is a is a diagrammatic depiction of a keyboard in use according to an preferred method embodiment of the present invention, "tap-touch";
  • FIG. 13 shows a computer display in use according to a method of the present invention
  • FIG. 14 shows a computer display and an exemplary case of the implied shift key
  • FIG. 15 shows a computer display depicting a text insertion pointer's direct motion through text in magnetic mouse mode
  • FIG. 16 shows a computer display depicting a text insertion pointer's second motion through text in a vertical direction in magnetic mouse mode
  • FIG. 17 shows a computer display depicting a pointer's third motion through text in a direction in magnetic mouse mode alongside a depiction of the user's hand during this motion;
  • FIG. 18 shows a computer display depicting a pointer's fourth motion through text in magnetic mouse mode alongside a depiction of the user's hand during this motion
  • FIG. 19 shows a computer display depicting a pointer's fifth motion through text alongside a depiction of the user's hand during this motion
  • FIG. 20 shows a computer display depicting a pointer's sixth motion through text alongside a depiction of the user's hand during this motion
  • FIG. 21 shows a computer display depicting a pointer's seventh motion through text alongside a depiction of the user's hand during this motion
  • FIG. 22 shows a computer display depicting a pointer's eighth motion through text alongside a depiction of the user's hand during this motion
  • FIG. 23 shows a computer display depicting a pointer's ninth motion through text alongside a depiction of the user's hand during this motion
  • FIG. 24 shows a computer display depicting a pointer's tenth motion through text alongside a depiction of the user's hand during this motion
  • FIG.25 shows a computer display depicting a pointer's eleventh motion through text alongside a depiction of the user's hand during this motion
  • FIG. 26 is a top diagrammatic depiction of a keyboard showing a user's two hands making a first motion
  • FIG. 27 is a top diagrammatic depiction of a keyboard showing a user's two hands making a second motion
  • FIG. 28 shows a computer display depicting a pointer's twelfth motion through text alongside a depiction of the user's hand during this motion
  • FIG. 29 shows the result on a computer display of the text after the user raises her hand after the tap-drag originating on the 'D' key.
  • FIG. 30 shows another embodiment of the present invention, illustrating the sensors located on every space of the touch sensitive keyboard.
  • GID Graphical Interface Device
  • Windows Macintosh
  • mobile operating systems such as Apple's iOS, Google's Android
  • any other computing devices with a graphical user interface that accept keyboard input as well as pointing device input in the form of mouse events as defined below.
  • Modes are the keys on a standard computer keyboard such as shift, command, option or alt, function, and control which do not transmit an ASCII code when pressed but which modify the ASCII code transmitted by the remaining keys.
  • “Mouse-touch events” shall be understood to be a catch-all phrase that includes well known pointing device position and button state events as generated by a computer mouse, track ball or other pointing device as well as touch events as generated by a touchscreen, touchpad or similar input device.
  • Low-level user input events or simply “low-level events” refer to mouse- touch events that are closest to the physical actions of the hardware such as the click of a button, or a touch on a touch sensitive surface.
  • high-level events are often combinations of low-level events. They may comprise a gesture and may be associated with a specific meaning or semantics.
  • a swipe in the art may be composed of a touch followed by a quick motion in a single direction while remaining in contact with the touch sensitive surface and finishing with the lifting of the fingers from the touch surface.
  • the semantics of the swipe may be to progress through an ordered collection such as advancing the current page in a document, scrolling a document up or down within a window, or moving a text cursor to the end or beginning of a line of text.
  • Keyboard events shall be understood to include the hardware and software events typically generated by a mechanical, membrane, or virtual keyboard. Keys may be pressed individually, in sequence or in combination with modifier keys.
  • “User input semantics” or just “semantics” shall be understood to be the meaning of user interactions with a GID within the current context of a software application or operating system. For example, the semantics of pressing a key on a keyboard might be to insert the associated ASCII character into a document while a key pressed in conjunction with a modifier key might move the text cursor through the document, delete a line or another command to the editor.
  • a “Drag event” shall be understood to mean a subset of mouse-touch events as defined above that indicates the moving of the pointing device while the mouse button being is held down or, in a touch environment, while in continuous contact with the surface.
  • the semantics of a drag event may be the moving of a selected user interface object, the moving of selected text, or the selecting of text, depending on the current program state when the drag event occurs.
  • a "touch sensitive keyboard” shall be understood to mean 1) a computer keyboard with discreet keys where a) the tops of the keys form a single touch sensitive surface that can be used as a pointing device in a similar manner to a touch pad or mouse, 2) a keyboard represented on a touch sensitive pad or touchscreen display (sometimes referred to as a virtual keyboard) where the pressure necessary to register as a mouse-touch event can be distinguished from a greater pressure that registers as a key press.
  • the touch sensitive keyboard may be implemented using a tablet computer as long as it is capable of distinguishing more than one level of pressure.
  • a keyboard system 1 including a touch sensitive keyboard 2, a touch event processor 3 and at least one graphical interface device (GID) 4 is illustrated.
  • the touch sensitive keyboard 2 includes a plurality of discrete keys 7 and a plurality of sensor pads 1004.
  • Each of the plurality of discrete keys 7 includes a key cap 110, a key press switch 120 and a touch sensitive top layer 100.
  • FIG. 2 shows a keyboard 2 with discreet keys 7 where the top surfaces 100 of the keys 7 are touch sensitive and, when taken altogether, make up a large touch sensitive surface.
  • the surface 100 of the keys 7 is thus akin to a large touchpad.
  • the user may touch the surface with one or more fingers and may easily move his or her fingers across the surface. Doing so will generate touch movement signals in a similar manner to a touchpad or touch screen without depressing any key 7 far enough to generate a key press.
  • FIG. 3 shows a typical single key 7, its keycap 110, and indicates the top of the key 7 may be a touch sensitive surface 100, while at the bottom of the key 7 the switch 120 is triggered when the key 7 is fully depressed.
  • FIG. 4 shows a block diagram of a typical sensor configuration for reporting key presses. Key presses are typically detected by a mechanical or membrane switch 10 located under each key 7. Key presses are detected by the key press sensor, 20 and converted to standardized electric signals that are transmitted externally to the keyboard, usually by USB (universal serial bus), 70, or alternatively wirelessly through Bluetooth or other networking technologies.
  • USB universal serial bus
  • the plurality of self capacitive sensor pads 1004 is arranged around each of the plurality of discrete keys 7 of the touch sensitive keyboard 2 and covered with a top cosmetic coating 1000.
  • the plurality of sensor pads 1004 is configured to transfer a keyboard input signal.
  • Each of the plurality of sensor pads 1004 includes a via 1005 connected to a signal trace line back to the touch event processor 3.
  • a plurality of signal trace layers may be employed.
  • the set of signal trace layers when taken together is referred to as the routing layer 1003.
  • the plurality of circuit board layers includes a top sensor layer 1001, a ground plane layer 1002 and a routing layer 1003.
  • Each of the plurality of sensor pads 1004 appears on the top sensor layer and is connected to the routing layer 1003 through the ground plane layer 1002.
  • the routing layer 1003 connects each of the plurality of sensor pads 1004 with the touch event processor 3.
  • the touch sensitive surface 100 on each of the plurality of discrete keys 7 and the top cosmetic coating 1000 of the plurality of sensor pads 1004 defines to form a touch sensitive surface on the touch sensitive keyboard 2.
  • the sensor pads 2005 are placed on the tops of the keys, just under a thin key cap or in some embodiments, the sensor pads 2005 are fabricated on top of the key and then coated with an insulating layer.
  • the frame of the keyboard is used to house circuit traces 2002 to route the electrical signals from the key sensor pads around the key wells 2006 to be processed.
  • a flexible conducting connector bridges the conductive sensor pad on the key to a connection on the keyboard frame 2004.
  • the end of each flexible connector connects through a via in the ground plane 2001 to the circuit trace layer below.
  • multiple circuit board layers may be utilized to carry the signal given the narrowness of the passage between the keys on the keyboard. In some embodiments, these separate layers are separated by ground planes to minimize or prevent signals from interfering with each other.
  • the touch event processor 3 is coupled to a memory unit wherein the event translation software enables the processor 3 to perform a set of predefined actions corresponding to at least one touch input on the touch sensitive keyboard 2.
  • the touch event processor 3 receives the keyboard input signal from the plurality of sensor pads 1004 through the plurality of sensor layers.
  • the keyboard input signal is generated by a touch on at least one of the plurality of discrete keys 7 and optionally a drag event.
  • the touch event processor 3 includes a touch determination module 5, a low-level event module 11 for generating a data packet that describes the mouse touch events, and a high-level event module 12 which monitors the stream of data packets and either sends the command directly to the computer or further modifies the translated data packet to a modified translated data packet, which is the sent to the computer.
  • the high-level event module 12 passes it directly.
  • the high-level event module 12 modifies it prior to it being sent.
  • the computer is instructed to perform an action, and the at least one graphical interface device 4 is configured to display the at least one action.
  • FIG. 6 shows a block diagram of the touch sensitive components of a touch sensitive keyboard 2.
  • An alternative method of implementing touch sensitive surfaces is by deploying a plurality of driver lines 70 orthogonal to a plurality of sensor lines 60. Each of the plurality of driver lines 70 and each of the plurality of sensor lines 60 is separated by a non-conductive film 80 so that no physical connection between driver 70 and sensor lines 60 is made. And each of the plurality of driver lines 70 in sequence is momentarily connected to a voltage source which induces capacitance in each of the plurality of sensor lines 60 where the driver 70 and sensor lines 60 cross. This capacitance can be measured.
  • FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; and sensor lines FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; under each key cap FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; This is so that the position of a touch within the key FIG. 5 A is a block diagram of a plurality of sensor layers of a touch sensitive keyboard; e.g.
  • driver 140 and sensor lines 130 are shown attached to the undersurface of the keycap 150.
  • FIG. 8 depicts an alternate embodiment wherein the driver 160 and sensor lines
  • 170 may be embedded in a flexible cloth or film, 180, that is bonded to the top and sides of the keys 7 with enough slack between keys 7 to allow the keys 7 to be pressed.
  • FIG. 9 shows a functionally similar setup but is implemented using a touchpad without discreet keys 7.
  • the key positions shown are thus drawn on the pad or displayed on the touchscreen.
  • the keys 7 may be contoured or equipped with force feedback to give the user some tactile sense as to the position, extant, and sensitivity of the key areas.
  • the surface must be capable of distinguishing between a light touch for generating mouse-touch events and a heavier touch to generate key press events. This may be accomplished in plurality of ways including a series of dome or membrane switches deployed under the touch surface, or the touch sensitive surface itself may be capable of detecting a light touch for generating mouse- touch events and a heavier touch to generate key presses.
  • the surface is a contiguous touch sensor, functioning both as a touch surface and keyboard 2.
  • a tablet computer may be used as the keyboard 2 with the keys 7 represented graphically on the screen.
  • the tablet must be able to sense and provide feedback to the user as to the difference between touching the surface and a harder touch that corresponds to a keypress.
  • One way of achieving this is to measure the size of the touch area. A light touch will produce an area in contact with the surface that is smaller than a heavier touch meant to serve as a key press.
  • the touch sensitive keyboard 2 connects to the computer via a common USB, Bluetooth, or other networking connection.
  • the touch sensitive keyboard 2 enables a user to provide an input through the plurality of discrete keys 7 by a tap-touch.
  • the touch sensitive keyboard 2 enables the keyboard system 1 to perform the at least one action by combining the touch screen gesture with the at least one of the plurality of discrete keys 7 without utilizing an external pointing device.
  • FIG. 10 depicts a block diagram showing the logic behind interpreting and translating the raw sensor outputs from multiple sensors into low-level events and then further to high-level events as defined above. These low-level events are events which can be processed by existing applications in accordance with an embodiment of the present invention.
  • the high-level event module 12 also monitors low-level events output by the low-level event module 11. Additional information provided by the touch
  • the at least one touch input can be a low-level event or a high-level event.
  • the low-level event is processed by a touch determination module 5 and the low-level event module 11 and the high-level event is processed by a high-level event module 12.
  • touch sensors 8 senses touch events
  • tap-touch-and gesture input while a key press sensor 9 senses key press inputs.
  • each keypress also generates an initial touch
  • a touch by itself has no semantics and generates no output unless it is part of a tap, tap-touch, double tap or or drag or three fingered touch.
  • a mouse down is generated by a double tap.
  • a tap- touch consists of a touch followed immediately by a release. Any key press immediately following a tap on the same key remove the touch from consideration. Thereafter, the gesture input and the key press inputs are transmitted to the touch event processor 3.
  • the low-level event module 11 receives a data packet describing the inputs, and using event translation software translates the data packet to a translated data packet and transmits the translated data packet to the high-level event module 12.
  • a keyboard-mouse driver logic 13 coupled with an operating system 14 in the graphical interface device 4 displays the at least one action performed by the high-level event module 12. In this way, the user can enter data through the touch sensitive keyboard 2 without utilizing the pointing device.
  • the low-level events generated by the hardware circuitry employed in a mechanical keyboard 2 design with touch sensitive keys 7 may be quite different from those generated by a keyboard 2 implemented on a touchpad or touchscreen tablet.
  • a touch pad or tablet device often generates absolute position information: a touch at the lower left of the input device is mapped directly to a position on the lower left of the user's display in a linear fashion. Additionally, touches hard enough to trigger a key press event will have to be mapped from an x-y position to the key 7 presented at that location.
  • the present invention translates all low-level events from absolute to relative motion events in the manner of a mouse or track pad and not a touch screen display: that is, a new position is determined by the previous position modified according to the direction and speed of the user's fingers.
  • the preferred embodiment includes a method for displaying the at least one action in response to the at least one touch input utilizing the touch sensitive keyboard system 1. The method commences by providing the at least one touch input to the at least one of the plurality of discrete keys 7 of the touch sensitive keyboard 2. Next, the plurality of sensor pads 1004 arranged around each of the plurality of discrete keys 7 is enabled to transfer the keyboard input signal to the touch determination module 5 within the event processor
  • the touch determination module 5 is enabled to interpret the keyboard input signal and to generate the data packet which is sent to the low-level event module 11 for translation.
  • the translated data packet is sent to the high-level event module, which receives the translated data packet corresponding to the at least one touch input and either passes said translated data packet to the computer or modifies said translated data packet and passes the modified translated data packet to the computer, thereby instructing the computer to perform at least one action.
  • the at least one graphical interface device 4 connected with the touch event processor 3 is enabled to display the at least one action.
  • lateral motion across the tops of keys 7 is not something users generally do today. Such a motion has no existing user interface semantics and is distinct from a key press and so the system is free to interpret it as mouse motion with low chance for error.
  • the low-level event logic therefore ignores random mouse-touch events but responds to movements across the surface of the keys 7 to indicate motion events in the manner of an Apple multi-touch trackpad as used in macOS and other Apple operating systems.
  • the position (or mouse) cursor may move anywhere a conventional mouse or trackpad might move it.
  • hysteresis is employed: small movements that do not occur as part of a longer movement are ignored, that is to say a threshold of distance must be crossed before motion is reported. This is to eliminate random movements generated by a user as he/she types or randomly lifts or rests her fingers on the keyboard surface without intention to generate mouse events.
  • Method 1 Simply include mouse buttons on the keyboard 2 below the space bar as shown in FIG. 11B. This is common on existing laptops that do not implement mouse clicks through their trackpad.
  • Method 2 The system uses a gesture to indicate the beginning of mouse events.
  • tap-touch we define "tap-touch", as shown in FIG. 12.
  • the user would raise at at least one finger off the touch sensitive keyboard surface of the keyboard 2 then quickly tap the surface of a key or keys, 240, followed immediately by a touch, 250, in the same position
  • This gesture can be distinguished from random touches. If tap-touch is followed by a drag, then a selection is created from the position of the cursor as with a three-fingered tap drag in the Apple macOS operating system
  • tap-touch replaces the movement of the hand from the keyboard 2 to the mouse or track pad, the pressing of the mouse button and returning of the hand to position on the keyboard and as such is an improvement in efficiency and speed.
  • the touch event processor 3 of the keyboard 2 recognizes the tap-touch gesture translating it to the equivalent mouse-down event.
  • tap-touch-drag is defined as tap-touch with motion before raising the fingers again.
  • the semantics defined for this gesture are usually selecting text, or selecting and moving a graphical object on the display. This is equivalent, and in most circumstances will be translated to mouse-down and mouse-move events. Raising the fingers at the end of the drag is equivalent to releasing a mouse button.
  • Method 3 Holding the shift key down may be defined to create or extend a selection to when there is a text insertion point in a block of text from the insertion point to the cursor position and continues to modify the selection when followed immediately by a mouse-drag event.
  • the semantics are similar to holding a shift key while using the arrow keys to select text in existing applications or using the shift key with a trackpad This is as opposed to holding the shift key followed by a key press, which simply generates the shift value of the pressed key, e.g. 'a' to ⁇ '.
  • FIG. 13 with the text insert cursor at the position denoted by 300, the shift key is pressed and held down. The user drags across the keyboard until position 310 where the shift key is released and the desired text is selected.
  • Method 5 a method for using the shift key as a mouse button outside of text is equivalent to a mouse click with the shift key down. This is referred to as an implied shift key.
  • the user has momentarily pressed the shift key. Each time, a new graphic object is added to the selection. Alternatively, the user could have clicked a mouse button if one were provided on the keyboard 2, or tapped two fingers on the keyboard 2 with the shift key held to accomplish the same thing as described in the above-mentioned method.
  • To release an item from the selection the user simply presses the shift key while the mouse is over the item, (or taps with two fingers, or clicks the mouse button) while the shift key is held.
  • To release all items simply tap two fingers, mouse button click, or press and release the shift key while the mouse cursor is over empty space in the document. Note that like many alternatives of the preferred embodiment, this application of the shift key is an optional optimization and is not required.
  • the low-level and high-level event logics receive the keyboard input signals from the touch sensitive keyboard 2 and translate them into standard mouse- touch events and, when appropriate, higher level gestures.
  • Existing applications and operating systems may use these higher-level events from the keyboard 2 with no need for modification to the software.
  • the event translation software is configured to enable the touch sensitive keyboard 2 to operate in a plurality of modes but is not limited to a magnetic cursor mode and a magnetic mouse mode. The plurality of modes is described below:
  • An additional innovation of the present invention is the capability it presents to translate some mouse-touch events into a combination of mouse-touch events, keyboard command-key events and arrowKey events.
  • the "magnetic cursor" mode is a technique to further reduce the number of mouse events that need to be specifically generated by the user and thereby making interaction quicker and more efficient. With magnetic mode selected, certain events are generated automatically for the user. In this case, when a user has a text insertion cursor in some block of text and the text cursor and mouse cursor are combined. Moving the mouse-pointer across text (by dragging across the surface of the touch sensitive keyboard's 2 keys 7) will advance the text insertion cursor through the text as shown in FIG. 15.
  • a slow but deliberate leftward motion across the touch sensitive keyboard 2 is moving the text insertion cursor leftward through the text one character at a time as shown in FIG. 15. Unlike a typical use of a mouse, no mouse click is required at the end of the mouse movement to set a new insertion point.
  • FIG. 16 shows a similar motion through text in a vertical direction in magnetic mouse mode. A slow upward drag movement from 500 to 510 is illustrated. It is similar to, and in fact the low-level event logic may choose to translate this motion into multiple clicks of the up arrow.
  • FIG. 17 With the magnetic mouse mode active, a faster movement across the surface of the keys 7 is represented using the mouse cursor which moves smoothly without stopping to show the text insertion cursor at every possible intermediate position starting from the existing text insertion point, at 600 vertically to release at 610.
  • the rapid motion causes the appearance of the mouse cursor.
  • a mouse cursor is added when a touch sensitive keyboard is attached. This may require support from an application running on a tablet or from the tablet operating system itself but would allow the tablet to be used for serious text processing without fatigue and inefficiency of moving ones finger from the keyboard to the tablet surface and back for each touch input.
  • the mouse cursor Rather than appearing at the previous position of the mouse cursor, the mouse cursor appears at the text insertion point and moves from there as opposed to the common practice of the day.
  • the preferred embodiment is partially combining and more tightly relating the text insertion cursor and mouse cursor by sharing gestures.
  • the raising of the dragging finger(s) off the surface causes a mouse Up event to be generated at the new position of the cursor.
  • the mouse cursor is past the end of a line when released.
  • the text insertion cursor is placed in the nearest position where text may be entered, the same as it would be for a standard mouse click past the end of the existing line.
  • the mouse cursor position of the mouse cursor is updated to share the position of the text cursor and is made invisible when the text cursor is displayed.
  • Magnetic mouse mode includes efficiencies that save events. Using a conventional pointing device, the user would need to move a hand to the pointing device, move the mouse to the desired location, click the mouse, return the hand to the keyboard 2 and continue typing. Using cursor control with arrow keys would require moving of the hand and an additional six key presses. Using a command key combination might avoid moving a hand if the required modifier keys didn't require repositioning but would still require six key presses. In magnetic mouse mode, the user leaves both hands in place, keeps her eyes on the display and simply swipes upwards until the proper position is reached and continues typing. This requires a single gesture instead of four or more and no loss of attention on the working document.
  • tap-touch-hold over a selection does the same (tap-touch- release would still remove the selection and create a new insertion point in text (or mouseUp while not in text).
  • tap-touch-release would still remove the selection and create a new insertion point in text (or mouseUp while not in text).
  • FIG. 18 shows the result of a rapid diagonal motion from 720 to 730 by the user. This causes the mouse cursor to appear and move from 700 to 710 followed by a click at 730 when the fingers are released which places the text insert cursor at the new location.
  • magnetic mouse mode is most useful in applications such as text editors or word processors where the predominance of the time is spent entering and moving about in text and where pushing a text insert cursor through text makes sense.
  • the low-level event logic 11 may read motion events and then generate arrow key events in the case when the motion is slow. Alternatively, sensing a rapid motion, it may generate motion events and an automatic mouse click event at the end. The effect is to give the user the sense of pushing a text insert cursor through the text and it is this visual metaphor which makes this novel computer interaction intuitive. It is as if the pointing cursor was "magnetic" and with the pause in motion, sticks to the underlying virtual object.
  • FIG. 21 shows a longer very rapid movement across two keys from 920 to 930 causes the text insert cursor to advance from 900 to the beginning of the next word at 910 is illustrated.
  • FIG. 22 shows a rapid 2 key movement vertically from 1000 to 1010 causes the text insert cursor 1020 to move upward to the previous beginning of a block 1030 in this case the block where the text insert cursor already resided. On repetition of the gesture the text insert cursor would move to the beginning of the next block above.
  • 1130 when there is a text insert cursor has the meaning of moving the cursor all the way to the right side of the current text line, 1100 to 1110 is illustrated. Similarly defined, a long swipe left will move the text input cursor to the left side of the line.
  • a long, fast swipe in any direction "throws" the text cursor in the direction of the motion.
  • the throw distance is settable by a user preference and may cause the document to scroll if so desired. This enables the user to gesture from 1230 to 1240 and to move the cursor from 1200 to 1210 and "glide” from 1230 to 1240. This allows the user in a large document to throw the cursor from a distance greater than he/she could drag the cursor in one stroke. The user can then precisely position the cursor with another stroke.
  • an upward swipe, 1330 where the fingers remain on the top row of keys, 1340, will move the cursor, 1300, to the top of the display, 1310, and continue scrolling upwards with the cursor through the text until the touch is released.
  • downward swipes that are held in the bottom row and horizontal swipes that are held at the left and right extremes of the keyboard 2 for panning right and left.
  • the event translation software is configured to enable the touch sensitive keyboard to operate with a plurality of commands to perform the at least one action.
  • the plurality of commands includes but is not limited to a scrolling and a panning gesture command, a double hand gesture command, a tap key command and a tap gesture command.
  • three fingers gestures are defined to generate the native platforms commands for scrolling or panning and selection.
  • the system may define continuous scrolling or panning of the window contents (rather than the moving of a cursor through the document as described above.) Moving the appropriate number of fingers to the edge of the draggable area (again, typically one more finger than for mouse motion) on the touch sensitive keyboard 2 and pausing there may be defined to cause continuous scrolling in the indicated direction while the hand remains at the edge.
  • Tablet computers may define a two-fingered pinch gesture to zoom in or out or using two fingers of the same hand and drawing them closer together or drawing them further apart.
  • a tablet may define a rotation gesture. These may also be deployed by the touch sensitive keyboard 2 but the user's hand position on the keyboard 2 and the possibility of the user resting her fingers on the keyboard 2 makes it easier to define similar gestures by using two hands.
  • the gestures are similar except that one or two fingers of each hand are brought closer together or further apart to indicate zooming as shown in FIG. 26, while moving the fingers of each hand in opposite directions indicates rotation as indicated at FIG. 27.
  • the force of the tap can be measured when the hardware is capable of such measurements as well as the immediate raising of the finger inherent in a tap detected.
  • Tapping 'b' might move backwards by a word.
  • Tapping ⁇ ' might open a new line after the current one without having to first go to the end of the line and hit return.
  • a touch sensitive keyboard 2 can be used to combine key commands with mouse commands. This is possible because the touch sensitive keyboard is both keyboard 2 and mouse, and taps always always take place on top of a key. Touching a modifier key while tap-touching distinguishes a mouse-touch event from a tap command. For example, touching but not depressing a shift key. In the same way that depressing the shift key modifies the key code sent when pressing a key (usually to send the uppercase version of the letter associated with the key), touching the shift key modifies the event to send a tap command.
  • Tapping on a key 7 can also be combined with gestures. For example, tap-touch on the 'd' key while touching the shift key, (shift-tap-touch) with a single finger and sliding left may be defined to delete characters to the left of the text insert cursor. Similarly, as shown in FIG. 28, shift-tap-touch-slide right deletes characters to the right of the text insert cursor.
  • FIG. 29 shows the result of the text after the user raises her hand after the tap-drag.
  • shift-tapping 'd' twice ('dd') is defined to delete a line
  • shift-tapping twice and then dragging up or down might be defined to delete line by line upwards or downwards respectively.
  • Tapping of different letters can be combined. Tapping 'dw' might be defined to delete the word at the text insert cursor, 'dw'-slide might delete by word. Shift-tapping enables a easily accessible and intuitive way to define easy to remember commands by leveraging the difference between touch and depressing of a key along with the fact that every mouse-touch event also occurs on a particular key.
  • gestures may be modified to increase functionality or ease of use, while supporting and taking advantage of not having to move one's hand to the pointing device.
  • the following gestures are single-finger gestures unless otherwise noted. These include but are not limited to:
  • a quick double tap generates a double click— (same as Apple trackpad)
  • a quick two-finger tap is a right mouse click— (same as Apple trackpad)
  • option-quick tap is also a right mouse— (same as Apple trackpad)
  • tap-tap-touch-drag selects a word and continues selecting by word upon as the touch is dragged, (diff from Apple)
  • tap-tap-touch-drag is equivalent to three fingered double tap-drag on a
  • Macintosh multi-touch track pad
  • mouseDown (same as Apple)
  • tapping a key while touching the Shift key sends a tap command if one is defined for the key.
  • shift-touch-tap-d can delete in any direction. This command has to be processed by the High-Level Event module and issue the appropriate low-level events.
  • g. preceding a tap by a tapped numerical value executes the command the indicated number of times, e.g. tapping 4 before a tap command will execute the tap command 4 times.
  • tap events which may be defined for actions typical to a word or text editor.
  • Other suitable tap events may be employed below, a letter or letters in single quotes indicates a tap on the associated key or keys.
  • a repeated letter or groups of letters indicates repeated taps on the associated key(s).
  • the symbol '#' indicates a number generated by tapping on the number keys on the keyboard.
  • a single tap or multiple taps while holding the touch on the last tap may be defined to repeat the associated command until released.
  • a. 'h', 'j', 'k', T tap command alternative for cursor motion.
  • Other combinations of letters are possible in software such as games and may be defined or customized.
  • Tap on the associated key to move the cursor in the associated direction. For the given example keys used within text, tapping 'h' moves the text cursor one character to the left, 'k' moves it one line up, 'k', one line down; T, one character to the right. Tapping 'j' and T may also be used to move through lists or menus.
  • k. 'd' swipe delete in the direction of the swipe to the end of logical unit in that direction, e.g. 'd' swipe right might delete to the end of the line
  • t. ' ⁇ ' move to the beginning of the current section or the beginning of the preceding section if already at the beginning of a section. That which constitutes a section is defined by the application. For example, a programmer's text editor might go to the beginning of the current code block which is delimited in many programming languages with the character ' ⁇ '. u. ' ⁇ ' : move to the end of the current section
  • Taps on keys may be defined to have application specific meanings.
  • applications define command keys equivalents to use for menu shortcuts or other functions specific to the application.
  • a keyboard utility can be used to map these functions to tap commands.
  • an application can support tap commands directly, as well as tap-gesture commands. For example, in a graphics editor, tapping 'b' and dragging might draw with the brush tool, while tapping 'p' and dragging might draw with the pencil tool.
  • the novelty lies in the ability to distinguish a gesture which begins with one key from the same gesture that begins on a different key.
  • a mode is a state in which keys 7 typed on the keyboard are interpreted differently than the same keys typed while in a different mode. For example, using the text editor VIM, the user switches between "command mode” and "text insert mode” . In command mode, each key is interpreted as being part of a command. In text insert mode keys typed are interpreted as characters to be inserted into the file. In real world editing, commands and text tend to group together: one tends to type a few commands, then some text, then some commands.
  • editing often involves issuing commands to position the text cursor properly, followed by a stream of text that is entered into the file, then more commands to go to another location.
  • a command to enter command mode where all keys are commands.
  • a command to switch to text insert mode can be issued and thereafter all key presses will be interpreted as text.
  • VIM Voice over IP
  • the most frequent and damaging mistake with modal editors is the user being in one mode when he or she thought the other mode was selected.
  • a modified VIM could be created that has but a single mode.
  • the VIM commands are defined as taps on the surface of the keys 7 normally used for commands in command mode. Normal typing is always text insert mode.
  • pressing the 'w' key always enters the 'w' character while tapping 'w' will move the cursor to the right one word.
  • the user is never stuck in the wrong mode as a tap is always a command while a key press always enters text.
  • the low-level events are those which map closely to the actions of the hardware, e.g. a key press, a mouse click.
  • High-level events are often combinations of low-level events, e.g. a fast swipe to the right initiated on the 'd' key is a combination of several low-level events that might be defined with the semantics "delete one character to the right of the cursor".
  • This high-level event could be send by the keyboard 2 to the GID 4 in different ways.
  • the simplest event to send might not be a gesture or a series of mouse-touch events but instead it might be more concise and accurate to translate this gesture to a press of the forward delete key.
  • tap 'd' and drag to delete may be directly defined and supported by an application or may be mapped into a sequence of existing mouse-touch and keyboard events with meaning to the operating system or application. For example, shift-tap-dragging from the 'd' key might generate a mouse-down and drag series of events to select text, followed by mouse-up event and a final press of the delete key.
  • the low and high-level event manager software may be able to learn from the user's interaction to extend or contract the distance required to distinguish certain gestures from each other, for example allowing a longer drag for character positioning than simply one key distance.
  • Each user has different sized hands and finger reach. When a gesture is followed by smaller counter gestures to correct it, it may be possible to recognize this as an error in interpretation and correct it to fit the user.
  • gestures may benefit from training or user customization. Having the user repeat each gesture several times can allow the recognition software to adjust to the individual user. For example, a programmer might define a gesture and associate that with a compile and build command.
  • a keyboard 2 may be customized to an individual as well as the tasks that the individual performs.
  • a bicycle is fit to an individual bike racer's size and strength while the design of the bike varies for terrain, e.g. a road bike versus a mountain bike.
  • a person with large hands and thick fingers can certainly benefit from a different keyboard than that tailored to someone with small hands and thin fingers.
  • the touch sensitive keyboard By eliminating the need for a trackpad entirely, the touch sensitive keyboard opens up space for customization.
  • the touch sensitive keyboard and customization of keyboard layout and key sizes can achieve greater efficiency than ever before.
  • the bracket keys may be moved between the 'g' and 'h' keys.
  • the caps lock key a large and importantly placed key which nevertheless has little utility in programming, may be used instead as a different more utilized key, such as the control key, with a different key being used for caps lock, or with the use of a combination such as a double tap on the caps lock key to activate/deactivate the caps lock function required.
  • the sensors are located between the keys of a keyboard.
  • the sensors are shown as vertical lines 1400 between each of the keys, while the wiring for the sensors is shown in horizontal lines 1410 between rows of keys.
  • the sensors are located on every space of the keyboard that has a key on its left and right, in other not shown embodiments fewer sensors are placed between keys.
  • sensors may be located on the keys and between the keys. To further improve on these embodiments with sensors between the keys, additional sensors may be present to determine the height of a finger over the keys or for detecting a tap gesture so as to filter out meaningful touch movements versus the user simply resting his or her fingers on the keys.
  • a non-transitory computer-readable medium comprises computer-executable instructions stored therein for causing a computer to implement a program executable on a keyboard system 1 for a method for displaying at least one action in response to at least one touch input.
  • the non-transitory computer readable storage medium may comprise a memory card, USB-Drive, CD-ROM or flash memory to list but a few.
  • a non-transitory computer-readable medium comprises computer-executable instructions stored therein for causing mobility solutions to implement a program executable on a keyboard system 1 that enables the keyboard system 1 to perform at least one action by combining a touch screen gesture with at least one of a plurality of discrete keys 7 without utilizing a pointing device.
  • the present invention provides several benefits. Firstly, a substantial increase in the speed and efficiency of user interaction is provided even with existing software on GIDs 4 due primarily to a reduction of the number of required actions to execute a task on a GID 4 and the elimination of the time it takes for the user to move her hand from the keyboard 2 to a mouse, trackpad, or set of arrow keys to perform an action, and then move her hand back to the keyboard 2 and position it for text entry.
  • a further goal is the elimination of the necessity to move one's eyes from the display in order to locate and use certain keys on the input device which cannot easily be found by touch alone, only then to return one's gaze to the display and refocus on the current task.
  • this invention a combination of the three components: touch sensitive keyboard 2, a simple but complete set of interactions, and the mapping of those actions to events compatible with existing applications— extends a similar increase in efficiency to the modern workflow that includes display, keyboard 2 and pointing device.
  • the preferred embodiment is far easier to learn than existing methodologies for speeding user interactions with a keyboard 2 which require the memorization both mentally and physically of a set of complex and often awkward command key combinations.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

Un système de clavier effectue au moins une action en combinant un geste tactile avec une pluralité de touches distinctes sans utiliser un dispositif de pointage externe. Le système de clavier comprend un clavier tactile, un processeur d'événement tactile et une interface graphique. Le clavier tactile comprend une pluralité de touches distinctes qui comprennent un commutateur de pression de touche et un dessus de touche, la partie supérieure de la touche comprenant un plot de capteur tactile capacitif. Un signal d'entrée de clavier est généré par un toucher sur au moins une touche distincte et par un geste tactile. Un module de détermination de toucher interprète le signal d'entrée de clavier et génère un paquet de données. Un module d'événement de faible intensité reçoit le paquet de données correspondant à l'entrée tactile, convertit le paquet de données et l'envoie à un module d'événement de haute intensité qui donne l'instruction à un ordinateur d'exécuter une action affichée par un dispositif d'interface graphique.
PCT/US2016/067890 2015-12-20 2016-12-20 Combinaison de clavier d'ordinateur et de dispositif de pointage d'ordinateur WO2017112714A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562270000P 2015-12-20 2015-12-20
US62/27000 2015-12-20
US62/270,007 2015-12-20

Publications (2)

Publication Number Publication Date
WO2017112714A1 true WO2017112714A1 (fr) 2017-06-29
WO2017112714A8 WO2017112714A8 (fr) 2017-11-23

Family

ID=59089970

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/067890 WO2017112714A1 (fr) 2015-12-20 2016-12-20 Combinaison de clavier d'ordinateur et de dispositif de pointage d'ordinateur

Country Status (1)

Country Link
WO (1) WO2017112714A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019192124A (ja) * 2018-04-27 2019-10-31 株式会社東海理化電機製作所 スイッチ装置および制御装置
CN112131076A (zh) * 2020-09-17 2020-12-25 上海上讯信息技术股份有限公司 一种用于获取鼠标操作事件信息的方法、设备与系统
US10908727B2 (en) 2017-11-02 2021-02-02 Blackberry Limited Electronic device including touchpad and fingerprint sensor and method of detecting touch
CN112306248A (zh) * 2019-07-29 2021-02-02 瑟克公司 用于触摸板和键盘的混合电路
CN112384884A (zh) * 2019-05-09 2021-02-19 微软技术许可有限责任公司 快速菜单选择设备和方法
CN116467059A (zh) * 2023-04-21 2023-07-21 哈尔滨有初科技有限公司 一种基于分布式计算的数据处理系统及方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100306427A1 (en) * 2009-05-29 2010-12-02 Aten International Co., Ltd. Ps/2 to usb keyboard adaptor supporting n-key rollover
US20120054671A1 (en) * 2010-08-30 2012-03-01 Vmware, Inc. Multi-touch interface gestures for keyboard and/or mouse inputs
US20130300668A1 (en) * 2012-01-17 2013-11-14 Microsoft Corporation Grip-Based Device Adaptations
US20140118264A1 (en) * 2012-10-30 2014-05-01 Apple Inc. Multi-functional keyboard assemblies
US20150058809A1 (en) * 2013-08-23 2015-02-26 General Electric Company Multi-touch gesture processing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100306427A1 (en) * 2009-05-29 2010-12-02 Aten International Co., Ltd. Ps/2 to usb keyboard adaptor supporting n-key rollover
US20120054671A1 (en) * 2010-08-30 2012-03-01 Vmware, Inc. Multi-touch interface gestures for keyboard and/or mouse inputs
US20130300668A1 (en) * 2012-01-17 2013-11-14 Microsoft Corporation Grip-Based Device Adaptations
US20140118264A1 (en) * 2012-10-30 2014-05-01 Apple Inc. Multi-functional keyboard assemblies
US20150058809A1 (en) * 2013-08-23 2015-02-26 General Electric Company Multi-touch gesture processing

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10908727B2 (en) 2017-11-02 2021-02-02 Blackberry Limited Electronic device including touchpad and fingerprint sensor and method of detecting touch
JP2019192124A (ja) * 2018-04-27 2019-10-31 株式会社東海理化電機製作所 スイッチ装置および制御装置
JP7137962B2 (ja) 2018-04-27 2022-09-15 株式会社東海理化電機製作所 スイッチ装置および制御装置
CN112384884A (zh) * 2019-05-09 2021-02-19 微软技术许可有限责任公司 快速菜单选择设备和方法
CN112306248A (zh) * 2019-07-29 2021-02-02 瑟克公司 用于触摸板和键盘的混合电路
CN112306248B (zh) * 2019-07-29 2024-03-05 瑟克公司 用于触摸板和键盘的混合电路
CN112131076A (zh) * 2020-09-17 2020-12-25 上海上讯信息技术股份有限公司 一种用于获取鼠标操作事件信息的方法、设备与系统
CN116467059A (zh) * 2023-04-21 2023-07-21 哈尔滨有初科技有限公司 一种基于分布式计算的数据处理系统及方法

Also Published As

Publication number Publication date
WO2017112714A8 (fr) 2017-11-23

Similar Documents

Publication Publication Date Title
US10061510B2 (en) Gesture multi-function on a physical keyboard
US10444989B2 (en) Information processing apparatus, and input control method and program of information processing apparatus
US8059101B2 (en) Swipe gestures for touch screen keyboards
KR101117481B1 (ko) 멀티터치 방식 입력제어 시스템
JP6115867B2 (ja) 1つ以上の多方向ボタンを介して電子機器と相互作用できるようにする方法およびコンピューティングデバイス
WO2017112714A1 (fr) Combinaison de clavier d'ordinateur et de dispositif de pointage d'ordinateur
JP2013527539A5 (fr)
US20150100911A1 (en) Gesture responsive keyboard and interface
EP0660218A1 (fr) Clavier graphique
US20110215914A1 (en) Apparatus for providing touch feedback for user input to a touch sensitive surface
US20050162402A1 (en) Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
WO2014047084A1 (fr) Fonctions de clavier lancées par des gestes
JP2000501526A (ja) メモリの必要を最小限にするマルチタッチ入力装置、方法およびシステム
US10409412B1 (en) Multi-input element for electronic device
US20140317564A1 (en) Navigation and language input using multi-function key
US20090262072A1 (en) Cursor control system and method thereof
US8970498B2 (en) Touch-enabled input device
CN102103461A (zh) 在笔记本电脑触控板上实现快捷键模式的方法
KR20130002983A (ko) 전극 배열이 집적된 컴퓨터 키보드
Zhang et al. Gestkeyboard: enabling gesture-based interaction on ordinary physical keyboard
US20140298275A1 (en) Method for recognizing input gestures
TWM486807U (zh) 具備觸控功能之周邊裝置
CN104503591A (zh) 一种基于折线手势的信息输入方法
US20170308177A1 (en) Capacitive Keyboard Having Variable Make Points
Hall et al. T-Bars: towards tactile user interfaces for mobile touchscreens

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16879997

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16879997

Country of ref document: EP

Kind code of ref document: A1