EP2646893A2 - Multiplexed numeric keypad and touchpad - Google Patents

Multiplexed numeric keypad and touchpad

Info

Publication number
EP2646893A2
EP2646893A2 EP11844754.9A EP11844754A EP2646893A2 EP 2646893 A2 EP2646893 A2 EP 2646893A2 EP 11844754 A EP11844754 A EP 11844754A EP 2646893 A2 EP2646893 A2 EP 2646893A2
Authority
EP
European Patent Office
Prior art keywords
mode
motion
processor
operation
surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11844754.9A
Other languages
German (de)
French (fr)
Inventor
Randal J. Marsden
Steve Hole
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cleankeys Inc
Original Assignee
Cleankeys Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US41827910P priority Critical
Priority to US201161472799P priority
Application filed by Cleankeys Inc filed Critical Cleankeys Inc
Priority to PCT/US2011/062723 priority patent/WO2012075199A2/en
Publication of EP2646893A2 publication Critical patent/EP2646893A2/en
Application status is Withdrawn legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the screen or tablet into independently controllable areas, e.g. virtual keyboards, menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Abstract

A method and system that integrates a numeric keypad with a touchpad in the same physical location on a touch-sensitive display device. Operational mode of the same location is automatically determined based on user actions with the display or based on a manual entry by the user. The system operates in at least one mode of operation selected from: numpad mode, touchpad mode, keyboard mode and auto-detect mode. A visual indicator communicates with the user which mode is the current mode.

Description

MULTIPLEXED NUMERIC KEYPAD AND TOUCHPAD

FIELD OF THE INVENTION

[0001] The invention relates to a smooth touch-sensitive surface that allows the user to rest their hands or fingers on the surface without causing an event actuation. More specifically, the touch surface may be made up of both a keypad and a touchpad occupying the same physical space.

BACKGROUND OF THE INVENTION

[0002] The origin of the modern keyboard as the primary method for inputting text and data from a human to a machine dates back to early typewriters in the 19th century. As computers were developed, it was a natural evolution to adapt the typewriter keyboard to be used as the primary method for inputting text and data. While the implementation of the keys on a typewriter and subsequently computer keyboards have evolved from mechanical to electrical and finally to electronic, the size, placement, and mechanical nature of the keys themselves have remained largely unchanged.

[0003] As computers evolved and graphical user interfaces were developed, the mouse pointer became a common user input device. With the introduction of portable "laptop" computers, various new pointing devices were invented as an alternative to the mouse, such as trackballs, joysticks, and touchpads (also referred to "trackpads"). The overwhelming majority of laptop computers now incorporate the touchpad as the primary pointing device.

[0004] Prior to computers, a common office instrument used for performing numerical calculations was the "adding machine". This device incorporated number keys along with common mathematical operation keys, such as add, subtract, multiply and devide. The operator would perform data entry on these machines, which then display the result, print the result, or do both. Experienced operators of adding machines were able to memorize the location of the keys and enter data and perform operations very quickly without looking. As computers became common, the need for efficient numeric entry persisted and the "adding machine" functions were added to computer keyboards in the form of a numeric keyboard (or "numpad") typically located to the right of the standard keyboard.

[0005] Combining the three primary user interface devices of keyboard, touchpad, and numpad into a single device results in the device becoming unreasonably large. The problem is further complicated by the fact that many modern keyboards incorporate yet additional keys for page navigation, multimedia controls, gaming, and keyboard settings functions. The result can be a "keyboard" that is often larger than the computer itself.

SUMMARY OF THE INVENTION

[0006] The present invention describes a method and system that solves the space problem by integrating the numeric keypad part of the keyboard and the touchpad in the same physical location.

[0007] Keyboard technology has now evolved to the point of eliminating the traditional mechanical keys, in favor of a touch-sensitive surface that can detect user input through the correlation of touch and vibration sensors (Marsden, U.S. Patent Application Ser. No. 12/234,053). This surface can be used to provide all the functions of the keyboard, numpad, and touchpad, but in a much smaller space since it makes it possible to "multiplex" or use the same physical space on the surface for multiple functions. The touch surface may incorporate either a dynamic or static display beneath it, or a mixture of both.

[0008] In aspect of the invention, the numeric keypad and the touchpad occupy the same physical space. This is possible due to the fact that the touch-sensitive surface, unlike traditional mechanical keys, can have the spacing, size, orientation, and function of its "keys" dynamically assigned.

[0009] In another aspect of the invention, the system has three modes of operation: numpad mode, touchpad mode, and auto-detect mode. A visual indicator communicates with the user which mode it is in. The user changes the mode via activation of a key or key combinations on the keyboard. Visual indicators provide feedback to the user as to which mode the device is in.

[0010] In a further aspect of the invention, the system automatically determines which mode the user intends based on their interaction with the touch surface. For example, if the user slides their finger across the surface, they most likely intend for it to act as a touchpad, causing the pointer to move. Similarly, if the user taps their finger on a specific sector of the touch surface assigned to a number key, then they most likely intend for it to be used as a numpad.

BRIEF DESCRIPTION OF THE DRAWINGS

[0011] Preferred and alternative examples of the present invention are described in detail below with reference to the following drawings:

[0012] FIGURE 1 is a hardware block diagram showing the typical hardware components of a system formed in accordance with an embodiment of the present invention;

[0013] FIGURE 2 shows an exemplary process performed by the system shown in FIGURE 1; and

[0014] FIGURE 3 is a schematic parital view of an exemplary touch sensitive surface formed in accordance with an embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

[0015] FIGURE 1 shows a block diagram of the hardware components of a device 100 for providing a multiplexed numeric keypad and touchpad. The device 100 includes one or more touch sensors 120 that provides input to a CPU (processor) 110 notifying the processor 110 of contact events when the surface has been touched, typically mediated by a hardware controller that interprets the raw signals received from the touch sensor(s) 120 and communicates the information to the processor 110 using a known communication protocol via an available data port. Similarly, the device 100 includes one or more vibration sensors 130 that communicate with the processor 110 when the surface is tapped, in a manner similar to that of the touch sensor(s) 120. The processor 110 communicates with an optional hardware controller to cause a display 140 to present an appropriate image. A speaker 150 is also coupled to the processor so that any appropriate auditory signals can be passed on to the user as guidance. The processor 110 has access to a memory 160, which may include a combination of temporary and/or permanent storage, and both read-only and writable memory (random access memory or RAM), read-only memory (ROM), writable non-volatile memory such as FLASH memory, hard drives, floppy disks, and so forth. The memory 160 includes program memory 170 that contains all programs and software such as an operating system 171, the User Gesture Recognition software 172, and any other application programs 173. The memory 160 also includes data memory 180 that includes user options and preferences 181 required by the User Gesture Recognition software 172, and any other data 182 required by any element of the device 100.

[0016] FIGURE 2 shows a flow chart of an exemplary process 200 that allows the same physical area on a touchscreen keyboard to be used to perform the functions of both a numeric keypad and touchpad. The process 200 is not intended to fully detail all the software of the present invention in its entirety, but is provided as an overview and an enabling disclosure of the present invention.

[0017] The process 200 is provided by the User Gesture Recognition Software 172. At block 205, when the process is first started, various system variables are initialized. For example, event time out (threshold time) is set to zero. At block 210, the process waits to be notified that user contact has occurred within the common area. While the system is waiting in block 210, a counter is incremented with the passage of time. Once user contact has occurred, block 215 determines if the counter has exceeded the maximum time (threshold) allowed for user input (stored as a user option in Data Memory 181).

[0018] If the maximum time allowed for user input has been exceeded, then the system resets the mode of the common area to the default mode in block 220. At a decision block 225, the processor 110 determines whether or not the current mode is in touchpad mode. If the current mode is in the touchpad mode, the processor 110 interprets the user contact as a touchpad event and outputs the command accordingly in block 230. [0019] If the current mode is not in the touchpad mode, then the processor 110 assumes the common area is in number pad (numpad) mode and proceeds to decision block 235. In touchpad operation, the user will make an initial touch followed by a sliding motion with their finger (or multiple fingers). In numpad operation, the user will tap on a number key and typically will not slide their finger. The processor 110 uses this difference in typical operation to interpret the user's input in decision block 235 and if a touch-and-slide motion is detected by the processor 110 based on signals provided by the sensors 120,130, the processor 110 changes the current mode to the touchpad mode in block 240, and outputs the user action as a touchpad event in block 245. If the user action is not a touch-and-slide motion then the user action is output by the processor 1 10 as a numpad event in block 250. After blocks 230, 245, 250, the process 200 returns to block 210.

[0020] Note that single taps (or multiple taps in succession) are also common when using a touchpad, and are commonly assigned to functions such as "select" or what is commonly referred to as a "mouse left button" action. These types of actions typically occur shortly after a touch-and-slide motion, and so the system will still be in touchpad mode (since the counter will not yet have reached the threshold in block 215).

[0021] Other user gestures on the touchpad are interpreted and assigned to functions, such as multiple finger swipes across the touchpad. While the device 100 is in the touchpad mode, all these gestures are interpreted as touchpad input and sent to the device's operating system as such to be interpreted by whatever system software resides therein. In this way, the system and method of the present invention acts exactly like any other touchpad when in touchpad mode.

[0022] In one embodiment, the default mode is set by the user (typically through control panel software). If the device 100 is at rest with no user input for the user-settable amount of time (threshold), the mode is restored to the default mode.

[0023] FIGURE 3 shows a schematic view representative of a touch and tap-sensitive keyboard 300 that incorporates on its forward- facing surface an area 310 incorporating the functions of both a numeric keypad and touchpad. The term "keyboard" in this application refers to any keyboard that is implemented on a touch and tap sensitive surface, including a keyboard presented on a touch-sensitive display. The keyboard 300 includes the outline of the area 310 incorporating the functions of the touchpad, the keys assigned to the numeric keypad, as well as the selection keys commonly referred to as the "left and right mouse buttons" 330. "Mode" refers to the type of function that is assigned to the commonly-shared area 310. A separate mode key 320 allows the user to manually select between Touchpad mode, numeric keypad (or "numpad") mode, or "Auto" mode (whereby the function assigned to common area 310 is determined by the system according to the actions of the user on the surface of the common area 310).

[0024] In one embodiment, the system of the present invention displays the current mode (touchpad or number pad) with visual indicators 320 along with an "Auto" mode visual indicator. In this way, the user can know which mode the system is in at all times. In one embodiment, a mode key 324 is provided below the indicators 320 on the keyboard. User activation of the mode key 324 causes the processor 110 to switch to another mode.

[0025] In one embodiment, the user may define the default mode to be the touchpad mode by first selecting Auto mode with the mode key 324 immediately followed by a touch-and-slide motion on the common area 310. In the absence of a touch-and-slide motion immediately following the selection of Auto mode, the processor 110 will set the default mode to numpad mode.

[0026] In another embodiment of the invention, the touch surface is used in a fourth mode: keyboard. In the fourth mode, the surface represents a keyboard, on which the user may enter text using a plethora of methods designed for smaller touch surfaces (such as those invented for smartphones). This mode is manually selected by the user through some scheme implemented on the keyboard or computer software, or it is selected by functionality provided by the auto-detect mode. The device stays in keyboard mode for as long as the user is typing. To exit the keyboard mode and return to the touchpad mode, the user performs a predefined gesture - such as pressing and holding all their fingers for a few seconds in the same location. The processor recognizes the unique gesture, then changes mode accordingly. Other gestures could also be recognized.

[0027] In another embodiment of the invention, the touch surface incorporates a dynamic display. The display changes in accordance with the current mode setting to display the appropriate image in the common area. For example, when numpad mode is selected, a numeric displayed; when touchpad is selected, a blank rounded rectangle is displayed; and so on.

Claims

[0028] The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows:
1. A system comprising:
a surface comprising a multi-mode area;
a plurality of touch sensors coupled to the surface, the plurality of touch sensors configured to generate at least one sense signal based on sensed user contact with the surface;
a plurality of motion sensors, the plurality of motion sensors configured to generate a motion signal based on sensed vibrations of the surface; and
a processor in signal communication with the surface, the plurality of touch sensors, and the plurality of motion sensors, wherein the processor is configured to determine mode of operation associated with the multi-mode area based on interpretation of at least one of the generated at least one sense signal and the motion signal associated with the multi-mode area.
2. The system of Claim 1, wherein the modes of operation comprise at least two of a keyboard mode, a numeric keypad mode, or a touchpad mode.
3. The system of Claim 2, wherein the processor is further configured to determine the mode of operation based on a signal associated with a user selection.
4. The system of Claim 3, wherein the surface comprises a display device coupled to the processor, wherein the user selection comprises activation of a mode key displayed by the processor on the surface.
5. The system of Claim 1, wherein the surface comprises at least one visual indicator, wherein the processor illuminates at least one visual indicator based on the determined mode of operation.
6. The system of Claim 2, wherein the processor identifies a default mode of operation.
7. The system of Claim 6, wherein the processor identifies the default mode of operation to be the touchpad mode after an auto mode selection has occurred followed within a predefined amount of time by a determination of a sliding motion at least on or near the multi-mode area based on the at least one sense signal,
wherein the processor identifies the default mode to be the numeric keypad mode if after the auto mode selection no sliding motion is detected within the predefined amount of time based on the at least one sense signal.
8. The system of Claim 6, wherein the processor determines mode of operation to be the touchpad mode, if the processor detects a touch-and-slide motion at the multi-mode area based on the generated at least one sense signal and the motion signal,
wherein the processor determines mode of operation to be at least one of the numeric keypad mode or the keyboard mode, if the processor detects only a tap motion based on the generated motion signals and the detected tap motion did not occur within a threshold amount of time since the detected touch-and-slide motion.
9. The system of Claim 8, wherein the processor returns interpretation of the generated at least one sense signal and the motion signal associated with the multi-mode area to the default mode after a predefined period of time has expired since a previously generated at least one sense signal and motion signal associated with the multi-mode area.
10. The system of Claim 2, wherein the surface comprises a display device coupled to the processor,
wherein the processor is configured to generate an image and present the generated image in the multi-mode area of the surface, wherein the generated image is associated with current mode of operation.
11. The system of Claim 1, wherein the surface comprises a static representation of at least one of a numeric keypad, keyboard or touchpad.
12. A method comprising:
at a plurality of touch sensors, generating at least one sense signal based on sensed user contact with a surface;
at a plurality of motion sensors, generating a motion signal based on sensed vibrations of the surface; and
at a processor in signal communication with the surface, the plurality of touch sensors, and the plurality of motion sensors,
receiving the generated at least one sense signal and the motion signal; and determining mode of operation associated with a multi-mode area of the surface based on interpretation of at least one of the received at least one sense signal and the motion signal associated with the multi-mode area.
13. The method of Claim 12, wherein the modes of operation comprise at least two of a keyboard mode, a numeric keypad mode, or a touchpad mode.
14. The method of Claim 13, wherein determining the mode of operation comprises determining the mode of operation based on a signal associated with a user selection.
15. The method of Claim 12, further comprising at the processor illuminating at least one visual indicator associated with the surface based on the determined mode of operation.
16. The method of Claim 13, further comprising at the processor identifying a default mode of operation.
17. The method of Claim 16, wherein identifying comprises:
identifying the default mode of operation is the touchpad mode after receiving an auto mode selection followed within a predefined amount of time by receiving at least one sense signal determined to be a sliding motion at least on or near the multi-mode area; and identifying the default mode is the numeric keypad mode if after receiving the auto mode selection no sense signal determined to be a sliding motion is received within the predefined amount of time.
18. The method of Claim 16, wherein determining mode of operation comprises:
determining the mode of operation is the touchpad mode, if a touch-and-slide motion at the multi-mode area has been detected based on the generated at least one sense signal and the motion signal,
determining the mode of operation is at least one of the numeric keypad mode or the keyboard mode, if only a tap motion has been detected based on the generated motion signals and the detected tap motion did not occur within a threshold amount of time since the detected touch-and-slide motion.
19. The method of Claim 18, further comprising at the processor returning interpretation of the generated at least one sense signal and the motion signal associated with the multi-mode area to the default mode after a predefined period of time has expired since a previously generated at least one sense signal and motion signal associated with the multi-mode area.
20. The method of Claim 13, further comprising at the processor:
generating an image based on current mode of operation; and
presenting the generated image in the multi-mode area of the surface.
EP11844754.9A 2010-11-30 2011-11-30 Multiplexed numeric keypad and touchpad Withdrawn EP2646893A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US41827910P true 2010-11-30 2010-11-30
US201161472799P true 2011-04-07 2011-04-07
PCT/US2011/062723 WO2012075199A2 (en) 2010-11-30 2011-11-30 Multiplexed numeric keypad and touchpad

Publications (1)

Publication Number Publication Date
EP2646893A2 true EP2646893A2 (en) 2013-10-09

Family

ID=46172548

Family Applications (2)

Application Number Title Priority Date Filing Date
EP11844775.4A Withdrawn EP2646894A2 (en) 2010-11-30 2011-11-30 Dynamically located onscreen keyboard
EP11844754.9A Withdrawn EP2646893A2 (en) 2010-11-30 2011-11-30 Multiplexed numeric keypad and touchpad

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP11844775.4A Withdrawn EP2646894A2 (en) 2010-11-30 2011-11-30 Dynamically located onscreen keyboard

Country Status (5)

Country Link
EP (2) EP2646894A2 (en)
JP (2) JP5782133B2 (en)
KR (1) KR101578769B1 (en)
CN (2) CN103443744B (en)
WO (2) WO2012075199A2 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6248635B2 (en) * 2011-11-08 2017-12-20 ソニー株式会社 Sensor device, analysis device, and storage medium
JP2017084404A (en) * 2012-02-23 2017-05-18 パナソニックIpマネジメント株式会社 Electronic apparatus
WO2014019085A1 (en) * 2012-08-01 2014-02-06 Whirlscape, Inc. One-dimensional input system and method
US8816985B1 (en) 2012-09-20 2014-08-26 Cypress Semiconductor Corporation Methods and apparatus to detect a touch pattern
EP2913325A4 (en) * 2012-10-25 2016-10-05 Shenyang Sinochem Agrochemicals R & D Co Ltd Substituted pyrimidine compound and uses thereof
US9965179B2 (en) 2012-11-27 2018-05-08 Thomson Licensing Adaptive virtual keyboard
WO2014083368A1 (en) 2012-11-27 2014-06-05 Thomson Licensing Adaptive virtual keyboard
JP6165485B2 (en) * 2013-03-28 2017-07-19 国立大学法人埼玉大学 AR gesture user interface system for mobile terminals
JP5801348B2 (en) 2013-06-10 2015-10-28 レノボ・シンガポール・プライベート・リミテッド Input system, input method, and smartphone
US9483176B2 (en) * 2013-07-08 2016-11-01 Samsung Display Co., Ltd. Method and apparatus to reduce display lag of soft keyboard presses
JP6154690B2 (en) * 2013-07-22 2017-06-28 ローム株式会社 Software keyboard type input device, input method, electronic device
US9335831B2 (en) 2013-10-14 2016-05-10 Adaptable Keys A/S Computer keyboard including a control unit and a keyboard screen
CN103885632B (en) * 2014-02-22 2018-07-06 小米科技有限责任公司 Input method and device
JP6330565B2 (en) * 2014-08-08 2018-05-30 富士通株式会社 Information processing apparatus, information processing method, and information processing program
CN104375647B (en) * 2014-11-25 2017-11-03 杨龙 Exchange method and electronic equipment for electronic equipment
CN105718069B (en) * 2014-12-02 2020-01-31 联想(北京)有限公司 Information processing method and electronic equipment
CN106155502A (en) * 2015-03-25 2016-11-23 联想(北京)有限公司 A kind of information processing method and electronic equipment
JP6153588B2 (en) * 2015-12-21 2017-06-28 レノボ・シンガポール・プライベート・リミテッド Information processing apparatus, sensing layout updating method, and program
KR101682214B1 (en) * 2016-04-27 2016-12-02 김경신 an electric ink keyboard
CN107704186A (en) * 2017-09-01 2018-02-16 联想(北京)有限公司 A kind of control method and electronic equipment
US20190273819A1 (en) * 2018-03-01 2019-09-05 International Business Machines Corporation Repositioning of a display on a touch screen based on touch screen usage statistics

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4725694A (en) * 1986-05-13 1988-02-16 American Telephone And Telegraph Company, At&T Bell Laboratories Computer interface device
JP3260240B2 (en) * 1994-05-31 2002-02-25 株式会社ワコム Information input method and device
US6278441B1 (en) * 1997-01-09 2001-08-21 Virtouch, Ltd. Tactile interface system for electronic data display system
EP1717682B1 (en) * 1998-01-26 2017-08-16 Apple Inc. Method and apparatus for integrating manual input
US7768501B1 (en) * 1998-05-01 2010-08-03 International Business Machines Corporation Method and system for touch screen keyboard and display space sharing
US6525717B1 (en) * 1999-12-17 2003-02-25 International Business Machines Corporation Input device that analyzes acoustical signatures
JP4176017B2 (en) * 2001-09-21 2008-11-05 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Maschines Corporation Input device, computer device, input object identification method, and computer program
US6947028B2 (en) * 2001-12-27 2005-09-20 Mark Shkolnikov Active keyboard for handheld electronic gadgets
JP4828826B2 (en) * 2002-07-04 2011-11-30 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Automatically adaptable virtual keyboard
JP2004341813A (en) * 2003-05-15 2004-12-02 Casio Comput Co Ltd Display control method for input device and input device
KR100537280B1 (en) * 2003-10-29 2005-12-16 삼성전자주식회사 Apparatus and method for inputting character using touch screen in portable terminal
US20050122313A1 (en) * 2003-11-11 2005-06-09 International Business Machines Corporation Versatile, configurable keyboard
US20050190970A1 (en) * 2004-02-27 2005-09-01 Research In Motion Limited Text input system for a mobile electronic device and methods thereof
JP2006127488A (en) * 2004-09-29 2006-05-18 Toshiba Corp Input device, computer device, information processing method, and information processing program
US20060066590A1 (en) * 2004-09-29 2006-03-30 Masanori Ozawa Input device
JP4417224B2 (en) * 2004-10-25 2010-02-17 本田技研工業株式会社 Fuel cell stack
US9019209B2 (en) * 2005-06-08 2015-04-28 3M Innovative Properties Company Touch location determination involving multiple touch location processes
FR2891928B1 (en) * 2005-10-11 2008-12-19 Abderrahim Ennadi Touch screen keyboard universal multilingual and multifunction
US7659887B2 (en) * 2005-10-20 2010-02-09 Microsoft Corp. Keyboard with a touchpad layer on keys
EP2191353A4 (en) * 2007-09-19 2012-04-18 Madentec Ltd Cleanable touch and tap-sensitive surface
KR101352994B1 (en) * 2007-12-10 2014-01-21 삼성전자 주식회사 Apparatus and method for providing an adaptive on-screen keyboard
KR101456490B1 (en) * 2008-03-24 2014-11-03 삼성전자주식회사 Touch screen keyboard display method and apparatus thereof
TWI360762B (en) * 2008-09-05 2012-03-21 Mitake Information Corp On-screen virtual keyboard system
US8633901B2 (en) * 2009-01-30 2014-01-21 Blackberry Limited Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device
CN101937313B (en) * 2010-09-13 2019-11-12 中兴通讯股份有限公司 A kind of method and device of touch keyboard dynamic generation and input

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2012075199A3 *

Also Published As

Publication number Publication date
JP2014514785A (en) 2014-06-19
WO2012075197A3 (en) 2012-10-04
JP6208718B2 (en) 2017-10-04
WO2012075199A3 (en) 2012-09-27
KR20140116785A (en) 2014-10-06
JP5782133B2 (en) 2015-09-24
WO2012075197A2 (en) 2012-06-07
KR101578769B1 (en) 2015-12-21
CN103443744A (en) 2013-12-11
EP2646894A2 (en) 2013-10-09
CN106201324B (en) 2019-12-13
JP2015232889A (en) 2015-12-24
CN103443744B (en) 2016-06-08
CN106201324A (en) 2016-12-07
WO2012075199A2 (en) 2012-06-07

Similar Documents

Publication Publication Date Title
US10275066B2 (en) Input apparatus, input method and program
US10191573B2 (en) Pointer display device, pointer display/detection method, pointer display/detection program and information apparatus
US20160062467A1 (en) Touch screen control
US9239673B2 (en) Gesturing with a multipoint sensing device
US9459700B2 (en) Keyboard with ntegrated touch surface
CN102722334B (en) The control method of touch screen and device
US9652146B2 (en) Ergonomic motion detection for receiving character input to electronic devices
CN202649992U (en) Information processing device
AU2011239621B2 (en) Extended keyboard user interface
US9851809B2 (en) User interface control using a keyboard
EP2474896B1 (en) Information processing apparatus, information processing method, and computer program
JP5204286B2 (en) Electronic device and input method
JP6115867B2 (en) Method and computing device for enabling interaction with an electronic device via one or more multi-directional buttons
JP2015232889A (en) Dynamically located onscreen keyboard
US8519977B2 (en) Electronic apparatus, input control program, and input control method
CN102224483B (en) Touch-sensitive display screen with absolute and relative input modes
KR101012598B1 (en) Method and computer readable medium for generating display on touch screen of computer
JP4351599B2 (en) Input device
US9092058B2 (en) Information processing apparatus, information processing method, and program
EP1979804B1 (en) Gesturing with a multipoint sensing device
JP5957834B2 (en) Portable information terminal, touch operation control method, and program
US6489951B1 (en) Method and system for providing touch-sensitive screens for the visually impaired
JP2012053926A (en) Electronic apparatus and electronic apparatus control method
JP5730667B2 (en) Method for dual-screen user gesture and dual-screen device
WO2016098418A1 (en) Input device, wearable terminal, mobile terminal, control method for input device, and control program for controlling operation of input device

Legal Events

Date Code Title Description
17P Request for examination filed

Effective date: 20130606

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (to any country) (deleted)
18D Application deemed to be withdrawn

Effective date: 20140603