EP2646893A2 - Pavé numérique et bloc à effleurement multiplexés - Google Patents

Pavé numérique et bloc à effleurement multiplexés

Info

Publication number
EP2646893A2
EP2646893A2 EP11844754.9A EP11844754A EP2646893A2 EP 2646893 A2 EP2646893 A2 EP 2646893A2 EP 11844754 A EP11844754 A EP 11844754A EP 2646893 A2 EP2646893 A2 EP 2646893A2
Authority
EP
European Patent Office
Prior art keywords
mode
motion
processor
signal
touchpad
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11844754.9A
Other languages
German (de)
English (en)
Inventor
Randal J. Marsden
Steve Hole
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cleankeys Inc
Original Assignee
Cleankeys Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cleankeys Inc filed Critical Cleankeys Inc
Publication of EP2646893A2 publication Critical patent/EP2646893A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the invention relates to a smooth touch-sensitive surface that allows the user to rest their hands or fingers on the surface without causing an event actuation. More specifically, the touch surface may be made up of both a keypad and a touchpad occupying the same physical space.
  • the present invention describes a method and system that solves the space problem by integrating the numeric keypad part of the keyboard and the touchpad in the same physical location.
  • the numeric keypad and the touchpad occupy the same physical space. This is possible due to the fact that the touch-sensitive surface, unlike traditional mechanical keys, can have the spacing, size, orientation, and function of its "keys" dynamically assigned.
  • the system has three modes of operation: numpad mode, touchpad mode, and auto-detect mode.
  • a visual indicator communicates with the user which mode it is in. The user changes the mode via activation of a key or key combinations on the keyboard. Visual indicators provide feedback to the user as to which mode the device is in.
  • the system automatically determines which mode the user intends based on their interaction with the touch surface. For example, if the user slides their finger across the surface, they most likely intend for it to act as a touchpad, causing the pointer to move. Similarly, if the user taps their finger on a specific sector of the touch surface assigned to a number key, then they most likely intend for it to be used as a numpad.
  • FIGURE 1 is a hardware block diagram showing the typical hardware components of a system formed in accordance with an embodiment of the present invention
  • FIGURE 2 shows an exemplary process performed by the system shown in FIGURE 1;
  • FIGURE 3 is a schematic parital view of an exemplary touch sensitive surface formed in accordance with an embodiment of the present invention.
  • FIGURE 1 shows a block diagram of the hardware components of a device 100 for providing a multiplexed numeric keypad and touchpad.
  • the device 100 includes one or more touch sensors 120 that provides input to a CPU (processor) 110 notifying the processor 110 of contact events when the surface has been touched, typically mediated by a hardware controller that interprets the raw signals received from the touch sensor(s) 120 and communicates the information to the processor 110 using a known communication protocol via an available data port.
  • the device 100 includes one or more vibration sensors 130 that communicate with the processor 110 when the surface is tapped, in a manner similar to that of the touch sensor(s) 120.
  • the processor 110 communicates with an optional hardware controller to cause a display 140 to present an appropriate image.
  • a speaker 150 is also coupled to the processor so that any appropriate auditory signals can be passed on to the user as guidance.
  • the processor 110 has access to a memory 160, which may include a combination of temporary and/or permanent storage, and both read-only and writable memory (random access memory or RAM), read-only memory (ROM), writable non-volatile memory such as FLASH memory, hard drives, floppy disks, and so forth.
  • the memory 160 includes program memory 170 that contains all programs and software such as an operating system 171, the User Gesture Recognition software 172, and any other application programs 173.
  • the memory 160 also includes data memory 180 that includes user options and preferences 181 required by the User Gesture Recognition software 172, and any other data 182 required by any element of the device 100.
  • FIGURE 2 shows a flow chart of an exemplary process 200 that allows the same physical area on a touchscreen keyboard to be used to perform the functions of both a numeric keypad and touchpad.
  • the process 200 is not intended to fully detail all the software of the present invention in its entirety, but is provided as an overview and an enabling disclosure of the present invention.
  • the process 200 is provided by the User Gesture Recognition Software 172.
  • various system variables are initialized. For example, event time out (threshold time) is set to zero.
  • event time out (threshold time) is set to zero.
  • the process waits to be notified that user contact has occurred within the common area. While the system is waiting in block 210, a counter is incremented with the passage of time. Once user contact has occurred, block 215 determines if the counter has exceeded the maximum time (threshold) allowed for user input (stored as a user option in Data Memory 181).
  • the system resets the mode of the common area to the default mode in block 220.
  • the processor 110 determines whether or not the current mode is in touchpad mode. If the current mode is in the touchpad mode, the processor 110 interprets the user contact as a touchpad event and outputs the command accordingly in block 230. [0019] If the current mode is not in the touchpad mode, then the processor 110 assumes the common area is in number pad (numpad) mode and proceeds to decision block 235. In touchpad operation, the user will make an initial touch followed by a sliding motion with their finger (or multiple fingers). In numpad operation, the user will tap on a number key and typically will not slide their finger.
  • the processor 110 uses this difference in typical operation to interpret the user's input in decision block 235 and if a touch-and-slide motion is detected by the processor 110 based on signals provided by the sensors 120,130, the processor 110 changes the current mode to the touchpad mode in block 240, and outputs the user action as a touchpad event in block 245. If the user action is not a touch-and-slide motion then the user action is output by the processor 1 10 as a numpad event in block 250. After blocks 230, 245, 250, the process 200 returns to block 210.
  • the default mode is set by the user (typically through control panel software). If the device 100 is at rest with no user input for the user-settable amount of time (threshold), the mode is restored to the default mode.
  • FIGURE 3 shows a schematic view representative of a touch and tap-sensitive keyboard 300 that incorporates on its forward- facing surface an area 310 incorporating the functions of both a numeric keypad and touchpad.
  • keyboard in this application refers to any keyboard that is implemented on a touch and tap sensitive surface, including a keyboard presented on a touch-sensitive display.
  • the keyboard 300 includes the outline of the area 310 incorporating the functions of the touchpad, the keys assigned to the numeric keypad, as well as the selection keys commonly referred to as the "left and right mouse buttons” 330.
  • “Mode” refers to the type of function that is assigned to the commonly-shared area 310.
  • a separate mode key 320 allows the user to manually select between Touchpad mode, numeric keypad (or “numpad”) mode, or "Auto” mode (whereby the function assigned to common area 310 is determined by the system according to the actions of the user on the surface of the common area 310).
  • the system of the present invention displays the current mode (touchpad or number pad) with visual indicators 320 along with an "Auto" mode visual indicator. In this way, the user can know which mode the system is in at all times.
  • a mode key 324 is provided below the indicators 320 on the keyboard. User activation of the mode key 324 causes the processor 110 to switch to another mode.
  • the user may define the default mode to be the touchpad mode by first selecting Auto mode with the mode key 324 immediately followed by a touch-and-slide motion on the common area 310. In the absence of a touch-and-slide motion immediately following the selection of Auto mode, the processor 110 will set the default mode to numpad mode.
  • the touch surface is used in a fourth mode: keyboard.
  • the surface represents a keyboard, on which the user may enter text using a plethora of methods designed for smaller touch surfaces (such as those invented for smartphones).
  • This mode is manually selected by the user through some scheme implemented on the keyboard or computer software, or it is selected by functionality provided by the auto-detect mode.
  • the device stays in keyboard mode for as long as the user is typing.
  • a predefined gesture such as pressing and holding all their fingers for a few seconds in the same location.
  • the processor recognizes the unique gesture, then changes mode accordingly. Other gestures could also be recognized.
  • the touch surface incorporates a dynamic display.
  • the display changes in accordance with the current mode setting to display the appropriate image in the common area. For example, when numpad mode is selected, a numeric displayed; when touchpad is selected, a blank rounded rectangle is displayed; and so on.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un procédé et un système qui font coexister un pavé numérique et un bloc à effleurement dans le même espace physique sur un dispositif d'affichage tactile. Le mode de fonctionnement de cet espace unique est déterminé automatiquement sur la base des actions que l'utilisateur réalise avec l'affichage ou sur la base d'une entrée manuelle effectuée par ledit utilisateur. Le système fonctionne selon un ou plusieurs des modes de fonctionnement choisis parmi : un mode pavé numérique (numpad), un mode bloc à effleurement (touchpad), un mode clavier et un mode détection automatique (auto). Un indicateur visuel informe l'utilisateur du mode qui est en cours.
EP11844754.9A 2010-11-30 2011-11-30 Pavé numérique et bloc à effleurement multiplexés Withdrawn EP2646893A2 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US41827910P 2010-11-30 2010-11-30
US201161472799P 2011-04-07 2011-04-07
PCT/US2011/062723 WO2012075199A2 (fr) 2010-11-30 2011-11-30 Pavé numérique et bloc à effleurement multiplexés

Publications (1)

Publication Number Publication Date
EP2646893A2 true EP2646893A2 (fr) 2013-10-09

Family

ID=46172548

Family Applications (2)

Application Number Title Priority Date Filing Date
EP11844775.4A Withdrawn EP2646894A2 (fr) 2010-11-30 2011-11-30 Clavier virtuel positionné de manière dynamique
EP11844754.9A Withdrawn EP2646893A2 (fr) 2010-11-30 2011-11-30 Pavé numérique et bloc à effleurement multiplexés

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP11844775.4A Withdrawn EP2646894A2 (fr) 2010-11-30 2011-11-30 Clavier virtuel positionné de manière dynamique

Country Status (5)

Country Link
EP (2) EP2646894A2 (fr)
JP (2) JP5782133B2 (fr)
KR (1) KR101578769B1 (fr)
CN (2) CN103443744B (fr)
WO (2) WO2012075199A2 (fr)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103917278B (zh) * 2011-11-08 2017-05-17 索尼公司 传感器装置、分析装置与记录介质
JP2017084404A (ja) * 2012-02-23 2017-05-18 パナソニックIpマネジメント株式会社 電子機器
US20150261310A1 (en) * 2012-08-01 2015-09-17 Whirlscape, Inc. One-dimensional input system and method
US8816985B1 (en) 2012-09-20 2014-08-26 Cypress Semiconductor Corporation Methods and apparatus to detect a touch pattern
US9770026B2 (en) * 2012-10-25 2017-09-26 Shenyang Sinochem Agrochemicals R&D Co., Ltd. Substituted pyrimidine compound and uses thereof
US9965179B2 (en) 2012-11-27 2018-05-08 Thomson Licensing Adaptive virtual keyboard
WO2014083368A1 (fr) 2012-11-27 2014-06-05 Thomson Licensing Clavier virtuel adaptatif
JP6165485B2 (ja) * 2013-03-28 2017-07-19 国立大学法人埼玉大学 携帯端末向けarジェスチャユーザインタフェースシステム
JP5801348B2 (ja) 2013-06-10 2015-10-28 レノボ・シンガポール・プライベート・リミテッド 入力システム、入力方法およびスマートフォン
US9483176B2 (en) * 2013-07-08 2016-11-01 Samsung Display Co., Ltd. Method and apparatus to reduce display lag of soft keyboard presses
JP6154690B2 (ja) * 2013-07-22 2017-06-28 ローム株式会社 ソフトウェアキーボード型入力装置、入力方法、電子機器
US9335831B2 (en) 2013-10-14 2016-05-10 Adaptable Keys A/S Computer keyboard including a control unit and a keyboard screen
CN103885632B (zh) * 2014-02-22 2018-07-06 小米科技有限责任公司 输入方法和装置
US10175882B2 (en) * 2014-07-31 2019-01-08 Technologies Humanware Inc. Dynamic calibrating of a touch-screen-implemented virtual braille keyboard
JP6330565B2 (ja) * 2014-08-08 2018-05-30 富士通株式会社 情報処理装置、情報処理方法及び情報処理プログラム
CN104375647B (zh) * 2014-11-25 2017-11-03 杨龙 用于电子设备的交互方法及电子设备
CN105718069B (zh) * 2014-12-02 2020-01-31 联想(北京)有限公司 信息处理方法及电子设备
CN106155502A (zh) * 2015-03-25 2016-11-23 联想(北京)有限公司 一种信息处理方法及电子设备
JP6153588B2 (ja) * 2015-12-21 2017-06-28 レノボ・シンガポール・プライベート・リミテッド 情報処理装置及びセンシングレイアウト更新方法並びにプログラム
KR101682214B1 (ko) * 2016-04-27 2016-12-02 김경신 전자잉크 키보드
US10234985B2 (en) * 2017-02-10 2019-03-19 Google Llc Dynamic space bar
CN107704186B (zh) * 2017-09-01 2022-01-18 联想(北京)有限公司 一种控制方法及电子设备
CN107493365A (zh) * 2017-09-13 2017-12-19 深圳传音通讯有限公司 一种用于智能设备的拨号盘的切换方法及切换装置
US11159673B2 (en) * 2018-03-01 2021-10-26 International Business Machines Corporation Repositioning of a display on a touch screen based on touch screen usage statistics
US10725506B2 (en) * 2018-08-21 2020-07-28 Dell Products, L.P. Context-aware user interface (UI) for multi-form factor information handling systems (IHSs)
CN109582211B (zh) * 2018-12-25 2021-08-03 努比亚技术有限公司 一种触控区适配方法、设备及计算机可读存储介质
JP2020135529A (ja) * 2019-02-21 2020-08-31 シャープ株式会社 タッチパネル、複合機、プログラムおよびタッチパネルの制御方法
US11150751B2 (en) * 2019-05-09 2021-10-19 Dell Products, L.P. Dynamically reconfigurable touchpad
EP4004695A4 (fr) * 2019-09-18 2022-09-28 Samsung Electronics Co., Ltd. Appareil électronique et procédé de commande associé

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4725694A (en) * 1986-05-13 1988-02-16 American Telephone And Telegraph Company, At&T Bell Laboratories Computer interface device
JP3260240B2 (ja) * 1994-05-31 2002-02-25 株式会社ワコム 情報入力方法およびその装置
US6278441B1 (en) * 1997-01-09 2001-08-21 Virtouch, Ltd. Tactile interface system for electronic data display system
KR100766627B1 (ko) * 1998-01-26 2007-10-15 핑거웍스, 인크. 수동 입력 통합 방법 및 장치
US7768501B1 (en) * 1998-05-01 2010-08-03 International Business Machines Corporation Method and system for touch screen keyboard and display space sharing
US6525717B1 (en) * 1999-12-17 2003-02-25 International Business Machines Corporation Input device that analyzes acoustical signatures
CA2462058A1 (fr) * 2001-09-21 2003-04-03 International Business Machines Corporation Dispositif d'entree, appareil informatique, procede d'identification d'un objet entre, procede d'identification d'un objet entre avec un clavier, et programme informatique
US6947028B2 (en) * 2001-12-27 2005-09-20 Mark Shkolnikov Active keyboard for handheld electronic gadgets
ATE538430T1 (de) * 2002-07-04 2012-01-15 Koninkl Philips Electronics Nv Sich automatisch anpassende virtuelle tastatur
JP2004341813A (ja) * 2003-05-15 2004-12-02 Casio Comput Co Ltd 入力装置表示制御方法及び入力装置
KR100537280B1 (ko) * 2003-10-29 2005-12-16 삼성전자주식회사 휴대용 단말기에서 터치스크린을 이용한 문자 입력 장치및 방법
US20050122313A1 (en) * 2003-11-11 2005-06-09 International Business Machines Corporation Versatile, configurable keyboard
US20050190970A1 (en) * 2004-02-27 2005-09-01 Research In Motion Limited Text input system for a mobile electronic device and methods thereof
US20060066590A1 (en) * 2004-09-29 2006-03-30 Masanori Ozawa Input device
JP2006127488A (ja) * 2004-09-29 2006-05-18 Toshiba Corp 入力装置、コンピュータ装置、情報処理方法及び情報処理プログラム
JP4417224B2 (ja) * 2004-10-25 2010-02-17 本田技研工業株式会社 燃料電池スタック
US9019209B2 (en) * 2005-06-08 2015-04-28 3M Innovative Properties Company Touch location determination involving multiple touch location processes
FR2891928B1 (fr) * 2005-10-11 2008-12-19 Abderrahim Ennadi Clavier a ecran tactile universel multilingue et multifonction
US7659887B2 (en) * 2005-10-20 2010-02-09 Microsoft Corp. Keyboard with a touchpad layer on keys
CA2698737C (fr) * 2007-09-19 2017-03-28 Cleankeys Inc. Surface tactile et sensible a un tapotement nettoyable
KR101352994B1 (ko) * 2007-12-10 2014-01-21 삼성전자 주식회사 적응형 온 스크린 키보드 제공 장치 및 그 제공 방법
KR101456490B1 (ko) * 2008-03-24 2014-11-03 삼성전자주식회사 터치 스크린 키보드 디스플레이 방법 및 그와 같은 기능을갖는 장치
TWI360762B (en) * 2008-09-05 2012-03-21 Mitake Information Corp On-screen virtual keyboard system
US8633901B2 (en) * 2009-01-30 2014-01-21 Blackberry Limited Handheld electronic device having a touchscreen and a method of using a touchscreen of a handheld electronic device
CN101937313B (zh) * 2010-09-13 2019-11-12 中兴通讯股份有限公司 一种触摸键盘动态生成和输入的方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2012075199A3 *

Also Published As

Publication number Publication date
KR101578769B1 (ko) 2015-12-21
KR20140116785A (ko) 2014-10-06
JP6208718B2 (ja) 2017-10-04
CN103443744B (zh) 2016-06-08
JP5782133B2 (ja) 2015-09-24
WO2012075197A3 (fr) 2012-10-04
CN106201324A (zh) 2016-12-07
WO2012075199A2 (fr) 2012-06-07
JP2015232889A (ja) 2015-12-24
WO2012075199A3 (fr) 2012-09-27
CN103443744A (zh) 2013-12-11
EP2646894A2 (fr) 2013-10-09
WO2012075197A2 (fr) 2012-06-07
JP2014514785A (ja) 2014-06-19
CN106201324B (zh) 2019-12-13

Similar Documents

Publication Publication Date Title
WO2012075199A2 (fr) Pavé numérique et bloc à effleurement multiplexés
US20120075193A1 (en) Multiplexed numeric keypad and touchpad
US10126942B2 (en) Systems and methods for detecting a press on a touch-sensitive surface
US20210132796A1 (en) Systems and Methods for Adaptively Presenting a Keyboard on a Touch-Sensitive Display
US20090153495A1 (en) Input method for use in an electronic device having a touch-sensitive screen
JP5721323B2 (ja) 触覚的に生成された基準キーを有するタッチパネル
CN102224483B (zh) 具有绝对及相对输入模式的触敏显示屏幕
TWI416374B (zh) 輸入方法、輸入裝置及電腦系統
US20100259482A1 (en) Keyboard gesturing
US8456433B2 (en) Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel
US8519960B2 (en) Method and apparatus for switching of KVM switch ports using gestures on a touch panel
US20090077493A1 (en) Method for the Selection of Functions with the Aid of a User Interface, and User Interface
US20090066653A1 (en) Systems and methods for using a keyboard as a touch panel
JP2011221640A (ja) 情報処理装置、情報処理方法およびプログラム
JP5556398B2 (ja) 情報処理装置、情報処理方法およびプログラム
US20100220067A1 (en) Portable electronic device with a menu selection interface and method for operating the menu selection interface
JP6162299B1 (ja) 情報処理装置、入力切替方法、及びプログラム
JPWO2011118602A1 (ja) タッチパネル機能付き携帯端末及びその入力方法
CN114690887B (zh) 一种反馈方法以及相关设备
EP2615534A1 (fr) Dispositif électronique et son procédé de contrôle
CN116601586A (zh) 一种虚拟键盘的处理方法以及相关设备
CN101470575B (zh) 电子装置及其输入方法
US20130234997A1 (en) Input processing apparatus, input processing program, and input processing method
WO2010084973A1 (fr) Dispositif d'entrée, dispositif de traitement d'informations, procédé d'entrée et programme
US20100038151A1 (en) Method for automatic switching between a cursor controller and a keyboard of depressible touch panels

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130606

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20140603