WO2009034220A1 - Système de commande d'écran tactile et procédé - Google Patents

Système de commande d'écran tactile et procédé Download PDF

Info

Publication number
WO2009034220A1
WO2009034220A1 PCT/FI2007/050489 FI2007050489W WO2009034220A1 WO 2009034220 A1 WO2009034220 A1 WO 2009034220A1 FI 2007050489 W FI2007050489 W FI 2007050489W WO 2009034220 A1 WO2009034220 A1 WO 2009034220A1
Authority
WO
WIPO (PCT)
Prior art keywords
symbol
touch screen
candidate
input
input section
Prior art date
Application number
PCT/FI2007/050489
Other languages
English (en)
Inventor
Jarkko Vatjus-Anttila
Ilkka Uhari
Original Assignee
Elektrobit Wireless Communications Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elektrobit Wireless Communications Oy filed Critical Elektrobit Wireless Communications Oy
Priority to PCT/FI2007/050489 priority Critical patent/WO2009034220A1/fr
Publication of WO2009034220A1 publication Critical patent/WO2009034220A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0237Character input methods using prediction or retrieval techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/274Converting codes to words; Guess-ahead of partial word inputs

Definitions

  • the invention relates to a method of controlling a touch screen and a control system implementing the method.
  • a touch screen typically shows symbols, such as characters, which are pressed by the user in order to provide an input for writing an intended symbol sequence.
  • the symbols may be arranged according to a standard keyboard system, such as the QWERTY system, which may be specific to the ap- plied character system.
  • the size of the touch screen may still be insufficient for reliable user input. Therefore, it is useful to consider techniques for improving the usability of touch screens.
  • An object of the invention is to provide an improved method and a control system for controlling a touch screen.
  • a method of controlling a touch screen comprising generating a display control signal for indicating on the touch screen an input section of at least one candidate symbol of at least one candidate symbol se- quence, wherein the at least one candidate symbol sequence matches with a selected symbol selected from the touch screen and wherein the at least one candidate symbol is predicted by using the selected symbol and known symbol sequences.
  • a control system for controlling a touch screen comprising a display controller coupled to a prediction text engine which is configured to predict at least one candidate symbol of at least one candidate symbol sequence which matches with a selected symbol selected from a touch screen by using the selected symbol and known symbol sequences, wherein the display controller is configured to generate a display control signal for indicating on the touch screen an input section of the at least one candidate symbol.
  • the invention provides several advantages.
  • the indication of a candidate symbol on the keyboard section of the touch screen provides guidance for the user on entering characters with the touch screen, and the user only needs to touch the indicated input section to select the intended candidate symbol.
  • Figure 1 shows an example of the layout of a keyboard of a touch screen
  • Figure 2 shows a first example of the structure of an electronic device
  • Figure 3 shows a second example of the structure of an electronic device
  • Figure 4 shows an example of the structure of a control system
  • Figure 5 shows a first embodiment of the invention
  • Figure 6 shows a second embodiment of the invention
  • Figure 7 shows a third embodiment of the invention
  • Figure 8 shows a fourth embodiment of the invention
  • Figure 9 shows a fifth embodiment of the invention
  • Figure 10 shows another embodiment of the invention
  • Figure 1 1 illustrates a first embodiment of a methodology according to an embodiment of the invention.
  • Figure 12 illustrates a second embodiment of a methodology according to an embodiment of the invention.
  • a touch screen also usually referred to as a touch panel or a touch screen panel, is a display overlay which has the ability to display and receive information on one and the same surface.
  • the effect of such an overlay allows a display to be used as an input device and reduces the need for other user interface components, such as a keyboard and/or a mouse.
  • the touch screen typically comprises a layered structure of a display element, such as an LCD (liquid crystal display), and sensors for sensing the touch of the user.
  • the sensors may be based on capacitive sensing, resistive sensing, strain gauge sensing, or optical imaging, for example. Embodi- ments of the invention are not, however, restricted to the presented input mechanisms.
  • Figure 1 shows a keyboard section 104 and a display section 102 of the touch screen.
  • the keyboard section 104 includes input sections 106 of symbols.
  • the input sections 106 may be displayed to the user on the touch screen.
  • An input section 106 is an area on the touch screen which is associated with a corresponding symbol. When the input section 106 is activated by touching the corresponding area or by bringing a finger to the proximity of the corresponding area, an input of a corresponding symbol is materialized in the electronic device 100.
  • the electronic device 100 may be a portable electronic device, such as a mobile phone, a PDA (Personal Digital Assistant) or a portable game console, for example.
  • the electronic device 100 is a user interface device of an automotive. Such a user interface device may be associated with the control of the automotive and/or a navigation system, for example.
  • the symbols may be characters, such as Roman alphabets and/or Latin numerals.
  • the characters may encode Chinese, Japanese or Korean characters or other character-based symbol systems.
  • the electronic device (ED) 200 comprises a processing unit (PU) 202, a memory unit (MEM) 204, a user interface (Ul) 206, and a user interface control circuit (UI-CC) 208.
  • PU processing unit
  • MEM memory unit
  • Ul user interface
  • UI-CC user interface control circuit
  • the processing unit 202 executes computer processes according to a computer program of encoded instructions stored in the memory unit 204.
  • the memory unit 204 may comprise mass storage media, such as flash memory, and dynamic random access memory (DRAM).
  • the user interface 206 comprises the touch screen and possibly other user interface components, such as audio devices and switches.
  • the user interface control circuit 208 comprises analogue circuits for providing analogue control signals for the touch screen, according to which the touch screen configures the liquid crystal segments or other elementary optical switches such that the intended optical effect is obtained.
  • the user interface control circuit 208 may further comprise a sensor control circuit which receives an analogue signal from the touch screen and transforms the analogue signal into digital signals.
  • Figure 3 shows the electronic device 300 comprising the touch screen (TS) 302, a control system 316, a predictive text engine (PTE) 308 and a data base (DB) 310.
  • TS touch screen
  • PTE predictive text engine
  • DB data base
  • the control system 316 comprises an input controller (I-CNTL) 304 and a display controller (D-CNTL) 306.
  • the input controller 304 receives an analogue input signal 312 from the touch screen 302, which analogue input signal 312 characterizes the location of the input section 106 of symbol "N" on the touch screen 302.
  • the input controller 304 transforms the analogue input signal 312 into symbol information 314 which represents the selected symbol "N".
  • the symbol information 314 is further supplied to the display controller 306 which may instruct the touch screen 302 to display the selected symbols "KN" in the display section 102 and/or indicate the input section 106 of the selected symbol "N" on the keyboard section 104.
  • the symbol information 314 is supplied to the predictive text engine
  • the predictive text engine 308 fetches candidate symbol sequences 318 from the database 310, which candidate symbol sequences 318 match with the selected symbol and/or symbols already entered.
  • the candidate symbol sequences may be the following words:
  • the predictive text engine 308 predicts candidate symbols by using the candidate symbol sequences 318.
  • the candidate symbol is a symbol which succeeds the selected symbol.
  • the candidate symbols succeeding symbol "N" are "E", “O”, "I” and "U”.
  • the predictive text engine 308 may be based on any prior art prediction method, such as T9 method (Text on 9 Keys), capable of providing candidate symbols for a symbol sequence.
  • T9 method Text on 9 Keys
  • the predictive text engine 308 supplies candidate symbol informa- tion 320 to the display controller 306.
  • the candidate symbol information 320 includes the list of the candidate symbols "E", “O", "I” and "U”.
  • the display controller 306 generates a display control signal 322 for indicating on the touch screen 302 input sections 106 of the candidate symbols "E", “O", “I” and “U” predicted by the predictive text engine 308.
  • the touch screen 302 receives the display control signal 322 and indicates the input section 106 of the candidate symbols "E", “O", "I” and "U”.
  • the user is now capable of observing the candidate symbols and may press the input section of the intended symbol “O” by using the touch screen.
  • the candidate symbol sequences "KNOB” and “KNOCK” are relevant.
  • the input controller 304 receives the analogue input signal 312 from the touch screen 302, which analogue input signal 312 characterizes the location of the input section 106 of symbol "O" on the touch screen 302.
  • the prediction and display process are executed in the same manner as in the case of the previous symbol.
  • the intended symbol sequence "KNOB” may be accepted by pressing an acceptance input section on the touch screen or by lifting the finger or other pointer from the touch screen depending upon an embodiment of the inven- tion.
  • control system 316 With reference to Figure 4, the structure of the control system 316 is shown in detail.
  • the input controller 304 comprises a sensor control circuitry (S-CC)
  • the sensor control circuitry 402 which detects the difference in the electric characteristics, such as capaci- tance, of the sensors of the touch screen 302.
  • the sensor control circuitry 402 transforms the difference into a digital output 410 which is supplied to the input processor (I-PRSR) 404.
  • I-PRSR input processor
  • the sensor control circuitry 402 may comprise a capacitance- sensitive electric circuit and a digitizer.
  • the capacitance-sensitive electric cir- cuit may be based on an RLC circuit whose current-voltage characteristics depend on the capacitance of the sensors. The change in the current voltage characteristics may be measured, and the result may be digitized in an analogue-to-digital converter.
  • the sensor control circuitry 402 may be implemented with ASIC (Application-Specific Integrated Circuit) and software, for example.
  • ASIC Application-Specific Integrated Circuit
  • the input processor 404 receives the digital output 410 from the sensor control circuitry 410 and generates symbol information from the digital output 410.
  • the control system 316 may comprise a symbol map (SM) 414, which associates each symbol with the corresponding input section 106.
  • SM symbol map
  • the symbol map 414 returns the coordinates of the input section 106 of a symbol as a response to an input of the symbol.
  • the symbol map 414 returns a symbol as a response to an input of a coordinate falling into an input section 106 of the symbol.
  • the processing of the location information of a symbol on the touch screen may be carried out by well known geometry and interpolation algorithms.
  • the display controller 306 includes a display processor (D-PRSR)
  • D-CC display control circuitry
  • the display processor 406 receives the candidate symbol information 320.
  • the display processor 406 may supply the candidate symbol infor- mation 320 to the symbol map 414 and receive coordinates of the corresponding input section 106 as a response.
  • the display processor 406 generates graphics data 412 which defines the graphical expression of the input sections 106 of the candidate symbols.
  • the graphics data 412 further includes information for indicating the input section of the candidate symbols on the touch screen 302.
  • the graphics data 412 is supplied to the display control circuitry 408 which transforms the graphics data 412 into the display control signal 322.
  • the symbol map 414 is a dynamic data structure where the location of the input section 106 of a symbol may be varied according to an embodiment of the invention.
  • the display control circuitry 408 may be implemented with ASIC (Application-Specific Integrated Circuit) and software, for example.
  • ASIC Application-Specific Integrated Circuit
  • the input processor 404 is implemented with a computer program which may be stored in the memory 204 and executed in the processing unit 202.
  • the display processor 406 is implemented with a computer program which may be stored in the memory 204 and executed in the processing unit 202.
  • the symbol map 414 is imple- mented with a computer program which may be stored in the memory 204 and executed in the processing unit 202.
  • the display controller 306 may generate the display control signal 322 for emphasizing the input sections 502, 504, 506, 508 of the candidate symbols "E", “U”, “I” and “O", respectively, on the touch screen.
  • the input sections 502 to 508 may be emphasized with color, shape, or brightness of the input sections 502 to 508.
  • the display processor 406 may include an algorithm which implements the color, shape or brightness of the input sections 502 to 508.
  • the symbol map 414 may also include predefined characteristics of the input sections 502 to 508 to be emphasized.
  • the display controller 306 emphasizes the input sections 602, 604, 606, 608 by enlarging the visual expression of the input sections 602 to 608.
  • the display processor 406 may include an algorithm which implements the enlargement of the input sections 602 to 608.
  • the symbol map 414 may also include predefined characteristics of the enlarged expressions of the input sections 502 to 508 to be emphasized.
  • the display controller 306 de- emphasizes input sections of non-candidate symbols by changing the color, shape, size or brightness of the non-candidate symbols.
  • the non-candidate symbols are symbols which are not selected as candidate symbols by the predictive text engine 308.
  • the de-emphasis may be implemented with an algorithm executed in the display processor 406.
  • the input controller 304 may enlarge the effective input section 602 - 604 (dashed line) of the candidate symbols on the touch screen without enlarging it visually. Alternatively or additionally, the input controller 304 may visually enlarge the input section 606 to 608 (continuous line) of the candidate symbols on the touch screen. Either of the enlargement of the input sections 602 to 608 of the candidate symbols in- creases the reliability of a pressing act by the user, since the effective input area corresponding to the candidate symbol is increased in both cases.
  • the enlargement of the input section 602 to 608 may be implemented with an algorithm executed in the input processor 404 or predefined enlarged input section boundary information obtained from the symbol map 414.
  • the input controller 304 arranges the input sections 702, 704, 706, 708 of the candidate symbols to be adjacent to the section 710 of the selected symbol.
  • the proximity of the input sections 702 to 708 of the candidate symbols to the section 710 of the selected symbol results in that the user may continue the entry of symbols by sliding his or her finger from the selected symbol "N" to the input section 708 of the intended candidate symbol "O".
  • the intended candidate symbol "O” will be entered and the predictive text engine 308 executes a new prediction process for the next candidate symbol.
  • the display controller 306 generates the display control signal 322 for indicating the input section 702 to 708 adjacent to the section 710 of the selected symbol.
  • the input sections 702 to 708 are arranged in a fan-like configuration.
  • the view following the view of Figure 7 is illustrated in Figure 8.
  • the input sections 802, 804 of new candidate symbols "B" and “C” are arranged adjacent to the previous candidate symbol "O".
  • the user may still slide his or her finger to the input section 802 of the intended candidate symbol "B”.
  • the control system 316 may finalize the entering after detecting the absence of the user's finger when the user lifts his or her finger from the touch screen. As a result, symbol sequence "KNOB" has been entered to the electronic device 100.
  • the arrangement of the input sections 702 to 708 and 802, 804 may be implemented with an algorithm executed in the input processor 404 and the display processor 406.
  • the rules according to which the input sections 702 to 708 and 802, 804 may depend on the location of the previous candidate symbol on the touch screen. If the previous candidate symbol or selected symbol is located at the edge of the touch screen, the input sections 702 to 708 and 802, 804 of the candidate symbols may be directed such that they fit to the touch screen.
  • the display controller 306 may arrange the input sections 808, 810, 812, 814 of the candidate symbols to the proximity of a trajectory 806 with predefined characteristics.
  • the symbols “K” and “N” have been selected and the input sections 808, 810, 812, 814 of candidate symbols “E”, “O", “U” and “I, respectively, are arranged around the trajectory 806.
  • symbol “O” has further been selected, and the input sections 816 and 818 of candidate symbols "C” and “B", respectively, have been arranged to the proximity of the trajectory 806.
  • the display controller 306 may or may not show the trajectory 806 to the user.
  • the trajectory 806 provides a predefined guidance for the user so that the user may find the input sections 808 to 818 of the candidate symbols from a predefined area of the touch screen independent of the location of the symbol on the standard keyboard, such as the QWERTY keyboard.
  • the trajectory 806 has a prede- fined shape.
  • the shape may be continuous, thus missing discontinuities, such as sharp corners.
  • the trajectory 806 has a predefined location on the touch screen.
  • the shape of the trajectory can be of any kind that is easy to use for the end-user.
  • the shape may be simple (such as a circle), may take full advantage of the display surface area (such as ellipse on standard shaped LCD display) and may serve both left and right handed users equally (such has shape of number 8).
  • the purpose of the efficient trajectory is to enable the end-user to follow it easily, when writing a word.
  • the implementation underneath must enable shaping of the trajectory in such a way that it maximizes the probability of the pre-defined shape (for example ellipse, of which would be a common choice in this kind of implementation).
  • Mathematical methods for accomplishing these requirements can be, for example, Bezier curves/patches.
  • a display control signal is generated for indicating on the touch screen an input section 106 of at least one candidate symbol of at least one candidate symbol sequence, wherein the at least one candidate symbol sequence matches with a selected symbol selected from the touch screen and wherein the at least one candidate symbol is predicted by using the selected symbol and known symbol sequences.
  • a display control signal is generated for de-emphasizing the input sections of non-candidate symbols on the touch screen.
  • it is assessed whether or not input sections of candidate symbols are enlarged.
  • the input section of the at least one candidate symbol is enlarged on the touch screen.
  • the method/computer process starts.
  • the input section 702 to 704, 802, 804 of the at least one candidate symbol is arranged to be adjacent to the section of the selected symbol or adjacent to the section of a previous candidate symbol on the touch screen.
  • the input section 808 to 818 of the at least one candidate symbol is arranged to the proximity of a trajectory 806 with predefined characteristics.
  • the embodiments of the computer process may be implemented as a computer program comprising instructions for executing the computer process.
  • the computer program may be stored in the memory unit 204 and exe- cuted in the processing unit 202 of the electronic device 200.
  • the computer program may be stored on a computer program distribution medium readable by a computer or a processor.
  • the computer program medium may be, for example, but not limited to, an electric, magnetic, optical, infrared or semiconductor system, device or transmission medium.
  • the computer program medium may include at least one of the following media: a computer readable medium, a program storage medium, a record medium, a computer readable memory, a random access memory, an erasable programmable read-only memory, a computer readable software distribution package, a computer readable signal, a computer readable telecommunications signal, computer readable printed matter, and a computer readable compressed software package.

Abstract

L'invention propose un procédé et un système de commande pour commander un écran tactile, un signal de commande d'affichage étant généré pour indiquer sur l'écran tactile une section d'entrée de symboles candidats de séquences de symboles candidats. Les séquences de symboles candidats correspondent à un symbole sélectionné à partir l'écran tactile et des symboles candidats sont prédits à l'aide du symbole sélectionné et de séquences de symboles connues.
PCT/FI2007/050489 2007-09-13 2007-09-13 Système de commande d'écran tactile et procédé WO2009034220A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2007/050489 WO2009034220A1 (fr) 2007-09-13 2007-09-13 Système de commande d'écran tactile et procédé

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2007/050489 WO2009034220A1 (fr) 2007-09-13 2007-09-13 Système de commande d'écran tactile et procédé

Publications (1)

Publication Number Publication Date
WO2009034220A1 true WO2009034220A1 (fr) 2009-03-19

Family

ID=40451612

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2007/050489 WO2009034220A1 (fr) 2007-09-13 2007-09-13 Système de commande d'écran tactile et procédé

Country Status (1)

Country Link
WO (1) WO2009034220A1 (fr)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010099835A1 (fr) * 2009-03-06 2010-09-10 Mobytech Consulting Ab Entrée de texte améliorée
EP2284653A1 (fr) * 2009-08-14 2011-02-16 Research In Motion Limited Dispositif électronique doté d'un affichage sensible au toucher et procédé de facilitation de saisie pour le dispositif électronique
EP2369445A1 (fr) 2010-02-26 2011-09-28 Research In Motion Limited Dispositif électronique doté d'un affichage sensible au toucher et procédé de facilitation de saisie pour le dispositif électronique
WO2011056610A3 (fr) * 2009-10-26 2011-11-24 Google Inc. Entrée de texte prédictif pour des dispositifs d'entrée
EP2407859A1 (fr) * 2010-07-16 2012-01-18 Gigaset Communications GmbH Adaptation dynamique d'une surface de commande sur un écran de capteur
WO2013024317A1 (fr) * 2011-08-15 2013-02-21 Telefonaktiebolaget Lm Ericsson (Publ) Redimensionnement de zones de sélection sur un affichage tactile en fonction d'une probabilité de sélection
US8456435B2 (en) 2010-02-26 2013-06-04 Research In Motion Limited Electronic device with touch-sensitive display and method of facilitating input at the electronic device
EP2660684A1 (fr) * 2012-04-30 2013-11-06 BlackBerry Limited Interface utilisateur pour changer l'état d'entrée d'un clavier virtuel
EP2672372A2 (fr) * 2011-04-11 2013-12-11 Huawei Device Co., Ltd. Procédé de traitement d'information et dispositif de terminal
WO2014047084A1 (fr) * 2012-09-18 2014-03-27 Microsoft Corporation Fonctions de clavier lancées par des gestes
EP2722731A1 (fr) * 2012-10-18 2014-04-23 Samsung Electronics Co., Ltd. Appareil d´affichage et son procédé de saisie de caractères associé
CN104160361A (zh) * 2012-02-06 2014-11-19 迈克尔·K·科尔比 字符串完成
EP3108338A1 (fr) * 2014-02-21 2016-12-28 Drnc Holdings, Inc. Procédés pour faciliter l'entrée d'une saisie d'utilisateur dans des dispositifs informatiques
US10216409B2 (en) 2013-10-30 2019-02-26 Samsung Electronics Co., Ltd. Display apparatus and user interface providing method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050071778A1 (en) * 2003-09-26 2005-03-31 Nokia Corporation Method for dynamic key size prediction with touch displays and an electronic device using the method
US20060265648A1 (en) * 2005-05-23 2006-11-23 Roope Rainisto Electronic text input involving word completion functionality for predicting word candidates for partial word inputs
US20070046641A1 (en) * 2005-09-01 2007-03-01 Swee Ho Lim Entering a character into an electronic device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050071778A1 (en) * 2003-09-26 2005-03-31 Nokia Corporation Method for dynamic key size prediction with touch displays and an electronic device using the method
US20060265648A1 (en) * 2005-05-23 2006-11-23 Roope Rainisto Electronic text input involving word completion functionality for predicting word candidates for partial word inputs
US20070046641A1 (en) * 2005-09-01 2007-03-01 Swee Ho Lim Entering a character into an electronic device

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8605039B2 (en) 2009-03-06 2013-12-10 Zimpl Ab Text input
WO2010099835A1 (fr) * 2009-03-06 2010-09-10 Mobytech Consulting Ab Entrée de texte améliorée
EP2284653A1 (fr) * 2009-08-14 2011-02-16 Research In Motion Limited Dispositif électronique doté d'un affichage sensible au toucher et procédé de facilitation de saisie pour le dispositif électronique
WO2011056610A3 (fr) * 2009-10-26 2011-11-24 Google Inc. Entrée de texte prédictif pour des dispositifs d'entrée
EP2369445A1 (fr) 2010-02-26 2011-09-28 Research In Motion Limited Dispositif électronique doté d'un affichage sensible au toucher et procédé de facilitation de saisie pour le dispositif électronique
US8456435B2 (en) 2010-02-26 2013-06-04 Research In Motion Limited Electronic device with touch-sensitive display and method of facilitating input at the electronic device
US8830200B2 (en) 2010-02-26 2014-09-09 Blackberry Limited Electronic device with touch-sensitive display and method of facilitating input at the electronic device
EP2407859A1 (fr) * 2010-07-16 2012-01-18 Gigaset Communications GmbH Adaptation dynamique d'une surface de commande sur un écran de capteur
US9207865B2 (en) 2011-04-11 2015-12-08 Huawei Device Co., Ltd. Information processing method and terminal device
EP2672372A4 (fr) * 2011-04-11 2015-01-14 Huawei Device Co Ltd Procédé de traitement d'information et dispositif de terminal
EP2672372A2 (fr) * 2011-04-11 2013-12-11 Huawei Device Co., Ltd. Procédé de traitement d'information et dispositif de terminal
US10430054B2 (en) 2011-08-15 2019-10-01 Telefonaktiebolaget Lm Ericsson (Publ) Resizing selection zones on a touch sensitive display responsive to likelihood of selection
WO2013024317A1 (fr) * 2011-08-15 2013-02-21 Telefonaktiebolaget Lm Ericsson (Publ) Redimensionnement de zones de sélection sur un affichage tactile en fonction d'une probabilité de sélection
US9098189B2 (en) 2011-08-15 2015-08-04 Telefonaktiebolaget L M Ericsson (Publ) Resizing selection zones on a touch sensitive display responsive to likelihood of selection
US9557890B2 (en) 2012-02-06 2017-01-31 Michael K Colby Completing a word or acronym using a multi-string having two or more words or acronyms
CN104160361A (zh) * 2012-02-06 2014-11-19 迈克尔·K·科尔比 字符串完成
EP2812777A4 (fr) * 2012-02-06 2015-11-25 Michael K Colby Réalisation de chaînes de caractères
US9696877B2 (en) 2012-02-06 2017-07-04 Michael K. Colby Character-string completion
EP2660684A1 (fr) * 2012-04-30 2013-11-06 BlackBerry Limited Interface utilisateur pour changer l'état d'entrée d'un clavier virtuel
WO2014047084A1 (fr) * 2012-09-18 2014-03-27 Microsoft Corporation Fonctions de clavier lancées par des gestes
JP2014087047A (ja) * 2012-10-18 2014-05-12 Samsung Electronics Co Ltd ディスプレイ装置およびその文字入力方法
US9285953B2 (en) 2012-10-18 2016-03-15 Samsung Electronics Co., Ltd. Display apparatus and method for inputting characters thereof
EP2722731A1 (fr) * 2012-10-18 2014-04-23 Samsung Electronics Co., Ltd. Appareil d´affichage et son procédé de saisie de caractères associé
US10216409B2 (en) 2013-10-30 2019-02-26 Samsung Electronics Co., Ltd. Display apparatus and user interface providing method thereof
EP3108338A1 (fr) * 2014-02-21 2016-12-28 Drnc Holdings, Inc. Procédés pour faciliter l'entrée d'une saisie d'utilisateur dans des dispositifs informatiques

Similar Documents

Publication Publication Date Title
WO2009034220A1 (fr) Système de commande d'écran tactile et procédé
JP4518955B2 (ja) 接触エリアの移動させられた表現を用いたユーザインタフェース
US6104317A (en) Data entry device and method
US10275152B2 (en) Advanced methods and systems for text input error correction
CN102576282B (zh) 操作控制设备和操作控制方法
US5844561A (en) Information search apparatus and information search control method
EP2400373A1 (fr) Saisie de symboles dans un dispositif électronique doté d'un écran tactile
US20100225592A1 (en) Apparatus and method for inputting characters/numerals for communication terminal
US9448722B2 (en) Text entry into electronic devices
US20150123928A1 (en) Multi-touch text input
US20130113714A1 (en) Electronic Device Having Single Hand Multi-Touch Surface Keyboard and Method of Inputting to Same
US20120306767A1 (en) Method for editing an electronic image on a touch screen display
US20050240879A1 (en) User input for an electronic device employing a touch-sensor
US20060061542A1 (en) Dynamic character display input device
US20110032200A1 (en) Method and apparatus for inputting a character in a portable terminal having a touch screen
AU2009295791B2 (en) Method and device for inputting texts
WO2007121673A1 (fr) Procédé et dispositif d'amélioration de la vitesse de saisie de caractères chinois
US11112965B2 (en) Advanced methods and systems for text input error correction
KR20080097114A (ko) 문자 입력 장치 및 방법
US9753637B2 (en) Information processing apparatus, information processing method, and program
EP2813936A1 (fr) Dispositif de traitement d'informations, procédé de commande de forme d'affichage et support lisible par ordinateur non temporaire
JP5893491B2 (ja) 操作入力装置
JP2010257197A (ja) 入力処理装置
US20130215037A1 (en) Multi-touch surface keyboard with multi-key zones on an adaptable home line and method of inputting to same
EP2400372B1 (fr) Saisie de symboles dans un dispositif électronique doté d'un écran tactile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07823126

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07823126

Country of ref document: EP

Kind code of ref document: A1