WO2003098417A2 - Object entry into an electronic device - Google Patents

Object entry into an electronic device Download PDF

Info

Publication number
WO2003098417A2
WO2003098417A2 PCT/IB2003/001710 IB0301710W WO03098417A2 WO 2003098417 A2 WO2003098417 A2 WO 2003098417A2 IB 0301710 W IB0301710 W IB 0301710W WO 03098417 A2 WO03098417 A2 WO 03098417A2
Authority
WO
WIPO (PCT)
Prior art keywords
field
individually selectable
additional
selectable fields
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2003/001710
Other languages
English (en)
French (fr)
Other versions
WO2003098417A3 (en
Inventor
Marco Van Leeuwen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Priority to AT03752872T priority Critical patent/ATE436048T1/de
Priority to JP2004505864A priority patent/JP4429898B2/ja
Priority to AU2003225484A priority patent/AU2003225484A1/en
Priority to DE60328284T priority patent/DE60328284D1/de
Priority to EP03752872A priority patent/EP1509832B1/en
Priority to KR1020047018628A priority patent/KR100941948B1/ko
Priority to US10/514,594 priority patent/US7424683B2/en
Publication of WO2003098417A2 publication Critical patent/WO2003098417A2/en
Publication of WO2003098417A3 publication Critical patent/WO2003098417A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to the field of entering objects into an electronic device comprising a display screen, and particularly to a method and apparatus for entering objects such as graphical characters into a small or portable electronic appliance which does not have a conventional alphanumeric keyboard with discrete keys. More specifically, the invention relates to an efficient system for entering objects from a large set of objects when the surface space that is available for entering the objects is relatively limited, such as on a cellular telephone, or in a personal digital assistant. The present invention also relates to a computer program product comprising software code portions for achieving the system and a method for entering objects into a small or portable electronic appliance when said product is run on a computer.
  • PDAs personal digital assistants
  • personal communication devices e.g. mobile phones
  • laptop personal computers multifunctional remote controls
  • web-tablets to name a few, which help users store and organize information.
  • PDAs personal digital assistants
  • the trend has been towards an ever-increasing reduction of the size of such devices.
  • this reduction of size has resulted in the problem that data entry into these devices is often very difficult.
  • One prior-art approach to providing a graphical text entry system has been to display a graphical text entry wheel on a graphical text entry screen.
  • a plurality of characters is positioned on the graphical text entry wheel.
  • the system also includes a pointing device for rotating the graphical text entry wheel to allow a user to select one of the characters on the graphical text entry wheel to be entered.
  • the graphical text entry system may provide suggested subsequent characters or words as aids for text entries.
  • a system of this type is disclosed in US 6 011 542, to Sony.
  • Another object of the present invention is to provide an improved system for entering objects, such as graphical characters, into a small electronic appliance. Another object of the invention is to provide an improved system for entering graphical characters into a small electronic appliance also providing word suggestions as aids for text entries.
  • a further object of the invention is to provide an improved system for entering objects into a small electronic appliance in which the objects can be entered by using small movements of a finger or stylus on a touch-sensitive screen.
  • a still further object of the invention is to provide an improved system for entering objects into a small electronic appliance, wherein selection and entry of individual objects is effected on individually selectable fields requiring only a limited amount of space to be depicted on the display screen of the small information appliance.
  • Yet another object of the invention is to provide an improved system for entering objects into a small electronic appliance which allows a user to enter any individual object from a large set of objects.
  • a still further object of the invention is to provide an improved computer program product comprising software code portions for achieving the system and a method for entering objects into a small or portable electronic appliance when said product is run on a computer.
  • Fig. 1 is a top plan view of a highest level of individually selectable fields, each of which represents a subset of objects from a larger set of objects, here illustrated as subsets of graphical characters from a larger character set;
  • Fig. 2 is a top plan view of a subordinate level of individually selectable fields, each of which represents an individual character from the subset of graphical characters represented in the fields of Fig. 1 ;
  • Fig. 3 is a top plan view illustrating selection of an individual graphical character from the characters represented in the subordinate level of individually selectable fields of Fig.2;
  • Fig. 4 is a top plan view illustrating a further subordinate level of individually selectable fields, each of which represents one candidate word for entry according to a second embodiment of the present invention
  • Fig. 5 is a top plan view illustrating selection of an individual word from the candidate words represented in the further subordinate level of individually selectable fields ofFig.4;
  • Fig. 6 is a flow chart of the method for entering objects of the embodiment of Figs. 1 through 3; and Fig. 7 illustrates an extension of the flow chart of Fig. 6 depicting the method for providing word suggestions as aids for text entries according to the embodiment of Figs. 4 and 5.
  • Fig. 1 shows a graphical display screen 10 displaying a highest level view of a graphical user interface system for entering objects from a set of objects into an electronic device according to an embodiment of the present invention.
  • the graphical display screen 10 of the electronic device comprises display means, where the display means may be any commonly used type of display, such as: a liquid crystal display; a plasma screen display; an electrochromic display; a cathode ray display.
  • the display means is preferably touch- sensitive.
  • the electronic device also comprises user input means (not shown) such as a stylus and touch-pad or touch-sensitive screen; a user's finger and touch-pad or touch-sensitive screen, or any combination of the above.
  • a screen which is not touch-sensitive such as the keys of a handset of a wireless communications system; the keys of a remote control appliance; a joystick, trackball or computer mouse, or any combination of the above.
  • This alternative might, however, require some minor adjustments of the supporting graphics in addition to what will be described below.
  • the current embodiment will be described with reference to an electronic device comprising a touch-sensitive display screen and a stylus and/or user's finger for providing user input to the electronic device by performing operations on the graphical user interface of the object entry system.
  • the display means Upon sensing placement by the user of the stylus and/or finger, hereinafter referred to as pointer, in a position on the screen 10, the display means are operatively arranged for sensing the point of placement to determine a first pointer position 11 and displaying, surrounding the first pointer position 11, a first plurality of individually selectable fields 12 representing a highest level of the graphical user interface. As shown in Fig. 1, these fields 12 are represented as rectangular or square boxes, but could alternatively take any suitable form or shape, such as circular, triangular or circle sectors. Each of these fields 12 represents a subset of objects from a larger set of objects. For ease of understanding, the objects are here illustrated as graphical characters from a larger set of characters, e.g. as shown in Fig.
  • the objects may be objects from other kinds of data collections commonly occurring in these types of devices, e.g. the set of objects could be a set of contact data and the individual objects contact data representative of an individual contact or alternatively the set of object could represent a top level folder, containing any number of subordinate folders with either further subordinate folders or individually selectable objects representative of items or functions, which are accessible through the user interface of the electronic device.
  • the input means are then operatively arranged to allow movement of the pointer to either of the first individually selectable fields 12 through the user moving the pointer, kept in contact with the touch-sensitive screen 10, to the field representing the desired subset of characters, illustrated in Fig.
  • said display means are further operatively arranged to alternate by display, surrounding a selected field, of an additional plurality of individually selectable fields 13, each of which additional individually selectable fields 13 represents either an individual character or a further subset of characters (not shown) from the subset of graphical characters represented in the previously selected field and allowing movement of the pointer, kept in contact with the touch-sensitive screen, to either of the additional individually selectable fields 13 until the additional field selected represents an individual character, illustrated in Fig. 3 as the blackened field 13a containing the character "h".
  • any number of subordinate levels of individually selectable fields can be provided, even if the embodiment, shown here as an example, only illustrates one subordinate level.
  • Fig 2 only the fields, which are displayed, non-overlaid the fields from the first plurality of individually selectable fields 12 contain a character and are thus selectable. It is of course possible to let all additional fields 13 comprise a character, but the shown way of presentation is preferred, as this facilitates an overview of the graphical user interface by the user and allows the user to perform a reverse movement of the pointer to the previous level to annul a performed selection.
  • the input means are further operatively arranged to allow performance of entry of the graphical character represented in the selected additional field 13a through the user lifting the pointer from the touch-sensitive screen, whereupon the graphical character selected is entered to a text entry field 14 of the graphical user interface and the displayed fields are removed.
  • the text entry field 14 of Figs. 1 to 3 as illustrated contains a previously entered character, here illustrated as the character "s", for the purpose of facilitating the following description.
  • the system is arranged to provide word suggestions as aids for text entries.
  • the system further comprises dictionary storage means (not shown), such as any conventional type of memory, e.g. RAM-memory circuits, ROM-memory circuits, a magnetic memory device (such as a hard-disk drive) or an optical memory device (such as a CD-ROM or DVD reader).
  • the dictionary storage means store a plurality of candidate words.
  • the system further comprises retrieval means (not shown) for retrieving a subset of candidate words from the dictionary storage means.
  • the retrieval means preferably comprise a microprocessor and software code portions for performing the retrieval process when executed on said microprocessor.
  • the subset of candidate words is retrieved on the basis of either the character represented in the selected additional field or the character represented in the selected additional field and at least one previously entered character, as illustrated by the character "s" in the text entry field 14 of Figs. 1, 2 and 3.
  • the display means are operatively arranged to display, surrounding the selected additional field 13 a, a further plurality 15 of individually selectable fields representing a further subordinate level of the graphical user interface. Each of the further individually selectable fields 15 represents one candidate word from the subset of candidate words.
  • the input means are also operatively arranged to allow movement of the pointer to either of the further individually selectable fields 15 through the user moving the pointer, kept in contact with the touch-sensitive screen 10, to the field representing the desired candidate word, illustrated in Fig. 5 as the blackened field 15a containing the word "ship", for performing a selection of the indicated further field.
  • the input means are further operatively arranged to allow performance of entry of the candidate word represented into the selected further field 15a through the user lifting the pointer from the touch-sensitive screen 10, whereupon the candidate word selected is entered into the text entry field 14 of the graphical user interface replacing the previously entered character, shown as the character "s" in Fig. 5.
  • the input means are operatively arranged to allow reverse movement of the pointer through the user moving the pointer, kept in contact with the touch-sensitive screen, from a currently selected field to a previous pointer position for annulling a performed selection. Upon such an annulment, presentation of any fields initiated through the previous selection is cancelled.
  • the display means of this embodiment are operatively arranged to display the individual characters represented in the additional plurality of individually selectable fields with an enlarged font size as compared to the font size of the subset of graphical characters represented in the first plurality of individually selectable fields.
  • the same font size could be used to present all characters.
  • the display means can be operatively arranged to provide a magnified version of the additional plurality of individually selectable fields 13 in response to using the input means to initiate movement of the pointer towards either of the additional individually selectable fields 13.
  • the display means are operatively arranged to display the additional plurality of individually selectable fields 13 at least partially overlaid with the previously displayed first plurality of individually selectable fields 12.
  • the system for entering objects from a set of objects into an electronic device is arranged in a handset of a wireless communication system, such as a mobile phone.
  • a wireless communication system such as a mobile phone.
  • pointer movement and entry could be controlled by the user depressing designated keys from the plurality of keys commonly occurring on such devices.
  • the system for entering objects from a set of objects into an electronic device may alternatively be arranged in a personal digital assistant, a remote control appliance or any type of handheld electronic device.
  • movement could be effected by the user using alternative input means, such as the keys of a remote control appliance; a joystick, trackball or computer mouse; a stylus and touch-pad or touch-sensitive screen; a user's finger and touch-pad or touch sensitive screen, or any combination of the above.
  • Fig. 6 is a flow chart of the method for entering objects from a set of objects, here illustrated as graphical characters from a character set, into an electronic device according to an embodiment of the present invention.
  • the user initiates the input process. In the embodiment described above, this is accomplished by the user placing the point of the pointer on the touch-sensitive screen. In other embodiments, initiation can be accomplished in any number of ways, e.g. simply by turning on the device or by pressing a dedicated key on the device or by touching a dedicated area on a touch- sensitive screen or touch-pad.
  • the system determines a first pointer position and displays on the display, surrounding the first pointer position, a first plurality of individually selectable fields, each of which represents a subset of graphical characters from the character set (step 21). Thereafter, it is determined if the user has input commands for moving the pointer to either of the first individually selectable fields for selection of the indicated field (step 22), and the selected field is highlighted (step 23).
  • the system alternates displaying on the display, surrounding a selected field, an additional plurality of individually selectable fields (step 24), each of which either represents an individual character or a further subset of characters from the subset of graphical characters represented in the previously selected field and determining if the user has input commands for moving the pointer to either of the additional individually selectable fields for selection and highlighting the indicated additional field (step 25) until the additional field selected represents an individual graphical character. It is thereafter determined if the user has input commands for performing an entry of the graphical character represented in the selected additional field (step 26), whereupon the selected graphical character is entered into a text entry field of the graphical user interface (step 27) and the displayed fields are removed. Also envisaged is a further enhanced embodiment for providing word suggestions as aids for text entries, which is illustrated in Fig. 7, which flow chart is coupled to the flow chart of Fig. 6 through the broken-line arrow 28.
  • Fig. 7 when the user has no input commands for performing an entry of the graphical character represented in the selected additional field, the process flow is continued via the broken-line arrow 28 as follows.
  • Dictionary storage means storing a plurality of candidate words are accessed (step 29) and a subset of candidate words is retrieved from the dictionary storage means based on either the character represented in the selected additional field or the character represented in the selected additional field and at least one previously entered character (step 30). Thereafter, the system displays on the display, surrounding the selected additional field, a further plurality of individually selectable fields, each of which represents one candidate word from the subset of candidate words (step 31).
  • step 32 As a next step it is determined if the user has input commands for moving the pointer to either of the further individually selectable fields for performing a selection of the indicated further field (step 32), which field is then highlighted (step 33). It is thereafter determined if the user has input commands for performing an entry of the candidate word represented in the selected further field (step 34), whereupon this word is entered into a text entry field of the graphical user interface (step 35), replacing any previously entered character(s) used for retrieving the candidate words.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)
  • Vending Machines For Individual Products (AREA)
  • Devices For Checking Fares Or Tickets At Control Points (AREA)
  • Push-Button Switches (AREA)
  • Electrophonic Musical Instruments (AREA)
PCT/IB2003/001710 2002-05-21 2003-04-23 Object entry into an electronic device Ceased WO2003098417A2 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
AT03752872T ATE436048T1 (de) 2002-05-21 2003-04-23 Objekteingabe in ein elektronisches gerät
JP2004505864A JP4429898B2 (ja) 2002-05-21 2003-04-23 電子装置に対するオブジェクト入力
AU2003225484A AU2003225484A1 (en) 2002-05-21 2003-04-23 Object entry into an electronic device
DE60328284T DE60328284D1 (de) 2002-05-21 2003-04-23 Objekteingabe in ein elektronisches Gerät
EP03752872A EP1509832B1 (en) 2002-05-21 2003-04-23 Object entry into an electronic device
KR1020047018628A KR100941948B1 (ko) 2002-05-21 2003-04-23 객체를 선택 및 입력하는 시스템, 객체 세트로부터 객체를 입력하는 방법, 및 이 방법을 구현하기 위한 소프트웨어 코드를 저장하기 위한 컴퓨터 판독가능 매체
US10/514,594 US7424683B2 (en) 2002-05-21 2003-04-23 Object entry into an electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP02076985.7 2002-05-21
EP02076985 2002-05-21

Publications (2)

Publication Number Publication Date
WO2003098417A2 true WO2003098417A2 (en) 2003-11-27
WO2003098417A3 WO2003098417A3 (en) 2004-11-11

Family

ID=29433166

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2003/001710 Ceased WO2003098417A2 (en) 2002-05-21 2003-04-23 Object entry into an electronic device

Country Status (10)

Country Link
US (1) US7424683B2 (enExample)
EP (1) EP1509832B1 (enExample)
JP (1) JP4429898B2 (enExample)
KR (1) KR100941948B1 (enExample)
CN (1) CN1308803C (enExample)
AT (1) ATE436048T1 (enExample)
AU (1) AU2003225484A1 (enExample)
DE (1) DE60328284D1 (enExample)
ES (1) ES2328921T3 (enExample)
WO (1) WO2003098417A2 (enExample)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1826656A2 (en) * 2006-02-28 2007-08-29 Samsung Electronics Co., Ltd. Portable device and special character input method thereof
WO2007107824A1 (en) * 2006-03-17 2007-09-27 Nokia Corporation Improved mobile communication terminal and method therefore
WO2008085749A3 (en) * 2007-01-07 2008-11-06 Apple Inc Portable multifunction device with soft keyboards
EP1906299A3 (en) * 2006-09-26 2008-11-19 Samsung Electronics Co., Ltd. Portable device and method for displaying menu in portable device
US7694231B2 (en) 2006-01-05 2010-04-06 Apple Inc. Keyboards for portable electronic devices
WO2011112276A1 (en) 2010-03-09 2011-09-15 Alibaba Group Holding Limited Method and apparatus for displaying character selection during user input
JP2012123461A (ja) * 2010-12-06 2012-06-28 Fujitsu Ten Ltd 電子機器
EP2487561A1 (en) * 2011-02-11 2012-08-15 Sony Mobile Communications Japan, Inc. Information input apparatus
EP2547081A4 (en) * 2011-07-29 2013-12-18 Huawei Tech Co Ltd METHOD FOR DISPLAYING CONTACTS LIST AND TERMINAL
WO2014028443A1 (en) * 2012-08-14 2014-02-20 Motorola Mobility Llc Systems and methods for touch-based two-stage text input
US8806362B2 (en) 2010-01-06 2014-08-12 Apple Inc. Device, method, and graphical user interface for accessing alternate keys
US9086802B2 (en) 2008-01-09 2015-07-21 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US9189079B2 (en) 2007-01-05 2015-11-17 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9274685B2 (en) 2013-03-15 2016-03-01 Google Technology Holdings LLC Systems and methods for predictive text entry for small-screen devices with touch-based two-stage text input
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US10025501B2 (en) 2008-06-27 2018-07-17 Apple Inc. Touch screen device, method, and graphical user interface for inserting a character from an alternate keyboard
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005081894A2 (en) * 2004-02-23 2005-09-09 Hillcrest Laboratories, Inc. Keyboardless text entry
KR100672362B1 (ko) * 2005-02-16 2007-01-24 엘지전자 주식회사 이동통신 단말기의 문자 입력 방법
GB0605386D0 (en) * 2006-03-17 2006-04-26 Malvern Scient Solutions Ltd Character input method
EP1959238B1 (en) * 2007-02-13 2018-05-23 Harman Becker Automotive Systems GmbH Method for inputting a destination in a navigation unit and nagivation system therefor
US8726194B2 (en) 2007-07-27 2014-05-13 Qualcomm Incorporated Item selection using enhanced control
JP5207699B2 (ja) * 2007-09-28 2013-06-12 京セラ株式会社 文字入力装置、文字入力方法および文字入力プログラム
US8839123B2 (en) * 2007-11-19 2014-09-16 Red Hat, Inc. Generating a visual user interface
JP2009169456A (ja) * 2008-01-10 2009-07-30 Nec Corp 電子機器、該電子機器に用いられる情報入力方法及び情報入力制御プログラム、並びに携帯端末装置
US8667413B2 (en) * 2008-02-14 2014-03-04 Creative Technology Ltd Apparatus and method for information input in an electronic device with display
JP2010033254A (ja) * 2008-07-28 2010-02-12 Speedscript Ltd アジア言語の高速入力システム
US20100110002A1 (en) * 2008-11-06 2010-05-06 Sony Ericsson Mobile Communications Ab Communication device with combined input and display device
US8605039B2 (en) * 2009-03-06 2013-12-10 Zimpl Ab Text input
US20100333014A1 (en) * 2009-06-24 2010-12-30 Research In Motion Limited Method and system for rendering data records
US8686955B2 (en) * 2010-03-11 2014-04-01 Apple Inc. Device, method, and graphical user interface for performing character entry
EP2367118A1 (en) * 2010-03-15 2011-09-21 GMC Software AG Method and devices for generating two-dimensional visual objects
US20110285651A1 (en) * 2010-05-24 2011-11-24 Will John Temple Multidirectional button, key, and keyboard
US8487877B2 (en) 2010-06-10 2013-07-16 Michael William Murphy Character specification system and method that uses a limited number of selection keys
KR20120033918A (ko) * 2010-09-30 2012-04-09 삼성전자주식회사 터치스크린을 구비한 휴대용 단말기의 입력 방법 및 장치
CN102455853B (zh) * 2010-11-01 2016-08-03 纬创资通股份有限公司 输入方法、输入装置及计算机系统
US20120124472A1 (en) * 2010-11-15 2012-05-17 Opera Software Asa System and method for providing interactive feedback for mouse gestures
US20120162086A1 (en) * 2010-12-27 2012-06-28 Samsung Electronics Co., Ltd. Character input method and apparatus of terminal
CN104160361A (zh) * 2012-02-06 2014-11-19 迈克尔·K·科尔比 字符串完成
US8701050B1 (en) * 2013-03-08 2014-04-15 Google Inc. Gesture completion path display for gesture-based keyboards
JP6112968B2 (ja) 2013-05-23 2017-04-12 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation コマンド生成方法、装置及びプログラム
JP2015130207A (ja) * 2015-04-15 2015-07-16 スピードスクリプト リミテッド アジア言語の高速入力システム
WO2016176357A1 (en) 2015-04-30 2016-11-03 Murphy Michael William Systems and methods for word identification that use button press type error analysis
KR101678094B1 (ko) * 2015-07-03 2016-11-23 현대자동차주식회사 차량, 및 그 제어방법
US10546053B2 (en) * 2016-05-18 2020-01-28 Citrix Systems, Inc. Semi-automated field placement for electronic forms
US10365823B2 (en) 2017-03-02 2019-07-30 International Business Machines Corporation Simplified text entry user interface for touch devices
US10671181B2 (en) * 2017-04-03 2020-06-02 Microsoft Technology Licensing, Llc Text entry interface
WO2018213805A1 (en) 2017-05-19 2018-11-22 Murphy Michael William An interleaved character selection interface
JP6482622B2 (ja) * 2017-09-14 2019-03-13 スピードスクリプト リミテッド アジア言語の高速入力システム
US11922007B2 (en) 2018-11-29 2024-03-05 Michael William Murphy Apparatus, method and system for inputting characters to an electronic device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5940076A (en) * 1997-12-01 1999-08-17 Motorola, Inc. Graphical user interface for an electronic device and method therefor
GB2332293A (en) * 1997-12-11 1999-06-16 British Telecomm An Input Device
US6011542A (en) 1998-02-13 2000-01-04 Sony Corporation Graphical text entry wheel
EP1171813B1 (en) 1999-03-18 2003-06-04 602531 British Columbia Ltd. Data entry for personal computing devices
EP1081583A3 (en) * 1999-08-31 2005-07-06 Sony Corporation Menu display system
GB2364493B (en) * 2000-06-30 2004-11-10 Nokia Mobile Phones Ltd Improved data input
US6907581B2 (en) * 2001-04-03 2005-06-14 Ramot At Tel Aviv University Ltd. Method and system for implicitly resolving pointing ambiguities in human-computer interaction (HCI)

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US9606668B2 (en) 2002-02-07 2017-03-28 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
US10338789B2 (en) 2004-05-06 2019-07-02 Apple Inc. Operation of a computer with touch screen interface
US9239677B2 (en) 2004-05-06 2016-01-19 Apple Inc. Operation of a computer with touch screen interface
US9348458B2 (en) 2004-07-30 2016-05-24 Apple Inc. Gestures for touch sensitive input devices
US10042418B2 (en) 2004-07-30 2018-08-07 Apple Inc. Proximity detector in handheld device
US11036282B2 (en) 2004-07-30 2021-06-15 Apple Inc. Proximity detector in handheld device
US7694231B2 (en) 2006-01-05 2010-04-06 Apple Inc. Keyboards for portable electronic devices
EP1826656A2 (en) * 2006-02-28 2007-08-29 Samsung Electronics Co., Ltd. Portable device and special character input method thereof
US10521022B2 (en) 2006-03-17 2019-12-31 Conversant Wireless Licensing S.a.r.l. Mobile communication terminal and method therefor
WO2007107824A1 (en) * 2006-03-17 2007-09-27 Nokia Corporation Improved mobile communication terminal and method therefore
EP1906299A3 (en) * 2006-09-26 2008-11-19 Samsung Electronics Co., Ltd. Portable device and method for displaying menu in portable device
US9189079B2 (en) 2007-01-05 2015-11-17 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US10592100B2 (en) 2007-01-05 2020-03-17 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US11112968B2 (en) 2007-01-05 2021-09-07 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US11416141B2 (en) 2007-01-05 2022-08-16 Apple Inc. Method, system, and graphical user interface for providing word recommendations
US9244536B2 (en) 2007-01-05 2016-01-26 Apple Inc. Method, system, and graphical user interface for providing word recommendations
WO2008085749A3 (en) * 2007-01-07 2008-11-06 Apple Inc Portable multifunction device with soft keyboards
US11474695B2 (en) 2008-01-09 2022-10-18 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US9086802B2 (en) 2008-01-09 2015-07-21 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US11079933B2 (en) 2008-01-09 2021-08-03 Apple Inc. Method, device, and graphical user interface providing word recommendations for text input
US10430078B2 (en) 2008-06-27 2019-10-01 Apple Inc. Touch screen device, and graphical user interface for inserting a character from an alternate keyboard
US10025501B2 (en) 2008-06-27 2018-07-17 Apple Inc. Touch screen device, method, and graphical user interface for inserting a character from an alternate keyboard
US8806362B2 (en) 2010-01-06 2014-08-12 Apple Inc. Device, method, and graphical user interface for accessing alternate keys
EP2545433A4 (en) * 2010-03-09 2014-05-07 Alibaba Group Holding Ltd METHOD AND DEVICE FOR DISPLAYING A CHARACTER DRAWING DURING A USER INPUT
US9582082B2 (en) 2010-03-09 2017-02-28 Alibaba Group Holding Limited Method and apparatus for displaying character selection during user input
WO2011112276A1 (en) 2010-03-09 2011-09-15 Alibaba Group Holding Limited Method and apparatus for displaying character selection during user input
JP2012123461A (ja) * 2010-12-06 2012-06-28 Fujitsu Ten Ltd 電子機器
US9766780B2 (en) 2011-02-11 2017-09-19 Sony Corporation Information input apparatus
US10175858B2 (en) 2011-02-11 2019-01-08 Sony Corporation Information input apparatus
US8704789B2 (en) 2011-02-11 2014-04-22 Sony Corporation Information input apparatus
EP2487561A1 (en) * 2011-02-11 2012-08-15 Sony Mobile Communications Japan, Inc. Information input apparatus
EP2547081A4 (en) * 2011-07-29 2013-12-18 Huawei Tech Co Ltd METHOD FOR DISPLAYING CONTACTS LIST AND TERMINAL
WO2014028443A1 (en) * 2012-08-14 2014-02-20 Motorola Mobility Llc Systems and methods for touch-based two-stage text input
US9256366B2 (en) 2012-08-14 2016-02-09 Google Technology Holdings LLC Systems and methods for touch-based two-stage text input
US9274685B2 (en) 2013-03-15 2016-03-01 Google Technology Holdings LLC Systems and methods for predictive text entry for small-screen devices with touch-based two-stage text input

Also Published As

Publication number Publication date
US7424683B2 (en) 2008-09-09
EP1509832A2 (en) 2005-03-02
DE60328284D1 (de) 2009-08-20
AU2003225484A1 (en) 2003-12-02
WO2003098417A3 (en) 2004-11-11
KR100941948B1 (ko) 2010-02-11
US20060095844A1 (en) 2006-05-04
JP4429898B2 (ja) 2010-03-10
EP1509832B1 (en) 2009-07-08
ATE436048T1 (de) 2009-07-15
JP2005526321A (ja) 2005-09-02
KR20040111642A (ko) 2004-12-31
AU2003225484A8 (en) 2003-12-02
ES2328921T3 (es) 2009-11-19
CN1308803C (zh) 2007-04-04
CN1666168A (zh) 2005-09-07

Similar Documents

Publication Publication Date Title
US7424683B2 (en) Object entry into an electronic device
US7778818B2 (en) Directional input system with automatic correction
US10552037B2 (en) Software keyboard input method for realizing composite key on electronic device screen with precise and ambiguous input
US9063580B2 (en) Keyboardless text entry
USRE42268E1 (en) Method and apparatus for organizing addressing elements
US7443316B2 (en) Entering a character into an electronic device
US5844561A (en) Information search apparatus and information search control method
US8151209B2 (en) User input for an electronic device employing a touch-sensor
US20090249203A1 (en) User interface device, computer program, and its recording medium
US20140123049A1 (en) Keyboard with gesture-redundant keys removed
JP2005235188A (ja) データ入力装置
EP1252618A1 (en) Pointing method
JP5556398B2 (ja) 情報処理装置、情報処理方法およびプログラム
US20110001718A1 (en) Data input device
JP2013097655A (ja) 情報検索装置、情報検索方法、および情報検索プログラム
CA2719387C (en) System and method for facilitating character capitalization in handheld electronic device
JPH11338608A (ja) 情報処理方法及び装置並びに記憶媒体
KR100745978B1 (ko) 문자 입력 장치, 이를 구비한 휴대용 장치, 및 입력 방법
CA2609870A1 (en) Screen object placement optimized for blind selection

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2003752872

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2006095844

Country of ref document: US

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 10514594

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2004505864

Country of ref document: JP

Ref document number: 1020047018628

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 20038114968

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 1020047018628

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 2003752872

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 10514594

Country of ref document: US