US20190138208A1 - Method and system of multi-variable character input - Google Patents

Method and system of multi-variable character input Download PDF

Info

Publication number
US20190138208A1
US20190138208A1 US16/242,688 US201916242688A US2019138208A1 US 20190138208 A1 US20190138208 A1 US 20190138208A1 US 201916242688 A US201916242688 A US 201916242688A US 2019138208 A1 US2019138208 A1 US 2019138208A1
Authority
US
United States
Prior art keywords
button
button press
timer
elapsing
expiration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/242,688
Inventor
Michael William Murphy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US16/242,688 priority Critical patent/US20190138208A1/en
Publication of US20190138208A1 publication Critical patent/US20190138208A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04895Guidance during keyboard input operation, e.g. prompting
    • G06F17/24
    • G06F17/273
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/232Orthographic correction, e.g. spell checking or vowelisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • This description generally relates to the field of electronic devices and, more particularly, to user interfaces of electronic devices.
  • a computer processor-implemented method may be summarized as including: receiving, by at least one computer processor, input resulting from actuation of buttons; and selecting characters, by at least one computer processor, based on the received input resulting from the actuation of buttons, wherein the selecting the characters includes, for each character to be selected: determining values, by at least one computer processor, for at least three variables from the received input; and identifying the character, by at least one computer processor, from among a plurality of characters based on the values of the at least three variables.
  • a first variable of the at least three variables may be an assigned value of each actuated button, a second variable of the at least three variables may be whether a second button actuation occurs within a finite-length time period measured from an onset of a first button actuation, and a third variable of the at least three variables may be a duration of the first button actuation measured from the onset of the first button actuation.
  • a value for the first variable may be one of ⁇ 3, ⁇ 2, +2 and +3.
  • a value for the second variable may be yes or no.
  • a value for the third variable may be less than the finite-length time period measured from the onset of the first button actuation or greater than the finite-length time period measured from the onset of the first button actuation.
  • the computer processor-implemented method may further include determining values, by at least one computer processor, for at least five variables from the received input.
  • a fourth variable of the at least five variables may identify for any given button actuation which array of two or more button arrays that the actuated button is a member of.
  • a value for the fourth variable may be one of row 1, row 2, row 3, row 4 and row 5.
  • a fifth variable of the at least five variables may identify whether the received input from any given button actuation is from a button that is time-dependent or time-independent.
  • a time-independent button may be a button for which the determined values of the second and third variables are always the same.
  • the determined value for the second variable may be no and the determined value for the third variable may be less than the finite-length time period measured from the onset of the first button actuation.
  • Values of the at least five variables may identify the character at least by identifying which of a plurality of one-dimensional arrays the character is a member of and by a position of the character in the identified one-dimensional array.
  • Values of the at least three variables may identify the character at least by identifying a position of the character in a one-dimensional array.
  • a system may be summarized as including: at least one computer processor; and at least one memory coupled to the at least one computer processor, the at least one memory having computer executable instructions stored thereon that, when executed, cause the at least one processor to perform: receiving input resulting from actuation of buttons; and selecting characters based on the received input resulting from the actuation of buttons, wherein the selecting the characters includes, for each character to be selected: determining values for at least three variables from the received input; and identifying the character from among a plurality of characters based on the values of the at least three variables.
  • a first variable of the at least three variables may be an assigned value of each actuated button, a second variable of the at least three variables may be whether a second button actuation occurs within a finite-length time period measured from an onset of a first button actuation, and a third variable of the at least three variables may be a duration of the first button actuation measured from the onset of the first button actuation.
  • a value for the first variable may be one of ⁇ 3, ⁇ 2, +2 and +3.
  • a value for the second variable may be yes or no.
  • a non-transitory computer-readable medium may be summarized as having computer executable instructions stored thereon that, when executed, cause at least one processor to perform: receiving input resulting from actuation of buttons; and selecting characters based on the received input resulting from the actuation of buttons, wherein the selecting the characters includes, for each character to be selected: determining values for at least three variables from the received input; and identifying the character from among a plurality of characters based on the values of the at least three variables.
  • a first variable of the at least three variables may be an assigned value of each actuated button, a second variable of the at least three variables may be whether a second button actuation occurs within a finite-length time period measured from an onset of a first button actuation, and a third variable of the at least three variables may be a duration of the first button actuation measured from the onset of the first button actuation.
  • a value for the first variable may be one of ⁇ 3, ⁇ 2, +2 and +3.
  • FIG. 1 is a schematic view of a system for input of characters with optional time-dependent button presses according to one illustrated embodiment.
  • FIG. 2 is a list of variables for a system to input characters with optional time-dependent button presses according to one illustrated embodiment.
  • FIG. 3 is a schematic view of an example electronic device for input of characters with optional time-dependent button presses according to one illustrated embodiment, the electronic device being a mobile device having a housing, a display, a graphics engine, a central processing unit (CPU), user input device(s), one or more storage mediums having various software modules thereon that are executable by the CPU, input/output (I/O) port(s), network interface(s), wireless receiver(s) and transmitter(s), a power source, an elapsed time counter and a button press value counter.
  • CPU central processing unit
  • I/O input/output
  • FIG. 4 is a schematic drawing of one embodiment of the electronic device 100 for input of characters. Aspects of the user interface 150 were previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • FIG. 5 is a flow diagram of a method for specifying a character from among a plurality of characters according to one illustrated embodiment.
  • FIGS. 6A and 6B are flow diagrams of a method for an electronic device to interpret button presses according to one illustrated embodiment.
  • FIG. 7 is a table of value assignments for one embodiment of a method of character identification.
  • FIG. 8 is an example of an application of a method of character identification.
  • FIG. 9 is a flow diagram of another method for an electronic device to interpret button presses according to one illustrated embodiment.
  • FIG. 10 is a flow diagram of variables of a method to interpret button presses according to one illustrated embodiment.
  • FIG. 11 is a table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
  • FIG. 12 is a table of characteristics of two different methods to input characters.
  • FIG. 13 is an embodiment of two different user interfaces for methods to input characters.
  • FIG. 14 is a graphical representation of button press types of a method to input characters according to one illustrated embodiment.
  • FIGS. 15-17 are examples of an application of a method of character input according to one illustrated embodiment.
  • FIGS. 18-21 are additional examples of an application of a method of character input according to one illustrated embodiment.
  • FIG. 22 is a schematic drawing of another embodiment of the electronic device 100 for character input.
  • FIG. 23 is a table of value assignments for another embodiment of a method for character input.
  • FIG. 24 is a schematic drawing of yet another embodiment of the electronic device 100 for character input.
  • FIG. 25 is a table of value assignments for yet another embodiment of a method for character input.
  • FIG. 26 is a schematic drawing of yet another embodiment of the electronic device 100 for character input.
  • FIG. 27 is a table of value assignments for yet another embodiment of a method for character input.
  • FIG. 28 is a drawing of a user interface, a table of values, and a graphical representation of button press types for one embodiment of an electronic device.
  • FIG. 29 is a table of value assignments for another embodiment of a method for character input.
  • FIGS. 30-31 are additional examples of an application of a method of character input according to one illustrated embodiment.
  • FIGS. 32-34 are yet additional examples of an application of a method of character input according to one illustrated embodiment.
  • FIG. 35 is two tables of characters of one embodiment of a method of character input according to one illustrated embodiment.
  • FIG. 36 is a flow diagram of variables of a method to interpret button presses according to one illustrated embodiment.
  • FIG. 1 is a schematic view of a system 101 for input of characters with optional time-dependent button presses.
  • the system 101 includes a device 100 and a user 160 .
  • the device 100 includes at least one processor 108 that operates a word identification algorithm 109 .
  • At least one face of the device 100 has a user interface 150 .
  • the user interface 150 includes at least a display 104 and user input devices 110 .
  • the display 104 includes a menu 240 and a text output area 105 .
  • the user input devices 110 are selection buttons, but in alternative embodiments may be touchscreen buttons or other types of input devices as disclosed in FIG. 3 .
  • the user 160 includes eyes 162 and fingers 164 , although in alternative embodiments of the system 101 designed for accessibility, the user is not required to have use of either.
  • the system 101 enables character input by passing values between its various components.
  • the values change with time, so the values are variables.
  • the system 101 passes a variable ‘available characters’ 200 from the menu 240 of the user interface 150 to the eyes 162 of the user.
  • the system 101 passes six input variables 186 from the user's fingers 164 to the selection buttons 110 of the user interface.
  • the input variables 186 are: ‘number of button presses’ 202 , ‘value of button press’ 222 , ‘row of button press’ 282 , ‘fixed-value press’ 355 , ‘co-press’ 210 and ‘duration of press’ 208 .
  • Values for each input variable 186 are the result of decisions made by the user. The values are contained within the time-dependent button presses received by the selection buttons 110 .
  • the decisions that produce values for the input variables 186 are both conscious and unconscious. For example, selections for the variable ‘value of button press’ 222 are typically conscious because the variable explicitly identifies which buttons a user decides to press. However, while deciding that the user also passively decides values for the other variables ‘number of button presses’ 202 , ‘row of button press’ 282 , ‘fixed-value press’ 355 , ‘co-press’ 210 and ‘duration of press’ 208 . The device 100 acquires values for these additional variables 202 , 282 , 355 , 210 , 208 through the same button presses that identify the variable ‘value of button press’ 222 . These passively acquired variables are useful for interpreting the user's intended input but come at little or no additional effort on the part of the user 160 .
  • the system 101 synthesizes the input variables 186 received by the selection buttons 110 into three intermediate variables 187 : ‘sequence of button row values’ 385 , ‘sequence of button press values’ 380 and ‘sequence of button press types’ 382 .
  • each intermediate variable 187 is an incomplete piece of information, but together the three variables 187 identify a presumed word and a plurality of possible alternatives.
  • the word identification algorithm 109 of the processor 108 interprets the intermediate variables 187 to identify the user's intended input.
  • the processor 108 passes a variable ‘identified word’ 130 to the text output area 105 of the display.
  • the text output area 105 makes the variable ‘identified word’ 130 available to the user's eyes 162 .
  • FIG. 2 lists the variables for the system 101 of FIG. 1 and example values for each variable according to one embodiment of the system.
  • Example values for the variable ‘available characters’ 200 are ‘a, b, c, d, e . . . ’ and so on.
  • Example values for the variable ‘number of button presses’ 202 is ‘1’, ‘2’, ‘3’, ‘4’, ‘5’ . . . ’ and so on.
  • Example values for the variable ‘button positions pressed’ 222 is ‘ ⁇ 3, ⁇ 2, ⁇ 1, 0, +1, +2, +3’ and so on.
  • Example values for the variable ‘row of button press’ 282 is ‘A, B, C . . . ’ and so on.
  • Example values for the variable ‘fixed-value’ 355 are ‘yes’ and ‘no’.
  • Example values for the variable ‘co-press’ 210 are ‘yes’ and ‘no’.
  • Example values for the variable ‘duration’ 208 are ‘ ⁇ ETP’ and ‘>ETP’, where ETP stands for elapsed time period and refers to the duration of user chosen selectable-length time clock.
  • An example value for the variable ‘sequence of button row values’ 385 is ‘B-A-A-A-A-A-B-A-A’.
  • An example value for the variable ‘sequence of button press values’ 380 is ‘ ⁇ 3 ⁇ 2 0 +2 +3 +2 0 ⁇ 2 ⁇ 3 ⁇ 2 ⁇ 2’.
  • An example value for the variable ‘sequence of button press types 382 is ‘long-short-fixed-pair-short-fixed-short-long-long-short’.
  • Example values for the variable ‘identified word’ 130 are ‘sun’, ‘dog’, ‘sidewalk’, ‘run’ . . . ’ and so on.
  • FIG. 3 is a schematic view of one example electronic device, in this case mobile device 100 , for input of characters with optional time-dependent button presses according to one illustrated embodiment.
  • the mobile device 100 shown in FIG. 3 may have a housing 102 , a display 104 , a graphics engine 106 , a central processing unit (CPU) 108 , one or more user input devices 110 , one or more storage mediums 112 having various software modules 114 stored thereon comprising instructions that are executable by the CPU 108 , input/output (I/O) port(s) 116 , one or more wireless receivers and transmitters 118 , one or more network interfaces 120 , and a power source 122 .
  • I/O input/output
  • some or all of the same, similar or equivalent structure and functionality of the mobile device 100 shown in FIG. 1 and described herein may be that of, part of or operably connected to a communication and/or computing system of another device or machine.
  • the mobile device 100 may be any of a large variety of communications devices such as a cellular telephone, a smartphone, a wearable device, a wristwatch, a portable media player (PMP), a personal digital assistant (PDA), a mobile communications device, a portable computer with built-in or add-on cellular communications, a portable game console, a global positioning system (GPS), a handheld industrial electronic device, or the like, or any combination thereof.
  • the mobile device 100 has at least one central processing unit (CPU) 108 which may be a scalar processor, a digital signal processor (DSP), a reduced instruction set (RISC) processor, or any other suitable processor.
  • CPU central processing unit
  • DSP digital signal processor
  • RISC reduced instruction set
  • the central processing unit (CPU) 108 , display 104 , graphics engine 106 , one or more user input devices 110 , one or more storage mediums 112 , input/output (I/O) port(s) 116 , one or more wireless receivers and transmitters 118 , and one or more network interfaces 120 may all be communicatively connected to each other via a system bus 124 .
  • the system bus 124 can employ any suitable bus structures or architectures, including a memory bus with memory controller, a peripheral bus, and/or a local bus.
  • the mobile device 100 also includes one or more volatile and/or non-volatile storage medium(s) 112 .
  • the storage mediums 112 may be comprised of any single or suitable combination of various types of processor-readable storage media and may store instructions and data acted on by CPU 108 . For example, a particular collection of software instructions comprising software 114 and/or firmware instructions comprising firmware are executed by CPU 108 .
  • the software or firmware instructions generally control many of the operations of the mobile device 100 and a subset of the software and/or firmware instructions may perform functions to operatively configure hardware and other software in the mobile device 100 to provide the initiation, control and maintenance of applicable computer network and telecommunication links from the mobile device 100 to other devices using the wireless receiver(s) and transmitter(s) 118 , network interface(s) 120 , and/or I/O ports 116 .
  • the CPU 108 includes an elapsed time counter 140 .
  • the elapsed time counter 140 may be implemented using a timer circuit operably connected to or as part of the CPU 108 . Alternately some or all of the elapsed time counter 140 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112 , for example, that when executed by CPU 108 or a processor of a timer circuit, performs the functions described herein of the elapsed time counter 140 .
  • the CPU 108 includes a button press value counter 142 .
  • some or all of the button press value counter 142 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112 , for example, that when executed by CPU 108 , performs the functions described herein of the button press value counter 142 .
  • the storage medium(s) 112 may be processor-readable storage media which may comprise any combination of computer storage media including volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Combinations of any of the above should also be included within the scope of processor-readable storage media.
  • the storage medium(s) 112 may include system memory which includes computer storage media in the form of volatile and/or nonvolatile memory such as read-only memory (ROM) and random access memory (RAM).
  • ROM read-only memory
  • RAM random access memory
  • BIOS basic input/output system
  • RAM typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by CPU 108 .
  • FIG. 1 illustrates software modules 114 including an operating system, application programs and other program modules that implement the processes and methods described herein.
  • the mobile device 100 may also include other removable/non-removable, volatile/nonvolatile computer storage media drives.
  • the storage medium(s) 112 may include a hard disk drive or solid state storage drive that reads from or writes to non-removable, nonvolatile media, a SSD that reads from or writes to a removable, nonvolatile SSD, and/or an optical disk drive that reads from or writes to a removable, nonvolatile optical disk such as a DVD-RW or other optical media.
  • a user may enter commands and information into the mobile device 100 through touch screen display 104 or the one or more other input device(s) 110 such as a keypad, keyboard, tactile buttons, camera, motion sensor, position sensor, light sensor, biometric data sensor, accelerometer, or a pointing device, commonly referred to as a mouse, trackball or touch pad.
  • Other input devices of the mobile device 100 may include a microphone, joystick, thumbstick, game pad, optical scanner, other sensors, or the like.
  • These and other input devices are often connected to the CPU 108 through a user input interface that is coupled to the system bus 124 , but may be connected by other interface and bus structures, such as a parallel port, serial port, wireless port, game port or a universal serial bus (USB).
  • USB universal serial bus
  • a unique software driver stored in software 114 configures each input mechanism to sense user input, and then the software driver provides data points that are acted on by CPU 108 under the direction of other software 114 .
  • the display is also connected to the system bus 124 via an interface, such as the graphics engine 106 .
  • the mobile device 100 may also include other peripheral output devices such as speakers, a printer, a projector, an external monitor, etc., which may be connected through one or more analog or digital I/O ports 116 , network interface(s) 120 or wireless receiver(s) and transmitter(s) 118 .
  • the mobile device 100 may operate in a networked environment using connections to one or more remote computers or devices, such as a remote computer or device.
  • the mobile device 100 When used in a LAN or WAN networking environment, the mobile device 100 may be connected via the wireless receiver(s) and transmitter(s) 118 and network interface(s) 120 , which may include, for example, cellular receiver(s) and transmitter(s), Wi-Fi receiver(s) and transmitter(s), and associated network interface(s).
  • the mobile device 100 When used in a WAN networking environment, the mobile device 100 may include a modem or other means as part of the network interface(s) for establishing communications over the WAN, such as the Internet.
  • the wireless receiver(s) and transmitter(s) 118 and the network interface(s) 120 may be communicatively connected to the system bus 124 .
  • program modules depicted relative to the mobile device 100 may be stored in a remote memory storage device of a remote system.
  • the mobile device 100 has a collection of I/O ports 116 and/or short range wireless receiver(s) and transmitter(s) 118 and network interface(s) 120 for passing data over short distances to and from the mobile device 100 or for coupling additional storage to the mobile device 100 .
  • serial ports USB ports, Wi-Fi ports, Bluetooth® ports, IEEE 1394 (i.e., FireWire), and the like can communicatively couple the mobile device 100 to other computing apparatuses.
  • Compact Flash (CF) ports, Secure Digital (SD) ports, and the like can couple a memory device to the mobile device 100 for reading and writing by the CPU 108 or couple the mobile device 100 to other communications interfaces such as Wi-Fi or Bluetooth transmitters/receivers and/or network interfaces.
  • CF Compact Flash
  • SD Secure Digital
  • Mobile device 100 also has a power source 122 (e.g., a battery).
  • the power source 122 may supply energy for all the components of the mobile device 100 that require power when a traditional, wired or wireless power source is unavailable or otherwise not connected.
  • Other various suitable system architectures and designs of the mobile device 100 are contemplated and may be utilized which provide the same, similar or equivalent functionality as those described herein.
  • the various techniques, components and modules described herein may be implemented in connection with hardware, software and/or firmware or, where appropriate, with a combination of such.
  • the methods and apparatus of the disclosure, or certain aspects or portions thereof may take the form of program code (i.e., instructions) embodied in tangible media, such as various solid state memory devices, DVD-RW, RAM, hard drives, flash drives, or any other machine-readable or processor-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a processor of a computer, vehicle or mobile device, the machine becomes an apparatus for practicing various embodiments.
  • vehicles or mobile devices such generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
  • One or more programs may implement or utilize the processes described in connection with the disclosure, e.g., through the use of an API, reusable controls, or the like.
  • Such programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system of mobile device 100 .
  • the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.
  • FIG. 4 shows a schematic drawing of one embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of FIG. 1 .
  • the device 100 has aspects previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104 , a plurality of characters 200 that populate positions 242 in multiple menu rows (or menu arrays) 244 of a character menu 240 , a plurality of selection buttons 110 divided among multiple button rows (or button arrays) 280 that includes both time-dependent selection buttons 127 and at least one time-independent selection button 129 , and a spacebar button 264 , which together make up a user interface 150 of the device 100 .
  • Each of the selection buttons 110 has an assigned row identification (ID) value 282 and an assigned button press value 222 . Included as part of or within proximity to the menu 240 is a reference 258 , additional row ID values 282 , and an offset scale 260 .
  • the display 104 , the plurality of selection buttons 110 , and the spacebar button 264 are communicatively coupled with the CPU 108 , as described in the embodiment of FIG. 1 .
  • the CPU 108 includes the elapsed time counter 140 and the button press value counter 142 , as described in the embodiment of FIG. 1 .
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122 , as described in the embodiment of FIG. 1 .
  • the positions 242 of the menu 240 are arranged in two one-dimensional arrays (or rows) 244 , each array similar to the embodiment in FIG. 8 of U.S. Pat. No. 8,487,877, except that the menu 240 is shown on the display 104 instead of as a physical feature of the user interface 150 .
  • Each menu row 244 is identified by the row ID value 282 that it shares with the row of selection buttons 280 that operate on that menu row.
  • the plurality of selection buttons 110 can be either hard keys (physical buttons) or soft keys (buttons shown on the display 104 ). In the embodiment of FIG. 4 , the selection buttons 110 are shown as physical buttons. In either case, the buttons 110 are communicatively coupled with the CPU 108 .
  • the selection buttons 110 can be arranged in any pattern.
  • the time-dependent buttons 127 and time-independent buttons 129 are interspersed amongst one another on the user interface 150 .
  • the buttons 127 , 129 are interspersed amongst one another in an arrangement on the display 104 .
  • the menu rows 244 and the offset scale 260 are positioned as respective one-dimensional arrays on the user interface 150 of the device 100 .
  • the menu rows 244 and the offset scale 260 are positioned on the user interface 150 so that they lie adjacent to and parallel with one other.
  • the menu rows 244 and the offset scale 260 are programmed in software so that they appear as features on the display 104 of the device 100 .
  • positions 242 of each respective menu row 244 are distributed in a one-dimensional array in evenly spaced increments.
  • values of the offset scale 260 are distributed in a one-dimensional array in spatial increments that equal the increment of the menu rows 244 , so that by referencing the offset scale 260 to the menu rows 244 , characters 200 in the menu rows are effectively numbered.
  • the reference 258 is an indicator located near or on one of the positions 242 of the menu 240 .
  • the offset scale 260 includes a value of zero that is located to correspond with the reference 258 of the menu 240 . Values of the offset scale 260 increase from zero in pre-selected increments as positions of the offset scale get farther from the zero value. In a further embodiment, values of the offset scale 260 decrease from zero in pre-selected increments as positions of the offset scale get farther from the zero value in a direction opposite to the increasing direction. In one embodiment, the pre-selected increment of the offset scale 260 equals one and the values of the offset scale extend from a negative value to a positive value passing through zero.
  • the positions 242 of the menu 240 and the values of the offset scale 260 are distributed in respective one-dimensional arrays positioned adjacent to and parallel with one another, the values of the offset scale 260 count in increments of one and are spaced with respect to one another in their array to correspond with the spacing of positions 242 of the menu 240 , and the zero value of the offset scale 260 corresponds to the reference 258 of the menu 240 so that the values of the offset scale 260 label the positions of each row 244 of the menu 240 according to how many positions a given position 242 of a row 244 is offset from the reference 258 .
  • the plurality of selection buttons 110 lie on the user interface 150 of the device 100 and, as described above, can be either hard or soft keys.
  • the buttons 110 are arranged in button rows 280 so that the number of button rows 280 and the number of menu rows 244 is the same.
  • the button rows 280 are arranged to visually correspond with the arrangement of the menu rows 244 .
  • Each button is communicatively coupled with the CPU 108 .
  • Each button 110 has the function that when pressed, the row ID value 282 and button press value 222 assigned to the button is input to the CPU 108 .
  • the buttons 110 arranged within the same row 280 have the same row ID value 282 and buttons in different rows have different row ID values.
  • the assigned button press values 222 can be either positive or negative.
  • the button press values 222 assigned to the selection buttons 110 of a button row 280 are all unique.
  • there are five selection buttons 110 per menu row 280 some buttons are time-dependent 127 while others are time-independent 129 , and the buttons' assigned values are ⁇ 3, ⁇ 2, 0, +2, and +3.
  • there are seven selection buttons 110 per menu row 280 some buttons are time-dependent 127 while others are time-independent 129 , and the buttons' assigned values are ⁇ 4, ⁇ 3, ⁇ 2, 0, +2, +3 and +4.
  • buttons 110 per menu row 280 there are six selection buttons 110 per menu row 280 , some buttons are time-dependent 127 while others are time-independent 129 , and the buttons' assigned values are ⁇ 4, ⁇ 3, ⁇ 2, 0, +2, and +3.
  • buttons 110 per menu row 280 some buttons are time-dependent 127 while others are time-independent 129 , and the buttons' assigned values are ⁇ 3, ⁇ 2, ⁇ 1, +1, +2 and +3.
  • the spacebar 264 also lies in the user interface 150 of the device 100 , can be either a hard or soft key, and is communicatively coupled with the CPU 108 .
  • FIGS. 5 and 6 show flowcharts for, respectively, an embodiment of a method 509 for specifying a character from among a plurality of characters and an embodiment of a method 609 for an electronic device to interpret button presses—both in accordance with the user interface 150 of FIG. 4 .
  • a user views the plurality of characters 200 displayed in the menu 240 .
  • the user selects a character from the menu 240 for input to the electronic device 100 .
  • the user identifies the selected character by (1) which row 244 of the menu 240 the character is in and (2) the position 242 of the character in its row with respect to the reference 258 .
  • a user can identify a selected character as in either a top or bottom row 244 and by a value equal to the number of positions 242 the selected character is offset from the menu's reference 258 .
  • the user can identify the position 242 of the selected character in its row 244 in a number of ways, including by referencing the position to a corresponding value in the offset scale 260 , counting the number of positions 242 that the selected character is offset from the reference 258 , recalling from memory the value that identifies the particular selected character, and recalling by muscle memory the selection button keystrokes that correspond with the selected character or the selected character's position.
  • step 544 the user determines whether the value that identifies the selected character's position 242 in its menu row 244 equals the assigned value 222 of any of the selection buttons 110 .
  • step 538 the user presses the selection button with the assigned value that equals the selected character's position and releases the button before the elapsed time counter expires, if necessary.
  • the aforementioned step 518 inputs the assigned value 222 and the row ID value 282 of the pressed selection button to the CPU 108 , triggers the CPU 108 to start the elapsed time counter 140 , and indicates to the CPU that the type of button press is a SHORT press, or a FIXED press if the button is a time-independent button 129 .
  • the user waits for the elapsed time counter 140 to expire, if necessary.
  • Expiration of the elapsed time period is not required if the selection button pressed in a subsequent cycle of the method is in a different row 280 than the selection button 110 pressed in the current cycle or if the button is a time-independent button 129 .
  • the user views the specified character on the display 104 .
  • step 522 is bypassed.
  • the user determines whether the value that identifies the selected character's position 242 in its menu row 244 equals twice the assigned button press value 222 of any selection button 110 .
  • step 540 the user presses the selection button 110 with the assigned value 222 that equals half the selected character's position and maintains the button press until the elapsed time counter expires.
  • the aforementioned step 540 inputs the assigned value 222 and the row ID value 282 of the pressed selection button to the CPU 108 , triggers the CPU 108 to start the elapsed time counter 140 , and indicates to the processor that the type of button press is a LONG press.
  • step 522 the user views the specified character on the display 104 . In an alternative embodiment, step 522 is bypassed.
  • step 524 the user presses the selection button with the assigned value 222 that is one of two values whose sum equals the selected character's position.
  • the aforementioned step 524 inputs the assigned value 222 and the row ID value 282 of the pressed selection button 110 to the CPU 108 and triggers the CPU to start the elapsed time counter 140 .
  • the user presses the selection button 110 with the assigned value 222 that is the other of two values whose sum equals the selected character's position 242 and does so before the elapsed time counter 140 expires.
  • the aforementioned step 526 inputs the assigned value 222 of the pressed selection button 110 to the CPU 108 and indicates to the processor that the type of button press is PAIR.
  • the CPU 108 may also terminate the elapsed time counter 140 .
  • the character specification method 509 described above is used iteratively to specify series of characters from the character menu 240 .
  • words and sentences are formed on the display 104 by iteratively specifying characters according the method above, with the spacebar 264 used to input spaces between words on the display.
  • FIGS. 6A and 6B show flowcharts of an embodiment of a method 609 for the processor 108 of an electronic device to interpret sequences of button presses.
  • the CPU 108 initializes element of an array variable ‘sequence of row ID values’ 385 to zero.
  • the CPU 108 initializes elements of an array variable ‘sequence of button press values’ 380 to zero.
  • the CPU 108 initializes elements of an array variable ‘sequence of button press types’ 382 to zero.
  • the CPU 108 initializes a variable ‘number of loops m’ 390 to zero.
  • the CPU 108 initializes a variable ‘number of button presses n’ 392 to zero.
  • the CPU 108 initializes the elapsed time counter 140 to zero.
  • the CPU 108 monitors the selection buttons 110 for a pressed selection button 110 . Once a first selection button occurs, in another step 656 , the CPU 108 determines if the first pressed selection button 110 is a press of the spacebar 264 . If not, in a next step 658 , the CPU 108 assigns to the n th element of the variable BPV sequence 380 the assigned value 222 of the first pressed selection button 110 .
  • the CPU determines which button row 280 the pressed selection button 110 is a member of.
  • the CPU 108 assigns to the m th element of the variable sequence of row ID values 385 the row ID value 282 of the pressed selection button 110 .
  • the CPU determines if the pressed selection button 110 is a time-dependent button 127 or time-independent button 129 . If the selection button is a time-independent button 129 , then in a subsequent step 684 the CPU 108 assigns to the m th element of the variable BPT sequence 382 the value ‘fixed’ 355 . If the selection button is a time-dependent button 127 , then in another step 618 , the CPU 108 starts the elapsed time counter 140 . In a pair of steps 620 , 622 , the CPU 108 monitors the selection buttons 110 for the occurrence of a second selection button press while comparing the elapsed time counter 140 with a user chosen selectable-length time period.
  • the CPU 108 determines if the first button press is still pressed.
  • the CPU 108 assigns to the m th element of the variable BPT sequence 382 the value ‘short’ 340 .
  • the CPU 108 assigns to the m th element of the variable BPT sequence 382 the value ‘long’ 345 .
  • the CPU 108 assigns to the m th element of the variable BPT sequence 382 the value ‘pair’ 350 . Then, in a subsequent step 666 the CPU 108 adds 1 to the variable number of button presses n 392 . Then, in a subsequent step 668 the CPU 108 assigns to the n th element of the variable BPV sequence 380 the assigned value 222 of the second pressed selection button 110 . Then, in the subsequent step 666 the CPU 108 again adds 1 to the variable number of button presses n 392 . Then, in a subsequent step 670 the CPU 108 adds 1 to the variable number of loops m 390 .
  • the CPU 108 re-initializes the elapsed time counter 140 to zero and repeats the method in succession until in the step 656 the CPU 108 finds that the selection button pressed in step 614 is a press of the spacebar 264 .
  • the CPU 108 converts the values of the variable BPV sequence 380 to values of a variable ‘total BPV sequence’ 386 by: (1) doubling values of the BPV sequence 380 that coincide with ‘long BPT’ values 345 of the BPT sequence 382 , and (2) adding together values of the BPV sequence 380 that coincide with consecutive ‘pair BPT’ values 350 of the BPT sequence 382 .
  • no value of the BPV sequence 380 is added to more than one other value. Furthermore, additions are made so that every value of the BPV sequence 380 that coincides with a pair BPT 350 gets added to a consecutive value of the BPV sequence that also coincides with a pair BPT and in such a way that leaves no BPV that coincides with a pair BPT un-added.
  • the CPU 108 constructs a character sequence 388 by identifying in order from the menu 240 each character 200 whose position 242 equals a value of the total BPV sequence 386 .
  • the method 609 of FIG. 4 is one embodiment of a method for a processor 108 to interpret sequences of button presses, the scope of the method is not limited by this embodiment, but rather by the scope of the claims.
  • FIG. 7 shows a table of value assignments for variables of one embodiment of a character input system for the English language.
  • the variables passed from the menu 240 to the user 160 are the ‘menu row’ 244 , the ‘menu position’ 242 and the ‘available character’ 200 . Assigned values are shown.
  • the variables passed from the user 160 to the selection buttons 110 are the ‘button row’ 282 , ‘button value’ 222 , ‘fixed-value’ 355 , ‘co-press’ 210 and ‘duration’ 208 . Assigned values are shown.
  • the variables synthesized by the processor 108 from presses of the selection buttons 110 are the ‘button row value’ 282 , ‘button press values’ 222 and ‘button press type’ 224 . Assigned values are shown.
  • FIG. 8 shows two examples of the method 609 of FIGS. 6A and 6B for the user interface 150 of FIG. 4 .
  • Each example includes the variables ‘number of button presses n’ 392 , ‘BPV sequence’ 380 , ‘number of loops m’ 390 , ‘row ID sequence’ 385 , ‘BPT sequence’ 382 and ‘character sequence’ 388 .
  • variable number of button presses n 392 identifies the elements (0-11) of the array variable BPV sequence 380 .
  • the BPV sequence 380 contains the BPV 222 of consecutive button presses ( ⁇ 3 ⁇ 2 0 +2 +3 +2 0 ⁇ 2 ⁇ 3 ⁇ 2 ⁇ 2) collected in steps 658 and/or 668 over multiple iterations of the method 609 of FIGS. 6A and 6B .
  • the variable number of loops m 390 identifies the elements (0-10) of the array variable row ID sequence 385 and the array variable BPT sequence 382 .
  • the row ID sequence 385 contains the row ID value 282 collected in steps 678 and 680 with each iteration of the method 609 (B-A-A-A-A-A-B-A-A).
  • the BPT sequence 382 contains the BPT 224 collected in one of steps 660 , 662 , 664 or 684 with each iteration of the method 609 of FIGS. 6A and 6B (long-short-fixed-pair-short-fixed-short-long-long-short).
  • the character sequence 388 contains the selected characters (n e g l i g e n c e). In the first example 198 , values for each of the variables above contribute to select characters of the word 130 ‘negligence’.
  • variable number of button presses n 392 identifies the elements (0-16) of the array variable BPV sequence 380 .
  • the BPV sequence 380 contains the BPV 222 of each consecutive button press (+3 ⁇ 2 ⁇ 3 ⁇ 3 ⁇ 2 ⁇ 2 ⁇ 2 +3 ⁇ 3 +2 0 ⁇ 3 ⁇ 2 +3 ⁇ 2 ⁇ 3) collected in steps 658 and/or 668 over multiple iterations of the method 609 of FIGS. 6A and 6B .
  • the variable number of loops m 390 identifies the elements (0-12) of the array variable row ID sequence 385 and the array variable BPT sequence 382 .
  • the row ID sequence 385 contains the row ID value 282 collected in steps 678 and 680 with each iteration of the method 609 (B-B-A-A-A-B-B-B-B-A-A-A).
  • the BPT sequence 382 contains the BPT 224 collected in one of steps 660 , 662 , 664 or 684 with each iteration of the method 609 of FIGS. 6A and 6B (pair-long-long-long-long-pair-pair-fixed-pair-long-short-short).
  • the character sequence 388 contains the selected characters (u n a c c u s t o m e d). In the second example 199 , values for each of the variables above contribute to select characters of the word 130 ‘unaccustomed’.
  • FIG. 9 shows a method 709 that uses button press types 224 and row ID values 282 to identify a word from a received sequence of button presses.
  • the first step of the method 709 of FIG. 9 is the method 609 of FIGS. 6A and 6B .
  • the CPU 108 interprets received button presses and from the presses constructs a character sequence 388 .
  • the constructed character sequence 388 is the presumed word 134 .
  • the CPU 108 compares the presumed word 134 with a library 136 of word possibilities. In a next step 712 , the CPU 108 determines whether the presumed word 134 is found in the library 136 or not.
  • the CPU 108 accepts the presumed word as input.
  • the CPU 108 divides the BPT sequence 382 into the BPT sequence segments 428 according to the row ID values 282 .
  • the BPT sequence 382 is divided so that consecutive BPTs 224 that have the same row ID value 282 are in the same BPT sequence segment 428 .
  • Consecutive BPTs 224 that have different row ID values 282 are points where the sequence 382 becomes divided.
  • the CPU 108 further divides the BPT sequence 382 into the BPT sequence segments 428 at the fixed-value BPTs 355 .
  • BPTs 224 separated by fixed-value BPTs 355 are in separate sequence segments 428 .
  • the CPU 108 identifies the possible alternative BPT sequences 420 for each sequence segment 428 .
  • the possible alternative BPT sequences 420 for each segment are combinations of BPTs with the same number of button presses as the corresponding segment of the received BPT sequence, as previously disclosed in FIGS. 15-18 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis, herein incorporated by reference in its entirety) except applied to an entire word.
  • the CPU 108 ranks, for each segment 428 , the possible alternative BPT sequences 425 according to the likelihood that each occurs based on the received BPT sequence 382 and a ranking criteria.
  • the ranking criteria is the number of individual BPT errors required to create a received BPT sequence 382 from an intended alternative BPT sequence.
  • the ranking criteria is the likelihood of each required individual BPT error occurring.
  • the ranking criteria is a composite of the number of individual BPT errors required to create a received BPT sequence 382 from an intended alternative BPT sequence and the likelihood of each required individual BPT error occurring.
  • the CPU 108 converts each alternative BPT sequence 420 to an alternative character sequence 445 based on the BPV sequence 380 , and the characters 200 and menu positions 242 of the user interface 150 of FIG. 4 , as previously disclosed in FIGS. 21-26 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis) except applied to an entire word.
  • a next step 758 the CPU 108 converts each fixed-value BPT 355 of the BPT sequence 382 to a fixed-value character 201 according to the assigned value 222 of the selection buttons 129 of the fixed-value BPTs 355 and the menu 240 of the user interface 150 of FIG. 4 .
  • the CPU 108 connects one ranked character sequence 445 or a presumed sequence 134 from each BPT sequence segment 428 with each fixed-value character 201 in the same order as the BPT sequence segments 428 from which the character sequences, presumed sequence and/or fixed-value characters are derived and, in that way, builds one of the reconnected alternative character sequences 462 .
  • the CPU 108 identifies a plurality of unique reconnected alternative character sequences 462 by connecting, in different combinations, character sequences 445 and/or presumed sequences 134 from each segment 428 with the fixed-value characters 201 . In one embodiment, all possible combinations of reconnected alternative character sequences 462 are identified.
  • the CPU 108 iteratively compares reconnected alternative character sequences 462 with the library 136 of word possibilities. In one embodiment, the CPU 108 compares the reconnected alternative characters sequences 462 in order of a composite of the ranked likelihood that the reconnected alternative characters sequences 462 occurs based on the received BPT sequence 382 and a ranking criteria.
  • the CPU 108 determines whether any reconnected alternative character sequence 462 is found in the library 136 . If at least one reconnected alternative character sequence 462 is in the library 136 , then in a step 726 the CPU 108 accepts one of the found alternative sequences 462 as input. If no alternative sequence 462 is in the library 136 , then in the step 714 the CPU 108 accepts the presumed word 134 as input.
  • FIG. 10 shows a flowchart 152 of the variables of the method 609 of FIGS. 6A and 6B and the method 709 of FIG. 9 .
  • the flowchart 152 incorporates within it the flowchart 138 of FIG. 5 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis) that shows the progression of variables through the method 609 of FIGS. 6A and 6B that leads to the presumed word 134 .
  • the flowchart 152 also shows the progression of variables through the method 709 of FIG. 9 that leads to the possible reconnected alternative character sequences 462 .
  • Sequence of row ID values 385 . Acquisition of row ID values 282 in step 680 of the method 609 of FIGS. 6A and 6B enables BPT sequence segmentation in step 750 of the method 709 of FIG. 9 . Sequence segmentation is beneficial because it reduces the number of alternative BPT sequences 420 possible for a given BPT sequence 382 .
  • row ID values 282 require no extra effort on the part of the user. As described in FIG. 4 , the row ID value 282 is part of the identity of each selection button 110 . No decision or additional actuation recognizable to the user is required for the CPU 108 to receive the row ID value 282 . Acquisition of values for the variable ‘sequence of row ID values’ 385 is transparent to the user.
  • variable ‘fixed-value BPT’ 355 Acquisition of fixed-value BPTs 355 in step 684 of the method 609 of FIGS. 6A and 6B enables BPT sequence segmentation in step 756 of the method 709 of FIG. 9 . Sequence segmentation is beneficial because it reduces the number of alternative BPT sequences 420 possible for a given sequence 382 .
  • fixed-value selection buttons 129 are interspersed among the selection buttons 110 and to the user appear as just another button. No decision or additional actuation recognizable to the user is required for the CPU 108 to receive the additional BPT 224 . Acquisition of the fixed-value BPT 355 along with the other three BPTs 340 , 345 , 350 is transparent to the user.
  • the flowchart 152 of FIG. 10 has five input variables: (1) ‘sequence of button press values’ 380 , (2) ‘co-press’ 210 , (3) ‘duration’ 208 , (4) ‘sequence of row ID values’ 385 , and (5) ‘fixed-value BPT’ 355 .
  • the variables ‘co-press’ 210 , ‘duration’ 208 and ‘fixed-value’ 355 together determine the variable ‘sequence of button press types’ 382 , which occurs as a result of repeated loops through steps 620 , 640 and 682 of FIGS. 6A and 6B .
  • variables ‘sequence of button press values’ 380 and ‘sequence of button press types’ 382 together determine the variable ‘sequence of total button press values’ 386 , which occurs within step 672 of the method 609 of FIGS. 6A and 6B .
  • the variable ‘sequence of total button press values’ 386 determines the variable ‘presumed word’ 134 which occurs in step 674 of the method 609 and is based on the user interface 150 of FIG. 4 .
  • variable ‘sequence of BPTs’ 382 determines the variable ‘segmented sequence of BPTs’ 428 , which occurs in step 756 of the method 709 .
  • the variable ‘sequence of row ID values’ 385 and the variable ‘segmented sequence of BPTs’ 428 together determine the variable ‘further segmented sequence of BPTs’ (also 428 ), which occurs in a step 750 of the method 709 . Note that the steps 750 and 756 of the method 709 of FIG. 9 can occur in either order.
  • variable ‘further segmented sequence of BPTs’ 428 determines the variable ‘number of button presses per sequence segment’ 202 , which occurs within step 752 of the method 709 .
  • variable ‘number of button presses per sequence segment’ 202 determines the variable ‘possible alternative BPT sequences per segment’ 420 , which also occurs in step 752 .
  • the variables ‘sequence of button press values’ 380 and ‘possible alternative BPT sequences per segment’ 420 together determine the variable ‘possible alternative sequences of total BPVs per segment’ 426 , which occurs within step 720 .
  • variable ‘possible alternative sequences of total BPVs per segment’ 426 determines the variable ‘possible alternative character sequences per segment’ 445 , which also occurs within step 720 .
  • variable ‘possible alternative character sequences per segment’ 445 determines the variable ‘reconnected alternative character sequences’ 462 , which occurs in step 760 of the method 709 .
  • the variables ‘reconnected alternative character sequences’ 462 and ‘presumed word’ 134 are compared with the variable ‘library of words’ 136 to determine the variable ‘identified word’ 130 , which occurs in steps 710 and 762 .
  • FIG. 11 shows the user interface 150 of FIG. 4 , a table 185 of value assignments for variables of the method 709 of FIG. 9 , and a list of input variables 186 for the method 609 of FIGS. 6A and 6B .
  • the user interface 150 , table 185 , and list of variables 186 are examples used to demonstrate the embodiments of FIGS. 4, 6 and 9 .
  • the scope of the invention is not limited by the variables and values shown here, but rather by the scope of the claims.
  • the table 185 is divided into rows and columns. Rows are grouped first by the row ID value 282 and then by the button press type 224 . Each column is one variable: the variable ‘row ID value’ 282 , the variable ‘co-press’ 210 , the variable ‘duration’ 208 , the variable ‘button press type’ 224 , the variable ‘button press values’ 222 , the variable ‘total button press value’ 228 and the variable ‘character’ 200 .
  • Each line of the table 185 is a unique combination of the variables row ID value 282 , button press type 224 , and button press value 222 .
  • two selection button rows 280 with five selection buttons 110 per row enable 26 unique variable combinations.
  • the list 186 highlights which of the variables of the method 709 of FIG. 9 are the input variables.
  • the input variables are: (1) ‘button press values’ 222 , (2) ‘co-press’ 210 , (3) ‘duration’ 208 , (4) ‘row ID value’ 282 and (5) ‘fixed-value’ 355 .
  • the remaining variables of the table 185 (‘button press type’ 224 , ‘total button press value’ 228 , and ‘character’ 200 ) all follow from the input variables 186 and the user interface 150 , as shown by the flowchart of FIG. 10 .
  • FIG. 12 shows a table that compares characteristics of two different input methods.
  • One method shown is the method 709 of FIG. 9 , also known as the reduced-button input method.
  • Another method shown is a 26-button input method 132 .
  • a standard QWERTY keyboard is one example of the 26-button method 132 .
  • the characteristics compared in the table of FIG. 12 are: input variables, possible values for the input variables, level of control, and factor determining the level of control.
  • the reduced-button method 709 has five input variables: (1) button press values 222 , (2) co-press 210 , (3) duration 208 , (4) row ID value 282 and (5) ‘fixed-value’ 355 . These five variables appear as inputs in the flowchart 152 of FIG. 10 and in steps 620 , 640 , 658 , 668 , 678 and 682 of the method 609 of FIGS. 6A and 6B . Possible values of these variables for the user interface 150 of FIG. 4 are: (1) ⁇ 3, ⁇ 2, 0, +2 or +3, (2) pair or not, (3) ⁇ ETP or >ETP, (4) A or B and (5) fixed-value or not.
  • the 26-button method 132 has one input variable: button press value 222 .
  • Possible values for the button press value 222 in the case of the 26-button method are the characters themselves: a, b, c, d . . . and so on.
  • Level of control over the five variables for the reduced-button method 709 is high for the button press value variable 222 , row ID value 282 and fixed-value 355 , but low for the co-press 210 and duration 208 .
  • the factor that determines the high level of control over the button press value variable 222 , the row ID value 282 and the fixed-value 355 is the button size. Because the reduced button method 709 requires fewer buttons compared to the number of characters that are selectable, relative to other input methods there is space available to increase the button size. For example, for the 13:5 ratio between characters to buttons shown in the interface 150 of FIG. 4 , only ten buttons are required to offer every character of the English alphabet. Therefore even in a compact application like a mobile device, button size can be large enough for a human finger to press them without error, so the level of control over each is considered high.
  • the factors determining the low level of control for the variables co-press 210 and duration 208 are the moment of button press and the moment of button release. Both these variables 208 , 210 are time dependent and in a typical application need to be controlled to a precision of less than tenths of a second. Achieving that level of control is difficult on a routine basis, so for that reason the level of control over these variables is considered low. However, due to the predictability of button press timing errors, the low level of control over the variables co-press 210 and duration 208 can be overcome with BPT error analysis.
  • Level of control over the button press value variable 222 for the 26-button method 132 is low.
  • the factor determining the level of control for the button press value 222 is button size. But the difference with the 26-button method 132 is that due to the requirement to provide 26 buttons, the size of each individual button must be small in a compact application. In use, the small button size leads to button press errors therefore the level of control for the button press value variable 222 is considered low for the 26-button method 132 .
  • FIG. 13 shows an example of the interface 150 of FIG. 4 used with the reduced button method 709 and an example of a 26-button user interface 133 used with the 26-button method 132 .
  • the difference in size of the selection buttons 110 for each interface is noticeable.
  • the larger selection buttons 110 of the user interface 150 used with the reduced button method 709 provide a user with a high level of control.
  • the smaller selection buttons 110 of the 26-button interface 133 provide only a low level of control.
  • FIG. 14 shows a graphical representation of an example of each of the three time-dependent button press types 224 : the short BPT 340 , the long BPT 345 and the pair BPT 350 .
  • FIG. 14 also shows a graphical representation of a sequence of two short BPTs 341 .
  • the passage of time is represented by a horizontal bar 326 .
  • a black region 327 within the bar 326 indicates a period of time when a button is pressed.
  • a white region 328 within the bar 326 indicates a period of time when a button is not pressed.
  • a first solid vertical marker indicates a beginning 329 of an elapsed time period 330 .
  • a second solid vertical marker indicates an end (or expiration) 331 of the elapsed time period 330 .
  • Button presses have an onset 320 and a moment of release 322 . The time between the onset 320 and the moment of release 322 is the duration 208 .
  • the elapsed time period 330 commences with the onset 320 of a button press whenever an elapsed time period 330 is not already occurring.
  • a button press with duration 208 less than the elapse time period 330 is the short BPT 340 .
  • a button press with duration 208 longer than the elapsed time period 330 is the long BPT 345 . Because the onset 320 of the button press and the elapsed time period 330 commence together, the moment of button release 322 distinguishes the short BPT 340 from a long BPT 345 .
  • the graphic of the short BPT 340 and long BPT 345 shows, if the moment of button release 322 occurs close (in time) to expiration 331 of the elapsed time period, then a small difference in the timing of the release can change the determination of the BPT 224 .
  • the difference in moment of release 322 between the short BPT 340 and the long BPT 345 is on the order of 0.02 seconds. This level of sensitivity is why the input variable duration 208 in the table of FIG. 12 is considered of low control. It also shows how in the method 709 of FIG. 9 , an inaccurately timed button release by a user can lead an intended short BPT 340 to be entered as a long BPT 345 , or vice-versa.
  • a similar risk occurs at the onset 320 of a button press, if the expiration 331 of the elapsed time period 330 is close ahead of it in time.
  • the graphical representation of the sequence of two short BPTs 341 shows how the onset 320 of a subsequent button press 332 can nearly overlap (in time) with the expiration 331 of an expiring elapsed time period 330 . If the onset 320 of the subsequent button press 332 occurs before the expiration 331 of the elapsed time period 330 , then the CPU 108 interprets subsequent button press 332 as a second button press 333 of the pair BPT 350 . Thus the sequence of consecutive short BPTs 341 is interpreted as the pair BPT 350 . In the examples of FIG.
  • the difference in the button onset 320 is on the order of 0.02 seconds. This sensitivity if why the input variable co-press 210 in the table of FIG. 12 is considered of low control. It also shows how in the method 609 of FIGS. 6A and 6B , an inaccurately timed button press by a user can lead button presses intended to be entered as consecutive BPTs to be entered as a pair BPT 350 instead, or vice-versa.
  • FIGS. 15-17 show an example of an application of the method 709 and system 101 for multi-variable character input.
  • the presumed word 134 of the example of FIGS. 15-17 is ‘negligence’.
  • FIG. 15 shows the user interface 150 of FIG. 4 , a table of values for each of the variables ‘character’ 200 , ‘menu row’ 244 , ‘menu position’ 242 , ‘button press values’ 222 and ‘button press type’ 224 , and three sequence variables ‘sequence of row ID values’ 385 , ‘sequence of BPVs’ 380 , and ‘sequence of BPTs’ 382 .
  • Values for the variable ‘character’ 200 derive directly from the presumed word 134 .
  • Values for the variable ‘menu row’ 244 and ‘menu position’ 242 derive from the position of each character 200 in the menu 240 according to the user interface 150 .
  • the value for the variable ‘sequence of row ID values’ 385 derives from iterative cycles through steps 678 and 680 of the method 609 of FIGS. 6A and 6B .
  • the value for the variable ‘sequence of BPVs’ 380 derives from iterative cycles through steps 658 and/or 668 of the method 609 of FIGS. 6A and 6B .
  • the value for the variable ‘sequence of BPTs’ 382 derives from iterative cycles through steps 660 , 662 , 664 and/or 684 of the method 609 .
  • the value for the sequence of row ID values 385 is ‘B-A-A-A-A-A-B-A-A’
  • the value for the sequence of BPVs 380 is ‘ ⁇ 3 ⁇ 2 0 +2 +3 +2 0 ⁇ 2 ⁇ 3 ⁇ 2 ⁇ 2’
  • the value for the sequence of BPTs 382 is ‘long-short-fixed-pair-short-fixed-short-long-long-short’.
  • FIGS. 16 and 17 show how the presumed word 134 and a plurality of reconnected alternative character sequences 462 are derived from the sequence of BPVs 380 , the sequence of BPTs 382 , and the sequence of row ID values 385 of FIG. 15 .
  • the BPT sequence 382 is divided into segments 428 according to the sequence of row ID values 385 and fixed-value BPTs 355 .
  • the sequence 382 is segmented at points where individual row ID values 282 change and at the position of each fixed-value BPT 355 .
  • the row ID values 282 change between the first and second positions of the sequence of row ID values' 385 , back again between the seventh and eighth positions, and then back again between the eighth and ninth positions.
  • Fixed-value BPTs 355 occur in the third and sixth positions of the BPT sequence 382 .
  • the BPT sequence 382 ‘long-short-fixed-pair-short-fixed-short-long-long-short’ is divided into six BPT sequence segments 428 : a first segment 432 ‘long’, a second segment 434 ‘short’, a third segment 436 ‘pair-short’, a fourth segment 438 ‘short’, a fifth segment 440 ‘long’ and a sixth segment 441 ‘long-short’.
  • the number of button presses 202 in the BPT sequence segment 428 determines the possible alternative BPT sequences 420 for the segment, just as the number of presses determines the possible alternative sequences 420 for an entire BPT sequence 382 .
  • the alternative BPT sequences 420 are determined as shown in FIGS. 15-18 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis).
  • the first, second, fourth and fifth BPT sequence segments 432 , 434 , 438 , 440 have one button press and therefore only one possible alternative BPT sequences 420 .
  • the third BPT sequence segment 436 has three button presses, and therefore eleven possible alternative BPT sequences 420 .
  • the sixth BPT sequence segment 441 has three button presses, and therefore four possible alternative BPT sequences 420 .
  • Each alternative BPT sequence 420 in each BPT sequence segment 428 is converted to a total BPV sequence 386 (not shown) based on the BPV sequence 380 and then to an alternative character sequence 445 according to the user interface 150 of FIG. 4 , as previously disclosed in steps of the method 700 of FIG. 19 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis).
  • FIG. 17 lists the alternative character sequences 445 as shown.
  • One character sequence 445 or presumed sequence 134 from each BPT sequence segment 428 is reconnected (indicated by dashed lines 463 ) with one character sequence or presumed sequence from every other BPT sequence segment, and with the fixed-value character 201 , in the same order as the BPT sequence segments 428 from which the character sequences, presumed sequences and/or fixed-value characters are derived and, in that way, builds one of the reconnected alternative character sequences 462 .
  • the CPU 108 identifies a plurality of unique reconnected alternative character sequences 462 by connecting, in different combinations, a character sequence 445 , a presumed sequence 134 and/or fixed-value character 201 from each segment 428 .
  • all possible combinations of reconnected alternative character sequences 462 are identified. In a further embodiment, all possible combinations of reconnected alternative character sequences 462 are compared with a library 136 of word possibilities. In a further embodiment, if a reconnected alternative character sequence 462 is found in the library 136 and the presumed word 134 is not found in the library, then the CPU 108 accepts as input the found reconnected alternative character sequence 462 in place of the presumed word 134 .
  • examples of reconnected alternative character sequences 462 are ‘n e g l i g e n e e’, ‘n e g l i g e n e c’, ‘n e g l i g e n c c’, ‘n e g l i g e q c e’ and so on.
  • FIGS. 18-21 show an example of an application of the method 709 and system 101 for multi-variable character input.
  • the presumed word 134 of the example of FIGS. 15-17 is ‘rwnaccustojed’.
  • An intended word 135 is ‘unaccustomed’.
  • FIG. 18 shows the user interface 150 of FIG. 4 , a table of values for each of the variables ‘character’ 200 , ‘menu row’ 244 , ‘menu position’ 242 , ‘button press values’ 222 and ‘button press type’ 224 , and three sequence variables ‘sequence of row ID values’ 385 , ‘sequence of BPVs’ 380 , and ‘sequence of BPTs’ 382 .
  • Values for the variable ‘character’ 200 derive directly from the presumed word 134 .
  • Values for the variable ‘menu row’ 244 and ‘menu position’ 242 derive from the position of each character 200 in the menu 240 according to the user interface 150 .
  • the value for the variable ‘sequence of row ID values’ 385 derives from iterative cycles through steps 678 and 680 of the method 609 of FIGS. 6A and 6B .
  • the value for the variable ‘sequence of BPVs’ 380 derives from iterative cycles through steps 658 and/or 668 of the method 609 of FIGS. 6A and 6B .
  • the value for the variable ‘sequence of BPTs’ 382 derives from iterative cycles through steps 660 , 662 , 664 and/or 684 of the method 609 .
  • the value for the sequence of row ID values 385 is ‘B-B-B-A-A-A-B-B-B-A-A’
  • the value for the sequence of BPVs 380 is ‘ ⁇ 2 +3 ⁇ 3 ⁇ 3 ⁇ 2 ⁇ 2 +3 ⁇ 3 +2 0 ⁇ 3 ⁇ 2 +3 ⁇ 2 ⁇ 3’
  • the value for the sequence of BPTs 382 is ‘short-short-long-long-long-long-pair-pair-fixed-pair-short-short-short’.
  • FIGS. 19-21 show how the presumed word 134 and a plurality of reconnected alternative character sequences 462 are derived from the sequence of BPVs 380 , the sequence of BPTs 382 , and the sequence of row ID values 385 of FIG. 18 .
  • the BPT sequence 382 is divided into segments 428 according to the sequence of row ID values 385 and fixed-value BPTs 355 .
  • the sequence 382 is segmented at points where individual row ID values 282 change and at the position of each fixed-value BPT 355 .
  • the row ID value 282 changes between the third and fourth positions of the sequence of row ID values' 385 , back again between the sixth and seventh positions, and then back again between the tenth and eleventh positions.
  • a fixed-value BPT 355 occurs in the ninth position of the BPT sequence 382 .
  • the BPT sequence 382 ‘short-short-long-long-long-long-long-pair-pair-fixed-pair-short-short’ is divided into five BPT sequence segments 428 : a first segment 432 ‘short-short-long’, a second segment 434 ‘long-long-long’, a third segment 436 ‘pair-pair’, a fourth segment 438 ‘pair’ and a fifth segment 440 ‘short-short-short’.
  • the number of button presses 202 in the BPT sequence segment 428 determines the possible alternative BPT sequences 420 for the segment, just as the number of presses determines the possible alternative sequences 420 for an entire BPT sequence 382 .
  • the alternative BPT sequences 420 are determined as shown in FIGS. 15-18 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis.
  • the first, second and fifth BPT sequence segments 432 , 434 , 440 have three button presses and therefore eleven alternative BPT sequences 420 .
  • the third BPT sequence segment 436 has four button presses and therefore 28 possible alternative BPT sequences 420 .
  • the fourth BPT sequence segment 438 has two button presses and therefore four possible alternative BPT sequences 420 .
  • Each alternative BPT sequence 420 in each BPT sequence segment 428 is converted to a total BPV sequence 386 (not shown) based on the BPV sequence 380 and then to an alternative character sequence 445 according to the user interface 150 of FIG. 4 , as previously disclosed in steps of the method 700 of FIG. 19 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis).
  • FIG. 19 lists the alternative character sequences 445 as shown.
  • One character sequence 445 or presumed sequence 134 from each BPT sequence segment 428 is reconnected (indicated by dashed lines 463 ) with one character sequence or presumed sequence from every other BPT sequence segment, and with the fixed-value character 201 , in the same order as the BPT sequence segments 428 from which the character sequences, presumed sequences and/or fixed-value characters are derived and, in that way, builds one of the reconnected alternative character sequences 462 .
  • the CPU 108 identifies a plurality of unique reconnected alternative character sequences 462 by connecting, in different combinations, a character sequence 445 , a presumed sequence 134 and/or fixed-value character 201 from each segment 428 .
  • all possible combinations of reconnected alternative character sequences 462 are identified. In a further embodiment, all possible combinations of reconnected alternative character sequences 462 are compared with a library 136 of word possibilities. In a further embodiment, if a reconnected alternative character sequence 462 is found in the library 136 and the presumed word 134 is not found in the library, then the CPU 108 accepts as input the found reconnected alternative character sequence 462 in place of the presumed word 134 .
  • the BPT sequence 382 includes a ‘pair’ BPT 450 , so there is more than one BPV sequence 380 a user could input to get the intended word 135 ‘unaccustomed’. As a result, more than one set of alternative character sequences 445 exists.
  • FIGS. 20 and 21 show two of eight possible sets of reconnected alternative character sequences 462 that could occur.
  • the received BPV sequence 380 is ‘ ⁇ 2 +3 ⁇ 3 ⁇ 3 ⁇ 2 ⁇ 2 [ ⁇ 2 +3] [ ⁇ 3 +2] 0 [ ⁇ 3 ⁇ 2]+3 ⁇ 2 ⁇ 3’.
  • the received BPV sequence 380 ‘ ⁇ 2 +3 ⁇ 3 ⁇ 3 ⁇ 2 ⁇ 2 [+3 ⁇ 2] [+2 ⁇ 3] 0 [ ⁇ 2 ⁇ 3]+3 ⁇ 2 ⁇ 3’.
  • examples of possible reconnected alternative character sequences 462 are ‘r w n a c c u s t o j e a’, ‘r w n a c c u s t o j c d’, ‘r w n a c c u s t o m e d’, ‘r w n a c c r w q v t o j e d’, ‘u n a c c u s t o j e d’, ‘unaccustomed’ and so on.
  • examples of possible reconnected alternative character sequences 462 are ‘r w n a c c u s t o j e a’, ‘r w n a c c u s t o j c d’, ‘r w n a c c u s t o m e d’, ‘r w n a c c w v q o j e d’, ‘u n a c c u s t o j d’, ‘u n a c c u s t o j d’, ‘u n a c c u s t o m e d’ and so on.
  • FIG. 22 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of FIG. 1 .
  • the device 100 has aspects previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104 , a plurality of characters 200 that populate positions 242 in multiple menu rows (or menu arrays) 244 of a character menu 240 , a plurality of selection buttons 110 divided among multiple button rows (or button arrays) 280 that includes both time-dependent selection buttons 127 and at least one time-independent selection button 129 , and a spacebar button 264 , which together make up a user interface 150 of the device 100 .
  • Each of the selection buttons 110 has an assigned row identification (ID) value 282 and an assigned button press value 222 . Included as part of or within proximity to the menu 240 is a reference 258 , additional row ID values 282 , and an offset scale 260 .
  • the display 104 , the plurality of selection buttons 110 , and the spacebar button 264 are communicatively coupled with the CPU 108 , as described in the embodiment of FIG. 1 .
  • the CPU 108 includes the elapsed time counter 140 and the button press value counter 142 , as described in the embodiment of FIG. 3 .
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122 , as described in the embodiment of FIG. 3 .
  • the menu 240 has 34 menu positions 242 and the plurality of selection buttons includes fourteen buttons with the assigned button press values 222 : ‘ ⁇ 4, ⁇ 3, ⁇ 2, 0, +2, +3, +4’.
  • the menu positions 242 are populated by 33 characters 200 of the Russian alphabet.
  • FIG. 23 shows an embodiment of the table 185 of value assignments for variables of the method 709 of FIG. 9 for the embodiment of the user interface 150 of FIG. 22 .
  • FIG. 24 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of FIG. 1 .
  • the device 100 has aspects previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104 , a plurality of characters 200 that populate positions 242 in multiple menu rows (or menu arrays) 244 of a character menu 240 , a plurality of selection buttons 110 divided among multiple button rows (or button arrays) 280 that includes both time-dependent selection buttons 127 and at least one time-independent selection button 129 , and a spacebar button 264 , which together make up a user interface 150 of the device 100 .
  • Each of the selection buttons 110 has an assigned row identification (ID) value 282 and an assigned button press value 222 . Included as part of or within proximity to the menu 240 is a reference 258 , additional row ID values 282 , and an offset scale 260 .
  • the display 104 , the plurality of selection buttons 110 , and the spacebar button 264 are communicatively coupled with the CPU 108 , as described in the embodiment of FIG. 1 .
  • the CPU 108 includes the elapsed time counter 140 and the button press value counter 142 , as described in the embodiment of FIG. 3 .
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122 , as described in the embodiment of FIG. 3 .
  • the menu 240 has 30 menu positions 242 and the plurality of selection buttons includes twelve buttons with the assigned button press values 222 : ‘ ⁇ 4, ⁇ 3, ⁇ 2, 0, +2, +3’.
  • the menu positions 242 are populated by the 26 characters 200 of the English alphabet, plus characters that represent four of the five tones used in Chinese pinyin.
  • the four tones represented are flat (high level), rising (high-rising), fall-rising (low) and falling (high-falling).
  • the four tones are represented by a macron, acute accent, caron and grave accent, respectively.
  • the four tones are represented by the marks ‘-’, ‘’’, ‘ ⁇ hacek over ( ) ⁇ ’ and ‘′’, respectively.
  • FIG. 25 shows an embodiment of the table 185 of value assignments for variables of the method 709 of FIG. 9 for the embodiment of the user interface 150 of FIG. 24 .
  • FIG. 26 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters.
  • the device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of FIG. 1 .
  • the device 100 has aspects previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • the electronic device 100 includes the display 104 , a plurality of characters 200 that populate positions 242 in multiple menu rows (or menu arrays) 244 of a character menu 240 , a plurality of selection buttons 110 divided among multiple button rows (or button arrays) 280 that includes both time-dependent selection buttons 127 and at least one time-independent selection button 129 , and a spacebar button 264 , which together make up a user interface 150 of the device 100 .
  • Each of the selection buttons 110 has an assigned row identification (ID) value 282 and an assigned button press value 222 . Included as part of or within proximity to the menu 240 is a reference 258 , additional row ID values 282 , and an offset scale 260 .
  • the display 104 , the plurality of selection buttons 110 , and the spacebar button 264 are communicatively coupled with the CPU 108 , as described in the embodiment of FIG. 1 .
  • the CPU 108 includes the elapsed time counter 140 and the button press value counter 142 , as described in the embodiment of FIG. 3 .
  • the CPU 108 is communicatively coupled with the storage medium 112 and the power source 122 , as described in the embodiment of FIG. 3 .
  • the menu 240 has 26 menu positions 242 and the plurality of selection buttons includes ten buttons with the assigned button press values 222 : ‘3, ⁇ 2, 0, +2, +3’.
  • the menu positions 242 are populated by the 26 characters 200 of the English alphabet.
  • the characters 200 populate the positions 242 so that the most frequently used characters in the language ‘e’ and ‘t’ are in positions selected by a time-independent button 129 .
  • the most frequently used characters in the language ‘e, t, a, o, i, n, s, r, h and l’ populate positions of the menu selected by ‘fixed BPTs’ 355 or ‘short BPTs’ 340 .
  • the least frequently used characters in the language ‘y, b, v, k, x, j, q and z’ populate positions of the menu selected by ‘pair BPTs’ 350 .
  • FIG. 27 shows an embodiment of the table 185 of value assignments for variables of the method 709 of FIG. 9 for the embodiment of the user interface 150 of FIG. 26 .
  • the proposed method uses time-dependent button actuations and a time-sensitive error correction algorithm to enable text input.
  • An algorithm corrects errors by identifying alternative character sequences that derive from predictable button actuation timing errors. Errors due to timing are predictable because the number of possible inaccurately timed button actuation combinations is finite. Furthermore, significant variation in the likelihood of different actuation errors makes error prioritization productive.
  • the method increases positional selection accuracy but comes at the expense of a new error type—button press timing inaccuracy. However the trade-off is net positive because positional selection errors are exchanged for more predictable time-based ones.
  • the interface has 10 selection buttons and a 26-position menu.
  • a user selects characters by navigating the menu using time-dependent button actuations. The character highlighted when a short duration timer expires becomes selected.
  • the 13:5 ratio between buttons and characters enables button size to be large compared with a 26-button interface, which increases selection accuracy.
  • a trade-off is that errors due to inaccurately timed button presses become possible.
  • An algorithm corrects selection errors by identifying alternative character sequences that require actuation of the same buttons in the same order as those of an entered word, but that use different time-dependent button actuations. Errors due to inaccurately timed button presses are more predictable than errors due to inaccurately positioned presses because the number of possible alternatives is fewer. Furthermore, the likelihood of the various errant time-dependent button actuations varies significantly, which makes prioritization of identified alternative character sequences useful.
  • the proposed method increases positional selection accuracy but at the expense of a new error type—inaccurately timed button presses.
  • the trade-off is net positive because errors due to inaccurately positioned button presses are exchanged for errors due to inaccurately timed presses, which are more predictable and easier to accurately correct.
  • FIG. 28 discloses one-half of a full 26-character interface 150 , a graphic representation of each of three button press types 224 , and a table of button press types 224 and their corresponding math operations 181 .
  • the selection buttons of the interface 150 have the assigned ‘button press values’ ⁇ 3, ⁇ 2, 0, +2 and +3.
  • a button press tentatively selects a character and simultaneously starts a short-duration elapsed timer (in one embodiment, ⁇ 0.15 sec).
  • the button press identifies the selected character by the value of its position in the menu.
  • the elapsed timer defines a period during which one of three button actuations completes the selection: (1) a button release (duration ⁇ ETP), (2) a continued button press (duration >ETP), or (3) an additional selection button press (a co-press).
  • the button actuation underway when the timer expires identifies the ‘button press type’ of the selection cycle (short, long or pair).
  • the button press type 224 determines a math operation 181 applied to the button press value of the tentatively selected character.
  • the result of the math operation is a ‘total button press value’ that identifies a character by its position in the menu, as shown by the table 185 of FIG. 29 .
  • a word is represented by a sequence of button press values (BPVs) and a sequence of button press types (BPTs).
  • BPVs button press values
  • BPTs button press types
  • An error correction algorithm finds the correct word by identifying possible alternative BPTs sequences for the button presses received. For example, for the word ‘lad’, a possible alternative to the BPT sequence ‘pair-long-short’ is ‘pair-long-long’. The number of possible sequences depends on the number of button presses the sequence has. As shown in FIG. 30 , the word ‘lad’ is a 4-button-press sequence. Therefore, as shown in the left half of FIG. 31 , ‘lad’ has 28 possible alternative BPT sequences 420 .
  • Each alternative BPT sequence 420 is translated to an alternative character sequence 445 according to the BPV sequence received and the menu.
  • the right half of FIG. 31 shows the alternative character sequences for ‘lad’. An errant BPT in the BPT sequence would lead the word ‘lad’ to appear among these 28 possible alternatives.
  • the number of possible alternative BPT sequences 445 compounds as the number of button presses increases. Compounding occurs due to the number of BPT combinations that become possible when consecutive button actuations cannot be conclusively determined.
  • the remedy for this problem is an additional variable that allows button presses, and therefore BPTs, to also be identified by row.
  • button presses occur in the order they are intended and (2) button presses occur at least within the row they are intended.
  • these two assumptions are considered valid. If those are true, then only those sequences where the BPTs correctly correspond with the known row value for each position of the sequence are possible.
  • the algorithm divides the received BPT sequence into segments at each row change. The algorithm then identifies possible alternative BPT sequences separately for each segment. To identify possible alternative character sequences, the algorithm reconnects, in order, possible combinations of the received and alternative character sequence segments.
  • a time-independent button is a key with an assigned button press value that does not change as a result of the actuation that selects it.
  • the button is immune to button press timing errors and its value cannot be combined with that of another actuation as part of a ‘pair’. Because a time-dependent button press cannot be mistaken for or combined with any other BPT, it conclusively divides a BPT sequence into segments the same way a row change does.
  • FIGS. 32-34 show an example application of the error correction algorithm described above.
  • the example uses the 2-row interface of FIG. 32 .
  • the interface includes time-independent buttons for the positions of the letters ‘g’ and ‘t’.
  • the algorithm also acquires a row ID value.
  • the intended word is ‘negligence’, but the word is incorrectly entered as ‘qegilgenee’.
  • the algorithm correctly identifies the word ‘negligence’ among the possible alternative character sequences, as shown by the dashed line in FIG. 34 .
  • buttons actuations 224 in FIG. 28 ‘pair’ takes the longest to execute because it requires initiation of two button presses.
  • the maximum time needed to execute two presses is found by experiment to be ⁇ 0.10-0.15 sec.
  • the minimum time needed to execute a button release (‘short’ BPT) or to comfortably wait out a continued button press (‘long’ BPT) is shorter than 0.15 seconds by only a minor amount in comparison with the time needed to execute a ‘pair’ BPT and in comparison with an entire character selection cycle. Therefore the button press actuations devised for the method meet the design condition mentioned above.
  • the number of possible alternative sequences is determined only by the number of button presses in the sequence.
  • the number of BPTs in the sequence is not important.
  • a table 418 in the bottom half of FIG. 35 shows the number of BPT alternatives for each case of a given number of button press values.
  • FIG. 36 is two flowcharts 152 . Each flowchart shows the relationship between variables of the method 709 of FIG. 9 .
  • the top flowchart shows the relationship for an un-segmented word or for one segment of a segmented word.
  • the bottom flowchart shows the same relationship as the top flowchart, but also includes variables for the row ID value and time-independent buttons (fixed-value BPTs) that enable BPT sequence segmentation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Telephone Function (AREA)

Abstract

Systems, devices and methods are disclosed for multi-variable character input. A device distinguishes characters by interpreting button presses relative to a short-duration elapsed time period. From the button presses, the device interprets values for one or more of the variables: (1) number of button presses, (2) value of button pressed, (3) row of button pressed, (4) if button is time-dependent or time-independent, (5) if an additional button press occurs within the same elapsed time period, and (6) the duration of the button press. From these values an algorithm synthesizes intermediate variables (1) sequence of row ID values, (2) sequence of button press values, and (3) sequence of button press types. From the intermediate variables the device identifies a presumed word and a plurality of possible alternative word possibilities. The system enables character input using fewer selection buttons than there are displayed characters and with improved speed and accuracy.

Description

    TECHNICAL FIELD
  • This description generally relates to the field of electronic devices and, more particularly, to user interfaces of electronic devices.
  • BRIEF SUMMARY
  • A computer processor-implemented method may be summarized as including: receiving, by at least one computer processor, input resulting from actuation of buttons; and selecting characters, by at least one computer processor, based on the received input resulting from the actuation of buttons, wherein the selecting the characters includes, for each character to be selected: determining values, by at least one computer processor, for at least three variables from the received input; and identifying the character, by at least one computer processor, from among a plurality of characters based on the values of the at least three variables.
  • A first variable of the at least three variables may be an assigned value of each actuated button, a second variable of the at least three variables may be whether a second button actuation occurs within a finite-length time period measured from an onset of a first button actuation, and a third variable of the at least three variables may be a duration of the first button actuation measured from the onset of the first button actuation. A value for the first variable may be one of −3, −2, +2 and +3. A value for the second variable may be yes or no. A value for the third variable may be less than the finite-length time period measured from the onset of the first button actuation or greater than the finite-length time period measured from the onset of the first button actuation.
  • The computer processor-implemented method may further include determining values, by at least one computer processor, for at least five variables from the received input.
  • A fourth variable of the at least five variables may identify for any given button actuation which array of two or more button arrays that the actuated button is a member of. A value for the fourth variable may be one of row 1, row 2, row 3, row 4 and row 5. A fifth variable of the at least five variables may identify whether the received input from any given button actuation is from a button that is time-dependent or time-independent. A time-independent button may be a button for which the determined values of the second and third variables are always the same. The determined value for the second variable may be no and the determined value for the third variable may be less than the finite-length time period measured from the onset of the first button actuation. Values of the at least five variables may identify the character at least by identifying which of a plurality of one-dimensional arrays the character is a member of and by a position of the character in the identified one-dimensional array. Values of the at least three variables may identify the character at least by identifying a position of the character in a one-dimensional array.
  • A system may be summarized as including: at least one computer processor; and at least one memory coupled to the at least one computer processor, the at least one memory having computer executable instructions stored thereon that, when executed, cause the at least one processor to perform: receiving input resulting from actuation of buttons; and selecting characters based on the received input resulting from the actuation of buttons, wherein the selecting the characters includes, for each character to be selected: determining values for at least three variables from the received input; and identifying the character from among a plurality of characters based on the values of the at least three variables.
  • A first variable of the at least three variables may be an assigned value of each actuated button, a second variable of the at least three variables may be whether a second button actuation occurs within a finite-length time period measured from an onset of a first button actuation, and a third variable of the at least three variables may be a duration of the first button actuation measured from the onset of the first button actuation. A value for the first variable may be one of −3, −2, +2 and +3. A value for the second variable may be yes or no.
  • A non-transitory computer-readable medium may be summarized as having computer executable instructions stored thereon that, when executed, cause at least one processor to perform: receiving input resulting from actuation of buttons; and selecting characters based on the received input resulting from the actuation of buttons, wherein the selecting the characters includes, for each character to be selected: determining values for at least three variables from the received input; and identifying the character from among a plurality of characters based on the values of the at least three variables.
  • A first variable of the at least three variables may be an assigned value of each actuated button, a second variable of the at least three variables may be whether a second button actuation occurs within a finite-length time period measured from an onset of a first button actuation, and a third variable of the at least three variables may be a duration of the first button actuation measured from the onset of the first button actuation. A value for the first variable may be one of −3, −2, +2 and +3.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • In the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn are not intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.
  • FIG. 1 is a schematic view of a system for input of characters with optional time-dependent button presses according to one illustrated embodiment.
  • FIG. 2 is a list of variables for a system to input characters with optional time-dependent button presses according to one illustrated embodiment.
  • FIG. 3 is a schematic view of an example electronic device for input of characters with optional time-dependent button presses according to one illustrated embodiment, the electronic device being a mobile device having a housing, a display, a graphics engine, a central processing unit (CPU), user input device(s), one or more storage mediums having various software modules thereon that are executable by the CPU, input/output (I/O) port(s), network interface(s), wireless receiver(s) and transmitter(s), a power source, an elapsed time counter and a button press value counter.
  • FIG. 4 is a schematic drawing of one embodiment of the electronic device 100 for input of characters. Aspects of the user interface 150 were previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • FIG. 5 is a flow diagram of a method for specifying a character from among a plurality of characters according to one illustrated embodiment.
  • FIGS. 6A and 6B are flow diagrams of a method for an electronic device to interpret button presses according to one illustrated embodiment.
  • FIG. 7 is a table of value assignments for one embodiment of a method of character identification.
  • FIG. 8 is an example of an application of a method of character identification.
  • FIG. 9 is a flow diagram of another method for an electronic device to interpret button presses according to one illustrated embodiment.
  • FIG. 10 is a flow diagram of variables of a method to interpret button presses according to one illustrated embodiment.
  • FIG. 11 is a table of value assignments, a user interface and a list of variables for one embodiment of a method of character identification.
  • FIG. 12 is a table of characteristics of two different methods to input characters.
  • FIG. 13 is an embodiment of two different user interfaces for methods to input characters.
  • FIG. 14 is a graphical representation of button press types of a method to input characters according to one illustrated embodiment.
  • FIGS. 15-17 are examples of an application of a method of character input according to one illustrated embodiment.
  • FIGS. 18-21 are additional examples of an application of a method of character input according to one illustrated embodiment.
  • FIG. 22 is a schematic drawing of another embodiment of the electronic device 100 for character input.
  • FIG. 23 is a table of value assignments for another embodiment of a method for character input.
  • FIG. 24 is a schematic drawing of yet another embodiment of the electronic device 100 for character input.
  • FIG. 25 is a table of value assignments for yet another embodiment of a method for character input.
  • FIG. 26 is a schematic drawing of yet another embodiment of the electronic device 100 for character input.
  • FIG. 27 is a table of value assignments for yet another embodiment of a method for character input.
  • FIG. 28 is a drawing of a user interface, a table of values, and a graphical representation of button press types for one embodiment of an electronic device.
  • FIG. 29 is a table of value assignments for another embodiment of a method for character input.
  • FIGS. 30-31 are additional examples of an application of a method of character input according to one illustrated embodiment.
  • FIGS. 32-34 are yet additional examples of an application of a method of character input according to one illustrated embodiment.
  • FIG. 35 is two tables of characters of one embodiment of a method of character input according to one illustrated embodiment.
  • FIG. 36 is a flow diagram of variables of a method to interpret button presses according to one illustrated embodiment.
  • DETAILED DESCRIPTION
  • In the following description, certain specific details are set forth in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that embodiments may be practiced without one or more of these specific details, or with other methods, components, materials, etc. In other instances, well-known structures associated with computing systems including client and server computing systems, as well as networks, including various types of telecommunications networks, have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments.
  • Unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as “comprises” and “comprising,” are to be construed in an open, inclusive sense, that is, as “including, but not limited to.”
  • Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
  • As used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the content clearly dictates otherwise. It should also be noted that the term “or” is generally employed in its sense including “and/or” unless the content clearly dictates otherwise.
  • The headings and Abstract of the Disclosure provided herein are for convenience only and do not interpret the scope or meaning of the embodiments.
  • Various embodiments are described herein that provide systems, devices and methods for input of characters with optional time-dependent button presses.
  • FIG. 1 is a schematic view of a system 101 for input of characters with optional time-dependent button presses. The system 101 includes a device 100 and a user 160. The device 100 includes at least one processor 108 that operates a word identification algorithm 109. At least one face of the device 100 has a user interface 150. The user interface 150 includes at least a display 104 and user input devices 110. The display 104 includes a menu 240 and a text output area 105. In one embodiment the user input devices 110 are selection buttons, but in alternative embodiments may be touchscreen buttons or other types of input devices as disclosed in FIG. 3. The user 160 includes eyes 162 and fingers 164, although in alternative embodiments of the system 101 designed for accessibility, the user is not required to have use of either.
  • The system 101 enables character input by passing values between its various components. The values change with time, so the values are variables. The system 101 passes a variable ‘available characters’ 200 from the menu 240 of the user interface 150 to the eyes 162 of the user.
  • The system 101 passes six input variables 186 from the user's fingers 164 to the selection buttons 110 of the user interface. The input variables 186 are: ‘number of button presses’ 202, ‘value of button press’ 222, ‘row of button press’ 282, ‘fixed-value press’ 355, ‘co-press’ 210 and ‘duration of press’ 208. Values for each input variable 186 are the result of decisions made by the user. The values are contained within the time-dependent button presses received by the selection buttons 110.
  • The decisions that produce values for the input variables 186 are both conscious and unconscious. For example, selections for the variable ‘value of button press’ 222 are typically conscious because the variable explicitly identifies which buttons a user decides to press. However, while deciding that the user also passively decides values for the other variables ‘number of button presses’ 202, ‘row of button press’ 282, ‘fixed-value press’ 355, ‘co-press’ 210 and ‘duration of press’ 208. The device 100 acquires values for these additional variables 202, 282, 355, 210, 208 through the same button presses that identify the variable ‘value of button press’ 222. These passively acquired variables are useful for interpreting the user's intended input but come at little or no additional effort on the part of the user 160.
  • The system 101 synthesizes the input variables 186 received by the selection buttons 110 into three intermediate variables 187: ‘sequence of button row values’ 385, ‘sequence of button press values’ 380 and ‘sequence of button press types’ 382. By itself each intermediate variable 187 is an incomplete piece of information, but together the three variables 187 identify a presumed word and a plurality of possible alternatives. The word identification algorithm 109 of the processor 108 interprets the intermediate variables 187 to identify the user's intended input.
  • The processor 108 passes a variable ‘identified word’ 130 to the text output area 105 of the display. The text output area 105 makes the variable ‘identified word’ 130 available to the user's eyes 162.
  • FIG. 2 lists the variables for the system 101 of FIG. 1 and example values for each variable according to one embodiment of the system. Example values for the variable ‘available characters’ 200 are ‘a, b, c, d, e . . . ’ and so on. Example values for the variable ‘number of button presses’ 202 is ‘1’, ‘2’, ‘3’, ‘4’, ‘5’ . . . ’ and so on. Example values for the variable ‘button positions pressed’ 222 is ‘−3, −2, −1, 0, +1, +2, +3’ and so on. Example values for the variable ‘row of button press’ 282 is ‘A, B, C . . . ’ and so on. Example values for the variable ‘fixed-value’ 355 are ‘yes’ and ‘no’. Example values for the variable ‘co-press’ 210 are ‘yes’ and ‘no’. Example values for the variable ‘duration’ 208 are ‘<ETP’ and ‘>ETP’, where ETP stands for elapsed time period and refers to the duration of user chosen selectable-length time clock.
  • An example value for the variable ‘sequence of button row values’ 385 is ‘B-A-A-A-A-A-A-B-A-A’. An example value for the variable ‘sequence of button press values’ 380 is ‘−3 −2 0 +2 +3 +2 0 −2 −3 −2 −2’. An example value for the variable ‘sequence of button press types 382 is ‘long-short-fixed-pair-short-fixed-short-long-long-short’.
  • Example values for the variable ‘identified word’ 130 are ‘sun’, ‘dog’, ‘sidewalk’, ‘run’ . . . ’ and so on.
  • FIG. 3 is a schematic view of one example electronic device, in this case mobile device 100, for input of characters with optional time-dependent button presses according to one illustrated embodiment. The mobile device 100 shown in FIG. 3 may have a housing 102, a display 104, a graphics engine 106, a central processing unit (CPU) 108, one or more user input devices 110, one or more storage mediums 112 having various software modules 114 stored thereon comprising instructions that are executable by the CPU 108, input/output (I/O) port(s) 116, one or more wireless receivers and transmitters 118, one or more network interfaces 120, and a power source 122. In some embodiments, some or all of the same, similar or equivalent structure and functionality of the mobile device 100 shown in FIG. 1 and described herein may be that of, part of or operably connected to a communication and/or computing system of another device or machine.
  • The mobile device 100 may be any of a large variety of communications devices such as a cellular telephone, a smartphone, a wearable device, a wristwatch, a portable media player (PMP), a personal digital assistant (PDA), a mobile communications device, a portable computer with built-in or add-on cellular communications, a portable game console, a global positioning system (GPS), a handheld industrial electronic device, or the like, or any combination thereof. The mobile device 100 has at least one central processing unit (CPU) 108 which may be a scalar processor, a digital signal processor (DSP), a reduced instruction set (RISC) processor, or any other suitable processor. The central processing unit (CPU) 108, display 104, graphics engine 106, one or more user input devices 110, one or more storage mediums 112, input/output (I/O) port(s) 116, one or more wireless receivers and transmitters 118, and one or more network interfaces 120 may all be communicatively connected to each other via a system bus 124. The system bus 124 can employ any suitable bus structures or architectures, including a memory bus with memory controller, a peripheral bus, and/or a local bus.
  • The mobile device 100 also includes one or more volatile and/or non-volatile storage medium(s) 112. The storage mediums 112 may be comprised of any single or suitable combination of various types of processor-readable storage media and may store instructions and data acted on by CPU 108. For example, a particular collection of software instructions comprising software 114 and/or firmware instructions comprising firmware are executed by CPU 108. The software or firmware instructions generally control many of the operations of the mobile device 100 and a subset of the software and/or firmware instructions may perform functions to operatively configure hardware and other software in the mobile device 100 to provide the initiation, control and maintenance of applicable computer network and telecommunication links from the mobile device 100 to other devices using the wireless receiver(s) and transmitter(s) 118, network interface(s) 120, and/or I/O ports 116.
  • The CPU 108 includes an elapsed time counter 140. The elapsed time counter 140 may be implemented using a timer circuit operably connected to or as part of the CPU 108. Alternately some or all of the elapsed time counter 140 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112, for example, that when executed by CPU 108 or a processor of a timer circuit, performs the functions described herein of the elapsed time counter 140.
  • The CPU 108 includes a button press value counter 142. Alternately, some or all of the button press value counter 142 may be implemented in computer software as computer executable instructions stored on volatile and/or non-volatile storage medium(s) 112, for example, that when executed by CPU 108, performs the functions described herein of the button press value counter 142.
  • By way of example, and not limitation, the storage medium(s) 112 may be processor-readable storage media which may comprise any combination of computer storage media including volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Combinations of any of the above should also be included within the scope of processor-readable storage media.
  • The storage medium(s) 112 may include system memory which includes computer storage media in the form of volatile and/or nonvolatile memory such as read-only memory (ROM) and random access memory (RAM). A basic input/output system (BIOS), containing the basic routines that help to transfer information between elements within mobile device 100, such as during start-up or power-on, is typically stored in ROM. RAM typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by CPU 108. By way of example, and not limitation, FIG. 1 illustrates software modules 114 including an operating system, application programs and other program modules that implement the processes and methods described herein.
  • The mobile device 100 may also include other removable/non-removable, volatile/nonvolatile computer storage media drives. By way of example only, the storage medium(s) 112 may include a hard disk drive or solid state storage drive that reads from or writes to non-removable, nonvolatile media, a SSD that reads from or writes to a removable, nonvolatile SSD, and/or an optical disk drive that reads from or writes to a removable, nonvolatile optical disk such as a DVD-RW or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in an operating environment of the mobile device 100 include, but are not limited to, flash memory cards, other types of digital versatile disks (DVDs), micro-discs, digital video tape, solid state RAM, solid state ROM, and the like. The storage medium(s) are typically connected to the system bus 124 through a non-removable memory interface. The storage medium(s) 112 discussed above and illustrated in FIG. 1 provide storage of computer readable instructions, data structures, program modules and other data for the mobile device 100. In FIG. 1, for example, a storage medium may store software 114 including an operating system, application programs, other program modules, and program data. The storage medium(s) 112 may implement a file system, a flat memory architecture, a database, or any other method or combination capable for storing such information.
  • A user may enter commands and information into the mobile device 100 through touch screen display 104 or the one or more other input device(s) 110 such as a keypad, keyboard, tactile buttons, camera, motion sensor, position sensor, light sensor, biometric data sensor, accelerometer, or a pointing device, commonly referred to as a mouse, trackball or touch pad. Other input devices of the mobile device 100 may include a microphone, joystick, thumbstick, game pad, optical scanner, other sensors, or the like. These and other input devices are often connected to the CPU 108 through a user input interface that is coupled to the system bus 124, but may be connected by other interface and bus structures, such as a parallel port, serial port, wireless port, game port or a universal serial bus (USB). Generally, a unique software driver stored in software 114 configures each input mechanism to sense user input, and then the software driver provides data points that are acted on by CPU 108 under the direction of other software 114. The display is also connected to the system bus 124 via an interface, such as the graphics engine 106. In addition to the display 104, the mobile device 100 may also include other peripheral output devices such as speakers, a printer, a projector, an external monitor, etc., which may be connected through one or more analog or digital I/O ports 116, network interface(s) 120 or wireless receiver(s) and transmitter(s) 118. The mobile device 100 may operate in a networked environment using connections to one or more remote computers or devices, such as a remote computer or device.
  • When used in a LAN or WAN networking environment, the mobile device 100 may be connected via the wireless receiver(s) and transmitter(s) 118 and network interface(s) 120, which may include, for example, cellular receiver(s) and transmitter(s), Wi-Fi receiver(s) and transmitter(s), and associated network interface(s). When used in a WAN networking environment, the mobile device 100 may include a modem or other means as part of the network interface(s) for establishing communications over the WAN, such as the Internet. The wireless receiver(s) and transmitter(s) 118 and the network interface(s) 120 may be communicatively connected to the system bus 124. In a networked environment, program modules depicted relative to the mobile device 100, or portions thereof, may be stored in a remote memory storage device of a remote system.
  • The mobile device 100 has a collection of I/O ports 116 and/or short range wireless receiver(s) and transmitter(s) 118 and network interface(s) 120 for passing data over short distances to and from the mobile device 100 or for coupling additional storage to the mobile device 100. For example, serial ports, USB ports, Wi-Fi ports, Bluetooth® ports, IEEE 1394 (i.e., FireWire), and the like can communicatively couple the mobile device 100 to other computing apparatuses. Compact Flash (CF) ports, Secure Digital (SD) ports, and the like can couple a memory device to the mobile device 100 for reading and writing by the CPU 108 or couple the mobile device 100 to other communications interfaces such as Wi-Fi or Bluetooth transmitters/receivers and/or network interfaces.
  • Mobile device 100 also has a power source 122 (e.g., a battery). The power source 122 may supply energy for all the components of the mobile device 100 that require power when a traditional, wired or wireless power source is unavailable or otherwise not connected. Other various suitable system architectures and designs of the mobile device 100 are contemplated and may be utilized which provide the same, similar or equivalent functionality as those described herein.
  • It should be understood that the various techniques, components and modules described herein may be implemented in connection with hardware, software and/or firmware or, where appropriate, with a combination of such. Thus, the methods and apparatus of the disclosure, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as various solid state memory devices, DVD-RW, RAM, hard drives, flash drives, or any other machine-readable or processor-readable storage medium wherein, when the program code is loaded into and executed by a machine, such as a processor of a computer, vehicle or mobile device, the machine becomes an apparatus for practicing various embodiments. In the case of program code execution on programmable computers, vehicles or mobile devices, such generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs may implement or utilize the processes described in connection with the disclosure, e.g., through the use of an API, reusable controls, or the like. Such programs are preferably implemented in a high level procedural or object oriented programming language to communicate with a computer system of mobile device 100. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language, and combined with hardware implementations.
  • FIG. 4 shows a schematic drawing of one embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of FIG. 1. The device 100 has aspects previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • The electronic device 100 includes the display 104, a plurality of characters 200 that populate positions 242 in multiple menu rows (or menu arrays) 244 of a character menu 240, a plurality of selection buttons 110 divided among multiple button rows (or button arrays) 280 that includes both time-dependent selection buttons 127 and at least one time-independent selection button 129, and a spacebar button 264, which together make up a user interface 150 of the device 100. Each of the selection buttons 110 has an assigned row identification (ID) value 282 and an assigned button press value 222. Included as part of or within proximity to the menu 240 is a reference 258, additional row ID values 282, and an offset scale 260. The display 104, the plurality of selection buttons 110, and the spacebar button 264 are communicatively coupled with the CPU 108, as described in the embodiment of FIG. 1. The CPU 108 includes the elapsed time counter 140 and the button press value counter 142, as described in the embodiment of FIG. 1. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of FIG. 1.
  • In the embodiment of FIG. 4, the positions 242 of the menu 240 are arranged in two one-dimensional arrays (or rows) 244, each array similar to the embodiment in FIG. 8 of U.S. Pat. No. 8,487,877, except that the menu 240 is shown on the display 104 instead of as a physical feature of the user interface 150. Each menu row 244 is identified by the row ID value 282 that it shares with the row of selection buttons 280 that operate on that menu row. The plurality of selection buttons 110 can be either hard keys (physical buttons) or soft keys (buttons shown on the display 104). In the embodiment of FIG. 4, the selection buttons 110 are shown as physical buttons. In either case, the buttons 110 are communicatively coupled with the CPU 108. The selection buttons 110 can be arranged in any pattern. In one embodiment, the time-dependent buttons 127 and time-independent buttons 129 are interspersed amongst one another on the user interface 150. In another embodiment, the buttons 127, 129 are interspersed amongst one another in an arrangement on the display 104.
  • The menu rows 244 and the offset scale 260 are positioned as respective one-dimensional arrays on the user interface 150 of the device 100. In one embodiment the menu rows 244 and the offset scale 260 are positioned on the user interface 150 so that they lie adjacent to and parallel with one other. In one embodiment, the menu rows 244 and the offset scale 260 are programmed in software so that they appear as features on the display 104 of the device 100.
  • In one embodiment, positions 242 of each respective menu row 244 are distributed in a one-dimensional array in evenly spaced increments. In a further embodiment, values of the offset scale 260 are distributed in a one-dimensional array in spatial increments that equal the increment of the menu rows 244, so that by referencing the offset scale 260 to the menu rows 244, characters 200 in the menu rows are effectively numbered.
  • The reference 258 is an indicator located near or on one of the positions 242 of the menu 240. The offset scale 260 includes a value of zero that is located to correspond with the reference 258 of the menu 240. Values of the offset scale 260 increase from zero in pre-selected increments as positions of the offset scale get farther from the zero value. In a further embodiment, values of the offset scale 260 decrease from zero in pre-selected increments as positions of the offset scale get farther from the zero value in a direction opposite to the increasing direction. In one embodiment, the pre-selected increment of the offset scale 260 equals one and the values of the offset scale extend from a negative value to a positive value passing through zero.
  • In one specific embodiment, the positions 242 of the menu 240 and the values of the offset scale 260 are distributed in respective one-dimensional arrays positioned adjacent to and parallel with one another, the values of the offset scale 260 count in increments of one and are spaced with respect to one another in their array to correspond with the spacing of positions 242 of the menu 240, and the zero value of the offset scale 260 corresponds to the reference 258 of the menu 240 so that the values of the offset scale 260 label the positions of each row 244 of the menu 240 according to how many positions a given position 242 of a row 244 is offset from the reference 258.
  • The plurality of selection buttons 110 lie on the user interface 150 of the device 100 and, as described above, can be either hard or soft keys. In one embodiment, the buttons 110 are arranged in button rows 280 so that the number of button rows 280 and the number of menu rows 244 is the same. In a further embodiment, the button rows 280 are arranged to visually correspond with the arrangement of the menu rows 244.
  • Each button is communicatively coupled with the CPU 108. Each button 110 has the function that when pressed, the row ID value 282 and button press value 222 assigned to the button is input to the CPU 108. In one embodiment, the buttons 110 arranged within the same row 280 have the same row ID value 282 and buttons in different rows have different row ID values.
  • The assigned button press values 222 can be either positive or negative. In one embodiment, the button press values 222 assigned to the selection buttons 110 of a button row 280 are all unique. In one embodiment there are five selection buttons 110 per menu row 280, some buttons are time-dependent 127 while others are time-independent 129, and the buttons' assigned values are −3, −2, 0, +2, and +3. In still another embodiment there are seven selection buttons 110 per menu row 280, some buttons are time-dependent 127 while others are time-independent 129, and the buttons' assigned values are −4, −3, −2, 0, +2, +3 and +4. In still another embodiment there are six selection buttons 110 per menu row 280, some buttons are time-dependent 127 while others are time-independent 129, and the buttons' assigned values are −4, −3, −2, 0, +2, and +3. In yet another embodiment there are six selection buttons 110 per menu row 280, some buttons are time-dependent 127 while others are time-independent 129, and the buttons' assigned values are −3, −2, −1, +1, +2 and +3.
  • The spacebar 264 also lies in the user interface 150 of the device 100, can be either a hard or soft key, and is communicatively coupled with the CPU 108.
  • FIGS. 5 and 6 show flowcharts for, respectively, an embodiment of a method 509 for specifying a character from among a plurality of characters and an embodiment of a method 609 for an electronic device to interpret button presses—both in accordance with the user interface 150 of FIG. 4.
  • In FIG. 5, in one step 510 of the method 509, a user views the plurality of characters 200 displayed in the menu 240. In another step 512, the user selects a character from the menu 240 for input to the electronic device 100.
  • In another step 542, the user identifies the selected character by (1) which row 244 of the menu 240 the character is in and (2) the position 242 of the character in its row with respect to the reference 258. For example, a user can identify a selected character as in either a top or bottom row 244 and by a value equal to the number of positions 242 the selected character is offset from the menu's reference 258. The user can identify the position 242 of the selected character in its row 244 in a number of ways, including by referencing the position to a corresponding value in the offset scale 260, counting the number of positions 242 that the selected character is offset from the reference 258, recalling from memory the value that identifies the particular selected character, and recalling by muscle memory the selection button keystrokes that correspond with the selected character or the selected character's position.
  • In another step 544, the user determines whether the value that identifies the selected character's position 242 in its menu row 244 equals the assigned value 222 of any of the selection buttons 110.
  • If one of the selection buttons 110 has an assigned value 222 that is equal, in another step 538 the user presses the selection button with the assigned value that equals the selected character's position and releases the button before the elapsed time counter expires, if necessary. The aforementioned step 518 inputs the assigned value 222 and the row ID value 282 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the CPU that the type of button press is a SHORT press, or a FIXED press if the button is a time-independent button 129. In a subsequent step 520, the user waits for the elapsed time counter 140 to expire, if necessary. Expiration of the elapsed time period is not required if the selection button pressed in a subsequent cycle of the method is in a different row 280 than the selection button 110 pressed in the current cycle or if the button is a time-independent button 129. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • However, if the value that identifies the selected character's position 242 in the menu 240 is not equal to the assigned value of any selection button, then in an alternate step 546, the user determines whether the value that identifies the selected character's position 242 in its menu row 244 equals twice the assigned button press value 222 of any selection button 110.
  • If so, in another step 540 the user presses the selection button 110 with the assigned value 222 that equals half the selected character's position and maintains the button press until the elapsed time counter expires. The aforementioned step 540 inputs the assigned value 222 and the row ID value 282 of the pressed selection button to the CPU 108, triggers the CPU 108 to start the elapsed time counter 140, and indicates to the processor that the type of button press is a LONG press. In an optional step 522, the user views the specified character on the display 104. In an alternative embodiment, step 522 is bypassed.
  • However, if none of the values 222 assigned to the selection buttons 110 equals the selected character's position 242 or is half the selected character's position, in an alternate step 524 the user presses the selection button with the assigned value 222 that is one of two values whose sum equals the selected character's position. The aforementioned step 524 inputs the assigned value 222 and the row ID value 282 of the pressed selection button 110 to the CPU 108 and triggers the CPU to start the elapsed time counter 140. In a subsequent step 526, the user presses the selection button 110 with the assigned value 222 that is the other of two values whose sum equals the selected character's position 242 and does so before the elapsed time counter 140 expires. The aforementioned step 526 inputs the assigned value 222 of the pressed selection button 110 to the CPU 108 and indicates to the processor that the type of button press is PAIR. Optionally, as part of the step 526, the CPU 108 may also terminate the elapsed time counter 140. Once the user has pressed the second selection button, in another step 522 the user views the specified character on the display 104, which is an optional step and in an alternative embodiment is bypassed.
  • According to another embodiment of the invention, the character specification method 509 described above is used iteratively to specify series of characters from the character menu 240. In one embodiment, words and sentences are formed on the display 104 by iteratively specifying characters according the method above, with the spacebar 264 used to input spaces between words on the display.
  • FIGS. 6A and 6B show flowcharts of an embodiment of a method 609 for the processor 108 of an electronic device to interpret sequences of button presses.
  • In one step 676 of the method 609, the CPU 108 initializes element of an array variable ‘sequence of row ID values’ 385 to zero. In one step 650 of the method 609, the CPU 108 initializes elements of an array variable ‘sequence of button press values’ 380 to zero. In another step 652 the CPU 108 initializes elements of an array variable ‘sequence of button press types’ 382 to zero. In another step 654 the CPU 108 initializes a variable ‘number of loops m’ 390 to zero. In another step 655 the CPU 108 initializes a variable ‘number of button presses n’ 392 to zero.
  • In another step 612 the CPU 108 initializes the elapsed time counter 140 to zero. In another step 614, the CPU 108 monitors the selection buttons 110 for a pressed selection button 110. Once a first selection button occurs, in another step 656, the CPU 108 determines if the first pressed selection button 110 is a press of the spacebar 264. If not, in a next step 658, the CPU 108 assigns to the nth element of the variable BPV sequence 380 the assigned value 222 of the first pressed selection button 110.
  • In a next step 678, the CPU determines which button row 280 the pressed selection button 110 is a member of. In a next step 680, the CPU 108 assigns to the mth element of the variable sequence of row ID values 385 the row ID value 282 of the pressed selection button 110.
  • In a next step 682, the CPU determines if the pressed selection button 110 is a time-dependent button 127 or time-independent button 129. If the selection button is a time-independent button 129, then in a subsequent step 684 the CPU 108 assigns to the mth element of the variable BPT sequence 382 the value ‘fixed’ 355. If the selection button is a time-dependent button 127, then in another step 618, the CPU 108 starts the elapsed time counter 140. In a pair of steps 620, 622, the CPU 108 monitors the selection buttons 110 for the occurrence of a second selection button press while comparing the elapsed time counter 140 with a user chosen selectable-length time period.
  • If the elapsed time counter 140 exceeds the duration of the elapsed time period (i.e., expires) before an additional selection button press occurs, in a subsequent step 640 the CPU 108 determines if the first button press is still pressed.
  • If the first button press is not still pressed when the elapsed time period expires, then in a subsequent step 660 the CPU 108 assigns to the mth element of the variable BPT sequence 382 the value ‘short’ 340.
  • If, however, the first button press is still pressed when the elapsed time period expires, then in an alternate subsequent step 662 the CPU 108 assigns to the mth element of the variable BPT sequence 382 the value ‘long’ 345.
  • If, however, a second selection button press occurs before the elapsed time counter 140 expires, in another step 664 the CPU 108 assigns to the mth element of the variable BPT sequence 382 the value ‘pair’ 350. Then, in a subsequent step 666 the CPU 108 adds 1 to the variable number of button presses n 392. Then, in a subsequent step 668 the CPU 108 assigns to the nth element of the variable BPV sequence 380 the assigned value 222 of the second pressed selection button 110. Then, in the subsequent step 666 the CPU 108 again adds 1 to the variable number of button presses n 392. Then, in a subsequent step 670 the CPU 108 adds 1 to the variable number of loops m 390.
  • According to one embodiment of the method 609, the CPU 108 re-initializes the elapsed time counter 140 to zero and repeats the method in succession until in the step 656 the CPU 108 finds that the selection button pressed in step 614 is a press of the spacebar 264.
  • Then, in an alternative step 672 the CPU 108 converts the values of the variable BPV sequence 380 to values of a variable ‘total BPV sequence’ 386 by: (1) doubling values of the BPV sequence 380 that coincide with ‘long BPT’ values 345 of the BPT sequence 382, and (2) adding together values of the BPV sequence 380 that coincide with consecutive ‘pair BPT’ values 350 of the BPT sequence 382.
  • In the case of pairs occurring consecutively in the BPT sequence 382 (i.e., pairs of pairs), no value of the BPV sequence 380 is added to more than one other value. Furthermore, additions are made so that every value of the BPV sequence 380 that coincides with a pair BPT 350 gets added to a consecutive value of the BPV sequence that also coincides with a pair BPT and in such a way that leaves no BPV that coincides with a pair BPT un-added.
  • Then, in a subsequent step 674 the CPU 108 constructs a character sequence 388 by identifying in order from the menu 240 each character 200 whose position 242 equals a value of the total BPV sequence 386.
  • Although the method 609 of FIG. 4 is one embodiment of a method for a processor 108 to interpret sequences of button presses, the scope of the method is not limited by this embodiment, but rather by the scope of the claims.
  • FIG. 7 shows a table of value assignments for variables of one embodiment of a character input system for the English language. The variables passed from the menu 240 to the user 160 are the ‘menu row’ 244, the ‘menu position’ 242 and the ‘available character’ 200. Assigned values are shown.
  • The variables passed from the user 160 to the selection buttons 110 are the ‘button row’ 282, ‘button value’ 222, ‘fixed-value’ 355, ‘co-press’ 210 and ‘duration’ 208. Assigned values are shown.
  • The variables synthesized by the processor 108 from presses of the selection buttons 110 are the ‘button row value’ 282, ‘button press values’ 222 and ‘button press type’ 224. Assigned values are shown.
  • FIG. 8 shows two examples of the method 609 of FIGS. 6A and 6B for the user interface 150 of FIG. 4. Each example includes the variables ‘number of button presses n’ 392, ‘BPV sequence’ 380, ‘number of loops m’ 390, ‘row ID sequence’ 385, ‘BPT sequence’ 382 and ‘character sequence’ 388.
  • In a first example 198, the variable number of button presses n 392 identifies the elements (0-11) of the array variable BPV sequence 380. The BPV sequence 380 contains the BPV 222 of consecutive button presses (−3 −2 0 +2 +3 +2 0 −2 −3 −2 −2) collected in steps 658 and/or 668 over multiple iterations of the method 609 of FIGS. 6A and 6B. The variable number of loops m 390 identifies the elements (0-10) of the array variable row ID sequence 385 and the array variable BPT sequence 382. The row ID sequence 385 contains the row ID value 282 collected in steps 678 and 680 with each iteration of the method 609 (B-A-A-A-A-A-A-B-A-A). The BPT sequence 382 contains the BPT 224 collected in one of steps 660, 662, 664 or 684 with each iteration of the method 609 of FIGS. 6A and 6B (long-short-fixed-pair-short-fixed-short-long-long-short). The character sequence 388 contains the selected characters (n e g l i g e n c e). In the first example 198, values for each of the variables above contribute to select characters of the word 130 ‘negligence’.
  • In a second example 199, the variable number of button presses n 392 identifies the elements (0-16) of the array variable BPV sequence 380. The BPV sequence 380 contains the BPV 222 of each consecutive button press (+3 −2 −3 −3 −2 −2 −2 +3 −3 +2 0 −3 −2 +3 −2 −3) collected in steps 658 and/or 668 over multiple iterations of the method 609 of FIGS. 6A and 6B. The variable number of loops m 390 identifies the elements (0-12) of the array variable row ID sequence 385 and the array variable BPT sequence 382. The row ID sequence 385 contains the row ID value 282 collected in steps 678 and 680 with each iteration of the method 609 (B-B-A-A-A-B-B-B-B-A-A-A). The BPT sequence 382 contains the BPT 224 collected in one of steps 660, 662, 664 or 684 with each iteration of the method 609 of FIGS. 6A and 6B (pair-long-long-long-long-pair-pair-fixed-pair-long-short-short). The character sequence 388 contains the selected characters (u n a c c u s t o m e d). In the second example 199, values for each of the variables above contribute to select characters of the word 130 ‘unaccustomed’.
  • FIG. 9 shows a method 709 that uses button press types 224 and row ID values 282 to identify a word from a received sequence of button presses.
  • The first step of the method 709 of FIG. 9 is the method 609 of FIGS. 6A and 6B. In the method 609 of FIGS. 6A and 6B, the CPU 108 interprets received button presses and from the presses constructs a character sequence 388. When used within the method 709 of FIG. 9, the constructed character sequence 388 is the presumed word 134.
  • In a next step 710 of the method 709, the CPU 108 compares the presumed word 134 with a library 136 of word possibilities. In a next step 712, the CPU 108 determines whether the presumed word 134 is found in the library 136 or not.
  • If the presumed word 134 is in the library 136, then in a next step 714 the CPU 108 accepts the presumed word as input.
  • If, however, the presumed word 134 is not in the library 136, then in a next step 750 the CPU 108 divides the BPT sequence 382 into the BPT sequence segments 428 according to the row ID values 282. The BPT sequence 382 is divided so that consecutive BPTs 224 that have the same row ID value 282 are in the same BPT sequence segment 428. Consecutive BPTs 224 that have different row ID values 282 are points where the sequence 382 becomes divided.
  • In a next step 756 the CPU 108 further divides the BPT sequence 382 into the BPT sequence segments 428 at the fixed-value BPTs 355. BPTs 224 separated by fixed-value BPTs 355 are in separate sequence segments 428.
  • In a next step 752, the CPU 108 identifies the possible alternative BPT sequences 420 for each sequence segment 428. The possible alternative BPT sequences 420 for each segment are combinations of BPTs with the same number of button presses as the corresponding segment of the received BPT sequence, as previously disclosed in FIGS. 15-18 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis, herein incorporated by reference in its entirety) except applied to an entire word.
  • In a next step 736, the CPU 108 ranks, for each segment 428, the possible alternative BPT sequences 425 according to the likelihood that each occurs based on the received BPT sequence 382 and a ranking criteria. In one embodiment, the ranking criteria is the number of individual BPT errors required to create a received BPT sequence 382 from an intended alternative BPT sequence. In another embodiment, the ranking criteria is the likelihood of each required individual BPT error occurring. In yet another embodiment, the ranking criteria is a composite of the number of individual BPT errors required to create a received BPT sequence 382 from an intended alternative BPT sequence and the likelihood of each required individual BPT error occurring.
  • In a next step 720 and 721, the CPU 108 converts each alternative BPT sequence 420 to an alternative character sequence 445 based on the BPV sequence 380, and the characters 200 and menu positions 242 of the user interface 150 of FIG. 4, as previously disclosed in FIGS. 21-26 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis) except applied to an entire word.
  • In a next step 758, the CPU 108 converts each fixed-value BPT 355 of the BPT sequence 382 to a fixed-value character 201 according to the assigned value 222 of the selection buttons 129 of the fixed-value BPTs 355 and the menu 240 of the user interface 150 of FIG. 4.
  • In a next step 760, the CPU 108 connects one ranked character sequence 445 or a presumed sequence 134 from each BPT sequence segment 428 with each fixed-value character 201 in the same order as the BPT sequence segments 428 from which the character sequences, presumed sequence and/or fixed-value characters are derived and, in that way, builds one of the reconnected alternative character sequences 462. In one embodiment, the CPU 108 identifies a plurality of unique reconnected alternative character sequences 462 by connecting, in different combinations, character sequences 445 and/or presumed sequences 134 from each segment 428 with the fixed-value characters 201. In one embodiment, all possible combinations of reconnected alternative character sequences 462 are identified.
  • In a next step 762, the CPU 108 iteratively compares reconnected alternative character sequences 462 with the library 136 of word possibilities. In one embodiment, the CPU 108 compares the reconnected alternative characters sequences 462 in order of a composite of the ranked likelihood that the reconnected alternative characters sequences 462 occurs based on the received BPT sequence 382 and a ranking criteria.
  • Next, in a step 724 the CPU 108 determines whether any reconnected alternative character sequence 462 is found in the library 136. If at least one reconnected alternative character sequence 462 is in the library 136, then in a step 726 the CPU 108 accepts one of the found alternative sequences 462 as input. If no alternative sequence 462 is in the library 136, then in the step 714 the CPU 108 accepts the presumed word 134 as input.
  • FIG. 10 shows a flowchart 152 of the variables of the method 609 of FIGS. 6A and 6B and the method 709 of FIG. 9. The flowchart 152 incorporates within it the flowchart 138 of FIG. 5 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis) that shows the progression of variables through the method 609 of FIGS. 6A and 6B that leads to the presumed word 134. The flowchart 152 also shows the progression of variables through the method 709 of FIG. 9 that leads to the possible reconnected alternative character sequences 462.
  • Of note in the flowchart 152 is the variable ‘sequence of row ID values’ 385. Acquisition of row ID values 282 in step 680 of the method 609 of FIGS. 6A and 6B enables BPT sequence segmentation in step 750 of the method 709 of FIG. 9. Sequence segmentation is beneficial because it reduces the number of alternative BPT sequences 420 possible for a given BPT sequence 382.
  • Furthermore, acquisition of row ID values 282 requires no extra effort on the part of the user. As described in FIG. 4, the row ID value 282 is part of the identity of each selection button 110. No decision or additional actuation recognizable to the user is required for the CPU 108 to receive the row ID value 282. Acquisition of values for the variable ‘sequence of row ID values’ 385 is transparent to the user.
  • Of note in the flowchart 152 is the variable ‘fixed-value BPT’ 355. Acquisition of fixed-value BPTs 355 in step 684 of the method 609 of FIGS. 6A and 6B enables BPT sequence segmentation in step 756 of the method 709 of FIG. 9. Sequence segmentation is beneficial because it reduces the number of alternative BPT sequences 420 possible for a given sequence 382.
  • Furthermore, acquisition of the fixed-value BPTs 355 requires no extra effort on the part of the user. As described in FIG. 4, fixed-value selection buttons 129 are interspersed among the selection buttons 110 and to the user appear as just another button. No decision or additional actuation recognizable to the user is required for the CPU 108 to receive the additional BPT 224. Acquisition of the fixed-value BPT 355 along with the other three BPTs 340, 345, 350 is transparent to the user.
  • The flowchart 152 of FIG. 10 has five input variables: (1) ‘sequence of button press values’ 380, (2) ‘co-press’ 210, (3) ‘duration’ 208, (4) ‘sequence of row ID values’ 385, and (5) ‘fixed-value BPT’ 355. Along one path of the flowchart 152, the variables ‘co-press’ 210, ‘duration’ 208 and ‘fixed-value’ 355 together determine the variable ‘sequence of button press types’ 382, which occurs as a result of repeated loops through steps 620, 640 and 682 of FIGS. 6A and 6B. Next, the variables ‘sequence of button press values’ 380 and ‘sequence of button press types’ 382 together determine the variable ‘sequence of total button press values’ 386, which occurs within step 672 of the method 609 of FIGS. 6A and 6B. Finally, the variable ‘sequence of total button press values’ 386 determines the variable ‘presumed word’ 134 which occurs in step 674 of the method 609 and is based on the user interface 150 of FIG. 4.
  • Along another path of the flowchart 152, the variable ‘sequence of BPTs’ 382 determines the variable ‘segmented sequence of BPTs’ 428, which occurs in step 756 of the method 709. Next the variable ‘sequence of row ID values’ 385 and the variable ‘segmented sequence of BPTs’ 428 together determine the variable ‘further segmented sequence of BPTs’ (also 428), which occurs in a step 750 of the method 709. Note that the steps 750 and 756 of the method 709 of FIG. 9 can occur in either order.
  • Next in the flowchart, the variable ‘further segmented sequence of BPTs’ 428 determines the variable ‘number of button presses per sequence segment’ 202, which occurs within step 752 of the method 709. Next in the flowchart, the variable ‘number of button presses per sequence segment’ 202 determines the variable ‘possible alternative BPT sequences per segment’ 420, which also occurs in step 752. Next, the variables ‘sequence of button press values’ 380 and ‘possible alternative BPT sequences per segment’ 420 together determine the variable ‘possible alternative sequences of total BPVs per segment’ 426, which occurs within step 720. Next, the variable ‘possible alternative sequences of total BPVs per segment’ 426 determines the variable ‘possible alternative character sequences per segment’ 445, which also occurs within step 720. Next, the variable ‘possible alternative character sequences per segment’ 445 determines the variable ‘reconnected alternative character sequences’ 462, which occurs in step 760 of the method 709. Finally, the variables ‘reconnected alternative character sequences’ 462 and ‘presumed word’ 134 are compared with the variable ‘library of words’ 136 to determine the variable ‘identified word’ 130, which occurs in steps 710 and 762.
  • FIG. 11 shows the user interface 150 of FIG. 4, a table 185 of value assignments for variables of the method 709 of FIG. 9, and a list of input variables 186 for the method 609 of FIGS. 6A and 6B. The user interface 150, table 185, and list of variables 186 are examples used to demonstrate the embodiments of FIGS. 4, 6 and 9. The scope of the invention is not limited by the variables and values shown here, but rather by the scope of the claims.
  • The table 185 is divided into rows and columns. Rows are grouped first by the row ID value 282 and then by the button press type 224. Each column is one variable: the variable ‘row ID value’ 282, the variable ‘co-press’ 210, the variable ‘duration’ 208, the variable ‘button press type’ 224, the variable ‘button press values’ 222, the variable ‘total button press value’ 228 and the variable ‘character’ 200.
  • Each line of the table 185 is a unique combination of the variables row ID value 282, button press type 224, and button press value 222. For the embodiment of the user interface 150 of FIG. 4, two selection button rows 280 with five selection buttons 110 per row enable 26 unique variable combinations.
  • The list 186 highlights which of the variables of the method 709 of FIG. 9 are the input variables. The input variables are: (1) ‘button press values’ 222, (2) ‘co-press’ 210, (3) ‘duration’ 208, (4) ‘row ID value’ 282 and (5) ‘fixed-value’ 355. The remaining variables of the table 185 (‘button press type’ 224, ‘total button press value’ 228, and ‘character’ 200) all follow from the input variables 186 and the user interface 150, as shown by the flowchart of FIG. 10.
  • FIG. 12 shows a table that compares characteristics of two different input methods. One method shown is the method 709 of FIG. 9, also known as the reduced-button input method. Another method shown is a 26-button input method 132. A standard QWERTY keyboard is one example of the 26-button method 132.
  • The characteristics compared in the table of FIG. 12 are: input variables, possible values for the input variables, level of control, and factor determining the level of control.
  • The reduced-button method 709 has five input variables: (1) button press values 222, (2) co-press 210, (3) duration 208, (4) row ID value 282 and (5) ‘fixed-value’ 355. These five variables appear as inputs in the flowchart 152 of FIG. 10 and in steps 620, 640, 658, 668, 678 and 682 of the method 609 of FIGS. 6A and 6B. Possible values of these variables for the user interface 150 of FIG. 4 are: (1) −3, −2, 0, +2 or +3, (2) pair or not, (3) <ETP or >ETP, (4) A or B and (5) fixed-value or not.
  • The 26-button method 132 has one input variable: button press value 222. Possible values for the button press value 222 in the case of the 26-button method are the characters themselves: a, b, c, d . . . and so on.
  • Level of control over the five variables for the reduced-button method 709 is high for the button press value variable 222, row ID value 282 and fixed-value 355, but low for the co-press 210 and duration 208. The factor that determines the high level of control over the button press value variable 222, the row ID value 282 and the fixed-value 355 is the button size. Because the reduced button method 709 requires fewer buttons compared to the number of characters that are selectable, relative to other input methods there is space available to increase the button size. For example, for the 13:5 ratio between characters to buttons shown in the interface 150 of FIG. 4, only ten buttons are required to offer every character of the English alphabet. Therefore even in a compact application like a mobile device, button size can be large enough for a human finger to press them without error, so the level of control over each is considered high.
  • The factors determining the low level of control for the variables co-press 210 and duration 208 are the moment of button press and the moment of button release. Both these variables 208, 210 are time dependent and in a typical application need to be controlled to a precision of less than tenths of a second. Achieving that level of control is difficult on a routine basis, so for that reason the level of control over these variables is considered low. However, due to the predictability of button press timing errors, the low level of control over the variables co-press 210 and duration 208 can be overcome with BPT error analysis.
  • Level of control over the button press value variable 222 for the 26-button method 132 is low. As with the reduced button method 709, the factor determining the level of control for the button press value 222 is button size. But the difference with the 26-button method 132 is that due to the requirement to provide 26 buttons, the size of each individual button must be small in a compact application. In use, the small button size leads to button press errors therefore the level of control for the button press value variable 222 is considered low for the 26-button method 132.
  • FIG. 13 shows an example of the interface 150 of FIG. 4 used with the reduced button method 709 and an example of a 26-button user interface 133 used with the 26-button method 132. The difference in size of the selection buttons 110 for each interface is noticeable. The larger selection buttons 110 of the user interface 150 used with the reduced button method 709 provide a user with a high level of control. The smaller selection buttons 110 of the 26-button interface 133 provide only a low level of control.
  • FIG. 14 shows a graphical representation of an example of each of the three time-dependent button press types 224: the short BPT 340, the long BPT 345 and the pair BPT 350. FIG. 14 also shows a graphical representation of a sequence of two short BPTs 341.
  • For each example, the passage of time is represented by a horizontal bar 326. A black region 327 within the bar 326 indicates a period of time when a button is pressed. A white region 328 within the bar 326 indicates a period of time when a button is not pressed. A first solid vertical marker indicates a beginning 329 of an elapsed time period 330. A second solid vertical marker indicates an end (or expiration) 331 of the elapsed time period 330. Button presses have an onset 320 and a moment of release 322. The time between the onset 320 and the moment of release 322 is the duration 208. As dictated by steps 612, 614 and 618 of the method 609 of FIGS. 6A and 6B, the elapsed time period 330 commences with the onset 320 of a button press whenever an elapsed time period 330 is not already occurring.
  • As previously defined, a button press with duration 208 less than the elapse time period 330 is the short BPT 340. A button press with duration 208 longer than the elapsed time period 330 is the long BPT 345. Because the onset 320 of the button press and the elapsed time period 330 commence together, the moment of button release 322 distinguishes the short BPT 340 from a long BPT 345.
  • As the graphic of the short BPT 340 and long BPT 345 shows, if the moment of button release 322 occurs close (in time) to expiration 331 of the elapsed time period, then a small difference in the timing of the release can change the determination of the BPT 224. In the examples of FIG. 14, the difference in moment of release 322 between the short BPT 340 and the long BPT 345 is on the order of 0.02 seconds. This level of sensitivity is why the input variable duration 208 in the table of FIG. 12 is considered of low control. It also shows how in the method 709 of FIG. 9, an inaccurately timed button release by a user can lead an intended short BPT 340 to be entered as a long BPT 345, or vice-versa.
  • A similar risk occurs at the onset 320 of a button press, if the expiration 331 of the elapsed time period 330 is close ahead of it in time. The graphical representation of the sequence of two short BPTs 341 shows how the onset 320 of a subsequent button press 332 can nearly overlap (in time) with the expiration 331 of an expiring elapsed time period 330. If the onset 320 of the subsequent button press 332 occurs before the expiration 331 of the elapsed time period 330, then the CPU 108 interprets subsequent button press 332 as a second button press 333 of the pair BPT 350. Thus the sequence of consecutive short BPTs 341 is interpreted as the pair BPT 350. In the examples of FIG. 14, the difference in the button onset 320 is on the order of 0.02 seconds. This sensitivity if why the input variable co-press 210 in the table of FIG. 12 is considered of low control. It also shows how in the method 609 of FIGS. 6A and 6B, an inaccurately timed button press by a user can lead button presses intended to be entered as consecutive BPTs to be entered as a pair BPT 350 instead, or vice-versa.
  • FIGS. 15-17 show an example of an application of the method 709 and system 101 for multi-variable character input. The presumed word 134 of the example of FIGS. 15-17 is ‘negligence’.
  • FIG. 15 shows the user interface 150 of FIG. 4, a table of values for each of the variables ‘character’ 200, ‘menu row’ 244, ‘menu position’ 242, ‘button press values’ 222 and ‘button press type’ 224, and three sequence variables ‘sequence of row ID values’ 385, ‘sequence of BPVs’ 380, and ‘sequence of BPTs’ 382.
  • Values for the variable ‘character’ 200 derive directly from the presumed word 134. Values for the variable ‘menu row’ 244 and ‘menu position’ 242 derive from the position of each character 200 in the menu 240 according to the user interface 150. The value for the variable ‘sequence of row ID values’ 385 derives from iterative cycles through steps 678 and 680 of the method 609 of FIGS. 6A and 6B. The value for the variable ‘sequence of BPVs’ 380 derives from iterative cycles through steps 658 and/or 668 of the method 609 of FIGS. 6A and 6B. The value for the variable ‘sequence of BPTs’ 382 derives from iterative cycles through steps 660, 662, 664 and/or 684 of the method 609. For the presumed word 134 ‘negligence’, the value for the sequence of row ID values 385 is ‘B-A-A-A-A-A-A-B-A-A’, the value for the sequence of BPVs 380 is ‘−3 −2 0 +2 +3 +2 0 −2 −3 −2 −2’ and the value for the sequence of BPTs 382 is ‘long-short-fixed-pair-short-fixed-short-long-long-short’.
  • FIGS. 16 and 17 show how the presumed word 134 and a plurality of reconnected alternative character sequences 462 are derived from the sequence of BPVs 380, the sequence of BPTs 382, and the sequence of row ID values 385 of FIG. 15.
  • The BPT sequence 382 is divided into segments 428 according to the sequence of row ID values 385 and fixed-value BPTs 355. The sequence 382 is segmented at points where individual row ID values 282 change and at the position of each fixed-value BPT 355.
  • For the example of FIGS. 15-17, the row ID values 282 change between the first and second positions of the sequence of row ID values' 385, back again between the seventh and eighth positions, and then back again between the eighth and ninth positions. Fixed-value BPTs 355 occur in the third and sixth positions of the BPT sequence 382. Therefore the BPT sequence 382 ‘long-short-fixed-pair-short-fixed-short-long-long-short’ is divided into six BPT sequence segments 428: a first segment 432 ‘long’, a second segment 434 ‘short’, a third segment 436 ‘pair-short’, a fourth segment 438 ‘short’, a fifth segment 440 ‘long’ and a sixth segment 441 ‘long-short’.
  • The number of button presses 202 in the BPT sequence segment 428 determines the possible alternative BPT sequences 420 for the segment, just as the number of presses determines the possible alternative sequences 420 for an entire BPT sequence 382. The alternative BPT sequences 420 are determined as shown in FIGS. 15-18 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis). For the example of FIGS. 15-17, the first, second, fourth and fifth BPT sequence segments 432, 434, 438, 440 have one button press and therefore only one possible alternative BPT sequences 420. The third BPT sequence segment 436 has three button presses, and therefore eleven possible alternative BPT sequences 420. The sixth BPT sequence segment 441 has three button presses, and therefore four possible alternative BPT sequences 420.
  • Each alternative BPT sequence 420 in each BPT sequence segment 428 is converted to a total BPV sequence 386 (not shown) based on the BPV sequence 380 and then to an alternative character sequence 445 according to the user interface 150 of FIG. 4, as previously disclosed in steps of the method 700 of FIG. 19 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis). FIG. 17 lists the alternative character sequences 445 as shown.
  • One character sequence 445 or presumed sequence 134 from each BPT sequence segment 428 is reconnected (indicated by dashed lines 463) with one character sequence or presumed sequence from every other BPT sequence segment, and with the fixed-value character 201, in the same order as the BPT sequence segments 428 from which the character sequences, presumed sequences and/or fixed-value characters are derived and, in that way, builds one of the reconnected alternative character sequences 462. In one embodiment, the CPU 108 identifies a plurality of unique reconnected alternative character sequences 462 by connecting, in different combinations, a character sequence 445, a presumed sequence 134 and/or fixed-value character 201 from each segment 428. In one embodiment, all possible combinations of reconnected alternative character sequences 462 are identified. In a further embodiment, all possible combinations of reconnected alternative character sequences 462 are compared with a library 136 of word possibilities. In a further embodiment, if a reconnected alternative character sequence 462 is found in the library 136 and the presumed word 134 is not found in the library, then the CPU 108 accepts as input the found reconnected alternative character sequence 462 in place of the presumed word 134.
  • For the example of FIGS. 15-17, examples of reconnected alternative character sequences 462 are ‘n e g l i g e n e e’, ‘n e g l i g e n e c’, ‘n e g l i g e n c c’, ‘n e g l i g e q c e’ and so on.
  • FIGS. 18-21 show an example of an application of the method 709 and system 101 for multi-variable character input. The presumed word 134 of the example of FIGS. 15-17 is ‘rwnaccustojed’. An intended word 135 is ‘unaccustomed’.
  • FIG. 18 shows the user interface 150 of FIG. 4, a table of values for each of the variables ‘character’ 200, ‘menu row’ 244, ‘menu position’ 242, ‘button press values’ 222 and ‘button press type’ 224, and three sequence variables ‘sequence of row ID values’ 385, ‘sequence of BPVs’ 380, and ‘sequence of BPTs’ 382.
  • Values for the variable ‘character’ 200 derive directly from the presumed word 134. Values for the variable ‘menu row’ 244 and ‘menu position’ 242 derive from the position of each character 200 in the menu 240 according to the user interface 150. The value for the variable ‘sequence of row ID values’ 385 derives from iterative cycles through steps 678 and 680 of the method 609 of FIGS. 6A and 6B. The value for the variable ‘sequence of BPVs’ 380 derives from iterative cycles through steps 658 and/or 668 of the method 609 of FIGS. 6A and 6B. The value for the variable ‘sequence of BPTs’ 382 derives from iterative cycles through steps 660, 662, 664 and/or 684 of the method 609. For the presumed word 134 ‘rwnaccustojed’, the value for the sequence of row ID values 385 is ‘B-B-B-A-A-A-B-B-B-B-A-A-A’, the value for the sequence of BPVs 380 is ‘−2 +3 −3 −3 −2 −2 −2 +3 −3 +2 0 −3 −2 +3 −2 −3’ and the value for the sequence of BPTs 382 is ‘short-short-long-long-long-long-pair-pair-fixed-pair-short-short-short’.
  • FIGS. 19-21 show how the presumed word 134 and a plurality of reconnected alternative character sequences 462 are derived from the sequence of BPVs 380, the sequence of BPTs 382, and the sequence of row ID values 385 of FIG. 18.
  • The BPT sequence 382 is divided into segments 428 according to the sequence of row ID values 385 and fixed-value BPTs 355. The sequence 382 is segmented at points where individual row ID values 282 change and at the position of each fixed-value BPT 355.
  • For the example of FIGS. 19-21, the row ID value 282 changes between the third and fourth positions of the sequence of row ID values' 385, back again between the sixth and seventh positions, and then back again between the tenth and eleventh positions. A fixed-value BPT 355 occurs in the ninth position of the BPT sequence 382. Therefore the BPT sequence 382 ‘short-short-long-long-long-long-pair-pair-fixed-pair-short-short-short’ is divided into five BPT sequence segments 428: a first segment 432 ‘short-short-long’, a second segment 434 ‘long-long-long’, a third segment 436 ‘pair-pair’, a fourth segment 438 ‘pair’ and a fifth segment 440 ‘short-short-short’.
  • The number of button presses 202 in the BPT sequence segment 428 determines the possible alternative BPT sequences 420 for the segment, just as the number of presses determines the possible alternative sequences 420 for an entire BPT sequence 382. The alternative BPT sequences 420 are determined as shown in FIGS. 15-18 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis. For the example of FIGS. 19-21, the first, second and fifth BPT sequence segments 432, 434, 440 have three button presses and therefore eleven alternative BPT sequences 420. The third BPT sequence segment 436 has four button presses and therefore 28 possible alternative BPT sequences 420. The fourth BPT sequence segment 438 has two button presses and therefore four possible alternative BPT sequences 420.
  • Each alternative BPT sequence 420 in each BPT sequence segment 428 is converted to a total BPV sequence 386 (not shown) based on the BPV sequence 380 and then to an alternative character sequence 445 according to the user interface 150 of FIG. 4, as previously disclosed in steps of the method 700 of FIG. 19 of U.S. Application No. 62/155,372 (Method of Word Identification that uses Button Press Type Error Analysis). FIG. 19 lists the alternative character sequences 445 as shown.
  • One character sequence 445 or presumed sequence 134 from each BPT sequence segment 428 is reconnected (indicated by dashed lines 463) with one character sequence or presumed sequence from every other BPT sequence segment, and with the fixed-value character 201, in the same order as the BPT sequence segments 428 from which the character sequences, presumed sequences and/or fixed-value characters are derived and, in that way, builds one of the reconnected alternative character sequences 462. In one embodiment, the CPU 108 identifies a plurality of unique reconnected alternative character sequences 462 by connecting, in different combinations, a character sequence 445, a presumed sequence 134 and/or fixed-value character 201 from each segment 428. In one embodiment, all possible combinations of reconnected alternative character sequences 462 are identified. In a further embodiment, all possible combinations of reconnected alternative character sequences 462 are compared with a library 136 of word possibilities. In a further embodiment, if a reconnected alternative character sequence 462 is found in the library 136 and the presumed word 134 is not found in the library, then the CPU 108 accepts as input the found reconnected alternative character sequence 462 in place of the presumed word 134.
  • The BPT sequence 382 includes a ‘pair’ BPT 450, so there is more than one BPV sequence 380 a user could input to get the intended word 135 ‘unaccustomed’. As a result, more than one set of alternative character sequences 445 exists. FIGS. 20 and 21 show two of eight possible sets of reconnected alternative character sequences 462 that could occur. For the example of FIG. 20 the received BPV sequence 380 is ‘−2 +3 −3 −3 −2 −2 [−2 +3] [−3 +2] 0 [−3 −2]+3 −2 −3’. For the example of FIG. 21 the received BPV sequence 380 ‘−2 +3 −3 −3 −2 −2 [+3 −2] [+2 −3] 0 [−2 −3]+3 −2 −3’.
  • For the BPV sequence 380 of FIG. 20, examples of possible reconnected alternative character sequences 462 are ‘r w n a c c u s t o j e a’, ‘r w n a c c u s t o j c d’, ‘r w n a c c u s t o m e d’, ‘r w n a c c r w q v t o j e d’, ‘u n a c c u s t o j e d’, ‘unaccustomed’ and so on. For the BPV sequence 380 of FIG. 20, examples of possible reconnected alternative character sequences 462 are ‘r w n a c c u s t o j e a’, ‘r w n a c c u s t o j c d’, ‘r w n a c c u s t o m e d’, ‘r w n a c c w r v q t o j e d’, ‘u n a c c u s t o j e d’, ‘u n a c c u s t o m e d’ and so on.
  • FIG. 22 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of FIG. 1. The device 100 has aspects previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • The electronic device 100 includes the display 104, a plurality of characters 200 that populate positions 242 in multiple menu rows (or menu arrays) 244 of a character menu 240, a plurality of selection buttons 110 divided among multiple button rows (or button arrays) 280 that includes both time-dependent selection buttons 127 and at least one time-independent selection button 129, and a spacebar button 264, which together make up a user interface 150 of the device 100. Each of the selection buttons 110 has an assigned row identification (ID) value 282 and an assigned button press value 222. Included as part of or within proximity to the menu 240 is a reference 258, additional row ID values 282, and an offset scale 260. The display 104, the plurality of selection buttons 110, and the spacebar button 264 are communicatively coupled with the CPU 108, as described in the embodiment of FIG. 1. The CPU 108 includes the elapsed time counter 140 and the button press value counter 142, as described in the embodiment of FIG. 3. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of FIG. 3.
  • In the embodiment of FIG. 22, the menu 240 has 34 menu positions 242 and the plurality of selection buttons includes fourteen buttons with the assigned button press values 222: ‘−4, −3, −2, 0, +2, +3, +4’. In a further embodiment, the menu positions 242 are populated by 33 characters 200 of the Russian alphabet.
  • FIG. 23 shows an embodiment of the table 185 of value assignments for variables of the method 709 of FIG. 9 for the embodiment of the user interface 150 of FIG. 22.
  • FIG. 24 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of FIG. 1. The device 100 has aspects previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • The electronic device 100 includes the display 104, a plurality of characters 200 that populate positions 242 in multiple menu rows (or menu arrays) 244 of a character menu 240, a plurality of selection buttons 110 divided among multiple button rows (or button arrays) 280 that includes both time-dependent selection buttons 127 and at least one time-independent selection button 129, and a spacebar button 264, which together make up a user interface 150 of the device 100. Each of the selection buttons 110 has an assigned row identification (ID) value 282 and an assigned button press value 222. Included as part of or within proximity to the menu 240 is a reference 258, additional row ID values 282, and an offset scale 260. The display 104, the plurality of selection buttons 110, and the spacebar button 264 are communicatively coupled with the CPU 108, as described in the embodiment of FIG. 1. The CPU 108 includes the elapsed time counter 140 and the button press value counter 142, as described in the embodiment of FIG. 3. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of FIG. 3.
  • In the embodiment of FIG. 24, the menu 240 has 30 menu positions 242 and the plurality of selection buttons includes twelve buttons with the assigned button press values 222: ‘−4, −3, −2, 0, +2, +3’. In a further embodiment, the menu positions 242 are populated by the 26 characters 200 of the English alphabet, plus characters that represent four of the five tones used in Chinese pinyin. In a further embodiment, the four tones represented are flat (high level), rising (high-rising), fall-rising (low) and falling (high-falling). In a further embodiment, the four tones are represented by a macron, acute accent, caron and grave accent, respectively. In an alternative embodiment, the four tones are represented by the marks ‘-’, ‘’’, ‘{hacek over ( )}’ and ‘′’, respectively.
  • FIG. 25 shows an embodiment of the table 185 of value assignments for variables of the method 709 of FIG. 9 for the embodiment of the user interface 150 of FIG. 24.
  • FIG. 26 shows a schematic drawing of another embodiment of the electronic device 100 for input of characters. The device 100 may have some or all the components and functionality described herein with respect to the mobile device 100 of FIG. 1. The device 100 has aspects previously disclosed in FIG. 8 of U.S. Pat. No. 8,487,877, which is hereby incorporated by reference in its entirety.
  • The electronic device 100 includes the display 104, a plurality of characters 200 that populate positions 242 in multiple menu rows (or menu arrays) 244 of a character menu 240, a plurality of selection buttons 110 divided among multiple button rows (or button arrays) 280 that includes both time-dependent selection buttons 127 and at least one time-independent selection button 129, and a spacebar button 264, which together make up a user interface 150 of the device 100. Each of the selection buttons 110 has an assigned row identification (ID) value 282 and an assigned button press value 222. Included as part of or within proximity to the menu 240 is a reference 258, additional row ID values 282, and an offset scale 260. The display 104, the plurality of selection buttons 110, and the spacebar button 264 are communicatively coupled with the CPU 108, as described in the embodiment of FIG. 1. The CPU 108 includes the elapsed time counter 140 and the button press value counter 142, as described in the embodiment of FIG. 3. The CPU 108 is communicatively coupled with the storage medium 112 and the power source 122, as described in the embodiment of FIG. 3.
  • In the embodiment of FIG. 26, the menu 240 has 26 menu positions 242 and the plurality of selection buttons includes ten buttons with the assigned button press values 222: ‘3, −2, 0, +2, +3’. In a further embodiment, the menu positions 242 are populated by the 26 characters 200 of the English alphabet. In a further embodiment, the characters 200 populate the positions 242 so that the most frequently used characters in the language ‘e’ and ‘t’ are in positions selected by a time-independent button 129. In another embodiment, the most frequently used characters in the language ‘e, t, a, o, i, n, s, r, h and l’ populate positions of the menu selected by ‘fixed BPTs’ 355 or ‘short BPTs’ 340. In another embodiment, the least frequently used characters in the language ‘y, b, v, k, x, j, q and z’ populate positions of the menu selected by ‘pair BPTs’ 350.
  • FIG. 27 shows an embodiment of the table 185 of value assignments for variables of the method 709 of FIG. 9 for the embodiment of the user interface 150 of FIG. 26.
  • The proposed method uses time-dependent button actuations and a time-sensitive error correction algorithm to enable text input. An algorithm corrects errors by identifying alternative character sequences that derive from predictable button actuation timing errors. Errors due to timing are predictable because the number of possible inaccurately timed button actuation combinations is finite. Furthermore, significant variation in the likelihood of different actuation errors makes error prioritization productive. The method increases positional selection accuracy but comes at the expense of a new error type—button press timing inaccuracy. However the trade-off is net positive because positional selection errors are exchanged for more predictable time-based ones.
  • The interface has 10 selection buttons and a 26-position menu. A user selects characters by navigating the menu using time-dependent button actuations. The character highlighted when a short duration timer expires becomes selected. The 13:5 ratio between buttons and characters enables button size to be large compared with a 26-button interface, which increases selection accuracy. However a trade-off is that errors due to inaccurately timed button presses become possible.
  • An algorithm corrects selection errors by identifying alternative character sequences that require actuation of the same buttons in the same order as those of an entered word, but that use different time-dependent button actuations. Errors due to inaccurately timed button presses are more predictable than errors due to inaccurately positioned presses because the number of possible alternatives is fewer. Furthermore, the likelihood of the various errant time-dependent button actuations varies significantly, which makes prioritization of identified alternative character sequences useful.
  • The proposed method increases positional selection accuracy but at the expense of a new error type—inaccurately timed button presses. However the trade-off is net positive because errors due to inaccurately positioned button presses are exchanged for errors due to inaccurately timed presses, which are more predictable and easier to accurately correct.
  • The embodiment of FIG. 28 discloses one-half of a full 26-character interface 150, a graphic representation of each of three button press types 224, and a table of button press types 224 and their corresponding math operations 181.
  • The selection buttons of the interface 150 have the assigned ‘button press values’ −3, −2, 0, +2 and +3. A button press tentatively selects a character and simultaneously starts a short-duration elapsed timer (in one embodiment, ˜0.15 sec). The button press identifies the selected character by the value of its position in the menu. The elapsed timer defines a period during which one of three button actuations completes the selection: (1) a button release (duration <ETP), (2) a continued button press (duration >ETP), or (3) an additional selection button press (a co-press). The button actuation underway when the timer expires identifies the ‘button press type’ of the selection cycle (short, long or pair). The button press type 224 determines a math operation 181 applied to the button press value of the tentatively selected character. The result of the math operation is a ‘total button press value’ that identifies a character by its position in the menu, as shown by the table 185 of FIG. 29.
  • A word is represented by a sequence of button press values (BPVs) and a sequence of button press types (BPTs). As FIG. 30 shows, for the user interface of FIG. 28 the word ‘lad’ is represented by the BPV sequence ‘+2 +3 −3 −3’ and the BPT sequence ‘pair-long-short’.
  • Inaccurately timed button presses result in errant BPTs within a sequence. An error correction algorithm finds the correct word by identifying possible alternative BPTs sequences for the button presses received. For example, for the word ‘lad’, a possible alternative to the BPT sequence ‘pair-long-short’ is ‘pair-long-long’. The number of possible sequences depends on the number of button presses the sequence has. As shown in FIG. 30, the word ‘lad’ is a 4-button-press sequence. Therefore, as shown in the left half of FIG. 31, ‘lad’ has 28 possible alternative BPT sequences 420.
  • Each alternative BPT sequence 420 is translated to an alternative character sequence 445 according to the BPV sequence received and the menu. The right half of FIG. 31 shows the alternative character sequences for ‘lad’. An errant BPT in the BPT sequence would lead the word ‘lad’ to appear among these 28 possible alternatives.
  • The number of possible alternative BPT sequences 445 compounds as the number of button presses increases. Compounding occurs due to the number of BPT combinations that become possible when consecutive button actuations cannot be conclusively determined. The remedy for this problem is an additional variable that allows button presses, and therefore BPTs, to also be identified by row.
  • First, two assumptions are required: (1) button presses occur in the order they are intended and (2) button presses occur at least within the row they are intended. For a user interface of only two rows that is used with only two fingers, these two assumptions are considered valid. If those are true, then only those sequences where the BPTs correctly correspond with the known row value for each position of the sequence are possible. To reveal the possible alternatives, the algorithm divides the received BPT sequence into segments at each row change. The algorithm then identifies possible alternative BPT sequences separately for each segment. To identify possible alternative character sequences, the algorithm reconnects, in order, possible combinations of the received and alternative character sequence segments.
  • Another technique that achieves the same effect is to intersperse time-independent selection buttons among the time-dependent ones. A time-independent button is a key with an assigned button press value that does not change as a result of the actuation that selects it. As a result, the button is immune to button press timing errors and its value cannot be combined with that of another actuation as part of a ‘pair’. Because a time-dependent button press cannot be mistaken for or combined with any other BPT, it conclusively divides a BPT sequence into segments the same way a row change does.
  • FIGS. 32-34 show an example application of the error correction algorithm described above. The example uses the 2-row interface of FIG. 32. The interface includes time-independent buttons for the positions of the letters ‘g’ and ‘t’. The algorithm also acquires a row ID value. The intended word is ‘negligence’, but the word is incorrectly entered as ‘qegilgenee’. In the example, the algorithm correctly identifies the word ‘negligence’ among the possible alternative character sequences, as shown by the dashed line in FIG. 34.
  • An important condition of a practical interface design is that each of the three button actuations be comfortably executed in nearly the same time duration. If that is not the case, then the duration of the time period must be set to allow for the most time-consuming actuation, which induces a wait in the other two actuations.
  • Of the three button actuations shown in the graphic of BPTs 224 in FIG. 28, ‘pair’ takes the longest to execute because it requires initiation of two button presses. The maximum time needed to execute two presses is found by experiment to be ˜0.10-0.15 sec. Conversely, the minimum time needed to execute a button release (‘short’ BPT) or to comfortably wait out a continued button press (‘long’ BPT) is shorter than 0.15 seconds by only a minor amount in comparison with the time needed to execute a ‘pair’ BPT and in comparison with an entire character selection cycle. Therefore the button press actuations devised for the method meet the design condition mentioned above.
  • For errors due to button press timing, ten unique error cases 397 exist, which are shown in the top half of FIG. 35. Any error due to button press timing is a combination of one or more of these. In practice, some error cases are far more likely than others. In searching a field of possible alternative character sequences for a word, character sequences that come from BPT sequences created by the most likely error cases would be searched first.
  • For any given BPT sequence, the number of possible alternative sequences is determined only by the number of button presses in the sequence. The number of BPTs in the sequence is not important. A table 418 in the bottom half of FIG. 35 shows the number of BPT alternatives for each case of a given number of button press values.
  • FIG. 36 is two flowcharts 152. Each flowchart shows the relationship between variables of the method 709 of FIG. 9. The top flowchart shows the relationship for an un-segmented word or for one segment of a segmented word. The bottom flowchart shows the same relationship as the top flowchart, but also includes variables for the row ID value and time-independent buttons (fixed-value BPTs) that enable BPT sequence segmentation.
  • The various embodiments described above can be combined to provide further embodiments. All of the U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet are incorporated herein by reference, in their entirety. Aspects of the embodiments can be modified, if necessary to employ concepts of the various patents, applications and publications to provide yet further embodiments.
  • These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.

Claims (20)

1. A computer processor-implemented method comprising:
monitoring, by at least one computer processor, an actuated first selection button and an elapsing timer;
interpreting, by at least one computer processor, one or more of the following actuation-concluding events: a release of the actuated first button, an expiration of the elapsing timer, and an actuation of a second selection button prior to the expiration of the elapsing timer, and;
inferring, by at least one computer processor, a button press type based on which of the following ordered event sequences matches all the interpreted actuation-concluding events:
the release of the actuated first button, then the expiration of the elapsing timer;
the expiration of the elapsing timer;
the actuation of a second selection button prior to the expiration of the elapsing timer, and
the release of the actuated first button, then the actuation of the second selection button prior to the expiration of the elapsing timer.
2. The method of claim 1 wherein the inferring includes the at least one processor inferring the following ordered event sequence as a short button press type: the release of the actuated first button, then the expiration of the elapsing timer.
3. The method of claim 1 wherein inferring includes the at least one processor inferring the following ordered event sequence as a long button press type: the expiration of the elapsing timer.
4. The method of claim 1 wherein inferring includes the at least one processor inferring the following ordered event sequences as a pair button press type:
the actuation of a second selection button prior to the expiration of the elapsing timer, and;
the release of the actuated first button, then the actuation of the second selection button prior to the expiration of the elapsing timer.
5. The method of claim 1 further comprising calculating, by at least one computer processor, a total button press value based on button press values assigned to the one or more actuated selection buttons and the inferred button press type.
6. The method of claim 5 wherein:
for a short button press type, the calculated total button press value equals the button press value assigned to the actuated first selection button;
for a long button press type, the calculated total button press value equals twice the button press value assigned to the actuated first selection button, and;
for a pair button press type, the calculated total button press value equals the sum of the button press values assigned to the first and second actuated selection buttons.
7. The method of claim 6 further comprising selecting a character from among a plurality of displayed characters by matching the calculated total button press value to a value that identifies a position of the selected character among the plurality of displayed characters.
8. The method of claim 7 wherein the assigned button press values of the one or more actuated selection buttons is one of −3, −2, +2 and +3.
9. A system comprising:
at least one computer processor; and
at least one memory coupled to the at least one computer processor, the at least one memory having computer-executable instructions stored thereon, that when executed by the at least one computer processor, cause the system to:
monitor an actuated first selection button and an elapsing timer;
interpret one or more of the following actuation-concluding events: a release of the actuated first button, an expiration of the elapsing timer, and an actuation of a second selection button prior to the expiration of the elapsing timer; and
infer a button press type based on which of the following ordered event sequences matches all the interpreted actuation-concluding events:
the release of the actuated first button, then the expiration of the elapsing timer;
the expiration of the elapsing timer;
the actuation of a second selection button prior to the expiration of the elapsing timer, and
the release of the actuated first button, then the actuation of the second selection button prior to the expiration of the elapsing timer.
10. The system of claim 9 wherein the computer-executable instructions, when executed by at least one computer processor, cause the system to infer the following ordered event sequence as a short button press type: the release of the actuated first button, then the expiration of the elapsing timer.
11. The system of claim 9 wherein the computer-executable instructions, when executed by at least one computer processor, cause the system to infer the following ordered event sequence as a long button press type: the expiration of the elapsing timer.
12. The system of claim 9 wherein the computer-executable instructions, when executed by at least one computer processor, cause the system to infer the following ordered event sequences as a pair button press type:
the actuation of a second selection button prior to the expiration of the elapsing timer, and;
the release of the actuated first button, then the actuation of the second selection button prior to the expiration of the elapsing timer.
13. The system of claim 9 wherein the computer-executable instructions, when executed by at least one computer processor, further cause the system to calculate a total button press value based on button press values assigned to the one or more actuated selection buttons and the inferred button press type.
14. The system of claim 13 wherein:
for a short button press type, the calculated total button press value equals the button press value assigned to the actuated first selection button;
for a long button press type, the calculated total button press value equals twice the button press value assigned to the actuated first selection button, and;
for a pair button press type, the calculated total button press value equals the sum of the button press values assigned to the first and second actuated selection buttons.
15. The system of claim 14 wherein the computer-executable instructions, when executed by at least one computer processor, further cause the system to select a character from among a plurality of displayed characters by matching the calculated total button press value to a value that identifies a position of the selected character among the plurality of displayed characters.
16. The system of claim 15 wherein the assigned button press values of the one or more actuated selection buttons is one of −3, −2, +2 and +3.
17. A non-transitory computer-readable storage medium having computer-executable instructions stored thereon, that when executed by at least one computer processor, cause the at least one computer processor to:
monitor an actuated first selection button and an elapsing timer;
interpret one or more of the following actuation-concluding events: a release of the actuated first button, an expiration of the elapsing timer, and an actuation of a second selection button prior to the expiration of the elapsing timer; and
infer a button press type based on which of the following ordered event sequences matches all the interpreted actuation-concluding events:
the release of the actuated first button, then the expiration of the elapsing timer;
the expiration of the elapsing timer;
the actuation of a second selection button prior to the expiration of the elapsing timer, and
the release of the actuated first button, then the actuation of the second selection button prior to the expiration of the elapsing timer.
18. The non-transitory computer-readable storage medium of claim 17 wherein the computer-executable instructions, when executed by at least one computer processor, cause the at least one computer processor to infer the following ordered event sequence as a short button press type: the release of the actuated first button, then the expiration of the elapsing timer.
19. The non-transitory computer-readable storage medium of claim 17 wherein the computer-executable instructions, when executed by at least one computer processor, cause the at least one computer processor to infer the following ordered event sequence as a long button press type: the expiration of the elapsing timer.
20. The non-transitory computer-readable storage medium of claim 17 wherein the computer-executable instructions, when executed by at least one computer processor the at least one computer processor to infer the following ordered event sequences as a pair button press type:
the actuation of a second selection button prior to the expiration of the elapsing timer, and;
the release of the actuated first button, then the actuation of the second selection button prior to the expiration of the elapsing timer.
US16/242,688 2015-04-30 2019-01-08 Method and system of multi-variable character input Abandoned US20190138208A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/242,688 US20190138208A1 (en) 2015-04-30 2019-01-08 Method and system of multi-variable character input

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562155372P 2015-04-30 2015-04-30
US15/139,866 US20160320929A1 (en) 2015-04-30 2016-04-27 Method and system of multi-variable character input
US16/242,688 US20190138208A1 (en) 2015-04-30 2019-01-08 Method and system of multi-variable character input

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/139,866 Continuation US20160320929A1 (en) 2015-04-30 2016-04-27 Method and system of multi-variable character input

Publications (1)

Publication Number Publication Date
US20190138208A1 true US20190138208A1 (en) 2019-05-09

Family

ID=57198771

Family Applications (5)

Application Number Title Priority Date Filing Date
US15/139,872 Abandoned US20160321237A1 (en) 2015-04-30 2016-04-27 Method of word identification that uses an array variable
US15/139,858 Active 2038-01-15 US10452264B2 (en) 2015-04-30 2016-04-27 Systems and methods for word identification that use button press type error analysis
US15/139,862 Active 2036-11-29 US10216410B2 (en) 2015-04-30 2016-04-27 Method of word identification that uses interspersed time-independent selection keys
US15/139,866 Abandoned US20160320929A1 (en) 2015-04-30 2016-04-27 Method and system of multi-variable character input
US16/242,688 Abandoned US20190138208A1 (en) 2015-04-30 2019-01-08 Method and system of multi-variable character input

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US15/139,872 Abandoned US20160321237A1 (en) 2015-04-30 2016-04-27 Method of word identification that uses an array variable
US15/139,858 Active 2038-01-15 US10452264B2 (en) 2015-04-30 2016-04-27 Systems and methods for word identification that use button press type error analysis
US15/139,862 Active 2036-11-29 US10216410B2 (en) 2015-04-30 2016-04-27 Method of word identification that uses interspersed time-independent selection keys
US15/139,866 Abandoned US20160320929A1 (en) 2015-04-30 2016-04-27 Method and system of multi-variable character input

Country Status (3)

Country Link
US (5) US20160321237A1 (en)
CN (1) CN107924273A (en)
WO (2) WO2016176359A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107085498A (en) * 2016-11-15 2017-08-22 阿里巴巴集团控股有限公司 The method and apparatus for inputting numerical value
EP3625652B1 (en) 2017-05-19 2022-09-14 Michael William Murphy An interleaved character selection interface
CN107807783A (en) * 2017-10-26 2018-03-16 珠海市魅族科技有限公司 Terminal operation method and device, computer installation and readable storage medium storing program for executing
US11212847B2 (en) * 2018-07-31 2021-12-28 Roku, Inc. More secure device pairing
US11922007B2 (en) 2018-11-29 2024-03-05 Michael William Murphy Apparatus, method and system for inputting characters to an electronic device
CN111191441A (en) * 2020-01-06 2020-05-22 广东博智林机器人有限公司 Text error correction method, device and storage medium
WO2022159565A1 (en) * 2021-01-22 2022-07-28 Rutgers, The State University Of New Jersey Systems for infrastructure degradation modelling and methods of use thereof

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5202961A (en) * 1990-06-08 1993-04-13 Apple Computer, Inc. Sequential information controller
US20030023473A1 (en) * 1999-05-04 2003-01-30 George Victor Guyan Method and article of manufacture for providing a component based interface to handle tasks during claim processing
US20090187860A1 (en) * 2008-01-23 2009-07-23 David Fleck Radial control menu, graphical user interface, method of controlling variables using a radial control menu, and computer readable medium for performing the method
US20100094866A1 (en) * 2007-01-29 2010-04-15 Cuttner Craig D Method and system for providing 'what's next' data
US20100295789A1 (en) * 2009-05-19 2010-11-25 Samsung Electronics Co., Ltd. Mobile device and method for editing pages used for a home screen
US20110167375A1 (en) * 2010-01-06 2011-07-07 Kocienda Kenneth L Apparatus and Method for Conditionally Enabling or Disabling Soft Buttons
US20110304555A1 (en) * 2010-06-10 2011-12-15 Michael William Murphy Novel Character Specification System and Method that Uses a Limited Number of Selection Keys
US20160063036A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Communication apparatus capable of communicating with external apparatus in which contents are recorded, and receiving metadata of contents
US20170118383A1 (en) * 2015-10-01 2017-04-27 William BOLLMAN Record booth sessions network
US20200064160A1 (en) * 2018-08-22 2020-02-27 Cirrus Logic International Semiconductor Ltd. Detecting and adapting to changes in a resonant phase sensing system having a resistive-inductive-capacitive sensor

Family Cites Families (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS57109031A (en) 1980-12-26 1982-07-07 Sharp Corp Input equipment
US4912462A (en) 1982-07-29 1990-03-27 Sharp Kabushiki Kaisha Letter input device for electronic word retrieval device
JP3727399B2 (en) 1996-02-19 2005-12-14 ミサワホーム株式会社 Screen display type key input device
JPH1185362A (en) 1997-09-01 1999-03-30 Nec Corp Keyboard control method and keyboard controller
US6011542A (en) 1998-02-13 2000-01-04 Sony Corporation Graphical text entry wheel
KR100327209B1 (en) 1998-05-12 2002-04-17 윤종용 Software keyboard system using the drawing of stylus and method for recognizing keycode therefor
US6271835B1 (en) 1998-09-03 2001-08-07 Nortel Networks Limited Touch-screen input device
US7712053B2 (en) 1998-12-04 2010-05-04 Tegic Communications, Inc. Explicit character filtering of ambiguous text entry
US6727830B2 (en) 1999-01-05 2004-04-27 Microsoft Corporation Time based hardware button for application launch
US6770572B1 (en) 1999-01-26 2004-08-03 Alliedsignal Inc. Use of multifunctional si-based oligomer/polymer for the surface modification of nanoporous silica films
KR100547767B1 (en) 1999-04-02 2006-02-01 삼성전자주식회사 Using method of multi-function key
US7030863B2 (en) 2000-05-26 2006-04-18 America Online, Incorporated Virtual keyboard system with automatic correction
US7403888B1 (en) 1999-11-05 2008-07-22 Microsoft Corporation Language input user interface
FI19992822A (en) 1999-12-30 2001-07-01 Nokia Mobile Phones Ltd The keyboard arrangement
US6597345B2 (en) 2000-03-03 2003-07-22 Jetway Technologies Ltd. Multifunctional keypad on touch screen
US20020163504A1 (en) * 2001-03-13 2002-11-07 Pallakoff Matthew G. Hand-held device that supports fast text typing
JP4084582B2 (en) 2001-04-27 2008-04-30 俊司 加藤 Touch type key input device
US7809574B2 (en) * 2001-09-05 2010-10-05 Voice Signal Technologies Inc. Word recognition using choice lists
US6765556B2 (en) 2001-11-16 2004-07-20 International Business Machines Corporation Two-key input per character text entry apparatus and method
SG125895A1 (en) 2002-04-04 2006-10-30 Xrgomics Pte Ltd Reduced keyboard system that emulates qwerty-type mapping and typing
KR100941948B1 (en) 2002-05-21 2010-02-11 코닌클리케 필립스 일렉트로닉스 엔.브이. A system for selecting and entering objects and a method for entering objects from a set of objects and compuetr readable medium for storing software code for implementing the method
US8416217B1 (en) 2002-11-04 2013-04-09 Neonode Inc. Light-based finger gesture user interface
US7382358B2 (en) 2003-01-16 2008-06-03 Forword Input, Inc. System and method for continuous stroke word-based text input
US7256769B2 (en) 2003-02-24 2007-08-14 Zi Corporation Of Canada, Inc. System and method for text entry on a reduced keyboard
SG135918A1 (en) 2003-03-03 2007-10-29 Xrgomics Pte Ltd Unambiguous text input method for touch screens and reduced keyboard systems
WO2005043332A2 (en) * 2003-10-31 2005-05-12 Iota Wireless Llc Concurrent data entry for a portable device
US7555732B2 (en) 2004-03-12 2009-06-30 Steven Van der Hoeven Apparatus method and system for a data entry interface
WO2005109644A1 (en) 2004-04-27 2005-11-17 Wildseed Ltd. Reduced keypad for predictive input
US7218249B2 (en) 2004-06-08 2007-05-15 Siemens Communications, Inc. Hand-held communication device having navigation key-based predictive text entry
CA2476046C (en) 2004-07-23 2012-07-10 Bruce Mead Grounding method and grid for a pedestal supported access floor
US20060066583A1 (en) 2004-09-27 2006-03-30 Toutonghi Michael J Text entry method and system using a numeric or non-QWERTY keypad
US20060202865A1 (en) 2005-03-04 2006-09-14 Nguyen Mitchell V Text entry coding system and handheld computing device
US20060213754A1 (en) 2005-03-17 2006-09-28 Microsoft Corporation Method and system for computer application program task switching via a single hardware button
TW200701035A (en) 2005-06-27 2007-01-01 Lite On Technology Corp System and method for inputting character
US7684821B2 (en) 2005-09-27 2010-03-23 Research In Motion Limited Multi-tap keyboard user interface
GB0520287D0 (en) 2005-10-06 2005-11-16 Maber Jonathan Keyboard and method of text entry
US20090201252A1 (en) * 2006-10-02 2009-08-13 Seok Ho Lee Method and apparatus for alphanumeric data entry using a keypad
US7793228B2 (en) 2006-10-13 2010-09-07 Apple Inc. Method, system, and graphical user interface for text entry with partial word display
US8299943B2 (en) 2007-05-22 2012-10-30 Tegic Communications, Inc. Multiple predictions in a reduced keyboard disambiguating system
US8011542B2 (en) 2007-12-18 2011-09-06 Helmet House, Inc. Motorcycle sissy bar luggage mounting system
TWI375162B (en) * 2008-05-02 2012-10-21 Hon Hai Prec Ind Co Ltd Character input method and electronic system utilizing the same
US8319669B2 (en) 2009-04-22 2012-11-27 Jeffrey C Weller Text entry device with radial keypad layout
US8745518B2 (en) 2009-06-30 2014-06-03 Oracle America, Inc. Touch screen input recognition and character selection
US20110009725A1 (en) * 2009-07-09 2011-01-13 Medtronic Minimed, Inc. Providing contextually relevant advertisements and e-commerce features in a personal medical device system
KR101636705B1 (en) 2009-08-06 2016-07-06 삼성전자주식회사 Method and apparatus for inputting letter in portable terminal having a touch screen
US8806362B2 (en) 2010-01-06 2014-08-12 Apple Inc. Device, method, and graphical user interface for accessing alternate keys
US9104312B2 (en) 2010-03-12 2015-08-11 Nuance Communications, Inc. Multimodal text input system, such as for use with touch screens on mobile phones
US20110304483A1 (en) 2010-06-10 2011-12-15 Richard Woocheol Moon Method and apparatus for text data input with an alphanumeric keypad for an electronic device
US8836643B2 (en) * 2010-06-10 2014-09-16 Qualcomm Incorporated Auto-morphing adaptive user interface device and methods
US8576184B2 (en) * 2010-08-19 2013-11-05 Nokia Corporation Method and apparatus for browsing content files
US8423898B2 (en) * 2010-08-23 2013-04-16 Hale Software Concepts, Inc. System and method for performing calculations using a portable electronic device
US8896543B2 (en) * 2010-09-06 2014-11-25 Avi Ettinger Virtual symbols-based keyboard
WO2012037200A2 (en) 2010-09-15 2012-03-22 Spetalnick Jeffrey R Methods of and systems for reducing keyboard data entry errors
US20120102401A1 (en) 2010-10-25 2012-04-26 Nokia Corporation Method and apparatus for providing text selection
US8316319B1 (en) 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface
KR101806934B1 (en) 2011-06-02 2018-01-10 삼성전자 주식회사 Terminal having touch screen and method for displaying key thereof
US8922490B2 (en) 2011-06-03 2014-12-30 Apple Inc. Device, method, and graphical user interface for entering alternate characters with a physical keyboard
US8842057B2 (en) 2011-09-27 2014-09-23 Z124 Detail on triggers: transitional states
KR101978687B1 (en) 2011-11-15 2019-05-16 삼성전자주식회사 Method for inputting a character in touch screen terminal and apparatus thereof
US20150177851A1 (en) * 2012-07-03 2015-06-25 N Sringeri Omprakash User input error detection and correction system
US9256366B2 (en) 2012-08-14 2016-02-09 Google Technology Holdings LLC Systems and methods for touch-based two-stage text input
US9026428B2 (en) * 2012-10-15 2015-05-05 Nuance Communications, Inc. Text/character input system, such as for use with touch screens on mobile phones
US20140173522A1 (en) 2012-12-17 2014-06-19 Michael William Murphy Novel Character Specification System and Method that Uses Remote Selection Menu and Touch Screen Movements
CN103279192A (en) * 2013-04-18 2013-09-04 百度在线网络技术(北京)有限公司 Method and device for conducting input by using multi-meaning keyboard
CN104571584B (en) * 2014-12-30 2017-12-19 北京奇虎科技有限公司 Character input method and device
US20170199661A1 (en) 2016-01-08 2017-07-13 Michael William Murphy Method of character selection that uses mixed ambiguous and unambiguous character identification

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5202961A (en) * 1990-06-08 1993-04-13 Apple Computer, Inc. Sequential information controller
US20030023473A1 (en) * 1999-05-04 2003-01-30 George Victor Guyan Method and article of manufacture for providing a component based interface to handle tasks during claim processing
US20100094866A1 (en) * 2007-01-29 2010-04-15 Cuttner Craig D Method and system for providing 'what's next' data
US20090187860A1 (en) * 2008-01-23 2009-07-23 David Fleck Radial control menu, graphical user interface, method of controlling variables using a radial control menu, and computer readable medium for performing the method
US20100295789A1 (en) * 2009-05-19 2010-11-25 Samsung Electronics Co., Ltd. Mobile device and method for editing pages used for a home screen
US20110167375A1 (en) * 2010-01-06 2011-07-07 Kocienda Kenneth L Apparatus and Method for Conditionally Enabling or Disabling Soft Buttons
US20110304555A1 (en) * 2010-06-10 2011-12-15 Michael William Murphy Novel Character Specification System and Method that Uses a Limited Number of Selection Keys
US8487877B2 (en) * 2010-06-10 2013-07-16 Michael William Murphy Character specification system and method that uses a limited number of selection keys
US9880638B2 (en) * 2010-06-10 2018-01-30 Michael William Murphy Character specification system and method that uses a limited number of selection keys
US20160063036A1 (en) * 2014-09-03 2016-03-03 Canon Kabushiki Kaisha Communication apparatus capable of communicating with external apparatus in which contents are recorded, and receiving metadata of contents
US20170118383A1 (en) * 2015-10-01 2017-04-27 William BOLLMAN Record booth sessions network
US20200064160A1 (en) * 2018-08-22 2020-02-27 Cirrus Logic International Semiconductor Ltd. Detecting and adapting to changes in a resonant phase sensing system having a resistive-inductive-capacitive sensor

Also Published As

Publication number Publication date
WO2016176357A1 (en) 2016-11-03
CN107924273A (en) 2018-04-17
US20160320929A1 (en) 2016-11-03
US20160321236A1 (en) 2016-11-03
WO2016176359A1 (en) 2016-11-03
US10452264B2 (en) 2019-10-22
US20160320963A1 (en) 2016-11-03
US10216410B2 (en) 2019-02-26
US20160321237A1 (en) 2016-11-03

Similar Documents

Publication Publication Date Title
US20190138208A1 (en) Method and system of multi-variable character input
US20170199661A1 (en) Method of character selection that uses mixed ambiguous and unambiguous character identification
US8878789B2 (en) Character specification system and method that uses a limited number of selection keys
US11853545B2 (en) Interleaved character selection interface
Romano et al. The tap and slide keyboard: A new interaction method for mobile device text entry
US20150234592A1 (en) Systems, methods and devices for input of characters with optional time-based button taps
US20160124535A1 (en) Method of character identification that uses button press types
CN104699402A (en) Sliding input device and method based on touch screen
CN106201003B (en) Virtual keyboard based on touch screen equipment and input method thereof
US11922007B2 (en) Apparatus, method and system for inputting characters to an electronic device
CN104199602A (en) Information processing method and electronic equipment
JP6655331B2 (en) Electronic equipment and methods
US20120059647A1 (en) Touchless Texting Exercise
TWI619027B (en) Dynamically generating a personalized handwriting font system and method thereof
JP6102241B2 (en) Character input program, character input device, and character input method
US20140253447A1 (en) Mouse and inputting method thereof

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION