US20100231521A1 - Character Input Device, and Method and Program for Inputting Character - Google Patents

Character Input Device, and Method and Program for Inputting Character Download PDF

Info

Publication number
US20100231521A1
US20100231521A1 US12/680,309 US68030908A US2010231521A1 US 20100231521 A1 US20100231521 A1 US 20100231521A1 US 68030908 A US68030908 A US 68030908A US 2010231521 A1 US2010231521 A1 US 2010231521A1
Authority
US
United States
Prior art keywords
character
selecting
group
detected
moving direction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/680,309
Other languages
English (en)
Inventor
Osamu Nishimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Corp
Original Assignee
Kyocera Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Corp filed Critical Kyocera Corp
Assigned to KYOCERA CORPORATION reassignment KYOCERA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NISHIMURA, OSAMU
Publication of US20100231521A1 publication Critical patent/US20100231521A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0234Character input methods using switches operable in different directions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/018Input/output arrangements for oriental characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0232Manual direct entries, e.g. key to main memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M11/00Coding in connection with keyboards or like devices, i.e. coding of the position of operated keys
    • H03M11/02Details
    • H03M11/04Coding of multifunction keys
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/3833Hand-held transceivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • H04B1/401Circuits for selecting or indicating operating mode

Definitions

  • the present invention relates to a character input device, and a method and a program for inputting a character. More specifically, the present invention relates to a character input device included in a portable device, and a method and a program for inputting a character which are carried out in the character input device.
  • a character input device in which a plurality of character list display means for each displaying a plurality of pieces of character information consisting of characters of a specific character type are displayed on display means, and when an arbitrary character in the character list display means is designated by designating means, character type displaying means for displaying a character of at least one character type corresponding to the designated character is displayed.
  • designating means when an arbitrary character in the character list display means is designated by designating means, character type displaying means for displaying a character of at least one character type corresponding to the designated character is displayed.
  • the character list needs to be displayed, which occupies a large display area. This poses a problem that the display area is restricted due to input of a character.
  • the present invention has been accomplished to solve the above-described problems, and an object of the present invention is to provide a character input device which is capable of readily selecting a character from among a plurality of characters.
  • Another object of the present invention is to provide a character input device which is capable of detecting a variety of inputs without enlarging an area into which a character is input.
  • a further object of the present invention is to provide a method for inputting a character which allows a character to be readily selected from among a plurality of characters.
  • a still further object of the present invention is to provide a program for inputting a character which allows a character to be readily selected from among a plurality of characters.
  • a character input device includes: position detecting means for detecting a designated position; direction detecting means for detecting a moving direction of the position detected by the position detecting means; first selecting means, when the moving direction is detected by the direction detecting means, for selecting one group as a selecting group, from among a plurality of groups into which a plurality of characters have been classified, on the basis of the moving direction; and second selecting means for selecting one of at least one character classified in the selecting group selected.
  • one of a plurality of groups into which a plurality of characters have been classified is selected as a selecting group, on the basis of the moving direction, and one of at least one character classified in the selecting group is selected. That is, in order to select one of the plurality of groups, it is only necessary to select the direction in which the designated position is to be moved. This enables a character to be readily selected from among the plurality of characters. Particularly, a character can be selected using one finger, without using a wide display area. As a result, the character input device which is capable of readily selecting a character from among a plurality of characters can be provided.
  • the device further includes distance detecting means for detecting a distance between a position detected at a first time by the position detecting means and a position detected at a second time by the position detecting means, wherein the second selecting means selects one of the at least one character classified in the selecting group on the basis of the distance detected.
  • distance detecting means for detecting a distance between a position detected at a first time by the position detecting means and a position detected at a second time by the position detecting means, wherein the second selecting means selects one of the at least one character classified in the selecting group on the basis of the distance detected.
  • one of at least one character is selected on the basis of the distance that the designated position is moved. This makes it possible to select a character by a simple operation of moving the designated position.
  • the device further includes time measuring means for measuring an elapsed time since the moving direction was detected, wherein the second selecting means selects one of the at least one character classified in the selecting group on the basis of the elapsed time measured.
  • one of at least one character is selected on the basis of the elapsed time since the moving direction was detected. This makes it possible to select a character by a simple operation of continuing the designation.
  • the device further includes determination means, when the position detecting means that had detected the designated position detects no more position, for determining the character that is being selected by the second selecting means from among the at least one character classified in the selecting group to be an input character.
  • the character that is being selected is determined as an input character. This makes it possible to select a character by a simple operation of stopping the designation.
  • a character input device includes: position detecting means for detecting a designated position; direction detecting means for detecting a moving direction of the position detected by the position detecting means; first selecting means for selecting, on the basis of the position detected at a predetermined time by the position detecting means, one group as a first selecting group from among a plurality of first type groups into which a plurality of characters have been classified; second selecting means, when the moving direction is detected by the direction detecting means, for selecting one group as a second selecting group, from among a plurality of second type groups into which the plurality of characters have been classified, on the basis of the moving direction; and third selecting means for selecting one of at least one character, among the plurality of characters, that has been classified in both the first selecting group and the second selecting group.
  • one of a plurality of first type groups into which a plurality of characters have been classified is selected as a first selecting group.
  • one of a plurality of second type groups into which the plurality of characters have been classified is selected as a second selecting group.
  • one character is selected from among at least one character that has been classified in both the first selecting group and the second selecting group.
  • one group each is selected from the two types of groups on the basis of the designated position and the direction in which the designated position is moved, respectively, thereby enabling a character to be readily selected from among the plurality of characters.
  • a character can be selected with a finger, without using a wide display area.
  • the character input device which is capable of readily selecting a character from among a plurality of characters can be provided.
  • the device further includes key input detecting means having a plurality of keys arranged under a predetermined rule and for detecting that each one of the plurality of keys has been designated, wherein the position detecting means includes a plurality of areas which correspond respectively to the plurality of keys included in the key input means, and the first selecting means associates the plurality of first type groups with the plurality of areas, respectively, and selects one of the plurality of first type groups that corresponds to one of the plurality of areas that is located nearest to the position detected at the predetermined time by the position detecting means.
  • the device further includes distance detecting means for detecting a distance between a position detected at a first time by the position detecting means and a position detected at a second time by the position detecting means, wherein the third selecting means selects one of the at least one character among the plurality of characters that has been classified in both the first selecting group and the second selecting group, on the basis of the distance detected.
  • distance detecting means for detecting a distance between a position detected at a first time by the position detecting means and a position detected at a second time by the position detecting means, wherein the third selecting means selects one of the at least one character among the plurality of characters that has been classified in both the first selecting group and the second selecting group, on the basis of the distance detected.
  • the device further includes time measuring means for measuring an elapsed time since the moving direction was detected, wherein the third selecting means selects one of the at least one character among the plurality of characters that has been classified in both the first selecting group and the second selecting group, on the basis of the elapsed time measured.
  • the device further includes canceling means for canceling the first selecting group and the second selecting group when the key input means detects that one of the plurality of keys has been designated, wherein the predetermined time is a time at which the key input means detects that one of the plurality of keys has been designated.
  • the first selecting group and the second selecting group are canceled, and in response to detection of the event that one of the keys has been designated, the designated position is detected. This makes it possible to select the first selecting group and the second selecting group again.
  • a character input device includes: position detecting means for detecting a designated position; and key input detecting means having a plurality of keys arranged under a predetermined rule and for detecting that each one of the plurality of keys has been designated; wherein the position detecting means includes a plurality of areas which correspond respectively to the plurality of keys included in the key input means.
  • the position detecting means includes a plurality of areas which correspond respectively to the plurality of keys, so that it can simultaneously detect the designated position and the key designated among the plurality of keys.
  • the character input device which is capable of detecting a variety of inputs without enlarging an area into which a character is input can be provided.
  • a character input device includes: position detecting means for detecting a designated position; character selecting means for selecting a character which is associated in advance with a first position detected by the position detecting means; displaying means for displaying a related character which is classified in a same group as the selected character, at a second position around the first position; direction detecting means for detecting a moving direction of the position detected by the position detecting means; and related character selecting means for making selectable the related character that is being displayed at the second position in the case where the direction detecting means detects a moving direction from the first position toward the second position.
  • the character input device which is capable of readily selecting a character from among a plurality of characters can be provided.
  • a method for inputting a character includes the steps of detecting a designated position; detecting a moving direction of the detected position; when the moving direction is detected, selecting one group as a selecting group, from among a plurality of groups into which a plurality of characters have been classified, on the basis of the moving direction; and selecting one of at least one character classified in the selecting group selected.
  • the method for inputting a character which allows a character to be readily selected from among a plurality of characters can be provided.
  • a program for inputting a character causes a computer to perform the steps of: detecting a designated position; detecting a moving direction of the detected position; when the moving direction is detected, selecting one group as a selecting group, from among a plurality of groups into which a plurality of characters have been classified, on the basis of the moving direction; and selecting one of at least one character classified in the selecting group selected.
  • the program for inputting a character which allows a character to be readily selected from among a plurality of characters can be provided.
  • FIG. 1A is a perspective view of a mobile phone in the state of an open style.
  • FIG. 1B is a perspective view of the mobile phone in the state of a closed style.
  • FIG. 2 is a diagram showing a configuration of an operation portion in the mobile phone.
  • FIG. 3 is a functional block diagram showing, by way of example, the functions of the mobile phone according to the present embodiment.
  • FIG. 4 is a diagram showing an example of a character table.
  • FIG. 5 is a functional block diagram showing, by way of example, the functions of a control portion included in the mobile phone.
  • FIG. 6 is a diagram showing an example of a character input screen.
  • FIG. 7 is a flowchart illustrating an example of the flow of a character input process.
  • FIG. 8 is a functional block diagram showing, by way of example, the functions of the mobile phone according to a modification.
  • FIG. 9 is a flowchart illustrating an example of the flow of the character input process according to the modification.
  • the character input device is not limited to the mobile phone, but may be any device such as personal digital assistants (PDA), as long as the device is used for input of a character.
  • PDA personal digital assistants
  • FIG. 1A and FIG. 1B are perspective views of a mobile phone according to an embodiment of the present invention.
  • FIG. 1A shows the mobile phone in the state of an open style
  • FIG. 1B shows the mobile phone in the state of a closed style.
  • a mobile phone 1 includes an operation side portion 3 and a display side portion 2 .
  • Operation side portion 3 has an operation portion 14 accepting an operation input by a user and a microphone 13 , which are arranged on its inner surface.
  • Display side portion 2 has a liquid crystal display (LCD) 15 and a first speaker 11 constituting a receiver, which are arranged on its inner surface, and a second speaker 12 arranged on its outer surface.
  • LCD liquid crystal display
  • LCD 15 may be replaced with an organic electro-luminescence (EL) display.
  • EL organic electro-luminescence
  • Operation side portion 3 and display side portion 2 are rotatably connected via a hinge mechanism to be freely opened and closed.
  • the state where mobile phone 1 is folded and operation side portion 3 and display side portion 2 are in the closed position corresponds to the closed style, while the state where mobile phone 1 is open and operation side portion 3 and display side portion 2 are in the open position corresponds to the open style.
  • FIG. 2 is a diagram showing a configuration of an operation portion in the mobile phone.
  • operation portion 14 is made up of: a key rubber 25 A having 12 keys arranged thereon under a predetermined rule; a touch sensor 27 ; a dome sheet 25 B having 12 dome keys arranged thereon in correspondence respectively with the 12 keys; and a key circuit board 25 C having 12 switch patterns arranged thereon in correspondence respectively with the 12 keys, which are stacked on one another in this order.
  • Touch sensor 27 which is arranged under key rubber 25 A, has at least a size covering all the 12 keys included in key rubber 25 A, and has 12 holes 27 A at positions corresponding respectively to the 12 keys arranged on key rubber 25 A.
  • the pressed part of key rubber 25 A directly presses down the corresponding dome key in dome sheet 25 B. This ensures that a key pressing event can be detected even in the case where the key is pressed down with a small force.
  • no physical force is applied to touch sensor 27 , which prevents a break from occurring in touch sensor 27 .
  • FIG. 3 is a functional block diagram showing, by way of example, the functions of the mobile phone of the present embodiment.
  • mobile phone 1 includes a control portion 21 responsible for overall control of mobile phone 1 , and also includes a communication circuit 23 , a codec portion 29 for processing audio data, a key circuit board 25 C, a touch sensor 27 , a card interface (I/F) 37 , a liquid crystal display (LCD) 15 , a random access memory (RAM) 31 used as a work area for control portion 21 , an electronically erasable and programmable ROM (EEPROM) 33 for storing in a nonvolatile manner a program or data to be executed by control portion 21 , and a battery 35 for supplying power into the whole of mobile phone 1 , which are each connected to control portion 21 .
  • EEPROM electronically erasable and programmable ROM
  • key circuit board 25 C detects the key pressing event, and outputs to control portion 21 a signal indicating the key that has been pressed down.
  • Touch sensor 27 is a capacitive touch panel. Touch sensor 27 detects a change in static electricity. When a user touches key rubber 25 A with his or her finger, touch sensor 27 detects the touched position as a designated position. Touch sensor 27 has 12 holes 27 A. When a user designates one of holes 27 A with his or her finger, touch sensor 27 accurately detects that hole 27 A has been designated, by detecting a change in static electricity around that hole 27 A.
  • touch sensor 27 While a user is touching key rubber 25 A with his or her finger, touch sensor 27 outputs to control portion 21 the designated position being detected. When a user is designating the same position, touch sensor 27 continues to output the same designated position to control portion 21 .
  • Touch sensor 27 may be configured to output the designated position to control portion 21 at predetermined time intervals while a user is touching key rubber 25 A with his or her finger. In this case as well, control portion 21 is capable of detecting that the user continues to touch key rubber 25 A with his or her finger.
  • Communication circuit 23 connects mobile phone 1 to a network. It is here assumed that a wideband code division multiple access (W-CDMA) is used as a communication method in the network.
  • Communication circuit 23 performs radio communication with a base station apparatus connected to the W-CDMA network. A radio signal transmitted from the base station apparatus is received by an antenna 23 A.
  • Communication circuit 23 receives a radio signal received by antenna 23 A, and outputs to control portion 21 a signal acquired by demodulating the radio signal. When the signal acquired by demodulating the radio signal is an audio signal, control portion 21 outputs the audio signal to codec portion 29 .
  • the communication method may be one of other communication methods.
  • communication circuit 23 When receiving a signal from control portion 21 , communication circuit 23 outputs a radio signal acquired by modulating the signal to antenna 23 A. When receiving an audio signal from codec portion 29 , control portion 21 outputs the audio signal to communication circuit 23 . The radio signal transmitted from antenna 23 A is received by and input into the W-CDMA base station apparatus.
  • Codec portion 29 is connected to microphone 13 , first speaker 11 , and second speaker 12 .
  • Codec portion 29 decodes an audio signal input from control portion 21 , converts the decoded digital audio signal to an analog signal, and amplifies the signal to output it to speaker 11 or second speaker 12 . Further, codec portion 29 receives an analog audio signal from microphone 13 , converts the audio signal to a digital signal, encodes it, and outputs the encoded audio signal to control portion 21 .
  • a removable flash memory 37 A is mounted to card I/F 37 .
  • Control portion 21 is capable of accessing flash memory 37 A via card I/F 37 . While it is here assumed that the program to be executed by control portion 21 is stored in EEPROM 33 in advance, the program may be stored in flash memory 37 A and read therefrom to be executed by control portion 21 .
  • the recording medium for storing the program is not restricted to flash memory 37 A.
  • It may be a flexible disk, a cassette tape, an optical disk (compact disc-ROM (CD-ROM), magnetic optical disc (MO), mini disc (MD), digital versatile disc (DVD)), an IC card, an optical card, or a semiconductor memory such as a mask ROM, an erasable programmable ROM (EPROM), an electronically erasable programmable ROM (EEPROM), or the like.
  • CD-ROM compact disc-ROM
  • MO magnetic optical disc
  • MD mini disc
  • DVD digital versatile disc
  • IC card an optical card
  • an optical card or a semiconductor memory such as a mask ROM, an erasable programmable ROM (EPROM), an electronically erasable programmable ROM (EEPROM), or the like.
  • EPROM erasable programmable ROM
  • EEPROM electronically erasable programmable ROM
  • mobile phone 1 may be connected to the Internet via communication circuit 23 and the program may be downloaded from a computer connected to the Internet, to be executed by control portion 21 .
  • the “program” includes, not only the program directly executable by control portion 21 , but also a source program, a compressed program, an encrypted program, and others.
  • Battery 35 which is a secondary battery such as a lithium polymer battery, a nickel-cadmium battery, or a nickel hydride battery, supplies power to the whole of mobile phone 1 .
  • FIG. 4 is a diagram showing an example of a character table.
  • the character table includes a “key” field, a “group name” field, and a “character type” field.
  • the “character type” field includes a “hiragana” field, a “katakana” field, an “uppercase alphanumeric” field, a “lowercase alphanumeric” field, and a “number” field.
  • the character table classifies a plurality of characters into a plurality of character types (a plurality of second types of groups).
  • the character table classifies the plurality of characters into ten groups (a plurality of first types of groups).
  • the character table shown in FIG. 4 includes ten groups having group names of “G 1 ” to “G 10 ”, respectively. Furthermore, the character table associates the ten groups respectively with the ten keys included in operation portion 14 .
  • group “G 1 ” is associated with the number “1” key
  • group “G 2 ” is associated with the number “2” key
  • group “G 9 ” is associated with the number “9” key
  • group “G 10 ” is associated with the number “0” key.
  • Hiragana characters are classified under the “hiragana” field
  • katakana characters are classified under the “katakana” field
  • uppercase alphanumeric characters are classified under the “uppercase alphanumeric” field
  • lowercase alphanumeric characters are classified under the “lowercase alphanumeric” field
  • numbers are classified under the “number” field.
  • group name “G 2 ” is associated with the number “2” key, and the following are classified in or assigned to group “G 2 ”: five hiragana characters belonging to the gyou as the characters of the character type “hiragana”, five katakana characters belonging to the gyou as the characters of the character type “katakana”, uppercase alphabetic characters “A”, “B”, and “C” and number “2” as the characters of the character type “uppercase alphanumeric”, lowercase alphabetic characters “a”, “b”, and “c” and number “ 2 ” as the characters of the character type “lowercase alphanumeric”, and number “ 2 ” as the character of the character type “number”.
  • FIG. 5 is a functional block diagram showing, by way of example, the functions of the control portion included in the mobile phone.
  • control portion 21 includes a designated-position detecting portion 51 for detecting a designated position which is designated by a user, a moving direction detecting portion 53 for detecting a moving direction of the designated position, a moving distance measuring portion 55 for measuring a moving distance of the designated position, a group selecting portion 57 for selecting one of ten groups defined in a character table on the basis of the designated position, a character type selecting portion 59 for selecting a character type on the basis of the detected moving direction, a character selecting portion 61 for selecting one of a plurality of characters which are included in the group selected and included in the character type selected, and a display control portion 63 for controlling an LCD to display characters.
  • Designated-position detecting portion 51 detects a designated position output from touch sensor 27 .
  • designated-position detecting portion 51 When detecting a designated position, designated-position detecting portion 51 outputs the designated position to group selecting portion 57 , moving direction detecting portion 53 , moving distance measuring portion 55 , and character selecting portion 61 .
  • designated-position detecting portion 51 When a designated position which continued to be detected is no longer detected, designated-position detecting portion 51 outputs a determination signal to character selecting portion 61 , and outputs a reset signal to group selecting portion 57 , moving direction detecting portion 53 , and moving distance measuring portion 55 . For example, during the period in which a designated position is detected repeatedly at predetermined time intervals, designated-position detecting portion 51 determines that the designated position continues to be detected. On the other hand, when a designated position is not detected the predetermined time after the designated position was detected, designated-position detecting portion 51 determines that the designated position dose not continue to be detected.
  • group selecting portion 57 When receiving a first designated position from designated-position detecting portion 51 , group selecting portion 57 selects one of the ten groups included in the character table, on the basis of the first designated position, and outputs the group name of the selected group to character selecting portion 61 .
  • the first designated position received from designated-position detecting portion 51 refers to a designated position that is received directly after a reset signal is received from designated-position detecting portion 51 , or a designated position that is received directly after mobile phone 1 enters a character input mode. Any designated position that is received following the first designated position prior to input of a reset signal is not a first designated position. Therefore, group selecting portion 57 does not select a group on the basis of that designated position.
  • group selecting portion 57 cancels the group which has been selected till then, and outputs a cancel signal to character selecting portion 61 .
  • a key position table is stored in EEPROM 33 in advance, in which each of the 12 keys arranged on key rubber 25 A is associated with its position on touch sensor 27 .
  • Group selecting portion 57 refers to the key position table to select one of the 12 keys that is associated with the position nearest to the first designated position received from designated-position detecting portion 51 .
  • Group selecting portion 57 then refers to the character table, stored in EEPROM 33 , to select a group associated with the selected key from among the ten groups included in the character table, and outputs the group name of the selected group to character selecting portion 61 and display control portion 63 .
  • group selecting portion 57 selects no group and outputs nothing to character selecting portion 61 .
  • a specific character may be assigned to the “*” key or the “#” key, or a character that cannot be classified may be assigned thereto.
  • moving direction detecting portion 53 waits until a designated position that is different from the first designated position is input next from designated-position detecting portion 51 .
  • the designated position that is input following the first designated position and is different from the first designated position is referred to as a “designated position for direction detection”.
  • Moving direction detecting portion 53 detects, as a moving direction, the direction from the first designated position toward the designated position for direction detection, and outputs the detected moving direction to character type selecting portion 59 . It is here assumed that one of four directions of up, down, left, and right is detected, the number of the directions being the same as the number of character types, or four. If it is configured to detect eight directions including diagonal directions, a selection can be made from among eight character types.
  • moving direction detecting portion 53 When receiving a reset signal from designated-position detecting portion 51 , moving direction detecting portion 53 outputs a cancel signal to character type selecting portion 59 .
  • a designated position of which distance from the first designated position is greater than a predetermined length is preferably used.
  • the predetermined length may be about 2 mm, for example.
  • Character type selecting portion 59 selects one of the four character types included in the character table on the basis of the moving direction received from moving direction detecting portion 53 . Character type selecting portion 59 outputs the selected character type to character selecting portion 61 . When receiving a cancel signal from moving direction detecting portion 53 , character type selecting portion 59 outputs nothing to character selecting portion 61 .
  • the character table includes, as the four character types, “hiragana”, “uppercase alphanumeric”, “lowercase alphanumeric”, and “katakana”.
  • Character type selecting portion 59 associates a character type with each of the four directions in advance, and selects a character type corresponding to the moving direction received from moving direction detecting portion 53 .
  • the character type “uppercase alphanumeric” is selected.
  • the character type “hiragana” is selected.
  • the character type “katakana” is selected.
  • the character type “lowercase alphanumeric” is selected.
  • Moving distance measuring portion 55 uses the first designated position received from designated-position detecting portion 51 as a reference position. For every input of a designated position from designated-position detecting portion 51 , moving distance measuring portion 55 calculates a moving distance from the first designated position to the designated position which was input, and outputs the calculated moving distance to character selecting portion 61 .
  • the designated position corresponding to an endpoint that is used for calculating a moving distance is referred to as a “designated position for distance calculation”.
  • the moving distance is a distance between the first designated position and the designated position for distance calculation. A cumulative total of the moving distances calculated maybe obtained whenever a moving distance is calculated. When the position designated by a user moves in a curve, a distance approximating the curve can be calculated as the moving distance.
  • moving distance measuring portion 55 resets the moving distance to “0”.
  • Character selecting portion 61 selects a group having the group name received from group selecting portion 57 , from among the groups included in the character table stored in EEPROM 33 , while the group name is being received from group selecting portion 57 , or in other words, from when the group name is received from group selecting portion 57 to when a cancel signal is received therefrom.
  • character selecting portion 61 Upon receipt of the group name from group selecting portion 57 , character selecting portion 61 refers to the character table stored in EEPROM 33 , to select a character that has been classified in the group having the group name input and further classified as the character type “number”, and outputs the selected character to display control portion 63 . For example, when the number “5” key is designated, group name “G 5 ” is received from group selecting portion 57 . In this case, the number “5” is selected, and this character is output to display control portion 63 .
  • character selecting portion 61 receives a character type from character type selecting portion 59 .
  • character selecting portion 61 refers to the character table stored in EEPROM 33 , to select a plurality of characters that have been classified in the group having the previously-input group name and further classified as the character type input. For example, in the case where group name “G 5 ” is input and then “hiragana” is input as a character type, the five characters of and are selected.
  • character selecting portion 61 selects one of the selected characters on the basis of the moving distance received from moving distance measuring portion 55 , and outputs the selected character to display control portion 63 .
  • Character selecting portion 61 specifies the sequence of the selected characters in advance. Character selecting portion 61 then selects a first character in the specified sequence when the moving distance exceeds 2 mm, for example. When the moving distance exceeds e.g. 2 mm, a moving direction is determined, and a character type is determined by character type selecting portion 59 . Character selecting portion 61 selects a next character in the specified sequence whenever the moving distance increases by a predetermined distance. When the moving distance increases by the predetermined distance after the last character in the specified sequence has been selected, the first character in the specified sequence is selected.
  • character selecting portion 61 firstly selects a first character in a predetermined sequence.
  • the character to be selected first can be arbitrarily determined in the predetermined sequence. It is assumed that characters are selected in Japanese alphabetical order, alphabetical order, and ascending order of numbers here.
  • the character to be selected first may be determined on the basis of the history in which characters were selected in the past. Further, it may be configured such that the select sequence is reversed when a predetermined key provided on operation portion 14 is designated. In this case, even if a designated position is moved too far and thus a character succeeding an intended character has been selected, the preceding, originally intended character can be selected. As such, re-selection can be made with ease and in a short time.
  • character selecting portion 61 When receiving a determination signal from designated-position detecting portion 51 , character selecting portion 61 outputs a cursor movement instruction to display control portion 63 .
  • moving direction detecting portion 53 is configured to detect a moving direction on the basis of the first designated position received from designated-position detecting portion 51 and the designated position for direction detection here, in the event that a moving direction that is detected on the basis of two designated positions input after the moving direction has once been detected is different from the previously-detected moving direction, moving direction detecting portion 53 may output the lastly detected moving direction to character type selecting portion 59 . For example, when a user moves a position designated with his or her finger firstly upward and then rightward, moving direction detecting portion 53 outputs the upward moving direction first, and then outputs the rightward moving direction at the point in time when the rightward motion is detected. Accordingly, even when an incorrect character type is selected, the selected character type can be modified immediately.
  • moving distance measuring portion 55 resets the moving distance whenever moving direction detecting portion 53 detects a new moving direction. In this case, moving distance measuring portion 55 resets the moving distance whenever the moving direction is changed, and measures a distance in a straight line between two designated positions as a moving distance. Moving distance measuring portion 55 measures a moving distance from the firstly detected one of the two designated positions that had caused moving direction detecting portion 53 to detect a new moving direction.
  • character selecting portion 61 In the case where character selecting portion 61 receives different character types from character type selecting portion 59 after it receives a group name from group selecting portion 57 and before it receives a determination signal from designated-position detecting portion 51 , character selecting portion 61 refers to the character table stored in EEPROM 33 whenever a character type is input from character type selecting portion 59 , to select a plurality of characters that have been classified in the group having the previously-input group name and further classified as the character type input.
  • Display control portion 63 controls LCD 15 to display a character input screen on LCD 15 .
  • FIG. 6 is a diagram showing an example of a character input screen.
  • a character input screen 80 includes an input character displaying area 81 for displaying a character which is input, and a guidance displaying area 85 .
  • display control portion 63 displays guidance on guidance displaying area 83 .
  • the guidance includes an area 91 on which a key corresponding to the first designated position is displayed, and areas 93 A to 93 D on which a plurality of selectable characters classified according to the moving directions are displayed.
  • the guidance shown here is displayed when a user designates the number “5” key and, hence, the position corresponding to the number “5” key is detected as a first designated position.
  • the number “5” key is graphically displayed on area 91 , and a plurality of characters included in each of the character types which will be selected when one of the upward, downward, leftward, and rightward moving directions is detected are displayed on its top, bottom, left, and right, respectively.
  • area 93 A arranged above area 91 displays a plurality of characters of “J, K, L, 5” which are of the “uppercase alphanumeric” character type selected when the designated position is moved upward.
  • Area 93 B arranged below area 91 displays a plurality of characters of which are of the “hiragana” character type selected when the designated position is moved downward.
  • Area 93 C arranged on the left of area 91 displays a plurality of characters of which are of the “katakana” character type selected when the designated position is moved leftward.
  • Area 93 D arranged on the right of area 91 displays a plurality of characters of “j, k, l, 5” which are of the “lowercase alphanumeric” character type selected when the designated position is moved rightward.
  • Guidance displaying area 85 displays the guidance, which can notify a user of the plurality of characters that will be selected when a designated position is moved. This allows the user to determine in which one of the four directions the user needs to move his or her finger in order to select an intended character.
  • Input character displaying area 81 includes an area 83 A displaying a character which is input, and a cursor area 83 B.
  • display control portion 63 displays the input character in a selectable mode on cursor area 83 B.
  • the character being displayed on cursor area 83 B is updated whenever a character is input. In other words, in the case where two or more characters are input before a cursor movement instruction is input, character selecting portion 61 displays on cursor area 83 B only the character that has been input lastly, among the plurality of input characters.
  • display control portion 63 moves cursor area 83 B by one character to the right, and enlarges input character displaying area 83 A by one character. This causes the character that had been displayed in a selectable mode on cursor area 83 B before the movement to be displayed on input character displaying area 83 A.
  • the position of cursor area 83 B after the movement thereof is reserved as an area for displaying in a selectable mode a character which will be input later from character selecting portion 61 .
  • FIG. 7 is a flowchart illustrating an example of the flow of a character input process.
  • the character input process is executed by control portion 21 as control portion 21 executes a program for inputting a character.
  • control portion 21 determines whether a first designated position has been acquired (step S 01 ). While a user keeps touching key rubber 25 A with his or her finger, touch sensor 27 detects the touched position as a designated position, and outputs the detected, designated position to control portion 21 . Control portion 21 receives the designated position from touch sensor 27 . When a designated position is input after no designated position has been input for a predetermined time or more, control portion 21 acquires the designated position as the first designated position. Control portion 21 is on standby until it acquires the first designated position, and once the first designated position is acquired, the process proceeds to step S 02 .
  • control portion 21 refers to the key position table to select one of the 12 keys that is associated with a position nearest to the first designated position. Control portion 21 then refers to the character table stored in EEPROM 33 to select one of the ten groups included in the character table that is associated with the selected key, and determines the selected group as a processing object.
  • Control portion 21 selects, from among a plurality of characters classified in the determined group, a character of which character type is “number” (step S 03 ), and displays the selected character in a selectable mode (step S 04 ).
  • Control portion 21 then displays guidance (step S 05 ).
  • the guidance displays graphically a key that is associated with a designated position nearest to the first designated position, and also displays by character type a plurality of characters that have been classified in the group determined in step S 02 within the character table.
  • a plurality of characters of the “uppercase, alphanumeric” character type are displayed above the graphically-displayed key, a plurality of characters of the “hiragana” character type are displayed below the graphically-displayed key, a plurality of characters of the “katakana” character type are displayed on the left side of the graphically-displayed key, and a plurality of characters of the “lowercase alphanumeric” character type are displayed on the right side of the graphically-displayed key.
  • step S 06 it is determined whether the movement of the designated position has been detected.
  • a designated position that is different from the first designated position i.e., the designated position for direction detection
  • step S 07 it is determined that the designated position has been moved, and the process proceeds to step S 07 ; otherwise, the process proceeds to step S 17 .
  • step S 17 it is determined whether the designated position is no longer detected. If control portion 21 receives no designated position from touch sensor 27 , the process proceeds to step S 18 ; if control portion 21 continues to receive a designated position, the process returns to step S 06 . In step S 18 , the number that is being displayed in a selectable mode is confirmed as an input character, and the process proceeds to step S 20 . A user only needs to release his or her finger from key rubber 25 A to confirm the character being input, ensuring easy operation.
  • a moving direction is detected.
  • the direction from the first designated position toward the designated position for direction detection is detected as the moving direction.
  • the moving direction to be detected is one of four directions of up, down, left, and right, the number of the directions being the same as the number of character types, or four.
  • a character type that is predetermined corresponding to the detected moving direction is then determined (step S 08 ).
  • the character type is determined to be “uppercase alphanumeric” when the moving direction is upward.
  • the character type is determined to be “hiragana” when the moving direction is downward.
  • the character type is determined to be “katakana” when the moving direction is leftward.
  • the character type is determined to be “lowercase alphanumeric” when the moving direction is rightward.
  • a character table stored in EEPROM 33 is referred to, to select a plurality of characters that have been classified in the group determined in step S 02 and further classified as the character type determined in step S 08 (step S 09 ). For example, when the group is determined to be the one with the group name “G 5 ” and subsequently the character type is determined to be “hiragana”, then the five characters of and are selected. Then, from among the plurality of selected characters, a first character in a sequence is selected (step S 10 ), and the selected character is displayed in a selectable mode (step S 11 ). Then, the process proceeds to step S 12 .
  • step S 12 a moving distance is reset (step S 12 ). It is then determined whether a designated position is no longer detected (step S 13 ). If control portion 21 receives no designated position from touch sensor 27 , the process proceeds to step S 19 . If control portion 21 continues to receive a designated position, the process proceeds to step S 14 . In step S 14 , it is determined whether the moving distance is equal to or greater than a threshold value T. If the moving distance is equal to or greater than the threshold value, the process proceeds to step S 15 ; otherwise, the process returns to step S 13 .
  • step S 15 if a user moves his or her finger from an initially designated position by a distance of not smaller than the threshold value T with the finger kept in contact with key rubber 25 A, the process proceeds to step S 15 ; if the user releases his or her finger from key rubber 25 A, the process proceeds to step S 19 .
  • step S 15 a next character in the sequence is selected.
  • step S 16 the character that has been displayed in a selectable mode in step S 11 , or in step S 16 which was executed previously, is replaced with the character selected in step S 15 , to switch the display. Thereafter, the process returns to step S 12 .
  • This causes the display of a selectable character to be switched, enabling a user to confirm the character that the user is able to select from among a plurality of characters.
  • step S 19 the character that is being displayed in a selectable mode is confirmed as an input character, and the process proceeds to step S 20 .
  • step S 20 a cursor is moved, and the process proceeds to step S 21 .
  • step S 21 it is determined whether an end instruction has been received.
  • the operation keypad is configured in advance to include a key corresponding to an instruction to terminate the character input process, and it is determined whether the key has been designated. If the end instruction is received, the process is terminated; otherwise, the process returns to step S 01 .
  • step S 07 it may be determined whether a change in the moving direction has been detected. If a change in the moving direction is detected, the process may proceed to step S 08 ; otherwise, the process may proceed to step S 12 . As such, when a change in the moving direction is detected, processes in steps S 08 to S 11 are executed, which allows a character type corresponding to the changed moving direction to be selected and one of the plurality of characters classified as that character type to be made selectable.
  • mobile phone 1 of the present embodiment when a first designated position is detected, one of ten groups is selected, and when the movement of the designated position is detected, a character type corresponding to the moving direction is selected. Therefore, the group in which a character to be input is classified is selected using the position firstly designated with a finger and the direction in which the finger is moved. Furthermore, one character is selected in accordance with the distance by which the finger is moved or the time during which the position is designated, and when a user releases his or her finger from the key rubber, the character that has been selected is confirmed as an input character. As a result, one of a plurality of characters of a plurality of character types can be selected and input with simple operation.
  • mobile phone 1 described above is configured to use touch sensor 27 to select a character which will be input
  • mobile phone 1 according to a modification is configured to select a character which will be input, further using an input of a key by key circuit board 25 C.
  • the differences from the above-described mobile phone 1 will be described primarily.
  • FIG. 8 is a functional block diagram showing, by way of example, the functions of the mobile phone according to a modification.
  • the functional block diagram of the modification is different from the functional block diagram shown in FIG. 4 in that a key accepting portion 71 has been added and the designated-position detecting portion, 51 A, has been modified.
  • the differences between the above-described mobile phone 1 and the modification will now be described mainly.
  • Key accepting portion 71 receives from key circuit board 25 C a signal indicating which one of the 12 switch patterns arranged thereon has been shorted, and detects which one of the 12 keys arranged on key rubber 25 A has been pressed down. In response to an input of the key from key circuit board 25 C, key accepting portion 71 outputs a key signal indicating which one of the 12 keys has been pressed down, to designated-position detecting portion 51 A.
  • Designated-position detecting portion 51 A detects a designated position output from touch sensor 27 , and outputs the designated position to group selecting portion 57 , moving direction detecting portion 53 , moving distance measuring portion 55 , and character selecting portion 61 .
  • designated-position detecting portion 51 A outputs a determination signal to character selecting portion 61 , and outputs a reset signal to group selecting portion 57 , moving direction detecting portion 53 , and moving distance measuring portion 55 .
  • designated-position detecting portion 51 A when receiving a key signal from key accepting portion 71 , designated-position detecting portion 51 A outputs a reset signal to group selecting portion 57 , moving direction detecting portion 53 , and moving distance measuring portion 55 , without outputting a determination signal to character selecting portion 61 .
  • Designated-position detecting portion 51 A then detects the designated position output from touch sensor 27 , and outputs the designated position to group selecting portion 57 , moving direction detecting portion 53 , moving distance measuring portion 55 , and character selecting portion 61 .
  • the user may designate another key incorrectly, for example the number “5” key.
  • the user may press down the number “2” key with his or her finger kept in contact with key rubber 25 A.
  • designated-position detecting portion 51 A outputs a reset signal to group selecting portion 57 , moving direction detecting portion 53 , and moving distance measuring portion 55 , and then outputs a designated position corresponding to the number “2” key to group selecting portion 57 , moving direction detecting portion 53 , moving distance measuring portion 55 , and character selecting portion 61 .
  • Group selecting portion 57 when receiving the reset signal, cancels the group that has been selected till then, and outputs a cancel signal' to character selecting portion 61 . Thereafter, it receives the designated position corresponding to the number “2” key.
  • the designated position corresponding to the number “2” key in this case is the first designated position, because it is the designated position firstly input after the reset signal is input. Therefore, group selecting portion 57 selects the group having the group name “G 2 ”, from among the ten groups included in the character table, on the basis of the designated position corresponding to the number “2” key, and outputs the group name of the selected group to character selecting portion 61 .
  • Moving direction detecting portion 53 when receiving the reset signal from designated-position detecting portion 51 , outputs a cancel signal to character type selecting portion 59 . Thereafter, when the designated position corresponding to the number “2” key is input as the first designated position, moving direction detecting portion 53 waits until the designated position for direction detection which is different from the first designated position is input next from designated-position detecting portion 51 . Moving direction detecting portion 53 detects as a moving direction the direction from the first designated position toward the designated position for direction detection, and outputs the detected moving direction to character type selecting portion 59 .
  • character type selecting portion 59 When receiving the cancel signal from moving direction detecting portion 53 , character type selecting portion 59 stops outputting to character selecting portion 61 . Thereafter, when receiving the moving direction from moving direction detecting portion 53 , character type selecting portion 59 selects one of the four character types included in the character table on the basis of the input moving direction, and outputs the selected character type to character selecting portion 61 .
  • Moving distance measuring portion 55 when receiving the reset signal from designated-position detecting portion 51 , resets the moving distance to “0”. Thereafter, when the designated position corresponding to the number “2” key is input as the first designated position, moving distance measuring portion 55 uses the first designated position as a reference position to calculate, for every input of a designated position from designated-position detecting portion 51 A, a moving distance from the first designated position toward the input designated position, and outputs the calculated moving distance to character selecting portion 61 .
  • character selecting portion 61 Before the cancel signal is input from group selecting portion 57 , character selecting portion 61 has selected, from among the groups included in the character table stored in EEPROM 33 , the group having the group name “G 5 ” input from group selecting portion 57 .
  • character selecting portion 61 selects, from among the groups included in the character table stored in EEPROM 33 , the group having the group name “G 2 ” that is input from group selecting portion 57 , until a next cancel signal is input.
  • character selecting portion 61 selects a character that is classified as the character type “number” in the group, and outputs the selected character to display control portion 63 .
  • the number “2” key has been pressed down, the number “2” is selected, and the character “2” is output to display control portion 63 .
  • character selecting portion 61 refers to the character table stored in EEPROM 33 to select a plurality of characters that have been classified in the group having the previously-input group name and further classified as the character type input. In this case, when “hiragana”, for example, is input as the character type, five characters of and are selected.
  • Character selecting portion 61 then selects one of the selected characters, on the basis of the moving distance received from moving distance measuring portion 55 , and outputs the selected character to display control portion 63 . Thereafter, when the determination signal is input from designated-position detecting portion 51 , character selecting portion 61 outputs a cursor movement instruction to display control portion 63 .
  • Display control portion 63 controls LCD 15 to display a character input screen on LCD 15 .
  • display control portion 63 displays guidance on guidance displaying area 85 .
  • the group name “G 2 ” is input, and thus, the number “2” key is graphically displayed, and a plurality of characters included in each of the character types which will be selected when one of the upward, downward, leftward, and rightward moving directions is detected are displayed on its top, bottom, left, and right, respectively.
  • a plurality of characters of “A, B, C, 2” which are of the “uppercase alphanumeric” character type selected when the designated position is moved upward are displayed above the graphical display of the number “2” key.
  • a plurality of characters of which are of the “hiragana” character type selected when the designated position is moved downward are displayed below the graphical display of the number “2” key.
  • a plurality of characters of which are of the “katakana” character type selected when the designated position is moved to the left are displayed on the left of the graphical display of the number “2” key.
  • a plurality of characters of “a, b, c, 2” which are of the “lowercase alphanumeric” character type selected when the designated position is moved to the right are displayed on the right of the graphical display of the number “2” key.
  • display control portion 63 displays the character on cursor area 83 B. That is, when the number “5” key is firstly designated, the number “5” is displayed. Thereafter, while a user is moving his or her finger from the number “5” key to the number “2” key, the display on cursor area 83 B is changed successively to “J”, “K”, “L”, and “5”. However, once the number “2” key is pressed down, the number “2” is displayed, accompanying no movement in cursor area 83 B. Thereafter, when the user moves his or her finger upward, downward, leftward, or rightward, the displayed character is switched to a character of the character type corresponding to the designated direction.
  • FIG. 9 is a flowchart illustrating an example of the flow of the character input process according to the modification.
  • the flowchart in the modification is different from the flowchart shown in FIG. 7 in that steps S 31 to S 33 have been added.
  • the other processes are identical to the processes shown in FIG. 7 , and thus, description thereof will not be repeated here.
  • step S 31 following the step S 12 in which the moving distance is reset, in step S 31 , it is determined whether a key has been pressed down. If the key has been pressed down, the process proceeds to step S 32 ; otherwise, the process proceeds to step S 13 .
  • step S 32 the group determined in step S 02 and the character type determined in step S 08 are canceled.
  • step S 33 the designated position corresponding to the key pressed is acquired, and the process returns to step S 02 .
  • key circuit board 25 C detects that one of the 12 keys has been pressed down
  • the designated position that is being detected by touch sensor 27 at that time is acquired, and the processes in step S 02 and the subsequent steps are executed.
  • a user can move his or her finger to a correct key with the finger kept in contact with key rubber 25 A and press down the correct key, to change the designated key to the correct one. This enables easy correction of false selection of a character.
  • a correct group can be readily selected even when another group has been falsely selected.
  • moving distance measuring portion 55 measures a moving distance of the designated position.
  • a designation time measuring portion may be provided which measures, after a moving direction is detected by moving direction detecting portion 53 , a duration in which a designated position is designated. In this case, the designation time measuring portion outputs the measured time to character selecting portion 61 . When receiving a reset signal from designated-position detecting portion 51 , the designation time measuring portion resets the measured time to “0”.
  • Character selecting portion 61 selects one of the selected characters on the basis of the designating time input from the designation time measuring portion, and outputs the selected character to display control portion 63 .
  • Character selecting portion 61 sequences the selected characters in advance. Character selecting portion 61 then selects a first character in the specified sequence when the designating time is zero. Character selecting portion 61 selects a next character in the specified sequence whenever the designating time increases by a predetermined time. When the designating time increases by the predetermined time after the last character in the specified sequence has been selected, the first character is selected because the characters have all been displayed. For example, at the point in time when a character type is selected, character selecting portion 61 initially selects a first character in a predetermined sequence. While the sequence may be arbitrarily determined in advance, it is here assumed that characters are selected in Japanese alphabetical order, alphabetical order, and ascending order of numbers. The character to be selected first may be determined on the basis of the history in which characters were selected in the past.
  • the present invention may of course be understood as a method for inputting a character, for executing the processes shown in FIG. 7 or FIG. 9 , or a program for inputting a character, for causing a computer to execute the method for inputting a character.
  • said second selecting means reverses the order of selecting the one of the at least one character classified in said selecting group when said key input accepting means accepts the input of said predetermined key.
  • a method for inputting a character comprising the steps of
  • a program for inputting a character causing a computer to perform the steps of

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)
  • Character Discrimination (AREA)
US12/680,309 2007-09-28 2008-09-24 Character Input Device, and Method and Program for Inputting Character Abandoned US20100231521A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2007255290A JP5207699B2 (ja) 2007-09-28 2007-09-28 文字入力装置、文字入力方法および文字入力プログラム
JP2007-255290 2007-09-28
PCT/JP2008/067162 WO2009041420A1 (ja) 2007-09-28 2008-09-24 文字入力装置、文字入力方法および文字入力プログラム

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2008/067162 A-371-Of-International WO2009041420A1 (ja) 2007-09-28 2008-09-24 文字入力装置、文字入力方法および文字入力プログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/474,031 Continuation US9442655B2 (en) 2007-09-28 2014-08-29 Character input device, and method and program for inputting character

Publications (1)

Publication Number Publication Date
US20100231521A1 true US20100231521A1 (en) 2010-09-16

Family

ID=40511304

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/680,309 Abandoned US20100231521A1 (en) 2007-09-28 2008-09-24 Character Input Device, and Method and Program for Inputting Character
US14/474,031 Active US9442655B2 (en) 2007-09-28 2014-08-29 Character input device, and method and program for inputting character
US15/263,232 Abandoned US20170031458A1 (en) 2007-09-28 2016-09-12 Character input device, and method and program for inputting character

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/474,031 Active US9442655B2 (en) 2007-09-28 2014-08-29 Character input device, and method and program for inputting character
US15/263,232 Abandoned US20170031458A1 (en) 2007-09-28 2016-09-12 Character input device, and method and program for inputting character

Country Status (4)

Country Link
US (3) US20100231521A1 (ja)
JP (1) JP5207699B2 (ja)
KR (2) KR101186784B1 (ja)
WO (1) WO2009041420A1 (ja)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110029901A1 (en) * 2009-07-31 2011-02-03 Brother Kogyo Kabushiki Kaisha Printing apparatus, composite image data generating apparatus, and composite image data generating program
US8316319B1 (en) * 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface
EP3001285A4 (en) * 2013-05-22 2016-11-02 Xiaomi Inc METHOD AND SYSTEM OF SEIZURE

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011118507A (ja) * 2009-12-01 2011-06-16 Mitsubishi Electric Corp 文字入力装置
JP5136916B2 (ja) * 2010-05-13 2013-02-06 Necインフロンティア株式会社 文字入力方法、文字入力装置および文字入力プログラム
JP5461335B2 (ja) * 2010-07-28 2014-04-02 京セラ株式会社 電子機器
JP2012083838A (ja) * 2010-10-07 2012-04-26 Nec Casio Mobile Communications Ltd 文字入力装置および文字入力方法
JP2012084086A (ja) * 2010-10-14 2012-04-26 Kyocera Corp 携帯電子機器、携帯電子機器の制御方法及びプログラム
JP5891540B2 (ja) * 2011-10-05 2016-03-23 シャープ株式会社 文字入力装置、文字入力方法、およびプログラム
KR101323281B1 (ko) 2012-04-06 2013-10-29 고려대학교 산학협력단 입력 장치 및 문자 입력 방법
US10671272B2 (en) * 2015-11-06 2020-06-02 International Business Machines Corporation Touchscreen oriented input integrated with enhanced four-corner indexing
TWI629405B (zh) * 2017-07-18 2018-07-11 朕豪工業股份有限公司 關門器
JP7053317B2 (ja) * 2018-03-14 2022-04-12 セイコーソリューションズ株式会社 電子機器
WO2023248323A1 (ja) * 2022-06-21 2023-12-28 雄介 山内 文字入力方法、文字入力プログラム、及び文字入力装置

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6271835B1 (en) * 1998-09-03 2001-08-07 Nortel Networks Limited Touch-screen input device
US20020145587A1 (en) * 1998-06-23 2002-10-10 Mitsuhiro Watanabe Character input device and method
US20050089226A1 (en) * 2003-10-22 2005-04-28 Samsung Electronics Co., Ltd. Apparatus and method for letter recognition
US6980200B2 (en) * 2000-06-13 2005-12-27 Michael Goren Rapid entry of data and information on a reduced size input area
US20060007162A1 (en) * 2001-04-27 2006-01-12 Misawa Homes Co., Ltd. Touch-type key input apparatus
US7136047B2 (en) * 2003-04-09 2006-11-14 Microsoft Corporation Software multi-tap input system and method
US20070152978A1 (en) * 2006-01-05 2007-07-05 Kenneth Kocienda Keyboards for Portable Electronic Devices
JP2007193465A (ja) * 2006-01-18 2007-08-02 Sharp Corp 入力装置
US7554529B2 (en) * 2005-12-15 2009-06-30 Microsoft Corporation Smart soft keyboard
US8405601B1 (en) * 1999-06-09 2013-03-26 Malvern Scientific Solutions Limited Communication system and method

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4019512B2 (ja) 1998-08-11 2007-12-12 ソニー株式会社 文字入力装置、文字入力方法及び文字入力機能を有するプログラムを記録した情報記録媒体
JP2000112636A (ja) 1998-10-07 2000-04-21 Kanazawa Engineering Systems:Kk かな文字入力装置
US6204848B1 (en) * 1999-04-14 2001-03-20 Motorola, Inc. Data entry apparatus having a limited number of character keys and method
US6597345B2 (en) * 2000-03-03 2003-07-22 Jetway Technologies Ltd. Multifunctional keypad on touch screen
JP2001265498A (ja) * 2000-03-15 2001-09-28 Toshiba Corp 情報処理装置及び入力装置並びに入力操作処理方法
JP2002000329A (ja) 2000-06-21 2002-01-08 Casio Comput Co Ltd 電子機器ケース
JP2002091676A (ja) * 2000-09-13 2002-03-29 Sanyo Electric Co Ltd 入力装置
JP2002108543A (ja) * 2000-09-21 2002-04-12 Nokia Mobile Phones Ltd 仮名文字入力方法
CN1308803C (zh) * 2002-05-21 2007-04-04 皇家飞利浦电子股份有限公司 对象输入电子设备
JP3797977B2 (ja) * 2003-03-17 2006-07-19 株式会社クレオ 文字入力装置、文字入力方法及び文字入力プログラム
JP2004355336A (ja) 2003-05-29 2004-12-16 Misawa Homes Co Ltd キー入力装置
JP2005032189A (ja) * 2003-07-11 2005-02-03 Sony Corp 文字入力制御方法、文字入力プログラム及び文字入力装置
JP2005128802A (ja) * 2003-10-23 2005-05-19 Sony Ericsson Mobilecommunications Japan Inc 携帯型電子装置
JP2005182487A (ja) * 2003-12-19 2005-07-07 Nec Software Chubu Ltd 文字入力装置、方法およびプログラム
JP2007128802A (ja) 2005-11-07 2007-05-24 Toyota Motor Corp 燃料電池システム
US10521022B2 (en) * 2006-03-17 2019-12-31 Conversant Wireless Licensing S.a.r.l. Mobile communication terminal and method therefor

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020145587A1 (en) * 1998-06-23 2002-10-10 Mitsuhiro Watanabe Character input device and method
US6271835B1 (en) * 1998-09-03 2001-08-07 Nortel Networks Limited Touch-screen input device
US8405601B1 (en) * 1999-06-09 2013-03-26 Malvern Scientific Solutions Limited Communication system and method
US6980200B2 (en) * 2000-06-13 2005-12-27 Michael Goren Rapid entry of data and information on a reduced size input area
US20060007162A1 (en) * 2001-04-27 2006-01-12 Misawa Homes Co., Ltd. Touch-type key input apparatus
US7136047B2 (en) * 2003-04-09 2006-11-14 Microsoft Corporation Software multi-tap input system and method
US20050089226A1 (en) * 2003-10-22 2005-04-28 Samsung Electronics Co., Ltd. Apparatus and method for letter recognition
US7554529B2 (en) * 2005-12-15 2009-06-30 Microsoft Corporation Smart soft keyboard
US20070152978A1 (en) * 2006-01-05 2007-07-05 Kenneth Kocienda Keyboards for Portable Electronic Devices
JP2007193465A (ja) * 2006-01-18 2007-08-02 Sharp Corp 入力装置

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110029901A1 (en) * 2009-07-31 2011-02-03 Brother Kogyo Kabushiki Kaisha Printing apparatus, composite image data generating apparatus, and composite image data generating program
US8837023B2 (en) * 2009-07-31 2014-09-16 Brother Kogyo Kabushiki Kaisha Printing apparatus, composite image data generating apparatus, and composite image data generating program
US8316319B1 (en) * 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface
EP3001285A4 (en) * 2013-05-22 2016-11-02 Xiaomi Inc METHOD AND SYSTEM OF SEIZURE
US9703479B2 (en) 2013-05-22 2017-07-11 Xiaomi Inc. Input method and device using same

Also Published As

Publication number Publication date
US20140368439A1 (en) 2014-12-18
KR101186784B1 (ko) 2012-09-27
JP5207699B2 (ja) 2013-06-12
KR20100053699A (ko) 2010-05-20
US9442655B2 (en) 2016-09-13
US20170031458A1 (en) 2017-02-02
KR20110115179A (ko) 2011-10-20
WO2009041420A1 (ja) 2009-04-02
KR101148688B1 (ko) 2012-05-25
JP2009086981A (ja) 2009-04-23

Similar Documents

Publication Publication Date Title
US9442655B2 (en) Character input device, and method and program for inputting character
TWI396127B (zh) 使用螢幕小鍵盤以簡化文字輸入之電子裝置及方法
US8745518B2 (en) Touch screen input recognition and character selection
TWI420889B (zh) 符號輸入用電子裝置與方法
US8525779B2 (en) Character input device
EP2073508B1 (en) A portable electronic apparatus, and a method of controlling a user interface thereof
US20120256858A1 (en) Character input device, character-input control method, and storage medium storing character input program
KR20080113913A (ko) 터치스크린을 구비한 단말기에서의 입력 방법 및 장치
US9539505B2 (en) Game device and computer-readable storage medium
KR20100062899A (ko) 터치 패턴을 이용한 입력 방법 및 장치
US20110205177A1 (en) Portable device, method of detecting operation, and computer-readable storage medium storing program for detecting operation
US6720951B2 (en) Key customizing method and portable terminal device
KR100742730B1 (ko) 이동 통신 단말기의 메뉴 실행 방법 및 그 이동 통신단말기
KR20130042675A (ko) 휴대용 단말기에서 점자 입력을 위한 장치 및 방법
JP5104659B2 (ja) 入力装置、携帯端末装置、及び入力装置の入力方法
JP2014167712A (ja) 情報処理装置、情報処理方法およびプログラム
JP6408665B2 (ja) 文字入力装置、文字入力方法および文字入力プログラム
JP6208808B2 (ja) 文字入力装置、文字入力方法および文字入力プログラム
JP5542906B2 (ja) 文字入力装置、文字入力方法および文字入力プログラム
JP5934280B2 (ja) 文字入力装置、文字入力方法および文字入力プログラム
JP2011239463A (ja) 文字入力装置、文字入力方法および文字入力プログラム
JP2002251246A (ja) 情報処理端末の文字入力装置
KR100644045B1 (ko) 화면에 표시된 정보 처리 방법 및 장치
JP5395599B2 (ja) 情報処理装置、入力装置及び情報処理装置の入力方法
KR20140020570A (ko) 터치패널을 갖는 개인휴대단말기의 작동방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: KYOCERA CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NISHIMURA, OSAMU;REEL/FRAME:024146/0908

Effective date: 20100323

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION