JP6135947B2 - Character input system - Google Patents

Character input system Download PDF

Info

Publication number
JP6135947B2
JP6135947B2 JP2015086112A JP2015086112A JP6135947B2 JP 6135947 B2 JP6135947 B2 JP 6135947B2 JP 2015086112 A JP2015086112 A JP 2015086112A JP 2015086112 A JP2015086112 A JP 2015086112A JP 6135947 B2 JP6135947 B2 JP 6135947B2
Authority
JP
Japan
Prior art keywords
character
coordinate
pressing
press
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2015086112A
Other languages
Japanese (ja)
Other versions
JP2015133155A (en
Inventor
小川 耕太
耕太 小川
Original Assignee
マイクロソフト テクノロジー ライセンシング,エルエルシー
マイクロソフト テクノロジー ライセンシング,エルエルシー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2007103979 priority Critical
Priority to JP2007103979 priority
Application filed by マイクロソフト テクノロジー ライセンシング,エルエルシー, マイクロソフト テクノロジー ライセンシング,エルエルシー filed Critical マイクロソフト テクノロジー ライセンシング,エルエルシー
Priority to JP2015086112A priority patent/JP6135947B2/en
Publication of JP2015133155A publication Critical patent/JP2015133155A/en
Application granted granted Critical
Publication of JP6135947B2 publication Critical patent/JP6135947B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a character input system and a mobile terminal device including the system, and more particularly to a mobile phone.

  In recent years, electronic devices have become smaller and more advanced, especially for mobile phones, not only for voice calls as telephones, but also for mail and Internet communications using numeric keys for telephone calls on mobile phones for character input. Is widely used.

  Since there are only 10 numeric keys (numeric keypads) 0 to 9 for telephone calls, alphabets including kana and symbols are usually assigned to these numeric keys, and numbers, kana and alphabets are selected in the mode. It is done by switching. In addition, a symbol is assigned to the “#” key, and a lowercase letter switching function is assigned to the “*” key. When inputting a character, the attribute information of one character is specified from the pressed position of the numeric keypad, and the detailed information of the one character is specified by the number of times of pressing. Hereinafter, a specific example will be described.

  FIG. 13 is a plan view showing an example of a conventional cellular phone. The mobile phone 50 includes a housing 1, a numeric keypad 2, a four-way key 3, a control key 4, a display unit 5, a speaker 11, and a microphone 12.

  The numeric keypad 2 is composed of ten numeric keys, “*” key, and “#” key. A plurality of kana characters and alphabets are assigned to one key, and the assigned characters are printed on each numeric key. Yes. Therefore, it is necessary to distinguish and input a plurality of types of characters from one key, and different characters assigned to the key are specified by the number of consecutive keystrokes of the same key.

  For example, in order to input a kana character such as “Tokkyo”, first, the “5” key to which the “ta” row of the kana 50-sound table is assigned is input five times continuously in order to input “to”. Since the next “tsu” is the same “ta” line as “to”, the [→] key is pressed to indicate that the consecutive keystrokes of the same key have been temporarily stopped, and “tsu” is entered to input “tsu”. Press the [4] key assigned to the “ta” line three times and press the “*” key to convert it to lower case. Next, the key [2] is pressed twice to input “ki”, the key [8] is pressed three times to input “「 ”, and then the“ * ”key is pressed. Input of the character string “Tokkyo” is completed by the above keystroke operation, and then Kanji conversion is performed using the four-way key.

  In the English alphabet, 26 letters of the alphabet are grouped into three or four letters, and one group is assigned to each key of the numeric keypad. For example, a group “ABC” is assigned to the [2] key. Each character is distinguished by the number of consecutive keystrokes of the same key. Switching between hiragana and alphabet is performed by a control key, and various symbols are input according to the number of consecutive keystrokes of the symbol key [#] (see Patent Document 1 and Patent Document 2).

JP-A-6-152711 JP-A-8-79360

  As described above, in the conventional method of using the numeric keypad for calling in a portable terminal device also for character input, a method is generally used in which a plurality of characters are assigned to one numeric key and the number is pressed. Therefore, for example, in the case of a kana character, when selecting a character arranged in the fifth row such as “to”, the number of pressing operations increases so that five pressing operations are required. In addition, the user needs to recognize the character by storing the number of times of pressing. Furthermore, when the characters belonging to the same line of the kana 50-syllabary are continuously input, the [→] key must be pressed each time. As a result, for example, 16 keystrokes are required to input 4 characters of “Tokkyo”, and the keystrokes are usually performed only with the thumb, so that a long time is required for character input.

  The present invention has been made in view of the above problems, and allows a character input system and a mobile terminal device including the character input system, which can input characters in a short time with fewer operations even with a single finger. An object is to provide a mobile phone.

  The character input system of the present invention detects a pressed position, transmits a position information as a touch panel, receives the position information, calculates one character, and transmits the character as character information. A processing unit, and a display unit that receives the character information and displays the specified character, and the processing unit displays one character based on position information obtained from the start to the end of the one pressing operation. It is characterized by specifying.

  Here, the display unit is a display panel such as a liquid crystal screen, and has a function of displaying information such as characters output from the processing means. In the case of a display unit that does not emit light autonomously, a light emitting unit such as a backlight is also included.

  The touch panel is, for example, a pressure detection panel that detects a pressing operation, and a method such as a capacitance method or a resistance film method can be used. Pressing refers to touching the touch panel and is also referred to as tapping. When the touch panel is pressed with a finger or the like, the touched location is converted into coordinate information of the X axis and the Y axis and transmitted to the processing means as position information. Like conventional mobile phones, it is desirable from the viewpoint of operability that numbers, telephone marks, and the like are printed on the surface at locations corresponding to keys such as numeric keys. Further, in order to improve the operability tactilely, each key and the center key may be provided with unevenness.

  The touch panel may serve as a display function and display numbers or the like depending on the situation. In this case, the display unit and the touch panel may be disposed so as to be entirely or partially overlapped, that is, the display unit and the touch panel may be used in appearance. If such a thing is used, a portable apparatus can be designed more compactly and a screen and a key can be taken widely. Furthermore, the arrangement of keys can be customized according to preference.

  “Position information obtained from the start to the end of one pressing operation” refers to the position information of the position where the pressing operation starts, the position information of the end position, and the pressing position when the pressing position changes. The position information on the trajectory is included, and it is not excluded to use the time information of pressing in addition to the position information as an element for character specification.

  It is desirable that the processing means can identify one character only from the position information of the position where one pressing operation has started and the position information of the position where the one pressing operation has ended.

  More specifically, the attribute information of one character is obtained from the position information of the position where one pressing operation is started, and the detailed information of the character is obtained from the difference between the position information of the position where the pressing operation is started and the position information of the end position. , Each can be identified.

  Here, the attribute information is information for classifying characters into several groups. For example, the vowels of kana characters, the rows in the kana 50 syllabary, and the alphabet 26 characters in alphabets in some groups. It is a group when divided. Further, the detailed information is information for distinguishing characters in the same group, and is, for example, a consonant of a kana character, a column in a 50-kana table of characters, or one character in a group if it is an alphabet.

  The character input system can be provided in a mobile terminal device.

  Here, the mobile terminal device refers to an electronic device such as a mobile phone, a PHS, and a PDA that has a display screen, has a strong demand for downsizing, and is difficult to provide a large number of character input keys.

  The touch panel has a lot of information obtained from the start to the end of pressing compared to the conventional keystroke. According to the present invention, a character is input by a touch panel, and the position information obtained from the start to the end of one pressing operation is calculated by the processing means so that one character is specified and displayed on the display unit. As a result, one character can be specified by one operation, and the character can be determined at the moment when the character is released. For this reason, the user does not need to count the number of keystrokes. For example, even when characters that belong to the same column of the 50-note table are continuously input, it is not necessary to press the [→] key, and one character is input in one operation. High-speed character input is possible.

  Further, if one character is specified only by the position information of the position at which one pressing operation starts and ends, the processing is simple, so that the program and the device can be simplified. Since the user only needs to be aware of only the pressing start position and the ending position, the input is simple and easy to get used to.

  Further, the processing means identifies the attribute information of one character from the position information of the position where one pressing operation has started, and based on the difference between the position information of the position where the pressing operation has started and the position information of the position where the pressing operation has ended, By specifying the detailed information of the characters, the user can easily shift from the conventional input method. This is because the conventional input method specifies attribute information of one character from the pressed position, and attribute information similar to the conventional one can be assigned to a position corresponding to a conventional numeric key.

  If the mobile terminal device is provided with the character input system, it is possible to provide a mobile terminal device that can input characters in a short time with fewer operations even with a single finger while satisfying the demand for miniaturization.

It is a front view which shows embodiment of the mobile telephone provided with the character input system of this invention. FIG. 2 is a block diagram illustrating the mobile phone in FIG. It is a flowchart which shows the flow which specifies the row | line | column in the kana 50 character table of a character. It is a figure which shows the judgment criteria which identify the line in the kana 50 character table of a character. It is a figure which shows the judgment criteria which identify the row | line | column in the kana 50 syllabary of a character. It is a figure which shows a guide display. It is a figure which shows the example of a relationship between a guide display and a criteria table. It is a figure which shows another guide display. It is a figure which shows a fan-shaped criteria table. It is a flowchart which shows the flow which specifies the row | line | column in the kana 50 syllabary of a character. It is a figure which shows another guide display. It is a figure which shows another guide display. It is a front view which shows the conventional mobile phone.

  Next, embodiments of the present invention will be described with reference to the drawings. In addition, the same code | symbol is attached | subjected about the part similar to the conventional portable terminal device, and description is abbreviate | omitted suitably.

  1A, 1B and 1C show the configurations of mobile phones 10, 20 and 30 according to an embodiment of the present invention. Unlike the conventional mobile phones, the mobile phones 10, 20 and 30 include touch panels 6, 7 and 8 as input means.

  The mobile phone 10 has only the touch panel 6 corresponding to the conventional numeric keypad, and the touch panel 6 is printed with the characters printed on the numeric keypad of the conventional numeric keypad at the position corresponding to the numeric keypad of the conventional numeric keypad. Yes. The four-way key 3, the control key 4, and the display unit 5 are the same as those of the conventional mobile phone 50.

  The entire input unit of the mobile phone 20 is the touch panel 7. Characters or the like printed on the corresponding keys are printed on the touch panel 7 at positions corresponding to the numeric keys, four-way keys, and control keys of the conventional numeric keypad. Yes. The display unit 5 is the same as the conventional mobile phone 50.

  The mobile phone 30 includes a touch panel 8 that also serves as a display unit on the entire surface thereof. Characters and the like corresponding thereto are appropriately displayed at positions corresponding to the conventional numeric keypad, four-way key, control key, and display unit.

  FIG. 2 is a block diagram showing a functional configuration of the mobile phone 10. The mobile phone 10 includes at least a touch panel 6, a CPU 9 as processing means, and a display unit 5. Since the mobile phone 20 has the same configuration, the illustration is omitted. In the mobile phone 30, the display unit 5 also serves as the touch panel 6, but the functional configuration is the same as that of the mobile phone 10, and thus illustration is omitted.

  An input operation when inputting one character using the mobile phone 10 as described above will be described with reference to the drawing, taking as an example the case of inputting a hiragana character “tsu”. For the mobile phones 20 and 30, the input method is the same as that of the mobile phone 10, and thus the description thereof is omitted.

  On the display unit of the mobile phone 10, the CPU 9 displays a plurality of attribute information, in this case, a plurality of characters indicating the hiragana line. Specifically, “A”, “KA”, “SA”, “TA”, “NA”, “HA”, “MA”, “YA”, “RA”, “WA”, and characters indicating symbols are displayed. First, the user designates a line in the kana 50 syllabary, which is attribute information of the character “t” to be input. Specifically, “ta” on the touch panel 6 is pressed with a finger or the like. Then, the position information of the position where the pressing operation is started is transmitted as coordinate information (x1, y1) from the touch panel 6 to the CPU 9, and the CPU 9 receives this (Step S1 in FIG. 3).

  The CPU 9 specifies a line in the kana 50-syllabary of one character from the coordinate information (x1, y1) based on the determination criterion table 21 shown in FIG. 4 (steps S4 to S14). The determination criterion table 21 corresponds to the coordinates of the numeric keypad drawn on the touch panel 6.

  More specifically, when the CPU 9 receives the coordinate information (x1, y1) (step S1), the coordinate information (x1, y1) is first out of a certain range ((xa <x1 <xd) and (ya <y1 <ye)). Is not determined as character input (steps S2 to S3).

  In the case of (x1 <xb) (step S4), in the case of (y1 <yb), “A row”, in the case of (y1 <yc), “Ta row”, and in the case of (y1 <yd), “Ma row” Otherwise, it is not determined to be a character input (steps S5 to S7).

  If not (x1 <xc) (step S8), (y1 <yb) is “ka row”, (y1 <yc) is “na row”, (y1 <yd) In this case, “ya row” is specified, and in other cases, “wa row” is specified (steps S9 to S11).

  Not applicable, if “y1 <yb”, “sa line”, if “y1 <yc”, “ha line”, if “y1 <yd”, specify “ra line”, otherwise In this case, it is not determined that the character is input (steps S12 to S14).

  In this example, “TA” is pressed on the touch panel 6, and the coordinates of the pressed position are (xa <x1 <xb) in light of the criterion table 21 and (yb <y1 <yc). According to steps S1 to S6, it is specified that one character is “ta line”.

  Next, in order to determine a column (vowel) in the kana 50 syllabary of characters to be input, the user moves his / her finger while pressing according to the criterion table 22 shown in FIG. That is, if the vowel is “i” (column), it is “up”, if it is “u” (column), right, if it is “e” (column), if it is “o” (column), it is “o” (column). Move your finger to the left, and if it is “a” (row), do not move your finger. In addition, a lowercase letter such as “t” can be specified as “t” when it is shifted to the right by a certain distance (xr) or more. In this example, since the consonant is “u”, the finger is shifted to the right by a certain distance (exceeding xq and less than xr). At this time, from the viewpoint of operability, an appropriate place on the display unit 6, for example, a part where a character to be input is to be displayed, or a pressed part of the touch panel 8 that also serves as the display part is shown in FIG. A guide display 23 as shown in FIG. In particular, if a guide display is displayed on the pressed portion or the portion where the detailed information is displayed, the subsequent operation can be performed by moving the finger so that it can be traced. The guide display is a display that represents detailed information belonging to this attribute based on specified attribute information, and displays characters specified when an operation is performed according to the guide display. That is, the criterion table 22 is expressed on the touch panel 8. Of course, at this time, a part of the attribute information may be hidden by overlapping the guide display 23.

  Although the detailed information based on the specified attribute information is displayed on the top, bottom, left, and right of the display indicating the specified attribute information, the overall cross-shaped guide display 23 has a narrower range indicating the detailed information than the actual judgment criterion table 22. If the guide display range is narrower than the judgment standard table, it does not become an obstacle to use. For example, as shown in FIG. 7, for example, the “I” region in the guide display 23 is narrower than the “I” range 25i in the determination criterion table 25 that is not actually displayed on the screen. On the other hand, it is only necessary to determine the row A when the finger is hardly shifted, but the guide display needs to be a size that is easy to see to some extent. Therefore, as shown in FIG. However, it is wider than the “A” range 25a in the determination criterion table 25 that is not actually displayed on the screen. In this case, the range of “A” in the determination criterion table 25 is a circle, whereby the critical distance of the range of “A” from the pressing start point can be made constant. In this way, it is possible to prevent the judgment criterion table and the guide display from matching. Further, as in the guide display 26 shown in FIG. 8, the color of the currently selected detailed information 26i may be highlighted by changing or enlarging it. At this time, it is possible to enlarge and display nearer detailed information, or to overlap some pieces of detailed information. Furthermore, if the guide display moves in reverse to the movement of the finger when the pressing position is shifted, even if there is little movement of the finger, the movement is relatively large on the screen.

  When the user releases the pressing of the touch panel 6 at an arbitrary place (releases a finger or the like), the position information of the position where the pressing operation is completed is transmitted from the touch panel 6 to the CPU 9 as coordinate information (x2, y2). Receive. The CPU 9 shows the difference (x3 (x2-x1), y3 (y2-y1)) between the coordinate information (x1, y1) of the pressing start point and the coordinate information (x2, y2) of the pressing release point, as shown in FIG. Based on the determination criteria table 21, a column in the 50-character kana table of one character is specified.

  Referring to FIG. 10 in detail, when coordinate information (x2, y2) is received (step S20), the CPU 9 first calculates difference coordinates (x3, y3) (step S21).

  If the amount of movement in the vertical direction is larger than the horizontal direction (| x3 | <| y3 |) (step S22), if the difference coordinate in the vertical direction exceeds a certain value (y3> yq), the “column” is less than a certain value ( If y3 <−yq), the “line” is specified (steps S23 and S24), and if not, “line” is specified (step S25).

  If it does not correspond to the above, and the difference coordinate in the horizontal direction exceeds a certain value (x3> xq), the “row” is identified, and if it is less than a certain value (x3 <−xq), the “row” is identified (step S26, S27). In the case of “Ta line”, if it proceeds further to the right more than a certain amount (x3> xr), it is specified as a small letter such as “tsu” (step S28).

When the finger is released without moving substantially from the place where the pressing is started ((| x3 | <xp) and (| y3 | <yp)), since it does not correspond to the above, “line” is identified (step S25, S29).
When one character is specified by the above routine, the CPU 9 transmits the specified character to the display unit 5 as character information. The display unit 5 receives the character information and displays the specified character.

  After the input of kana characters, it enters the standby state for the next character input. You can also enter the Kanji conversion routine by pressing the “↓” key or enter the katakana or alphabet input mode by using the “Change Character Type” key, but these are the same as for conventional mobile phones. .

  As another embodiment, as shown in FIG. 9, the detailed information based on the specified attribute information is a fan-shaped image developed around the display 26 a (“a” in the example of the figure) indicating the specified attribute information. The detailed information may be determined by hit determination according to the determination criteria table. In this case, if the fan-shaped guide display 26b substantially along the judgment criterion table is displayed, the detailed information can be specified only by the direction as in the case of the cross shape, and the guide display of the finger can be performed when operating with the finger. Part is not hidden and can be used comfortably.

  As described above, in the embodiment of the present invention, a single hiragana character is specified from the coordinates where the pressing is started and the released coordinates, and the single character is specified only by a series of operations of pressing, shifting and releasing the touch panel with a finger. be able to. In addition, by inputting not only the moving direction but also the distance as a criterion for determination as in the example of “t” described above, it is possible to input many characters with fewer operations.

  Although one embodiment of the present invention has been described above, appropriate values such as threshold values such as xq and yq and determination criteria should be selected. The information used for the judgment reference should also be appropriate information using the figure of the trajectory drawn by the pressing position when the finger is moved while pressing, the moving distance, the pressing time, and the pressing position in addition to the pressing position and moving direction. Can do.

  For example, as in the criterion 24 shown in FIG. 12, the character can be specified by the moving distance regardless of the direction in which the finger is moved. Even when such a criterion is used, it is desirable to display a guide corresponding to the criterion table. Further, it may be a criterion based on a linear criterion table in which detailed information based on designated attribute information is arranged in a straight line. In this case, the finger is slid in substantially the same direction when any detailed information is selected.

  Furthermore, by appropriately changing the determination criterion, it is possible to specify characters other than hiragana, for example, katakana, alphabets, numbers, and the like by only a series of operations without pressing the “character type change” key. For example, it is possible to specify the character type such as hiragana, katakana, alphabet, etc. by the distance between the coordinates where the pressing is started and the coordinates where the pressing is released, and if it is pressed for a certain period of time, it is judged as a lower case letter, and the pressing action at one point For example, if the character type is changed for a certain period of time, the position information from the start of pressing until it is released, such as changing the character type, the pressing time, the time remaining at a certain coordinate, etc. By incorporating, more information can be included in one step from the start of pressing until it is released.

  In addition, if you move your finger while pressing and the direction of movement changes 90 degrees in the middle, you can change the character type, or if you draw a circular orbit while pressing, the character assigned to that number key (For example, “Ta, Chi, Tsu, and G, H, I, 4”) may be displayed in order. According to these methods, various characters can be input with a few steps, and the input speed can be increased. In addition, various variations are possible, such as pressing two places at the same time with two fingers and using one as attribute information and the other as detailed information. It is also possible to acquire attribute information with the first press and perform guide display, and to determine detailed information with the second press.

  Further, when applied to kanji conversion, particularly predictive conversion, as shown in FIG. 11, a guide display 23b of words or character strings to be predicted candidates is displayed under the selected detailed information in the guide display 23. Then, by sliding downward while pressing as indicated by the arrow A1, it is possible to determine a character string to be input by an operation until releasing from one press. At this time, for example, for a candidate character in the row, it is also possible to display a character string group that is predicted when a certain time elapses after the row is pressed. Furthermore, if the character string ("when" in the figure) without being released from the finger is slid in the horizontal direction while being pressed as indicated by the arrow A2, a word or character string that is a predicted candidate that follows the character string. By displaying the group 23c and sliding downward while pressing as indicated by the arrow A3, it is possible to determine a character string to be input in an operation until it is released from a single press. In this way, while pressing the guide display of the character string candidate group, the character string candidate is arranged in the direction of arrangement, that is, in the example shown in FIG. By changing the direction, the processing means confirms the character string and repeats the operation of displaying the guide display of the next character string candidate group next to the pressed position, so that the finger is not released from the screen. You can assemble sentences. The character string candidates may be displayed when a certain character string is pressed without changing the slide direction, or may be displayed when the finger is released. Finally, take your finger off the screen to confirm the sentence. While the finger is not released, the input can be canceled by sliding back to the finger or sliding to a place other than the guide display.

  Moreover, this prediction conversion can also be applied to a linear judgment criterion. Detailed information is displayed as a guide when attribute information is pressed, detailed information or character string is not fixed when the finger is slid up and down while pressing the guide display, and detailed information or character string is fixed when the finger is released However, when the finger is moved sideways while being pressed, the detailed information or the character string is confirmed and the next character string group that is a candidate predicted from the confirmed detailed information or the character string is displayed repeatedly. You can assemble sentences. When the guide display cannot be displayed at the edge of the screen, the guide display may be displayed in the center direction of the screen by turning left and right or upside down.

  The mobile terminal device using the character input system described above is mainly targeted for mobile phones that are strongly demanded for miniaturization. However, the present invention is not limited to this, and a display screen such as a PHS or PDA can be used. The present invention can be widely applied to electronic devices that have a large number of character input keys. The directions such as up and down, left and right are not limited to the above example.

DESCRIPTION OF SYMBOLS 1 Case 2 Numeric keypad 3 Four-way key 4 Control key 5 Display part 6, 7, 8 Touch panel 9 CPU
11 Speaker 12 Microphone 21, 22, 24 Judgment Criteria Table 23 Guide Display 10, 20, 30, 50 Mobile Phone

Claims (1)

  1. A touch panel also serving as a display unit, and processing means,
    The processing means displays a plurality of characters representing attribute information on the touch panel,
    When one of the areas related to the attribute information is pressed, a plurality of characters representing detailed information based on the attribute information related to the press are displayed.
    When the pressing is released, the difference coordinates between the x coordinate and the y coordinate of the pressing release position and the x coordinate and the y coordinate of the pressing start position are calculated, and both the x coordinate and the y coordinate are calculated . When the difference coordinate is within a predetermined range, the character representing the attribute information related to the pressing can be selected,
    The movement direction of the pressing operation is determined according to the comparison result between the absolute value of the difference coordinate of the x coordinate and the absolute value of the difference coordinate of the y coordinate, and the release position of the press and the start position of the press the x-coordinate or differential coordinates of y-coordinate of the case the not within the predetermined range, can select the character that represents the detailed information based on the moving direction of the operation of the press,
    The determination on whether or not each of the difference coordinates between the release position of the press and the start position of the press is within the predetermined range and the selection of the character indicating the detailed information based on the moving direction are areas related to the displayed detailed information And a character input system based on at least partially different criteria tables.
JP2015086112A 2007-04-11 2015-04-20 Character input system Active JP6135947B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2007103979 2007-04-11
JP2007103979 2007-04-11
JP2015086112A JP6135947B2 (en) 2007-04-11 2015-04-20 Character input system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015086112A JP6135947B2 (en) 2007-04-11 2015-04-20 Character input system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2013000618 Division 2008-01-18

Publications (2)

Publication Number Publication Date
JP2015133155A JP2015133155A (en) 2015-07-23
JP6135947B2 true JP6135947B2 (en) 2017-05-31

Family

ID=40143112

Family Applications (8)

Application Number Title Priority Date Filing Date
JP2008008697A Active JP4694579B2 (en) 2007-04-11 2008-01-18 Character input system
JP2008200418A Active JP4907612B2 (en) 2007-04-11 2008-08-04 Character input system
JP2011254480A Active JP5210471B2 (en) 2007-04-11 2011-11-21 Character input system
JP2011254481A Active JP4979100B2 (en) 2007-04-11 2011-11-21 Character input system
JP2012042755A Active JP4969710B2 (en) 2007-04-11 2012-02-29 Character input system
JP2013000618A Granted JP2013061991A (en) 2007-04-11 2013-01-07 Character input system
JP2014081218A Active JP6038834B2 (en) 2007-04-11 2014-04-10 Character input system
JP2015086112A Active JP6135947B2 (en) 2007-04-11 2015-04-20 Character input system

Family Applications Before (7)

Application Number Title Priority Date Filing Date
JP2008008697A Active JP4694579B2 (en) 2007-04-11 2008-01-18 Character input system
JP2008200418A Active JP4907612B2 (en) 2007-04-11 2008-08-04 Character input system
JP2011254480A Active JP5210471B2 (en) 2007-04-11 2011-11-21 Character input system
JP2011254481A Active JP4979100B2 (en) 2007-04-11 2011-11-21 Character input system
JP2012042755A Active JP4969710B2 (en) 2007-04-11 2012-02-29 Character input system
JP2013000618A Granted JP2013061991A (en) 2007-04-11 2013-01-07 Character input system
JP2014081218A Active JP6038834B2 (en) 2007-04-11 2014-04-10 Character input system

Country Status (1)

Country Link
JP (8) JP4694579B2 (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5476790B2 (en) * 2009-05-13 2014-04-23 富士通株式会社 Electronic device, display method, and display program
JP2011013990A (en) * 2009-07-03 2011-01-20 Pioneer Electronic Corp Content reproduction apparatus
JP2011028498A (en) * 2009-07-24 2011-02-10 Fujitsu Ltd Program, apparatus and method for processing information
KR101650339B1 (en) * 2010-03-12 2016-09-05 삼성전자 주식회사 Text Input Method And Portable Device supporting the same
JP5437899B2 (en) * 2010-04-28 2014-03-12 株式会社ジャストシステム Input device, input method, and input program
CN102884501A (en) * 2010-05-31 2013-01-16 日本电气株式会社 Electronic device using touch panel input and method for receiving operation thereby
JP5418508B2 (en) 2011-01-13 2014-02-19 カシオ計算機株式会社 Electronic device, display control method and program
JP2013025579A (en) * 2011-07-21 2013-02-04 Panasonic Corp Character input device and character input program
KR101616651B1 (en) * 2011-09-15 2016-05-02 미쓰비시덴키 가부시키가이샤 Ladder program creation device
JP5834302B2 (en) * 2011-09-29 2015-12-16 株式会社ユピテル Character input system and program
JP5510473B2 (en) * 2012-02-02 2014-06-04 富士通株式会社 Character input device, character input method, and character input program
JP5994374B2 (en) * 2012-05-09 2016-09-21 富士ゼロックス株式会社 Character processing apparatus and program
JP2014089503A (en) * 2012-10-29 2014-05-15 Kyocera Corp Electronic apparatus and control method for electronic apparatus
JP5542906B2 (en) * 2012-12-25 2014-07-09 京セラ株式会社 Character input device, character input method, and character input program
EP2946272A4 (en) 2013-01-21 2016-11-02 Keypoint Technologies India Pvt Ltd Text input system and method
IN2013CH00469A (en) * 2013-01-21 2015-07-31 Keypoint Technologies India Pvt Ltd
CN104007832B (en) * 2013-02-25 2017-09-01 上海触乐信息科技有限公司 Continuous method, system and the equipment for sliding input text
KR101334342B1 (en) 2013-05-16 2013-11-29 주식회사 네오패드 Apparatus and method for inputting character
JP6085529B2 (en) * 2013-06-18 2017-02-22 シャープ株式会社 Character input device
CN104662495B (en) 2013-09-26 2017-06-23 富士通株式会社 Drive dynamic control device, electronic equipment and drive control method
CN103677641B (en) * 2013-12-16 2017-06-27 联想(北京)有限公司 Information processing method and device
JP5971817B2 (en) 2014-06-20 2016-08-17 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Information processing apparatus, program, and method
JP2016066356A (en) * 2014-09-18 2016-04-28 高 元祐 Method for inputting information on two steps in release after connection movement with latent key
JP2016081277A (en) * 2014-10-16 2016-05-16 三菱電機エンジニアリング株式会社 Information device
JP6485077B2 (en) * 2015-02-02 2019-03-20 富士通コネクテッドテクノロジーズ株式会社 Information processing apparatus, character input method, and character input program
WO2016178273A1 (en) * 2015-05-01 2016-11-10 富士通株式会社 Drive control device, electronic apparatus, drive control program, and drive control method
JP6208808B2 (en) * 2016-05-06 2017-10-04 京セラ株式会社 Character input device, character input method, and character input program
JP6393435B1 (en) * 2018-02-19 2018-09-19 株式会社WindyLab Character string input system and method
JP6422614B1 (en) * 2018-08-09 2018-11-14 株式会社三菱電機ビジネスシステム Text input support system, text input support method, and text input support program

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3546337B2 (en) * 1993-12-21 2004-07-28 ゼロックス コーポレイションXerox Corporation User interface device for computing system and method of using graphic keyboard
JPH0981320A (en) * 1995-09-20 1997-03-28 Matsushita Electric Ind Co Ltd Pen input type selection input device and method therefor
JPH11224161A (en) * 1998-02-04 1999-08-17 Pfu Ltd Character input device and recording medium
KR100327209B1 (en) * 1998-05-12 2002-02-22 윤종용 Software keyboard system using the drawing of stylus and method for recognizing keycode therefor
JP2000181608A (en) * 1998-12-17 2000-06-30 Sharp Corp Virtual input device
JP2001265496A (en) * 2000-03-21 2001-09-28 Seiko Instruments Inc Information processor, method for processing information and computer readable recording medium with program for allowing computer to execute the method recorded thereon
JP2002091676A (en) * 2000-09-13 2002-03-29 Sanyo Electric Co Ltd Input device
JP2002108543A (en) * 2000-09-21 2002-04-12 Nokia Mobile Phones Ltd Method for inputting kana character
JP2002169651A (en) * 2000-11-29 2002-06-14 Takahito Imagawa Take-off input
JP2003076478A (en) * 2001-08-31 2003-03-14 Hitachi Ltd Character input means
JP2003157144A (en) * 2001-11-20 2003-05-30 Sony Corp Character input device, character input method, character input program storage medium and character input program
JP4027671B2 (en) * 2001-12-20 2007-12-26 ミサワホーム株式会社 Keyboard sheet
AU2003252548A1 (en) * 2002-08-16 2004-03-03 Hoon-Kee Kang Method of inputting a character using a software keyboard
JP3797977B2 (en) * 2003-03-17 2006-07-19 株式会社クレオ Character input device, character input method, and character input program
JP2004288010A (en) * 2003-03-24 2004-10-14 Fenghua Yin Symbol input device and symbol input method
JP2005032189A (en) * 2003-07-11 2005-02-03 Sony Corp Character input control method, character input program, and character input device
US20050052431A1 (en) * 2003-09-05 2005-03-10 Samsung Electronics Co., Ltd. Apparatus and method for character recognition
JP2005092441A (en) * 2003-09-16 2005-04-07 Aizu:Kk Character input method
JP2005182487A (en) * 2003-12-19 2005-07-07 Nec Software Chubu Ltd Character input apparatus, method and program
JP2005275635A (en) * 2004-03-24 2005-10-06 Fuji Photo Film Co Ltd Method and program for japanese kana character input
JP2007087200A (en) * 2005-09-22 2007-04-05 Mitsubishi Electric Corp Character input apparatus

Also Published As

Publication number Publication date
JP2009003950A (en) 2009-01-08
JP2013061991A (en) 2013-04-04
JP4694579B2 (en) 2011-06-08
JP2014150569A (en) 2014-08-21
JP2008282380A (en) 2008-11-20
JP4907612B2 (en) 2012-04-04
JP4969710B2 (en) 2012-07-04
JP6038834B2 (en) 2016-12-07
JP4979100B2 (en) 2012-07-18
JP5210471B2 (en) 2013-06-12
JP2012074065A (en) 2012-04-12
JP2015133155A (en) 2015-07-23
JP2012099118A (en) 2012-05-24
JP2012113741A (en) 2012-06-14

Similar Documents

Publication Publication Date Title
US10552037B2 (en) Software keyboard input method for realizing composite key on electronic device screen with precise and ambiguous input
US8599139B2 (en) Electronic device system utilizing a character input method
US8812972B2 (en) Dynamic generation of soft keyboards for mobile devices
JP5572059B2 (en) Display device
US6356258B1 (en) Keypad
US6597345B2 (en) Multifunctional keypad on touch screen
KR100720335B1 (en) Apparatus for inputting a text corresponding to relative coordinates values generated by movement of a touch position and method thereof
US9274685B2 (en) Systems and methods for predictive text entry for small-screen devices with touch-based two-stage text input
US7532198B2 (en) Handheld electronic device with roller ball input
US9329753B2 (en) Handheld electronic device having selectable language indicator and menus for language selection and method therefor
US8521927B2 (en) System and method for text entry
KR100859217B1 (en) Touch-type key input apparatus
JP5278259B2 (en) Input device, input method, and program
US8374846B2 (en) Text input device and method
JP4863211B2 (en) Character data input device
US7561902B2 (en) Apparatus and method for inputting character and numerals to display of a mobile communication terminal
WO2011158641A1 (en) Information processing terminal and method for controlling operation thereof
US8548793B2 (en) Handheld electronic device having selectable language indicator for language selection and method therefor
KR20040002875A (en) Hand-held device that supports fast text typing
US7002553B2 (en) Active keyboard system for handheld electronic devices
US6847706B2 (en) Method and apparatus for alphanumeric data entry using a keypad
JP4797104B2 (en) Electronic device and method for symbol input
KR20040097232A (en) Method and apparatus for character entry in a wireless communication device
JP5822662B2 (en) Portable electronic device, control method and program for portable electronic device
US20020060699A1 (en) Character input device based on a two-dimensional movememt sensor

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20150420

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160216

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20160511

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160816

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20161114

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20161118

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170314

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A711

Effective date: 20170407

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170412

R150 Certificate of patent or registration of utility model

Ref document number: 6135947

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150