KR20090022111A - Method for inputting a character, and apparatus for implementing the same - Google Patents
Method for inputting a character, and apparatus for implementing the same Download PDFInfo
- Publication number
- KR20090022111A KR20090022111A KR1020070087181A KR20070087181A KR20090022111A KR 20090022111 A KR20090022111 A KR 20090022111A KR 1020070087181 A KR1020070087181 A KR 1020070087181A KR 20070087181 A KR20070087181 A KR 20070087181A KR 20090022111 A KR20090022111 A KR 20090022111A
- Authority
- KR
- South Korea
- Prior art keywords
- keypad screen
- displayed
- contact
- access point
- detected
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/23—Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The present invention relates to a character input method and a character input apparatus capable of inputting characters such as numbers, alphabets, and Korean characters based on a touch pad method, and when the access is detected, displaying a keypad screen corresponding to an access point. ; If a contact is detected, entering a key corresponding to the current point of contact. According to the present invention, characters can be input quickly and accurately based on a non-contact touch pad.
Description
The present invention relates to a character input method and apparatus, and more particularly, to a character input method and a character input apparatus capable of inputting characters such as numbers, alphabets, and Korean characters based on a touch pad method.
Recently, instead of providing a key button as an input device in a mobile terminal such as a mobile phone, a PDA, a PMP, a UMPC, products that display a virtual keypad or a keypad screen on a touch screen have gained popularity. When the virtual keypad or keypad screen is displayed on the display, a key corresponding to the point may be input by touching the finger on the display using a finger or a stylus pen.
This virtual keypad method is applied to the number keys of 12-15 keys provided in the mobile phone, using the keys in the numeric mode, you can enter a number from 0 to 9, in the Hangul mode One or two consonants or vowels are assigned to one key, and three to four alphabets are assigned to one key in the English alphabet mode, so that the input characters change according to the number of key presses. For example, if "A, B, C" is assigned to the
However, if you implement a virtual keypad with 12-15 keys by assigning 3-4 characters to one key as described above, there are many times (1-3 times or more) to touch to enter one character. However, there is a problem that the input speed is very low. In addition, when a dozens of keys (e.g., a QWERTY keyboard) are implemented as a virtual keypad, the display of the mobile terminal is very narrow, so that one key is very small in the displayed area, so that the user may not have the desired key. By touching the key, a typo frequently occurred.
SUMMARY OF THE INVENTION The present invention has been made to solve the above problems, and an object thereof is to provide a character input method and apparatus capable of quickly and accurately inputting characters based on a touch pad of a non-contact type.
In order to achieve the above object, a character input method according to the present invention includes: displaying a keypad screen corresponding to an access point when an approach is detected; And if a contact is detected, inputting a key corresponding to the current contact point.
According to the present invention, before the displaying, the method may further include displaying a basic keypad screen.
According to the present invention, the keypad screen corresponding to the current access point may be an image including an enlarged image of a character corresponding to the current access point, and may be displayed on the basic keypad screen.
According to the invention, the magnification of the magnification may be proportional to the duration of the approach.
According to the present invention, a character corresponding to the current access point may include two or more characters, one of which is displayed at the original position, and the remaining characters are displayed at the periphery of the original position.
According to the present invention, the remaining characters may be displayed on one of the left, right, top, and bottom of the original position.
According to the present invention, a key corresponding to the current contact point may be determined based on a keypad screen corresponding to the access point.
According to the present invention, when the end of the approach is detected, the method may further include stopping displaying the keypad screen corresponding to the access point and displaying only the basic keypad screen.
According to the present invention, two or more keys may be displayed as one button image on the basic keypad screen.
According to the present invention, after the input step, it may further include generating a fine vibration.
Another aspect of the invention, the input unit for sensing the approach and contact of the external object or body; A controller for displaying a keypad screen corresponding to an access point when the approach is detected, and inputting a key corresponding to the current contact point when the contact is detected; And an output unit for displaying the keypad screen.
According to an aspect of the present invention, even if the keypad screen is displayed on a narrow display, the error rate may be reduced by accurately entering a desired letter.
According to another aspect of the present invention, it is possible to quickly input a desired letter with only one touch without touching several times.
According to another aspect of the present invention, when the input is completed by generating a fine vibration, it is possible to feel the sense of the key, the user can enter the letters more easily and conveniently.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. Prior to this, terms or words used in the specification and claims should not be construed as having a conventional or dictionary meaning, and the inventors should properly explain the concept of terms in order to best explain their own invention. Based on the principle that can be defined, it should be interpreted as meaning and concept corresponding to the technical idea of the present invention. Therefore, the embodiments described in the specification and the drawings shown in the drawings are only the most preferred embodiment of the present invention and do not represent all of the technical idea of the present invention, various modifications that can be replaced at the time of the present application It should be understood that there may be equivalents and variations.
1 is a view showing a character input device according to an embodiment of the present invention. Referring to the drawings, the
The
Referring back to FIG. 1, the
If an access is detected, the
6 is a view showing a sequence of a character input method according to an embodiment of the present invention. Referring to FIG. 6, first, a basic keypad screen is output before a finger or a stylus pen approaches (S110). 7 is an example of a basic keypad screen displayed before a finger approaches. Referring to FIG. 7, when the distance D between the
If there is an access of an external object (eg, a stylus pen) or a body (eg, a finger) (“YES” in step S120), a keypad screen corresponding to the access point is displayed (step S130). For example, an image in which characters corresponding to an access point are enlarged may be displayed on a basic keypad screen. 8 is an example of a screen displayed when a finger approaches within a predetermined distance. Referring to FIG. 8, since the key corresponding to the access point is "D / E / F", the characters included in the key are separated and enlarged and displayed on the enlarged window W. FIG. Specifically, in the enlarged window W, the first letter D is displayed at its original position, the second letter E is at the left of the original position, and the third letter F is at the top of the original position. However, it may be displayed on the right side or the bottom of the original position, or may be displayed at the position in the diagonal direction, the present invention is not limited thereto. On the other hand, the magnification of the magnification of the magnification window W may be proportional to the approach duration.
Then, in the state in which the screen of step S130 is displayed, if a contact of an external object or body occurs (YES in step S140), a key corresponding to the current contact point is input (step S150). Here, the key corresponding to the current contact point is determined based on the keypad screen displayed at the moment of contact, wherein the screen displayed at the moment of contact includes an enlarged window displayed in step S120. 9 and 10 are examples of screens displayed when a finger touches. First, referring to FIG. 9, when the finger touches the position of the letter D in the enlarged window W while the access is maintained, the letter D is input and displayed on the input window ch. If, as shown in FIG. 10, the finger touches the position of the letter E in the enlarged window W while the access is maintained, it can be seen that the letter E is input and displayed on the input window ch. As such, after the input of a certain key is completed, fine vibration can be generated as a response of the completion of the input.
If, in step S140, the touch of an external object or body is not detected (No in step S140), or after the contact is performed (after step S150), or the end of contact and approach is detected (in step S160). YES '), the display of the screen displayed in step S120 is stopped, and the screen returns to the basic keypad screen (step S170). 11 is an example of a screen displayed when a finger is separated by more than a predetermined distance after contact. Referring to FIG. 11, after the input of the letter E is completed, the finger is moved beyond the predetermined distance and the end of the approach is detected, so that the display of the enlarged window W is stopped and the display returns to the basic keypad screen.
As described above, although the present invention has been described by way of limited embodiments and drawings, the present invention is not limited thereto and is intended by those skilled in the art to which the present invention pertains. Of course, various modifications and variations are possible within the scope of equivalents of the claims to be described.
The present invention can be applied to a computer and a mobile terminal.
1 is a block diagram of a character input device according to an embodiment of the present invention.
2 is a diagram illustrating a distance between an input unit and a finger;
3 shows a signal before a finger approaches.
4 is a diagram illustrating a signal when a finger approaches within a predetermined distance.
5 is a view showing a signal when a finger touches.
6 is a flow chart of a character input method according to an embodiment of the present invention.
7 is an example of a basic keypad screen displayed before a finger approaches.
8 is an example of a screen displayed when a finger approaches within a predetermined distance.
9 and 10 are examples of screens displayed when a finger touches.
11 is an example of a screen displayed when a finger is separated by more than a predetermined distance after contact.
Claims (11)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020070087181A KR20090022111A (en) | 2007-08-29 | 2007-08-29 | Method for inputting a character, and apparatus for implementing the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020070087181A KR20090022111A (en) | 2007-08-29 | 2007-08-29 | Method for inputting a character, and apparatus for implementing the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20090022111A true KR20090022111A (en) | 2009-03-04 |
Family
ID=40692041
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020070087181A KR20090022111A (en) | 2007-08-29 | 2007-08-29 | Method for inputting a character, and apparatus for implementing the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20090022111A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101029816B1 (en) * | 2009-01-08 | 2011-04-20 | 엘지전자 주식회사 | Controlling method of electronic device |
-
2007
- 2007-08-29 KR KR1020070087181A patent/KR20090022111A/en not_active Application Discontinuation
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101029816B1 (en) * | 2009-01-08 | 2011-04-20 | 엘지전자 주식회사 | Controlling method of electronic device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6104317A (en) | Data entry device and method | |
JP2005535975A (en) | Character input method using software keyboard | |
KR20100000617A (en) | Character input apparatus and, method thereof | |
JP2009169456A (en) | Electronic equipment, information input method and information input control program used for same electronic equipment, and portable terminal device | |
WO2010089918A1 (en) | Electronic device and electronic device program | |
JP2011081676A (en) | Input device | |
KR20100103275A (en) | Method and apparatus for inputting key using virtual keypad | |
KR20050048758A (en) | Inputting method and appartus of character using virtual button on touch screen or touch pad | |
JP6057441B2 (en) | Portable device and input method thereof | |
US20060248457A1 (en) | Input device | |
KR20070100209A (en) | Touchscreen input method without stylus | |
KR100656779B1 (en) | Alphabet Input Apparatus Using A TouchPad And Method Thereof | |
JP2003186613A (en) | Character input unit | |
JP5463241B2 (en) | Input device | |
KR20090022111A (en) | Method for inputting a character, and apparatus for implementing the same | |
JP4907296B2 (en) | Input device | |
JP2013171295A (en) | Portable device, method for inputting character, and program | |
KR20090016209A (en) | Method for inputting a character, and apparatus for implementing the same | |
JP7056686B2 (en) | Information input method and mobile terminal device | |
JP6525082B2 (en) | Electronic device, information input method and information input control program used for the electronic device, and portable terminal device | |
JP6090289B2 (en) | Electronic device, information input method and information input control program used for the electronic device, and portable terminal device | |
JP7211481B2 (en) | INFORMATION INPUT METHOD AND MOBILE TERMINAL DEVICE | |
KR101255801B1 (en) | Mobile terminal capable of inputting hangul and method for displaying keypad thereof | |
WO2013078621A1 (en) | Touch screen input method for electronic device, and electronic device | |
KR101149892B1 (en) | Mobile device, letter input method thereof and |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |