KR100886251B1 - Character input device using touch sensor - Google Patents

Character input device using touch sensor Download PDF

Info

Publication number
KR100886251B1
KR100886251B1 KR1020070114567A KR20070114567A KR100886251B1 KR 100886251 B1 KR100886251 B1 KR 100886251B1 KR 1020070114567 A KR1020070114567 A KR 1020070114567A KR 20070114567 A KR20070114567 A KR 20070114567A KR 100886251 B1 KR100886251 B1 KR 100886251B1
Authority
KR
South Korea
Prior art keywords
sensor
character
input
central
area
Prior art date
Application number
KR1020070114567A
Other languages
Korean (ko)
Inventor
곽희수
Original Assignee
곽희수
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 곽희수 filed Critical 곽희수
Priority to KR1020070114567A priority Critical patent/KR100886251B1/en
Application granted granted Critical
Publication of KR100886251B1 publication Critical patent/KR100886251B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

A device for inputting characters with a touch sensor is provided to enable a user to easily input alphabets, numbers, special characters, and function keys with a small touchpad, and improve a character input speed by offering a simple pattern. A sensor input unit(10) receives a character type continuous touch locus from central and neighboring sensor sections. A sensor input position processor(20) recognizes the locus of the sensor input unit as a continuous route pattern, and recognizes that the neighboring sensor section is touched when the central and neighboring sensor section are touched at the same time. An input mode determiner(30) determines input modes for inputting capitals, small characters, special characters, and numbers. A character pattern analyzer(40) searches the character by comparing the route pattern recognized in the sensor input position processor with stored route pattern data. An output unit(60) outputs the searched character.

Description

Character input device using touch sensor {CHARACTER INPUT DEVICE USING TOUCH SENSOR}

The present invention relates to a character input device using a touch sensor that allows a user to easily and quickly input text using a touch sensor in a personal portable information terminal such as a mobile phone, a PMP, a UMPC, a notebook computer, a navigation device, and the like.

Small handheld personal digital assistants are widely used in mobile phones, PMPs, and UMPCs.

Currently used as an input device of such a personal digital assistant is a keyboard, a touch screen, 12 keys and the like.

The keyboard basically had to contain function keys in addition to the 26 letters of the alphabet, so it was limited to be used in small personal digital assistants.

In general, a mobile phone includes 12 keys including a number and a switch for selecting a mode. Such a 12-key method requires a large number of key presses to input a single character when a character is input and is complicated to use. Had

When character is recognized by touch screen, many input pattern values are analyzed by software in high performance CPU to distinguish characters. Therefore, this can be implemented only in PDA or PC with high performance CPU. Had a problem.

Accordingly, an object of the present invention is to provide a character input device using a touch sensor capable of inputting a character using only a low-cost processor by allowing the touch sensor to recognize only a relatively simple path pattern.

The present invention provides a character input device using a touch sensor that recognizes a trajectory touched using a touch sensor as a character or a function key.

In the case of inputting a character by drawing a character rather than a button method, it is necessary to recognize and process a large number of patterns that may come out while drawing a character. The present invention provides a touch that can simplify and process patterns that appear while drawing a character. It is to provide a character input device using a sensor.

According to an aspect of the present invention, there is provided a sensor input unit including a central sensor area and a peripheral sensor area radially disposed at a periphery of the central sensor area; A sensor input position processor configured to recognize the touched trajectory of the sensor input unit as a continuous path pattern; A character pattern analyzer for searching for a corresponding character by comparing the path pattern recognized by the sensor input position processor with stored path pattern data; And an output unit for outputting a character searched by the character pattern analysis unit.

The character input apparatus using the touch pad according to the present invention enables input of English letters, numbers, special characters, function keys, etc. using a small size touch pad.

In addition, by diversifying the input pattern of the character and by taking the input pattern from the form of the original character, the user can easily learn, and also provides a simplified pattern to improve the character input speed.

In addition, the English input speed is faster than the multi-input method using 12 keys used in the conventional mobile phone. When the beginner repeats the character input method implemented in the present invention about 10 times over 30 minutes, the learning is easy enough to input more than 50-Charater per minute and the input speed is improved.

In addition, by inputting a character using a simple pattern can be implemented using a low-speed processor, this also brings the effect of reducing the cost.

Hereinafter, with reference to the accompanying drawings will be described an embodiment of a character input device using a touch pad according to the present invention.

In this process, the thickness of the lines or the size of the components shown in the drawings may be exaggerated for clarity and convenience of description.

In addition, terms to be described later are terms defined in consideration of functions in the present invention, which may vary according to a user's or operator's intention or custom.

Therefore, definitions of these terms should be based on the contents throughout the specification.

1 is a block diagram showing the structure of a character input apparatus using a touch pad according to a first embodiment of the present invention.

As shown, the character input apparatus using the touch pad according to the embodiment of the present invention, the sensor input unit 10 including a central sensor region and a peripheral sensor region radially disposed and the azimuth angle around the periphery, and A sensor input position processing unit 20 for recognizing the touched trajectory of the sensor input unit 10 as a continuous path pattern, and an input mode determination unit 30 for determining an input mode such as English uppercase letters, lowercase letters, special characters, and numbers. And a text pattern analyzer 40 searching for a corresponding character by comparing the path pattern recognized by the sensor input position processor 20 with the stored path pattern data, and outputting a character searched by the character pattern analyzer. The unit 60 is included.

The sensor input unit 10 uses a sensor that can recognize a touch such as a finger or a stylus pen, such as a touch sensor that senses capacitance or a touch pad coated with a resistive film, and inputs a signal by a trace of a finger movement. Receive.

For example, when a plurality of sensors are used as the sensor input unit 10, the central sensor area and the peripheral sensor area are configured as individual capacitive sensors to generate a predetermined signal according to the touched area.

In addition, when the sensor input unit 10 includes a single sensor that receives X and Y coordinate values, such as a touch pad of a notebook computer, one sensor having a predetermined area is used, and the touched area is the central sensor area. And a plurality of peripheral sensor regions to generate a signal corresponding to the touched region.

As described above, the sensor input unit 10 may be physically composed of a plurality of sensors or a plurality of sensors, and various types of touch sensors such as a resistive film type and a capacitive type may be applied.

In addition, the sensor input unit 10 may be configured by a touch screen method attached to the surface of the display device. In this case, a resistive film type, a capacitive type, an infrared type, or an ultrasonic type may be used.

The sensor input position processor 20 recognizes the touched trajectory of the sensor input unit 10 as a continuous path pattern. The sensor input position processor 20 allocates the C value to the central sensor area and divides the peripheral sensor in 8 directions around the central sensor area. If you assign the values of N, NE, E, SE, S, SW, W, and NW corresponding to the respective directions to the area, if the finger moves from N to C through S, the three sensors touched continuously In order, it is recognized as a path pattern called NCS.

The character pattern analyzer 40 compares the path pattern recognized by the sensor input position processor 20 with the stored path pattern to find a character corresponding to the recognized path pattern, for each character. It stores the route data assigned to it. At this time, the character is found from the data corresponding to the mode selected by the input mode determination unit 30.

The output unit 60 serves to transmit a code corresponding to the character retrieved by the character pattern analyzer 40.

2 is a block diagram illustrating a structure of a character input apparatus using a touch pad according to a second embodiment of the present invention.

As shown, an input mode selection switch 70 connected to the input mode determination unit 30 may be further provided.

When inputting a character, there are many cases where the conversion of English upper and lowercase letters, numbers, and special characters is necessary. In this case, an input mode selection switch (70) can be used to change the mode of the input character more quickly. ) Is further provided.

In the case of the first embodiment, the input mode conversion is set to a predetermined path pattern input through the sensor input unit 10, and the operation is performed in such a manner that the input mode is changed when the input path pattern corresponds to the input mode conversion.

For example, as shown in FIG. 5, the pattern path corresponding to each mode is set, and when the pattern path corresponding to the mode change is input, the input mode is changed. The pattern path of this mode change is exemplary and is not necessarily limited to this pattern path.

3 illustrates a first embodiment of a sensor input unit of a text input device using a touch pad according to the present invention, and FIG. 4 illustrates a second embodiment of a sensor input unit of a text input device using a touch pad according to the present invention. will be.

As shown, the sensor input includes a central sensor region C disposed at the center, and a peripheral sensor region radially disposed with the azimuth angle around the center. In FIG. 3, eight peripheral sensor regions are arranged, and in FIG. 4, twelve peripheral sensor regions are arranged.

When a capacitive or resistive capacitance sensor is used as the sensor input unit, coordinate values input by specifying boundary values to a plurality of peripheral sensor regions radially arranged around the central sensor region and the central sensor region are inputted. Simplify the input value by dividing into 9 (center 1+ 8 around) or 13 (center 1 + 12 peripheral) blocks.

The number of peripheral sensor regions is preferably a multiple of four, but may be an odd number.

In addition, each of the nine or thirteen blocks may be composed of a plurality of sensor units. The number of divisions of the peripheral sensor part needs to be distributed at 90-degree intervals crosswise with each other like N and S and W and E to draw horizontal and vertical strokes. The remaining spaces NW, NE, SW, and SE can be arranged in multiples of four, so that four (N, S, W, E) + 4 * n peripheral sensor blocks can be generated.

The peripheral sensor areas each have an assigned address value. In the eight cases as shown in FIG. 3, N, NE, E, SE, S, SW, W, and NW, respectively, according to the positions of the bearings, and in the case of FIG. 4, 1,2,3,4,5,6, 7,8,9,10,11,12 was set. Hereinafter, the pattern path will be described using the direction indications in the case of eight and the direction indications according to the time at 12 in the case of eight address values of the peripheral sensor area.

In addition, it is preferable to include an identification point for indicating the starting point of the input in the peripheral sensor area. In the first embodiment of FIG. 3, identification for convenience of character input in the center of the NW, N, NE, and S sensors is performed. It is preferable to display the point by plane printing or uneven printing. In the second embodiment of FIG. 4, character input is performed at the center of 12 and 6 sensors, the boundary between 1 and 2 sensors, and the boundary between 10 and 11 sensors. It is preferable to mark the identification point for convenience of flat printing or uneven printing.

6 is a diagram illustrating various shapes of a sensor input unit.

As shown, the sensor input unit may have various shapes such as a circle, an ellipse, and a polygon. In this case, the center sensor area is disposed at the center, and the peripheral sensor area is disposed at the periphery of the center sensor area. Is the same.

7 is a view for explaining the output method of the sensor input unit.

As shown in the drawing, when the area A is touched with a finger, only the peripheral sensor area "W" is touched, so that the sensor output value is "W". When the area B is touched, the peripheral sensor area "W" and the center sensor area "C" are touched. Is in contact, but in this case, the peripheral sensor area is treated as being in contact, and the sensor output value is "W". When the area C is touched, only the center sensor area “C” is in contact with the sensor, and thus the sensor output value is “C”.

The reason why the output value is processed is that when various types of inputs are made in the sensor input unit, which is a relatively narrow area, as described below, the peripheral sensor area and the central sensor area are often touched together. This is because there is a possibility that an error will occur if it is handled. In addition, it is preferable that the central sensor region is designed to have a diameter about 1.5 times larger than the maximum diameter that a tool (finger, pen, etc.) attempting to input the sensor can contact with the surface. Of course, the diameter of the central sensor area can be adjusted according to the tool trying to input. When inputting text by finger recognition, it is recommended that the diameter of the center sensor area ranges from 12mm to 13mm, and 15mm or more when using the thumb.

In addition, when the central sensor unit and the peripheral sensor unit are configured as independent sensor units, the maximum separation distance between the central sensor unit and the peripheral sensor unit is 1 of the minimum diameter that the means for attempting sensor input (finger, pen, etc.) can input the sensor. It is recommended that the design be less than / 2. This is because the sensor input must be seamlessly input from the peripheral sensor unit and the central sensor unit to recognize the stroke as one stroke when the stroke passes from the peripheral sensor unit to the central sensor unit or from the central sensor unit to the peripheral sensor unit.

In addition, the central sensor area C may be configured as a push switch for ensuring the input certainty and for the convenience of the user, and the central sensor area C may protrude or dent more than the peripheral sensor areas. It may also be formed. When the central sensor area C is formed to protrude or dent more than the peripheral sensor areas, the central sensor area C and the peripheral sensor areas have a smooth inclination from the contact portion and protrude or dent so that the chin protrudes or dents when inputting characters. Do not get caught in the character input error.

Hereinafter will be described in detail with respect to the operation of the character input device according to the present invention.

8 is a diagram illustrating a case where an A letter is input to a sensor input unit according to a first exemplary embodiment of the present invention.

As shown, one method of inputting the letter A is to write a capital letter A divided into three strokes according to the writing order.

First, write 1 stroke down from the center point of N to SW down, and write 2 strokes down from the center point of N to right down to SW, and write 3 strokes from W to E horizontally. This is the same shape as the capital letter A in writing order.

In this case, the extracted pattern path is N-NW-W-SW, 2 strokes are N-NE-E-SE, and 3 strokes are W-C-E.

Thus, when one character is input to a plurality of strokes, it is preferable that the time interval between the end of one stroke and the first start of the next stroke is about 800 ms (700 ms to 900 ms).

In other words, if a character is not distinguished with the first pattern and the next stroke is input within a predetermined time interval, the next stroke is processed as a pattern for character discrimination until the next stroke. Ignore them and discard them, to determine if the pattern you just entered is a character pattern.

Since the time interval may vary depending on the user, it is preferable to be adjustable.

In more detail, when a character is completed in one pattern (stroke), the character corresponding to the input path pattern is output, and the first input of the next character is waited without waiting. Although a predetermined waiting time may be set for each input, if a waiting time is given for every input, the character input speed may be significantly reduced.

For example, if it does not find a character that corresponds to the entered path pattern (when the input is incorrect or consists of multiple strokes), it waits for the next stroke and enters the next stroke within the wait time (800 ms). It searches for a pattern of characters that are associated with a stroke, and outputs a character corresponding to a path input with a plurality of strokes.

However, if the path pattern is input after the waiting time (800ms) has elapsed while the character is not output because no character corresponding to the input path is found, the previously entered path pattern data is deleted (ignored) and the character is reset from the beginning. Wait for input of.

However, if the character is completed in a number of patterns, such as the uppercase letters "G" or "Q", the letter "G" or " When a pattern completes Q ", it deletes the previously completed" C "and" O "characters and prints" G "or" Q "again.

In addition, the form of the lowercase letter a may be set as the input pattern of A and stored. The above example is a path pattern corresponding to eight peripheral sensor regions. In the case of 12 peripheral sensor regions, as shown in FIG. 6, one stroke: 12-11-10-9-8- (7), 2 strokes: 12-1-2-3-4- (5), 3 strokes: 9-C-3. Here, () means a path that can be omitted. This is because it can be touched up to 7 sensors or 5 sensors, or up to 8 sensors or 4 sensors, depending on how far down from 1 stroke or 2 strokes.

9 is a diagram illustrating a case where b characters are input to a sensor input unit according to a first exemplary embodiment of the present invention.

As shown, in the case of writing a lowercase letter b, the path is input in one stroke, and the pattern path is NW-W-SW- (W) -C- (S) -SW.

Looking at all possible cases of paths that can be omitted,

If only W is omitted, NW-W-SW-CS-SW, if only S is omitted, NW-W-SW-WC-SW, if both are omitted, NW-W-SW-C-SW Becomes

 As described above, the path that can be omitted means a sensor that may or may not pass through a finger when drawing a predetermined pattern shape.

FIG. 10 is a diagram illustrating a case where h is input to a sensor input unit according to a first exemplary embodiment of the present invention.

As shown, in the case of writing a lowercase letter h, the path is input in one stroke, and the pattern path is NW-W-SW- (W) -C- [S, SE].

Here, the pattern in [] means that one of the two may be omitted, and both are selected in order when both are selected.

That is, it may be NW-W-SW- (W) -C-S or NW-W-SW- (W) -C-SE or NW-W-SW- (W) -C-S-SE.

This is because when you draw the last stroke of h vertically, you can go to S, to SE, or through S to SE.

FIG. 11 is a diagram illustrating a case where a W character is input to a sensor input unit according to a first exemplary embodiment of the present invention.

The letter W represents a pattern path that reciprocates twice vertically, among other possible cases.

That is, as shown by the arrow on the left side of the figure, the pattern path becomes N-C-S-C-N-C-S-C-N by reciprocating twice starting from N.

12A, 12B, and 12C show examples of patterns recognized by respective alphabet letters.

As shown, the case of the letter case and the possible abbreviation of each letter of the letter is scheduled as the path of each letter of the alphabet, the corresponding pattern path is extracted and stored in the character pattern analyzer, and then corresponds to the input pattern path. Will print out the alphabetic characters. The user can use these patterns after learning them, but as shown, each pattern can be easily learned because the shapes of the letters bring the shapes of English letters or omit some of them.

Each of these patterns corresponds to a path pattern corresponding to the number and arrangement of peripheral sensor regions.

13A, 13B, and 13C correspond to the patterns of FIGS. 12A, 12B, and 12C when the central sensor area C and eight peripheral sensor areas N, NE, E, SE, S, SW, W, and NW are disposed. 14A, 14B, and 14C illustrate a central sensor area C and 12 peripheral sensor areas 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, and 12 ) Is a path pattern corresponding to the pattern of FIGS. 12A, 12B and 12C.

As mentioned above. (a) means that a is a pattern that can be omitted, and [a, b] means that one or more of the two enters in order from left to right when entering at the same time. That is, it means a or b or a-b path. {a, b} means that one or more of the two are included and the order is irrelevant when both are included. Ie a or b or a-b or b-a.

15 illustrates a pattern recognized by each number, and FIG. 16 illustrates a case in which the central sensor region C and eight peripheral sensor regions N, NE, E, SE, S, SW, W, and NW are disposed. 16 illustrates a path pattern corresponding to the pattern of FIG. 16, and FIG. 17 illustrates a central sensor area C and 12 peripheral sensor areas 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, and 11. When 12 is arranged, the path pattern corresponding to the pattern of FIG. 15 is illustrated.

18A and 18B show patterns recognized as special characters, and FIGS. 19A and 19B show a central sensor area C and eight peripheral sensor areas N, NE, E, SE, S, SW, W, and NW. ) Is a path pattern corresponding to the patterns of FIGS. 18A and 18B, and FIGS. 20A and 20B illustrate a central sensor area C and 12 peripheral sensor areas 1, 2, 3, 4, 5, 6, 7,8,9,10,11,12 are shown the path pattern corresponding to the pattern of Figs. 18a and 18b.

In the case of a special character pattern, some special characters (.,:;! & '' "{} ~` ∴ ∵ ♡ 않는) that do not overlap with the alphabet pattern can be input in the alphabet input mode.

Because of this, special characters can be drawn without changing the input mode to separate special characters during alphabetical character input, thus speeding up the character input.

FIG. 21 illustrates a pattern recognized by each function key, and FIG. 22 illustrates a central sensor region C and eight peripheral sensor regions N, NE, E, SE, S, SW, W, and NW. ) Is a path pattern corresponding to the pattern of FIG. 21, and FIG. 23 illustrates a central sensor area C and 12 peripheral sensor areas 1, 2, 3, 4, 5, 6, 7, 8, When 9, 10, 11, and 12 are arranged, a path pattern corresponding to the pattern of FIG. 21 is shown.

When the pattern as shown in Fig. 21 is inputted, the same function key is operated as input. These function keys are briefly described as function keys generally used in computers or personal portable devices.

"UP", "DOWN", "LEFT" "RIGHT" means the arrow keys.

"VOL_UP", "VOL_DN" means volume control key,

"ENT" means Enter, "SB" means Spacebar, "BS" means Backspace and "ESC" means ESCAPE.

Such function keys are preferably operated in all modes. Since the function keys are commonly used regardless of the input mode, it is desirable to operate in all modes. In addition, when the input mode conversion is performed through the sensor input unit, the mode conversion pattern described with reference to FIG. 5 is also a kind of function key and operates in all modes.

All of these function keys can be input through the input sensor unit, thereby improving user convenience and reducing costs by unifying input means.

Although the present invention has been described with reference to the embodiments shown in the drawings, this is merely exemplary, and those skilled in the art to which the art belongs can make various modifications and other equivalent embodiments therefrom. Will understand.

Therefore, the true technical protection scope of the present invention will be defined by the claims below.

1 is a block diagram showing the structure of a character input apparatus using a touch pad according to a first embodiment of the present invention.

2 is a block diagram illustrating a structure of a character input apparatus using a touch pad according to a second embodiment of the present invention.

3 is a view showing a first embodiment of a sensor input unit of a text input device using a touch pad according to the present invention;

4 is a view showing a second embodiment of a sensor input unit of a text input device using a touch pad according to the present invention;

5 is a view showing an input mode conversion pattern path according to the present invention;

6 is a view illustrating various shapes of a sensor input unit;

7 is a view for explaining the output method of the sensor input unit;

8 is a diagram illustrating a case where an A letter is input to a sensor input unit according to a first embodiment of the present invention;

9 is a diagram illustrating a case where b characters are input to a sensor input unit according to a first embodiment of the present invention;

FIG. 10 is a diagram illustrating a case where h is input to a sensor input unit according to a first exemplary embodiment of the present invention; FIG.

11 is a view illustrating a case where W is input to a sensor input unit according to a first embodiment of the present invention;

12A, 12B and 12C are diagrams showing examples of patterns recognized in respective English letters;

13A, 13B, and 13C correspond to the patterns of FIGS. 12A, 12B, and 12C when the central sensor area C and eight peripheral sensor areas N, NE, E, SE, S, SW, W, and NW are disposed. Drawing showing the path pattern to

14A, 14B, and 14C illustrate a case in which a central sensor region C and 12 peripheral sensor regions 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, and 12 are arranged. A path pattern corresponding to the patterns 12b and 12c,

15 is a view showing a pattern recognized by each number,

FIG. 16 illustrates a path pattern corresponding to the pattern of FIG. 16 when the central sensor area C and eight peripheral sensor areas N, NE, E, SE, S, SW, W, and NW are disposed;

FIG. 17 corresponds to the pattern of FIG. 15 when the central sensor area C and 12 peripheral sensor areas 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 and 12 are arranged. Drawing showing the path pattern to

18A and 18B show patterns recognized as respective special characters;

19A and 19B illustrate path patterns corresponding to the patterns of FIGS. 18A and 18B when the central sensor area C and eight peripheral sensor areas N, NE, E, SE, S, SW, W, and NW are disposed. Shown,

20A and 20B illustrate a case in which a central sensor region C and 12 peripheral sensor regions 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, and 12 are arranged. A diagram showing a path pattern corresponding to a pattern of

21 is a view showing a pattern recognized by each function key;

FIG. 22 illustrates a path pattern corresponding to the pattern of FIG. 21 when the central sensor area C and eight peripheral sensor areas N, NE, E, SE, S, SW, W, and NW are disposed;

FIG. 23 corresponds to the pattern of FIG. 21 when the central sensor area C and the twelve peripheral sensor areas 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, and 12 are arranged. It is a drawing showing the path pattern.

Claims (18)

A region capable of receiving a letter-shaped trajectory is divided into a central sensor region in the center and a peripheral sensor region radially arranged with an azimuth angle around the central sensor region. The central sensor area may move in a continuous path to all peripheral sensor areas, and the peripheral sensor area may be formed to move in a continuous path to an adjacent peripheral sensor area or a central sensor area, and thus the central sensor area and the peripheral sensor area. A sensor input unit for receiving a continuous touch trajectory in the form of a character from the sensor; A sensor input position processor configured to recognize the touched trajectory of the sensor input unit as a continuous path pattern and to process the peripheral sensor region as being touched when the central sensor region and the peripheral sensor region are simultaneously touched; A character pattern analyzer for searching for a corresponding character by comparing the path pattern recognized by the sensor input position processor with stored path pattern data; And Character input device using a touch sensor comprising a; output unit for outputting a character retrieved from the character pattern analysis unit. The method of claim 1, Further comprising an input mode determination unit for determining the input mode, such as English uppercase letters, English lowercase letters, special characters, numbers, the character pattern analysis unit is characterized in that the touch sensor for searching for a character corresponding to the path pattern in the selected input mode Character input device using. The method of claim 2, The input mode of the input mode determination unit is a character input device using a touch sensor, characterized in that for changing by the operation of the input mode selection switch. The method of claim 2, The input mode determination unit is a character input device using a touch sensor, characterized in that the input mode is changed by the operation of the sensor input unit. The method of claim 1, The sensor input unit is a character input device using a touch sensor, characterized in that the central sensor area and the plurality of peripheral sensor areas are composed of each sensor physically separated. The method of claim 1, The sensor input unit may include a single sensor that recognizes a coordinate value corresponding to the touched area, and divides the touched area into a central sensor area and a plurality of peripheral sensor areas to generate a signal corresponding to the touched area. Character input device using a touch sensor. The method of claim 1, The sensor input unit is a character input device, characterized in that the touch screen method is attached to the surface of the display device. The method of claim 1, The sensor input unit may include a central sensor area C disposed at the center and eight peripheral sensor areas receiving N, NE, E, SE, S, SW, W, and NW values in a clockwise direction around the central sensor area C. Done, Character input device using a touch sensor, characterized in that the path pattern recognized by the sensor input position processing unit recognizes the corresponding alphabet in the case of the following table.
Figure 112007080703111-pat00001
The method of claim 1, The sensor input unit may include a central sensor area C disposed at the center and eight peripheral sensor areas receiving N, NE, E, SE, S, SW, W, and NW values in a clockwise direction around the central sensor area C. Done, Character input device using a touch sensor, characterized in that the path pattern recognized by the sensor input position processing unit recognizes the corresponding number as shown in the following table.
Figure 112007080703111-pat00002
The method of claim 1, The sensor input unit may include a central sensor area C disposed at the center and eight peripheral sensor areas receiving N, NE, E, SE, S, SW, W, and NW values in a clockwise direction around the central sensor area C. Done, Character input device using a touch sensor, characterized in that the path pattern recognized by the sensor input position processing unit recognizes the corresponding special character as shown in the following table.
Figure 112007080703111-pat00003
The method of claim 1, The sensor input unit may include a central sensor area C disposed at the center and eight peripheral sensor areas receiving N, NE, E, SE, S, SW, W, and NW values in a clockwise direction around the central sensor area C. Done, Character path input device using a touch sensor, characterized in that for recognizing the path pattern recognized by the sensor input position processor as shown in the following table the corresponding function key (function key).
Figure 112007080703111-pat00004
The method of claim 1, The sensor input unit has a central sensor area C disposed at the center and 1,2,3,4,5,6,7,8,9,10,11,12 values in a clockwise direction around the central sensor area C. It consists of 12 peripheral sensor areas Character input device using a touch sensor, characterized in that the path pattern recognized by the sensor input position processing unit recognizes the corresponding alphabet in the case of the following table.
Figure 112007080703111-pat00005
The method of claim 1, The sensor input unit has a central sensor area C disposed at the center and 1,2,3,4,5,6,7,8,9,10,11,12 values in a clockwise direction around the central sensor area C. It consists of 12 peripheral sensor areas Character input device using a touch sensor, characterized in that the path pattern recognized by the sensor input position processing unit recognizes the corresponding number as shown in the following table.
Figure 112007080703111-pat00006
The method of claim 1, The sensor input unit has a central sensor area C disposed at the center and 1,2,3,4,5,6,7,8,9,10,11,12 values in a clockwise direction around the central sensor area C. It consists of 12 peripheral sensor areas Character input device using a touch sensor, characterized in that the path pattern recognized by the sensor input position processing unit recognizes the corresponding special character as shown in the following table.
Figure 112007080703111-pat00007
The method of claim 1, The sensor input unit has a central sensor area C disposed at the center and 1,2,3,4,5,6,7,8,9,10,11,12 values in a clockwise direction around the central sensor area C. It consists of 12 peripheral sensor areas Character path input device using a touch sensor, characterized in that for recognizing the path pattern recognized by the sensor input position processor as shown in the following table the corresponding function key (function key).
Figure 112007080703111-pat00008
The method of claim 1, The sensor input unit may include a central sensor area C disposed at the center and eight peripheral sensor areas receiving N, NE, E, SE, S, SW, W, and NW values in a clockwise direction around the central sensor area C. Including, Character input device using a touch sensor, characterized in that the identification point for the convenience of character input in the center of the NW, N, NE, S sensor displayed by plane printing or uneven printing. The method of claim 1, The sensor input unit has a central sensor area C disposed at the center and 1,2,3,4,5,6,7,8,9,10,11,12 values in a clockwise direction around the central sensor area C. Including 12 peripheral sensor areas to receive input, Touch sensors, characterized in that the identification point for the convenience of character input on the center of the 12, 6 sensors, the boundary between 1 sensor and 2 sensors, and the 10 sensor and 11 sensor for plane printing or uneven printing. Character input device using. The method of claim 1, The sensor input unit is a character input device using a touch sensor, characterized in that the overall appearance has a circular, elliptical or polygonal shape.
KR1020070114567A 2007-11-09 2007-11-09 Character input device using touch sensor KR100886251B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020070114567A KR100886251B1 (en) 2007-11-09 2007-11-09 Character input device using touch sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020070114567A KR100886251B1 (en) 2007-11-09 2007-11-09 Character input device using touch sensor

Publications (1)

Publication Number Publication Date
KR100886251B1 true KR100886251B1 (en) 2009-02-27

Family

ID=40682246

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020070114567A KR100886251B1 (en) 2007-11-09 2007-11-09 Character input device using touch sensor

Country Status (1)

Country Link
KR (1) KR100886251B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101106628B1 (en) * 2011-08-31 2012-01-20 김태진 Trajectory searching method using patterns
CN105824476A (en) * 2016-04-28 2016-08-03 京东方科技集团股份有限公司 Touch structure and driving method thereof, touch screen and touch display device
KR101904899B1 (en) 2015-04-24 2018-10-08 게저 발린트 Method and data input device for data entry in electrical form

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050122662A (en) * 2004-06-25 2005-12-29 주식회사 팬택 Wireless communication terminal and method with function of typing characters based double sensor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050122662A (en) * 2004-06-25 2005-12-29 주식회사 팬택 Wireless communication terminal and method with function of typing characters based double sensor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101106628B1 (en) * 2011-08-31 2012-01-20 김태진 Trajectory searching method using patterns
KR101904899B1 (en) 2015-04-24 2018-10-08 게저 발린트 Method and data input device for data entry in electrical form
CN105824476A (en) * 2016-04-28 2016-08-03 京东方科技集团股份有限公司 Touch structure and driving method thereof, touch screen and touch display device
CN105824476B (en) * 2016-04-28 2018-08-07 京东方科技集团股份有限公司 A kind of touch-control structure and its driving method, touch screen, touch control display apparatus

Similar Documents

Publication Publication Date Title
US7170496B2 (en) Zero-front-footprint compact input system
Nesbat A system for fast, full-text entry for small electronic devices
US9030416B2 (en) Data entry system and method of entering data
US9557916B2 (en) Keyboard system with automatic correction
JP3597060B2 (en) Japanese character input device for mobile terminal and character input method
US20130227460A1 (en) Data entry system controllers for receiving user input line traces relative to user interfaces to determine ordered actions, and related systems and methods
US10133479B2 (en) System and method for text entry
US20030006956A1 (en) Data entry device recording input in two dimensions
CN101427202B (en) Method and device for improving inputting speed of characters
KR20050119112A (en) Unambiguous text input method for touch screens and reduced keyboard systems
CN101424977A (en) Input method for inputting content by keyboard and terminal equipment
CN1524212A (en) Text entry method and device therefor
US9529448B2 (en) Data entry systems and methods
SG177239A1 (en) Data entry system
EP1513053A2 (en) Apparatus and method for character recognition
KR100414143B1 (en) Mobile terminal using touch pad
KR100886251B1 (en) Character input device using touch sensor
US20050174334A1 (en) User interface
WO2014045414A1 (en) Character input device, character input method, and character input control program
CN102841752A (en) Character input method and device of man-machine interaction device
US11244138B2 (en) Hologram-based character recognition method and apparatus
CN115917469A (en) Apparatus and method for inputting logograms into electronic device
KR100935338B1 (en) Hangul character input device using touch sensor
WO2001045034A1 (en) Ideographic character input using legitimate characters as components
CN101551701A (en) Multidimensional control method and device, optimal or relatively favorable display input method and device

Legal Events

Date Code Title Description
A201 Request for examination
A302 Request for accelerated examination
AMND Amendment
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
J201 Request for trial against refusal decision
N231 Notification of change of applicant
AMND Amendment
B701 Decision to grant
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20120221

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20130223

Year of fee payment: 5

LAPS Lapse due to unpaid annual fee