WO2010125744A1 - 入力装置 - Google Patents
入力装置 Download PDFInfo
- Publication number
- WO2010125744A1 WO2010125744A1 PCT/JP2010/002409 JP2010002409W WO2010125744A1 WO 2010125744 A1 WO2010125744 A1 WO 2010125744A1 JP 2010002409 W JP2010002409 W JP 2010002409W WO 2010125744 A1 WO2010125744 A1 WO 2010125744A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- pattern
- area
- touch
- partial
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/018—Input/output arrangements for oriental characters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/32—Digital ink
Definitions
- the present invention relates to an input device for inputting information by a touch operation.
- a character input method using a touch panel on a small screen a button input method for assigning a plurality of characters to a small number of buttons, a handwriting recognition method for recognizing handwritten characters with a pen or a finger, and the like are employed.
- Patent Document 1 discloses a conventional input device using an input method for recognizing handwritten characters.
- the input device disclosed in Patent Document 1 uses a virtual frame that is automatically updated based on the inclusion relation between a circumscribed rectangle of a character stroke and a rectangle of a virtual frame, so that a plurality of strokes continuously generated as a character is written. Is sorted into character units. As a result, the user of the device can recognize and input a plurality of characters written at a free character size and a free position.
- Patent Document 1 proposes a stroke separation method in order to increase the recognition rate of character input composed of a plurality of strokes such as Japanese.
- the handwriting input device disclosed in Patent Document 2 includes a handwriting input board and an AIUEO alphabet keyboard, and inputs a Roman character Kana consonant by handwriting on the input board, and inputs a Roman character Kana vowel using the keyboard.
- Patent Literature 2 proposes a method in which a target for recognizing handwritten characters is only a vowel and a consonant is selected with a button on the keyboard.
- Patent Document 3 discloses a touch input device having a group of input keys (buttons) arranged in a matrix.
- input key groups arranged in a matrix are stored in a data table as registered key patterns corresponding to each character, and handwritten input is performed from the matching result of the handwritten input pattern and the registered key pattern for this input key group. Determine which character is a character.
- the button input method for assigning a plurality of characters to a small number of buttons requires an operation for selecting the characters assigned to the buttons. For example, when a button is pressed, a list of characters assigned to the button is displayed, and by further pressing, a character in the list is selected. As described above, in the button input method, a desired character is input by performing an operation of displaying a list of characters assigned to the button and an operation of selecting a character from the list, so that the same button is pressed multiple times. Cumbersome operation is required.
- Patent Document 2 Although only a vowel whose recognition target is a Roman letter is used, handwritten character input and button input must be used together, and a cumbersome operation of alternately performing different input methods is required.
- the present invention has been made to solve the above-described problems, and provides an input device that can improve the recognition rate and recognition speed of handwritten character recognition in an input device that uses a touch operation for character input. For the purpose.
- An input device includes a touch-type input unit that inputs a trajectory obtained by touching a touch input area, a display unit that displays an input screen corresponding to the touch input area of the touch-type input unit, and a display unit
- a first storage unit that stores partial area definition data in which a partial area of the touch input area of the touch input unit corresponding to the input button displayed on the input screen is defined at a position on the touch input area
- a second storage unit for storing corresponding data registered in association with a partial area corresponding to the input button, which is a pattern recognition target selected according to the display content of the input button, and a first storage
- the partial area including the input start position of the trajectory input to the touch input area of the touch-type input unit is identified with reference to the partial area definition data of the part, and identified with reference to the corresponding data of the second storage unit Partial area To obtain the pattern candidates associated, in which and a recognizing section a pattern candidate corresponding to the path using the acquired pattern candidates.
- the partial area including the input start position of the trajectory touch-inputted to the touch input area of the touch-type input unit with reference to the partial area definition data is identified and selected according to the display content of the input button
- the pattern candidate associated with the identified partial area is acquired, and the acquired pattern candidate is used.
- FIG. 6 is a diagram illustrating a typical application example of the input device according to Embodiment 1.
- FIG. It is a flowchart which shows the flow of operation
- FIG. It is a figure which shows the other application example of the input device by Embodiment 1.
- FIG. It is a figure which shows the other application example of the input device by Embodiment 1.
- FIG. It is a figure which shows an example of the registration process of the pattern utilized for character recognition. It is a figure which shows the normalization process of the locus
- FIG. 10 is a diagram illustrating an application example of an input device according to a third embodiment. It is a block diagram which shows the structure of the input device by Embodiment 4 of this invention. It is a figure for demonstrating the process which expands and displays the partial touch area
- FIG. 10 is a diagram illustrating an application example of an input device according to a third embodiment.
- FIG. 1 is a block diagram showing a configuration of an input device according to Embodiment 1 of the present invention.
- the input device 1 according to the first embodiment includes a touch input device (touch input unit) 2, a display device (display unit) 3, a pattern recognition processing unit (recognition processing unit) 4, and a partial touch area / input.
- a storage unit (second storage unit) 5 for feature pattern correspondence data (corresponding data) and a storage unit (first storage unit) 6 for partial touch area definition data (partial area definition data) are provided.
- the touch input device 2 has a function of acquiring a locus by a user's manual input or pen input to the touch input area 2a.
- Examples of the touch input device 2 include a touch pad used in a personal computer (PC).
- a touch panel integrated with the display device 3 may be used.
- the display device 3 is a component that displays input feedback from the touch input device 2 (for example, a locus display) and user input predicted by the pattern recognition processing unit 4.
- the pattern recognition processing unit 4 detects the partial touch area of the touch input area 2a from the trajectory input obtained by the touch input device 2 using the partial touch area definition data, and the input feature pattern associated with the partial touch area. And the input content intended by the user is predicted from the locus input.
- the storage unit 5 is a storage unit that stores partial touch area / input feature pattern correspondence data.
- the partial touch area / input feature pattern correspondence data is data configured by registering feature patterns that are candidates for handwriting input for each partial touch area defined by the partial touch area definition data.
- the feature pattern is a feature amount for a character candidate.
- the storage unit 6 is a storage unit that stores partial touch area definition data.
- the partial touch area definition data is data configured by registering data defining each of a plurality of partial touch areas obtained by dividing the touch input area 2 a of the touch input device 2.
- the partial touch area can be defined as a partial area A which is a rectangle composed of a point (x1, y1) and a point (x2, y2) on the touch input area 2a.
- FIG. 2 is a diagram showing an example of partial touch area / input feature pattern correspondence data.
- the partial touch area / input feature pattern correspondence data includes data corresponding to each of the n partial touch areas.
- the number of patterns associated with the partial touch area 1 is m from pattern 1 to pattern m
- the number of patterns associated with the partial touch area 2 is x from pattern 1 to pattern x
- the patterns associated with the touch area n are z patterns from pattern 1 to pattern z.
- FIG. 3 is a diagram showing a typical application example of the input device according to the first embodiment, and the present invention is applied to a touch panel on which nine buttons from the “ABC” button to the “#” button are arranged. Is shown.
- the area of each button is a partial touch area, and A to Z and # are registered as patterns.
- pattern 1 is J
- pattern 2 is K
- pattern 3 is L
- pattern 1 is P
- pattern 2 Is Q pattern 3 is R
- pattern 4 is S
- four handwritten input character candidates are defined.
- a character candidate that becomes a pattern is associated with each button that is a partial touch area and registered as partial touch area / input feature pattern correspondence data. Only the pattern candidate corresponding to the button is extracted, and the character intended by the user is recognized from the pattern candidates based on the subsequent input locus.
- the recognition speed can be improved, and the most probable one among the narrowed pattern candidates is recognized, so that erroneous recognition can be reduced.
- FIG. 4 is a flowchart showing a flow of operation by the pattern recognition processing unit 4 in FIG.
- the user performs handwriting input by touch operation on the touch input area 2 a of the touch input device 2.
- the trajectory data obtained by handwriting input is acquired by the touch input device 2 and transmitted to the pattern recognition processing unit 4 as trajectory input.
- the pattern recognition processing unit 4 When the pattern recognition processing unit 4 acquires a trajectory input from the touch input device 2 (step ST1), the pattern recognition processing unit 4 refers to the partial touch area definition data in the storage unit 6 based on the position coordinates of the input start point of the trajectory (step ST2). ), It is determined whether or not there is a partial touch area corresponding to the locus (step ST3). When there is no corresponding partial touch area (step ST3; NO), the pattern recognition processing unit 4 returns to the process of step ST1 to instruct re-input or input a locus related to the character next to the character string to be input. To get.
- step ST3 when there is a corresponding partial touch area (step ST3; YES), the pattern recognition processing unit 4 searches the storage unit 5 based on this partial touch area, and the corresponding partial touch area / input feature pattern correspondence data. , Pattern matching between the pattern registered in this data and the locus input acquired in step ST1 is executed (step ST4), and it is determined whether there is a corresponding pattern (step ST5).
- step ST5 when there is no corresponding pattern (step ST5; NO), the pattern recognition process part 4 returns to the process of step ST1.
- the pattern recognition processing unit 4 If there is a pattern corresponding to the partial touch area / input feature pattern correspondence data (step ST5; YES), the pattern recognition processing unit 4 outputs the pattern to the display device 3 as a recognition result. Thereby, the pattern of the recognition result is displayed on the display screen of the display device 3 (step ST6).
- the pattern recognition processing unit 4 determines whether or not the input of the current handwritten character string has been completed until the data for designating the input end is acquired from the touch-type input device 2 (step ST7). If the character string input is not completed (step ST7; NO), the process returns to step ST1, and the above-described process is repeated for the next input character. If the character string input has been completed (step ST7; YES), the process is terminated.
- the pattern recognition processing unit 4 acquires trajectory data from the input start point on the “JKL” button to the input end point as a trajectory input from the touch input device 2.
- the pattern recognition processing unit 4 refers to the partial touch area definition data in the storage unit 6 and specifies partial touch area definition data indicating the “JKL” button based on the position coordinates of the input start point in the trajectory data.
- the pattern recognition processing unit 4 searches the storage unit 5 with data for identifying the partial touch area (referred to as “area J”), and from the partial touch area / input feature pattern correspondence data regarding the area J, the area J
- area J data for identifying the partial touch area
- the three characters “J”, “K”, and “L” associated with are extracted as character recognition target patterns.
- the pattern recognition processing unit 4 performs pattern matching between the pattern of the trajectory acquired from the touch input device 2 and the pattern of the three characters to be recognized.
- the pattern recognition processing unit 4 is the best match among these three patterns. “L” is selected as the pattern character, and it is determined that the input character is intended by the user. Thereby, as shown in FIG. 3, “L” is displayed in the display column of the recognized character on the display screen of the display device 3.
- FIG. 5 is a diagram showing another application example of the input device according to the first embodiment.
- the present invention is a Japanese vowel “A” “KA” “SA” “TA” “NA” “HA” “ This shows a case where the present invention is applied to a touch panel on which nine buttons “ma”, “ya”, “ra”, and “wa” are arranged.
- each of the nine button areas is a partial touch area, and kana characters input by handwriting are recognized.
- the input character is composed of a plurality of strokes as in Japanese
- matching may be performed every time one stroke is input, and the input character may be determined before inputting all the strokes. In this case, if the difference in matching score with a plurality of pattern candidates in each stroke does not exceed a predetermined threshold value, the determination may be made in the next stroke.
- Pattern determination for a character composed of a plurality of strokes is performed according to the following processing flow.
- the pattern recognition processing unit 4 sets the score holding array score (p) (s) of the sth stroke of each recognition pattern p (0 ⁇ p ⁇ X; p is an integer), respectively. calculate.
- the pattern recognition processing unit 4 calculates the sum total (p, s) of the scores of the strokes up to the s-th number of each recognition target number p as the score total calculation processing. Thereafter, the pattern recognition processing unit 4 compares the sum (p, s) having the highest score with the second highest score, and if it exceeds the threshold d, it is selected as the pattern having the highest score. Then, the process ends. On the other hand, if it is less than or equal to the threshold value d, 1 is added to the value of s, the process returns to the score calculation process, and the above process is repeated.
- FIG. 6 is a diagram showing another application example of the input device according to the first embodiment.
- the present invention is described as “A” “KA” “SA” “TA” “NA” “HA” “MA” “YA” “YA”. ”,“. ”,“ Small ”,“ N ”, and“ ⁇ ”are applied to a touch panel on which 12 buttons are arranged.
- each of the 12 button areas is a partial touch area, and kana characters inputted by handwriting are recognized.
- FIG. 6B shows partial touch area / input feature pattern correspondence data for “partial touch area” of the “ta” button.
- the length of one side of the square inscribed in the locus recognized as the character “hi” is d1
- the pattern recognition processing unit 4 compares d1 and d2, and if d1> d2 and the difference exceeds a predetermined threshold, the lowercase letter “tsu” Finally recognize.
- the pattern candidate “T” to which a flag small indicating lowercase letters is assigned. Is the recognition result.
- FIG. 7 is a diagram showing an example of a pattern registration process used for character recognition, and shows a case where the numbers 1, 2, and 3 are recognized.
- a pattern corresponding to recognition in N ⁇ N (here, 5 ⁇ 5) regions is registered as an ordered sequence of points. Note that.
- the recognition pattern is registered in a recognition library (not shown) in FIG.
- the recognition library is stored in a memory that can be appropriately read out by the pattern recognition processing unit 4.
- the pattern for recognition of the numeral “1” has a pattern ⁇ 3, 1: 3, 2: 3, 3: 3, 4: 3, 5 > Is registered.
- the pattern for recognition of the numeral “2” has patterns ⁇ 2, 2: 2, 1: 3, 1: 4, 1: 4, 2: 4, 3: 3, 3: 3, 4: 2, 4. : 1, 5: 2, 5: 3, 5: 4, 5: 5, 5>
- the pattern for recognition of the numeral “3” is the pattern ⁇ 2, 1: 3, 1: 4, 1: 4, 2: 3, 2: 3, 3: 2, 3: 3, 3: 3, 4: 4, 4: 4, 5: 3, 5: 2, 5>.
- FIG. 8 is a diagram showing a normalization process for a locus input by handwriting.
- the pattern recognition processing unit 4 acquires the locus input from the touch input area 2a
- the pattern recognition processing unit 4 detects the position coordinates of the four corners of the rectangle inscribed by the input locus, and uses this rectangle as a (5 ⁇ 5) square area in the recognition pattern. Convert (normalize) to.
- the number “2” input by handwriting is ⁇ 1, 1: 2, 1: 3, 1: 4, 2: 4, 3: 3, 3: 2, 4: 1, 5: 2, 5: 3, 5: 4, 5>.
- the pattern recognition processing unit 4 calculates the distance between the recognition pattern (5 ⁇ 5) read from the recognition library and the locus of handwritten input normalized to (5 ⁇ 5). For example, the distance between patterns having different lengths may be calculated by extending a short pattern and calculating the distance at each point. The pattern recognition processing unit 4 performs the above-described distance calculation on all the recognition patterns registered in the recognition library, and determines the pattern having the closest distance as the pattern of the recognition result.
- the partial touch area of the touch input area 2a of the touch input device 2 corresponding to the input button displayed on the input screen of the display device 3 is set as the touch input.
- the partial touch area definition data defined at the position on the area 2a the partial touch area including the input start position of the trajectory input to the touch input area 2a of the touch input device 2 is specified, and the input button A pattern candidate associated with the specified partial area is obtained by referring to corresponding data registered in association with a partial area corresponding to the input button for the pattern recognition target selected according to the display content.
- the pattern corresponding to the trajectory is recognized using the acquired pattern candidate.
- the pattern candidates are selected from among the pattern candidates set in the partial touch area where the input of the first stroke is started.
- a pattern candidate that includes the first stroke and is the best match is determined as a recognition result. In this way, by narrowing down the recognition target according to the position where the input is started, it is possible to determine the recognition target to be input before inputting all the character strings made up of a plurality of strokes.
- the size of the character currently being processed is compared with the size of the character input before this character, and the current processing is performed. If the target character has a character size smaller than that of the previous character and the difference exceeds a predetermined threshold, it is determined whether or not the character currently being processed is a lowercase letter. In this way, it is possible to input lowercase letters with natural input without using a key or input method dedicated to lowercase letters.
- the touch input device 2 is a device provided separately from the display device 3 .
- the touch input device 2 may be configured to be integrated with the display device 3 such as a touch panel.
- the touch input device 2 configured separately from the display device 3 includes a pointing device of the display device 3 such as an input pad mounted on a PC or a remote controller.
- the pattern recognition processing unit 4 refers to the partial touch area definition data and detects the corresponding partial touch area.
- the partial touch area itself is detected. Without performing this, pattern recognition processing is performed by calculating the distance to each partial touch area. By performing such processing, even if the handwriting input start point is not strictly within the partial touch area, the input character can be detected and the recognition accuracy is improved as compared with the conventional case. It becomes possible.
- the input device according to the second embodiment is basically the same as the configuration described with reference to FIG. 1 in the first embodiment, but the pattern recognition processing unit does not detect the partial touch area itself, and each partial touch. The difference is that pattern recognition is performed by detecting the distance to the region. Therefore, hereinafter, the configuration of the input device according to the second embodiment will be referred to FIG.
- FIG. 9 is a flowchart showing a flow of operations performed by the pattern recognition processing unit according to the second embodiment of the present invention.
- the user performs handwriting input by touch operation on the touch input area 2 a of the touch input device 2.
- the trajectory data obtained by handwriting input is acquired by the touch input device 2 and transmitted to the pattern recognition processing unit 4 as trajectory input.
- the pattern recognition processing unit 4 When the pattern recognition processing unit 4 acquires the locus input from the touch input device 2 (step ST1a), the pattern recognition processing unit 4 refers to the partial touch area definition data in the storage unit 6 and stores the position coordinates of the input start point of the input locus. The distances from the partial touch areas defined by all the partial touch area definition data stored in the unit 6 are respectively calculated (step ST2a).
- the distance to the partial touch area for example, the shortest distance from the position coordinate of the input start point of the input locus to the rectangle indicating the partial touch area defined by the above formula (1) or the center coordinate of the rectangle Use distance.
- the number of partial touch areas is N, and the distance columns according to the distances of the partial touch areas 1 to N are ⁇ r_1, r_2,..., R_N>.
- the pattern recognition processing unit 4 compares the distances r_1 to r_N to the partial touch areas 1 to N with a predetermined threshold value, and determines whether there is a partial touch area with a distance equal to or smaller than the threshold value (Ste ST3a). If none of the distances to each partial touch area is equal to or less than the threshold value (all of the distances exceed the threshold value) (step ST3a; NO), the pattern recognition processing unit 4 returns to the process of step ST1a and makes a trajectory. And the process from step ST1a to step ST3a is repeated until a partial touch area whose distance from the position coordinate of the input start point of the locus is equal to or smaller than the threshold value appears.
- the pattern recognition processing unit 4 corresponds to the partial touch area / input feature pattern correspondence data from the storage unit 5 based on the partial touch area.
- the pattern recognition processing unit 4 selects a partial touch area in order from the input locus according to the weighting value related to the distance to the partial touch area, and stores the storage unit 5 based on the selected partial touch area.
- the corresponding partial touch area / input feature pattern correspondence data is referred to, and pattern matching between the pattern registered in this data and the locus input acquired in step ST1a is executed (step ST4a).
- the pattern candidates obtained as a recognition result by the pattern recognition processing unit 4 by this pattern matching are output to the display device 3. Thereby, the pattern of the recognition result is displayed on the display screen of the display device 3 (step ST5a).
- weighting As shown in FIG. 10, when there are four areas 1 to 4 as partial touch areas and the start point is P, the distances from the point P to the centers of the areas 1 to 4 are defined as d_1, d_2, d_3, d_4. Then, the weighting for each of the areas 1 to 4 is defined as follows. Thereby, the weighting value can be increased as the distance is shorter.
- the distance from the handwritten input trajectory to each partial touch area is calculated, and the partial touch area close to the trajectory is selected according to this distance to perform pattern recognition processing. Do. In this way, even if the handwriting input start point is not strictly within the partial touch area, the partial touch area is identified by the distance from the approximate position of the input start point, and character recognition is performed. It can be carried out. Further, by selecting partial touch areas with weighting according to the distance, character recognition targets are narrowed down, and the recognition speed can be improved.
- FIG. FIG. 11 is a block diagram showing a configuration of an input device according to Embodiment 3 of the present invention.
- the input device 1A according to the third embodiment has a pattern / display correspondence data storage unit 7 added to the configuration described in the first embodiment with reference to FIG.
- the pattern recognition processing unit 4 stores the storage unit 7.
- the pattern / display correspondence data read out from “” it is possible to display “ne”, which is a display character corresponding to the partial touch area, on the display device 3.
- the pattern / display correspondence data is, for example, the following data. ⁇ Ah, ah, yes, uh, eh, oh> ⁇ Ka; ki, ku, ke, ko> ⁇ Sa; sa, shi, su, se, so> ⁇ Ta; Chi, Tsu, Te, and> ⁇ Nana, N, N, Ne> ... ⁇ Wow, null, null, null>
- the characters described in the column ⁇ ; in ⁇ > (the first sound “a”, “ka”, “sa”, “wa”) of the consonant part of the Japanese syllabary string are displayed on the button.
- FIG. 12 is a diagram showing an application example of the input device according to the third embodiment.
- the present invention is applied to the first sound “a”, “ka”, “sa”, “ta”, “na”, “na”, The figure shows a case where the present invention is applied to a touch panel on which ten buttons “ma” “ya” “ra” “wa” are arranged.
- the partial touch area that distinguishes “ya”, “ra”, and “wa” is defined by the partial touch area definition data.
- the partial touch area / input feature pattern correspondence data includes five patterns “a”, “i”, “u”, “e” corresponding to phoneme symbols of Japanese vowels as common pattern candidates in each partial touch area. “O” is registered.
- the pattern recognition processing unit 4 refers to the partial touch area / input feature pattern correspondence data corresponding to the partial touch area where the input is started, and pattern candidates “a”, “i”, “u”, “e”, “o” Performs pattern matching with the handwritten trajectory.
- the pattern recognition processing unit 4 refers to the pattern / display correspondence data in the storage unit 7 and inputs it.
- a character combining a consonant displayed on a button corresponding to the started partial touch area and a phoneme symbol pattern candidate is specified, and the specified character is output to the display device 3 as a recognition result.
- input is started from the button on which the consonant “NA” is displayed, and the phoneme symbol “e” of the vowel is input, whereby the pattern candidate “e” is recognized.
- the character “Ne” combining the consonant “na” and the vowel “e” is displayed as the recognition result.
- the character recognition target of each partial touch area is set to only characters indicating the phoneme symbols of the five vowels “a”, “i”, “u”, “e”, and “o”.
- the desired character is input by combining the consonant determined at the start position of the handwriting input and the phoneme symbol of the vowel recognized by the handwriting input.
- FIG. 13 is a block diagram showing a configuration of an input device according to Embodiment 4 of the present invention.
- the input device 1 ⁇ / b> B according to the fourth embodiment includes a proximity detection system (proximity detection unit) 8 in addition to the configuration described with reference to FIG. 1 in the first embodiment.
- the proximity detection system 8 is a system that measures the distance between an object such as a hand or a pen that performs an input operation on the touch input device 2 and the touch input area of the touch input device 2.
- the touch input device 2 is configured with an electrostatic touch panel that detects the approach of an object based on a change in capacitance, and the object, the touch input area, and the like are based on proximity information of the object detected by the electrostatic touch panel. Measure the distance.
- the proximity detection system 8 measures the distance between an object such as a hand or a pen and the touch input area from the proximity information of the object acquired by the touch input device 2 as described above, and this distance is below a predetermined threshold value.
- the display data of the touch input area is changed and displayed on the display device 3 so that one or more partial touch areas near the area where the object is approaching are enlarged.
- the proximity detection system 8 stores the relationship between the relative display positions before and after enlargement in the display data of the touch input area. For example, in FIG. 14 to be described later, the change contents are stored in the proximity detection system 8 so that the number of partial touch areas is increased from the initial value 10 to 4 after expansion, and the four parts near the proximity point A are stored.
- the touch area is configured to be enlarged and displayed.
- the proximity detection system 8 sequentially receives the proximity information of the object from the touch input device 2, measures the distance between the object and the touch input area, and compares the distance with the threshold value.
- the proximity detection system 8 clears the stored relative display position. Then, a transition is made to a waiting state for new object proximity information from the touch input device 2.
- the proximity detection system 8 outputs the relationship between the relative display positions before and after the enlargement to the pattern recognition processing unit 4.
- the pattern recognition processing unit 4 stores the relationship between the relative display positions before and after the enlargement input from the proximity detection system 8, and uses this to start the pattern recognition processing of the trajectory input by the object.
- the proximity detection system 8 determines that the distance between the object and the touch input area exceeds the threshold before the pattern recognition processing unit 4 notifies the recognition of the locus (before the pattern recognition is completed). This is notified to the pattern recognition processing unit 4.
- the pattern recognition processing unit 4 When the pattern recognition processing unit 4 is notified that the object has moved away to a distance exceeding the threshold again before the pattern recognition of the trajectory by the object is completed, the pattern recognition processing unit 4 receives the relative input from the proximity detection system 8. Clear location information. Thereafter, the state shifts to a touch input waiting state.
- the pattern recognition processing unit 4 and the relative position information input from the proximity detection system 8 and the object from the touch input device 2 are displayed.
- the partial touch area where the input is started is searched using the position information that defines the trajectory of, and the input character is recognized in the same manner as in the first embodiment.
- FIG. 14 is a diagram for explaining a process for enlarging and displaying a partial touch area near the area where the object is approached.
- FIG. 14A shows a touch input area before the enlarged display
- FIG. The touch input area after enlarged display is shown.
- the vertical and horizontal dimensions of the rectangular area inscribed by the partial touch areas of the “A”, “KA”, “TA”, and “NA” buttons near the proximity point A are d1 and d2, and when the rectangular area is enlarged and displayed. If the vertical and horizontal dimensions are D1 and D2, the corresponding position before the enlarged display can be calculated from the position (a, b) in the enlarged rectangular area using the following formula (2).
- an approach to the touch input area of an object such as an input hand or pen is detected, and a partial display area near the proximity point of the detected object is enlarged and displayed.
- Handwritten characters and gestures are recognized from the pattern candidates and input patterns set in these partial display areas. In this way, in an apparatus with a limited input area and display area, it is possible to reduce the influence of camera shake and the like, and reliable recognition and high-speed recognition are possible.
- the input device can improve the recognition rate and recognition speed of handwritten character recognition, it is suitable for use in an interface using a touch operation for character input.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Character Discrimination (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
実施の形態1.
図1は、この発明の実施の形態1による入力装置の構成を示すブロック図である。図1において、実施の形態1による入力装置1は、タッチ式入力装置(タッチ式入力部)2、表示装置(表示部)3、パタン認識処理部(認識処理部)4、部分タッチ領域/入力特徴パタン対応データ(対応データ)の記憶部(第2の記憶部)5及び部分タッチ領域定義データ(部分領域定義データ)の記憶部(第1の記憶部)6を備える。
<矩形(x1,y1,x2,y2):部分領域A> ・・・(1)
このように、パタン候補の絞り込みを行うことで、認識速度の向上を図ることができ、絞り込んだパタン候補の中から最も確からしいものを認識するので誤認識を軽減することも可能である。
ここで、上述のような認識処理を行うパタン認識処理部4の詳細な動作を説明する。
図4は、図1中のパタン認識処理部4による動作の流れを示すフローチャートである。
先ず、ユーザが、タッチ式入力装置2のタッチ入力領域2aに対し、タッチ操作による手書き入力を行う。この手書き入力による軌跡データは、タッチ式入力装置2により取得され、軌跡入力としてパタン認識処理部4へ伝えられる。
パタン認識処理部4は、タッチ式入力装置2からの軌跡入力として、「JKL」釦上の入力開始点から入力終了点までの軌跡データを取得する。次に、パタン認識処理部4は、記憶部6の部分タッチ領域定義データを参照し、軌跡データにおける入力開始点の位置座標に基づいて「JKL」釦を示す部分タッチ領域定義データを特定する。この後、パタン認識処理部4は、この部分タッチ領域(「領域J」とする)を識別するデータで記憶部5を検索し、領域Jに関する部分タッチ領域/入力特徴パタン対応データから、領域Jに関連付けられた文字「J」「K」「L」の3文字を文字認識対象のパタンとして抽出する。
先ず、初期化処理として、パタン認識処理部4が、スコア保持用配列score(p)(s)(認識ターゲット数p、ストローク最大数s)を0で初期化する(p=0,s=0)。続いて、スコア計算処理として、パタン認識処理部4が、各認識パタンp(0≦p<X;pは整数)の第s番目のストロークのスコア保持用配列score(p)(s)をそれぞれ計算する。
図7は、文字認識で利用するパタンの登録処理の一例を示す図であり、数字の1,2,3を文字認識する場合を示している。図7に示す例では、NxN(ここでは5x5)個の領域で認識に対応するパタンを順序付きの点のシーケンスとして登録した場合を示している。なお。認識用パタンは、図1において不図示の認識用ライブラリに登録される。認識用ライブラリは、パタン認識処理部4により適宜読み出し可能なメモリに記憶される。
例えば、「ABC」と表示され、文字認識に用いる認識用パタンの候補が「A」「B」「C」であるキー釦において、この釦上で手入力が開始された場合、この手入力された文字を認識するには、当該キー釦に設定されている「A」「B」「C」の3つの文字のみを文字認識対象に限定する。
上記実施の形態1では、パタン認識処理部4が部分タッチ領域定義データを参照して、対応する部分タッチ領域を検出する場合を示したが、この実施の形態2では、部分タッチ領域自体を検出せずに、各部分タッチ領域までの距離を算出してパタン認識処理を行う。このような処理を行うことにより、手書きの入力開始点が厳密に部分タッチ領域内に入っていない場合であっても、入力する文字を検出でき、かつ、従来と比較して認識の精度を上げることが可能となる。
図9は、この発明の実施の形態2のパタン認識処理部による動作の流れを示すフローチャートである。
先ず、ユーザが、タッチ式入力装置2のタッチ入力領域2aに対し、タッチ操作による手書き入力を行う。この手書き入力による軌跡データは、タッチ式入力装置2により取得され、軌跡入力としてパタン認識処理部4へ伝えられる。
図10に示すように、部分タッチ領域として1~4の4つの領域があり、開始点がPである場合、点Pから各領域1~4の中心までの距離をd_1,d_2,d_3,d_4とすると、各領域1~4への重み付けを下記のように定義する。これにより、距離が近いものほど重み付けの値を大きくすることができる。
領域1への重み付け;1-d_1/D
領域2への重み付け;1-d_2/D
領域3への重み付け;1-d_3/D
領域4への重み付け;1-d_4/D
ただし、D=d_1+d_2+d_3+d_4
この重み付けを、距離を考慮しない各スコアに積算したものを評価値とする。
図11は、この発明の実施の形態3による入力装置の構成を示すブロック図である。実施の形態3による入力装置1Aは、上記実施の形態1で図1を用いて説明した構成に対してパタン・表示対応データの記憶部7を追加している。パタン認識処理部4は、検出した部分タッチ領域(例えば、後述する図12の「な」釦)と入力特徴パタン(例えば、後述する図12のパタン「e」)とに基づいて、記憶部7から読み出したパタン・表示対応データを参照することで、部分タッチ領域に応じた表示文字である「ね」を表示装置3に表示できる。
<あ;あ,い,う,え,お>
<か;か,き,く,け,こ>
<さ;さ,し,す,せ,そ>
<た;た,ち,つ,て,と>
<な;な,に,ぬ,ね,の>
・・・
<わ;わ,null,null,null,を>
ここで、<>中の< ;の欄に記述された文字(五十音列の子音部の第1音「あ」「か」「さ」・・・「わ」)が釦に表示される文字であり、; >の欄に順次記述された文字が、前述の釦に表示される文字と、母音の音素記号に相当する各パタン候補「a」「i」「u」「e」「o」と、を組み合わせた文字である。なお、nullは該当文字なしを意味する。
また、部分タッチ領域/入力特徴パタン対応データには、各部分タッチ領域における共通のパタン候補として、日本語の母音の音素記号に対応する5つのパタン「a」「i」「u」「e」「o」が登録される。
このように、各部分タッチ領域の文字認識対象を「a」「i」「u」「e」「o」のみとすることで、文字認識対象が絞り込まれ、認識速度を向上させることができる。
また、従来の携帯電話のように、日本語を入力するために同一釦を複数回押下して文字候補リストを検索する煩わしい操作を行う必要がない。さらに、子音となる文字のみを手書き入力すればよいので、通常のひらがなを手書き入力する場合と比較して少ないストロークで日本語を入力することが可能である。
また、図12における「あ」,「か」,...「わ」の代わりに図15のように「A」,「K」,「S」,...「W」のように子音に対応するアルファベットを表示するように構成することもできる。
図13は、この発明の実施の形態4による入力装置の構成を示すブロック図である。図13において、実施の形態4による入力装置1Bは、上記実施の形態1で図1を用いて説明した構成に加え、近接検出システム(近接検出部)8を備える。近接検出システム8は、タッチ式入力装置2に対して入力操作する手やペン等の物体とタッチ式入力装置2のタッチ入力領域との距離を測定するシステムである。例えば、静電容量の変化で物体の接近を検出する静電式タッチパネルでタッチ式入力装置2を構成し、静電式タッチパネルで検出された物体の近接情報を基に当該物体とタッチ入力領域との距離を測定する。
近接検出システム8は、上述のようにしてタッチ式入力装置2で取得された物体の近接情報から手やペン等の物体とタッチ入力領域との距離を測定し、この距離が所定の閾値を下回ると、タッチ入力領域においてこの物体が接近している領域付近の1個以上の部分タッチ領域を拡大表示するように、タッチ入力領域の表示データを変更して、表示装置3に表示させる。このとき、近接検出システム8は、タッチ入力領域の表示データにおける拡大前後の相対的な表示位置の関係を保存しておく。
例えば、後述する図14であれば、部分タッチ領域の数を初期値10から拡大後に4になるように近接検出システム8内にその変化内容を保存しておき、近接点A付近の4つの部分タッチ領域を拡大表示するように構成する。
図14は、物体が接近した領域付近の部分タッチ領域を拡大表示する処理を説明するための図であり、図14(a)は拡大表示前のタッチ入力領域を示し、図14(b)は拡大表示後のタッチ入力領域を示している。ここで、図14(a)中の近接点Aに物体が近接したものとする。この場合、近接点A付近の「あ」「か」「た」「な」釦の各部分タッチ領域が内接する矩形領域の縦横の寸法をd1,d2とし、当該矩形領域を拡大表示した際の縦横の寸法をD1,D2とすると、下記式(2)を用いて、拡大表示した上記矩形領域内の位置(a,b)から拡大表示前の対応する位置を算出することができる。
Claims (6)
- タッチ入力領域をタッチして得られる軌跡を入力するタッチ式入力部と、
前記タッチ式入力部のタッチ入力領域に対応する入力用画面を表示する表示部と、
前記表示部の入力用画面に表示された入力釦に対応する、前記タッチ式入力部のタッチ入力領域の部分領域を、当該タッチ入力領域上の位置で定義した部分領域定義データを格納する第1の記憶部と、
前記入力釦の表示内容に応じて選別したパタン認識の対象となるパタン候補を、当該入力釦に対応する部分領域と関連付けて登録した対応データを格納する第2の記憶部と、
前記第1の記憶部の部分領域定義データを参照して、前記タッチ式入力部のタッチ入力領域に入力された軌跡の入力開始位置を含む部分領域を特定し、前記第2の記憶部の対応データを参照して、前記特定した部分領域に関連付けられた前記パタン候補を取得し、前記取得したパタン候補を用いて前記軌跡に対応するパタンを認識する認識処理部とを備えた入力装置。 - 認識処理部は、タッチ入力領域をタッチして得られる軌跡の入力開始位置を含む部分領域がない場合、前記入力開始位置から前記部分領域までの距離が所定の閾値以下となる部分領域に関連付けられたパタン候補を取得し、前記取得したパタン候補を用いて前記軌跡に対応するパタンを認識することを特徴とする請求項1記載の入力装置。
- 第2の記憶部は、入力釦に表示された文字及びこれに関連する文字のパタン候補を対応データとして格納しており、
認識処理部は、タッチ入力領域をタッチして前記文字を構成するストロークが入力される度に、前記第2の記憶部の前記対応データを参照して当該ストロークの軌跡に対応する前記パタン候補を取得し、前記取得したパタン候補を用いて前記軌跡に対応するパタンを認識することを特徴とする請求項1記載の入力装置。 - 第2の記憶部は、入力釦に表示された平仮名文字及び片仮名文字のパタン候補を対応データとして格納しており、
認識処理部は、前回パタンを認識した軌跡と今回入力された軌跡とのタッチ入力領域上での大きさを比較して今回入力された軌跡が小さい場合、前記第2の記憶部の対応データを参照して、前記平仮名文字又は前記片仮名文字の小文字のパタン候補から前記今回入力された軌跡に対応するパタン候補を取得し、前記取得したパタン候補を用いて前記軌跡に対応するパタンを認識することを特徴とする請求項1記載の入力装置。 - 入力釦に日本語の五十音列の子音部の第1音の文字をそれぞれ表示し、
第2の記憶部は、入力釦に対応する部分領域と関連付けて日本語の母音を示す音素記号「a」「i」「u」「e」「o」のパタン候補のみを対応データとして格納しており、
認識処理部は、第1の記憶部の部分領域定義データを参照して、前記タッチ式入力部のタッチ入力領域に入力された軌跡の入力開始位置を含む部分領域を特定し、前記第2の記憶部の対応データを参照して、前記特定した部分領域に関連付けられた前記パタン候補を取得し、前記取得したパタン候補を用いて前記タッチ入力領域をタッチして得られた軌跡に対応するパタン候補を特定すると、当該入力釦に表示される前記子音部の第1音の文字と前記特定したパタン候補の日本語の母音を示す音素記号とを組み合わせてなる文字を認識結果とすることを特徴とする請求項1記載の入力装置。 - タッチ入力領域に接近する物体を検出する近接検出部を備え、
表示部は、前記近接検出部にて検出された物体が接近した前記タッチ入力領域上の位置周辺の部分領域に対応する入力釦を拡大表示することを特徴とする請求項1記載の入力装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010800193076A CN102414648A (zh) | 2009-04-28 | 2010-04-01 | 输入装置 |
US13/148,761 US20120069027A1 (en) | 2009-04-28 | 2010-04-01 | Input device |
JP2011511276A JP5208267B2 (ja) | 2009-04-28 | 2010-04-01 | 入力装置 |
DE112010001796T DE112010001796T5 (de) | 2009-04-28 | 2010-04-01 | Eingabevorrichtung |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-109258 | 2009-04-28 | ||
JP2009109258 | 2009-04-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010125744A1 true WO2010125744A1 (ja) | 2010-11-04 |
Family
ID=43031904
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/002409 WO2010125744A1 (ja) | 2009-04-28 | 2010-04-01 | 入力装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20120069027A1 (ja) |
JP (1) | JP5208267B2 (ja) |
CN (1) | CN102414648A (ja) |
DE (1) | DE112010001796T5 (ja) |
WO (1) | WO2010125744A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102662487A (zh) * | 2012-03-31 | 2012-09-12 | 刘炳林 | 一种显示键盘、输入处理方法及装置 |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2010095255A1 (ja) * | 2009-02-23 | 2012-08-16 | 富士通株式会社 | 情報処理装置、表示制御方法および表示制御プログラム |
KR101695818B1 (ko) * | 2010-07-28 | 2017-01-13 | 엘지전자 주식회사 | 이동 단말기 및 이것의 가상 키 패드 제어 방법 |
KR101978687B1 (ko) * | 2011-11-15 | 2019-05-16 | 삼성전자주식회사 | 터치스크린 단말기에서 문자 입력 방법 및 장치 |
US9323726B1 (en) * | 2012-06-27 | 2016-04-26 | Amazon Technologies, Inc. | Optimizing a glyph-based file |
CN102841682B (zh) * | 2012-07-12 | 2016-03-09 | 宇龙计算机通信科技(深圳)有限公司 | 终端和手势操控方法 |
DE102012015255A1 (de) * | 2012-08-01 | 2014-02-06 | Volkswagen Aktiengesellschaft | Anzeige- und Bedieneinrichtung und Verfahren zur Ansteuerung einer Anzeige- und Bedieneinrichtung |
KR102091710B1 (ko) * | 2012-08-28 | 2020-04-14 | 삼성전자주식회사 | 좌표 측정 장치 및 그 제어 방법 |
US9645729B2 (en) * | 2012-10-18 | 2017-05-09 | Texas Instruments Incorporated | Precise object selection in touch sensing systems |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US8994827B2 (en) | 2012-11-20 | 2015-03-31 | Samsung Electronics Co., Ltd | Wearable electronic device |
US10551928B2 (en) | 2012-11-20 | 2020-02-04 | Samsung Electronics Company, Ltd. | GUI transitions on wearable electronic device |
US11237719B2 (en) | 2012-11-20 | 2022-02-01 | Samsung Electronics Company, Ltd. | Controlling remote electronic device with wearable electronic device |
US11372536B2 (en) | 2012-11-20 | 2022-06-28 | Samsung Electronics Company, Ltd. | Transition and interaction model for wearable electronic device |
US10423214B2 (en) | 2012-11-20 | 2019-09-24 | Samsung Electronics Company, Ltd | Delegating processing from wearable electronic device |
US11157436B2 (en) | 2012-11-20 | 2021-10-26 | Samsung Electronics Company, Ltd. | Services associated with wearable electronic device |
CN103902090A (zh) * | 2012-12-29 | 2014-07-02 | 深圳雷柏科技股份有限公司 | 一种无界触摸技术的实现方法与系统 |
US9317125B2 (en) | 2013-04-24 | 2016-04-19 | Microsoft Technology Licensing, Llc | Searching of line pattern representations using gestures |
US9275480B2 (en) | 2013-04-24 | 2016-03-01 | Microsoft Technology Licensing, Llc | Encoding of line pattern representation |
US9721362B2 (en) * | 2013-04-24 | 2017-08-01 | Microsoft Technology Licensing, Llc | Auto-completion of partial line pattern |
JP6125333B2 (ja) * | 2013-05-31 | 2017-05-10 | 株式会社東芝 | 検索装置、方法及びプログラム |
US10691332B2 (en) | 2014-02-28 | 2020-06-23 | Samsung Electronics Company, Ltd. | Text input on an interactive display |
US9430702B2 (en) * | 2014-07-10 | 2016-08-30 | Korea Electronics Technology Institute | Character input apparatus and method based on handwriting |
KR102717063B1 (ko) * | 2018-12-24 | 2024-10-15 | 삼성전자주식회사 | 전자 장치 및 전자 장치의 제어 방법 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0887380A (ja) * | 1994-09-19 | 1996-04-02 | Tabai Espec Corp | 操作体対応型操作パネル装置 |
JP2000035857A (ja) * | 1996-02-19 | 2000-02-02 | Misawa Homes Co Ltd | 画面表示式キー入力装置 |
JP2001325064A (ja) * | 2000-03-10 | 2001-11-22 | Misawa Homes Co Ltd | 画面表示式キー入力装置 |
JP2002133369A (ja) * | 2000-10-30 | 2002-05-10 | Sony Corp | 手書き文字入力方法および装置ならびにプログラム格納媒体 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60136868A (ja) | 1983-12-26 | 1985-07-20 | Sharp Corp | 日本語入力装置 |
JPH09161011A (ja) | 1995-12-13 | 1997-06-20 | Matsushita Electric Ind Co Ltd | 手書き文字入力装置 |
US6271835B1 (en) * | 1998-09-03 | 2001-08-07 | Nortel Networks Limited | Touch-screen input device |
FI20012209A (fi) * | 2001-11-14 | 2003-06-24 | Nokia Corp | Menetelmä informaation esittämisen ohjaamiseksi elektroniikkalaitteessa ja elektroniikkalaite |
US20040212601A1 (en) * | 2003-04-24 | 2004-10-28 | Anthony Cake | Method and apparatus for improving accuracy of touch screen input devices |
US7979795B2 (en) * | 2004-08-02 | 2011-07-12 | Hewlett-Packard Development Company, L.P. | System and method for inputting syllables of a phonetic script into a computer |
US7561737B2 (en) * | 2004-09-22 | 2009-07-14 | Microsoft Corporation | Mathematical expression recognition |
CN100353301C (zh) * | 2006-04-19 | 2007-12-05 | 劳英杰 | 日本语文字输入方法 |
KR100949581B1 (ko) * | 2007-10-08 | 2010-03-25 | 주식회사 자코드 | 통신단말기의 문자/숫자 입력장치 및 입력방법 |
US20090251422A1 (en) * | 2008-04-08 | 2009-10-08 | Honeywell International Inc. | Method and system for enhancing interaction of a virtual keyboard provided through a small touch screen |
CN101261564A (zh) * | 2008-04-14 | 2008-09-10 | 昆明理工大学 | 一种用于输入汉字的虚拟键盘及操作方法 |
CN101286097A (zh) * | 2008-06-02 | 2008-10-15 | 昆明理工大学 | 一种汉字输入方法 |
CN100593151C (zh) * | 2008-07-04 | 2010-03-03 | 金雪松 | 日语输入方法和终端 |
-
2010
- 2010-04-01 JP JP2011511276A patent/JP5208267B2/ja not_active Expired - Fee Related
- 2010-04-01 CN CN2010800193076A patent/CN102414648A/zh active Pending
- 2010-04-01 WO PCT/JP2010/002409 patent/WO2010125744A1/ja active Application Filing
- 2010-04-01 US US13/148,761 patent/US20120069027A1/en not_active Abandoned
- 2010-04-01 DE DE112010001796T patent/DE112010001796T5/de not_active Ceased
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0887380A (ja) * | 1994-09-19 | 1996-04-02 | Tabai Espec Corp | 操作体対応型操作パネル装置 |
JP2000035857A (ja) * | 1996-02-19 | 2000-02-02 | Misawa Homes Co Ltd | 画面表示式キー入力装置 |
JP2001325064A (ja) * | 2000-03-10 | 2001-11-22 | Misawa Homes Co Ltd | 画面表示式キー入力装置 |
JP2002133369A (ja) * | 2000-10-30 | 2002-05-10 | Sony Corp | 手書き文字入力方法および装置ならびにプログラム格納媒体 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102662487A (zh) * | 2012-03-31 | 2012-09-12 | 刘炳林 | 一种显示键盘、输入处理方法及装置 |
CN102662487B (zh) * | 2012-03-31 | 2017-04-05 | 刘炳林 | 一种显示键盘、输入处理方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
CN102414648A (zh) | 2012-04-11 |
JP5208267B2 (ja) | 2013-06-12 |
JPWO2010125744A1 (ja) | 2012-10-25 |
DE112010001796T5 (de) | 2012-08-09 |
US20120069027A1 (en) | 2012-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5208267B2 (ja) | 入力装置 | |
KR101061317B1 (ko) | 알파벳 텍스트 입력 방법 및 장치 | |
JP6419162B2 (ja) | 文字入力装置及び文字入力方法 | |
US9021380B2 (en) | Incremental multi-touch gesture recognition | |
CN108700996B (zh) | 用于多输入管理的系统和方法 | |
US10133479B2 (en) | System and method for text entry | |
JP5897725B2 (ja) | ユーザインタフェース装置、ユーザインタフェース方法、プログラム及びコンピュータ可読情報記憶媒体 | |
JP2006524955A (ja) | タッチスクリーン及び縮小型キーボードのための曖昧でないテキスト入力方法 | |
US9529448B2 (en) | Data entry systems and methods | |
JP5075997B2 (ja) | 電子機器、プログラムおよび文字列認識方法 | |
KR20180119647A (ko) | 문자열에 문자를 삽입하기 위한 방법 및 대응하는 디지털 디바이스 | |
US8976134B2 (en) | Character input device and character input method | |
CN108369637B (zh) | 用于美化数字墨水的系统和方法 | |
JPWO2013171919A1 (ja) | 表示制御装置、制御プログラム、および表示装置の制御方法 | |
US7979795B2 (en) | System and method for inputting syllables of a phonetic script into a computer | |
CN102109951A (zh) | 一种输入字符和背景隐含字符组合输入汉字的方法 | |
JP5897726B2 (ja) | ユーザインタフェース装置、ユーザインタフェース方法、プログラム及びコンピュータ可読情報記憶媒体 | |
US20150089432A1 (en) | Quick data entry systems and methods | |
JP2011237876A (ja) | 文字入力装置、文字入力方法および文字入力プログラム | |
JP6409165B2 (ja) | 電子機器、及び、手書き文字入力プログラム | |
JP2018018366A (ja) | 情報処理装置、文字入力プログラムおよび文字入力方法 | |
CN108733227B (zh) | 输入装置及其输入方法 | |
WO2024110354A1 (en) | Setting font size in an unconstrained canvas | |
JP6226472B2 (ja) | 入力支援装置、入力支援システムおよびプログラム | |
KR20190006470A (ko) | 터치스크린 상에서 터치 버튼의 멀티터치 인식방법, 문자 입력 방법 및 객체 변형 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080019307.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10769446 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2011511276 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13148761 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112010001796 Country of ref document: DE Ref document number: 1120100017964 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10769446 Country of ref document: EP Kind code of ref document: A1 |