WO2008105574A1 - Easy handwriting user interface system and method for pen-input display - Google Patents
Easy handwriting user interface system and method for pen-input display Download PDFInfo
- Publication number
- WO2008105574A1 WO2008105574A1 PCT/KR2007/002058 KR2007002058W WO2008105574A1 WO 2008105574 A1 WO2008105574 A1 WO 2008105574A1 KR 2007002058 W KR2007002058 W KR 2007002058W WO 2008105574 A1 WO2008105574 A1 WO 2008105574A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- handwriting
- pen
- window
- recognition result
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 238000012905 input function Methods 0.000 claims abstract description 27
- 230000004913 activation Effects 0.000 claims abstract description 17
- 230000004044 response Effects 0.000 claims abstract description 13
- 230000003213 activating effect Effects 0.000 claims description 6
- 230000000994 depressogenic effect Effects 0.000 claims description 5
- 230000000881 depressing effect Effects 0.000 claims description 3
- 238000007796 conventional method Methods 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/02—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
- B60R11/0229—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for displays, e.g. cathodic tubes
- B60R11/0235—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for displays, e.g. cathodic tubes of flat type, e.g. LCD
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/02—Occupant safety arrangements or fittings, e.g. crash pads
- B60R21/055—Padded or energy-absorbing fittings, e.g. seat belt anchors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/32—Digital ink
- G06V30/36—Matching; Classification
- G06V30/387—Matching; Classification using human interaction, e.g. selection of the best displayed recognition candidate
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F27/00—Combined visual and audible advertising or displaying, e.g. for public address
- G09F27/008—Sun shades, shades, hoods or louvres on electronic displays to minimise the effect of direct sun light on the display
Definitions
- the present invention relates to handwriting user interface system and method for pen-input display.
- FIG. 1 illustrates a typical handwriting user interface for the conventional pen-input display.
- Fig. l(a) shows the user interface by which the user browses a particular working window (10). While the working window (10) is activated, an input of a particular character value may be requested through pen input at a working area (11).
- a handwriting input window (20) is created and displayed. As illustrated in Fig.
- the handwriting input window (20) is displayed in a partially overlapped manner with the lower part of the working window (10).
- the handwriting input window (20) is an opaque window and covers some part of the working window (10).
- the pen input value (21) is displayed in the handwriting input window (20).
- the pen input recognition function is activated by handwriting management application equipped inside the mobile terminal. Then, as illustrated in Fig. l(c), the recognition result (22) is displayed in the handwriting input window (20) replacing the pen input value (21).
- the recognition result (22) is inputted in the working area (11) of the working window (10).
- An object of the present invention is to provide handwriting user interface system and method for pen-input display which enable simple and efficient pen input without interrupting the user's browsing of the working window at all.
- Another object of the present invention is to provide handwriting user interface system and method for pen-input display which do not require the user's jump of focus of attention between the current working window and the handwriting input window during character input.
- Still another object of the present invention is to provide handwriting user interface system and method for pen-input display which enable convenient input of the handwriting recognition result in the working area of the working window.
- Yet another object of the present invention is to provide handwriting user interface system and method for pen-input display which enable selective input of the wanted character value in the working area when there is more than one recognition result.
- a handwriting character input method for pen-input display which comprises the steps of a) in response to user's activation of a handwriting input function, displaying a transparent handwriting input window in an overlapped manner with a current working window, b) receiving pen input of the user through a touch screen and, in response to activation of a pen input recognition function, recognizing the pen input, c) displaying the recognition result in the handwriting input window and d) in response to activation of the recognition result input function, transferring the recognition result to a working area of the working window.
- the handwriting recognition area and the working area are overlapped without interrupting the user's visual recognition of the working area. Accordingly, the user's attention can be focused. In the conventional technique, the user's jump of focus of attention between the working area and the handwriting recognition area is required. But, it is unnecessary in the present invention.
- the recognition result can be controlled more conveniently by transferring the recognition result to the working window through drag and drop. This further simplifies the selective input from two or more recognition results.
- the transparent handwriting input window remains activated if the activation button is pressed and becomes inactivated if the button is depressed. This enables more efficient handwriting input.
- FIG. 1 illustrates a typical example of the conventional handwriting user interface for pen-input display.
- FIG. 2 is a flowchart illustrating a preferred embodiment of the handwriting user interface for pen-input display in accordance with the present invention.
- FIGs. 3 to 5 illustrate a preferred embodiment of the handwriting user interface for pen-input display in accordance with the present invention.
- FIG. 6 is a perspective view illustrating a preferred embodiment of a mobile electronic terminal equipped with the handwriting user interface for pen-input display in accordance with the present invention.
- FIG. 2 is a flowchart illustrating a preferred embodiment of the handwriting user interface for pen-input display in accordance with the present invention and Figs. 3 to 5 illustrate preferred embodiments of the handwriting user interface for pen-input display in accordance with the present invention.
- a handwriting input function for character inputs (e.g., keyword) in the working area (101) while the current working window (100) is displayed on a mobile electronic terminal (SlOO).
- a transparent handwriting input window (200) is displayed onto the working window (100) in an overlapped manner as illustrated in Fig. 3 (S200).
- the handwriting input function may be assigned to a particular button (500 in Fig. 6) or a menu of the mobile electronic terminal (1).
- the handwriting input function is assigned to a particular button (500 in Fig. 6).
- the handwriting input function is activated if the button (500 in Fig. 6) is pressed and is inactivated if the button (500 in Fig. 6) is depressed.
- the handwriting input window (200) is a transparent intermediate window.
- transparent means that the text and/or image in the working window (100) can be visually recognized even if it is overlapped by the activated handwriting input window (200).
- the transparency can be adjusted within the range from 100 to 50 %. In this range, the content in the working window (100) can be sufficiently recognized visually. Consequently, the content in the working window (100) can be visually recognized even when the transparent handwriting input window (200) is arranged on the working window (100).
- the handwriting input function button (500 in Fig. 6 the user can resume the browsing of the working window (100) at any time.
- the transparent handwriting input window (200) has a size identical to that of the working window (100). If so, the transparent handwriting input window (200) covers the working window (100) entirely.
- the reference numerals 205 and 206 in Fig. 3 are close menu and size control menu of the transparent handwriting input window (200), respectively.
- the menus 205 and 206 are optional.
- the user can input a particular character through the touch screen (400 in Fig. 6) while the transparent handwriting input window (200) is activated (S300).
- the pen input value (201a, 201b and 201c, collectively "201" is displayed in the transparent handwriting input window (200) (S400).
- the user may input more than one pen input value in order.
- the pen input values "Garden” (201a) and “House” (201b and 201c) are examples of the pen input values. If there is no more pen input for a predetermined period of time (typically, 1 to 3 seconds) after the pen input of "Garden” (201a), the pen input recognition function is activated by the handwriting management application equipped inside the mobile electronic terminal (S500).
- the pen input recognition function may be activated by selecting the "handwriting recognition” menu (203) provided at the bottom of the transparent handwriting input window (200).
- the recognition result (202a) for the pen input value “Garden” (201a) is displayed near the pen input value (S600).
- the user may determine that "House” is a better keyword than the pen input value “Garden” (201a). In this case, the user inputs "House” through the pen input.
- the pen input value "House” (201b) is recognized and the recognition result (202b) is displayed at side of the pen input value (201b). Of course, recognition error may occur.
- the recognition result (202b) shows that the pen input value (201b) is recognized as "Hauze.”
- a recognition error may be corrected by correcting the pen input value (201b).
- it can be simply corrected by inputting "House” (201c) again more clearly.
- the correct recognition result (202c) is displayed near the pen input value "House” (201c). If there is more than one recognition result (S700), each of the recognition results is displayed in the handwriting input window (200) as an independent entity (S800). Specifically, the recognition results “Garden” (202a), “Hauze” (202b) and “House” (202c) are displayed in the handwriting input window (200) as independent entities.
- the user can select the wanted recognition result from the recognition results (202a, 202b and 202c, totally “202") and input it in the working area (101) of the working window (100) (S900).
- the user can select the recognition result "House” (202c) using a mouse or a pen and transfer the recognition result "House” (202c) to the working area (101) by selecting the "recognition result input” menu (204) positioned at the bottom of the transparent handwriting input window (200).
- the activation of the recognition result input function is performed by dragging and dropping the recognition result displayed as an independent entity to the working area of the working window using a mouse.
- the recognition result "House” (202c) is transferred to the working area (101) by simple drag-and-drop action of the mouse.
- the handwriting management application can automatically perform the transfer of the corresponding recognition result to the working window (100).
- the handwriting input window (200) may be closed automatically under the control of the handwriting management application when the drag and drop is completed.
- the handwriting input window (200) may be closed when the handwriting input button (500 in Fig. 6) is depressed.
- a working window (100) in which the recognition result "House" (202c) is inputted in the working area (101) is activated, as illustrated in Fig. 5.
- Fig. 6 is a perspective view illustrating a preferred embodiment of a mobile electronic terminal equipped with the handwriting user interface for pen-input display in accordance with the present invention.
- the mobile electronic terminal (1) is equipped with a handwriting input application (300), a touch screen (400), and a button to which the handwriting input function is assigned (500).
- a middleware e.g., Window OS
- the handwriting input button (500) When the user presses the handwriting input button (500), the handwriting input application (300) is executed and the transparent handwriting input window (200) is displayed overlapping the current working window (100).
- the activated status is maintained while the button (500) is pressed and the handwriting input function is inactivated when the button is depressed. This ensures more efficient handwriting input.
- the transparent handwriting input window (200) is illustrated as partially overlapping the current working window (100). However, this is for the convenience of understanding and it may overlap whole of the current working window (100).
- the handwriting input window (200) is a transparent intermediate window and does not disturb the user's browsing of the working window (100) at all. While the handwriting input window (200) is activated, the pen input value (201) through the touch screen (300) and the recognition result (202) thereof are displayed on the transparent handwriting input window (200). The recognition result (202) is transferred to the working area (101) of the working window (100), for example, by drag and drop. Then, the transparent handwriting input application (200) becomes inactivated and the wanted keyword is inputted in the working window (100). This process is carried out by the handwriting management application (300). For this purpose, the handwriting management application (300) performs the following functions of:
- the handwriting management application (300) preferably performs the function of, when there is more than one recognition result, recognizing each of the recognition results as an independent entity. And, more preferably, it further performs the function of automatically transfer the recognition result to the working window (100) through the drag-and-drop of a mouse.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Character Discrimination (AREA)
Abstract
Provided are handwriting user interface system and method for pen-input display. When the handwriting input function is activated by user, a transparent handwriting input window is displayed overlapping the current working window. Then, the user's pen input value is inputted through a touch screen and the pen input value is recognized in response to the activation of the pen input recognition function. The recognition result is displayed in the transparent handwriting input window. In response to the activation of the recognition result input function, the recognition result is transferred to the working area of the working window. Consequently, a working window in which the user's pen input value is inputted in the working area is obtained. When there is two or more pen input values and they are recognized in order, each of the recognition results is preferably recognized as an independent entity. The handwriting user interface system and method for pen-input display enable easy handwriting input and allow various additional applications.
Description
Description
EASY HANDWRITING USER INTERFACE SYSTEM AND METHOD FOR PEN-INPUT DISPLAY
Technical Field
[1] The present invention relates to handwriting user interface system and method for pen-input display. Background Art
[2] Various mobile electronic terminals such as Sony's UX and Samsung's Ql are on the market. Such mobile electronic terminals do not have character input devices like keyboard such that character input is typically performed through touch-screen pen input. Fig. 1 illustrates a typical handwriting user interface for the conventional pen-input display. Fig. l(a) shows the user interface by which the user browses a particular working window (10). While the working window (10) is activated, an input of a particular character value may be requested through pen input at a working area (11). When the handwriting input function is activated by the user, a handwriting input window (20) is created and displayed. As illustrated in Fig. l(b), the handwriting input window (20) is displayed in a partially overlapped manner with the lower part of the working window (10). The handwriting input window (20) is an opaque window and covers some part of the working window (10). When a particular character is input in the handwriting input window (20) through touch- screen pen input, the pen input value (21) is displayed in the handwriting input window (20). If no more pen input is carried out for a predetermined period of time or the "handwriting recognition" menu (23) is selected, the pen input recognition function is activated by handwriting management application equipped inside the mobile terminal. Then, as illustrated in Fig. l(c), the recognition result (22) is displayed in the handwriting input window (20) replacing the pen input value (21). When the user selects the "recognition result input" menu (24), the recognition result (22) is inputted in the working area (11) of the working window (10).
[3] The problem of the conventional handwriting input is that it distracts the attention of the user. To be specific, a "jump of focus of attention" between the working window (10) and the handwriting input window (20) is required for the user. Although the handwriting recognition function is being improved, about 10 % of recognition failure is inevitable. In the conventional technique, the character value inputted in the handwriting input window (20) is recognized as a single operation value. Thus, the correction of errors is not easy and a delete key has to be activated for the correction. Further, at times, the handwriting input window (20) may be activated covering the
working area (11) of the working window (10). In this situation, the working window (10) has to be rearranged. Disclosure of Invention
Technical Problem
[4] An object of the present invention is to provide handwriting user interface system and method for pen-input display which enable simple and efficient pen input without interrupting the user's browsing of the working window at all.
[5] Another object of the present invention is to provide handwriting user interface system and method for pen-input display which do not require the user's jump of focus of attention between the current working window and the handwriting input window during character input.
[6] Still another object of the present invention is to provide handwriting user interface system and method for pen-input display which enable convenient input of the handwriting recognition result in the working area of the working window.
[7] Yet another object of the present invention is to provide handwriting user interface system and method for pen-input display which enable selective input of the wanted character value in the working area when there is more than one recognition result. Technical Solution
[8] In accordance with a preferred embodiment of the present invention, provided is a handwriting character input method for pen-input display which comprises the steps of a) in response to user's activation of a handwriting input function, displaying a transparent handwriting input window in an overlapped manner with a current working window, b) receiving pen input of the user through a touch screen and, in response to activation of a pen input recognition function, recognizing the pen input, c) displaying the recognition result in the handwriting input window and d) in response to activation of the recognition result input function, transferring the recognition result to a working area of the working window.
Advantageous Effects
[9] The handwriting user interface system and method for pen-input display according to the present invention provide the following advantages.
[10] First, the handwriting recognition area and the working area are overlapped without interrupting the user's visual recognition of the working area. Accordingly, the user's attention can be focused. In the conventional technique, the user's jump of focus of attention between the working area and the handwriting recognition area is required. But, it is unnecessary in the present invention.
[11] Second, more than one pen input value and recognition result thereof are recognized as independent entities. Accordingly, the user can control them more conveniently. The
recognition error can be simply corrected by re-input and two or more recognition results can be selectively controlled.
[12] Third, the recognition result can be controlled more conveniently by transferring the recognition result to the working window through drag and drop. This further simplifies the selective input from two or more recognition results.
[13] Fourth, the transparent handwriting input window remains activated if the activation button is pressed and becomes inactivated if the button is depressed. This enables more efficient handwriting input. Brief Description of the Drawings
[14] Fig. 1 illustrates a typical example of the conventional handwriting user interface for pen-input display.
[15] Fig. 2 is a flowchart illustrating a preferred embodiment of the handwriting user interface for pen-input display in accordance with the present invention.
[16] Figs. 3 to 5 illustrate a preferred embodiment of the handwriting user interface for pen-input display in accordance with the present invention.
[17] Fig. 6 is a perspective view illustrating a preferred embodiment of a mobile electronic terminal equipped with the handwriting user interface for pen-input display in accordance with the present invention. Mode for the Invention
[18] Fig. 2 is a flowchart illustrating a preferred embodiment of the handwriting user interface for pen-input display in accordance with the present invention and Figs. 3 to 5 illustrate preferred embodiments of the handwriting user interface for pen-input display in accordance with the present invention.
[19] Hereinafter, the present invention will be more fully illustrated referring to Figs. 2 to 5. User may select a handwriting input function for character inputs (e.g., keyword) in the working area (101) while the current working window (100) is displayed on a mobile electronic terminal (SlOO). In response to the activation of the handwriting input function, a transparent handwriting input window (200) is displayed onto the working window (100) in an overlapped manner as illustrated in Fig. 3 (S200). The handwriting input function may be assigned to a particular button (500 in Fig. 6) or a menu of the mobile electronic terminal (1). Preferably, the handwriting input function is assigned to a particular button (500 in Fig. 6). In this case, the handwriting input function is activated if the button (500 in Fig. 6) is pressed and is inactivated if the button (500 in Fig. 6) is depressed.
[20] The handwriting input window (200) is a transparent intermediate window. As used herein, "transparent" means that the text and/or image in the working window (100) can be visually recognized even if it is overlapped by the activated handwriting input
window (200). Usually, the transparency can be adjusted within the range from 100 to 50 %. In this range, the content in the working window (100) can be sufficiently recognized visually. Consequently, the content in the working window (100) can be visually recognized even when the transparent handwriting input window (200) is arranged on the working window (100). And, by depressing the handwriting input function button (500 in Fig. 6), the user can resume the browsing of the working window (100) at any time. Preferably, the transparent handwriting input window (200) has a size identical to that of the working window (100). If so, the transparent handwriting input window (200) covers the working window (100) entirely. The reference numerals 205 and 206 in Fig. 3 are close menu and size control menu of the transparent handwriting input window (200), respectively. The menus 205 and 206 are optional.
[21] As illustrated in Fig. 4, the user can input a particular character through the touch screen (400 in Fig. 6) while the transparent handwriting input window (200) is activated (S300). The pen input value (201a, 201b and 201c, collectively "201") is displayed in the transparent handwriting input window (200) (S400). The user may input more than one pen input value in order. The pen input values "Garden" (201a) and "House" (201b and 201c) are examples of the pen input values. If there is no more pen input for a predetermined period of time (typically, 1 to 3 seconds) after the pen input of "Garden" (201a), the pen input recognition function is activated by the handwriting management application equipped inside the mobile electronic terminal (S500). Alternatively, the pen input recognition function may be activated by selecting the "handwriting recognition" menu (203) provided at the bottom of the transparent handwriting input window (200). The recognition result (202a) for the pen input value "Garden" (201a) is displayed near the pen input value (S600). The user may determine that "House" is a better keyword than the pen input value "Garden" (201a). In this case, the user inputs "House" through the pen input. After a predetermined time, the pen input value "House" (201b) is recognized and the recognition result (202b) is displayed at side of the pen input value (201b). Of course, recognition error may occur. The recognition result (202b) shows that the pen input value (201b) is recognized as "Hauze." In the conventional technique, such a recognition error may be corrected by correcting the pen input value (201b). However, in accordance with the present invention, it can be simply corrected by inputting "House" (201c) again more clearly. The correct recognition result (202c) is displayed near the pen input value "House" (201c). If there is more than one recognition result (S700), each of the recognition results is displayed in the handwriting input window (200) as an independent entity (S800). Specifically, the recognition results "Garden" (202a), "Hauze" (202b) and "House" (202c) are displayed in the handwriting input window (200) as independent
entities.
[22] By activating the recognition result input function, the user can select the wanted recognition result from the recognition results (202a, 202b and 202c, totally "202") and input it in the working area (101) of the working window (100) (S900). For example, the user can select the recognition result "House" (202c) using a mouse or a pen and transfer the recognition result "House" (202c) to the working area (101) by selecting the "recognition result input" menu (204) positioned at the bottom of the transparent handwriting input window (200). More preferably, the activation of the recognition result input function is performed by dragging and dropping the recognition result displayed as an independent entity to the working area of the working window using a mouse. That is, the recognition result "House" (202c) is transferred to the working area (101) by simple drag-and-drop action of the mouse. When the recognition result (202) is dragged and dropped, the handwriting management application can automatically perform the transfer of the corresponding recognition result to the working window (100). If necessary, the handwriting input window (200) may be closed automatically under the control of the handwriting management application when the drag and drop is completed. As an alternative, the handwriting input window (200) may be closed when the handwriting input button (500 in Fig. 6) is depressed. After handwriting input window (200) has closed by the drag and drop or by the depressing of the handwriting input button (500 in Fig. 6) (SlOOO), a working window (100) in which the recognition result "House" (202c) is inputted in the working area (101) is activated, as illustrated in Fig. 5.
[23] Fig. 6 is a perspective view illustrating a preferred embodiment of a mobile electronic terminal equipped with the handwriting user interface for pen-input display in accordance with the present invention. As illustrated in Fig. 6, the mobile electronic terminal (1) is equipped with a handwriting input application (300), a touch screen (400), and a button to which the handwriting input function is assigned (500). In addition, a middleware (e.g., Window OS) is equipped in the mobile electronic terminal (1). When the user presses the handwriting input button (500), the handwriting input application (300) is executed and the transparent handwriting input window (200) is displayed overlapping the current working window (100). In accordance with a preferred embodiment of the present invention, the activated status is maintained while the button (500) is pressed and the handwriting input function is inactivated when the button is depressed. This ensures more efficient handwriting input.
[24] In Fig. 6, the transparent handwriting input window (200) is illustrated as partially overlapping the current working window (100). However, this is for the convenience of understanding and it may overlap whole of the current working window (100). As
described earlier, the handwriting input window (200) is a transparent intermediate window and does not disturb the user's browsing of the working window (100) at all. While the handwriting input window (200) is activated, the pen input value (201) through the touch screen (300) and the recognition result (202) thereof are displayed on the transparent handwriting input window (200). The recognition result (202) is transferred to the working area (101) of the working window (100), for example, by drag and drop. Then, the transparent handwriting input application (200) becomes inactivated and the wanted keyword is inputted in the working window (100). This process is carried out by the handwriting management application (300). For this purpose, the handwriting management application (300) performs the following functions of:
[25] i) in response to the activation of the handwriting input function by the user, displaying the transparent handwriting input window in an overlapped manner onto the current working window;
[26] ii) displaying the user's touch-screen pen input value in the handwriting input window;
[27] iii) performing the recognition of the pen input value if there is no more pen input for a predetermined period of time and displaying the recognition result in the handwriting input window; and
[28] iv) in response to the activation of the recognition result input function, transferring the recognition result to the working area of the working window.
[29] In addition to these functions, the handwriting management application (300) preferably performs the function of, when there is more than one recognition result, recognizing each of the recognition results as an independent entity. And, more preferably, it further performs the function of automatically transfer the recognition result to the working window (100) through the drag-and-drop of a mouse.
Claims
[1] A handwriting character input method for pen-input display which comprises the steps of: a) in response to user's activation of a handwriting input function, displaying a transparent handwriting input window in an overlapped manner with a current working window; b) receiving pen input of the user through a touch screen and, in response to activation of a pen input recognition function, recognizing the pen input; c) displaying the recognition result in the handwriting input window; and d) in response to activation of a recognition result input function, transferring the recognition result to a working area of the working window.
[2] The method as set forth in claim 1, wherein the activation of the pen input recognition function in the step (b) is performed in a case that there is no pen input for a predetermined period of time.
[3] The method as set forth in claim 1, wherein, if there is two or more recognition results in the step (c), each of the recognition results is displayed in the transparent handwriting input window as an independent entity.
[4] The method as set forth in claim 3, wherein the activation of the recognition result input function in the step (d) is performed by dragging and dropping the recognition result, which is displayed as an independent entity, to the working area of the working window.
[5] The method as set forth in claim 1, wherein the steps (a) to (d) are performed while a button for activating the handwriting input function is pressed.
[6] The method as set forth in claim 5, wherein the handwriting input window is closed by depressing the button for activating the handwriting input function.
[7] A handwriting user interface for pen-input display system which comprises: a button for activating the handwriting input function; a touch screen for pen input; and a handwriting management application which performs functions of, in response to activation of a handwriting input function, displaying a transparent handwriting input window overlapping a current working window, displaying a user's touch-screen pen input value in the handwriting input window, performing recognition of the pen input value if there is no more pen input for a predetermined period of time and displaying the recognition result in the handwriting input window and, in response to activation of a recognition result input function, transferring the recognition result to a working area of the working window.
[8] The handwriting user interface for pen-input display system as set forth in claim
7, wherein the handwriting management application further performs the function of, when there is more than one recognition result, recognizing each of the recognition results as an independent entity.
[9] The handwriting user interface for pen-input display system as set forth in claim
7, wherein the handwriting management application further performs the function of automatically transferring the recognition result to the working area of the working window through drag and drop.
[10] The handwriting user interface for pen-input display system as set forth in claim
7, wherein the handwriting input window is activated when the button for activating the handwriting input function is pressed and is inactivated when the button for activating the handwriting input function is depressed.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2007-0020333 | 2007-02-28 | ||
KR1020070020333A KR100874044B1 (en) | 2007-02-28 | 2007-02-28 | Easy handwriting user interface system and method for pen-input display |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008105574A1 true WO2008105574A1 (en) | 2008-09-04 |
Family
ID=39721391
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2007/002058 WO2008105574A1 (en) | 2007-02-28 | 2007-04-26 | Easy handwriting user interface system and method for pen-input display |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR100874044B1 (en) |
WO (1) | WO2008105574A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120174008A1 (en) * | 2009-04-03 | 2012-07-05 | Sony Computer Entertainment Inc. | Information input device and information input method |
US8917248B2 (en) | 2009-07-31 | 2014-12-23 | Samsung Electronics Co., Ltd. | Character recognition and character input apparatus using touch screen and method thereof |
US9489126B2 (en) | 2013-05-07 | 2016-11-08 | Samsung Electronics Co., Ltd. | Portable terminal device using touch pen and handwriting input method thereof |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102204784B1 (en) * | 2014-03-10 | 2021-01-19 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR102576909B1 (en) | 2018-08-08 | 2023-09-11 | 삼성전자 주식회사 | electronic device and method for providing a drawing environment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5878226A (en) * | 1981-11-04 | 1983-05-11 | Canon Inc | Input device |
JPS6252630A (en) * | 1985-08-30 | 1987-03-07 | Toshiba Corp | Hand written input display system |
JP2000123114A (en) * | 1998-10-15 | 2000-04-28 | Casio Comput Co Ltd | Handwritten character input device and storage medium |
JP2005258882A (en) * | 2004-03-12 | 2005-09-22 | Sanyo Electric Co Ltd | Character input support method and character input support program |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100380600B1 (en) * | 2000-04-17 | 2003-04-21 | (주)네이스텍 | Method for inputing a character in Terminal having Touch Screen |
EP1639441A1 (en) | 2003-07-01 | 2006-03-29 | Nokia Corporation | Method and device for operating a user-input area on an electronic display device |
-
2007
- 2007-02-28 KR KR1020070020333A patent/KR100874044B1/en not_active IP Right Cessation
- 2007-04-26 WO PCT/KR2007/002058 patent/WO2008105574A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5878226A (en) * | 1981-11-04 | 1983-05-11 | Canon Inc | Input device |
JPS6252630A (en) * | 1985-08-30 | 1987-03-07 | Toshiba Corp | Hand written input display system |
JP2000123114A (en) * | 1998-10-15 | 2000-04-28 | Casio Comput Co Ltd | Handwritten character input device and storage medium |
JP2005258882A (en) * | 2004-03-12 | 2005-09-22 | Sanyo Electric Co Ltd | Character input support method and character input support program |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120174008A1 (en) * | 2009-04-03 | 2012-07-05 | Sony Computer Entertainment Inc. | Information input device and information input method |
EP2418569A4 (en) * | 2009-04-03 | 2015-07-29 | Sony Computer Entertainment Inc | Information input device and information input method |
US9703392B2 (en) | 2009-04-03 | 2017-07-11 | Sony Corporation | Methods and apparatus for receiving, converting into text, and verifying user gesture input from an information input device |
US8917248B2 (en) | 2009-07-31 | 2014-12-23 | Samsung Electronics Co., Ltd. | Character recognition and character input apparatus using touch screen and method thereof |
US9811750B2 (en) | 2009-07-31 | 2017-11-07 | Samsung Electronics Co., Ltd | Character recognition and character input apparatus using touch screen and method thereof |
US10373009B2 (en) | 2009-07-31 | 2019-08-06 | Samsung Electronics Co., Ltd | Character recognition and character input apparatus using touch screen and method thereof |
US9489126B2 (en) | 2013-05-07 | 2016-11-08 | Samsung Electronics Co., Ltd. | Portable terminal device using touch pen and handwriting input method thereof |
US9875022B2 (en) | 2013-05-07 | 2018-01-23 | Samsung Electronics Co., Ltd. | Portable terminal device using touch pen and handwriting input method thereof |
Also Published As
Publication number | Publication date |
---|---|
KR20080079830A (en) | 2008-09-02 |
KR100874044B1 (en) | 2008-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10459613B2 (en) | Object display device | |
TWI420379B (en) | Method for selecting functional icons on a touch screen | |
JP6073863B2 (en) | Item display control method and apparatus | |
EP1835385A2 (en) | Method and device for fast access to application in mobile communication terminal | |
US20110246918A1 (en) | Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display | |
US20100281410A1 (en) | Configuring an Adaptive Input Device with Selected Graphical Images | |
KR20130052743A (en) | Method for selecting menu item | |
JP5743847B2 (en) | Mobile terminal and low sensitivity area setting program | |
KR20140089224A (en) | Device and method for executing operation based on touch-input | |
EP2642377B1 (en) | Handheld device and homescreen management method thereof | |
WO2008105574A1 (en) | Easy handwriting user interface system and method for pen-input display | |
CN105739700A (en) | Notice opening method and apparatus | |
US20120120021A1 (en) | Input control apparatus | |
CN109885303A (en) | A kind of method, apparatus of the software Custom Icons page | |
KR101961907B1 (en) | Method of providing contents of a mobile terminal based on a duration of a user's touch | |
US20090079704A1 (en) | Method and apparatus for inputting operation instructions using a dual touch panel of a mobile communication device | |
WO2019205596A1 (en) | Smart interactive device, main page customization method and device therefor, and readable storage medium | |
CN106339096A (en) | Method and device for quickly selecting texts and pictures | |
CN112199124B (en) | Project opening method and device and display equipment | |
CN106775237B (en) | Control method and control device of electronic equipment | |
KR101922426B1 (en) | System and method for providing information with stages | |
KR100874731B1 (en) | User interface method using touch screen | |
CN102855083B (en) | Terminal and terminal cooperation method | |
JP2012141696A (en) | Portable information terminal and display control method for the same | |
CN102096492A (en) | Selection function menu method of touch screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07746217 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 07746217 Country of ref document: EP Kind code of ref document: A1 |