WO2011075891A1 - Method and apparatus for facilitating text editing and related computer program product and computer readable medium - Google Patents
Method and apparatus for facilitating text editing and related computer program product and computer readable medium Download PDFInfo
- Publication number
- WO2011075891A1 WO2011075891A1 PCT/CN2009/075875 CN2009075875W WO2011075891A1 WO 2011075891 A1 WO2011075891 A1 WO 2011075891A1 CN 2009075875 W CN2009075875 W CN 2009075875W WO 2011075891 A1 WO2011075891 A1 WO 2011075891A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- editing region
- editing
- region
- characters
- inputted
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
Definitions
- the present invention generally relates to the field of text editing and, more particularly, to a method and apparatus for facilitating touch-based text editing and relevant computer program products and storage medium.
- target word or characters for example a misrecognized or mis-inputted word or character
- users may need to input a new word or character to replace the selected one.
- the above-mentioned various input modalities can be used in this interactive correction procedure. How to fuse these input modalities and allow users to quickly edit text is very important to gain a smoothly and joyful user experience, which is also a design challenge in the limited portable device screen.
- the present invention proposes new interacting mechanism for facilitating text editing in a portable device with a size-limited touch screen, especially for the speech recognition recovery.
- a method for facilitating text editing comprises: providing a first editing region displaying a plurality of inputted characters; providing a second editing region in which a subset of the inputted characters is displayed enlargedly for being edited on the basis of a language unit; performing, in responding to receiving an editing input to the second editing region, a joint update to corresponding characters in the second editing region and the first editing region.
- an apparatus for facilitating text editing comprising: means for providing a first editing region displaying a plurality of inputted characters; means for providing a second editing region in which a subset of the inputted characters is displayed enlargedly for being edited on the basis of a language unit; means for performing, in responding to receiving an editing input to the second editing region, a joint update to corresponding characters in the second editing region and the first editing region.
- a computer program product comprising a computer readable storage structure embodying computer program code thereon for execution by a computer processor, wherein said computer program code is hosted by a device and comprises instructions for performing a method including: providing a first editing region displaying a plurality of inputted characters; providing a second editing region in which a subset of the inputted characters is displayed enlargedly for being edited on the basis of a language unit; performing, in responding to receiving an editing input to the second editing region, a joint update to corresponding characters in the second editing region and the first editing region.
- Fig. 1 schematically shows a flow chart of a method for facilitating text editing according to one illustrative embodiment of the present invention
- Fig. 2 schematically shows the main view of a user interface according to one illustrative embodiment of the present invention
- Fig. 3 schematically shows a view of a user interface for moving a cursor, according to one illustrative embodiment of the present invention
- Fig.4A schematically shows views of a user interface for browsing the plurality of inputted characters in the second editing region, according to one illustrative embodiment of the present invention
- Fig.4B schematically shows views of a user interface for zooming in/zooming out the content in the second editing region, according to one illustrative embodiment of the present invention
- Fig.4C schematically shows views of a user interface for zooming in/zooming out the content in the second editing region, according to one illustrative embodiment of the present invention
- Fig. 5A shows a view of a user interface for editing text by candidate list according to one illustrative embodiment of the present invention
- Fig. 5B shows another view of a user interface for editing text by candidate list according to one illustrative embodiment of the present invention
- Fig. 5C shows views of a user interface for editing text by candidate list according to one illustrative embodiment of the present invention
- Fig.6A shows a view of a user interface for editing text by handwriting, according to one illustrative embodiment of the present invention
- Fig.6B shows another view of a user interface for editing text by handwriting, according to one illustrative embodiment of the present invention
- Fig.6C shows another view of a user interface for editing text by handwriting, according to one illustrative embodiment of the present invention
- Fig.7A shows a view of a user interface for deleting text, according to one illustrative embodiment of the present invention
- Fig.7B shows a view of a user interface for deleting text, according to one illustrative embodiment of the present invention
- Fig.8 shows a view of a user interface for inputting symbols, according to one illustrative embodiment of the present invention
- Fig.9 shows a portable device in which one illustrative embodiment of the present invention can be implemented
- Fig.10 shows a configuration schematic of the portable device as shown Fig.9.
- Fig. 1 schematically shows a flow chart of a method for facilitating text editing according to one illustrative embodiment of the present invention.
- step S100 the flow of the method for facilitating text editing according to one illustrative embodiment of the present invention starts.
- a first editing region displaying a plurality of inputted characters is provided in a user interface.
- the plurality of inputted characters are, for example, resulted from speech-to-text recognition, handwriting recognition, optical character recognition (OCR) and/or the result of captured keystroke.
- OCR optical character recognition
- the first editing region functions as the overview and provides the user with a contextual view of whole text including the plurality of inputted characters.
- the plurality of inputted characters displayed by the first editing region is preferably with a scaled-down size.
- a second editing region in which a subset of the inputted characters is displayed enlargedly for being edited on the basis of a language unit is provided.
- the subset of the inputted characters which needs to be further edited or corrected can be for example selected by the user from the first editing region via a selecting means and shown in the second editing region.
- the selected subset of the inputted characters shown in the second editing region can be edited on the basis of a minimal language unit, for example, a Chinese character in Chinese, a word or even a character of a word in English.
- the second editing region functions as a detail view of the selected characters and allows the user to view them in detail and interact with respective characters to make error corrections or further editing.
- the second editing region can be flipped and/or scanned to enable a navigation of the detailed text as shown in the first editing region.
- the first editing region and the second editing region are configured to be displayed simultaneously, so as to provide the user both the contextual view and enlarged detailed view of the text .
- Editing inputs include any type of inputs for making text editing, for example, moving cursor, deleting, selecting character ( s ) , selecting an editing modality, adding a new character or symbol, and so o .
- the present invention can support any type of the editing input by configuring corresponding processing for supported input types. That is, the present invention will not be restricted to any specific input type discussed as examples in the present disclosure, but can be applicable to any new editing scenario which may require performing scenario-specific inputs of new types.
- a joint update to corresponding characters in the second editing region and the first editing region is performed.
- the first editing region and second editing region are associated with each other.
- the corresponding characters as shown in the first editing region will be updated jointly, to display in the first editing region the overview of the whole text containing the corresponding change.
- step S150 the flow of the method for facilitating text editing according to one illustrative embodiment of the present invention ends.
- FIG. 1 the method for facilitating text editing according to one illustrative embodiment of the present invention is described.
- Hardware, software and the combination of both, which can be configured to provide the above functionalities, are well known in the art and will not be set forth herein in detail, for the purpose of emphasizing the core concept of the present invention.
- Fig. 2 schematically shows the main view of a user interface according to one illustrative embodiment of the present invention.
- reference numeral 200 denotes a user interface according to one illustrative embodiment of the present invention for a message application
- reference numeral 210 denotes a first editing region of the user interface 200
- reference numeral 220 denotes a second editing region of the user interface 200.
- a plurality of inputted characters which can be resulted for example from speech-to-text recognition, handwriting recognition, optical character recognition (OCR) and/or the result of captured keystroke, are displayed in the first editing region 210 of the user interface 200.
- the first editing region 210 displays the whole text which has been inputted to the message application. Due to limitation of the screen size, an individual character displayed in the overview of the first editing region 210 are typically scaled down and with a small size, which is substantially difficult to be interacted with individually by using a fingertip of the user.
- the second editing region 220 of the user interface 200 is provided horizontally under the first editing region 210.
- different layout for the second editing region 220 relative to the first editing region 210 can also be adopted, which will not make any limitation to the protection scope of the present invention.
- a subset of the inputted characters selected from the first editing region 210 via a selecting means 211 such as a hint box or a sliding line is displayed in an enlarged style.
- multiple characters as an example, 7 characters shown in Fig. 2 which are selected by the selecting means 211 in the first editing region 210 are displayed enlargedly in the second editing region 220 as buttonized characters 221.
- Each button 221 represents one language unit such as a single character or a word which can be edited independently. In a preferable implementation, a minimal language unit can be enlarged in one button.
- the user may also configure the buttons to show his/her desired language units in these buttons. Since each button represents a language unit, the user may perform text editing/error correction on the basis of the buttons 221 (i.e., the language units represented by the buttons) to update the inputted text.
- the first editing region 210 and the second editing region 220 are associated with each other. When the user performs text editing/error correction on the buttons 221, a joint update is performed with respect to characters shown in the corresponding buttons 221 of the second editing region 220 and corresponding characters shown in the first editing region 210.
- the user interface 200 may optionally contain several functional buttons 230 to enable corresponding functionalities for facilitating text editing.
- the functional buttons 230 includes input mode buttons, such as a speech input button for activating speech recognition mode, a handwriting input button for activating a handwriting mode, a symbol input button for activating a mode for inputting symbols; and editing operation buttons, such as a deleting operation button for deleting characters or symbols selected in the text, inserting operation button for inserting characters or symbols into the selected position of the text; and the like.
- specific gestures that the user makes on the touch screen of the portable device can be designated to respective functionalities. When a specific gesture is detected, corresponding functionalities will be enabled.
- functionality buttons and/or gestures designated to respective functionalities can be designed on the demand of applications and/or depending upon user preferability .
- the user begins speech input by pressing the speech input button in the user interface.
- the result of speech recognition is shown in the first editing region 210, which usually contains a plurality of speech-inputted characters.
- the hint box 211 of a certain length appears at its default location (for example, the end) of the speech-inputted text displayed in the first editing region 210.
- the user may change the location of the hint box 211 by directly clicking desired location in the first editing region 210 or dragging the hint box 211 to the desired location.
- the hint box 211 selects a subset of the inputted characters shown in the first editing region 210.
- Enlarged version of the characters in the hint box 211 is displayed in the second editing region 220 as buttonized characters.
- the hint box 211 gives the user a hint of which part of the inputted characters in the first editing region 210 is visible in the second editing region 220.
- both the hint box 211 of the first editing region 210 and the second editing region 220 can be activated or hided in responding to specific indication of the user.
- Fig. 3 schematically shows a view of a user interface for moving a cursor, according to one illustrative embodiment of the present invention.
- the operation of moving the cursor can be performed both in the first and the second editing regions 210, 220.
- the user may click somewhere in the first editing region 210 to move the cursor in the overview of the inputted text; the user may also tap a space between two buttonized characters 221 in the second editing region 220. Regardless in which one of the first and second editing regions the movement of the cursor is occurred, the location of the cursor in the other of the first and second editing regions will be updated accordingly.
- the hint box 211 can be configured to be moved following the cursor, the relative location of the hint box 211 and the cursor should be considered in practice. In one implementation, it may predefined that the center of the hint box 211 always follows the user's fingertip by default and the cursor always also follows the user's fingertip. If the user clicks somewhere in the beginning/last characters within the length of the hint box 211, the hint box 211 will cover the beginning/last characters within the length of the hint box 211 and the cursor should follow the user's fingertip. In the case of the inputted characters less than the default length of the hint box 211, the length of the hint box 211 can be configured to be changed according to the text length.
- the second editing region 220 per se can be provided with a mechanism for browsing the text.
- Fig.4A schematically shows views of a user interface for browsing the plurality of inputted characters in the second editing region, according to one illustrative embodiment of the present invention.
- the user for example can flick the second editing region 220 to page down or page up the content shown in the second editing region 220 and/or flip the second editing region 220 to left or right to view the previous or next set of characters and/or scan the second editing region 220 to shift characters to left or right one at a time (a slower and more controlled version of flipping) .
- the mechanism for browsing allows the user to make detailed text navigation in the second editing region 220.
- the hint box 211 in the first editing region 210 is moved accordingly.
- Fig 4B schematically shows views of a user interface for zooming in/zooming out the content in the second editing region, according to one illustrative embodiment of the present invention.
- the characters in the second editing region 220 is preferably configured to be zoomed in or zoomed out, so that the user can dynamically change the number of the characters ⁇ as the language units) shown in the second editing region 220, as shown in Fig. 4B, and/or the language unit per se based on which the second editing region 220 shows the characters as shown, in Fig. 4C.
- the second editing region 220 is zoomed in or zoomed out to change the number of characters displayed in the second editing region 220.
- the language unit presented by each button 221 of the second editing region 220 can be changed from a single character to a word or from a word to a single character.
- Figs. 5A-5C show views of a user interface for editing text by candidate list according to one illustrative embodiment of the present invention.
- the buttonized characters 221 in the second editing region 220 can be activated to reveal a candidate list 510 of statistically relevant characters.
- the user taps the buttonized character and then the candidate list 510 pops up to show the candidate list 510, which can be generated according to any known algorithm in the art for prompting candidates of an inputted character or word.
- the cursor may be hidden in the second editing region 220 and the corresponding character in the first editing region 210 are highlighted in the hint box 211. The user may tap the activated buttonized character again to deactivate the character and hide the candidate list 510.
- the cursor may appear at the original location of the second editing region 220 and the corresponding character in the first editing region 210 are de-highlighted in the hint box 211.
- the candidate list 510 for each of the buttonized characters 221 can be flipped upwards and downwards to reveal more candidate characters.
- the original activated buttonized character in the second editing region 220 will be replaced by the selected one.
- a joint update is also performed in the first editing region 210 accordingly.
- the candidate list 510 is configured to be flipped along a second direction while the second editing region 220 is configured to be flipped along a first direction, wherein the first and the second directions are substantially perpendicular to each other .
- the user may drag along in the second editing region 220 to select multiple buttonized characters to be activated. After the current buttonized character is corrected/deselected, a next character of the selected buttonized characters in the second editing region 220 will be activated to show its candidate list 510. If a buttonrized character in the second editing region 220 is replaced with a selected candidate character, it is preferred that the candidate list 510 of the next buttonized character is dynamically changed according to the user's correction. Similarly, as multiple characters are selected in the second editing region 220, the corresponding characters in the first editing region 210 are highlighted in the hint box 211. The user may tap other enlarged character in the second editing region 220 beyond the selection to deselect the multiple characters.
- a handwriting mode can be activated in the user interface 200.
- Figs.6A-6C shows views of a user interface for editing text by handwriting, according to one illustrative embodiment of the present invention.
- the user may, for example, click the handwriting input button in the user interface 200, and then the handwriting pane 600 pops up in the user interface 200, in which the first editing region 210 can be hidden or defocused while the second editing region 220 appears along with the handwriting pane 600, as shown in Fig. 6A.
- a handwriting candidate list 610 of the handwriting mode can be popped up to enable the user search for the desired character.
- the handwriting candidate list 610 of the handwriting mode can be flipped upwards and downwards with the user' s specific gestures.
- the functional buttons 630 can be provided to enable corresponding functionalities for facilitating handwriting input process.
- the functional buttons 630 include a confirmation button, an input language switching button, a symbol input button and a deleting button. For example, if the user clicks the deleting button on the handwriting pane 600, then the candidate list 610 and the selected buttonized character in the second editing region 220 will be deleted. If there is no character being selected, then clicking the deleting button will delete the character immediate before the cursor.
- the first editing region 210 is invisible or defocused, the text contained in the first editing region is also updated along with the second editing region 220.
- the first editing region 210 will display the updated text.
- handwriting recognition as an example of various input modalities is used to correct the errors in the inputted characters or further edit the inputted text.
- the user may activate a pane for speech recognition or for virtual keyboard input, to correct errors in the inputted characters or further edit the inputted text in conjunction with the second editing region 220.
- Figs.7A-7B show views of a user interface for deleting text, according to one illustrative embodiment of the present invention.
- Figs. 7A and 7B illustrate two applicable examples.
- a joint update will be performed in both first editing region 210 and the second editing region 220.
- Fig.8 shows a view of a user interface for inputting symbols, according to one illustrative embodiment of the present invention.
- a symbol pane 800 can be activated to facilitate symbol input, for example, by pressing a symbol input button in the user interface 200 or making some predefined gesture.
- the symbol pane 800 is displayed in conjunction with the second editing region 220.
- the symbol pane 800 is. activated, the first editing region 210 will become invisible or defocused.
- the user designates the location in the second editing region 220 where he or she would like to insert a symbol and then taps a desired symbol in the symbol pane 800.
- the symbol pane 800 may further include functional buttons 830 to support additional operations with respect to the symbol pane 800, for example, page down button, page up button, deleting button, confirming button and the like.
- Fig.9 shows a portable device in which one illustrative embodiment of the present invention can be implemented .
- the mobile terminal 900 comprises a speaker or earphone 902, a microphone 906, a touch display 903 and a set of keys 904 which may include virtual keys 904a, soft keys 904b, 904c and a joystick 905 or other type of navigational input device.
- Fig.10 shows a configuration schematic of the portable device as shown Fig.9.
- the mobile terminal has a controller 1000 which is responsible for the overall operation of the mobile terminal and may be implemented by any commercially available CPU ("Central Processing Unit"), DSP ⁇ "Digital Signal Processor") or any other electronic programmable logic device.
- the controller 1000 has associated electronic memory 1002 such as RAM memory, ROM memory, EEPROM memory, flash memory, or any combination thereof.
- the memory 1002 is used for various purposes by the controller 1000, one of them being for storing data used by and program instructions for various software in the mobile terminal.
- the software includes a real-time operating system 1020, drivers for a man-machine interface (MMI) 1034, an application handler 1032 as well as various applications.
- MMI man-machine interface
- the applications can include a message text editor 1050, a hand writing recognition (HWR) application 1060, as well as various other applications 1070, such as applications for voice calling, video calling, sending and receiving Short Message Service (SMS) messages, Multimedia Message Service (MMS) messages or email, web browsing, an instant messaging application, a phone book application, a calendar application, a control panel application, a camera application, one or more video games, a notepad application, etc. It should be noted that two or more of the applications listed above may be executed as the same applicatio .
- SMS Short Message Service
- MMS Multimedia Message Service
- the MMI 1034 also includes one or more hardware controllers, which together with the MMI drivers cooperate with the first display 1036/903, and the keypad 1038/904 as well as various other I/O devices such as microphone, speaker, vibrator, ringtone generator, LED indicator, etc. As is commonly known, the user may operate the mobile terminal through the man-machine interface thus formed.
- the software can also include various modules, protocol stacks, drivers, etc., which are commonly designated as 1030 and which provide communication services (such as transport, network and connectivity) for an RF interface 1006, and optionally a Bluetooth interface 1008 and/or an IrDA interface 1010 for local connectivity.
- the RF interface 1006 comprises an internal or external antenna as well as appropriate radio circuitry for establishing and maintaining a wireless link to a base station.
- the radio circuitry comprises a series of analogue and digital electronic components, together forming a radio receiver and transmitter. These components include, band pass filters, amplifiers, mixers, local oscillators, low pass filters, AD/DA converters, etc.
- the mobile terminal also has a SIM card 1004 and an associated reader.
- the SIM card 1004 comprises a processor as well as local work and data memory .
- the various aspects of what is described above can be used alone or in various combinations .
- the teaching of this application may be implemented by a combination of hardware and software, but can also be implemented in hardware or software.
- the teaching of this application can also be embodied as computer program product on a computer readable medium, which can be any material media, such as floppy disks, CD-ROMs, DVDs, hard drivers, even network media and etc.
- the specification of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. It is understood by those skilled in the art that the method and means in the embodiments of the present invention can be implemented in software, hardware, firmware or a combination thereof.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09852441A EP2517123A1 (en) | 2009-12-23 | 2009-12-23 | Method and apparatus for facilitating text editing and related computer program product and computer readable medium |
CN200980163173.2A CN102667753B (en) | 2009-12-23 | 2009-12-23 | The method and apparatus being easy to text editing |
US13/518,319 US20120262488A1 (en) | 2009-12-23 | 2009-12-23 | Method and Apparatus for Facilitating Text Editing and Related Computer Program Product and Computer Readable Medium |
JP2012545046A JP5567685B2 (en) | 2009-12-23 | 2009-12-23 | Method and apparatus for facilitating text editing and associated computer program and computer-readable medium |
PCT/CN2009/075875 WO2011075891A1 (en) | 2009-12-23 | 2009-12-23 | Method and apparatus for facilitating text editing and related computer program product and computer readable medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2009/075875 WO2011075891A1 (en) | 2009-12-23 | 2009-12-23 | Method and apparatus for facilitating text editing and related computer program product and computer readable medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011075891A1 true WO2011075891A1 (en) | 2011-06-30 |
Family
ID=44194908
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2009/075875 WO2011075891A1 (en) | 2009-12-23 | 2009-12-23 | Method and apparatus for facilitating text editing and related computer program product and computer readable medium |
Country Status (5)
Country | Link |
---|---|
US (1) | US20120262488A1 (en) |
EP (1) | EP2517123A1 (en) |
JP (1) | JP5567685B2 (en) |
CN (1) | CN102667753B (en) |
WO (1) | WO2011075891A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130179778A1 (en) * | 2012-01-05 | 2013-07-11 | Samsung Electronics Co., Ltd. | Display apparatus and method of editing displayed letters in the display apparatus |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8423351B2 (en) * | 2010-02-19 | 2013-04-16 | Google Inc. | Speech correction for typed input |
US8988366B2 (en) * | 2011-01-05 | 2015-03-24 | Autodesk, Inc | Multi-touch integrated desktop environment |
US9612743B2 (en) | 2011-01-05 | 2017-04-04 | Autodesk, Inc. | Multi-touch integrated desktop environment |
US9600090B2 (en) | 2011-01-05 | 2017-03-21 | Autodesk, Inc. | Multi-touch integrated desktop environment |
US8766937B2 (en) * | 2011-09-08 | 2014-07-01 | Blackberry Limited | Method of facilitating input at an electronic device |
CN103902198B (en) * | 2012-12-28 | 2017-06-27 | 联想(北京)有限公司 | Electronic equipment and the method for it |
CN104142911B (en) * | 2013-05-08 | 2017-11-03 | 腾讯科技(深圳)有限公司 | A kind of text information input method and device |
CN103685747B (en) * | 2013-12-06 | 2016-06-01 | 北京奇虎科技有限公司 | The modification method of input number and correction device |
CN103761216B (en) * | 2013-12-24 | 2018-01-16 | 上海斐讯数据通信技术有限公司 | Edit the method and mobile terminal of text |
WO2015100172A1 (en) * | 2013-12-27 | 2015-07-02 | Kopin Corporation | Text editing with gesture control and natural speech |
KR101822624B1 (en) * | 2016-06-21 | 2018-01-26 | 김영길 | Method for error correction and application stored in media for executing the same |
JP6925789B2 (en) * | 2016-06-29 | 2021-08-25 | 京セラ株式会社 | Electronics, control methods, and programs |
JP2018072568A (en) * | 2016-10-28 | 2018-05-10 | 株式会社リクルートライフスタイル | Voice input unit, voice input method and voice input program |
EP3690625B1 (en) * | 2017-11-20 | 2023-05-17 | Huawei Technologies Co., Ltd. | Method and device for dynamically displaying icon according to background image |
CN108062290B (en) * | 2017-12-14 | 2021-12-21 | 北京三快在线科技有限公司 | Message text processing method and device, electronic equipment and storage medium |
CN110275651B (en) * | 2018-03-16 | 2024-02-20 | 厦门歌乐电子企业有限公司 | Vehicle-mounted display equipment and text editing method |
CN109032380B (en) * | 2018-08-01 | 2021-04-23 | 维沃移动通信有限公司 | Character input method and terminal |
US11551480B2 (en) * | 2019-04-11 | 2023-01-10 | Ricoh Company, Ltd. | Handwriting input apparatus, handwriting input method, program, and input system |
JP7036862B2 (en) * | 2020-05-18 | 2022-03-15 | 京セラ株式会社 | Electronics, control methods, and programs |
CN112882408B (en) * | 2020-12-31 | 2022-10-18 | 深圳市雷赛控制技术有限公司 | Online editing method and device for ST text language |
CN113807058A (en) * | 2021-09-24 | 2021-12-17 | 维沃移动通信有限公司 | Text display method and text display device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101068411A (en) * | 2006-05-03 | 2007-11-07 | Lg电子株式会社 | Method of displaying text using mobile terminal |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0798769A (en) * | 1993-06-18 | 1995-04-11 | Hitachi Ltd | Information processor and its screen editing method |
JPH10154144A (en) * | 1996-11-25 | 1998-06-09 | Sony Corp | Document inputting device and method therefor |
JP3361956B2 (en) * | 1997-04-18 | 2003-01-07 | シャープ株式会社 | Character recognition processor |
JPH10340075A (en) * | 1997-06-06 | 1998-12-22 | Matsushita Electric Ind Co Ltd | Image display method |
CA2330133C (en) * | 1998-04-24 | 2008-11-18 | Natural Input Solutions Inc. | Pen based edit correction interface method and apparatus |
US7403888B1 (en) * | 1999-11-05 | 2008-07-22 | Microsoft Corporation | Language input user interface |
GB2365676B (en) * | 2000-02-18 | 2004-06-23 | Sensei Ltd | Mobile telephone with improved man-machine interface |
KR100460105B1 (en) * | 2000-02-22 | 2004-12-03 | 엘지전자 주식회사 | Method for searching a menu in a mobile communication terminal |
JP2005055973A (en) * | 2003-08-06 | 2005-03-03 | Hitachi Ltd | Personal digital assistant |
US7443386B2 (en) * | 2004-11-01 | 2008-10-28 | Nokia Corporation | Mobile phone and method |
KR20080044677A (en) * | 2006-11-17 | 2008-05-21 | 삼성전자주식회사 | Remote control apparatus using a soft keyboard, method for inputting character by remote control apparatus and display apparatus using a soft keyboard |
US8028230B2 (en) * | 2007-02-12 | 2011-09-27 | Google Inc. | Contextual input method |
KR101391080B1 (en) * | 2007-04-30 | 2014-04-30 | 삼성전자주식회사 | Apparatus and method for inputting character |
US8726194B2 (en) * | 2007-07-27 | 2014-05-13 | Qualcomm Incorporated | Item selection using enhanced control |
EP2201443A4 (en) * | 2007-09-11 | 2013-05-01 | Smart Internet Technology Crc Pty Ltd | A system and method for manipulating digital images on a computer display |
US8225204B2 (en) * | 2008-03-27 | 2012-07-17 | Kai Kei Cheng | System and method of document reuse |
-
2009
- 2009-12-23 CN CN200980163173.2A patent/CN102667753B/en active Active
- 2009-12-23 EP EP09852441A patent/EP2517123A1/en not_active Withdrawn
- 2009-12-23 US US13/518,319 patent/US20120262488A1/en not_active Abandoned
- 2009-12-23 JP JP2012545046A patent/JP5567685B2/en active Active
- 2009-12-23 WO PCT/CN2009/075875 patent/WO2011075891A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101068411A (en) * | 2006-05-03 | 2007-11-07 | Lg电子株式会社 | Method of displaying text using mobile terminal |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130179778A1 (en) * | 2012-01-05 | 2013-07-11 | Samsung Electronics Co., Ltd. | Display apparatus and method of editing displayed letters in the display apparatus |
Also Published As
Publication number | Publication date |
---|---|
JP2013515984A (en) | 2013-05-09 |
CN102667753A (en) | 2012-09-12 |
CN102667753B (en) | 2016-08-24 |
JP5567685B2 (en) | 2014-08-06 |
US20120262488A1 (en) | 2012-10-18 |
EP2517123A1 (en) | 2012-10-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20120262488A1 (en) | Method and Apparatus for Facilitating Text Editing and Related Computer Program Product and Computer Readable Medium | |
US11416141B2 (en) | Method, system, and graphical user interface for providing word recommendations | |
US7443316B2 (en) | Entering a character into an electronic device | |
US8605039B2 (en) | Text input | |
KR101557358B1 (en) | Method for inputting a string of charaters and apparatus thereof | |
US8412278B2 (en) | List search method and mobile terminal supporting the same | |
US9811750B2 (en) | Character recognition and character input apparatus using touch screen and method thereof | |
US9448715B2 (en) | Grouping of related graphical interface panels for interaction with a computing device | |
EP2529287B1 (en) | Method and device for facilitating text editing and related computer program product and computer readable medium | |
CN102362252A (en) | System and method for touch-based text entry | |
JP2010079441A (en) | Mobile terminal, software keyboard display method, and software keyboard display program | |
CN102279698A (en) | Virtual keyboard, input method and relevant storage medium | |
US8115743B2 (en) | Terminal with touch screen and method for inputting message therein | |
US20140331160A1 (en) | Apparatus and method for generating message in portable terminal | |
KR20080096732A (en) | Touch type information inputting terminal, and method thereof | |
KR20130008740A (en) | Mobile terminal and method for controlling thereof | |
US20110173573A1 (en) | Method for inputting a character in a portable terminal | |
US20090327966A1 (en) | Entering an object into a mobile terminal | |
WO2011075890A1 (en) | Method and apparatus for editing speech recognized text | |
KR20120024034A (en) | Mobile terminal capable of inputting alphabet |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980163173.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09852441 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2009852441 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009852441 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012545046 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13518319 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 6111/CHENP/2012 Country of ref document: IN |