CN107526449B - Character input method - Google Patents

Character input method Download PDF

Info

Publication number
CN107526449B
CN107526449B CN201710465985.9A CN201710465985A CN107526449B CN 107526449 B CN107526449 B CN 107526449B CN 201710465985 A CN201710465985 A CN 201710465985A CN 107526449 B CN107526449 B CN 107526449B
Authority
CN
China
Prior art keywords
gesture
phoneme
tap
touch
phonemes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710465985.9A
Other languages
Chinese (zh)
Other versions
CN107526449A (en
Inventor
吕奇璋
李智尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ambit Microsystems Shanghai Ltd
Hon Hai Precision Industry Co Ltd
Original Assignee
Ambit Microsystems Shanghai Ltd
Hon Hai Precision Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/186,553 external-priority patent/US20160299623A1/en
Application filed by Ambit Microsystems Shanghai Ltd, Hon Hai Precision Industry Co Ltd filed Critical Ambit Microsystems Shanghai Ltd
Publication of CN107526449A publication Critical patent/CN107526449A/en
Application granted granted Critical
Publication of CN107526449B publication Critical patent/CN107526449B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A text input method is executed in an electronic device and comprises the steps of inputting a plurality of phonemes and enabling each phoneme to be operated through gestures. Generating a candidate word menu from the plurality of phonemes. Modifying the plurality of phonemes in response to a gesture operation on at least one of the plurality of phonemes. Generating a modified candidate word menu from the modified plurality of phonemes.

Description

Character input method
Technical Field
The invention relates to a calculator control technology, in particular to a character input method and an electronic system using the same.
Background
Mobile devices such as Smart phones (Smart phones) and tablet computers are now widely used. Such mobile devices typically use touch devices without a mouse. Some mouse operations are not easily achieved by touch device operations, such as icon selection (selection) and drag (drag) operations, and selection of a segment of text. Since the sliding operation on the capacitive or infrared touch device is often used to move the mobile phone screen or menu, the short-press operation on the touch device is often interpreted as the beginning of the sliding operation rather than the selection operation. The select operation is the first step of the drag operation. For example, when a drag operation is used to select a word, a press operation is first used to select the position in front of the word or the first word, and the press operation is continued until the selection is completed when the last word is released. Or, when an icon is moved by using a drag operation, the icon needs to be selected by a press operation first, and the icon is released when the press operation is continued to move to the target position of the icon, so that the movement of the icon is completed.
The touch device needs to use a time length threshold of the pressing operation to separate the sliding operation from the dragging operation. On an object, the pressing operation with the time longer than the threshold value is long pressing, and the pressing operation is analyzed as a selection operation for starting dragging the object; if the pressing operation with the time less than the threshold is finished by the releasing operation, the pressing operation is short-time pressing, and the pressing operation is analyzed as the selection operation of the function or the application program represented by the object to be started; and if the pressing operation with the time less than the threshold leaves the object by the moving operation, the pressing operation is the beginning of the moving operation and is analyzed as the mobile phone picture to be moved.
In some applications, the time length threshold is used to distinguish between selection operations, which is very complicated and affects the smoothness of operations. For example, when selecting an object in a game, a number of opportunities may have been missed while waiting to reach a time length threshold for selecting the object.
The existing keyboard needs to input phonemes such as Roman pinyin or ZhuYin symbols through keys of the keyboard. Because the current intelligent device provides the function of related words, the user usually inputs more than two phonemes, and the intelligent device provides the related words according to the more than two phonemes. If the phoneme of the first word is found to have errors after more than two phonemes are input, which results in providing wrong related words, it is necessary to delete the last phoneme until the phoneme of the first word having errors by using a keyboard, which is inconvenient.
In addition, mobile phones often do not facilitate text entry due to the limited space available for a keypad. As more and more keyboards of different languages, symbols, emoticons (emojies) and different input methods are installed in the mobile phone, it is troublesome and time-consuming to switch between different keyboards for use.
Disclosure of Invention
In view of the above, it is desirable to provide a text input method, which can directly correct the phoneme with errors in a convenient manner when a user continuously inputs more than two phonemes, for example, by performing gesture operations to directly correct the phoneme with errors.
The embodiment of the invention provides a character input method which is executed in an electronic device and is characterized by comprising the following steps:
allowing input of one or more phonemes, wherein the one or more phonemes form a set of phonemes;
allowing each phoneme in the set of phonemes to be processed with a gesture operation;
generating a word choice list, wherein the word choice list comprises word choices derived from one or more phonemes in the set of phonemes;
altering the phone set in response to an altering gesture operation for altering one or more phones in the phone set to produce an altered phone set;
generating an updated word choice list, wherein the updated word choice list includes word choices derived from one or more phonemes in the altered set of phonemes;
enabling an option in the updated list of text options for text entry.
Preferably, the gesture operation for altering one or more phonemes in the set of phonemes comprises a tap and gesture movement operation.
Preferably, the text input method further includes:
judging whether the first part of the touch and gesture movement operation accords with a first input mode or a second input mode;
processing a remaining portion of the tap and gesture movement operations with a first generalized algorithm in the event that the first portion of the tap and gesture movement operations conform to the first input mode; and
in the event that the first portion of the tap and gesture movement operations conform to the second input mode, processing the remaining portion of the tap and gesture movement operations with a second profiling algorithm.
Preferably, the text input method further includes:
determining whether the remaining portion of the tap and gesture movement operation conforms to a delete gesture using the first generalized algorithm;
deleting the phoneme selected by the change gesture operation in the phoneme set to generate the changed phoneme set under the condition that the rest part of the tap and gesture movement operation conforms to the deletion gesture.
Preferably, the text input method further includes:
and judging that the rest part of the touch and gesture movement operation conforms to the deletion gesture under the condition that the rest part of the touch and gesture movement operation drags the selected phoneme from the phoneme region where the phoneme set is located to the outside of the phoneme region.
Preferably, the text input method further includes:
judging whether the rest part of the tap and gesture movement operation conforms to a copy gesture or not by utilizing the first generalized algorithm;
in the event that the remaining portion of the tap-and-gesture movement operation conforms to the copy gesture, copying the phone selected by the alteration gesture operation in the phone set to produce a copy of the selected phone, and adding the copy to the phone set to produce the altered phone set.
Preferably, the text input method further includes:
and under the condition that the rest part of the touch and gesture movement operation drags the selected phoneme from the phoneme region where the phoneme set is located to the outside of the phoneme region and then drags the selected phoneme to another copy destination position in the phoneme region, judging that the rest part of the touch and gesture movement operation conforms to the copy gesture.
Preferably, the text input method further includes:
determining whether the remaining portion of the tap and gesture movement operation conforms to a movement gesture using the first generalized algorithm;
and under the condition that the rest part of the tap and gesture movement operation conforms to the movement gesture, moving the phoneme selected by the change gesture operation in the phoneme set to a movement destination position to generate the changed phoneme set.
Preferably, the text input method further includes:
and in the case that the rest part of the tap and gesture movement operation drags the selected phoneme from the phoneme region where the phoneme set is located to another movement destination position in the phoneme region along a path in the phoneme region, judging that the rest part of the tap and gesture movement operation is in accordance with the movement gesture.
Preferably, the text input method further includes:
determining whether the remaining portion of the tap and gesture movement operation conforms to a replacement gesture using the second rough algorithm;
and under the condition that the rest part of the tap and gesture movement operation accords with the replacing gesture, replacing the phoneme selected by the changing gesture operation in the phoneme set with a replacing symbol to generate the changed phoneme set.
Preferably, the text input method further includes:
in a case where the first portion of the tap and gesture movement operations conforms to the second input mode, it is determined that the remaining portion of the tap and gesture movement operations conforms to the substitute gesture, and one of a plurality of symbols is selected as the substituted symbol in accordance with an action path of the remaining portion of the tap and gesture movement operations.
Preferably, the text input method further includes:
in the case that an operation period of the first part of the tap and gesture movement operations is shorter than a time threshold, determining that the first part of the tap and gesture movement operations conforms to the first input mode;
in a case where an operation period of the first part of the tap and gesture movement operation is longer than a time threshold, it is determined that the first part of the tap and gesture movement operation conforms to the second input mode.
Preferably, the text input method further includes:
in the case that the total force data of the first part of the tap and gesture movement operations does not exceed a total force threshold, determining that the first part of the tap and gesture movement operations conforms to the first input mode; and
in a case that total force data of the first part of the tap and gesture movement operations exceeds a total force threshold, determining that the first part of the tap and gesture movement operations conforms to the second input mode.
Preferably, the character input method is characterized in that the first rough algorithm includes a switching condition for switching to the second rough algorithm, and the first rough algorithm transfers a judgment operation for judging the remaining part of the tap and gesture movement operation to the second rough algorithm according to the switching condition.
Preferably, the second rough algorithm includes a return condition for switching to the first rough algorithm, and the second rough algorithm hands over a determination operation for determining the remaining part of the tap and gesture movement operation to the first rough algorithm according to the return condition.
An embodiment of the present invention provides a method for inputting characters, executed in an electronic device, including:
detecting gesture operation acting on a graphical user interface related to a text input function;
judging whether the first part of the gesture operation conforms to the first input mode or the second input mode by using a rough algorithm for judging the input mode;
in the case that the first portion of the gesture operation conforms to the first input mode, utilizing a first generalization algorithm that discriminates the first input mode to process a second portion of the gesture operation, wherein the first generalization algorithm that discriminates the first input mode is used to discriminate whether the second portion of the gesture operation enables one of the first set of options associated with the graphical user interface; and
in the event that the first portion of the gesture operation conforms to the second input mode, processing a remaining portion of the gesture operation with a second summary algorithm that distinguishes the second input mode, wherein the second summary algorithm that distinguishes the second input mode is used to distinguish whether the remaining portion of the gesture operation enables one of the second set of options associated with the graphical user interface.
Preferably, the rough algorithm for determining an input mode is used for determining whether the first part of the gesture operation conforms to the first input mode or the second input mode according to a time threshold.
Preferably, the rough algorithm for distinguishing the input mode is configured to distinguish whether the first part of the gesture operation conforms to the first input mode or the second input mode according to a total threshold, wherein the total threshold is used for comparing total force data of the gesture operation.
Compared with the prior art, the character input method provided by the invention can operate the phoneme by using the gesture, so that the candidate character list is changed. In addition, the character input method provided by the invention can determine the gesture operation by using the first rough algorithm or the second rough algorithm according to the total force threshold or the time threshold. The first or second profiling algorithm further comprises a return condition to switch to a different profiling algorithm.
Drawings
FIG. 1A shows a block diagram of an electronic system embodiment of the present invention;
FIG. 1B shows a schematic diagram of the composition of an embodiment of a remote control application;
FIGS. 2A-2G show a pressure curve, a force-receiving area curve, and a total force curve of a touch operation signal;
FIG. 3 shows hardware and software layers of a mobile device and a media playback device;
fig. 4 is a flowchart showing processing and determining whether the selection and dragging is started by the touch operation signal;
FIG. 5A shows a block diagram of an electronic system embodiment of the present invention;
FIG. 5B is a schematic diagram of an embodiment of a keyboard;
fig. 6A shows the auxiliary frame representing that the heavy pressing operation is effectively actuated.
Fig. 6B shows a key operation signal diagram with reference to a time axis.
FIG. 7 is a flow chart of an embodiment of character input for displaying candidate characters using a GUI menu for character display.
FIG. 8A shows a graphical interface menu for displaying characters of the positive-order key-activated candidate word sequence "wxyz".
FIG. 8B shows a schematic diagram of the presentation of the next word "x" of the sequence "wxyz" in the text entry area.
FIG. 8C is a schematic diagram of inputting a candidate word "y" into the text input area.
FIG. 8D illustrates another embodiment of a graphical interface for character display in which candidate words are represented by other auxiliary keys.
FIG. 9 is a diagram illustrating an embodiment of a graphical interface of a first input mode menu, in which a plurality of input method options are associated with a plurality of keyboards and represented by auxiliary keys.
FIG. 10 is a diagram illustrating an exemplary graphical interface of a second input mode menu, in which a plurality of other types of input method options are associated with a plurality of keypads and represented by auxiliary keys.
FIG. 11 is a schematic diagram of another keyboard embodiment.
FIG. 12A shows a template for a single key associated with multiple key options arranged in a positive sequence.
FIG. 12B shows a template for a single key associated with multiple key options arranged in another sequence.
FIG. 13 shows a flow diagram of an embodiment of a text input method for processing phonemes.
FIG. 14 shows a schematic diagram of a phone deletion gesture.
FIG. 15 is a diagram illustrating deletion of a phoneme by a delete gesture in the phoneme region.
FIG. 16 is a flowchart of an embodiment of a schematic algorithm (heiristic) for determining delete, copy, move, and replace gestures.
FIG. 17 shows a schematic diagram of a phoneme copy gesture.
FIG. 18 shows a schematic diagram of a phoneme movement gesture.
FIG. 19 shows a schematic diagram of a phoneme substitution gesture.
FIG. 20 is a diagram illustrating the replacement of one phone with another in response to a phone replacement gesture.
FIG. 21 shows a schematic diagram associated with a graphical user interface finite state machine (finite state machine).
Description of the main elements
The following detailed description will further illustrate the invention in conjunction with the accompanying drawings.
Paths P1, P2
Time period T0-T3
Processor 10
Electronic system 10a
Electronic device 100
Main memory 20
Pressure curve 21
Force area curve 22
Total force curves 23,24,25,26,27,28
Button 201 and 221
Key positions 218a-221a
Display 30
Operating units 31,32,33,34
Mobile device 40
Processor 41
Memory 42
Display 43
Quartz oscillator 44
Controller 45
Hardware layers 400, 500
Touch device 401
Wireless communication module 402, 502
Input unit 403
Operating system cores 410, 510
System libraries 420, 520
Cursor libraries 421, 521
System frame 430, 530
Remote control application 440
Target application 450
Counter 441
Detector 442
Selecting action judging module 443
Instruction generator 444
Signal generating module 445
Conversion module 446
Drag action determination module 448
Media player 50
Memory 52
Display 53
Quartz oscillator 54
Timers 55,56
Text entry area 500
Text 501
Phonemic symbols 503
Text 504
Phonemic symbols 505
Text 506
Phonemic notation 507
Word 510
Text 510a
Input control library 511
Character 513
Menu 522
Keyboard region 523
Text options area 524
The key 525,526,527,531,532,533,534,535 is provided with a key 525,526,527,531,532,533,534,535,
536,
phonemes 531a-536a,535b,536b
Input operations service 540
Target application 550
Operation region 541-548
Phonemes 541a-548a
Region 560,561,562
Wireless network 60
Wireless communication channel 61
Articles 71,72,73
Outer frame 74
Input method options 81-84
Input method options 81a-84a
Keyboards 81b-84b,81c-84c
Graphic interface 800 for character display
Cursor 801
Menu 803
Gesture 811-814
Path 814a
Symbols 820-
Operating regions 820a-824a,830a-834a
Options 820b-824b,830b-834b
Touch operation signal 90
User's hand 92
State 920,921,922,923,924,925
State machine 930
Connecting line 931,932,933,934,935,936,937,938
Detailed Description
In order to make the features and characteristics of the present invention comprehensible, preferred embodiments accompanied with figures 1 to 21 are described in detail below. The present description provides various examples to illustrate the technical features of various embodiments of the present invention. The arrangement of the components in the embodiments is illustrative and not restrictive. And the repetition of certain reference numbers in the following examples is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various examples.
The present invention relates to a touch operation method and an electronic system using the same, which enable a user to operate the electronic system in an easier and intuitive manner, such as a Smart phone (Smart phone), a Tablet personal computer (Tablet personal computer), a Set-top box (Set-top box) and a Smart television (Smart TV), and the implementation method thereof is characterized in that: short press operation was utilized to simulate long press operation.
As shown in FIG. 1A, the electronic system 10a includes a mobile device 40 and a media playing device 50. Each unit and module in the electronic system 10a may be configured by a computer program or a circuit. The processor 41 of the mobile device 40 is communicatively connected to the memory 42, the display 43, the touch device 401 and the wireless communication module 402. Embodiments of the mobile device 40 may include a Personal Digital Assistant (PDA), a notebook computer, a smart phone, or a tablet computer. The memory 42 of the mobile device 40 may include an operating system and applications, such as AndroidTMOperating system and remote controller application program440, and a target application 450.
Fig. 1B shows a schematic diagram of the remote control application 440. The detector 442 is used for detecting the touch operation of the touch device 401. The touch operation includes a user operation acting on a touch device, such as 401, and the touch device detects the user operation event. Various gestures detected by the touch device are distinguished by the touch device as different touch operations, such as press, release, short press (short press), long press (long press), light press (light press), heavy press (heavy press), drag, move, slide, and other operations/events. If the total force of the short press acting on the touch device 401 is greater than the total force threshold, the total force is defined as the weight. The instruction generator 444 is configured to generate a long-press signal representing a long-press operation or a press-down signal representing a long-press operation when receiving a short-press operation (i.e., a heavy-press operation) in which a total force on the touch device 401 is greater than a total force threshold. The signal encapsulation module 445 is used to encapsulate the signal generated by the instruction generator 444 into a unit of data transmission, such as a frame or a packet. The instruction generator 444 utilizes the signal packaging module 445 and the wireless communication module 402 to generate and transmit a wireless communication signal representing the total force data of the touch operation signal 90 of the touch operation to the media playback device 50, so as to control the media playback device 50. Other modules and units in the remote control application 440 are described later.
The processor 51 in the media playing apparatus 50 is communicatively connected to the memory 52, the display 53, the input device 501 and the wireless communication module 502. Examples of the media playback device 50 include a smart television or a set-top box. Fig. 1 is merely an example, and the display 53 may not be included in the embodiment where the media playing device 50 is a set-top box. Examples of the mobile device 40 may also include a media playing device, such as a smart phone.
The memory 52 of the media playing device 50 may include an operating system and an application program, such as AndroidTMAn operating system, input operations services 540, and a target application 550.
The processors 41 and 51 are central processing units (cpus) of the mobile device 40 and the media playing device 50, and may be Integrated Circuits (ICs) for processing data and executing computer programs.
The wireless communication modules 402 and 502 establish a wireless communication channel 61, so that the mobile device 40 and the media playing device 50 can communicate via the wireless communication channel or connect to a network application store, and download applications, such as the remote control application 440 and the input operation service 540, from the application store.
The wireless communication modules 402 and 502 may respectively include an antenna, a base band (base band) and a Radio Frequency (RF) chipset for performing wireless local area network (wireless local area network) communication and/or cellular communication system communication, such as Wideband Code Division Multiple Access (W-CDMA) and High Speed Downlink Packet Access (HSDPA).
Embodiments of the touch device may include capacitive, resistive, or infrared touch devices. The touch device detects touch operation and generates a touch electronic signal. The controller 45 of the touch device 401 generates a touch data signal representing the touch electronic signal according to the touch electronic signal received by the touch device. The touch data signal comprises a touch packet sequence. The touch control package sequence comprises a plurality of touch control packages, and each touch control package comprises a pressure field, an area field and a coordinate field, and is used for storing a pressure value, a stress area and a coordinate associated with the touch control package and representing the pressure, the stress area and the coordinate of the touch control operation.
The touch device 401 may include a touch pad on a display, or may be combined with the display 43 to form a touch screen. The input device 501 may include control buttons, an alphanumeric keyboard, a touch panel, and a touch screen.
In the remote control application 440, the detector 442 is used to detect the operation status of the touch device 401. The counter 441 is used to count and notify the processor 41 of a start time, an end time and a duration of an operation state of the touch device 401. The selecting action determining module 443 is used to determine whether the pressing operation of the touch device 401 is a heavy pressing operation representing a long pressing. The long press is an operation in which the pressing operation time of the touch device 401 is greater than a time threshold, and the short press is an operation in which the pressing operation time of the touch device 401 is less than the time threshold. The heavy pressing operation is an operation in which the total force of the touch device 401 is greater than the total force threshold. The total force is a product of a pressure value and a force-receiving area of the touch device 401 acted by the touch operation at a time. The heavy pressure operation is not determined by the time threshold but by the total force threshold, so the heavy pressure operation may be a short pressure operation.
A quartz oscillator 44 provides a frequency signal to the processor 41 and other components in the mobile device 40. The quartz oscillator 54 provides a frequency signal to the processor 51 and other components in the media playback device 50. The controller 45 or the driver of the touch device 401 may generate a touch packet over time according to the time information provided by the quartz oscillator 44 or the counter 441, where the touch packet includes a pressure value, a force-receiving area, and coordinates of a touch operation applied to the touch device 401, and the pressure field, the area field, and the coordinates field are respectively stored in the touch packet.
The signal encapsulation module 445 inputs a plurality of touch packets in a specific time interval in the touch packet sequence in the touch operation signal 90 to the conversion module 446. The conversion module 446 generates a total force associated with an input touch packet according to a multiplication operation of a pressure value and a force-receiving area of the input touch packet input to the conversion module. The conversion module 446 thereby generates a total force of each of the plurality of touch packets in the sequence of touch packets, and the total forces of the plurality of touch packets in the sequence of touch packets constitute total force data of the touch operation, which can be represented by a total force curve.
In various embodiments, the conversion module 446 generates a product by multiplying a pressure value and a force-receiving area of an input touch packet input to the conversion module, and then generates a total force associated with the input touch packet by averaging a plurality of products associated with a plurality of touch packets within a specific time period.
The signal encapsulation module 445 or the conversion module 446 stores the total force associated with the input touch packet in the pressure field of the input touch packet to replace the pressure value of the input touch packet. Examples of the specific time segment are the time segment T1 in fig. 2G or a segment smaller than the time segment T1, such as a sub-time segment of the split time segment T1.
The processor 41 displays an object 71 on the display 43. The target program of the mobile device 40 needs to receive a pressing operation or a long pressing operation to select the object 71, and if a release signal representing a release operation is received, the selection operation is ended. The target program of the mobile device 40 continuously receives the coordinates of the touch operation represented by the touch operation signal 90, and can perform the dragging action of the object 71 according to the received coordinates. Examples of the target program of the mobile device 40 may include a target application 450 or an operating system of the mobile device 40. For example, the target application 450 of the mobile device 40 needs to receive a long press operation to select the object 71. The long press operation is an operation in which a period from when the mobile device 40 receives the press operation to when the mobile device receives the release operation is longer than a time threshold.
The processor 51 displays an object 72 on the display 53. The target program of the media playing device 50 needs to receive a pressing operation or a long pressing operation to select the object 72, and if a release signal representing a release operation is received, the selecting operation is ended. The target program of the media playing device 50 continuously receives the coordinates of the touch operation represented by the touch operation signal 90, and can perform the dragging action of the object 72 according to the received coordinates. Examples of the target program of the media playback device 50 may include the target application 550 or the operating system of the media playback device 50. For example, the target application 550 of the media playing device 50 needs to receive a long-press operation to select the object. The long-press operation is an operation in which the period from when the media playback device 50 receives the press operation to when the media playback device receives the release operation is longer than the time threshold.
Fig. 2A shows a pressure curve 21 and a force-receiving area curve 22 of the touch operation signal 90 received by the processor 41 from the touch device 401. The touch operation signal 90 includes a touch packet sequence. The touch packet sequence comprises a plurality of touch packets. The horizontal axis in fig. 2A to 2G represents the sequence number of the touch packets received by the processor 41 in time sequence, and the vertical axis represents the unit of the values in the pressure field and the area field of the touch packets. The pressure curve 21 is generated according to pressure values in pressure fields of a plurality of touch packets in the touch packet sequence. The force-receiving area curve 22 is generated according to force-receiving area values in area fields of a plurality of touch packets in the touch packet sequence.
Fig. 2B shows the total force curves 23 and 24 of the touch operation signal 90 received by the processor 41 from the touch device 401. The total force curves 23 and 24 are generated according to the value of the total force associated with each of the plurality of touch packets in the sequence of touch packets. The total force of the total force curve 23 is generated by the multiplication. The total force of the total force curve 24 results from the multiplication and averaging operations.
Fig. 2C, 2D, 2E and 2F show the total force curves 25,26,27 and 28 of the processor 41 receiving the touch operation signal 90 from the touch device 401, respectively. The total force curves 25,26,27 and 28 represent different touch operations on the touch device 401. The total force curve 25 represents the pressing operation. The total force curve 26 represents the movement operation. The total force curve 27 represents a push-down and then move operation, i.e. a drag operation. The total force curve 28 represents light pressure operation. The light press operation is a press operation in which the total force value is smaller than the total force threshold value. The heavy pressing operation is a pressing operation in which the total force value is greater than the total force threshold value. Fig. 2G shows the total force curves 25,26,27 and 28 superimposed on one another. In the time section T1, the curves 25 and 27 representing the depressing and dragging operations are differentiated from the magnitude of the total force value of the curves 26 and 28 representing the moving and light depressing operations. The selecting action determining module 443 can determine that the curves 25 and 27 include the heavy pressing operation and the curves 26 and 28 do not include the heavy pressing operation according to the total threshold value. The selecting action determining module 443 can use the portions of the curves 25 and 27 in the time segment T1 as the signal of the stress operation to initiate the selecting action for the object 71 or 72.
As shown in fig. 6A, when a heavy pressing operation is applied to an object 73, an outer frame 74 is displayed so as to surround the object during a period in which the first selection operation on the object 73 is started. The electronic system may also display the stress applied to the object 73 with different visual effects. The instance of object 73 may comprise object 71 or 72.
And the starting point of the lower left of the curve close to the origin is the starting point of the touch operation represented by the curve. It should be appreciated that the time from the bottom left start of the total force curves 25,26,27 and 28 to the right limit of the time segment T1 is less than the time threshold. The limit from the origin of fig. 2G to the left of the time segment T1 is generally 0.1 seconds. The origin of fig. 2G is approximately 0.5 seconds to the right of the time segment T1.
Referring to fig. 3, the mobile device 40 receives the touch operation signal 90 from the touch device 401 in the hardware layer 400. Then, the processor 41 of the mobile device 40 transmits and converts the touch operation signal 90 between the software units and the hardware units in the figure along the sequence indicated by the path P1. The mobile device 40 further utilizes the wireless communication module 402 in the hardware layer 400 to transmit the touch operation signal 90 to the media playing device 50 through the wireless network 60.
The media playing device 50 receives the touch operation signal 90 from the wireless communication module 502 in the hardware layer 500. Then, the processor 51 of the media playing device 50 transmits and converts the touch operation signal 90 between the software units and the hardware units in the figure along the sequence indicated by the path P2. The mobile device 40 then transmits the touch operation signal 90 to the target application 550 by using the cursor function 521 in the system function library layer 520. The target application 550 of the media playing device 50 utilizes the touch operation signal 90 as a cursor control signal or a control signal of the object 72 to execute a corresponding function.
The software elements of the mobile device 40 include an operating system core 410, system functions 420, a virtual system framework 430, and a remote control application 440. The software elements in the system function 420 include cursor function 421. The hardware units of the mobile device 40 include the hardware layer 400. The hardware units of the hardware layer 400 include a touch device 401, a wireless communication module 402, and other hardware of the mobile device 40.
An example of the operating system kernel 410 may be a Linux or other operating system kernel. Other operating systems may include WindowsTM,Mac OSTMOr iOSTM. An instance of the virtual system framework 430 may include AndroidTMAn operating system, or other virtual machine. The wireless communication module 402 may include a wireless network device that conforms to the 802.11-related standard established by the Institute of Electrical and Electronics Engineers (IEEE) or other wireless communication standards. Other wireless communication standards may include BluetoothTM(BluetoothTM) Or ZigbeeTM
The transmission and conversion of the touch operation signal 90 by the processor 41 along the path P1 includes signal processing in each cell and transmission and conversion between cells in table 1 below:
TABLE 1
Sequence of Transfer unit Receiving unit
1 Touch device 401 Operating system kernel 410
2 Operating system kernel 410 Vernier letter 421
3 Vernier letter 421 Virtual system framework 430
4 Virtual system framework 430 Remote control application 440
5 Remote control application 440 Virtual system framework 430
6 Virtual system framework 430 System function 420
7 Operating system kernel 410 Wireless communication module 402
8 Wireless communication module 402 Wireless network 60
The software units of the media playback device 50 include the operating system core 510, system functions 520, virtual system framework 530, input operation service 540, and target application 550. The input operation service 540 is an application program. The software elements in the system function 520 include a cursor function 521. The software elements in the operating system core 510 include input control functions 511. The hardware unit of the media playing device 50 includes the hardware layer 500. The hardware units of the hardware layer 500 include a touch-to-wireless communication module 502 and other hardware of the media playback device 50.
An example of the operating system kernel 510 may be a Linux or other operating system kernel. Other operating systems may include WindowsTM,Mac OSTMOr iOSTM. An instance of the virtual system framework 530 may include AndroidTMAn operating system, or other virtual machine. An example of the input control function 511 may be the Unput function in Linux. The wireless communication module 502 and the wireless network 60 may include wireless network devices and wireless networks conforming to the 802.11-related standard established by the Institute of Electrical and Electronics Engineers (IEEE) or other wireless communication standards. Other wireless communication standards may include BluetoothTM(BluetoothTM) Or ZigbeeTM. The wireless network 60 may comprise a wireless communication channel between the mobile device 40 and the media playback device 50, a wireless network, or other network device. In other embodiments, the network 60 may include a Wide Area Network (WAN), such as one or more Public Land Mobile Networks (PLMNs) and the internet. The wireless communication modules 402 and 502 can establish a low latency wireless channel (low latency wireless channel) to transmit the touch operation signal 90. One practical example of the low-latency radio channel is a radio channel using a shortened time interval (short transmission time interval, abbreviated as sTTI) in a Long Term Evolution (LTE) standard protocol.
The wireless communication module 502 receives the touch operation signal 90 from the wireless network 60. The transmission and conversion of the touch operation signal 90 by the processor 51 along the path P2 includes signal processing in each cell and transmission and conversion between cells in table 2 below:
TABLE 2
Figure BDA0001325994950000211
Figure BDA0001325994950000221
The touch operation signal received by the mode cursor function 421 can be transmitted and interpreted as the touch signal received by the cursor function 521, and the touch operation signal 90 is sent to the target application 550 according to the predetermined call and control relationship between the cursor function 521 and the target application 550. The target application 550 uses the touch operation signal 90 as a user operation signal, such as a cursor control signal, to execute a corresponding function.
Fig. 4 shows the processing and determination of the touch operation signal 90 by the mobile device 40 or the media playing device 50. The processor 41 may execute the flow of fig. 4. Alternatively, the processor 51 may execute the flow in fig. 4. The input operation service 540 may execute the method or steps in fig. 4 when receiving the touch operation signal 90. Alternatively, the method or steps in fig. 4 may be executed when the remote control application 440 receives the touch operation signal 90.
It is determined whether the touch operation represented by the touch operation signal 90 is finished (step S2). If so, the flow of FIG. 4 ends. If not, whether the touch operation lasts for more than 0.1 second is judged (step S4). And if the touch operation does not exceed 0.1 second, repeating the step S2. If the touch operation exceeds 0.1 second, it is determined whether the touch operation continues for more than 0.5 second (step S8). If the touch operation does not exceed 0.5 seconds, a touch packet of the touch operation is transmitted, including the current coordinates of the touch operation (step S6). If the touch operation exceeds 0.5 second, it is determined whether the movement of the touch operation exceeds 15 pixels (step S10). If the movement of the touch operation does not exceed 15 pixels, a touch packet of the touch operation is transmitted, including the current coordinates of the touch operation (step S22), and it is determined whether the touch operation is finished (step S24). If the movement of the touch operation exceeds 15 pixels, it is determined whether the total force of the touch operation is greater than the total force threshold (step S12). If the total force of the touch operation is not greater than the total force threshold, the step S22 is repeated. If the total force of the touch operation is greater than the total force threshold, a pressing signal representing a pressing operation or a long pressing signal representing a long pressing operation is generated and transmitted (step S14), and a touch packet of the touch operation including the current coordinates of the touch operation is transmitted (step S16), and it is determined whether the touch operation is finished (step S18). If the touch operation is not ended, the step S16 is repeated. If the touch operation is finished, a release signal representing the release operation is generated and transmitted (step S20).
The processor 41 or 51 uses the first instance of the depression signal or long depression signal generated from the total force data to initiate a first selection action on the object 71 or 72.
The processor 41 or 51 performs the following drag action determination. The dragging action determining module 448 is utilized to determine whether the total force data of the touch operation signal 90 initiates a first dragging action on the object 71 or 72. The processor 41 or 51 utilizes the dragging determination module 448 to determine whether the displacement of the touch operation represented by the total force data exceeds n pixels, where n is a positive integer. If the displacement exceeds n pixels, the first drag action on the object 71 or 72 is initiated. And if the first selection action is finished, finishing the first dragging action.
In other embodiments, the processor 41 displays a first operation component of a graphical user interface to receive a pressing operation on the touch device 401, and generates the total threshold according to the pressing operation.
The relationship and the generation sequence of the heavy pressure signal, the long pressure signal and the pressure signal can be a sequential expression, a parallel expression or a selective expression. In the sequential relationship, for example, the electronic system 10a generates a long-pressing signal according to the heavy-pressing signal, and then generates a pressing-down signal according to the long-pressing signal. In the parallel relationship, for example, the electronic system 10a generates a long voltage signal and a low voltage signal according to the heavy voltage signal. In the selected relationship, for example, the electronic system 10a generates a long press signal or a press-down signal according to the heavy press signal.
The remote control application 440 can generate and transmit a long press signal or a press signal to the target application 550 according to the touch operation signal 90. Alternatively, the remote control application 440 may generate and transmit the touch operation signal 90 to the target application 550, and the target application 550 generates a long press signal or a press-down signal according to the touch operation signal 90.
The touch operation method can operate simultaneously with the original long-press selection operation of the object, provides another choice for a user, and increases the diversity of object operation. The touch operation method generates a long-pressing signal according to the heavy-pressing signal so as to simulate the long-pressing operation by utilizing the heavy-pressing operation. The depressing signal and the selecting action can be generated according to the long-pressing signal. The touch operation method can be used for accelerating the selection action of the object.
U.S. patent application No. 12/432,734 entitled "ELECTRONIC DEVICE SYSTEM utizing A CHARACTER INPUT METHOD" and published patent US20090273566a1 filed on 4/29/2009, also published patent US8300016 claiming priority from TW on 2008/5/2 and application No. 097116277, discloses a text INPUT METHOD. U.S. publication patent US8300016 is incorporated herein by reference. The text input method can distinguish different input modes on the same GUI component according to the pressure or the total force value by utilizing the touch operation method.
5.1 implementation of text input method
Alternatively, the processor 10 may display a menu with characters on the display 30, and the menu has options such as symbols, phonemes, candidate words, or input method options, and a graphical interface displays the candidate words for each button to assist the input of the character element. The keys of the input unit 403 are divided into input method switching keys, character keys, and auxiliary keys in accordance with a graphical interface for character display. For example, the keys 201 and 212 in FIG. 2 are text keys, and the keys 213 and 217 are auxiliary keys. Button 217 is an arrow key, and pressing at positions 218a, 219a, 220a and 221a moves the cursor to the top, right, bottom and left, respectively. The key 217 may accept a forward depression as an operation in the fifth direction. The keys 217 may be replaced with five-way keys in different embodiments. The keyboard of fig. 2, 11, and 14 is illustrated for convenience of description.
Referring to fig. 7, first, the processor 10 starts the character input method (step S7700), and determines whether or not a character key (hereinafter, referred to as a key i) is actuated by receiving a gesture operation in the input unit 403 (step S7701). If so, the processor 10 starts the timer 55 to start timing the operation of the key i (step S7702), and starts the sequence of the predetermined sequence or the changing sequence of the key i as the currently presented sequence of the options according to the operation conforming to the first operation mode or the second operation mode (step S7705). For example, in the case that the gesture operation conforms to a first operation mode, enabling the predetermined sequence of options (default sequence) as the currently presented option sequence; and in the case that the gesture operation conforms to a second operation mode, enabling the option sequence (alternative sequence) with the changed sequence as the currently presented option sequence. The option sequence of varying order may be a reverse order option sequence or an extended set of options with more options, such as more candidate words and auto-complete words. FIG. 8D, an example of the set of expansion options is shown. Fig. 9 and 10 show a predetermined order option sequence and a change order option sequence of the input method switching key, respectively. FIG. 12A shows a predetermined sequence of options for button 570, including symbols 820, 821, 822, 823 and 824. The lines in FIG. 12A represent associations between the entities to which the lines connect. In the predetermined sequential selection sequence, the symbol 820 is associated with an operation area 820a, wherein the operation area 820a activates a key option 820b as a current option upon receiving an operation. The symbol 821 is associated with an operation area 821a, wherein the operation area 821a activates a key option 821b as a current option when receiving an operation. The symbol 822 is associated with an operation area 822a, wherein the operation area 822a activates a key option 822b as a current option upon receiving an operation. The symbol 823 is associated with an operation area 823a, wherein the operation area 823a activates a key option 823b as a present option upon receiving an operation. The symbol 824 is associated with an operation area 824a, wherein the operation area 824a activates a key option 824b as a current option when receiving an operation.
Figure 12B shows a sequence of options for varying order of buttons 570, including options 830B, 831B, 832B, 833B, and 834B in menu 805. The lines in FIG. 12A represent associations between the entities to which the lines connect. In the predetermined sequence of options, the operation area 830a activates the key option 830b as the current option when receiving an operation. The operation area 831a activates the key option 831b as the current option upon receiving an operation. The operation area 832a activates the key option 832b as the current option when receiving an operation. The operation area 833a activates the key option 833b as the current option when receiving an operation. The operation area 834a activates a key option 834b as a present option when receiving an operation. The options in fig. 12A and 12B may include symbols, phonemes, characters, input methods, static or dynamic electronic images, or device executable functions.
After starting one of the predetermined sequence of options and the changed sequence of options, the processor 10 displays a menu on the display 30 to present the started sequence of options, particularly to display the first option of the sequence of options (step S7706), and starts the timer 56 to time the operation period of the key i (step S7709). For example, in step S7706, the processor 10 displays a menu on the display 30, in which the focus of the cursor or graphical user interface is used to display the first key option in the currently presented option sequence. The key actuated in the step S7701 may be an input method switching key, such as the key 212 in fig. 5B and 11, or the key 527 in fig. 14. If the activated key is an input method switching key in step S7701, the processor 10 may display a menu 803 shown in fig. 9 or a menu 804 shown in fig. 10 in step S7706. The predetermined sequential selection sequence of input method options for the actuated keys may include input method options 81, 82, 83 and 84, the input method options 81, 82, 83 and 84 being associated with keyboards 81c, 82c, 83c and 84c, respectively. The sequence of options of varying order of input method options for the actuated keys may include input method options 81, 82, 83 and 84, the input method options 81a, 82a, 83a and 84a being associated with keyboards 81b, 82b, 83b and 84b, respectively. Each of the options 81, 82, 83, 84, 81a, 82a, 83a and 84a may be selected and actuated to actuate the keypad associated with the actuated option. The association of the input method with the keyboard is shown in dashed lines in fig. 9 and 10. The keyboards 81b, 82b, 83b, 84b,81c, 82c, 83c, and 84c may include keyboards of different arrangement styles, different languages, and different input methods. For example, some of the keyboards in fig. 5B, 11, and 14 may be some implementations of the keyboards 81B, 82B, 83B, 84B,81c, 82c, 83c, and 84 c.
In one example, assuming that button i is button 209, FIG. 8A shows graphical interface 800 for display of characters after the button has initiated a predetermined sequence of options. In the graphical interface 800 for character display, candidate words are arranged in a clockwise direction. However, FIG. 8A is not intended to limit the present invention, and the candidate words may be arranged in a counter-clockwise or any other order. When the first candidate word "w" of the button 209 is displayed in the character input area 500, the cursor 801 indicates "w" as the candidate word currently displayed in the graphical interface 800 for character display. The auxiliary keys 218, 219, 220, and 221 represent candidate words "w", "x", "y", and "z", respectively. Referring to fig. 9, if the key in step S7701 is an input method switching key and is actuated in a gesture operation conforming to the first operation manner, the auxiliary keys 218, 219, 220, and 221 are associated with input method options 81c, 82c, 83c, and 84c, respectively. Referring to fig. 10, if the key in step S7701 is an input method switching key and is actuated in a gesture operation conforming to the second operation manner, the auxiliary keys 218, 219, 220, and 221 are associated with input method options 81b, 82b, 83b, and 84b, respectively.
The processor 10 continuously detects whether a subsequent option selection operation is received, such as a short-press operation, a move gesture, or a slide gesture operation on the same key i (event a), an end of the operation period of the key i indicated by the expiration of the timer 56 (event B), an operation on another text key j (event C), a long-press operation on the same key i (event D), or an operation on the operation area or the auxiliary key k (event G). k is a positive integer and in the example of fig. 11, 213 ≦ k ≦ 221.
In step S7710, if an option selection operation (event a) for key i is received, processor 10 resets timer 56 (step S7712), and selects one option in the currently presented option sequence as the selected option (step S7714). For example, where key i is key 209, in the example of fig. 8A, processor 10 presents the next word "x" of the positive sequence candidate word sequence "wxyz", as shown in fig. 8B. The cursor 801 in the graphical interface for character display 800 also moves clockwise to "x" to indicate the candidate word currently displayed. Step S7710 is then repeated. Similarly, if an option selection operation is received again for the key 209, for example, another short press operation, the processor 10 resets the timer 56 and displays the next candidate word "y" in the currently presented option sequence "wxyz". The cursor 801 in the graphic interface 800 for displaying characters is also moved clockwise to "y".
The cursor 801 indicates an option as a selection option. The option selection operation may include an operation cursor 801 such as a tap (tap), press (press), swipe gesture (swipe), move gesture (move), slide (slide), and the like. The clockwise sequential movement of the swipe gesture to the keys 218, 219, 220, and 221 can actuate the cursor 801 to move clockwise to w, x, y, and z. The movement of the swipe gesture to keys 221, 220, 219, and 218 in a counterclockwise sequence may actuate cursor 801 to move to z, y, x, and w in a clockwise sequence. In the example of FIG. 8D, moving the swipe gesture from key 218 to keys 219, 220, 221, 213, 214, 216, and 215 in a clockwise order may actuate cursor 801 to move to a, 2, C, B, A, "tea", C, and B in a clockwise order.
Referring to fig. 9, moving the swipe gesture to the keys 218, 219, 220, and 221 in a clockwise sequence may actuate the cursor 801 to move to the input method options 81, 82, 83, and 84 in a clockwise sequence. A swipe gesture to keys 221, 220, 219, and 218 in a counterclockwise sequence may actuate cursor 801 to move to input method options 84, 83, 82, and 81 in a clockwise sequence. Referring to fig. 10, moving the swipe gesture to the keys 218, 219, 220, and 221 in a clockwise sequence may actuate the cursor 801 to move to the input method options 81a, 82a, 83a, and 84a in a clockwise sequence. A swipe gesture to keys 221, 220, 219, and 218 in a counterclockwise sequence may actuate cursor 801 to move to input method options 84a, 83a, 82a, and 81a in a clockwise sequence.
If timer 56 expires (event B), at step S7710, the processor 10 activates the current selection option of the pressed i-key and updates the user interface on the display 30 (step S7716). For example, in step S7716, the processor 10 inputs the candidate word currently displayed by the key i into the text input area, and moves the cursor to the next position in the text input area. Then, step S7701 is repeated. For example, if the currently displayed candidate word is "y", and the timer 56 expires as shown in fig. 8C, the processor 10 inputs "y" into the text input area 500, moves the cursor 550a to the right to the next position of the text input area 500, and stops displaying the graphical interface 800 for displaying characters.
If another operation is received on a different key j (event C) in step S7710, the processor 10 activates the currently selected option of key i and updates the user interface on the display 30 (step S7718) and resets the timer 55 to count the duration of the operation of key j (step S7702). For example, in step S7701, if an operation on another character key j is received (event C), the processor 10 inputs the candidate character currently displayed on the key i into the character input area, and moves the cursor to the next position in the character input area (step S7718). The timer 5 is restarted for the character key j (step S7702), and then the steps after step S7702 are repeated, including steps S7705, S7706, S7709, S7710, S7712, S7714, S7716, S7718, S7720, and S7722.
If a long-press operation (event D) is received on the same key i in step S7710, the processor 10 may start a change order option sequence in the order of the currently presented option sequence before step S7720. For example, the processor 10 may start a candidate word sequence in the reverse order of the candidate word sequence before step S7720. For example, the processor 10 is in reverse order as the selected order in step S7710, and then in forward order as the selected order when performing step S7720. In contrast, the processor 10 has the selected order in the positive order in step S7710, and has the selected order in the reverse order when executing step S7720. Next, in step S7714, processor 10 presents the next candidate word in the sequence of candidate words in the selected order. For example, in the example of fig. 8A, when the user selects the character in the forward sequence, the processor 10, upon receiving a long-press operation (event D) to the same key 209, presents the character "z" preceding the character "w" in the forward sequence candidate character sequence "wxyz" (i.e., the next candidate character in the reverse sequence candidate character sequence), and the cursor 801 in the graphical interface 800 for displaying characters is also moved counterclockwise to "z". Step S7710 is then repeated. Similarly, when another short-press operation is received again for the key 209, the processor 10 resets the timer 60, and displays the next candidate word "y" in the reverse sequence of candidate words, and the cursor 801 in the graphical interface 800 for displaying characters is also moved counterclockwise to "y". The candidate word sequence is changed by a long press operation, but the candidate word sequence may be changed by other input devices, such as a clockwise or counterclockwise operation trajectory using a rotary button (rotatable button) or a touch panel to realize the clockwise or counterclockwise movement of the cursor 801. The display 30 may be a touch screen having the touch panel. The keyboard in fig. 11 may be a virtual keyboard displayed on the display 30.
In step S7710, upon receiving an operation on auxiliary key k (event G), processor 10 activates the option represented by auxiliary key k and updates the graphical user interface (step S7722). For example, if an operation on the auxiliary key k is received (event G), the processor 10 inputs a candidate word represented by the auxiliary key k into the text input area, moves the cursor to the next position of the text input area (step S7722), and then repeats steps S7700 and subsequent steps, including S7701, S7702, S7705, S7706, S7709, S7710, S7712, S7714, S7716, S7718, S7720 and S7722. For example, in the case of FIG. 8A, if button 220 is actuated, processor 10 enters candidate word "y" directly into text entry area 500, regardless of whether the currently displayed candidate word is "x" or any candidate word, resulting in the result shown in FIG. 8C. In the case of fig. 8A, the candidate word sequence starting in the forward or reverse order requires two pressing operations to display "y", and the timer 56 is waited for to expire to input "y", and the candidate word can be input by only one pressing operation using the auxiliary key. Similarly, in the case of fig. 8A, if the auxiliary key 218, 219 or 221 is operated, the processor 10 inputs the candidate words "w", "x" and "z" into the text input area 500 respectively. Although the candidate words of the key 209 have the five input modes of a, B, C, D and G, the modes can be performed simultaneously without conflict.
In the case where the key actuated in step S7701 is an input method switching key, after the gesture operation for actuating the auxiliary key k is finished (event G), the processor 10 actuates the input method option and the keyboard associated with the auxiliary key k in step S7722. For example, referring to fig. 9, the processor 10 actuates the input method option 83 and keyboard 83c associated with the auxiliary key 220 in response to operation of the auxiliary key 220.
The graphical interface for character display 800 may display more options such as candidate words, which may include upper and lower case letters for each character and autocomplete words. Although only arrow keys 217 are used as auxiliary keys in the description, voice commands (voice commands) or other auxiliary keys may be used to represent candidate words in the graphical interface 800 for character display
5.2. Other embodiments of the text input method:
referring to fig. 13, the apparatus 100 may perform a method of gesture manipulation of phonemes and characters. Phonemes are the constituent elements of characters. For example, the phonemes may be letters in English, phonetic notation or Roman pinyin in Chinese, hiragana in Japanese or katakana. The method 900 of gesture operations for phonemes and characters may be performed by a processor, such as the processors 10, 41, and 51. The processor receives an input operation from an input device (step S901), such as input devices 401, 403, or 501, and generates one or more phonemes in response to the received input operation (step S902). The processor displays the one or more phonemes as a gesturable object (step S903). The object capable of gesture operation can be defined as a class (class) by an object oriented programming language (object oriented language), wherein the characteristics and functions of gesture operation in the class can be inherited by the object (object), and the object is used for containing the input phoneme. The processor may allow drag and drop operations and total force related operations to be performed on the gesture operands. General force related operations are disclosed in U.S. patent publication No. US 20160070400. For example, referring to fig. 14, the processor displays a phoneme 531a as a gesture object in the phoneme region 561 in response to an operation of the key 531, wherein the key 531 is in column 1, column 2 in the text key array of the region 562. The key in the mth column, nth column in the text key array in region 562 may be labeled as key (m, n). The key 531 in column 1 and column 2 in the text key array in area 562 may be labeled as key (1, 2). Similarly, the processor responds to the operation of the keys 532,533,534,535 and 536 in the region 562 of the keyboard region 523, and displays the phonemes 532a, 533a, 534a, 535a and 536a in the phoneme region 561 as the gesture operation objects. The key 527 may be an input method switching key. The key 526 may be a key to enter a blank. The keys 525 may be input keys.
The processor may display text in the text options area 524 based on the one or more phonemes (step S904). The text options area 524 contains one or more texts derived from the phonemes in the phoneme area 561. For example, the processor displays text 501 derived from phonemes 531a, 532a, 533a, and 534a, and text 504 derived from phonemes 535a and 536 a. The processor also displays in area 560 the phonetic symbol 503 associated with the word 501 and the phonetic symbol 505 associated with the word 504. The processor may not display the phoneme symbols 503 and 505.
The processor detects a gesture operation associated with a phoneme in the phoneme region 561 (step S905). The gesture operation may be applied to a single or multiple selected phonemes. A single or multiple phonemes may be selected as the selected phoneme with a selection operation. The gesture operations may include delete (event C1), copy (event C2), move (event C3), and replace (event C4) gestures. The processor is responsive to the delete gesture to alter one or more phonemes (step 906), is responsive to the copy gesture to alter one or more phonemes (step 907), is responsive to the move gesture to alter one or more phonemes (step 908), and is responsive to the replace gesture to alter one or more phonemes (step 909). The processor analyzes the one or more phonemes altered by the reading of the gesture operation (step S910) and generates an updated text list in area 524, the text list including one or more text generated from the altered one or more phonemes (step S911).
Referring to FIG. 16, an example of steps S905-S912 is described in detail as follows. Each gesture operation associated with a phoneme, such as delete, copy, move, and replace, begins with selecting a set of phonemes consisting of one or more phonemes. The selection operation for selecting the phoneme is a selection gesture, forming a first part of a gesture operation related to the phoneme. The first part of the phoneme related gesture operation may be a press or a tap. The remainder of the phoneme-related gesture operation may include a swipe gesture (slide), or a move gesture (move). The processor identifies a first portion of a phoneme related gesture operation and determines whether the selection gesture corresponds to an input mode. For example, the delete, copy, and move gestures include a selection gesture that conforms to a first input mode, while the replace gesture includes a selection gesture that conforms to a second input mode. The processor may distinguish a remaining portion of the phoneme-related gesture operation from a first portion of the phoneme-related gesture operation.
If the first part of the phone-related delete gesture operation (event C1) is received in step S905, the processor responds to the delete gesture to delete the phone associated with the delete gesture. Referring to fig. 14 and 16, for example, delete gesture 810 may comprise a select gesture for selecting phoneme 535 a. The selection gesture may comprise pressing or tapping on phone 535a, or a gesture that surrounds phone 535 a. Upon receiving a gesture operation on a phoneme (step S9051), the processor discriminates whether a selection gesture constituting a first part of the phoneme-related gesture operation conforms to the first input mode or the second input mode (step S9052). When the first part of the gesture operation related to the phoneme conforms to the first input mode, the processor further determines whether the gesture operation moves out of the phoneme region (step S9053). In a case where the gesture operation moves out of the phoneme region, the processor discriminates whether the gesture operation returns to the phoneme region and an end point is also within the phoneme region (step S9054). In a case that the gesture operation end point is not in a phoneme region, the processor discriminates that the gesture potential is a deletion gesture, and deletes a phoneme selected by the deletion gesture (step S9055). In the case that the gesture operation endpoint is still within the phoneme region, the processor determines that the gesture is a copy gesture, copies the phoneme selected by the copy gesture to generate a copy of the phoneme, and places the copy at the endpoint (step S9056).
For example, in the case where a drag operation 810 moves the phoneme 535a from a location originally in the region 561 to an end point outside the region 561, the processor discriminates that the drag operation 810 is a delete operation for the phoneme 535 a. Referring to FIG. 15, the processor deletes phone 535a in response to the delete gesture (step S906). If the first part of the phone-related copy gesture operation (event C2) is received in step S905, the processor copies the phone selected by the copy gesture to generate a copy of the phone, and places the copy at the end of the copy gesture (step S907). Referring to FIG. 17, for example, the copy gesture may include a select gesture for selecting phonemes 535a and 536 a. The selection gesture may include pressing or tapping on the elements 535a and 536a, or a gesture surrounding the elements 535a and 536 a. The copy gesture includes a drag operation shown as sections 811 and 812. The drag operation section 811 moves the phonemes 535a and 536a from the region 561 to a temporary position outside the region 561. The drag-and-drop operation section 812 moves the phonemes 535a and 536a from the temporary position to an end point within region 561, to the left of phoneme 531 a.
Upon detecting the drag gesture segments 811 and 812, the processor determines that the gesture is a copy gesture acting on the phonemes 535a and 536a, and copies the phonemes 535a and 536a selected by the copy gesture to generate copies, i.e., phonemes 535b and 536b, in response to the copy gesture (step S907). Text 506 is a text choice derived from the phonemes 535b and 536 b. The phoneme symbol 507 is associated with the text 506.
In step S9053 of fig. 16, if the gesture operation related to a phoneme moves in the area 561 and moves the selected phoneme to the end point (step S9057), the processor discriminates that the gesture operation is a move gesture operation for moving the selected phoneme to the end point (step S9058).
If a phoneme-related movement gesture operation is received in step S905 (event C3), the processor responds to the movement gesture to move the phoneme associated with the movement gesture and place the phoneme at the end of the movement gesture (step S908). Referring to FIG. 18, for example, the move gesture 813 may comprise a selection gesture for selecting the phoneme 535 a. The selection gesture may comprise pressing or tapping on phone 535a, or a gesture that surrounds phone 535 a. The move gesture 813 includes a drag operation, the drag operation 813 moving the phoneme 535a from the region 561 to an end position within the region 561 along the path of the drag operation 813. The drag-and-drop operation ends when the phoneme 535a is moved to an end within region 561 by the drag-and-drop operation.
Upon detecting the move gesture 813, the processor determines that the gesture is a move gesture applied to the phone 535a and moves the phone 535a selected by the move gesture to an end point within the area 561 in response to the move gesture 813 (step S908). The phoneme 535a moves to a new position, causing the text 504 to disappear. Text 508 is a text option derived from the phoneme 535 a. The phoneme symbol 509 is associated with the word 508. Text 501a is a text choice derived from the phonemes 531a, 532a, 533a and 534 a. The phoneme symbol 503 is associated with the text 501 a. The words 508 and 510a form a word.
In step S9052 of fig. 16, when the first part of the gesture operation related to the phoneme conforms to the second input mode, the processor determines that the gesture operation is a substitution gesture, and displays a menu 522 to display other options of the phoneme selected by the substitution gesture (step S9059). The processor selects another option of the phoneme selected by the substitution gesture according to the remaining part of the substitution gesture (step S9060), and replaces the phoneme selected in step S9051 with the selected another option (step S9061). The other options may include phonemes, symbols, emoticons (emojies), and other GUI components.
If a replacing gesture operation on the inputted phoneme is received in step S905 (event C4), the processor responds to the replacing gesture to select another phoneme among the options thereof to replace the inputted phoneme (step S909). Referring to FIG. 19, for example, substitution gesture 814 may comprise a selection gesture for selecting phoneme 535 a. The selection gesture may comprise pressing or tapping on phone 535a, or a gesture that surrounds phone 535 a. The processor discriminates that the selection gesture is associated with a replacement gesture, rather than a delete, copy, or move gesture, and resolves the movement of the replacement gesture into a selection instruction for selecting another option (e.g., another phoneme).
Upon detecting the substitution gesture 814 associated with the phone 535a, the processor defines the operation regions 541, 542, 543, 544, 545, 546, 547, and 548 associated with the phone 535 a. The operation regions 541, 542, 543, 544, 545, 546, 547, and 548 are respectively associated with the other phonemes 541a, 542a, 543a, 544a, 545a, 546a, 547a, and 548a substituted in the region 522. When the replacing gesture 814 reaches one of the operation regions, one focus (focus) in the operation regions moves to the other phonemes for replacement associated with the operation region reached by the replacing gesture. The path 814a of the focus movement is synchronized with the gesture 814. For example, when the replacing gesture 814 reaches the operation region 541, other replaced phonemes 541a are synchronously selected and the phoneme 541a is highlighted in focus. Similarly, when the replacing gesture 814 reaches the operation area 542, other replaced phonemes 542a are synchronously selected and the phonemes 542a are highlighted in focus. Similarly, when the replacing gesture 814 reaches the operation region 543-548, the relevant one of the other replacing phonemes 543a-548a is synchronously selected and highlighted with the focus. Upon completion of the replacement gesture 814 selecting an alternative phoneme, the processor replaces 535a with the selected alternative phoneme. Similarly, another phoneme in the region 561 may be replaced.
Referring to fig. 20, the processor parses the one or more phonemes altered by the substitution gesture (step S910) and generates one or more words from the altered one or more phonemes (step S911). Text 510 is a text choice derived from the phonemes 531a, 532a, 533a and 534 a. The phoneme symbol 503 is associated with the word 510. Text 513 is a text option derived from the phonemes 544a and 536 a. The words 510 and 513 form a word.
The processor discriminates whether or not the phoneme in the phoneme region 561 has received the gesture operation (step S912). In a case where the phonemes in the phoneme region 561 receive a gesture operation, the processor processes the gesture operation in accordance with steps S905 to S911. If a selection operation of a text option is received, the processor inputs the selected text option to the text area 560 (step S913).
Referring to fig. 21, the processor may process a gesture applied to an object such as a GUI component according to a state machine (state machine) 930. A gesture acting on an object, such as a key, input method switch, or phoneme gesture, is received in state 920, and the processor determines whether a first portion of the gesture corresponds to a first input mode. Whether the first part of the gesture complies with the first input mode, the processor transitions the object to state 921 along connection line 931. In state 921, the processor determines whether a second portion of the gesture conforms to a second input pattern or whether a first generalized algorithm (heiristic) for determining gesture movement is activated. In the event that the second portion of the gesture conforms to the second input mode, the processor transitions the object to state 922 along connection line 932. In state 922, the processor determines whether the third portion of the gesture activates a second rough algorithm for determining gesture movement. In state 922, a second rough algorithm for discriminating gesture movement is initiated in a third portion of the gesture, and the processor transitions the object along connection line 934 to state 924. At state 924, the processor determines whether the object was selected at the completion of the gesture using the second summary algorithm. In the event that the selection of the object is selected at the completion of the gesture, the processor transitions the object along connection line 936 to state 925 to enable the selected object selection.
In state 921, the processor transitions the object along connection line 933 to state 923 in the event that the second portion of the gesture initiates the first generalized algorithm for discriminating gesture movement. In state 923, the processor determines whether the object has been selected for selection by the gesture using the first generalized algorithm. In the event that the selection of the object is selected at the completion of the gesture, the processor transitions the object along connection line 935 to state 925 to enable the selected object selection. The state machine 930 also includes a connection 937 that allows the object to transition from state 923 to state 922, and also includes a connection 938 that allows the object to transition from state 924 to state 921. In state 923, the processor switches the object from state 923 to state 922 along connection line 937, for example, upon receiving a portion of the gesture acting on the object if the second input mode is met. In state 924, for example, the processor switches the object from state 924 to state 921 along connection line 938 when a portion of the received gesture acting on the object conforms to the first input mode. The connection line 937 may be a switching condition. The first summary algorithm includes a switch condition to switch to the second summary algorithm, the first summary algorithm handing over remaining work in determining the gesture (e.g., determining movement or tapping in the gesture) to the second summary algorithm according to the switch condition. The connection 938 may be a return condition. The second summary algorithm includes a return condition to switch to the first summary algorithm, the second summary algorithm handing over remaining work in determining the gesture (e.g., determining movement or tapping in the gesture) to the first summary algorithm based on the return condition. For example, the objects in FIG. 21 may be phonemes, and the first generalization algorithm may include steps S906, S907, and S908 associated with the GUI components in FIGS. 14, 15, 17, and 18; the first generalized algorithm may include step S909 associated with the GUI elements in fig. 19 and 20. In addition, the objects in FIG. 21 are keys, and the first generalized algorithm may include steps associated with the predetermined sequential option sequence (default sequence) and GUI components in S7706-S7722; and the second summary algorithm may include steps associated with the change order sequence (alternate sequence) and GUI components in S7706-S7722.
6. And (4) conclusion:
the text input method can be used for inputting characters, numbers or symbols of various countries, such as hiragana or katakana in japanese, or phonetic notation in chinese. The text input method can be applied to keyboards with different key design styles. Although the illustration shows the current candidate word as a cursor, the current candidate word may be indicated in a different color, font size, or other manner.
The touch operation method can operate simultaneously with the original long-press selection operation of the object, provides another option for controlling the object by a user, and increases the diversity of object operation. The touch control operation method generates a long-pressing signal or a selection signal according to the heavy-pressing signal, so that the heavy-pressing operation is utilized to simulate the long-pressing operation. The depressing signal and the selecting action can be generated according to the long-pressing signal. Or, the touch operation method generates a selection signal according to the stress signal, so as to execute the selection operation by utilizing the stress operation. The touch operation method can be used for accelerating the selection action of the object. In addition, the heavy pressing operation may be a gesture operation that satisfies the switching condition or the returning condition.
In summary, the text input method utilizes different operations of the same key to actuate option sequences in different sequences, and can be assisted by a graphical interface for displaying key options, so that the text can be input by using fewer pressing times, the time for inputting characters can be saved, and the operation errors of a user can be avoided due to fewer required operation times. The key options may include characters, phonemes, and input methods. The text input method can use the touch method to distinguish different input modes on the same key. In summary, the present invention is in accordance with the patent requirements of the invention, and the following claims are hereby made.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the spirit and scope of the present invention.

Claims (11)

1. A character input method executed in an electronic device is characterized by comprising the following steps:
allowing input of one or more phonemes, wherein the one or more phonemes form a set of phonemes;
allowing each phoneme in the set of phonemes to be processed with a gesture operation;
generating a word choice list, wherein the word choice list comprises word choices derived from one or more phonemes in the set of phonemes;
altering the phone set in response to an altering gesture operation for altering one or more phones in the phone set to produce an altered phone set;
generating an updated word choice list, wherein the updated word choice list includes word choices derived from one or more phonemes in the altered set of phonemes;
enabling an option in the updated list of text options for text entry;
wherein the gesture-altering operations for altering one or more of the phonemes in the set of phonemes comprise tap and gesture movement operations, the text input method further comprising:
judging whether the first part of the touch and gesture movement operation accords with a first input mode or a second input mode;
in the event that the first portion of the tap and gesture movement operations conforms to the first input mode, processing a remaining portion of the tap and gesture movement operations with a first generalized algorithm, determining with the first generalized algorithm whether the remaining portion of the tap and gesture movement operations conforms to a first class of phoneme change gestures for a selected phoneme in the set of phonemes, wherein the first class of phoneme change gestures includes a delete gesture, a copy gesture, or a move gesture; and
processing the remaining portion of the tap and gesture movement operation using a second rough algorithm to display a menu if the first portion of the tap and gesture movement operation conforms to the second input mode, determining whether the remaining portion of the tap and gesture movement operation conforms to a second phoneme-like change gesture applied to a selected phoneme in the set of phonemes using the second rough algorithm, wherein the second phoneme-like change gesture comprises a replacement gesture selecting a candidate phoneme in the menu to replace the selected phoneme;
wherein the first summary algorithm includes a switching condition for switching to the second summary algorithm, and the first summary algorithm hands over a determination operation for determining a remaining portion of the light touch gesture movement operation to the second summary algorithm according to the switching condition in case that a portion of the light touch gesture movement operation meets the switching condition; and
wherein the second summary algorithm includes a return condition for switching to the first summary algorithm, the second summary algorithm handing over a determination task for determining a remaining portion of the light touch gesture movement operation to the first summary algorithm in accordance with the return condition in the case where a portion of the light touch gesture movement operation meets the return condition.
2. The text entry method of claim 1, further comprising:
determining whether the remaining portion of the tap and gesture movement operation conforms to a delete gesture using the first generalized algorithm;
deleting the phoneme selected by the change gesture operation in the phoneme set to generate the changed phoneme set under the condition that the rest part of the tap and gesture movement operation conforms to the deletion gesture.
3. The text entry method of claim 2, further comprising:
and judging that the rest part of the touch and gesture movement operation conforms to the deletion gesture under the condition that the rest part of the touch and gesture movement operation drags the selected phoneme from the phoneme region where the phoneme set is located to the outside of the phoneme region.
4. The text entry method of claim 1, further comprising:
judging whether the rest part of the tap and gesture movement operation conforms to a copy gesture or not by utilizing the first generalized algorithm;
in the event that the remaining portion of the tap-and-gesture movement operation conforms to the copy gesture, copying the phone selected by the alteration gesture operation in the phone set to produce a copy of the selected phone, and adding the copy to the phone set to produce the altered phone set.
5. The text entry method of claim 4, further comprising:
and under the condition that the rest part of the touch and gesture movement operation drags the selected phoneme from the phoneme region where the phoneme set is located to the outside of the phoneme region and then drags the selected phoneme to another copy destination position in the phoneme region, judging that the rest part of the touch and gesture movement operation conforms to the copy gesture.
6. The text entry method of claim 1, further comprising:
determining whether the remaining portion of the tap and gesture movement operation conforms to a movement gesture using the first generalized algorithm;
and under the condition that the rest part of the tap and gesture movement operation conforms to the movement gesture, moving the phoneme selected by the change gesture operation in the phoneme set to a movement destination position to generate the changed phoneme set.
7. The text entry method of claim 6, further comprising:
and in the case that the rest part of the tap and gesture movement operation drags the selected phoneme from the phoneme region where the phoneme set is located to another movement destination position in the phoneme region along a path in the phoneme region, judging that the rest part of the tap and gesture movement operation is in accordance with the movement gesture.
8. The text entry method of claim 1, further comprising:
determining whether the remaining portion of the tap and gesture movement operation conforms to a replacement gesture using the second rough algorithm;
and under the condition that the rest part of the tap and gesture movement operation accords with the replacing gesture, replacing the phoneme selected by the changing gesture operation in the phoneme set with a replacing symbol to generate the changed phoneme set.
9. The text entry method of claim 8, further comprising:
in a case where the first portion of the tap and gesture movement operations conforms to the second input mode, it is determined that the remaining portion of the tap and gesture movement operations conforms to the substitute gesture, and one of a plurality of symbols is selected as the substituted symbol in accordance with an action path of the remaining portion of the tap and gesture movement operations.
10. The text entry method of claim 1, further comprising:
in the case that an operation period of the first part of the tap and gesture movement operations is shorter than a time threshold, determining that the first part of the tap and gesture movement operations conforms to the first input mode;
in a case where an operation period of the first part of the tap and gesture movement operation is longer than a time threshold, it is determined that the first part of the tap and gesture movement operation conforms to the second input mode.
11. The text entry method of claim 1, further comprising:
in the case that the total force data of the first part of the tap and gesture movement operations does not exceed a total force threshold, determining that the first part of the tap and gesture movement operations conforms to the first input mode; and
in a case that total force data of the first part of the tap and gesture movement operations exceeds a total force threshold, determining that the first part of the tap and gesture movement operations conforms to the second input mode.
CN201710465985.9A 2016-06-20 2017-06-19 Character input method Active CN107526449B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/186553 2016-06-20
US15/186,553 US20160299623A1 (en) 2012-04-20 2016-06-20 Text input method

Publications (2)

Publication Number Publication Date
CN107526449A CN107526449A (en) 2017-12-29
CN107526449B true CN107526449B (en) 2020-11-10

Family

ID=60748713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710465985.9A Active CN107526449B (en) 2016-06-20 2017-06-19 Character input method

Country Status (2)

Country Link
CN (1) CN107526449B (en)
TW (1) TWI633463B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI779310B (en) * 2019-09-26 2022-10-01 華碩電腦股份有限公司 Control method of electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243561A (en) * 2010-05-10 2011-11-16 腾讯科技(深圳)有限公司 Method and device for modifying input information
CN103885696A (en) * 2014-03-17 2014-06-25 联想(北京)有限公司 Information processing method and electronic device
CN103927116A (en) * 2014-03-18 2014-07-16 兴唐通信科技有限公司 Chinese character gesture input keyboard and method based on touch screen equipment
CN104090669A (en) * 2014-07-16 2014-10-08 三星电子(中国)研发中心 Input method editing method and device
CN105117159A (en) * 2015-08-27 2015-12-02 广东欧珀移动通信有限公司 Character processing method and terminal
CN105247540A (en) * 2013-06-09 2016-01-13 苹果公司 Managing real-time handwriting recognition

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI236628B (en) * 2004-05-06 2005-07-21 Sentelic Corp Touch-type character input method and control module thereof
TWI313430B (en) * 2005-09-16 2009-08-11 Input method for touch screen

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243561A (en) * 2010-05-10 2011-11-16 腾讯科技(深圳)有限公司 Method and device for modifying input information
CN105247540A (en) * 2013-06-09 2016-01-13 苹果公司 Managing real-time handwriting recognition
CN103885696A (en) * 2014-03-17 2014-06-25 联想(北京)有限公司 Information processing method and electronic device
CN103927116A (en) * 2014-03-18 2014-07-16 兴唐通信科技有限公司 Chinese character gesture input keyboard and method based on touch screen equipment
CN104090669A (en) * 2014-07-16 2014-10-08 三星电子(中国)研发中心 Input method editing method and device
CN105117159A (en) * 2015-08-27 2015-12-02 广东欧珀移动通信有限公司 Character processing method and terminal

Also Published As

Publication number Publication date
TW201800906A (en) 2018-01-01
TWI633463B (en) 2018-08-21
CN107526449A (en) 2017-12-29

Similar Documents

Publication Publication Date Title
US8576166B2 (en) Electronic device system utilizing a character input method
US8018441B2 (en) Character input apparatus and method for automatically switching input mode in terminal having touch screen
EP1988444A2 (en) Character input apparatus and method
US20160306546A1 (en) Software Keyboard Input Method for Realizing Composite Key on Electronic Device Screen
EP3005066B1 (en) Multiple graphical keyboards for continuous gesture input
CN103268154B (en) A kind of letter input method of set top box virtual keyboard and device
USRE48242E1 (en) Character input apparatus and method for automatically switching input mode in terminal having touch screen
JP2011524595A (en) Methods for customizing data entry for individual text fields
EP3267301B1 (en) High-efficiency touch screen text input system and method
CN107526449B (en) Character input method
KR101261227B1 (en) Virtual keyboard input device, and data input method thereof
KR20150132896A (en) A remote controller consisting of a single touchpad and its usage
US20190227668A1 (en) Text input method
US20110173573A1 (en) Method for inputting a character in a portable terminal
US20190250815A1 (en) Input Accepting Device
KR102260468B1 (en) Method for Inputting Hangul Vowels Using Software Keypad
JP6605921B2 (en) Software keyboard program, character input device, and character input method
CN104598061B (en) Input unit and its input method
JP2013196598A (en) Information processing apparatus, method and program
KR20140004522A (en) Apparatus for inputting korean characters using multi-touch type and method thereof
KR20180029349A (en) Input interface device of wearable device and character input method using the same
CN106468958A (en) The method and device of input character
KR20140122826A (en) command and text input method for a remote controller of smart TV
KR20150052905A (en) Display apparatus with touch screen and screen keypad control method thereof
KR20140122408A (en) command and text input method for a remote controller of smart TV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant