CN105718072A - Character output method and mobile terminal - Google Patents

Character output method and mobile terminal Download PDF

Info

Publication number
CN105718072A
CN105718072A CN201610039029.XA CN201610039029A CN105718072A CN 105718072 A CN105718072 A CN 105718072A CN 201610039029 A CN201610039029 A CN 201610039029A CN 105718072 A CN105718072 A CN 105718072A
Authority
CN
China
Prior art keywords
touch
candidate
determining
user
candidate items
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610039029.XA
Other languages
Chinese (zh)
Other versions
CN105718072B (en
Inventor
周辉
曾元清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201610039029.XA priority Critical patent/CN105718072B/en
Publication of CN105718072A publication Critical patent/CN105718072A/en
Application granted granted Critical
Publication of CN105718072B publication Critical patent/CN105718072B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a character output method and a mobile terminal. The character output method comprises the following steps: detecting a touch control parameter generated when a user performs touch control operation at a cursor position displayed on a touch control display screen; determining a corresponding candidate option according to a preset corresponding relationship and the touch control parameter, wherein the corresponding candidate option comprises at least one character; determining a target candidate option from the corresponding candidate option and outputting the target candidate option at the cursor position. With the adoption of the embodiment of the invention, the candidate option containing characters can be outputted without inputting complicated phonetic alphabets or strokes by the user, so that the user operation amount can be reduced, and the character output efficiency is improved.

Description

Character output method and mobile terminal
Technical Field
The invention relates to the technical field of intelligent terminals, in particular to a character output method and a mobile terminal.
Background
At present, mobile terminals such as mobile phones and tablet computers have become indispensable communication tools in daily life, and the mobile terminals can communicate with contacts through telephone, information, mails and other modes.
During communication, users usually need to edit various text messages. In the conventional character output method, for text output, a user needs to perform multiple input operations on an input interface to output text, for example, input multiple strokes or multiple pinyin letters of a text to select a target text from a pop-up option. For the output of the symbols, the user is required to click some specific options of the input interface, select the symbol type from the popped up multiple symbols, such as chinese symbols, english symbols, emoticons, and the like, and after selecting the symbol type, the user is also required to select the target symbol from a large number of symbols of the type. Therefore, the traditional character output method needs a user to operate for many times, and the process is complicated.
Disclosure of Invention
The embodiment of the invention provides a character output method and a mobile terminal, which can reduce the operation amount of a user and improve the efficiency of character output.
A first aspect of an embodiment of the present invention provides a character output method, which may include:
detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen;
determining a corresponding candidate item according to a preset corresponding relation and the touch parameter, wherein the corresponding candidate item comprises at least one character;
and determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position.
In some of the possible embodiments of the present invention,
before detecting a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen, the method further includes:
setting a first corresponding relation between the touch pressure level and the candidate items;
the determining the corresponding candidate item according to the preset corresponding relationship and the touch parameter includes:
and determining corresponding candidate items according to the first corresponding relation and the touch pressure level in the touch parameters.
In some of the possible embodiments of the present invention,
before detecting a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen, the method further includes:
setting a second corresponding relation among the touch pressure level, the touch times and the candidate items;
the determining the corresponding candidate item according to the preset corresponding relationship and the touch parameter includes:
and determining corresponding candidate items according to the second corresponding relation and the touch pressure level and the touch frequency in the touch parameters.
In some of the possible embodiments of the present invention,
before detecting a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen, the method further includes:
setting a third corresponding relation among the touch pressure level, the touch duration and the candidate items;
the determining the corresponding candidate item according to the preset corresponding relationship and the touch parameter includes:
and determining corresponding candidate items according to the third corresponding relation and the touch pressure level and the touch duration in the touch parameters.
In some of the possible embodiments of the present invention,
before detecting a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen, the method further includes:
setting a fourth corresponding relation between the touch frequency, the touch pressure level, the touch duration and the candidate items;
the determining the corresponding candidate item according to the preset corresponding relationship and the touch parameter includes:
and determining corresponding candidate items according to the fourth corresponding relation, the touch times in the touch parameters, the pressure level of each touch and the duration of each touch.
In some possible embodiments, the determining a target candidate from the corresponding candidates and outputting the target candidate at the cursor position includes:
when the number of the corresponding candidate items is larger than 1, receiving touch operation aiming at the target candidate item, and outputting the target candidate item at the cursor position;
and when the number of the corresponding candidate items is equal to 1, determining the corresponding candidate items as target candidate items, and outputting the target candidate items at the cursor position.
In some possible embodiments, the determining a target candidate from the corresponding candidates and outputting the target candidate at the cursor position includes:
when the number of the corresponding candidate items is larger than 1, receiving voice information input by a user;
determining a target candidate matched with the voice information from the corresponding candidate;
and outputting the target candidate item at the cursor position.
A second aspect of an embodiment of the present invention provides a mobile terminal, which may include:
the detection module is used for detecting touch parameters generated by touch operation of a user at a cursor position displayed on the touch display screen;
the determining module is used for determining corresponding candidate items according to a preset corresponding relation and the touch parameters, wherein the corresponding candidate items comprise at least one character;
and the output module is used for determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position.
In some possible embodiments, the mobile terminal further includes:
the first setting module is used for setting a first corresponding relation between the touch pressure level and the candidate items;
the determining module is specifically configured to determine a corresponding candidate item according to the first corresponding relationship and the touch pressure level in the touch parameter.
In some possible embodiments, the mobile terminal further includes:
the second setting module is used for setting a second corresponding relation among the touch pressure level, the touch frequency and the candidate items;
the determining module is specifically configured to determine a corresponding candidate item according to the second correspondence and the touch pressure level and the touch frequency in the touch parameter.
In some possible embodiments, the mobile terminal further includes:
the third setting module is used for setting a third corresponding relation among the touch pressure level, the touch duration and the candidate items;
the determining module is specifically configured to determine a corresponding candidate item according to the third correspondence and the touch pressure level and the touch duration in the touch parameter.
In some possible embodiments, the mobile terminal further includes:
the fourth setting module is used for setting a fourth corresponding relation between the touch times, the touch pressure level, the touch duration and the candidate items;
the determining module is specifically configured to determine corresponding candidate items according to the fourth corresponding relationship, and the number of touches, the pressure level of each touch, and the duration of each touch in the touch parameter.
In some possible embodiments, the output module is specifically configured to:
when the number of the corresponding candidate items is larger than 1, receiving touch operation aiming at the target candidate item, and outputting the target candidate item at the cursor position; or,
and when the number of the corresponding candidate items is equal to 1, determining the corresponding candidate items as target candidate items, and outputting the target candidate items at the cursor position.
In some possible embodiments, the output module includes:
the voice receiving unit is used for receiving voice information input by a user when the number of the corresponding candidate items is greater than 1;
a determining unit, configured to determine a target candidate matched with the speech information from the corresponding candidate;
and the output unit is used for outputting the target candidate item at the cursor position.
A third aspect of the embodiments of the present invention provides a mobile terminal, which may include an input device, an output device, a processor, and a memory, where the input device, the output device, the processor, and the memory are connected through a bus, the memory is configured to store a set of program codes, and the input device, the output device, and the processor are configured to call the program codes to execute the character output method as described in the first aspect or any possible implementation manner of the first aspect.
In the embodiment of the invention, a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen is detected; determining a corresponding candidate item according to a preset corresponding relation and the touch parameter, wherein the corresponding candidate item comprises at least one character; and determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing the character can be output without inputting complicated pinyin or strokes by the user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
FIG. 1 is a flow chart of a method for outputting characters according to an embodiment of the present invention;
FIG. 2 is a flow chart of a character output method according to another embodiment of the present invention;
FIG. 3 is a flow chart of a character output method according to another embodiment of the present invention;
FIG. 4 is a flow chart of a character output method according to another embodiment of the present invention;
fig. 5 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention;
fig. 7 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention;
fig. 8 is a schematic structural diagram of a mobile terminal according to yet another embodiment of the present invention
Fig. 9 is a schematic structural diagram of a mobile terminal according to still another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention discloses a character output method and a mobile terminal, which can reduce the operation amount of a user and improve the efficiency of character output. The following detailed description will be made in conjunction with the accompanying drawings. The mobile terminal according to the embodiment of the present invention may include, but is not limited to, a mobile phone, a notebook computer, a tablet computer, and other mobile terminals.
Referring to fig. 1, fig. 1 is a flow chart illustrating a character output method according to an embodiment of the invention. The method of the embodiment of the invention can be realized by the mobile terminal. As shown in fig. 1, the method may include the steps of:
s101, detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen.
In a specific implementation, the touch display screen can be used for sensing a touch operation of a user and outputting signals such as characters, pictures or videos.
Optionally, the touch operation may include, but is not limited to, a click operation, a press operation, a slide operation, or the like. The last touch operation may be a long-time touch operation, such as continuous pressing or sliding, or a short-time touch operation, such as clicking.
Optionally, the touch parameter may include at least one of a touch pressure value, a touch frequency, and a touch duration. A pressure sensor can be arranged below the screen, and a touch pressure value is detected through the pressure sensor. The number of touches and the duration of touches can be detected by a counter, a timer, etc.
Specifically, the cursor position is a position where a character needs to be inserted, where the character may include letters, numbers, chinese characters, symbols, and the like.
And S102, determining corresponding candidate items according to a preset corresponding relation and the touch parameters, wherein the corresponding candidate items comprise at least one character.
In some possible embodiments, the corresponding relationship between the touch parameter and the candidate item may be preset by the mobile terminal, and the corresponding relationship may be set before the mobile terminal leaves a factory, or may be set by the user. Optionally, each candidate item may include one or more characters, for example, one candidate item may be a user-used sentence, and the sentence may include a plurality of words and/or symbols; as another example, a candidate may be a symbol.
Optionally, the sentence with higher user frequency and more input pinyin/strokes may be selected according to the input history of the user to set the corresponding relationship between the sentence and the touch parameter. And/or selecting a symbol or an emoticon with high use frequency of the user to set the corresponding relation between the symbol or the emoticon and the touch parameter according to the input history of the user. And/or selecting a contact name with more contact times with the user from the contacts of the user and setting the corresponding relation between the contact name and the touch parameter. And/or selecting some network popular phrases, idioms, posters and the like to set the corresponding relation between the network popular phrases, idioms, posters and the like and the touch parameters.
Optionally, the preset correspondence may include, but is not limited to: the corresponding relation between the touch pressure level and the candidate items; the corresponding relation between the touch pressure level and the touch frequency and the candidate items; the corresponding relation between the touch pressure level and the touch duration and the candidate items; and the corresponding relation among the touch frequency, the pressure level of each touch, the duration of each touch and the candidate items.
Preferably, when the number of touch parameters available for setting is small, the candidate items to be set with the correspondence relationship may be divided into a plurality of categories, and the correspondence relationship between the touch parameters and the candidate item categories is set, so that the candidate items corresponding to the touch parameters are all the candidate items in the candidate item categories. For example, the corresponding candidate items may be all emoticons in the emoticon class under the first pressure value, the first touch frequency and the first touch duration.
According to the preset corresponding relation and the detected touch parameters, candidate items corresponding to the detected touch parameters can be determined.
S103, determining a target candidate from the corresponding candidates and outputting the target candidate at the cursor position.
Optionally, if the number of the corresponding candidate items is equal to 1, it may be determined that the corresponding candidate item is the target candidate item, and the corresponding candidate item is directly inserted into the cursor position and then displayed on the touch display screen. If the number of the corresponding candidate items is larger than 1, the candidate items can be displayed on a touch display screen so that a user can select a target candidate item from the candidate items and insert the target candidate item into the cursor position. Optionally, after the plurality of corresponding candidate items are displayed on the touch display screen, the touch operation of the user can be detected, and the touched candidate item is used as a target candidate item to be inserted into the cursor position; or, a voice selection instruction of the user may be received, and the selected candidate is inserted into the cursor position as a target candidate, for example, a voice selection instruction of the user is received and recognized, where a candidate with the same pronunciation as the voice selection instruction is the selected target candidate.
In the embodiment of the invention, a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen is detected; determining a corresponding candidate item according to a preset corresponding relation and the touch parameter, wherein the corresponding candidate item comprises at least one character; and determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing the character can be output without inputting complicated pinyin or strokes by the user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Referring to fig. 2, fig. 2 is a flow chart illustrating a character output method according to another embodiment of the invention. The method of the embodiment of the invention can be realized by the mobile terminal. As shown in fig. 2, the method may include the steps of:
s201, setting a first corresponding relation between the touch pressure level and the candidate items.
In specific implementation, the range of the pressure value generated by the touch operation of the user on the touch display screen can be determined through a test experiment before delivery. The range may be divided into a plurality of touch pressure levels, and at least one corresponding candidate item may be set for each touch pressure level, forming a first correspondence between the touch pressure level and the candidate item. Optionally, the first corresponding relationship may be set before the mobile terminal leaves a factory, or may be set by a user. Optionally, each candidate item may include one or more characters, for example, one candidate item may be a user-used sentence, and the sentence may include a plurality of words and/or symbols; as another example, a candidate may be a symbol.
Alternatively, a sentence with a higher user frequency and a larger number of pinyin/strokes may be selected to set the first corresponding relationship between the sentence and the touch pressure level according to the input history of the user. And/or selecting a symbol or an emoticon with higher use frequency of the user according to the input history of the user to set a first corresponding relation between the symbol or the emoticon and the touch pressure level. And/or selecting a contact name which is contacted with the user more frequently from the contacts of the user and setting a first corresponding relation between the contact name and the touch pressure level. And/or, some popular network phrases, idioms, posters and the like can be selected to set the first corresponding relation between the popular network phrases, idioms, posters and the like and the touch pressure level.
In some possible embodiments, the user has limited strength in performing the touch operation, and the divided touch pressure levels are less, at this time, the candidate items to be set with the correspondence may be divided into a plurality of categories, and the first correspondence between different touch pressure levels and different candidate item categories is set, so that the candidate item corresponding to one touch pressure level is all the candidate items in the corresponding candidate item category. For example, the first pressure level may be set, and the corresponding candidates are all the emoticons in the emoticon class.
S202, detecting a touch parameter generated by touch operation of a user at a cursor position displayed on the touch display screen.
In a specific implementation, the implementation manner of step S202 may refer to the related description of step S101 in the embodiment shown in fig. 1, which is not described herein again.
S203, determining corresponding candidate items according to the first corresponding relation and the touch pressure level in the touch parameters.
In a specific implementation, candidate items corresponding to the touch pressure levels in the touch parameters can be determined by querying a preset first corresponding relation. Optionally, the corresponding candidate item may include one or more candidate items, and each candidate item may include one or more characters.
S204, judging whether the number of the corresponding candidate items is more than 1; if yes, go to step S205, otherwise go to step S207.
In a specific implementation, after determining the corresponding candidate items, the number of the corresponding candidate items can be known, and further, whether the number of the corresponding candidate items is greater than 1 can be judged.
As a possible implementation manner, if the number of the corresponding candidate items is greater than 1, the corresponding candidate items may be displayed on the touch display screen at a place other than the cursor position, for example, the corresponding candidate items may be displayed at any blank position in the touch display screen, so that the user may select the target candidate item therefrom.
S205, receiving a touch operation for one of the corresponding candidate items, or receiving speech information input by a user and determining a target candidate item matched with the speech information from the corresponding candidate item.
In some possible embodiments, if the number of the corresponding candidates is greater than 1, a user is required to select to determine the target candidate. Optionally, after the plurality of corresponding candidate items are displayed on the touch display screen, the touch operation of the user can be detected, and the touched candidate item is taken as a target candidate item; or, the voice information input by the user can be received, and the target candidate matched with the voice information input by the user is determined from the corresponding candidate. Alternatively, the matching may be that the voice message input by the user is the same as the pronunciation of the target candidate, or that the voice message input by the user is the initial pronunciation of the target candidate, and so on.
S206, outputting the candidate item or the target candidate item at the cursor position.
After the target candidate item is determined, the target candidate item can be directly inserted into the cursor position and displayed on the touch display screen.
And S207, determining the corresponding candidate as a target candidate, and outputting the target candidate at the cursor position.
In some possible embodiments, if the number of the corresponding candidate items is equal to 1, the corresponding candidate item may be determined to be the target candidate item, and the corresponding candidate item is directly inserted into the cursor position and then displayed on the touch display screen.
In the embodiment of the invention, a first corresponding relation between the touch pressure level and the candidate items is set; detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen; determining corresponding candidate items according to the first corresponding relation and the touch pressure level in the touch parameters; and if the number of the corresponding candidate items is larger than 1, determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing at least one character can be output without inputting complicated pinyin or strokes by a user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Referring to fig. 3, fig. 3 is a flow chart illustrating a character output method according to another embodiment of the invention. The method of the embodiment of the invention can be realized by the mobile terminal. As shown in fig. 3, the method may include the steps of:
s301, setting a second corresponding relation among the touch pressure level, the touch frequency and the candidate items.
In specific implementation, the range of the pressure value generated by the touch operation of the user on the touch display screen can be determined through a test experiment before delivery. The range may be divided into a plurality of touch pressure levels, and a second correspondence between the touch pressure level, the number of touches, and the candidate items may be set. Optionally, the second corresponding relationship may be set before the mobile terminal leaves a factory, or may be set by a user. Optionally, each candidate item may include one or more characters, for example, one candidate item may be a user-used sentence, and the sentence may include a plurality of words and/or symbols; as another example, a candidate may be a symbol.
Alternatively, a sentence with a higher user use frequency and a larger number of pinyin/strokes may be selected to set the second correspondence relationship between the sentence and the touch pressure level and the number of touches according to the input history of the user. And/or selecting a symbol or an emoticon with higher use frequency of the user according to the input history of the user to set a second corresponding relation between the symbol or the emoticon and the touch pressure level and the touch frequency. And/or selecting a contact name with a larger contact frequency with the user from the contacts of the user and setting a second corresponding relation between the contact name and the touch pressure level and the touch frequency. And/or, some network popular phrases, idioms, posters and the like can be selected to set the second corresponding relation between the network popular phrases, idioms, posters and the like and the touch pressure level and the touch times.
In some possible embodiments, the user has limited force for performing the touch operation, the divided touch pressure levels are less, and the number of touches is limited, at this time, the candidate items to be set with the correspondence relationship may be divided into a plurality of categories, and a second correspondence relationship between different touch pressure levels, the number of touches, and different candidate item categories is set, so that the candidate items corresponding to a group of touch parameters composed of the touch pressure levels and the number of touches are all the candidate items in the corresponding candidate item categories. For example, the candidate items may be all emoticons in the emoticon class when the candidate items are set to the first pressure level and the first touch frequency.
S302, detecting a touch parameter generated by a user performing touch operation at a cursor position displayed on a touch display screen.
In a specific implementation, the implementation manner of step S302 may refer to the related description of step S101 in the embodiment shown in fig. 1, which is not described herein again.
And S303, determining corresponding candidate items according to the second corresponding relation, the touch pressure level and the touch frequency in the touch parameters.
In a specific implementation, candidate items corresponding to the touch pressure level and the touch frequency in the touch parameter can be determined by querying a preset second corresponding relation. Optionally, the corresponding candidate item may include one or more candidate items, and each candidate item may include one or more characters.
S304, judging whether the number of the corresponding candidate items is more than 1; if yes, go to step S305, otherwise go to step S307.
S305, receiving a touch operation for one candidate of the corresponding candidates, or receiving speech information input by a user and determining a target candidate matching the speech information from the corresponding candidate.
S306, outputting the candidate item or the target candidate item at the cursor position.
S307, determining the corresponding candidate item as a target candidate item, and outputting the target candidate item at the cursor position.
In a specific implementation, the implementation manner of steps S304-S307 may refer to the related description of steps S204-S207 in the embodiment shown in fig. 2, which is not repeated herein.
In the embodiment of the invention, a second corresponding relation among the touch pressure level, the touch frequency and the candidate items is set; detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen; determining corresponding candidate items according to the second corresponding relation and the touch pressure level and the touch frequency in the touch parameters; and if the number of the corresponding candidate items is larger than 1, determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing at least one character can be output without inputting complicated pinyin or strokes by a user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Referring to fig. 4, fig. 4 is a flowchart illustrating a character output method according to another embodiment of the invention. The method of the embodiment of the invention can be realized by the mobile terminal. As shown in fig. 4, the method may include the steps of:
s401, setting a third corresponding relation among the touch pressure level, the touch duration and the candidate items.
In specific implementation, the range of the pressure value generated by the touch operation of the user on the touch display screen can be determined through a test experiment before delivery. The range may be divided into a plurality of touch pressure levels, and a third correspondence between the touch pressure level, the touch duration, and the candidate items may be set. Optionally, the third corresponding relationship may be set before the mobile terminal leaves a factory, or may be set by a user. Optionally, each candidate item may include one or more characters, for example, one candidate item may be a user-used sentence, and the sentence may include a plurality of words and/or symbols; as another example, a candidate may be a symbol.
Alternatively, a sentence with a higher user use frequency and a larger input pinyin/stroke may be selected to set the third correspondence relationship between the sentence and the touch pressure level and the touch duration according to the input history of the user. And/or selecting a symbol or an emoticon with higher use frequency of the user according to the input history of the user to set a third corresponding relation between the symbol or the emoticon and the touch pressure level and the touch duration. And/or selecting a contact name with a higher contact frequency with the user from the contacts of the user and setting a third corresponding relation between the contact name and the touch pressure level and the touch duration. And/or, some popular network phrases, idioms, posters and the like can be selected to set a third corresponding relation between the popular network phrases, idioms, posters and the like and the touch pressure level and the touch duration.
In some possible embodiments, the user has limited force for performing the touch operation, the divided touch pressure levels are less, and the touch duration is also limited, at this time, the candidate items to be set with the correspondence relationship may be divided into a plurality of categories, and a third correspondence relationship between different touch pressure levels, touch durations, and different candidate item categories is set, so that the candidate items corresponding to a set of touch parameters composed of the touch pressure levels and the touch durations are all the candidate items in the corresponding candidate item categories. For example, the corresponding candidates may be all emoticons in the emoticon class set at the first pressure level and the first touch duration.
S402, detecting a touch parameter generated by touch operation of a user at a cursor position displayed on the touch display screen.
In a specific implementation, the implementation manner of step S402 may refer to the related description of step S101 in the embodiment shown in fig. 1, which is not described herein again.
And S403, determining corresponding candidate items according to the third corresponding relation and the touch pressure level and the touch duration in the touch parameters.
In specific implementation, candidate items corresponding to the touch pressure level and the touch duration in the touch parameter can be determined by querying a preset third corresponding relation. Optionally, the corresponding candidate item may include one or more candidate items, and each candidate item may include one or more characters.
S404, judging whether the number of the corresponding candidate items is more than 1; if yes, go to step S405, otherwise go to step S407.
S405, receiving a touch operation for one candidate item of the corresponding candidate items, or receiving speech information input by a user and determining a target candidate item matched with the speech information from the corresponding candidate item.
S406, outputting the candidate item or the target candidate item at the cursor position.
S407, determining the corresponding candidate as a target candidate, and outputting the target candidate at the cursor position.
In a specific implementation, the implementation manner of steps S404 to S407 can refer to the related description of steps S204 to S207 in the embodiment shown in fig. 2, which is not repeated herein.
In the embodiment of the invention, a third corresponding relation among the touch pressure level, the touch duration and the candidate items is set; detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen; determining corresponding candidate items according to the third corresponding relation and the touch pressure level and the touch duration in the touch parameters; and if the number of the corresponding candidate items is larger than 1, determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing at least one character can be output without inputting complicated pinyin or strokes by a user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention. As shown in fig. 5, the mobile terminal may include a detection module 501, a determination module 502, and an output module 503, wherein:
the detecting module 501 is configured to detect a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen.
In a specific implementation, the touch display screen can be used for sensing a touch operation of a user and outputting signals such as characters, pictures or videos.
Optionally, the touch operation may include, but is not limited to, a click operation, a press operation, a slide operation, or the like. The last touch operation may be a long-time touch operation, such as continuous pressing or sliding, or a short-time touch operation, such as clicking.
Optionally, the touch parameter may include at least one of a touch pressure value, a touch frequency, and a touch duration. A pressure sensor can be arranged below the screen, and a touch pressure value is detected through the pressure sensor. The number of touches and the duration of touches can be detected by a counter, a timer, etc.
Specifically, the cursor position is a position where a character needs to be inserted, where the character may include letters, numbers, chinese characters, symbols, and the like.
A determining module 502, configured to determine a corresponding candidate item according to a preset correspondence and the touch parameter, where the corresponding candidate item includes at least one character.
In some possible embodiments, the corresponding relationship between the touch parameter and the candidate item may be preset by the mobile terminal, and the corresponding relationship may be set before the mobile terminal leaves a factory, or may be set by the user. Optionally, each candidate item may include one or more characters, for example, one candidate item may be a user-used sentence, and the sentence may include a plurality of words and/or symbols; as another example, a candidate may be a symbol.
Optionally, the sentence with higher user frequency and more input pinyin/strokes may be selected according to the input history of the user to set the corresponding relationship between the sentence and the touch parameter. And/or selecting a symbol or an emoticon with high use frequency of the user to set the corresponding relation between the symbol or the emoticon and the touch parameter according to the input history of the user. And/or selecting a contact name with more contact times with the user from the contacts of the user and setting the corresponding relation between the contact name and the touch parameter. And/or selecting some network popular phrases, idioms, posters and the like to set the corresponding relation between the network popular phrases, idioms, posters and the like and the touch parameters.
Optionally, the preset correspondence may include, but is not limited to: the corresponding relation between the touch pressure level and the candidate items; the corresponding relation between the touch pressure level and the touch frequency and the candidate items; the corresponding relation between the touch pressure level and the touch duration and the candidate items; and the corresponding relation among the touch frequency, the pressure level of each touch, the duration of each touch and the candidate items.
Preferably, when the number of touch parameters available for setting is small, the candidate items to be set with the correspondence relationship may be divided into a plurality of categories, and the correspondence relationship between the touch parameters and the candidate item categories is set, so that the candidate items corresponding to the touch parameters are all the candidate items in the candidate item categories. For example, the corresponding candidate items may be all emoticons in the emoticon class under the first pressure value, the first touch frequency and the first touch duration.
According to the preset corresponding relation and the detected touch parameters, candidate items corresponding to the detected touch parameters can be determined.
An output module 503, configured to determine a target candidate from the corresponding candidates and output the target candidate at the cursor position.
Optionally, if the number of the corresponding candidate items is equal to 1, it may be determined that the corresponding candidate item is the target candidate item, and the corresponding candidate item is directly inserted into the cursor position and then displayed on the touch display screen. If the number of the corresponding candidate items is larger than 1, the candidate items can be displayed on a touch display screen so that a user can select a target candidate item from the candidate items and insert the target candidate item into the cursor position. Optionally, after the plurality of corresponding candidate items are displayed on the touch display screen, the touch operation of the user can be detected, and the touched candidate item is used as a target candidate item to be inserted into the cursor position; or, a voice selection instruction of the user may be received, and the selected candidate is inserted into the cursor position as a target candidate, for example, a voice selection instruction of the user is received and recognized, where a candidate with the same pronunciation as the voice selection instruction is the selected target candidate.
In the embodiment of the invention, the mobile terminal can detect the touch parameter generated by the touch operation of the user at the cursor position displayed on the touch display screen; determining a corresponding candidate item according to a preset corresponding relation and the touch parameter, wherein the corresponding candidate item comprises at least one character; and determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing the character can be output without inputting complicated pinyin or strokes by the user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention. As shown in fig. 6, the mobile terminal may include a first setting module 601, a detection module 602, a determination module 603, and an output module 604, wherein:
the first setting module 601 is configured to set a first corresponding relationship between the touch pressure level and the candidate items.
In specific implementation, the range of the pressure value generated by the touch operation of the user on the touch display screen can be determined through a test experiment before delivery. The range may be divided into a plurality of touch pressure levels, and at least one corresponding candidate item may be set for each touch pressure level, forming a first correspondence between the touch pressure level and the candidate item. Optionally, the first corresponding relationship may be set before the mobile terminal leaves a factory, or may be set by a user. Optionally, each candidate item may include one or more characters, for example, one candidate item may be a user-used sentence, and the sentence may include a plurality of words and/or symbols; as another example, a candidate may be a symbol.
Alternatively, a sentence with a higher user frequency and a larger number of pinyin/strokes may be selected to set the first corresponding relationship between the sentence and the touch pressure level according to the input history of the user. And/or selecting a symbol or an emoticon with higher use frequency of the user according to the input history of the user to set a first corresponding relation between the symbol or the emoticon and the touch pressure level. And/or selecting a contact name which is contacted with the user more frequently from the contacts of the user and setting a first corresponding relation between the contact name and the touch pressure level. And/or, some popular network phrases, idioms, posters and the like can be selected to set the first corresponding relation between the popular network phrases, idioms, posters and the like and the touch pressure level.
In some possible embodiments, the user has limited strength in performing the touch operation, and the divided touch pressure levels are less, at this time, the candidate items to be set with the correspondence may be divided into a plurality of categories, and the first correspondence between different touch pressure levels and different candidate item categories is set, so that the candidate item corresponding to one touch pressure level is all the candidate items in the corresponding candidate item category. For example, the first pressure level may be set, and the corresponding candidates are all the emoticons in the emoticon class.
The detecting module 602 is configured to detect a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen.
In a specific implementation, the implementation of the detection module 602 may refer to the implementation of the detection module 501 in the embodiment shown in fig. 5, which is not described herein again.
The determining module 603 is configured to determine a corresponding candidate item according to a preset corresponding relationship and the touch parameter, where the corresponding candidate item includes at least one character.
In some possible implementations, the implementation of the determining module 603 may refer to the implementation of the determining module 502 in the embodiment shown in fig. 5.
In this embodiment, the determining module 603 may be specifically configured to determine a corresponding candidate item according to the first corresponding relationship and the touch pressure level in the touch parameter.
In a specific implementation, candidate items corresponding to the touch pressure levels in the touch parameters can be determined by querying a preset first corresponding relation. Optionally, the corresponding candidate item may include one or more candidate items, and each candidate item may include one or more characters.
An output module 604, configured to determine a target candidate from the corresponding candidates and output the target candidate at the cursor position.
In a specific implementation, after determining the corresponding candidate items, the number of the corresponding candidate items can be known, and further, whether the number of the corresponding candidate items is greater than 1 can be judged.
In some possible embodiments, if the number of the corresponding candidate items is equal to 1, the corresponding candidate item may be determined to be the target candidate item, and the corresponding candidate item is directly inserted into the cursor position and then displayed on the touch display screen.
In some possible embodiments, if the number of the corresponding candidate items is greater than 1, the corresponding candidate items may be displayed on the touch display screen at a place other than the cursor position, for example, the corresponding candidate items may be displayed at any blank position in the touch display screen, so that the user may select the target candidate item therefrom.
In some possible embodiments, when the number of the corresponding candidate items is greater than 1, the output module 604 may be specifically configured to: and receiving touch operation aiming at the target candidate item, and outputting the target candidate item at the cursor position.
In other possible embodiments, the output module 604 may include a voice receiving unit 6041, a determining unit 6042, and an output unit 6043, where:
a speech receiving unit 6041 configured to receive speech information input by the user when the number of the corresponding candidates is greater than 1.
A determining unit 6042 configured to determine a target candidate matching the speech information from the corresponding candidates.
Alternatively, the matching may be that the voice message input by the user is the same as the pronunciation of the target candidate, or that the voice message input by the user is the initial pronunciation of the target candidate, and so on.
An output unit 6043 for outputting the target candidate at the cursor position.
After the target candidate item is determined, the target candidate item can be directly inserted into the cursor position and displayed on the touch display screen.
In the embodiment of the invention, the mobile terminal can set a first corresponding relation between the touch pressure level and the candidate items; detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen; determining corresponding candidate items according to the first corresponding relation and the touch pressure level in the touch parameters; and if the number of the corresponding candidate items is larger than 1, determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing at least one character can be output without inputting complicated pinyin or strokes by a user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention. As shown in fig. 7, the mobile terminal may include a second setting module 701, a detection module 702, a determination module 703, and an output module 704, wherein:
the second setting module 701 is configured to set a second corresponding relationship between the touch pressure level, the touch frequency, and the candidate items.
In specific implementation, the range of the pressure value generated by the touch operation of the user on the touch display screen can be determined through a test experiment before delivery. The range may be divided into a plurality of touch pressure levels, and a second correspondence between the touch pressure level, the number of touches, and the candidate items may be set. Optionally, the second corresponding relationship may be set before the mobile terminal leaves a factory, or may be set by a user. Optionally, each candidate item may include one or more characters, for example, one candidate item may be a user-used sentence, and the sentence may include a plurality of words and/or symbols; as another example, a candidate may be a symbol.
Alternatively, a sentence with a higher user use frequency and a larger number of pinyin/strokes may be selected to set the second correspondence relationship between the sentence and the touch pressure level and the number of touches according to the input history of the user. And/or selecting a symbol or an emoticon with higher use frequency of the user according to the input history of the user to set a second corresponding relation between the symbol or the emoticon and the touch pressure level and the touch frequency. And/or selecting a contact name with a larger contact frequency with the user from the contacts of the user and setting a second corresponding relation between the contact name and the touch pressure level and the touch frequency. And/or, some network popular phrases, idioms, posters and the like can be selected to set the second corresponding relation between the network popular phrases, idioms, posters and the like and the touch pressure level and the touch times.
In some possible embodiments, the user has limited force for performing the touch operation, the divided touch pressure levels are less, and the number of touches is limited, at this time, the candidate items to be set with the correspondence relationship may be divided into a plurality of categories, and a second correspondence relationship between different touch pressure levels, the number of touches, and different candidate item categories is set, so that the candidate items corresponding to a group of touch parameters composed of the touch pressure levels and the number of touches are all the candidate items in the corresponding candidate item categories. For example, the candidate items may be all emoticons in the emoticon class when the candidate items are set to the first pressure level and the first touch frequency.
The detecting module 702 is configured to detect a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen.
In a specific implementation, the implementation of the detection module 702 may refer to the implementation of the detection module 501 in the embodiment shown in fig. 5, which is not described herein again.
The determining module 703 is configured to determine a corresponding candidate item according to a preset correspondence and the touch parameter, where the corresponding candidate item includes at least one character.
In some possible implementations, the implementation manner of the determining module 703 may refer to the implementation manner of the determining module 502 in the embodiment shown in fig. 5.
In this embodiment, the determining module 703 may be specifically configured to determine corresponding candidate items according to the second corresponding relationship and the touch pressure level and the touch frequency in the touch parameter.
In a specific implementation, candidate items corresponding to the touch pressure level and the touch frequency in the touch parameter can be determined by querying a preset second corresponding relation. Optionally, the corresponding candidate item may include one or more candidate items, and each candidate item may include one or more characters.
An output module 704, configured to determine a target candidate item from the corresponding candidate items and output the target candidate item at the cursor position.
In a specific implementation, the implementation manner of the output module 704 may refer to the related description of the output module 604 in the embodiment shown in fig. 6, which is not described herein again.
In the embodiment of the invention, the mobile terminal can set a second corresponding relation among the touch pressure level, the touch frequency and the candidate items; detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen; determining corresponding candidate items according to the second corresponding relation and the touch pressure level and the touch frequency in the touch parameters; and if the number of the corresponding candidate items is larger than 1, determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing at least one character can be output without inputting complicated pinyin or strokes by a user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention. As shown in fig. 8, the mobile terminal may include a third setting module 801, a detection module 802, a determination module 803, and an output module 804, wherein:
a third setting module 801, configured to set a third corresponding relationship between the touch pressure level, the touch duration, and the candidate.
In specific implementation, the range of the pressure value generated by the touch operation of the user on the touch display screen can be determined through a test experiment before delivery. The range may be divided into a plurality of touch pressure levels, and a third correspondence between the touch pressure level, the touch duration, and the candidate items may be set. Optionally, the third corresponding relationship may be set before the mobile terminal leaves a factory, or may be set by a user. Optionally, each candidate item may include one or more characters, for example, one candidate item may be a user-used sentence, and the sentence may include a plurality of words and/or symbols; as another example, a candidate may be a symbol.
Alternatively, a sentence with a higher user use frequency and a larger input pinyin/stroke may be selected to set the third correspondence relationship between the sentence and the touch pressure level and the touch duration according to the input history of the user. And/or selecting a symbol or an emoticon with higher use frequency of the user according to the input history of the user to set a third corresponding relation between the symbol or the emoticon and the touch pressure level and the touch duration. And/or selecting a contact name with a higher contact frequency with the user from the contacts of the user and setting a third corresponding relation between the contact name and the touch pressure level and the touch duration. And/or, some popular network phrases, idioms, posters and the like can be selected to set a third corresponding relation between the popular network phrases, idioms, posters and the like and the touch pressure level and the touch duration.
In some possible embodiments, the user has limited force for performing the touch operation, the divided touch pressure levels are less, and the touch duration is also limited, at this time, the candidate items to be set with the correspondence relationship may be divided into a plurality of categories, and a third correspondence relationship between different touch pressure levels, touch durations, and different candidate item categories is set, so that the candidate items corresponding to a set of touch parameters composed of the touch pressure levels and the touch durations are all the candidate items in the corresponding candidate item categories. For example, the corresponding candidates may be all emoticons in the emoticon class set at the first pressure level and the first touch duration.
The detecting module 802 is configured to detect a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen.
In a specific implementation, the implementation of the detection module 802 may refer to the implementation of the detection module 501 in the embodiment shown in fig. 5, which is not described herein again.
The determining module 803 is configured to determine a corresponding candidate item according to a preset correspondence and the touch parameter, where the corresponding candidate item includes at least one character.
In some possible implementations, the implementation manner of the determining module 803 may refer to the implementation manner of the determining module 502 in the embodiment shown in fig. 5.
In this embodiment, the determining module 803 may be specifically configured to determine the corresponding candidate item according to the third corresponding relationship and the touch pressure level and the touch duration in the touch parameter.
In specific implementation, candidate items corresponding to the touch pressure level and the touch duration in the touch parameter can be determined by querying a preset third corresponding relation. Optionally, the corresponding candidate item may include one or more candidate items, and each candidate item may include one or more characters.
An output module 804, configured to determine a target candidate item from the corresponding candidate items and output the target candidate item at the cursor position.
In a specific implementation, the implementation manner of the output module 804 may refer to the related description of the output module 604 in the embodiment shown in fig. 6, which is not described herein again.
In the embodiment of the invention, the mobile terminal can set a third corresponding relation among the touch pressure level, the touch duration and the candidate items; detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen; determining corresponding candidate items according to the third corresponding relation and the touch pressure level and the touch duration in the touch parameters; and if the number of the corresponding candidate items is larger than 1, determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing at least one character can be output without inputting complicated pinyin or strokes by a user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a terminal according to another embodiment of the present invention. As shown in fig. 9, the mobile terminal may include: at least one input device 1000; at least one output device 2000; at least one processor 3000, e.g., a CPU; and a memory 4000, the input device 1000, the output device 2000, the processor 3000, and the memory 4000 being connected by a bus 5000.
The input device 1000 may be a touch control screen, a key, or a voice recognition module of the terminal, and the input device 1000 may be configured to detect a touch operation of a user or receive voice information input by the user.
The output device 2000 may specifically be a display screen or a voice playing module of a terminal, and the output device 2000 may be configured to output information such as text, images, and voice.
The memory 4000 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 4000 is used for storing a set of program codes and also for storing channel identifications.
The input device 1000, the output device 2000 and the processor 3000 are configured to call the program code stored in the memory 4000, and perform the following operations:
the processor 3000 may be configured to:
detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen;
determining a corresponding candidate item according to a preset corresponding relation and the touch parameter, wherein the corresponding candidate item comprises at least one character;
the output device 2000 described above may be used to:
and determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position.
In some possible embodiments, before detecting the touch parameter generated by the user performing a touch operation at the cursor position displayed on the touch display screen, the processor 3000 may further be configured to:
setting a first corresponding relation between the touch pressure level and the candidate items;
the processor 3000 determines corresponding candidate items according to a preset corresponding relationship and the touch parameter, and the determining may include:
and determining corresponding candidate items according to the first corresponding relation and the touch pressure level in the touch parameters.
In some possible embodiments, before detecting the touch parameter generated by the user performing a touch operation at the cursor position displayed on the touch display screen, the processor 3000 may further be configured to:
setting a second corresponding relation among the touch pressure level, the touch times and the candidate items;
the processor 3000 determines corresponding candidate items according to a preset corresponding relationship and the touch parameter, and the determining may include:
and determining corresponding candidate items according to the second corresponding relation and the touch pressure level and the touch frequency in the touch parameters.
In some possible embodiments, before detecting the touch parameter generated by the user performing a touch operation at the cursor position displayed on the touch display screen, the processor 3000 may further be configured to:
setting a third corresponding relation among the touch pressure level, the touch duration and the candidate items;
the processor 3000 determines corresponding candidate items according to a preset corresponding relationship and the touch parameter, and the determining may include:
and determining corresponding candidate items according to the third corresponding relation and the touch pressure level and the touch duration in the touch parameters.
In some possible embodiments, before detecting the touch parameter generated by the user performing a touch operation at the cursor position displayed on the touch display screen, the processor 3000 may further be configured to:
setting a fourth corresponding relation between the touch frequency, the touch pressure level, the touch duration and the candidate items;
the processor 3000 determines corresponding candidate items according to a preset corresponding relationship and the touch parameter, and the determining may include:
and determining corresponding candidate items according to the fourth corresponding relation, the touch times in the touch parameters, the pressure level of each touch and the duration of each touch.
In some possible embodiments, the above processor 3000 determining a target candidate from the corresponding candidates and outputting the target candidate at the cursor position may include:
when the number of the corresponding candidate items is larger than 1, receiving touch operation aiming at the target candidate item, and outputting the target candidate item at the cursor position;
and when the number of the corresponding candidate items is equal to 1, determining the corresponding candidate items as target candidate items, and outputting the target candidate items at the cursor position.
In some possible embodiments, the above processor 3000 determining a target candidate from the corresponding candidates and outputting the target candidate at the cursor position may include:
when the number of the corresponding candidate items is larger than 1, receiving voice information input by a user;
determining a target candidate matched with the voice information from the corresponding candidate;
and outputting the target candidate item at the cursor position.
In the embodiment of the invention, a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen is detected; determining a corresponding candidate item according to a preset corresponding relation and the touch parameter, wherein the corresponding candidate item comprises at least one character; and determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position. By adopting the embodiment of the invention, the candidate item containing the character can be output without inputting complicated pinyin or strokes by the user, so that the operation amount of the user can be reduced, and the character output efficiency is improved.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs.
The modules or units in the device of the embodiment of the invention can be combined, divided and deleted according to actual needs.
The modules or modules of the embodiments of the present invention may be implemented in a general purpose integrated circuit (e.g., a central processing unit CPU) or an Application Specific Integrated Circuit (ASIC).
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
The above-described embodiments do not limit the scope of the present invention. Any modification, equivalent replacement, and improvement made within the spirit and principle of the above-described embodiments should be included in the protection scope of the technical solution.

Claims (13)

1. A method for outputting characters, the method comprising:
detecting a touch parameter generated by touch operation of a user at a cursor position displayed on a touch display screen;
determining a corresponding candidate item according to a preset corresponding relation and the touch parameter, wherein the corresponding candidate item comprises at least one character;
and determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position.
2. The character output method according to claim 1,
before detecting a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen, the method further includes:
setting a first corresponding relation between the touch pressure level and the candidate items;
the determining the corresponding candidate item according to the preset corresponding relationship and the touch parameter includes:
and determining corresponding candidate items according to the first corresponding relation and the touch pressure level in the touch parameters.
3. The character output method according to claim 1,
before detecting a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen, the method further includes:
setting a second corresponding relation among the touch pressure level, the touch times and the candidate items;
the determining the corresponding candidate item according to the preset corresponding relationship and the touch parameter includes:
and determining corresponding candidate items according to the second corresponding relation and the touch pressure level and the touch frequency in the touch parameters.
4. The character output method according to claim 1,
before detecting a touch parameter generated by a user performing a touch operation at a cursor position displayed on the touch display screen, the method further includes:
setting a third corresponding relation among the touch pressure level, the touch duration and the candidate items;
the determining the corresponding candidate item according to the preset corresponding relationship and the touch parameter includes:
and determining corresponding candidate items according to the third corresponding relation and the touch pressure level and the touch duration in the touch parameters.
5. The character output method according to any one of claims 1 to 4, wherein the determining a target candidate from among the corresponding candidates and outputting the target candidate at the cursor position includes:
when the number of the corresponding candidate items is larger than 1, receiving touch operation aiming at the target candidate item, and outputting the target candidate item at the cursor position;
and when the number of the corresponding candidate items is equal to 1, determining the corresponding candidate items as target candidate items, and outputting the target candidate items at the cursor position.
6. The character output method according to any one of claims 1 to 4, wherein the determining a target candidate from among the corresponding candidates and outputting the target candidate at the cursor position includes:
when the number of the corresponding candidate items is larger than 1, receiving voice information input by a user;
determining a target candidate matched with the voice information from the corresponding candidate;
and outputting the target candidate item at the cursor position.
7. A mobile terminal, characterized in that the mobile terminal comprises:
the detection module is used for detecting touch parameters generated by touch operation of a user at a cursor position displayed on the touch display screen;
the determining module is used for determining corresponding candidate items according to a preset corresponding relation and the touch parameters, wherein the corresponding candidate items comprise at least one character;
and the output module is used for determining a target candidate item from the corresponding candidate items and outputting the target candidate item at the cursor position.
8. The mobile terminal of claim 7, wherein the mobile terminal further comprises:
the first setting module is used for setting a first corresponding relation between the touch pressure level and the candidate items;
the determining module is specifically configured to determine a corresponding candidate item according to the first corresponding relationship and the touch pressure level in the touch parameter.
9. The mobile terminal of claim 7, wherein the mobile terminal further comprises:
the second setting module is used for setting a second corresponding relation among the touch pressure level, the touch frequency and the candidate items;
the determining module is specifically configured to determine a corresponding candidate item according to the second correspondence and the touch pressure level and the touch frequency in the touch parameter.
10. The mobile terminal of claim 7, wherein the mobile terminal further comprises:
the third setting module is used for setting a third corresponding relation among the touch pressure level, the touch duration and the candidate items;
the determining module is specifically configured to determine a corresponding candidate item according to the third correspondence and the touch pressure level and the touch duration in the touch parameter.
11. The mobile terminal according to any one of claims 7 to 10, wherein the output module is specifically configured to:
when the number of the corresponding candidate items is larger than 1, receiving touch operation aiming at the target candidate item, and outputting the target candidate item at the cursor position; or,
and when the number of the corresponding candidate items is equal to 1, determining the corresponding candidate items as target candidate items, and outputting the target candidate items at the cursor position.
12. The mobile terminal according to any of claims 7 to 10, wherein the output module comprises:
the voice receiving unit is used for receiving voice information input by a user when the number of the corresponding candidate items is greater than 1;
a determining unit, configured to determine a target candidate matched with the speech information from the corresponding candidate;
and the output unit is used for outputting the target candidate item at the cursor position.
13. A mobile terminal, characterized in that the mobile terminal comprises an input device, an output device, a processor and a memory, wherein the input device, the output device, the processor and the memory are connected by a bus, the memory is used for storing a set of program codes, and the input device, the output device and the processor are used for calling the program codes to execute the character output method according to any one of claims 1 to 6.
CN201610039029.XA 2016-01-20 2016-01-20 A kind of character input method and mobile terminal Expired - Fee Related CN105718072B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610039029.XA CN105718072B (en) 2016-01-20 2016-01-20 A kind of character input method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610039029.XA CN105718072B (en) 2016-01-20 2016-01-20 A kind of character input method and mobile terminal

Publications (2)

Publication Number Publication Date
CN105718072A true CN105718072A (en) 2016-06-29
CN105718072B CN105718072B (en) 2018-03-02

Family

ID=56147449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610039029.XA Expired - Fee Related CN105718072B (en) 2016-01-20 2016-01-20 A kind of character input method and mobile terminal

Country Status (1)

Country Link
CN (1) CN105718072B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106775378A (en) * 2016-11-25 2017-05-31 维沃移动通信有限公司 A kind of method and mobile terminal for determining input method candidate word
CN107797749A (en) * 2017-10-24 2018-03-13 网易(杭州)网络有限公司 Message method and device, storage medium, processor, terminal
WO2018053695A1 (en) * 2016-09-20 2018-03-29 谷歌公司 Pressure-based selection of additional characters
CN108415657A (en) * 2018-03-12 2018-08-17 网易(杭州)网络有限公司 Message method, device, medium and electronic equipment
CN109388249A (en) * 2017-08-02 2019-02-26 北京搜狗科技发展有限公司 Input processing method, device, terminal and the readable storage medium storing program for executing of information
CN110096163A (en) * 2018-01-29 2019-08-06 北京搜狗科技发展有限公司 A kind of expression input method and device
CN115291791A (en) * 2022-08-17 2022-11-04 维沃移动通信有限公司 Text recognition method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045411A (en) * 2015-08-27 2015-11-11 广东欧珀移动通信有限公司 Object control method and terminal
CN105100492A (en) * 2015-08-11 2015-11-25 努比亚技术有限公司 Pressure feedback system and pressure feedback method
CN105159556A (en) * 2015-08-27 2015-12-16 广东欧珀移动通信有限公司 Interface operation method and electronic terminal
CN105183356A (en) * 2015-09-09 2015-12-23 魅族科技(中国)有限公司 Character output method, input device and electronic device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105100492A (en) * 2015-08-11 2015-11-25 努比亚技术有限公司 Pressure feedback system and pressure feedback method
CN105045411A (en) * 2015-08-27 2015-11-11 广东欧珀移动通信有限公司 Object control method and terminal
CN105159556A (en) * 2015-08-27 2015-12-16 广东欧珀移动通信有限公司 Interface operation method and electronic terminal
CN105183356A (en) * 2015-09-09 2015-12-23 魅族科技(中国)有限公司 Character output method, input device and electronic device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018053695A1 (en) * 2016-09-20 2018-03-29 谷歌公司 Pressure-based selection of additional characters
CN106775378A (en) * 2016-11-25 2017-05-31 维沃移动通信有限公司 A kind of method and mobile terminal for determining input method candidate word
CN109388249A (en) * 2017-08-02 2019-02-26 北京搜狗科技发展有限公司 Input processing method, device, terminal and the readable storage medium storing program for executing of information
CN107797749A (en) * 2017-10-24 2018-03-13 网易(杭州)网络有限公司 Message method and device, storage medium, processor, terminal
CN110096163A (en) * 2018-01-29 2019-08-06 北京搜狗科技发展有限公司 A kind of expression input method and device
CN108415657A (en) * 2018-03-12 2018-08-17 网易(杭州)网络有限公司 Message method, device, medium and electronic equipment
CN108415657B (en) * 2018-03-12 2023-03-24 网易(杭州)网络有限公司 Message sending method, device, medium and electronic equipment
CN115291791A (en) * 2022-08-17 2022-11-04 维沃移动通信有限公司 Text recognition method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN105718072B (en) 2018-03-02

Similar Documents

Publication Publication Date Title
CN105718072B (en) A kind of character input method and mobile terminal
US9508028B2 (en) Converting text strings into number strings, such as via a touchscreen input
USRE46139E1 (en) Language input interface on a device
US8605039B2 (en) Text input
US8364134B2 (en) Automatic language selection for text input in messaging context
US9128610B2 (en) Virtual predictive keypad
US8831209B2 (en) Conference call dialing
US9836448B2 (en) Text editing
CN102375656B (en) Full spelling single character sliding input method based on touch screen, device and touch screen terminal
US20140063067A1 (en) Method to select word by swiping capacitive keyboard
EP2282252A1 (en) Method of and apparatus for converting a character sequence input
WO2014159473A2 (en) Automatic supplementation of word correction dictionaries
CN101178633A (en) Method, system and device for correcting hand-written screen error
EP2703957A1 (en) Method to select word by swiping capacitive keyboard
EP3037948A1 (en) Portable electronic device and method of controlling display of selectable elements
US20130097548A1 (en) Virtual Keyboard, Input Method, and Associated Storage Medium
US20110296347A1 (en) Text entry techniques
US20130076641A1 (en) Method and Keyboard for Inputting Chinese Characters and Electronic Apparatus Containing the Keyboard
US20030036411A1 (en) Method of entering characters into a text string and a text-editing terminal using the method
CN107132927B (en) Input character recognition method and device for recognizing input characters
US20140298177A1 (en) Methods, devices and systems for interacting with a computing device
CN102902751A (en) Webpage input method and device in mobile terminal and mobile terminal
CN105739894B (en) A kind of input method and terminal
EP2624529A1 (en) Page flip and operation method thereof for reading mms on mobile phones
WO2006125660A2 (en) Automatic language selection for text input in messaging context

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Patentee after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Patentee before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180302