WO2021082694A1 - Procédé et appareil de traitement d'informations, dispositif électronique et support - Google Patents

Procédé et appareil de traitement d'informations, dispositif électronique et support Download PDF

Info

Publication number
WO2021082694A1
WO2021082694A1 PCT/CN2020/111785 CN2020111785W WO2021082694A1 WO 2021082694 A1 WO2021082694 A1 WO 2021082694A1 CN 2020111785 W CN2020111785 W CN 2020111785W WO 2021082694 A1 WO2021082694 A1 WO 2021082694A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
area
stored
drag
processed
Prior art date
Application number
PCT/CN2020/111785
Other languages
English (en)
Chinese (zh)
Inventor
邢增兴
Original Assignee
北京字节跳动网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201911046206.7A external-priority patent/CN110806834A/zh
Priority claimed from CN201911046169.XA external-priority patent/CN110806827A/zh
Application filed by 北京字节跳动网络技术有限公司 filed Critical 北京字节跳动网络技术有限公司
Publication of WO2021082694A1 publication Critical patent/WO2021082694A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present disclosure relates to computer technology. Specifically, the present disclosure relates to an information processing method, device, electronic device, and medium.
  • the present disclosure provides an information processing method, including: determining information to be processed; and performing a corresponding operation on the information to be processed based on a drag operation for the information to be processed.
  • the present disclosure provides an information processing method, including:
  • an information processing method including:
  • the information to be stored is stored.
  • an information processing device including:
  • the determination module is used to determine the information to be processed
  • the processing module is configured to perform corresponding operations on the to-be-processed information based on the drag operation for the to-be-processed information.
  • an information processing device including:
  • the first determining module is used to determine the information to be processed in the input area
  • the second determining module is configured to determine the processing mode of the information to be processed based on the drag operation for the information to be processed;
  • the execution module is used to execute corresponding operations on the information to be processed based on the determined processing mode.
  • an information processing device including:
  • the third determining module is used to determine the information to be stored based on the user's trigger operation
  • the storage module is configured to store the information to be stored when it is detected that the user drags the information to be stored to the preset storage area.
  • an electronic device including:
  • One or more processors are One or more processors;
  • One or more application programs where one or more application programs are stored in a memory and configured to be executed by one or more processors, and one or more programs are configured to execute the method shown in the first aspect Operation.
  • the present disclosure provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the method shown in the first aspect is implemented.
  • the present disclosure provides an information processing method, device, electronic equipment, and medium.
  • the present disclosure determines information to be processed, and performs operations on the information to be processed based on a drag operation for the information to be processed, thereby improving user experience.
  • the present disclosure can perform certain operations on the input information in the input area according to the user's drag operation, and is not limited to only inputting information through the input method keyboard in the input area, thereby meeting the diversified requirements of information interaction. Improved user experience.
  • the method of storing the determined information to be stored in the present disclosure is to directly drag the information to be stored to a preset storage area to achieve storage, and neither requires the user to manually input the information to be stored nor perform a copy operation or Paste operation and switch to the corresponding operation interface, thereby reducing the complexity of information storage, saving time for information storage, and improving user experience.
  • FIG. 1 is a schematic flowchart of an information processing method based on an input method provided by an embodiment of the disclosure
  • FIG. 2 is a schematic structural diagram of an information processing device based on an input method provided by an embodiment of the disclosure
  • FIG. 3 is a schematic structural diagram of an electronic device provided by an embodiment of the disclosure.
  • FIG. 4 is a schematic diagram of a display interface of an input method provided by an embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram of a display interface of another input method provided by an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of a display interface of another input method provided by an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of a display interface of another input method provided by an embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of a display interface of a shortcut phrase panel provided by an embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of a display interface of another input method provided by an embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of a display interface of another input method provided by an embodiment of the present disclosure.
  • FIG. 11 is a schematic diagram of a display interface of another input method provided by an embodiment of the present disclosure.
  • FIG. 12 is a schematic diagram of a display interface of another input method provided by an embodiment of the disclosure.
  • FIG. 13 is a schematic flowchart of an information processing method provided by an embodiment of the disclosure.
  • FIG. 14 is a schematic structural diagram of an information processing device provided by an embodiment of the disclosure.
  • FIG. 15 is a schematic diagram of displaying a preset storage area provided by an embodiment of the present disclosure.
  • FIG. 16 is a schematic diagram showing another preset storage area provided by an embodiment of the present disclosure.
  • FIG. 17 is a schematic diagram of a sliding operation provided by an embodiment of the disclosure.
  • FIG. 18 is a schematic diagram of a selection operation provided by an embodiment of the disclosure.
  • the embodiments of the present disclosure provide an information processing method, including:
  • Step S1 Determine the information to be processed.
  • determining the information to be processed may include determining the information to be processed in the input area.
  • determining the information to be processed may include determining the information to be stored based on a user's trigger operation.
  • Step S2 based on the drag operation for the to-be-processed information, perform an operation on the to-be-processed information.
  • performing the operation on the information to be processed may include: determining the processing method for the information to be processed based on the drag operation for the information to be processed; performing the processing on the information to be processed based on the determined processing method The corresponding operation.
  • performing an operation on the information to be processed may include: storing the information to be stored when it is detected that the user drags the information to be stored to the preset storage area.
  • the embodiment of the present disclosure provides an information processing device including: a determination module and a processing module.
  • the determining module is used to determine the information to be processed.
  • the processing module is used to perform corresponding operations on the information to be processed based on the drag operation for the information to be processed.
  • the determination module may include a first determination module 201
  • the processing module may include a second determination module 202 and an execution module 203.
  • the first determining module 201 is configured to determine the information to be processed in the input area.
  • the second determining module 202 is configured to determine the processing mode of the information to be processed based on the drag operation for the information to be processed.
  • the determination module may include a third determination module 1401, and the processing module may include a storage module 1402.
  • the third determining module 1401 is configured to determine the information to be stored based on a user's trigger operation when the information to be processed is stored information.
  • the storage module 1402 is configured to store the information to be stored when it is detected that the user drags the information to be stored to the preset storage area.
  • the embodiments of the present disclosure provide an information processing method based on an input method, which can be executed by a terminal device. As shown in FIG. 1, the method includes:
  • Step S101 Determine the to-be-processed information in the input area.
  • the information to be processed may be text information, picture information, or video information.
  • the method for determining the information to be processed in the input area includes: for the preset information in the input area, selecting the information in the preset information through a selection operation, and determining the selected information as the information to be processed.
  • the area pointed to by the label A in Figure 4 represents the input area
  • the area pointed to by the label B represents the keyboard area of the input method.
  • the preset information "Watermelon in summer” in 4 select the information in the preset information through the selection operation, select “Watermelon in summer” in Figure 5, and determine the selected information as the information to be processed, as shown in Figure 5.
  • the "watermelon in summer” selected in is determined as the information to be processed, that is, the "watermelon in summer” in Figure 6.
  • Step S102 Determine a processing method of the information to be processed based on the drag operation for the information to be processed.
  • the mouse can be used to implement the drag operation on the information to be processed, or the touch movement of the user's finger on the display screen can be detected to implement the drag operation on the information to be processed. There is no limit in it.
  • the drag operation for the information to be processed may specifically include dragging the information to be processed in the input area to the input area, or dragging the information to be processed in the input area to the input method keyboard area, As shown in Figure 6 and Figure 7, drag the "Watermelon in Summer" in the input area in Figure 6 to the input method keyboard area, as shown in Figure 7.
  • Step S103 based on the determined processing mode, perform a corresponding operation on the information to be processed.
  • a corresponding operation is performed on the to-be-processed information determined in step S101 based on the processing mode determined in step S102.
  • the embodiments of the present disclosure can perform certain operations on the input information in the input area according to the user's drag operation, and are not limited to only inputting information through the input method keyboard in the input area, thereby satisfying the diversity of information interaction. Requirements to further enhance the user experience.
  • step S102 may include: determining the drag termination area of the drag operation, and determining the processing mode of the information to be processed based on the drag termination area; or, determining the drag operation The drag trajectory information is determined, and the processing mode of the information to be processed is determined based on the drag trajectory information.
  • the processing mode of the information to be processed can be determined based on the drag termination area or the drag trajectory information.
  • the solution in the prior art can be used to determine the information to be processed.
  • the solution shown in step S101 can also be used to determine the information to be processed; after determining the processing method of the information to be processed, the solution in the prior art can be used to perform corresponding operations on the information to be processed, or the solution shown in step S103 can be used Performing corresponding operations on the information to be processed is not limited in the embodiment of the present disclosure.
  • determining the processing mode of the information to be processed based on the drag trajectory information may specifically include: determining the corresponding drag trajectory information based on the corresponding relationship between the drag trajectory information, the pre-configured drag trajectory information, and the processing mode How to handle the information to be processed.
  • a variety of drag trajectory information can be pre-configured, and the processing method corresponding to any one of the pre-configured drag trajectory information can be a text-to-expression processing method, or a processing method in which information is added to an expression panel.
  • the drag trajectory information of the drag operation for the information to be processed can be determined. If the drag trajectory information is consistent with the pre-configured drag trajectory information, then based on the drag trajectory information, the pre-configured drag trajectory information corresponds to Processing method, determine the processing method of the information to be processed.
  • the drag trajectory information is a circle and is consistent with the pre-configured circular drag trajectory information
  • the processing method corresponding to the pre-configured circular drag trajectory information is "the processing method of adding information to the shortcut phrase panel”
  • the processing method corresponding to the U-shaped drag trajectory information is "the processing method of adding information to the expression panel”
  • the processing method of determining the information to be processed based on the drag trajectory information is "the processing method of adding information to the expression panel”.
  • the above embodiment introduces the processing method of determining the information to be processed based on the drag trajectory.
  • it can also be determined based on the drag termination area corresponding to the drag operation. See the following examples:
  • determining the drag termination area of the drag operation, and determining the processing manner of the information to be processed based on the drag termination area may include: determining that the drag termination area of the drag operation is the first The preset area displays at least one processing method corresponding to the first preset area. When a trigger operation for any processing method is detected, it is determined that the processing method of the information to be processed is the processing method corresponding to the trigger operation.
  • the first preset area and its corresponding at least one processing method are pre-configured.
  • the drag end area is the first preset area
  • the latter at least one processing method corresponding to the first preset area is displayed in the predetermined display area, and when a trigger operation by the user by clicking any processing method is detected, it is determined that the processing method of the information to be processed is trigger The processing method corresponding to the operation.
  • the first preset area may be located in the input area or the input method keyboard area, and may also be partly located in the input area and partly located in the input method keyboard area; the predetermined display area may be located in the input area or the input method keyboard The area may also be partly located in the input area and partly located in the input method keyboard area, which is not limited in the embodiment of the present disclosure.
  • the first preset area is the input method keyboard area
  • the pre-configured input method keyboard area corresponds to three processing methods.
  • the three processing methods are “translation”, “editing” and “voice conversion”.
  • three processing methods of "translation”, “editing” and “voice conversion” are displayed in a predetermined display area, where the predetermined display area is In the input area, when the trigger operation of the user clicking "translate” is detected, the processing method of the information to be processed is determined to be the language conversion processing of the "watermelon in summer".
  • At least one first preset area may be set on the display screen, and the number of processing methods corresponding to each first preset area may be the same or different.
  • two first preset areas can be preset, where when the first preset area is an input area, the first preset area corresponds to two processing methods; when the first preset area is an input method keyboard area, The first preset area corresponds to three processing methods. As shown in FIG. 9, when the first preset area is an input method keyboard area, the three processing methods corresponding to the first preset area are "translation", "editing", and "voice conversion".
  • the processing method corresponding to the first preset area may include: the processing method of converting text to expression, the processing method of adding information to the expression panel, the processing method of adding information to the shortcut phrase panel, word segmentation, and text format. At least one of adjustment, text-to-speech processing, and language conversion processing, and the processing method is not limited to the items listed above.
  • determining the processing method of the information to be processed based on the drag termination area includes: determining the processing method of the information to be processed based on the correspondence between the area and the processing method and the drag termination area .
  • At least one area is pre-configured, and the processing method corresponding to each area is pre-configured, wherein any area can be an expression area, or a phrase area, or an editing area, or a text-to-speech area, or a translation area, and any area
  • An area can be located in the input area, can also be located in the input method keyboard area, and can also be partly located in the input area and partly located in the input method area.
  • the relevant descriptions of the expression area, phrase area, editing area, text-to-speech area, and translation area are detailed in the following embodiments, and will not be repeated here.
  • the drag end area is determined, the area of the drag end area can be determined, and based on the pre-configured processing mode corresponding to the area, it is determined that the processing mode of the information to be processed is the processing mode corresponding to the area.
  • the area C in the pre-configured input method keyboard area is an area
  • the processing mode corresponding to the pre-configured area C is "text-to-speech processing"
  • the processing method of the information to be processed is "text-to-speech processing”.
  • determining the drag termination area of the drag operation, and determining the processing method of the information to be processed based on the drag termination area may include: when a drag termination operation for the information to be processed is detected When the dragging suspension area is determined, when the dragging suspension area is the second preset area, the first trigger area of each processing method corresponding to the second preset area is determined and displayed. When it is detected that the dragging information to be processed continues to The first trigger area of any processing method and the drag operation is terminated, the first trigger area is determined to be the drag termination area, and the processing method of the information to be processed is determined to be the processing method corresponding to the first trigger area.
  • the second preset area and at least one processing method corresponding to the second preset area are preset, and the first trigger area corresponding to each processing method is preset.
  • the drag suspension area is determined. If the drag suspension area is the second preset area, based on the preset second preset area and at least one processing method corresponding to the second preset area, and each preset
  • the first trigger area corresponding to the processing method determines and displays the first trigger area of each processing method corresponding to the second preset area.
  • the first trigger area is determined to be the drag termination area
  • the processing method of the information to be processed is determined to be the processing method corresponding to the first trigger area based on the first trigger area corresponding to each preset processing method.
  • the preset area D corresponds to the two processing methods "translation” and "edit”, and the first trigger area corresponding to each processing method is preset.
  • the second preset area can be located in the input area or the input method keyboard area, and can also be partially located in the input area and partially located in the input method keyboard area; any first trigger area can be located in the input area or the input method
  • the keyboard area may also be partly located in the input area and partly located in the input method keyboard area, which is not limited in the embodiments of the present disclosure.
  • the second preset area is area D, and area D is located in the input method keyboard area.
  • the first trigger area corresponding to "translation" and the first trigger area corresponding to "edit” are both located in the input area.
  • At least one second preset area may be preset, and the number of first trigger areas corresponding to each second preset area may be the same or different.
  • the second preset area preset in the input area corresponds to three first trigger areas
  • the second preset area preset in the input method keyboard area corresponds to two first trigger areas, as shown in FIG. 11,
  • the second preset area located in the input method keyboard area corresponds to two first trigger areas, which are respectively the first trigger area corresponding to "translation" and the first trigger area corresponding to "edit”.
  • any first trigger area may be an expression area, or a phrase area, or an editing area, or a text-to-speech area, or a translation area, and the first trigger area is not limited to the items listed above. Among them, the relevant descriptions of the expression area, phrase area, editing area, text-to-speech area, and translation area are detailed in the following embodiments, and will not be repeated here.
  • Another possible implementation manner of the embodiment of the present disclosure is to determine the drag termination area of the drag operation. Before that, it may also include: recognizing the information to be processed, and obtaining the recognition result of the information to be processed; based on the recognition result, displaying at least one processing method The second trigger area of each processing method in the.
  • determining the drag termination area of the drag operation may include: when it is detected that the information to be processed is dragged to the second trigger area of any processing method and the drag operation is terminated, determining the second trigger area as the drag termination area.
  • the information to be processed can be identified, and the recognition result of the information to be processed can be obtained. Based on the recognition result, in the preset third preset area, the second trigger of each of the at least one processing method is displayed
  • the third preset area may be preset to be located in the input area, or in the input method keyboard area, and may also be partly located in the input area and partly located in the input method keyboard area. In the embodiment of the present disclosure, the third preset area is not located in the input area. Make a limit.
  • the information to be processed is "laugh”, recognize “laugh” and get the recognition result of "laugh”, based on the recognition result of "laugh", display "Translation” in the input area
  • the second trigger area corresponding to "" the second trigger area corresponding to "Expression”
  • the second trigger area corresponding to "Edit” when it is detected that dragging "Laugh” to the second trigger area corresponding to "Expression” is terminated
  • the second trigger area corresponding to the "emoji” is determined as the drag termination area, which can realize the expression information of "laughing" into “laughing”.
  • any second trigger area may be any one of an expression area, a phrase area, an editing area, a text-to-speech area, and a translation area, and the second trigger area is not limited to the items listed above.
  • the relevant descriptions of the expression area, phrase area, editing area, text-to-speech area, and translation area are detailed in the following embodiments, and will not be repeated here.
  • determining the processing manner of the information to be processed based on the drag termination area may include:
  • the processing method for determining the information to be processed includes at least one of the processing method of converting text to expression and the processing method of adding information to the expression panel; or, if the dragging termination area is a phrase Area, the processing method of determining the information to be processed includes: adding the information to the processing method of the shortcut phrase panel; or, if the drag termination area is an editing area, the processing method of determining the information to be processed includes: word segmentation and text format adjustment Or, if the dragging termination area is a text-to-speech area, the processing method for determining the information to be processed includes: text-to-speech processing; or, if the dragging termination area is a translation area, determining the content of the information to be processed Processing methods include: language conversion processing.
  • the processing mode corresponding to the expression area may be pre-configured to include at least one of the processing mode of converting text to expression and the processing mode of adding information to the expression panel.
  • the processing method of text-to-emoji conversion is to convert the information to be processed in the form of text into corresponding emoticons, such as converting the text "laughing" into “laughing” emoticons; adding information to the emoticon panel processing method , Is to add the emoticon information to the emoticon panel, such as adding the "laughing" emoticon to the emoticon panel.
  • the processing method corresponding to the phrase area may be pre-configured, including the processing method of adding information to the shortcut phrase panel.
  • the processing method of adding information to the shortcut phrase panel is to add the to-be-processed information in text form to the shortcut phrase panel.
  • the to-be-processed information "Wangjing North Road, Chaoyang District, Beijing”, “ “Watermelon in summer” and "80% of the work schedule to be completed by Friday" are added to the quick phrase panel.
  • the processing mode corresponding to the editing area may be pre-configured to include at least one of word segmentation and text format adjustment.
  • word segmentation is to perform word segmentation operations on the information to be processed to obtain at least two text messages. For example, perform word segmentation operations on the information to be processed "Chaoyang North Road Supermarket” to obtain three text information “Chaoyang", “North Road” and “Supermarket”.
  • Text format adjustment is to adjust the font size, font, and tilt of the information to be processed.
  • the word segmentation operation is performed on the information to be processed to obtain at least two pieces of text information.
  • the click sequence corresponding to each piece of text information is determined, based on the clicked at least two pieces of text information. And the click sequence corresponding to each text message to generate target information. For example, after word segmentation of the processed information "Chaoyang North Road Supermarket”, three text messages “Chaoyang”, “North Road” and “Supermarket” are obtained. When it is detected that the user clicks in turn After “supermarket” and “north road”, the target information "supermarket north road” is generated.
  • the processing mode corresponding to the text-to-speech area may be pre-configured to include text-to-speech processing.
  • text-to-speech processing refers to the conversion of information to be processed in text form into information to be processed in speech form, such as converting the text "Chaoyang North Road Supermarket” into "Chaoyang North Road Supermarket” machine-voice voice information.
  • the target voice type can be determined, and the information to be processed can be converted into voice information corresponding to the target voice type.
  • the method for determining the target voice type includes the target voice type preset by the user; or, the target voice type is determined according to the frequency of use of each voice type; or, the target voice type is determined according to the user's selection operation of the voice type. For example, for the to-be-processed information "Chaoyang North Road Supermarket", the determined target voice type is machine sound, then the “Chaoyang North Road Supermarket” is converted into the machine sound voice information of the "Chaoyang North Road Supermarket”.
  • the processing mode corresponding to the text-to-speech area can be pre-configured to include language conversion processing, where the language conversion processing is the conversion of to-be-processed information in one language form into information in another language form, such as The Chinese form of "summer" is converted to the English form of "summer".
  • the target language type can be determined, and the information to be processed can be converted into information corresponding to the target language type.
  • the method of determining the target language type includes the target language type preset by the user; or, the target language type is determined according to the frequency of use of each language type; or, the target language type is determined according to the user's selection operation for the language type. For example, for the to-be-processed information "summer" in Chinese, and the determined target language type is English, the Chinese “summer” is converted into English “summer”.
  • the method may further include: displaying processing information corresponding to the determined processing manner.
  • step S103 may include: when a confirmation operation for the processing information is detected, performing a corresponding operation on the to-be-processed information based on the determined processing mode.
  • the processing method of the information to be processed can be determined based on the user's drag operation on the information to be processed, and the processing information corresponding to the determined processing method can be displayed, so that the user can understand the drag operation and its corresponding processing method , It can effectively prevent the user's misoperation.
  • the corresponding operation is performed on the processing information based on the determined processing method. Because the user is required to perform further confirmation operations on the displayed processing information, It can further prevent user's misoperation.
  • the input method-based information processing device 20 may include: a first determination module 201, a second determination module 202, and an execution module 203, among them,
  • the first determining module 201 is configured to determine the information to be processed in the input area.
  • the second determining module 202 is configured to determine the processing mode of the information to be processed based on the drag operation for the information to be processed.
  • the execution module 203 is configured to execute corresponding operations on the information to be processed based on the determined processing mode.
  • the second determining module 202 may include any one of the first determining unit and the second determining unit, where:
  • the first determining unit is used to determine the drag termination area of the drag operation, and determine the processing mode of the information to be processed based on the drag termination area.
  • the second determining unit is used to determine the drag trajectory information of the drag operation, and determine the processing mode of the information to be processed based on the drag trajectory information.
  • the first determining unit includes a first determining subunit, a display subunit, and a second determining subunit, where:
  • the first determining subunit is used to determine that the drag termination area of the drag operation is the first preset area.
  • the display subunit is configured to display at least one processing method corresponding to the first preset area.
  • the second determining subunit is configured to determine that the processing method of the information to be processed is the processing method corresponding to the trigger operation when a trigger operation for any processing method is detected.
  • the first determining unit includes a third determining subunit, where:
  • the third determining subunit is used to determine the processing method of the information to be processed based on the correspondence between the area and the processing method and the drag termination area.
  • the first determining unit includes a fourth determining subunit, a fifth determining subunit, a sixth determining subunit, and a seventh determining subunit, where:
  • the fourth determining subunit is used to determine the drag suspension area when the drag suspension operation for the to-be-processed information is detected.
  • the fifth determining subunit is configured to determine and display the first trigger area of each processing method corresponding to the second preset area when the dragging suspension area is the second preset area.
  • the sixth determining subunit is used to determine that the first trigger area is the drag termination area when it is detected that the information to be processed is continued to be dragged to the first trigger area of any processing method and the drag operation is terminated.
  • the seventh determining subunit is used to determine that the processing mode of the information to be processed is the processing mode corresponding to the first trigger area.
  • the input method-based information processing device 20 further includes an identification module and a first display module, where:
  • the identification module is used to identify the information to be processed and obtain the identification result of the information to be processed.
  • the first display module is configured to display the second trigger area of each of the at least one processing method based on the recognition result.
  • the first determining unit includes an eighth determining subunit, wherein:
  • the eighth determining subunit is configured to determine that the second trigger area is the drag termination area when it is detected that the information to be processed is dragged to the second trigger area of any processing method and the drag operation is terminated.
  • the first determining unit is specifically configured to:
  • the processing method for determining the information to be processed includes at least one of: a processing method of converting text to an expression and a processing method of adding information to an expression panel.
  • determining the processing method of the information to be processed includes: adding the information to the processing method of the shortcut phrase panel.
  • the processing method for determining the to-be-processed information includes at least one of word segmentation and text format adjustment.
  • the processing method for determining the information to be processed includes: text-to-speech processing.
  • the processing method for determining the information to be processed includes: language conversion processing.
  • the input method-based information processing device 20 further includes a second display module, where:
  • the second display module is used to display processing information corresponding to the determined processing mode.
  • the execution module 203 is specifically configured to perform a corresponding operation on the to-be-processed information based on the determined processing mode when the confirmation operation for the processed information is detected.
  • the first determining module 201 and the second determining module 202 may be the same type of determining module, or may be two different determining modules, the first display module and the second display module may be the same type of display module, It may also be two different display modules.
  • the first determining unit and the second determining unit may be the same type of determining unit, or two different determining units; the first determining sub-unit, the second determining sub-unit, and the third determining unit.
  • the determining subunit, the fourth determining subunit, the fifth determining subunit, the sixth determining subunit, the seventh determining subunit, and the eighth determining subunit may be the same type of determining subunit, or may be different determining subunits, It can also be arbitrarily combined into the same subunit, which is not limited in the embodiment of the present disclosure.
  • the input method-based information processing device 20 of the embodiment of the present disclosure can execute the input method-based information processing method shown in the foregoing method embodiment of the present disclosure, and the implementation principles are similar, and will not be repeated here.
  • the embodiments of the present disclosure provide an information processing method, which can be executed by a terminal device. As shown in FIG. 13, the method includes:
  • Step S1301 Determine the information to be stored based on the trigger operation of the user.
  • the user's trigger operation is the user's selection operation for the information to be stored
  • the user's selection operation for the information to be stored may include the user's click operation on the information to be stored and the user's long press operation on the information to be stored
  • At least one of, for example, when it is detected that the user long presses the text information "watermelon in summer", the "watermelon in summer” is determined as the information to be stored.
  • Step S1302 When it is detected that the user drags the information to be stored to the preset storage area, the information to be stored is stored.
  • the information to be stored is stored in the storage location corresponding to the preset storage area, where the preset storage area may be for storing text information or expressions.
  • An area of at least one of information, link information, audio information, and video information may also include at least one of a clipboard area, an expression area, a shortcut phrase area, a link area, an audio area, a video area, and an encryption area ,
  • the storage location corresponding to the preset storage area may be located in the memory.
  • the expression area is an area for storing expression information
  • the shortcut phrase area is an area for storing text information that is not private to the user
  • the link area is an area for storing link information
  • the video area is an area for storing video information
  • the encryption area is for storing user privacy.
  • Information area For example, when it is detected that the user drags the information to be stored "watermelon in summer” to the shortcut phrase area, the "watermelon in summer” is stored in the storage location corresponding to the shortcut phrase area.
  • the data corresponding to the information to be stored is dumped to the preset system memory; when it is detected that the user has dragged the information to be stored to the preset
  • the storage area is set, the data corresponding to the information to be stored that has been transferred to the preset system memory is transferred to the storage location corresponding to the preset storage area.
  • the method of storing the determined information to be stored is to directly drag the information to be stored to the preset storage area to realize storage. It does not require the user to manually input the information to be stored or perform copying. Operate or paste operation, and switch to the corresponding operation interface, thereby reducing the complexity of information storage, saving information storage time, and improving user experience.
  • step S1301 includes determining the information to be stored based on the user's trigger operation in the editable area, and determining the information to be stored based on the trigger operation in the non-editable area At least one item of information to be stored.
  • the information to be stored is determined, and/or when a user is detected in a non-editable area, such as a chat box area, triggering Operate to determine the information to be stored.
  • a non-editable area such as a chat box area
  • triggering Operate to determine the information to be stored.
  • the chat box area the area where the message has been sent between the user A and the user B is the chat box area.
  • Step S1302 can be further performed, that is, when it is detected that the user drags the information to be stored to the preset storage area, the information to be stored is stored. Further, the information to be stored can be stored in the preset storage area. The information to be stored after storage is displayed in the area to facilitate the user to find the information to be stored, as follows:
  • the method may further include: determining the display position of the stored information in the preset storage area, and the stored information is the information after the information to be stored is stored; The storage information is displayed at the display position of.
  • determining the display position of the stored information in the preset storage area may include at least one of the following steps A1, A2, and A3.
  • step A1 may include: when it is detected that the user drags the information to be stored to the preset storage area, determining the drag end position of the information to be stored, and determining the storage based on the drag end position of the information to be stored The display position of the information in the preset storage area.
  • step A1 the embodiment of the present disclosure provides an example.
  • the drag end position in the preset storage area is between the display position of "Wangjing North Road, Chaoyang District, Beijing” and the display position of "80% of the work schedule to be completed by Friday", and the drag end position is determined to be stored based on the drag end position
  • the display position of the message "watermelon in summer” in the preset storage area is the display position of "80% of the work schedule to be completed by Friday”
  • the stored message "watermelon in summer” is displayed at the determined display position, such as As shown in Figure 8, the display position of "Watermelon in Summer” is between the display position of "Wangjing North Road, Chaoyang District, Beijing” and the display position of "80% of the work schedule to be completed by Friday”.
  • step A2 may include: determining the display position of the stored information in the preset storage area based on the storage time of the stored information.
  • step A2 the embodiment of the present disclosure provides an example. Specifically, the storage time of "Wangjing North Road, Chaoyang District, Beijing” is 12:00 on October 11, 2019, and the storage time of "Watermelon in Summer” is At 12:00 on October 12, 2019, the storage time of "80% of the work schedule to be completed by Friday" is 13:11 on October 12, 2019. Based on the storage time of each stored information, determine the storage time of each stored information The display position in the preset storage area, as shown in Figure 8, the order of the display position of each stored information in the preset storage area is "Wangjing North Road, Chaoyang District, Beijing", “Watermelon in Summer” and "Zhou 80% of the work schedule must be completed before five.”
  • step A3 may include: based on the information type of the stored information, determining the display area corresponding to the information type in the preset storage area, and determining the display position of the stored information in the determined display area.
  • the preset storage area includes at least one of a clipboard area, an expression area, a shortcut phrase area, a link area, an audio area, a video area, and an encryption area, that is, the display area may be a clipboard area, Any one of expression area, shortcut phrase area, link area, audio area, video area, and encryption area.
  • the embodiments of the present disclosure provide the following examples.
  • the stored information is ordinary text information "watermelon in summer”, based on the information type of the stored information "watermelon in summer”, it is stored in the preset storage area Determine the display area corresponding to the information type as the shortcut phrase area, determine the display position of the stored information "Watermelon in summer” in the shortcut phrase area, and display the stored information "Watermelon in summer” at the display position in the shortcut phrase area .
  • the stored information is the special text information "login password XXXXX"
  • the display area corresponding to the information type is determined to be an encrypted area in the preset storage area, and the stored information is determined.”
  • the display position of the login password XXXXX” in the encrypted area, and the storage information "Login password XXXXX" is displayed in the display position in the encrypted area.
  • the above details the method of displaying the stored information in the preset storage area.
  • the following describes the method of adjusting the display position of the information in the preset storage area, so that the user can find the information in the preset storage area more conveniently and improve the user experience.
  • the information processing method may further include: when a preset condition is met, adjusting the display positions of at least two pieces of information displayed in the preset storage area.
  • satisfying the preset condition includes: detecting that the preset time is reached, detecting the preset operation instruction triggered by the user, and detecting the user's drag operation for any information displayed in the preset storage area.
  • the display positions of at least two pieces of information displayed in the preset storage area can be adjusted, which is not limited to adjustment before step S1301, nor is it limited to adjustment after step S1302.
  • the display positions of at least two pieces of information displayed in the preset storage area are adjusted; it can also be when the preset time triggered by the user is detected Operation instructions, such as a preset operation instruction triggered by a user by triggering an adjustment control in the preset storage area to adjust the display position of at least two pieces of information displayed in the preset storage area, wherein the preset operation instruction is used to control the adjustment
  • Operation instructions such as a preset operation instruction triggered by a user by triggering an adjustment control in the preset storage area to adjust the display position of at least two pieces of information displayed in the preset storage area, wherein the preset operation instruction is used to control the adjustment
  • the display position of at least two pieces of information displayed in the preset storage area it can also be detected when a user drags any information displayed in the preset storage area, such as the user’s response to the “Summer in the summer” displayed in the preset storage area. "Watermelon" drag operation to adjust the display position of at least two messages displayed in the preset storage area.
  • adjusting the display positions of the at least two pieces of information displayed in the preset storage area may specifically include at least one of the following steps B1 and B2.
  • step B1 may include: when it is detected that the preset time is reached or the preset operation instruction triggered by the user is detected, adjusting the display in the preset storage area according to at least one of the frequency of information use and the information storage time The display position of at least two messages.
  • the information displayed in the preset storage area can be sorted according to the order of information use frequency or according to the order of information storage time, and at least the information displayed in the preset storage area can be adjusted according to the sorted information.
  • the display position of the two messages can be sorted according to the order of information use frequency or according to the order of information storage time, and at least the information displayed in the preset storage area can be adjusted according to the sorted information.
  • the order of its original display position is "Wangjing North Road, Chaoyang District, Beijing", “Watermelon in Summer” and "80% of the work schedule to be completed by Friday".
  • the storage time of "Wangjing North Road, Chaoyang District, Beijing” is 12:00, October 10, 2019, and the storage time of "Watermelon in summer” is 17:35, October 12, 2019.
  • the storage time of "80% of the work schedule to be completed by Friday" is at 13:00 on October 11, 2019.
  • the order of the information storage time is to sort the information, that is, the order of the information is "Wangjing North Road, Chaoyang District, Beijing", “80% of the work schedule to be completed by Friday” and “Watermelon in summer”
  • the display position of “Watermelon in summer” and the display position of "80% of the work schedule to be completed by Friday that is, as shown in Figure 16, the order of the adjusted display positions is "Beijing "Wangjing North Road, Chaoyang District", "80% of the work arrangements to be completed by Friday” and “Watermelon in summer”.
  • step B2 may include: when a user's drag operation for any information displayed in the preset storage area is detected, adjusting the value of at least two pieces of information displayed in the preset storage area based on the drag termination position Show location.
  • the drag end position can be determined, and the location of any information in the preset storage area can be determined based on the drag end position. Insert the position, and insert any information at the insertion position, and adjust the display position of at least one information except any information in the preset storage area. For example, for the information displayed in the preset storage area, the order of its original display position is "Wangjing North Road, Chaoyang District, Beijing", “Watermelon in Summer” and "80% of the work schedule to be completed by Friday".
  • the drag end position can be determined, and the to-be-adjusted display in the preset storage area can be determined based on the drag end position. And adjust the display position of the determined two information in the preset storage area. For example, for the information displayed in the preset storage area, the order of its original display position is "Wangjing North Road, Chaoyang District, Beijing", “Watermelon in Summer” and "80% of the work schedule to be completed by Friday".
  • the information processing method may further include: when it is detected that the user drags the information displayed in the preset storage area to the editing area, dragging the dragged information Insert to the corresponding position in the editing area.
  • inserting the dragging information into the corresponding position of the editing area can include determining the dragging end position of the dragging information in the editing area, inserting the dragging information in the editing area according to the dragging end position, and dragging The information is inserted into at least one item of the cursor position in the editing area.
  • the dragged information can be inserted into the corresponding position of the editing area, which is not limited to before step S1301, nor is it limited to After step S1302.
  • the drag information when it is detected that the user drags the information displayed in the preset storage area to the editing area, the drag information is determined to be at the drag end position of the edit area, and the drag information is determined according to the drag end position At the insertion position of the editing area, insert the dragged information to the insertion position of the editing area.
  • the cursor position of the cursor in the editing area is determined, and the dragged information is inserted into the cursor position of the editing area. For example, when it is detected that the user drags the information "Watermelon in summer” displayed in the preset storage area to the editing area, and after determining that the cursor position in the editing area is “Watermelon” is inserted after "Little Bear loves to eat", that is, the information displayed in the editing area is "Little Bear loves to eat watermelon in summer”.
  • the information processing method may further include: when it is detected that the user triggers at least one of the following operation mode C1, operation mode C2, operation mode C3, and operation mode C4 When the item is operated, the preset storage area is loaded and displayed.
  • the step of loading and displaying the preset storage area may be before step S1302, or when it is detected that the user drags the information displayed in the preset storage area to the editing area, the dragged information Before the step of inserting into the corresponding position of the editing area, it is not limited in the embodiment of the present disclosure.
  • the operation mode C1 specifically includes: a drag operation for the information to be stored, and the drag end position belongs to the first preset area.
  • the information to be stored when it is detected that the user triggers the information to be stored for a time greater than the preset time threshold, the information to be stored is floated so that the user can select the information to be stored and perform a move operation; when it is detected that the user selects and moves the information to be stored after floating, It is determined that the user's operation on the information to be stored is a drag operation on the information to be stored.
  • the operation mode C2 specifically includes: a click operation on the second preset area.
  • the second preset area is the identification area corresponding to the preset storage area
  • the preset storage area is loaded and displayed.
  • the operation mode C3 specifically includes: a sliding operation, and the sliding trajectory information of the sliding operation matches the preset trajectory information, or the starting position of the sliding operation belongs to the third preset area, and the sliding direction is the preset sliding direction.
  • the embodiment of the present disclosure provides an example. Specifically, when a user triggers a sliding operation is detected, the sliding trajectory information of the sliding operation is circular trajectory information, and the circular trajectory information matches the preset circular trajectory information, Load and display the preset storage area; or, when it is detected that the user triggers a sliding operation, and the starting position of the sliding operation belongs to the third preset area, and the sliding direction is the preset sliding direction, load and display the preset storage area, such as As shown in FIG. 17, when it is detected that a user triggers a sliding operation, and the starting position of the sliding operation is the third preset area, and the sliding direction is the preset sliding direction pointed by the arrow, the preset storage area is loaded and displayed.
  • the operation mode C4 specifically includes: a selection operation for the information to be stored.
  • a selection operation for the information to be stored As shown in Figure 18, when it is detected that the user triggers the selection operation for the information to be stored "watermelon in summer", the preset storage area is loaded and displayed, which can facilitate the user to drag the information to be stored to the preset storage area, and speed up The speed of information storage.
  • step S1302 may specifically include any one of the following steps D1 and D2, and the following will specifically introduce step D1 and step D2.
  • Step D1 may include: when it is detected that the user drags the information to be stored from one application to a preset storage area in another application, storing the information to be stored.
  • the data corresponding to the information to be stored is divided into one The storage location corresponding to the application is transferred to the preset system memory; when it is detected that the user drags the information to be stored from one application to the preset storage area in another application, the data corresponding to the information to be stored is transferred from The preset system memory is transferred to the storage location corresponding to another application.
  • Step D2 may include: when it is detected that the user drags the information to be stored from the first functional area to the preset storage area located in the second functional area, storing the information to be stored.
  • first functional area and the second functional area correspond to the same application.
  • the data corresponding to the information to be stored is replaced by The storage location corresponding to the first functional area is transferred to the preset system memory; when it is detected that the user drags the information to be stored from the first functional area to the preset storage area in the second functional area, the information to be stored is corresponding The data of is transferred from the preset system memory to the storage location corresponding to the second functional area.
  • the storage location corresponding to the first functional area may be the same as or different from the storage location corresponding to the second functional area, which is not limited in the embodiment of the present disclosure.
  • step S1302 may specifically include: when it is detected that the user drags the information to be stored from the dialog box area of the instant messaging to the preset storage area, storing the information to be stored.
  • the dialog box area and the preset storage area of the instant messaging are located in different applications, or located in different functional areas of the same application.
  • the preset storage area includes a shortcut phrase area
  • the shortcut phrase area and the instant messaging dialog box area can be located in different applications.
  • the instant messaging dialog box area refers to the area where the user instantly sends and receives Internet messages, and the shortcut phrase area is located in the system that can be used by the system. In the application that can be called at any time, for example, the shortcut phrase area can be located in the input method application that comes with the system.
  • the shortcut phrase area when it is detected that the user triggers at least one of the operation mode C1, the operation mode C2, the operation mode C3, and the operation mode C4, the shortcut phrase area is loaded and displayed so that the user can drag the information to be stored into Shortcut phrase area.
  • the data corresponding to "watermelon in summer” is stored in the storage location corresponding to the shortcut phrase area Place.
  • the dialog box area and the preset storage area of instant messaging can be located in different functional areas of the same application.
  • the preset storage area can be located in the favorite area of instant messaging.
  • the method may further include: when it is detected that the user drags the information in the dialog box area of the instant messaging to the information input area of the instant messaging, and when it is detected When the user sends the information in the information input area, the information in the information input area is sent.
  • the display position of the dragging information is displayed to realize the rapid input of information. Further, when the user's sending operation of the information in the information input area is detected, the information in the information input area is sent.
  • the information processing method is described in detail from the perspective of method steps above, and the information processing device is introduced from the perspective of a virtual module or a virtual unit as follows.
  • the information processing device 140 may include: a third determining module 1401 and a storage module 1402.
  • the third determining module 1401 is configured to determine the information to be stored based on the trigger operation of the user.
  • the storage module 1402 is configured to store the information to be stored when it is detected that the user drags the information to be stored to the preset storage area.
  • the information processing device 140 further includes a fourth determining module and a display module.
  • the fourth determining module is used to determine the display position of the stored information in the preset storage area.
  • the storage information is the information after the information to be stored is stored.
  • the display module is used to display the storage information at the determined display position.
  • the fourth determining module includes at least one of a third determining unit, a fourth determining unit, and a fifth determining unit.
  • the third determining unit is used to determine the drag end position of the information to be stored when it is detected that the user drags the information to be stored to the preset storage area, and determine that the stored information is in the preset based on the drag end position of the information to be stored The display position in the storage area.
  • the fourth determining unit is configured to determine the display position of the stored information in the preset storage area based on the storage time of the stored information.
  • the fifth determining unit is configured to determine the display area corresponding to the information type in the preset storage area based on the information type of the stored information, and determine the display position of the stored information in the determined display area.
  • the information processing device 140 further includes an adjustment module.
  • the adjustment module is used to adjust the display positions of at least two pieces of information displayed in the preset storage area when the preset conditions are met.
  • satisfying the preset condition includes detecting that the preset time is reached, detecting a preset operation instruction triggered by the user, and detecting at least one of the user's drag operation for any information displayed in the preset storage area.
  • the adjustment module includes at least one of a first adjustment unit and a second adjustment unit.
  • the first adjustment unit is configured to adjust at least two items displayed in the preset storage area according to at least one of the frequency of information use and the information storage time when it is detected that the preset time is reached or the preset operation instruction triggered by the user is detected.
  • the display position of the information is configured to adjust at least two items displayed in the preset storage area according to at least one of the frequency of information use and the information storage time when it is detected that the preset time is reached or the preset operation instruction triggered by the user is detected.
  • the second adjustment unit is configured to adjust the display positions of at least two pieces of information displayed in the preset storage area based on the drag termination position when a drag operation of the user for any information displayed in the preset storage area is detected.
  • the information processing device 140 further includes an insertion module.
  • the inserting module is used to insert the dragged information into the corresponding position of the editing area when it is detected that the user drags the information displayed in the preset storage area to the editing area.
  • the plug-in module includes at least one of a first plug-in unit and a second plug-in unit.
  • the first insertion unit is used to determine the drag end position of the drag information in the edit area, and insert the drag information in the edit area according to the drag end position.
  • the second inserting unit is used to insert the dragged information to the cursor position in the editing area.
  • the information processing device 140 further includes a loading display module.
  • the loading display module is used to load and display the preset storage area when it is detected that the user triggers at least one of the following operation modes 1-4.
  • Operation mode 1 may include: a drag operation for the information to be stored, and the drag end position belongs to the first preset area;
  • operation mode 2 may include: a click operation for the second preset area;
  • operation mode 3 may include: sliding Operation, and the sliding trajectory information of the sliding operation matches the preset trajectory information, or the starting position of the sliding operation belongs to the third preset area, and the sliding direction is the preset sliding direction;
  • operation mode 4 may include: for the information to be stored Select the operation.
  • the third determining module 1401 is specifically configured to determine the information to be stored based on the user's trigger operation in the editable area, and based on the trigger operation in the non-editable area, Determine at least one item of the information to be stored.
  • the storage module 1402 may specifically include any one of a first storage unit and a second storage unit.
  • the first storage unit is configured to store the information to be stored when it is detected that the user drags the information to be stored from one application to a preset storage area in another application;
  • the second storage unit is used to store the information to be stored when it is detected that the user drags the information to be stored from the first functional area to the preset storage area located in the second functional area, the first functional area and the second functional area Corresponds to the same application.
  • the storage module 1402 is specifically used to store the information to be stored when it is detected that the user drags the information to be stored from the dialog box area of the instant messaging to the preset storage area;
  • the communication dialog box area and the preset storage area are located in different applications, or located in different functional areas of the same application.
  • the storage module 1402 detects that the user will The information to be stored is dragged from the dialog box area of instant messaging to the preset storage area.
  • the information to be stored it is specifically used when it is detected that the user drags the information to be stored from the dialog box area of instant messaging to the shortcut phrase area to store Information to be stored.
  • the device 140 may further include a sending module.
  • the sending module is used for when it is detected that the user drags the information in the dialog box area of the instant messaging to the information input area of the instant messaging, and when the user’s sending operation for the information in the information input area is detected, send the information in the information input area information.
  • the third determining module 1401 and the fourth determining module may be the same determining module or two different determining modules; the third determining unit, the fourth determining unit, and the fifth determining unit may be the same.
  • One determination unit can also be three different determination units, or can be arbitrarily combined into the same determination unit; the first adjustment unit and the second adjustment unit can be two identical adjustment units or two different adjustment units Unit; the first plug-in unit and the second plug-in unit can be two identical plug-in units, or two different plug-in units, the first storage unit and the second storage unit can be the same storage unit, or two The different storage units are not limited in the embodiment of the present disclosure.
  • the information processing device 140 of the embodiment of the present disclosure can execute an information processing method provided by the method embodiment of the present disclosure, and its implementation principles are similar, and will not be repeated here.
  • the foregoing describes the input method-based information processing device of the present disclosure from the perspective of a virtual module or a virtual unit, and the following describes the electronic device of the present disclosure from the perspective of a physical device.
  • the electronic device 300 may include: one or more processors; a memory; one or more application programs, wherein one or more application programs are stored in the memory and configured to be executed by one or more processors, one or more Each program is configured to perform operations corresponding to the methods shown in the above-mentioned embodiments.
  • Terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablets), PMPs (portable multimedia players), vehicle-mounted terminals (e.g. Mobile terminals such as car navigation terminals) and fixed terminals such as digital TVs, desktop computers, etc.
  • PDAs personal digital assistants
  • PADs tablets
  • PMPs portable multimedia players
  • vehicle-mounted terminals e.g. Mobile terminals such as car navigation terminals
  • fixed terminals such as digital TVs, desktop computers, etc.
  • the electronic device shown in FIG. 3 is only an example, and should not bring any limitation to the function and scope of use of the embodiments of the present disclosure.
  • the electronic device includes a memory and a processor, where the processor here may be referred to as the processing device 301 described below, and the memory may include a read-only memory (ROM) 302, a random access memory (RAM) 303, and a storage device below. At least one of 308, as follows:
  • the electronic device 300 may include a processing device (such as a central processing unit, a graphics processor, etc.) 301, which may be loaded into a random access device according to a program stored in a read-only memory (ROM) 302 or from a storage device 308
  • the program in the memory (RAM) 303 executes various appropriate actions and processing.
  • various programs and data required for the operation of the electronic device 300 are also stored.
  • the processing device 301, the ROM 302, and the RAM 303 are connected to each other through a bus 304.
  • An input/output (I/O) interface 305 is also connected to the bus 304.
  • the following devices can be connected to the I/O interface 305: including input devices 306 such as touch screens, touch pads, keyboards, mice, cameras, microphones, accelerometers, gyroscopes, etc.; including, for example, liquid crystal displays (LCD), speakers, vibrations
  • input devices 306 such as touch screens, touch pads, keyboards, mice, cameras, microphones, accelerometers, gyroscopes, etc.
  • LCD liquid crystal displays
  • An output device 307 such as a device
  • a storage device 308 such as a magnetic tape, a hard disk, etc.
  • the communication device 309 may allow the electronic device 300 to perform wireless or wired communication with other devices to exchange data.
  • FIG. 3 shows an electronic device 300 having various devices, it should be understood that it is not required to implement or have all of the illustrated devices. It may alternatively be implemented or provided with more or fewer devices.
  • the process described above with reference to the flowchart can be implemented as a computer software program.
  • the embodiments of the present disclosure include a computer program product, which includes a computer program carried on a non-transitory computer readable medium, and the computer program contains program code for executing the method shown in the flowchart.
  • the computer program may be downloaded and installed from the network through the communication device 309, or installed from the storage device 308, or installed from the ROM 302.
  • the processing device 301 the above-mentioned functions defined in the method of the embodiment of the present disclosure are executed.
  • the aforementioned computer-readable medium in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as a part of a carrier wave, and a computer-readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
  • the computer-readable signal medium may send, propagate or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: wire, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
  • the client and server can communicate with any currently known or future developed network protocol such as HTTP (HyperText Transfer Protocol), and can communicate with digital data in any form or medium.
  • Communication e.g., communication network
  • Examples of communication networks include local area networks (“LAN”), wide area networks (“WAN”), the Internet (for example, the Internet), and end-to-end networks (for example, ad hoc end-to-end networks), as well as any currently known or future research and development network of.
  • the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or it may exist alone without being assembled into the electronic device.
  • the aforementioned computer-readable medium carries one or more programs, and when the aforementioned one or more programs are executed by the electronic device, the electronic device is caused to perform corresponding operations of the method according to the embodiment of the present disclosure.
  • the computer program code used to perform the operations of the present disclosure can be written in one or more programming languages or a combination thereof.
  • the above-mentioned programming languages include but are not limited to object-oriented programming languages such as Java, Smalltalk, C++, and Including conventional procedural programming languages-such as "C" language or similar programming languages.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, executed as an independent software package, partly on the user's computer and partly executed on a remote computer, or entirely executed on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network (including a local area network (LAN) or a wide area network (WAN)), or it can be connected to an external computer (for example, using an Internet service provider to connect to the user's computer). connection).
  • LAN local area network
  • WAN wide area network
  • each block in the flowchart or block diagram may represent a module, program segment, or part of code, and the module, program segment, or part of code contains one or more for realizing the specified logical function Executable instructions.
  • the functions marked in the block may also occur in a different order from the order marked in the drawings. For example, two blocks shown in succession can actually be executed substantially in parallel, and they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagram and/or flowchart, and the combination of the blocks in the block diagram and/or flowchart can be implemented by a dedicated hardware-based system that performs the specified functions or operations Or it can be realized by a combination of dedicated hardware and computer instructions.
  • the modules or units involved in the embodiments described in the present disclosure can be implemented in software or hardware. Wherein, the name of the module or unit does not constitute a limitation on the unit itself under certain circumstances.
  • the first determining module can also be described as "a module that determines the information to be processed in the input area".
  • exemplary types of hardware logic components include: Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), Application Specific Standard Product (ASSP), System on Chip (SOC), Complex Programmable Logical device (CPLD) and so on.
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • ASSP Application Specific Standard Product
  • SOC System on Chip
  • CPLD Complex Programmable Logical device
  • a machine-readable medium may be a tangible medium, which may contain or store a program for use by the instruction execution system, apparatus, or device or in combination with the instruction execution system, apparatus, or device.
  • the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
  • the machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any suitable combination of the foregoing.
  • machine-readable storage media would include electrical connections based on one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or flash memory erasable programmable read-only memory
  • CD-ROM compact disk read only memory
  • magnetic storage device or any suitable combination of the foregoing.
  • the embodiments of the present disclosure provide a computer-readable medium with a computer program stored on the computer-readable medium, and when it runs on a computer, the computer can execute the corresponding content in the foregoing method embodiment.
  • an information processing method including:
  • an information processing device including:
  • the determination module is used to determine the information to be processed
  • the processing module is configured to perform corresponding operations on the to-be-processed information based on the drag operation for the to-be-processed information.
  • an information processing method based on an input method including:
  • determining the processing mode of the to-be-processed information includes any one of the following:
  • determining the drag termination area of a drag operation, and determining the processing method of the information to be processed based on the drag termination area includes:
  • determining the processing method of the information to be processed based on the drag termination area includes:
  • the processing method of the information to be processed is determined.
  • determining the drag termination area of a drag operation, and determining the processing method of the information to be processed based on the drag termination area includes:
  • the drag suspension area is determined
  • the first trigger area is the drag termination area
  • the processing method of the information to be processed is the processing method corresponding to the first trigger area.
  • determining the drag termination area of the drag operation previously further includes:
  • determine the drag termination area of the drag operation including:
  • the second trigger area is the drag termination area.
  • determining the processing method of the information to be processed based on the drag termination area includes any one of:
  • determining the processing method of the information to be processed includes at least one of: a processing method of converting text to an expression and a processing method of adding information to the expression panel;
  • the processing method for determining the information to be processed includes: adding information to the processing method of the shortcut phrase panel;
  • the processing method for determining the information to be processed includes at least one of word segmentation and text format adjustment;
  • the processing method for determining the information to be processed includes: text-to-speech processing;
  • the processing method for determining the information to be processed includes: language conversion processing.
  • performing corresponding operations on the information to be processed includes:
  • an information processing device based on an input method, including: a first determining module, configured to determine information to be processed in an input area;
  • the second determining module is configured to determine the processing mode of the information to be processed based on the drag operation for the information to be processed;
  • the execution module is used to execute corresponding operations on the information to be processed based on the determined processing mode.
  • the second determining module includes any one of a first determining unit and a second determining unit, wherein,
  • the first determining unit is configured to determine the drag termination area of the drag operation, and determine the processing mode of the information to be processed based on the drag termination area;
  • the second determining unit is used to determine the drag trajectory information of the drag operation, and determine the processing mode of the information to be processed based on the drag trajectory information.
  • the first determination unit includes a first determination subunit, a display subunit, and a second determination subunit, wherein:
  • the first determining subunit is used to determine that the drag termination area of the drag operation is the first preset area
  • a display subunit for displaying at least one processing method corresponding to the first preset area
  • the second determining subunit is configured to determine that the processing method of the information to be processed is the processing method corresponding to the trigger operation when a trigger operation for any processing method is detected.
  • the first determining unit includes a third determining subunit, wherein
  • the third determining subunit is used to determine the processing method of the information to be processed based on the correspondence between the area and the processing method and the drag termination area.
  • the first determining unit includes a fourth determining subunit, a fifth determining subunit, a sixth determining subunit, and a seventh determining subunit, wherein,
  • the fourth determining subunit is used to determine the drag suspension area when the drag suspension operation for the to-be-processed information is detected;
  • the fifth determining subunit is used to determine and display the first trigger area of each processing method corresponding to the second preset area when the drag suspension area is the second preset area;
  • the sixth determining subunit is used to determine that the first trigger area is the drag termination area when it is detected that the information to be processed is continued to be dragged to the first trigger area of any processing method and the drag operation is terminated;
  • the seventh determining subunit is used to determine that the processing mode of the information to be processed is the processing mode corresponding to the first trigger area.
  • the information processing device based on the input method further includes an identification module and a first display module, wherein,
  • the identification module is used to identify the information to be processed and obtain the identification result of the information to be processed;
  • the first display module is configured to display the second trigger area of each of the at least one processing method based on the recognition result
  • the first determining unit includes an eighth determining subunit, wherein:
  • the eighth determining subunit is configured to determine that the second trigger area is the drag termination area when it is detected that the information to be processed is dragged to the second trigger area of any processing method and the drag operation is terminated.
  • the first determining unit is specifically configured to:
  • the processing method for determining the information to be processed includes at least one of: a processing method of converting text to an expression and a processing method of adding information to the expression panel;
  • determining the processing method of the information to be processed includes: adding information to the processing method of the shortcut phrase panel;
  • the processing method for determining the information to be processed includes at least one of word segmentation and text format adjustment;
  • the processing method of the information to be processed includes: text-to-speech processing;
  • the processing method for determining the information to be processed includes: language conversion processing.
  • the information processing device based on the input method further includes a second display module, wherein:
  • the second display module is used to display processing information corresponding to the determined processing method
  • the execution module is specifically configured to perform a corresponding operation on the information to be processed based on the determined processing mode when the confirmation operation for the processed information is detected.
  • an information processing method including:
  • the information to be stored is stored.
  • storing the to-be-stored information further includes:
  • the storage information is displayed at the determined display position.
  • determining the display position of the stored information in the preset storage area includes at least one of the following:
  • the display area corresponding to the information type is determined in the preset storage area, and the display position of the stored information in the determined display area is determined.
  • the information processing method further includes:
  • Satisfying the preset conditions includes at least one of the following:
  • the preset operation instruction triggered by the user is detected
  • a user's drag operation on any information displayed in the preset storage area is detected.
  • adjusting the display positions of at least two pieces of information displayed in the preset storage area includes at least one of the following:
  • the display positions of at least two pieces of information displayed in the preset storage area are adjusted based on the drag termination position.
  • the information processing method further includes:
  • the dragged information is inserted into the corresponding position of the editing area.
  • inserting the dragged information into the corresponding position of the editing area includes at least one of the following:
  • the information processing method further includes:
  • the selected operation for the information to be stored is the selected operation for the information to be stored.
  • determining the information to be stored based on a user's trigger operation includes at least one of the following:
  • the information to be stored is determined.
  • storing the information to be stored includes any one of the following:
  • the information to be stored is stored.
  • the first functional area and the second functional area correspond to the same application.
  • storing the information to be stored includes:
  • the dialog box area and the preset storage area of the instant messaging are located in different applications, or located in different functional areas of the same application.
  • the preset storage area includes a shortcut phrase area
  • the dialog box area of the instant messaging and the preset storage area are located in different applications
  • storing the information to be stored includes:
  • the information to be stored is stored.
  • the method further includes:
  • an information processing device including a third determining module, configured to determine information to be stored based on a user's trigger operation;
  • the storage module is used to store the information to be stored when it is detected that the user drags the information to be stored to the preset storage area.
  • the information processing device further includes a fourth determining module and a display module, wherein:
  • the fourth determining module is used to determine the display position of the stored information in the preset storage area, and the stored information is the information after storage of the to-be-stored information;
  • the display module is used to display the storage information at the determined display position.
  • the fourth determining module includes at least one of a third determining unit, a fourth determining unit, and a fifth determining unit.
  • the third determining unit is used to determine the drag end position of the information to be stored when it is detected that the user drags the information to be stored to the preset storage area, and determine that the stored information is in the preset based on the drag end position of the information to be stored The display position in the storage area;
  • the fourth determining unit is configured to determine the display position of the stored information in the preset storage area based on the storage time of the stored information
  • the fifth determining unit is configured to determine the display area corresponding to the information type in the preset storage area based on the information type of the stored information, and determine the display position of the stored information in the determined display area.
  • the information processing device further includes an adjustment module.
  • the adjustment module is used to adjust the display positions of at least two pieces of information displayed in the preset storage area when the preset conditions are met;
  • Satisfying the preset conditions includes at least one of the following:
  • the preset operation instruction triggered by the user is detected
  • a user's drag operation on any information displayed in the preset storage area is detected.
  • the adjustment module includes at least one of a first adjustment unit and a second adjustment unit.
  • the first adjustment unit is configured to adjust at least two items displayed in the preset storage area according to at least one of the frequency of information use and the information storage time when it is detected that the preset time is reached or the preset operation instruction triggered by the user is detected.
  • the second adjustment unit is configured to adjust the display positions of at least two pieces of information displayed in the preset storage area based on the drag termination position when a drag operation of the user for any information displayed in the preset storage area is detected.
  • the information processing apparatus further includes a plug-in module.
  • the inserting module is used to insert the dragged information into the corresponding position of the editing area when it is detected that the user drags the information displayed in the preset storage area to the editing area.
  • the plug-in module includes at least one of a first plug-in unit and a second plug-in unit.
  • the first insertion unit is used to determine the drag end position of the drag information in the edit area, and insert the drag information in the edit area according to the drag end position;
  • the second inserting unit is used to insert the dragged information into the cursor position in the editing area.
  • the information processing device further includes a loading display module.
  • the loading display module is used to load and display the preset storage area when it is detected that the user triggers at least one of the following operations:
  • the selected operation for the information to be stored is the selected operation for the information to be stored.
  • the third determining module is specifically configured to determine the information to be stored based on the user's trigger operation in the editable area, and to determine the information to be stored based on the trigger operation in the non-editable area. Store at least one item of information.
  • the storage module may specifically include any one of a first storage unit and a second storage unit, wherein:
  • the first storage unit is configured to store the information to be stored when it is detected that the user drags the information to be stored from one application to a preset storage area in another application;
  • the second storage unit is used to store the information to be stored when it is detected that the user drags the information to be stored from the first functional area to the preset storage area located in the second functional area, the first functional area and the second functional area Corresponds to the same application.
  • the storage module is specifically configured to store the information to be stored when it is detected that the user drags the information to be stored from the dialog box area of the instant messaging to the preset storage area;
  • the dialog box area and the preset storage area of the instant messaging are located in different applications, or located in different functional areas of the same application.
  • the storage module detects that the user will be stored Information is dragged from the dialog box area of instant messaging to the preset storage area to store the information to be stored, specifically used when it is detected that the user drags the information to be stored from the dialog box area of instant messaging to the shortcut phrase area to store the information to be stored information.
  • the device further includes a sending module.
  • the sending module is used for when it is detected that the user drags the information in the dialog box area of the instant messaging to the information input area of the instant messaging, and when the user’s sending operation for the information in the information input area is detected, send the information in the information input area information.
  • an electronic device including: one or more processors; a memory; one or more application programs, wherein one or more application programs are stored in the memory and It is configured to be executed by one or more processors, and one or more programs are configured to perform operations corresponding to the method according to the embodiments of the present disclosure.
  • a computer-readable medium having a computer program stored thereon, and when the program is executed by a processor, the method according to the embodiment of the present disclosure is implemented.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé et un appareil de traitement d'informations, un dispositif électronique et un support. Le procédé consiste : à déterminer des informations à traiter ; et à effectuer une opération correspondante sur lesdites informations en fonction d'une opération de glissement destinée auxdites informations.
PCT/CN2020/111785 2019-10-30 2020-08-27 Procédé et appareil de traitement d'informations, dispositif électronique et support WO2021082694A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201911046169.X 2019-10-30
CN201911046206.7A CN110806834A (zh) 2019-10-30 2019-10-30 基于输入法的信息处理方法、装置、电子设备及介质
CN201911046169.XA CN110806827A (zh) 2019-10-30 2019-10-30 信息处理方法、装置、电子设备及介质
CN201911046206.7 2019-10-30

Publications (1)

Publication Number Publication Date
WO2021082694A1 true WO2021082694A1 (fr) 2021-05-06

Family

ID=75715749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/111785 WO2021082694A1 (fr) 2019-10-30 2020-08-27 Procédé et appareil de traitement d'informations, dispositif électronique et support

Country Status (1)

Country Link
WO (1) WO2021082694A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102203711A (zh) * 2008-11-13 2011-09-28 高通股份有限公司 依赖于上下文的弹出式菜单的方法和系统
CN103941999A (zh) * 2014-04-14 2014-07-23 联想(北京)有限公司 一种信息处理方法及电子设备
CN104112019A (zh) * 2014-07-23 2014-10-22 广州三星通信技术研究有限公司 发送信息的方法、装置和终端
US20140351725A1 (en) * 2013-05-27 2014-11-27 Samsung Electronics Co., Ltd Method and electronic device for operating object
CN106020622A (zh) * 2016-07-12 2016-10-12 百度在线网络技术(北京)有限公司 用于收藏表情符号的方法和装置
CN106126089A (zh) * 2016-06-17 2016-11-16 腾讯科技(深圳)有限公司 一种在移动终端中实现复制和粘贴的方法及移动终端
CN107885443A (zh) * 2017-09-22 2018-04-06 阿里巴巴集团控股有限公司 一种信息处理的方法及装置
CN109462692A (zh) * 2018-10-29 2019-03-12 努比亚技术有限公司 分屏显示操作方法、移动终端及计算机可读存储介质
CN110147197A (zh) * 2019-04-08 2019-08-20 努比亚技术有限公司 一种操作识别方法、装置及计算机可读存储介质
CN110806827A (zh) * 2019-10-30 2020-02-18 北京字节跳动网络技术有限公司 信息处理方法、装置、电子设备及介质
CN110806834A (zh) * 2019-10-30 2020-02-18 北京字节跳动网络技术有限公司 基于输入法的信息处理方法、装置、电子设备及介质

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102203711A (zh) * 2008-11-13 2011-09-28 高通股份有限公司 依赖于上下文的弹出式菜单的方法和系统
US20140351725A1 (en) * 2013-05-27 2014-11-27 Samsung Electronics Co., Ltd Method and electronic device for operating object
CN103941999A (zh) * 2014-04-14 2014-07-23 联想(北京)有限公司 一种信息处理方法及电子设备
CN104112019A (zh) * 2014-07-23 2014-10-22 广州三星通信技术研究有限公司 发送信息的方法、装置和终端
CN106126089A (zh) * 2016-06-17 2016-11-16 腾讯科技(深圳)有限公司 一种在移动终端中实现复制和粘贴的方法及移动终端
CN106020622A (zh) * 2016-07-12 2016-10-12 百度在线网络技术(北京)有限公司 用于收藏表情符号的方法和装置
CN107885443A (zh) * 2017-09-22 2018-04-06 阿里巴巴集团控股有限公司 一种信息处理的方法及装置
CN109462692A (zh) * 2018-10-29 2019-03-12 努比亚技术有限公司 分屏显示操作方法、移动终端及计算机可读存储介质
CN110147197A (zh) * 2019-04-08 2019-08-20 努比亚技术有限公司 一种操作识别方法、装置及计算机可读存储介质
CN110806827A (zh) * 2019-10-30 2020-02-18 北京字节跳动网络技术有限公司 信息处理方法、装置、电子设备及介质
CN110806834A (zh) * 2019-10-30 2020-02-18 北京字节跳动网络技术有限公司 基于输入法的信息处理方法、装置、电子设备及介质

Similar Documents

Publication Publication Date Title
WO2020238815A1 (fr) Procédé et dispositif de commande d'affichage, dispositif électronique et support de stockage
WO2021077883A1 (fr) Procédé et dispositif pour afficher un document en ligne, dispositif électronique et support de stockage
US10863338B2 (en) Copy and paste between devices
WO2022002066A1 (fr) Procédé et appareil de navigation dans une table dans un document, ainsi que dispositif électronique et support de stockage
WO2020200173A1 (fr) Procédé et appareil de traitement de contenus d'entrée de documents, dispositif électronique et support d'informations
JP2006338667A (ja) ユーザ−マシン間通信方法、装置、インターフェイス・プロセッサ、及びプログラム
JP6434640B2 (ja) メッセージ表示方法、メッセージ表示装置、およびメッセージ表示デバイス
US9270738B2 (en) Processor sharing between in-range devices
TW201621706A (zh) 使用近場通訊來以權限控制進行共享內容
WO2015176352A1 (fr) Procédé et dispositif basés sur le système android pour l'échange d'informations entre applications
US20160070455A1 (en) Toggle graphic object
WO2022218251A1 (fr) Procédé et appareil de traitement de document électronique, terminal et support de stockage
JP2020161135A (ja) チャットスレッドを表示するための方法およびシステム
US11809690B2 (en) Human-computer interaction method and apparatus, and electronic device
WO2020119409A1 (fr) Procédé et appareil de positionnement de liste, et dispositif et support d'informations
WO2023185817A1 (fr) Procédé et appareil de coopération multi-dispositif, et dispositif électronique et support
WO2019072213A1 (fr) Procédé de traitement d'informations de point d'accès sans fil, dispositif et support de stockage lisible par ordinateur
CN110806834A (zh) 基于输入法的信息处理方法、装置、电子设备及介质
JP2024508319A (ja) ドキュメント更新方法、装置、機器及び媒体
US10942622B2 (en) Splitting and merging files via a motion input on a graphical user interface
US20190369827A1 (en) Remote data input framework
WO2023160578A1 (fr) Procédé et appareil de traitement d'informations, ainsi que terminal et support de stockage
WO2021082694A1 (fr) Procédé et appareil de traitement d'informations, dispositif électronique et support
WO2022184012A1 (fr) Procédé et appareil de création de document, et dispositif et support de stockage
US20150347008A1 (en) Method for controlling virtual keyboard and electronic device implementing the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20882674

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20882674

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01.09.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20882674

Country of ref document: EP

Kind code of ref document: A1