CN110825303A - Interaction method, device, terminal and computer readable medium - Google Patents

Interaction method, device, terminal and computer readable medium Download PDF

Info

Publication number
CN110825303A
CN110825303A CN201910935845.2A CN201910935845A CN110825303A CN 110825303 A CN110825303 A CN 110825303A CN 201910935845 A CN201910935845 A CN 201910935845A CN 110825303 A CN110825303 A CN 110825303A
Authority
CN
China
Prior art keywords
option
coordinate
touch position
touch
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910935845.2A
Other languages
Chinese (zh)
Inventor
姚儒升
陈志伟
曹聪灵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Futuo Network Technology (shenzhen) Co Ltd
Original Assignee
Futuo Network Technology (shenzhen) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Futuo Network Technology (shenzhen) Co Ltd filed Critical Futuo Network Technology (shenzhen) Co Ltd
Priority to CN201910935845.2A priority Critical patent/CN110825303A/en
Publication of CN110825303A publication Critical patent/CN110825303A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Abstract

The invention relates to an interaction method, an interaction device, a terminal and a computer readable medium, wherein the method comprises the following steps: responding to a first touch instruction in a designated area in the interactive control, and displaying an interactive module; acquiring coordinate information of each option in the interactive module; and in response to a second touch instruction aiming at the option, explicitly displaying the option. The invention can praise with various emotions and improve the interaction quality.

Description

Interaction method, device, terminal and computer readable medium
Technical Field
The present invention relates to the field of information processing, and in particular, to an interaction method, an interaction device, a terminal, and a computer-readable medium.
Background
At present, in a network community, users can interact through published information, wherein the utilization rate of a praise method is very high, and the specific method comprises the following steps: and clicking a praise button below the published content by the user, realizing the praise on the content, and receiving a notice by the user publishing the content.
However, when the published content is sad, angry and the like, the praise operation is not applicable, because the praise is praise, like, praise and the like, and the sad content is not compliant, so that the existing praise mode can only express a limited number of emotions, and the number of available scenes is too small.
Disclosure of Invention
To solve the technical problem or at least partially solve the technical problem, the invention provides an interaction method, an interaction device, a terminal and a computer-readable medium.
In a first aspect, the present invention provides an interaction method, including:
responding to a first touch instruction in a designated area in the interactive control, and displaying an interactive module;
acquiring coordinate information of each option in the interactive module;
and in response to a second touch instruction aiming at the option, explicitly displaying the option.
Optionally, the explicitly displaying the option in response to the second touch instruction for the option includes:
analyzing a touch position coordinate from the second touch instruction;
comparing the touch position coordinates with coordinates of each option respectively;
and determining the option of which the distance from the coordinate to the touch position coordinate falls within a set range as a display option.
Optionally, after explicitly displaying the option, the method further includes:
and displaying all or part of the display options in an enlarged mode.
Optionally, the displaying the part of the display options in an enlarged manner includes:
and amplifying and displaying the option with the smallest distance between the coordinate of the display option and the coordinate of the touch position.
Optionally, the explicitly displaying the option in response to the second touch instruction for the option includes:
analyzing a touch position coordinate from the second touch instruction;
comparing the touch position coordinates with coordinates of each option respectively;
and determining the option with the smallest distance between the coordinate and the touch position coordinate as a display option.
In a second aspect, the present invention provides an interaction apparatus, comprising:
the display module is used for responding to a first touch instruction in a specified area in the interactive control and displaying the interactive module;
the acquisition module is used for acquiring the coordinate information of each option in the interaction module;
and the execution module is used for responding to the second touch instruction aiming at the option and explicitly displaying the option.
Optionally, the execution module includes:
the first analysis coordinate unit is used for analyzing the touch position coordinate from the second touch instruction;
the first judgment unit is used for comparing the touch position coordinates with the coordinates of each option respectively;
and the first option determining unit is used for determining the options of which the distances from the coordinates to the touch position coordinates fall within a set range as display options.
Optionally, the explicitly displaying the option in response to the second touch instruction for the option includes:
the second analysis coordinate unit is used for analyzing the touch position coordinate from the second touch instruction;
the second judging unit is used for comparing the touch position coordinates with the coordinates of each option respectively;
and the second determination option unit is used for determining the option with the minimum distance from the coordinate of the second determination option to the coordinate of the touch position as a display option.
In a third aspect, the present invention provides a terminal, including a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory complete mutual communication through the communication bus; the memory has stored therein a computer program operable on the processor, which when executed by the processor performs the steps of the method of the first aspect.
In a fourth aspect, the invention provides a computer readable medium having non-volatile program code executable by a processor, the program code causing the processor to perform the method of the first aspect.
Compared with the prior art, the technical scheme provided by the embodiment of the invention has the following advantages:
the method provided by the embodiment of the invention can respond to the first touch instruction in the designated area in the interactive control and display the interactive module; then obtaining coordinate information of each option in the interaction module; and finally, responding to a second touch instruction aiming at the option, and explicitly displaying the option, so that the corresponding emotion is determined according to the current scene, various emotion types are selected in the interaction module, and the appropriate emotion is determined, thereby realizing the approval of various emotions and improving the interaction quality.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic flowchart of an interaction method according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating an interaction method according to another embodiment of the present invention;
fig. 3 is a flowchart illustrating an interaction method according to another embodiment of the present invention;
fig. 4 is a flowchart illustrating an interaction method according to another embodiment of the present invention;
FIG. 5 is a block diagram of an interactive apparatus according to another embodiment of the present invention;
FIG. 6 is a block diagram of an interactive apparatus according to another embodiment of the present invention;
FIG. 7 is a block diagram of an interactive apparatus according to another embodiment of the present invention;
fig. 8 is a block diagram of a terminal according to another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer and more complete, the technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention, it is obvious that the described embodiments are some embodiments of the present invention, not all embodiments are based on the embodiments of the present invention, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts belong to the protection scope of the present invention.
At present, in a network community, users can interact through published information, wherein the usage rate of the praise method is very high, however, when the published content is sad, angry and other emotions, the praise operation is not applicable at this time, because the praise is praise, like, praise and the like, and the sad content is not compliant, the existing praise method can only express a limited number of emotions, and the available scenes are too few. Therefore, an interaction method, an interaction apparatus, a terminal, and a computer-readable medium are provided in an embodiment of the present invention, where the interaction method may be applied in a terminal, as shown in fig. 1, and the interaction method may include the following steps:
step S101, responding to a first touch instruction in a designated area in an interactive control, and displaying an interactive module;
in the embodiment of the present invention, the interaction control may refer to a plurality of button columns having an interaction function, and exemplarily, the interaction control includes praise, comment, forward, and the like; the effective area of the maximum range in which the button can be pointed is pointed in the designated area.
The first touch instruction can be a long-press instruction, the interaction module is displayed after the long-press instruction is received, and the following steps are continuously executed; or, the first touch instruction may also be a click instruction, and after receiving the click instruction, the praise operation is directly executed, or the long press instruction may be continuously received on the basis of the praise.
The interaction module can refer to an emotion module, and exemplarily, the emotion module comprises emotions of like, love, laugh, surprise, cry, anger and the like; the emotion types can be set by the terminal or the user, or the messages of the user are fed back to the terminal and are improved by the terminal; or the interaction module may refer to a comment module, which is only an example, and in practical application, other modes may be selected according to actual needs, and the present invention is not limited.
In this step, when a long-press instruction is received within the effective area of the thumbs up button, a plurality of emotions within the interactive module are displayed in response to the long-press instruction.
Step S102, obtaining coordinate information of each option in the interaction module;
in this step, coordinate information of various emotions within the emotion module is acquired.
And step S103, responding to a second touch instruction aiming at the option, and explicitly displaying the option.
The second touch instruction may be a single-click instruction.
In this step, when a click instruction is received within the effective area of the emotion module, a selected emotion is determined in response to the click instruction.
The embodiment of the invention can respond to a first touch instruction in a specified area in the interactive control and display the interactive module; then obtaining coordinate information of each option in the interaction module; and finally, responding to a second touch instruction aiming at the option, and explicitly displaying the option, so that the corresponding emotion is determined according to the current scene, and the various emotion types are selected in the interactive module to determine the appropriate emotion, thereby realizing the praise of the various emotions and improving the interactive quality.
In a further embodiment of the present invention, there is also provided an interaction method, as shown in fig. 2, the method may include the steps of:
step S201, responding to a first touch instruction in a specified area in an interactive control, and displaying an interactive module;
step S202, coordinate information of each option in the interaction module is obtained;
step S203, resolving a touch position coordinate from the second touch instruction;
step S204, comparing the touch position coordinates with the coordinates of each option respectively;
in the embodiment of the present invention, the touch position coordinates may refer to coordinates in an interactive control area, and may specifically refer to coordinates in an effective area of the emotion module.
In step S205, an option whose distance from the coordinate to the touch position coordinate falls within a set range is determined as a display option.
In the embodiment of the present invention, the option whose distance between the coordinate and the touch position coordinate falls within the setting range is the display option, and each option has different setting ranges, for example, so that the calculation of the distance between one option and the touch position coordinate is illustrated herein, taking the name of one option as option 1, when comparing the touch position coordinate with the setting range of option 1, the touch position coordinate (x, y) may be compared with the minimum coordinate (x1, y1) and the maximum coordinate (x2, y2) of the setting range of option 1, if x1< x < x2, y1< y < 2, it is indicated that the touch position coordinate is located within the setting range of option 1, and the coordinate of option 1 is within the setting range of its own, so that the distance between the coordinate of option 1 and the touch position coordinate falls within the setting range of option 1, that is determined that the distance between the coordinate and the touch position coordinate falls within the setting range of one option, the option is determined to be a display option.
Or, the option whose distance between the coordinate and the touch position coordinate falls within a set range is a display option, and the set range may also be a range of an overall option, so that the determined display options are all options; the setting range may also be a range of a partial option, and thus the determined display option is a partial option.
The above determination methods are merely examples, and other methods may be used in practical applications according to practical situations, and the present invention is not limited thereto.
The embodiment of the invention can respond to a first touch instruction in a specified area in the interactive control and display the interactive module; then obtaining coordinate information of each option in the interaction module; analyzing a touch position coordinate from a second touch instruction, and comparing the touch position coordinate with the coordinate of each option respectively; and finally, determining the option of which the distance from the coordinate to the touch position coordinate falls in a set range as a display option, and comparing the coordinate touch position coordinate with the coordinate of each option respectively after acquiring the coordinate information of the touch position and each option so as to clearly display the option, so that the praise can be realized by using various emotions, and the interaction quality is improved.
In a further embodiment of the present invention, there is also provided an interaction method, as shown in fig. 3, the method may include the steps of:
step S301, responding to a first touch instruction in a designated area in an interactive control, and displaying an interactive module;
step S302, coordinate information of each option in the interaction module is obtained;
step S303, resolving a touch position coordinate from the second touch instruction;
step S304, comparing the touch position coordinates with the coordinates of each option respectively;
step S305, determining the options of which the distance from the coordinates to the touch position coordinates falls into a set range as display options;
and S306, amplifying and displaying all or part of the display options.
In this step, the enlargement display of all or part of the display options is divided into two cases, namely, the enlargement of all the display options and the enlargement of part of the display options.
Illustratively, in the first case, the interactive module is displayed as a square area, and the options in the interactive module are arranged in a plurality of rows closely, for example, the options in the interactive module are four, and are arranged closely in two rows in the square area, because the square area is smaller or the space between the options is smaller, when the finger falls, the square area is enlarged and displayed, that is, the options are enlarged and displayed completely; and then continuing to perform the next step, such as continuing to receive the touch instruction, or continuing to judge, etc., which is not limited herein.
In the second case, the interactive module is displayed as a rectangular area, and the multiple options in the interactive module are arranged in sequence at equal intervals, because the rectangular area is smaller or the distance between the options is smaller, if the finger does not accurately fall down, the two options can be simultaneously displayed in an enlarged manner; when the finger accurately falls, an option is accurately displayed in an enlarged manner.
The present invention is only illustrated by way of example, and other display modes can be selected according to actual needs in practical applications, and the present invention is not limited.
The embodiment of the invention can respond to a first touch instruction in a specified area in the interactive control and display the interactive module; then obtaining coordinate information of each option in the interaction module; analyzing a touch position coordinate from the second touch instruction, and comparing the touch position coordinate with the coordinate of each option respectively; and finally, determining the option of which the distance from the coordinate to the touch position coordinate falls in a set range as a display option, and displaying the display option in an amplification manner, so that after the coordinate information of the touch position and each option is acquired, the coordinate of the coordinate touch position coordinate is respectively compared with the coordinate of each option, and then the option is clearly displayed and displayed in an amplification manner, thereby realizing the praise with various emotions and improving the interaction quality.
In still another embodiment of the present invention, there is also provided an interaction method, as shown in fig. 4, the method may include the steps of:
step S401, responding to a first touch instruction in a designated area in an interactive control, and displaying an interactive module;
step S402, obtaining coordinate information of each option in the interaction module;
step S403, resolving a touch position coordinate from the second touch instruction;
step S404, comparing the touch position coordinates with the coordinates of each option respectively;
step S405, determining an option having the smallest distance between the coordinates thereof and the touch position coordinates as a display option.
For example, when the touch position coordinates are respectively compared with the coordinates of each option, the touch position coordinates (X, Y) may be compared with the coordinates of two adjacent options, for example, the coordinates of the first option are (X1, Y1), the coordinates of the second option are (X2, Y2), the distance between (X, Y) and (X1, Y1) is taken as L1, the distance between (X, Y) and (X2, Y2) is taken as L2, and if L1< L2, the distance between the touch position coordinates and the first coordinates is determined to be the smallest, and the first coordinates are determined to be the display coordinates.
The above determination methods are merely examples, and other methods may be used in practical applications according to practical situations, and the present invention is not limited thereto.
The embodiment of the invention can respond to a first touch instruction in a specified area in the interactive control and display the interactive module; then obtaining coordinate information of each option in the interaction module; analyzing a touch position coordinate from a second touch instruction, and comparing the touch position coordinate with the coordinate of each option respectively; and finally, determining the option with the minimum distance from the coordinate to the coordinate of the touch position as a display option, comparing the coordinate of the coordinate touch position with the coordinate of each option respectively after acquiring the coordinate information of the touch position and each option, and further clearly displaying the options, so that approval of various emotions can be realized, and the interaction quality is improved.
On the basis of the foregoing embodiment, in another embodiment of the present invention, an interaction method is further provided, where the displaying a part of the display options in an enlarged manner includes:
and amplifying and displaying the option with the smallest distance between the coordinate of the display option and the coordinate of the touch position.
In this step, a process of determining an option with a minimum distance between the coordinate of the option and the coordinate of the touch position is described, which is not described herein again by way of example, and after the display option is determined, the display option is enlarged and displayed.
In still another embodiment of the present invention, there is also provided an interactive apparatus, as shown in fig. 5, the apparatus may include the steps of:
the display module 51 is used for responding to a first touch instruction in a specified area in the interactive control and displaying the interactive module;
an obtaining module 52, configured to obtain coordinate information of each option in the interaction module;
and the execution module 53 is used for responding to the second touch instruction aiming at the option and explicitly displaying the option.
The embodiment of the device can respond to a first touch instruction in a specified area in the interactive control and display the interactive module; then obtaining coordinate information of each option in the interaction module; and finally, responding to a second touch instruction aiming at the option, and explicitly displaying the option, so that the corresponding emotion is determined according to the current scene, and the various emotion types are selected in the interactive module to determine the appropriate emotion, thereby realizing the praise of the various emotions and improving the interactive quality.
In still another embodiment of the present invention, there is also provided an interactive apparatus, as shown in fig. 6, the apparatus may include the steps of:
the display module 51 is used for responding to a first touch instruction in a specified area in the interactive control and displaying the interactive module;
an obtaining module 52, configured to obtain coordinate information of each option in the interaction module;
a first coordinate analyzing unit 61, configured to analyze the touch position coordinate from the second touch instruction;
a first judging unit 62, configured to compare the touch position coordinates with coordinates of each option respectively;
and a first determination option unit 63 for determining an option, of which a distance from the coordinate to the touch position coordinate falls within a set range, as a display option.
The embodiment of the device can respond to a first touch instruction in a specified area in the interactive control and display the interactive module; then obtaining coordinate information of each option in the interaction module; analyzing a touch position coordinate from a second touch instruction, and comparing the touch position coordinate with the coordinate of each option respectively; and finally, determining the option of which the distance from the coordinate to the touch position coordinate falls in a set range as a display option, and comparing the coordinate touch position coordinate with the coordinate of each option respectively after acquiring the coordinate information of the touch position and each option so as to clearly display the option, so that the praise can be realized by using various emotions, and the interaction quality is improved.
On the basis of the above embodiment, in another embodiment of the present invention, there is further provided an interactive apparatus, after explicitly displaying the options, further including:
and displaying all or part of the display options in an enlarged mode.
On the basis of the foregoing embodiment, in another embodiment of the present invention, there is further provided an interactive device, where the displaying the part of the display options in an enlarged manner includes:
and amplifying and displaying the option with the smallest distance between the coordinate of the display option and the coordinate of the touch position.
In still another embodiment of the present invention, there is also provided an interactive apparatus, as shown in fig. 7, the apparatus may include the steps of:
the display module 51 is used for responding to a first touch instruction in a specified area in the interactive control and displaying the interactive module;
an obtaining module 52, configured to obtain coordinate information of each option in the interaction module;
a second coordinate resolving unit 71, configured to resolve the touch position coordinate from the second touch instruction;
a second judging unit 72 for comparing the touch position coordinates with coordinates of each option, respectively;
and a second determination option unit 73 for determining an option having a smallest distance from the coordinate to the touch position coordinate as a display option.
The embodiment of the device can respond to a first touch instruction in a specified area in the interactive control and display the interactive module; then obtaining coordinate information of each option in the interaction module; analyzing a touch position coordinate from a second touch instruction, and comparing the touch position coordinate with the coordinate of each option respectively; and finally, determining the option with the minimum distance from the coordinate to the coordinate of the touch position as a display option, comparing the coordinate of the coordinate touch position with the coordinate of each option respectively after acquiring the coordinate information of the touch position and each option, and further clearly displaying the options, so that approval of various emotions can be realized, and the interaction quality is improved.
In another embodiment of the present invention, there is also provided a terminal including: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus; the memory stores a computer program that can be run on the processor, and the processor executes the computer program to implement the steps of the method of the above-mentioned method embodiments.
According to the terminal provided by the embodiment of the invention, the processor responds to the first touch instruction in the designated area in the interactive control by executing the program stored in the memory, and displays the interactive module; acquiring coordinate information of each option in the interactive module; and responding to a second touch instruction aiming at the option, explicitly displaying the option, realizing praise with various emotions and improving the interaction quality.
The communication bus 1140 mentioned in the above terminal may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus 1140 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 8, but this is not intended to represent only one bus or type of bus.
The communication interface 1120 is used for communication between the terminal and other devices.
The memory 1130 may include a Random Access Memory (RAM), and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The processor 1110 may be a general-purpose processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the integrated circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components.
In yet another embodiment of the present invention, there is also provided a computer readable medium having non-volatile program code executable by a processor, the program code causing the processor to perform the method of the method embodiment.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the invention are brought about in whole or in part when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk (ssd)), among others.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An interaction method, comprising:
responding to a first touch instruction in a designated area in the interactive control, and displaying an interactive module;
acquiring coordinate information of each option in the interactive module;
and in response to a second touch instruction aiming at the option, explicitly displaying the option.
2. The interactive method of claim 1, wherein the explicitly displaying the option in response to the second touch instruction for the option comprises:
analyzing a touch position coordinate from the second touch instruction;
comparing the touch position coordinates with coordinates of each option respectively;
and determining the option of which the distance from the coordinate to the touch position coordinate falls within a set range as a display option.
3. The interactive method of claim 2, wherein after explicitly displaying the option, the method further comprises:
and displaying all or part of the display options in an enlarged mode.
4. The interactive method of claim 3, wherein the displaying the portion of the display options in an enlarged manner comprises:
and amplifying and displaying the option with the smallest distance between the coordinate of the display option and the coordinate of the touch position.
5. The interactive method of claim 1, wherein the explicitly displaying the option in response to the second touch instruction for the option comprises:
analyzing a touch position coordinate from the second touch instruction;
comparing the touch position coordinates with coordinates of each option respectively;
and determining the option with the smallest distance between the coordinate and the touch position coordinate as a display option.
6. An interactive apparatus, comprising:
the display module is used for responding to a first touch instruction in a specified area in the interactive control and displaying the interactive module;
the acquisition module is used for acquiring the coordinate information of each option in the interaction module;
and the execution module is used for responding to the second touch instruction aiming at the option and explicitly displaying the option.
7. The interaction apparatus of claim 6, wherein the execution module comprises:
the first analysis coordinate unit is used for analyzing the touch position coordinate from the second touch instruction;
the first judgment unit is used for comparing the touch position coordinates with the coordinates of each option respectively;
and the first option determining unit is used for determining the options of which the distances from the coordinates to the touch position coordinates fall within a set range as display options.
8. The interaction apparatus of claim 6, wherein the execution module comprises:
the second analysis coordinate unit is used for analyzing the touch position coordinate from the second touch instruction;
the second judging unit is used for comparing the touch position coordinates with the coordinates of each option respectively;
and the second determination option unit is used for determining the option with the minimum distance from the coordinate of the second determination option to the coordinate of the touch position as a display option.
9. A terminal, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus; stored in the memory is a computer program that can be run on the processor, characterized in that the steps of the method according to any of the preceding claims 1 to 5 are implemented when the computer program is executed by the processor.
10. A computer-readable medium having non-volatile program code executable by a processor, wherein the program code causes the processor to perform the method of any of claims 1 to 5.
CN201910935845.2A 2019-09-29 2019-09-29 Interaction method, device, terminal and computer readable medium Pending CN110825303A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910935845.2A CN110825303A (en) 2019-09-29 2019-09-29 Interaction method, device, terminal and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910935845.2A CN110825303A (en) 2019-09-29 2019-09-29 Interaction method, device, terminal and computer readable medium

Publications (1)

Publication Number Publication Date
CN110825303A true CN110825303A (en) 2020-02-21

Family

ID=69548491

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910935845.2A Pending CN110825303A (en) 2019-09-29 2019-09-29 Interaction method, device, terminal and computer readable medium

Country Status (1)

Country Link
CN (1) CN110825303A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112667120A (en) * 2021-01-22 2021-04-16 百果园技术(新加坡)有限公司 Display method and device of interactive icon and electronic equipment
CN112764612A (en) * 2021-01-21 2021-05-07 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
WO2022001542A1 (en) * 2020-07-03 2022-01-06 Oppo广东移动通信有限公司 Information processing method and apparatus, and storage medium, and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840299A (en) * 2010-03-18 2010-09-22 华为终端有限公司 Touch operation method, device and mobile terminal
CN105138222A (en) * 2015-08-26 2015-12-09 美国掌赢信息科技有限公司 Method for selecting expression icon and electronic equipment
CN106250046A (en) * 2016-08-10 2016-12-21 北京金山安全软件有限公司 Praise processing method and device and terminal equipment
CN106873860A (en) * 2017-03-16 2017-06-20 北京搜狐新媒体信息技术有限公司 The network information comments on method and device
US20180348996A1 (en) * 2014-09-30 2018-12-06 Cienet Technologies (Beijing) Co., Ltd. Instant messaging method, client, and system based on graph grid
CN109697100A (en) * 2018-12-29 2019-04-30 天津字节跳动科技有限公司 Conversation message display processing method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101840299A (en) * 2010-03-18 2010-09-22 华为终端有限公司 Touch operation method, device and mobile terminal
US20180348996A1 (en) * 2014-09-30 2018-12-06 Cienet Technologies (Beijing) Co., Ltd. Instant messaging method, client, and system based on graph grid
CN105138222A (en) * 2015-08-26 2015-12-09 美国掌赢信息科技有限公司 Method for selecting expression icon and electronic equipment
CN106250046A (en) * 2016-08-10 2016-12-21 北京金山安全软件有限公司 Praise processing method and device and terminal equipment
CN106873860A (en) * 2017-03-16 2017-06-20 北京搜狐新媒体信息技术有限公司 The network information comments on method and device
CN109697100A (en) * 2018-12-29 2019-04-30 天津字节跳动科技有限公司 Conversation message display processing method and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022001542A1 (en) * 2020-07-03 2022-01-06 Oppo广东移动通信有限公司 Information processing method and apparatus, and storage medium, and electronic device
US11941245B2 (en) 2020-07-03 2024-03-26 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Information processing method, storage medium, and electronic device
CN112764612A (en) * 2021-01-21 2021-05-07 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN112667120A (en) * 2021-01-22 2021-04-16 百果园技术(新加坡)有限公司 Display method and device of interactive icon and electronic equipment

Similar Documents

Publication Publication Date Title
US11393017B2 (en) Two-dimensional code identification method and device, and mobile terminal
CN110825303A (en) Interaction method, device, terminal and computer readable medium
CN109144392B (en) Method and device for processing gesture conflict and electronic equipment
CN104049887A (en) Methods for data transmission and electronic devices using the same
CN109547335B (en) Session message processing method and device
US20170300225A1 (en) Displaying application page of mobile terminal
CN110764986A (en) Page fluency detection method and device
CN112882850A (en) Key event processing method, device, equipment and storage medium
CN110989877B (en) Message management method, related equipment and computer readable storage medium
CN103631509A (en) Electronic device and method for changing an object according to a bending state
CN109358927B (en) Application program display method and device and terminal equipment
CN108062401B (en) Application recommendation method and device and storage medium
CN108521460B (en) Information pushing method and device, mobile terminal and computer readable storage medium
CN108762637A (en) Control method, apparatus and system
CN110851029B (en) Method, device, terminal and storage medium for interchanging application icons
CN113656286A (en) Software testing method and device, electronic equipment and readable storage medium
CN108415656B (en) Display control method, device, medium and electronic equipment in virtual scene
CN111949510A (en) Test processing method and device, electronic equipment and readable storage medium
CN111124841A (en) Abnormal page alarming method and device and computer system
CN111737372A (en) Map data generation method and device
CN112256654A (en) Document sharing method and device
CN110875873B (en) Information monitoring method, information monitoring device and mobile terminal
CN115344152B (en) Interface operation method, device, electronic equipment and readable storage medium
CN111666733B (en) Method and device for processing cells in document
CN110875869B (en) Red packet identification method, system and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200221