CN113342218A - Interaction method and terminal equipment - Google Patents

Interaction method and terminal equipment Download PDF

Info

Publication number
CN113342218A
CN113342218A CN202010099815.5A CN202010099815A CN113342218A CN 113342218 A CN113342218 A CN 113342218A CN 202010099815 A CN202010099815 A CN 202010099815A CN 113342218 A CN113342218 A CN 113342218A
Authority
CN
China
Prior art keywords
interaction
interface
user
preset
operation instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010099815.5A
Other languages
Chinese (zh)
Inventor
胡湘宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010099815.5A priority Critical patent/CN113342218A/en
Publication of CN113342218A publication Critical patent/CN113342218A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The embodiment of the specification discloses a method for interaction, a method for obtaining commodity comments, a method for quitting video playing, terminal equipment and a computer-readable storage medium. The method comprises the following steps: providing an interaction control group in at least part of interfaces under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, wherein the interaction control group comprises at least one interaction control; detecting a moving track of the first operation instruction, and determining a selected interactive control in the interactive control group according to the offset direction of the moving track; and under the condition that the end of the first operation instruction is detected, determining the highlighted interaction control as the interaction control selected by the user, and executing the interaction action corresponding to the interaction control selected by the user.

Description

Interaction method and terminal equipment
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to an interaction method, a touch screen-based interaction method, a method for obtaining a commodity comment, a method for quitting video playing, a terminal device, and a computer-readable storage medium.
Background
The intelligent terminal is a terminal device loaded with an intelligent operating system, has the capability of accessing the internet, and can run various different APPs (applications).
At present, in order to realize human-computer interaction, a user can input an interaction instruction in modes of an entity key, a virtual button and the like provided by an intelligent terminal, but in the mode, one operation of the user can only execute one most basic interaction instruction, which is not convenient for the user.
Disclosure of Invention
The interaction method provided by the embodiment of the specification enables a user to input a plurality of specific interaction instructions through one continuous operation.
According to a first aspect of the present disclosure, there is provided an interaction method, including:
providing an interaction control group in at least part of interfaces under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, wherein the interaction control group comprises at least one interaction control;
detecting a moving track of the first operation instruction, determining a selected interactive control in the interactive control group according to the offset direction of the moving track, and highlighting the interactive control;
and under the condition that the end of the first operation instruction is detected, determining the highlighted interaction control as the interaction control selected by the user, and executing the interaction action corresponding to the interaction control selected by the user.
According to a second aspect of the present disclosure, there is provided a touch screen-based interaction method implemented by a device having a touch screen, including the following steps:
providing an interaction control group in at least part of interfaces of a touch screen under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, wherein the interaction control group comprises at least one interaction control;
detecting a moving track of the first operation instruction, determining a selected interactive control in the interactive control group according to the offset direction of the moving track, and highlighting the interactive control;
determining the highlighted interaction control as the interaction control selected by the user and executing the interaction action corresponding to the interaction control selected by the user under the condition that the first operation instruction is detected to be finished; and returning to the previous page.
According to a third aspect of the present disclosure, there is provided a method for obtaining a product review, which is implemented by a device having a touch screen, including the following steps:
under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, providing an interaction control group in at least part of interfaces of a commodity display interface provided by a touch screen, wherein the interaction control group comprises at least one interaction control for the user to express the evaluation of commodities;
detecting a moving track of the first operation instruction, determining a selected interactive control in the interactive control group according to the offset direction of the moving track, and highlighting the interactive control;
determining the highlighted interaction control as the interaction control selected by the user and executing the interaction action corresponding to the interaction control selected by the user under the condition that the first operation instruction is detected to be finished; and returning to the previous page.
According to a fourth aspect of the present disclosure, there is provided a method for exiting video playing, which is implemented by a device having a touch screen, and includes the following steps:
under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, providing an interaction control group in at least part of interfaces of a video content playing interface provided by a touch screen, wherein the interaction control group comprises at least one interaction control for the user to express the preference of the user on video content;
detecting a moving track of the first operation instruction, determining a selected interactive control in the interactive control group according to the offset direction of the moving track, and highlighting the interactive control;
determining the highlighted interaction control as the interaction control selected by the user and executing the interaction action corresponding to the interaction control selected by the user under the condition that the first operation instruction is detected to be finished; and exiting the video content play.
According to a fifth aspect of the present disclosure, there is provided a terminal device comprising an input means, a processor and a memory;
the input device is used for inputting an operation instruction by a user;
the memory stores computer instructions, and the computer instructions, when executed by the processor, implement the interaction method provided by the first aspect of the present invention.
According to a sixth aspect of the present disclosure, there is provided a terminal device, comprising a touch screen, a processor and a memory; the memory stores computer instructions, and the computer instructions, when executed by the processor, implement the interaction method provided by the second aspect of the invention.
According to a seventh aspect of the present disclosure, there is provided a terminal device, including a touch screen, a processor, and a memory; the memory stores computer instructions, and the computer instructions, when executed by the processor, implement the method for obtaining reviews of goods provided by the third aspect of the invention.
According to an eighth aspect of the disclosure, there is provided a terminal device, comprising a touch screen, a processor and a memory; the memory stores computer instructions, and the computer instructions, when executed by the processor, implement the method for exiting video playing provided by the fourth aspect of the present invention.
According to a ninth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the interaction method provided by the first aspect of the present disclosure.
According to a tenth aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the interaction method provided by the second aspect of the present disclosure.
According to an eleventh aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the method for obtaining reviews of goods provided by the third aspect of the present disclosure.
According to a twelfth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the method for exiting video playback provided by the fourth aspect of the present disclosure.
In the interaction method provided in the embodiment of the present specification, when it is detected that a first operation instruction of a user meets a first preset operation instruction, an interaction control group is provided, then an interaction control is selected from the interaction control group according to an offset direction of a movement trajectory of the first operation instruction, and when the first operation instruction is ended, an interaction action corresponding to the selected interaction control is executed. That is to say, the interaction method provided in the embodiment of the present specification enables a user to input a plurality of specific interaction instructions through one continuous operation, so as to implement a plurality of interaction actions of providing an interaction control group, selecting an interaction control, and executing the interaction control, thereby greatly facilitating the user and providing better use experience for the user.
Features of embodiments of the present specification and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description, serve to explain the principles of the embodiments of the specification.
Fig. 1 is a block diagram of a terminal device provided in an embodiment of the present specification.
FIG. 2 is a flow chart of an interaction method provided by a first embodiment of the present specification;
FIG. 3 is a flow chart of an interaction method provided by a second embodiment of the present specification;
FIG. 4 is a diagram illustrating an interface change process of an interaction method provided in a second embodiment of the present specification;
FIG. 5 is a schematic diagram of an interface change process of a method for obtaining reviews of goods according to a third embodiment of the present specification;
6(a) -6(c) are schematic diagrams of the interface change process of the interaction method provided by the fourth embodiment of the present specification;
fig. 7 is a schematic diagram of an interface change process of a method for exiting video playing provided in a fifth embodiment of the present specification;
fig. 8 is a schematic diagram of a terminal device provided in an embodiment of the present specification;
fig. 9 is a schematic diagram of a terminal device provided in an embodiment of the present specification.
Detailed Description
Various exemplary embodiments of the present specification will now be described in detail with reference to the accompanying drawings.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the embodiments, their application, or uses.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Fig. 1 is a block diagram of a hardware configuration of a terminal device provided in an embodiment of the present specification.
As shown in fig. 1, the terminal device 300 is an electronic device installed with an intelligent operating system (e.g., android, IOS, Windows, Linux, etc.) including, but not limited to, a laptop, a desktop computer, a mobile phone, a tablet computer, etc. The terminal device 300 has the capability of accessing the internet.
The configuration of the terminal apparatus 300 includes, but is not limited to, the processor 3010, the memory 3020, the interface device 3030, the communication device 3040, the display device 3050, the input device 3060, the speaker 3070, and the camera 3080.
The processor 3010 includes but is not limited to a central processing unit CPU, a microprocessor MCU, etc. Processor 3010 may also include an image processor gpu (graphics Processing unit). The memory 3020 includes, but is not limited to, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 3030 includes, but is not limited to, a USB interface, a serial interface, a parallel interface, a headphone jack, and the like. The communication device 3040 is capable of wired communication or wireless communication, and may specifically include WiFi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The display device 3050 includes a touch screen. Input device 3060 may include, but is not limited to, a keyboard, mouse, and the like. The configuration of the terminal device 300 may include only some of the above-described apparatuses.
In an embodiment applied to this specification, a user may access and operate on a corresponding page through various APPs (applications) loaded by the terminal device 300 to implement functions required by the user, for example, listening to music through a music APP, shopping through a shopping APP, making a hotel reservation through a hotel reservation APP, and the like.
The applications related to the illustrative embodiments include, but are not limited to, android applications, IOS applications, applets, Web applications, and the like.
The terminal device shown in fig. 1 is merely illustrative and in no way implies any limitation of the embodiments of the present description, their application, or uses. It should be understood by those skilled in the art that although a plurality of devices of the terminal equipment are described in the foregoing, the embodiments of the present specification may refer to only some of the devices. Those skilled in the art can design instructions based on the disclosed embodiments of the present specification. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
Referring to fig. 2, an embodiment of the present invention provides an interaction method, implemented by a terminal device, including the following steps:
s102, providing an interaction control group in at least part of interfaces under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, wherein the interaction control group comprises at least one interaction control.
The user may input the first operation instruction through an input device, for example, the user inputs the first operation instruction by manipulating a mouse, or inputs the first operation instruction by manipulating a finger or a stylus to touch a touch screen. The terminal equipment detects a first operation instruction of a user, and if the first operation instruction meets a first preset operation instruction, the terminal equipment presents an interaction control group containing one or more interaction controls on an interface.
In the case that the user inputs the operation instruction by manipulating the mouse, the first preset operation instruction may have the following form:
(1) and moving the mouse pointer to the position of a preset virtual button on the interface. In a specific example, the preset virtual button may be a back button. In a specific example, the interface is in a state of playing video content, and the preset virtual button may be a button for exiting video playing.
(2) And moving the mouse pointer to the position of the preset virtual button on the interface, and stopping the mouse pointer at the position of the preset virtual button for a first preset time. In a specific example, the preset virtual button may be a back button. In a specific example, the interface is in a state of playing video content, and the preset virtual button may be a button for exiting video playing. The first preset time is, for example, 1 second.
(3) And pressing a mouse button to click a preset virtual button on the interface. In a specific example, the preset virtual button may be a back button. In a specific example, the interface is in a state of playing video content, and the preset virtual button may be a button for exiting video playing.
(4) The mouse pointer is moved to the location of the content item on the interface. A plurality of content items may be presented on the interface and a mouse pointer is moved to the position of one of the content items on the interface.
(5) And moving the mouse pointer to the position of one content item on the interface and staying at the position of the content item for a first preset time. A plurality of content items may be presented on the interface, and the mouse pointer is moved to the position of one of the content items on the interface and stays at the position of the content item for a first preset time. The first preset time is, for example, 1 second.
(6) Pressing a mouse button clicks on a content item on the interface. Multiple content items may be presented on the interface and one of the content items on the interface may be clicked on.
(7) And moving the mouse pointer to a first preset area on the interface. The first preset area is a preset area, for example, the left area of the interface is preset as the first preset area, the upper area of the interface is preset as the first preset area, or the lower area of the interface is preset as the first preset area.
(8) And moving the mouse pointer to a first preset area on the interface, and stopping the mouse pointer in the first preset area for a first preset time. The first preset area is a preset area, for example, the left area of the interface is preset as the first preset area, the upper area of the interface is preset as the first preset area, or the lower area of the interface is preset as the first preset area. The first preset time is, for example, 1 second.
(9) And pressing a mouse button to click a first preset area of the interface. The first preset area is a preset area, for example, the left area of the interface is preset as the first preset area, the upper area of the interface is preset as the first preset area, or the lower area of the interface is preset as the first preset area.
(10) Pressing a mouse button to click a first preset area of the interface and then moving towards a first preset direction. The first preset area is a preset area. The first predetermined region is, for example, a left region of the interface, and the first predetermined direction is, for example, a direction moving from the left region of the interface to a right region of the interface. The first predetermined region is, for example, an upper region of the interface, and the first predetermined direction is, for example, a direction moving from the upper region of the interface toward a lower region of the interface. The first predetermined region is, for example, a lower region of the interface, and the first predetermined direction is, for example, a direction moving from the lower region of the interface toward an upper region of the interface.
Under the condition that a user inputs an operation instruction through the touch screen, the first preset operation instruction may have the following form:
(1) and moving the pen point of the finger or the touch pen to a preset virtual button on the interface. In a specific example, the preset virtual button may be a back button. In a specific example, the interface is in a state of playing video content, and the preset virtual button may be a button for exiting video playing.
(2) And moving the pen point of the finger or the touch pen to a preset virtual button on the interface, and staying at the preset virtual button for a first preset time. In a specific example, the preset virtual button may be a back button. In a specific example, the interface is in a state of playing video content, and the preset virtual button may be a button for exiting video playing. The first preset time is, for example, 1 second.
(3) And clicking a preset virtual button on the interface according to a first preset force requirement by using a finger or a touch control pen, wherein the duration time reaches a first preset time. In a specific example, the preset virtual button may be a back button. In a specific example, the interface is in a state of playing video content, and the preset virtual button may be a button for exiting video playing. The first preset time is, for example, 1 second. For some terminal devices, the Touch screen of the terminal device has pressing sensitivity (for example, a mobile phone with a 3D Touch function of apple inc.), the force of pressing the Touch screen by the user can be sensed and distinguished, and when the user presses the Touch screen with different forces, different control instructions can be excited. In one embodiment of the invention, the first predetermined force requirement corresponds to a relatively large degree of pressing force. And when the user clicks the preset virtual button with the relatively large pressing force and the maintaining time reaches more than the first preset time, the first preset operation instruction is met.
(4) A finger or a tip of a stylus is moved to a content item on the interface. Multiple content items may be presented on the interface, and the tip of the finger or stylus is moved to one of the content items on the interface.
(5) Moving a tip of a finger or a stylus to one of the content items on the interface and staying at the content item for a first preset time. The interface may present a plurality of content items, and the stylus tip of the finger or stylus is moved to one of the content items on the interface and stays on the content item for a first preset time. The first preset time is, for example, 1 second.
(6) Clicking the content item on the interface with a finger or a stylus according to a first preset force requirement and the duration reaching a first preset time. The interface may present a plurality of content items, and the stylus tip of the finger or stylus is moved to one of the content items on the interface and stays on the content item for a first preset time. The first preset time is, for example, 1 second. For some terminal devices, the Touch screen of the terminal device has pressing sensitivity (for example, a mobile phone with a 3D Touch function of apple inc.), the force of pressing the Touch screen by the user can be sensed and distinguished, and when the user presses the Touch screen with different forces, different control instructions can be excited. In one embodiment of the invention, the first predetermined force requirement corresponds to a relatively large degree of pressing force. When the user clicks one of the content items with the relatively large pressing force and the maintaining time reaches above a first preset time, a first preset operation instruction is met.
(7) And moving the pen point of the finger or the touch pen to a first preset area on the interface. The first preset area is a preset area, for example, the left area of the interface is preset as the first preset area, the upper area of the interface is preset as the first preset area, or the lower area of the interface is preset as the first preset area.
(8) Moving a pen point of a finger or a touch pen to a first preset area on the interface, and staying in the first preset area for a first preset time. The first preset area is a preset area, for example, the left area of the interface is preset as the first preset area, the upper area of the interface is preset as the first preset area, or the lower area of the interface is preset as the first preset area. The first preset time is, for example, 1 second.
(9) And clicking a first preset area of the interface according to a first preset force requirement by using a finger or a touch control pen and then moving towards a first preset direction. The first predetermined region is, for example, a left region of the interface, and the first predetermined direction is, for example, a direction moving from the left region of the interface to a right region of the interface. The first predetermined region is, for example, an upper region of the interface, and the first predetermined direction is, for example, a direction moving from the upper region of the interface toward a lower region of the interface. The first predetermined region is, for example, a lower region of the interface, and the first predetermined direction is, for example, a direction moving from the lower region of the interface toward an upper region of the interface. For some terminal devices, the Touch screen of the terminal device has pressing sensitivity (for example, a mobile phone with a 3D Touch function of apple inc.), the force of pressing the Touch screen by the user can be sensed and distinguished, and when the user presses the Touch screen with different forces, different control instructions can be excited. In one embodiment of the invention, the first predetermined force requirement corresponds to a relatively large degree of pressing force. When the user clicks one of the content items with the relatively large pressing force and then moves toward a first preset direction, a first preset operation instruction is satisfied.
In one embodiment of the invention, the interaction control group comprises at least one interaction control for the user to express the user preference. An interaction control for expressing a positive rating and an interaction control for expressing a negative rating can be included in the interaction control group. Interaction controls that reject ratings may also be included within the set of interaction controls. The object of the evaluation can be the content, goods or services presented on the current page, and the evaluation is used for reflecting the opinion or preference of the user on the object. For example, the positive-rated interaction controls may be, for example, interaction controls that implement "score", "like", "favorite", "join wish list", "join shopping cart" functionality. The interaction controls that are negatively rated may be, for example, interaction controls that implement "point", "step", "dislike" functionality. The interactive control for rejecting evaluation is suitable for the situation that the user does not want to make evaluation, for example, the interactive control is marked with information such as a 'cancel' symbol, a 'reject evaluation' symbol, a 'cancel' typeface and a 'reject evaluation' typeface.
Embodiments of the present specification are not limited to the aforementioned interaction controls for expressing user preferences, and other types of controls for interaction may also be included within the set of interaction controls, such as, for example, "feedback" controls, "complaint" controls, "consultation" controls.
The interaction control in the embodiment of the present invention may also be used to further obtain some information or implement a specific function, for example, the interaction control may be used to "view a map", "view a rating", "navigate", "search", and the like.
In an embodiment of the invention, a semi-transparent cover layer is loaded on the current page, and an interactive control group is loaded on the semi-transparent cover layer to highlight the interactive control group. Or, the brightness of the current page is reduced, and the interaction control group is loaded on the current page, so that the effect of highlighting the interaction control group can be realized.
And S104, detecting the moving track of the first operation instruction, determining the selected interactive control in the interactive control group according to the offset direction of the moving track, and displaying the interactive control in a highlighted mode.
The method comprises the steps that a user can input a first operation instruction through an input device, the terminal equipment detects the first operation instruction of the user, and if the first operation instruction meets a first preset operation instruction, the terminal equipment presents an interaction control group containing one or more interaction controls on an interface; the terminal equipment continuously detects the moving track of the first operation instruction of the user, one interactive control is selected from the interactive control group according to the moving track of the first operation instruction, and the selected interactive control is highlighted so that the selected interactive control can be distinguished from other interactive controls more obviously to prompt the user that the interactive control is selected.
That is to say, in the embodiment of the present invention, the first operation instruction includes two consecutive stages, the operation in the first stage needs to conform to the first preset operation instruction to present the interaction control group, and the movement in the second stage is used to select the interaction control.
In step S102, if the first preset operation instruction includes moving toward the first preset direction, the arrangement direction of the plurality of interactive controls of the interactive control group may be perpendicular to the first preset direction when the interactive control group is presented. In this case, in step S104, the interaction control can be selected from the interaction control group only by turning the first operation instruction and moving in a direction perpendicular to the previous moving direction instead.
The mode of highlighting the selected interactive control is any one of the following modes:
(1) changing the color of the selected interactive control.
(2) And amplifying the selected interactive control.
(3) And stretching the selected interactive control.
(4) And expanding the selected interactive control. When the user can understand the specific meaning of the interactive control, when the first operation instruction moves to enable a certain interactive control to be selected, the interactive control can be completely expanded to display more information about the function of the interactive control.
The above highlighting may also be used in combination, for example, to change the color of the selected interaction control and to zoom in on the selected interaction control as a whole.
Determining the selected interactive control in the interactive control group according to the offset direction of the moving trajectory of the first operation instruction, which may include steps S1042 and S1044:
and S1042, moving the interactive control group along the offset direction of the moving track of the first operation instruction, so that the interactive control group moves in the range of the interface.
At the second stage, the interactive control group can move integrally within the range of the interface along with the offset direction of the movement operation of the user, and certain damping is preset in the Application (APP) to ensure that the interactive control group cannot be dragged out of the interface.
In this way, the interactive control group can move along with the movement operation of the user, and better use experience can be given to the user. Meanwhile, even if the moving distance of the first operation instruction is larger, the situation that the interaction control group is dragged out of the interface can not occur, and the interaction control group is always presented to the user.
And S1044, determining the selected interactive control in the interactive control group along the offset direction of the movement track of the first operation instruction under the condition that the interactive control is located in a second preset area of the interface.
That is, when the interactive control group is located in the second preset area, the interactive control is selected according to the offset direction of the movement trajectory of the first operation instruction, for example, when the interactive control group is located in the second preset area, the offset direction of the movement trajectory of the first operation instruction points to the first interactive control of the interactive control group, and then the first interactive control is selected. In a specific example, the second predetermined area is, for example, an edge position of the interface.
And S106, under the condition that the end of the first operation instruction is detected, determining the highlighted interactive control as the interactive control selected by the user, and executing the interactive action corresponding to the interactive control selected by the user.
The method comprises the steps that a user can input a first operation instruction through an input device, the terminal equipment detects the first operation instruction of the user, and if the first operation instruction meets a first preset operation instruction, the terminal equipment presents an interaction control group containing one or more interaction controls on an interface; the terminal equipment continuously detects the moving track of a first operation instruction of a user, and selects an interactive control from the interactive control group according to the moving track of the first operation instruction, and the selected interactive control is highlighted; and when the first operation instruction is finished, executing the interactive action corresponding to the highlighted interactive control.
In the embodiment of the present invention, the first operation instruction includes two consecutive stages, the operation in the first stage needs to conform to a first preset operation instruction to present an interaction control group, the movement in the second stage is used to select an interaction control, and finally the first operation instruction is ended to execute an interaction action corresponding to the selected interaction control. For example, the first operation instruction comprises pressing a mouse key to realize clicking (a first stage), controlling a mouse pointer to move to select the interactive control in a state of keeping the mouse key pressed (a second stage), and finally releasing the mouse key to finish the instruction. For example, the first operation instruction includes operating a mouse pointer to perform a specific movement (a first stage), operating the mouse pointer to move continuously to select the interactive control (a second stage), and finally stopping the movement to end the instruction. For example, the first operation instruction includes operating the mouse pointer to perform a specific movement in a state where the mouse key is pressed (a first stage), operating the mouse pointer to move continuously to select the interactive control (a second stage), and finally releasing the mouse key to end the instruction. For example, the first operation instruction includes touching the touch screen to realize clicking (first stage), sliding on the touch screen to select an interaction control while keeping the touched state (second stage), and finally leaving the touch screen to finish the instruction. For example, the first operation instruction includes touching the touch screen and performing a specific sliding (first stage), continuing to slide on the touch screen to select the interactive control (second stage), and finally leaving the touch screen to end the instruction.
In one embodiment of the invention, the performing of the interactive action corresponding to the interactive control selected by the user comprises: and jumping to a page corresponding to the interactive control selected by the user. For example, the interaction control functions as "view map", and the interactive action corresponding to the interactive control selected by the user is to jump to a map page. For example, the interaction control functions as "view rating", and the interactive action corresponding to the interactive control selected by the user is to jump to a rating page.
In an embodiment of the present invention, if the selected interactive control is a control that needs to input further detailed information, such as "feedback", "complaint", "consultation", and the like, the performing of the interactive action corresponding to the interactive control selected by the user may include: and popping up an information input window for a user to input information.
In a specific example, the embodiment of the present invention is used to implement evaluation on a current page in a process of returning to a previous page. This is explained in detail below:
in this case, the preset virtual button referred to in step S102 is a return button, and in step S106, after the interactive action corresponding to the interactive control selected by the user is performed, the previous page is returned. Or, the first preset area in step S102 is a function area of the previous page, and in step S106, after the interactive action corresponding to the interactive control selected by the user is executed, the previous page is returned.
In a specific example, the step S102 of providing the interactive control group in the interface may include steps S1022 to S1024:
s1022, loading the previous page under the current page, and moving the current page to present the first portion of the previous page to the user.
And S1024, loading an interaction control group on the first part of the previous page.
That is, in this particular example, the previous page has been loaded while the set of interactive controls is provided in the interface. Correspondingly, in step S106, after the interactive action corresponding to the interactive control selected by the user is executed, the loaded previous page can be completely displayed only by closing the current page and removing the interactive control group, so that the effect of quickly returning to the previous page is achieved, and the response speed of the previous page is improved.
In step S102, when providing the interaction control group, a semi-transparent cover layer may be loaded between the previous page and the interaction control group. For example, a semi-transparent overlay is loaded under the current page, a previous page is loaded under the semi-transparent overlay, and the set of interaction controls is located on a first portion of the previous page and presented above the semi-transparent overlay. The semi-transparent masking layer can enable the current page and the previous page to be distinguished more clearly, and the interaction control piece to be more striking, so that the user experience is improved, and the user can use the interaction control piece conveniently. Correspondingly, in step S106, the semi-transparent mask layer needs to be removed to fully expose the previous page.
When the interactive control group is provided, the brightness of the previous page may be reduced. The brightness of the previous page is reduced, the current page and the previous page can be distinguished more vividly, the interaction control piece can be more striking, the user experience is improved, and the user can use the interaction control piece conveniently. Correspondingly, in step S106, the brightness of the previous page needs to be increased, so that the previous page returns to the normal state.
The removal of the user-selected interactive control may be delayed for a second preset time when the group of interactive controls is removed. The second preset time may be, for example, 1 second. That is, when the interaction control group is removed, the interaction control that is not selected is immediately removed, and the selected interaction control stays on the previous page for a short time and is removed later, so as to indicate to the user that the interaction control is selected by the user.
The interaction method provided by the embodiment of the specification enables a user to input a plurality of specific interaction instructions through one continuous operation, realizes providing an interaction control group, selecting an interaction control and executing a plurality of interaction actions of the interaction control, greatly facilitates the user, and provides better use experience for the user.
The specific schemes of the second to fifth embodiments can be further implemented by using the interaction method provided by the first embodiment.
Referring to fig. 3, an embodiment of the present invention provides a touch screen-based interaction method, which is implemented by a terminal device having a touch screen. Fig. 4 shows a change process of an interface of the terminal device under the control of the interaction method, and the method of the embodiment is used for evaluating the content presented on the current page in the process of returning to the previous page.
The interaction method based on the touch screen provided by the embodiment comprises the following steps:
s202, providing an interactive control group in at least part of interfaces of the touch screen under the condition that the first operation instruction of the user is detected to meet a first preset operation instruction, wherein the interactive control group comprises at least one interactive control.
Referring to the 200-1 interface of fig. 4, a virtual "back" button is provided on the current page, the "back" button is located on the left side of the interface, the first preset direction is a direction from the left side of the interface to the right side of the interface, and the first preset operation instruction includes clicking the "back" button and then moving towards the first preset direction, i.e. sliding transversely towards the right side of the interface.
Referring to the hand pattern and the directional arrow in the 200-1 interface of fig. 4, when the user touches and clicks the "back" button and then slides laterally toward the right side of the interface, the terminal device detects that the first operation instruction of the user satisfies the first preset operation instruction, loads the previous page below the current page, and moves the current page toward the first preset direction to present the first part of the previous page to the user, and loads the interactive control group above the first part of the previous page, so that the interface 200-1 is transformed into the interface 200-2. Referring to interface 200-2, the left side of the interface shows the left half of the previous page and the right side of the interface shows the left half of the current page; an interactive control group is provided on top of the first portion of the previous page, i.e., on the left side of the interface.
Referring to the interface 200-2, the interaction control group includes three interaction controls, the three interaction controls are arranged longitudinally on the interface (the arrangement direction of the interaction controls is perpendicular to the first preset direction), the first interaction control used for expressing "like" and the second interaction control used for expressing "dislike" are sequentially arranged from top to bottom, the third interaction control used for expressing "do not want to comment" is used, and the interaction controls displayed in the interface 200-2 only display corresponding symbols. The first interactive control is used to express "like" with a complete heart-shaped pattern. The second interactive control is used to express "dislike," which is symbolized by a broken heart-shaped pattern. The third interactive control is used for expressing 'do not want to comment', and the symbol of the third interactive control is a cross-shaped pattern.
In other embodiments, the first preset operation instruction may be to click a preset area of the interface and then move towards the first preset direction. Alternatively, the first preset operation instruction may be: and clicking a preset area of the interface according to a first preset force requirement and then moving towards a first preset direction. The first preset operation instruction can refer to the relevant content of the first preset operation instruction of the first embodiment, and the description is not repeated here.
S204, detecting a moving track of the first operation instruction, determining the selected interactive control in the interactive control group according to the offset direction of the moving track, and displaying the interactive control in a highlighted mode.
The method comprises the steps that a user can input a first operation instruction through a touch screen, the terminal equipment detects the first operation instruction of the user, and if the first operation instruction meets a first preset operation instruction, the terminal equipment presents an interaction control group containing one or more interaction controls on an interface; the terminal equipment continuously detects the moving track of the first operation instruction of the user, one interactive control is selected from the interactive control group according to the moving track of the first operation instruction, and the selected interactive control is highlighted so that the selected interactive control can be distinguished from other interactive controls more obviously to prompt the user that the interactive control is selected.
That is to say, in the embodiment of the present invention, the first operation instruction includes two consecutive stages, the operation in the first stage needs to conform to the first preset operation instruction to present the interaction control group, and the movement in the second stage is used to select the interaction control.
The three interactive controls in the interactive control group are arranged along the longitudinal direction, as shown by the hand-shaped pattern and the directional arrows of the interface 200-2, the user changes from the horizontal sliding finger of the first stage to the longitudinal sliding finger of the second stage to select the interactive control. Referring to the interface 200-2 to the interface 200-3, the terminal device moves the interactive control group within the range of the interface along the offset direction of the movement track of the first operation instruction, and the user slides a finger toward the upper side of the interface, so that the interactive control group moves toward the upper side of the interface as a whole. Referring to the interface 200-3, the interaction control group is moved to an edge position of the interface, the finger of the user finally stays at a position facing the first interaction control, and the terminal device determines that the first interaction control is selected according to the offset direction of the movement track of the first operation instruction. Referring to the interface 200-4, after the terminal device determines that the first interaction control is selected, the terminal device expands the first interaction control to prompt the user that the first interaction control is selected. The expanded first interactive control is added with 'like' text detail information so that the user can understand that the first interactive control is used for expressing 'like' to prevent the user from selecting by mistake.
In other embodiments, the manner of highlighting the selected interactive control may be any of the following:
(1) changing the color of the selected interactive control.
(2) And amplifying the selected interactive control.
(3) And stretching the selected interactive control.
The above highlighting may also be used in combination, for example, to change the color of the selected interaction control and to zoom in on the selected interaction control as a whole.
S206, under the condition that the end of the first operation instruction is detected, determining the highlighted interactive control as the interactive control selected by the user, and executing the interactive action corresponding to the interactive control selected by the user; and returning to the previous page.
The method comprises the steps that a user can input a first operation instruction through an input device, the terminal equipment detects the first operation instruction of the user, and if the first operation instruction meets a first preset operation instruction, the terminal equipment presents an interaction control group containing one or more interaction controls on an interface; the terminal equipment continuously detects the moving track of a first operation instruction of a user, and selects an interactive control from the interactive control group according to the moving track of the first operation instruction, and the selected interactive control is highlighted; when the first operation instruction is finished, the interaction action corresponding to the highlighted interaction control is executed, and the previous page is returned.
In the embodiment of the present invention, the first operation instruction includes two consecutive stages, the operation in the first stage needs to conform to a first preset operation instruction to present an interaction control group, the movement in the second stage is used to select an interaction control, and finally the first operation instruction is ended to execute an interaction action corresponding to the selected interaction control and return to a previous page.
Referring to the interface 200-5 and the interface 200-6, when the user operates a finger to leave the screen, the terminal device detects that the first operation instruction is finished, determines the highlighted first interaction control as the interaction control selected by the user, executes an interaction action corresponding to the first interaction control, that is, records the content presented on the current page as the content preferred by the user, closes the current page and removes the interaction control group, and completely displays the loaded previous page.
As shown with reference to interfaces 200-5 and 200-6, the removal of the first interactive control may be delayed for a second preset time when the set of interactive controls is removed. The second preset time may be, for example, 1 second. That is to say, when the interaction control group is removed, the second interaction control and the third interaction control which are not selected are immediately removed, and the first interaction control which is selected is removed after delaying for 1 second, so as to indicate to the user that the first interaction control is selected by the user.
In the embodiment shown in FIG. 4, the interaction controls in the interaction control group may be used for the user to express preferences or ratings for the content of the page, which may be, for example, audio-visual content. In other embodiments, the current page is used to present a good, and the interaction controls in the set of interaction controls may be used for the user to express a preference or rating for the good.
The interaction method provided by the embodiment of the specification enables a user to input a plurality of specific interaction instructions through one continuous operation, achieves a plurality of interaction actions of providing an interaction control group, selecting an interaction control, executing the interaction control and returning to a previous page, greatly facilitates the user, and provides better use experience for the user.
The interaction method provided by the embodiment of the specification can provide interaction opportunities for the user in the process that the user exits from the current page and returns to the previous page, and the return behavior of the user can generate value. Moreover, the interactive control group only appears temporarily under the condition that the user wants to return to the previous page, but is not always arranged on the current page, so that the space of the current page is not occupied, and the page arrangement is facilitated.
The interaction method of the embodiment of the specification can naturally introduce the evaluation function in the process that the user exits from the current page and returns to the previous page, and is favorable for improving the evaluation rate of the user.
According to the interaction method in the embodiment of the specification, the user can realize the functions of evaluating and returning to the previous page by using single-finger continuous sliding operation, and the interaction method is very friendly and natural.
According to the interaction method provided by the embodiment of the specification, the user can give evaluation or feedback after finishing reading the current page, and if the user returns without finishing reading the current page, the user also has an opportunity to perform evaluation or feedback, so that the evaluation rate of the user is favorably improved.
Referring to fig. 5, an embodiment of the present invention provides a method for obtaining a product review, which is implemented by a device having a touch screen. Fig. 5 shows a change process of an interface of the terminal device under control of the method for obtaining commodity comments, and the method of the embodiment is used for evaluating commodities presented on a current page in the process of returning to a previous page.
A user inputs 'Y brand Y1 model mobile phone' on a commodity searching page provided by a shopping application loaded on terminal equipment, and enters a commodity selling page of 'Y brand Y1 model mobile phone', wherein the commodity searching page is a previous page, and the commodity selling page of 'Y brand Y1 model mobile phone' is a current page. Referring to the interface 100-1 of fig. 5, the current page displayed on the touch screen is a merchandise sales page of "Y-brand Y1 model mobile phone", the top half of the current page displays pictures and prices of Y-brand Y1 model mobile phones, and an "order" interaction control is provided, and if the user clicks the "order" interaction control, the current page is shifted to an order flow page.
The method for obtaining the commodity comment provided by the embodiment comprises the following steps:
s302, under the condition that it is detected that a first operation instruction of a user meets a first preset operation instruction, providing an interaction control group in at least part of interfaces of a commodity display interface provided by a touch screen, wherein the interaction control group comprises at least one interaction control for the user to express the evaluation of the commodity.
Referring to the 100-1 interface of fig. 5, a first preset zone Z1 is provided on the left side of the middle portion of the interface, the first preset direction is a direction from the left side of the interface to the right side of the interface, and the first preset operation instruction includes clicking the first preset zone Z1 and then moving towards the first preset direction, i.e. sliding transversely towards the right side of the interface.
Referring to the hand pattern and the directional arrow in the 100-1 interface of fig. 5, the user touches the first preset area Z1 and then slides laterally toward the right side of the interface, the terminal device detects that the first operation instruction of the user satisfies the first preset operation instruction, loads the previous page below the current page, and moves the current page toward the first preset direction to present the first portion of the previous page to the user, and loads the interactive control group above the first portion of the previous page, so that the interface 100-1 is transformed into the interface 100-2. Referring to interface 100-2, the left side of the interface shows the left half of the previous page and the right side of the interface shows the left half of the current page; an interactive control group is provided on top of the first portion of the previous page, i.e., on the left side of the interface.
Referring to the interface 100-2, the interaction control group includes three interaction controls, the three interaction controls are arranged longitudinally on the interface (the arrangement direction of the interaction controls is perpendicular to the first preset direction), and sequentially from top to bottom, a first interaction control for expressing "like", a second interaction control for expressing "dislike", and a third interaction control which is annotated with "cancel" text and used for expressing "do not want to comment".
In other embodiments, the first preset operation instruction may be: and clicking a preset area of the interface according to a first preset force requirement and then moving towards a first preset direction. The first preset operation instruction may be clicking a virtual "back" button on the interface and then moving toward a first preset direction. Alternatively, the first preset operation instruction may refer to the relevant content of the first preset operation instruction of the first embodiment, and the description is not repeated here.
S304, detecting a moving track of the first operation instruction, determining the selected interactive control in the interactive control group according to the offset direction of the moving track, and displaying the interactive control in a highlighted mode.
The method comprises the steps that a user can input a first operation instruction through a touch screen, the terminal equipment detects the first operation instruction of the user, and if the first operation instruction meets a first preset operation instruction, the terminal equipment presents an interaction control group containing one or more interaction controls on an interface; the terminal equipment continuously detects the moving track of the first operation instruction of the user, one interactive control is selected from the interactive control group according to the moving track of the first operation instruction, and the selected interactive control is highlighted so that the selected interactive control can be distinguished from other interactive controls more obviously to prompt the user that the interactive control is selected.
That is to say, in the embodiment of the present invention, the first operation instruction includes two consecutive stages, the operation in the first stage needs to conform to the first preset operation instruction to present the interaction control group, and the movement in the second stage is used to select the interaction control.
The three interactive controls in the interactive control group are arranged along the longitudinal direction, as shown by the hand-shaped pattern and the directional arrow of the interface 100-2, the user changes from the horizontal sliding finger of the first stage to the longitudinal sliding finger of the second stage to select the interactive control. Referring to the interface 100-2 to the interface 100-3, the terminal device moves the interactive control group within the range of the interface along the offset direction of the movement track of the first operation instruction, and the user slides a finger toward the upper side of the interface, so that the interactive control group moves toward the upper side of the interface as a whole. Referring to the interface 100-3, the interaction control group is moved to an edge position of the interface, the finger of the user finally stays at a position facing the second interaction control, and the terminal device determines that the second interaction control is selected according to the offset direction of the movement track of the first operation instruction. Referring to the interface 100-4, after the terminal device determines that the second interaction control is selected, the terminal device stretches the second interaction control to prompt the user that the second interaction control is selected.
In other embodiments, the manner of highlighting the selected interactive control may be any of the following:
(1) changing the color of the selected interactive control.
(2) And amplifying the selected interactive control.
(3) And expanding the selected interactive control.
The above highlighting may also be used in combination, for example, to change the color of the selected interaction control and to zoom in on the selected interaction control as a whole.
S306, under the condition that the first operation instruction is detected to be finished, determining the highlighted interactive control as the interactive control selected by the user, and executing the interactive action corresponding to the interactive control selected by the user; and returning to the previous page.
The method comprises the steps that a user can input a first operation instruction through an input device, the terminal equipment detects the first operation instruction of the user, and if the first operation instruction meets a first preset operation instruction, the terminal equipment presents an interaction control group containing one or more interaction controls on an interface; the terminal equipment continuously detects the moving track of a first operation instruction of a user, and selects an interactive control from the interactive control group according to the moving track of the first operation instruction, and the selected interactive control is highlighted; when the first operation instruction is finished, the interaction action corresponding to the highlighted interaction control is executed, and the previous page is returned.
In the embodiment of the present invention, the first operation instruction includes two consecutive stages, the operation in the first stage needs to conform to a first preset operation instruction to present an interaction control group, the movement in the second stage is used to select an interaction control, and finally the first operation instruction is ended to execute an interaction action corresponding to the selected interaction control and return to a previous page.
Referring to the interface 100-5, when the user operates a finger to leave the screen, the terminal device detects that the first operation instruction is finished, determines the highlighted second interactive control as the interactive control selected by the user, executes an interactive action corresponding to the second interactive control, that is, "Y brand Y1 model mobile phone" is recorded as a product that the user does not like, closes the current page and removes the interactive control group, completely displays the loaded previous page, and returns to the previous page (product search page) after the current page is closed.
Referring to the interface 100-2, in step S302, when providing an interaction control group, a semi-transparent overlay may be loaded between a previous page and the interaction control group. For example, a semi-transparent overlay is loaded under the current page, a previous page is loaded under the semi-transparent overlay, and the set of interaction controls is located on a first portion of the previous page and presented above the semi-transparent overlay. The semi-transparent masking layer can enable the current page and the previous page to be distinguished more clearly, and the interaction control piece to be more striking, so that the user experience is improved, and the user can use the interaction control piece conveniently. Correspondingly, as shown in the interface 100-5, in step S306, the semi-transparent cover layer needs to be removed to fully expose the previous page.
The method for obtaining the commodity comments, provided by the embodiment of the specification, enables a user to input a plurality of specific interactive instructions through one continuous operation, achieves the purposes of providing an interactive control group, selecting an interactive control, executing the interactive control and returning to a plurality of interactive actions of a previous page, greatly facilitates the user, and provides better use experience for the user.
The method for obtaining the commodity comment provided by the embodiment of the specification can provide the opportunity for evaluating the commodity for the user in the process that the user exits from the current page and returns to the previous page, and can enable the return behavior of the user to generate value. Moreover, the interactive control group only appears temporarily under the condition that the user wants to return to the previous page, but is not always arranged on the current page, so that the space of the current page is not occupied, and the page arrangement is facilitated.
The method for obtaining the commodity comment provided by the embodiment of the specification can naturally introduce the commodity evaluation function in the process that the user exits from the current page and returns to the previous page, and is favorable for improving the evaluation rate of the user.
According to the method for obtaining the commodity comment, provided by the embodiment of the specification, the user can realize the functions of commodity evaluation and return to the previous page through single-finger continuous sliding operation, and the method is very friendly and natural.
According to the method for obtaining the commodity comment, the user can give evaluation or feedback after the user finishes watching the commodity content, and if the user returns without finishing watching the commodity content, the user also has an opportunity to evaluate or feedback, so that the evaluation rate of the user is improved.
Referring to fig. 6(a) -6(c), an embodiment of the present invention provides an interaction method implemented by a device having a touch screen. Fig. 6(a) -6(c) show the changing process of the interface of the terminal device under the control of the interactive method.
The user views the hotel related information on the hotel reservation application installed on the terminal device, as shown in an interface 300-1, 3 content items are shown on the interface of the terminal device, namely a "hotel H1" content item, a "hotel H2" content item and a "hotel H3" content item. Each content item includes an introduction to the hotel and notes the price of the hotel room.
The first preset operation instruction is a content item on the click interface. Referring to the hand-shaped pattern in the interface 300-1 and the interface 300-2, when the user touches and clicks the content item "hotel H1" for a first preset time, the terminal device detects that the first operation instruction of the user meets the first preset operation instruction, a semi-transparent covering layer is arranged on the content item "hotel H1", and an interaction control group is arranged on the semi-transparent covering layer. As shown in interface 300-2, the set of interaction controls includes a first interaction control for "View map", a third interaction control for "View rating", and a second interaction control for "Cancel interaction controls group". The first interactive control, the second interactive control and the third interactive control are transversely arranged.
Referring to the hand-shaped patterns of the interface 300-2 and the interface 300-3, after the user touches and clicks the content item "hotel H1" to call out the interaction control group, when the finger keeps a touch state and moves the finger to select the first interaction control, the terminal device determines that the first interaction control is selected according to the offset direction of the movement track of the first operation instruction of the user, enlarges the first interaction control and changes the color to prompt the user that the first interaction control is selected. Then, the user operates the finger to leave the interface, as shown in interface 300-4, the terminal device detects that the first operation instruction is finished, and performs an interaction corresponding to the first interaction control, that is, jumps to a map page corresponding to hotel H1. In the map page, the geographical location of hotel H1 is indicated, and a "return" button, a "go to hotel" button, and a "view rating" button are also provided. The user clicks the "back" button and the terminal device returns to interface 300-2. The user clicks the 'go to hotel' button, the terminal equipment starts the navigation function, and a route for going out the hotel H1 is planned on the map page. The user clicks the "view rating" button and the terminal device jumps to the rating page corresponding to hotel H1, i.e., opens interface 300-6.
Referring to the hand-shaped patterns of the interface 300-2 and the interface 300-5, after the user touches and clicks the content item "hotel H1" to call out the interactive control group, when the finger keeps a touch state and moves the finger to select the third interactive control, the terminal device determines that the third interactive control is selected according to the offset direction of the movement track of the first operation instruction of the user, enlarges and changes the color of the third interactive control to prompt the user that the third interactive control is selected. And then, the user operates the finger to leave the interface, as shown in an interface 300-5, the terminal device detects that the first operation instruction is finished, and performs an interactive action corresponding to the third interactive control, that is, jumps to an evaluation page corresponding to hotel H1. In this evaluation page, the scores for each of the indices of hotel H1 are presented, along with the evaluation of the hotels H1 by the multiple tenants. A 'return' button and a 'view map' button are also arranged on the evaluation requirement surface. The user clicks the "back" button and the terminal device returns to interface 300-2. The user clicks the "view map" button and the terminal device jumps to the map page corresponding to hotel H1, i.e., opens interface 300-4.
Referring to the hand-shaped patterns of the interface 300-2 and the interface 300-7, after the user touches and clicks the content item "hotel H1" to call out the interaction control group, when the finger keeps a touch state and moves the finger to select the second interaction control, the terminal device determines that the second interaction control is selected according to the offset direction of the movement track of the first operation instruction of the user, enlarges the second interaction control and changes the color to prompt the user that the second interaction control is selected. Then, the user operates the finger to leave the interface, as shown in the interface 300-7, the terminal device detects that the first operation instruction is finished, and executes the interactive action corresponding to the second interactive control, that is, cancels the interactive control group and the semi-transparent cover layer, and returns to the state of the interface 300-1.
The interaction method provided by the embodiment of the specification enables a user to input a plurality of specific interaction instructions through one continuous operation, realizes providing an interaction control group, selecting an interaction control, executing the interaction control, and cancels a plurality of interaction actions of the interaction control group, greatly facilitates the user, and provides better use experience for the user.
The interaction method provided by the embodiment of the present specification can provide the user with the interaction control corresponding to the content item by long-pressing the content item, thereby providing the user with the service related to the content item. Moreover, the interactive control group only appears temporarily when the user needs the interactive control group, but is not always arranged on the current page, so that the space of the current page cannot be occupied, and the page arrangement is facilitated.
Referring to fig. 7, an embodiment of the present invention provides a method for exiting video playback, which is implemented by a device having a touch screen. Fig. 7 shows a change process of an interface of the terminal device under the control of the method for playing the video.
The user watches videos on a video playing application carried by the terminal device, the interface 500-1 is a video playing page, a virtual exit button is arranged at the lower left corner of the video playing page, a first preset direction is a direction from the left side of the interface to the right side of the interface, and the first preset operation instruction comprises clicking the exit button and then moving towards the first preset direction, namely sliding towards the right side of the interface transversely.
Referring to the hand-shaped pattern and the direction arrow of the interface 500-1, the user touches and clicks the "back" button and then slides laterally toward the right side of the interface, the terminal device detects that the first operation instruction of the user meets the first preset operation instruction, the interaction control group is loaded on the current page, and the interface 500-1 becomes the interface 500-2.
Referring to the interface 500-2, the interaction control group includes three interaction controls, the three interaction controls are arranged longitudinally on the interface (the arrangement direction of the interaction controls is perpendicular to the first preset direction), and are sequentially, from top to bottom, a first interaction control for expressing "like", a second interaction control for expressing "step", and a third interaction control which is annotated with "cancel" text and is used for expressing "do not want to comment".
Referring to the hand-shaped pattern and the direction arrow of the interface 500-2 and the hand-shaped pattern of the interface 500-3, the finger of the user slides longitudinally (i.e., slides up and down) and finally stays at a position facing the first interactive control, and the terminal device determines that the first interactive control is selected according to the offset direction of the movement track of the first operation instruction. Referring to interface 500-4, after determining that the first interaction control is selected, the terminal device stretches the first interaction control to prompt the user that the first interaction control is selected.
Then, the user operates the finger to leave the interface, as shown in interface 500-5, the terminal device detects that the first operation instruction is finished, and executes an interactive action corresponding to the first interactive control, that is, submits the praise of the user to the video, so as to improve the score of the video. And quitting video playing and returning to the selection page.
The method for quitting video playing provided by the embodiment of the specification enables a user to input a plurality of specific interactive instructions through one continuous operation, realizes a plurality of interactive actions of providing an interactive control group, selecting an interactive control, executing the interactive control and quitting video playing, greatly facilitates the user, and provides better use experience for the user.
The method for quitting video playing provided by the embodiment of the specification can provide the opportunity for evaluating the video content for the user when the user quits the video playing. Moreover, the interactive control group only appears temporarily under the condition that the user wants to quit the video playing, but is not always arranged on the video playing page, the space of the video playing page cannot be occupied, and the page arrangement is facilitated.
The method for quitting video playing in the embodiment of the specification can naturally introduce the evaluation function when the user quits video playing, and is favorable for improving the evaluation rate of the user.
According to the method for quitting video playing in the embodiment of the specification, a user can evaluate the video and quit video playing by using single-finger continuous sliding operation, and the method is very friendly and natural.
According to the method for quitting video playing in the embodiment of the specification, the user can give the evaluation after finishing watching the video content, and if the user returns without finishing watching the video content, the user also has an opportunity to evaluate, so that the evaluation rate of the user is favorably improved.
Fig. 8 is a schematic diagram of a terminal device 10 provided in an embodiment of the present specification, where the terminal device 10 includes an input device 11, a processor 12, and a memory 13. The input device 11 is used for a user to input an operation instruction. The input device may be a mouse, a touch screen, or the like. The memory 13 stores computer instructions, which when executed by the processor 12, implement the interaction method disclosed in any of the foregoing embodiments.
Fig. 9 is a schematic diagram of a terminal device 20 provided in an embodiment of the present specification, where the terminal device 20 includes a touch screen 21, a processor 22, and a memory 23. The memory 23 stores computer instructions which, when executed by the processor 22, implement the interaction method disclosed in any of the foregoing embodiments.
The embodiment of the specification provides terminal equipment, which comprises a touch screen, a processor and a memory; the memory stores computer instructions, and the computer instructions, when executed by the processor, implement the method for obtaining reviews of goods disclosed in any of the foregoing embodiments.
The embodiment of the specification provides terminal equipment, which comprises a touch screen, a processor and a memory; the memory stores computer instructions, and the computer instructions, when executed by the processor, implement the method for exiting video playback disclosed in any of the foregoing embodiments.
The embodiment of the present specification further provides a computer readable storage medium, on which computer instructions are stored, and the computer instructions, when executed by a processor, implement the interaction method disclosed in any one of the foregoing embodiments.
The embodiment of the present specification further provides a computer readable storage medium, on which computer instructions are stored, and the computer instructions, when executed by a processor, implement the method for obtaining a comment of a commodity disclosed in any one of the foregoing embodiments.
The embodiment of the present specification further provides a computer-readable storage medium, on which computer instructions are stored, and the computer instructions, when executed by a processor, implement the method for exiting video playing disclosed in any of the foregoing embodiments.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the device and apparatus embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference may be made to some descriptions of the method embodiments for relevant points.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Embodiments of the present description may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement aspects of embodiments of the specification.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations for embodiments of the present description may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), can execute computer-readable program instructions to implement various aspects of embodiments of the present specification by utilizing state information of the computer-readable program instructions to personalize the electronic circuit.
Aspects of embodiments of the present specification are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present description. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
The foregoing description of the embodiments of the present specification has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (40)

1. An interaction method, comprising:
providing an interaction control group in at least part of interfaces under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, wherein the interaction control group comprises at least one interaction control;
detecting a moving track of the first operation instruction, and determining a selected interactive control in the interactive control group according to the offset direction of the moving track;
and under the condition that the end of the first operation instruction is detected, determining the highlighted interaction control as the interaction control selected by the user, and executing the interaction action corresponding to the interaction control selected by the user.
2. The method of claim 1, wherein the set of interaction controls includes at least one interaction control for a user to express user preferences.
3. The method of claim 1, the performing an interactive action corresponding to a user-selected interactive control, comprising:
and jumping to a page corresponding to the interactive control selected by the user.
4. The method of claim 1, said providing an interactive control set in at least a portion of the interface, comprising:
loading a semi-transparent cover layer on a current page, and loading the interaction control group on the semi-transparent cover layer; alternatively, the first and second electrodes may be,
and reducing the brightness of the current page, and loading the interaction control group on the current page.
5. The method of claim 1, wherein the selected interactive control is highlighted by any of:
changing the color of the selected interactive control;
amplifying the selected interactive control;
stretching the selected interactive control;
and expanding the selected interactive control.
6. The method of claim 1, wherein determining the selected interactive control in the group of interactive controls according to the offset direction of the movement trajectory comprises:
moving the interactive control group along the offset direction of the moving track, so that the interactive control group moves in the range of the interface;
and under the condition that the interactive control is located in a second preset area of the interface, determining the selected interactive control in the interactive control group along the offset direction of the moving track.
7. The method of claim 1, wherein the first predetermined operation instruction is any one of:
moving to the position of a preset virtual button on the interface;
move to a location of a content item on the interface;
moving to a first preset area on the interface;
clicking a preset virtual button on an interface;
clicking on a content item on the interface;
the first preset area of the click interface is then moved towards a first preset direction.
8. The method of claim 1, wherein the first predetermined operation instruction is any one of:
moving to the position of a preset virtual button on an interface and staying at the position of the preset virtual button for a first preset time;
moving to a position of a content item on an interface and staying at the position of the content item for a first preset time;
and moving the interface to a first preset area on the interface and staying in the first preset area for a first preset time.
9. The method of claim 1, the method being implemented by a device having a touch screen, the interface being an interface presented by the touch screen; the first preset operation instruction is any one of the following instructions:
clicking a preset virtual button on an interface according to a first preset force requirement, wherein the duration time reaches a first preset time;
clicking a content item on an interface according to a first preset force requirement, wherein the duration time reaches a first preset time;
and clicking a first preset area of the interface according to a first preset force requirement and then moving towards a first preset direction.
10. The method of any of claims 7-9, the preset virtual button being a back button; or, the first preset area is a functional area returning to the previous page;
and returning to the previous page after executing the interactive action corresponding to the interactive control selected by the user.
11. The method of claim 10, said providing an interactive control set in at least a portion of the interface, comprising:
loading a previous page beneath the current page and moving the current page to present a first portion of the previous page to the user;
loading the set of interaction controls on top of a first portion of a previous page.
12. The method of claim 11, said providing an interactive control set in at least a portion of the interface, further comprising:
loading a semi-transparent covering layer between the previous page and the interaction control group; alternatively, the first and second electrodes may be,
reducing the brightness of the previous page.
13. The method of claim 11, the returning to a previous page, comprising:
closing the current page and removing the interaction control group;
and when the interaction control group is removed, delaying a second preset time to remove the interaction control selected by the user.
14. The method of claim 11, wherein the set of interaction controls includes at least one interaction control for a user to express a user preference.
15. An interaction method based on a touch screen is implemented by equipment with the touch screen and comprises the following steps:
providing an interaction control group in at least part of interfaces of a touch screen under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, wherein the interaction control group comprises at least one interaction control;
detecting a moving track of the first operation instruction, determining a selected interactive control in the interactive control group according to the offset direction of the moving track, and highlighting the interactive control;
determining the highlighted interaction control as the interaction control selected by the user and executing the interaction action corresponding to the interaction control selected by the user under the condition that the first operation instruction is detected to be finished; and returning to the previous page.
16. The method of claim 15, wherein the first preset operation instruction is:
moving to a position of a content item on an interface and staying at the position of the content item for a first preset time;
clicking a preset area of an interface and then moving towards a first preset direction; alternatively, the first and second electrodes may be,
clicking a preset virtual button of the interface then moves towards a first preset direction.
17. The method of claim 15, wherein the first preset operation instruction is:
and clicking a preset area of the interface according to a first preset force requirement and then moving towards a first preset direction.
18. The method of claim 16 or 17, wherein the interaction control group includes a plurality of interaction controls, and an arrangement direction of the plurality of interaction controls is perpendicular to the first preset direction.
19. The method of claim 16 or 17, wherein providing an interactive control set in at least part of the interface comprises:
loading a previous page below the current page and moving the current page towards the first preset direction to present a first part of the previous page to a user;
loading the set of interaction controls on top of a first portion of a previous page.
20. The method of claim 19, said providing an interactive control set in at least a portion of the interface, further comprising:
loading a semi-transparent covering layer between the previous page and the interaction control group; alternatively, the first and second electrodes may be,
reducing the brightness of the previous page.
21. The method of claim 19, the returning to a previous page, comprising:
closing the current page and removing the interaction control group;
and when the interaction control group is removed, delaying preset time to remove the interaction control selected by the user.
22. The method of claim 15, wherein the selected interactive control is highlighted by any of:
changing the color of the selected interactive control;
amplifying the selected interactive control;
stretching the selected interactive control;
and expanding the selected interactive control.
23. The method of claim 15, wherein determining the selected interactive control in the group of interactive controls according to the offset direction of the movement trajectory comprises:
moving the interactive control group along the offset direction of the moving track, so that the interactive control group moves in the range of the interface;
and under the condition that the interactive control is located in a second preset area of the interface, determining the selected interactive control in the interactive control group along the offset direction of the moving track.
24. The method of any of claims 15-23, wherein the set of interaction controls includes at least one interaction control for a user to express a user preference.
25. The method of any of claims 15-23, wherein the interface is configured to present a commodity, and the interactive control group includes at least one interactive control for a user to express a rating for the commodity; or, the interface is used for presenting audio and video content, and the interaction control group comprises at least one interaction control for expressing the preference of the user on the audio and video content.
26. A method for obtaining commodity comments is implemented by equipment with a touch screen and comprises the following steps:
under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, providing an interaction control group in at least part of interfaces of a commodity display interface provided by a touch screen, wherein the interaction control group comprises at least one interaction control for the user to express the evaluation of commodities;
detecting a moving track of the first operation instruction, determining a selected interactive control in the interactive control group according to the offset direction of the moving track, and highlighting the interactive control;
determining the highlighted interaction control as the interaction control selected by the user and executing the interaction action corresponding to the interaction control selected by the user under the condition that the first operation instruction is detected to be finished; and returning to the previous page.
27. The method of claim 26, wherein the first preset operation instruction is:
clicking a preset area of an interface and then moving towards a first preset direction; alternatively, the first and second electrodes may be,
clicking a preset virtual button of the interface then moves towards a first preset direction.
28. The method of claim 27, wherein the interaction control group includes a plurality of interaction controls, and an arrangement direction of the plurality of interaction controls is perpendicular to the first preset direction.
29. The method of claim 27, said providing an interactive control set in at least a portion of the interface, comprising:
loading a previous page below the current page and moving the current page towards the first preset direction to present a first part of the previous page to a user;
loading the set of interaction controls on top of a first portion of a previous page.
30. The method of claim 29, the returning to a previous page, comprising:
closing the current page and removing the interaction control group;
and when the interaction control group is removed, delaying preset time to remove the interaction control selected by the user.
31. A method for quitting video playing is implemented by equipment with a touch screen and comprises the following steps:
under the condition that a first operation instruction of a user is detected to meet a first preset operation instruction, providing an interaction control group in at least part of interfaces of a video content playing interface provided by a touch screen, wherein the interaction control group comprises at least one interaction control for the user to express the preference of the user on video content;
detecting a moving track of the first operation instruction, determining a selected interactive control in the interactive control group according to the offset direction of the moving track, and highlighting the interactive control;
determining the highlighted interaction control as the interaction control selected by the user and executing the interaction action corresponding to the interaction control selected by the user under the condition that the first operation instruction is detected to be finished; and exiting the video content play.
32. The method of claim 31, wherein the first preset operation instruction is:
clicking a preset area of an interface and then moving towards a first preset direction; alternatively, the first and second electrodes may be,
clicking a preset virtual button of the interface then moves towards a first preset direction.
The interaction control group comprises a plurality of interaction controls, and the arrangement direction of the interaction controls is perpendicular to the first preset direction.
33. A terminal device comprising an input means, a processor and a memory;
the input device is used for inputting an operation instruction by a user;
the memory has stored therein computer instructions which, when executed by the processor, implement the interaction method of any one of claims 1-14.
34. A terminal device comprises a touch screen, a processor and a memory; the memory has stored therein computer instructions which, when executed by the processor, implement the interaction method of any one of claims 15-25.
35. A terminal device comprises a touch screen, a processor and a memory; the memory has stored therein computer instructions which, when executed by the processor, implement the method of obtaining reviews for items of merchandise of any of claims 26-30.
36. A terminal device comprises a touch screen, a processor and a memory; the memory has stored therein computer instructions which, when executed by the processor, implement the method of exiting video playback as claimed in any of claims 31-32.
37. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the interaction method of any one of claims 1-14.
38. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the interaction method of any one of claims 15-25.
39. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the method of obtaining reviews for items of merchandise of any of claims 26-30.
40. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the method of exiting video playback of any of claims 31-32.
CN202010099815.5A 2020-02-18 2020-02-18 Interaction method and terminal equipment Pending CN113342218A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010099815.5A CN113342218A (en) 2020-02-18 2020-02-18 Interaction method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010099815.5A CN113342218A (en) 2020-02-18 2020-02-18 Interaction method and terminal equipment

Publications (1)

Publication Number Publication Date
CN113342218A true CN113342218A (en) 2021-09-03

Family

ID=77467035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010099815.5A Pending CN113342218A (en) 2020-02-18 2020-02-18 Interaction method and terminal equipment

Country Status (1)

Country Link
CN (1) CN113342218A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979747A (en) * 2022-05-19 2022-08-30 北京字跳网络技术有限公司 Live broadcast method, device, equipment, medium and program product

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5802516A (en) * 1993-11-03 1998-09-01 Apple Computer, Inc. Method of controlling an electronic book for a computer system
US20120192078A1 (en) * 2011-01-26 2012-07-26 International Business Machines Method and system of mobile virtual desktop and virtual trackball therefor
CN106293473A (en) * 2016-08-15 2017-01-04 珠海市魅族科技有限公司 Page display method and device
CN107219988A (en) * 2017-05-26 2017-09-29 维沃移动通信有限公司 A kind of interface operation bootstrap technique and mobile terminal
WO2018006280A1 (en) * 2016-07-05 2018-01-11 深圳动三帝虚拟现实互动科技有限公司 Page switching method and device, terminal, and storage medium
CN107617213A (en) * 2017-07-27 2018-01-23 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107621914A (en) * 2017-08-02 2018-01-23 努比亚技术有限公司 Display methods, terminal and the computer-readable recording medium of termination function control key
CN108837506A (en) * 2018-05-25 2018-11-20 网易(杭州)网络有限公司 Control method, device and the storage medium of virtual item in a kind of race games

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5802516A (en) * 1993-11-03 1998-09-01 Apple Computer, Inc. Method of controlling an electronic book for a computer system
US20120192078A1 (en) * 2011-01-26 2012-07-26 International Business Machines Method and system of mobile virtual desktop and virtual trackball therefor
WO2018006280A1 (en) * 2016-07-05 2018-01-11 深圳动三帝虚拟现实互动科技有限公司 Page switching method and device, terminal, and storage medium
CN109219795A (en) * 2016-07-05 2019-01-15 深圳脑穿越科技有限公司 page switching method, device, terminal and storage medium
CN106293473A (en) * 2016-08-15 2017-01-04 珠海市魅族科技有限公司 Page display method and device
CN107219988A (en) * 2017-05-26 2017-09-29 维沃移动通信有限公司 A kind of interface operation bootstrap technique and mobile terminal
CN107617213A (en) * 2017-07-27 2018-01-23 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN107621914A (en) * 2017-08-02 2018-01-23 努比亚技术有限公司 Display methods, terminal and the computer-readable recording medium of termination function control key
CN108837506A (en) * 2018-05-25 2018-11-20 网易(杭州)网络有限公司 Control method, device and the storage medium of virtual item in a kind of race games

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979747A (en) * 2022-05-19 2022-08-30 北京字跳网络技术有限公司 Live broadcast method, device, equipment, medium and program product
CN114979747B (en) * 2022-05-19 2024-03-12 北京字跳网络技术有限公司 Live broadcast method, device, equipment, medium and program product

Similar Documents

Publication Publication Date Title
US11720221B2 (en) Systems and methods for enhancing user interaction with displayed information
US8413075B2 (en) Gesture movies
US20110145745A1 (en) Method for providing gui and multimedia device using the same
US20140279025A1 (en) Methods and apparatus for display of mobile advertising content
CN105307000A (en) Display apparatus and method thereof
US20130127921A1 (en) Electronic Device, Method of Displaying Display Item, and Search Processing Method
JP2022548285A (en) Hotspot recommendation pop-up window control method, control device, storage medium and electronic device
CN104331246A (en) Device and method for split screen display in terminal
WO2018112928A1 (en) Method for displaying information, apparatus and terminal device
KR100910759B1 (en) Method for providing user interface in electric device and the device thereof
CN108363534B (en) Global preview method and device of editable object and electronic equipment
KR20160023412A (en) Method for display screen in electronic device and the device thereof
KR20150066129A (en) Display appratus and the method thereof
CN114564604B (en) Media collection generation method and device, electronic equipment and storage medium
WO2018113065A1 (en) Information display method, device and terminal device
US20120131460A1 (en) Playlist Creation
US11029801B2 (en) Methods, systems, and media for presenting messages
CN110753251A (en) Video switching method and device and electronic equipment
KR20230173209A (en) Methods and devices for interacting with applications and electronic devices
JP2014106625A (en) Portable terminal, control method of portable terminal, program and recording medium
US20160170580A1 (en) Improved method for pre-listening to voice contents
CN110909274B (en) Page browsing method and device and electronic equipment
US10976913B2 (en) Enabling undo on scrubber/seekbar UI widgets
CN105843594A (en) Method and device for displaying application program page of mobile terminal
CN113342218A (en) Interaction method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40059843

Country of ref document: HK