CN114168020A - Interaction method, interaction device, terminal equipment and readable storage medium - Google Patents

Interaction method, interaction device, terminal equipment and readable storage medium Download PDF

Info

Publication number
CN114168020A
CN114168020A CN202111508399.0A CN202111508399A CN114168020A CN 114168020 A CN114168020 A CN 114168020A CN 202111508399 A CN202111508399 A CN 202111508399A CN 114168020 A CN114168020 A CN 114168020A
Authority
CN
China
Prior art keywords
interface
interaction
control
floating
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111508399.0A
Other languages
Chinese (zh)
Inventor
张涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202111508399.0A priority Critical patent/CN114168020A/en
Publication of CN114168020A publication Critical patent/CN114168020A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The embodiment of the application discloses an interaction method, an interaction device, terminal equipment and a readable storage medium, wherein the method comprises the following steps: responding to a first movement operation of a floating interaction control in a first interface to control the floating interaction control to move; and if the floating interaction control is detected to stay in the target interaction area in the first interface, generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface. By implementing the method, the interaction efficiency can be improved.

Description

Interaction method, interaction device, terminal equipment and readable storage medium
Technical Field
The present application relates to the field of human-computer interaction technologies, and in particular, to an interaction method, an interaction device, a terminal device, and a readable storage medium.
Background
The floating interactive control is a floating operation control under an android platform, and is generally used as a switch for advanced operations (such as creating schedules, creating mails, creating information, creating notes and the like). However, in practice, the existing floating interaction control is mostly used for triggering and outputting a default interface corresponding to the function of the floating interaction control, so that the personalization degree is low, and the interaction efficiency is influenced.
Disclosure of Invention
The embodiment of the application provides an interaction method, an interaction device, terminal equipment and a readable storage medium, and interaction efficiency can be improved.
A first aspect of an embodiment of the present application provides an interaction method, where the method includes:
responding to a first movement operation of a floating interaction control in a first interface to control the floating interaction control to move;
and if the floating interaction control is detected to stay in the target interaction area of the first interface, generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface.
A second aspect of the embodiments of the present application provides an interaction apparatus, including:
the first processing unit is used for responding to a first movement operation of a floating interaction control in a first interface so as to control the floating interaction control to move;
and the second processing unit is used for generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface if the floating interaction control is detected to stay in the target interaction area in the first interface.
A third aspect of the embodiments of the present application provides a terminal device, which may include:
a memory storing executable program code;
and a processor coupled to the memory;
the processor calls the executable program code stored in the memory, which when executed by the processor causes the processor to implement the method as described in the first aspect of the embodiments of the present application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, on which executable program code is stored, and when the executable program code is executed by a processor, the method according to the first aspect of embodiments of the present application is implemented.
A fifth aspect of embodiments of the present application discloses a computer program product, which, when run on a computer, causes the computer to perform any one of the methods disclosed in the first aspect of embodiments of the present application.
A sixth aspect of the present embodiment discloses an application publishing platform, configured to publish a computer program product, where when the computer program product runs on a computer, the computer is caused to execute any one of the methods disclosed in the first aspect of the present embodiment.
According to the technical scheme, the embodiment of the application has the following advantages:
in the embodiment of the application, responding to a first movement operation of a floating interaction control in a first interface to control the floating interaction control to move; and if the floating interaction control is detected to stay in the target interaction area in the first interface, generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface. By implementing the method, the floating interaction control can interact with the interface content of the interaction area in the first interface in the moving process, so that the generated second interface can automatically associate the interface content contained in the target interaction area, and the interaction efficiency is greatly improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the following briefly introduces the embodiments and the drawings used in the description of the prior art, and obviously, the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained according to the drawings.
Fig. 1 is a schematic flow chart of an interaction method disclosed in an embodiment of the present application;
FIG. 2A is a schematic flow chart diagram of an interaction method disclosed in the embodiments of the present application;
FIG. 2B is a schematic view of a first interface disclosed in an embodiment of the present application;
FIG. 2C is a schematic view of a second interface disclosed in embodiments of the present application;
FIG. 2D is another schematic illustration of a first interface disclosed in an embodiment of the present application;
FIG. 2E is a schematic illustration of a second interface disclosed in an embodiment of the present application;
FIG. 2F is a schematic flow diagram of a second interface generation method;
FIG. 2G is yet another schematic illustration of a first interface disclosed in an embodiment of the present application;
FIG. 2H is yet another schematic illustration of a first interface disclosed in an embodiment of the present application;
FIG. 2I is a further schematic illustration of the first interface disclosed in an embodiment of the present application;
FIG. 2J is another schematic illustration of a second interface disclosed in an embodiment of the present application;
FIG. 2K is yet another schematic illustration of a first interface disclosed in an embodiment of the present application;
FIG. 2L is yet another schematic illustration of a first interface disclosed in an embodiment of the present application;
FIG. 2M is a further schematic illustration of a first interface disclosed in an embodiment of the present application;
FIG. 2N is yet another schematic illustration of a second interface disclosed in embodiments herein;
FIG. 2O is a schematic flow chart of another second interface generation method;
FIG. 2P is yet another schematic illustration of a first interface disclosed in an embodiment of the present application;
FIG. 2Q is yet another schematic illustration of a second interface disclosed in an embodiment of the present application;
FIG. 3A is a schematic flow chart diagram of an interaction method disclosed in the embodiments of the present application;
FIG. 3B is a schematic diagram of one arrangement of function icons;
FIG. 3C is a schematic diagram of another arrangement of function icons;
FIG. 3D is yet another schematic illustration of a first interface disclosed in an embodiment of the present application;
FIG. 3E is yet another schematic illustration of a second interface disclosed in an embodiment of the present application;
FIG. 4 is a block diagram of an interactive apparatus according to an embodiment of the present disclosure;
fig. 5 is a block diagram of a terminal device according to an embodiment of the present disclosure.
Detailed Description
The embodiment of the application provides an interaction method, an interaction device, terminal equipment and a readable storage medium, and interaction efficiency can be greatly improved.
For a person skilled in the art to better understand the present application, the technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. The embodiments in the present application shall fall within the protection scope of the present application.
In the prior art, the floating interaction control is widely used as a floating operation control under an android platform, for example, the floating interaction control can jump to an interaction interface corresponding to the function of the floating interaction control. However, in practice, it is found that the existing interactive interface is often a default interactive interface designed in advance, and a user is usually required to manually fill in a large amount of information, which affects the interaction efficiency. In the application, the suspended interactive control can interact with the interface content in the interactive area in the first interface in the moving process, and a second interface associated with the interface content contained in the target interactive area is generated. Therefore, in the embodiment of the application, the second interface is generated without manually adding interface contents by a user, and the interaction efficiency is greatly improved.
It is understood that the terminal device referred to in the embodiments of the present application may include a general handheld electronic terminal device with a screen, such as a mobile phone, a smart phone, a portable terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP) device, a notebook Computer, a notebook (Note Pad), a Wireless Broadband (Wibro) terminal, a tablet Computer (PC), a smart PC, a Point of Sales (POS), a car Computer, and the like.
The terminal device may also comprise a wearable device. The wearable device may be worn directly on the user or a portable electronic device integrated into the user's clothing or accessory. The wearable device is not only a hardware device, but also can realize powerful intelligent functions through software support, data interaction and cloud server interaction, such as: the system has the functions of calculation, positioning and alarming, and can be connected with a mobile phone and various terminals. Wearable devices may include, but are not limited to, wrist-supported watch types (e.g., wrist watches, wrist-supported products), foot-supported shoes types (e.g., shoes, socks, or other leg-worn products), head-supported Glass types (e.g., glasses, helmets, headbands, etc.), and various types of non-mainstream products such as smart clothing, bags, crutches, accessories, and the like.
The technical solution of the present application is further described below by way of examples.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an interaction method according to an embodiment of the present disclosure. The interaction method as shown in fig. 1 may include the steps of:
101. and responding to the first movement operation of the floating interaction control in the first interface so as to control the floating interaction control to move.
The first interface may include an interface of one or more application programs currently displayed on a display device of the terminal device, which is not limited in this embodiment of the application.
The hover interaction control may include a hover operation button that may be any shape, for example, the hover operation button may be a small circle that hovers in the first interface.
In some embodiments, responding to the first movement operation of the hover interaction control in the first interface may include: responding to the first movement operation of the floating interactive control in the default position in the first interface. Wherein the default position is a position designed in the first interface for displaying the floating interaction control.
In some embodiments, where the first interface includes multiple interfaces, the multiple interfaces may belong to the same application or different applications. The default position of the floating interaction control is located in any interface of the plurality of interfaces, and the floating interaction control can be an advanced interaction control of the interface.
In some embodiments, the first movement operation may be triggered by a user through a gesture, which may include a contactless type or a contact type, and the embodiments of the present application are not limited thereto.
For example, in the case that the gesture is a contact type gesture, the first moving operation may be a dragging operation of dragging a floating interactive control, the floating interactive control may be matched with a dragging position of a user at a display position of the first interface, and one of the floating interactive controls is in a pressed state during the moving process.
102. And if the floating interaction control is detected to stay in the target interaction area in the first interface, generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface.
The first interface may include at least one interactive area, each of which may include corresponding interface content, and the interactive area may refer to an area in which a user may trigger new interactive content (e.g., jump to a new interface, change interface content in the area, etc.) through an interactive operation. The target interaction area is any one of the at least one interaction area, for example, the target interaction area may be an area where a user wants to perform human-computer interaction. The interactive region may include a list, a card, and the like.
In some embodiments, detecting that the hovering interaction control stays in the target interaction region in the first interface may include, but is not limited to, the following:
in the method 1, when the fact that the stay time of the floating interaction control in any interaction area in the first interface is larger than a time threshold is detected, the interaction area is used as a target interaction area;
and in the mode 2, when the floating interaction control is detected to be switched from the pressing state to the releasing state in any interaction area in the first interface, the interaction area is taken as a target interaction area. Wherein the release state indicates that the hover interaction control is not pressed.
The interactive function of the floating interactive control in the first interface can comprise at least one of the following: and adding, editing, sharing, navigating, creating and the like, wherein the interaction functions corresponding to the first interfaces of different application programs can be different.
In some embodiments, displaying a second interface comprises: jumping from the first interface to the second interface; or displaying the second interface in a popup window or split screen mode. If the second interface is displayed in a skipping mode, the first interface is not displayed any more while the second interface is displayed; and if the second interface is displayed in a popup window or split screen mode, the first interface continues to be displayed.
By implementing the method, the floating interaction control can interact with the interface content in the interaction area in the first interface in the moving process, so that the generated second interface can automatically associate the interface content contained in the target interaction area, and the interaction efficiency is greatly improved.
Referring to fig. 2A, fig. 2A is a schematic flowchart illustrating an interaction method according to an embodiment of the present disclosure. The interaction method as shown in fig. 2 may include the steps of:
201. and responding to the first operation, and controlling the state of the floating interaction control in the first interface to be a movable state.
Alternatively, the first operation may include, but is not limited to, speech or a gesture. In the case where the first operation is a gesture, the first operation may include, but is not limited to, a pressing operation or a sliding operation.
In some embodiments, when the state of the hovering interaction control is the movable state, a prompt may be further performed in a second specified manner to prompt that the hovering interaction control is in the movable state.
In some embodiments, the second designation includes at least one of: displaying a prompt animation on the suspension interaction control; controlling the vibration component to vibrate at a second vibration frequency; and controlling the audio playing device to play the second designated audio.
Optionally, the prompt animation may be a swelling animation of the floating interaction control. By implementing the method, the user can know the state of the floating interaction control conveniently.
202. And responding to the first movement operation of the floating interaction control in the first interface so as to control the floating interaction control to move.
And in the case that the first interface comprises an application interface, the first moving operation can be used for controlling the floating interactive control to move in the application interface. Where the first interface includes a plurality of application interfaces, the first movement operation may be for the hover interaction control to move across the interfaces in the plurality of application interfaces.
In some embodiments, the first interface includes a first application interface and a second application interface, the default position of the hover interaction control is at the first application interface, at which time the first move operation may be used to control movement of the hover interaction control from the first application interface to the second application interface.
203. And if the floating interaction control is detected to stay in the target interaction area in the first interface, generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface.
In some embodiments, the hover interaction control that remains in the first interface may be in an immovable state. Based on this, detecting that the floating interaction control stays in the target interaction region in the first interface may include: and when detecting that any interaction area of the floating interaction control in the first interface is in an unmovable state, taking the interaction area as a target interaction area.
In some embodiments, if it is detected that the floating interaction control stays in the target interaction area in the first interface, a prompt may be further performed in a first specified manner to prompt that the current interaction content is the interface content included in the target interaction area.
For example:
example 1, if the first interface is a mail list, the interaction function of the floating interaction control in the first interface is a mail reply, the interface content included in the target interaction area is any mail, and prompting that the current interaction content is the interface content included in the target interaction area includes: and prompting to reply to the mail.
Example 2, if the first interface is a calendar, the interaction function of the floating interaction control in the first interface is created for a schedule, the interface content included in the target interaction area is any date, and prompting that the current interaction content is the interface content included in the target interaction area includes: prompting for schedule creation for the date.
Optionally, the first specifying manner includes at least one of: displaying a prompt animation for the interface content in the target interaction area; controlling the vibration component to vibrate at a first vibration frequency; and playing the first designated audio through the audio playing device.
In some embodiments, when the floating interaction control stays in the target interaction region in the first interface, the contact area of the floating interaction control and the target interaction region is greater than or equal to a percentage threshold of the total area of the target interaction region. Wherein the percentage threshold is 50%, 55%, 60%, 65%, or the like.
In some embodiments, detecting that the hover interaction control remains in the target interaction region in the first interface may include: and when detecting that any interaction area of the floating interaction control in the first interface is in a non-movable state and the contact area of the floating interaction control and the target interaction area is greater than or equal to a percentage threshold value of the total area of the target interaction area, taking the interaction area as the target interaction area.
In some embodiments, the prompt animation displayed on the interface content may be a zoom animation of the interface content.
In some embodiments, if it is detected that the floating interaction control stays in the target interaction area in the first interface, first prompt information may be further displayed at a specified position of the target interaction area to prompt an interaction function of the floating interaction control in the first interface.
Based on the above description, the hovering of the interaction control in the target interaction region of the first interface may include: under the condition that the first interface comprises an application interface, the floating interaction control stays in a target interaction area in the application interface; and in the case that the first interface comprises a first application interface and a second application interface, the floating interaction control stays in the target interaction area in the second application interface.
204. And if the floating interaction control is detected to stay in the non-interaction area or the default position of the first interface, displaying a third interface according to the interaction function of the floating interaction control in the first interface.
In a case that the first interface includes a first application interface and a second application interface, the non-interactive area may be in the first application interface or the second application interface. The third interface is a default interactive interface that is not associated with interface content.
The following describes, by way of example, a case where the first interface includes one application interface:
example 1, the first interface is a calendar interface, the interaction function of the floating interaction control is schedule creation, the target interaction area is a location of any specific date, the user can move the floating interaction control from a default location to the location of the specific date through a first moving operation, and the second interface is a schedule creation interface for the specific date.
Referring to fig. 2B, fig. 2B is a schematic diagram of a first interface according to an embodiment of the disclosure. The first interface shown in FIG. 2B is a calendar interface. The calendar interface as shown in FIG. 2B includes a hover interaction control 10, a default position 20, and a target interaction region 30. The floating interactive control 10 is located at the default position 20, and when the user moves the mobile floating interactive control 10 from the default position 20 to the target interactive area 30, a corresponding schedule interface can be generated according to a date corresponding to the target interactive area 30. Please refer to fig. 2C for the schedule interface. Fig. 2C is an illustration of a second interface disclosed in an embodiment of the present application. Fig. 2C is a schematic diagram of a schedule interface, where the schedule interface includes a date display area 10, and the date in the date display area 10 is the date corresponding to the target interaction area 30 in fig. 2B.
Example 2, the first interface is a mail list interface, the interaction function of the floating interaction control is mail reply, the target interaction area is the position of any mail, the user can move the floating interaction control from the default position to the position of the mail through the first moving operation, and the second interface is a reply interface for the mail.
Referring to fig. 2D, fig. 2D is another schematic diagram of the first interface according to the disclosure of the present application. The first interface shown in fig. 2D is a mailing list interface that includes a hover interaction control 10, a default position 20, and a target interaction region 30. The floating interactive control 10 is located at the default position 20, and when the user moves the mobile floating interactive control 10 from the default position 20 to the target interactive area 30, a corresponding reply email may be generated according to the email information corresponding to the target interactive area 30. Referring to fig. 2E for the reply email, fig. 2E is a schematic diagram of a second interface disclosed in the embodiment of the present application, and the second interface shown in fig. 2E is an email reply interface, where the email reply interface includes a recipient display area 10 and an email subject display area 20, and both the recipient information displayed in the recipient display area 10 and the subject information displayed in the subject display area 20 correspond to the email information corresponding to the target interaction area 30.
The following describes, by way of example, a case where the first interface includes a first application interface and a second application interface.
Referring to fig. 2F, fig. 2F is a flowchart illustrating a second interface generating method. The method as illustrated in fig. 2F may comprise the steps of:
211. and responding to the first movement operation of the floating interaction control in the first application interface to control the floating interaction control to move.
212. And if the floating interaction control is detected to stay in the target interaction area of the second application interface, generating the second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface.
In some embodiments, the second application interface may be controlled to switch from the static mode to the interactive mode when the hover interaction control moves to a second interaction region in the second application interface. The second interaction area comprises a target interaction area, the second application interface in the static mode does not have an interactive area, and the second application interface in the interaction mode has an interactive area.
Further, the target interaction area of the second application interface may include: and the target interaction area in the second application interface in the interaction mode.
In some embodiments, after the second interface is generated according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface, the second application interface can be further controlled to be switched from the interaction mode to the static mode. By implementing the method, when the second interface is generated, the second application interface is controlled to be switched from the interactive mode to the static mode, so that the user can be prevented from touching the second application interface by mistake.
The method shown in fig. 2F is explained below by a specific example:
example 1, a first application interface is a mail list interface, a second application interface is a picture interface, an interaction function of a floating interaction control in the mail list interface is to write a mail, and a user can move the floating interaction control from the mail list interface to the picture interface through a first moving operation, and add the picture to the mail writing interface as a mail attachment when the floating interaction control is left in any picture in the picture interface.
Referring to fig. 2G, fig. 2G is another schematic diagram of the first interface according to the embodiment of the disclosure. The first interface shown in fig. 2G includes a mailing list interface 10 and a picture interface 20, where the mailing list interface 10 includes a floating interaction control 110 and a default position 120, and the picture interface 20 includes a second interaction area 210. When the floating interaction control is moved from the default position 120 to the second interaction region 210, the picture interface 20 will be switched from the static mode to the interaction mode. Referring to fig. 2H, please refer to fig. 2H for the picture interface 20 in the interactive mode, where fig. 2H is another schematic diagram of the first interface disclosed in the embodiment of the present application. The schematic diagram shown in fig. 2H includes a mailing list interface 10 and a picture interface 20 in an interactive mode.
Referring to fig. 2I, fig. 2I is another schematic diagram of the first interface. The schematic shown in fig. 2I includes a mailing list interface 10 and a picture interface 20. The picture interface 20 includes a second interactive area 210 and a target interactive area 220. When the floating interaction control 110 stays in the target interaction area 220, a mail writing interface including a picture of the target interaction area 220 is generated, wherein please refer to fig. 2J regarding the mail writing interface, fig. 2J is another schematic diagram of the second interface disclosed in the embodiment of the present application, fig. 2J includes a mail text display area 10, and the mail text display area 10 displays the picture of the target interaction area 220.
Example 2, the first application interface is a mailing list interface, the second application interface is a schedule card interface, an interactive function of the floating interactive control in the mailing list interface is to write a mail, and the user may move the floating interactive control from the mailing list interface to the schedule card interface through a first moving operation, and add the schedule card to the mailing list interface as a mail attachment when the floating interactive control stays on any schedule card in the schedule card interface.
Referring to fig. 2K, fig. 2K is a schematic diagram of a first interface according to an embodiment of the disclosure. The first interface shown in fig. 2K includes a mailing list interface 10 and a calendar card interface 20, wherein the mailing list interface 10 includes a hover interaction control 110 and a default position 120, and the calendar card interface 20 includes a second interaction area 210. When the hover interaction control moves from the default position 120 to the second interaction region 210, the calendar card interface 20 will switch from the static mode to the interaction mode. Please refer to fig. 2L for the date card interface 20 in the interactive mode, wherein fig. 2L is another schematic diagram of the first interface. The schematic diagram shown in fig. 2L includes a mailing list interface 10 and a calendar card interface 20 in an interactive mode.
Referring to fig. 2M, fig. 2M is another schematic diagram of the first interface. The schematic diagram shown in fig. 2M includes a mailing list interface 10 and a calendar card interface 20. The schedule card interface 20 includes a second interactive area 210 and a target interactive area 220. While the floating interaction control 110 remains in the target interaction region 220, a mail-writing interface is generated that contains the calendar card of the target interaction region 220. Referring to fig. 2O, please refer to fig. 2O for a mail writing interface, where fig. 2O is another schematic diagram of a second interface disclosed in the embodiment of the present application. Fig. 2O includes a mail body display area 10, and the mail body display area 10 displays a schedule card with a target interaction area 220.
Referring to fig. 2P, fig. 2P is a schematic flow chart of another second interface generation method. The method as shown in fig. 2P may comprise the steps of:
221. and responding to the first movement operation of the floating interaction control in the first interface so as to control the floating interaction control to move.
222. And if the floating interaction control is detected to stay in the target interaction area of the second application interface, generating the second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface.
For the description of step 221, please refer to the description of step 212, which is not repeated herein.
223. And responding to the second movement operation of the floating interaction control to control the floating interaction control to continuously move.
For the description of the second moving operation, please refer to the description of the first moving operation, which is not described herein again.
224. And if the hovering interaction control is detected to stay in the first interaction area of the second application interface again, adding the interface content of the first interaction area to the second interface.
The first interaction area can be any interaction area except the target interaction area in the second application interface. It can be understood that, if the floating interaction control stays in any interaction area in the second application interface again, the interface content corresponding to the interaction area may be continuously added to the second screen.
It should be noted that, in the embodiment of the present application, the number of times of stopping the floating interaction control in the interaction area on the second application interface is not limited.
Referring to fig. 2Q, fig. 2Q is another schematic diagram of a first interface according to an embodiment of the disclosure. The first interface shown in fig. 2Q includes a mailing list interface 10 and a calendar card interface 20 in an interaction mode, wherein the mailing list interface 10 includes a floating interaction control 110 and a default position 120, and the calendar card interface 20 includes a second interaction area 210, a target interaction area 220, and a first interaction area 230. As shown in fig. 2Q, the floating interaction control 110 stays in the target interaction region 220 for the first time and stays in the first interaction region 230 for the second time, and at this time, the generated email interface includes both the schedule card of the target interaction region 220 and the schedule card of the first interaction region 230. Referring to fig. 2R for a mail writing interface, fig. 2R is another schematic diagram of a second interface disclosed in an embodiment of the present application, and fig. 2R includes a mail text display area 10, where the mail text display area 10 displays a schedule card of the target interaction area 220 and a schedule card of the first interaction area 230.
Referring to fig. 3A, fig. 3A is a schematic flowchart illustrating an interaction method according to an embodiment of the present disclosure. The interaction method as shown in fig. 3A may include the steps of:
301. responding to a first movement operation of a floating interaction control in a first interface to control the floating interaction control to move; wherein the first interface comprises an application interface.
302. And if the fact that the floating interaction control stays in the target interaction area in the first interface is detected, displaying function icons corresponding to all sub-functions corresponding to the interaction functions of the floating interaction control in the first interface.
In some embodiments, if the first interface is a mail list interface and the interactive function of the hover button interactive control in the mail list interface is a reply mail, the sub-functions may include, but are not limited to, copy, forward, and the like. If the first interface is a calendar interface and the interaction function of the floating button interaction control in the calendar interface is schedule creation, the sub-functions may include, but are not limited to, copy, schedule reminder, invitee addition, attachment addition, and the like.
In some embodiments, the color of the floating interaction control can change to a light-colored transparent state when the function icons corresponding to the respective sub-functions are displayed.
303. And responding to the third movement operation of the floating interaction control to control the floating interaction control to move continuously.
For the description of the third moving operation, please refer to the above description about the first moving operation, which is not described herein again.
304. If the suspension interaction control is detected to stay on the target function icon, generating and displaying a second interface according to the interface content of the target interaction area, the interaction function of the suspension interaction control in the first interface and the sub-function corresponding to the target function icon; the target function icon is a function icon of any one of the at least one sub-function.
In some embodiments, the function icons corresponding to the sub-functions are displayed on the first side of the floating interaction control in a linear arrangement; or the function icons corresponding to the sub-functions are displayed on the second side of the suspended interactive control in an arc-shaped arrangement.
In some embodiments, the arrangement mode of the function icons corresponding to each sub-function is determined according to the layout style corresponding to the target interaction area; and when the layout style is the list style, the arrangement mode of the function icons corresponding to the sub-functions is a straight-line arrangement mode. Referring to fig. 3B, fig. 3B is a schematic diagram of an arrangement of function icons, and the schematic diagram shown in fig. 3B may include a plurality of list contents 10, floating interaction controls 20, and 5 function icons arranged in a line. And under the condition that the layout style is a card style, the arrangement mode of the function icons corresponding to the sub-functions is an arc-shaped arrangement mode. Referring to fig. 3C, fig. 3C is another arrangement schematic diagram of function icons, and the schematic diagram shown in fig. 3C may include a plurality of card contents 10, a floating interaction control 20, and 5 function icons arranged in an arc shape.
For example, please refer to fig. 3D, where fig. 3D is another schematic diagram of the first interface disclosed in the embodiment of the present application. The first interface shown in fig. 3D is a mailing list interface, and the mailing list interface shown in fig. 3D includes a floating interaction control 10, a default position 20, a target interaction area 30, a sub-function icon 40, and a sub-function icon 50. The function corresponding to the sub-function icon 40 is copy, and the function corresponding to the sub-function icon 50 is forward. As shown in fig. 3D, the floating interaction control 10 stays at the target interaction area 30 based on the first moving operation, so as to trigger and display the sub-function icon 40 and the sub-function icon 50, and the floating interaction control 10 stays at the sub-function icon 40 based on the third moving operation, at this time, a reply email is generated according to the email information corresponding to the target interaction area 30 and the function corresponding to the sub-function icon 40. Referring to fig. 3E for the reply email, fig. 3E is another schematic diagram of the second interface disclosed in the embodiment of the present application, and the email reply view shown in fig. 3E includes a recipient display area 10, an email subject display area 20, and an email copying display area 30, where both the recipient information displayed in the recipient display area 10 and the subject information displayed in the subject display area 20 correspond to the email information corresponding to the target interaction area 30.
By implementing the method, the floating interaction control can interact with the interface content of the interaction area in the first interface in the moving process, so that the generated second interface can automatically associate the interface content contained in the target interaction area, and the interaction efficiency is greatly improved. In addition, on the basis of the relevance of the suspended interactive control and the interface content, more sub-functions corresponding to the suspended interactive control are expanded, the enrichment of types of advanced operations is facilitated, and the interaction efficiency is further improved.
Referring to fig. 4, fig. 4 is a block diagram of an interactive device according to an embodiment of the present disclosure. The interaction means as shown in fig. 4 may comprise a first processing unit 401 and a second processing unit 402, wherein:
the first processing unit 401 is configured to respond to a first movement operation of a floating interaction control in a first interface to control movement of the floating interaction control;
the second processing unit 402 is configured to, if it is detected that the floating interaction control stays in the target interaction area in the first interface, generate and display a second interface according to interface content included in the target interaction area and an interaction function of the floating interaction control in the first interface.
In some embodiments, the second processing unit 402 is further configured to prompt in a first specified manner if it is detected that the floating interaction control stays in the target interaction area in the first interface, so as to prompt that the current interaction content is the interface content included in the target interaction area.
In some embodiments, the first designation includes at least one of:
displaying a prompt animation for the interface content in the target interaction area;
controlling the vibration component to vibrate at a first vibration frequency;
and playing the first designated audio through the audio playing device.
In some embodiments, the first interface includes a first application interface and a second application interface, the default position of the hover interaction control is at the first application interface;
the manner, by which the second processing unit 402 is configured to generate the second interface according to the interface content included in the target interaction region and the interaction function of the floating interaction control in the first interface if it is detected that the floating interaction control stays in the target interaction region in the first interface, may specifically include:
the second processing unit 402 is configured to, if it is detected that the floating interaction control stays in the target interaction area of the second application interface, generate the second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface.
In some embodiments, the manner in which the second processing unit 402 is configured to generate the second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface may specifically include: the second processing unit 402 is configured to generate a second interface according to the interaction function of the floating interaction control in the first application interface, and insert the interface content of the target interaction area into the second interface.
In some embodiments, the second processing unit 402 is further configured to respond to a second moving operation of the floating interaction control after generating the second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface, so as to control the floating interaction control to continue to move; and if the floating interaction control is detected to stay in the first interaction area of the second application interface again, adding the interface content of the first interaction area to the second interface, wherein the first interaction area is an interaction area except the target interaction area in the second application interface.
In some embodiments, the first processing unit 401 is further configured to control the second application interface to switch from the static mode to the interactive mode when the floating interaction control moves to the second interaction region in the second application interface; the second application interface in the static mode has no interactive area, and the second application interface in the interactive mode has an interactive area;
the target interaction area in the first interface comprises: a target interaction area in the second application interface in the interaction mode; the second interaction area comprises a target interaction area;
the second processing unit 402 is further configured to control the second application interface to be switched from the interaction mode to the static mode after the second interface is generated according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface.
In some embodiments, the first processing unit 401 is further configured to, in response to the first moving operation of the floating interaction control in the first interface, control the state of the floating interaction control in the first interface to be a movable state in response to the first operation before controlling the floating interaction control to move;
further, the state of the floating interaction control staying in the target interaction area is an immovable state.
In some embodiments, the first processing unit 401 is further configured to prompt in a second specified manner when the state of the floating interaction control is the movable state, so as to prompt that the floating interaction control is in the movable state.
In some embodiments, the second designation includes at least one of:
displaying a prompt animation for the suspension interaction control;
controlling the vibration component to vibrate at a second vibration frequency;
and controlling the audio playing device to play the second designated audio.
In some embodiments, the interactive function of the hovering interactive control in the first interface corresponds to at least one sub-function.
The second processing unit 402 is configured to, if it is detected that the floating interaction control stays in the target interaction area in the first interface, generate the second interface according to the interface content included in the target interaction area and the interaction function of the floating interaction control in the first interface, and specifically may include:
the second processing unit 402, if detecting that the floating interaction control stays in the target interaction area in the first interface, displaying a function icon corresponding to each sub-function; responding to the third movement operation of the floating interaction control to control the floating interaction control to continue to move; if the suspension interaction control is detected to stay on the target function icon, generating a second interface according to the interface content of the target interaction area, the interaction function of the suspension interaction control in the first interface and the sub-function corresponding to the target function icon; the target function icon is a function icon of any one of the at least one sub-function.
In some embodiments, the function icons corresponding to the sub-functions are displayed on the first side of the floating interaction control in a linear arrangement; or the function icons corresponding to the sub-functions are displayed on the second side of the suspended interactive control in an arc-shaped arrangement.
In some embodiments, the arrangement mode of the function icons corresponding to each sub-function is determined according to the layout style corresponding to the target interaction area; wherein, under the condition that the layout style is the list style, the arrangement mode of the function icons corresponding to each sub-function is a linear arrangement mode; and under the condition that the layout style is a card style, the arrangement mode of the function icons corresponding to the sub-functions is an arc-shaped arrangement mode.
In some embodiments, the second processing unit 402 is further configured to, if it is detected that the floating interaction control stays in the non-interaction area or the default position of the first interface, display a third interface according to the interaction function of the floating interaction control in the first interface, where the third interface is a default interaction interface.
Referring to fig. 5, fig. 5 is a block diagram of a terminal device according to an embodiment of the present disclosure. As shown in fig. 5, the terminal device may comprise a processor 501, a memory 502 coupled to the processor 501, wherein the memory 502 may store one or more computer programs.
Processor 501 may include one or more processing cores. The processor 501 connects various parts within the entire terminal device using various interfaces and lines, performs various functions of the terminal device and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 502, and calling data stored in the memory 502. Alternatively, the processor 501 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 501 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 501, but may be implemented by a communication chip.
The Memory 502 may include a Random Access Memory (RAM) or a Read-Only Memory (ROM). The memory 502 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 502 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like. The storage data area may also store data created by the terminal device in use, and the like.
In the embodiment of the present application, the processor 501 further has the following functions:
the first interface is used for responding to a first movement operation of the floating interaction control in the first interface so as to control the floating interaction control to move;
and if the floating interaction control is detected to stay in the target interaction area in the first interface, generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface.
In the embodiment of the present application, the processor 501 further has the following functions:
and if the floating interaction control is detected to stay in the target interaction area in the first interface, prompting in a first specified mode to prompt that the current interaction content is the interface content contained in the target interaction area.
In the embodiment of the present application, the first specifying manner includes at least one of:
displaying a prompt animation for the interface content in the target interaction area;
controlling the vibration component to vibrate at a first vibration frequency;
and playing the first designated audio through the audio playing device.
In an embodiment of the application, the first interface includes a first application interface and a second application interface, and the default position of the hover interaction control is in the first application interface. The processor 501 also has the following functions:
and if the floating interaction control is detected to stay in the target interaction area of the second application interface, generating the second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface.
In the embodiment of the present application, the processor 501 further has the following functions:
and generating a second interface according to the interactive function of the floating interactive control in the first application interface, and inserting the interface content of the target interactive area into the second interface.
In the embodiment of the present application, the processor 501 further has the following functions:
responding to the second movement operation of the floating interaction control to control the floating interaction control to continue to move; and if the floating interaction control is detected to stay in the first interaction area of the second application interface again, adding the interface content of the first interaction area to the second interface, wherein the first interaction area is an interaction area except for the target interaction area in the second application interface.
In the embodiment of the present application, the processor 501 further has the following functions:
when the floating interaction control moves to a second interaction area in the second application interface, controlling the second application interface to be switched from the static mode to the interaction mode; the second application interface in the static mode does not have an interactive area, and the second application interface in the interactive mode has an interactive area;
the target interaction area in the first interface comprises: a target interaction area in the second application interface in the interaction mode; the second interaction area comprises a target interaction area;
and controlling the second application interface to be switched from the interactive mode to the static mode after generating the second interface according to the interface content of the target interactive area and the interactive function of the floating interactive control in the first application interface.
In the embodiment of the present application, the processor 501 further has the following functions:
responding to the first operation, and controlling the state of the floating interaction control in the first interface to be a movable state;
further, the state of the floating interaction control staying in the target interaction area is an immovable state.
In the embodiment of the present application, the processor 501 further has the following functions:
and when the state of the floating interaction control is the movable state, prompting in a second specified mode to prompt that the floating interaction control is in the movable state.
In an embodiment of the present application, the second specifying means includes at least one of:
displaying a prompt animation for the suspension interaction control;
controlling the vibration component to vibrate at a second vibration frequency;
and controlling the audio playing device to play the second designated audio.
In some embodiments, the interactive function of the hovering interactive control in the first interface corresponds to at least one sub-function.
In the embodiment of the present application, the processor 501 further has the following functions:
if the fact that the floating interaction control stays in the target interaction area in the first interface is detected, displaying function icons corresponding to the sub-functions; responding to the third movement operation of the floating interaction control to control the floating interaction control to continue to move; if the suspension interaction control is detected to stay on the target function icon, generating a second interface according to the interface content of the target interaction area, the interaction function of the suspension interaction control in the first interface and the sub-function corresponding to the target function icon; the target function icon is a function icon of any one of the at least one sub-function.
In the embodiment of the application, the function icons corresponding to the sub-functions are arranged and displayed on the first side of the suspended interactive control in a straight line shape; or the function icons corresponding to the sub-functions are arranged in an arc shape and displayed on the second side of the floating interaction control.
In the embodiment of the application, the arrangement mode of the function icons corresponding to each sub-function is determined according to the layout style corresponding to the target interaction area; wherein, under the condition that the layout style is the list style, the arrangement mode of the function icons corresponding to each sub-function is a linear arrangement mode; and under the condition that the layout style is a card style, the arrangement mode of the function icons corresponding to the sub-functions is an arc-shaped arrangement mode.
In the embodiment of the present application, the processor 501 further has the following functions:
and if the floating interaction control is detected to stay in the non-interaction area or the default position of the first interface, displaying a third interface according to the interaction function of the floating interaction control in the first interface, wherein the third interface is a default interaction interface.
The embodiment of the application discloses a computer readable storage medium, which stores a computer program, wherein the computer program realizes the method described in the above embodiments when being executed by a processor.
Embodiments of the present application disclose a computer program product comprising a non-transitory computer readable storage medium storing a computer program, and the computer program is executable by a processor to implement the methods as described in the embodiments above.
It will be understood by those skilled in the art that all or part of the processes for implementing the methods of the embodiments described above can be implemented by a computer program that can be stored in a non-volatile computer-readable storage medium and includes processes for implementing the embodiments of the methods described above when executed. The storage medium may be a magnetic disk, an optical disk, a ROM, etc.
Any reference to memory, storage, database, or other medium as used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include ROM, Programmable ROM (PROM), Erasable PROM (EPROM), Electrically Erasable PROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM can take many forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), SLDRAM (SLDRAM), Rambus Direct RAM (Rambus DRAM, RDRAM), and Direct Rambus DRAM (DRDRAM).
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Those skilled in the art should also appreciate that the embodiments described in this specification are all alternative embodiments and that the acts and modules involved are not necessarily required for this application.
In the various embodiments of the present application, it should be understood that the serial numbers of the above-mentioned processes do not imply an inevitable order of execution, and the order of execution of the processes should be determined by their functions and inherent logic, and should not limit the implementation processes of the embodiments of the present application.
The functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated units, if implemented as software functional units and sold or used as a stand-alone product, may be stored in a computer accessible memory. Based on such understanding, the technical solution of the present application, which is essential or contributing to the prior art, or all or part of the technical solution, can be embodied in the form of a software product stored in a memory, and includes several requests for causing a computer device (which may be a personal computer, a server, or a network device, and may specifically be a processor in the computer device) to execute part or all of the steps of the above-described method of the embodiments of the present application.
The interaction method, the interaction device, the terminal device and the readable storage medium disclosed in the embodiments of the present application are described in detail above, and the principles and implementations of the present application are described herein using specific embodiments, and the description of the embodiments is only used to help understanding the method and core ideas of the present application. Meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific implementation and application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (17)

1. An interactive method, characterized in that the method comprises:
responding to a first movement operation of a floating interaction control in a first interface to control the floating interaction control to move;
and if the floating interaction control is detected to stay in the target interaction area of the first interface, generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface.
2. The method of claim 1, further comprising:
and if the floating interaction control is detected to stay in the target interaction area in the first interface, prompting in a first specified mode to prompt that the current interaction content is the interface content contained in the target interaction area.
3. The method of claim 2, wherein the first specifying means comprises at least one of:
displaying a prompt animation for the interface content in the target interaction area;
controlling the vibration component to vibrate at a first vibration frequency;
and playing the first designated audio through the audio playing device.
4. The method of any of claims 1-3, wherein the first interface comprises a first application interface and a second application interface, and wherein a default position of the hover interaction control is at the first application interface;
if it is detected that the floating interaction control stays in the target interaction area of the first interface, generating a second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first interface, including:
and if the floating interaction control is detected to stay in the target interaction area of the second application interface, generating a second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface.
5. The method of claim 4, wherein generating a second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface comprises:
and generating a second interface according to the interactive function of the floating interactive control in the first application interface, and inserting the interface content of the target interactive area into the second interface.
6. The method according to claim 4 or 5, wherein after generating a second interface according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface, the method further comprises:
responding to a second movement operation of the floating interaction control to control the floating interaction control to continue to move;
and if the situation that the floating interaction control stays in the first interaction area of the second application interface again is detected, adding the interface content of the first interaction area to the second interface, wherein the first interaction area is an interaction area, except for the target interaction area, in the second application interface.
7. The method of claim 4, further comprising:
when the floating interaction control moves to a second interaction area in the second application interface, controlling the second application interface to be switched from a static mode to an interaction mode; wherein, the second application interface in the static mode has no interactive area, and the second application interface in the interactive mode has an interactive area;
the target interaction area in the first interface comprises: a target interaction area in the second application interface in the interaction mode; wherein the second interaction region comprises the target interaction region;
after a second interface is generated according to the interface content of the target interaction area and the interaction function of the floating interaction control in the first application interface, the method further includes:
and controlling the second application interface to be switched from the interactive mode to the static mode.
8. The method of any of claims 1-3, wherein prior to the operation of responding to the first movement of the hover interaction control in the first interface to control movement of the hover interaction control, the method further comprises:
responding to the first operation, and controlling the state of the floating interaction control in the first interface to be a movable state;
and the state of the floating interaction control staying in the target interaction area is an immovable state.
9. The method of claim 8, further comprising:
and when the state of the suspension interaction control is the movable state, prompting in a second specified mode to prompt that the suspension interaction control is in the movable state.
10. The method of claim 9, wherein the second specifying means comprises at least one of:
displaying a prompt animation for the suspension interaction control;
controlling the vibration component to vibrate at a second vibration frequency;
and controlling the audio playing device to play the second designated audio.
11. The method of any one of claims 1-3, wherein the interactive function of the hovering interactive control in the first interface corresponds to at least one sub-function; if it is detected that the floating interaction control stays in the target interaction area of the first interface, generating a second interface according to interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface, including:
if the suspension interaction control is detected to stay in the target interaction area in the first interface, displaying a function icon corresponding to each sub-function;
responding to a third movement operation of the floating interaction control to control the floating interaction control to continue to move;
if the suspension interaction control is detected to stay on a target function icon, generating a second interface according to the interface content of the target interaction area, the interaction function of the suspension interaction control in the first interface and the sub-function corresponding to the target function icon; wherein the target function icon is a function icon of any one of the at least one sub-function.
12. The method according to claim 11, wherein the function icons corresponding to the sub-functions are displayed on the first side of the floating interaction control in a line-type arrangement; alternatively, the first and second electrodes may be,
and the function icons corresponding to the sub-functions are arranged in an arc shape and displayed on the second side of the suspension interaction control.
13. The method according to claim 12, wherein the arrangement of the function icons corresponding to the sub-functions is determined according to the layout style corresponding to the target interaction region;
wherein, when the layout style is a list style, the arrangement mode of the function icons corresponding to each sub-function is the arrangement mode of the straight type;
and under the condition that the layout style is a card style, the arrangement mode of the function icons corresponding to the sub-functions is the arc arrangement mode.
14. The method according to any one of claims 1-3, further comprising:
and if the suspension interaction control is detected to stay in the non-interaction area or the default position of the first interface, displaying a third interface according to the interaction function of the suspension interaction control in the first interface, wherein the third interface is a default interaction interface.
15. An interactive apparatus, comprising:
the first processing unit is used for responding to a first movement operation of a floating interaction control in a first interface so as to control the floating interaction control to move;
and the second processing unit is used for generating and displaying a second interface according to the interface content contained in the target interaction area and the interaction function of the floating interaction control in the first interface if the floating interaction control is detected to stay in the target interaction area in the first interface.
16. A terminal device, comprising:
a memory storing executable program code;
and a processor coupled to the memory;
the processor calls the executable program code stored in the memory, which when executed by the processor causes the processor to implement the method of any of claims 1-14.
17. A computer-readable storage medium having executable program code stored thereon, wherein the executable program code, when executed by a processor, implements the method of any of claims 1-14.
CN202111508399.0A 2021-12-10 2021-12-10 Interaction method, interaction device, terminal equipment and readable storage medium Pending CN114168020A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111508399.0A CN114168020A (en) 2021-12-10 2021-12-10 Interaction method, interaction device, terminal equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111508399.0A CN114168020A (en) 2021-12-10 2021-12-10 Interaction method, interaction device, terminal equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN114168020A true CN114168020A (en) 2022-03-11

Family

ID=80485462

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111508399.0A Pending CN114168020A (en) 2021-12-10 2021-12-10 Interaction method, interaction device, terminal equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114168020A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115202548A (en) * 2022-06-30 2022-10-18 大众问问(北京)信息科技有限公司 Voice operation guiding method and device for application function, computer equipment and medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717345A (en) * 2018-06-08 2018-10-30 Oppo广东移动通信有限公司 Methods of exhibiting, device, terminal and the storage medium of functionality controls
CN108920238A (en) * 2018-06-29 2018-11-30 上海连尚网络科技有限公司 Operate method, electronic equipment and the computer-readable medium of application
CN109710132A (en) * 2018-12-28 2019-05-03 维沃移动通信有限公司 Method of controlling operation thereof and terminal
CN110471591A (en) * 2019-08-08 2019-11-19 深圳传音控股股份有限公司 A kind of exchange method, device and computer storage medium
CN110879685A (en) * 2019-11-05 2020-03-13 维沃移动通信有限公司 Interaction method of application program interface and electronic equipment
CN111309418A (en) * 2020-01-21 2020-06-19 华为技术有限公司 Control display method and electronic equipment
CN112631492A (en) * 2020-12-30 2021-04-09 北京达佳互联信息技术有限公司 Task creation method and device
CN113419800A (en) * 2021-06-11 2021-09-21 北京字跳网络技术有限公司 Interaction method, device, medium and electronic equipment
CN113721819A (en) * 2021-09-02 2021-11-30 网易(杭州)网络有限公司 Man-machine interaction method and device and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108717345A (en) * 2018-06-08 2018-10-30 Oppo广东移动通信有限公司 Methods of exhibiting, device, terminal and the storage medium of functionality controls
CN108920238A (en) * 2018-06-29 2018-11-30 上海连尚网络科技有限公司 Operate method, electronic equipment and the computer-readable medium of application
CN109710132A (en) * 2018-12-28 2019-05-03 维沃移动通信有限公司 Method of controlling operation thereof and terminal
CN110471591A (en) * 2019-08-08 2019-11-19 深圳传音控股股份有限公司 A kind of exchange method, device and computer storage medium
CN110879685A (en) * 2019-11-05 2020-03-13 维沃移动通信有限公司 Interaction method of application program interface and electronic equipment
CN111309418A (en) * 2020-01-21 2020-06-19 华为技术有限公司 Control display method and electronic equipment
CN112631492A (en) * 2020-12-30 2021-04-09 北京达佳互联信息技术有限公司 Task creation method and device
CN113419800A (en) * 2021-06-11 2021-09-21 北京字跳网络技术有限公司 Interaction method, device, medium and electronic equipment
CN113721819A (en) * 2021-09-02 2021-11-30 网易(杭州)网络有限公司 Man-machine interaction method and device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115202548A (en) * 2022-06-30 2022-10-18 大众问问(北京)信息科技有限公司 Voice operation guiding method and device for application function, computer equipment and medium

Similar Documents

Publication Publication Date Title
JP7377319B2 (en) Systems, devices, and methods for dynamically providing user interface controls on a touch-sensitive secondary display
US11809700B2 (en) Device, method, and graphical user interface for managing folders with multiple pages
US11210458B2 (en) Device, method, and graphical user interface for editing screenshot images
US20220291793A1 (en) User interface for receiving user input
KR102222143B1 (en) Handwriting keyboard for screens
CN107422934B (en) Icon setting method and electronic equipment
CN109690481B (en) Method and apparatus for dynamic function row customization
US10613745B2 (en) User interface for receiving user input
KR102476243B1 (en) Touch input cursor manipulation
CN111324266B (en) Device, method and graphical user interface for sharing content objects in a document
KR102107491B1 (en) List scroll bar control method and mobile apparatus
CN109643207A (en) Desktop starter
US10860788B2 (en) Device, method, and graphical user interface for annotating text
KR20210134849A (en) Sharing user-configurable graphical constructs
CN113824998A (en) Music user interface
EP4002107A1 (en) Data binding method, apparatus, and device of mini program, and storage medium
CN107924256B (en) Emoticons and preset replies
US11893212B2 (en) User interfaces for managing application widgets
JP2022520094A (en) Interface display method and its devices, terminals and computer programs
US20220365831A1 (en) Devices, Methods, and Graphical User Interfaces for Automatically Providing Shared Content to Applications
CN113728301A (en) Device, method and graphical user interface for manipulating 3D objects on a 2D screen
KR20240005099A (en) Devices, methods, and graphical user interfaces for automatically providing shared content to applications
CN114168020A (en) Interaction method, interaction device, terminal equipment and readable storage medium
CN115268890A (en) Information processing method and device and electronic equipment
US20130169669A1 (en) Methods And Apparatus For Presenting A Position Indication For A Selected Item In A List

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination