CN114661219A - Device control method, device, storage medium and electronic device - Google Patents

Device control method, device, storage medium and electronic device Download PDF

Info

Publication number
CN114661219A
CN114661219A CN202210316421.XA CN202210316421A CN114661219A CN 114661219 A CN114661219 A CN 114661219A CN 202210316421 A CN202210316421 A CN 202210316421A CN 114661219 A CN114661219 A CN 114661219A
Authority
CN
China
Prior art keywords
gesture
window
icon
window mode
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210316421.XA
Other languages
Chinese (zh)
Inventor
莫博宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210316421.XA priority Critical patent/CN114661219A/en
Publication of CN114661219A publication Critical patent/CN114661219A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Abstract

The application discloses a device control method, a device control apparatus, a storage medium and an electronic device. The equipment control method comprises the following steps: receiving information of a first gesture when information of an application is displayed in a first window mode; if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a processing operation on the first window mode; acquiring a gesture ending position of a first gesture; if the gesture ending position of the first gesture is detected to be located at the display position of the icon, determining the processing operation represented by the corresponding icon as a target processing operation; the target processing operation is executed. The operability of the electronic equipment can be improved.

Description

Device control method, device, storage medium and electronic device
The present application is a divisional application filed on 09.07.2020 of China patent office under the name of 2020106592018, entitled "method for controlling device, apparatus, storage medium, and electronic device", the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of interactive technologies, and in particular, to a device control method and apparatus, a storage medium, and an electronic device.
Background
With the development of the technology, the man-machine interaction mode is more and more diversified. For example, the user may interact with the electronic device by performing touch operation on the display screen, or the user may interact with the electronic device by performing voice control, and so on. However, in the related art, the operability of the electronic device is still low during the interaction process.
Disclosure of Invention
The embodiment of the application provides a device control method and device, a storage medium and an electronic device, which can improve the operability of the electronic device.
In a first aspect, an embodiment of the present application provides an apparatus control method, where the method includes:
receiving information of a first gesture when information of an application is displayed in a first window mode;
if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a processing operation on the first window mode;
acquiring a gesture ending position of the first gesture;
if the gesture ending position of the first gesture is located at the display position of the icon, determining the processing operation represented by the corresponding icon as a target processing operation;
and executing the target processing operation.
In a second aspect, an embodiment of the present application provides an apparatus for controlling a device, where the apparatus includes:
the first receiving module is used for receiving information of a first gesture when the information of the application is displayed in a first window mode;
the display module is used for displaying at least one icon if the first gesture comprises a preset trigger gesture, and each icon is used for representing a processing operation on the first window mode;
the second receiving module is used for acquiring a gesture ending position of the first gesture;
the determining module is used for determining the processing operation represented by the corresponding icon as the target processing operation if the gesture ending position of the first gesture is located at the display position of the icon;
and the execution module is used for executing the target processing operation.
In a third aspect, an embodiment of the present application provides a storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer program is enabled to execute a flow in an apparatus control method provided in an embodiment of the present application.
In a fourth aspect, an embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the flow in the device control method provided in the embodiment of the present application by calling a computer program stored in the memory.
In this embodiment, when displaying information of an application in the first window mode, if the electronic device receives a first gesture including a preset trigger gesture, the electronic device may display at least one icon, where each icon is used to represent a processing operation performed on the first window mode. After that, the electronic device may acquire a gesture end position of the first gesture, and when it is detected that the gesture end position of the first gesture is located at a display position of a certain icon, determine the processing operation represented by the corresponding icon as a target processing operation and execute the target processing operation. According to the embodiment of the application, when the electronic device receives the gesture including the trigger gesture, the corresponding processing operation can be performed on the first window mode, so that the electronic device can rapidly perform the corresponding processing operation on the first window mode, namely, the operability of the electronic device can be improved.
Drawings
The technical solutions and advantages of the present application will become apparent from the following detailed description of specific embodiments of the present application when taken in conjunction with the accompanying drawings.
Fig. 1 is a schematic flowchart of an apparatus control method provided in an embodiment of the present application.
Fig. 2 is another schematic flow chart of the device control method according to the embodiment of the present application.
Fig. 3 to fig. 12 are schematic diagrams of various scenarios of a device control method provided in an embodiment of the present application.
Fig. 13 is another operation diagram of the first window mode according to the embodiment of the present application.
Fig. 14 is a schematic structural diagram of an apparatus control device according to an embodiment of the present application.
Fig. 15 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Fig. 16 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
It is understood that the execution subject of the embodiment of the present application may be an electronic device such as a smart phone or a tablet computer.
Referring to fig. 1, fig. 1 is a schematic flow chart of an apparatus control method according to an embodiment of the present application, where the flow chart may include:
101. information of a first gesture is received while information of an application is displayed in a first window mode.
With the development of the technology, the man-machine interaction mode is more and more diversified. For example, the user may interact with the electronic device by performing touch operation on the display screen, or the user may interact with the electronic device by performing voice control, and so on. However, in the related art, the operability of the electronic device is still low during the interaction process.
In this embodiment, for example, when the electronic device displays information of an application in the first window mode, the electronic device may receive a gesture from a user, and the gesture is recorded as a first gesture, for example. That is, the electronic device may receive information of the first gesture.
It should be noted that the window mode may mean that the electronic device may create a window on the display screen, and display information that the user wants to display in the window (e.g., an execution interface of a certain application specified by the user) or display information of an application currently running (e.g., a foreground application) in the window.
In the process of receiving the information of the first gesture, the electronic device may detect whether the first gesture includes a preset trigger gesture.
It should be noted that, in this embodiment, the first gesture is a complete and coherent gesture. Taking the first gesture as an example of a touch operation on the touch display screen, the fact that the first gesture is a complete and coherent gesture may refer to: the user's finger remains in contact with the touch screen display at all times during the first gesture is made without leaving the touch screen display.
Then, the first gesture including the preset trigger gesture may refer to: for example, in the viewpoint of decomposing the first gesture into multiple segments of gestures, one of the segments of the first gesture matches the preset trigger gesture. For example, the first gesture is decomposed into two segments of gestures, namely a previous segment of gesture and a next segment of gesture, and if the previous segment of gesture matches the preset trigger gesture, the first gesture may be considered to comprise the preset trigger gesture, and so on.
If the first gesture does not include a preset trigger gesture, the electronic device may perform other operations.
If the first gesture comprises a preset trigger gesture, then the process flow of 102 may be entered.
102. If the first gesture comprises a preset triggering gesture, displaying at least one icon, wherein each icon is used for representing a processing operation on the first window mode.
For example, if the electronic device detects that the first gesture includes a preset trigger gesture, the electronic device may be triggered to display at least one icon on the touch display screen, where each icon is used to represent a processing operation performed on the first window mode.
103. And acquiring a gesture ending position of the first gesture.
For example, after displaying at least one icon, the electronic device may further acquire a gesture end position of the first gesture after detecting that the first gesture is ended. .
After the gesture end position of the first gesture is acquired, the electronic device may detect whether the gesture end position of the first gesture is located at a display position of one icon.
The gesture ending position of the first gesture at the display position of one icon may refer to: for example, if the display position of an icon is the a position, then if the gesture ending position of the first gesture is also the a position, then the gesture ending position of the first gesture is located at the display position of the icon. For example, the first gesture is a touch operation on the touch display screen, and the touch position of the last touch operation is the gesture end position of the first gesture. For example, when the finger of the user leaves the touch display screen after sliding to the a position of the touch display screen, the touch position of the last touch operation is the a position. If the display position of one icon is also the A position, the gesture ending position of the first gesture is located at the display position of the icon.
If the gesture end position of the first gesture is detected not to be at the display position of any one of the icons, the electronic device may perform other operations.
If the gesture ending position of the first gesture is detected to be located at the display position of one icon, the process flow of 104 is entered.
104. And if the gesture ending position of the first gesture is located at the display position of the icon, determining the processing operation represented by the corresponding icon as the target processing operation.
105. A target processing operation is performed.
For example, when the electronic device detects that the gesture end position of the first gesture is located at the display position of one icon, the electronic device may determine the processing operation represented by the corresponding icon as a target processing operation, and execute the target processing operation.
For example, when it is detected that the first gesture includes a preset trigger gesture, the electronic device displays three icons, namely a second icon, a third icon and a fourth icon, on the touch display screen, wherein the second icon is used for representing a second processing operation performed on the first window mode, the third icon is used for representing a third processing operation performed on the first window mode, and the fourth icon is used for representing a fourth processing operation performed on the first window mode. Then, the electronic device acquires the gesture end position of the first gesture, and detects that the gesture end position of the first gesture is located at the display position of the second icon, so that the electronic device may determine the second processing operation represented by the second icon as the target processing operation, and execute the second processing operation.
It is understood that, in the embodiment of the present application, when information of an application is displayed in the first window mode, if the electronic device receives a first gesture including a preset trigger gesture, the electronic device may display at least one icon, where each icon is used for representing a processing operation performed on the first window mode. After that, the electronic device may acquire a gesture end position of the first gesture, and when it is detected that the gesture end position of the first gesture is located at a display position of a certain icon, determine the processing operation represented by the corresponding icon as a target processing operation and execute the target processing operation. According to the embodiment of the application, when the electronic device receives the gesture including the trigger gesture, the corresponding processing operation can be performed on the first window mode, so that the electronic device can rapidly perform the corresponding processing operation on the first window mode, namely, the operability of the electronic device can be improved.
Referring to fig. 2, fig. 2 is another schematic flow chart of an apparatus control method according to an embodiment of the present application, where the flow chart may include:
201. information of a first gesture is received while information of an application is displayed in a first window mode.
For example, when the electronic device displays information of an application in the first window mode, the electronic device may receive a gesture from the user, for example, the gesture is recorded as a first gesture. That is, the electronic device may receive information of the first gesture.
It should be noted that the window mode may mean that the electronic device may create a window on the display screen, and display information that the user wants to display in the window (e.g., an execution interface of a certain application specified by the user) or display information of an application currently running (e.g., a foreground application) in the window.
In the process of receiving the information of the first gesture, the electronic device may detect whether the first gesture includes a preset trigger gesture.
In an embodiment, the preset trigger gesture may be a pressing operation performed on a window corresponding to the first window mode, and a pressing duration of the pressing operation may be greater than or equal to a preset duration threshold, that is, the preset trigger gesture may be a long-time pressing operation performed on the window corresponding to the first window mode by a user. For example, a long press operation may be considered when the press duration of the press operation of the window by the user is greater than or equal to 0.5 seconds or 1 second. Alternatively, the pressing pressure value of the pressing operation may be greater than or equal to the preset pressure threshold, that is, the preset trigger gesture may be a re-pressing operation performed by the user on the window corresponding to the first window mode. For example, a re-press operation may be considered when the press pressure value of the press operation of the window by the user is greater than or equal to 5N or 4N, and so on.
In one embodiment, the window corresponding to the first window mode may include a preset control. Then, the preset trigger gesture includes a pressing operation performed at a position where the preset control is located. For example, the preset trigger gesture may be a long press operation or a re-press operation performed at a position where the preset control is located. For example, as shown in fig. 3, the window shown in the drawing is a window corresponding to the first window mode, a preset control is disposed at a middle position of an upper edge of the window, and the preset trigger gesture may be a long-time pressing of the preset control, for example, when the user presses the preset control and a pressing time duration is greater than or equal to a preset time duration threshold, the electronic device may determine that the preset trigger gesture is received. Alternatively, the preset trigger gesture may be a re-pressing of the preset control, for example, when the user presses the preset control and a pressing pressure value applied to the touch display screen is greater than or equal to a preset pressure threshold, the electronic device may determine that the preset trigger gesture is received, and the like.
It should be noted that, in this embodiment, the first gesture is a complete and coherent gesture. Taking the first gesture as an example of a touch operation on the touch display screen, the fact that the first gesture is a complete and consecutive gesture may refer to: the user's finger remains in contact with the touch screen display at all times during the first gesture is made without leaving the touch screen display.
Then, the first gesture including the preset trigger gesture may refer to: for example, in the viewpoint of decomposing the first gesture into multiple segments of gestures, one of the segments of the first gesture matches the preset trigger gesture.
For example, in one embodiment, a first gesture is decomposed into a first segment of a gesture and a second segment of a gesture, the first segment of the gesture occurring before the second segment of the gesture, and the first segment of the gesture and the second segment of the gesture having a continuous gesture trajectory.
Then, the first gesture includes a preset trigger gesture, which may include: the first segment of gesture matches a preset trigger gesture.
If the first gesture does not include a preset trigger gesture, the electronic device may perform other operations.
If the first gesture contains a preset trigger gesture match, then the process flow of 202 may be entered.
202. If the first gesture comprises a preset trigger gesture, displaying a second icon, a third icon and a fourth icon, wherein the processing operation represented by the second icon comprises switching to a second window mode, and the window area of the second window mode is larger than that of the first window mode; the processing operation represented by the third icon includes switching to a third window mode, the window area of the third window mode being smaller than the window area of the first window mode; the processing operation represented by the fourth icon includes closing the first window mode.
For example, if the electronic device detects that the first gesture includes a preset trigger gesture, the electronic device may be triggered to display at least one icon on the touch display screen, where each icon is used to represent a processing operation performed on the first window mode.
For example, in this embodiment, the electronic device may display the second icon, the third icon, and the fourth icon. The processing operation represented by the second icon may be switching to a second window mode, where the window area of the second window mode is larger than the window area of the first window mode. That is, switching from the first window mode to the second window mode means switching from the current window to a larger window. The processing operation represented by the third icon may be switching to a third window mode having a window area smaller than that of the first window mode. That is, switching from the first window mode to the third window mode means switching from the current window to a smaller window. The processing operation represented by the fourth icon may be to close the first window mode.
For example, as shown in fig. 4, when information of an application is displayed in the first window mode, the first gesture received by the electronic device includes a preset trigger gesture, and at this time, the electronic device may display a second icon R, a third icon S, and a fourth icon T on the display screen. The processing operation represented by the second icon R may be switching to a second window mode, where the window area of the second window mode is larger than the window area of the first window mode. The processing operation represented by the third icon S may be switching to a third window mode having a window area smaller than that of the first window mode. The processing operation represented by the fourth icon T may be to close the first window mode.
In this embodiment, the first window mode may include displaying a running interface of an application in a window, the second window mode may include displaying the running interface of the application full screen, and the third window mode may include displaying customized information of the application in the window.
In one embodiment, the customized information of the application displayed in the third window mode may be the latest notification information of the application or other information. For example, if the target application is an instant messaging application, the latest notification information of the instant messaging application may be displayed in the third window mode. Alternatively, if the application is a map navigation type application, the position information of the user currently located in the third window mode may be displayed. That is, the customized information may determine what information of the specific display application is according to the type of the application or the requirement of the user, which is not specifically limited in the embodiment of the present application.
203. Before the first gesture is finished, the window in the window mode moves along the track of the first gesture.
For example, in this embodiment, before the first gesture ends, the window in the window mode may move along the gesture track of the first gesture.
For example, as shown in fig. 5, the electronic device displays information of the network appointment application Y in the first window mode. The window corresponding to the first window mode comprises a preset control, and the position of the preset control is B. And the user performs long-time pressing operation on the preset control, the electronic equipment determines that the preset trigger gesture is received, and displays a second icon R, a third icon S and a fourth icon T on the display screen. Thereafter, the user's finger maintains contact with the touch screen display and slides from position B to position C, the sliding trajectory being as the curve between position B and position C in FIG. 5. During the process of sliding the user's finger from position B to position C, the electronic device may control the window to move from position B to position C synchronously following the gesture trajectory. Thereafter, the user's finger, for example, remains in contact with the touch display screen without leaving the touch display screen, and slides from position C to position D, along a sliding trajectory such as the curve between position C and position D in FIG. 5. During the process of sliding the user's finger from position C to position D, the electronic device may control the window to move from the C position to the D position in synchronization with the gesture trajectory.
204. And acquiring a gesture ending position of the first gesture.
205. And if the gesture ending position of the first gesture is detected to be located at the display position of the icon, determining the processing operation represented by the corresponding icon as the target processing operation.
206. A target processing operation is performed.
204, 205, 206 may include, for example:
for example, after the second icon, the third icon, and the fourth icon are displayed, in a case that the finger of the user does not leave the touch display screen, the user may continue to perform the touch gesture operation on the touch display screen, that is, the electronic device may continue to receive the information of the first gesture from the user.
In continuing to receive information for the first gesture, the electronic device may detect whether the first gesture ends.
In addition, in the case that the first gesture is a touch operation performed on the touch display screen, the first gesture end may refer to an event that the electronic device detects that a user finger leaves the touch display screen. For example, as shown in fig. 5, the user's finger slides on the touch display screen from position B, passes through position C, and leaves the touch display screen when sliding to position D, and then the electronic device detects that the first gesture ends when detecting that the user's finger leaves the touch display screen from position D.
If the first gesture is detected not to be finished, the electronic device can control the window in the window mode to synchronously move along with the continuous gesture track.
If the first gesture is detected to be ended, the electronic device may acquire a gesture ending position of the first gesture, and detect whether the gesture ending position of the first gesture is located at a display position of an icon.
The gesture ending position of the first gesture at the display position of one icon may refer to: for example, if the display position of an icon is the a position, then if the gesture ending position of the first gesture is also the a position, then the gesture ending position of the first gesture is located at the display position of the icon. For example, the first gesture is a touch operation on the touch display screen, and the touch position of the last touch operation is the gesture end position of the first gesture. For example, when the finger of the user leaves the touch display screen after sliding to the a position of the touch display screen, the touch position of the last touch operation is the a position. If the display position of one icon is also the A position, the gesture ending position of the first gesture is located at the display position of the icon.
When the first gesture is decomposed into the first segment of the gesture and the second segment of the gesture, the gesture end position of the first gesture is also the gesture end position of the second segment of the gesture.
If the gesture end position of the first gesture is detected not to be at the display position of any one of the icons, the electronic device may perform other operations.
If the gesture end position of the first gesture is detected to be located at the display position of one icon, the electronic device can determine the processing operation represented by the corresponding icon as the target processing operation and execute the target processing operation.
For example, as shown in fig. 5, after the user's finger leaves the touch display screen after sliding to position D, the electronic device may detect that the first gesture is ended, and the gesture ending position of the first gesture is D. Since the position D is located at the display position of the third icon S, the electronic device may determine the processing operation represented by the third icon S as the target processing operation, i.e., the electronic device may determine the switching to the third window mode as the target processing operation, and perform the switching to the third window mode. For example, since the window area of the third window mode is smaller than that of the first window mode, the electronic device may display the information of the network appointment application Y with one smaller window. For example, as shown in fig. 6, switching to the third window mode is to display the latest notification information of the network appointment application Y in a smaller window at the upper right corner of the display screen for the electronic device.
In an implementation manner, the embodiment of the present application may further include the following process:
and before the first gesture is finished, if the gesture pause position is positioned at the display position of the icon, displaying a preview effect of the processing operation represented by the corresponding icon.
For example, as shown in FIG. 5, the user's finger slides from position B to position C and then to position D. For example, the time when the user's finger slides to position D is t1, at which time the user's finger does not immediately leave the touch screen display. That is, at this time, the finger of the user stays at the position D, the gesture pause position of the first gesture is the position D, and since the position D is located at the display position of the third icon S, in this case, the electronic device may display the preview effect of the window after being switched to the third window mode. For example, the electronic device may display a preview effect of the area size of the window after switching to the third window mode, as shown in fig. 7.
Referring to fig. 8 to 14, fig. 8 to 14 are schematic views of a scenario of an apparatus control method according to an embodiment of the present application.
For example, the electronic device currently displays the running interface of the network appointment application Y in the first window mode. The first window mode may be named a small window mode in this embodiment. That is, as shown in fig. 3, the electronic device displays the operation interface of the network appointment application Y in the small window mode. A preset control is arranged in the middle of the upper edge of the small window. And the user presses the preset control for a long time by fingers. At this point, the electronic device may determine that a preset trigger gesture is received.
After receiving the preset trigger gesture, the electronic device may display three icons above the small window, for example, as shown in fig. 4, the displayed 3 icons are icon R, icon S, and icon T, respectively. The processing operation represented by the icon R may be switching to a running interface of a full screen display application (full screen mode may be regarded as a special window mode). The processing operation represented by the icon S may be switching to a flash window mode, where the window area in the flash window mode is smaller than the window area in the small window mode. The processing operation represented by the icon T may be a closed widget mode.
After displaying icon R, S, T, the electronic device may continue to receive gestures from the user. For example, after long-pressing the position B where the preset control is located, the finger of the user does not leave the touch display screen but continues to slide to the position C, as shown in fig. 8. After sliding to position C, the user's finger leaves the touch display screen. At this time, the electronic apparatus detects that the slide trajectory from the position B to the position C is continuous, and the position C is located at the display position of the icon R. In this case, the electronic device may switch to displaying the running interface of the web taxi appointment application Y in a full screen, as shown in fig. 9.
As shown in fig. 8, the first gesture is a gesture in which the slide trajectory starting from the position B and ending at the position C is continuous. In this embodiment, the first gesture can be decomposed into a first segment of gesture and a second segment of gesture. The first segment of gesture may be a long-press operation performed at the position B, and the second segment of gesture may be the segment of gesture starting from the position B to the position C, as shown in fig. 8, the first segment of gesture and the second segment of gesture have continuous sliding tracks, that is, the user finger does not leave the touch display screen all the time in the process of making the first segment of gesture and the second segment of gesture.
As another example, after displaying icon R, S, T, the electronic device may continue to receive gestures from the user. For example, after the user's finger is pressed for a long time at the position B where the preset control is located, the user's finger does not leave the touch display screen but continues to slide to the position E, and the trajectory may be as shown in fig. 10. After sliding to position E, the user's finger leaves the touch display screen. At this time, the electronic apparatus detects that the slide trajectory from the position B to the position E is continuous, and the position E is located at the display position of the icon S. In this case, the electronic device may switch to display the latest notification information of the network appointment application Y in the flashing window mode, as shown in fig. 6.
As shown in fig. 10, the first gesture is a gesture in which the slide trajectory starting from the position B and ending at the position E is continuous. In this embodiment, the first gesture can be decomposed into a first segment of gesture and a second segment of gesture. The first segment of gesture may be a long-press operation performed at a position B, and the second segment of gesture may be a segment of gesture starting from the position B to a position E, as shown in fig. 10, the first segment of gesture and the second segment of gesture have continuous sliding tracks, that is, a user finger does not leave the touch display screen all the time in the process of making the first segment of gesture and the second segment of gesture.
As another example, after displaying icon R, S, T, the electronic device may continue to receive gestures from the user. For example, after the user presses the position B where the preset control is located for a long time, the user's finger does not leave the touch display screen but continues to slide to the position F, and the sliding track is as shown in fig. 11. After sliding to position F, the user's finger leaves the touch display screen. At this time, the electronic apparatus detects that the slide trajectory from the position B to the position F is continuous, and the position F is located at the display position of the icon T. In this case, the electronic device may close the widget mode and return to the desktop, as shown in fig. 12.
As shown in fig. 11, the first gesture is a gesture in which the slide trajectory starting from the position B and ending at the position F is continuous. In this embodiment, the first gesture can be decomposed into a first segment of gesture and a second segment of gesture. The first segment of gesture may be a long-press operation performed at the position B, and the second segment of gesture may be the segment of gesture starting from the position B to the position F, as shown in fig. 8, the first segment of gesture and the second segment of gesture have continuous sliding tracks, that is, the user finger does not leave the touch display screen all the time in the process of making the first segment of gesture and the second segment of gesture.
It can be understood that, in the embodiment of the present application, as shown in fig. 8 and fig. 14, the whole process of the electronic device entering the full-screen mode from the small-window mode (corresponding to the gesture operation starting from the position B and ending at the position C last) or entering the flash-window mode from the small-window mode (corresponding to the gesture operation starting from the position B and ending at the position E last) or closing the small-window mode is completed in one step, that is, only one step of operation is required to switch to the other window mode or the window closing mode, so that the operation efficiency is high, and the user experience is good.
In another embodiment, the position of each icon displayed on the display screen may be adjusted by the user. For example, by default, icons R, S, T are displayed in order from left to right. The user may also adjust the position of the icon, for example, R, T, S is displayed from left to right after the adjustment, and so on. Of course, the electronic device may also learn the user habits of the user by using a machine learning method, and adjust the display positions of the icons according to the user habits, and so on.
In another implementation, the embodiment of the present application may further include the following process:
when the information of the application is displayed in the first window mode, receiving information of a second gesture, wherein the second gesture comprises moving a selected window to the edge of the display screen after the window corresponding to the first window mode is selected;
and hiding the window corresponding to the first window mode at the edge of the display screen according to the information of the second gesture, and representing the position of the hidden window in an icon mode.
For example, when displaying the information of the application in the first window mode, the user may drag the selected window (for example, pressing the edge of the small window) to the edge of the screen to make the window collapse at the edge of the screen, and display the position of the collapsed window (i.e., the hidden window) with an icon (e.g., a circular icon), and the schematic diagram of the entire process may be as shown in fig. 13. When the user clicks on the circular icon, the information of the application may be restored to display in the first window mode.
Alternatively, when the information of the application is displayed in the first window mode, the user may drag the selected window (e.g., hold the edge of the widget) to the edge of the screen and then stay there for a certain period of time (e.g., 0.5 second or 1 second, etc.) to allow the window to be closed at the edge of the screen, and display the position of the closed window with an icon (e.g., a circular icon). When the user clicks on the circular icon, the information of the application may be restored to display in the first window mode.
Referring to fig. 14, fig. 14 is a schematic structural diagram of an apparatus control device according to an embodiment of the present application. The device control apparatus 300 may include: a first receiving module 301, a display module 302, a second receiving module 303, a determination module 304, and an execution module 305.
The first receiving module 301 is configured to receive information of a first gesture when information of an application is displayed in a first window mode.
A display module 302, configured to display at least one icon if the first gesture includes a preset trigger gesture, where each icon is used to represent a processing operation performed on the first window mode.
The second receiving module 303 is configured to obtain a gesture ending position of the first gesture.
A determining module 304, configured to determine, if the gesture ending position of the first gesture is located at the display position of the icon, the processing operation represented by the corresponding icon as the target processing operation.
An execution module 305, configured to execute the target processing operation.
In one embodiment, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture occurs before the second segment gesture, and the first segment gesture and the second segment gesture have a continuous gesture trajectory.
The first gesture includes a preset trigger gesture, including: the first segment of gesture is matched with the preset triggering gesture.
In an embodiment, the preset trigger gesture includes a pressing operation performed on a window corresponding to the first window mode, where a pressing duration of the pressing operation is greater than or equal to a preset duration threshold, or a pressing pressure value of the pressing operation is greater than or equal to a preset pressure threshold.
In an embodiment, the window corresponding to the first window mode includes a preset control. The preset triggering gesture comprises pressing operation at the position where the preset control is located.
In one embodiment, the displayed icons include a second icon representing a processing operation including switching to a second window mode having a window area greater than the window area of the first window mode.
In one embodiment, the displayed icons further include a third icon representing a processing operation including switching to a third window mode having a window area smaller than the window area of the first window mode.
In one embodiment, the displayed icons further include a fourth icon representing a processing operation including closing the first window mode.
In one embodiment, the first window mode includes displaying a running interface of the application in a window, the second window mode includes displaying the running interface of the application full screen, and the third window mode includes displaying customized information of the application in a window.
In one embodiment, the execution module 305 may be further configured to: and before the first gesture is finished, if the gesture pause position is positioned at the display position of the icon, displaying a preview effect of the processing operation represented by the corresponding icon.
In one embodiment, the execution module 305 may be further configured to: and returning to the desktop after closing the first window mode.
In one embodiment, the execution module 305 may be further configured to: and before the first gesture is finished, the window in the window mode moves along the track of the first gesture.
In one embodiment, the execution module 305 may be further configured to: when the information of the application is displayed in the first window mode, receiving information of a second gesture, wherein the second gesture comprises selecting a window corresponding to the first window mode and moving the selected window to the edge of the display screen; and hiding the window corresponding to the first window mode at the edge of the display screen according to the information of the second gesture, and representing the position of the hidden window in an icon mode.
The present application provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is caused to execute the procedures in the method provided by the present embodiment.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the procedure in the device control method provided in this embodiment by calling a computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 15, fig. 15 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
The electronic device 400 may include a touch display 401, memory 402, a processor 403, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 15 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The touch display screen 401 may be used to display information such as characters, images, and the like, and may also be used to receive a touch operation by a user, and the like.
The memory 402 may be used to store applications and data. The memory 402 stores applications containing executable code. The application programs may constitute various functional modules. The processor 403 executes various functional applications and data processing by running an application program stored in the memory 402.
The processor 403 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 402 and calling data stored in the memory 402, thereby integrally monitoring the electronic device.
In this embodiment, the processor 403 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 403 runs the application programs stored in the memory 402, so as to execute:
receiving information of a first gesture when information of an application is displayed in a first window mode;
if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a processing operation on the first window mode;
acquiring a gesture ending position of the first gesture;
if the gesture ending position of the first gesture is located at the display position of the icon, determining the processing operation represented by the corresponding icon as a target processing operation;
and executing the target processing operation.
Referring to fig. 16, the electronic device 400 may include a touch display 401, a memory 402, a processor 403, a battery 404, a microphone 405, a speaker 406, and other components.
The touch display screen 401 may be used to display information such as characters, images, and the like, and may also be used to receive a touch operation by a user, and the like.
The memory 402 may be used to store applications and data. The memory 402 stores applications containing executable code. The application programs may constitute various functional modules. The processor 403 executes various functional applications and data processing by running an application program stored in the memory 402.
The processor 403 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device.
The battery 404 may be used to provide power support for the various components and modules of the electronic device, thereby ensuring proper operation of the various components and modules.
Microphone 405 may be used to capture acoustic signals in the surrounding environment, such as capturing a user's voice.
The speaker 406 may be used to play sound signals.
In this embodiment, the processor 403 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 403 runs the application programs stored in the memory 402, so as to execute:
receiving information of a first gesture when information of an application is displayed in a first window mode;
if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a processing operation on the first window mode;
acquiring a gesture ending position of the first gesture;
if the gesture ending position of the first gesture is located at the display position of the icon, determining the processing operation represented by the corresponding icon as a target processing operation;
and executing the target processing operation.
In one embodiment, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture occurs before the second segment gesture, and the first segment gesture and the second segment gesture have a continuous gesture trajectory.
Then, the first gesture includes a preset trigger gesture, including: the first segment of gesture is matched with the preset triggering gesture.
In an embodiment, the preset trigger gesture includes a pressing operation performed on a window corresponding to the first window mode, where a pressing duration of the pressing operation is greater than or equal to a preset duration threshold, or a pressing pressure value of the pressing operation is greater than or equal to a preset pressure threshold.
In one embodiment, the window corresponding to the first window mode includes a preset control; the preset triggering gesture comprises pressing operation at the position where the preset control is located.
In one embodiment, the displayed icons include a second icon representing a processing operation including switching to a second window mode having a window area greater than the window area of the first window mode.
In one embodiment, the displayed icons further include a third icon representing a processing operation including switching to a third window mode having a window area smaller than the window area of the first window mode.
In one embodiment, the displayed icons further include a fourth icon representing a processing operation including closing the first window mode.
In one embodiment, the first window mode includes displaying a running interface of the application in a window, the second window mode includes displaying the running interface of the application full screen, and the third window mode includes displaying customized information of the application in a window.
In one embodiment, the processor 403 may further perform: and before the first gesture is finished, if the gesture pause position is positioned at the display position of the icon, displaying a preview effect of the processing operation represented by the corresponding icon.
In one embodiment, the processor 403 may further perform: and returning to the desktop after closing the first window mode.
In one embodiment, the processor 403 may further perform: and before the first gesture is finished, the window in the window mode moves along the track of the first gesture.
In one embodiment, the processor 403 may further perform: when the information of the application is displayed in the first window mode, receiving information of a second gesture, wherein the second gesture comprises selecting a window corresponding to the first window mode and moving the selected window to the edge of the display screen; and hiding the window corresponding to the first window mode at the edge of the display screen according to the information of the second gesture, and representing the position of the hidden window in an icon mode.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the device control method, and are not described again here.
The device control apparatus provided in the embodiment of the present application and the device control method in the above embodiments belong to the same concept, and any method provided in the device control method embodiment may be run on the device control apparatus, and a specific implementation process thereof is described in the device control method embodiment in detail, and is not described herein again.
It should be noted that, for the apparatus control method described in the embodiment of the present application, it can be understood by those skilled in the art that all or part of the process of implementing the apparatus control method described in the embodiment of the present application can be completed by controlling the relevant hardware through a computer program, where the computer program can be stored in a computer-readable storage medium, such as a memory, and executed by at least one processor, and during the execution, the process of the embodiment of the apparatus control method can be included. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the device control apparatus according to the embodiment of the present application, each functional module may be integrated into one processing chip, each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The above detailed description is provided for a device control method, apparatus, storage medium, and electronic device provided in the embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the descriptions of the above embodiments are only used to help understand the method and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (12)

1. An apparatus control method, characterized in that the method comprises:
receiving information of a first gesture when information of an application is displayed in a first window mode;
if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a processing operation on the first window mode, the at least one icon comprises a second icon and a third icon, the processing operation represented by the second icon comprises switching to a second window mode, the window area of the second window mode is larger than that of the first window mode, the processing operation represented by the third icon comprises switching to a third window mode, and the window area of the third window mode is smaller than that of the first window mode;
before the first gesture is finished, the window in the window mode moves along the track of the first gesture;
and if the gesture ending position of the first gesture is located at the display position of any icon, executing the processing operation represented by the icon.
2. The device control method according to claim 1, wherein the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture occurs before the second segment gesture, and the first segment gesture and the second segment gesture have a continuous gesture trajectory;
the first gesture includes a preset trigger gesture, including: the first segment of gesture matches the preset trigger gesture.
3. The device control method according to claim 1, wherein the preset trigger gesture includes a pressing operation performed on a window corresponding to the first window mode, and a pressing duration of the pressing operation is greater than or equal to a preset duration threshold, or a pressing pressure value of the pressing operation is greater than or equal to a preset pressure threshold.
4. The device control method according to claim 3, wherein the window corresponding to the first window mode includes a preset control;
the preset triggering gesture comprises pressing operation at the position where the preset control is located.
5. The device control method according to claim 1, wherein the displayed icons further include a fourth icon representing a processing operation including closing the first window mode.
6. The device control method according to claim 1, wherein the first window mode includes displaying a running interface of the application in a window, the second window mode includes displaying the running interface of the application full screen, and the third window mode includes displaying customized information of the application in a window.
7. The apparatus control method according to claim 1, characterized in that the method further comprises:
and before the first gesture is finished, if the gesture pause position is positioned at the display position of the icon, displaying a preview effect of the processing operation represented by the corresponding icon.
8. The apparatus control method according to claim 5, characterized in that the method further comprises:
and returning to the desktop after closing the first window mode.
9. The apparatus control method according to claim 1, characterized in that the method further comprises:
when information of an application is displayed in a first window mode, receiving information of a second gesture, wherein the second gesture comprises selecting a window corresponding to the first window mode and moving the selected window to the edge of a display screen;
and hiding the window corresponding to the first window mode at the edge of the display screen, and representing the position of the hidden window in an icon mode.
10. An apparatus control device, characterized in that the device comprises:
the first receiving module is used for receiving information of a first gesture when the information of the application is displayed in a first window mode;
a display module, configured to display at least one icon if the first gesture includes a preset trigger gesture, where each icon is used to represent a processing operation performed on the first window mode, the at least one icon includes a second icon and a third icon, the processing operation represented by the second icon includes switching to a second window mode, a window area of the second window mode is larger than a window area of the first window mode, the processing operation represented by the third icon includes switching to a third window mode, and the window area of the third window mode is smaller than the window area of the first window mode;
and the execution module is used for moving the window in the window mode along the track of the first gesture before the first gesture is finished, and executing the processing operation represented by the icon corresponding to the first gesture if the gesture finishing position of the first gesture is located at the display position of any icon.
11. A computer-readable storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to carry out the method according to any one of claims 1 to 9.
12. An electronic device comprising a memory, a processor, wherein the processor executes the method of any one of claims 1 to 9 by invoking a computer program stored in the memory.
CN202210316421.XA 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device Pending CN114661219A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210316421.XA CN114661219A (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010659201.8A CN111831205B (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device
CN202210316421.XA CN114661219A (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010659201.8A Division CN111831205B (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN114661219A true CN114661219A (en) 2022-06-24

Family

ID=72900494

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010659201.8A Active CN111831205B (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device
CN202210316421.XA Pending CN114661219A (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202010659201.8A Active CN111831205B (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device

Country Status (2)

Country Link
CN (2) CN111831205B (en)
WO (1) WO2022007541A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115357177A (en) * 2020-07-09 2022-11-18 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN111831205B (en) * 2020-07-09 2022-04-19 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN112578958B (en) * 2020-12-16 2022-05-17 珠海格力电器股份有限公司 Control method, control device, terminal equipment and storage medium
CN114764300B (en) * 2020-12-30 2024-05-03 华为技术有限公司 Window page interaction method and device, electronic equipment and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103106011A (en) * 2011-11-09 2013-05-15 三星电子株式会社 Visual presentation method and apparatus for application in mobile terminal
CN104503689A (en) * 2014-11-21 2015-04-08 小米科技有限责任公司 Method and device for displaying application interface
EP3048519A1 (en) * 2015-01-26 2016-07-27 Samsung Electronics Co., Ltd. Electronic device and method for displaying object in electronic device
WO2018082269A1 (en) * 2016-11-04 2018-05-11 华为技术有限公司 Menu display method and terminal
CN108845854A (en) * 2018-06-08 2018-11-20 Oppo广东移动通信有限公司 Method for displaying user interface, device, terminal and storage medium
CN110489043A (en) * 2019-07-31 2019-11-22 华为技术有限公司 A kind of management method and relevant apparatus of suspension windows
US20190369842A1 (en) * 2014-09-16 2019-12-05 Amazon Technologies, Inc. Contextual launch interfaces

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559033A (en) * 2013-10-30 2014-02-05 上海天奕达电子科技有限公司 Method and device for realizing multi-window display of intelligent terminal
CN103793176B (en) * 2014-02-27 2018-03-06 朱印 A kind of method and device being switched fast between application program
US11567626B2 (en) * 2014-12-17 2023-01-31 Datalogic Usa, Inc. Gesture configurable floating soft trigger for touch displays on data-capture electronic devices
CN105988668A (en) * 2015-02-27 2016-10-05 阿里巴巴集团控股有限公司 Menu selection method and apparatus
US10579213B2 (en) * 2015-07-20 2020-03-03 Facebook, Inc. Gravity composer
CN112596664A (en) * 2016-08-19 2021-04-02 创新先进技术有限公司 Data communication processing method and device, electronic equipment and touch display equipment
CN106648324B (en) * 2016-12-28 2020-11-10 合肥恒研智能科技有限公司 Hidden icon control method and device and terminal
CN107894910A (en) * 2017-10-31 2018-04-10 维沃移动通信有限公司 The operation method and device of multiple applications
CN111831205B (en) * 2020-07-09 2022-04-19 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN112181582A (en) * 2020-11-02 2021-01-05 百度时代网络技术(北京)有限公司 Method, apparatus, device and storage medium for device control

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103106011A (en) * 2011-11-09 2013-05-15 三星电子株式会社 Visual presentation method and apparatus for application in mobile terminal
US20190369842A1 (en) * 2014-09-16 2019-12-05 Amazon Technologies, Inc. Contextual launch interfaces
CN104503689A (en) * 2014-11-21 2015-04-08 小米科技有限责任公司 Method and device for displaying application interface
EP3048519A1 (en) * 2015-01-26 2016-07-27 Samsung Electronics Co., Ltd. Electronic device and method for displaying object in electronic device
WO2018082269A1 (en) * 2016-11-04 2018-05-11 华为技术有限公司 Menu display method and terminal
CN108845854A (en) * 2018-06-08 2018-11-20 Oppo广东移动通信有限公司 Method for displaying user interface, device, terminal and storage medium
CN110489043A (en) * 2019-07-31 2019-11-22 华为技术有限公司 A kind of management method and relevant apparatus of suspension windows

Also Published As

Publication number Publication date
WO2022007541A1 (en) 2022-01-13
CN111831205B (en) 2022-04-19
CN111831205A (en) 2020-10-27

Similar Documents

Publication Publication Date Title
CN111831205B (en) Device control method, device, storage medium and electronic device
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
US10990278B2 (en) Method and device for controlling information flow display panel, terminal apparatus, and storage medium
WO2017118329A1 (en) Method and apparatus for controlling tab bar
US20190302984A1 (en) Method and device for controlling a flexible display device
EP2825950B1 (en) Touch screen hover input handling
US8581864B2 (en) Information processing device, operation input method and operation input program
CN103530047B (en) Touch screen equipment event triggering method and device
CN109491562B (en) Interface display method of voice assistant application program and terminal equipment
CN109976655B (en) Long screen capturing method, device, terminal and storage medium
CN111782332A (en) Application interface switching method and device, terminal and storage medium
US11740754B2 (en) Method for interface operation and terminal, storage medium thereof
US9851802B2 (en) Method and apparatus for controlling content playback
KR20130097331A (en) Apparatus and method for selecting object in device with touch screen
US20230139376A1 (en) Method for device control, electronic device, and storage medium
US20230091771A1 (en) Device Control Method, Storage Medium, and Non-Transitory Computer-Readable Electronic Device
CN108984089B (en) Touch operation method and device, storage medium and electronic equipment
CN109766054A (en) A kind of touch-screen equipment and its control method, medium
WO2016173307A1 (en) Message copying method and device, and smart terminal
CN106843559B (en) User instruction identification and processing method and device
US20170168686A1 (en) Method and electronic device for processing list item operation
CN114415886A (en) Application icon management method and electronic equipment
CN107544740B (en) Application processing method and device, storage medium and electronic equipment
CN106020694B (en) Electronic equipment, and method and device for dynamically adjusting selected area
CN113849082B (en) Touch processing method and device, storage medium and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination