WO2022007541A1 - 设备控制方法、装置、存储介质及电子设备 - Google Patents

设备控制方法、装置、存储介质及电子设备 Download PDF

Info

Publication number
WO2022007541A1
WO2022007541A1 PCT/CN2021/097402 CN2021097402W WO2022007541A1 WO 2022007541 A1 WO2022007541 A1 WO 2022007541A1 CN 2021097402 W CN2021097402 W CN 2021097402W WO 2022007541 A1 WO2022007541 A1 WO 2022007541A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
window
icon
window mode
processing operation
Prior art date
Application number
PCT/CN2021/097402
Other languages
English (en)
French (fr)
Inventor
莫博宇
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2022007541A1 publication Critical patent/WO2022007541A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present application relates to the field of interaction technologies, and in particular, to a device control method, device, storage medium, and electronic device.
  • the user may interact with the electronic device by performing a touch operation on the display screen, or the user may also interact with the electronic device by means of voice control, and so on.
  • Embodiments of the present application provide a device control method, device, storage medium, and electronic device, which can improve the operability of the electronic device.
  • an embodiment of the present application provides a device control method, wherein the device control method includes:
  • the first gesture includes a preset trigger gesture, at least one icon is displayed, and each of the icons is used to represent a processing operation performed on the first window mode;
  • the processing operation represented by the corresponding icon is determined as the target processing operation
  • the target processing operation is performed.
  • an embodiment of the present application provides a device control device, wherein the device control device includes:
  • a first receiving module configured to receive the information of the first gesture when the information of the application is displayed in the first window mode
  • a display module configured to display at least one icon if the first gesture includes a preset trigger gesture, and each of the icons is used to represent a processing operation performed on the first window mode;
  • a second receiving module configured to obtain the gesture end position of the first gesture
  • a determining module configured to determine the processing operation represented by the corresponding icon as the target processing operation if the gesture end position of the first gesture is located at the display position of the icon;
  • An execution module configured to execute the target processing operation.
  • an embodiment of the present application provides a storage medium, wherein the storage medium:
  • the first gesture includes a preset trigger gesture, at least one icon is displayed, and each of the icons is used to represent a processing operation performed on the first window mode;
  • the processing operation represented by the corresponding icon is determined as the target processing operation
  • the target processing operation is performed.
  • the embodiment of the present application provides a kind of electronic equipment, wherein, the electronic equipment comprises a memory and a processor, and a computer program is stored in the memory, and the processor is used to execute by calling the computer program stored in the memory:
  • the first gesture includes a preset trigger gesture, at least one icon is displayed, and each of the icons is used to represent a processing operation performed on the first window mode;
  • the processing operation represented by the corresponding icon is determined as the target processing operation
  • the target processing operation is performed.
  • FIG. 1 is a schematic flowchart of a device control method provided by an embodiment of the present application.
  • FIG. 2 is another schematic flowchart of a device control method provided by an embodiment of the present application.
  • 3 to 12 are schematic diagrams of various scenarios of the device control method provided by the embodiments of the present application.
  • FIG. 13 is a schematic diagram of another operation of the first window mode provided by an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a device control apparatus provided by an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 16 is another schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the executive body of the embodiment of the present application may be an electronic device such as a smart phone or a tablet computer.
  • the embodiment of the present application provides a device control method, including:
  • the first gesture includes a preset trigger gesture, at least one icon is displayed, and each of the icons is used to represent a processing operation performed on the first window mode;
  • the processing operation represented by the corresponding icon is determined as the target processing operation
  • the target processing operation is performed.
  • the first gesture includes a first-segment gesture and a second-segment gesture, the first-segment gesture occurs before the second-segment gesture, and the first-segment gesture and the second-segment gesture
  • the segment gesture has a continuous gesture trajectory, and the first gesture includes a preset trigger gesture, including:
  • the first segment gesture matches the preset trigger gesture.
  • the preset trigger gesture includes a pressing operation performed on the window corresponding to the first window mode, and the pressing duration of the pressing operation is greater than or equal to a preset duration threshold, or the duration of the pressing operation is greater than or equal to a preset duration threshold.
  • the compression pressure value is greater than or equal to the preset pressure threshold.
  • the window corresponding to the first window mode includes a preset control
  • the preset trigger gesture includes a pressing operation performed at a position where the preset control is located.
  • the displayed icon includes a second icon
  • the processing operation represented by the second icon includes switching to a second window mode
  • the window area of the second window mode is larger than the window area of the first window mode.
  • the displayed icon further includes a third icon
  • the processing operation represented by the third icon includes switching to a third window mode
  • the window area of the third window mode is smaller than that of the first window mode. area.
  • the displayed icons further include a fourth icon, and the processing operation represented by the fourth icon includes closing the first window mode.
  • the first window mode includes displaying the running interface of the application in the window
  • the second window mode includes displaying the running interface of the application in full screen
  • the third window mode includes displaying the running interface in the window. Displays customized information for the application.
  • the device control method further includes:
  • the gesture pause position is at the display position of the icon, the preview effect of the processing operation represented by the corresponding icon is displayed.
  • the device control method further includes:
  • the device control method further includes:
  • the window in the window mode moves along the trajectory of the first gesture.
  • the device control method further includes:
  • the second gesture includes selecting a window corresponding to the first window mode and moving the selected window to the edge of the display screen;
  • the window corresponding to the first window mode is hidden at the edge of the display screen, and the position of the hidden window is represented in the form of an icon.
  • FIG. 1 is a schematic flowchart of a device control method provided by an embodiment of the present application. The process may include:
  • the user may interact with the electronic device by performing a touch operation on the display screen, or the user may also interact with the electronic device by means of voice control, and so on.
  • the operability of the electronic device is still low during the interaction process.
  • the electronic device may receive a gesture from the user, for example, the gesture is recorded as the first gesture. That is, the electronic device can receive the information of the first gesture.
  • the window mode may mean that the electronic device can create a window on the display screen, and display the information that the user wants to display in the window (such as the running interface of an application specified by the user) or convert the currently running Information about applications (such as foreground applications) is displayed in this window.
  • the electronic device may detect whether the first gesture includes a preset trigger gesture.
  • the first gesture is a complete and coherent gesture.
  • the first gesture is a complete and coherent gesture, which may refer to: during the process of making the first gesture, the user's finger always keeps in contact with the touch display screen. without leaving the touch screen.
  • the fact that the first gesture includes a preset trigger gesture may mean: for example, when the first gesture is decomposed into multiple gestures, one of the gestures of the first gesture matches the preset trigger gesture.
  • the first gesture is decomposed into two gestures, namely the previous gesture and the next gesture, then if the previous gesture matches the preset trigger gesture, it can be considered that the first gesture contains the preset trigger gesture, etc. .
  • the electronic device may perform other operations.
  • the process of 102 may be entered.
  • the first gesture includes a preset trigger gesture, display at least one icon, where each icon is used to represent a processing operation performed on the first window mode.
  • the electronic device can be triggered to display at least one icon on the touch screen, where each icon is used to represent a processing operation performed on the first window mode.
  • the electronic device may also acquire the gesture end position of the first gesture after detecting that the first gesture ends. .
  • the electronic device After acquiring the gesture end position of the first gesture, the electronic device can detect whether the gesture end position of the first gesture is located at the display position of a certain icon.
  • the gesture end position of the first gesture at the display position of a certain icon may refer to: for example, the display position of a certain icon is the A position, then if the gesture end position of the first gesture is also the A position, then the first gesture's display position is the A position.
  • the gesture end position is where the icon is displayed. For example, if the first gesture is a touch operation on the touch screen, the touch position of the last touch operation is the gesture end position of the first gesture. For example, after the user's finger slides to the A position of the touch screen and leaves the touch screen, the touch position of the last touch operation is the A position. If the display position of a certain icon is also the A position, the gesture end position of the first gesture is located at the display position of the icon.
  • the electronic device may perform other operations.
  • the flow of 104 is entered.
  • gesture end position of the first gesture is located at the display position of the icon, determine the processing operation represented by the corresponding icon as the target processing operation.
  • the electronic device may determine the processing operation represented by the corresponding icon as the target processing operation, and execute the target processing operation.
  • the electronic device displays three icons on the touch display screen, namely the second icon, the third icon and the fourth icon, wherein the second icon is used to indicate that the For the second processing operation performed in the first window mode, the third icon is used to represent the third processing operation performed on the first window mode, and the fourth icon is used to represent the fourth processing operation performed on the first window mode.
  • the electronic device obtains the gesture end position of the first gesture, and detects that the gesture end position of the first gesture is at the display position of the second icon, then the electronic device can determine the second processing operation represented by the second icon The operation is processed for the target, and the second processing operation is performed.
  • the electronic device when the information of the application is displayed in the first window mode, if the electronic device receives the first gesture including the preset trigger gesture, the electronic device can display at least one icon, each An icon is used to represent a processing operation performed on the first window mode. After that, the electronic device can acquire the gesture end position of the first gesture, and when detecting that the gesture end position of the first gesture is at the display position of a certain icon, determine the processing operation represented by the corresponding icon as the target processing operation and Execute the target processing operation.
  • the embodiment of the present application can perform a corresponding processing operation on the first window mode when the electronic device receives a gesture including a trigger gesture, the electronic device can quickly perform a corresponding processing operation on the first window mode, that is, this
  • the application embodiments can improve the operability of the electronic device.
  • FIG. 2 is another schematic flowchart of a device control method provided by an embodiment of the present application.
  • the process may include:
  • the electronic device may receive a gesture from the user, for example, the gesture is recorded as the first gesture. That is, the electronic device can receive the information of the first gesture.
  • the window mode may mean that the electronic device can create a window on the display screen, and display the information that the user wants to display in the window (such as the running interface of an application specified by the user) or convert the currently running Information about applications (such as foreground applications) is displayed in this window.
  • the electronic device may detect whether the first gesture includes a preset trigger gesture.
  • the preset trigger gesture may be a pressing operation performed on a window corresponding to the first window mode, and the pressing duration of the pressing operation may be greater than or equal to a preset duration threshold, that is, the preset trigger gesture may be a user A long-press operation on the window corresponding to the first window mode.
  • the pressing duration of the pressing operation performed by the user on the window is greater than or equal to 0.5 seconds or 1 second, it may be regarded as a long pressing operation.
  • the pressing pressure value of the pressing operation may be greater than or equal to a preset pressure threshold, that is, the preset trigger gesture may be a re-pressing operation performed by the user on the window corresponding to the first window mode.
  • the pressing pressure value of the pressing operation performed by the user on the window is greater than or equal to 5N or 4N, it may be regarded as a re-pressing operation, and so on.
  • the window corresponding to the first window mode may include a preset control.
  • the preset trigger gesture includes a pressing operation performed at the position where the preset control is located.
  • the preset trigger gesture may be a long-press operation or a re-press operation performed at the position where the preset control is located.
  • the window shown in the figure is the window corresponding to the first window mode
  • a preset control is set at the middle position of the upper edge of the window
  • the preset trigger gesture can be long-pressing the preset control.
  • Set the control for example, when the user presses the preset control and the pressing duration is greater than or equal to the preset duration threshold, the electronic device may determine that the preset trigger gesture is received.
  • the preset trigger gesture may also be to press the preset control again. For example, when the user presses the preset control and the pressing pressure value applied to the touch display screen is greater than or equal to the preset pressure threshold, the electronic device may determine that A preset trigger gesture is received, and so on.
  • the first gesture is a complete and coherent gesture.
  • the first gesture is a complete and coherent gesture, which may refer to: during the process of making the first gesture, the user's finger always keeps in contact with the touch display screen. without leaving the touch screen.
  • the fact that the first gesture includes a preset trigger gesture may mean: for example, when the first gesture is decomposed into multiple gestures, one of the gestures of the first gesture matches the preset trigger gesture.
  • the first gesture is decomposed into a first segment gesture and a second segment gesture, the first segment gesture occurs before the second segment gesture, and the first segment gesture and the second segment gesture Gestures have a continuous gesture trajectory.
  • the first gesture includes a preset trigger gesture, which may include: the first segment of the gesture matches the preset trigger gesture.
  • the electronic device may perform other operations.
  • the process of 202 may be entered.
  • the first gesture includes a preset trigger gesture, display a second icon, a third icon, and a fourth icon, wherein the processing operation represented by the second icon includes switching to a second window mode, and the window of the second window mode is displayed.
  • the area is larger than the window area of the first window mode;
  • the processing operation represented by the third icon includes switching to a third window mode, and the window area of the third window mode is smaller than the window area of the first window mode;
  • the processing operation represented by the fourth icon The operation includes closing the first window mode.
  • the electronic device can be triggered to display at least one icon on the touch screen, where each icon is used to represent a processing operation performed on the first window mode.
  • the electronic device may display the second icon, the third icon and the fourth icon.
  • the processing operation represented by the second icon may be switching to a second window mode, and the window area of the second window mode is larger than the window area of the first window mode. That is, switching from the first window mode to the second window mode means switching from the current window to a larger window.
  • the processing operation represented by the third icon may be switching to a third window mode, where the window area of the third window mode is smaller than that of the first window mode. That is, switching from the first window mode to the third window mode means switching from the current window to a smaller window.
  • the processing operation represented by the fourth icon may be closing the first window mode.
  • the first gesture received by the electronic device includes a preset trigger gesture, and the electronic device can display the second icon R, The third icon S, the fourth icon T.
  • the processing operation represented by the second icon R may be switching to a second window mode, and the window area of the second window mode is larger than that of the first window mode.
  • the processing operation represented by the third icon S may be switching to a third window mode, where the window area of the third window mode is smaller than that of the first window mode.
  • the processing operation represented by the fourth icon T may be closing the first window mode.
  • the first window mode may include displaying the running interface of the application in the window
  • the second window mode may include displaying the running interface of the application in full screen
  • the third window mode may include displaying the customized application in the window information.
  • the customized information of the application displayed in the third window mode may be the latest notification information of the application or other information.
  • the target application is an instant messaging application
  • the latest notification information of the instant messaging application may be displayed in the third window mode.
  • the application is a map navigation application
  • the information displayed in the third window mode may be the current location information of the user, and the like. That is, the customized information may determine which information of the application is specifically displayed according to the type of the application or the user's requirement, which is not specifically limited in this embodiment of the present application.
  • the window in the window mode moves along the trajectory of the first gesture.
  • the window in the window mode may move following the gesture track of the first gesture.
  • the electronic device displays the information of the car-hailing application Y in the first window mode.
  • the window corresponding to the first window mode includes a preset control, and the position of the preset control is B.
  • the user performs a long-press operation on the preset control, the electronic device determines that the preset trigger gesture is received, and displays the second icon R, the third icon S and the fourth icon T on the display screen.
  • the user's finger maintains contact with the touch screen and slides from position B to position C, and the sliding trajectory is a curve between position B and position C in FIG. 5 .
  • the electronic device can control the window to move from the position B to the position C in synchronization with the gesture track.
  • the user's finger still keeps in contact with the touch screen without leaving the touch screen, and slides from position C to position D, and the sliding trajectory is a curve between positions C and D in FIG. 5 .
  • the electronic device can control the window to move from position C to position D in synchronization with the gesture track.
  • 204, 205, 206 may include:
  • the user can continue to perform touch gesture operations on the touch screen without leaving the user's finger on the touch screen, that is, the electronic device can continue to operate from the touch screen.
  • Information of the first gesture is received at the user.
  • the electronic device may detect whether the first gesture ends.
  • the end of the first gesture may refer to an event in which the electronic device detects that the user's finger leaves the touch display screen.
  • the user's finger starts to slide from position B on the touch screen, passes through position C, and when the user's finger leaves the touch screen when sliding to position D, then it is detected that the user's finger leaves the touch screen from position D.
  • the electronic device detects that the first gesture ends.
  • the electronic device may control the window in the window mode to follow the continuous gesture track to move synchronously.
  • the electronic device may acquire the gesture end position of the first gesture, and detect whether the gesture end position of the first gesture is located at the display position of an icon.
  • the gesture end position of the first gesture at the display position of a certain icon may refer to: for example, the display position of a certain icon is the A position, then if the gesture end position of the first gesture is also the A position, then the first gesture's display position is the A position.
  • the gesture end position is where the icon is displayed. For example, if the first gesture is a touch operation on the touch screen, the touch position of the last touch operation is the gesture end position of the first gesture. For example, after the user's finger slides to the A position of the touch screen and leaves the touch screen, the touch position of the last touch operation is the A position. If the display position of a certain icon is also the A position, the gesture end position of the first gesture is located at the display position of the icon.
  • the gesture end position of the first gesture is also the gesture end position of the second segment gesture.
  • the electronic device may perform other operations.
  • the electronic device may determine the processing operation represented by the corresponding icon as the target processing operation, and execute the target processing operation.
  • the electronic device can detect the end of the first gesture, and the gesture end position of the first gesture is D. Since the position D is located at the display position of the third icon S, the electronic device can determine the processing operation represented by the third icon S as the target processing operation, that is, the electronic device can determine switching to the third window mode as the target processing operation, and Perform the operation of switching to the third window mode. For example, since the window area of the third window mode is smaller than that of the first window mode, the electronic device can use a smaller window to display the information of the car-hailing application Y. For example, as shown in FIG. 6 , switching to the third window mode means that the electronic device displays the latest notification information of the car-hailing application Y in a small window in the upper right corner of the display screen.
  • the embodiments of the present application may further include the following processes:
  • the gesture pause position is at the display position of the icon, the preview effect of the processing operation represented by the corresponding icon is displayed.
  • the user's finger slides from position B to position C and then to position D.
  • the moment when the user's finger slides to the position D is t1
  • the user's finger does not leave the touch screen immediately at this time. That is, at this time, the user's finger stays at the position D, then the gesture pause position of the first gesture is the position D.
  • the electronic device can switch the display to the third icon S.
  • the preview effect of the window after window mode For example, the electronic device may display a preview effect of the area size of the window after switching to the third window mode, as shown in FIG. 7 .
  • FIG. 8 to FIG. 14 are schematic diagrams of scenarios of a device control method provided by an embodiment of the present application.
  • the electronic device currently displays the running interface of the car-hailing application Y in the first window mode.
  • the first window mode may be named as a small window mode. That is, as shown in FIG. 3 , the electronic device displays the running interface of the car-hailing application Y in a small window mode.
  • the electronic device may display three icons above the small window.
  • the displayed three icons are icon R, icon S and icon T respectively.
  • the processing operation represented by the icon R may be switching to the running interface of the full-screen display application (the full-screen mode may be regarded as a special window mode).
  • the processing operation represented by the icon S may be switching to a flash window mode, and the window area in the flash window mode is smaller than the window area in the small window mode.
  • the processing operation represented by the icon T may be closing the widget mode.
  • the electronic device may continue to receive gestures from the user. For example, the user's finger does not leave the touch screen after long pressing the position B where the preset control is located, but continues to slide to the position C, as shown in FIG. 8 . After swiping to position C, the user's finger leaves the touch screen. At this time, the electronic device detects that the sliding trajectory from the position B to the position C is continuous, and the position C is at the display position of the icon R. In this case, the electronic device can switch to display the running interface of the car-hailing application Y in full screen, as shown in FIG. 9 .
  • the first gesture is a gesture with continuous sliding tracks starting from position B and ending at position C.
  • the first gesture can be decomposed into a first segment of gestures and a second segment of gestures.
  • the first segment of the gesture may be a long-press operation at position B
  • the second segment of the gesture may be a gesture from position B to position C.
  • the first segment of gesture and the second segment of gesture The gesture has a continuous sliding track, that is, the user's finger never leaves the touch screen during the process of the user making the first gesture and the second gesture.
  • the electronic device may continue to receive gestures from the user. For example, the user's finger does not leave the touch screen after long pressing the position B where the preset control is located, but continues to slide to the position E, and the trajectory can be as shown in FIG. 10 . After sliding to position E, the user's finger leaves the touch screen. At this time, the electronic device detects that the sliding trajectory from the position B to the position E is continuous, and the position E is located at the display position of the icon S. In this case, the electronic device can switch to display the latest notification information of the car-hailing application Y in a flashing window mode, as shown in FIG. 6 .
  • the first gesture is a gesture that starts from position B and ends at position E with a continuous sliding trajectory.
  • the first gesture can be decomposed into a first segment of gestures and a second segment of gestures.
  • the first segment of the gesture may be a long-press operation performed at position B
  • the second segment of the gesture may be a gesture from position B to position E.
  • the first segment of the gesture and the second segment of the gesture The gesture has a continuous sliding track, that is, the user's finger never leaves the touch screen during the process of the user making the first gesture and the second gesture.
  • the electronic device may continue to receive gestures from the user. For example, the user's finger does not leave the touch screen after long pressing the position B where the preset control is located, but continues to slide to the position F, and the sliding track is shown in Figure 11. After swiping to position F, the user's finger leaves the touch display. At this time, the electronic device detects that the sliding trajectory from the position B to the position F is continuous, and the position F is located at the display position of the icon T. In this case, the electronic device can close the small window mode and return to the desktop, as shown in FIG. 12 .
  • the first gesture is a gesture with continuous sliding tracks starting from position B and ending at position F.
  • the first gesture can be decomposed into a first segment of gestures and a second segment of gestures.
  • the first segment of the gesture may be a long-press operation performed at position B
  • the second segment of the gesture may be a segment of the gesture from position B to position F.
  • the first segment of the gesture and the second segment of the gesture The gesture has a continuous sliding track, that is, the user's finger never leaves the touch screen during the process of the user making the first gesture and the second gesture.
  • the electronic device enters the full-screen mode from the small window mode (corresponding to the gesture operation starting from position B and ending at position C) or enters the flashing mode from the small window mode.
  • the whole process of window mode (corresponding to the gesture operation starting from position B and ending at position E) or closing the small window mode is completed in one go, that is, it only takes one step to switch to other window modes or close the window mode, and the operation efficiency is high , the user experience is good.
  • the position of each icon displayed on the display screen can be adjusted by the user. For example, by default, icons R, S, T are displayed in order from left to right.
  • the user can also adjust the position of the icon, for example, the adjusted icons displayed in sequence from left to right are R, T, S, and so on.
  • the electronic device can also learn the user's habit of the user by means of machine learning, and adjust the display position of the icon according to the user's habit, and so on.
  • this embodiment of the present application may also include the following processes:
  • the second gesture includes moving the selected window to the edge of the display screen after selecting the window corresponding to the first window mode;
  • the window corresponding to the first window mode is hidden at the edge of the display screen, and the position of the hidden window is represented in the form of an icon.
  • the user can also select the window (for example, press and hold the edge of the small window) and drag it to the edge of the screen to collapse the window at the edge of the screen, and use an icon (such as round icon) to display the location of the folded window (that is, the hidden window), and a schematic diagram of the whole process can be shown in FIG. 13 .
  • the window for example, press and hold the edge of the small window
  • an icon such as round icon
  • the application information can be displayed in the first window mode.
  • the user when displaying the information of the application in the first window mode, the user can also drag the window to the edge of the screen after selecting it (for example, press and hold the edge of the small window) and stay there for a certain period of time (for example, 0.5 seconds or 1 second, etc.)
  • the window is collapsed at the edge of the screen, and an icon (such as a circular icon) is used to display the position of the collapsed window.
  • an icon such as a circular icon
  • the application information can be displayed in the first window mode.
  • the device control apparatus 300 may include: a first receiving module 301 , a display module 302 , a second receiving module 303 , a determining module 304 , and an executing module 305 .
  • the first receiving module 301 is configured to receive the information of the first gesture when the information of the application is displayed in the first window mode.
  • the display module 302 is configured to display at least one icon if the first gesture includes a preset trigger gesture, and each of the icons is used to represent a processing operation performed on the first window mode.
  • the second receiving module 303 is configured to acquire the gesture end position of the first gesture.
  • the determining module 304 is configured to determine the processing operation represented by the corresponding icon as the target processing operation if the gesture end position of the first gesture is located at the display position of the icon.
  • the execution module 305 is configured to execute the target processing operation.
  • the first gesture includes a first-segment gesture and a second-segment gesture, the first-segment gesture occurs before the second-segment gesture, and the first-segment gesture and the second-segment gesture
  • the two-segment gesture has a continuous gesture trajectory.
  • the first gesture includes a preset trigger gesture, including: the first segment of the gesture matches the preset trigger gesture.
  • the preset trigger gesture includes a pressing operation performed on a window corresponding to the first window mode, and the pressing duration of the pressing operation is greater than or equal to a preset duration threshold, or the pressing operation
  • the compression pressure value is greater than or equal to the preset pressure threshold.
  • the window corresponding to the first window mode includes preset controls.
  • the preset trigger gesture includes a pressing operation performed at the position where the preset control is located.
  • the displayed icon includes a second icon
  • the processing operation represented by the second icon includes switching to a second window mode
  • the window area of the second window mode is larger than that of the first window mode. area.
  • the displayed icons further include a third icon
  • the processing operation represented by the third icon includes switching to a third window mode
  • the window area of the third window mode is smaller than that of the first window mode. window area.
  • the displayed icons further include a fourth icon, and the processing operation represented by the fourth icon includes closing the first window mode.
  • the first window mode includes displaying the running interface of the application in a window
  • the second window mode includes displaying the running interface of the application in full screen
  • the third window mode includes displaying the running interface in the window Customized information for the application is displayed in the .
  • the execution module 305 may also be configured to: before the first gesture ends, if the gesture pause position is at the display position of the icon, display a preview effect of the processing operation represented by the corresponding icon.
  • the executing module 305 may also be configured to: return to the desktop after closing the first window mode.
  • the executing module 305 may be further configured to: before the end of the first gesture, the window in the window mode moves along the trajectory of the first gesture.
  • the execution module 305 may be further configured to: when displaying the information of the application in the first window mode, receive the information of the second gesture, where the second gesture includes selecting the information corresponding to the first window mode window and move the selected window to the edge of the display screen; according to the information of the second gesture, hide the window corresponding to the first window mode on the edge of the display screen, and represent the position of the hidden window in the form of an icon.
  • An embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is made to execute the process in the method provided by this embodiment.
  • An embodiment of the present application further provides an electronic device, including a memory and a processor, where the processor is configured to execute the process in the device control method provided by the present embodiment by invoking a computer program stored in the memory.
  • the above-mentioned electronic device may be a mobile terminal such as a tablet computer or a smart phone.
  • FIG. 15 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the electronic device 400 may include a touch display screen 401, a memory 402, a processor 403 and other components.
  • a touch display screen 401 may include a touch display screen 401, a memory 402, a processor 403 and other components.
  • FIG. 15 does not constitute a limitation to the electronic device, and may include more or less components than the one shown, or combine some components, or arrange different components.
  • the touch display screen 401 can be used to display information such as text and images, and can also be used to receive user's touch operations.
  • Memory 402 may be used to store applications and data.
  • the application program stored in the memory 402 contains executable code.
  • Applications can be composed of various functional modules.
  • the processor 403 executes various functional applications and data processing by executing the application programs stored in the memory 402 .
  • the processor 403 is the control center of the electronic device, uses various interfaces and lines to connect various parts of the entire electronic device, and executes the electronic device by running or executing the application program stored in the memory 402 and calling the data stored in the memory 402.
  • the various functions and processing data of the device are used to monitor the electronic equipment as a whole.
  • the processor 403 in the electronic device loads the executable code corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 403 executes the execution and stores it in the memory 402 in the application, thus executing:
  • the first gesture includes a preset trigger gesture, at least one icon is displayed, and each of the icons is used to represent a processing operation performed on the first window mode;
  • the processing operation represented by the corresponding icon is determined as the target processing operation
  • the target processing operation is performed.
  • the electronic device 400 may include a touch display screen 401 , a memory 402 , a processor 403 , a battery 404 , a microphone 405 , a speaker 406 and other components.
  • the touch display screen 401 can be used to display information such as text and images, and can also be used to receive user's touch operations.
  • Memory 402 may be used to store applications and data.
  • the application program stored in the memory 402 contains executable code.
  • Applications can be composed of various functional modules.
  • the processor 403 executes various functional applications and data processing by executing the application programs stored in the memory 402 .
  • the processor 403 is the control center of the electronic device, uses various interfaces and lines to connect various parts of the entire electronic device, and executes the electronic device by running or executing the application program stored in the memory 402 and calling the data stored in the memory 402.
  • the various functions and processing data of the device are used to monitor the electronic equipment as a whole.
  • the battery 404 can be used to provide power support for the various components and modules of the electronic device, thereby ensuring the normal operation of the various components and modules.
  • the microphone 405 can be used to collect sound signals in the surrounding environment, such as the user's voice.
  • Speaker 406 may be used to play sound signals.
  • the processor 403 in the electronic device loads the executable code corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 403 executes the execution and stores it in the memory 402 in the application, thus executing:
  • the first gesture includes a preset trigger gesture, then at least one icon is displayed, and each of the icons is used to represent a processing operation performed on the first window mode;
  • the processing operation represented by the corresponding icon is determined as the target processing operation
  • the target processing operation is performed.
  • the first gesture includes a first-segment gesture and a second-segment gesture, the first-segment gesture occurs before the second-segment gesture, and the first-segment gesture and the second-segment gesture
  • the two-segment gesture has a continuous gesture trajectory.
  • the first gesture includes a preset trigger gesture, including: the first segment of the gesture matches the preset trigger gesture.
  • the preset trigger gesture includes a pressing operation performed on a window corresponding to the first window mode, and the pressing duration of the pressing operation is greater than or equal to a preset duration threshold, or the pressing operation
  • the compression pressure value is greater than or equal to the preset pressure threshold.
  • the window corresponding to the first window mode includes a preset control; the preset trigger gesture includes a pressing operation performed at a position where the preset control is located.
  • the displayed icon includes a second icon
  • the processing operation represented by the second icon includes switching to a second window mode
  • the window area of the second window mode is larger than that of the first window mode. area.
  • the displayed icons further include a third icon
  • the processing operation represented by the third icon includes switching to a third window mode
  • the window area of the third window mode is smaller than that of the first window mode. window area.
  • the displayed icons further include a fourth icon, and the processing operation represented by the fourth icon includes closing the first window mode.
  • the first window mode includes displaying the running interface of the application in a window
  • the second window mode includes displaying the running interface of the application in full screen
  • the third window mode includes displaying the running interface in the window Customized information for the application is displayed in the .
  • the processor 403 may further execute: before the first gesture ends, if the gesture pause position is at the display position of the icon, display a preview effect of the processing operation represented by the corresponding icon.
  • the processor 403 may further execute: returning to the desktop after closing the first window mode.
  • the processor 403 may further execute: before the first gesture ends, the window in the window mode moves along the trajectory of the first gesture.
  • the processor 403 may further perform: when the information of the application is displayed in the first window mode, receive information of a second gesture, where the second gesture includes selecting a window corresponding to the first window mode and move the selected window to the edge of the display screen; according to the information of the second gesture, hide the window corresponding to the first window mode on the edge of the display screen, and represent the position of the hidden window in the form of an icon.
  • the device control apparatus provided in the embodiment of the present application and the device control method in the above embodiments belong to the same concept, and any method provided in the device control method embodiment can be executed on the device control apparatus.
  • any method provided in the device control method embodiment can be executed on the device control apparatus.
  • the computer program can be stored in a computer-readable storage medium, such as a memory, and executed by at least one processor, and the execution process can include processes such as the embodiments of the device control method .
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM, Read Only Memory), a random access memory (RAM, Random Access Memory), and the like.
  • each functional module may be integrated in one processing chip, or each module may exist physically alone, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. If the integrated module is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer-readable storage medium, such as a read-only memory, a magnetic disk or an optical disk, etc. .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了一种设备控制方法、装置、存储介质及电子设备。该设备控制方法当以第一窗口模式显示应用的信息时,若接收到包含预设触发手势的第一手势,则显示至少一个图标,每一该图标用于表示一种对该第一窗口模式进行的处理操作,根据该第一手势的手势结束位置对应的图标确定目标处理操作,执行该目标处理操作。

Description

设备控制方法、装置、存储介质及电子设备
本申请要求于2020年7月9日提交中国专利局、申请号为202010659201.8、发明名称为“设备控制方法、装置、存储介质及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及交互技术领域,具体涉及一种设备控制方法、装置、存储介质及电子设备。
背景技术
随着技术的发展,人机交互的方式也越来越多样。比如,用户可以通过对显示屏进行触摸操作的方式和电子设备进行交互,或者用户也可以通过语音控制的方式和电子设备进行交互,等等。
发明内容
本申请实施例提供一种设备控制方法、装置、存储介质及电子设备,可以提高电子设备的可操作性。
第一方面,本申请实施例提了供一种设备控制方法,其中,设备控制方法包括:
当以第一窗口模式显示应用的信息时,接收第一手势的信息;
若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
获取所述第一手势的手势结束位置;
若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
执行所述目标处理操作。
第二方面,本申请实施例提供了一种设备控制装置,其中,设备控制装置包括:
第一接收模块,用于当以第一窗口模式显示应用的信息时,接收第一手势的信息;
显示模块,用于若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
第二接收模块,用于获取所述第一手势的手势结束位置;
确定模块,用于若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
执行模块,用于执行所述目标处理操作。
第三方面,本申请实施例提供了一种存储介质,其中,存储介质中:
当以第一窗口模式显示应用的信息时,接收第一手势的信息;
若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
获取所述第一手势的手势结束位置;
若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
执行所述目标处理操作。
第四方面,本申请实施例提供了一种电子设备,其中,电子设备包括存储器和处理器,存储器中存储有计算机程序,处理器通过调用存储器中存储的计算机 程序,用于执行:
当以第一窗口模式显示应用的信息时,接收第一手势的信息;
若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
获取所述第一手势的手势结束位置;
若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
执行所述目标处理操作。
附图说明
下面结合附图,通过对本申请的具体实施方式详细描述,将使本申请的技术方案及其有益效果显而易见。
图1是本申请实施例提供的设备控制方法的流程示意图。
图2是本申请实施例提供的设备控制方法的另一流程示意图。
图3至图12是本申请实施例提供的设备控制方法的各种场景示意图。
图13是本申请实施例提供的对第一窗口模式的另一种操作示意图。
图14是本申请实施例提供的设备控制装置的结构示意图。
图15是本申请实施例提供的电子设备的结构示意图。
图16是本申请实施例提供的电子设备的另一结构示意图。
具体实施方式
请参照图示,其中相同的组件符号代表相同的组件,本申请的原理是以实施在一适当的运算环境中来举例说明。以下的说明是基于所例示的本申请具体实施例,其不应被视为限制本申请未在此详述的其它具体实施例。
可以理解的是,本申请实施例的执行主体可以是诸如智能手机或平板电脑等的电子设备。
本申请实施例提供一种设备控制方法,包括:
当以第一窗口模式显示应用的信息时,接收第一手势的信息;
若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
获取所述第一手势的手势结束位置;
若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
执行所述目标处理操作。
在一实施例中,所述第一手势包括第一段手势和第二段手势,所述第一段手势发生于所述第二段手势之前,且所述第一段手势和所述第二段手势具有连续的手势轨迹,所述第一手势包含预设触发手势,包括:
所述第一段手势与所述预设触发手势相匹配。
在一实施例中,所述预设触发手势包括对所述第一窗口模式所对应的窗口进行的按压操作,所述按压操作的按压时长大于或等于预设时长阈值,或者所述按压操作的按压压力值大于或等于预设压力阈值。
在一实施例中,所述第一窗口模式所对应的窗口包括预设控件,所述预设触发手势包括在所述预设控件所在的位置处进行的按压操作。
在一实施例中,显示的图标包括第二图标,所述第二图标表示的处理操作包括切换至第二窗口模式,所述第二窗口模式的窗口面积大于所述第一窗口模式的窗口面积。
在一实施例中,显示的图标还包括第三图标,所述第三图标表示的处理操作包括切换至第三窗口模式,所述第三窗口模式的窗口面积小于所述第一窗口模式的窗口面积。
在一实施例中,显示的图标还包括第四图标,所述第四图标表示的处理操作包括关闭所述第一窗口模式。
在一实施例中,所述第一窗口模式包括在窗口中显示所述应用的运行界面,所述第二窗口模式包括全屏显示所述应用的运行界面,所述第三窗口模式包括在窗口中显示所述应用的定制化信息。
在一实施例中,设备控制方法还包括:
在所述第一手势结束前,若手势暂停位置位于图标的显示位置,则显示对应的图标所表示的处理操作的预览效果。
在一实施例中,设备控制方法还包括:
在关闭所述第一窗口模式后返回桌面。
在一实施例中,设备控制方法还包括:
在所述第一手势结束前,窗口模式下的窗口跟随所述第一手势的轨迹进行移动。
在一实施例中,设备控制方法还包括:
当以第一窗口模式显示应用的信息时,接收第二手势的信息,所述第二手势包括选中所述第一窗口模式对应的窗口并将被选中的窗口移动到显示屏边缘;
根据所述第二手势的信息,在所述显示屏的边缘隐藏所述第一窗口模式对应的窗口,并以图标的方式表示隐藏后的窗口所在的位置。
请参阅图1,图1是本申请实施例提供的设备控制方法的流程示意图,流程可以包括:
101、当以第一窗口模式显示应用的信息时,接收第一手势的信息。
随着技术的发展,人机交互的方式也越来越多样。比如,用户可以通过对显示屏进行触摸操作的方式和电子设备进行交互,或者用户也可以通过语音控制的方式和电子设备进行交互,等等。然而,相关技术中,在交互过程中,电子设备的可操作性仍然较低。
在本申请实施例中,比如,当电子设备以第一窗口模式显示某一应用的信息时,该电子设备可以从用户处接收一个手势,例如该手势记为第一手势。即,该电子设备可以接收到第一手势的信息。
需要说明的是,窗口模式可以是指电子设备可以在显示屏上创建一个窗口,并在该窗口显示用户想要显示的信息(如用户指定的某个应用的运行界面)或者将当前正在运行的应用(如前台应用)的信息显示在该窗口中。
在接收第一手势的信息的过程中,电子设备可以检测该第一手势是否包含预设触发手势。
需要说明的是,本实施例中,第一手势是一个完整的、连贯的手势。以第一手势为对触摸显示屏的触摸操作为例,第一手势是一个完整的、连贯的手势可以是指:在做出第一手势的过程中用户的手指始终保持对触摸显示屏的接触而没有离开过触摸显示屏。
那么,第一手势包含预设触发手势可以是指:比如将第一手势分解为多段手势来看,那么第一手势的其中一段手势与预设触发手势匹配。例如,将第一手势分解为两段手势,即前一段手势和后一段手势,那么如果该前一段手势和预设触发手势匹配的话,可以认为第一手势是包含预设触发手势的,等等。
如果该第一手势不包含预设触发手势,那么电子设备可以执行其它操作。
如果该第一手势包含预设触发手势,那么可以进入102的流程中。
102、若第一手势包含预设触发手势,则显示至少一个图标,每一图标用于表示一种对第一窗口模式进行的处理操作。
比如,电子设备检测到第一手势包含预设触发手势,那么可以触发该电子设备在触摸显示屏上显示至少一个图标,其中每一个图标用于表示一种对第一窗口模式进行的处理操作。
103、获取第一手势的手势结束位置。
比如,在显示至少一个图标后,电子设备还可以在检测到第一手势结束后,获取该第一手势的手势结束位置。。
在获取到第一手势的手势结束位置后,电子设备可以检测该第一手势的手势结束位置是否位于某一个图标的显示位置。
第一手势的手势结束位置位于某一个图标的显示位置可以是指:比如,某一个图标的显示位置为A位置,那么若第一手势的手势结束位置也为A位置,那么该第一手势的手势结束位置位于该图标的显示位置。例如,第一手势为对触摸显示屏的触摸操作,那么最后一次触摸操作的触摸位置为该第一手势的手势结束位置。例如,用户的手指在滑动到触摸显示屏的A位置后离开了该触摸显示屏,那么最后一次触摸操作的触摸位置为A位置。如果某一个图标的显示位置也是该A位置,那么第一手势的手势结束位置位于该图标的显示位置。
如果检测到第一手势的手势结束位置不位于任何一个图标的显示位置,那么电子设备可以执行其它操作。
如果检测到第一手势的手势结束位置位于某一个图标的显示位置,那么进入104的流程中。
104、若第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作。
105、执行目标处理操作。
比如,电子设备检测到第一手势的手势结束位置位于某一个图标的显示位置,那么电子设备可以将该对应的图标所表示的处理操作确定为目标处理操作,并执行该目标处理操作。
例如,当检测到第一手势包含预设触发手势时,电子设备在触摸显示屏上显示了三个图标,分别为第二图标、第三图标和第四图标,其中第二图标用于表示对第一窗口模式进行的第二处理操作,第三图标用于表示对第一窗口模式进行的第三处理操作,第四图标用于表示对第一窗口模式进行的第四处理操作。之后,电子设备获取到第一手势的手势结束位置,并检测到该第一手势的手势结束位置位于第二图标的显示位置,那么电子设备可以将该第二图标所表示的第二处理操作确定为目标处理操作,并执行该第二处理操作。
可以理解的是,在本申请实施例中,当以第一窗口模式显示应用的信息时,若电子设备接收到包含预设触发手势的第一手势,那么该电子设备可以显示至少一个图标,每一图标用于表示一种对第一窗口模式进行的处理操作。之后,电子设备可以获取第一手势的手势结束位置,并在检测到第一手势的手势结束位置位于某个图标的显示位置时,将该对应的图标所表示的处理操作确定为目标处理操作并执行该目标处理操作。由于本申请实施例可以在电子设备接收到包含触发手势在内的手势时,对第一窗口模式进行对应的处理操作,因此电子设备可以快速地对第一窗口模式进行相应的处理操作,即本申请实施例可以提高电子设备的可 操作性。
请参阅图2,图2为本申请实施例提供的设备控制方法的另一流程示意图,流程可以包括:
201、当以第一窗口模式显示应用的信息时,接收第一手势的信息。
比如,当电子设备以第一窗口模式显示某一应用的信息时,该电子设备可以从用户处接收一个手势,例如该手势记为第一手势。即,该电子设备可以接收到第一手势的信息。
需要说明的是,窗口模式可以是指电子设备可以在显示屏上创建一个窗口,并在该窗口显示用户想要显示的信息(如用户指定的某个应用的运行界面)或者将当前正在运行的应用(如前台应用)的信息显示在该窗口中。
在接收第一手势的信息的过程中,电子设备可以检测该第一手势是否包含预设触发手势。
在一种实施方式中,预设触发手势可以是对第一窗口模式所对应的窗口进行的按压操作,该按压操作的按压时长可以大于或等于预设时长阈值,即预设触发手势可以是用户对第一窗口模式所对应的窗口进行的长按操作。例如,当用户对窗口进行的按压操作的按压时长大于或等于0.5秒或1秒时可以认为是长按操作。或者,该按压操作的按压压力值可以大于或等于预设压力阈值,即预设触发手势可以是用户对第一窗口模式所对应的窗口进行的重按操作。例如,当用户对窗口进行的按压操作的按压压力值大于或等于5N或4N时可以认为是重按操作,等等。
在一种实施方式中,第一窗口模式所对应的窗口可以包括一预设控件。那么,预设触发手势包括在该预设控件所在的位置处进行的按压操作。比如,预设触发手势可以是在预设控件所在的位置处进行的长按操作或者重按操作。例如,如图3所示,图中所示的窗口为第一窗口模式所对应的窗口,在该窗口的上边缘的中间位置处设置有一预设控件,预设触发手势可以是长按该预设控件,比如当用户对该预设控件进行按压且按压时长大于或等于预设时长阈值时,电子设备可以确定出接收到预设触发手势。或者,预设触发手势也可以是重按该预设控件,比如当用户对该预设控件进行按压且对触摸显示屏施加的按压压力值大于或等于预设压力阈值时,电子设备可以确定出接收到预设触发手势,等等。
需要说明的是,本实施例中,第一手势是一个完整的、连贯的手势。以第一手势为对触摸显示屏的触摸操作为例,第一手势是一个完整的、连贯的手势可以是指:在做出第一手势的过程中用户的手指始终保持对触摸显示屏的接触而没有离开过触摸显示屏。
那么,第一手势包含预设触发手势可以是指:比如将第一手势分解为多段手势来看,那么第一手势的其中一段手势与预设触发手势匹配。
例如,在一种实施方式中,将第一手势分解为第一段手势和第二段手势,该第一段手势发生于该第二段手势之前,且该第一段手势和该第二段手势具有连续的手势轨迹。
那么,第一手势包含预设触发手势,可以包括:第一段手势与预设触发手势相匹配。
如果第一手势不包含预设触发手势,那么电子设备可以执行其它操作。
如果第一手势包含预设触发手势匹配,那么可以进入202的流程中。
202、若第一手势包含预设触发手势,则显示第二图标、第三图标、第四图 标,其中该第二图标表示的处理操作包括切换至第二窗口模式,该第二窗口模式的窗口面积大于第一窗口模式的窗口面积;该第三图标表示的处理操作包含切换至第三窗口模式,该第三窗口模式的窗口面积小于第一窗口模式的窗口面积;该第四图标表示的处理操作包括关闭第一窗口模式。
比如,电子设备检测到第一手势包含预设触发手势,那么可以触发该电子设备在触摸显示屏上显示至少一个图标,其中每一个图标用于表示对第一窗口模式进行的一种处理操作。
比如,本实施例中,电子设备可以显示第二图标、第三图标和第四图标。其中,第二图标表示的处理操作可以为切换至第二窗口模式,该第二窗口模式的窗口面积大于第一窗口模式的窗口面积。也即,从第一窗口模式切换至第二窗口模式表示由当前窗口切换至更大的窗口。第三图标表示的处理操作可以为切换至第三窗口模式,该第三窗口模式的窗口面积小于第一窗口模式的窗口面积。也即,从第一窗口模式切换至第三窗口模式表示由当前窗口切换至更小的窗口。该第四图标表示的处理操作可以为关闭第一窗口模式。
例如,如图4所示,当以第一窗口模式显示应用的信息时,电子设备接收到的第一手势包含预设触发手势,此时电子设备可以在显示屏上显示第二图标R、第三图标S、第四图标T。其中,第二图标R表示的处理操作可以为切换至第二窗口模式,该第二窗口模式的窗口面积大于第一窗口模式的窗口面积。第三图标S表示的处理操作可以为切换至第三窗口模式,该第三窗口模式的窗口面积小于第一窗口模式的窗口面积。该第四图标T表示的处理操作可以为关闭第一窗口模式。
在本实施例中,第一窗口模式可以包括在窗口中显示应用的运行界面,第二窗口模式可以包括全屏显示该应用的运行界面,第三窗口模式可以包括在窗口中显示该应用的定制化信息。
在一种实施方式中,第三窗口模式下显示的应用的定制化信息可以是应用的最新通知信息或者其它信息。例如,如果目标应用是即时通信应用,那么第三窗口模式下显示的可以是即时通信应用的最新通知信息。或者,如果应用是地图导航类应用,那么第三窗口模式下显示的可以是用户当前所在的位置信息等。也即,定制化信息可以根据应用的类型或者用户的需求来决定具体显示应用的何种信息,本申请实施例对此不做具体限定。
203、在第一手势结束前,窗口模式下的窗口跟随该第一手势的轨迹进行移动。
比如,本实施例中,在第一手势结束前,窗口模式下的窗口可以跟随该第一手势的手势轨迹进行移动。
例如,如图5所示,电子设备以第一窗口模式显示网约车应用Y的信息。第一窗口模式所对应的窗口包括一预设控件,该预设控件的位置为B。用户对预设控件进行了长按操作,电子设备确定出接收到预设触发手势,并在显示屏上显示了第二图标R、第三图标S和第四图标T。之后,用户的手指保持对触摸显示屏的接触,并从位置B滑动到位置C,滑动轨迹如图5中的B位置和C位置之间的曲线。在用户手指从位置B滑动到位置C的过程中,电子设备可以控制窗口跟随手势轨迹同步从B位置移动到C位置。之后,例如用户的手指仍然保持对触摸显示屏的接触而没有离开该触摸显示屏,并从位置C滑动到位置D,滑动轨迹如图5中的C位置和D位置之间的曲线。在用户手指从位置C滑动到位置D的过程中,电子设备可以控制窗口跟随手势轨迹同步从C位置移动到D位 置。
204、获取第一手势的手势结束位置。
205、若检测到第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作。
206、执行目标处理操作。
比如,204、205、206可以包括:
比如,在显示第二图标、第三图标、第四图标后,在用户的手指未离开触摸显示屏的情况下,用户可以继续对该触摸显示屏进行触摸手势的操作,即电子设备可以继续从用户处接收第一手势的信息。
在继续接收第一手势的信息的过程中,电子设备可以检测该第一手势是否结束。
需要说明的是,在第一手势为对触摸显示屏进行的触摸操作的情况下,第一手势结束可以是指电子设备检测到用户手指离开触摸显示屏的事件。比如,如图5所示,用户手指在触摸显示屏上从位置B开始滑动,经过位置C,在滑动到位置D时用户手指离开触摸显示屏,那么在检测到用户手指从位置D离开触摸显示屏时,电子设备检测到第一手势结束。
若检测到第一手势尚未结束,那么电子设备可以控制窗口模式下的窗口跟随连续的手势轨迹同步移动。
若检测到第一手势结束,则电子设备可以获取第一手势的手势结束位置,并检测该第一手势的手势结束位置是否位于某一图标的显示位置。
第一手势的手势结束位置位于某一个图标的显示位置可以是指:比如,某一个图标的显示位置为A位置,那么若第一手势的手势结束位置也为A位置,那么该第一手势的手势结束位置位于该图标的显示位置。例如,第一手势为对触摸显示屏的触摸操作,那么最后一次触摸操作的触摸位置为该第一手势的手势结束位置。例如,用户的手指在滑动到触摸显示屏的A位置后离开了该触摸显示屏,那么最后一次触摸操作的触摸位置为A位置。如果某一个图标的显示位置也是该A位置,那么第一手势的手势结束位置位于该图标的显示位置。
需要说明的是,在将第一手势分解为第一段手势和第二段手势来看的情况下,第一手势的手势结束位置也即第二段手势的手势结束位置。
如果检测到第一手势的手势结束位置不位于任何一个图标的显示位置,那么电子设备可以执行其它操作。
如果检测到第一手势的手势结束位置位于某一个图标的显示位置,那么电子设备可以将该对应的图标所表示的处理操作确定为目标处理操作,并执行该目标处理操作。
例如,如图5所示,用户的手指在滑动到位置D后离开了触摸显示屏,那么电子设备可以检测到第一手势结束,该第一手势的手势结束位置为D。由于位置D位于第三图标S的显示位置,因此电子设备可以将第三图标S所表示的处理操作确定为目标处理操作,即电子设备可以将切换至第三窗口模式确定为目标处理操作,并执行该切换至第三窗口模式的操作。例如,由于第三窗口模式的窗口面积小于第一窗口模式的窗口面积,因此电子设备可以用一个更小的窗口来显示网约车应用Y的信息。例如,如图6所示,切换至第三窗口模式为电子设备在显示屏的右上角以一个较小的窗口来显示网约车应用Y的最新通知信息。
在一种实施方式中,本申请实施例还可以包括如下流程:
在第一手势结束前,若手势暂停位置位于图标的显示位置,则显示对应的图 标所表示的处理操作的预览效果。
比如,如图5所示,用户的手指从位置B滑动到位置C再滑动到位置D。例如,用户手指滑动到位置D的时刻为t1,此时用户的手指并没有马上离开触摸显示屏。即,此时用户的手指停留在位置D,那么第一手势的手势暂停位置为位置D,由于位置D位于第三图标S的显示位置,在这种情况下,电子设备可以显示切换至第三窗口模式后的窗口的预览效果。例如,电子设备可以显示切换至第三窗口模式后的窗口的面积大小的预览效果,可以如图7所示。
请参阅图8至图14,图8至图14为本申请实施例提供的设备控制方法的场景示意图。
比如,电子设备当前以第一窗口模式显示网约车应用Y的运行界面。本实施例中可以将第一窗口模式命名为小窗模式。即,如图3所示,电子设备以小窗模式显示网约车应用Y的运行界面。在小窗的上边缘中间位置处设有一预设控件。用户手指长按该预设控件。此时,电子设备可以确定出接收到预设触发手势。
在接收到预设触发手势后,电子设备可以在小窗的上方显示三个图标,例如,如图4所示,显示的3个图标分别为图标R、图标S和图标T。其中,图标R表示的处理操作可以为切换至全屏显示应用的运行界面(全屏模式可以认为是一种特殊的窗口模式)。图标S表示的处理操作可以为切换至闪窗模式,该闪窗模式下的窗口面积小于小窗模式下的窗口面积。图标T表示的处理操作可以为关闭小窗模式。
在显示图标R、S、T后,电子设备可以继续从用户处接收手势。例如,用户手指在长按预设控件所在的位置B后并未离开触摸显示屏而是继续滑动到位置C,如图8所示。在滑动到位置C后,用户的手指离开触摸显示屏。此时,电子设备检测到从位置B到位置C的滑动轨迹是连续的,并且位置C位于图标R的显示位置。在这种情况下,电子设备可以切换至以全屏显示网约车应用Y的运行界面,图9所示。
需要说明的是,如图8所示,第一手势为从位置B开始最后结束于位置C的滑动轨迹连续的手势。本实施例中第一手势可以分解为第一段手势和第二段手势来看。其中,第一段手势可以是在位置B处进行的长按操作,第二段手势可以是从位置B开始至位置C的这段手势,如图8所示,第一段手势和第二段手势具有连续的滑动轨迹,即用户在做出第一段手势和第二段手势的过程中,用户手指始终未离开过触摸显示屏。
又如,在显示图标R、S、T后,电子设备可以继续从用户处接收手势。例如,用户手指在长按预设控件所在的位置B后并未离开触摸显示屏而是继续滑动到位置E,轨迹可以如图10所示。在滑动到位置E后,用户的手指离开触摸显示屏。此时,电子设备检测到从位置B到位置E的滑动轨迹是连续的,并且位置E位于图标S的显示位置。在这种情况下,电子设备可以切换至以闪窗模式显示网约车应用Y的最新通知信息,如图6所示。
需要说明的是,如图10所示,第一手势为从位置B开始最后结束于位置E的滑动轨迹连续的手势。本实施例中第一手势可以分解为第一段手势和第二段手势来看。其中,第一段手势可以是在位置B处进行的长按操作,第二段手势可以是从位置B开始至位置E的这段手势,如图10所示,第一段手势和第二段手势具有连续的滑动轨迹,即用户在做出第一段手势和第二段手势的过程中,用户手指始终未离开过触摸显示屏。
再如,在显示图标R、S、T后,电子设备可以继续从用户处接收手势。例 如,用户手指在长按预设控件所在的位置B后并未离开触摸显示屏而是继续滑动到位置F,滑动轨迹如图11所示。在滑动到位置F后,用户的手指离开触摸显示屏。此时,电子设备检测到从位置B到位置F的滑动轨迹是连续的,并且位置F位于图标T的显示位置。在这种情况下,电子设备可以关闭小窗模式,并返回桌面,如图12所示。
需要说明的是,如图11所示,第一手势为从位置B开始最后结束于位置F的滑动轨迹连续的手势。本实施例中第一手势可以分解为第一段手势和第二段手势来看。其中,第一段手势可以是在位置B处进行的长按操作,第二段手势可以是从位置B开始至位置F的这段手势,如图8所示,第一段手势和第二段手势具有连续的滑动轨迹,即用户在做出第一段手势和第二段手势的过程中,用户手指始终未离开过触摸显示屏。
可以理解的是,本申请实施例中,如图8和图14所示,电子设备从小窗模式进入全屏模式(对应于从位置B开始最后结束于位置C的手势操作)或从小窗模式进入闪窗模式(对应于从位置B开始最后结束于位置E的手势操作)或者关闭小窗模式的整个过程是一气呵成的,即只需一步操作即可切换至其它窗口模式或关闭窗口模式,操作效率高,用户体验好。
在另一种实施方式中,显示屏上显示的各个图标的位置可以由用户进行调整。比如,默认情况下,从左至右依次显示图标R、S、T。用户也可以对图标的位置进行调整,例如调整后的从左至右依次显示的图标为R、T、S,等等。当然,电子设备也可以利用机器学习的方式学习得到用户的用户习惯,并根据用户的习惯对图标的显示位置进行调整,等等。
在另一种实施方式中,本申请实施例还可以包括如下流程:
当以第一窗口模式显示应用的信息时,接收第二手势的信息,该第二手势包括在选中第一窗口模式对应的窗口后将被选中的窗口移动到显示屏边缘;
根据该第二手势的信息,在显示屏的边缘隐藏第一窗口模式对应的窗口,并以图标的方式表示隐藏后的窗口所在的位置。
比如,当以第一窗口模式显示应用的信息时,用户也可以在选中窗口(例如按住小窗边缘)后将其拖到屏幕边缘让该窗口在屏幕边缘收起,并以一个图标(如圆形图标)来显示收起后的窗口(即隐藏后的窗口)所在的位置,整个过程的示意图可以如图13所示。当用户点击圆形图标时,可以恢复到以第一窗口模式显示应用的信息。
或者,当以第一窗口模式显示应用的信息时,用户也可以在选中窗口(例如按住小窗边缘)后将其拖到屏幕边缘后停留一定的时长(例如0.5秒或1秒等)让该窗口在屏幕边缘收起,并以一个图标(如圆形图标)来显示收起后的窗口所在的位置。当用户点击圆形图标时,可以恢复到以第一窗口模式显示应用的信息。
请参阅图14,图14为本申请实施例提供的设备控制装置的结构示意图。设备控制装置300可以包括:第一接收模块301,显示模块302,第二接收模块303,确定模块304,执行模块305。
第一接收模块301,用于当以第一窗口模式显示应用的信息时,接收第一手势的信息。
显示模块302,用于若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作。
第二接收模块303,用于获取所述第一手势的手势结束位置。
确定模块304,用于若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作。
执行模块305,用于执行所述目标处理操作。
在一种实施方式中,所述第一手势包括第一段手势和第二段手势,所述第一段手势发生于所述第二段手势之前,且所述第一段手势和所述第二段手势具有连续的手势轨迹。
所述第一手势包含预设触发手势,包括:所述第一段手势与所述预设触发手势相匹配。
在一种实施方式中,所述预设触发手势包括对所述第一窗口模式所对应的窗口进行的按压操作,所述按压操作的按压时长大于或等于预设时长阈值,或者所述按压操作的按压压力值大于或等于预设压力阈值。
在一种实施方式中,所述第一窗口模式所对应的窗口包括预设控件。所述预设触发手势包括在所述预设控件所在的位置处进行的按压操作。
在一种实施方式中,显示的图标包括第二图标,所述第二图标表示的处理操作包括切换至第二窗口模式,所述第二窗口模式的窗口面积大于所述第一窗口模式的窗口面积。
在一种实施方式中,显示的图标还包括第三图标,所述第三图标表示的处理操作包括切换至第三窗口模式,所述第三窗口模式的窗口面积小于所述第一窗口模式的窗口面积。
在一种实施方式中,显示的图标还包括第四图标,所述第四图标表示的处理操作包括关闭所述第一窗口模式。
在一种实施方式中,所述第一窗口模式包括在窗口中显示所述应用的运行界面,所述第二窗口模式包括全屏显示所述应用的运行界面,所述第三窗口模式包括在窗口中显示所述应用的定制化信息。
在一种实施方式中,所述执行模块305还可以用于:在所述第一手势结束前,若手势暂停位置位于图标的显示位置,则显示对应的图标所表示的处理操作的预览效果。
在一种实施方式中,所述执行模块305还可以用于:在关闭所述第一窗口模式后返回桌面。
在一种实施方式中,所述执行模块305还可以用于:在所述第一手势结束前,窗口模式下的窗口跟随所述第一手势的轨迹进行移动。
在一种实施方式中,所述执行模块305还可以用于:当以第一窗口模式显示应用的信息时,接收第二手势的信息,该第二手势包括选中第一窗口模式对应的窗口并将被选中的窗口移动到显示屏边缘;根据该第二手势的信息,在显示屏的边缘隐藏第一窗口模式对应的窗口,并以图标的方式表示隐藏后的窗口所在的位置。
本申请实施例提供一种计算机可读的存储介质,其上存储有计算机程序,当所述计算机程序在计算机上执行时,使得所述计算机执行如本实施例提供的方法中的流程。
本申请实施例还提供一种电子设备,包括存储器,处理器,所述处理器通过调用所述存储器中存储的计算机程序,用于执行本实施例提供的设备控制方法中的流程。
例如,上述电子设备可以是诸如平板电脑或者智能手机等移动终端。请参阅 图15,图15为本申请实施例提供的电子设备的结构示意图。
该电子设备400可以包括触摸显示屏401、存储器402、处理器403等部件。本领域技术人员可以理解,图15中示出的电子设备结构并不构成对电子设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
触摸显示屏401可以用于显示诸如文字、图像等信息,还可以用于接收用户的触摸操作等。
存储器402可用于存储应用程序和数据。存储器402存储的应用程序中包含有可执行代码。应用程序可以组成各种功能模块。处理器403通过运行存储在存储器402的应用程序,从而执行各种功能应用以及数据处理。
处理器403是电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器402内的应用程序,以及调用存储在存储器402内的数据,执行电子设备的各种功能和处理数据,从而对电子设备进行整体监控。
在本实施例中,电子设备中的处理器403会按照如下的指令,将一个或一个以上的应用程序的进程对应的可执行代码加载到存储器402中,并由处理器403来运行存储在存储器402中的应用程序,从而执行:
当以第一窗口模式显示应用的信息时,接收第一手势的信息;
若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
获取所述第一手势的手势结束位置;
若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
执行所述目标处理操作。
请参阅图16,电子设备400可以包括触摸显示屏401、存储器402、处理器403、电池404、麦克风405、扬声器406等部件。
触摸显示屏401可以用于显示诸如文字、图像等信息,还可以用于接收用户的触摸操作等。
存储器402可用于存储应用程序和数据。存储器402存储的应用程序中包含有可执行代码。应用程序可以组成各种功能模块。处理器403通过运行存储在存储器402的应用程序,从而执行各种功能应用以及数据处理。
处理器403是电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器402内的应用程序,以及调用存储在存储器402内的数据,执行电子设备的各种功能和处理数据,从而对电子设备进行整体监控。
电池404可用于为电子设备的各个部件和模块提供电力支持,从而保证各个部件和模块的正常运行。
麦克风405可用于采集周围环境中的声音信号,例如采集用户的语音。
扬声器406可以用于播放声音信号。
在本实施例中,电子设备中的处理器403会按照如下的指令,将一个或一个以上的应用程序的进程对应的可执行代码加载到存储器402中,并由处理器403来运行存储在存储器402中的应用程序,从而执行:
当以第一窗口模式显示应用的信息时,接收第一手势的信息;
若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用 于表示一种对所述第一窗口模式进行的处理操作;
获取所述第一手势的手势结束位置;
若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
执行所述目标处理操作。
在一种实施方式中,所述第一手势包括第一段手势和第二段手势,所述第一段手势发生于所述第二段手势之前,且所述第一段手势和所述第二段手势具有连续的手势轨迹。
那么,所述第一手势包含预设触发手势,包括:所述第一段手势与所述预设触发手势相匹配。
在一种实施方式中,所述预设触发手势包括对所述第一窗口模式所对应的窗口进行的按压操作,所述按压操作的按压时长大于或等于预设时长阈值,或者所述按压操作的按压压力值大于或等于预设压力阈值。
在一种实施方式中,所述第一窗口模式所对应的窗口包括预设控件;所述预设触发手势包括在所述预设控件所在的位置处进行的按压操作。
在一种实施方式中,显示的图标包括第二图标,所述第二图标表示的处理操作包括切换至第二窗口模式,所述第二窗口模式的窗口面积大于所述第一窗口模式的窗口面积。
在一种实施方式中,显示的图标还包括第三图标,所述第三图标表示的处理操作包括切换至第三窗口模式,所述第三窗口模式的窗口面积小于所述第一窗口模式的窗口面积。
在一种实施方式中,显示的图标还包括第四图标,所述第四图标表示的处理操作包括关闭所述第一窗口模式。
在一种实施方式中,所述第一窗口模式包括在窗口中显示所述应用的运行界面,所述第二窗口模式包括全屏显示所述应用的运行界面,所述第三窗口模式包括在窗口中显示所述应用的定制化信息。
在一种实施方式中,所述处理器403还可以执行:在所述第一手势结束前,若手势暂停位置位于图标的显示位置,则显示对应的图标所表示的处理操作的预览效果。
在一种实施方式中,所述处理器403还可以执行:在关闭所述第一窗口模式后返回桌面。
在一种实施方式中,所述处理器403还可以执行:在所述第一手势结束前,窗口模式下的窗口跟随所述第一手势的轨迹进行移动。
在一种实施方式中,所述处理器403还可以执行:当以第一窗口模式显示应用的信息时,接收第二手势的信息,该第二手势包括选中第一窗口模式对应的窗口并将被选中的窗口移动到显示屏边缘;根据该第二手势的信息,在显示屏的边缘隐藏第一窗口模式对应的窗口,并以图标的方式表示隐藏后的窗口所在的位置。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见上文针对设备控制方法的详细描述,此处不再赘述。
本申请实施例提供的所述设备控制装置与上文实施例中的设备控制方法属于同一构思,在所述设备控制装置上可以运行所述设备控制方法实施例中提供的任一方法,其具体实现过程详见所述设备控制方法实施例,此处不再赘述。
需要说明的是,对本申请实施例所述设备控制方法而言,本领域普通技术 人员可以理解实现本申请实施例所述设备控制方法的全部或部分流程,是可以通过计算机程序来控制相关的硬件来完成,所述计算机程序可存储于一计算机可读取存储介质中,如存储在存储器中,并被至少一个处理器执行,在执行过程中可包括如所述设备控制方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)等。
对本申请实施例的所述设备控制装置而言,其各功能模块可以集成在一个处理芯片中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中,所述存储介质譬如为只读存储器,磁盘或光盘等。
以上对本申请实施例所提供的一种设备控制方法、装置、存储介质以及电子设备进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (20)

  1. 一种设备控制方法,其中,所述方法包括:
    当以第一窗口模式显示应用的信息时,接收第一手势的信息;
    若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
    获取所述第一手势的手势结束位置;
    若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
    执行所述目标处理操作。
  2. 根据权利要求1所述的设备控制方法,其中,所述第一手势包括第一段手势和第二段手势,所述第一段手势发生于所述第二段手势之前,且所述第一段手势和所述第二段手势具有连续的手势轨迹,所述第一手势包含预设触发手势,包括:
    所述第一段手势与所述预设触发手势相匹配。
  3. 根据权利要求1所述的设备控制方法,其中,所述预设触发手势包括对所述第一窗口模式所对应的窗口进行的按压操作,所述按压操作的按压时长大于或等于预设时长阈值,或者所述按压操作的按压压力值大于或等于预设压力阈值。
  4. 根据权利要求3所述的设备控制方法,其中,所述第一窗口模式所对应的窗口包括预设控件,所述预设触发手势包括在所述预设控件所在的位置处进行的按压操作。
  5. 根据权利要求1所述的设备控制方法,其中,显示的图标包括第二图标,所述第二图标表示的处理操作包括切换至第二窗口模式,所述第二窗口模式的窗口面积大于所述第一窗口模式的窗口面积。
  6. 根据权利要求5所述的设备控制方法,其中,显示的图标还包括第三图标,所述第三图标表示的处理操作包括切换至第三窗口模式,所述第三窗口模式的窗口面积小于所述第一窗口模式的窗口面积。
  7. 根据权利要求6所述的设备控制方法,其中,显示的图标还包括第四图标,所述第四图标表示的处理操作包括关闭所述第一窗口模式。
  8. 根据权利要求6所述的设备控制方法,其中,所述第一窗口模式包括在窗口中显示所述应用的运行界面,所述第二窗口模式包括全屏显示所述应用的运行界面,所述第三窗口模式包括在窗口中显示所述应用的定制化信息。
  9. 根据权利要求1所述的设备控制方法,其中,所述方法还包括:
    在所述第一手势结束前,若手势暂停位置位于图标的显示位置,则显示对应的图标所表示的处理操作的预览效果。
  10. 根据权利要求7所述的设备控制方法,其中,所述方法还包括:
    在关闭所述第一窗口模式后返回桌面。
  11. 根据权利要求1所述的设备控制方法,其中,所述方法还包括:
    在所述第一手势结束前,窗口模式下的窗口跟随所述第一手势的轨迹进行移动。
  12. 根据权利要求1所述的设备控制方法,其中,所述方法还包括:
    当以第一窗口模式显示应用的信息时,接收第二手势的信息,所述第二手势包括选中所述第一窗口模式对应的窗口并将被选中的窗口移动到显示屏边缘;
    根据所述第二手势的信息,在所述显示屏的边缘隐藏所述第一窗口模式对应 的窗口,并以图标的方式表示隐藏后的窗口所在的位置。
  13. 一种设备控制装置,其中,所述装置包括:
    第一接收模块,用于当以第一窗口模式显示应用的信息时,接收第一手势的信息;
    显示模块,用于若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
    第二接收模块,用于获取所述第一手势的手势结束位置;
    确定模块,用于若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
    执行模块,用于执行所述目标处理操作。
  14. 一种计算机可读的存储介质,其上存储有计算机程序,其中,当所述计算机程序在计算机上运行时,使得所述计算机执行:
    当以第一窗口模式显示应用的信息时,接收第一手势的信息;
    若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
    获取所述第一手势的手势结束位置;
    若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
    执行所述目标处理操作。
  15. 一种电子设备,包括存储器和处理器,所述存储器存储有计算机程序,其中,所述处理器通过调用所述计算机程序,用于执行:
    当以第一窗口模式显示应用的信息时,接收第一手势的信息;
    若所述第一手势包含预设触发手势,则显示至少一个图标,每一所述图标用于表示一种对所述第一窗口模式进行的处理操作;
    获取所述第一手势的手势结束位置;
    若所述第一手势的手势结束位置位于图标的显示位置,则将对应的图标所表示的处理操作确定为目标处理操作;
    执行所述目标处理操作。
  16. 根据权利要求15所述的电子设备,其中,所述第一手势包括第一段手势和第二段手势,所述第一段手势发生于所述第二段手势之前,且所述第一段手势和所述第二段手势具有连续的手势轨迹,所述第一手势包含预设触发手势,包括:
    所述第一段手势与所述预设触发手势相匹配。
  17. 根据权利要求15所述的电子设备,其中,所述预设触发手势包括对所述第一窗口模式所对应的窗口进行的按压操作,所述按压操作的按压时长大于或等于预设时长阈值,或者所述按压操作的按压压力值大于或等于预设压力阈值。
  18. 根据权利要求15所述的电子设备,其中,显示的图标包括第二图标,所述第二图标表示的处理操作包括切换至第二窗口模式,所述第二窗口模式的窗口面积大于所述第一窗口模式的窗口面积。
  19. 根据权利要求18所述的电子设备,其中,显示的图标还包括第四图标,所述第四图标表示的处理操作包括关闭所述第一窗口模式。
  20. 根据权利要求18所述的电子设备,其中,所述第一窗口模式包括在窗口中显示所述应用的运行界面,所述第二窗口模式包括全屏显示所述应用的运行界面,所述第三窗口模式包括在窗口中显示所述应用的定制化信息。
PCT/CN2021/097402 2020-07-09 2021-05-31 设备控制方法、装置、存储介质及电子设备 WO2022007541A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010659201.8A CN111831205B (zh) 2020-07-09 2020-07-09 设备控制方法、装置、存储介质及电子设备
CN202010659201.8 2020-07-09

Publications (1)

Publication Number Publication Date
WO2022007541A1 true WO2022007541A1 (zh) 2022-01-13

Family

ID=72900494

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/097402 WO2022007541A1 (zh) 2020-07-09 2021-05-31 设备控制方法、装置、存储介质及电子设备

Country Status (2)

Country Link
CN (2) CN111831205B (zh)
WO (1) WO2022007541A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111831205B (zh) * 2020-07-09 2022-04-19 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备
CN115357177A (zh) * 2020-07-09 2022-11-18 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备
CN112578958B (zh) * 2020-12-16 2022-05-17 珠海格力电器股份有限公司 控制方法、装置、终端设备及存储介质
CN114764300B (zh) * 2020-12-30 2024-05-03 华为技术有限公司 一种窗口页面的交互方法、装置、电子设备以及可读存储介质

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559033A (zh) * 2013-10-30 2014-02-05 上海天奕达电子科技有限公司 一种实现智能终端多窗口显示的方法及装置
CN103793176A (zh) * 2014-02-27 2014-05-14 朱印 一种应用程序间快速切换的方法及装置
CN104503689A (zh) * 2014-11-21 2015-04-08 小米科技有限责任公司 应用界面显示方法及装置
CN105988668A (zh) * 2015-02-27 2016-10-05 阿里巴巴集团控股有限公司 一种菜单选择的方法和装置
US20170024116A1 (en) * 2015-07-20 2017-01-26 Facebook, Inc. Gravity Composer
WO2018082269A1 (zh) * 2016-11-04 2018-05-11 华为技术有限公司 菜单显示方法及终端
CN108845854A (zh) * 2018-06-08 2018-11-20 Oppo广东移动通信有限公司 用户界面显示方法、装置、终端及存储介质
CN110489043A (zh) * 2019-07-31 2019-11-22 华为技术有限公司 一种悬浮窗口的管理方法及相关装置
CN111831205A (zh) * 2020-07-09 2020-10-27 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备
CN112181582A (zh) * 2020-11-02 2021-01-05 百度时代网络技术(北京)有限公司 设备控制的方法、装置、设备以及存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130051234A (ko) * 2011-11-09 2013-05-20 삼성전자주식회사 휴대용 단말기에서 어플리케이션에 대한 비주얼 프레젠테이션 방법 및 장치
US10261672B1 (en) * 2014-09-16 2019-04-16 Amazon Technologies, Inc. Contextual launch interfaces
US11567626B2 (en) * 2014-12-17 2023-01-31 Datalogic Usa, Inc. Gesture configurable floating soft trigger for touch displays on data-capture electronic devices
KR102343365B1 (ko) * 2015-01-26 2021-12-27 삼성전자주식회사 전자장치 및 전자장치의 객체표시 방법
CN106909297B (zh) * 2016-08-19 2020-11-20 创新先进技术有限公司 一种数据通信处理方法、装置及电子设备、触摸显示设备
CN106648324B (zh) * 2016-12-28 2020-11-10 合肥恒研智能科技有限公司 一种隐藏图标操控方法、装置及终端
CN107894910A (zh) * 2017-10-31 2018-04-10 维沃移动通信有限公司 多个应用的运行方法及装置

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559033A (zh) * 2013-10-30 2014-02-05 上海天奕达电子科技有限公司 一种实现智能终端多窗口显示的方法及装置
CN103793176A (zh) * 2014-02-27 2014-05-14 朱印 一种应用程序间快速切换的方法及装置
CN104503689A (zh) * 2014-11-21 2015-04-08 小米科技有限责任公司 应用界面显示方法及装置
CN105988668A (zh) * 2015-02-27 2016-10-05 阿里巴巴集团控股有限公司 一种菜单选择的方法和装置
US20170024116A1 (en) * 2015-07-20 2017-01-26 Facebook, Inc. Gravity Composer
WO2018082269A1 (zh) * 2016-11-04 2018-05-11 华为技术有限公司 菜单显示方法及终端
CN108845854A (zh) * 2018-06-08 2018-11-20 Oppo广东移动通信有限公司 用户界面显示方法、装置、终端及存储介质
CN110489043A (zh) * 2019-07-31 2019-11-22 华为技术有限公司 一种悬浮窗口的管理方法及相关装置
CN111831205A (zh) * 2020-07-09 2020-10-27 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备
CN112181582A (zh) * 2020-11-02 2021-01-05 百度时代网络技术(北京)有限公司 设备控制的方法、装置、设备以及存储介质

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LITTLE SUN: "How to open the flash window in coloros11", SMART HOME, CN, pages 1 - 3, XP009533507, Retrieved from the Internet <URL:https://www.znj.com/news/72382.html> *
OPPO SERVICES: "How to use the "Flash" function of OPPO mobile phones", BAIDU EXPERIENCE, XP009533508, Retrieved from the Internet <URL:https://jingyan.baidu.com/article/ceb9fb10e32ccecdac2ba032.html> *

Also Published As

Publication number Publication date
CN114661219A (zh) 2022-06-24
CN111831205A (zh) 2020-10-27
CN111831205B (zh) 2022-04-19

Similar Documents

Publication Publication Date Title
JP6965319B2 (ja) 文字入力インターフェース提供方法及び装置
JP7357027B2 (ja) 入力デバイス及びユーザインターフェース対話
WO2022007541A1 (zh) 设备控制方法、装置、存储介质及电子设备
US11216158B2 (en) Method and apparatus for multitasking
US20200183528A1 (en) Information Processing Device, Operation Input Method And Operation Input Program
US8446383B2 (en) Information processing apparatus, operation prediction method, and operation prediction program
US10990278B2 (en) Method and device for controlling information flow display panel, terminal apparatus, and storage medium
US20190302984A1 (en) Method and device for controlling a flexible display device
EP2732364B1 (en) Method and apparatus for controlling content using graphical object
US8633909B2 (en) Information processing apparatus, input operation determination method, and input operation determination program
US8656296B1 (en) Selection of characters in a string of characters
US9851802B2 (en) Method and apparatus for controlling content playback
US20230139376A1 (en) Method for device control, electronic device, and storage medium
US20230091771A1 (en) Device Control Method, Storage Medium, and Non-Transitory Computer-Readable Electronic Device
WO2016173307A1 (zh) 一种消息复制方法和装置、以及智能终端
US10908868B2 (en) Data processing method and mobile device
WO2019061052A1 (zh) 一种用于智能终端的分屏显示控制方法
WO2018209464A1 (zh) 一种联系人列表控制方法和终端
CN114489424A (zh) 桌面组件的控制方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837797

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21837797

Country of ref document: EP

Kind code of ref document: A1