CN115390740A - Device control method, device, storage medium and electronic device - Google Patents

Device control method, device, storage medium and electronic device Download PDF

Info

Publication number
CN115390740A
CN115390740A CN202211052153.1A CN202211052153A CN115390740A CN 115390740 A CN115390740 A CN 115390740A CN 202211052153 A CN202211052153 A CN 202211052153A CN 115390740 A CN115390740 A CN 115390740A
Authority
CN
China
Prior art keywords
gesture
icon
electronic device
display screen
window mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211052153.1A
Other languages
Chinese (zh)
Inventor
莫博宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211052153.1A priority Critical patent/CN115390740A/en
Publication of CN115390740A publication Critical patent/CN115390740A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a device control method, a device control apparatus, a storage medium and an electronic device. The device control method may be applied to an electronic device including a touch display screen, the method including: receiving information of a first gesture; if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a window mode; acquiring a gesture ending position of the first gesture; if the gesture ending position of the first gesture is detected to be located at the display position of the icon, determining the window mode represented by the corresponding icon as a target window mode; and acquiring information of a target application, and displaying the information of the target application in the target window mode, wherein the target application is an application corresponding to the information displayed by the touch display screen when the first gesture is received. The operability of the electronic equipment can be improved.

Description

Device control method, device, storage medium and electronic device
The present application is a divisional application of a chinese patent application entitled "method and apparatus for controlling device, storage medium, and electronic device" having an application date of 2020, 7/9/2020 and an application number of 202010658123.X, the entire contents of which are incorporated herein by reference.
Technical Field
The present application belongs to the technical field of electronic devices, and in particular, to a device control method, apparatus, storage medium, and electronic device.
Background
With the development of the technology, the man-machine interaction mode is more and more diversified. For example, the user may interact with the electronic device by performing touch operation on the display screen, or the user may interact with the electronic device by performing voice control, and so on. However, in the related art, the operability of the electronic device is still low during the interaction process.
Disclosure of Invention
The embodiment of the application provides a device control method and device, a storage medium and an electronic device, which can improve the operability of the electronic device.
In a first aspect, an embodiment of the present application provides an apparatus control method, which is applied to an electronic apparatus, where the electronic apparatus includes a touch display screen, and the method includes:
receiving information of a first gesture;
if the first gesture comprises a preset triggering gesture, displaying at least one icon, wherein each icon is used for representing a window mode;
acquiring a gesture ending position of the first gesture;
if the gesture ending position of the first gesture is detected to be located at the display position of the icon, determining the window mode represented by the corresponding icon as a target window mode;
and acquiring information of a target application, and displaying the information of the target application in the target window mode, wherein the target application is an application corresponding to the information displayed by the touch display screen when the first gesture is received.
In a second aspect, an embodiment of the present application provides an apparatus control device, which is applied to an electronic device, where the electronic device includes a touch display screen, and the apparatus includes:
the first receiving module is used for receiving information of the first gesture;
the first display module is used for displaying at least one icon if the first gesture comprises a preset trigger gesture, and each icon is used for representing a window mode;
the second receiving module is used for acquiring a gesture ending position of the first gesture;
the determining module is used for determining the window mode represented by the corresponding icon as a target window mode if the gesture ending position of the first gesture is detected to be located at the display position of the icon;
and the second display module is used for acquiring information of a target application and displaying the information of the target application in the target window mode, wherein the target application is an application corresponding to the information displayed by the touch display screen when the first gesture is received.
In a third aspect, an embodiment of the present application provides a storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer program is enabled to execute a flow in an apparatus control method provided in an embodiment of the present application.
In a fourth aspect, an embodiment of the present application further provides an electronic device, which includes a memory, a processor, and a touch display screen, where the processor is configured to execute a flow in the device control method provided in the embodiment of the present application by calling a computer program stored in the memory.
In an embodiment of the application, the electronic device may receive information of a first gesture, and if it is detected that the first gesture includes a preset trigger gesture, the electronic device may display at least one icon, each icon being used for representing a window mode. After that, the electronic device may acquire a gesture ending position of the first gesture, and when the gesture ending position of the first gesture is located at a display position of a certain icon, determine the window mode represented by the corresponding icon as the target window mode. Then, the electronic device may display information of a target application in a target window mode, where the target application is an application corresponding to the information displayed on the touch display screen when the first gesture is received. According to the method and the device for displaying the window mode, when the electronic device receives the first gesture including the preset trigger gesture, the information of the target application is displayed in the corresponding window mode, so that the electronic device can quickly enter a certain window mode, namely, the operability of the electronic device can be improved.
Drawings
The technical solutions and advantages of the present application will become apparent from the following detailed description of specific embodiments of the present application when taken in conjunction with the accompanying drawings.
Fig. 1 is a schematic flowchart of a device control method provided in an embodiment of the present application.
Fig. 2 is another schematic flow chart of the device control method according to the embodiment of the present application.
Fig. 3 to fig. 12 are schematic diagrams of various scenarios of a device control method provided in an embodiment of the present application.
Fig. 13 is a schematic structural diagram of an apparatus control device according to an embodiment of the present application.
Fig. 14 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Fig. 15 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Referring to the drawings, wherein like reference numbers refer to like elements, the principles of the present application are illustrated as being implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
It is understood that the execution subject of the embodiment of the present application may be an electronic device such as a smart phone or a tablet computer.
Referring to fig. 1, fig. 1 is a schematic flow chart of an apparatus control method according to an embodiment of the present disclosure. The device control method may be applied to an electronic device, which may include a touch display screen. The process can include:
101. information of a first gesture is received.
With the development of the technology, the man-machine interaction mode is more and more diversified. For example, the user may interact with the electronic device by performing touch operation on the display screen, or the user may interact with the electronic device by performing voice control, and so on. However, in the related art, the operability of the electronic device is still low during the interaction process.
In this embodiment, for example, the electronic device may receive a gesture from the user, for example, the gesture is recorded as a first gesture, that is, the electronic device may receive information of the first gesture.
102. If the first gesture comprises a preset triggering gesture, at least one icon is displayed, and each icon is used for representing a window mode.
For example, in the process of receiving the information of the first gesture, the electronic device may detect whether the first gesture includes a preset trigger gesture.
It should be noted that, in this embodiment, the first gesture is a complete and coherent gesture. Taking the first gesture as an example of a touch operation on the touch display screen, the fact that the first gesture is a complete and coherent gesture may refer to: the user's finger remains in contact with the touch screen display at all times during the first gesture is made without leaving the touch screen display.
Then, the first gesture including the preset trigger gesture may refer to: for example, in the viewpoint of decomposing the first gesture into multiple segments of gestures, one of the segments of the first gesture matches the preset trigger gesture. For example, the first gesture is decomposed into two segments of gestures, namely a previous segment of gesture and a next segment of gesture, and if the previous segment of gesture matches the preset trigger gesture, the first gesture may be considered to comprise the preset trigger gesture, and so on.
If the first gesture does not include a preset trigger gesture, the electronic device may perform other operations.
If the first gesture comprises a preset triggering gesture, the electronic device can trigger the touch display screen of the electronic device to display at least one icon, wherein each icon can be used for representing a window mode.
It should be noted that the window mode may mean that the electronic device may create a window on the display screen, and display information that the user wants to display in the window (for example, an execution interface of a certain application specified by the user) or display information of an application currently running (for example, a foreground application) in the window, and so on.
103. And acquiring a gesture ending position of the first gesture.
For example, after displaying at least one icon, the electronic device may further acquire a gesture end position of the first gesture after detecting that the first gesture is ended.
After the gesture end position of the first gesture is acquired, the electronic device may detect whether the gesture end position of the first gesture is located at a display position of one icon.
It should be noted that the gesture ending position of the first gesture located at the display position of one icon may be: for example, if the display position of a certain icon is the position a, then if the gesture ending position of the first gesture is also the position a, then the gesture ending position of the first gesture is located at the display position of the icon. For example, the first gesture is a touch operation on the touch display screen, and the touch position of the last touch operation is the gesture end position of the first gesture. For example, when the finger of the user leaves the touch display screen after sliding to the a position of the touch display screen, the touch position of the last touch operation is the a position. If the display position of one icon is also the A position, the gesture ending position of the first gesture is located at the display position of the icon.
If the gesture end position of the first gesture is detected not to be at the display position of any one of the icons, the electronic device may perform other operations.
If the gesture ending position of the first gesture is detected to be located at the display position of one icon, the process flow of 104 is entered.
104. And if the gesture ending position of the first gesture is detected to be located at the display position of the icon, determining the window mode represented by the corresponding icon as the target window mode.
For example, if the electronic device detects that the gesture end position of the first gesture is located at the display position of one icon, the electronic device may determine the window mode represented by the corresponding icon as the target window mode.
For example, the electronic device displays two icons, a first icon and a second icon, on the touch display screen. The electronic device detects that the gesture end position of the first gesture is located at the display position of the first icon, and then the electronic device may determine the window mode represented by the first icon as the target window mode.
105. And acquiring information of a target application, and displaying the information of the target application in a target window mode, wherein the target application is an application corresponding to the information displayed by the touch display screen when the first gesture is received.
For example, after the target window mode is determined, the electronic device may obtain information of a target application, and display the information of the target application in the target window mode, where the target application is an application corresponding to the information displayed on the touch display screen when the first gesture is received.
For example, when the first gesture is received, the touch display screen displays an operation interface of a certain network appointment application Y, and after the target window mode is determined, the electronic device may display information of the network appointment application Y in the target window mode.
In the present embodiment, different window patterns may differ in terms of the size area, position, etc. of the windows.
It is understood that, in the embodiment of the present application, the electronic device may receive information of the first gesture, and if it is detected that the first gesture includes a preset trigger gesture, the electronic device may display at least one icon, where each icon is used for representing a window mode. After that, the electronic device may acquire a gesture ending position of the first gesture, and when the gesture ending position of the first gesture is located at a display position of a certain icon, determine the window mode represented by the corresponding icon as the target window mode. Then, the electronic device may display information of a target application in a target window mode, where the target application is an application corresponding to the information displayed on the touch display screen when the first gesture is received. According to the method and the device for displaying the window mode, when the electronic device receives the first gesture including the preset trigger gesture, the information of the target application is displayed in the corresponding window mode, so that the electronic device can quickly enter a certain window mode, namely, the operability of the electronic device can be improved.
Referring to fig. 2, fig. 2 is another schematic flow chart of a device control method according to an embodiment of the present disclosure, where the device control method may be applied to an electronic device, and the electronic device may include a touch display screen. The flow of the device control method may include:
201. information of a first gesture is received.
For example, the gesture in this embodiment includes a touch operation performed by the user on the touch display screen.
For example, the touch display screen of the electronic device currently displays an operation interface of the network appointment application Y, as shown in fig. 3. At this time, the user sends a touch gesture operation to the touch display screen, that is, the electronic device may receive information of a touch gesture from the user, where the touch gesture may be recorded as a first gesture.
In the process of receiving the first gesture, the electronic device may detect whether the first gesture includes a preset trigger gesture.
It should be noted that, in this embodiment, the first gesture is a complete and coherent gesture. Taking the first gesture as an example of a touch operation on the touch display screen, the fact that the first gesture is a complete and coherent gesture may refer to: the user's finger remains in contact with the touch screen display at all times during the first gesture is made without leaving the touch screen display.
Then, the first gesture including the preset trigger gesture may refer to: for example, in the viewpoint of decomposing the first gesture into multiple segments of gestures, one of the segments of the first gesture matches the preset trigger gesture.
For example, in one embodiment, a first gesture is decomposed into a first segment of a gesture and a second segment of a gesture, the first segment of the gesture occurring before the second segment of the gesture, and the first segment of the gesture and the second segment of the gesture having a continuous gesture trajectory.
Then, the first gesture includes a preset trigger gesture, which may include: the first segment of gesture matches a preset trigger gesture.
If the first gesture does not include a preset trigger gesture, the electronic device may perform other operations.
If the first gesture comprises a preset trigger gesture, then the process flow of 202 may be entered.
202. If the first gesture comprises a preset trigger gesture, displaying a first icon and a second icon, wherein a first window mode represented by the first icon comprises the steps of determining a first target area on a touch display screen, and displaying a reduced running interface of a target application in the first target area; the second window mode represented by the second icon comprises the steps of determining a second target area on the touch display screen, and displaying the customized information of the target application in the second target area, wherein the area of the second target area is smaller than that of the first target area; the target application is an application corresponding to information displayed by the touch display screen when the first gesture is received.
For example, the preset trigger gesture is a slide-up operation from the bottom of the touch display screen, and a slide track corresponding to the slide-up operation reaches a preset first length. Then, if the first segment of the first gesture received by the electronic device from the user is a sliding operation from the bottom of the touch display screen, and a sliding track of the sliding operation reaches a preset first length, it may be determined that the received first gesture includes a preset trigger gesture, and at this time, the electronic device may be triggered to display two icons, namely the first icon and the second icon, on the display screen of the electronic device. The first window mode represented by the first icon comprises the steps of determining a first target area on the touch display screen, and displaying a reduced running interface of the target application in the first target area. The second window mode represented by the second icon comprises determining a second target area on the touch display screen, and displaying the customized information of the target application in the second target area, wherein the area of the second target area is smaller than that of the first target area. The target application is the application corresponding to the information displayed by the touch display screen when the first gesture is received.
In one embodiment, the customized information of the target application displayed in the second window mode may be the latest notification information or other information of the target application. For example, if the target application is an instant messaging application, the latest notification information of the instant messaging application may be displayed in the second window mode. Or, if the target application is a map navigation application, the second window mode may display the current location information of the user, and the like. That is, the customized information may determine what information of the specific display application is according to the type of the application or the requirement of the user, which is not specifically limited in the embodiment of the present application.
For example, as shown in fig. 4, if the running interface of the network appointment car application Y is displayed on the current display screen, the network appointment car application Y is the target application. In an operation interface of the network appointment vehicle application Y, a user finger slides on the touch display screen from a bottom position B (shown as a position of a black dot in the figure) of the display screen to a position C (shown as a position of a black dot in the figure), a distance d1 between the position B and the position C is equal to a preset first length, at this time, the electronic device determines that the received first gesture comprises a preset trigger gesture, and triggers the display of two icons, namely a first icon and a second icon, on the display screen, wherein a first window mode represented by the first icon can be named as a small window mode, and a second window mode represented by the second icon can be named as a flash window mode. The small window mode may be that the running interface of the reduced target application is displayed in the first target area on the display screen. The flash mode may be to display customized information of the target application in a second target area on the display screen. The area of the window in the flash window mode is smaller than the area of the window in the small window mode.
Of course, in other embodiments, the preset trigger gesture may be other gestures. For example, the preset trigger gesture may be a slide-down operation starting from the top of the touch display screen, and a slide track corresponding to the slide-down operation reaches a preset first length. Or the preset trigger gesture may be a right sliding operation starting from the left edge of the touch display screen, and the sliding track corresponding to the right sliding operation reaches the preset first length. Or, the preset trigger gesture may be a left-sliding operation starting from the right edge of the touch display screen, and a sliding track corresponding to the left-sliding operation reaches a preset first length, and so on.
That is, in the embodiment of the present application, in a case that the first gesture is decomposed into the first segment of gesture and the second segment of gesture, matching between the first segment of gesture and the preset trigger gesture may include that gesture actions of the first segment of gesture and the preset trigger gesture are the same, and a trajectory length of the first segment of gesture is greater than or equal to the preset first length. The gesture actions of the first segment of gesture and the preset trigger gesture are the same, which means that, for example, the first segment of gesture is an up-sliding operation, and the preset trigger gesture is also an up-sliding operation; or the first segment gesture is a sliding-down operation, the preset trigger gesture is also a sliding-down operation, and the like.
In one embodiment, the starting point of the preset trigger gesture may be located at an edge position of the touch display screen. For example, the starting point of the preset trigger gesture is located at the bottom edge or the top edge or the left side edge or the right side edge of the touch display screen. For example, taking an example that the starting point of the preset trigger gesture is located at the bottom edge of the touch display screen, the preset trigger gesture is a sliding operation from the bottom edge of the touch display screen, and the sliding distance is greater than or equal to the preset first length. For another example, taking the starting point of the preset trigger gesture located at the top edge of the touch display screen as an example, the preset trigger gesture is a downward sliding operation from the top edge of the touch display screen, and the sliding distance is greater than or equal to the preset first length, and so on.
203. And acquiring a gesture ending position of the first gesture.
204. And if the gesture ending position of the first gesture is detected to be located at the display position of the icon, determining the window mode represented by the corresponding icon as the target window mode.
205. And acquiring the information of the target application, and displaying the information of the target application in a target window mode.
For example, 203, 204, 205 may include:
after the first icon and the second icon are displayed, the user can continue to perform the touch gesture operation on the touch display screen under the condition that the finger of the user does not leave the touch display screen.
After detecting that the first gesture is finished, the electronic device may acquire a gesture end position of the first gesture, and detect whether the gesture end position of the first gesture is located at a display position of one icon.
It should be noted that the gesture ending position of the first gesture located at the display position of one icon may be: for example, if the display position of an icon is the a position, then if the gesture ending position of the first gesture is also the a position, then the gesture ending position of the first gesture is located at the display position of the icon. For example, the first gesture is a touch operation on the touch display screen, and the touch position of the last touch operation is the gesture end position of the first gesture. For example, if the finger of the user leaves the touch display screen after sliding to the position a of the touch display screen, the touch position of the last touch operation is the position a. If the display position of one icon is also the A position, the gesture ending position of the first gesture is located at the display position of the icon.
When the first gesture is decomposed into the first segment of the gesture and the second segment of the gesture, the gesture end position of the first gesture is also the gesture end position of the second segment of the gesture.
If the gesture end location of the first gesture is detected to be not located at the display location of any of the icons, the electronic device may perform other operations.
If the gesture end position of the first gesture is detected to be located at the display position of one icon, the electronic equipment can determine the window mode represented by the icon as the target window mode. Thereafter, the electronic device may obtain information of the target application and display the information of the target application in the target window mode.
For example, as shown in FIG. 5, the user's finger does not leave the touch screen after sliding to position C but continues to slide to position D (see the location of the black dot in the figure), after which the user's finger leaves the touch screen. That is, the gesture end position of the first gesture is position D. Since the sliding trajectory starting from the position B, passing through the position C, and ending at the position D is continuous and the position D is located at the display position of the first icon, the electronic device may determine the widget mode represented by the first icon as the target window mode. Then, the electronic device may acquire information of a current target application, namely, the network appointment application Y, and display the information of the network appointment application Y in a small window mode. For example, as shown in fig. 6, the first target area determined by the electronic device on the touch display screen is an area in the middle of the lower half of the display screen. Then, the electronic device may display the reduced running interface of the network appointment application Y in the first target area.
Of course, in other embodiments, the first target area may also be another area of the display screen, for example, the middle position of the upper half screen of the display screen, or the middle position of the left two-thirds screen of the display screen, or the middle position of the right two-thirds screen of the display screen, and so on, which is not specifically limited in this embodiment of the present application.
As shown in fig. 5, the first gesture is a gesture in which a sliding trajectory starting from the position B, passing through the position C, and ending at the position D continues. In this embodiment, the first gesture can be decomposed into a first segment of gesture and a second segment of gesture. The first segment of gesture may be the segment of gesture starting from the position B to the position C, and the second segment of gesture may be the segment of gesture starting from the position C to the position D, as shown in fig. 5, the first segment of gesture and the second segment of gesture have continuous sliding tracks, that is, the user finger does not leave the touch display screen all the time in the process of making the first segment of gesture and the second segment of gesture.
As another example, as shown in FIG. 7, the user's finger does not leave the touch screen after sliding to position C but continues to slide to position E (see the location of the black dot in the figure), after which the user's finger leaves the touch screen. That is, the gesture end position of the first gesture is position E. Since the sliding trajectory starting from the position B, passing through the position C, and ending at the position E finally is continuous and the position E is located at the display position of the second icon, the electronic device may determine the flash window mode represented by the second icon as the target window mode. Then, the electronic device can acquire information of the current target application, namely the network appointment application Y, and display the latest notification information of the network appointment application Y in a flashing window mode. For example, as shown in fig. 8, the second target area determined by the electronic device on the touch display screen is an area in the upper right corner of the display screen. Thereafter, the electronic device may display the latest notification information of the network appointment application Y in the second target area.
Of course, in other embodiments, the second target area may also be another area of the display screen, for example, the position of the upper left corner of the display screen, the position of the lower right corner of the display screen, and the like, which is not specifically limited in this embodiment of the application.
As shown in fig. 7, the first gesture is a gesture in which a sliding trajectory starting from the position B, passing through the position C, and ending at the position E is continuous. In this embodiment, the first gesture can be decomposed into a first segment of gesture and a second segment of gesture. The first segment of gesture may be the segment of gesture starting from the position B to the position C, and the second segment of gesture may be the segment of gesture starting from the position C to the position E, as shown in fig. 7, the first segment of gesture and the second segment of gesture have continuous sliding tracks, that is, the user finger does not leave the touch display screen all the time in the process of making the first segment of gesture and the second segment of gesture.
In addition, in the embodiment of the present application, the positions of the windows in the small window mode and the flashing window mode may be readjusted by the user, for example, the user may drag the window to another position by a drag operation after selecting the window, and the like.
In another embodiment, the first gesture may be a blank gesture in addition to a touch operation performed on the touch display screen. For example, in view of decomposing the first gesture into a first segment of gesture and a second segment of gesture, the first segment of gesture and the second segment of gesture may also be spaced-apart gestures. The second segment of the gesture and the first segment of the gesture have continuous gesture trajectories, which may include: the second segment of the gesture and the first segment of the gesture have a continuous, spaced gesture trajectory. That is, the hold gesture made by the user is also a coherent gesture.
In the embodiment of the present application, the embodiment of the present application may further include the following process:
and gradually narrowing the running interface of the displayed target application from the receiving of the first gesture to the ending of the first gesture.
For example, the preset trigger gesture is a sliding operation from the bottom of the touch display screen, and the length of the sliding track is greater than or equal to a preset first length. Then, as shown in fig. 5, in the process that the finger of the user slides from the position B to the position C and then slides to the position D to the left, the running interface of the car booking application Y displayed by the electronic device may gradually shrink, for example, a changing process of gradually shrinking the running interface of the car booking application Y may be as shown in fig. 9. For example, as shown in fig. 9, in the process that the user's finger slides up from the position B at the bottom of the touch display screen, passes through the position F (see the position of the black dot in the figure), the position C, and then slides to the left to the position D, the running interface of the car appointment application Y displayed by the electronic device may gradually shrink.
In one embodiment, the step-by-step reduction process of the electronic device for the displayed running interface of the target application may include: the electronic equipment gradually reduces the displayed running interface of the target application to a preset size, namely when the running interface of the target application is reduced to the preset size, the electronic equipment can not reduce the running interface of the target application any more.
In one embodiment, as shown in fig. 9, while the electronic device shrinks the running interface of the target application, the gradually shrunk running interface may move towards a preset direction. For example, when the preset trigger gesture is an upward sliding operation from the bottom of the touch display screen and the length of the sliding track is greater than or equal to the preset first length, the gradually reduced running interface may move in the direction of the upper half screen of the display screen. For another example, when the preset trigger gesture is a sliding operation starting from the top of the touch display screen and the length of the sliding track is greater than or equal to the preset first length, the gradually reduced running interface may move towards the direction of the lower half screen of the display screen, and so on.
In an implementation manner, the embodiment of the present application may further include the following process:
before the first gesture is finished, if the gesture pause position is located at the display position of the icon, the electronic device displays the preview effect of the window mode represented by the corresponding icon at the preset position of the touch display screen.
For example, in a case where the finger of the user slides to the D position to stop but the finger of the user still does not leave the touch display screen, the electronic device may display the preview effect of the first window mode represented by the first icon at a preset position of the touch display screen. For example, as shown in fig. 10, the electronic device may display a preview effect of the running interface of the network appointment application Y in the small window mode at a middle position (i.e., a preset position) of the upper half screen of the display screen.
For another example, in a case where the user's finger slides from the D position to the E position again and stops, but the user's finger still does not leave the touch display screen, the electronic device may display a preview effect of the second window mode represented by the second icon at a preset position of the touch display screen. For example, as shown in fig. 11, the electronic device may display a preview effect of the latest notification information of the network appointment application Y in the flashing mode at a middle position (i.e., a preset position) of the upper half of the display screen.
In addition, in the case that the first gesture is a touch operation performed on the touch display screen, the ending of the first gesture may refer to that the electronic device detects that the finger of the user leaves the touch display screen. For example, as shown in fig. 5, the user's finger slides on the touch display screen from position B, passes through position C, and leaves the touch display screen when sliding to position D, and then the electronic device detects that the first gesture ends when detecting that the user's finger leaves the touch display screen from position D.
Under the condition that the first gesture is the space gesture, taking a technology of detecting the palm action of the user by using a camera to realize space gesture recognition as an example, the technology of detecting the space gesture by using the camera generally comprises four parts, namely gesture image capture, gesture segmentation, gesture feature extraction and gesture detection. Then, the first gesture ending may refer to an action of the electronic device capturing the palm down of the user through the camera. For example, when making a blank gesture, the user's palm needs to be lifted and make a gesture. When the user's palm-down motion is captured, the clear gesture may be considered to be over.
It can be understood that, in the embodiment of the present application, as shown in fig. 5 and fig. 7, the whole process of the electronic device entering the small window mode (corresponding to the gesture operation starting from the position B, going through the position C, and ending at the position D last) or the flashing window mode (corresponding to the gesture operation starting from the position B, going through the position C, and ending at the position E last) is completed in one step, that is, only one step of operation is required to switch to the corresponding window mode, so that the operation efficiency is high, and the user experience is good.
In an implementation manner, the embodiment of the present application may further include the following process:
receiving information of a second gesture;
and if the gesture actions of the second gesture and the preset triggering gesture are the same and the track length of the second gesture is smaller than the preset first length and larger than the preset second length, entering a multitask management interface.
For example, the preset trigger gesture is a sliding operation from the bottom of the touch display screen, and the length of the sliding track is greater than or equal to a preset first length. If the second gesture received by the electronic device is also a slide-up operation (that is, the gesture actions of the second gesture and the preset trigger gesture are the same), but the track length of the second gesture is smaller than the preset first length but larger than the preset second length, for example, the track length corresponding to the slide-up operation of the finger of the user is between the preset second length and the preset first length, and then the finger of the user leaves the touch display screen, the electronic device may enter the multitask management interface at this time.
For example, the preset trigger gesture is a sliding-up operation from the bottom of the touch display screen, and the length of the sliding track is greater than or equal to a preset first length. As shown in fig. 12, the user's finger slides up from the B position at the bottom of the touch display screen to the G position, but the length of the sliding track between the B position and the G position is between the preset second length and the preset first length, and after sliding at the position G, the user's finger leaves the touch display screen, and then the multitask management interface can be entered.
It will be appreciated that the above approach may enable multiple segment responses of the same gesture. That is, for the same gesture motion, when the lengths of the gesture tracks are different, the electronic device may respond differently. For example, in the same slide-up gesture, when the length of the slide-up gesture track is greater than or equal to the preset first length, the electronic device may display the first icon and the second icon. And when the length of the upglide gesture track is smaller than the preset first length but larger than or equal to the preset second length, the electronic device can enter the multitask management interface.
In one embodiment, after the electronic device displays information of the target application in the target window mode, the electronic device may return to the desktop. For example, in fig. 6, after the electronic device displays the running interface of the car appointment application Y in the small window mode, the electronic device displays the interface of the desktop in the area except the small window on the touch display screen. For another example, in fig. 8, after the electronic device displays the latest notification information of the car appointment application Y in the flashing window mode, the electronic device displays an interface of the desktop in an area other than the flashing window on the touch display screen.
Referring to fig. 13, fig. 13 is a schematic structural diagram of an apparatus control device according to an embodiment of the present application. The device control apparatus may be applied to an electronic device, which may include a touch display screen. The device control apparatus 300 may include: a first receiving module 301, a first display module 302, a second receiving module 303, a determining module 304, a second display module 305.
The first receiving module 301 is configured to receive information of a first gesture.
A first display module 302, configured to display at least one icon if the first gesture includes a preset trigger gesture, where each icon is used to represent a window mode.
The second receiving module 303 is configured to obtain a gesture ending position of the first gesture.
The determining module 304 is configured to determine, if it is detected that the gesture end position of the first gesture is located at the display position of the icon, the window mode represented by the corresponding icon as the target window mode.
The second display module 305 is configured to obtain information of a target application, and display the information of the target application in the target window mode, where the target application is an application corresponding to the information displayed on the touch display screen when the first gesture is received.
In one embodiment, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture occurs before the second segment gesture, and the first segment gesture and the second segment gesture have a continuous gesture trajectory.
Then, the first gesture includes a preset trigger gesture, which may include: the first segment of gesture is matched with the preset triggering gesture.
In one embodiment, the icons displayed by the electronic equipment at least comprise a first icon, and the first window mode represented by the first icon comprises the steps of determining a first target area on the touch display screen and displaying the reduced running interface of the target application in the first target area.
In one embodiment, the icons displayed by the electronic device further include at least a second icon, and the second window mode represented by the second icon includes determining a second target area on the touch display screen and displaying the information of the target application in the second target area, and the area of the second target area is smaller than that of the first target area.
In one embodiment, the second window mode represented by the second icon includes determining a second target area on the touch display screen and displaying the customized information of the target application in the second target area.
In one embodiment, the first and second gestures comprise touch operations on the touch display screen;
then, the second segment of gesture and the first segment of gesture have a continuous gesture trajectory, which may include: the second segment of the gesture and the first segment of the gesture have a continuous touch trajectory on the touch display screen.
In one embodiment, the first gesture comprises the first segment of the gesture and the second segment of the gesture comprises a clear gesture;
then, the second segment of gesture and the first segment of gesture have a continuous gesture trajectory, which may include: the second segment of the gesture and the first segment of the gesture have a continuous spaced gesture trajectory.
In one embodiment, the first display module 302 may be configured to: and if the gesture actions of the first segment of gesture are the same as those of a preset trigger gesture and the track length of the first segment of gesture is greater than or equal to a preset first length, determining that the first segment of gesture is matched with the preset trigger gesture, and displaying at least one icon.
In one embodiment, the starting point of the preset trigger gesture is located at an edge position of the touch display screen.
In one embodiment, the second display module 305 is further configured to: gradually reducing the displayed running interface of the target application to a preset size from the moment the first gesture is received to the moment the first gesture is finished.
In one embodiment, the second display module 305 may further be configured to: and moving the gradually reduced running interface of the target application to a preset direction.
In one embodiment, the second display module 305 may further be configured to: and before the first gesture is finished, if the gesture pause position is located at the display position of the icon, displaying the preview effect of the window mode represented by the corresponding icon at the preset position of the touch display screen.
In one embodiment, the second display module 305 may further be configured to: receiving information of a second gesture; and if the gesture actions of the second gesture and the preset triggering gesture are the same, and the track length of the second gesture is smaller than the preset first length and larger than the preset second length, entering a multitask management interface.
The embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed on a computer, the computer is caused to execute the flow in the device control method provided in this embodiment.
The embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the processor is configured to execute the flow in the device control method provided in this embodiment by calling the computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 14, fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
The electronic device 400 may include a touch display 401, memory 402, a processor 403, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 14 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The touch display screen 401 may be used to display information such as text, images, and the like, and the touch display screen 401 may also be used to receive a touch operation or the like issued by a user.
The memory 402 may be used to store applications and data. The memory 402 stores applications containing executable code. The application programs may constitute various functional modules. The processor 403 executes various functional applications and data processing by running an application program stored in the memory 402.
The processor 403 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device.
In this embodiment, the processor 403 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 403 runs the application programs stored in the memory 402, so as to execute:
receiving information of a first gesture;
if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a window mode;
acquiring a gesture ending position of the first gesture;
if the gesture ending position of the first gesture is detected to be located at the display position of the icon, determining the window mode represented by the corresponding icon as a target window mode;
and acquiring information of a target application, and displaying the information of the target application in the target window mode, wherein the target application is an application corresponding to the information displayed by the touch display screen when the first gesture is received.
Referring to fig. 15, the electronic device 400 may include a touch display 401, a memory 402, a processor 403, a battery 404, a speaker 405, a microphone 406, and the like.
The touch display screen 401 may be used to display information such as text, images, and the like, and the touch display screen 401 may also be used to receive a touch operation or the like issued by a user.
The memory 402 may be used to store applications and data. The memory 402 stores applications containing executable code. The application programs may constitute various functional modules. The processor 403 executes various functional applications and data processing by running an application program stored in the memory 402.
The processor 403 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 402 and calling data stored in the memory 402, thereby integrally monitoring the electronic device.
The battery 404 may provide power support for the various components and modules of the electronic device to ensure proper operation of the various components and modules.
The speaker 405 may be used to play sound signals, for example, the speaker 405 may play sound signals in a multimedia image.
Microphone 406 may be used to collect sound signals in the surrounding environment, for example microphone 406 may be used to collect speech uttered by a user and convert the collected speech into corresponding speech signals.
In this embodiment, the processor 403 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 403 runs the application programs stored in the memory 402, so as to execute:
receiving information of a first gesture;
if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a window mode;
acquiring a gesture ending position of the first gesture;
if the gesture ending position of the first gesture is detected to be located at the display position of the icon, determining the window mode represented by the corresponding icon as a target window mode;
and acquiring information of a target application, and displaying the information of the target application in the target window mode, wherein the target application is an application corresponding to the information displayed by the touch display screen when the first gesture is received.
In one embodiment, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture occurs before the second segment gesture, and the first segment gesture and the second segment gesture have a continuous gesture trajectory;
the first gesture comprises a preset trigger gesture, and may include: the first segment of gesture is matched with the preset triggering gesture.
In one embodiment, the icons displayed by the electronic equipment at least comprise a first icon, and the first window mode represented by the first icon comprises the steps of determining a first target area on the touch display screen and displaying the reduced running interface of the target application in the first target area.
In one embodiment, the icons displayed by the electronic device further include at least a second icon, and a second window mode represented by the second icon includes determining a second target area on the touch display screen and displaying information of the target application in the second target area, where an area of the second target area is smaller than an area of the first target area.
In one embodiment, the second window mode represented by the second icon includes determining a second target area on the touch display screen and displaying the customized information of the target application in the second target area.
In one embodiment, the first and second gestures comprise touch operations on the touch display screen;
then, the second segment of gesture and the first segment of gesture have a continuous gesture trajectory, which may include: the second segment of the gesture and the first segment of the gesture have a continuous touch trajectory on the touch display screen.
In one embodiment, the first gesture comprises the first segment of the gesture and the second segment of the gesture comprises a clear gesture;
then, the second segment of gesture and the first segment of gesture have a continuous gesture trajectory, which may include: the second segment of the gesture and the first segment of the gesture have a continuous spaced gesture trajectory.
In one embodiment, the processor 403 executes the displaying at least one icon if the first segment of gesture matches a preset trigger gesture, and may execute: and if the gesture actions of the first segment of gesture are the same as those of a preset trigger gesture and the track length of the first segment of gesture is greater than or equal to a preset first length, determining that the first segment of gesture is matched with the preset trigger gesture, and displaying at least one icon.
In one embodiment, the starting point of the preset trigger gesture is located at an edge position of the touch display screen.
In one embodiment, processor 403 may further perform: gradually reducing the displayed running interface of the target application to a preset size from the beginning of receiving the first gesture to the end of the first gesture.
In one embodiment, processor 403 may further perform: and moving the gradually reduced running interface of the target application to a preset direction.
In one embodiment, processor 403 may further perform: and before the first gesture is finished, if the gesture pause position is located at the display position of the icon, displaying the preview effect of the window mode represented by the corresponding icon at the preset position of the touch display screen.
In one embodiment, processor 403 may further perform: receiving information of a second gesture; and if the gesture actions of the second gesture and the preset triggering gesture are the same, and the track length of the second gesture is smaller than the preset first length and larger than the preset second length, entering a multitask management interface.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the device control method, and are not described herein again.
The device control apparatus provided in the embodiment of the present application and the device control method in the above embodiments belong to the same concept, and any method provided in the device control method embodiment may be run on the device control apparatus, and a specific implementation process thereof is described in the device control method embodiment in detail, and is not described herein again.
It should be noted that, for the apparatus control method described in the embodiment of the present application, it can be understood by those skilled in the art that all or part of the process of implementing the apparatus control method described in the embodiment of the present application can be completed by controlling the relevant hardware through a computer program, where the computer program can be stored in a computer-readable storage medium, such as a memory, and executed by at least one processor, and during the execution, the process of the embodiment of the apparatus control method can be included. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
In the device control apparatus according to the embodiment of the present application, each functional module may be integrated into one processing chip, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium, such as a read-only memory, a magnetic or optical disk, or the like.
The above detailed description is provided for a device control method, apparatus, storage medium, and electronic device provided in the embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the descriptions of the above embodiments are only used to help understand the method and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A device control method is applied to an electronic device, and is characterized in that the electronic device comprises a touch display screen, and the method comprises the following steps:
receiving a first gesture;
if the first gesture comprises a preset trigger gesture, displaying at least one icon, wherein each icon is used for representing a window mode;
if the gesture ending position of the first gesture is located at the display position of the icon, displaying information of a target application in a target window mode, wherein the target application is an application corresponding to the information displayed on the touch display screen when the first gesture is received, and the target window mode is a window mode represented by the icon displayed at the gesture ending position of the first gesture.
2. The device control method according to claim 1, wherein the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture occurs before the second segment gesture, and the first segment gesture and the second segment gesture have a continuous gesture trajectory;
the first gesture includes a preset trigger gesture, including: the first segment of gesture is matched with the preset triggering gesture.
3. The device control method according to claim 1, wherein the icons displayed by the electronic device include at least a first icon, and the first window mode represented by the first icon includes displaying information of the target application in a first target area on the touch display screen.
4. The device control method according to claim 3, wherein the icons displayed by the electronic device further include at least a second icon, the second window mode represented by the second icon including displaying information of the target application in a second target area on the touch display screen, the second target area having an area smaller than that of the first target area.
5. The apparatus control method according to claim 1, characterized in that the method further comprises:
gradually reducing the displayed running interface of the target application to a preset size from the moment the first gesture is received to the moment the first gesture is finished.
6. The apparatus control method according to claim 5, characterized in that the method further comprises:
and moving the gradually reduced running interface of the target application to a preset direction.
7. The apparatus control method according to claim 1, characterized in that the method further comprises:
before the first gesture is finished, if the gesture pause position is located at the icon display position, displaying a preview effect of a window mode represented by the corresponding icon at a preset position of the touch display screen.
8. An apparatus control device applied to an electronic device, wherein the electronic device includes a touch display screen, the apparatus comprising:
the first receiving module is used for receiving a first gesture;
the first display module is used for displaying at least one icon if the first gesture comprises a preset trigger gesture, and each icon is used for representing a window mode;
and the determining module is configured to display information of a target application in a target window mode if the gesture ending position of the first gesture is located at the display position of the icon, where the target application is an application corresponding to the information displayed on the touch display screen when the first gesture is received, and the target window mode is a window mode represented by the icon displayed at the gesture ending position of the first gesture.
9. A computer-readable storage medium, on which a computer program is stored, which, when executed on a computer, causes the computer to carry out the method according to any one of claims 1 to 7.
10. An electronic device comprising a memory, a processor and a touch display screen, wherein the processor executes the method of any one of claims 1 to 7 by invoking a computer program stored in the memory.
CN202211052153.1A 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device Pending CN115390740A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211052153.1A CN115390740A (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211052153.1A CN115390740A (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device
CN202010658123.XA CN111831204B (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010658123.XA Division CN111831204B (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN115390740A true CN115390740A (en) 2022-11-25

Family

ID=72900750

Family Applications (4)

Application Number Title Priority Date Filing Date
CN202211052153.1A Pending CN115390740A (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device
CN202211043518.4A Pending CN115357177A (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device
CN202010658123.XA Active CN111831204B (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device
CN202180038046.0A Pending CN115698930A (en) 2020-07-09 2021-06-01 Device control method, device, storage medium and electronic device

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN202211043518.4A Pending CN115357177A (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device
CN202010658123.XA Active CN111831204B (en) 2020-07-09 2020-07-09 Device control method, device, storage medium and electronic device
CN202180038046.0A Pending CN115698930A (en) 2020-07-09 2021-06-01 Device control method, device, storage medium and electronic device

Country Status (5)

Country Link
US (1) US20230139376A1 (en)
EP (1) EP4180933A4 (en)
JP (1) JP7498352B2 (en)
CN (4) CN115390740A (en)
WO (1) WO2022007544A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115390740A (en) * 2020-07-09 2022-11-25 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN114625288A (en) * 2020-12-11 2022-06-14 Oppo广东移动通信有限公司 Interface processing method and device, electronic equipment and computer readable storage medium
CN116700914B (en) * 2022-11-22 2024-05-10 荣耀终端有限公司 Task circulation method and electronic equipment

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010039991A (en) 2008-08-08 2010-02-18 Kawai Musical Instr Mfg Co Ltd Image display method, computer program for image display, and recording medium
US9030487B2 (en) * 2011-08-01 2015-05-12 Lg Electronics Inc. Electronic device for displaying three-dimensional image and method of using the same
KR20130051234A (en) 2011-11-09 2013-05-20 삼성전자주식회사 Visual presentation method for application in portable and apparatus thereof
EP2891950B1 (en) 2014-01-07 2018-08-15 Sony Depthsensing Solutions Human-to-computer natural three-dimensional hand gesture based navigation method
CN103870116B (en) * 2014-02-18 2018-07-06 联想(北京)有限公司 A kind of information processing method and electronic equipment
US10261672B1 (en) 2014-09-16 2019-04-16 Amazon Technologies, Inc. Contextual launch interfaces
CN104503689B (en) * 2014-11-21 2018-07-31 小米科技有限责任公司 Application interface display methods and device
KR20170093358A (en) * 2016-02-05 2017-08-16 최정은 This is a organization of app-program that can be memorized foreign words efficiently and systematically on playing the game
CN107302724A (en) * 2016-04-14 2017-10-27 北京搜狗科技发展有限公司 A kind of video playing control method, device and electronic equipment
CN108885531A (en) 2016-10-14 2018-11-23 华为技术有限公司 A kind of method and terminal of interface display
JP6553719B2 (en) 2016-10-31 2019-07-31 ベイジン シャオミ モバイル ソフトウェア カンパニーリミテッド Screen split display method and apparatus
WO2018082269A1 (en) 2016-11-04 2018-05-11 华为技术有限公司 Menu display method and terminal
CN107368511B (en) * 2017-04-20 2020-12-01 口碑控股有限公司 Information display method and device
CN110347317B (en) * 2019-06-11 2022-09-27 广州视源电子科技股份有限公司 Window switching method and device, storage medium and interactive intelligent panel
CN115390740A (en) * 2020-07-09 2022-11-25 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN114661219A (en) * 2020-07-09 2022-06-24 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device

Also Published As

Publication number Publication date
WO2022007544A1 (en) 2022-01-13
CN115698930A (en) 2023-02-03
EP4180933A1 (en) 2023-05-17
JP2023533281A (en) 2023-08-02
CN115357177A (en) 2022-11-18
CN111831204B (en) 2022-09-16
EP4180933A4 (en) 2023-12-27
CN111831204A (en) 2020-10-27
US20230139376A1 (en) 2023-05-04
JP7498352B2 (en) 2024-06-11

Similar Documents

Publication Publication Date Title
JP7361156B2 (en) Managing real-time handwriting recognition
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
CN111831204B (en) Device control method, device, storage medium and electronic device
CN106415472B (en) Gesture control method and device, terminal equipment and storage medium
CN111831205B (en) Device control method, device, storage medium and electronic device
JP2021089761A (en) Method and apparatus for controlling electronic device based on gesture
CN109976655B (en) Long screen capturing method, device, terminal and storage medium
CN108984093B (en) Touch operation method and device, storage medium and electronic equipment
KR102373021B1 (en) Global special effect conversion method, conversion device, terminal equipment and storage medium
CN112015270A (en) Terminal control method, terminal and computer storage medium
WO2021232956A1 (en) Device control method and apparatus, and storage medium and electronic device
CN114415886A (en) Application icon management method and electronic equipment
CN113485590A (en) Touch operation method and device
CN113923295B (en) Voice control method, device, electronic equipment and storage medium
WO2022252872A1 (en) Device control method and apparatus, electronic device, and storage medium
CN117813636A (en) Text conversion method and device, storage medium and interaction equipment
CN114049638A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111949141A (en) Handwritten character input method and device, electronic equipment and storage medium
CN114296599A (en) Interface interaction method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination