CN116610248A - Gesture control method and electronic device - Google Patents

Gesture control method and electronic device Download PDF

Info

Publication number
CN116610248A
CN116610248A CN202210122978.XA CN202210122978A CN116610248A CN 116610248 A CN116610248 A CN 116610248A CN 202210122978 A CN202210122978 A CN 202210122978A CN 116610248 A CN116610248 A CN 116610248A
Authority
CN
China
Prior art keywords
gesture
interface
segment
message notification
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210122978.XA
Other languages
Chinese (zh)
Inventor
陈宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210122978.XA priority Critical patent/CN116610248A/en
Publication of CN116610248A publication Critical patent/CN116610248A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a gesture control method and electronic equipment, and relates to the technical field of terminals. The method and the device can be used for facilitating the operation of the electronic equipment by a user with one hand through the preset gesture, and improving the use experience of the user. The method comprises the following steps: in the process of displaying the first interface by the electronic equipment, detecting that the user starts to input the first gesture at the first position of the first interface and ends to input the first gesture at the second position, and displaying a second interface corresponding to the first gesture. The first gesture comprises a first section of gesture and a second section of gesture, the first section of gesture comprises a first position, the second section of gesture comprises a second position, and the first section of gesture and the second section of gesture are not on the same straight line; or the residence time of the first gesture at the second position exceeds a preset time.

Description

Gesture control method and electronic device
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a gesture control method and electronic equipment.
Background
With the development of multimedia technology, display content of electronic devices (such as mobile phones) is more and more abundant, and demands of users for large-screen electronic devices are also increasing. In order to meet the requirements of users, the display screen of the electronic equipment such as mobile phones is generally more than 6 inches in size, which makes it difficult for users to operate the electronic equipment with one hand.
For this, the mobile phone develops a one-hand mode. In the one-hand mode, the mobile phone can display the usable display area of the display screen in the lower left corner or the lower right corner of the display screen in an equal ratio mode, so that one-hand operation of a user is facilitated. However, after entering the one-hand mode, the size of the usable display area is still relatively large (e.g., about 5 inches), so that one-hand operation of part of the functions of the mobile phone is still relatively difficult, which affects the user experience.
Disclosure of Invention
In order to solve the technical problems, the embodiment of the application provides a gesture control method and electronic equipment. According to the technical scheme provided by the embodiment of the application, through the preset gestures, the electronic equipment is conveniently operated by a user with one hand, and the use experience of the user is improved.
In order to achieve the technical purpose, the embodiment of the application provides the following technical scheme:
in a first aspect, a gesture control method is provided and applied to an electronic device. The method comprises the following steps: the first interface is displayed. Detecting that a user starts to input a first gesture at a first position of a first interface and ending to input the first gesture at a second position; the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises a first position, the second segment gesture comprises a second position, and the first segment gesture and the second segment gesture are not on the same straight line; or the residence time of the first gesture at the second position exceeds a preset time. And displaying a second interface corresponding to the first gesture.
In some embodiments, the first interface may be a desktop or an interface in a first application in the electronic device. In the process of displaying the first interface, the electronic equipment can execute corresponding operations according to the gestures of the user.
In some embodiments, the first location is any location on the first interface. The first gesture is a two-segment gesture, such as a sliding-down gesture, a sliding-left gesture, a gesture that slides down and stops for a preset time, and the like.
For example, the first gesture is a slide-down right gesture, the first gesture is a slide-down and includes a start position (i.e., a first position) of the first gesture, and the second gesture is a slide-down and includes an end position (i.e., a second position) of the first gesture. Alternatively, the first gesture includes only a trace (the trace including a first position and a second position), and the dwell time at the end position (i.e., the second position) of the first gesture exceeds a preset time.
Therefore, through presetting the first gesture which needs smaller operation space, a user can realize that the electronic equipment is indicated by one hand to display a corresponding interface through simple gesture operation. The requirement of single-hand operation of the large-screen electronic equipment user is met, and the use experience of the user is improved.
According to a first aspect, the first segment gesture corresponds to a first direction and the second segment gesture corresponds to a second direction, the first direction being perpendicular to the second direction.
For example, the first gesture is a sliding-down gesture and includes a starting position (i.e., a first position) of the first gesture, the second gesture is a sliding-right gesture and includes an ending position (i.e., a second position) of the first gesture, the first direction is downward, the second direction is rightward, and the downward direction is perpendicular to the rightward direction.
Therefore, the user can instruct the electronic equipment to switch the display interface through the simple two-section gesture, so that the user can operate with one hand conveniently, and the operation difficulty of the user is reduced.
According to the first aspect, or any implementation manner of the first aspect, before displaying the second interface corresponding to the first gesture, the method further includes: and determining the first interface as a desktop or the first interface as a first application interface.
In some embodiments, to reduce the difficulty of a user memorizing gestures, the number of gestures that the user needs to memorize is reduced. After the electronic device detects the touch operation, whether the current scene is a preset scene can be judged first. Specifically, after detecting the first gesture, the electronic device determines whether the currently displayed interface is a desktop or a first application interface. And then, the electronic equipment determines to execute corresponding response according to the acquired gesture.
Therefore, by configuring different responses of the same gestures in different display interfaces (desktop or application interfaces), a user can instruct the electronic device to execute the corresponding responses in the different display interfaces only by memorizing a limited number of gestures, and the operation difficulty of the user is reduced.
According to a first aspect, or any implementation manner of the first aspect, the first interface is a desktop, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture includes a first position, the second segment gesture includes a second position, the first segment gesture and the second segment gesture are not on the same straight line, and displaying a second interface corresponding to the first gesture includes: and determining a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture. And displaying a control center function interface or a message notification function interface according to the second direction.
In some embodiments, if the first gesture is a swipe right gesture, the electronic device may obtain an array of coordinate tracks corresponding to the coordinate tracks of the first gesture. The first gesture includes a start point (first position), a rightward inflection point, and an end point (second position), and the electronic device may obtain n coordinate values corresponding to the first gesture, where each coordinate value is [ xi, yi ] (i=1, 2, … …, n). In the gesture recognition process, the electronic equipment can calculate the difference value between the coordinate value of the back term and the coordinate value of the front term to obtain a difference value coordinate array of n-1 terms. The first gesture segment (i.e., the segment between the starting point and the inflection point) in the first gesture is a swipe, so abs (x-axis difference) <=threshold 1 (threshold 1) in the first k-term difference, and the y-axis difference is always positive. Based on the feature, the electronic device may determine that the first segment gesture is a swipe gesture. Thereafter, the second segment of the first gesture (i.e., the segment between the inflection point and the endpoint) is right-slid, so abs (y-axis difference) <=threshold 2 (threshold 2) and the x-axis difference is always positive from the k+1st to the n-1st difference. Based on the feature, the electronic device may determine that the second segment gesture is a right swipe gesture. Accordingly, the electronic device may determine whether the currently detected gesture is a preset first gesture.
Illustratively, if the second direction is to the right, the second interface is a control center function interface. The second direction is left, and the second interface is a message notification function interface.
In this way, in the process of displaying the desktop, the user can instruct the electronic device to display the control center function interface or the message notification function interface through a simple two-section gesture. Therefore, the problem that a user is difficult to display the control center function interface or the message notification function interface by one hand due to the oversized display screen is solved, and the user experience is improved.
According to a first aspect, or any implementation manner of the first aspect, the first interface is a desktop, the residence time of the first gesture in the second position exceeds a preset time, and displaying a second interface corresponding to the first gesture includes: and determining a target message notification to be displayed according to preset conditions. And if the target message notification to be displayed does not have the quick reply function, displaying a second interface of the application corresponding to the target message notification to be displayed, wherein the second interface is associated with the target message notification to be displayed. Or if the message notification to be displayed has the shortcut reply function, displaying a shortcut reply interface corresponding to the target message notification to be displayed.
In some embodiments, the portion of the message notification displayed in the message notification function interface has a quick reply function, and the electronic device may not need to open a corresponding application, and may implement quick reply of the corresponding message notification in the message notification function interface.
In this way, the electronic device can display the message notification function interface according to the first gesture input by the user in the desktop display process, and if the message notification to be displayed has a quick reply function, the quick reply function of the message notification can be directly started on the message notification function interface, so that the corresponding application does not need to be opened, the application starting power consumption is reduced, and the efficiency is improved.
According to the first aspect, or any implementation manner of the first aspect, the preset condition includes one or more of a recently received message notification, a message notification with a high priority, a message notification corresponding to a user commonly used application, a message notification corresponding to an application set by a user, and a message notification with a quick reply function.
Therefore, through the preset conditions, the electronic equipment can select the message notification which is concerned by the user or needs to be replied, so that the displayed message notification meets the user requirement, and the user experience is improved.
According to a first aspect, or any implementation manner of the first aspect, the first interface is a first application interface, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture includes a first position, the second segment gesture includes a second position, the first segment gesture and the second segment gesture are not on the same straight line, and displaying a second interface corresponding to the first gesture includes: and determining a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture. According to the second direction, displaying an interface of a first function in the first application corresponding to the first gesture, or displaying an interface of a second function in the first application corresponding to the first gesture; wherein the first function and the second function are the same function or different functions.
In some embodiments, the functionality controls in some applications are displayed at the edges of the display screen or in locations where the finger is difficult to operate during one-handed operation by the user. Therefore, the first gesture which is simpler and can be conveniently operated by a user with one hand is preset in the electronic equipment, the user can easily instruct the electronic equipment to execute the corresponding function through the first gesture, the user operation difficulty is reduced, and the user experience is improved.
According to a first aspect, or any implementation manner of the first aspect, the first interface is a first application interface, the residence time of the first gesture in the second position exceeds a preset time, and displaying a second interface corresponding to the first gesture includes: and displaying an interface of a third function of the first application corresponding to the first gesture.
According to a first aspect, or any implementation of the first aspect above, the first application is a camera application.
According to a first aspect, or any implementation manner of the first aspect, the first position is any position on the first interface.
In a second aspect, an electronic device is provided. The electronic device includes: a display screen, a processor, and a memory, the display screen, the memory being coupled to the processor, the memory being for storing computer program code, the computer program code comprising computer instructions that, when read from the memory by the processor, cause the electronic device to perform: the first interface is displayed. Detecting that a user starts to input a first gesture at a first position of a first interface and ending to input the first gesture at a second position; the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises a first position, the second segment gesture comprises a second position, and the first segment gesture and the second segment gesture are not on the same straight line; or the residence time of the first gesture at the second position exceeds a preset time. And displaying a second interface corresponding to the first gesture.
According to a second aspect, the first segment of the gesture corresponds to a first direction and the second segment of the gesture corresponds to a second direction, the first direction being perpendicular to the second direction.
According to a second aspect, or any implementation manner of the second aspect, the processor reads the computer instructions from the memory, and further causes the electronic device to perform the following operations: and determining the first interface as a desktop or the first interface as a first application interface.
According to a second aspect, or any implementation manner of the second aspect, the first interface is a desktop, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture includes a first position, the second segment gesture includes a second position, the first segment gesture and the second segment gesture are not on the same straight line, and displaying a second interface corresponding to the first gesture includes: and determining a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture. And displaying a control center function interface or a message notification function interface according to the second direction.
According to a second aspect, or any implementation manner of the second aspect, the first interface is a desktop, the residence time of the first gesture in the second position exceeds a preset time, and displaying a second interface corresponding to the first gesture includes: and determining a target message notification to be displayed according to preset conditions. And if the target message notification to be displayed does not have the quick reply function, displaying a second interface of the application corresponding to the target message notification to be displayed, wherein the second interface is associated with the target message notification to be displayed. Or if the message notification to be displayed has the shortcut reply function, displaying a shortcut reply interface corresponding to the target message notification to be displayed.
According to a second aspect, or any implementation manner of the second aspect, the preset condition includes one or more of a recently received message notification, a message notification with a high priority, a message notification corresponding to a user commonly used application, a message notification corresponding to an application set by a user, and a message notification with a quick reply function.
According to a second aspect, or any implementation manner of the second aspect, the first interface is a first application interface, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture includes a first position, the second segment gesture includes a second position, the first segment gesture and the second segment gesture are not on the same line, and displaying a second interface corresponding to the first gesture includes: and determining a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture. According to the second direction, displaying an interface of a first function in the first application corresponding to the first gesture, or displaying an interface of a second function in the first application corresponding to the first gesture; wherein the first function and the second function are the same function or different functions.
According to a second aspect, or any implementation manner of the second aspect, the first interface is a first application interface, the residence time of the first gesture in the second position exceeds a preset time, and displaying a second interface corresponding to the first gesture includes: and displaying an interface of a third function of the first application corresponding to the first gesture.
According to a second aspect, or any implementation of the second aspect above, the first application is a camera application.
According to a second aspect, or any implementation of the second aspect above, the first location is any location on the first interface.
The technical effects corresponding to the second aspect and any implementation manner of the second aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a third aspect, an embodiment of the present application provides an electronic device having a function of implementing the gesture control method described in the first aspect and any one of possible implementation manners. The functions may be implemented by hardware, or by corresponding software executed by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
The technical effects corresponding to the third aspect and any implementation manner of the third aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, and are not described herein again.
In a fourth aspect, a computer-readable storage medium is provided. The computer readable storage medium stores a computer program (which may also be referred to as instructions or code) which, when executed by an electronic device, causes the electronic device to perform the method of the first aspect or any implementation of the first aspect.
The technical effects corresponding to the fourth aspect and any implementation manner of the fourth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a fifth aspect, embodiments of the present application provide a computer program product for causing an electronic device to perform the method of the first aspect or any one of the embodiments of the first aspect when the computer program product is run on the electronic device.
The technical effects corresponding to the fifth aspect and any implementation manner of the fifth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a sixth aspect, an embodiment of the application provides circuitry comprising processing circuitry configured to perform the first aspect or the method of any one of the embodiments of the first aspect.
The technical effects corresponding to the sixth aspect and any implementation manner of the sixth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a seventh aspect, an embodiment of the present application provides a chip system, including at least one processor and at least one interface circuit, where the at least one interface circuit is configured to perform a transceiver function and send an instruction to the at least one processor, and when the at least one processor executes the instruction, the at least one processor performs the method of the first aspect or any implementation manner of the first aspect.
The technical effects corresponding to the seventh aspect and any implementation manner of the seventh aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein again.
Drawings
FIG. 1 is a schematic diagram of an interface provided by an embodiment of the present application;
FIG. 2 is a second schematic interface diagram according to an embodiment of the present application;
FIG. 3 is a third schematic interface diagram according to an embodiment of the present application;
fig. 4 is a schematic diagram of an electronic device according to an embodiment of the present application;
fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
FIG. 7 is a fourth schematic interface diagram according to an embodiment of the present application;
FIG. 8 is a fifth interface diagram according to an embodiment of the present application;
FIG. 9 is a diagram illustrating a sixth interface according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an interface according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an interface eighth embodiment of the present application;
FIG. 12 is a diagram illustrating an interface according to an embodiment of the present application;
FIG. 13 is a schematic view of an interface provided by an embodiment of the present application;
FIG. 14 is an eleventh interface diagram according to an embodiment of the present application;
FIG. 15 is a schematic diagram showing twelve interfaces provided in an embodiment of the present application;
FIG. 16 is a flowchart illustrating a gesture control method according to an embodiment of the present application;
FIG. 17 is a diagram of thirteen interfaces according to an embodiment of the present application;
FIG. 18 is a schematic diagram of a gesture control method according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are described below with reference to the accompanying drawings in the embodiments of the present application. In the description of embodiments of the application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one or more than two (including two).
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In embodiments of the application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In some scenes, with the development of network technology and multimedia technology, electronic equipment is upgraded from an original small-size display screen to a large-size display screen, and better visual experience is provided for users. However, the increase in the size of the display screen also increases the difficulty of operation for the user.
For example, take an electronic device as a mobile phone. The user can only operate the mobile phone with one hand in the scene that one hand is occupied, such as when one hand holds the handrail in the commuting process, one hand holds the bag, one hand walks the dog, and the like. However, the size of the display screen of the mobile phone is generally more than 6 inches, and the single-hand operation of a user is difficult.
For example, the control center function and the message notification function are aggregated in the mobile phone pull-down notification bar, and if a user needs to open the control center function or the message notification function, the user needs to slide down from the top (usually the side where the receiver and the camera are located) of the mobile phone to the right side or the left side, so as to realize the corresponding function in the pull-down display notification bar. For example, as shown in interface 101 (a) of fig. 1, when the mobile phone detects that the user slides down in the direction indicated by arrow 12 starting from an arbitrary position in the right-side area of the top of the mobile phone indicated by reference numeral 11, control center function interface 102 shown in fig. 1 (b) can be displayed. For another example, as shown in the interface 201 (a) of fig. 2, when the mobile phone detects that the user slides down in the direction indicated by the arrow 22 starting from an arbitrary position in the left area of the top of the mobile phone indicated by the reference numeral 21, the message notification function interface 202 shown in (b) of fig. 2 may be displayed.
It can be seen that the user, whether he needs to activate the control center function or the message notification function, needs to enter a swipe gesture from the top area of the handset display screen for the start position. However, under the condition of one-hand operation, the operation range which can be touched by the finger of the user is limited by a larger display screen, and the user can hardly finish the current gesture operation by one hand, so that the use of the user is affected.
In this way, after the single-hand mode is started, the mobile phone can be conveniently operated by a user with one hand by reducing the usable display area. For example, after detecting a user's operation of sliding along the bottom of the display screen and pausing, the mobile phone may start a single-hand mode. As shown in the interface 301 of fig. 3 (a), the mobile phone detects the operation that the user slides leftwards in the direction indicated by the arrow 32 at the bottom of the display screen with the position indicated by the reference numeral 31 as the starting point, and leaves after stopping at the position indicated by the reference numeral 33, and determines to start the one-hand mode, and displays the interface 302 shown in fig. 3 (b). The interface 302 is a mobile phone display interface in a single-hand mode, wherein the displayed small screen 34 is a usable display area, so that the usable display area is reduced for display, and the single-hand operation of a user is facilitated.
It can be seen that the user needs to start the single-hand mode through the difficult bottom sliding pause gesture to operate the mobile phone with one hand.
And, take the current large screen mobile phone of about 6 inches, which is commonly used as an example. Even if the mobile phone enters a single-hand mode, the usable display area is still about 5 inches, and the operation difficulty of partial functions is still relatively high. As shown in interface 302 of fig. 3 (b), if the user needs to open the control center function or the message notification function with one hand, the user still needs to perform the sliding operation starting from the position within the top area of the small screen 34 shown in the dashed box as indicated by reference numeral 35. The larger area of the small screen 34 also affects the user's operation.
Therefore, the embodiment of the application provides the gesture control method, which can meet the requirement of a user for operating the mobile phone by one hand and improve the use experience of the user by presetting gesture operation which can be realized at any position of the display screen of the electronic equipment.
It should be noted that, in the following embodiments, "up", "down", "left" and "right" refer to the orientation shown in fig. 1 (a), and will not be described in detail.
The gesture control method provided by the embodiment of the application can be applied to the electronic equipment 100. As shown in fig. 4, the electronic device 100 may be a mobile phone 41, Terminal devices with display functions such as tablet computer 42, notebook computer 43, ultra-mobile personal computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA), artificial intelligence (artificial intelligence) device, etc., and operating systems installed by electronic devices include, but are not limited toOr other operating system. The application is not limited by the specific type of electronic device and the installed operating system.
By way of example, fig. 5 shows a schematic structural diagram of the electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to a touch sensor, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor through an I2C interface, such that the processor 110 communicates with the touch sensor through an I2C bus interface to implement a touch function of the electronic device 100.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device or displays images or video through a display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may be manufactured using a liquid crystal display (liquid crystal display, LCD), for example, using an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (FLED), a Mini-led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. The electronic device 100 may play, record, etc. music through the audio module 170. The audio module 170 may include a speaker, a receiver, a microphone, a headphone interface, an application processor, etc. to implement audio functions.
The sensor module 180 may include a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen 194. Pressure sensors are of many kinds, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
Touch sensors, also known as "touch devices". The touch sensor may be disposed on the display screen 194, and the touch sensor and the display screen 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor may also be disposed on a surface of the electronic device 100 at a different location than the display 194.
In some embodiments, the electronic device 100 detects a touch operation of the user on the display screen 194 through the touch sensor during display through the display screen 194. The electronic device 100 may determine a touch trajectory according to the touch position, thereby determining whether the current touch operation is a touch operation of a preset gesture. If a touch operation is determined as a preset gesture, the electronic device 100 may perform a corresponding function and display a corresponding interface on the display screen 194. If the mobile phone responds to the preset gesture operation, the mobile phone displays a message notification function interface on the display screen 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 6 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 6, the application package may include applications for cameras, navigation, short messages, calendars, music, gallery, maps, calls, videos, etc.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 6, the application framework layer may include a window manager, a content provider, a view system, a resource manager, a notification manager, a gesture module, a scene determination module, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
The gesture module is used for identifying gestures and determining whether the gesture corresponding to the touch operation is a preset gesture. In some embodiments, when a touch operation is received by a touch sensor, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The gesture module in the application framework layer can acquire an original input event from the kernel layer, and identify a gesture corresponding to the input event.
The scene determination module is configured to determine whether the electronic device 100 is in a preset scene. The preset scenes include, for example, a desktop display scene, a target application display scene and the like. Preset gestures corresponding to different scenes can be preset in the electronic device 100, and the same preset gestures can be used for indicating the electronic device 100 to execute different functions in different scenes. Thereby avoiding the need of memorizing a large number of gesture operations and corresponding functions for the user and reducing the operation difficulty of the user.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The two-dimensional graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In some scenes, the scene displaying the functions in the drop-down notification bar as shown in fig. 1 or fig. 2 is a scene which is more commonly applied by users and is also a scene which is greatly influenced by the size of the display screen. Therefore, taking the electronic device 100 as a mobile phone, the user needs to open the control center function or the message notification function in the notification bar, the gesture control method provided by the embodiment of the present application will be described below.
In some embodiments, the control center function and the message notification function are aggregated in the cell phone notification bar. As shown in fig. 1, a slide-down gesture started at a position of the top of the mobile phone to the right may trigger the display of a control center function interface; as shown in fig. 2, a swipe gesture starting with the top left position of the handset may trigger the display of a message notification function interface. Limited by the gesture start position and the display screen size, it is difficult for a user to accomplish this with one hand. Therefore, a preset gesture starting from any position on the display screen is preset to trigger a control center function or a message notification function.
In addition, in the actual application process, in order to avoid the gesture conflict with the triggering of other functions, the preset gesture for triggering the control center function or the message notification function may be different from the gesture for triggering other functions. For example, the preset gesture is set to be a two-segment gesture (such as a down-slide gesture, an up-slide gesture, a left-slide gesture, a right-slide gesture, etc.) different from the existing single-segment gesture, such as a down-slide right-slide gesture, a down-slide left-slide gesture, a down-slide gesture and a pause gesture for a preset time, etc. The gesture of sliding down and stopping for a preset time comprises a first section sliding down and a second section sliding right.
It will be appreciated that the mobile phone detects that a user's finger touches the display screen to determine that a gesture is beginning and detects that a user's finger leaves the display screen to determine that a gesture is ending. If the beginning of the first segment of the gesture of the user touching the display screen is the beginning of the current gesture, the ending of the second segment of the gesture of the user's finger leaving the display screen is the ending of the current gesture.
Further, based on the implementation position of the function in the existing notification bar (for example, the right side position of the top of the mobile phone corresponds to the control center function, and the left side position of the top of the mobile phone corresponds to the message notification function), the gesture 1 of sliding right and sliding down at any position of the display screen can be set to indicate to display the control center function interface; and the gesture 2 is arranged at any position of the display screen and is used for indicating and displaying a message notification function interface.
It should be noted that, the gesture 1 of sliding down and sliding up may be used to instruct the mobile phone to display a message notification function interface, and the gesture 2 of sliding down and sliding up may be used to instruct the mobile phone to display a control center function interface. The gesture control method provided by the embodiment of the application is described by taking the above gesture 1 of sliding down and sliding up as an example for indicating the mobile phone to display the function interface of the control center, and the gesture 2 of sliding down and sliding up as an example for indicating the function interface of the message notification. In addition, the description of the preset gesture in the embodiment of the present application does not limit the embodiment of the present application, and the preset gesture in the embodiment of the present application for triggering the corresponding function may be any gesture.
For example, as shown in interface 701 in fig. 7 (a), when the mobile phone detects that the user slides down and right in the direction indicated by arrow 72 at the position indicated by reference numeral 71 (the position is any position on the display screen), leaves the display screen at the position indicated by reference numeral 73, and ends the operation of the current gesture, it may be determined that the gesture corresponding to the user operation is gesture 1. Then the handset may activate the control center function in the notification bar and display the control center function interface 702 as shown in fig. 7 (b).
As another example, as shown in the interface 801 in fig. 8 (a), when the mobile phone detects that the user slides down and slides left in the direction indicated by the arrow 82 at the position indicated by the reference numeral 81 (the position is any position on the display screen), leaves the display screen at the position indicated by the reference numeral 83, and ends the operation of the current gesture, it may be determined that the gesture corresponding to the user operation is gesture 2. Then the handset may initiate a message notification function in the notification bar and display a message notification function interface 802 as shown in fig. 8 (b).
Therefore, the user can display the control center function interface or the message notification function interface of the mobile phone by one hand through simple gesture operation at any position of the display screen. The requirement of single-hand operation of the large-screen mobile phone user is met, and the use experience of the user is improved.
In some embodiments, the user indication displays a message notification function interface (i.e., a message notification bar), typically a message notification or reply message notification that requires viewing attention. However, the message notifications displayed in the message notification function interface include all message notifications that the user did not view or clear prior to this indication operation, and the number may be high, and the user needs to look for a message notification that is of interest or needs to reply from among multiple message notifications (e.g., slide-flip the desired message notification over the message notification function interface). The searching process is complicated, the time consumption is long, and the use experience of a user is affected. In addition, under the condition that more message notices are displayed in the message notice function interface, part of message notices only comprise part of the content of the message notices, and if a user needs to check the complete message notice content, the user needs to click on the corresponding message notices again to check, so that the operation difficulty of the user is increased.
Illustratively, as in the scenario shown in FIG. 2, the user instructs the handset to display a message notification function interface 202 as shown in FIG. 2 (b) via a top swipe gesture. A plurality of message notifications are displayed on the message notification function interface 202, and the user needs to find a desired message among the plurality of message notifications. And, as shown in the message notification function interface 202, after receiving the news message notifications of the plurality of news applications, the mobile phone folds part of the news message notifications. After detecting the operation of clicking the control 23 by the mobile phone, the mobile phone determines that the user needs to view all news information, and the expandable and foldable news information is convenient for the user to view. And then, if the mobile phone detects that the user clicks one of the news information notifications, the corresponding news application can be opened to display the news information notification focused by the user.
In contrast, in the gesture control method provided by the embodiment of the application, the preset gesture is set, and after the mobile phone detects the preset gesture, the corresponding message notification can be opened according to the preset condition. The preset gesture is gesture 3 that slides down at any position of the display screen and stops for a preset time, where the preset time may be determined by a developer according to experimental data, an empirical value, and the like, for example, 1 second (i.e., the stop time after sliding down exceeds 1 second and then leaves, and may be determined as gesture 3), which is not particularly limited in the embodiment of the present application. The preset condition includes, for example, one or more of a recently received message notification, a message notification having a high priority, a message notification corresponding to a user commonly used application, a message notification corresponding to an application set by a user, a message notification having a quick reply function, and the like.
The message notification function interface 901 shown in fig. 9 (a) is exemplified by a preset condition that is a message notification received recently. Wherein the message notification received by the message notification function includes a resident message as shown at reference numeral 91 and a message notification as shown at reference numeral 92. The resident message is displayed in the message notification function interface, and other message notifications are displayed in the message notification function interface after receiving the message notification. And after receiving the message notification except the resident message notification, the mobile phone can determine the corresponding time stamp of the message notification and display the corresponding time information of the message notification on the message notification function interface based on the time stamp. As shown in the message notification function interface 901, the mobile phone just receives a message corresponding to the smooth application, the short message application receives the message 5 minutes before, and the news application receives the message notification 10 minutes before.
Based on the scenario shown in fig. 9 (a), as shown in the interface 902 shown in fig. 9 (b), in the process of displaying the desktop, the mobile phone detects that the user slides down along the direction shown by the arrow 94 at the position shown by the reference numeral 93 (the position is any position on the display screen), stops sliding at the position shown by the reference numeral 95, leaves the display screen after the preset time is exceeded, and finishes the operation of the current gesture, and then the mobile phone can determine that the gesture corresponding to the user operation is gesture 3. The handset may detect the time stamp of the user beginning to input gesture 3 (i.e., the time stamp when the location indicated by reference numeral 93 was touched), then it may determine that the message was received most recently from that time stamp in the message notification function. As shown in the scenario of fig. 9 (a), the message that the mobile phone has recently received is the message corresponding to the fluent application. Then, after detecting gesture 3 shown in fig. 9 (b), the mobile phone may open the fluent application, and display an interface 903 shown in fig. 9 (c), where the interface 903 is the fluent application interface.
Also, by way of example, a message notification with a high priority is given to a preset condition. As shown in an interface 1001 in fig. 10 (a), in the process of displaying a desktop, the mobile phone detects that a user slides down along the direction shown by an arrow 102 at a position shown by a reference numeral 101 (the position is any position on a display screen), stops sliding at a position shown by a reference numeral 103, leaves the display screen after stopping for more than a preset time, and ends the operation of the current gesture, and then the mobile phone can determine that the gesture corresponding to the user operation is gesture 3. Based on the scenario shown in fig. 9 (a), the mobile phone determines that the message notification received corresponding to the current message notification function includes a smooth application message, a short message application message, and a news application message. Optionally, as shown in table 1 below, the priorities corresponding to the message notifications of the applications may be preset in the mobile phone, including 5 priorities as shown in table 1 below, and it is assumed that the priority of the message notification of the short message application is MAX, which is higher than the priority of the smooth application message and the news application message. Then, after determining the priority, the mobile phone may open the short message application and display the short message application interface 1002 as shown in fig. 10 (b).
TABLE 1
Priority level Meaning of
MAX Important urgent message notifications, time critical or requiring immediate handling by the user
HIGH High priority messaging for important communication content such as short messages
DEFAULT Non-special priority class message notification
LOW Low priority messaging without urgency
MIN Background message notification
It should be noted that, the priority of each message notification may be determined according to the developer setting or the user setting, and after detecting the gesture 3 in the preset gestures, the mobile phone may determine the priority order of each message notification first, and then determine the application corresponding to the message notification that needs to be started.
Also, by way of example, the message notification corresponding to the application program commonly used by the user is taken as an example of the preset condition. Corresponding to the scenario shown in fig. 9 (a), the handset detects gesture 3 at interface 1001 shown in fig. 10 (a). And then, the mobile phone determines that the most commonly used application of the user is a short message application in the applications corresponding to the current received message notification according to the using habit of the user. Then the handset may open the short message application and display an interface 1002 as shown in fig. 10 (b).
Further exemplary, the message notification received recently and the message notification with high priority are taken as examples of the preset condition. Then the handset may also combine the time information and the priority of the message notification to determine that the user indicates to view the message notification. For example, the handset determines that there are multiple message notifications for the message notification function when gesture 3 is detected. The priorities of the message notifications are ordered, and the number of message notifications corresponding to the highest priority in the message notifications is determined to be multiple (e.g., the priorities of message notifications corresponding to application 1, application 2, and application 3 are all MAX). The time stamps of the received plurality of highest priority message notifications may then be ranked again, determining the time stamp of the message notification in which it is closest to the time stamp at which gesture 3 was detected. If the mobile phone determines that the most recently received message notification is the message notification corresponding to the application 1. Then the handset may open application 1, displaying the message notification received by application 1.
It should be noted that, the mobile phone may also determine, according to other preset conditions, that the gesture 3 corresponds to the initiated message notification, and specific content may refer to the related content of the initiated message notification determined according to the preset conditions, which is not illustrated in one-to-one embodiment of the present application.
In some embodiments, a portion of the message notification displayed in the message notification function interface has a quick reply function, and the mobile phone can implement quick reply of the corresponding message notification in the message notification function interface without opening the corresponding application. The preset condition may include a message notification with a quick reply function, or may not include a message notification with a quick reply function. For example, the preset condition includes a message notification having a quick reply function. After detecting the gesture 3, the mobile phone can determine the message notification with the shortcut reply function in the received message notification, and directly start the shortcut reply function of the message notification on the message notification function interface, so that the user can reply the message notification directly on the message notification function interface. For another example, the preset condition does not include a message notification with a quick reply function. After the mobile phone detects the gesture 3, if it is determined that the message notification to be opened has the quick reply function after the message notification to be opened is determined, the quick reply function of the message notification can be directly started, and the corresponding application interface does not need to be started and displayed. For another example, in the case that the preset condition includes a message notification with a quick reply function, after detecting the gesture 3, the mobile phone determines that the message notification has the quick reply function according to the message notification to be opened determined by the preset condition, and starts the quick reply function of the message notification at the message notification function interface. Therefore, the application starting power consumption is reduced, and the efficiency is improved.
The message notification function interface 1101 shown in fig. 11 (a) is exemplified by a preset condition that is a most recently received message notification. Wherein, as indicated by reference numeral 111, the mobile phone can determine that the short message application has just received the information according to the timestamp.
Based on the scenario shown in fig. 11 (a), as shown in the interface 1102 shown in fig. 11 (b), during the process of displaying the desktop, the mobile phone detects that the user slides down along the direction shown by the arrow 113 at the position shown by the reference numeral 112 (the position is any position on the display screen), stops sliding at the position shown by the reference numeral 114, leaves the display screen after the preset time is exceeded, and ends the operation of the current gesture, and then the mobile phone can determine that the gesture corresponding to the user operation is gesture 3. The handset may detect that the user begins to enter the timestamp of gesture 3 (i.e., the timestamp of the location indicated by touch reference numeral 112), then it may determine the timestamp closest to the timestamp of the message notification received by the message notification function. As shown in fig. 11 (a), the message recently received by the mobile phone is a message corresponding to the short message application, and it is determined that the message notification corresponding to the short message application has a quick reply function. Then, after detecting gesture 3 as shown in fig. 11 (b), the mobile phone may display interface 1103 as shown in fig. 11 (c). The interface 1103 is a message notification function interface, and a shortcut reply function of a message notification corresponding to the short message application is started on the interface 1103. Then, the user may reply with a short message directly through the virtual keyboard 115 displayed on the interface 1103.
Therefore, the user can realize the one-hand indication mobile phone opening message notification through simple gesture operation at any position of the display screen. The user operation is simplified, the requirement of single-hand operation of the large-screen mobile phone user is met, and the use experience of the user is improved.
In the above embodiments, the implementation manner of opening message notification is described by detecting the preset gesture in the process of displaying the desktop by the mobile phone. In addition, the mobile phone can also directly open the message notification according to the preset gesture in the process of displaying the application interface. In some embodiments, the desktop is a launcher application, i.e., displaying the desktop on a mobile phone may also be understood as displaying the launcher application interface on the desktop.
For example, as shown in the interface 1201 in fig. 12 (a), during the process of displaying the camera application, the mobile phone detects that the user slides down and slides right in the direction indicated by the arrow 122 at the position indicated by the reference numeral 121 (the position is any position on the display screen), leaves the display screen at the position indicated by the reference numeral 123, and ends the operation of the current gesture, and then the mobile phone may determine that the gesture corresponding to the user operation is gesture 1 (i.e. a slide-down and slide-right gesture). Then the handset may activate the control center function in the notification bar and display a control center function interface 1202 as shown in fig. 12 (b).
It should be noted that, the related implementation manners of the gesture 2 and the gesture 3 in fig. 8-11 may also be implemented in the process of displaying an application interface (such as an interface of an application such as a camera application), and the related content may refer to the description of the gesture 2 and the gesture 3, which are not described herein.
In some scenarios, in addition to the difficulty of starting the notification bar with one hand during one-handed operation by the user, the difficulty of starting or using certain functions with one hand is also greater in some applications. Thus, preset gestures in the application may also be set to assist the user in launching or using certain functions of the application. Therefore, the difficulty of single-hand operation of the user is reduced, and the use experience of the user is improved.
Exemplary, a camera application interface 1301 is shown in fig. 13 (a). As indicated by the dashed box at 131, the camera application typically displays a partially functional functionality control, such as filter control 132, at a top position of the camera application interface 1301. Then, if the user needs to switch the camera filter, he needs to click the filter control 132 displayed at the top position of the camera application interface 1301, and the difficulty of one-hand operation is high.
In addition, in the process of selecting the camera filter, the difficulty of one-hand operation of a user is also high. As shown in (a) of fig. 13, when the mobile phone detects an operation of clicking the filter control 132 by the user, an interface 1302 shown in (b) of fig. 13 is displayed. As indicated by reference numeral 133, a plurality of filters that can be switchably selected are displayed on an interface 1302. Thereafter, as shown in an interface 1303 in fig. 13 (c), the mobile phone detects a sliding operation of the user within a dotted line box shown by a reference numeral 133, and displays a plurality of filters switchably selectable in a sliding manner. After the mobile phone detects the operation of clicking the control 134 by the user, it determines that the user selects the filter corresponding to the control 134, and may display an interface 1304 shown in fig. 13 (d), as shown by reference numeral 135, and the mobile phone completes the switching of the camera application filter.
It can be seen that the filter selection process is complex to operate, if the size of the display screen of the mobile phone is larger, the left and right width of the display screen is larger, and the difficulty of selecting the filter by one hand of the user is larger (for example, the user holds the mobile phone by the right hand and selects the filter close to the left side of the display screen by the thumb of the right hand, the filter selection is limited by the touchable operation range of the thumb, and the filter selection is difficult to be completed by one hand), so that the single-hand operation experience of the user is affected.
Based on this, taking the camera application as an example, the preset gesture is set as a two-segment gesture (such as the preset gesture includes a slide-down right gesture, a slide-down left gesture, a slide-down gesture and a pause gesture for a preset time) different from the existing single-segment gesture, and is used for starting or using the function corresponding to the camera application.
For example, as shown in an interface 1401 in fig. 14 (a), in the process of displaying a camera application, the mobile phone detects that the user slides down and slides left in the direction shown by arrow 142 at a position shown by reference numeral 141 (the position is any position on the display screen), leaves the display screen at a position shown by reference numeral 143, and ends the operation of the current gesture, and then the mobile phone may determine that the gesture corresponding to the user operation is gesture 2. Then, a filter function in the camera application can be activated and an interface 1402 as shown in fig. 14 (b) is displayed. Or, when detecting a slide-down gesture (i.e. gesture 1) of the user sliding down and right at any position on the display screen in the process of displaying the interface 1401, the mobile phone may also start the filter function to display the interface 1402.
Therefore, the user can start the function which is difficult to start by single-hand operation of the large-screen mobile phone through simple gestures, and the operation difficulty of the user is reduced.
Then, on the interface 1402 shown in fig. 14 (b), the user can select a desired filter by a gesture 1 of sliding down to the right or a gesture 2 of sliding down to the left. As shown in interface 1402, assume that the standard filter currently selected by selection box 144 is not the filter desired by the user. The mobile phone detects that the user slides down in the direction indicated by arrow 146 at the position indicated by reference numeral 145 (the position is any position on the display screen), stops sliding at the position indicated by reference numeral 147, moves the selection frame 144 rightward, and determines a filter required for selection according to the display position of the selection frame 144. As shown in interface 1403 (c) of fig. 14, the mobile phone detects the position where the user's finger indicated by reference numeral 147 stops sliding, corresponding to interface 1402, and the user's finger leaves the display screen, and determines that the filter selected by the current selection box 144 is a soft filter. Then the handset may display an interface 1404 as shown in fig. 14 (d), as shown by reference numeral 149, the handset activates the soft filter.
Therefore, the user can complete the switching of the camera filter through simple gestures, the requirement of single-hand operation of the user is met, the operation difficulty of the user is reduced, and the use experience of the user is improved.
In some embodiments, smart vision functions are configured in the mobile phone camera application for implementing functions such as code scanning, translation, text recognition, etc. by using the mobile phone camera. Currently, users need to click on the corresponding control in the camera application if they need to trigger the camera smart vision function. As shown in interface 1501 in fig. 15 (a), the user clicks control 151 located in the upper left corner of the display screen to activate the intelligent vision function. If the size of the mobile phone display screen is larger, the difficulty of starting the intelligent visual function by a user with one hand is larger, and the use experience of the user is affected.
Based on this, a gesture for starting a smart life function in the camera application may also be set, such as gesture 3, which slides down and pauses for a preset time.
For example, as shown in an interface 1501 in fig. 15 (a), when the mobile phone detects that the user slides down along the direction indicated by the arrow 153 at the position indicated by the reference numeral 152 (the position is any position on the display screen), stops sliding at the position indicated by the reference numeral 154, leaves the display screen after stopping exceeding the preset time, and ends the operation of the current gesture, the mobile phone may determine that the gesture corresponding to the user operation is gesture 3. The handset determines that gesture 3 in the camera application is used to indicate that the smart vision function is activated, then the handset may display a smart vision function interface 1502 as shown in fig. 15 (b).
Therefore, by setting simple gestures, functions in the camera application can be started quickly, the requirement of single-hand operation of a user is met, the operation difficulty of the user is reduced, and the use experience of the user is improved.
It should be noted that, in the embodiment of the present application, the description of the preset gesture does not limit the embodiment of the present application, and in the embodiment of the present application, the preset gesture for triggering the corresponding function may be any gesture.
In addition, other preset gestures may be preset in the camera application for enabling other camera functions. In addition, the preset gesture for starting the corresponding function in other applications may refer to the related content of the preset gesture corresponding to the camera application, which is not described in detail in the embodiment of the present application.
In some embodiments, the above embodiments take the gesture 1, the gesture 2 and the gesture 3 as examples to describe the single-hand gesture operation, and it can be understood that the requirement of the single-hand operation of the user can be met by other gestures. Alternatively, in different scenarios, the same gesture may implement the same or different functions, and different gestures may implement the same or different functions.
For example, in the process of displaying the desktop on the mobile phone, gesture 1 is detected, and a control center function interface can be displayed. In the process of displaying a camera application interface by a mobile phone, detecting a gesture 1, and displaying a camera filter selection interface; or display a control center function interface.
Therefore, in order to avoid gesture collision, the mobile phone cannot explicitly execute the corresponding function. Different gestures can be preset in the mobile phone, and the user starts different functions.
Alternatively, in some embodiments, to reduce the difficulty of a user memorizing gestures, the number of gestures that the user needs to memorize is reduced. After the mobile phone detects the touch operation, whether the current scene is a preset scene can be judged first. And if the gesture is determined to be a preset scene, the mobile phone determines to execute a corresponding response according to the acquired gesture. In this way, by configuring different responses of the same gesture in different preset scenes, the user can instruct the electronic device to execute the corresponding response in different scenes by memorizing a limited number of gestures. The preset scenes include, for example, a desktop display scene, a preset application display scene, and the like, and different preset applications may correspond to different preset scenes.
For example, as shown in fig. 16, after the mobile phone detects the operation of touching the display screen by the user, the start of the gesture detection process is triggered. Then in step S1601, the mobile phone may determine whether the current display interface is a preset scene through a scene determination module in the application framework layer as shown in fig. 6. If yes (namely, a preset scene), different responses can be executed according to the acquired gestures; if not (i.e., not a preset scenario), it may be determined to end detection of the current single-hand gesture, and whether it is a gesture that indicates other operations (e.g., a gesture to click on an application functionality control).
For example, as shown in the scene of fig. 8, the mobile phone determines that the current display interface is a desktop, and belongs to a preset scene. Then, according to the collected gesture 2 (i.e. the sliding gesture of sliding down and sliding left), it is determined that the response corresponding to the gesture 2 in the current preset scene is to display the message notification function interface 802 as shown in fig. 8 (b).
For another example, as shown in the scenario of fig. 14, the mobile phone determines that the current display interface is a camera application interface, which belongs to a preset scenario. Then, according to the collected gesture 2 (i.e. the sliding gesture of sliding down and sliding left), determining that the response corresponding to the gesture 2 in the current preset scene turns on the filter selection function, and displaying an interface 1402 as shown in fig. 14 (b).
Thus, in different applications, the user can instruct the mobile phone to execute different functions through the same gestures, so that gesture multiplexing is realized, and the operation difficulty of the user is reduced.
In some embodiments, the mobile phone may set the correspondence between the gesture and the response according to the selection of the user. Or the mobile phone can collect gestures input by a user, set gesture templates and determine corresponding responses of the newly added gesture templates according to user operation.
For example, as shown in fig. 16, it is assumed that in the desktop scenario, response 1 corresponding to gesture 1 is a display control center function interface, and response 2 corresponding to gesture 2 is a display message notification function interface, as described above.
Optionally, the user may set a correspondence between the gesture and the response, for example, the user sets gesture 1 to indicate that the message notification function interface is displayed (i.e. indicates that the correspondence between gesture 1 and response 2 is established), and sets gesture 2 to indicate that the control center function interface is displayed (i.e. indicates that the correspondence between gesture 2 and response 1 is established).
A gesture library (GestureLibrary) may be set in the mobile phone, for storing a plurality of preset gestures. And, can display a plurality of gestures in the gesture library for the user to select, in order to set up the corresponding relation of gesture and response. For example, in the setting function, a gesture for indicating that a message notification function interface is displayed is selected as gesture 1.
Also for example, the mobile phone may further add a gesture template input by the user to the gesture library according to the user operation. For example, by the static GestureLibraries. From File path method, a gesture library can be loaded from a Path File. After the getdurlibrary object is acquired, a Gesture named entryName (i.e., a user newly added Gesture) is added by an add Gesture (addgeture) method provided by the getdurlibrary object (e.g., string entryName, geture). Then, gesture editing is performed through gesture recognition (gestureovertyiew), and the detected gestures input by the user are saved in a gesture library. And then, in the process of setting the gestures for the functions in the application, the newly added gestures of the user in the gesture library can be selected by the user, and if the newly added gestures are selected by the user, the corresponding relation between the newly added gestures and the corresponding function responses can be established. And, the mobile phone sets up the interception of the newly added gesture by a newly added gesture interception (addOnGesturePerformance Listener) method. Then, in the process of gesture monitoring, the mobile phone can match the gesture input by the user according to the recognize method provided by GestureLibraries, and the matching degree is greater than a preset threshold (the matching degree is greater than 1), so that the matching is determined to be successful. Then the handset may perform the corresponding response.
The different gestures and responses corresponding to the gestures are described above, as follows for the gesture recognition process.
In some embodiments, the cell phone may perform gesture recognition by an android gesture (gesture) method. For example, after detecting an operation of touching the display screen of the mobile phone by the user, the mobile phone may trigger a touch screen event (MotionEvent). The touch listening (ontouchListener) interface is used for listening to a MotionEvent, and the MotionEvent object can be obtained in an onTouch () method of the Listener interface. And forwarding the MotionEvent object to a gesture monitoring (OnGestureListener) interface through a gesture detector (GestureDetector), and acquiring the MotionEvent object through the OnGestureListener interface so as to acquire relevant information of the gesture for processing.
In the gesture detection process, starting from action_down in the MotionEvent, the mobile phone records coordinates when the gesture starts. When the finger movement is detected, the action_MOVE event of the MotionEvent can be triggered, and the mobile phone records the coordinate track in the finger movement process. And finally, when the mobile phone detects that the finger is lifted, determining that the gesture is ended, triggering an action_UP event of the MotionEvent, and recording the coordinates at the moment. Thus, the recording of all gesture coordinate tracks is completed, and gesture recognition is started.
For example, taking the gesture 1 (i.e. the slide-down right gesture) as an example, a method for determining a gesture according to a gesture coordinate track will be described.
For example, the mobile phone can obtain a coordinate track array (array) corresponding to the gesture coordinate track through the method. The interface 1701 shown in fig. 17 is to set up a coordinate axis with the upper left corner of the mobile phone display screen as the origin, wherein the direction parallel to the upper edge of the display screen and right is the positive direction of the x-axis; parallel to the left edge of the display screen, downward is the positive direction of the y axis.
Then, as shown in the interface 1701, the mobile phone can obtain n coordinate values corresponding to the gesture 1 (i=1, 2, … …, n) for the gesture 1 (i=1, 2, … …, n). In the gesture recognition process, the mobile phone can calculate the difference value between the coordinate value of the back term and the coordinate value of the front term to obtain a difference value coordinate array of n-1 terms. The first segment in gesture 1 (i.e., the segment between the starting point 171 and the inflection point 172) is a swipe, as shown in the interface 1701, so abs (x-axis difference) <=threshold 1 (threshold 1) in the first k-term difference, and the y-axis difference is always positive. Based on the feature, the mobile phone may determine that the first segment gesture is a swipe gesture. Thereafter, the second segment of gesture 1 (i.e., the segment between inflection point 172 to endpoint 173) is right-sliding, so abs (y-axis difference) <=threshold 2 (threshold 2) and the x-axis difference is always positive from the k+1st to the n-1st difference. Based on the feature, the mobile phone may determine that the second segment gesture is a right swipe gesture. Thus, the mobile phone may determine whether the currently detected gesture is gesture 1 according to the above method. If determined as gesture 1, a control center function interface may be displayed.
It should be noted that, in the gesture track determining process, the determination of each threshold (such as threshold 1 and threshold 2) may be determined according to various factors such as mobile phone performance, experimental value, empirical value, etc., and the specific determining method may refer to the prior art, so that the embodiment of the present application is not limited in particular. For other gesture detection and trajectory determination methods, reference may be made to the gesture 1 detection and trajectory determination process described above, which is not described in detail in the embodiments of the present application.
In this way, the mobile phone can determine whether the gesture input by the user is a preset gesture through the gesture detection method. The requirement of the user for operating the mobile phone function by one hand is met, and the user experience is improved.
Fig. 18 is a schematic flow chart of a gesture control method according to an embodiment of the present application. As shown in fig. 18, the flow includes the following steps.
S1801, the electronic device displays a first interface.
In some embodiments, the first interface may be a desktop or an interface in a first application in the electronic device. In the process of displaying the first interface, the electronic equipment can execute corresponding operations according to the gestures of the user.
S1802, the electronic equipment detects that a user starts to input a first gesture at a first position of a first interface and ends to input the first gesture at a second position; the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises a first position, the second segment gesture comprises a second position, and the first segment gesture and the second segment gesture are not on the same straight line; or the residence time of the first gesture at the second position exceeds a preset time.
In some embodiments, the first location is any location on the first interface.
In some embodiments, the first gesture is a two-segment gesture, such as a swipe down right gesture, a swipe down left gesture, a swipe down and pause gesture for a preset time, or the like.
Wherein, if divided by a track, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture corresponds to a first direction, the second segment gesture corresponds to a second direction, and then the first direction is perpendicular to the second direction. For example, the first gesture is a sliding-down gesture and includes a starting position (i.e., a first position) of the first gesture, the second gesture is a sliding-right gesture and includes an ending position (i.e., a second position) of the first gesture, the first direction is downward, the second direction is rightward, and the downward direction is perpendicular to the rightward direction. For another example, the first gesture includes only a trace (the trace includes a first position and a second position), and the time spent at the end position (i.e., the second position) of the first gesture exceeds a preset time. Further, the first position and the second position are on a straight line, which is oriented perpendicular to an edge of the electronic device, such as a downward first gesture.
Illustratively, as in interface 701 (a) of fig. 7, the electronic device detects that the user begins to input a first gesture at a first location (e.g., the location indicated by reference numeral 71) on the desktop and ends to input the first gesture at a second location (e.g., the location indicated by reference numeral 73). Wherein the first location and the second location include an inflection point therebetween as indicated by reference numeral 74, the first gesture includes a first segment gesture (including the first location) between the first location and the inflection point and a second segment gesture (including the second location) between the inflection point and the second location. And the first direction corresponding to the first segment of gesture is downward, the second direction corresponding to the second segment of gesture is rightward, and the first direction is perpendicular to the second direction. The electronic device may determine that the first gesture is a swipe down right gesture.
Also exemplary, as in interface 801 of FIG. 8 (a), the electronic device detects that the user has begun to input a first gesture at a first location on the desktop (the location indicated by reference numeral 81) and has finished inputting the first gesture at a second location (the location indicated by reference numeral 83). Wherein the first location and the second location include an inflection point therebetween as indicated by reference numeral 84, the first gesture includes a first segment gesture (including the first location) between the first location and the inflection point and a second segment gesture (including the second location) between the inflection point and the second location. And the first direction corresponding to the first segment of gesture is downward, the second direction corresponding to the second segment of gesture is leftward, and the first direction is perpendicular to the second direction. The electronic device may determine that the first gesture is a swipe down left gesture.
Still further exemplary, as in interface 902 of fig. 9 (b), the electronic device detects that the user has begun to input a first gesture at a first location on the desktop (the location shown at 93) and has finished inputting the first gesture at a second location (the location shown at 95). And determining that the residence time of the first gesture at the second position exceeds the preset time, the first gesture can be determined to be a gesture which slides down and stops for the preset time.
S1803, the electronic device displays a second interface corresponding to the first gesture.
In some embodiments, after detecting the first gesture, the electronic device may display a second interface corresponding to the first gesture according to a preset rule, where the second interface is a message notification function page or a control center function interface.
For example, as shown in the above-mentioned scenario in fig. 7 or fig. 12, in the process of displaying the first interface (the first interface is a desktop or a camera application interface), the electronic device detects a first gesture of sliding down and sliding right, and displays the control center interface.
In other embodiments, before displaying the second interface corresponding to the first gesture, the electronic device further determines that the first interface is displayed, for example, the first interface is a desktop, or the first interface is a first application interface. And then, the electronic equipment determines a second interface displayed by the first gesture indication according to the display of the first interface. In a possible implementation manner, the first interface is a desktop, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture includes a first position, the second segment gesture includes a second position, and the first segment gesture and the second segment gesture are not on the same straight line. Then, the electronic device displays a second interface corresponding to the first gesture, including: the electronic equipment determines a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture, and displays a control center function interface or a message notification function interface according to the second direction.
Illustratively, as shown in interface 701 in fig. 7 (a), during the process of displaying the desktop, the electronic device detects that the user slides down in the direction indicated by arrow 72 at the first position indicated by reference numeral 71 (the first position is any position on the display screen) and slides right at the inflection point position indicated by reference numeral 74, leaves the display screen at the second position indicated by reference numeral 73, and ends the first gesture. Then the electronic device can activate the control center function in the notification bar and display the control center function interface 702 as shown in fig. 7 (b).
Also exemplary, as shown in interface 801 (a) of fig. 8, the electronic device detects that the user slides down in the direction of arrow 82 at a first location indicated by reference numeral 81 (the first location is any location on the display screen) and slides left at an inflection point location indicated by reference numeral 84, leaves the display screen at a location indicated by reference numeral 83, and ends the first gesture. Then the electronic device can initiate a message notification function in the notification bar and display a message notification function interface 802 as shown in fig. 8 (b).
The electronic device may determine an inflection point of the first gesture according to a coordinate track corresponding to the first gesture, and determine a first segment gesture from the first location to the inflection point, and a second segment gesture from the inflection point to the second location. And, a second direction of the second segment of the gesture can be determined, and a second interface displayed by the first gesture indication can be further determined according to the second direction. If the second direction is rightward, the corresponding second interface is a control center function interface; the second direction is leftward, and the corresponding second interface is a message notification function interface.
Therefore, through presetting a first gesture requiring a smaller operation space, a user can display a control center function interface or a message notification function interface on the electronic equipment display screen at any position through simple gesture operation. The requirement of single-hand operation of the large-screen electronic equipment user is met, and the use experience of the user is improved.
In another possible implementation, the first interface is a desktop, and the residence time of the first gesture at the second location exceeds a preset time. Then, the electronic device displays a second interface corresponding to the first gesture, including: and the electronic equipment determines a target message notification to be displayed according to the preset condition. And if the target message notification to be displayed does not have the quick reply function, displaying a second interface of the application corresponding to the target message notification to be displayed, wherein the second interface is associated with the target message notification to be displayed. Or if the message notification to be displayed has the shortcut reply function, displaying a shortcut reply interface corresponding to the target message notification to be displayed.
The preset conditions comprise one or more of a recently received message notification, a message notification with high priority, a message notification corresponding to a user commonly used application, a message notification corresponding to an application set by a user, and a message notification with a quick reply function.
For example, in the case where the preset condition includes a notification of a recently received message, as shown in interface 902 in fig. 9 (b), the electronic device detects that the user slides down in the direction indicated by arrow 94 at a first position indicated by reference numeral 93 (the first position is any position on the display screen), stops sliding at a second position indicated by reference numeral 95, and leaves the display screen after stopping exceeding the preset time, and ends the first gesture. The electronic device can determine a message notification in the message notification function that was most recently received from a timestamp corresponding to the first location in the first gesture. As shown in the scenario of fig. 9 (a), the message recently received by the electronic device is a message corresponding to the inline application. Based on this scenario, the electronic device determines that the notification message of the app has no shortcut reply function, and may open the app, and display an interface 903 as shown in fig. 9 (c), where the interface 903 is an app interface (e.g., a second interface). The smooth application interface is associated with a message corresponding to the smooth application recently received by the electronic device, for example, an interface for displaying the message.
Also by way of example, in the case where the preset condition includes a recently received message notification, as shown in interface 1102 in fig. 11 (b), the electronic device detects that the user slides down in the direction indicated by arrow 113 at a first position indicated by reference numeral 112 (the first position is any position on the display screen), stops sliding at a second position indicated by reference numeral 114, and leaves the display screen after stopping exceeding the preset time, ending the first gesture. The electronic device can determine a message notification in the message notification function that was most recently received from a timestamp corresponding to the first location in the first gesture. As shown in fig. 11 (a), the message recently received by the electronic device is a message corresponding to the short message application. Based on the scenario, the electronic device determines that the message notification corresponding to the short message application has a shortcut reply function, and may display an interface 1103 as shown in fig. 11 (c). The interface 1103 is a message notification function interface, and a shortcut reply function of a message notification corresponding to the short message application is started on the interface 1103. Then, the user may reply with a short message directly through the virtual keyboard 115 displayed on the interface 1103.
Therefore, the user can realize the one-hand indication of the electronic equipment to open the message notification through simple gesture operation at any position of the display screen. The user operation is simplified, the requirement of single-hand operation of the large-screen electronic equipment user is met, and the use experience of the user is improved.
In some embodiments, the first interface is a first application interface, the first gesture includes a first segment gesture and a second segment gesture, the first segment gesture includes a first position, the second segment gesture includes a second position, and the first segment gesture and the second segment gesture are not collinear. Then, the electronic device displays a second interface corresponding to the first gesture, including: and the electronic equipment determines a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture. And then, according to the second direction, displaying an interface of the first function in the first application corresponding to the first gesture, or displaying an interface of the second function in the first application corresponding to the first gesture. Wherein the first function and the second function are the same function or different functions.
The first application is, for example, a camera application. As shown in the interface 1201 in fig. 12 (a), during the process of displaying the camera application (i.e., the first application), the electronic device detects that the user slides down and right in the direction indicated by the arrow 122 at the first position indicated by the reference numeral 121 (the first position is any position on the display screen), leaves the display screen at the second position indicated by the reference numeral 123, and ends the first gesture. Then the electronic device can activate the control center function in the notification bar and display a control center function interface 1202 (e.g., interface for the first function) as shown in fig. 12 (b).
Also exemplary, as shown in interface 1401 in fig. 14 (a), the electronic device detects that the user has slid down and left and right in the direction of arrow 142 at a first position shown at 141 (the first position is any position on the display screen) and left off the display screen at a second position shown at 143, ending the first gesture during the display of the camera application. The electronic device may activate the filter function in the camera application and display an interface 1402 (e.g., an interface for the second function) as shown in fig. 14 (b).
The electronic device may determine an inflection point of the first gesture according to a coordinate track corresponding to the first gesture, and determine a first segment gesture from the first location to the inflection point, and a second segment gesture from the inflection point to the second location. And, a second direction of the second segment of the gesture can be determined, and a second interface displayed by the first gesture indication can be further determined according to the second direction. If the second direction is rightward, the corresponding second interface is a control center function interface; the second direction is leftward, and the corresponding second interface is a message notification function interface.
In other embodiments, the first interface is a first application interface, and the residence time of the first gesture at the second location exceeds a preset time. Then, the electronic device displays a second interface corresponding to the first gesture, including: and displaying an interface of a third function of the first application corresponding to the first gesture.
The first application is, for example, a camera application. As shown in interface 1501 in fig. 15 (a), during the process of displaying the camera application, the electronic device detects that the user slides down in the direction indicated by arrow 153 at the first position indicated by reference numeral 152 (the first position is any position on the display screen), stops sliding at the second position indicated by reference numeral 154, and leaves the display screen after stopping exceeding the preset time, and ends the first gesture. Then, the electronic device can display an intelligent visual function interface 1502 (e.g., an interface for a third function) as shown in fig. 15 (b).
Therefore, in the process of displaying the application on the electronic equipment, a user can start the function which is difficult to start by single-hand operation of the large-screen electronic equipment through simple gestures, and the operation difficulty of the user is reduced. The gesture control method provided by the embodiment of the application is described in detail above with reference to fig. 7 to 18. The gesture control device provided by the embodiment of the application is described in detail below with reference to fig. 19.
In one possible design, fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 19, the electronic device 1900 as a gesture control apparatus may include: display unit 1901, transceiver unit 1902, and processing unit 1903. Electronic device 1900 may be used to implement the functions of the electronic device as referred to in the method embodiments described above.
Optionally, a display unit 1901 for supporting the electronic device 1900 to display interface content; and/or support electronic device 1900 performs S1801 and S1803 in fig. 18.
Optionally, the transceiver unit 1902 is configured to support the electronic device 1900 to execute S1802 in fig. 18.
Optionally, the processing unit 1903 is configured to support the electronic device 1900 to execute S1802 in fig. 18.
The transceiver unit may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver related circuit component, and may be a transceiver or a transceiver module. The operations and/or functions of each unit in the electronic device 1900 may be referred to as a functional description of a corresponding functional unit for brevity, so that corresponding flows of the gesture control method described in the above method embodiments are implemented, and all relevant contents of each step related to the above method embodiments are not described herein.
Optionally, the electronic device 1900 shown in fig. 19 may further include a storage unit (not shown in fig. 19) in which a program or instructions are stored. When the display unit 1901, the transceiver unit 1902, and the processing unit 1903 execute the program or instructions, the electronic device 1900 shown in fig. 19 can execute the gesture control method described in the above method embodiment.
The technical effects of the electronic device 1900 shown in fig. 19 may refer to the technical effects of the gesture control method described in the above method embodiment, and will not be described herein.
In addition to the form of the electronic device 1900, the technical solution provided by the present application may be a functional unit or a chip in the electronic device, or a device used in cooperation with the electronic device.
The embodiment of the application also provides a chip system, which comprises: a processor coupled to a memory for storing programs or instructions which, when executed by the processor, cause the system-on-a-chip to implement the method of any of the method embodiments described above.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integral with the processor or separate from the processor, and embodiments of the present application are not limited. The memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately provided on different chips, and the type of memory and the manner of providing the memory and the processor are not particularly limited in the embodiments of the present application.
Illustratively, the chip system may be a field programmable gate array (field programmable gate array, FPGA), an application specific integrated chip (AP device plication specific integrated circuit, ASIC), a system on chip (SoC), a central processor (central processor unit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip.
It should be understood that the steps in the above-described method embodiments may be accomplished by integrated logic circuitry in hardware in a processor or instructions in the form of software. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
The embodiment of the application also provides a computer readable storage medium, in which a computer program is stored, which when run on a computer causes the computer to execute the above related steps to implement the gesture control method in the above embodiment.
The embodiment of the application also provides a computer program product, which when run on a computer, causes the computer to execute the related steps to realize the gesture control method in the embodiment.
In addition, the embodiment of the application also provides a device. The apparatus may be a component or module in particular, and may comprise one or more processors and memory coupled. Wherein the memory is for storing a computer program. The computer program, when executed by one or more processors, causes the apparatus to perform the gesture control method in the method embodiments described above.
Wherein an apparatus, a computer-readable storage medium, a computer program product, or a chip provided by embodiments of the application are for performing the corresponding methods provided above. Therefore, the advantages achieved by the method can be referred to as the advantages in the corresponding method provided above, and will not be described herein.
The steps of a method or algorithm described in connection with the present disclosure may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access memory (random access memory, RAM), flash memory, read Only Memory (ROM), erasable programmable read only memory (erasable programmable ROM), electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disc read only memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an application specific integrated circuit (AP device plication specific integrated circuit, ASIC).
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that the foregoing functional block divisions are merely illustrative for convenience and brevity of description. In practical application, the above functions can be allocated by different functional modules according to the need; i.e. the internal structure of the device is divided into different functional modules to perform all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
In the several embodiments provided in the present application, it should be understood that the disclosed method may be implemented in other manners. The device embodiments described above are merely illustrative. For example, the division of the modules or units is only one logic function division, and other division modes can be adopted when the modules or units are actually implemented; for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, module or unit indirect coupling or communication connection, which may be electrical, mechanical or other form.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
Computer readable storage media include, but are not limited to, any of the following: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, and the scope of the present application is not limited thereto, but any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (22)

1. A gesture control method, applied to an electronic device, the method comprising:
displaying a first interface;
detecting that a user starts to input a first gesture at a first position of the first interface and ending to input the first gesture at a second position; the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises the first position, the second segment gesture comprises the second position, and the first segment gesture and the second segment gesture are not on the same straight line; or the residence time of the first gesture at the second position exceeds a preset time;
And displaying a second interface corresponding to the first gesture.
2. The method of claim 1, wherein the first segment gesture corresponds to a first direction and the second segment gesture corresponds to a second direction, the first direction being perpendicular to the second direction.
3. The method of claim 1 or 2, wherein prior to displaying the second interface corresponding to the first gesture, the method further comprises:
and determining the first interface as a desktop or the first interface as a first application interface.
4. The method of claim 3, wherein the first interface is a desktop, the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises the first location, the second segment gesture comprises the second location, the first segment gesture and the second segment gesture are not collinear, and the displaying a second interface corresponding to the first gesture comprises:
determining a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture;
and displaying a control center function interface or a message notification function interface according to the second direction.
5. The method of claim 3, wherein the first interface is a desktop, the residence time of the first gesture in the second location exceeds a preset time, and the displaying the second interface corresponding to the first gesture comprises:
determining a target message notification to be displayed according to preset conditions;
if the target message notification to be displayed does not have the quick reply function, displaying the second interface of the application corresponding to the target message notification to be displayed, wherein the second interface is associated with the target message notification to be displayed;
or,
and if the message notification to be displayed has the shortcut reply function, displaying a shortcut reply interface corresponding to the target message notification to be displayed.
6. The method of claim 5, wherein the preset condition includes one or more of a recently received message notification, a message notification with a high priority, a message notification corresponding to a user commonly used application, a message notification corresponding to an application set by a user, and a message notification with a quick reply function.
7. The method of claim 3, wherein the first interface is a first application interface, the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises the first location, the second segment gesture comprises the second location, the first segment gesture and the second segment gesture are not collinear, and the displaying the second interface corresponding to the first gesture comprises:
Determining a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture;
displaying an interface of a first function in the first application corresponding to the first gesture or displaying an interface of a second function in the first application corresponding to the first gesture according to the second direction; wherein the first function and the second function are the same function or different functions.
8. A method according to claim 3, wherein the first interface is a first application interface, the residence time of the first gesture at the second location exceeds a preset time, and the displaying the second interface corresponding to the first gesture includes:
and displaying an interface of a third function of the first application corresponding to the first gesture.
9. The method according to claim 7 or 8, wherein the first application is a camera application.
10. The method of any one of claims 1-9, wherein the first location is any location on the first interface.
11. An electronic device, comprising: a display screen, a processor, and a memory, the display screen, the memory being coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when read from the memory by the processor, cause the electronic device to perform:
Displaying a first interface;
detecting that a user starts to input a first gesture at a first position of the first interface and ending to input the first gesture at a second position; the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises the first position, the second segment gesture comprises the second position, and the first segment gesture and the second segment gesture are not on the same straight line; or the residence time of the first gesture at the second position exceeds a preset time;
and displaying a second interface corresponding to the first gesture.
12. The electronic device of claim 11, wherein the first segment gesture corresponds to a first direction and the second segment gesture corresponds to a second direction, the first direction being perpendicular to the second direction.
13. The electronic device of claim 11 or 12, wherein the computer instructions, when read from the memory by the processor, further cause the electronic device to:
and determining the first interface as a desktop or the first interface as a first application interface.
14. The electronic device of claim 13, wherein the first interface is a desktop, the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises the first location, the second segment gesture comprises the second location, the first segment gesture and the second segment gesture are not collinear, the displaying a second interface corresponding to the first gesture comprises:
Determining a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture;
and displaying a control center function interface or a message notification function interface according to the second direction.
15. The electronic device of claim 13, wherein the first interface is a desktop, the dwell time of the first gesture in the second position exceeds a preset time, the displaying the second interface corresponding to the first gesture comprises:
determining a target message notification to be displayed according to preset conditions;
if the target message notification to be displayed does not have the quick reply function, displaying the second interface of the application corresponding to the target message notification to be displayed, wherein the second interface is associated with the target message notification to be displayed;
or,
and if the message notification to be displayed has the shortcut reply function, displaying a shortcut reply interface corresponding to the target message notification to be displayed.
16. The electronic device of claim 15, wherein the preset condition includes one or more of a recently received message notification, a message notification with a high priority, a message notification corresponding to a user commonly used application, a message notification corresponding to an application set by a user, and a message notification with a quick reply function.
17. The electronic device of claim 13, wherein the first interface is a first application interface, the first gesture comprises a first segment gesture and a second segment gesture, the first segment gesture comprises the first location, the second segment gesture comprises the second location, the first segment gesture and the second segment gesture are not collinear, the displaying a second interface corresponding to the first gesture comprises:
determining a second direction corresponding to the second section of gesture according to the coordinate track corresponding to the first gesture;
displaying an interface of a first function in the first application corresponding to the first gesture or displaying an interface of a second function in the first application corresponding to the first gesture according to the second direction; wherein the first function and the second function are the same function or different functions.
18. The electronic device of claim 13, wherein the first interface is a first application interface, the dwell time of the first gesture in the second location exceeds a preset time, the displaying a second interface corresponding to the first gesture comprises:
And displaying an interface of a third function of the first application corresponding to the first gesture.
19. The electronic device of claim 17 or 18, wherein the first application is a camera application.
20. The electronic device of any of claims 11-19, wherein the first location is any location on the first interface.
21. A computer readable storage medium, characterized in that the computer readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the method according to any one of claims 1-10.
22. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the method according to any of claims 1-10.
CN202210122978.XA 2022-02-09 2022-02-09 Gesture control method and electronic device Pending CN116610248A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210122978.XA CN116610248A (en) 2022-02-09 2022-02-09 Gesture control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210122978.XA CN116610248A (en) 2022-02-09 2022-02-09 Gesture control method and electronic device

Publications (1)

Publication Number Publication Date
CN116610248A true CN116610248A (en) 2023-08-18

Family

ID=87675154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210122978.XA Pending CN116610248A (en) 2022-02-09 2022-02-09 Gesture control method and electronic device

Country Status (1)

Country Link
CN (1) CN116610248A (en)

Similar Documents

Publication Publication Date Title
US11722449B2 (en) Notification message preview method and electronic device
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
EP3811191B1 (en) Electronic device for displaying list of executable applications on split screen and operating method thereof
CN111147660B (en) Control operation method and electronic equipment
US20220358089A1 (en) Learning-Based Keyword Search Method and Electronic Device
AU2018456082B2 (en) Stylus detection method, system, and related device
US20230205417A1 (en) Display Control Method, Electronic Device, and Computer-Readable Storage Medium
CN113791850B (en) Information display method and electronic equipment
WO2022179249A1 (en) Function page display method and electronic device
CN113961115B (en) Object editing method, electronic device, medium, and program product
WO2023029985A1 (en) Method for displaying dock bar in launcher and electronic device
CN112835495B (en) Method and device for opening application program and terminal equipment
CN115801943B (en) Display method, electronic device and storage medium
CN115904160A (en) Icon moving method, related graphical interface and electronic equipment
CN113885973B (en) Translation result display method and device and electronic equipment
CN115328565A (en) Function skipping method and electronic equipment
CN110609650B (en) Application state switching method and terminal equipment
CN116610248A (en) Gesture control method and electronic device
CN113821130A (en) Method and related device for determining screenshot area
WO2023045774A1 (en) Display method and electronic device
CN116661670B (en) Method for managing gesture navigation window, electronic device and storage medium
CN116701795B (en) Page display method and electronic equipment
WO2023071590A1 (en) Input control method and electronic device
CN116916093B (en) Method for identifying clamping, electronic equipment and storage medium
CN117648040A (en) Method for generating desktop folder and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination