CN112783408A - Gesture navigation method and device of electronic equipment, equipment and readable storage medium - Google Patents

Gesture navigation method and device of electronic equipment, equipment and readable storage medium Download PDF

Info

Publication number
CN112783408A
CN112783408A CN202110123320.6A CN202110123320A CN112783408A CN 112783408 A CN112783408 A CN 112783408A CN 202110123320 A CN202110123320 A CN 202110123320A CN 112783408 A CN112783408 A CN 112783408A
Authority
CN
China
Prior art keywords
input
track
electronic device
target area
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110123320.6A
Other languages
Chinese (zh)
Inventor
李凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110123320.6A priority Critical patent/CN112783408A/en
Publication of CN112783408A publication Critical patent/CN112783408A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The application discloses a gesture navigation method, a gesture navigation device, gesture navigation equipment and a readable storage medium of electronic equipment, and belongs to the technical field of human-computer interaction. The method comprises the following steps: receiving a first input of a user on a screen of an electronic device; in response to a first input, displaying area information, and determining a sliding track of the first input, wherein the area information is used for indicating a target area, the sliding track at least comprises a track starting point and a track ending point, and the track starting point is located at the edge of a screen of the electronic equipment; executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed. The method can reduce the risk of damage caused by falling of the electronic equipment due to the fact that the user executes the navigation gesture, and avoid the problem of mutual interference of the navigation gesture.

Description

Gesture navigation method and device of electronic equipment, equipment and readable storage medium
Technical Field
The application belongs to the technical field of human-computer interaction, and particularly relates to a gesture navigation method of electronic equipment, a gesture navigation device of the electronic equipment, the electronic equipment and a readable storage medium.
Background
With the development of technology, more and more electronic devices adopt a full-screen configuration.
Currently, a user executes a navigation gesture through a navigation key at the bottom of a full screen of an electronic device or through three navigation gesture areas divided on the full screen to implement an operation on the electronic device.
However, the size of the current full-screen is getting larger, and therefore, when a user performs a navigation gesture using a bottom navigation key or a navigation gesture area to operate an electronic device, it is difficult for the user to reach and operate the bottom navigation key or the navigation gesture area, which may cause the electronic device to be damaged by falling. Meanwhile, when the user operates the electronic device by using the navigation gesture area, the problem of mutual interference of navigation gestures also exists.
Disclosure of Invention
An object of the embodiments of the present application is to provide a gesture navigation method, apparatus, device and readable storage medium for an electronic device, which can reduce the risk of damage to the electronic device due to a user performing a navigation gesture and avoid the problem of mutual interference of the navigation gestures.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a gesture navigation method for an electronic device, including:
receiving a first input of a user on a screen of an electronic device;
displaying area information in response to the first input, and determining a sliding track of the first input, wherein the area information is used for indicating a target area, the sliding track at least comprises a track starting point and a track ending point, and the track starting point is located at the edge of a screen of the electronic equipment;
executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed.
In a second aspect, an embodiment of the present application provides a gesture navigation apparatus for an electronic device, including:
a gesture navigation apparatus of an electronic device, comprising:
the first receiving module is used for receiving a first input of a user on a screen of the electronic equipment;
a first response module, configured to display area information in response to the first input, and determine a sliding track of the first input, where the area information is used to indicate a target area, the sliding track includes at least a track starting point and a track ending point, and the track starting point is located at an edge of a screen of an electronic device;
the execution module is used for executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the gesture navigation method of the electronic device according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In an embodiment of the application, a gesture navigation method of an electronic device is provided, which includes receiving a first input of a user on a screen of the electronic device; in response to a first input, displaying area information, and determining a sliding track of the first input, wherein the area information is used for indicating a target area, the sliding track at least comprises a track starting point and a track ending point, and the track starting point is located at the edge of a screen of the electronic equipment; executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed. On the basis, the first input can be executed based on the edge of the screen so as to control the electronic equipment to execute different navigation operations. In this way, the user can easily perform the first input with fingers while gripping the electronic device, and thus, this reduces the risk of damage to the electronic device due to dropping of the electronic device caused by performing the navigation gesture. In addition, the target area is displayed, so that the user can accurately input the first input, and the problem of mutual interference of navigation gestures can be avoided.
Drawings
Fig. 1 is a schematic flowchart of a gesture navigation method of an electronic device according to an embodiment of the present disclosure;
FIG. 2a is a first schematic diagram of a first input and a target area provided by an embodiment of the present application;
FIG. 2b is a second schematic diagram of a first input and target area provided by an embodiment of the present application;
FIG. 3 is a third schematic diagram of a first input and target area provided by an embodiment of the present application;
FIG. 4 is a diagram illustrating an example of a screen display interface and a second input;
FIG. 5 is a schematic diagram of a first input provided by an embodiment of the present application;
FIG. 6a is a diagram illustrating a first screen display and a third input according to an embodiment of the present disclosure;
FIG. 6b is a diagram of a second screen display and a third input provided by the embodiment of the present application;
fig. 7 is a schematic structural diagram of a gesture navigation apparatus of an electronic device according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The method, the apparatus, the device and the readable storage medium for gesture navigation of the electronic device provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
The embodiment of the application provides a gesture navigation method of an electronic device, as shown in fig. 1, the method includes the following steps S1100-S1300:
s1100, receiving a first input of a user on a screen of the electronic equipment.
S1200, responding to the first input, displaying the area information, and determining the sliding track of the first input.
The area information is used for indicating a target area, the sliding track comprises a track starting point and a track ending point, and the track starting point is located at the edge of a screen of the electronic equipment.
In an embodiment of the application, the first input performed by the user is a navigation gesture with a track starting from an edge of a screen of the electronic device. The sliding track corresponding to the first input at least comprises a track starting point and a track ending point.
In the embodiment of the present application, a trajectory starting point of the sliding trajectory corresponding to the first input may be located at a left edge or a right edge of a screen of the electronic device, which may be determined according to a habit of a user using the electronic device. For example, in a case where the user is accustomed to holding the electronic device with the left hand, the user may set, in the setting function of the electronic device, that the trajectory starting point corresponding to the first input corresponding to when the electronic device is controlled to perform the first operation or the second operation is located at the left edge of the screen of the electronic device.
It can be understood that the starting point of the sliding track of the first input is located at the edge of the screen of the electronic device, so that the user can conveniently execute the first input, the operation difficulty is reduced, and the risk of damage caused by falling of the electronic device due to execution of the navigation gesture is reduced.
In one example of the present application, the first input and the target area may be exemplarily as shown in fig. 2a and 2 b.
The target area may be a circular area, a rectangular area, or the like, and the present application is not limited thereto.
In an embodiment of the application, the electronic device displays the region information at an initial time in response to the first input. The display area information may prompt the user for a specific area location of the target area for the user to accurately perform the first input. Further, the electronic device determines a sliding trajectory of the first input in response to completion of the first input.
S1300, executing a first operation under the condition that the track end point is located in the target area; otherwise, executing the second operation.
In one example, the first operation is an operation to return to the desktop, and the second operation is an operation to return to a top level identification page. Or the first operation is an operation of returning to the upper-level identification page, and the second operation is an operation of returning to the desktop. Or the first operation is an operation of returning to the desktop, and the second operation is displaying an identification page of the application started in the background. It should be noted that the first operation and the second operation may be set according to actual requirements.
In an embodiment of the present application, as shown in fig. 2a, when the sliding track corresponding to the first input is a straight track starting from the edge of the screen, and the track ending point is located outside the target area (i.e. not within the target area, such as located between the target area and the track starting point, or located beyond the target area), the second operation is performed.
In an embodiment of the present application, as shown in fig. 2b, in a case that the trajectory corresponding to the first input is a straight trajectory starting from the edge of the screen, and the trajectory end point is located within the target area, the first operation is performed.
It should be noted that fig. 2a and 2b illustrate the track starting point located at the left edge of the screen, and fig. 2a illustrates the track ending point located between the target area and the track starting point.
In the embodiment of the application, for different first inputs, the track starting points corresponding to the sliding tracks are the same, and the track ending points of the sliding tracks are used for distinguishing the different first inputs. On the basis, the sliding tracks corresponding to different first inputs have overlapping parts, namely, the different first inputs have consistency. In this way, the user does not have a split feeling when performing the first input, which improves the user experience.
Further, as shown in fig. 2a and 2b, it is obvious that the navigation gestures corresponding to different first inputs have consistency.
In an embodiment of the application, a gesture navigation method of an electronic device is provided, which includes receiving a first input of a user on a screen of the electronic device; in response to a first input, displaying area information, and determining a sliding track of the first input, wherein the area information is used for indicating a target area, the sliding track at least comprises a track starting point and a track ending point, and the track starting point is located at the edge of a screen of the electronic equipment; executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed. On the basis, the first input can be executed based on the edge of the screen so as to control the electronic equipment to execute different navigation operations. In this way, the user can easily perform the first input with fingers while gripping the electronic device, and thus, this reduces the risk of damage to the electronic device due to dropping of the electronic device caused by performing the navigation gesture. In addition, the target area is displayed, so that the user can accurately execute the first input, and the problem of mutual interference of navigation gestures can be avoided.
Further, the first input has consistency, so that the user does not have a split feeling when executing the first input, which improves the user experience.
In an embodiment of the present application, the above S1300 may also be implemented by the following S1310 and S1320:
s1310, if the first input only includes a first preset input and the trajectory end point of the sliding trajectory of the first input is located in the target area, perform a first operation.
And S1320, executing a second operation under the condition that the first input comprises a first preset input and a second preset input and the track end point of the sliding track of the first input is positioned outside the target area.
In the embodiment of the present application, the first input includes only the first preset input refers to a continuous input which is a user input and in which there is no inflection point from an edge of a screen of the electronic device.
Correspondingly, the first input includes a first preset input and a second preset input, which refer to a continuous input of a user input with an inflection point from an edge of a screen of the electronic device, that is, the first preset input and the second preset input are two continuous inputs.
In one example, as shown in FIG. 3, the first predetermined input may be a straight input and the second predetermined input may be an arc input.
It should be noted that the second preset input may specifically be an input with an upward arc, and may also be an input with a downward arc, which is not limited in this application. In addition, fig. 3 shows an example in which the second preset input is an input that is one arc line and faces downward.
In an embodiment of the present application, the performing of the second operation in S1320 may be specifically implemented by the following S1321:
s1321, displaying the identification page of the application started in the background under the condition that the track end point is located outside the target area.
In the embodiment of the application, under the condition that the first input comprises a first preset input and a second preset input and the first sliding track point is located outside the target area, an identification page used for starting the background is displayed. Therefore, the visual experience of the user can be suddenly changed by displaying the identification page of the opened application in the background, and therefore, the identification page of the opened application in the background can be displayed through the complex first input, so that the problem that the visual experience is suddenly changed due to the fact that the user is triggered by mistake can be reduced, and the use experience of the user is improved.
In an embodiment of the application, when the execution in S1310 displays the identification page of the application already opened in the background, the identification page of the application already opened in the background may be displayed in a conventional manner, for example, the identification page of the application already opened is displayed in a vertically adjacent arrangement.
In an embodiment of the application, in the case that there are a plurality of identification pages of the background opened application, the displaying of the identification page of the background opened application in S1321 may be implemented by the following S1321-1:
s1321-1, displaying the identification pages of the plurality of background started applications in a fan-shaped overlapping arrangement mode.
For example, the display of the identification pages of multiple background opened applications in a fan-shaped overlapping arrangement may be as shown in fig. 4.
In the embodiment of the application, the plurality of identification pages with the opened background applications are displayed in a fan-shaped overlapping arrangement mode, so that more identification pages with the opened background applications can be displayed in a screen of the electronic equipment, and meanwhile, part of information of the identification pages can be shielded to protect the privacy of a user, so that the use experience of the user is further improved.
In an embodiment of the present application, after the step S1321-1, the gesture navigation method of the electronic device according to the embodiment of the present application further includes the following steps S1410 and S1411:
and S1410, receiving a second input of the user on the screen of the electronic equipment.
S1411, responding to the second input, selecting one opened application page selected by the second input, and adjusting the page to the first layer of the sector.
In this embodiment of the application, the second input is an operation of controlling the identification pages of the multiple background started applications displayed in the fan-shaped overlapping arrangement. The navigation gesture corresponding to the second input may be an input along an edge of the fan. And after receiving the second input, responding to the second input, and displaying the selected identification page of the application started in the background on the first layer of the sector. Or responding to the second input, and adjusting the sector to realize the overall movement of the mark page of the application started in the background.
In one example, the second input described above may be as shown in FIG. 4.
In an embodiment of the present application, the performing of the second operation in S1320 may be specifically implemented by the following S1322:
and S1322, displaying the icon of the opened application and the icon of the set application in the background in a dial mode when the track end point is positioned outside the target area.
In the embodiment of the application, when the first input comprises a first preset input and a second preset input and the first sliding track point is located outside the target area, the icon of the opened application and the icon of the set application in the background are displayed in a dial mode.
The setting application can be a shortcut application set by a user in a defined manner.
In one example, the first input may be as exemplarily shown in fig. 5, and displaying icons of background opened applications and icons of set applications in the form of a dial may be as exemplarily shown in fig. 6a and 6 b. Fig. 6b shows a direction icon indicating a direction displayed at the center of the dial.
Fig. 6a and 6b show examples in which the applications that have been started in the background are application 1 ', application 2', application 3 ', and application 4', and the applications are set to application 1, application 2, and application 3.
In the embodiment of the present application, the user can also operate the dial. The user can perform an up-slide operation or a down-slide operation on the edge of the dial to control the dial to rotate. Further, the rotation range of the dial may be controlled according to a sliding distance or a sliding number at which the user can perform an up-sliding operation or a down-sliding operation on the edge of the dial.
As shown in fig. 6a, upon the user operating the dial, in a state where an icon of a certain application is visible, the user may further click the visible icon to control the screen of the electronic device to display an identification page of the application corresponding to the visible icon.
As shown in fig. 6b, upon operation of the dial by the user, the screen of the electronic device is controlled to display an identification page corresponding to the application icon pointed to by the direction icon when the dial stops rotating. On this basis, after the above S1322, the gesture navigation method of the electronic device according to the embodiment of the present application further includes the following S1420 and S1421:
and S1420, receiving a third input of the user on the screen of the electronic equipment.
And S1421, responding to a third input, rotating the dial according to the third input, and displaying an identification page corresponding to the application icon pointed by the direction icon when the dial stops rotating.
In the embodiment of the application, the identification page corresponding to the application icon pointed by the direction icon is displayed when the dial stops rotating, so that the user does not need to perform the operation of clicking the icon of the application displayed in the dial form on the basis of the third input, and the operation times of the user are further reduced.
In the embodiment of the application, icons of the applications which are opened in the background and the set applications are displayed in a dial mode, so that a user can quickly open the identification pages of the identification applications which are opened recently and the set applications. In this way, the use experience of the user is further improved. In addition, for the identification page of the set application, the user does not need to return to the desktop and then open the application, and the operation times of the user are reduced.
In addition, in one embodiment of the present application, different first and second preset inputs may be defined, thereby combining the above S1321 and S1322. For example, when the first input is as shown in fig. 3, S1321 is executed, and when the first input is as shown in fig. 5, S1322 is executed.
An embodiment of the present application provides a gesture navigation apparatus 700 of an electronic device, as shown in fig. 7, including: a first receiving module 701, a first responding module 702, and an executing module 703. Wherein:
a first receiving module 701, configured to receive a first input of a user on a screen of an electronic device;
a first response module 702, configured to display area information in response to the first input, and determine a sliding track of the first input, where the area information is used to indicate a target area, the sliding track includes at least a track starting point and a track ending point, and the track starting point is located at an edge of a screen of the electronic device;
an executing module 703, configured to execute a first operation when the trajectory end point is located in the target area; otherwise, the second operation is executed.
In one embodiment, the executing module 703 includes:
the first execution unit is used for executing a first operation under the condition that the first input only comprises a first preset input and the track end point of the sliding track of the first input is positioned in a target area;
and the second execution unit is used for executing a second operation under the condition that the first input comprises the first preset input and the second preset input and the track end point of the sliding track of the first input is positioned outside the target area.
In an embodiment, the second execution unit is specifically configured to:
and displaying an identification page of the started application in the background under the condition that the track end point is positioned outside the target area.
In an embodiment, when there are a plurality of identification pages of the application started in the background, the second execution unit is specifically configured to:
and displaying a plurality of identification pages of the application started in the background in a fan-shaped overlapping arrangement mode.
In one embodiment, the apparatus further comprises a second receiving module and a second responding module, wherein:
the second receiving module is used for receiving a second input of the user on the screen of the electronic equipment;
and the second response module is used for responding to the second input, and adjusting the page of the opened application selected by the second input to the first layer of the sector.
The gesture navigation device of the electronic device in the embodiment of the present application may be a device, or may also be a component, an integrated circuit, or a chip in a terminal. The apparatus may be a mobile electronic device. For example, the mobile electronic device may be a mobile phone, a tablet computer, a palm computer, a wearable device, a netbook, or a Personal Digital Assistant (PDA), and the embodiments of the present application are not limited in particular.
The gesture navigation device of the electronic device in the embodiment of the application may be a device with an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The gesture navigation apparatus of the electronic device provided in the embodiment of the present application can implement each process implemented by the gesture navigation apparatus of the electronic device in the method embodiment of fig. 1, and is not repeated here to avoid repetition.
In an embodiment of the application, a gesture navigation method of an electronic device is provided, where a first receiving module is used for receiving a first input of a user on a screen of the electronic device; the first response module is used for responding to the first input, displaying area information and determining a sliding track of the first input, wherein the area information is used for indicating a target area, the sliding track at least comprises a track starting point and a track ending point, and the track starting point is positioned at the edge of a screen of the electronic equipment; the execution module is used for executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed. On the basis, the first input can be executed based on the edge of the screen so as to control the electronic equipment to execute different navigation operations. In this way, the user can easily perform the first input with fingers while gripping the electronic device, and thus, this reduces the risk of damage to the electronic device due to dropping of the electronic device caused by performing the navigation gesture. In addition, the target area is displayed, so that the user can accurately input the first input, and the problem of mutual interference of navigation gestures can be avoided.
Optionally, an electronic device is further provided in this embodiment of the present application, and includes a processor 810, a memory 809, and a program or an instruction stored in the memory 809 and executable on the processor 810, where the program or the instruction is executed by the processor 810 to implement each process of the gesture navigation method embodiment of the electronic device, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device described above.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 800 includes, but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, and a processor 810.
Those skilled in the art will appreciate that the electronic device 800 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 810 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 8 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The user input unit 807 is used for inputting a first input by a user. Optionally, the user input unit 807 is also used for the user to input a second input;
a processor 810 for receiving a first input of a user on a screen of an electronic device;
displaying area information in response to the first input, and determining a sliding track of the first input, wherein the area information is used for indicating a target area, the sliding track at least comprises a track starting point and a track ending point, and the track starting point is located at the edge of a screen of the electronic equipment;
executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed.
In the embodiment of the application, a gesture navigation method of an electronic device is provided, wherein a processor receives a first input of a user on a screen of the electronic device; in response to a first input, displaying area information, and determining a sliding track of the first input, wherein the area information is used for indicating a target area, the sliding track at least comprises a track starting point and a track ending point, and the track starting point is located at the edge of a screen of the electronic equipment; executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed. On the basis, the first input can be executed based on the edge of the screen so as to control the electronic equipment to execute different navigation operations. In this way, the user can easily perform the first input with fingers while gripping the electronic device, and thus, this reduces the risk of damage to the electronic device due to dropping of the electronic device caused by performing the navigation gesture. In addition, the target area is displayed, so that the user can accurately input the first input, and the problem of mutual interference of navigation gestures can be avoided.
Optionally, the processor 810 is specifically configured to:
and displaying an identification page of the started application in the background under the condition that the track end point is positioned outside the target area.
Optionally, when there are multiple identification pages of the application started in the background, the processor 810 is specifically configured to:
and displaying the identification pages of the plurality of background started applications in a fan-shaped overlapping arrangement mode.
Optionally, the processor 810 is further configured to:
receiving a second input of the user on the screen of the electronic equipment;
responding to the second input, and adjusting the page of one opened application selected by the second input to the first layer of the sector.
It should be understood that in the embodiment of the present application, the input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics Processing Unit 8041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 806 may include a display panel 8061, and the display panel 8061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 807 includes a touch panel 8071 and other input devices 8072. A touch panel 8071, also referred to as a touch screen. The touch panel 8071 may include two portions of a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 809 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems. The processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the gesture navigation method for an electronic device, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the gesture navigation method embodiment of the electronic device, and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A gesture navigation method of an electronic device is characterized by comprising the following steps:
receiving a first input of a user on a screen of an electronic device;
displaying area information in response to the first input, and determining a sliding track of the first input, wherein the area information is used for indicating a target area, the sliding track at least comprises a track starting point and a track ending point, and the track starting point is located at the edge of a screen of the electronic equipment;
executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed.
2. The method of claim 1, wherein the first operation is performed if the trajectory end point is located in a target area; otherwise, performing a second operation comprising:
executing a first operation when the first input only comprises a first preset input and a track end point of a sliding track of the first input is located in the target area;
and executing a second operation when the first input comprises the first preset input and a second preset input and the track end point of the sliding track of the first input is positioned outside the target area.
3. The method of claim 2, wherein the performing the second operation comprises:
and displaying an identification page of the application started in the background under the condition that the track end point is positioned outside the target area.
4. The method according to claim 3, wherein in the case that there are a plurality of identification pages of the background opened application, the operation of displaying the identification pages of the background opened application comprises:
and displaying a plurality of identification pages of the application started in the background in a fan-shaped overlapping arrangement mode.
5. The method of claim 4, further comprising:
receiving a second input of a user on a screen of the electronic equipment;
responding to the second input, and adjusting the page of one opened application selected by the second input to the first layer of the sector.
6. A gesture navigation apparatus of an electronic device, comprising:
the first receiving module is used for receiving a first input of a user on a screen of the electronic equipment;
a first response module, configured to display area information in response to the first input, and determine a sliding track of the first input, where the area information is used to indicate a target area, the sliding track includes at least a track starting point and a track ending point, and the track starting point is located at an edge of a screen of the electronic device;
the execution module is used for executing a first operation under the condition that the track end point is located in the target area; otherwise, the second operation is executed.
7. The apparatus of claim 6, wherein the execution module comprises:
the first execution unit is used for executing a first operation under the condition that the first input only comprises a first preset input and the track end point of the sliding track of the first input is positioned in a target area;
and the second execution unit is used for executing a second operation under the condition that the first input comprises the first preset input and the second preset input and the track end point of the sliding track of the first input is positioned outside the target area.
8. The apparatus of claim 7, wherein the second execution unit is specifically configured to:
and displaying an identification page of the started application in the background under the condition that the track end point is positioned outside the target area.
9. The apparatus according to claim 8, wherein, in a case that there are a plurality of identification pages of the application that has been started in the background, the second execution unit is specifically configured to:
and displaying a plurality of identification pages of the application started in the background in a fan-shaped overlapping arrangement mode.
10. The apparatus of claim 9, further comprising:
the second receiving module is used for receiving a second input of the user on the screen of the electronic equipment;
and the second response module is used for responding to the second input, and adjusting the page of the opened application selected by the second input to the first layer of the sector.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the gesture navigation method of the electronic device according to any one of claims 1-5.
12. A readable storage medium, on which a program or instructions are stored, which, when executed by a processor, carry out the steps of the gesture navigation method of an electronic device according to any one of claims 1-5.
CN202110123320.6A 2021-01-28 2021-01-28 Gesture navigation method and device of electronic equipment, equipment and readable storage medium Pending CN112783408A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110123320.6A CN112783408A (en) 2021-01-28 2021-01-28 Gesture navigation method and device of electronic equipment, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110123320.6A CN112783408A (en) 2021-01-28 2021-01-28 Gesture navigation method and device of electronic equipment, equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN112783408A true CN112783408A (en) 2021-05-11

Family

ID=75759675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110123320.6A Pending CN112783408A (en) 2021-01-28 2021-01-28 Gesture navigation method and device of electronic equipment, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112783408A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608610A (en) * 2021-07-14 2021-11-05 荣耀终端有限公司 Interaction control method, electronic equipment and system
CN113835609A (en) * 2021-09-27 2021-12-24 联想(北京)有限公司 Display method, system, electronic device and storage medium
WO2023030117A1 (en) * 2021-08-31 2023-03-09 维沃移动通信有限公司 Writing control method and apparatus, and electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014100948A1 (en) * 2012-12-24 2014-07-03 Nokia Corporation An apparatus and associated methods
CN104024985A (en) * 2011-12-28 2014-09-03 三星电子株式会社 Multitasking method and apparatus of user device
CN106095218A (en) * 2016-05-30 2016-11-09 努比亚技术有限公司 Information processing method and electronic equipment
CN108632451A (en) * 2018-03-23 2018-10-09 珠海市魅族科技有限公司 A kind of terminal control method and device, terminal, storage medium
CN111176508A (en) * 2019-12-27 2020-05-19 深圳集智数字科技有限公司 Task interface processing method and related device
CN111857513A (en) * 2020-07-17 2020-10-30 维沃移动通信有限公司 Background program control method and device and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104024985A (en) * 2011-12-28 2014-09-03 三星电子株式会社 Multitasking method and apparatus of user device
WO2014100948A1 (en) * 2012-12-24 2014-07-03 Nokia Corporation An apparatus and associated methods
CN106095218A (en) * 2016-05-30 2016-11-09 努比亚技术有限公司 Information processing method and electronic equipment
CN108632451A (en) * 2018-03-23 2018-10-09 珠海市魅族科技有限公司 A kind of terminal control method and device, terminal, storage medium
CN111176508A (en) * 2019-12-27 2020-05-19 深圳集智数字科技有限公司 Task interface processing method and related device
CN111857513A (en) * 2020-07-17 2020-10-30 维沃移动通信有限公司 Background program control method and device and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608610A (en) * 2021-07-14 2021-11-05 荣耀终端有限公司 Interaction control method, electronic equipment and system
WO2023030117A1 (en) * 2021-08-31 2023-03-09 维沃移动通信有限公司 Writing control method and apparatus, and electronic device and storage medium
CN113835609A (en) * 2021-09-27 2021-12-24 联想(北京)有限公司 Display method, system, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN112783408A (en) Gesture navigation method and device of electronic equipment, equipment and readable storage medium
CN112162665B (en) Operation method and device
CN111857509A (en) Split screen display method and device and electronic equipment
CN112433693B (en) Split screen display method and device and electronic equipment
CN112148165A (en) Display method and device and electronic equipment
CN113407075B (en) Icon sorting method and device and electronic equipment
CN114415886A (en) Application icon management method and electronic equipment
WO2021254377A1 (en) Application icon display method and apparatus, and electronic device
CN113253883A (en) Application interface display method and device and electronic equipment
CN113282213A (en) Interface display method and device
CN113268182A (en) Application icon management method and electronic equipment
CN113452744A (en) File sharing method, device, equipment and storage medium
CN112764630A (en) Application icon display method and device
CN112558851A (en) Object processing method, device, equipment and readable storage medium
CN111638828A (en) Interface display method and device
CN112269501A (en) Icon moving method and device and electronic equipment
WO2022247787A1 (en) Application classification method and apparatus, and electronic device
CN112764622A (en) Icon moving method and device and electronic equipment
CN114090110A (en) Application program starting method and device and electronic equipment
CN114518929A (en) Application program processing method and device
CN114116087A (en) Interface operation method and device between two systems, electronic equipment and medium
CN113515216A (en) Application program switching method and device and electronic equipment
CN113885749A (en) Icon display method and device and electronic equipment
CN113885981A (en) Desktop editing method and device and electronic equipment
CN113778277A (en) Application icon management method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210511