CN113504871A - Electronic equipment control method and device, chip and electronic equipment - Google Patents

Electronic equipment control method and device, chip and electronic equipment Download PDF

Info

Publication number
CN113504871A
CN113504871A CN202111058458.9A CN202111058458A CN113504871A CN 113504871 A CN113504871 A CN 113504871A CN 202111058458 A CN202111058458 A CN 202111058458A CN 113504871 A CN113504871 A CN 113504871A
Authority
CN
China
Prior art keywords
input event
screen
sliding
desktop
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111058458.9A
Other languages
Chinese (zh)
Inventor
张宁宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Tianjin Co Ltd
Original Assignee
Spreadtrum Communications Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Tianjin Co Ltd filed Critical Spreadtrum Communications Tianjin Co Ltd
Priority to CN202111058458.9A priority Critical patent/CN113504871A/en
Publication of CN113504871A publication Critical patent/CN113504871A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

The application provides an electronic equipment control method, an electronic equipment control device, a chip and electronic equipment. The method is applied to the electronic equipment and comprises the following steps: acquiring an input event; confirming whether the input event is a desktop return input event, wherein the desktop return input event comprises a first sliding input event and a second sliding input event which are synchronously performed, the first sliding input event and the second sliding input event are sliding input events of which the starting points are positioned at the edge of a screen of the electronic equipment, and the sliding direction points to the center of the screen of the electronic equipment; and when the input event is a return desktop input event, directly returning to a system desktop from the current running interface of the electronic equipment. According to the method, the electronic equipment can simply and quickly return to the system desktop from the application interface, so that the complexity of operation of the electronic equipment is greatly simplified, and the user experience of the electronic equipment is improved.

Description

Electronic equipment control method and device, chip and electronic equipment
Technical Field
The present application relates to the field of communications technologies, and in particular, to an electronic device control method, an electronic device control apparatus, a chip, and an electronic device.
Background
In the application scenario of the prior art, the application of the intelligent electronic device is more and more popular. For a smart device, a common device control scheme is to perform a touch operation based on a touch screen of the smart device. However, as the integration and miniaturization of electronic devices are continuously developed, the size of intelligent electronic devices is smaller and smaller. The size of the intelligent electronic device is limited, and the area of the touch screen of some intelligent electronic devices is relatively small, so that the control space of touch operation is greatly limited, and a user can hardly perform touch operation on the touch screen. For example, for a smart watch using a touch screen, there is a great difficulty in performing touch operation on the touch screen on the watch surface.
Disclosure of Invention
The application provides an electronic equipment control method, device, chip and electronic equipment, and further provides a computer readable storage medium, aiming at the problem that in the prior art, the touch operation is difficult for a user due to the fact that the area of a touch screen is too small.
The embodiment of the application adopts the following technical scheme:
in a first aspect, the present application provides an electronic device control method, where the method is applied to an electronic device, and the method includes:
acquiring an input event;
confirming whether the input event is a desktop return input event, wherein the desktop return input event comprises a first sliding input event and a second sliding input event which are synchronously performed, the first sliding input event and the second sliding input event are sliding input events of which the starting points are positioned at the edge of a screen of the electronic equipment, and the sliding direction points to the center of the screen of the electronic equipment;
when the input event is a return desktop input event, directly returning to a system desktop from a current running interface of the electronic equipment; or when the input event is a desktop return input event, determining whether to trigger direct return from the current running interface of the electronic equipment to the system desktop, and when determining that the trigger is to trigger direct return from the current running interface of the electronic equipment to the system desktop, directly returning from the current running interface of the electronic equipment to the system desktop.
In one implementation manner of the first aspect, starting points of the first sliding input event and the second sliding input event are respectively located at edges of opposite sides of a screen of the electronic device.
In an implementation manner of the first aspect, an included angle formed by the starting point of the first sliding input event, the screen center of the electronic device, and the starting point of the second sliding input event is greater than a first included angle threshold.
In an implementation manner of the first aspect, an included angle between a direction pointing from a starting point of the first sliding input event to a center of a screen of the electronic device and a sliding direction of the first sliding input event is smaller than a second included angle threshold;
and/or the presence of a gas in the gas,
and an included angle between a direction pointing to the center of the screen of the electronic equipment from the starting point of the second sliding input event and the sliding direction of the second sliding input event is smaller than the second included angle threshold value.
In one implementation of the first aspect, a distance between a starting point of the first sliding input event and an edge of a screen of the electronic device is less than a first distance threshold;
and/or the presence of a gas in the gas,
the distance between the starting point of the second sliding input event and the edge of the screen of the electronic device is smaller than the first distance threshold.
In one implementation of the first aspect, a distance between a termination point of the first sliding input event and a screen center point of the electronic device is less than a second distance threshold;
and/or the presence of a gas in the gas,
the distance between the termination point of the second sliding input event and the screen center point of the electronic device is smaller than the second distance threshold.
In one implementation manner of the first aspect, a starting point of the first sliding input event is located on a screen frame of the electronic device;
and/or the presence of a gas in the gas,
the starting point of the second sliding input event is located on a screen frame of the electronic equipment.
In an implementation manner of the first aspect, when the input event is a return-to-desktop input event, determining whether to trigger a direct return from a current running interface of the electronic device to a system desktop is performed, including:
confirming whether the application currently running by the electronic equipment is configured with a response aiming at the returned desktop input event;
when the application currently running on the electronic equipment is not configured with a response to the return desktop input event, confirming that the trigger is returned from the current running interface of the electronic equipment to the system desktop directly.
In a second aspect, the present application further provides an electronic device control apparatus, including:
an input event acquisition module for acquiring an input event;
an input event identification module, configured to determine whether the input event is a return desktop input event, where the return desktop input event includes a first slide input event and a second slide input event that are performed synchronously, where the first slide input event and the second slide input event are slide input events whose starting points are located at an edge of a screen of the electronic device, and a slide direction points to a center of the screen of the electronic device;
an execution module to: when the input event is a return desktop input event, directly returning to a system desktop from a current running interface of the electronic equipment; or when the input event is a desktop return input event, determining whether to trigger direct return from the current running interface of the electronic equipment to the system desktop, and when determining that the trigger is to trigger direct return from the current running interface of the electronic equipment to the system desktop, directly returning from the current running interface of the electronic equipment to the system desktop.
In a third aspect, the present application further provides an electronic chip, including:
a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method steps of the first aspect.
In a fourth aspect, the present application also provides an electronic device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method steps according to the first aspect.
In a fifth aspect, the present application also provides a computer readable storage medium having stored thereon a computer program which, when run on a computer, causes the computer to perform the method according to the first aspect.
According to the technical scheme provided by the embodiment of the application, at least the following technical effects can be realized:
according to the method, the electronic equipment can simply and quickly return to the system desktop from the application interface, so that the complexity of operation of the electronic equipment is greatly simplified, and the user experience of the electronic equipment is improved.
Drawings
FIG. 1 is a schematic diagram of an application scenario according to an embodiment of the present application;
fig. 2a is a schematic diagram illustrating a hardware structure of a smart watch according to an embodiment of the present application;
FIG. 2b is a flow chart of a method according to an embodiment of the present application;
FIG. 3a is a schematic view of a first embodiment of a circular screen according to the present application;
FIG. 3b is a schematic diagram of a first embodiment of a square screen according to the present application;
FIG. 4a is a schematic view of a second embodiment of a circular screen according to the present application;
FIG. 4b is a schematic diagram of a second embodiment of a square screen according to the present application;
FIG. 4c is a schematic view of a third embodiment of a circular screen according to the present application;
FIG. 4d is a schematic view of a third embodiment of a square screen according to the present application;
FIG. 5a is a schematic view of a fourth embodiment of a circular screen according to the present application;
FIG. 5b is a schematic view of a fourth embodiment of a square screen according to the present application;
FIG. 5c is a schematic view of a fifth embodiment of a circular screen according to the present application;
FIG. 5d is a schematic view of a fifth embodiment of a square screen according to the present application;
FIG. 6a is a schematic view of a sixth embodiment of a circular screen according to the present application;
FIG. 6b is a schematic view of a sixth embodiment of a square screen according to the present application;
FIG. 6c is a schematic view of a seventh embodiment of a circular screen according to the present application;
FIG. 6d is a schematic view of a seventh embodiment of a square screen according to the present application;
FIG. 7a is a schematic view of an eighth embodiment of a circular screen according to the present application;
FIG. 7b is a schematic view of an eighth embodiment of a square screen according to the present application;
FIG. 8a is a schematic view of a ninth embodiment of a circular screen according to the present application;
FIG. 8b is a schematic view of a ninth embodiment of a square screen according to the present application;
FIG. 9 is a flow chart of a method according to another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terminology used in the description of the embodiments section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
In a practical application scenario of control operations for smart devices, a return operation is a very common and very important operation. Typically, the return operation is triggered by pressing a return button of the entity, or, alternatively, by touching a return button on the control interface. Along with the function of the application program (APP) on the intelligent device is more and more perfect, the logic structure of the APP is also more and more complex, the operation interface of the APP usually has a multilayer structure, and after the APP is started from the system interface by a user, the user can enter each layer of interfaces of the APP according to the control logic.
After the user finishes the operation on a certain layer of the APP, if the user wants to quit the APP, the user returns to the system desktop. Another possible solution is to call out an application control interface of the operating system, and directly close the current APP or directly switch from the application control interface to the system desktop.
However, since a certain interface operation space is required for calling the application control interface of the operating system (for example, corresponding application control interface buttons need to be reserved on the display interface of the touch screen), for the electronic device with a small touch screen, it is generally impossible to call the application control interface of the operating system on the display interface of the touch screen, and only tedious returning operations can be repeatedly performed to gradually return to the system desktop. For example, in a smart watch product form, a full-screen dial interface functions as a system desktop, which is used as a top-level interface and can enter each level of interfaces of applications from the dial interface. However, when it is necessary to return to the dial interface from each level interface of the application, it is only possible to return to the dial interface step by step.
Aiming at the problem that the process of returning the electronic equipment from the application interface to the system desktop is too tedious, a feasible solution is to trigger the electronic equipment to return from the application interface to the system desktop through a hardware key of the electronic equipment.
Specifically, the electronic device may be triggered to return to the system desktop from the application interface by configuring different operation modes based on the original hardware key of the electronic device, for example, the electronic device may be triggered to return to the system desktop from the application interface by continuously pressing the power key 2 times. However, by configuring different operation modes, the electronic device is triggered to return to the system interface from the application interface based on the original hardware key, which will conflict with the original function of the original hardware key, and thus false triggering is easily caused. For example, if the electronic device is set to be pressed 2 times to trigger the electronic device to return to the system desktop from the application interface, if the user presses the power key at a low speed, or the electronic device recognizes a mistake, and the electronic device does not recognize that the power key is pressed 2 times, it may cause the electronic device to determine that the power key is pressed 1 time, and the electronic device is directly powered off.
Specifically, a new hardware key for triggering the return from the application interface to the system desktop can be added to the electronic device. However, adding a new hardware key to the electronic device will inevitably increase the cost of the electronic device, and the new key will also extrude the structural space of the original key, resulting in further reduction of the controllable space of the electronic device.
In order to solve the above problem, an embodiment of the present application provides an electronic device control method. On the basis of not increasing hardware keys of the electronic equipment, the electronic equipment is triggered to return to a system desktop from an application interface by performing specific touch operation on the electronic equipment.
Specifically, in the solution of an embodiment of the present application, for an electronic device (for example, a screen of a smart watch, the diameter (length) of which is usually between 25-42 mm) in which the edge length of the screen is within a range that two fingers easily manipulate, when two fingers of a user touch the edge position of the screen and slide from the edge of the screen to the center of the screen at the same time, the electronic device is triggered to return to the desktop of the system from the current display interface directly.
Fig. 1 is a schematic view of an application scenario according to an embodiment of the present application. As shown in the image in the dashed box 101 in fig. 1, two fingers of the user touch the edge of the screen of the smart watch, and slide from the edge of the screen to the center of the screen at the same time. Then, as shown by the image in the dashed box 102 in fig. 1, when the two fingers of the user slide to the center of the screen, the smart watch is triggered to return to the dial interface of the smart watch from the current application interface.
Specifically, fig. 2a is a schematic diagram illustrating a hardware structure of a smart watch according to an embodiment of the present application. As shown in fig. 2a, the smart watch 110 includes at least an input event acquisition module 112, an input event recognition module 113, and an execution module 114.
FIG. 2b is a flow chart of a method according to an embodiment of the present application. The smart watch 110 shown in fig. 2a executes the flow shown in fig. 2b to implement the operation of returning to the dial face interface (system desktop) from the current application interface.
S220, acquiring an input event, and the input event acquiring module 112 acquires a current input event of the smart watch 110.
Specifically, the smart watch 110 further includes a touch acquisition module and an input event generation module, and the touch acquisition module and the input event generation module may be integrated in a touch screen of the smart watch 110.
The touch acquisition module acquires an input signal (for example, a touch point trigger signal of a capacitive touch screen) generated by a user touch operation; the input event generating module generates an input event (for example, the input event includes touch point coordinates of the touch screen and trigger time) according to the input signal acquired by the touch acquisition module.
S230, recognizing the input event, and the input event recognizing module 113 recognizes the input event obtained by the input event obtaining module 112, and confirms the specific content of the input event (the specific operation of the user corresponding to the input event), for example, whether the current input event is a return desktop input event (the return desktop input event corresponds to a situation where two fingers of the user touch the edge of the smart watch screen, and simultaneously slide from the edge of the screen to the center of the screen).
The execution module 114 is configured to respond to the input event identified by the input event identification module 113.
S240, judging whether the input event is a return desktop input event, and the execution module 114 determining whether the current input event is a return desktop input event according to the identification result of S230;
if so, S250, returning to the system desktop, and the execution module 114 makes the display interface of the smart watch 110 return to the dial interface.
If not, S260, an event response operation is executed, the execution module 114 confirms the response operation of the input event according to the identification result of S230, and the response operation of the input event is executed.
According to the method, the electronic equipment can simply and quickly return to the system desktop from the application interface, so that the complexity of operation of the electronic equipment is greatly simplified, and the user experience of the electronic equipment is improved.
Further, in the embodiment of the present application, a specific implementation logic of S230 is not limited in detail, and a person skilled in the art may set the identification logic in S230 according to an actual situation. Specifically, in S230, it is identified whether the input event includes a first slide input event and a second slide input event, where the first slide input event and the second slide input event are two input events performed synchronously, the first slide input event and the second slide input event are the slide input events whose starting points are located at the edge of the screen of the electronic device, and the sliding direction points to the center of the screen of the electronic device. When the input event comprises a first sliding input event and a second sliding input event, the input event can be recognized as a return to desktop input event.
For example, fig. 3a is a schematic view of a screen of an electronic device according to a first embodiment of the circular screen of the present application. As shown in fig. 3a, the electronic device screen 300a is circular, and the point O3a is the center point of the screen 300 a. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 301, and the swipe direction points to point O3 a; the second swipe input event starting point is 302 and the swipe direction points to point O3 a. 301. 302 is a point on the edge of the screen 300a, and thus, the current input events (the first slide input event and the second slide input event that are performed simultaneously) can be recognized as a return to desktop input event (corresponding to the user touching the smart watch screen edge locations with two fingers, sliding from the screen edge to the screen center at the same time).
Further, the screen of the electronic device may have other shapes than a circular shape. For example, fig. 3b is a schematic view of a screen of an electronic device according to a first embodiment of the square screen of the present application. As shown in FIG. 3b, the electronic device screen 300b is a square, and the point O3b is the center point of the screen 300 b. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 301b, and the swipe direction points to point O3 b; the second swipe input event starting point is 302b, and the swipe direction points to point O3 b. 301b, 302b are midpoints on the edge of the screen 300b, and thus, the current input events (the first slide input event and the second slide input event that are performed simultaneously) can be recognized as a return to desktop input event (corresponding to the user touching the edge of the smart watch screen with two fingers, sliding from the edge of the screen to the center of the screen simultaneously).
Further, the sliding operation of the user may be initiated from any point on the edge of the screen, for example, 301a, 302a are any points on the edge of the screen 300b, the starting point of the first sliding input event is 301a, and the sliding direction points to the point O3 b; the second swipe input event starting point is 302a, and the swipe direction points to point O3 b; for another example, 301c and 302c are the vertices of the square of the screen 300b, the starting point of the first sliding input event is 301c, and the sliding direction points to the point O3 b; the second swipe input event starting point is 302c, and the swipe direction points to point O3 b.
Furthermore, the screen of the electronic device may also be in other shapes such as rectangle, diamond, ellipse, triangle, and the like.
Further, in a practical application scenario, when the first slide input event and the second slide input event have a close distance from the slide start point, the first slide input event and the second slide input event are easily recognized as two slide events sliding in parallel. Therefore, in order to avoid recognition errors, in the embodiment of the present application, the sliding start points of the first sliding input event and the second sliding input event must be points on the edges of the opposite sides of the screen with the center point of the screen as a reference point. At this time, the first sliding input event and the second sliding input event are recognized as being capable of triggering a return desktop operation of returning to the system desktop. Namely, two fingers of a user respectively touch the opposite side edges of the screen of the intelligent watch, and when the two fingers slide to the center of the screen from the edge of the screen, the intelligent watch is triggered to return to the desktop of the system. For example, as shown in FIG. 3a, 301 and 302 are respectively located at two opposite edges of the screen 300a (the connection line between 301 and 302 passes through O3 a) centered at O3 a; as shown in fig. 3b, 301a and 302a are respectively located at two opposite side edges of the screen 300b (the connection line between 301a and 302a passes through O3 b) centered at O3 b; 301b and 302b are respectively located at two opposite side edges of the screen 300b (the connecting line between 301b and 302b passes through O3 b) with O3b as the center; 301c and 302c are located at two opposite edges of the screen 300b (the line connecting 301c and 302c passes through O3 b) centered at O3b, respectively.
Further, in an actual application scenario, when a user performs a touch operation on a screen, the user cannot perform perfect positioning. For example, when a user tries to touch two points on two sides of the screen of the smart watch with two fingers, respectively, the user may not be able to make the two points be completely opposite points centered on the center of the screen.
For example, fig. 4a is a schematic view of a second embodiment of a circular screen according to the present application. As shown in fig. 4a, the screen of the electronic device is circular, and the point O4a is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 401b, and the swipe direction points to point O4 a; the second swipe input event starting point is 402 and the swipe direction points to point O4 a. 401b, 402 are both midpoints on the edge of the screen, but, centered at O4a, the opposite endpoint of 402 should be 401a, 401b not at the position of 401 a. The first sliding input event and the second sliding input event do not slide from two points on opposite sides of the full screen.
For another example, fig. 4b is a schematic diagram of a second embodiment of a square screen according to the present application, and as shown in fig. 4b, the electronic device screen is square, and the point O4b is a center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 403b, and the swipe direction points to point O4 b; the second swipe input event starting point is 404, and the swipe direction points to point O4 b. 403b, 404 are both midpoints on the edge of the screen, but, centered at O4b, the opposite endpoint of 404 would be 403a, 403b not at the location of 403 a. The first sliding input event and the second sliding input event do not slide from two points on opposite sides of the full screen.
In view of the above problem, in an embodiment of the present invention, when an included angle between sliding directions of the first sliding input event and the second sliding input event is greater than a preset included angle threshold (a first included angle threshold, for example, 120 degrees), it is considered that the sliding start points of the first sliding input event and the second sliding input event are points on the edge of the opposite side screen with the screen center point as the center.
For example, fig. 4c is a schematic view of a third embodiment of a circular screen according to the present application. As shown in fig. 4c, the screen of the electronic device is circular, and the point O4c is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 405b, and the swipe direction points to point O4 c; the second swipe input event starting point is 406 and the swipe direction points to point O4 c. When the angle (a) formed by the two fingers 405b, O4c, and 406 is greater than a preset first angle threshold value, for example, 120 degrees, the current input events (the first sliding input event and the second sliding input event that are performed simultaneously) may be recognized as a return-to-desktop input event (corresponding to the user touching the edge of the smart watch screen with two fingers, and sliding from the opposite edge of the screen to the center of the screen at the same time).
For another example, fig. 4d is a schematic diagram of a third embodiment of a square screen according to the present application. As shown in fig. 4d, the electronic device screen is square, and the point O4d is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 407b, and the swipe direction points to point O4 d; the second swipe input event starting point is 408 and the swipe direction points to point O4 d. When the angle (B) formed by 407B, O4d, 408 is greater than a preset first angle threshold, for example, 120 degrees, the current input event (the first sliding input event and the second sliding input event that are performed simultaneously) may be recognized as a return desktop input event (the user touches two fingers to the edge of the smart watch screen, and slides from the opposite edge of the screen to the center of the screen at the same time).
Further, in an actual application scenario, when a user performs a touch operation on a screen, since the user cannot perform a perfect positioning, a sliding direction of the finger of the user is not completely consistent with an expected direction of the finger. For example, when a user attempts to simultaneously slide two fingers from two different location points to a screen center point of a smart watch, the sliding direction of one or both of the user's fingers may deviate from the screen center point.
For example, fig. 5a is a schematic view of a fourth embodiment of a circular screen according to the present application. As shown in fig. 5a, the screen of the electronic device is circular, and the point O5a is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 501, and the swipe direction is shown as a solid arrow, which does not point to point O5a (the direction to O5a is shown as a dashed arrow); the second swipe input event starting point is 502, and the swipe direction is shown as a solid arrow, which does not point to point O5a (the direction to O5a is shown as a dashed arrow).
For another example, fig. 5b is a schematic diagram of a fourth embodiment of a square screen according to the present application. As shown in fig. 5b, the screen of the electronic device is square, and the point O5b is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 503, and the swipe direction is shown as a solid arrow, which does not point to point O5b (the direction to O5b is shown as a dashed arrow); the second swipe input event starting point is 504, and the swipe direction is shown as a solid arrow, which does not point to point O5b (the direction to O5b is shown as a dashed arrow).
In view of the above problem, in an embodiment of the present invention, when an angle between a sliding direction of the first sliding input event or the second sliding input event and a direction pointing to the screen center point is smaller than a preset angle threshold (a second angle threshold, for example, 30 degrees), it can be considered that the sliding direction of the first sliding input event or the second sliding input event points to the screen center point.
For example, fig. 5c is a schematic view of a fifth embodiment of a circular screen according to the present application. As shown in fig. 5c, the screen of the electronic device is circular, and the point O5c is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starts at 505, and the swipe direction is shown by the solid arrow, which is at an angle A5a with respect to the direction pointing to point O5c (dashed arrow); the second swipe input event starting point is 506, and the swipe direction is shown as a solid arrow, which is at an angle A5b from the direction pointing to point O5c (dashed arrow). When the angles A5a and A5b are both smaller than a preset second angle threshold, for example, 30 degrees, the current input events (the first sliding input event and the second sliding input event that are performed synchronously) may be recognized as a return-to-desktop input event (the user touches two fingers at an edge position of the screen of the smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time).
For another example, fig. 5d is a schematic diagram of a fifth embodiment of a square screen according to the present application. As shown in fig. 5d, the electronic device screen is square, and the point O5d is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 507, and the swipe direction is as shown by the solid arrow, which is at an angle B5a with respect to the direction pointing to point O5d (dashed arrow); the second swipe input event starting point is 508, and the swipe direction is shown as a solid arrow at an angle B5B to the direction pointing to point O5d (dashed arrow). When the angles B5a and B5B are both smaller than the preset second angle threshold, for example, 30 degrees, the current input event (the first sliding input event and the second sliding input event that are performed synchronously) may be recognized as a return to desktop input event (the user touches two fingers at an edge position of the screen of the smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time).
Further, in an actual application scenario, when two fingers of a user slide from two points on the edge of the screen to the center of the screen, due to the limitation of the screen space, the two fingers collide near the center of the screen, or the fingers leave the screen too early, and a situation that the sliding of one or two fingers stops before reaching the center of the screen may occur. In some application scenarios, a too short sliding trajectory may be recognized as a tap.
For example, fig. 6a is a schematic view of a sixth embodiment of a circular screen according to the present application. As shown in fig. 6a, the screen of the electronic device is circular, and the point O6a is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first sliding input event starting point is 601a, the sliding direction points to the point O6a, the sliding end point is 601b, and the point does not reach O6 a; the second swipe input event starts at 602a, points in the swipe direction to point O6a, ends at 602b, and does not reach O6 a.
For another example, fig. 6b is a schematic diagram of a sixth embodiment of a square screen according to the present application. As shown in fig. 6b, the electronic device screen is square, and the point O6b is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first swipe input event starting point is 603a, the swipe direction points to point O6b, the swipe end point is 603b, and does not reach O6 b; the second swipe input event starting point is 604a, the swipe direction is directed to point O6b, the swipe end point is 604b, and O6b is not reached.
In view of the above problem, in an embodiment of the present invention, when a distance between a sliding end point of the first sliding input event or the second sliding input event and the screen center point is less than a preset distance threshold (a first distance threshold, for example, 1/3 of the screen radius), the sliding end point of the first sliding input event or the second sliding input event is considered as the screen center point. The first slide input event or the second slide input event may be recognized as sliding toward the center of the screen.
For example, fig. 6c is a schematic view of a seventh embodiment of a circular screen according to the present application. As shown in fig. 6c, the screen of the electronic device is circular, and the point O6c is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The first sliding input event starting point is 605a, the sliding direction points to the point O6c, the sliding end point is 605b, and the distance between 605b and O6c is A6 a; the second swipe input event starts at 606a, the swipe direction points to point O6c, the swipe end point is 606b, and the distance between 606b and O6c is A6 b. When the distances A6a and A6b are both smaller than a preset first distance threshold, for example 1/3 of the screen radius, the current input events (the first sliding input event and the second sliding input event that are performed synchronously) can be recognized as a return to desktop input event (the user touches two fingers at the edge of the screen of the smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time).
For another example, fig. 6d is a schematic diagram of a seventh embodiment of a square screen according to the present application. As shown in fig. 6d, the electronic device screen is square, and the point O6d is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The starting point of the first sliding input event is 607a, the sliding direction points to the point O6d, the sliding end point is 607B, and the distance between 607B and O6d is B6 a; the second swipe input event starts at 608a, the swipe direction points to O6d, the swipe end point is 608B, and the distance between 608B and O6d is B6B. When the distances B6a and B6B are both smaller than a preset first distance threshold, for example, 1/3 the side length of the screen, the current input event (the first sliding input event and the second sliding input event that are performed synchronously) may be recognized as a return to desktop input event (the user touches two fingers at an edge of the screen of the smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time).
Further, in an actual application scenario, when a user touches the screen, since the user cannot perfectly position the screen, when the user desires to slide from a point at the edge of the screen to the center of the screen, the position where the user's finger first touches may not be at the edge of the screen.
In view of the above problem, in an embodiment of the present application, when a distance between a sliding start point of the first sliding input event or the second sliding input event and the screen edge is less than a preset distance threshold (a second distance threshold, for example, 1/3 of the screen radius), the sliding start point of the first sliding input event or the second sliding input event is considered as the screen edge. The first slide input event or the second slide input event may be recognized as starting to slide from the screen edge.
For example, fig. 7a is a schematic view of an eighth embodiment of a circular screen according to the present application. As shown in fig. 7a, the screen of the electronic device is circular, and the point O8a is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The starting point of the first sliding input event is 801a, the sliding direction points to the point O8a, and the distance between 801a and the edge of the screen is A8 a; the second swipe input event starts at 802a, and the swipe direction points to point O8a, 802a spaced from the screen edge by B8 a. When the distances A8a and B8a are both smaller than a preset second distance threshold, for example, 1/3 of the screen radius, the current input events (the first sliding input event and the second sliding input event that are performed synchronously) may be recognized as a return to desktop input event (the user touches two fingers at an edge of the screen of the smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time).
For another example, fig. 7b is a schematic diagram of an eighth embodiment of a square screen according to the present application. As shown in fig. 7b, the electronic device screen is square, and the point O8b is the center point of the screen. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
The starting point of the first sliding input event is 801b, the sliding direction points to the point O8b, and the distance between the point 801b and the edge of the screen is A8 b; the second swipe input event starts at 802B, and the swipe direction points to point O8B, 802B spaced from the screen edge by B8B. When the distances A8B and B8B are both smaller than a preset second distance threshold, for example, 1/3 the side length of the screen, the current input event (the first sliding input event and the second sliding input event that are performed synchronously) may be recognized as a return to desktop input event (the user touches two fingers at the edge of the screen of the smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time).
Further, in order to ensure that the user performs a desktop return operation (the user touches the edge of the screen of the smart watch with two fingers and slides from the opposite edge of the screen to the center of the screen at the same time), the starting point of the sliding of the user's fingers may be on the edge of the screen, and the user may start to touch from a portion of the electronic device other than the screen and slide to the center of the screen of the electronic device, so that the portion of the screen touched by the user's fingers first is the edge of the screen of the electronic device.
Further, in order to avoid recognition errors, in an embodiment of the present application, a touch sensor is also configured on a frame portion of a screen of the electronic device, and a user must touch the frame position of the smart watch screen with two fingers, and then slide from the frame on the opposite side of the screen to the center of the screen at the same time, so as to trigger a return to the system desktop.
For example, fig. 8a is a schematic view of a ninth embodiment of a circular screen according to the present application. As shown in fig. 8a, the screen 700a of the electronic device is circular, the point O7a is the center point of the screen 700a, and the point 710a is the border of the screen 700 a. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
Points 701a, 704a are points on the border 710a, and points 702a, 703a are points on the edge of the screen 700 a. When the starting point of the first sliding input event is 701a on the frame, the sliding track passes through 702a on the edge of the screen 700a, and the direction points to the point O7 a; when the starting point of the second sliding input event is 704a on the frame, the sliding track passes through 703a on the edge of the screen 700a, and the direction is pointed to the point O7a, the current input events (the first sliding input event and the second sliding input event which are performed synchronously) can be recognized as a return desktop input event (the user touches the edge of the smart watch screen with two fingers and slides from the opposite edge of the screen to the center of the screen at the same time). And when the first sliding input event starting point is 702a on the edge of the screen 700 a; or, when the starting point of the second sliding input event is 703a on the edge of the screen 700a, the current input events (the first sliding input event and the second sliding input event that are performed synchronously) may not be recognized as a return-to-desktop input event (the user touches two fingers at the edge of the screen of the smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time).
For another example, fig. 8b is a schematic diagram of a ninth embodiment of a square screen according to the present application. As shown in fig. 8b, the screen 700b of the electronic device is a square, the point O7b is the center point of the screen 700b, and 710b is the border of the screen 700 b. In S220, two input events, a first slide input event and a second slide input event, are generated in synchronization.
Points 701b, 704b are points on the border 710b, and points 702b, 703b are points on the edge of the screen 700 b. When the starting point of the first sliding input event is 701b on the frame, the sliding track passes through 702b on the edge of the screen 700b, and the direction points to a point O7 b; when the second swipe input event starting point is 704b on the border, the swipe track passes 703b on the edge of the screen 700b, and the direction is pointed to the point O7b, the current input events (the first swipe input event and the second swipe input event that are performed in synchronization) can be recognized as a return-to-desktop input event (the user touches two fingers to the edge of the smart watch screen, and simultaneously swipes from the opposite edge of the screen to the center of the screen). And when the first sliding input event starting point is 702b on the edge of the screen 700 b; or, when the starting point of the second sliding input event is 703b on the edge of the screen 700b, the current input events (the first sliding input event and the second sliding input event that are performed synchronously) may not be recognized as a return-to-desktop input event (the user touches two fingers at the edge of the screen of the smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time).
Further, in some applications, a response to a return to desktop operation (the user touches two fingers to the edge of the smart watch screen, sliding from the opposite edge of the screen to the center of the screen at the same time) has been set. For example, in some electronic map applications, a user touching two fingers at an edge of a screen of the smart watch and sliding from the opposite edge of the screen to the center of the screen simultaneously triggers a zoom-out view of the map. In this case, if a return desktop operation (two fingers of the user touch the edge of the smart watch screen and slide from the opposite edge of the screen to the center of the screen at the same time) is also set to trigger a return to the system desktop, an application running error is caused.
In view of the above problem, in an embodiment of the present application, after an input event is identified as a return desktop input event (a user touches two fingers at an edge of a screen of a smart watch, and slides from the opposite edge of the screen to the center of the screen at the same time), it is first determined whether a response to the return desktop input event exists in a current application running scene, and if so, a response to the return desktop operation (for example, zooming out a view map) set by an application is preferentially executed; if not, the return to the system desktop is triggered.
Specifically, fig. 9 is a flow chart of a method according to another embodiment of the present application. The smart watch 110 shown in fig. 2a executes the flow shown in fig. 9 to implement the operation of returning to the dial face interface (system desktop) from the current application interface.
S920 to S930, refer to S220 to S230.
S940, determining whether the input event is a return desktop input event, and the execution module 114 determines whether the current input event is a return desktop input event;
if not, S980, an event response operation is performed, and the execution module 114 performs a response operation corresponding to the input event recognized by the input event recognition module 113.
If so, S950, judging whether a response for returning to the desktop input event exists in the current running scene; the execution module 114 determines whether the application currently running by the smart watch 110 is configured with a response to the return to desktop input event.
If the currently running application configures a response to the return desktop input event, S960, executing the response to the return desktop input event in the current running scenario; the execution module 114 executes the response configured by the currently running application to the return desktop input event.
If the currently running application does not configure a response to the return desktop input event, S970, returns to the system desktop, and the execution module 114 causes the smart watch 110 display interface to return to the dial interface.
In the description of the embodiments of the present application, for convenience of description, the device is described as being divided into various modules by functions, the division of each module is only a division of logic functions, and the functions of each module may be implemented in one or more pieces of software and/or hardware when the embodiments of the present application are implemented.
Specifically, the apparatuses proposed in the embodiments of the present application may be wholly or partially integrated into one physical entity or may be physically separated when actually implemented. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling by the processing element in software, and part of the modules can be realized in the form of hardware. For example, the detection module may be a separate processing element, or may be integrated into a chip of the electronic device. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more Digital Signal Processors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, these modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
In a practical application scenario, the method flow of the embodiment shown in this specification may be implemented by an electronic chip installed on an electronic device. Therefore, an embodiment of the present application provides an electronic chip. For example, an electronic chip is mounted on an electronic device, the electronic chip including:
a processor for executing computer program instructions stored on the memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method steps of the embodiments of the present application.
An embodiment of the present application also proposes an electronic device (e.g. a smart watch) comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method steps as described in the embodiments of the present application.
Specifically, in an embodiment of the present application, the one or more computer programs are stored in the memory, and the one or more computer programs include instructions that, when executed by the apparatus, cause the apparatus to perform the method steps described in the embodiment of the present application.
Specifically, in an embodiment of the present application, a processor of the electronic device may be an on-chip device SOC, and the processor may include a Central Processing Unit (CPU), and may further include other types of processors. Specifically, in an embodiment of the present application, the processor of the electronic device may be a PWM control chip.
Specifically, in an embodiment of the present application, the processors may include, for example, a CPU, a DSP, a microcontroller, or a digital Signal processor, and may further include a GPU, an embedded Neural-Network Processor (NPU), and an Image Signal Processing (ISP), and the processors may further include necessary hardware accelerators or logic Processing hardware circuits, such as an ASIC, or one or more integrated circuits for controlling the execution of the program according to the present application. Further, the processor may have the functionality to operate one or more software programs, which may be stored in the storage medium.
Specifically, in an embodiment of the present application, the memory of the electronic device may be a read-only memory (ROM), another type of static storage device capable of storing static information and instructions, a Random Access Memory (RAM), or another type of dynamic storage device capable of storing information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM), or another optical disc storage, an optical disc storage (including a compact disc, a laser disc, an optical disc, a digital versatile disc, a blu-ray disc, etc.), a magnetic disc storage medium, or another magnetic storage device, or any computer readable medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
In particular, in an embodiment of the present application, the processor and the memory may be combined into a processing device, and more generally, independent components, and the processor is configured to execute the program code stored in the memory to implement the method described in the embodiment of the present application. In particular implementations, the memory may be integrated within the processor or may be separate from the processor.
Further, the apparatuses, devices, and modules described in the embodiments of the present application may be implemented by a computer chip or an entity, or by a product with certain functions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied in the medium.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application.
Specifically, an embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the method provided by the embodiment of the present application.
An embodiment of the present application further provides a computer program product, which includes a computer program, when it runs on a computer, causes the computer to execute the method provided by the embodiment of the present application.
The embodiments herein are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments herein. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the embodiments of the present application, "at least one" means one or more, "and" a plurality "means two or more. "and/or" describes the association relationship of the associated objects, and means that there may be three relationships, for example, a and/or B, and may mean that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and similar expressions refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
In the embodiments of the present application, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present application are described in a progressive manner, and the same and similar parts among the embodiments can be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the apparatus embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of electronic hardware and computer software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The above description is only for the specific embodiments of the present application, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope disclosed in the present application, and all the changes or substitutions should be covered within the protective scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. An electronic device control method, applied to an electronic device, the method comprising:
acquiring an input event;
confirming whether the input event is a desktop return input event, wherein the desktop return input event comprises a first sliding input event and a second sliding input event which are synchronously performed, the first sliding input event and the second sliding input event are sliding input events of which the starting points are positioned at the edge of a screen of the electronic equipment, and the sliding direction points to the center of the screen of the electronic equipment;
when the input event is a return desktop input event, directly returning to a system desktop from a current running interface of the electronic equipment; or when the input event is a desktop return input event, determining whether to trigger direct return from the current running interface of the electronic equipment to the system desktop, and when determining that the trigger is to trigger direct return from the current running interface of the electronic equipment to the system desktop, directly returning from the current running interface of the electronic equipment to the system desktop.
2. The method of claim 1, wherein starting points of the first sliding input event and the second sliding input event are respectively located at edges of opposite sides of a screen of the electronic device.
3. The method of claim 1, wherein an angle formed by the starting point of the first slide input event, the center of the screen of the electronic device, and the starting point of the second slide input event is greater than a first angle threshold.
4. The method of claim 1, wherein an angle between a direction pointing from a starting point of the first slide input event to a screen center of the electronic device and a sliding direction of the first slide input event is less than a second angle threshold;
and/or the presence of a gas in the gas,
and an included angle between a direction pointing to the center of the screen of the electronic equipment from the starting point of the second sliding input event and the sliding direction of the second sliding input event is smaller than the second included angle threshold value.
5. The method of claim 1, wherein a distance between a starting point of the first swipe input event and an edge of a screen of the electronic device is less than a first distance threshold;
and/or the presence of a gas in the gas,
the distance between the starting point of the second sliding input event and the edge of the screen of the electronic device is smaller than the first distance threshold.
6. The method of claim 1, wherein a distance between a termination point of the first swipe input event and a screen center point of the electronic device is less than a second distance threshold;
and/or the presence of a gas in the gas,
the distance between the termination point of the second sliding input event and the screen center point of the electronic device is smaller than the second distance threshold.
7. The method of claim 1, wherein a starting point of the first swipe input event is located at a screen border of the electronic device;
and/or the presence of a gas in the gas,
the starting point of the second sliding input event is located on a screen frame of the electronic equipment.
8. The method according to any one of claims 1-7, wherein when the input event is a return desktop input event, confirming whether to trigger a direct return to a system desktop from a current running interface of the electronic device comprises:
confirming whether the application currently running by the electronic equipment is configured with a response aiming at the returned desktop input event;
when the application currently running on the electronic equipment is not configured with a response to the return desktop input event, confirming that the trigger is returned from the current running interface of the electronic equipment to the system desktop directly.
9. An electronic device control apparatus, comprising:
an input event acquisition module for acquiring an input event;
an input event identification module, configured to determine whether the input event is a return desktop input event, where the return desktop input event includes a first slide input event and a second slide input event that are performed synchronously, where the first slide input event and the second slide input event are slide input events whose starting points are located at an edge of a screen of the electronic device, and a slide direction points to a center of the screen of the electronic device;
an execution module to: when the input event is a return desktop input event, directly returning to a system desktop from a current running interface of the electronic equipment; or when the input event is a desktop return input event, determining whether to trigger direct return from the current running interface of the electronic equipment to the system desktop, and when determining that the trigger is to trigger direct return from the current running interface of the electronic equipment to the system desktop, directly returning from the current running interface of the electronic equipment to the system desktop.
10. An electronic chip, comprising:
a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method steps of any of claims 1-8.
11. An electronic device, characterized in that the electronic device comprises a memory for storing computer program instructions and a processor for executing program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method steps of any of claims 1-8.
12. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 1-8.
CN202111058458.9A 2021-09-10 2021-09-10 Electronic equipment control method and device, chip and electronic equipment Pending CN113504871A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111058458.9A CN113504871A (en) 2021-09-10 2021-09-10 Electronic equipment control method and device, chip and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111058458.9A CN113504871A (en) 2021-09-10 2021-09-10 Electronic equipment control method and device, chip and electronic equipment

Publications (1)

Publication Number Publication Date
CN113504871A true CN113504871A (en) 2021-10-15

Family

ID=78016978

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111058458.9A Pending CN113504871A (en) 2021-09-10 2021-09-10 Electronic equipment control method and device, chip and electronic equipment

Country Status (1)

Country Link
CN (1) CN113504871A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104915142A (en) * 2015-05-29 2015-09-16 歌尔声学股份有限公司 Method for realizing touch screen dialing keyboard and intelligent watch
US20150363082A1 (en) * 2014-06-17 2015-12-17 Vmware, Inc. User interface control based on pinch gestures
CN105589698A (en) * 2014-10-20 2016-05-18 阿里巴巴集团控股有限公司 Method and system for rapidly starting system functions
CN110109603A (en) * 2019-04-29 2019-08-09 努比亚技术有限公司 A kind of page operation method, wearable device and computer readable storage medium
CN111010512A (en) * 2019-12-13 2020-04-14 维沃移动通信有限公司 Display control method and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150363082A1 (en) * 2014-06-17 2015-12-17 Vmware, Inc. User interface control based on pinch gestures
CN105589698A (en) * 2014-10-20 2016-05-18 阿里巴巴集团控股有限公司 Method and system for rapidly starting system functions
CN104915142A (en) * 2015-05-29 2015-09-16 歌尔声学股份有限公司 Method for realizing touch screen dialing keyboard and intelligent watch
CN110109603A (en) * 2019-04-29 2019-08-09 努比亚技术有限公司 A kind of page operation method, wearable device and computer readable storage medium
CN111010512A (en) * 2019-12-13 2020-04-14 维沃移动通信有限公司 Display control method and electronic equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
沈任元: "《智能手机实用教程第2版》", 31 August 2018 *
翟长霖 等: "《计算机实用操作指南》", 30 April 2001 *
顾涛 等: "《3DS MAX、VRAY建筑表现技法》", 30 September 2009 *

Similar Documents

Publication Publication Date Title
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
JP6036807B2 (en) Information processing apparatus, information processing method, and program
KR102135160B1 (en) Information processing apparatus, information processing method and storage medium
US20120044183A1 (en) Multimodal aggregating unit
US20140267084A1 (en) Enhancing touch inputs with gestures
JP2016520946A (en) Human versus computer natural 3D hand gesture based navigation method
US20110199387A1 (en) Activating Features on an Imaging Device Based on Manipulations
JP2003131785A (en) Interface device, operation control method and program product
KR20180102211A (en) Target disambiguation and correction
US20140218315A1 (en) Gesture input distinguishing method and apparatus in touch input device
TWI714513B (en) Book display program product and book display device
US9519355B2 (en) Mobile device event control with digital images
JP2013533541A (en) Select character
CN112445341B (en) Keyboard perspective method and device of virtual reality equipment and virtual reality equipment
CN111831204B (en) Device control method, device, storage medium and electronic device
WO2022267760A1 (en) Key function execution method, apparatus and device, and storage medium
CN105739817B (en) A kind of method, device and mobile terminal of icon hiding
CN107450717B (en) Information processing method and wearable device
KR20180123574A (en) Method, apparatus and storage medium for control command identification
CN113504871A (en) Electronic equipment control method and device, chip and electronic equipment
JP6910376B2 (en) Application program data processing method and device
KR102294717B1 (en) A system and method for providing an augmented reality image for providing a transformed object
KR101348763B1 (en) Apparatus and method for controlling interface using hand gesture and computer-readable recording medium with program therefor
KR20110049162A (en) Apparatus and method for virtual input/output in portable image processing device
KR101096572B1 (en) Method and device for inputting on touch screen, and portable device comprising the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20211015

RJ01 Rejection of invention patent application after publication