CN107479700B - Black screen gesture control method and device, storage medium and mobile terminal - Google Patents

Black screen gesture control method and device, storage medium and mobile terminal Download PDF

Info

Publication number
CN107479700B
CN107479700B CN201710632956.7A CN201710632956A CN107479700B CN 107479700 B CN107479700 B CN 107479700B CN 201710632956 A CN201710632956 A CN 201710632956A CN 107479700 B CN107479700 B CN 107479700B
Authority
CN
China
Prior art keywords
gesture
black screen
data
application layer
screen gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710632956.7A
Other languages
Chinese (zh)
Other versions
CN107479700A (en
Inventor
韩通
郭明强
石仁栋
张强
汪昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710632956.7A priority Critical patent/CN107479700B/en
Publication of CN107479700A publication Critical patent/CN107479700A/en
Application granted granted Critical
Publication of CN107479700B publication Critical patent/CN107479700B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a black screen gesture control method and device, a storage medium and a mobile terminal. After a system is awakened based on a black screen gesture, when a driving layer reports a black screen gesture event, an operation of reading gesture data of the black screen gesture is executed in parallel, and the gesture data is stored in a preset node of the driving layer; when the application layer receives the black screen gesture event, detecting whether gesture data in the preset node is ready to be finished or not; when the preparation is finished, the gesture data are read from the preset nodes by the application layer; and when the gesture data is successfully and effectively read, the application layer executes a black screen gesture function corresponding to the gesture data. According to the technical scheme, the response speed of the black screen gesture can be effectively improved by changing the control logic of the black screen gesture function, and the time from the detection of the black screen gesture to the opening of the application program corresponding to the black screen gesture is shortened.

Description

Black screen gesture control method and device, storage medium and mobile terminal
Technical Field
The embodiment of the invention relates to a mobile terminal technology, in particular to a black screen gesture control method, a device, a storage medium and a mobile terminal.
Background
At present, mobile terminals, such as smart phones, handheld computers, tablet computers, handheld game machines, etc., are generally designed to have a structure with a touch display screen to provide a touch input mode, so that the operation of a user is more convenient.
The black screen gesture is widely applied to the existing smart phone, and when the black screen gesture function is started, gesture operation acting on the touch display screen can be detected under the standby black screen state of the smart phone, so that corresponding functions or software in the smart phone are triggered. The black screen gesture is widely used because it can reduce the operation steps of opening the internal functions or application software of the gesture by the user, and has a cool and dazzling use effect. However, the current black screen gesture processing flow has defects, so that the response speed of the mobile terminal to the black screen gesture is slow, and the functional response of the black screen gesture is not sensitive enough.
Disclosure of Invention
The embodiment of the invention provides a method and a device for controlling a black screen gesture, a storage medium and a mobile terminal, which can improve the response speed of the black screen gesture.
In a first aspect, an embodiment of the present invention provides a method for controlling a black screen gesture, including:
after a system is awakened based on a black screen gesture, when a driving layer reports a black screen gesture event, an operation of reading gesture data of the black screen gesture is executed in parallel, and the gesture data is stored in a preset node of the driving layer;
when the application layer receives the black screen gesture event, detecting whether gesture data in the preset node is ready to be finished or not;
when the preparation is finished, the gesture data are read from the preset node by the application layer, wherein the gesture data comprise gesture types;
and when the gesture data is successfully and effectively read, the application layer executes a black screen gesture function corresponding to the gesture data.
In a second aspect, an embodiment of the present invention further provides a device for controlling a black screen gesture, where the device includes:
the first data reading module is used for parallelly executing the operation of reading the gesture data of the black screen gesture when the driving layer reports the black screen gesture event after the system is awakened based on the black screen gesture, and storing the gesture data in a preset node of the driving layer;
the data detection module is used for detecting whether the gesture data in the preset node is ready to be finished or not when the application layer receives the black screen gesture event;
the second data reading module is used for reading the gesture data from the preset node in the application layer when the preparation is completed, wherein the gesture data comprises a gesture type;
and the function execution module is used for executing the black screen gesture function corresponding to the gesture data when the gesture data is successfully and effectively read by the application layer.
In a third aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the black screen gesture control method according to the embodiment of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a mobile terminal, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the computer program to implement the black screen gesture control method according to the embodiment of the present invention.
According to the black screen gesture control scheme of the mobile terminal, operations of reporting a black screen gesture event and reading gesture data are executed in parallel; if the application layer receives the black screen gesture event in the process of reading gesture data by the driving layer, the application layer detects whether the gesture data is ready according to a set period; when the preparation is finished, the application layer reads gesture data from the preset node, wherein the gesture data comprises a gesture type; and when the reading is successful, judging whether the gesture data is effective, and if so, executing a black screen gesture function corresponding to the gesture data. According to the technical scheme, the response speed of the black screen gesture can be effectively improved by changing the control logic of the black screen gesture function, and the time from the detection of the black screen gesture to the opening of the application program corresponding to the black screen gesture is shortened.
Drawings
FIG. 1 is a flowchart of a method for controlling a black screen gesture according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an android system framework provided by an embodiment of the present invention;
FIG. 3a is a flowchart of another method for controlling a black screen gesture according to an embodiment of the present invention;
FIG. 3b is a schematic diagram illustrating a black screen gesture track according to an embodiment of the present invention;
FIG. 4a is a flowchart of another method for controlling a black screen gesture according to an embodiment of the present invention;
FIG. 4b is a schematic diagram illustrating another example of a black screen gesture track according to an embodiment of the present invention;
FIG. 5 is a schematic structural diagram of a black screen gesture control apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The black screen gesture function is that when a mobile terminal (such as a smart phone) is in a screen-off dormant state, a touch display screen operates in a low-power consumption state to detect a black screen gesture acting on the touch display screen under the screen-off state, and according to the black screen gesture, a certain function of the smart phone is awakened or a preset function of an application program corresponding to the type of the black screen gesture is started. In order to facilitate understanding of the function of the black screen gesture, a description is given below of a process from detection of the black screen gesture in the screen-off state to opening of the application program corresponding to the black screen gesture by the application layer, where the process includes: storing gesture data corresponding to the black screen gesture into a preset node of the driving layer, wherein the gesture data comprise gesture coordinates and gesture types; executing validity judgment of the black screen gesture data by the driving layer; if the event is valid, the frame layer executes the dispatch of the black screen gesture event; after the application layer receives the black screen gesture event, the application layer reads gesture data from a preset node in the driving layer, the gesture coordinates calculate an animation track of the black screen gesture according to the gesture coordinates and the gesture type, and the animation track data are sent to a frame buffer (FrameBuffer) so that the animation track is refreshed to a touch display screen according to a set screen refresh rate and displayed; and then, executing the operation of opening the application program corresponding to the black screen gesture by the application layer.
As can be seen from the above black screen gesture execution process, before the touch display screen displays the black screen gesture trajectory and opens the application, the following steps are also executed: the driving layer reads the gesture coordinates from the touch chip by setting a reading function and stores the gesture coordinates in a preset node of the driving layer; the driving layer determines the gesture type of the black screen gesture input by the user according to the gesture coordinate, and then the gesture type is also used as gesture data to be stored in the preset node. And reporting the black screen gesture event by the driving layer. And after the application layer receives the black screen gesture event, reading gesture data by the preset node, and determining a gesture track according to the gesture data. Since the above steps are performed in a state that the touch display screen is turned off, the mobile terminal does not respond from the user's perspective, that is, the delay time from inputting the black screen gesture to opening the application program corresponding to the black screen gesture is long, so that the user intuitively thinks that the black screen gesture function response is not sensitive enough. The black screen gesture control scheme provided by the embodiment of the invention can well solve the problem that the starting time delay of the application program corresponding to the black screen gesture is long.
Fig. 1 is a flowchart of a black screen gesture control method according to an embodiment of the present invention, where the method is performed by a black screen gesture control apparatus, where the apparatus may be implemented by software and/or hardware, and may be generally integrated in a mobile terminal. As shown in fig. 1, the method includes:
step 110, after waking up the system based on the black screen gesture, when the driving layer reports the black screen gesture event, performing an operation of reading gesture data of the black screen gesture in parallel, and storing the gesture data in a preset node of the driving layer.
The black screen gesture event can be an event which is negotiated in advance by the driving layer and the application layer and is used for representing that the black screen gesture input exists.
The black screen gesture may be a touch gesture input by a user on a touch display screen of the mobile terminal in a screen-off state after the function of the black screen gesture is turned on. It is to be understood that the black screen gesture is not limited to a touch gesture input on the touch display screen, and may also be an operation detected by a sensor of the mobile terminal, and the like. For example, a gesture of shaking the smartphone from side to side, a gesture of sweeping from a touch display screen of the smartphone, a gesture of pressing a bezel of the smartphone, and so on.
The gesture data comprises gesture coordinates corresponding to the black screen gesture, gesture types, a preset end position and the like.
The preset node may be a file node, and may be a virtual file node under a proc-D directory, for example.
Fig. 2 is a schematic diagram of an android system framework provided in an embodiment of the present invention. Taking the mobile terminal with the operating system shown in fig. 2 as an Android system as an example, an execution flow of the black screen gesture function provided by the embodiment of the present invention is described. As shown in fig. 2, the android system framework includes, from bottom to top, a kernel layer 210, a core class library layer 220, a framework layer 230, and an application layer 240. The kernel layer 210 provides core system services, including security, memory management, process management, network protocol stack, and hardware drivers. The hardware driver in the kernel layer 210 is referred to as a driver layer 211, and the driver layer 211 includes a touch display screen driver, a camera driver, and the like. The core class library layer 220 includes an Android Runtime environment (Android Runtime) and class Libraries (Libraries). Among them, Android Runtime provides most of the functions available in Java programming language Core class Libraries, including Core Libraries (Core Libraries) and Dalvik virtual machines (Dalvik VM). Each android application is an instance in a Dalvik virtual machine, running in their own process. The class library is used by each component of the android system, and comprises the following functions: media library (Media Framework), interface Manager (Surface Manager), SQLite (relational database engine), FreeType (bitmap and vector font rendering), etc., each of which is exposed to the developer for use by the Framework layer 230 of the android system. The framework layer 230 provides a series of class libraries required for developing android applications, so that developers can develop the applications quickly, reuse components conveniently, and also realize personalized extension through inheritance, and the provided services include component management services, window management services, system data source components, space frameworks, resource management services, installation package management services and the like. The application layer 240 includes various applications that interact directly with the user, or service programs written in Java language and running in the background, including desktop applications, contact applications, call applications, camera applications, picture browsers, games, maps, web browsers, and other applications developed by developers.
Illustratively, after the function of the black screen gesture is started, when the touch chip detects the black screen gesture, a wake-up signal is generated, and the wake-up signal is sent to the kernel layer. And triggering the kernel layer to execute a system wake-up operation through the wake-up signal. After the system is awakened, the kernel layer calls a drive layer to interrupt the processing function for execution, the drive layer reads gesture data in the touch chip through the interrupt processing function, and the read gesture data is stored in a preset node of the drive layer. The touch chip is used for outputting a touch sensing control signal to the touch display screen so as to detect touch operation, identifying gesture coordinates of a black screen gesture acting on the touch display screen, and storing the gesture coordinates as gesture data in a register of the touch chip. The preset node may be a file node, and may be a virtual file node under a proc-D directory, for example. After the data reading is completed, the driving layer determines the validity of the gesture data, and there are many ways for determining the validity, which is not limited in this embodiment. For example, the driving layer determines a gesture type according to the gesture coordinates included in the gesture data, and stores the determined gesture type as gesture data in the preset node. And if the gesture type is not the preset black screen gesture, judging that the gesture data is invalid. And if not, judging that the gesture data is invalid. And when the data is valid, reporting a black screen gesture event by the driving layer. The black screen gesture event is transmitted to the framework layer through the kernel layer and the core class library layer, and is distributed through the framework layer to reach the application layer. When the application layer acquires the black screen gesture event, whether gesture data in a preset node of the driving layer are ready or not is detected. If yes, reading gesture data from the preset node. And calculating a black screen gesture track according to the gesture coordinates contained in the gesture data, and drawing the black screen gesture track on the touch display screen for displaying. Then, the application layer opens an application program corresponding to the gesture type based on the gesture type in the read gesture data. The gesture type may be a gesture preset in the mobile terminal for implementing a certain function, or may be a user-defined gesture. For example, the gesture type may be O, representing turning on the camera. As another example, the gesture type may be V, representing turning on a flashlight, or the like.
It is to be understood that the execution flow of the black screen gesture function is not limited to the manner illustrated in the present embodiment. For example, a black screen gesture event can be reported when the system is awakened, the kernel layer calls a drive layer interrupt processing function to execute, the drive layer reads gesture data in the touch chip through the interrupt processing function, and the gesture data is stored in a preset node of the drive layer; when a black screen gesture event is reported, the driving layer parallelly executes the operation of reading gesture data from the preset node and determining the gesture type according to the gesture data; optionally, the driver layer reports a black screen gesture event, and the black screen gesture event is distributed to the application layer through the framework layer. When a black screen gesture event is reported, the kernel layer calls a driving layer to interrupt a processing function for execution, the driving layer reads gesture data corresponding to the black screen gesture from the touch chip through the interrupt processing function in parallel, and the gesture data are stored in a preset node of the driving layer. And the driving layer detects whether the gesture data in the preset node comprises a preset end bit, and if so, determines the gesture type corresponding to the black screen gesture according to the gesture coordinate.
And step 120, when receiving the black screen gesture event, the application layer detects whether gesture data in the preset node is ready to be finished.
The state of the gesture data in the preset node comprises preparation completion and preparation. The preset node data can be determined to be in a ready state or a ready state according to whether the gesture data in the preset node includes a preset end bit. For example, the pre-defined end bit corresponds to a character "#". When a user inputs a black screen gesture, the touch chip stores gesture data corresponding to the detected black screen gesture into a preset register. The touch chip adds a "#" at the end of the gesture data stored in the register after detecting that the input of the black screen gesture is completed (e.g., the input of the black screen gesture by the user is not detected within a preset time period). And the driving layer reads gesture data in the preset node according to a set period, and if a character corresponding to the preset end bit, namely, "#", is detected, the data state of the preset node is judged to be ready. At this time, the value of the set identification bit in the driving layer is changed to the value representing that the gesture data preparation is completed. The set identification position is used for identifying the state of gesture data in a preset node in the driving layer. The value of the set identification bit is determined by whether the driving layer reads a preset end bit. And if the driving layer reads the preset end bit, updating the value of the set identification bit to a numerical value corresponding to the preparation completion state. And if the preset ending bit is not read by the driving layer, keeping the value of the set identification bit as a numerical value corresponding to the preparation state.
It is understood that the end bit can be of many kinds, and is not limited to the "#" listed in this embodiment.
And after receiving the black screen gesture event, the application layer periodically reads the value of the set identification bit configured in the driving layer. The application layer can know whether the state of the gesture data in the preset node is ready to be finished or in preparation through the value of the set identification bit. And when the gesture data in the preset node is not prepared, starting a timer to set the time length at regular time, waiting for the set time length, and reading the value of the identification bit in the driving layer so as to judge whether the gesture data in the preset node is prepared or not according to the value.
It can be understood that, in the process that the kernel layer calls a drive layer to interrupt the processing function for execution, the drive layer reads gesture data corresponding to the black screen gesture from the touch chip through the terminal processing function and stores the gesture data into a preset node of the drive layer, the application layer periodically reads and sets the identification bit.
And step 130, when the preparation is completed, the gesture data is read from the preset node by the application layer.
After the gesture data in the preset node is prepared, the application layer extracts the gesture data from the preset node. For example, the application layer calls a setting function to read the gesture coordinates and gesture types from the virtual file nodes in the proc-D directory.
And 140, when the gesture data is successfully and effectively read, the application layer executes a black screen gesture function corresponding to the gesture data.
And if the application layer reads the preset end bit of the gesture data, the gesture data is considered to be successfully read. And if the number of the gesture coordinates read by the application layer exceeds a set number threshold value and the black screen gesture corresponding to the gesture type is the enabled black screen gesture, determining that the black screen gesture data is valid. Illustratively, after the gesture data is read by the preset node, the application layer compares the gesture data with the preset end bit, and if the gesture data includes the preset end bit, it is determined that the gesture data is successfully read, and the data reading from the preset node is stopped. The application layer inquires the gesture type of the started black screen gesture according to the gesture type included in the gesture data, if the black screen gesture corresponding to the gesture type is started, the number of the gesture coordinates in the read gesture data is counted, and whether the number of the gesture coordinates exceeds a set number threshold value is judged. And if so, judging that the gesture data is valid. The minimum number of gesture coordinates for drawing the black screen gesture trajectory may be set in advance for each black screen gesture.
And when the gesture data is successfully and effectively read, the application layer executes the black screen gesture function corresponding to the gesture data. The black screen gesture function comprises the functions of opening a certain application program, executing to awaken the mobile terminal or cutting songs and the like. Illustratively, when the gesture data is successfully and effectively read, a gesture track corresponding to the black screen gesture is determined, and the gesture track is displayed on the touch display screen. And determining the operation to be executed according to the corresponding relation between the gesture type and the black screen gesture function. For example, if it is determined that the operation to be performed is to turn on the camera according to the gesture type, after the gesture trajectory displays the set time on the touch display screen, the camera is turned on, and the display interface is switched to the camera interface.
According to the technical scheme of the embodiment, the operation of reporting the black screen gesture event and reading the gesture data is executed in parallel; if the application layer receives the black screen gesture event in the process of reading gesture data by the driving layer, the application layer detects whether the gesture data is ready according to a set period; when the preparation is completed, the application layer reads the gesture data from the preset node, judges whether the gesture data is valid or not when the reading is successful, and executes a black screen gesture function corresponding to the gesture data if the gesture data is valid. According to the technical scheme, the response speed of the black screen gesture can be effectively improved by changing the control logic of the black screen gesture function, and the time from the detection of the black screen gesture to the opening of the application program corresponding to the black screen gesture is shortened.
Fig. 3a is a flowchart of another method for controlling a black screen gesture according to an embodiment of the present invention. As shown in fig. 3a, the method comprises:
step 301, after waking up the system based on the black screen gesture, when the driving layer reports the black screen gesture event, performing an operation of reading gesture data of the black screen gesture in parallel, and storing the gesture data in a preset node of the driving layer.
Step 302, when receiving the black screen gesture event, the application layer queries an identification bit in the driver layer, which identifies the data state of the preset node.
And in the process that the driving layer reads the gesture data of the black screen gesture and stores the gesture data to the preset node of the driving layer, if the application layer receives the black screen gesture event, the operation of inquiring whether the gesture data in the preset node comprises a preset end bit is executed in parallel. In other words, in the process of writing the gesture data of the black screen gesture into the preset node, the application layer detects whether the gesture data in the preset node includes the preset end bit in real time, and the situation that the preset end bit is searched from a large amount of data to prolong the response time of the black screen gesture can be effectively avoided.
Step 303, judging whether the gesture data is ready to be finished according to the value of the identification position, if so, executing step 306, otherwise, executing step 304.
And the application layer reads the value of the identification bit in the driving layer according to a set period and matches the value with a numerical value representing the completion of gesture data preparation in the preset node. If the value is equal to the value representing the completion of the preparation of the gesture data, determining that the preparation of the gesture data in the preset node is completed, and executing step 306; otherwise, it is determined that the gesture data in the preset node is not ready to be completed, and step 304 is executed.
Step 304, timing is performed through a timer.
And starting a timer to set the time length when the gesture data in the preset node is not ready to be finished. The set time length is equal to a first period for the application layer to read the flag bit in the driving layer.
Step 305, determining whether the value of the timer reaches a set time length, if so, executing step 302, otherwise, executing step 304.
Reading the timer according to a second period, comparing the reading with a set time length, and if the reading is greater than or equal to the set time length, executing step 302; otherwise, returning to the step 304; wherein the second period is less than the first period.
And step 306, the application layer reads the gesture data from the preset node.
Step 307, determining whether the gesture data is successfully and effectively read, if so, executing step 308, otherwise, executing step 306.
And step 308, the application layer starts the application program corresponding to the gesture type in the background through at least one starting thread.
The starting thread is used for executing the operation of opening the application program corresponding to the gesture type in the background. In the android system, the application is composed of Activity, and therefore, the starting process of the application is actually the starting process of the default Activity in the application, including invocation of an Activity class, instantiation of an object and the like. And after the application program corresponding to the gesture type is started in the background, caching the picture frame corresponding to the application program interface, and temporarily not drawing the application program interface to the touch display screen.
The association relationship between the gesture type and the application program or the mobile phone function is pre-established, and the association relationship between the gesture type and the application program (such as a process number or an installation package name) or the association relationship between the gesture type and the mobile phone function (such as switching of working modes of mobile phone wake-up, a conference mode or a standard mode) can be stored in a manner of setting a white list. It can be understood that there are many ways to establish the association relationship between the gesture type and the application program, and the embodiment of the present invention is not limited thereto. For example, a quick start function may be set for a setting function or a setting application of the mobile terminal before the mobile terminal leaves a factory, and the setting function may be directly executed or the setting application may be opened in a screen-off state by inputting a setting gesture. Taking a flashlight as an example, the flashlight is preset to have a quick starting function before the mobile terminal leaves a factory, and the flashlight can be turned on in a screen off state by inputting a black screen gesture O. For another example, the mobile terminal provides a black screen gesture configuration function, when the user starts the black screen gesture function, the user is prompted to select an application program needing to be provided with the quick start function, and a black screen gesture corresponding to the application program is input or selected, so that the association relationship between the gesture type and the application program is established.
When the application layer receives a black screen gesture event, a preset white list is inquired through a starting thread, an application program corresponding to the gesture type is determined, and the application program is started in a background.
And 309, executing and determining a gesture track corresponding to the gesture type by the application layer through at least one drawing thread and the starting thread in parallel, and drawing the gesture track to the touch display screen.
And the drawing thread is used for executing and determining a gesture track corresponding to the gesture type in parallel with the starting thread, and displaying the operation from the gesture track to the touch display screen. The drawing of the gesture track can be completed by one drawing thread, or the gesture track segments can be drawn by two or more drawing threads respectively. After each drawing thread finishes the gesture track segment which is responsible for drawing, the drawn gesture track segments are spliced into a complete gesture track, and therefore the drawing efficiency of the gesture track can be effectively improved.
For example, the application layer, while executing, by the opening thread, to open the application program corresponding to the gesture type in the background, concurrently executes, by the at least one other drawing thread, an operation of determining the gesture trajectory corresponding to the gesture type. For example, a drawing thread extracts a sampling point meeting a preset sampling rule of the gesture type from coordinate information included in the gesture data, draws a gesture track corresponding to the gesture type according to the sampling point, sends a picture frame containing a connecting line between the sampling point and the sampling point to a frame buffer (FrameBuffer), and displays the gesture track on a touch display screen in an animation mode to simulate a drawing process of the black screen gesture.
Optionally, the preset sampling rule may be that one gesture coordinate is collected every set number of gesture coordinates to serve as a sampling point. And performing curve fitting on the sampling point to obtain a gesture track corresponding to the gesture type of the black screen gesture input by the user. The set number of pixel points can be drawn at set time intervals starting from the first sampling point of the gesture track, and therefore the drawing process of the gesture track is shown in an animation mode. In order to ensure a high black screen gesture response speed, the set time interval is a minimum time interval at which human eyes can distinguish image changes, and of course, the time interval can be set as required. Curve fitting is a data processing approach that uses a continuous curve to approximately delineate or mimic the functional relationship between the coordinates represented by discrete points on a plane.
Fig. 3b is a schematic display diagram of a black screen gesture track according to an embodiment of the present invention. As shown in fig. 3b, the gesture trajectory between 3 sampling points is drawn at set time intervals, starting from the first sampling point 301 of the gesture type "W". When the gesture is drawn for the first time, the gesture tracks among the 3 sampling points are drawn from the first sampling point 301, and then the gesture tracks are drawn for the second time at intervals of a set time length. At this time, the gesture trajectory between 3 sampling points is drawn from the fourth sampling point 302, and then the third drawing is performed at a set time interval. At this time, the gesture track between 3 sampling points is drawn from the seventh sampling point 303, and the gesture track is drawn according to the rule until the last sampling point 304 is drawn, so that the gesture track is displayed on the touch display screen in an animation mode, the monotonicity problem of the static display gesture track is improved, and the interestingness is increased.
Step 310, determining whether a preset display condition of the application program interface is met, if yes, executing step 311, otherwise, executing step 312.
The display condition of the display interface switched from the gesture track display interface to the application program is preset, the display condition can be set according to actual needs, the application program corresponding to the gesture type is successfully opened in the background, and the gesture track displayed on the touch display screen is drawn to the last sampling point.
When detecting that the application program corresponding to the black screen gesture input by the user is started in the background, the application layer judges whether a gesture track on the touch display screen is drawn to the last sampling point, if so, step 311 is executed, otherwise, step 312 is executed. Optionally, if the gesture track on the touch display screen is drawn to the last sampling point, but the application program corresponding to the black screen gesture input by the user is not started, the completed gesture track is displayed on the touch display screen until it is detected that the application program is started in the background, and step 311 is executed.
And 311, drawing an interface corresponding to the application program to the touch display screen by the application layer so as to display the interface of the application program on the touch display screen.
And reading data from a preset storage area in which a picture frame corresponding to the application program interface is stored, and refreshing the application program interface to the touch display screen at a set screen refresh rate to realize direct switching from the gesture track interface to the application program interface.
And step 312, continuing to display the gesture track on the touch display screen.
According to the technical scheme of the embodiment, when the application program corresponding to the gesture type is opened through the starting thread, a plurality of sampling points meeting a preset sampling rule are extracted from the coordinate information of the gesture data through the drawing thread, a gesture track is drawn according to the sampling points, and the gesture track is displayed on the touch display screen in an animation mode; and when the preset display condition aiming at the application program interface is met, switching the gesture track display interface to the application program interface. By adopting the technical scheme, the gesture track of the black screen gesture input by the user can be rapidly determined while the application program is opened in the execution background, and the gesture track is vividly drawn on the touch display screen, so that the situation that the user intuitively senses that the function response of the black screen gesture is insensitive is avoided, and the response speed of the black screen gesture is further improved.
Fig. 4a is a flowchart of another black screen gesture control method according to an embodiment of the present invention. As shown in fig. 4a, the method comprises:
step 401, when detecting a black screen gesture, the touch chip triggers the kernel layer wake-up system.
And step 402, reporting a black screen gesture event corresponding to the black screen gesture to an application layer by the driving layer through a frame layer.
And step 403, when the driving layer reports a black screen gesture event, executing an operation of reading gesture data of the black screen gesture in parallel, and storing the gesture data in a preset node of the driving layer.
And step 404, when the application layer receives the black screen gesture event, inquiring an identification position for identifying the data state of the preset node in the driving layer.
Step 405, judging whether the gesture data is ready to be finished according to the value of the identification position, if so, executing step 408, otherwise, executing step 406.
Step 406, time counting is performed through a timer.
Step 407, determine whether the value of the timer reaches the set time length.
And step 408, the application layer reads the gesture data from the preset node.
Step 409, judging whether the gesture data is successfully read and is valid, if so, executing step 410, otherwise, executing step 408.
And step 410, the application layer determines a gesture track corresponding to the gesture type and draws the gesture track to the touch display screen.
And querying a pre-configured standard graph library to determine a standard graph matched with the gesture type. The standard graphic library can be arranged in the mobile terminal so as to facilitate the query of an application layer, and has higher query speed because the standard graphic library does not depend on the Internet. And the standard graphic library can be updated based on an update message pushed by a remote server after the mobile terminal is networked. Optionally, the standard graphic library may also be stored in a remote server to avoid occupying the memory space of the mobile terminal.
Illustratively, the gesture type of the user input black screen gesture is 'W', the standard graph library is queried according to the gesture type, and the standard graph of 'W' for setting the display effect is determined. The set display effect may be a default display effect of the system or a display effect preset by the user, including font color, font style, font size, and the like.
The drawing thread can query the standard graph library according to the gesture type to obtain the standard graph corresponding to the black screen gesture, gesture coordinates do not need to be obtained, gesture tracks are drawn, and data acquisition amount is greatly reduced.
The application layer searches a standard graph matched with the gesture type in a pre-configured standard graph library, draws an image of the standard graph, stores image data in a frame cache of the touch display screen, and refreshes the image of the standard graph to the touch display screen according to a set screen refresh rate so as to display the image of the standard graph on the touch display screen.
Fig. 4b is a schematic display diagram of another black screen gesture track according to an embodiment of the present invention. As shown in fig. 4b, when the user inputs the black screen gesture "W" in the black screen state, the standard graph corresponding to the gesture type may be displayed on the touch display screen.
Step 411, the application layer starts an application program corresponding to the gesture type, and displays an interface of the application program on the touch display screen.
Reading data from a preset storage area in which a picture frame corresponding to the application program interface is stored, transmitting the data to a frame cache, refreshing the application program interface to a touch display screen according to a set screen refresh rate, and directly switching a display picture to the application program interface from a gesture track.
According to the technical scheme of the embodiment, a standard graph corresponding to the gesture type is determined by inquiring a preset standard graph library, and an image of the standard graph is drawn to a touch display screen; and then, opening an application program corresponding to the gesture type, and displaying an interface of the application program on the touch display screen. By adopting the technical scheme, the standard graph corresponding to the gesture track of the black screen gesture input by the user is rapidly determined by searching the preset standard graph library, and the standard graph is drawn on the touch display screen, so that the situation that the user intuitively senses that the function response of the black screen gesture is insensitive is avoided, and the response speed of the black screen gesture is further improved.
Fig. 5 is a schematic structural diagram of a black screen gesture control apparatus according to an embodiment of the present invention. As shown in fig. 5, the apparatus includes:
the first data reading module 510 is configured to, after waking up the system based on the black screen gesture, execute an operation of reading gesture data of the black screen gesture in parallel when the driving layer reports a black screen gesture event, and store the gesture data in a preset node of the driving layer;
the data detection module 520 is configured to detect whether gesture data in the preset node is ready to be completed or not when the application layer receives the black screen gesture event;
the second data reading module 530 is configured to, when the preparation is completed, the application layer reads the gesture data from the preset node;
and the function executing module 540 is configured to, when the gesture data is successfully and effectively read, execute a black screen gesture function corresponding to the gesture data by the application layer.
The technical scheme of this embodiment provides a black screen gesture controlling means, through the control logic who changes the black screen gesture function, can promote the response speed of black screen gesture effectively, shortens by detecting the black screen gesture to opening the required time of the application that this black screen gesture corresponds.
Optionally, before the driving layer wakes up the system based on the black screen gesture, the method further includes:
when the touch chip detects a black screen gesture, triggering a kernel layer awakening system;
the reporting of the black screen gesture event comprises the following steps: and the driving layer reports the black screen gesture event corresponding to the black screen gesture to the application layer through the frame layer.
Optionally, the parallel execution of the operation of reading the gesture data corresponding to the black screen gesture includes:
and the driving layer parallelly executes the operation of reading the gesture data corresponding to the black screen gesture from the touch chip through an interrupt processing function.
Optionally, the application layer is triggered by the black screen gesture event to execute and detect whether the gesture data in the preset node is ready to be completed, including:
when the application layer receives the black screen gesture event, inquiring an identification position for identifying the data state of the preset node in a driving layer;
the application layer judges whether the gesture data is ready to be finished according to the value of the identification bit;
if so, the application layer executes the operation of extracting the gesture data from the preset node;
otherwise, after waiting for the set time length, the application layer returns to execute the operation of inquiring the identification bit.
Optionally, after the application layer reads the gesture data from the preset node, the method further includes:
and the application layer judges whether the gesture data read from the preset node comprises a preset ending bit of the gesture data, and if so, the gesture data is determined to be successfully read.
Optionally, the executing the black screen gesture function corresponding to the gesture data includes:
the application layer determines a gesture track corresponding to the gesture type and draws the gesture track to the touch display screen;
and the application layer starts an application program corresponding to the gesture type and displays an interface of the application program on the touch display screen.
Optionally, the executing the black screen gesture function corresponding to the gesture data includes:
the application layer starts an application program corresponding to the gesture type at the background through at least one starting thread;
the application layer executes and determines a gesture track corresponding to the gesture type in parallel with the starting thread through at least one drawing thread, and draws the gesture track to a touch display screen;
and when the preset display condition is met, the application layer draws an interface corresponding to the application program to the touch display screen so as to display the interface of the application program on the touch display screen.
Embodiments of the present invention also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a method for black screen gesture control, the method including:
after a system is awakened based on a black screen gesture, when a driving layer reports a black screen gesture event, an operation of reading gesture data of the black screen gesture is executed in parallel, and the gesture data is stored in a preset node of the driving layer;
when the application layer receives the black screen gesture event, detecting whether gesture data in the preset node is ready to be finished or not;
when the preparation is finished, the gesture data are read from the preset nodes by the application layer;
and when the gesture data is successfully and effectively read, the application layer executes a black screen gesture function corresponding to the gesture data.
Storage medium-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk, or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, Lanbas (Rambus) RAM, etc.; non-volatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a first computer system in which the program is executed, or may be located in a different second computer system connected to the first computer system through a network (such as the internet). The second computer system may provide program instructions to the first computer for execution. The term "storage medium" may include two or more storage media that may reside in different locations, such as in different computer systems that are connected by a network. The storage medium may store program instructions (e.g., embodied as a computer program) that are executable by one or more processors.
Of course, the storage medium containing the computer-executable instructions provided in the embodiments of the present invention is not limited to the operations of the black screen gesture control method described above, and may also perform related operations in the black screen gesture control method provided in any embodiments of the present invention.
The embodiment of the invention provides a mobile terminal, wherein the black screen gesture control device provided by the embodiment of the invention can be integrated in the mobile terminal. Fig. 6 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention. As shown in fig. 6, the mobile terminal may include: a housing (not shown), a memory 601, a Central Processing Unit (CPU) 602 (also called a processor, hereinafter referred to as CPU), a circuit board (not shown), a touch display 612, and a power circuit (not shown). The touch display screen 612 is configured to convert a user operation into an electrical signal, input the electrical signal to the processor, and display a visual output signal; the circuit board is arranged in a space enclosed by the touch display screen 612 and the shell; the CPU602 and the memory 601 are disposed on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the mobile terminal; the memory 601 is used for storing a computer program; the CPU602 reads and executes the computer program stored in the memory 601. The CPU602, when executing the computer program, implements the steps of: after a system is awakened based on a black screen gesture, when a driving layer reports a black screen gesture event, an operation of reading gesture data of the black screen gesture is executed in parallel, and the gesture data is stored in a preset node of the driving layer; when the application layer receives the black screen gesture event, detecting whether gesture data in the preset node is ready to be finished or not; when the preparation is finished, the gesture data are read from the preset nodes by the application layer; and when the gesture data is successfully and effectively read, the application layer executes a black screen gesture function corresponding to the gesture data.
The mobile terminal further includes: peripheral interface 603, RF (Radio Frequency) circuitry 605, audio circuitry 606, speakers 611, power management chip 608, input/output (I/O) subsystem 609, other input/control devices 610, and external port 604, which communicate via one or more communication buses or signal lines 607.
It should be understood that the illustrated mobile terminal 600 is merely one example of a mobile terminal and that the mobile terminal 600 may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The mobile terminal integrated with the black screen gesture control device provided in this embodiment is described in detail below, and the mobile terminal is a mobile phone as an example.
A memory 601, the memory 601 being accessible by the CPU602, the peripheral interface 603, and the like, the memory 601 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other volatile solid state storage devices.
A peripheral interface 603, said peripheral interface 603 may connect input and output peripherals of the device to the CPU602 and the memory 601.
An I/O subsystem 609, the I/O subsystem 609 may connect input and output peripherals on the device, such as a touch screen 612 and other input/control devices 610, to the peripheral interface 603. The I/O subsystem 609 may include a display controller 6091 and one or more input controllers 6092 for controlling other input/control devices 610. Where one or more input controllers 6092 receive electrical signals from or transmit electrical signals to other input/control devices 610, the other input/control devices 610 may include physical buttons (push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels. It is noted that the input controller 6092 may be connected to any one of: a keyboard, an infrared port, a USB interface, and a pointing device such as a mouse.
A touch display screen 612, where the touch display screen 612 is an input interface and an output interface between the user terminal and the user, and displays visual output to the user, where the visual output may include graphics, text, icons, video, and the like.
Display controller 6091 in I/O subsystem 609 receives electrical signals from touch display screen 612 or sends electrical signals to touch display screen 612. The touch display screen 612 detects a contact on the touch display screen, and the display controller 6091 converts the detected contact into an interaction with a user interface object displayed on the touch display screen 612, that is, to implement a human-computer interaction, where the user interface object displayed on the touch display screen 612 may be an icon for running a game, an icon networked to a corresponding network, or the like. It is worth noting that the device may also include a light mouse, which is a touch sensitive surface that does not display visual output, or an extension of the touch sensitive surface formed by the touch sensitive display screen 612.
The RF circuit 605 is mainly used to establish communication between the mobile phone and the wireless network (i.e., network side), and implement data reception and transmission between the mobile phone and the wireless network. Such as sending and receiving short messages, e-mails, etc. In particular, RF circuitry 605 receives and transmits RF signals, also referred to as electromagnetic signals, through which RF circuitry 605 converts electrical signals to or from electromagnetic signals and communicates with a communication network and other devices. RF circuitry 605 may include known circuitry for performing these functions including, but not limited to, an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC (CODEC) chipset, a Subscriber Identity Module (SIM), and so forth.
The audio circuit 606 is mainly used to receive audio data from the peripheral interface 603, convert the audio data into an electric signal, and transmit the electric signal to the speaker 611.
The speaker 611 is used to convert the voice signal received by the handset from the wireless network through the RF circuit 605 into sound and play the sound to the user.
And a power management chip 608 for supplying power and managing power to the hardware connected to the CPU602, the I/O subsystem, and the peripheral interface.
The mobile terminal provided by the embodiment of the invention can effectively improve the response speed of the black screen gesture and shorten the time from the detection of the black screen gesture to the opening of the application program corresponding to the black screen gesture.
The black screen gesture control device, the storage medium and the mobile terminal provided in the above embodiments can execute the black screen gesture control method provided in any embodiment of the present invention, and have corresponding functional modules and beneficial effects for executing the method. Technical details that are not described in detail in the above embodiments may be referred to a black screen gesture control method provided in any embodiment of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A black screen gesture control method is characterized by comprising the following steps:
after the system is awakened based on the black screen gesture, when a driving layer reports a black screen gesture event, the operation of reading gesture data of the black screen gesture is executed in parallel, and the gesture data are stored in a preset node of the driving layer, wherein the black screen gesture comprises a touch gesture input on a touch display screen of a mobile terminal in a screen-off state or an operation detected by a sensor, the black screen gesture event is an event which is negotiated in advance by the driving layer and an application layer and is used for representing that the black screen gesture is input, the gesture data comprise a gesture coordinate, a gesture type and a preset end bit corresponding to the black screen gesture, the preset node is a file node, and the system is awakened based on the black screen gesture and comprises a touch chip which generates an awakening signal when the black screen gesture is detected, and an inner core layer awakening system is triggered through the awakening signal;
when the application layer receives the black screen gesture event, detecting whether gesture data in the preset node is ready to be finished or not;
when the preparation is finished, the gesture data are read from the preset node by the application layer, wherein the gesture data comprise gesture types;
and when the gesture data is successfully and effectively read, the application layer executes a black screen gesture function corresponding to the gesture data.
2. The method of claim 1, wherein the reporting of the black screen gesture event comprises:
and the driving layer reports the black screen gesture event corresponding to the black screen gesture to the application layer through the frame layer.
3. The method according to claim 2, wherein the parallel execution of the operation of reading the gesture data corresponding to the black screen gesture comprises:
and the driving layer parallelly executes the operation of reading the gesture data corresponding to the black screen gesture from the touch chip through an interrupt processing function.
4. The method of claim 1, wherein the application layer is triggered by the black screen gesture event to perform detection of whether gesture data in the preset node is ready to be completed, and the detection comprises:
when the application layer receives the black screen gesture event, inquiring an identification position for identifying the data state of the preset node in a driving layer;
the application layer judges whether the gesture data is ready to be finished according to the value of the identification bit;
if so, the application layer executes the operation of extracting the gesture data from the preset node;
otherwise, after waiting for the set time length, the application layer returns to execute the operation of inquiring the identification bit.
5. The method according to claim 1, further comprising, after the application layer reads the gesture data from the preset node:
and the application layer judges whether the gesture data read from the preset node comprises a preset ending bit of the gesture data, and if so, the gesture data is determined to be successfully read.
6. The method according to any one of claims 1 to 5, wherein performing a black screen gesture function corresponding to the gesture data comprises:
the application layer determines a gesture track corresponding to the gesture type and draws the gesture track to the touch display screen;
and the application layer starts an application program corresponding to the gesture type and displays an interface of the application program on the touch display screen.
7. The method according to any one of claims 1 to 5, wherein performing a black screen gesture function corresponding to the gesture data comprises:
the application layer starts an application program corresponding to the gesture type at the background through at least one starting thread;
the application layer executes and determines a gesture track corresponding to the gesture type in parallel with the starting thread through at least one drawing thread, and draws the gesture track to a touch display screen;
and when the preset display condition is met, the application layer draws an interface corresponding to the application program to the touch display screen so as to display the interface of the application program on the touch display screen.
8. A black screen gesture control apparatus, comprising:
a first data reading module for the driving layer to report the black screen gesture event after waking up the system based on the black screen gesture, parallelly executing the operation of reading the gesture data of the black screen gesture, and storing the gesture data in a preset node of a driving layer, wherein the black screen gesture comprises a touch gesture input on a touch display screen of the mobile terminal in a screen-off state or an operation detected by a sensor, the black screen gesture event is an event which is pre-negotiated by the driving layer and the application layer and is used for representing that the black screen gesture is input, the gesture data includes gesture coordinates corresponding to a black screen gesture, a gesture type, and a preset end bit, the preset node is a file node, and the wake-up system based on the black screen gesture comprises a touch chip, a kernel layer wake-up system and a kernel layer wake-up module, wherein the touch chip generates a wake-up signal when detecting the black screen gesture, and the kernel layer wake-up system is triggered through the wake-up signal;
the data detection module is used for detecting whether the gesture data in the preset node is ready to be finished or not when the application layer receives the black screen gesture event;
the second data reading module is used for reading the gesture data from the preset node in the application layer when the preparation is completed, wherein the gesture data comprises a gesture type;
and the function execution module is used for executing the black screen gesture function corresponding to the gesture data when the gesture data is successfully and effectively read by the application layer.
9. A computer-readable storage medium, on which a computer program is stored, the program, when being executed by a processor, implementing a black screen gesture control method according to any one of claims 1 to 7.
10. A mobile terminal comprising a touch display screen, a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the black screen gesture control method according to any one of claims 1 to 7 when executing the computer program.
CN201710632956.7A 2017-07-28 2017-07-28 Black screen gesture control method and device, storage medium and mobile terminal Expired - Fee Related CN107479700B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710632956.7A CN107479700B (en) 2017-07-28 2017-07-28 Black screen gesture control method and device, storage medium and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710632956.7A CN107479700B (en) 2017-07-28 2017-07-28 Black screen gesture control method and device, storage medium and mobile terminal

Publications (2)

Publication Number Publication Date
CN107479700A CN107479700A (en) 2017-12-15
CN107479700B true CN107479700B (en) 2020-05-12

Family

ID=60598279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710632956.7A Expired - Fee Related CN107479700B (en) 2017-07-28 2017-07-28 Black screen gesture control method and device, storage medium and mobile terminal

Country Status (1)

Country Link
CN (1) CN107479700B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110032397B (en) * 2018-01-10 2023-01-31 Oppo广东移动通信有限公司 Application processing method and device, electronic equipment and computer readable storage medium
CN109407842A (en) * 2018-10-22 2019-03-01 Oppo广东移动通信有限公司 Interface operation method, device, electronic equipment and computer readable storage medium
CN109933253A (en) * 2019-01-23 2019-06-25 努比亚技术有限公司 Method for controlling application starting, terminal and computer readable storage medium
CN112462963A (en) * 2019-09-09 2021-03-09 北京小米移动软件有限公司 Non-contact gesture control method and device and storage medium
CN112527093A (en) * 2019-09-18 2021-03-19 华为技术有限公司 Gesture input method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104137034A (en) * 2011-11-30 2014-11-05 惠普发展公司,有限责任合伙企业 Input mode based on location of hand gesture
CN104238916A (en) * 2014-09-16 2014-12-24 广东欧珀移动通信有限公司 Application or application function starting method of mobile terminal, and mobile terminal
CN106576123A (en) * 2015-09-02 2017-04-19 华为技术有限公司 A method, an apparatus and an electronic device for controlling an electronic device
CN106843728A (en) * 2017-01-16 2017-06-13 珠海市魅族科技有限公司 A kind of operation trace processing method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104137034A (en) * 2011-11-30 2014-11-05 惠普发展公司,有限责任合伙企业 Input mode based on location of hand gesture
CN104238916A (en) * 2014-09-16 2014-12-24 广东欧珀移动通信有限公司 Application or application function starting method of mobile terminal, and mobile terminal
CN106576123A (en) * 2015-09-02 2017-04-19 华为技术有限公司 A method, an apparatus and an electronic device for controlling an electronic device
CN106843728A (en) * 2017-01-16 2017-06-13 珠海市魅族科技有限公司 A kind of operation trace processing method and system

Also Published As

Publication number Publication date
CN107479700A (en) 2017-12-15

Similar Documents

Publication Publication Date Title
CN107479700B (en) Black screen gesture control method and device, storage medium and mobile terminal
CN107748686B (en) Application program starting optimization method and device, storage medium and intelligent terminal
CN107395889B (en) Method and device for reducing power consumption of mobile terminal, storage medium and mobile terminal
US20190370095A1 (en) Method and device for preloading application, storage medium and intelligent terminal
US10901608B2 (en) Method for recognizing a screen-off gesture, and storage medium and terminal thereof
US11086510B2 (en) Split screen control method based on screen-off gestures, and storage medium and mobile terminal thereof
CN107450838B (en) Response method and device of black screen gesture, storage medium and mobile terminal
WO2019206213A1 (en) Application program pre-loading method and apparatus, and storage medium and terminal
US11604660B2 (en) Method for launching application, storage medium, and terminal
WO2019223578A1 (en) Application program preloading method and apparatus, and storage medium and terminal
CN107450837B (en) Respond method, apparatus, storage medium and the mobile terminal of blank screen gesture
US20190370657A1 (en) Method and apparatus for updating application prediction model, storage medium, and terminal
WO2019233241A1 (en) Method and apparatus for starting application program, and storage medium and terminal
US20200201536A1 (en) Black screen gesture detection method and device, storage medium, and mobile terminal
WO2019223511A1 (en) Application program preloading method and apparatus, storage medium, and terminal
WO2019218886A1 (en) Application pre-loading management method, device, storage medium and smart terminal
EP3435215B1 (en) Method, device, storage medium and mobile terminal for recognizing an off-screen gesture
CN107402713B (en) Accelerate method, apparatus, storage medium and the mobile terminal of the processing of blank screen gesture
WO2019214477A1 (en) Application program pre-loading method and device, storage medium and terminal
WO2019047231A1 (en) Touch operation response method and device
WO2019019899A1 (en) Method and device for improving response to black screen gesture, storage medium, and mobile terminal
CN108664285A (en) Application program preloads method, apparatus, storage medium and mobile terminal
WO2019047226A1 (en) Touch operation response method and device
CN110795172B (en) Foreground process control method and device, electronic equipment and storage medium
WO2018010438A1 (en) Responding method and apparatus for terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200512

CF01 Termination of patent right due to non-payment of annual fee