WO2019019899A1 - 改善黑屏手势响应的方法、装置、存储介质及移动终端 - Google Patents

改善黑屏手势响应的方法、装置、存储介质及移动终端 Download PDF

Info

Publication number
WO2019019899A1
WO2019019899A1 PCT/CN2018/094914 CN2018094914W WO2019019899A1 WO 2019019899 A1 WO2019019899 A1 WO 2019019899A1 CN 2018094914 W CN2018094914 W CN 2018094914W WO 2019019899 A1 WO2019019899 A1 WO 2019019899A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
application
display
data
display screen
Prior art date
Application number
PCT/CN2018/094914
Other languages
English (en)
French (fr)
Inventor
韩通
郭明强
石仁栋
张强
汪昊
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2019019899A1 publication Critical patent/WO2019019899A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Definitions

  • the embodiments of the present application relate to mobile terminal technologies, for example, to a method, an apparatus, a storage medium, and a mobile terminal for improving a black screen gesture response.
  • Mobile terminals such as smart phones, PDAs, tablets, or handheld game consoles, are typically designed with touch detection to provide a touch input method that makes the user's operation easier.
  • the black screen gesture is a unique feature of the smart phone and has a futuristic function.
  • the black screen gesture function When the black screen gesture function is turned on, the gesture operation on the display screen can also be detected in the state where the smart phone is in standby black screen, thereby triggering the mobile phone. Internal corresponding function or software.
  • the flaw in the black screen gesture processing process causes the mobile terminal to respond slowly to the black screen gesture, and the black screen gesture function is not sensitive enough.
  • the embodiment of the present application provides a method, an apparatus, a storage medium, and a mobile terminal for improving a black screen gesture response, which can improve the response speed of a black screen gesture.
  • an embodiment of the present application provides a method for improving a black screen gesture response, including:
  • gesture data corresponding to the black screen gesture event where the gesture data includes a gesture type
  • an interface corresponding to the application is drawn to the display screen to display an interface of the application on the display screen.
  • the embodiment of the present application further provides an apparatus for improving a black screen gesture response, including:
  • a gesture data acquisition module configured to acquire gesture data corresponding to the black screen gesture event when the black screen gesture event is detected, where the gesture data includes a gesture type;
  • the application opening module is configured to open an application corresponding to the gesture type in the background by using at least one open thread;
  • a gesture trajectory drawing module is configured to perform a gesture trajectory corresponding to determining the gesture type in parallel with the open thread by using at least one drawing thread, and draw the gesture trajectory to a display screen;
  • the application display module is configured to draw an interface corresponding to the application to the display screen to display an interface of the application on the display screen when a preset display condition is met.
  • the embodiment of the present application further provides a computer readable storage medium, where the storage medium stores a computer program, and when the program is executed by the processor, the method for improving the black screen gesture response as described in the embodiment of the present application is implemented. .
  • the embodiment of the present application further provides a mobile terminal, including a display screen, a memory, a processor, and a computer program stored on the memory and operable on the processor, when the processor executes the computer program
  • a mobile terminal including a display screen, a memory, a processor, and a computer program stored on the memory and operable on the processor, when the processor executes the computer program
  • the solution for improving the response of the black screen gesture is provided by the embodiment of the present application.
  • the gesture data corresponding to the black screen gesture event is acquired, where the gesture data includes a gesture type; and the at least one open thread is used to open the background in the background.
  • draw the interface corresponding to the application to the display to display the interface of the application on the display By adopting the above technical solution, the response speed of the black screen gesture can be effectively improved, and the time required for detecting the black screen gesture to the application corresponding to the black screen gesture is shortened.
  • FIG. 1 is a flowchart of a method for improving a black screen gesture response according to an embodiment.
  • FIG. 2 is a schematic diagram of an Android system framework provided by an embodiment.
  • FIG. 3 is a flowchart of a method for acquiring gesture data corresponding to the black screen gesture event according to an embodiment.
  • FIG. 4a is a flowchart of another method for improving a black screen gesture response according to an embodiment.
  • FIG. 4b is a schematic diagram showing the display of a black screen gesture track provided by an embodiment.
  • FIG. 5a is a flowchart of still another method for improving a black screen gesture response according to an embodiment.
  • FIG. 5b is a schematic diagram showing another black screen gesture track provided by an embodiment.
  • FIG. 6a is a structural block diagram of an apparatus for improving a black screen gesture response according to an embodiment.
  • FIG. 6b is a structural block diagram of the gesture data acquiring module 610 of FIG. 6a.
  • FIG. 6c is a structural block diagram of still another apparatus for improving a black screen gesture response according to an embodiment.
  • FIG. 6d is a structural block diagram of still another apparatus for improving a black screen gesture response according to an embodiment.
  • FIG. 6e is a structural block diagram of still another apparatus for improving a black screen gesture response according to an embodiment.
  • FIG. 7 is a schematic structural diagram of a mobile terminal according to an embodiment.
  • the black screen gesture function may be that when the mobile terminal (for example, a smart phone) is in a sleep-out state, the touch display screen operates in a low power consumption state to detect a black screen gesture acting on the touch display screen under the screen-off, and according to The black screen gesture wakes up a certain function of the smartphone or turns on a preset function corresponding to the application of the black screen gesture type.
  • the following describes the flow of the application corresponding to the black screen gesture detected by the blank screen gesture to the application layer, and the process includes: storing the gesture data corresponding to the black screen gesture into the driver layer.
  • the drive layer performs black screen gesture data validity determination; if valid, the frame layer performs black screen gesture event dispatch; after the application layer receives the black screen gesture event
  • the application layer reads the gesture coordinates from the preset node in the driver layer, calculates the animation track of the black screen gesture according to the gesture coordinates and the gesture type, and sends the animation track data to the frame buffer (FrameBuffer) to follow the set screen refresh rate.
  • the animation track is refreshed to the touch display screen for display; then, the application layer performs an operation of opening the application corresponding to the black screen gesture.
  • the above-mentioned black screen gesture execution process detects a black screen gesture in a state where the display screen is off, and triggers a black screen gesture event by the detected black screen gesture to open the application after displaying the black screen gesture track on the display screen, from the user's perspective. It seems that the mobile terminal is not responding. For example, when the user inputs a black screen gesture to open the corresponding application, it takes a long time to open the application, which may cause the user to delay the system for a period of time after the black screen gesture is input to display the application on the display screen. Interface. As a result, the user intuitively believes that the black screen gesture function is not sensitive enough.
  • the solution for improving the response of the black screen gesture provided by the embodiment of the present application can well solve the problem that the opening delay of the application corresponding to the black screen gesture is long.
  • FIG. 1 is a flowchart of a method for improving a black screen gesture response according to an embodiment of the present disclosure.
  • the method may be implemented by a device for improving a black screen gesture response, where the device may be implemented by software and/or hardware, and generally integrated in In the mobile terminal.
  • the method includes:
  • Step 110 Acquire gesture data corresponding to the black screen gesture event when a black screen gesture event is detected.
  • the black screen gesture event may be an event that is pre-negotiated by the driver layer and the application layer for representing a black screen gesture input.
  • the black screen gesture may be a touch gesture input by the user on the display screen of the mobile terminal in the off-screen state after the black screen gesture function is turned on. It can be understood that the black screen gesture is not limited to the touch gesture input on the display screen, but may be an operation detected by a sensor of the mobile terminal or the like. For example, shaking the gesture of the smartphone from side to side, gestures that are swept over the display of the smartphone, gestures that press the border of the smartphone, and the like.
  • the display screen is a display screen with touch detection function.
  • the touch electrode can be fabricated on a glass substrate of the display panel to obtain a display screen with a touch detection function. It can be understood that, for a display screen that does not have a touch detection function, the function of the display screen in the embodiment of the present application needs to be implemented in conjunction with the touch screen.
  • the gesture data may include a gesture type, a gesture coordinate, a setting end bit, and the like.
  • the Android system framework includes a kernel layer 210, a core class library layer 220, a framework layer 230, and an application layer 240 from bottom to top.
  • the kernel layer 210 provides core system services, including security, memory management, process management, network protocol stack, and hardware drivers.
  • the hardware driver in the kernel layer 210 is referred to as a driving layer 211, and the driving layer 211 includes a touch screen display driver, a camera driver, and the like.
  • the core class library layer 220 includes an Android runtime environment (Android Runtime) and a library (Libraries). Among them, Android Runtime provides most of the functions available in the Java programming language core class library, including the core library (Core Libraries) and the Dalvik virtual machine (Dalvik VM). Each Android application is an instance of the Dalvik virtual machine running in their own process.
  • the class library is used by multiple components of the Android system, including the following functions: Media Framework, Surface Manager, Relational Database Engine (SQLite), Bitmap and Vector Font Rendering (FreeType), etc. The functionality is exposed to the developer through the framework layer 230 of the Android system.
  • the framework layer 230 provides a series of class libraries needed to develop Android applications, enabling developers to develop applications quickly, to reuse components, and to extend personalization through inheritance.
  • the services provided include component management services. Window management services, system data source components, space frameworks, resource management services, and installation package management services.
  • the application layer 240 includes a plurality of applications that directly interact with the user, or a service program written in the Java language and running in the background, including a desktop application, a contact application, a call application, a camera application, a picture browser, a game, a map, Programs such as web browsers, and other applications developed by developers.
  • the touch chip after the black screen gesture function is turned on, the touch chip generates a wake-up signal when the black screen gesture is detected, and sends the wake-up signal to the kernel layer 210.
  • the wake-up signal triggers the kernel layer 210 to perform a system wake-up operation.
  • the kernel layer 210 invokes the driver layer 211 interrupt function to perform an operation of reading the gesture data in the touch chip, and stores the read gesture data in the preset node of the driver layer 211.
  • the touch chip is configured to output a touch sensing control signal to the touch display screen to detect a touch operation, identify a gesture coordinate of a black screen gesture acting on the touch display screen, and store the gesture coordinate as a gesture data in its own register. in.
  • the preset node can be a file node, for example, a virtual file node in the proc-D directory.
  • the driver layer determines the validity of the gesture data. There are many ways to determine the validity, which is not specifically limited in this embodiment. For example, the driving layer 211 determines the gesture type according to the gesture coordinates included in the gesture data, and stores the determined gesture type as gesture data in the preset node. If the gesture type is not a preset black screen gesture, it is determined that the gesture data is invalid.
  • the driving layer collects the number of the gesture data, determines whether the number satisfies the requirement of drawing a preset black screen gesture, and if the requirement of drawing the preset black screen gesture is not satisfied, determining that the gesture data is invalid.
  • the driver layer 211 reports a black screen gesture event.
  • the black screen gesture event is transmitted to the framework layer 230 through the core class library layer 220 and distributed through the framework layer 230 to the application layer 240.
  • the application layer 240 acquires a black screen gesture event, the gesture data is read by the preset node of the driver layer 211.
  • the black screen gesture track is calculated according to the gesture coordinates included in the gesture data, and the black screen gesture track is drawn on the touch display screen for display.
  • the application layer 240 then opens an application corresponding to the gesture type based on the type of gesture in the read gesture data.
  • the gesture type may be a gesture for implementing a certain function preset in the mobile terminal, or may be a user-defined gesture.
  • the gesture type can be 0, which means that the camera is turned on.
  • the gesture type can be V, which means turning on the flashlight and the like.
  • the black screen gesture event may be reported when the system wakes up, and the kernel layer 210 calls the driver layer 211 interrupt function to perform the operation of reading the gesture data in the touch chip, and stores the gesture data in the preset node of the driver layer 211.
  • the parallel execution driving layer 211 reads the gesture data, and determines the operation of the gesture type according to the gesture data.
  • the driving layer 211 acquires gesture data in the preset node, and performs curve fitting on the gesture data to obtain a gesture type that is closest to the black screen gesture, and stores the gesture type as gesture data in the preset node.
  • the application layer 240 When the application layer 240 receives the black screen gesture event, it is detected whether the gesture data in the preset node is ready to be completed according to the set period. When the preparation is completed, the application layer 240 reads the gesture data from the preset node. When the gesture data is successfully read and valid, the black screen gesture track is calculated according to the gesture coordinates included in the gesture data, and the black screen gesture track is drawn on the touch display screen for display. The application layer 240 then opens an application corresponding to the gesture type based on the type of gesture in the read gesture data.
  • Step 120 Open, by using at least one open thread, an application corresponding to the gesture type in the background.
  • the open thread can be used to perform an operation of opening an application corresponding to the gesture type in the background.
  • the application is composed of Activity. Therefore, the startup process of the application is actually the startup process of the default Activity in the application, including the call of the Activity class and the instantiation of the object.
  • the image frame corresponding to the application interface can be cached, and the application interface is not drawn to the display screen.
  • the relationship between the gesture type and the application or the mobile phone function can be established in advance, and the gesture type and the application (such as the process number or the installation package name) or the mobile phone function (such as the wake-up of the mobile phone, the conference mode or the standard) can be stored in a whitelist manner.
  • the relationship between mode and other working mode switching can be understood that there are many ways to establish a relationship between a gesture type and an application, which is not limited in the embodiment of the present application.
  • the shortcut function can be set for the setting function or the setting application of the mobile terminal before the mobile terminal leaves the factory, and the setting function can be directly executed in the screen-out state or the setting application can be opened by inputting the setting gesture.
  • the flashlight has a quick start function before the factory is shipped from the factory, and the flashlight can be turned on in the off state by inputting a black screen gesture “0”.
  • the mobile terminal provides a black screen gesture configuration function, and when the user turns on the black screen gesture function, prompts the user to select an application that needs to set a quick start function, and inputs or selects a black screen gesture corresponding to the application, thereby establishing a gesture type and an application. Relationship.
  • the application layer may open a thread to query a preset white list, determine an application corresponding to the gesture type, and open the application in the background.
  • Step 130 Perform, by the at least one drawing thread, executing a gesture track corresponding to the gesture type in parallel with the opening thread, and drawing the gesture track to the display screen.
  • the drawing thread can be used to execute in parallel with the opening thread, determine a gesture track corresponding to the gesture type, and draw the gesture track to the display operation.
  • the application layer performs an operation of determining a gesture track corresponding to the gesture type in parallel by another at least one drawing thread while the application corresponding to the gesture type is opened in the background by opening the thread.
  • the gesture trajectory corresponding to the gesture type For example, a gesture image of a different display effect corresponding to the gesture type of the black screen gesture may be pre-stored in the mobile terminal. After a black screen gesture input, the drawing thread can acquire the gesture type of the black screen gesture without retrieving the gesture coordinates, which greatly reduces the amount of data acquisition.
  • the gesture picture is determined according to the gesture type, and the gesture picture is drawn, and the drawn gesture picture is sent to a frame buffer (FrameBuffer) to refresh the gesture picture to the display screen according to the set refresh rate.
  • the gesture picture of the default display effect can be selected according to the gesture type. It is also possible to select a gesture picture that matches the display effect selected by the user in advance according to the type of gesture. For example, in order to more realistically reproduce the black screen gesture input by the user, after the black screen gesture input, the drawing thread can acquire the gesture type and the gesture coordinate, and extract multiple of the gesture coordinates according to the setting drawing rule corresponding to the gesture type.
  • the sampling point is sequentially connected to the sampling point to obtain a gesture track corresponding to the gesture type, and the frame frame including the connection between the sampling point and the sampling point is sent to a frame buffer (FrameBuffer), and the gesture track is displayed in an animated manner.
  • a frame buffer Frazier
  • the foregoing process may be separately performed by one drawing thread, or may be separately drawn by at least two drawing threads. After each drawing thread completes the gesture track segment that is responsible for drawing, the completed gesture track segment is stitched into a complete gesture track.
  • Step 140 When the preset display condition is met, draw an interface corresponding to the application to the display screen to display an interface of the application on the display screen.
  • the display condition may be set according to actual needs, and is not specifically limited in this embodiment.
  • the display time threshold of the gesture track on the display screen may be specified as the display condition of the application interface, that is, when the gesture track display time reaches the set display time threshold, the display screen is switched to the application interface.
  • the data is read from the preset storage area of the picture frame corresponding to the application interface and transmitted to the frame buffer (FrameBuffer) to follow the set.
  • the refresh rate refreshes the application interface to the display and switches the display directly from the gesture track to the application interface.
  • the gesture data corresponding to the black screen gesture event is obtained, where the gesture data includes a gesture type; and the application corresponding to the gesture type is enabled in the background by using at least one open thread. a process of determining, by the at least one drawing thread, a gesture trajectory corresponding to the gesture type according to the gesture data, and drawing the gesture trajectory to a display screen; and drawing the preset display condition
  • the application corresponds to the display to display the interface of the application on the display.
  • the user may input an incorrect black screen gesture and find the error in time when the gesture track is displayed. At this time, the user often does not want the application corresponding to the wrong black screen gesture to be opened. Displayed on the display. You can increase the ability to close the application in the background by drawing the interface corresponding to the application to the display. For example, before the interface corresponding to the application is drawn to the display screen, it is determined whether the user-entered abandonment opening instruction for the application is detected; when the user-entered abandonment opening instruction for the application is detected, the application is closed in the background.
  • the program, and controlling the mobile terminal to resume the black screen gesture mode can timely intercept the application corresponding to the wrong black screen gesture before displaying the application of the black screen gesture, thereby avoiding wasting the processing resources of the image processor.
  • the user can close the application again, which can effectively shorten the time of re-entering the black screen gesture mode.
  • FIG. 3 is a flowchart of a method for acquiring gesture data corresponding to the black screen gesture event according to an embodiment of the present application. As shown in FIG. 3, the method includes:
  • Step 310 When detecting a black screen gesture event, read an identifier bit in the driver layer of the operating system that identifies the data state of the preset node.
  • the data state of the preset node includes the preparation completion and preparation, and may determine whether the preset node data is in a ready state or a ready state according to detecting whether the gesture data in the preset node includes a set end bit.
  • the character corresponding to the preset end bit is "#”.
  • the touch chip stores the gesture data corresponding to the detected black screen gesture into a preset register.
  • the touch chip adds "#" to the end of the gesture data stored in the register after detecting that the black screen gesture input is completed.
  • the driver layer reads the gesture data in the preset node according to the set period. If the character corresponding to the preset end bit is detected, that is, “#”, it is determined that the preset node data state is ready for completion.
  • end bit can be various, and is not limited to the "#" listed in this embodiment.
  • the flag is used to identify the state of the gesture data in the preset node in the driver layer.
  • the application layer can learn whether the state of the gesture data in the preset node is ready or completed by querying the identifier.
  • the value of the flag is determined by whether the driver layer reads the set end bit. For example, if the driver layer reads the set end bit, the value of the update flag is updated to a value corresponding to the preset preparation completion state. Thereafter, the application layer can determine that the data in the preset node is in a ready state according to the updated identifier bit. If the driver layer does not read the set end bit, the value of the flag is kept as the value corresponding to the state in preparation.
  • Step 320 Determine, according to the identifier, whether the gesture data in the preset node is ready to be completed. If the preparation is complete, go to step 330. If no preparation is completed, go to step 340.
  • the application layer reads the value of the identifier bit in the driver layer according to the set period, and matches the value with the value that is ready to complete the gesture data in the preset node. If the value is equal to the value of the representative gesture data, the gesture data in the preset node is determined to be completed, and step 330 is performed; the value is not equal to the value of the representative gesture data preparation completion, and the preset node is determined. The gesture data in the interior is not ready to be completed, and step 340 is performed.
  • Step 330 Extract the gesture data from the preset node.
  • the gesture data preparation in the preset node is completed, the gesture data is extracted from the preset node.
  • the application layer calls the set function to read gesture data from the virtual file node in the proc-D directory.
  • Step 340 timing is performed by a timer.
  • the timer is started to set the length of time.
  • the set time length is equal to the first period of the identifier bit in the application layer read drive layer.
  • Step 350 Determine whether the value of the timer reaches the set time length, and if the set time length is reached, execute step 310, and if the set time length is not reached, perform step 340.
  • Reading the timer reading according to the second period comparing the reading with the set time length, if it is greater than or equal to the set time length, performing step 310; if less than the set time length, returning to step 340 .
  • the second period is smaller than the first period. For example, when the application layer determines that the gesture data in the preset node is not ready, the application layer waits for the set length of time, and then reads the value of the identifier bit in the driver layer to determine the preset node according to the value thereof. Whether the gesture data inside is ready to be completed.
  • the technical solution of the embodiment determines whether the gesture data in the preset node of the driving layer is ready to be completed by reading the preset identifier bit in the driving layer; if the preparation is completed, reading the gesture data in the preset node If it is not ready to be completed, after waiting for the set length of time, the setting flag is re-read to perform data state determination in the preset node.
  • FIG. 4a is a flowchart of another method for improving a black screen gesture response provided by an embodiment of the present application. As shown in Figure 4a, the method includes:
  • Step 410 Acquire gesture data corresponding to the black screen gesture event when a black screen gesture event is detected.
  • the gesture data includes a gesture type.
  • the gesture type is determined by the driver layer according to the gesture coordinates corresponding to the black screen gesture, and is formed by curve fitting.
  • Curve fitting is a method of data processing that approximates the relationship between the coordinates represented by discrete points on a plane by a continuous curve.
  • the driver layer performs a curve fit based on the gesture coordinates in the read gesture data to calculate the type of gesture that is closest to the input black screen gesture.
  • Step 420 Open, by using at least one open thread, an application corresponding to the gesture type in the background.
  • Step 430 Match, by the at least one drawing thread, the gesture type to the preset standard graphic in parallel with the opening thread.
  • the application layer determines the application level corresponding to the gesture type by starting the thread while querying the pre-configured standard graphics library by drawing a thread to determine a standard graphic that matches the gesture type.
  • the standard graphics library can be set in the mobile terminal to facilitate application layer query, because it has a faster query speed without relying on the Internet.
  • the standard graphics library can be updated based on the update message pushed by the remote server after the mobile terminal is networked.
  • the standard graphics library can also be stored in a remote server to avoid occupying the storage space of the mobile terminal.
  • the gesture type of the user inputting the black screen gesture is “W”, and the standard graphics library is queried according to the gesture type, and the standard graphic of “W” for setting the display effect is determined.
  • the setting display effect may be a default display effect of the system or a display effect preset by the user, including font color, font shape, font size, and the like.
  • Step 440 Display a standard graphic that successfully matches the gesture type on the display screen.
  • the application layer finds a standard graphic matching the gesture type in a pre-configured standard graphic library, draws an image of the standard graphic, and stores the image data in a frame buffer of the display screen, and displays the standard graphic according to the set refresh rate. The image is refreshed to the display to display an image of the standard graphic on the display.
  • FIG. 4b is a schematic diagram showing a black screen gesture track provided by an embodiment of the present application. As shown in FIG. 4b, when the user inputs a black screen gesture “W” in a black screen state, the standard graphic corresponding to the gesture type can be displayed on the display screen.
  • step 450 it is determined whether the display condition preset for the application is satisfied, and the display condition preset for the application is satisfied. Then, step 460 is performed, and the display condition preset for the application is not satisfied, and step 440 is performed.
  • step 470 is executed, and the display condition preset for the application is not satisfied, and the execution proceeds to step 450.
  • Step 460 Determine whether the user's input of the abandonment open command for the application is detected, and if the user does not detect the abandonment open command for the application, the step 470 is executed, when the user inputs the abandonment open command for the application. Then, go to step 480.
  • the abandonment open command is a pre-configured command triggered by a user's setting operation.
  • setting operations There are many kinds of setting operations that can trigger the abandonment of the open command, and are not specifically limited herein.
  • it may be a default gesture of the system, such as shaking the mobile terminal or being over the top of the display screen of the mobile terminal. It can also be a user-defined gesture and so on.
  • Step 470 Draw an interface corresponding to the application to the display screen to display an interface of the application on the display screen.
  • Step 480 Close the application in the background, and control the mobile terminal to resume the black screen gesture mode.
  • the operation of drawing the interface corresponding to the application to the display screen is not performed, the application is closed in the background, and the display screen is turned off, and the mobile terminal is controlled to re-enter the black screen gesture mode.
  • the mobile terminal In the black screen gesture mode, the mobile terminal has lower power consumption and is capable of detecting a black screen gesture acting on the display screen.
  • the thread when the gesture type included in the gesture data is obtained, the thread is opened to execute the application corresponding to the gesture type; and the standard graphics corresponding to the gesture type is determined by drawing the thread to query the preset standard graphics library, and the drawing is performed.
  • the image of the standard graphic is displayed to the display; when the display condition preset for the application is satisfied, the display screen is directly switched from the gesture track to the application interface.
  • FIG. 5a is a flowchart of still another method for improving a black screen gesture response provided by an embodiment of the present application. As shown in Figure 5a, the method includes:
  • Step 510 Acquire gesture data corresponding to the black screen gesture event when a black screen gesture event is detected.
  • Step 520 Open, by using at least one open thread, an application corresponding to the gesture type in the background.
  • Step 530 Extract, by the at least one drawing thread, a sampling point that extracts a preset sampling rule that satisfies the gesture type from the gesture coordinates included in the gesture data in parallel with the opening thread.
  • the sampling rule is set in advance for the gesture type.
  • a coordinate point may be set as a sampling point every set number of coordinate points. That is, by drawing the thread according to the preset sampling rule, a plurality of sampling points satisfying the preset sampling rule are extracted from the gesture data read by the application layer.
  • Step 540 Draw a gesture track corresponding to the gesture type according to the sampling point, and display the gesture track in an animated form on the display screen.
  • the sample point can be curve-fitted to obtain a gesture track corresponding to the gesture type of the black screen gesture input by the user.
  • the set number of pixel points may be drawn every set time interval starting from the first sampling point of the gesture track, thereby displaying the drawing process of the gesture track in an animated form.
  • the set time interval is a minimum time interval at which the human eye can distinguish the image change.
  • the time interval can be set as needed.
  • FIG. 5b is a schematic diagram showing another black screen gesture track provided by an embodiment of the present application.
  • the first sample point 501 of the gesture type "W" is used as a starting point, and the gesture track between the three sample points is drawn every set time interval.
  • the gesture track between the three sample points is drawn by the first sampling point 501, and then the second time is drawn by setting the time length.
  • the gesture track between the three sample points is started by the fourth sampling point 502, and then the third time is drawn by setting the time length.
  • the gesture track between the three sample points is drawn by the seventh sampling point 503, and the gesture track is drawn according to the rule until the last sample point 504 is drawn to end the drawing, thereby realizing the animation on the display screen.
  • the gesture trajectory improves the monotonicity of the static display gesture trajectory and increases the interest.
  • step 550 it is determined whether the display condition preset for the application is satisfied, and when the display condition preset for the application is satisfied, step 560 is executed, and when the display condition preset for the application is not satisfied, step 540 is performed.
  • the display condition of the display interface switched from the gesture track display interface to the application is set in advance, and the display condition can be set according to actual needs, and the application corresponding to the gesture type is successfully opened in the background and the gesture track displayed on the display screen is drawn to the last. A sampling point.
  • the application layer determines whether the gesture track on the display screen is drawn to the last sampling point when the application corresponding to the black screen gesture detected by the user is turned on in the background. When the application is drawn to the last sampling point, step 560 is performed. When not drawn to the last sample point, proceed to step 540. Optionally, if the gesture track on the display is drawn to the last sampling point, but the application corresponding to the black screen gesture input by the user is not turned on, the completed gesture track is displayed on the display until the application is detected in the background. The completion is completed, and then step 560 is performed.
  • Step 560 Determine whether a user's input of the abandonment open command for the application is detected.
  • step 570 is performed, when the user input is rejected for the application.
  • step 580 is performed.
  • Step 570 Draw an interface corresponding to the application to the display screen to display an interface of the application on the display screen.
  • the data is read from a preset storage area of the picture frame corresponding to the application interface, and the application interface is refreshed to the display screen at a set refresh rate, thereby directly switching from the gesture track interface to the application interface.
  • Step 580 Close the application in the background, and control the mobile terminal to resume the black screen gesture mode.
  • the gesture type is obtained by curve fitting the read gesture data, and the thread is executed to open the application corresponding to the gesture type; and the drawing thread extracts several gesture coordinates from the gesture data.
  • a sampling point that satisfies a preset sampling rule, draws a gesture trajectory according to the sampling point, and displays the gesture trajectory on the display screen in an animated form; when the preset display condition for the application interface is satisfied, the gesture trajectory display interface is switched to Application interface.
  • FIG. 6 is a structural block diagram of an apparatus for improving a black screen gesture response according to an embodiment of the present application.
  • the device can be implemented in software and/or hardware and is typically integrated in a mobile terminal.
  • the apparatus can include:
  • the gesture data acquisition module 610 is configured to acquire gesture data corresponding to the black screen gesture event when the black screen gesture event is detected, where the gesture data includes a gesture type;
  • the application opening module 620 is configured to open an application corresponding to the gesture type in the background by using at least one open thread;
  • the gesture track drawing module 630 is configured to perform a gesture track corresponding to determining the gesture type in parallel with the open thread by using at least one drawing thread, and draw the gesture track to the display screen;
  • the application display module 640 is configured to draw an interface corresponding to the application to the display screen to display an interface of the application on the display screen when a preset display condition is satisfied.
  • the technical solution of the embodiment provides a device for improving the response of the black screen gesture, which can effectively improve the response speed of the black screen gesture and shorten the time required for detecting the black screen gesture to open the application corresponding to the black screen gesture.
  • the gesture data acquiring module 610 includes:
  • the data status determining sub-module 611 is configured to determine, when the black screen gesture event is detected, whether the gesture data in the preset node of the driver layer of the operating system is ready to be completed;
  • the data extraction sub-module 612 is configured to extract the gesture data from the preset node if the gesture data in the preset node is ready to be completed;
  • the data status determination sub-module 611 is configured to:
  • the identifier of the data state of the preset node in the driver layer is read, and whether the gesture data is ready to be completed is determined according to the value of the identifier bit.
  • the application open module 620 is configured to:
  • the gesture type of the black screen gesture is determined by curve fitting according to the gesture coordinates included in the gesture data.
  • the gesture trajectory rendering module 630 is configured to:
  • the at least one drawing thread is executed in parallel with the open thread to match the gesture type with a preset standard graphic, and draw a standard graphic that successfully matches the gesture type to the display screen.
  • the gesture trajectory rendering module 630 is configured to:
  • the method further includes:
  • the first display condition determining module 650 is configured to successfully open the application in the background before the interface corresponding to the application is drawn to the display screen, and the gesture track displayed by the animation on the display screen is reached. At the last sampling point, it is determined that the preset display condition is satisfied.
  • a second display condition determining module 660 is further disposed, before the interface corresponding to the application is drawn to the display screen.
  • an abandonment instruction module 670 is further included, which is set before the interface corresponding to the application is drawn to the display screen.
  • the application is closed in the background when a user-input abandonment open command for the application is detected.
  • the embodiment of the present application further provides a storage medium including computer executable instructions for performing a method for improving a blackout gesture response when executed by a computer processor, the method comprising:
  • gesture data corresponding to the black screen gesture event where the gesture data includes a gesture type
  • an interface corresponding to the application is drawn to the display screen to display an interface of the application on the display screen.
  • Storage media any different type of storage device or storage device.
  • the term "storage medium” is intended to include: a mounting medium such as a CD-ROM, a floppy disk or a tape device; a computer system memory or a random access memory (RAM) such as a dynamic random access memory (dynamic random access memory). DRAM), display data random access memory (DDR RAM), static random access memory (SRAM), extended data output random access memory (EDO RAM) , Rambus RAM, etc.; non-volatile memory, such as flash memory, magnetic media (such as hard disk or optical storage); registers or other similar types of memory components, and the like.
  • the storage medium may also include other types of memory or a combination thereof.
  • the storage medium may be located in a first computer system in which the program is executed, or may be located in a different second computer system, the second computer system being coupled to the first computer system via a network, such as the Internet.
  • the second computer system can provide program instructions to the first computer for execution.
  • the term "storage medium" can include at least two storage mediums that can reside in different locations (eg, in different computer systems connected through a network).
  • the storage medium may store program instructions (eg, implemented as a computer program) executable by the at least one processor.
  • a storage medium containing computer executable instructions provided by the embodiments of the present application the computer executable instructions thereof are not limited to the operation of improving the black screen gesture processing as described above, and may also perform the improvement provided by any embodiment of the present application. Related operations in the black screen gesture processing method.
  • FIG. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure.
  • the mobile terminal may include: a casing (not shown), a memory 701, a central processing unit (CPU) 702 (also referred to as a processor, hereinafter referred to as a CPU), and a circuit board ( Not shown in the drawing), display screen 712 and power supply circuit (not shown).
  • CPU central processing unit
  • FIG. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present disclosure.
  • the mobile terminal may include: a casing (not shown), a memory 701, a central processing unit (CPU) 702 (also referred to as a processor, hereinafter referred to as a CPU), and a circuit board ( Not shown in the drawing), display screen 712 and power supply circuit (not shown).
  • CPU central processing unit
  • the display screen 712 is configured to convert a user operation into an electrical signal input to the processor and display a visual output signal;
  • the circuit board is disposed inside the space surrounded by the display screen 712 and the housing
  • the CPU 702 and the memory 701 are disposed on the circuit board;
  • the power supply circuit is configured to supply power to each circuit or device of the mobile terminal;
  • the memory 701 is configured to store a computer program;
  • the CPU 702 The computer program stored in the memory 701 is read and executed.
  • the CPU 702 when executing the computer program, implements the following steps: when detecting a black screen gesture event, acquiring gesture data corresponding to the black screen gesture event, the gesture data includes a gesture type; and opening in the background by at least one opening thread An application corresponding to the gesture type; performing, by the at least one drawing thread, executing a gesture trajectory corresponding to the gesture type in parallel with the open thread, and drawing the gesture trajectory to a display screen; when the preset display condition is met And mapping an interface corresponding to the application to the display screen to display an interface of the application on the display screen.
  • the mobile terminal further includes: a peripheral interface 703, a radio frequency (RF) circuit 705, an audio circuit 706, a speaker 711, a power management chip 708, an input/output (I/O) subsystem 709, and other inputs/controls.
  • RF radio frequency
  • I/O input/output subsystem 709
  • Device 710 and external port 704 these components communicate via one or more communication buses or signal lines 707.
  • the illustrated mobile terminal 700 is merely one example of a mobile terminal, and that the mobile terminal 700 may have more or fewer components than those shown in the figures, and two or more components may be combined. Or it can have different component configurations.
  • the various components shown in the figures can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the mobile terminal integrated with the device for improving the black screen gesture response provided by the embodiment is described in detail below.
  • the mobile terminal takes a mobile phone as an example.
  • the memory 701 can be accessed by the CPU 702, the peripheral interface 703, etc., and the memory 701 can include a high speed random access memory, and can also include a non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices. Or other volatile solid-state storage devices.
  • a non-volatile memory such as one or more magnetic disk storage devices, flash memory devices. Or other volatile solid-state storage devices.
  • Peripheral interface 703, which can connect the input and output peripherals of the device to CPU 702 and memory 701.
  • the I/O subsystem 709 can connect input and output peripherals on the device, such as display screen 712 and other input/control devices 710, to peripheral interface 703.
  • the I/O subsystem 709 can include a display controller 7091 and at least one input controller 7092 that is configured to control other input/control devices 710.
  • at least one input controller 7092 receives electrical signals from other input/control devices 710 or transmits electrical signals to other input/control devices 710, and other input/control devices 710 may include physical buttons (press buttons, rocker buttons, etc.), Dial, slide switch, joystick, click wheel.
  • the input controller 7092 can be connected to any of the following: a keyboard, an infrared port, a USB interface, and a pointing device such as a mouse.
  • a display screen 712 is an input interface and an output interface between the user terminal and the user, and displays the visual output to the user.
  • the visual output may include graphics, text, icons, video, and the like.
  • Display controller 7091 in I/O subsystem 709 receives an electrical signal from display 712 or an electrical signal to display 712.
  • Display 712 detects contact on the display screen, and display controller 7091 converts the detected contact into interaction with a user interface object displayed on display screen 712, i.e., enables human-computer interaction, a user interface displayed on display screen 712.
  • the object can be an icon that runs the game, an icon that is networked to the corresponding network, and the like.
  • the device may also include a light mouse, which is a touch sensitive surface that does not display a visual output, or an extension of a touch sensitive surface formed by the display screen.
  • the RF circuit 705 is configured to establish communication between the mobile phone and the wireless network (ie, the network side) to implement data reception and transmission between the mobile phone and the wireless network. For example, sending and receiving short messages, emails, and the like. Specifically, the RF circuit 705 receives and transmits an RF signal, which is also referred to as an electromagnetic signal, and the RF circuit 705 converts the electrical signal into an electromagnetic signal or converts the electromagnetic signal into an electrical signal, and through the electromagnetic signal and communication network and other devices Communicate.
  • an RF signal which is also referred to as an electromagnetic signal
  • RF circuitry 705 may include known circuitry configured to perform these functions including, but not limited to, an antenna system, an RF transceiver, at least one amplifier, a tuner, one or more oscillators, a digital signal processor, a codec ( COder-DECoder, CODEC) Chipset, Subscriber Identity Module (SIM), etc.
  • an antenna system an RF transceiver, at least one amplifier, a tuner, one or more oscillators, a digital signal processor, a codec ( COder-DECoder, CODEC) Chipset, Subscriber Identity Module (SIM), etc.
  • COder-DECoder CODEC
  • SIM Subscriber Identity Module
  • the audio circuit 706 is arranged to receive audio data from the peripheral interface 703, convert the audio data into an electrical signal, and transmit the electrical signal to the speaker 711.
  • the speaker 711 is arranged to restore the voice signal received by the mobile phone from the wireless network through the RF circuit 705 to sound and play the sound to the user.
  • the power management chip 708 is configured to provide power and power management for the hardware connected to the CPU 702, the I/O subsystem, and the peripheral interface.
  • the mobile terminal provided by the embodiment of the present invention can effectively improve the response speed of the black screen gesture, and shorten the time required for detecting the black screen gesture to open the application corresponding to the black screen gesture.
  • the device for improving the black-screen gesture response provided by the foregoing embodiment, the storage medium, and the mobile terminal can perform the method for improving the black-screen gesture response provided by any embodiment of the present application, and have the corresponding functional modules and beneficial effects for performing the method.
  • a method for improving a black screen gesture response provided by any embodiment of the present application can perform the method for improving the black-screen gesture response provided by any embodiment of the present application, and have the corresponding functional modules and beneficial effects for performing the method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种改善黑屏手势响应的方法、装置、存储介质及移动终端。该方法包括在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据,其中所述手势数据包括手势类型;通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;通过至少一个绘制线程,与所述开启线程并行执行确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;在满足预设的显示条件时,绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。

Description

改善黑屏手势响应的方法、装置、存储介质及移动终端 技术领域
本申请实施例涉及移动终端技术,例如涉及一种改善黑屏手势响应的方法、装置、存储介质及移动终端。
背景技术
移动终端,例如智能手机、掌上电脑、平板电脑或掌上游戏机等的显示屏通常被设计为具有触摸检测功能,以提供触摸输入方式,使用户的操作更加便捷。
黑屏手势是智能手机的一个独具特色又具有科技未来感的功能,当黑屏手势功能被开启后,在智能手机待机黑屏的状态下也可实现检测作用于显示屏上的手势操作,从而触发手机内部相应的功能或软件。但是,黑屏手势处理流程存在缺陷导致移动终端对黑屏手势的响应速度慢,黑屏手势功能反应不够灵敏。
发明内容
本申请实施例提供一种改善黑屏手势响应的方法、装置、存储介质及移动终端,可以提升黑屏手势的响应速度。
第一方面,本申请实施例提供了一种改善黑屏手势响应的方法,包括:
在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据,其中,所述手势数据包括手势类型;
通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;
通过至少一个绘制线程,与所述开启线程并行执行根据所述手势数据确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;
在满足预设的显示条件时,绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。
第二方面,本申请实施例还提供了一种改善黑屏手势响应的装置,包括:
手势数据获取模块,设置为在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据,其中,所述手势数据包括手势类型;
应用程序开启模块,设置为通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;
手势轨迹绘制模块,设置为通过至少一个绘制线程,与所述开启线程并行执行确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;
应用程序显示模块,设置为在满足预设的显示条件时,绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。
第三方面,本申请实施例还提供了一种计算机可读存储介质,所述存储介质存储有计算 机程序,该程序被处理器执行时实现如本申请实施例所述的改善黑屏手势响应的方法。
第四方面,本申请实施例还提供了一种移动终端,包括显示屏、存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如本申请实施例所述的改善黑屏手势响应的方法。
本申请实施例提供的改善黑屏手势响应的方案,通过在检测到黑屏手势事件时,获取黑屏手势事件对应的手势数据,其中,该手势数据包括手势类型;通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;通过至少一个绘制线程,与所述开启线程并行执行根据所述手势数据确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;在满足预设的显示条件时,绘制该应用程序对应的界面至显示屏,以在显示屏上显示该应用程序的界面。通过采用上述技术方案,可以有效地提升黑屏手势的响应速度,缩短由检测到黑屏手势至打开该黑屏手势对应的应用程序所需的时间。
附图说明
图1是一实施例提供的一种改善黑屏手势响应的方法的流程图。
图2是一实施例提供的一种安卓系统框架示意图。
图3是一实施例提供的获取所述黑屏手势事件对应的手势数据的方法的流程图。
图4a是一实施例提供的另一种改善黑屏手势响应的方法的流程图。
图4b是一实施例提供的一种黑屏手势轨迹的显示示意图。
图5a是一实施例提供的又一种改善黑屏手势响应的方法的流程图。
图5b是一实施例提供的又一种黑屏手势轨迹的显示示意图。
图6a是一实施例提供的一种改善黑屏手势响应的装置的结构框图。
图6b是图6a中手势数据获取模块610的结构框图。
图6c是一实施例提供的又一种改善黑屏手势响应的装置的结构框图。
图6d是一实施例提供的又一种改善黑屏手势响应的装置的结构框图。
图6e是一实施例提供的又一种改善黑屏手势响应的装置的结构框图。
图7是一实施例提供的一种移动终端的结构示意图。
具体实施方式
下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,此处所描述的具体实施例仅用于解释本申请,而非对本申请的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与本申请相关的部分而非全部结构。
在更加详细地讨论示例性实施例之前应当提到的是,一些示例性实施例被描述成作为流程图描绘的处理或方法。虽然流程图将多个步骤描述成顺序的处理,但是其中的许多步骤可以被并行地、并发地或者同时实施。此外,多个步骤之间的顺序可以被重新安排。当其操作完成时所述处理可以被终止,但是还可以具有未包括在附图中的附加步骤。所述处理可以对应于方法、函数、规程、子例程、子程序等等。
黑屏手势功能可以是在移动终端(例如智能手机)处于熄屏休眠的情况下,触控显示屏 以低功耗状态运行,以检测熄屏下的作用于触控显示屏的黑屏手势,并根据该黑屏手势来唤醒智能手机的某项功能或开启预先设置的与黑屏手势类型对应的应用程序的功能。为了便于理解黑屏手势功能,下面对由熄屏状态下检测到黑屏手势至应用层开启该黑屏手势对应的应用程序的流程进行说明,该流程包括:将黑屏手势对应的手势数据存入驱动层的预设节点内,其中,手势数据包括手势坐标和手势类型;由驱动层执行黑屏手势数据有效性判断;若有效,则由框架层执行黑屏手势事件派发;在应用层接收到黑屏手势事件后,由应用层从驱动层内预设节点读取手势坐标,根据该手势坐标和手势类型计算黑屏手势的动画轨迹,将动画轨迹数据发送至帧缓存(FrameBuffer),以按照设定的屏幕刷新率将该动画轨迹刷新至触控显示屏,进行显示;随后,由应用层执行开启该黑屏手势对应的应用程序的操作。
由于上述黑屏手势执行流程是在显示屏熄灭的状态下检测黑屏手势,由检测到的黑屏手势触发上报黑屏手势事件至在显示屏上显示黑屏手势轨迹后打开应用程序的过程中,从用户的角度看来移动终端是没有反应的。例如,在用户输入黑屏手势打开对应的应用程序时,由于打开应用程序的操作需要花费较长的时间,会导致用户在输入黑屏手势后,系统延时一段时间才能在显示屏上显示该应用程序的界面。从而,导致用户直观的认为黑屏手势功能反应不够灵敏。本申请实施例提供的改善黑屏手势响应的方案可以很好的解决上述的黑屏手势对应的应用程序的开启延时较长的问题。
图1为本申请实施例提供的一种改善黑屏手势响应的方法的流程图,该方法可以由改善黑屏手势响应的装置来执行,其中,该装置可由软件和/或硬件实现,一般可集成在移动终端中。如图1所示,该方法包括:
步骤110、在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据。
其中,黑屏手势事件可以是驱动层与应用层预先协商好的用于代表有黑屏手势输入的事件。
黑屏手势可以是用户在黑屏手势功能开启后,在处于熄屏状态的移动终端的显示屏上输入的触摸手势。可以理解的是黑屏手势并不限于在显示屏上输入的触摸手势,还可以是由移动终端的传感器检测到的操作等。例如,左右摇晃智能手机的手势,从智能手机的显示屏上空拂过的手势及按压智能手机边框的手势等等。其中,显示屏为具有触摸检测功能的显示屏。例如,可以将触控电极制作于显示面板的玻璃基板上得到具有触摸检测功能的显示屏。可以理解的是,对于不具有触摸检测功能的显示屏,需要配合触摸屏实现本申请实施例中显示屏的功能。
手势数据可以包括手势类型、手势坐标和设定结束位等。
图2是本申请实施例提供的一种安卓系统框架示意图。以图2所示的操作系统为安卓(Android)系统的移动终端为例,介绍本申请实施例提供的黑屏手势功能的执行流程。如图2所示,安卓系统框架由下至上包括内核层210、核心类库层220、框架层230及应用层240。其中,内核层210提供核心系统服务,包括安全、内存管理、进程管理、网络协议栈及硬件驱动等。其中,将内核层210中的硬件驱动记为驱动层211,该驱动层211包括触控显示屏驱动、摄像头驱动等。核心类库层220包括安卓运行环境(Android Runtime)和类库 (Libraries)。其中,Android Runtime提供大部分在Java编程语言核心类库中可用的功能,包括核心库(Core Libraries)和Dalvik虚拟机(Dalvik VM)。每一个安卓应用程序是Dalvik虚拟机中的实例,运行在它们自己的进程中。类库供安卓系统的多个组件使用,包括如下功能:媒体库(Media Framework)、界面管理(Surface Manager)、关系数据库引擎(SQLite)及位图和矢量字体渲染(FreeType)等,其中每个功能通过安卓系统的框架层230暴露给开发者使用。框架层230提供开发安卓应用程序所需的一系列类库,使开发人员可以进行快速的应用程序开发,方便重用组件,也可以通过继承实现个性化的扩展,其提供的服务包括组件管理服务、窗口管理服务、系统数据源组件、空间框架、资源管理服务及安装包管理服务等。应用层240上包括多类与用户直接交互的应用程序,或由Java语言编写的运行于后台的服务程序,包括桌面应用、联系人应用、通话应用、相机应用、图片浏览器、游戏、地图、网络web浏览器等程序,以及开发人员开发的其他应用程序。
示例性的,在黑屏手势功能开启后,触摸芯片在检测到黑屏手势时,生成一唤醒信号,并发送该唤醒信号至内核层210。通过该唤醒信号触发内核层210执行系统唤醒操作。在系统唤醒后,内核层210调用驱动层211中断函数执行读取触摸芯片中手势数据的操作,并将读取的手势数据存储在驱动层211的预设节点内。其中,触摸芯片用于输出触摸感测控制信号至触控显示屏,以检测触摸操作,识别作用于触控显示屏上的黑屏手势的手势坐标,将该手势坐标作为手势数据存储在自身的寄存器中。预设节点可以为文件节点,例如可以是proc-D目录下的虚拟文件节点。在数据读取完成后,驱动层判定该手势数据的有效性,有效性判定的方式有很多种,本实施例不作具体限定。例如,驱动层211根据该手势数据包含的手势坐标确定手势类型,并将所确定的手势类型作为手势数据存储在该预设节点内。若该手势类型不是预设的黑屏手势,则判定手势数据无效。又如,驱动层统计该手势数据的数目,判定该数目是否满足绘制预设的黑屏手势的要求,若不满足绘制预设的黑屏手势的要求,则判定手势数据无效。在数据有效时,驱动层211上报黑屏手势事件。该黑屏手势事件通过核心类库层220传输至框架层230,并通过框架层230派发,达到应用层240。应用层240在获取到黑屏手势事件时,由驱动层211的预设节点读取手势数据。在手势数据准备完成后,根据该手势数据包含的手势坐标计算出黑屏手势轨迹,将该黑屏手势轨迹绘制在触控显示屏上进行显示。然后,应用层240基于所读取的手势数据中的手势类型,打开与该手势类型对应的应用程序。其中,手势类型可以是预先设置于移动终端中的用于实现某一功能的手势,还可以是用户自定义的手势。例如,手势类型可以是0,代表打开相机。又如,手势类型可以是V,代表打开手电筒等等。
黑屏手势功能的执行流程并不限于本实施例中列举的方式。例如,还可以是在系统唤醒时即上报黑屏手势事件,内核层210调用驱动层211中断函数执行读取触摸芯片中手势数据的操作,并将该手势数据存储在驱动层211的预设节点内;在黑屏手势事件上报时,并行执行驱动层211读取手势数据,根据手势数据确定手势类型的操作。例如,驱动层211获取该预设节点内的手势数据,对该手势数据作曲线拟合得到该黑屏手势最接近的手势类型,将该手势类型也作为手势数据存储在该预设节点内。在应用层240接收到黑屏手势事件时,按照 设定周期检测该预设节点内的手势数据是否准备完成。在准备完成时,应用层240由该预设节点内读取该手势数据。在该手势数据读取成功且有效时,根据该手势数据包含的手势坐标计算出黑屏手势轨迹,将该黑屏手势轨迹绘制在触控显示屏上进行显示。然后,应用层240基于所读取的手势数据中的手势类型,打开与该手势类型对应的应用程序。
步骤120、通过至少一个开启线程,在后台开启所述手势类型对应的应用程序。
其中,开启线程可以用于执行在后台打开手势类型对应的应用程序的操作。在安卓系统中,应用程序是由Activity组成的,因此,应用程序的启动过程实际上就是应用程序中的默认Activity的启动过程,包括Activity类的调用及对象的实例化等。在后台开启该手势类型对应的应用程序后,可以缓存应用程序界面对应的画面帧,暂不绘制应用程序界面至显示屏。
可以预先建立手势类型与应用程序或手机功能的关联关系,可以以设定白名单的方式存储手势类型与应用程序(如进程号或安装包名)或手机功能(如手机唤醒、会议模式或标准模式等工作模式切换)的关联关系。可以理解的是建立手势类型与应用程序的关联关系的方式有很多种,本申请实施例不作限定。例如,可以在移动终端出厂前为移动终端的设定功能或设定应用设置快捷启动功能,通过输入设定手势可以在熄屏状态下直接执行该设定功能或打开该设定应用。以手电筒为例,在移动终端出厂前预先设置手电筒具有快捷启动功能,通过输入黑屏手势“0”可以在熄屏状态下打开手电筒。又如,移动终端提供黑屏手势配置功能,在用户开启黑屏手势功能时,提示用户选择需要设置快捷启动功能的应用程序,并输入或选择该应用程序对应的黑屏手势,从而建立手势类型与应用程序的关联关系。
应用层在接收到黑屏手势事件时,可以通过开启线程查询预先设定的白名单,确定与该手势类型对应的应用程序,在后台开启该应用程序。
步骤130、通过至少一个绘制线程,与所述开启线程并行执行确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏。
其中,绘制线程可以用于与开启线程并行执行,确定该手势类型对应的手势轨迹,并绘制该手势轨迹至显示屏的操作。
示例性的,应用层在通过开启线程执行在后台打开该手势类型对应的应用程序的同时,通过另外的至少一个绘制线程并行执行确定该手势类型对应的手势轨迹的操作。确定手势类型对应的手势轨迹的方式有很多种,例如,可以在移动终端中预先存储黑屏手势的手势类型对应的不同显示效果的手势图片。在有黑屏手势输入后,绘制线程可以获取黑屏手势的手势类型而不需要再获取手势坐标,大大减少了数据获取量。然后,根据该手势类型确定手势图片,并绘制该手势图片,将绘制好的手势图片发送至帧缓存(FrameBuffer),以按照设定的刷新率将手势图片刷新至显示屏。其中,可以根据手势类型选择默认显示效果的手势图片。还可以是根据手势类型选择符合用户预先选择的显示效果的手势图片。又如,为了更真实的重现用户输入的黑屏手势,在有黑屏手势输入后,绘制线程可以获取手势类型和手势坐标,按照该手势类型对应的设定绘制规则提取该手势坐标中的多个采样点,顺序连接该采样点得到该手势类型对应的手势轨迹,并将包含采样点和采样点之间连线的画面帧发送至帧缓存 (FrameBuffer),以动画的形式将该手势轨迹展示在显示屏上,以模拟该黑屏手势的绘制过程。需要说明的是,上述过程可以由一个绘制线程单独完成,还可以是由至少两个绘制线程分别绘制手势轨迹片段。在每个绘制线程完成自己负责绘制的手势轨迹片段后,将已绘制完成的手势轨迹片段拼接为完整的手势轨迹。
步骤140、在满足预设的显示条件时,绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。
其中,该显示条件可以根据实际需要设置,本申请实施例并不作具体限定。例如,可以规定显示屏上手势轨迹的显示时间阈值,作为应用程序界面的显示条件,即在手势轨迹显示时间达到设定显示时间阈值时,显示屏画面切换至应用程序界面。又如,可以规定显示屏上显示了绘制至显示屏上的手势轨迹的结束点为显示条件,即在手势轨迹绘制出结束点时,显示屏画面切换至应用程序界面。
示例性的,在手势轨迹显示时间达到设定显示时间阈值时,从缓存有应用程序界面对应的画面帧的预设存储区域读取数据,并传输至帧缓存(FrameBuffer),以按照设定的刷新率将该应用程序界面刷新至显示屏,将显示画面由手势轨迹直接切换至应用程序界面。
本实施例的技术方案,通过在检测到黑屏手势事件时,获取黑屏手势事件对应的手势数据,其中,该手势数据包括手势类型;通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;通过至少一个绘制线程,与所述开启线程并行执行根据所述手势数据确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;在满足预设的显示条件时,绘制该应用程序对应的界面至显示屏,以在显示屏上显示该应用程序的界面。通过采用上述技术方案,可以有效地提升黑屏手势的响应速度,缩短由检测到黑屏手势至打开该黑屏手势对应的应用程序所需的时间。
在一实施例中,一些情况下,用户可能输入了错误的黑屏手势,并在显示手势轨迹时及时发现了该错误,此时,用户往往不希望该错误的黑屏手势对应的应用程序被打开并在显示屏上显示。可以通过在绘制该应用程序对应的界面至显示屏之前,增加后台关闭应用程序的功能。例如,在绘制该应用程序对应的界面至显示屏之前,判断是否检测到用户输入的针对应用程序的放弃开启指令;当检测到用户输入的针对应用程序的放弃开启指令时,在后台关闭该应用程序,并控制移动终端恢复黑屏手势模式,可以实现在显示黑屏手势的应用程序之前,及时拦截错误的黑屏手势对应的应用程序,避免浪费图像处理器的处理资源。同时,可以避免显示错误的黑屏手势对应的应用程序后,用户再关闭该应用程序,可以有效地缩短重新进入黑屏手势模式的时间。
图3是本申请实施例提供的获取所述黑屏手势事件对应的手势数据的方法的流程图。如图3所示,该方法包括:
步骤310、在检测到黑屏手势事件时,读取操作系统的驱动层中标识所述预设节点数据状态的标识位。
其中,该预设节点数据状态包括准备完成及准备中,可以根据检测该预设节点中手势数据是否包括设定结束位,判定该预设节点数据处于准备完成状态或是准备中状态。例如,预 先设定结束位对应的字符是“#”。在用户输入黑屏手势时,触摸芯片将检测到的黑屏手势对应的手势数据存入本身预设的寄存器中。触摸芯片在检测到黑屏手势输入完成后,在寄存器中存储的手势数据的结尾添加“#”。驱动层按照设定周期读取该预设节点内的手势数据,若检测到该预设结束位对应的字符,即“#”,则判定该预设节点数据状态为准备完成。
可以理解的是,该结束位可以有很多种,并不限于本实施例列举的“#”。
该标识位用于标识驱动层中的预设节点中手势数据的状态。应用层可以通过查询该标识位的方式获知该预设节点中的手势数据的状态是准备完成还是准备中。其中,该标识位的取值由驱动层是否读取到设定结束位确定。例如,若驱动层读取到设定结束位,则更新该标识位的取值为预设的准备完成状态对应的数值。此后,应用层根据该更新后的标识位,可以判定该预设节点内的数据为准备完成状态。若驱动层未读取到设定结束位,则保持该标识位的取值为准备中状态对应的数值。
步骤320、根据标识位判断所述预设节点内的手势数据是否准备完成,准备完成,则执行步骤330,没有准备完成,执行步骤340。
应用层按照设定的周期读取驱动层中的标识位的取值,并将该取值与代表该预设节点内的手势数据准备完成的数值进行匹配。若该取值等于该代表手势数据准备完成的数值,则判定该预设节点内的手势数据准备完成,执行步骤330;该取值不等于该代表手势数据准备完成的数值,判定该预设节点内的手势数据未准备完成,执行步骤340。
步骤330、从所述预设节点内提取所述手势数据。
在该预设节点内的手势数据准备完成后,从该预设节点内提取手势数据。例如,应用层调用设定函数从proc-D目录下的虚拟文件节点内读取手势数据。
步骤340、通过计时器进行计时。
在该预设节点内的手势数据未准备完成时,启动计时器定时设定时间长度。其中,该设定时间长度等于应用层读取驱动层中标识位的第一周期。
步骤350、判断计时器的数值是否达到设定时间长度,达到设定时间长度,则执行步骤310,未达到设定时间长度,执行步骤340。
按照第二周期读取该计时器的读数,将该读取与设定时间长度比较,若大于或等于该设定时间长度,则执行步骤310;若小于该设定时间长度,返回执行步骤340。其中,第二周期小于第一周期。示例性的,应用层在判定该预设节点内的手势数据未准备完成时,等待设定时间长度,再读取驱动层中该标识位的取值,以根据其取值判断该预设节点内的手势数据是否准备完成。
本实施例的技术方案,通过读取驱动层中预置的标识位的方式判定驱动层的预设节点内的手势数据是否准备完成;若准备完成,则读取该预设节点内的手势数据,未准备完成,等待设定时间长度后,重新读取该设定标识位,以进行该预设节点内数据状态判断。通过采用上述技术方案,可以有效地缩短应用层判断手势数据准备完成并读取该手势数据的时间,从而进一步提升黑屏手势的响应速度。
图4a是本申请实施例提供的另一种改善黑屏手势响应的方法的流程图。如图4a所示, 该方法包括:
步骤410、在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据。
其中,手势数据包括手势类型。该手势类型是由驱动层根据黑屏手势对应的手势坐标,采用曲线拟合的方式确定的。曲线拟合是一种数据处理方式,即用连续曲线近似地刻画或比拟平面上离散点所表示的坐标之间的函数关系。示例性的,驱动层根据读取的手势数据中的手势坐标作曲线拟合来计算与输入黑屏手势最接近的手势类型。
步骤420、通过至少一个开启线程,在后台开启所述手势类型对应的应用程序。
步骤430、通过至少一个绘制线程,与所述开启线程并行执行将所述手势类型与预置的标准图形进行匹配。
应用层在通过开启线程在后台启动该手势类型对应的应用程度的同时,通过绘制线程查询预先配置的标准图形库,确定与该手势类型匹配的标准图形。其中,该标准图形库可以设置于移动终端中,以便于应用层查询,因其不依赖于互联网而具有较快的查询速度。并且,该标准图形库可以在移动终端联网后,基于远程服务器推送的更新消息进行更新。可选的,该标准图形库还可以存储于远程服务器,以避免占用移动终端的存储空间。
示例性的,用户输入黑屏手势的手势类型为“W”,根据该手势类型查询该标准图形库,确定设定显示效果的“W”的标准图形。其中,设定显示效果可以是系统默认的显示效果或者用户预先设定的显示效果,包括字体颜色、字形、字号等等。
步骤440、在显示屏上显示与所述手势类型匹配成功的标准图形。
应用层在预先配置的标准图形库中查找到与该手势类型匹配的标准图形,绘制该标准图形的图像,并将图像数据存储于显示屏的帧缓存内,按照设定的刷新率将标准图形的图像刷新至显示屏,以在该显示屏上显示该标准图形的图像。
图4b是本申请实施例提供的一种黑屏手势轨迹的显示示意图。如图4b所示,用户在黑屏状态下输入黑屏手势“W”,即可在显示屏上显示该手势类型对应的标准图形。
步骤450、判断是否满足针对应用程序预设的显示条件,满足针对应用程序预设的显示条件,则执行步骤460,不满足针对应用程序预设的显示条件,执行步骤440。
预先设置由标准图形的图像显示界面切换至应用程序的显示界面的显示条件,该显示条件可以根据实际需要设置为当前标准图形的图像显示时间不超过设定时间阈值等。若满足针对应用程序预设的显示条件,则执行步骤470,不满足针对应用程序预设的显示条件,返回执行步骤450。
步骤460、判断是否检测到用户输入的针对应用程序的放弃开启指令,没有检测到用户输入的针对应用程序的放弃开启指令,则执行步骤470,当检测到用户输入的针对应用程序的放弃开启指令时,执行步骤480。
其中,该放弃开启指令是预先配置的由用户的设定操作触发的指令。可以触发放弃开启指令的设定操作有很多种,此处不作具体限定。示例性的,可以是系统默认的手势,如晃动移动终端或由移动终端的显示屏上端拂过等。还可以是用户自定的手势等等。
步骤470、绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应 用程序的界面。
从缓存有应用程序界面对应的画面帧的预设存储区域读取数据,并传输至帧缓存,以按照设定的刷新率将该应用程序界面刷新至显示屏,将显示画面由手势轨迹直接切换至应用程序界面。
步骤480、在后台关闭所述应用程序,控制移动终端恢复黑屏手势模式。
若检测到用户输入了放弃显示指令,则不执行绘制该应用程序对应的界面至显示屏的操作,在后台关闭该应用程序,并熄灭显示屏,控制移动终端重新进入黑屏手势模式。在该黑屏手势模式下,移动终端具有较低的功耗,且能够检测作用于显示屏上的黑屏手势。
本实施例的技术方案,通过获取的手势数据包括的手势类型,开启线程执行打开该手势类型对应的应用程序的同时;通过绘制线程查询预设标准图形库确定该手势类型对应的标准图形,绘制该标准图形的图像至显示屏;在满足针对应用程序预设的显示条件时,将显示画面由手势轨迹直接切换至应用程序界面。通过采用上述技术方案,可以在执行后台打开应用程序的同时,通过手势类型查询标准图形库迅速地确定用户输入的黑屏手势的手势轨迹,并将该手势轨迹绘制在显示屏上,避免用户直观地感觉黑屏手势功能反应不灵敏的情况发生,从而进一步提升黑屏手势的响应速度。
图5a是本申请实施例提供的又一种改善黑屏手势响应的方法的流程图。如图5a所示,该方法包括:
步骤510、在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据。
步骤520、通过至少一个开启线程,在后台开启所述手势类型对应的应用程序。
步骤530、通过至少一个绘制线程,与所述开启线程并行执行从所述手势数据包括的手势坐标中提取满足所述手势类型的预设采样规则的采样点。
其中,预先为手势类型设置采样规则,例如,可以设置每隔设定数目的坐标点采集一个坐标点作为采样点。即通过绘制线程按照该预设采样规则,由应用层读取的手势数据中提取满足该预设采样规则的若干个采样点。
步骤540、根据所述采样点绘制所述手势类型对应的手势轨迹,并在所述显示屏上以动画的形式显示所述手势轨迹。
可以对该采样点作曲线拟合得到用户输入的黑屏手势的手势类型对应的手势轨迹。可以以该手势轨迹的第一采样点开始,每隔设定时间间隔绘制设定数目的像素点,从而,以动画的形式展示手势轨迹的绘制过程。其中,为了保证较高的黑屏手势响应速度,该设定时间间隔为人眼能分辨图像变化的最小时间间隔,当然该时间间隔可以根据需要设定。
图5b是本申请实施例提供的又一种黑屏手势轨迹的显示示意图。如图5b所示,以手势类型“W”的第一采样点501为起点,每隔设定时间间隔绘制3个采样点之间的手势轨迹。在第一次绘制时,由第一采样点501开始绘制3个采样点之间的手势轨迹,然后间隔设定时间长度进行第二次绘制。此时,由第四采样点502开始绘制3个采样点之间的手势轨迹,然后间隔设定时间长度进行第三次绘制。此时,由第七采样点503开始绘制3个采样点之间的手势轨迹,依此规则绘制手势轨迹直至绘出最后一个采样点504结束绘制,从而,实现在显示 屏上以动画的形式显示所述手势轨迹,改善了静态显示手势轨迹的单调性问题,增加了趣味性。
步骤550、判断是否满足针对应用程序预设的显示条件,满足针对应用程序预设的显示条件时,则执行步骤560,不满足针对应用程序预设的显示条件时,执行步骤540。
预先设置由手势轨迹显示界面切换至应用程序的显示界面的显示条件,该显示条件可以根据实际需要设定为该手势类型对应的应用程序于后台打开成功且显示屏上显示的手势轨迹绘制至最后一个采样点。
应用层在检测到用户输入的黑屏手势对应的应用程序于后台开启完成时,判断显示屏上的手势轨迹是否绘制至最后一个采样点,当绘制至最后一个采样点时,则执行步骤560,当没有绘制至最后一个采样点时,继续执行步骤540。可选的,若显示屏上的手势轨迹绘制至最后一个采样点,但用户输入的黑屏手势对应的应用程序未开启完成,则在显示屏上显示完成的手势轨迹直至检测到该应用程序在后台已开启完成,再执行步骤560。
步骤560、判断是否检测到用户输入的针对应用程序的放弃开启指令,当没有检测到用户输入的针对应用程序的放弃开启指令时,则执行步骤570,当检测到用户输入的针对应用程序的放弃开启指令时,执行步骤580。
步骤570、绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。
从缓存有应用程序界面对应的画面帧的预设存储区域读取数据,并以设定的刷新率将该应用程序界面刷新至显示屏,实现由手势轨迹界面直接切换至应用程序界面。
步骤580、在后台关闭所述应用程序,控制移动终端恢复黑屏手势模式。
本实施例的技术方案,通过对读取的手势数据作曲线拟合得到手势类型,开启线程执行打开该手势类型对应的应用程序的同时;通过绘制线程从该手势数据的手势坐标中提取若干个满足预设采样规则的采样点,根据该采样点绘制手势轨迹,以动画的形式在显示屏上显示该手势轨迹;在满足针对应用程序界面的预设显示条件时,由手势轨迹显示界面切换至应用程序界面。通过采用上述技术方案,可以在执行后台打开应用程序的同时,迅速地确定用户输入的黑屏手势的手势轨迹,并将该手势轨迹生动地绘制在显示屏上,避免用户直观地感觉黑屏手势功能反应不灵敏的情况发生,从而进一步提升黑屏手势的响应速度。
图6a是本申请实施例提供的一种改善黑屏手势响应的装置的结构框图。该装置可有软件和/或硬件实现,一般集成在移动终端中。如图6a所示,该装置可以包括:
手势数据获取模块610,设置为在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据,其中,所述手势数据包括手势类型;
应用程序开启模块620,设置为通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;
手势轨迹绘制模块630,设置为通过至少一个绘制线程,与所述开启线程并行执行确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;
应用程序显示模块640,设置为在满足预设的显示条件时,绘制所述应用程序对应的界 面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。
本实施例的技术方案提供一种改善黑屏手势响应的装置,可以有效地提升黑屏手势的响应速度,缩短由检测到黑屏手势至打开该黑屏手势对应的应用程序所需的时间。
在一实施例中,参考图6b,手势数据获取模块610包括:
数据状态判断子模块611,设置为在检测到黑屏手势事件时,判断操作系统的驱动层的预设节点内的手势数据是否准备完成;
数据提取子模块612,设置为若该预设节点内的手势数据已准备完成,则从所述预设节点内提取所述手势数据;
当所述预设节点内的手势数据没有准备完成时,在等待设定时间长度后,返回执行判断操作系统的驱动层的预设节点内的手势数据是否准备完成的操作。
在一实施例中,数据状态判断子模块611是设置为:
在检测到黑屏手势事件时,读取驱动层中标识所述预设节点数据状态的标识位,根据所述标识位的取值判断所述手势数据是否准备完成。
在一实施例中,应用程序开启模块620是设置为:
根据所述手势数据包括的手势坐标,采用曲线拟合的方式判定所述黑屏手势的手势类型。
在一实施例中,手势轨迹绘制模块630是设置为:
通过至少一个绘制线程,与所述开启线程并行执行将所述手势类型与预置的标准图形进行匹配,绘制与所述手势类型匹配成功的标准图形至显示屏。
在一实施例中,手势轨迹绘制模块630是设置为:
通过至少一个绘制线程,与所述开启线程并行执行从所述手势数据包括的手势坐标中提取满足所述手势类型的预设采样规则的采样点;
根据所述采样点绘制所述手势类型对应的手势轨迹,并在所述显示屏上以动画的形式显示所述手势轨迹。
在一实施例中,参考图6c,还包括:
第一显示条件判定模块650,设置为在绘制所述应用程序对应的界面至所述显示屏之前,在所述应用程序于后台成功开启,且所述显示屏上以动画形式显示的手势轨迹达到最后一个采样点时,判定满足预设的显示条件。
在一实施例中,参考图6d,还包括第二显示条件判定模块660,设置为在绘制所述应用程序对应的界面至所述显示屏之前,
在所述手势轨迹的显示时间达到设定显示时间时,判定满足预设的显示条件。
在一实施例中,参考图6e,还包括放弃指令模块670,设置为在绘制所述应用程序对应的界面至所述显示屏之前,
判断是否检测到用户输入的针对所述应用程序的放弃开启指令;
当检测到用户输入的针对所述应用程序的放弃开启指令时,在后台关闭所述应用程序。
本申请实施例还提供一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时用于执行一种改善黑屏手势响应的方法,该方法包括:
在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据,所述手势数据包括手势类型;
通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;
通过至少一个绘制线程,与所述开启线程并行执行确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;
在满足预设的显示条件时,绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。
存储介质——任何的不同类型的存储器设备或存储设备。术语“存储介质”旨在包括:安装介质,例如CD-ROM、软盘或磁带装置;计算机系统存储器或随机存取存储器(random access memory,RAM),诸如动态随机存取存储器(dynamic random access memory,DRAM)、显示数据随机存储器(display data random access memory,DDR RAM)、静态随机存取存储器(static random access memory,SRAM)、扩展数据输出随机存取存储器(extended data output random access memory,EDO RAM),兰巴斯(Rambus)RAM等;非易失性存储器,诸如闪存、磁介质(例如硬盘或光存储);寄存器或其它相似类型的存储器元件等。存储介质可以还包括其它类型的存储器或其组合。另外,存储介质可以位于程序在其中被执行的第一计算机系统中,或者可以位于不同的第二计算机系统中,第二计算机系统通过网络(诸如因特网)连接到第一计算机系统。第二计算机系统可以提供程序指令给第一计算机用于执行。术语“存储介质”可以包括可以驻留在不同位置中(例如在通过网络连接的不同计算机系统中)的至少两个存储介质。存储介质可以存储可由至少一个处理器执行的程序指令(例如实现为计算机程序)。
当然,本申请实施例所提供的一种包含计算机可执行指令的存储介质,其计算机可执行指令不限于如上所述的改善黑屏手势处理的操作,还可以执行本申请任意实施例所提供的改善黑屏手势处理方法中的相关操作。
本实施例提供了一种移动终端,该移动终端中可集成本实施例提供的改善黑屏手势响应的装置。图7为本实施例提供的一种移动终端的结构示意图。如图7所示,该移动终端可以包括:壳体(图中未示出)、存储器701、中央处理器(Central Processing Unit,CPU)702(又称处理器,以下简称CPU)、电路板(图中未示出)、显示屏712和电源电路(图中未示出)。所述显示屏712,设置为将用户操作转换成电信号输入至所述处理器,并显示可视输出信号;所述电路板安置在所述显示屏712与所述壳体围成的空间内部;所述CPU702和所述存储器701设置在所述电路板上;所述电源电路,设置为为所述移动终端的各个电路或器件供电;所述存储器701,设置为存储计算机程序;所述CPU702读取并执行所述存储器701中存储的计算机程序。所述CPU702在执行所述计算机程序时实现以下步骤:在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据,所述手势数据包括手势类型;通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;通过至少一个绘制线程,与所述开启线程并行执行确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;在满足预设的显示条件时,绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述 应用程序的界面。
所述移动终端还包括:外设接口703、射频(Radio Frequency,RF)电路705、音频电路706、扬声器711、电源管理芯片708、输入/输出(I/O)子系统709、其他输入/控制设备710以及外部端口704,这些部件通过一个或多个通信总线或信号线707来通信。
应该理解的是,图示移动终端700仅仅是移动终端的一个范例,并且移动终端700可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
下面就本实施例提供的集成有改善黑屏手势响应的装置的移动终端进行详细的描述,该移动终端以手机为例。
存储器701,所述存储器701可以被CPU702、外设接口703等访问,所述存储器701可以包括高速随机存取存储器,还可以包括非易失性存储器,例如一个或多个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
外设接口703,所述外设接口703可以将设备的输入和输出外设连接到CPU702和存储器701。
I/O子系统709,所述I/O子系统709可以将设备上的输入输出外设,例如显示屏712和其他输入/控制设备710,连接到外设接口703。I/O子系统709可以包括显示控制器7091和设置为控制其他输入/控制设备710的至少一个输入控制器7092。其中,至少一个输入控制器7092从其他输入/控制设备710接收电信号或者向其他输入/控制设备710发送电信号,其他输入/控制设备710可以包括物理按钮(按压按钮、摇臂按钮等)、拨号盘、滑动开关、操纵杆、点击滚轮。值得说明的是,输入控制器7092可以与以下任一个连接:键盘、红外端口、USB接口以及诸如鼠标的指示设备。
显示屏712,所述显示屏712是用户终端与用户之间的输入接口和输出接口,将可视输出显示给用户,可视输出可以包括图形、文本、图标、视频等。
I/O子系统709中的显示控制器7091从显示屏712接收电信号或者向显示屏712发送电信号。显示屏712检测显示屏上的接触,显示控制器7091将检测到的接触转换为与显示在显示屏712上的用户界面对象的交互,即实现人机交互,显示在显示屏712上的用户界面对象可以是运行游戏的图标、联网到相应网络的图标等。值得说明的是,设备还可以包括光鼠,光鼠是不显示可视输出的触摸敏感表面,或者是由显示屏形成的触摸敏感表面的延伸。
RF电路705,设置为建立手机与无线网络(即网络侧)的通信,实现手机与无线网络的数据接收和发送。例如收发短信息、电子邮件等。具体地,RF电路705接收并发送RF信号,RF信号也称为电磁信号,RF电路705将电信号转换为电磁信号或将电磁信号转换为电信号,并且通过该电磁信号与通信网络以及其他设备进行通信。RF电路705可以包括设置为执行这些功能的已知电路,其包括但不限于天线系统、RF收发机、至少一个放大器、调谐器、一个或多个振荡器、数字信号处理器、编译码器(COder-DECoder,CODEC)芯片组、用户标识模块(Subscriber Identity Module,SIM)等等。
音频电路706,设置为从外设接口703接收音频数据,将该音频数据转换为电信号,并且将该电信号发送给扬声器711。
扬声器711,设置为将手机通过RF电路705从无线网络接收的语音信号,还原为声音并向用户播放该声音。
电源管理芯片708,设置为为CPU702、I/O子系统及外设接口所连接的硬件进行供电及电源管理。
本申请实施例提供的移动终端,可以有效地提升黑屏手势的响应速度,缩短由检测到黑屏手势至打开该黑屏手势对应的应用程序所需的时间。
上述实施例中提供的改善黑屏手势响应的装置、存储介质及移动终端可执行本申请任意实施例所提供的改善黑屏手势响应的方法,具备执行该方法相应的功能模块和有益效果。未在上述实施例中详尽描述的技术细节,可参见本申请任意实施例所提供的改善黑屏手势响应的方法。

Claims (20)

  1. 一种改善黑屏手势响应的方法,包括:
    在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据,其中,所述手势数据包括手势类型;
    通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;
    通过至少一个绘制线程,与所述开启线程并行执行确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;
    在满足预设的显示条件时,绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。
  2. 根据权利要求1所述的方法,其中,获取所述黑屏手势事件对应的手势数据,包括:
    判断操作系统的驱动层的预设节点内的手势数据是否准备完成;
    当所述预设节点内的手势数据准备完成时,则从所述预设节点内提取所述手势数据;
    当所述预设节点内的手势数据没有准备完成时,在等待设定时间后,返回执行判断操作系统的驱动层的预设节点内的手势数据是否准备完成的操作。
  3. 根据权利要求2所述的方法,其中,判断操作系统的驱动层的预设节点内的手势数据是否准备完成,包括:
    读取所述驱动层中标识所述预设节点数据状态的标识位,根据所述标识位的取值判断所述手势数据是否准备完成。
  4. 根据权利要求1至3中任一所述的方法,其中,确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏,包括:
    将所述手势类型与预置的标准图形进行匹配,绘制与所述手势类型匹配成功的标准图形至所述显示屏。
  5. 根据权利要求1至3中任一所述的方法,其中,确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏,包括:
    从所述手势数据包括的手势坐标中提取满足所述手势类型的预设采样规则的采样点;
    根据所述采样点绘制所述手势类型对应的手势轨迹,并在所述显示屏上以动画的形式显示所述手势轨迹。
  6. 根据权利要求5所述的方法,其中,所述预设采样规则包括:每隔设定数目的坐标点采集一个坐标点作为所述采样点。
  7. 根据权利要求5所述的方法,其中,在绘制所述应用程序对应的界面至所述显示屏之前,还包括:
    在所述应用程序于后台成功开启,且在所述显示屏上以动画形式显示的手势轨迹达到最后一个所述采样点时,判定满足预设的显示条件。
  8. 根据权利要求5所述的方法,其中,在绘制所述应用程序对应的界面至所述显示屏之前,还包括:
    在所述手势轨迹的显示时间达到设定显示时间时,判定满足预设的显示条件。
  9. 根据权利要求1所述的方法,其中,在绘制所述应用程序对应的界面至所述显示屏之 前,还包括:
    判断是否检测到用户输入的针对所述应用程序的放弃开启指令;
    当检测到用户输入的针对所述应用程序的放弃开启指令时,在后台关闭所述应用程序。
  10. 一种改善黑屏手势响应的装置,包括:
    手势数据获取模块,设置为在检测到黑屏手势事件时,获取所述黑屏手势事件对应的手势数据,其中,所述手势数据包括手势类型;
    应用程序开启模块,设置为通过至少一个开启线程,在后台开启所述手势类型对应的应用程序;
    手势轨迹绘制模块,设置为通过至少一个绘制线程,与所述开启线程并行执行确定所述手势类型对应的手势轨迹,并绘制所述手势轨迹至显示屏;
    应用程序显示模块,设置为在满足预设的显示条件时,绘制所述应用程序对应的界面至所述显示屏,以在所述显示屏上显示所述应用程序的界面。
  11. 根据权利要求10所述的装置,其中,手势数据获取模块包括:
    数据状态判断子模块,设置为在检测到黑屏手势事件时,判断操作系统的驱动层的预设节点内的手势数据是否准备完成;
    数据提取子模块,设置为当所述预设节点内的手势数据准备完成时,则从所述预设节点内提取所述手势数据;当所述预设节点内的手势数据没有准备完成时,在等待设定时间后,返回执行判断操作系统的驱动层的预设节点内的手势数据是否准备完成的操作。
  12. 根据权利要求11所述的装置,其中,所述数据状态判断子模块是设置为:
    在检测到黑屏手势事件时,读取所述驱动层中标识所述预设节点数据状态的标识位,根据所述标识位的取值判断所述手势数据是否准备完成。
  13. 根据权利要求10至12中任一所述的装置,其中,所述手势轨迹绘制模块是设置为:
    通过至少一个绘制线程,与所述开启线程并行执行将所述手势类型与预置的标准图形进行匹配,绘制与所述手势类型匹配成功的标准图形至所述显示屏。
  14. 根据权利要求10至12中任一所述的装置,其中,所述手势轨迹绘制模块是设置为:
    通过至少一个绘制线程,与所述开启线程并行执行从所述手势数据包括的手势坐标中提取满足所述手势类型的预设采样规则的采样点;
    根据所述采样点绘制所述手势类型对应的手势轨迹,并在所述显示屏上以动画的形式显示所述手势轨迹。
  15. 根据权利要求14所述的装置,其中,所述预设采样规则包括:每隔设定数目的坐标点采集一个坐标点作为所述采样点。
  16. 根据权利要求14所述的装置,还包括第一显示条件判定模块,设置为在绘制所述应用程序对应的界面至所述显示屏之前,
    在所述应用程序于后台成功开启,且在所述显示屏上以动画形式显示的手势轨迹达到最后一个所述采样点时,判定满足预设的显示条件。
  17. 根据权利要求14所述的装置,还包括第二显示条件判定模块,设置为在绘制所述应 用程序对应的界面至所述显示屏之前,
    在所述手势轨迹的显示时间达到设定显示时间时,判定满足预设的显示条件。
  18. 根据权利要求10所述的装置,还包括放弃指令模块,设置为在绘制所述应用程序对应的界面至所述显示屏之前,
    判断是否检测到用户输入的针对所述应用程序的放弃开启指令;
    当检测到用户输入的针对所述应用程序的放弃开启指令时,在后台关闭所述应用程序。
  19. 一种计算机可读存储介质,所述存储介质存储有计算机程序,该程序被处理器执行时实现如权利要求1至9中任一所述的改善黑屏手势响应的方法。
  20. 一种移动终端,包括显示屏、存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如权利要求1至9中任一所述的改善黑屏手势响应的方法。
PCT/CN2018/094914 2017-07-28 2018-07-06 改善黑屏手势响应的方法、装置、存储介质及移动终端 WO2019019899A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710633685.7A CN107529636A (zh) 2017-07-28 2017-07-28 改善黑屏手势响应的方法、装置、存储介质及移动终端
CN201710633685.7 2017-07-28

Publications (1)

Publication Number Publication Date
WO2019019899A1 true WO2019019899A1 (zh) 2019-01-31

Family

ID=60766298

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/094914 WO2019019899A1 (zh) 2017-07-28 2018-07-06 改善黑屏手势响应的方法、装置、存储介质及移动终端

Country Status (2)

Country Link
CN (1) CN107529636A (zh)
WO (1) WO2019019899A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111212411A (zh) * 2019-12-31 2020-05-29 宇龙计算机通信科技(深圳)有限公司 文件传输方法、装置、存储介质以及终端

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107529636A (zh) * 2017-07-28 2018-01-02 广东欧珀移动通信有限公司 改善黑屏手势响应的方法、装置、存储介质及移动终端
CN108959857A (zh) * 2018-06-12 2018-12-07 Oppo广东移动通信有限公司 应用程序启动控制方法、装置、电子设备及存储介质
CN110865767A (zh) * 2019-11-20 2020-03-06 深圳传音控股股份有限公司 应用程序的运行方法、装置、设备及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823554A (zh) * 2014-01-12 2014-05-28 青岛科技大学 一种数字化虚实交互系统及方法
CN105183352A (zh) * 2015-09-01 2015-12-23 广东欧珀移动通信有限公司 一种在终端黑屏状态下实现更多手势识别的方法及装置
CN105511794A (zh) * 2015-12-14 2016-04-20 中国电子科技集团公司第十五研究所 一种支持多点触控手势操作的标绘系统及其方法
US20160164986A1 (en) * 2014-12-08 2016-06-09 Google Inc. Multi-purpose application launching interface
CN106168881A (zh) * 2016-07-12 2016-11-30 硕诺科技(深圳)有限公司 提升黑屏手势响应速度的方法
CN106569717A (zh) * 2016-11-03 2017-04-19 努比亚技术有限公司 移动终端及应用启动方法
CN106657610A (zh) * 2016-11-17 2017-05-10 宇龙计算机通信科技(深圳)有限公司 终端应用启动控制方法及装置
CN107529636A (zh) * 2017-07-28 2018-01-02 广东欧珀移动通信有限公司 改善黑屏手势响应的方法、装置、存储介质及移动终端

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104424410B (zh) * 2013-09-05 2018-10-19 深圳市艾酷通信软件有限公司 移动智能终端分安全等级快捷启动应用的方法及其系统

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823554A (zh) * 2014-01-12 2014-05-28 青岛科技大学 一种数字化虚实交互系统及方法
US20160164986A1 (en) * 2014-12-08 2016-06-09 Google Inc. Multi-purpose application launching interface
CN105183352A (zh) * 2015-09-01 2015-12-23 广东欧珀移动通信有限公司 一种在终端黑屏状态下实现更多手势识别的方法及装置
CN105511794A (zh) * 2015-12-14 2016-04-20 中国电子科技集团公司第十五研究所 一种支持多点触控手势操作的标绘系统及其方法
CN106168881A (zh) * 2016-07-12 2016-11-30 硕诺科技(深圳)有限公司 提升黑屏手势响应速度的方法
CN106569717A (zh) * 2016-11-03 2017-04-19 努比亚技术有限公司 移动终端及应用启动方法
CN106657610A (zh) * 2016-11-17 2017-05-10 宇龙计算机通信科技(深圳)有限公司 终端应用启动控制方法及装置
CN107529636A (zh) * 2017-07-28 2018-01-02 广东欧珀移动通信有限公司 改善黑屏手势响应的方法、装置、存储介质及移动终端

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111212411A (zh) * 2019-12-31 2020-05-29 宇龙计算机通信科技(深圳)有限公司 文件传输方法、装置、存储介质以及终端
CN111212411B (zh) * 2019-12-31 2023-11-14 宇龙计算机通信科技(深圳)有限公司 文件传输方法、装置、存储介质以及终端

Also Published As

Publication number Publication date
CN107529636A (zh) 2018-01-02

Similar Documents

Publication Publication Date Title
US11086663B2 (en) Preloading application using active window stack
CN107748686B (zh) 应用程序的启动优化方法、装置、存储介质及智能终端
US11086510B2 (en) Split screen control method based on screen-off gestures, and storage medium and mobile terminal thereof
WO2019228156A1 (en) Method and device for preloading application, storage medium and intelligent terminal
CN107395889B (zh) 降低移动终端功耗的方法、装置、存储介质及移动终端
US10761722B2 (en) Black screen gesture detection method and device, storage medium, and mobile terminal
US11604660B2 (en) Method for launching application, storage medium, and terminal
US10901608B2 (en) Method for recognizing a screen-off gesture, and storage medium and terminal thereof
CN107450838B (zh) 黑屏手势的响应方法、装置、存储介质及移动终端
WO2019233241A1 (zh) 应用程序启动方法、装置、存储介质及终端
WO2019019899A1 (zh) 改善黑屏手势响应的方法、装置、存储介质及移动终端
CN107479700B (zh) 黑屏手势控制方法、装置、存储介质及移动终端
WO2019019835A1 (zh) 响应黑屏手势的方法、装置、存储介质及移动终端
WO2019218886A1 (zh) 应用预加载管理方法、装置、存储介质及智能终端
WO2019223511A1 (zh) 应用程序的预加载方法、装置、存储介质及终端
WO2019019818A1 (zh) 加快黑屏手势处理的方法、装置、存储介质及移动终端
TWI646835B (zh) 一種電視節目播放方法以及相關的終端設備
US11086442B2 (en) Method for responding to touch operation, mobile terminal, and storage medium
EP3435215B1 (en) Method, device, storage medium and mobile terminal for recognizing an off-screen gesture
WO2019214476A1 (zh) 屏幕方向设置方法以及装置、存储介质及终端
US20200210047A1 (en) Method for Responding to Touch Operation, Mobile Terminal, and Storage Medium
WO2019019817A1 (zh) 基于黑屏手势的控制方法、装置、存储介质及移动终端
WO2019047183A1 (zh) 按键显示方法、装置及终端
WO2019047187A1 (zh) 导航栏控制方法及装置
US20230229462A1 (en) Terminal Device, Gesture Operation Method Thereof, and Medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18837947

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18837947

Country of ref document: EP

Kind code of ref document: A1