CN111258455B - Event stream processing method, event stream processing device and mobile terminal - Google Patents

Event stream processing method, event stream processing device and mobile terminal Download PDF

Info

Publication number
CN111258455B
CN111258455B CN202010054558.3A CN202010054558A CN111258455B CN 111258455 B CN111258455 B CN 111258455B CN 202010054558 A CN202010054558 A CN 202010054558A CN 111258455 B CN111258455 B CN 111258455B
Authority
CN
China
Prior art keywords
event stream
event
processed
screen
touch point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010054558.3A
Other languages
Chinese (zh)
Other versions
CN111258455A (en
Inventor
吴恒刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010054558.3A priority Critical patent/CN111258455B/en
Publication of CN111258455A publication Critical patent/CN111258455A/en
Application granted granted Critical
Publication of CN111258455B publication Critical patent/CN111258455B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an event stream processing method, an event stream processing device, a mobile terminal and a computer readable storage medium, wherein the event stream processing method comprises the following steps: when the driving layer acquires an event stream to be processed, acquiring more than one touch point from the event stream to be processed, wherein the event stream to be processed is generated based on touch operation of a user on a screen of the mobile terminal; matching more than one touch point with preset gesture navigation conditions; assigning a preset event flag bit according to the matching result; the framework layer determines an event indicated by the event stream to be processed based on the assigned event flag bit, and determines an associated object of the event stream to be processed according to the event indicated by the event stream to be processed; triggering the associated object to execute the event indicated by the pending event stream. According to the scheme, the judgment of the event stream of the screen is completed at the driving layer, and the frame layer is triggered to directly distribute the event stream to the corresponding associated object for processing based on the judgment result, so that the stability of the mobile terminal can be ensured.

Description

Event stream processing method, event stream processing device and mobile terminal
Technical Field
The present application relates to the field of data processing technologies, and in particular, to an event stream processing method, an event stream processing device, a mobile terminal, and a computer readable storage medium.
Background
The user will generate an event stream when touching the screen of the mobile terminal. Generally, a mobile terminal adopting an Android system distributes an event stream to a framework and sends the event stream to an application needing to process the event stream step by step, wherein a gesture navigation application intercepts the event stream and judges the event stream according to a certain rule so as to determine a distribution direction of the event stream. Considering that in an android system, the event streams are unidirectional and sequential, thus the operation of the application intercepting the event streams and redistributing may cause the timing of the event streams to be destroyed, causing compatibility problems.
Disclosure of Invention
The embodiment of the application provides an event stream processing method, an event stream processing device, a mobile terminal and a computer readable storage medium, which can ensure the compatibility of the mobile terminal.
In a first aspect, an embodiment of the present application provides a method for processing an event stream, including:
when a driving layer of a mobile terminal acquires an event stream to be processed, acquiring more than one touch point from the event stream to be processed, wherein the event stream to be processed is generated based on touch operation of a user on a screen of the mobile terminal;
Matching the more than one touch points with preset gesture navigation conditions;
assigning a preset event flag bit according to the matching result;
the frame layer of the mobile terminal determines an event indicated by the event stream to be processed based on the assigned event flag bit, and determines an associated object of the event stream to be processed according to the event indicated by the event stream to be processed;
triggering the associated object to execute the event indicated by the pending event stream.
In a second aspect, an embodiment of the present application provides an event stream processing apparatus, including:
an acquisition unit, configured to acquire, when a driving layer of a mobile terminal acquires an event stream to be processed, more than one touch point from the event stream to be processed, where the event stream to be processed is generated based on a touch operation of a user on a screen of the mobile terminal;
the matching unit is used for matching the more than one touch points with preset gesture navigation conditions;
the assignment unit is used for assigning a preset event flag bit according to the matching result;
the determining unit is used for determining an event indicated by the event stream to be processed based on the event flag bit after assignment through the framework layer of the mobile terminal, and determining an associated object of the event stream to be processed according to the event indicated by the event stream to be processed;
And the triggering unit is used for triggering the associated object to execute the event indicated by the event stream to be processed.
A third aspect of the present application provides a mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method of the first aspect when executing the computer program.
A fourth aspect of the application provides a computer readable storage medium storing a computer program which, when executed by a processor, performs the steps of the method of the first aspect.
A fifth aspect of the application provides a computer program product comprising a computer program which, when executed by one or more processors, implements the steps of the method of the first aspect described above.
From the above, in the present application, when a driving layer of a mobile terminal obtains a to-be-processed event stream, one or more touch points are obtained from the to-be-processed event stream, where the to-be-processed event stream is generated based on a touch operation of a user on a screen of the mobile terminal, then the one or more touch points are matched with a preset gesture navigation condition, a preset event flag bit is assigned according to a matching result, then an event indicated by the to-be-processed event stream is determined by a frame layer of the mobile terminal based on the assigned event flag bit, an associated object of the to-be-processed event stream is determined according to the event indicated by the to-be-processed event stream, and finally the associated object is triggered to execute the event indicated by the to-be-processed event stream. According to the scheme, the judgment of the event stream of the screen is completed at the driving layer, the frame layer is triggered to directly distribute the event stream to the corresponding associated object for processing based on the judgment result, namely, the flow direction of the event stream is judged fundamentally, the processing of the event stream by the android system is not required to be modified, the problem of compatibility is avoided, and the stability is greatly enhanced. It will be appreciated that the advantages of the second to fifth aspects may be found in the relevant description of the first aspect, and are not described here again.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of an implementation of an event stream processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of four groups of touch points in different areas in the event stream processing method provided by the embodiment of the application;
fig. 3 is a block diagram of an event stream processing device according to an embodiment of the present application
Fig. 4 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the above technical solution of the present application, the following description will be made by specific examples.
Example 1
The following describes an event stream processing method provided by the embodiment of the application. Referring to fig. 1, the event stream processing method in the embodiment of the application includes:
step 101, when a driving layer of a mobile terminal acquires an event stream to be processed, acquiring more than one touch point from the event stream to be processed;
in the embodiment of the present application, the main focus is the distribution of the event stream generated by the screen when the user touches the screen of the mobile terminal, and thus, the event stream to be processed refers to the event stream generated based on the touch operation of the user on the screen. Specifically, during the process from one touch of the user to leaving the screen, the screen is sampled at a fixed sampling frequency, so as to obtain a plurality of touch points associated with the touch operation, and the touch points are all contained in the event stream to be processed. Wherein each touch point carries its attribute, which refers to the coordinates of the touch point on the screen and the acquisition time of the touch point (i.e., the time the screen samples acquired the touch point). For a mobile terminal adopting an android system, the driving layer of the mobile terminal refers to a hardware abstraction layer (Hardware Abstraction Layer, HAL) of the mobile terminal. Since the sampling frequency of the screen is generally higher, the longer the duration of the touch operation of the user on the screen, the more touch points are included in the event stream to be processed, so that, in order to improve the processing efficiency, only a preset number of touch points are extracted from the time stream to be processed, where the preset number is an empirical value set according to the performance of the mobile terminal, for example, the preset number may be 3; that is, three touch points are acquired from the event stream to be processed.
Optionally, in order to store more than one touch point acquired from the event stream to be processed, the mobile terminal may create and initialize a cache array at a driving layer when the mobile terminal is powered on, where the number of elements of the cache array is set based on the preset number; the step 101 may be embodied as: when the driving layer acquires the event stream to be processed, more than one touch point is acquired from the event stream to be processed and stored in the cache array.
Optionally, in order to improve the processing efficiency of the to-be-processed event stream, when the driving layer acquires the to-be-processed event stream, the step of acquiring one or more touch points from the to-be-processed event stream and storing the touch points in the cache array may include:
a1, reading the acquisition time of each touch point contained in the event stream to be processed;
wherein, the foregoing has described that each touch point has its attribute, which refers to the coordinates of the touch point on the screen and the acquisition time of the touch point. Based on the above, the acquisition time of each touch point contained in the event stream to be processed can be read by analyzing the attribute of each touch point. Specifically, the event stream to be processed is generated as soon as the user touches the screen, and then the driving layer can acquire the event stream to be processed, and then the number of touch points in the event stream to be processed is continuously increased based on continuous touch operation of the user on the screen; thus, the touch points in the stream of events to be processed read by the driver layer actually also show a trend from a small number to a large number. That is, the touch point that the driver layer first reads is typically the point that the user first touches.
A2, determining more than one touch point in the event stream to be processed based on the sequence from early to late of the acquisition time, and storing the touch points in the cache array.
The driving layer can quickly determine which touch points in the current read event stream to be processed are the earliest received touch points through the acquisition time, so that more than one touch point can be determined in the event stream to be processed based on the sequence from the early to the late of the acquisition time and stored in the cache array; that is, the earliest one or more (e.g., three) touch points are acquired from the event stream to be processed, and the acquired touch points are stored in the cache array.
102, matching the above-mentioned more than one touch points with preset gesture navigation conditions;
in the embodiment of the application, after more than one touch point in the time stream to be processed is acquired, the more than one touch point can be matched with a preset gesture navigation condition. Specifically, the mobile terminal may respond differently according to the touch position when the user touches the screen: for example, when the mobile terminal touches a middle position of the screen, an operation (typically, an object selection or page movement operation) corresponding to the touch (e.g., clicking, long pressing or sliding) is normally performed; when the mobile terminal touches the edge position of the screen, according to the matching result of the touch operation and the preset pickup navigation condition, executing gesture navigation touch operation or operation corresponding to the touch (usually, object selection or page movement operation). Based on this, the step 102 may be specifically expressed as:
B1, acquiring the position of a starting touch point on a screen;
when the above-mentioned more than one touch points are matched with the preset gesture navigation condition, firstly, the position corresponding to the touch operation of the user on the screen should be determined. Here, in order to reduce the amount of calculation, the determination of the touch position may be performed only by the start touch point, that is, the touch point having the earliest acquisition time among the above-described one or more touch points. Thus, the position of the start touch point on the screen, that is, the coordinates of the start touch point on the screen, may be acquired first.
B2, if the position of the initial touch point on the screen belongs to a preset screen bottom area, matching the more than one touch point with a preset first gesture navigation condition;
the mobile terminal can preset a corresponding screen area aiming at the edge position of the screen, and judge the corresponding position of the touch operation of the user on the screen by judging whether the position of the initial touch point on the screen falls into the preset screen area or not. Specifically, considering that the bottom of the edge is a region that is commonly used by users to trigger gesture navigation, different gesture navigation conditions are set for the bottom region of the screen and other edge regions of the screen (such as the top region or the side region of the screen) respectively. When the position of the initial touch point on the screen belongs to a preset screen bottom area, matching the more than one touch points with preset first gesture navigation conditions, wherein the first gesture navigation conditions are as follows: the average displacement of the at least one touch point is greater than a preset displacement threshold, and the displacement threshold is a preset empirical value, which is not limited herein. Specifically, the average displacement of the above-mentioned more than one touch points refers to obtaining an average value of displacements of two adjacent touch points in time, the average displacement can be used for representing a sliding speed when a user touches a screen, and the larger the average displacement is, the more the user performs a touch sliding operation, and the faster the sliding speed is; that is, as long as the user performs a faster sliding touch operation in the bottom area of the screen, the user is considered to want to trigger the mobile terminal to perform gesture navigation at this time, and at this time, the above-mentioned one or more touch points are successfully matched with the preset first gesture navigation condition.
And B3, if the position of the initial touch point on the screen belongs to a preset screen top area or screen side area, determining the screen edge associated with the more than one touch point according to the position of the initial touch point on the screen, and matching the more than one touch point with a preset second gesture navigation condition.
When the position of the initial touch point on the screen belongs to a preset screen top area or a screen side area, determining a screen edge associated with the more than one touch point according to the position of the initial touch point on the screen, for example, if the position of the initial touch point on the screen belongs to the preset screen top area, the screen edge associated with the more than one touch point is the screen top edge; the screen side area includes a screen left side area and a screen right side area: if the position of the initial touch point on the screen belongs to a preset left side area of the screen, the screen edge associated with more than one touch point is the left side edge of the screen; if the position of the initial touch point on the screen belongs to the preset right side area of the screen, the screen edge associated with the more than one touch points is the right side edge of the screen. After determining the screen edge associated with the more than one touch point, the more than one touch point can be matched with a preset second gesture navigation condition, wherein the second gesture navigation condition is as follows: the average displacement of the more than one touch points is larger than the displacement threshold, and an included angle formed by the more than one touch points and the associated screen edge is larger than a preset angle threshold. That is, the above-described second matching condition requires not only that the speed of the sliding touch operation by the user on the screen be as fast as possible, but also that the sliding direction of the sliding touch operation be as perpendicular as the associated screen edge. Specifically, the included angle formed by the above-mentioned more than one touch points and the associated screen edge refers to the included angle formed by the connection line of the start touch point and the end touch point and the associated screen edge, and the end touch point refers to the touch point with the latest acquisition time in the above-mentioned more than one touch points.
For a better explanation of the gesture navigation conditions, please refer to fig. 2, fig. 2 shows four different touch points:
the initial touch point A1 of the first group of touch points A1, A2 and A3 is in the bottom area of the screen, and the average displacement of the first group of touch points A1, A2 and A3 is larger, so that the first group of touch points is successfully matched with the first gesture navigation condition;
the initial touch point B1 of the second group of touch points B1, B2 and B3 is in the top area of the screen, the average displacement of the B1, B2 and B3 is larger, but the included angle < 1 > presented by the connecting line of the B1 and the end touch point B3 of the group of touch points and the top edge of the screen is smaller, so that the matching of the second group of touch points and the second gesture navigation condition fails;
the initial touch point C1 of the third group of touch points C1, C2 and C3 is in the left side edge area of the screen, the included angle 2 presented by the connecting line of the C1 and the end touch point C3 of the group of touch points and the left side edge of the screen is larger, but the average displacement of the C1, C2 and C3 is smaller, so that the matching of the third group of touch points and the second gesture navigation condition is failed;
the initial touch point D1 of the third group of touch points D1, D2 and D3 is in the right side edge area of the screen, the included angle 3 presented by the connecting line of the D1 and the end touch point D3 of the group of touch points and the right side edge of the screen is larger, and the average displacement of the D1, D2 and D3 is also larger, so that the fourth group of touch points is successfully matched with the second gesture navigation condition.
Step 103, assigning a preset event flag bit according to the matching result;
in the embodiment of the application, when the mobile terminal leaves the factory, a flag bit is selected from idle flag bits of a screen of the mobile terminal as the event flag bit. After the step 102, the driving layer may assign a value to the preset event flag bit according to the matching result. Optionally, the step 103 specifically includes:
if the matching result is that the matching is successful, the event zone bit is assigned to be a first value, and the first value is used for marking that the event stream to be processed indicates a gesture navigation event;
and C2, if the matching result is that the matching fails, assigning the event flag bit to a second value, wherein the second value is used for marking that the event stream to be processed indicates a selection/movement event.
Wherein the first value may be set to "1" and the second value may be set to "0"; when the above touch points are successfully matched with the preset gesture navigation conditions, the event stream to be processed is considered to indicate a gesture navigation event, and the event zone bit can be assigned to be a first value; when the matching of the above touch points with the preset gesture navigation condition fails, the event stream to be processed is considered to be a selection/movement event, and the event zone bit can be assigned to a second value, wherein the selection/movement event specifically refers to a click selection event when the user clicks or clicks for a long time, and a page movement event when the user slides and touches.
104, determining an event indicated by the event stream to be processed by the framework layer of the mobile terminal based on the assigned event flag bit, and determining an associated object of the event stream to be processed according to the event indicated by the event stream to be processed;
in the embodiment of the present application, since the value on the event flag bit may be used to mark the event indicated by the to-be-processed event stream, the frame layer of the mobile terminal may directly determine the event indicated by the to-be-processed event stream by reading the event flag bit, and determine the associated object of the to-be-processed event stream according to the event indicated by the to-be-processed event stream. Specifically, when the event indicated by the event stream to be processed is a gesture navigation event, a System interface (System UI) application of the mobile terminal may be determined as an associated object of the event stream to be processed, and when the event indicated by the event stream to be processed is a selection/movement event, a View (View) component of the mobile terminal may be determined as an associated object of the event stream to be processed.
Step 105, triggering the associated object to execute the event indicated by the pending event stream.
In the embodiment of the application, the event stream to be processed is distributed to the corresponding associated object for processing. Specifically, when the event stream to be processed indicates a gesture navigation event, the event stream to be processed is distributed to a system interface application, and the corresponding gesture navigation event is executed; when the event stream to be processed indicates a selection/movement event, the event stream to be processed is distributed to the view component, and the corresponding selection/movement event is executed according to each touch point carried in the event stream to be processed.
Optionally, when the disappearance of the touch operation is monitored; that is, when the user lifts his hand and touches the screen, the cache array is emptied and the event flag bit is reset. This indicates that the determination of the touch operation by the user has ended.
Alternatively, for some full screen mobile terminals that support interface flipping, there are no fixed top and bottom portions. For this case, the current gesture of the mobile terminal may be detected by an accelerometer and a gyroscope, and a screen top area (and screen top edge) and a screen bottom area (and screen bottom edge) in the current gesture are determined. Of course, for a common mobile terminal, the top area and the bottom area of the screen can be preset directly according to the position of the camera: the top area of the screen is on the same side as the camera and the bottom area of the screen is on the opposite side as the camera.
From the above, in the embodiment of the present application, the judgment of the event stream of the screen is completed at the driving layer, and the frame layer is triggered to directly distribute the event stream to the corresponding associated object for processing based on the judgment result, that is, the flow direction of the event stream is judged on the root, and the processing of the event stream by the android system is not required to be modified, so that the problem of compatibility is avoided, and the stability is greatly enhanced. Further, since the edge false touch logic is also judged at the driving layer, that is, in the embodiment of the application, the judgment of whether the event stream is gesture navigation is raised to the same level as the edge false touch logic, which can raise the response speed of the gesture navigation function.
Example two
The second embodiment of the present application provides an event stream processing device, where the event stream processing device may be integrated in a mobile terminal. As shown in fig. 3, an event stream processing apparatus 300 in an embodiment of the present application includes:
an obtaining unit 301, configured to obtain, when a driving layer of a mobile terminal obtains a to-be-processed event stream, more than one touch point from the to-be-processed event stream, where the to-be-processed event stream is generated based on a touch operation of a user on a screen of the mobile terminal;
A matching unit 302, configured to match the above-mentioned one or more touch points with a preset gesture navigation condition;
the assigning unit 303 is configured to assign a preset event flag bit according to a matching result;
a determining unit 304, configured to determine, by the frame layer of the mobile terminal, an event indicated by the to-be-processed event stream based on the assigned event flag bit, and determine an associated object of the to-be-processed event stream according to the event indicated by the to-be-processed event stream;
a triggering unit 305, configured to trigger the associated object to execute the event indicated by the pending event stream.
Optionally, the event stream processing apparatus 300 further includes:
the creation unit is used for creating and initializing a cache array;
accordingly, the acquiring unit 301 is specifically configured to acquire more than one touch point from the to-be-processed event stream and store the touch point in the cache array when the driving layer acquires the to-be-processed event stream.
Optionally, the event stream processing apparatus 300 further includes:
the clearing unit is used for clearing the cache array when the touch operation is monitored to disappear;
and the reset unit is used for resetting the event zone bit when the touch operation is detected to disappear.
Optionally, the acquiring unit 301 includes:
an acquisition time reading subunit, configured to read an acquisition time of each touch point included in the event stream to be processed;
and the touch point storage subunit is used for determining more than one touch point in the event stream to be processed based on the order of the acquisition time from early to late and storing the touch points in the cache array.
Optionally, the matching unit 302 includes:
a position obtaining subunit, configured to obtain a position of a start touch point on a screen, where the start touch point is a touch point with earliest obtaining time in the above one or more touch points;
the first matching subunit is configured to match the one or more touch points with a preset first gesture navigation condition if the position of the initial touch point on the screen belongs to a preset screen bottom area, where the first gesture navigation condition is: the average displacement of the more than one touch points is larger than a preset displacement threshold value;
a second matching subunit, configured to determine, according to the position of the start touch point on the screen, a screen edge associated with the one or more touch points if the position of the start touch point on the screen belongs to a preset screen top area or a preset screen side area, and match the one or more touch points with a preset second gesture navigation condition, where the second gesture navigation condition is: the average displacement of the more than one touch points is larger than the displacement threshold, and an included angle formed by the more than one touch points and the associated screen edge is larger than a preset angle threshold.
Optionally, the assigning unit 303 is specifically configured to assign the event flag bit to a first value if the matching result is successful, where the first value is used to mark that the to-be-processed event stream indicates a gesture navigation event, and assign the event flag bit to a second value if the matching result is unsuccessful, where the second value is used to mark that the to-be-processed event stream indicates a selection/movement event.
Optionally, the determining unit 304 is specifically configured to determine the system interface application of the mobile terminal as the associated object of the to-be-processed event stream if the event indicated by the to-be-processed event stream is a gesture navigation event, and determine the view component of the mobile terminal as the associated object of the to-be-processed event stream if the event indicated by the to-be-processed event stream is a selection/movement event.
From the above, in the embodiment of the present application, the event stream processing device completes the determination of the event stream of the screen at the driving layer, and triggers the frame layer to directly distribute the event stream to the corresponding associated object for processing based on the determination result, that is, the flow direction of the event stream is determined on the root, without modifying the processing of the event stream by the android system itself, so as to avoid causing the compatibility problem, and greatly enhance the stability. Further, since the edge false touch logic is also judged at the driving layer, that is, in the embodiment of the application, the judgment of whether the event stream is gesture navigation is raised to the same level as the edge false touch logic, which can raise the response speed of the gesture navigation function.
Example III
Referring to fig. 4, a mobile terminal 4 in an embodiment of the present application includes: a memory 401, one or more processors 402 (only one shown in fig. 4) and a computer program stored on the memory 401 and executable on the processors. Wherein: the memory 401 is used for storing software programs and units, and the processor 402 executes various functional applications and data processing by running the software programs and units stored in the memory 401 to obtain resources corresponding to the preset events. Specifically, the processor 402 realizes the following steps by running the above-described computer program stored in the memory 401:
when a driving layer of a mobile terminal acquires an event stream to be processed, acquiring more than one touch point from the event stream to be processed, wherein the event stream to be processed is generated based on touch operation of a user on a screen of the mobile terminal;
matching the more than one touch points with preset gesture navigation conditions;
assigning a preset event flag bit according to the matching result;
the frame layer of the mobile terminal determines an event indicated by the event stream to be processed based on the assigned event flag bit, and determines an associated object of the event stream to be processed according to the event indicated by the event stream to be processed;
Triggering the associated object to execute the event indicated by the pending event stream.
Assuming that the above is a first possible implementation, in a second possible implementation provided on the basis of the first possible implementation, the processor 402 implements the following further steps by running the above-mentioned computer program stored in the memory 401:
creating and initializing a cache array;
correspondingly, when the driving layer of the mobile terminal acquires the event stream to be processed, acquiring more than one touch point from the event stream to be processed comprises the following steps:
when the driving layer acquires the event stream to be processed, more than one touch point is acquired from the event stream to be processed and stored in the cache array.
In a third possible implementation provided on the basis of the second possible implementation, the processor 402 implements the further steps by running the above-mentioned computer program stored in the memory 401:
when the touch operation is detected to disappear, the cache array is emptied, and the event flag bit is reset.
In a fourth possible implementation manner provided by the two possible implementation manners as a basis, the acquiring one or more touch points from the event stream to be processed and storing the one or more touch points in the cache array includes:
Reading the acquisition time of each touch point contained in the event stream to be processed;
and determining more than one touch point in the event stream to be processed based on the sequence from the early to the late of the acquisition time, and storing the touch points in the cache array.
In a fifth possible implementation provided on the basis of the first possible implementation, or on the basis of the second possible implementation, or on the basis of the third possible implementation, or on the basis of the fourth possible implementation, the matching the one or more touch points with a preset gesture navigation condition includes:
acquiring the position of a starting touch point on a screen, wherein the starting touch point is the touch point with the earliest acquisition time in the more than one touch points;
if the position of the initial touch point on the screen belongs to a preset screen bottom area, matching the more than one touch points with preset first gesture navigation conditions, wherein the first gesture navigation conditions are as follows: the average displacement of the more than one touch points is larger than a preset displacement threshold value;
if the position of the initial touch point on the screen belongs to a preset screen top area or screen side area, determining a screen edge associated with the more than one touch point according to the position of the initial touch point on the screen, and matching the more than one touch point with a preset second gesture navigation condition, wherein the second gesture navigation condition is as follows: the average displacement of the more than one touch points is larger than the displacement threshold, and an included angle formed by the more than one touch points and the associated screen edge is larger than a preset angle threshold.
In a sixth possible implementation manner provided by the first possible implementation manner, the second possible implementation manner, the third possible implementation manner, or the fourth possible implementation manner, the assigning the preset event flag bit according to the matching result includes:
if the matching result is that the matching is successful, the event flag bit is assigned to be a first value, and the first value is used for marking that the event stream to be processed indicates a gesture navigation event;
if the matching result is that the matching fails, the event flag bit is assigned to a second value, and the second value is used for marking that the event stream to be processed indicates a selection/movement event.
In a seventh possible implementation manner provided by the sixth possible implementation manner, the determining, according to an event indicated by the pending event stream, an associated object of the pending event stream includes:
if the event indicated by the event stream to be processed is a gesture navigation event, determining a system interface application of the mobile terminal as an associated object of the event stream to be processed;
And if the event indicated by the event stream to be processed is a selection/movement event, determining the view component of the mobile terminal as an associated object of the event stream to be processed.
It should be appreciated that in embodiments of the present application, the processor 402 may be a central processing unit (Central Processing Unit, CPU), which may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSPs), application specific integrated circuits (Application Specific Integrated Circuit, ASICs), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Memory 401 may include read-only memory and random access memory, and provides instructions and data to processor 402. Some or all of memory 401 may also include non-volatile random access memory. For example, the memory 401 may also store information of a device class.
From the above, in the embodiment of the present application, the mobile terminal completes the determination of the event stream of the screen at the driving layer, and triggers the frame layer to directly distribute the event stream to the corresponding associated object for processing based on the determination result, that is, the flow direction of the event stream is determined on the root, without modifying the processing of the event stream by the android system itself, so as to avoid causing the compatibility problem, and greatly enhance the stability. Further, since the edge false touch logic is also judged at the driving layer, that is, in the embodiment of the application, the judgment of whether the event stream is gesture navigation is raised to the same level as the edge false touch logic, which can raise the response speed of the gesture navigation function.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of external device software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the system embodiments described above are merely illustrative, e.g., the division of modules or units described above is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may also be implemented by implementing all or part of the flow of the method of the above embodiment, or by instructing the associated hardware by a computer program, where the computer program may be stored on a computer readable storage medium, and where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. The computer program comprises computer program code, and the computer program code can be in a source code form, an object code form, an executable file or some intermediate form and the like. The above computer readable storage medium may include: any entity or device capable of carrying the computer program code described above, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer readable Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier wave signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable storage medium described above may be appropriately increased or decreased according to the requirements of the jurisdiction's legislation and the patent practice, for example, in some jurisdictions, the computer readable storage medium does not include electrical carrier signals and telecommunication signals according to the legislation and the patent practice.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A method of event stream processing, comprising:
when a driving layer of a mobile terminal acquires an event stream to be processed, acquiring more than one touch point from the event stream to be processed, wherein the event stream to be processed is generated based on touch operation of a user on a screen of the mobile terminal;
matching the more than one touch points with preset gesture navigation conditions;
assigning a preset event flag bit according to the matching result;
the frame layer of the mobile terminal determines an event indicated by the event stream to be processed based on the assigned event flag bit, and determines an associated object of the event stream to be processed according to the event indicated by the event stream to be processed;
Triggering the associated object to execute the event indicated by the event stream to be processed;
the matching the more than one touch point with a preset gesture navigation condition includes:
acquiring the position of a starting touch point on a screen, wherein the starting touch point is the touch point with the earliest acquisition time in the more than one touch points;
if the position of the initial touch point on the screen belongs to a preset screen bottom area, matching the more than one touch points with preset first gesture navigation conditions, wherein the first gesture navigation conditions are as follows: the average displacement of the more than one touch points is greater than a preset displacement threshold.
2. The event stream processing method as claimed in claim 1, wherein the event stream processing method further comprises:
creating and initializing a cache array;
correspondingly, when the driving layer of the mobile terminal acquires the event stream to be processed, acquiring more than one touch point from the event stream to be processed comprises the following steps:
when the driving layer acquires the event stream to be processed, more than one touch point is acquired from the event stream to be processed and stored in the cache array.
3. The event stream processing method according to claim 2, wherein the event stream processing method further comprises:
and when the touch operation is monitored to disappear, the cache array is emptied, and the event flag bit is reset.
4. The event stream processing method as set forth in claim 2, wherein said obtaining more than one touch point from the event stream to be processed is stored in the cache array, and comprises:
reading the acquisition time of each touch point contained in the event stream to be processed;
and determining more than one touch point in the event stream to be processed based on the order of the acquisition time from early to late, and storing the touch points in the cache array.
5. The event stream processing method according to any one of claims 1 to 4, wherein said matching said one or more touch points with a preset gesture navigation condition further comprises:
if the position of the initial touch point on the screen belongs to a preset screen top area or screen side area, determining the screen edge associated with more than one touch point according to the position of the initial touch point on the screen, and matching the more than one touch point with a preset second gesture navigation condition, wherein the second gesture navigation condition is as follows: the average displacement of the more than one touch points is larger than the displacement threshold, and an included angle formed by the more than one touch points and the associated screen edge is larger than a preset angle threshold.
6. The method for processing an event stream according to any one of claims 1 to 4, wherein assigning a preset event flag according to the matching result comprises:
if the matching result is that the matching is successful, the event flag bit is assigned to be a first value, and the first value is used for marking that the event stream to be processed indicates a gesture navigation event;
if the matching result is that the matching fails, the event flag bit is assigned to a second value, and the second value is used for marking that the event stream to be processed indicates a selection/movement event.
7. The event stream processing method as set forth in claim 6, wherein said determining an associated object of the event stream to be processed from the event indicated by the event stream to be processed comprises:
if the event indicated by the event stream to be processed is a gesture navigation event, determining a system interface application of the mobile terminal as an associated object of the event stream to be processed;
and if the event indicated by the event stream to be processed is a selection/movement event, determining a view component of the mobile terminal as an associated object of the event stream to be processed.
8. An event stream processing apparatus, comprising:
the mobile terminal comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring more than one touch point from a to-be-processed event stream when a driving layer of the mobile terminal acquires the to-be-processed event stream, and the to-be-processed event stream is generated based on touch operation of a user on a screen of the mobile terminal;
the matching unit is used for matching the more than one touch points with preset gesture navigation conditions;
the assignment unit is used for assigning a preset event flag bit according to the matching result;
the determining unit is used for determining an event indicated by the event stream to be processed based on the event flag bit after assignment by the framework layer of the mobile terminal, and determining an associated object of the event stream to be processed according to the event indicated by the event stream to be processed;
the triggering unit is used for triggering the associated object to execute the event indicated by the event stream to be processed;
wherein, the matching unit includes:
a position obtaining subunit, configured to obtain a position of a start touch point on a screen, where the start touch point is a touch point with earliest obtaining time in the above one or more touch points;
The first matching subunit is configured to match the one or more touch points with a preset first gesture navigation condition if the position of the initial touch point on the screen belongs to a preset screen bottom area, where the first gesture navigation condition is: the average displacement of the more than one touch points is larger than a preset displacement threshold value.
9. A mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the method according to any one of claims 1 to 7.
CN202010054558.3A 2020-01-17 2020-01-17 Event stream processing method, event stream processing device and mobile terminal Active CN111258455B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010054558.3A CN111258455B (en) 2020-01-17 2020-01-17 Event stream processing method, event stream processing device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010054558.3A CN111258455B (en) 2020-01-17 2020-01-17 Event stream processing method, event stream processing device and mobile terminal

Publications (2)

Publication Number Publication Date
CN111258455A CN111258455A (en) 2020-06-09
CN111258455B true CN111258455B (en) 2023-08-18

Family

ID=70947121

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010054558.3A Active CN111258455B (en) 2020-01-17 2020-01-17 Event stream processing method, event stream processing device and mobile terminal

Country Status (1)

Country Link
CN (1) CN111258455B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114995738B (en) * 2022-05-31 2023-06-16 重庆长安汽车股份有限公司 Transformation method, transformation device, electronic equipment, storage medium and program product
CN115168354B (en) * 2022-07-11 2023-06-30 广州市玄武无线科技股份有限公司 Integrated processing method and device for event stream of mobile terminal

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511675A (en) * 2015-11-20 2016-04-20 努比亚技术有限公司 Touch control method, user equipment, input processing method, mobile terminal and intelligent terminal
CN105511794A (en) * 2015-12-14 2016-04-20 中国电子科技集团公司第十五研究所 Plotting system supporting multi-point touch gesture operation and method of system
CN107390996A (en) * 2017-08-02 2017-11-24 维沃移动通信有限公司 A kind of processing method and mobile terminal of power key false touch
CN107450837A (en) * 2017-07-28 2017-12-08 广东欧珀移动通信有限公司 Respond method, apparatus, storage medium and the mobile terminal of blank screen gesture
CN107463329A (en) * 2017-07-28 2017-12-12 广东欧珀移动通信有限公司 Detection method, device, storage medium and the mobile terminal of blank screen gesture
CN107479816A (en) * 2017-07-28 2017-12-15 广东欧珀移动通信有限公司 Recognition methods, device, storage medium and the mobile terminal of blank screen gesture
WO2018045598A1 (en) * 2016-09-12 2018-03-15 深圳前海达闼云端智能科技有限公司 Electronic device
CN109766043A (en) * 2018-12-29 2019-05-17 华为技术有限公司 The operating method and electronic equipment of electronic equipment
CN109828807A (en) * 2018-12-24 2019-05-31 天津字节跳动科技有限公司 Method, apparatus, electronic equipment and the storage medium of the small routine gesture switching page
CN109842727A (en) * 2019-01-31 2019-06-04 Oppo广东移动通信有限公司 Report method, device, terminal and the storage medium of touching signals
CN110531864A (en) * 2019-09-18 2019-12-03 华为技术有限公司 A kind of gesture interaction method, device and terminal device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511675A (en) * 2015-11-20 2016-04-20 努比亚技术有限公司 Touch control method, user equipment, input processing method, mobile terminal and intelligent terminal
CN105511794A (en) * 2015-12-14 2016-04-20 中国电子科技集团公司第十五研究所 Plotting system supporting multi-point touch gesture operation and method of system
WO2018045598A1 (en) * 2016-09-12 2018-03-15 深圳前海达闼云端智能科技有限公司 Electronic device
CN107450837A (en) * 2017-07-28 2017-12-08 广东欧珀移动通信有限公司 Respond method, apparatus, storage medium and the mobile terminal of blank screen gesture
CN107463329A (en) * 2017-07-28 2017-12-12 广东欧珀移动通信有限公司 Detection method, device, storage medium and the mobile terminal of blank screen gesture
CN107479816A (en) * 2017-07-28 2017-12-15 广东欧珀移动通信有限公司 Recognition methods, device, storage medium and the mobile terminal of blank screen gesture
CN107390996A (en) * 2017-08-02 2017-11-24 维沃移动通信有限公司 A kind of processing method and mobile terminal of power key false touch
CN109828807A (en) * 2018-12-24 2019-05-31 天津字节跳动科技有限公司 Method, apparatus, electronic equipment and the storage medium of the small routine gesture switching page
CN109766043A (en) * 2018-12-29 2019-05-17 华为技术有限公司 The operating method and electronic equipment of electronic equipment
CN109842727A (en) * 2019-01-31 2019-06-04 Oppo广东移动通信有限公司 Report method, device, terminal and the storage medium of touching signals
CN110531864A (en) * 2019-09-18 2019-12-03 华为技术有限公司 A kind of gesture interaction method, device and terminal device

Also Published As

Publication number Publication date
CN111258455A (en) 2020-06-09

Similar Documents

Publication Publication Date Title
CN107329750B (en) Identification method and skip method of advertisement page in application program and mobile terminal
CN107678534B (en) Method for processing event signals and event-based sensor for carrying out said method
JP6214547B2 (en) Measuring the rendering time of a web page
CN108874289B (en) Application history record viewing method and device and electronic equipment
CN111258455B (en) Event stream processing method, event stream processing device and mobile terminal
RU2556079C2 (en) User data input
CN105320417B (en) Page switching method and client
JP6951408B2 (en) Wake-up method and device for voice recognition function in mobile terminals
JP6141262B2 (en) Removal and correction of target ambiguity
CN110764906B (en) Memory recovery processing method and device, electronic equipment and storage medium
WO2020215959A1 (en) Game object control method and apparatus
CN107577415B (en) Touch operation response method and device
US20150084877A1 (en) Touch device and method for dynamically setting touch inactive area, and non-transitory recording medium
CN108427737B (en) Data cleaning method, equipment and computer readable medium
WO2019179028A1 (en) Electronic device, user authentication method based on dynamic pictures, and storage medium
CN110442267A (en) Touch operation response method, device, mobile terminal and storage medium
CN108984339B (en) Data recovery method and related product
CN105224216A (en) A kind of user terminal control method and user terminal
CN110888628B (en) Method, apparatus, device and storage medium for generating control tool
CN109842727B (en) Touch signal reporting method and device, terminal and storage medium
CN111597009B (en) Application program display method and device and terminal equipment
CN107562346A (en) Terminal control method, device, terminal and computer-readable recording medium
CN108710477B (en) Display method, mobile terminal and storage medium
CN111061429B (en) Data access method, device, equipment and medium
JP7413513B2 (en) Human-computer interaction methods, devices, and systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant