CN113031812A - Touch event reporting method and device, terminal and storage medium - Google Patents

Touch event reporting method and device, terminal and storage medium Download PDF

Info

Publication number
CN113031812A
CN113031812A CN202110291923.7A CN202110291923A CN113031812A CN 113031812 A CN113031812 A CN 113031812A CN 202110291923 A CN202110291923 A CN 202110291923A CN 113031812 A CN113031812 A CN 113031812A
Authority
CN
China
Prior art keywords
touch
terminal
touch screen
processing module
upper layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110291923.7A
Other languages
Chinese (zh)
Inventor
古启才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110291923.7A priority Critical patent/CN113031812A/en
Publication of CN113031812A publication Critical patent/CN113031812A/en
Priority to PCT/CN2022/079675 priority patent/WO2022193988A1/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Abstract

The embodiment of the application provides a touch event reporting method, a touch event reporting device, a terminal and a storage medium, and relates to the technical field of terminals. The method is applied to a terminal, the terminal comprises an upper layer processing module and a touch screen module, and the method comprises the following steps: the upper layer processing module determines the state information of the terminal; responding to the situation that the terminal is in a horizontal screen state, and sending a first instruction to the touch screen module by the upper layer processing module; the touch screen module sets the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting the first touch event to an upper layer processing module according to the first touch precision. According to the method and the device, the touch precision is set to be high through judging whether the terminal is in the transverse screen state or not, and under the condition that the terminal is in the transverse screen state, the touch event is reported according to the high touch precision.

Description

Touch event reporting method and device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a touch event reporting method, a touch event reporting device, a terminal and a storage medium.
Background
With the development of terminal technology, a touch screen is generally arranged on a mobile terminal, and the touch screen can sense touch operation of a user on the touch screen.
In the related art, the basic principle of the touch screen is generally capacitance sensing, when a finger or an object of a user contacts a sensing material on the surface of the touch screen, capacitance changes, and the touch screen calculates the number of the fingers or the objects pressed on the surface of the touch screen through an algorithm.
Disclosure of Invention
The embodiment of the application provides a touch event reporting method, a touch event reporting device, a terminal and a storage medium. The technical scheme is as follows:
on one hand, an embodiment of the present application provides a method for reporting a touch event, which is applied to a terminal, where the terminal includes an upper processing module and a touch screen module, and the method includes:
the upper layer processing module determines state information of the terminal, wherein the state information of the terminal is used for indicating whether the terminal is in a horizontal screen state or a vertical screen state;
responding to the situation that the terminal is in a horizontal screen state, and sending a first instruction to the touch screen module by the upper layer processing module;
the touch screen module sets the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
On the other hand, an embodiment of the present application provides a touch event reporting device, where the device includes:
the upper layer processing module is used for determining the state information of the terminal, and the state information of the terminal is used for indicating whether the terminal is in a horizontal screen state or a vertical screen state;
responding to the terminal in the transverse screen state, wherein the upper layer processing module is used for sending a first instruction to the touch screen module;
the touch screen module is used for setting the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
On the other hand, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores a computer program, and the computer program is loaded and executed by the processor to implement the touch event reporting method according to the above aspect.
In another aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and the computer program is loaded and executed by a processor to implement the touch event reporting method according to the above aspect.
In yet another aspect, embodiments of the present application provide a computer program product including computer instructions stored in a computer-readable storage medium. And the processor of the terminal reads the computer instruction from the computer readable storage medium, and executes the computer instruction, so that the terminal executes the touch event reporting method.
The technical scheme provided by the embodiment of the application can bring the following beneficial effects:
by judging whether the terminal is in a horizontal screen state or not, under the condition that the terminal is in the horizontal screen state, the touch precision is set to be high, and the touch event is reported according to the high touch precision.
Drawings
Fig. 1 is a flowchart of a touch event reporting method according to an embodiment of the present application;
fig. 2 is a flowchart of a touch event reporting method according to another embodiment of the present application;
fig. 3 is a schematic diagram illustrating a portrait screen state and a landscape screen state of a terminal according to an embodiment of the present application;
fig. 4 is a block diagram of a touch event reporting apparatus according to an embodiment of the present application;
fig. 5 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of a touch event reporting method according to an embodiment of the present application is shown, where the method may be applied to a terminal, where the terminal includes an upper processing module and a touch screen module, and the method may include the following steps.
Step 101, the upper layer processing module determines the state information of the terminal.
The upper layer processing module is a module having an upper layer service function in the terminal. In the embodiment of the present application, the upper layer processing module refers to a module for determining state information of the terminal. For example, the upper layer processing module may also be referred to as an upper layer system service, a terminal system service, and the like, which is not limited in this embodiment. In the embodiment of the present application, the terminal refers to an electronic device with a touch screen, and for example, the terminal may include an electronic device such as a mobile phone, a tablet Computer, a PC (Personal Computer), and a smart wearable device, which is not limited in the embodiment of the present application. Illustratively, the upper layer processing module is a software module. The upper layer processing module is mainly used for processing an operating system, a user interface, an application program and the like.
In the embodiment of the application, the state information of the terminal is used for indicating whether the terminal is in a horizontal screen state or a vertical screen state. Generally, the terminal has a long side and a short side visually, and the landscape state is a state in which the long side of the terminal is positioned above and below and the short side of the terminal is positioned to the left and right; the vertical screen state is a state in which the short side of the terminal is located above and below the terminal and the long side of the terminal is located on the left and right sides. Exemplarily, when interface content is displayed on a touch screen of the terminal, the state information of the terminal may also be referred to as display state information of the terminal, where the display state information is used to indicate whether the terminal is in a landscape display state or a portrait display state, the landscape display state is a user interface displaying an application program in the landscape display state, and the portrait display state is a user interface displaying the application program in the portrait display state.
In a possible implementation manner, the upper processing module may acquire acceleration data through an acceleration sensor, and then determine state information of the terminal based on the acceleration data.
And 102, responding to the situation that the terminal is in a horizontal screen state, and sending a first instruction to the touch screen module by the upper layer processing module.
The touch screen module refers to a module related to a touch screen in the terminal. The upper layer processing module and the touch screen module can be communicated with each other. Illustratively, the upper layer processing module and the touch screen module may communicate with each other through an I2C (Inter-Integrated Circuit) Interface or an SPI (Serial Peripheral Interface) Interface.
The first instruction is an instruction for setting the touch precision of the touch screen module to the first touch precision. The touch precision refers to the capability of the touch screen to sense the minimum change of the touch screen touched by the user, and the smaller the touch precision is, the larger the capability of the touch screen to sense the minimum change of the touch screen touched by the user is. For example, the touch precision may also be referred to as a touch resolution, a touch coordinate resolution, a touch reporting range, and the like, which is not limited in this embodiment of the application.
In a possible implementation manner, before the upper processing module sends the first instruction to the touch screen module, it may be further determined whether there is an application program running in the foreground first, and when it is determined that there is an application program running in the foreground and the terminal is in the horizontal screen state, the upper processing module sends the first instruction to the touch screen module.
And 103, setting the touch control precision of the touch screen module as a first touch control precision by the touch screen module based on the first instruction.
Exemplarily, the first touch precision is higher than the current original touch precision of the terminal, and the higher the touch precision is, the higher the precision recognition capability is, the better the touch experience is. Illustratively, the current original touch precision of the terminal is the same as the display resolution. For example, assuming that the display resolution is 1080 × 2400, the original coordinate range of the touch screen may be 1080 × 2400, that is, the original touch precision is 1.
Illustratively, the touch screen module sets the touch accuracy of the touch screen module from the original touch accuracy to a first touch accuracy based on the first instruction.
Illustratively, the first touch precision may include 0.25, 0.5, 0.2, and the like, and when the first touch precision is 0.25, the coordinate scale in the original coordinate system of the touch screen is averagely split into 4 coordinate scales; when the first touch precision is 0.5, a coordinate scale in an original coordinate system of the touch screen is averagely split into 2 coordinate scales; when the first touch precision is 0.2, the coordinate scale in the original coordinate system of the touch screen is averagely split into 5 coordinate scales.
And step 104, reporting the first touch event to an upper layer processing module by the touch screen module according to the first touch precision.
The first touch event may be any one touch event, and the first touch event is used to indicate a touch behavior, which is acquired by the touch screen module and is triggered by a user for a user interface of an application program in a running state. For example, the first touch event may refer to a touch point when the user touches the touch screen, that is, a coordinate range when the user touches the touch screen.
According to the touch control method and device, the touch control precision matched with the state information is determined through the state information based on the terminal, and the touch control experience can be improved for some touch control scenes needing high precision.
To sum up, in the technical scheme provided in this application, by determining whether the terminal is in the landscape screen state, and when the terminal is in the landscape screen state, the touch accuracy is set to the high touch accuracy, and the touch event is reported according to the high touch accuracy, the touch accuracy is adjusted according to the state information of the terminal, and the diversity of touch recognition is improved.
Please refer to fig. 2, which shows a flowchart of a touch event reporting method according to another embodiment of the present application, where the method may be applied to a terminal, where the terminal includes an upper processing module and a touch screen module, and the method may include the following steps.
In step 201, the upper layer processing module obtains target acceleration data.
In this embodiment of the application, the target acceleration data includes acceleration data in a first direction, acceleration data in a second direction, and acceleration data in a third direction, where the first direction, the second direction, and the third direction are mutually perpendicular in pairs, a plane formed by the first direction and the second direction is parallel to a touch screen in the touch screen module, the third direction is perpendicular to the touch screen, the first direction is along a short side direction of the touch screen, and the second direction is along a long side direction of the touch screen.
Illustratively, the upper processing module collects target acceleration data through an acceleration sensor, and the acceleration sensor sends the collected target acceleration data to the upper processing module. For example, the acceleration sensor may be a three-axis acceleration sensor, and the three-axis acceleration sensor may acquire acceleration data in three directions of an X axis, a Y axis, and a Z axis. Wherein, the X-axis direction represents a first direction, the Y-axis direction represents a second direction, and the Z-axis direction represents a third direction.
Step 202, the upper layer processing module determines the state information of the terminal based on the target acceleration data.
In a possible implementation, step 202 may include several sub-steps as follows:
step 202a, in response to that the absolute value of the acceleration data in the first direction belongs to the target value range, the upper layer processing module determines that the terminal is in a landscape state.
And step 202b, in response to that the absolute value of the acceleration data in the second direction belongs to the target value range, the upper layer processing module determines that the terminal is in a vertical screen state.
Exemplarily, referring to fig. 3, a schematic diagram exemplarily showing a portrait screen state and a landscape screen state of a terminal is shown. The vertex A of the upper left corner of the front panel of the terminal is used as a coordinate origin, the direction along the upper frame of the front panel (the short side direction of the touch screen) is the X-axis direction, the direction along the left frame of the front panel (the long side direction of the touch screen) is the Y-axis direction, and the Z-axis direction is the direction perpendicular to the plane where the front panel (the touch screen) is located. The upward direction of the origin along the Z-axis direction is taken as the positive Z-axis direction, and the downward direction of the origin along the Z-axis direction is taken as the negative Z-axis direction. Part (a) of fig. 3 shows a schematic view in which the terminal is in a portrait screen state, and part (b) of fig. 3 shows a schematic view in which the terminal is in a landscape screen state.
Illustratively, the target value range may be 8-9.8, that is, when the value of the acceleration data in the first direction (X direction) is within an interval of 8-9.8 or-8-9.8, it indicates that the terminal is in a landscape state; and when the value of the acceleration data in the second direction (Y direction) is in the interval of 8-9.8 or-8-9.8, the terminal is in the vertical screen state. Of course, in other possible implementations, the target value range may also be other ranges, which is not limited in the embodiment of the present application.
In a possible implementation manner, when the absolute value of the acceleration data in the first direction is greater than the absolute value of the acceleration data in the second direction, the upper layer processing module determines that the terminal is in the landscape state.
In a possible implementation manner, when the absolute value of the acceleration data in the second direction is greater than the absolute value of the acceleration data in the first direction, the upper layer processing module determines that the terminal is in the vertical screen state.
Of course, in other possible implementation manners, the upper layer processing module may also determine the state information of the terminal through other determination logics based on the target acceleration data, which is not limited in this embodiment of the application.
In a possible implementation manner, there is a case where the terminal is placed on a desktop or other objects, at this time, it cannot be simply determined whether the terminal is in the landscape screen state or the portrait screen state according to the acceleration data in the first direction and the acceleration data in the second direction, and the terminal may need to detect whether an instruction related to landscape screen display is received to determine whether the terminal is in the landscape screen state or determine whether the terminal is in the landscape screen state based on an application in a foreground running state, because some applications only support display in the landscape screen state.
It should be noted that generally, an application program with high touch precision needs to be used, and a user needs to hold the terminal with both hands for operation, so that in most cases, the horizontal screen state only considers the acceleration data in the first direction, but the situation that the terminal is in the horizontal screen state when being horizontally placed is not excluded, at this time, the upper layer processing module may determine the display state of the terminal according to other determination logics, and send the first instruction to the touch screen module when determining that the terminal is in the horizontal screen state.
For example, the upper layer processing module may determine the state information of the terminal based on the target acceleration data and the duration of the target acceleration data. The maintenance period of the target acceleration data refers to a time during which the value of the target acceleration data is maintained within a certain range.
In a possible implementation manner, in response to that the absolute value of the acceleration data in the first direction belongs to the target value range, and the duration that the absolute value of the acceleration data in the first direction belongs to the target value range is longer than a preset duration, the upper layer processing module determines that the terminal is in the landscape state.
In a possible implementation manner, in response to that the absolute value of the acceleration data in the second direction belongs to the target value range, and the duration that the absolute value of the acceleration data in the second direction belongs to the target value range is longer than a preset duration, the upper layer processing module determines that the terminal is in the vertical screen state.
According to the method and the device, the state information of the terminal is determined by judging whether the absolute value of the acceleration data belongs to the target value range or not, the duration that the absolute value of the acceleration data belongs to the target value range is added as a judgment basis, the false triggering event that the upper layer processing module sends the first instruction to the touch screen module due to the fact that a user only rotates the terminal rapidly is avoided, the power consumption of the terminal is effectively reduced, and the instruction is sent more accurately.
Step 203, responding to the terminal being in the horizontal screen state, the upper layer processing module sends a first instruction to the touch screen module.
In a possible implementation manner, the upper layer processing module may send the first instruction to the touch screen module by:
in step 203a, the upper layer processing module determines an instruction corresponding to the application type of the application program based on the application type of the application program in the foreground running state.
The application program in the foreground running state refers to a state in which a user interface of the application program is displayed on a touch screen of the terminal. The application may be any application, and the application may be a social application, a game application, a video application, a music application, or the like.
Illustratively, the instructions may differ for different application types. The upper processing module may determine the instruction corresponding to the application type of the application program in the foreground operating state based on the correspondence. For example, the touch precision corresponding to the following application types may be sequentially reduced: a game application, a shooting application, a social application, a life application, etc. (this example is merely an example, and it may be set according to practical situations, and this is not limited in this embodiment of the application).
Because the instruction corresponds to the touch precision, the touch precision corresponding to different instructions is different, and the instruction is determined based on the application type, so that the touch precision is more diversified, and the touch identification requirement of the application program can be better met.
In step 203b, the upper layer processing module determines an instruction corresponding to the application type of the application program as a first instruction.
In possible implementations, the instructions corresponding to the same application type may also be different. The instruction corresponding to the application program may be set by a user, for example, the user may set the instruction corresponding to each application program in a setting interface, or the user may set the instruction corresponding to each application type in the setting interface.
And 203c, the upper layer processing module sends a first instruction to the touch screen module.
In a possible implementation manner, when the terminal is in the horizontal screen state, and the upper processing module determines that the application program in the foreground running state is in the application white list, the upper processing module sends a first instruction to the touch screen module. The application white list is used for indicating a set of applications which need to use higher touch precision, and the set of applications comprises at least one application. The application white list can be set by a user, for example, the user can select an application program which is expected to use higher touch precision in the setting interface, the terminal receives an adding instruction for the application program, and the application program is added into the application white list.
The application white list can be stored in the memory, and the upper layer processing module accesses the memory to acquire the application white list. For example, after obtaining the application white list, the upper layer processing module may determine whether the application is in the application white list by: the upper layer processing module acquires identification information of an application program; the upper layer processing module determines whether the application white list comprises identification information of the application program; in response to the application white list including identification information of the application program, the upper layer processing module determines that the application program is in the application white list; in response to the identification information of the application program not being included in the application whitelist, the upper layer processing module determines that the application program is not in the application whitelist.
The application white list may include identification information of at least one application program, where the identification information is used to uniquely identify the application program, and the identification information of the application program may be a name and a package name of the application program (a name of an installation package of the application program), and in a possible implementation manner, a technician may set a corresponding serial number for each application program, where the serial number is identification information of the application program, and the serial numbers corresponding to different application programs are different, and in a case where the identification information of the application program is a serial number, the application white list may include at least one serial number.
The identification information of the application program is taken as an example of the name of the application program for introduction, the upper processing module obtains the name of the application program in the foreground running state, and the upper processing module compares the name of the application program in the foreground running state with the identification of at least one application program included in the application white list to determine whether the application white list includes the name of the application program in the foreground running state. In response to the application white list including the name of the application program in the foreground running state, the upper layer processing module determines that the application program is in the application white list; in response to not including the name of the application in the foreground run state within the application whitelist, the upper-level processing module determines that the application is not within the application whitelist.
When the identification information of the application program is in the application white list and the terminal is in a horizontal screen state, the application program is indicated to need to use higher touch precision, and the upper layer processing module sends a first instruction to the touch screen module. By acquiring the identification information of the application program, when the identification information of the application program is in the application white list and the terminal is in a horizontal screen state, the touch precision is adjusted, and the operation is simple.
In a possible implementation manner, when the application program in the foreground running state is a preset type application program and the terminal is in the horizontal screen state, the upper layer processing module sends a first instruction to the touch screen module. For example, when the application program in the foreground running state is a game application program and the terminal is in the horizontal screen state, the upper layer processing module sends a first instruction to the touch screen module.
And 204, the touch screen module sets the touch precision of the touch screen module as a first touch precision based on the first instruction.
In a possible implementation manner, the touch screen module includes a touch screen driving unit and a touch screen assembly unit. Illustratively, the touch screen driving unit is a software unit and the touch screen assembly unit is a hardware unit. The touch screen module Unit may also have its own CPU (Central Processing Unit) and memory, and may also perform data calculation (including but not limited to converting touch information into touch coordinates based on the touch screen). The touch screen driving unit is used for receiving the instruction of the upper layer processing module and communicating with the touch screen assembly unit.
Illustratively, step 204 may include several substeps as follows:
in step 204a, the touch screen driving unit receives a first instruction from the upper layer processing module.
And step 204b, the touch screen driving unit sends the first instruction to the touch screen assembly unit.
The upper layer processing module sends a first instruction to the touch screen driving unit, the touch screen driving unit receives the first instruction, and the touch screen driving unit forwards the first instruction from the upper layer processing module to the touch screen assembly unit.
In step 204c, the touch screen assembly unit sets the touch accuracy to the first touch accuracy.
Illustratively, the touch screen assembly unit may set the first touch accuracy by:
1. the touch screen assembly unit accesses the register, and determines first touch precision corresponding to the first instruction from the corresponding relation between the at least one instruction stored in the register and the touch precision.
The touch screen module comprises a register which is a memory of the touch screen component, the register stores the corresponding relation between at least one instruction and the touch precision, and the touch screen can determine the touch precision corresponding to the instruction from the upper processing module according to the corresponding relation.
2. The touch screen component unit sets the touch precision to a first touch precision.
The touch screen component unit sets the touch precision from the current original touch precision of the terminal to a first touch precision.
In step 205, the touch screen module reports the first touch event to the upper processing module according to the first touch precision.
In a possible implementation, the first touch event includes: the user touches the touch coordinates of the touch screen in the touch screen module. For example, taking the first touch precision of 0.25 as an example for description, the touch screen component unit may report that the touch coordinate of the touch screen in the touch screen module touched by the user is 4. It should be noted that the touch coordinates reported by the touch screen component unit are integers.
In a possible implementation manner, the upper processing module receives the touch coordinates of the touch screen from the touch screen component unit, the upper processing module needs to convert the touch coordinates into actual processing data, and the upper processing module needs to obtain the actual processing data by using the touch coordinates and the touch precision, for example, if the touch coordinates reported by the touch screen component unit are 1 and the touch precision is 0.25, the upper processing module determines that 1 × 0.25 — 0.25 is the actual processing data, and if the touch coordinates reported by the touch screen component unit are 2 and the touch precision is 0.25, the upper processing module determines that 2 × 0.25 — 0.5 is the actual processing data. And after the upper layer processing module obtains the actual processing data, responding the touch operation of the user based on the actual processing data.
For example, the touch coordinate system and the display resolution have a proportional mapping relationship, for example, the display resolution is 1080 × 2400, X ranges from 0 to 1080 × (1/touch accuracy), and Y ranges from 0 to 1080 × (1/touch accuracy) in the touch coordinate system.
Illustratively, the touch coordinate system and the actual physical size of the touch screen have a scaling relationship, for example, the actual length of the touch screen is 12cm, the corresponding X ranges from 0 to 1080 × 5, and the minimum accuracy range of the identified operation may be 0.0222mm as one pixel.
And step 206, responding to the terminal in the vertical screen state, and sending a second instruction to the touch screen module by the upper layer processing module.
In this embodiment of the application, the second instruction is an instruction for setting the touch precision of the touch screen module to the second touch precision.
When the terminal is in the vertical screen state, the application program in the foreground running state may not need high touch precision. The accuracy of the second touch accuracy is lower than the accuracy of the first touch accuracy. The second touch precision may be the same as the current original touch precision of the terminal, or may be higher than the original touch precision but lower than the first touch precision. Assuming that the current original touch precision of the terminal is the same as the display resolution, for example, assuming that the display resolution is 1080 × 2400, the original coordinate range of the touch screen may also be 1080 × 2400, that is, the original touch precision is 1. Illustratively, the second touch precision is the original touch precision, and in response to the terminal being in the vertical screen state, the upper layer processing module sends a second instruction to the touch screen module, where the second instruction is an instruction for setting the touch precision of the touch screen module to the original touch precision.
It should be noted that, at the same time, the terminal only executes one of step 203 and step 206, that is, the terminal executes step 203 without executing step 206 to step 207, and the terminal executes step 206 without executing step 203 to step 205.
And step 207, the touch screen module sets the touch precision of the touch screen module to be a second touch precision based on the second instruction.
And step 208, reporting the second touch event to the upper layer processing module by the touch screen module according to the second touch precision.
The second touch event may be any one of the touch events, and the second touch event may be a touch coordinate where the user touches the touch screen in the touch screen module.
For example, the second touch precision is 1, and the touch screen module reports that the touch coordinate of the touch screen touched by the user is 4, the upper processing module determines that 4 is actual processing data, and responds to the touch operation of the user based on the actual processing data.
When the terminal is in a vertical screen state, a touch progress with lower precision is set, so that the side effect of shaking caused by over-high precision and noise shaking can be prevented, and the user experience is further improved.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Please refer to fig. 4, which is a block diagram illustrating a touch event reporting apparatus according to an embodiment of the present application, where the apparatus has a function of implementing the above method example, and the function may be implemented by hardware or by hardware executing corresponding software. The apparatus 400 may include:
an upper layer processing module 410, configured to determine state information of a terminal, where the state information of the terminal is used to indicate whether the terminal is in a landscape screen state or a portrait screen state;
in response to the terminal being in the landscape screen state, the upper layer processing module 410 is configured to send a first instruction to the touch screen module;
the touch screen module 420 is configured to set the touch accuracy of the touch screen module to a first touch accuracy based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
To sum up, in the technical scheme provided in this application, by determining whether the terminal is in the landscape screen state, and when the terminal is in the landscape screen state, the touch accuracy is set to the high touch accuracy, and the touch event is reported according to the high touch accuracy, the touch accuracy is adjusted according to the state information of the terminal, and the diversity of touch recognition is improved.
In an exemplary embodiment, the upper layer processing module 410 is configured to:
acquiring target acceleration data, wherein the target acceleration data comprises acceleration data in a first direction, acceleration data in a second direction and acceleration data in a third direction, the first direction, the second direction and the third direction are mutually perpendicular in pairs, a plane formed by the first direction and the second direction is parallel to a touch screen in a touch screen module, the third direction is perpendicular to the touch screen, the first direction is along the short side direction of the touch screen, and the second direction is along the long side direction of the touch screen;
and determining the state information of the terminal based on the target acceleration data.
In an exemplary embodiment, the upper layer processing module 410 is configured to:
determining that the terminal is in a vertical screen state in response to the fact that the absolute value of the acceleration data in the second direction belongs to a target value range;
and determining that the terminal is in a horizontal screen state in response to the fact that the absolute value of the acceleration data in the first direction belongs to a target value range.
In an exemplary embodiment, the upper layer processing module 410 is configured to:
and determining that the terminal is in a horizontal screen state in response to that the absolute value of the acceleration data in the first direction belongs to the target value range and the time length of the absolute value of the acceleration data in the first direction belonging to the target value range is longer than a preset time length.
In an exemplary embodiment, in response to the terminal being in a vertical screen state, the upper layer processing module 410 is further configured to send a second instruction to the touch screen module;
the touch screen module 420 is further configured to set the touch accuracy of the touch screen module to a second touch accuracy based on the second instruction; and reporting a second touch event to the upper layer processing module according to the second touch precision.
In an exemplary embodiment, the upper layer processing module 410 is further configured to:
determining an instruction corresponding to the application type of the application program based on the application type of the application program in the foreground running state;
determining an instruction corresponding to an application type of the application program as the first instruction;
and sending the first instruction to the touch screen module.
In an exemplary embodiment, the touch screen module includes a touch screen driving unit, a touch screen assembly unit (not shown in the drawings);
the touch screen driving unit is used for receiving a first instruction from the upper layer processing module; sending the first instruction to the touch screen assembly unit;
the touch screen component unit is used for setting the touch precision to the first touch precision.
In an exemplary embodiment, the touch screen assembly unit is configured to:
accessing a register, and determining a first touch precision corresponding to the first instruction from a corresponding relation between at least one instruction stored in the register and the touch precision;
and setting the touch precision as the first touch precision.
In an exemplary embodiment, the first touch event includes: and the user touches the touch coordinate of the touch screen in the touch screen module.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 5, a block diagram of a terminal according to an embodiment of the present application is shown.
The terminal in the embodiment of the present application may include one or more of the following components: a processor 510 and a memory 520.
Processor 510 may include one or more processing cores. The processor 510 connects various parts within the overall terminal using various interfaces and lines, and performs various functions of the terminal and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 520 and calling data stored in the memory 520. Alternatively, the processor 510 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). Processor 510 may integrate one or a combination of a Central Processing Unit (CPU) and a modem. Wherein, the CPU mainly processes an operating system, an application program and the like; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 510, but may be implemented by a single chip.
Optionally, the processor 510, when executing the program instructions in the memory 520, implements the methods provided by the various method embodiments described above.
The Memory 520 may include a Random Access Memory (RAM) or a Read-Only Memory (ROM). Optionally, the memory 520 includes a non-transitory computer-readable medium. The memory 520 may be used to store instructions, programs, code sets, or instruction sets. The memory 520 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for at least one function, instructions for implementing the various method embodiments described above, and the like; the storage data area may store data created according to the use of the terminal, and the like.
The structure of the terminal described above is only illustrative, and in actual implementation, the terminal may include more or less components, such as: a display screen (touch screen), an acceleration sensor, a gravity sensor, and the like, which are not limited in this embodiment.
Those skilled in the art will appreciate that the configuration shown in fig. 5 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a computer-readable storage medium is further provided, where a computer program is stored in the computer-readable storage medium, and the computer program is loaded and executed by a processor of a computer device to implement each step in the above-mentioned touch event reporting method embodiment.
In an exemplary embodiment, a computer program product is provided that includes computer instructions stored in a computer readable storage medium. And the processor of the terminal reads the computer instruction from the computer readable storage medium, and executes the computer instruction, so that the terminal executes the touch event reporting method.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A touch event reporting method is applied to a terminal, wherein the terminal comprises an upper layer processing module and a touch screen module, and the method comprises the following steps:
the upper layer processing module determines state information of the terminal, wherein the state information of the terminal is used for indicating whether the terminal is in a horizontal screen state or a vertical screen state;
responding to the situation that the terminal is in a horizontal screen state, and sending a first instruction to the touch screen module by the upper layer processing module;
the touch screen module sets the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
2. The method of claim 1, wherein the upper layer processing module determines the state information of the terminal, and comprises:
the upper layer processing module acquires target acceleration data, wherein the target acceleration data comprises acceleration data in a first direction, acceleration data in a second direction and acceleration data in a third direction, the first direction, the second direction and the third direction are mutually perpendicular in pairs, a plane formed by the first direction and the second direction is parallel to a touch screen in the touch screen module, the third direction is perpendicular to the touch screen, the first direction is along the short side direction of the touch screen, and the second direction is along the long side direction of the touch screen;
and the upper layer processing module determines the state information of the terminal based on the target acceleration data.
3. The method of claim 2, wherein the upper layer processing module determines the state information of the terminal based on the target acceleration data, comprising:
responding to the fact that the absolute value of the acceleration data in the second direction belongs to a target value range, and determining that the terminal is in a vertical screen state by the upper layer processing module;
and in response to that the absolute value of the acceleration data in the first direction belongs to a target value range, the upper layer processing module determines that the terminal is in a horizontal screen state.
4. The method according to claim 3, wherein the determining, by the upper processing module, that the terminal is in the landscape state in response to the absolute value of the acceleration data in the first direction belonging to a target value range includes:
and in response to that the absolute value of the acceleration data in the first direction belongs to the target value range, and the time length of the absolute value of the acceleration data in the first direction belonging to the target value range is longer than a preset time length, the upper layer processing module determines that the terminal is in a horizontal screen state.
5. The method of claim 1, further comprising:
responding to the terminal in a vertical screen state, and sending a second instruction to the touch screen module by the upper layer processing module;
the touch screen module sets the touch precision of the touch screen module to be a second touch precision based on the second instruction; and reporting a second touch event to the upper layer processing module according to the second touch precision.
6. The method of any of claims 1 to 5, wherein the upper layer processing module sends a first instruction to the touch screen module, comprising:
the upper layer processing module determines an instruction corresponding to the application type of the application program based on the application type of the application program in a foreground running state;
the upper layer processing module determines an instruction corresponding to the application type of the application program as the first instruction;
and the upper layer processing module sends the first instruction to the touch screen module.
7. The method of any one of claims 1 to 5, wherein the touch screen module comprises a touch screen drive unit, a touch screen assembly unit;
the touch screen module sets the touch precision of the touch screen module to the first touch precision based on the first instruction, including:
the touch screen driving unit receives a first instruction from the upper layer processing module; sending the first instruction to the touch screen assembly unit;
the touch screen component unit sets the touch precision to the first touch precision.
8. The method of claim 7, wherein the touch screen assembly unit setting the touch accuracy to the first touch accuracy comprises:
the touch screen assembly unit accesses a register and determines first touch precision corresponding to the first instruction from the corresponding relation between at least one instruction stored in the register and the touch precision;
the touch screen component unit sets the touch precision to the first touch precision.
9. The method of any of claims 1 to 5, wherein the first touch event comprises: and the user touches the touch coordinate of the touch screen in the touch screen module.
10. A touch event reporting device, comprising:
the upper layer processing module is used for determining the state information of the terminal, and the state information of the terminal is used for indicating whether the terminal is in a horizontal screen state or a vertical screen state;
responding to the terminal in the transverse screen state, wherein the upper layer processing module is used for sending a first instruction to the touch screen module;
the touch screen module is used for setting the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
11. A terminal, comprising a processor and a memory, wherein the memory stores a computer program, and the computer program is loaded by the processor and executed to implement the touch event reporting method according to any one of claims 1 to 9.
12. A computer-readable storage medium, wherein a computer program is stored in the computer-readable storage medium, and the computer program is loaded and executed by a processor to implement the touch event reporting method according to any one of claims 1 to 9.
CN202110291923.7A 2021-03-18 2021-03-18 Touch event reporting method and device, terminal and storage medium Withdrawn CN113031812A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110291923.7A CN113031812A (en) 2021-03-18 2021-03-18 Touch event reporting method and device, terminal and storage medium
PCT/CN2022/079675 WO2022193988A1 (en) 2021-03-18 2022-03-08 Touch event reporting method and apparatus, terminal, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110291923.7A CN113031812A (en) 2021-03-18 2021-03-18 Touch event reporting method and device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN113031812A true CN113031812A (en) 2021-06-25

Family

ID=76471459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110291923.7A Withdrawn CN113031812A (en) 2021-03-18 2021-03-18 Touch event reporting method and device, terminal and storage medium

Country Status (2)

Country Link
CN (1) CN113031812A (en)
WO (1) WO2022193988A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022193988A1 (en) * 2021-03-18 2022-09-22 Oppo广东移动通信有限公司 Touch event reporting method and apparatus, terminal, and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN204463063U (en) * 2014-12-30 2015-07-08 深圳欧菲光科技股份有限公司 Touch screen and adopt the electronic installation of this touch screen
CN105022546A (en) * 2009-06-11 2015-11-04 株式会社村田制作所 Touch panel and touch type input device
CN107861655A (en) * 2017-11-01 2018-03-30 平安科技(深圳)有限公司 Control matching process, device, computer equipment and storage medium
CN108345438A (en) * 2018-02-06 2018-07-31 深圳市恒晨电器有限公司 Cell phone platform is applied to the physics transverse screen display method of vehicular platform
CN108700977A (en) * 2017-04-20 2018-10-23 华为技术有限公司 A kind of signal reporting method and device
CN112445358A (en) * 2019-08-29 2021-03-05 Oppo(重庆)智能科技有限公司 Adjusting method, terminal and computer storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9304575B2 (en) * 2013-11-26 2016-04-05 Apple Inc. Reducing touch sensor panel power consumption
CN108111689A (en) * 2017-12-26 2018-06-01 深圳市万普拉斯科技有限公司 Dynamic regulating method, device and the mobile terminal of pressure touch
CN108762554B (en) * 2018-05-18 2021-09-17 北京硬壳科技有限公司 Touch event response method and device
CN110069165A (en) * 2019-04-29 2019-07-30 广州视源电子科技股份有限公司 Processing method, device and the equipment and storage medium of touch data
CN110442263A (en) * 2019-07-23 2019-11-12 深圳市锐尔觅移动通信有限公司 Touching display screen processing method, device, storage medium and electronic equipment
CN111124173B (en) * 2019-11-22 2023-05-16 Oppo(重庆)智能科技有限公司 Working state switching method and device of touch screen, mobile terminal and storage medium
CN113031812A (en) * 2021-03-18 2021-06-25 Oppo广东移动通信有限公司 Touch event reporting method and device, terminal and storage medium
CN113031814A (en) * 2021-03-18 2021-06-25 Oppo广东移动通信有限公司 Touch event reporting method and device, terminal and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105022546A (en) * 2009-06-11 2015-11-04 株式会社村田制作所 Touch panel and touch type input device
CN204463063U (en) * 2014-12-30 2015-07-08 深圳欧菲光科技股份有限公司 Touch screen and adopt the electronic installation of this touch screen
CN108700977A (en) * 2017-04-20 2018-10-23 华为技术有限公司 A kind of signal reporting method and device
CN107861655A (en) * 2017-11-01 2018-03-30 平安科技(深圳)有限公司 Control matching process, device, computer equipment and storage medium
CN108345438A (en) * 2018-02-06 2018-07-31 深圳市恒晨电器有限公司 Cell phone platform is applied to the physics transverse screen display method of vehicular platform
CN112445358A (en) * 2019-08-29 2021-03-05 Oppo(重庆)智能科技有限公司 Adjusting method, terminal and computer storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022193988A1 (en) * 2021-03-18 2022-09-22 Oppo广东移动通信有限公司 Touch event reporting method and apparatus, terminal, and storage medium

Also Published As

Publication number Publication date
WO2022193988A1 (en) 2022-09-22

Similar Documents

Publication Publication Date Title
US10796133B2 (en) Image processing method and apparatus, and electronic device
CN102736854B (en) Communication terminal and the screen adjustment method based on this communication terminal
EP3082025A1 (en) Touch input processing method and electronic device for supporting the same
CN105867751B (en) Operation information processing method and device
WO2017107086A1 (en) Touch gesture detection assessment
CN107450841B (en) Interactive object control method and device
CN106453832B (en) A kind of report method falling data, device and mobile terminal
US11487377B2 (en) Electronic device acquiring user input when in submerged state by using pressure sensor, and method for controlling electronic device
WO2019100407A1 (en) Positioning of terminal screen based on transformation relation of coordinates of marking graphic points in pattern
CN110784672B (en) Video data transmission method, device, equipment and storage medium
CN113031812A (en) Touch event reporting method and device, terminal and storage medium
CN106445698B (en) Method and device for acquiring step counting data
CN111064842A (en) Method, terminal and storage medium for recognizing special-shaped touch
CN113031814A (en) Touch event reporting method and device, terminal and storage medium
CN103543933A (en) Method for selecting files and touch terminal
CN110275639B (en) Touch data processing method and device, terminal and storage medium
KR20180088859A (en) A method for changing graphics processing resolution according to a scenario,
CN106651968B (en) Game abnormity detection method and device
CN109240531A (en) Sampling compensation method, device, mobile terminal and the storage medium of touch data
CN113126868B (en) Unread message identifier clearing method and device and electronic equipment
US9274703B2 (en) Method for inputting instruction and portable electronic device and computer readable recording medium
CN113780291A (en) Image processing method and device, electronic equipment and storage medium
CN112857613A (en) Method and device for determining shell temperature, storage medium and electronic equipment
CN106095163B (en) Driving device, the display device of touch screen
US20180113530A1 (en) Capacitive sensing device and detection method for an irregular conductive matter in a touch event

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210625

WW01 Invention patent application withdrawn after publication