CN113031814A - Touch event reporting method and device, terminal and storage medium - Google Patents

Touch event reporting method and device, terminal and storage medium Download PDF

Info

Publication number
CN113031814A
CN113031814A CN202110292806.2A CN202110292806A CN113031814A CN 113031814 A CN113031814 A CN 113031814A CN 202110292806 A CN202110292806 A CN 202110292806A CN 113031814 A CN113031814 A CN 113031814A
Authority
CN
China
Prior art keywords
touch
touch screen
instruction
precision
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110292806.2A
Other languages
Chinese (zh)
Inventor
古启才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110292806.2A priority Critical patent/CN113031814A/en
Publication of CN113031814A publication Critical patent/CN113031814A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a touch event reporting method, a touch event reporting device, a terminal and a storage medium, and relates to the technical field of terminals. The method comprises the following steps: responding to the situation that the application program in the foreground running state is in a target scene, and sending a first instruction to the touch screen module by the upper layer processing module; the touch screen module sets the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting the first touch event to an upper layer processing module according to the first touch precision. According to the method and the device, the touch precision is set to be the high touch precision by judging whether the application program is in the target scene or not and reporting the touch event according to the high touch precision under the condition that the application program is in the target scene.

Description

Touch event reporting method and device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to a touch event reporting method, a touch event reporting device, a terminal and a storage medium.
Background
With the development of terminal technology, a touch screen is generally arranged on a mobile terminal, and the touch screen can sense touch operation of a user on the touch screen.
In the related art, the basic principle of the touch screen is generally capacitance sensing, when a finger or an object of a user contacts a sensing material on the surface of the touch screen, capacitance changes, and the touch screen calculates the number of the fingers or the objects pressed on the surface of the touch screen through an algorithm.
Disclosure of Invention
The embodiment of the application provides a touch event reporting method, a touch event reporting device, a terminal and a storage medium. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for reporting a touch event, where the method includes:
responding to the situation that the application program in the foreground running state is in a target scene, and sending a first instruction to the touch screen module by the upper layer processing module;
the touch screen module sets the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
On the other hand, an embodiment of the present application provides a touch event reporting device, where the device includes:
the upper layer processing module is used for sending a first instruction to the touch screen module in response to the situation that the application program in the foreground running state is in the target scene;
the touch screen module is used for setting the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
On the other hand, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores a computer program, and the computer program is loaded and executed by the processor to implement the touch event reporting method according to the above aspect.
In another aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and the computer program is loaded and executed by a processor to implement the touch event reporting method according to the above aspect.
In yet another aspect, embodiments of the present application provide a computer program product including computer instructions stored in a computer-readable storage medium. And the processor of the terminal reads the computer instruction from the computer readable storage medium, and executes the computer instruction, so that the terminal executes the touch event reporting method.
The technical scheme provided by the embodiment of the application can bring the following beneficial effects:
by judging whether the application program is in the target scene or not, setting the touch precision to be high touch precision under the condition that the application program is in the target scene, and reporting the touch event according to the high touch precision, the application adjusts the touch precision aiming at the application program, and the diversity of touch identification is improved.
Drawings
Fig. 1 is a flowchart of a touch event reporting method according to an embodiment of the present application;
fig. 2 is a flowchart of a touch event reporting method according to another embodiment of the present application;
fig. 3 is a flowchart of a touch event reporting method according to another embodiment of the present application;
fig. 4 is a block diagram of a touch event reporting apparatus according to an embodiment of the present application;
fig. 5 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of a touch event reporting method according to an embodiment of the present application is shown, where the method may be applied to a terminal, and the method may include the following steps.
Step 101, in response to that the application program in the foreground running state is in a target scene, the upper layer processing module sends a first instruction to the touch screen module.
The application program in the foreground running state refers to a state in which a user interface of the application program is displayed on a touch screen of the terminal. The application may be any application, and the application may be a social application, a game application, a video application, a music application, or the like.
The target scene refers to a scene in which the application needs to use high touch precision. Illustratively, an application in the foreground running state is in the target scene when the application is in the application white list, and the application white list is used for indicating at least one application in the target scene. For example, the target scene may also be referred to as a target state (a state in which the application needs to use high touch precision) or other terms, which is not limited in this embodiment of the present application. Illustratively, when the application in the foreground running state is a preset type of application, the application is in a target scene, for example, when the application in the foreground running state is a game-type application, the application is in the target scene.
The upper layer processing module is a module having an upper layer service function in the terminal. In the embodiment of the present application, the upper layer processing module refers to a module for determining whether an application is in a target scene. For example, the upper layer processing module may also be referred to as an upper layer system service, a terminal system service, and the like, which is not limited in this embodiment. In the embodiment of the present application, the terminal refers to an electronic device with a touch screen, and for example, the terminal may include an electronic device such as a mobile phone, a tablet Computer, a PC (Personal Computer), and a smart wearable device, which is not limited in the embodiment of the present application. Illustratively, the upper layer processing module is a software module. The upper layer processing module is mainly used for processing an operating system, a user interface, an application program and the like.
The touch screen module refers to a module related to a touch screen in the terminal. The upper layer processing module and the touch screen module can be communicated with each other. Illustratively, the upper layer processing module and the touch screen module may communicate with each other through an I2C (Inter-Integrated Circuit) Interface or an SPI (Serial Peripheral Interface) Interface.
The first instruction is an instruction for setting the touch precision of the touch screen module to the first touch precision. The touch precision refers to the capability of the touch screen to sense the minimum change of the touch screen touched by the user, and the smaller the touch precision is, the larger the capability of the touch screen to sense the minimum change of the touch screen touched by the user is. For example, the touch precision may also be referred to as a touch resolution, a touch coordinate resolution, a touch reporting range, and the like, which is not limited in this embodiment of the application.
And 102, the touch screen module sets the touch precision of the touch screen module as a first touch precision based on the first instruction.
Exemplarily, the first touch precision is higher than the current original touch precision of the terminal, and the higher the touch precision is, the higher the precision recognition capability is, the better the touch experience is.
The touch screen module sets the touch precision of the touch screen module from the original touch precision to a first touch precision based on the first instruction.
Exemplarily, the first touch precision may include 0.25, 0.5, 0.2, and the like, and when the first touch precision is 0.25, the division of one coordinate scale in the original coordinate system of the touch screen into 4 coordinate scales is denoted; when the first touch precision is 0.5, a coordinate scale in an original coordinate system of the touch screen is divided into 2 coordinate scales; when the first touch precision is 0.2, the method shows that one coordinate scale in the original coordinate system of the touch screen is split into 5 coordinate scales.
And 103, reporting the first touch event to an upper layer processing module by the touch screen module according to the first touch precision.
The first touch event may be any one touch event, and the first touch event is used to indicate a touch behavior, which is acquired by the touch screen module and is triggered by a user for a user interface of the application program. Illustratively, the first touch event refers to a touch point when the user touches the touch screen, that is, a coordinate range when the user touches the touch screen.
According to the embodiment of the application, the touch precision matched with the application is determined based on the application, and the touch experience can be improved for some scenes needing high-precision touch.
In summary, in the technical scheme provided in the embodiment of the present application, by determining whether the application is in the target scene, and when the application is in the target scene, the touch precision is set to the high touch precision, and the touch event is reported according to the high touch precision, the application adjusts the touch precision for the application, and the diversity of touch recognition is improved.
Please refer to fig. 2, which shows a flowchart of a touch event reporting method according to another embodiment of the present application, where the method can be applied to a terminal, and the method can include the following steps.
In step 201, an upper layer processing module obtains an application white list.
The application white list is used for indicating a set of applications in the target scene, and the set of applications comprises at least one application. The application white list can be set by a user, for example, the user can select an application program which is desired to be in a target scene in a setting interface, the terminal receives an adding instruction for the application program, and the application program is added into the application white list.
The application white list can be stored in the memory, and the upper layer processing module accesses the memory to acquire the application white list.
Step 202, in response to the application program being in the application white list, the upper layer processing module sends a first instruction to the touch screen driving unit.
When the application program is in the application white list, the application program is in a target scene, and the upper layer processing module sends a first instruction to the touch screen module.
In a possible implementation manner, after the upper layer processing module obtains the application white list, the following steps need to be further performed:
the first and upper processing modules obtain the identification information of the application program.
And secondly, determining whether the application white list comprises the identification information of the application program or not by the upper layer processing module.
Third, in response to the application white list including identification information for the application, the upper processing module determines that the application is in the application white list.
Fourth, in response to the identification information of the application program not being included in the application white list, the upper processing module determines that the application program is not in the application white list.
The application white list may include identification information of at least one application program, where the identification information is used to uniquely identify the application program, and the identification information of the application program may be a name and a package name of the application program (a name of an installation package of the application program), and in a possible implementation manner, a technician may set a corresponding serial number for each application program, where the serial number is identification information of the application program, and the serial numbers corresponding to different application programs are different, and in a case where the identification information of the application program is a serial number, the application white list may include at least one serial number.
The identification information of the application program is taken as an example of the name of the application program for introduction, the upper processing module obtains the name of the application program in the foreground running state, and the upper processing module compares the name of the application program in the foreground running state with the identification of at least one application program included in the application white list to determine whether the application white list includes the name of the application program in the foreground running state. In response to the application white list including the name of the application program in the foreground running state, the upper layer processing module determines that the application program is in the application white list and the application program is in a target scene; in response to the name of the application program in the foreground running state not being included in the application white list, the upper layer processing module determines that the application program is not in the application white list and the application program is not in the target scene.
In an exemplary embodiment, the touch screen module includes a touch screen driving unit, a touch screen assembly unit. Illustratively, the touch screen driving unit is a software unit and the touch screen assembly unit is a hardware unit. The touch screen module Unit may also have its own CPU (Central Processing Unit) and memory, and may also perform data calculation (including but not limited to converting touch information into touch coordinates based on the touch screen). The touch screen driving unit is used for receiving the instruction of the upper layer processing module and communicating with the touch screen assembly unit.
Correspondingly, the touch screen driving unit receives a first instruction from the upper layer processing module.
And step 203, the touch screen driving unit sends the first instruction to the touch screen assembly unit.
The touch screen driving unit forwards the first instruction from the upper layer processing module to the touch screen component unit.
In step 204, the touch screen assembly unit sets the touch precision to a first touch precision.
In a possible implementation, this step comprises several sub-steps as follows:
in step 204a, the touch screen module unit accesses the register, and determines a first touch precision corresponding to the first instruction from the corresponding relationship between the at least one instruction stored in the register and the touch precision.
The touch screen module comprises a register which is a memory of the touch screen component, the register stores the corresponding relation between at least one instruction and the touch precision, and the touch screen can determine the touch precision corresponding to the instruction from the upper processing module according to the corresponding relation.
In step 204b, the touch screen assembly unit sets the touch precision to the first touch precision.
The touch screen component unit sets the touch precision from the current original touch precision of the terminal to a first touch precision.
In step 205, the touch screen module unit reports the first touch event to the upper layer processing module according to the first touch precision.
In an exemplary embodiment, the first touch event includes: the user touches the touch coordinates of the touch screen in the touch screen module. For example, taking the first touch precision of 0.25 as an example for description, the touch screen component unit may report that the touch coordinate of the touch screen in the touch screen module touched by the user is 4. It should be noted that the touch coordinates reported by the touch screen component unit are integers.
In a possible implementation manner, the upper processing module receives the touch coordinates of the touch screen from the touch screen component unit, the upper processing module needs to convert the touch coordinates into actual processing data, and the upper processing module needs to obtain the actual processing data by using the touch coordinates and the touch precision, for example, if the touch coordinates reported by the touch screen component unit are 1 and the touch precision is 0.25, the upper processing module determines that 1 × 0.25 — 0.25 is the actual processing data, and if the touch coordinates reported by the touch screen component unit are 2 and the touch precision is 0.25, the upper processing module determines that 2 × 0.25 — 0.5 is the actual processing data. And after the upper layer processing module obtains the actual processing data, responding the touch operation of the user based on the actual processing data.
For example, the touch coordinate system and the display resolution have a proportional mapping relationship, for example, the display resolution is 1080 × 2400, X ranges from 0 to 1080 × (1/touch accuracy), and Y ranges from 0 to 1080 × (1/touch accuracy) in the touch coordinate system.
Illustratively, the touch coordinate system and the actual physical size of the touch screen have a scaling relationship, for example, the actual length of the touch screen is 12cm, the corresponding X ranges from 0 to 1080 × 5, and the minimum accuracy range of the identified operation may be 0.0222mm as one pixel.
In summary, in the technical scheme provided in the embodiment of the present application, by obtaining the identification information of the application program, when the identification information of the application program is in the application white list, the touch precision is adjusted, and the operation is simple.
Please refer to fig. 3, which shows a flowchart of a touch event reporting method according to another embodiment of the present application, where the method can be applied to a terminal, and the method can include the following steps.
Step 301, in response to that the application program in the foreground running state is in the target scene, the upper layer processing module sends a first instruction to the touch screen module.
In a possible implementation, step 301 comprises the following sub-steps:
in step 301a, the upper layer processing module determines an instruction corresponding to the application type of the application program based on the application type of the application program.
Illustratively, the instructions may differ for different application types. The upper processing module may determine the instruction corresponding to the application type of the application program in the foreground operating state based on the correspondence. For example, the touch precision corresponding to the following application types may be sequentially reduced: a game application, a shooting application, a social application, a life application, etc. (this example is merely an example, and it may be set according to practical situations, and this is not limited in this embodiment of the application).
Because the instruction corresponds to the touch precision, the touch precision corresponding to different instructions is different, and the instruction is determined based on the application type, so that the touch precision is more diversified, and the touch identification requirement of the application program can be better met.
In step 301b, the upper layer processing module determines an instruction corresponding to the application type of the application program as a first instruction.
And step 301c, the upper layer processing module sends a first instruction to the touch screen module.
In possible implementations, the instructions corresponding to the same application type may also be different. The instruction corresponding to the application program may be set by a user, for example, the user may set the instruction corresponding to each application program in a setting interface, or the user may set the instruction corresponding to each application type in the setting interface.
Step 302, the touch screen module sets the touch precision of the touch screen module to a first touch precision based on the first instruction.
And step 303, reporting the first touch event to an upper layer processing module by the touch screen module according to the first touch precision.
For the description of step 302 and step 303, reference may be made to the above embodiments, which are not described herein again.
And step 304, in response to that the application program is not in the target scene, the upper layer processing module sends a second instruction to the touch screen module.
In this embodiment of the application, the second instruction is an instruction for setting the touch precision of the touch screen module to the second touch precision.
When the application program is not in the target scene, the application program may not need high touch precision. The accuracy of the second touch accuracy is lower than the accuracy of the first touch accuracy. The second touch precision may be the same as the current original touch precision of the terminal, or may be higher than the original touch precision but lower than the first touch precision. Assuming that the current original touch precision of the terminal may be the same as the display resolution, for example, assuming that the display resolution is 1080 × 2400, the original coordinate range of the touch screen may also be 1080 × 2400, that is, the original touch precision may be 1. Illustratively, the second touch precision is the original touch precision, and in response to that the application program is not in the target scene, the upper processing module sends a second instruction to the touch screen module, where the second instruction is an instruction for setting the touch precision of the touch screen module to the original touch precision.
It should be noted that the terminal only performs one of step 301 and step 304 at the same time, that is, the terminal does not perform step 304 to step 306 when performing step 301, and the terminal does not perform step 301 to step 303 when performing step 304.
In step 305, the touch screen module sets the touch precision of the touch screen module to a second touch precision based on the second instruction.
And step 306, reporting the second touch event to the upper layer processing module by the touch screen module according to the second touch precision.
The second touch event may be any one of the touch events, and the second touch event may be a touch coordinate where the user touches the touch screen in the touch screen module.
For example, the second touch precision is 1, and the touch screen module reports that the touch coordinate of the touch screen touched by the user is 4, the upper processing module determines that 4 is actual processing data, and responds to the touch operation of the user based on the actual processing data.
When the application program is not in the target scene, the touch progress with lower precision is set, so that the side effect of shaking caused by over-high precision and noise shaking can be prevented, and the user experience is further improved.
Step 307, in response to the application program being switched from the foreground operating state to the background operating state or the application program being switched from the foreground operating state to the closed state, the upper layer processing module sends a third instruction to the touch screen module.
In this embodiment of the application, the third instruction is an instruction for setting the touch precision of the touch screen module to the third touch precision.
Illustratively, the accuracy of the third touch accuracy is lower than the accuracy of the first touch accuracy. The third touch accuracy may be the same as the second touch accuracy. The third touch precision may be the same as the original touch precision, or may be higher than the original touch precision but lower than the first touch precision.
Exemplarily, assuming that the third touch precision is the original touch precision, in response to the application being switched from the foreground operating state to the background operating state or the application being switched from the foreground operating state to the off state, the upper layer processing module sends a third instruction to the touch screen module, where the third instruction is an instruction for setting the touch precision of the touch screen module to the original touch precision.
When the application program is switched from the foreground running state to the background running state or the application program is switched from the foreground running state to the closed state, the terminal may display user interfaces of other application programs and may also display a system desktop, which is not limited in this embodiment of the present application.
It should be noted that step 307 should be executed after step 303.
And 308, the touch screen module sets the touch precision of the touch screen module to be a third touch precision based on the third instruction.
The touch screen module switches the touch accuracy of the touch screen module from the first touch accuracy to a third touch accuracy based on the third instruction.
And 309, reporting a third touch event to the upper-layer processing module by the touch screen module according to the third touch precision.
Taking the third touch precision as 1 as an example for description, if the touch screen module reports that the touch coordinate of the touch screen touched by the user is 4, the upper processing module determines that 4 is actual processing data, and responds to the touch operation of the user based on the actual processing data.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Please refer to fig. 4, which is a block diagram illustrating a touch event reporting apparatus according to an embodiment of the present application, where the apparatus has a function of implementing the above method example, and the function may be implemented by hardware or by hardware executing corresponding software. The apparatus 400 may include:
in response to that the application program in the foreground running state is in the target scene, the upper layer processing module 410 is configured to send a first instruction to the touch screen module;
the touch screen module 420 is configured to set the touch accuracy of the touch screen module to a first touch accuracy based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
In summary, in the technical scheme provided in the embodiment of the present application, by determining whether the application is in the target scene, and when the application is in the target scene, the touch precision is set to the high touch precision, and the touch event is reported according to the high touch precision, the application adjusts the touch precision for the application, and the diversity of touch recognition is improved.
In an exemplary embodiment, the upper layer processing module 410 is configured to:
acquiring an application white list;
and responding to the application program in the application white list, and sending the first instruction to the touch screen module.
In an exemplary embodiment, the upper layer processing module 410 is further configured to:
acquiring identification information of the application program;
determining whether the identification information of the application program is included in the application white list;
determining that the application program is within the application whitelist in response to the application whitelist including identification information for the application program.
In an exemplary embodiment, the upper layer processing module 410 is further configured to:
in response to not including the identification information of the application program within the application whitelist, the upper layer processing module determines that the application program is not within the application whitelist.
In an exemplary embodiment, in response to the application program not being in the target scene, the upper layer processing module 410 is configured to send a second instruction to the touch screen module;
the touch screen module 420 is configured to set the touch accuracy of the touch screen module to a second touch accuracy based on the second instruction; and reporting a second touch event to the upper layer processing module according to the second touch precision.
In an exemplary embodiment, in response to the application program being switched from the foreground operating state to the background operating state or the application program being switched from the foreground operating state to the closed state, the upper layer processing module 410 is further configured to send a third instruction to the touch screen module;
the touch screen module 420 is further configured to set the touch precision of the touch screen module to a third touch precision based on the third instruction; and reporting a third touch event to the upper layer processing module according to the third touch precision.
In an exemplary embodiment, the upper layer processing module 410 is configured to:
determining an instruction corresponding to the application type of the application program based on the application type of the application program;
determining an instruction corresponding to an application type of the application program as the first instruction;
and sending the first instruction to the touch screen module.
In an exemplary embodiment, the touch screen module includes a touch screen driving unit, a touch screen assembly unit (not shown in the drawings);
the touch screen driving unit is used for receiving a first instruction from the upper layer processing module; sending the first instruction to the touch screen assembly unit;
the touch screen component unit is used for setting the touch precision to the first touch precision.
In an exemplary embodiment, the touch screen assembly unit is configured to:
accessing a register, and determining a first touch precision corresponding to the first instruction from a corresponding relation between at least one instruction stored in the register and the touch precision;
and setting the touch precision as the first touch precision.
In an exemplary embodiment, the first touch event includes: and the user touches the touch coordinate of the touch screen in the touch screen module.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Referring to fig. 5, a block diagram of a terminal according to an embodiment of the present application is shown.
The terminal in the embodiment of the present application may include one or more of the following components: a processor 510 and a memory 520.
Processor 510 may include one or more processing cores. The processor 510 connects various parts within the overall terminal using various interfaces and lines, and performs various functions of the terminal and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 520 and calling data stored in the memory 520. Alternatively, the processor 510 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). Processor 510 may integrate one or a combination of a Central Processing Unit (CPU) and a modem. Wherein, the CPU mainly processes an operating system, an application program and the like; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 510, but may be implemented by a single chip.
Optionally, the processor 510, when executing the program instructions in the memory 520, implements the methods provided by the various method embodiments described above.
The Memory 520 may include a Random Access Memory (RAM) or a Read-Only Memory (ROM). Optionally, the memory 520 includes a non-transitory computer-readable medium. The memory 520 may be used to store instructions, programs, code sets, or instruction sets. The memory 520 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for at least one function, instructions for implementing the various method embodiments described above, and the like; the storage data area may store data created according to the use of the terminal, and the like.
The structure of the terminal described above is only illustrative, and in actual implementation, the terminal may include more or less components, such as: a display screen (touch panel), etc., but this embodiment is not limited thereto.
Those skilled in the art will appreciate that the configuration shown in fig. 5 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a computer-readable storage medium is further provided, where a computer program is stored in the computer-readable storage medium, and the computer program is loaded and executed by a processor of a computer device to implement each step in the above-mentioned touch event reporting method embodiment.
In an exemplary embodiment, a computer program product is provided that includes computer instructions stored in a computer readable storage medium. And the processor of the terminal reads the computer instruction from the computer readable storage medium, and executes the computer instruction, so that the terminal executes the touch event reporting method.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (13)

1. A touch event reporting method is characterized by comprising the following steps:
responding to the situation that the application program in the foreground running state is in a target scene, and sending a first instruction to the touch screen module by the upper layer processing module;
the touch screen module sets the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
2. The method of claim 1, wherein in response to the application program in the foreground running state being in the target scene, the upper layer processing module sends a first instruction to the touch screen module, and the first instruction comprises:
an upper layer processing module acquires an application white list;
and responding to the application program in the application white list, and sending the first instruction to the touch screen module by the upper layer processing module.
3. The method of claim 2, wherein after the upper layer processing module obtains the application white list, the method further comprises:
the upper layer processing module acquires identification information of the application program;
the upper layer processing module determines whether the application white list comprises identification information of the application program;
in response to the application white list including identification information for the application, the upper layer processing module determines that the application is within the application white list.
4. The method of claim 3, wherein after the upper layer processing module determines whether the identification information of the application program is included in the application white list, the method further comprises:
in response to not including the identification information of the application program within the application whitelist, the upper layer processing module determines that the application program is not within the application whitelist.
5. The method of claim 1, further comprising:
in response to the application program not being in the target scene, the upper layer processing module sends a second instruction to the touch screen module;
the touch screen module sets the touch precision of the touch screen module to be a second touch precision based on the second instruction; and reporting a second touch event to the upper layer processing module according to the second touch precision.
6. The method of claim 1, further comprising:
in response to the application program being switched from the foreground running state to the background running state or the application program being switched from the foreground running state to the off state, the upper layer processing module sends a third instruction to the touch screen module;
the touch screen module sets the touch precision of the touch screen module to be third touch precision based on the third instruction; and reporting a third touch event to the upper layer processing module according to the third touch precision.
7. The method according to any one of claims 1 to 6, wherein the upper layer processing module sends a first instruction to the touch screen module, and the first instruction comprises:
the upper layer processing module determines an instruction corresponding to the application type of the application program based on the application type of the application program;
the upper layer processing module determines an instruction corresponding to the application type of the application program as the first instruction;
and the upper layer processing module sends the first instruction to the touch screen module.
8. The method of any one of claims 1 to 6, wherein the touch screen module comprises a touch screen drive unit, a touch screen assembly unit;
the touch screen module sets the touch precision of the touch screen module to the first touch precision based on the first instruction, including:
the touch screen driving unit receives a first instruction from the upper layer processing module; sending the first instruction to the touch screen assembly unit;
the touch screen component unit sets the touch precision to the first touch precision.
9. The method of claim 8, wherein the touch screen assembly unit setting the touch accuracy to the first touch accuracy comprises:
the touch screen assembly unit accesses a register and determines first touch precision corresponding to the first instruction from the corresponding relation between at least one instruction stored in the register and the touch precision;
the touch screen component unit sets the touch precision to the first touch precision.
10. The method of any of claims 1 to 6, wherein the first touch event comprises: and the user touches the touch coordinate of the touch screen in the touch screen module.
11. A touch event reporting device, comprising:
the upper layer processing module is used for sending a first instruction to the touch screen module in response to the situation that the application program in the foreground running state is in the target scene;
the touch screen module is used for setting the touch precision of the touch screen module to be first touch precision based on the first instruction; and reporting a first touch event to the upper layer processing module according to the first touch precision.
12. A terminal, comprising a processor and a memory, wherein the memory stores a computer program, and the computer program is loaded by the processor and executed to implement the touch event reporting method according to any one of claims 1 to 10.
13. A computer-readable storage medium, wherein a computer program is stored in the computer-readable storage medium, and the computer program is loaded and executed by a processor to implement the touch event reporting method according to any one of claims 1 to 10.
CN202110292806.2A 2021-03-18 2021-03-18 Touch event reporting method and device, terminal and storage medium Pending CN113031814A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110292806.2A CN113031814A (en) 2021-03-18 2021-03-18 Touch event reporting method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110292806.2A CN113031814A (en) 2021-03-18 2021-03-18 Touch event reporting method and device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN113031814A true CN113031814A (en) 2021-06-25

Family

ID=76471485

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110292806.2A Pending CN113031814A (en) 2021-03-18 2021-03-18 Touch event reporting method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113031814A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022193988A1 (en) * 2021-03-18 2022-09-22 Oppo广东移动通信有限公司 Touch event reporting method and apparatus, terminal, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107508994A (en) * 2017-09-21 2017-12-22 努比亚技术有限公司 Touch-screen report point rate processing method, terminal and computer-readable recording medium
CN108646938A (en) * 2018-03-13 2018-10-12 广东欧珀移动通信有限公司 Configuration method, device, terminal and the storage medium of touch screen
WO2020000406A1 (en) * 2018-06-29 2020-01-02 深圳市汇顶科技股份有限公司 Touch screen adjustment method, touch chip and electronic terminal
CN111124173A (en) * 2019-11-22 2020-05-08 Oppo(重庆)智能科技有限公司 Working state switching method and device of touch screen, mobile terminal and storage medium
CN112445358A (en) * 2019-08-29 2021-03-05 Oppo(重庆)智能科技有限公司 Adjusting method, terminal and computer storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107508994A (en) * 2017-09-21 2017-12-22 努比亚技术有限公司 Touch-screen report point rate processing method, terminal and computer-readable recording medium
CN108646938A (en) * 2018-03-13 2018-10-12 广东欧珀移动通信有限公司 Configuration method, device, terminal and the storage medium of touch screen
WO2020000406A1 (en) * 2018-06-29 2020-01-02 深圳市汇顶科技股份有限公司 Touch screen adjustment method, touch chip and electronic terminal
CN112445358A (en) * 2019-08-29 2021-03-05 Oppo(重庆)智能科技有限公司 Adjusting method, terminal and computer storage medium
CN111124173A (en) * 2019-11-22 2020-05-08 Oppo(重庆)智能科技有限公司 Working state switching method and device of touch screen, mobile terminal and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022193988A1 (en) * 2021-03-18 2022-09-22 Oppo广东移动通信有限公司 Touch event reporting method and apparatus, terminal, and storage medium

Similar Documents

Publication Publication Date Title
EP3811191B1 (en) Electronic device for displaying list of executable applications on split screen and operating method thereof
US10761642B2 (en) Method, mobile terminal and non-transitory computer-readable storage medium for adjusting scanning frequency of touch screen
CN108984064B (en) Split screen display method and device, storage medium and electronic equipment
CN105183284A (en) Short message viewing method and user terminal
AU2018273505B2 (en) Method for capturing fingerprint and associated products
CN109062464B (en) Touch operation method and device, storage medium and electronic equipment
CN107357458B (en) Touch key response method and device, storage medium and mobile terminal
EP3627314A2 (en) Method for game running and related products
CN107390923B (en) Screen false touch prevention method and device, storage medium and terminal
CN108512997B (en) Display method, display device, mobile terminal and storage medium
US11487377B2 (en) Electronic device acquiring user input when in submerged state by using pressure sensor, and method for controlling electronic device
CN105224216A (en) A kind of user terminal control method and user terminal
US11169697B2 (en) Electronic device and method for displaying contextual information of application
CN113031814A (en) Touch event reporting method and device, terminal and storage medium
CN113778255B (en) Touch recognition method and device
CN111343321B (en) Backlight brightness adjusting method and related product
CN113031812A (en) Touch event reporting method and device, terminal and storage medium
CN110275639B (en) Touch data processing method and device, terminal and storage medium
EP2639686A2 (en) Input control device, input control program, and input control method
CN113312122A (en) Virtual keyboard calling method and device, computer storage medium and electronic equipment
KR20180088859A (en) A method for changing graphics processing resolution according to a scenario,
US11874986B2 (en) Electronic device including touch circuit and operating method therefor
US9274703B2 (en) Method for inputting instruction and portable electronic device and computer readable recording medium
WO2021102677A1 (en) Touch response method, touch screen system, terminal, storage medium, and chip
CN114270297A (en) Touch screen point reporting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination