CN112445410B - Touch event identification method and device and computer readable storage medium - Google Patents

Touch event identification method and device and computer readable storage medium Download PDF

Info

Publication number
CN112445410B
CN112445410B CN202011439375.XA CN202011439375A CN112445410B CN 112445410 B CN112445410 B CN 112445410B CN 202011439375 A CN202011439375 A CN 202011439375A CN 112445410 B CN112445410 B CN 112445410B
Authority
CN
China
Prior art keywords
touch
event
touch event
determining
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011439375.XA
Other languages
Chinese (zh)
Other versions
CN112445410A (en
Inventor
常群
张勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202011439375.XA priority Critical patent/CN112445410B/en
Publication of CN112445410A publication Critical patent/CN112445410A/en
Application granted granted Critical
Publication of CN112445410B publication Critical patent/CN112445410B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The disclosure relates to a touch event identification method, a touch event identification device and a computer readable storage medium, wherein the method comprises the following steps: responding to a touch event of a screen, and determining that the touch event is a tapping event; determining a target touch area where a touch position of the touch event is located, and acquiring a current acceleration value detected in the target touch area for the touch event; and in response to the fact that the confidence degree corresponding to the current acceleration value is larger than the preset confidence degree threshold corresponding to the target touch area, determining that the touch event is a finger joint tapping event, and improving the accuracy of finger joint tapping event identification.

Description

Touch event identification method and device and computer readable storage medium
Technical Field
The present disclosure relates to the field of electronic information technologies, and in particular, to a method and an apparatus for identifying a touch event, and a computer-readable storage medium.
Background
In the related art, touch events are classified into various types. For example, a touch event in which the normal finger belly acts on the screen, and a touch event in which the joint of the finger taps the screen. For the screen, different types of touch events set different response operations within the mobile terminal.
Currently, when a finger lightly touches a screen, the touch may be recognized by the terminal as a touch event of tapping the screen by the joint of the finger, especially a touch screen lightly touched by the thumb, and if the type of the touch event is recognized incorrectly, the mobile terminal may respond to an event inconsistent with the intention of the user.
Therefore, the problem of low accuracy in identifying the joint tap screen event of the finger exists in the related art.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a touch event recognition method, apparatus, and computer-readable storage medium.
According to a first aspect of the embodiments of the present disclosure, a method for identifying a touch event is provided, the method including:
responding to a touch event of a screen, and determining that the touch event is a tapping event;
determining a target touch area where the touch position of the touch event is located, and,
acquiring a current acceleration value detected in the target touch area for the touch event;
and determining that the touch event is a finger joint tapping event in response to the fact that the confidence corresponding to the current acceleration value is larger than a preset confidence threshold corresponding to the target touch area.
Optionally, the determining, in response to a touch event to the screen, that the touch event is a tap event includes:
responding to a touch event of a screen, and acquiring detection data of the touch event;
inputting the detection data into a first model to obtain classification probability;
if the classification probability is within a preset interval, inputting the detection data into a second model to obtain classification information of the touch event output by the second model;
and if the classification information is preset classification information, determining that the touch event is a tapping event.
Optionally, the method further comprises:
if the classification probability is smaller than the lower probability limit of the preset interval, determining that the touch event is a tapping event;
and/or the presence of a gas in the atmosphere,
and if the classification probability is greater than the probability upper limit of the preset interval, determining that the touch event is a non-tapping event.
Optionally, the second model is trained by using a cross entropy cost function as a loss function; the first model is formed by training by adopting a modified cross entropy cost Focal local function as a Loss function.
Optionally, the determining a target touch area where the touch position of the touch event is located includes:
determining coordinates of a touch position of the touch event;
and searching a touch area matched with the coordinates in a plurality of preset touch areas according to the coordinates, and taking the touch area as a target touch area where the touch position of the touch event is located.
Optionally, the method further comprises:
and determining that the touch event is a non-finger joint tapping event in response to that the confidence degree corresponding to the current acceleration value is less than or equal to a preset confidence degree threshold corresponding to the target touch area.
According to a second aspect of the embodiments of the present disclosure, there is provided a touch event recognition apparatus, the apparatus including:
a first determination module configured to determine, in response to a touch event to a screen, that the touch event is a tap event;
a second determination module configured to determine a target touch area at which a touch position of the touch event is located, and,
acquiring a current acceleration value detected in the target touch area for the touch event;
a third determining module configured to determine that the touch event is a finger joint tapping event in response to the confidence degree corresponding to the current acceleration value being greater than a preset confidence degree threshold corresponding to the target touch area.
Optionally, the first determining module includes:
the detection data acquisition sub-module is configured to respond to a touch event of a screen and acquire detection data of the touch event;
a first classification submodule configured to input the detection data into a first model, resulting in a classification probability;
the second classification submodule is configured to input the detection data into a second model to obtain classification information of the touch event output by the second model if the classification probability is within a preset interval;
and the first event determining submodule is configured to determine that the touch event is a tapping event if the classification information is preset classification information.
According to a third aspect of the embodiments of the present disclosure, there is provided a touch event recognition apparatus, the apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
responding to a touch event of a screen, and determining that the touch event is a tapping event;
determining a target touch area where the touch position of the touch event is located, and,
acquiring a current acceleration value detected in the target touch area for the touch event;
and determining that the touch event is a finger joint tapping event in response to the fact that the confidence corresponding to the current acceleration value is larger than a preset confidence threshold corresponding to the target touch area.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium on which computer program instructions are stored, the program instructions, when executed by a processor, implementing the steps of the touch event identification method provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
responding to a touch event of a screen, and determining that the touch event is a tapping event; determining a target touch area where a touch position of a touch event is located, and acquiring a current acceleration value detected for the touch event in the target touch area; and determining that the touch event is a finger joint tapping event in response to the fact that the confidence degree corresponding to the current acceleration value is larger than the preset confidence degree threshold corresponding to the target touch area. On the basis of determining that the touch event is a tapping event, further combining with the confidence corresponding to the current acceleration value detected for the touch event, judging whether the touch event is a finger joint tapping event again, and improving the accuracy of touch event identification.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a touch event recognition method according to an exemplary embodiment.
Fig. 2 is a schematic diagram illustrating a touch area division according to an exemplary embodiment.
Fig. 3 is another schematic diagram illustrating a touch area division according to an exemplary embodiment.
Fig. 4 is a flowchart illustrating S101 shown in fig. 1 according to an exemplary embodiment.
Fig. 5 is a block diagram illustrating a touch event recognition apparatus according to an exemplary embodiment.
Fig. 6 is a block diagram illustrating a touch event recognition apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the disclosure, as detailed in the appended claims.
Due to diversification of the terminal functions, the terminal can respond to specific touch operation, more function selections are provided for a user, interactivity is enhanced, and the user can use the terminal conveniently. For example, when the response corresponding to the finger joint tapping touch operation set by the terminal is to capture a current interface of the terminal, when the user taps a screen through the finger joint, the terminal can respond to the tapping operation and execute capture on the current interface of the terminal. In actual use, an event that a user lightly touches a terminal screen may be recognized by the terminal as a finger joint tapping event, so that a touch event applied to the screen by a normal finger abdomen is mistakenly judged by the terminal as the finger joint tapping event, and therefore, an event that the terminal response is inconsistent with the intention of the user is caused, and therefore, the problem that the finger joint touch event recognition accuracy is low exists in the related art. In view of the above, the present disclosure provides a method and an apparatus for identifying a touch event, and a computer-readable storage medium, which solve the problem of low accuracy in identifying a finger joint touch event.
First, it is explained that the touch event identification method of the present disclosure may be applied to an electronic device provided with an acceleration sensor and a capacitive touch screen, where the screen in the present disclosure also refers to the capacitive touch screen, and the electronic device may be, for example, an electronic device such as a mobile phone and a tablet computer. The embodiments of the present disclosure do not limit this.
The present disclosure is further described below with reference to the accompanying drawings.
Fig. 1 is a flowchart illustrating a touch event recognition method according to an exemplary embodiment, where the touch event recognition method is applied to a terminal, as shown in fig. 1, and includes the following steps:
in step S101, in response to a touch event to the screen, it is determined that the touch event is a tap event.
In step S102, a target touch area where the touch position of the touch event is located is determined, and a current acceleration value detected in the target touch area for the touch event is obtained.
In step S103, in response to that the confidence corresponding to the current acceleration value is greater than the preset confidence threshold corresponding to the target touch area, it is determined that the touch event is a finger joint tapping event.
It should be noted that the determination that the touch event is the tap event in step S101 is determined based on a model, and step S102 is executed only when the model determines that the touch event is the tap event, so as to save terminal resources.
By adopting the technical scheme, on the basis of determining that the touch event is the tapping event based on the model, whether the touch event is the finger joint tapping event or not is judged again by further combining the current acceleration value detected for the touch event, so that the accuracy of touch event identification is improved.
In order to make those skilled in the art understand the touch event identification method in the embodiments of the present disclosure, the following describes the above steps in detail.
For example, an acceleration sensor built in the terminal may be located at a position opposite to a lower side of a screen of the terminal for detecting a force applied to the screen by a user. It will be appreciated that the acceleration sensor must have a large variation in value when the screen is tapped.
It should be noted that, for the force applied to the screen, when the screen is tapped with the same magnitude of force, the acceleration values detected by the acceleration sensor for the positions of the respective applied forces are not uniform. Therefore, for different touch areas, different thresholds (i.e., confidence levels) need to be set for the identification of the touch event when the touch position of the touch event is in the different touch areas. For example, referring to fig. 2, assuming that the acceleration sensor is disposed above the terminal, the display screen of the terminal may be divided into three touch areas (corresponding to the area A1, the area A2, and the area A3 in fig. 2) according to the longitudinal direction of the terminal, and the actual acceleration values detected by the acceleration sensor on the same-magnitude applied forces of the area A1, the area A2, and the area A3 are different.
In one embodiment, the preset confidence thresholds for the areas A1, A2 and A3 are 70, 65 and 63, respectively. For the area A1, the confidence corresponding to the acceleration value detected in the area needs to be greater than 70, so that it can be determined that the touch event in the area is a finger joint tapping event. For the area A2, the confidence corresponding to the detected acceleration value in the area needs to be greater than 65, so that it can be determined that the touch event in the area is a finger joint tapping event. For the area A3, the confidence corresponding to the acceleration value detected in the area needs to be greater than 63, so that the touch event in the area can be determined to be a finger joint tapping event.
It should be noted that the preset confidence threshold set for each touch area may be set according to an actual test situation, which is not limited in this embodiment.
It should be noted that the method for converting the acceleration value into the corresponding confidence coefficient may be obtained from the implementation manner in the related art, and this embodiment is not described in detail herein.
In this embodiment, the terminal may be further divided into touch areas according to the actual set position of the acceleration sensor. When the position actually set by the acceleration sensor is close to the upper left position of the terminal screen, the terminal screen center is taken as a central point, the four aspects are extended, and the screen is divided into four touch areas, such as an area B1, an area B2, an area B3 and an area B4 in fig. 3. The present embodiment may also divide the touch area of the screen in other manners according to the actual setting of the acceleration sensor, which is not limited in this embodiment.
In this embodiment, the target touch area is a touch area to which the touch position of the touch event belongs. Still following the above example, in case that the touch area divided by the terminal includes an area A1 (corresponding to an upper portion of the terminal display screen), an area A2 (corresponding to a middle portion of the terminal display screen), and an area A3 (corresponding to a lower portion of the terminal display screen), when the touch position of the touch event is located at the upper portion of the terminal screen, correspondingly, the target touch area is the area A1.
In the present disclosure, a memory of the terminal may store a mapping table for each touch area and a preset confidence threshold corresponding to each touch area. When determining a target touch area where a touch position of a touch event is located, the terminal may search the mapping table for a touch area corresponding to the target touch area, and then compare the current acceleration value with a preset confidence threshold corresponding to the touch area, so as to determine whether the touch event corresponding to the current acceleration value is a finger joint tapping event.
Fig. 4 is a flowchart illustrating S101 shown in fig. 1 according to an exemplary embodiment, as shown in fig. 4, including the steps of:
in step S401, in response to a touch event to the screen, detection data of the touch event is acquired.
In step S402, the detection data is input into the first model, and the classification probability is obtained.
In step S403, if the classification probability is within the preset interval, the detection data is input to the second model to obtain the classification information of the touch event output by the second model.
In step S404, if the classification information is the preset classification information, it is determined that the touch event is a tapping event.
In some embodiments, the terminal reports coordinates of the touch position of the touch event and/or capacitance values corresponding to the coordinates.
In an alternative embodiment, the detection data may include: and reporting the capacitance value corresponding to the touch event.
It is to be understood that the sensing data input to the first model and the second model may include a data matrix including preset rows and predetermined columns. One of the data matrices may correspond to one touch event.
In the embodiment of the present disclosure, the detection data is input into the first model, and the first model extracts the detection data for operation to obtain the classification probability. After the classification probability is obtained, whether the classification probability of the detection data of the touch event is within a preset interval or not is further determined, and the preset interval can be a continuously distributed probability space. The preset interval may include: an upper probability limit and a lower probability limit. The lower probability limit is less than the upper probability limit.
The value of the classification probability is between 0 and 1.
The preset interval may include: a probability interval of 0.5. For example, the preset interval may be: 0.5 to 0.7; alternatively, 0.45 to 0.75.
It can be understood that the preset interval divides the classification probability between 0 and 1 into 3 sets; the probability contained in the preset interval is a set; the classification probability smaller than the lower probability limit of the preset interval belongs to a set; the classification probability greater than the upper probability limit of the preset interval belongs to another set.
And taking the classification probability corresponding to the preset interval as follows: 0.5 to 0.7, the classification probabilities corresponding to the two sets are: 0 to 0.5 and not including 0.5, and 0.7 to 1 and not including 0.7.
If the classification probability is within the preset interval, the second model is continuously used for classifying the touch event. In S403, the detection data of the touch event with the classification probability in the preset interval is input into the second model, and the second model completes classification of the touch event through operations such as convolution and the like, so as to obtain classification information.
The classification information herein may include: and indicating the judging information of the category to which the touch event belongs. It is understood that, in the present embodiment, the classification information at least includes the tap type information and other types of information besides the finger tap type information, and the other types of information include, but are not limited to, the touch operation information applied to the screen by the finger pad of the finger, and/or the touch operation information applied to the screen by the stylus, and/or the touch operation information applied to the screen by the finger side.
If the touch event is classified twice in the embodiment of the present disclosure, the classification information output in S403 may be: "0" or "1"; "0" indicates a category of touch events; another category of touch events indicated by a "1". It is further understood that "0" may represent the preset category information, and "1" may also represent the preset category information, which is not limited in this embodiment.
Wherein the network structures of the first model and the second model may be the same or different.
By adopting the technical scheme, the classification of the touch events is simply and conveniently realized through the two deep learning models. It can be understood that the probability differences of the touch event representations with the classification probabilities in the preset interval are not large, that is, the first classification model is difficult to distinguish the touch events, so that the touch events are classified again through the second classification model, and the classification accuracy is improved.
In one embodiment, the method further comprises:
if the classification probability is smaller than the lower probability limit of the preset interval, determining that the touch event is a tapping event;
and/or the presence of a gas in the gas,
and if the classification probability is greater than the upper probability limit of the preset interval, determining that the touch event is a non-tapping event.
It can be understood that if the classification probability is outside the preset interval, it indicates that the first classification model can well distinguish the type of the touch event. Therefore, the technical scheme has the characteristic of high classification speed.
Wherein the non-tapping event includes, but is not limited to, the touch operation information applied to the screen by the finger pad of the finger, and/or the touch operation information applied to the screen by the stylus, and/or the touch operation information applied to the screen by the finger side.
In one embodiment, the second model is trained by using a cross entropy cost function as a loss function; the first model is formed by training by adopting a modified cross entropy cost Focal local function as a Loss function.
In the embodiment of the present disclosure, the first model and the second model have a correlation, and the Loss function of the first model and the second model adopts a basic cross entropy cost function (or a common cross entropy cost function) and a Focal local function having a certain correlation. The Focal local function is obtained after the basic cross entropy cost function is modified.
The cross-entropy cost function employed by the second model may be as follows:
Figure BDA0002821743100000101
wherein, L is a loss value calculated by adopting a cross entropy cost function. y is the actual label of the sample; and y' is a label predicted by the second model on the sample.
The Focal local function may be as follows:
Figure BDA0002821743100000102
wherein, L1 is a Loss value calculated based on a Focal local function; y' is the classification information output by the first model; y is the label of the sample;
Figure BDA0002821743100000103
is a balance factor. In an embodiment of the disclosure, is selected>
Figure BDA0002821743100000104
To be usedThe number of samples corresponding to the touch events of the two categories is balanced, that is, the number is determined according to the difference between the numbers of samples corresponding to the touch events of the two categories used in the training process of the first model.
Therefore, the first model is trained by adopting the Focal local function, the trained first model is difficult to classify the detection data, the classification probability output by the first model is near 0.5, namely, the detection data with low classification accuracy of the first model are located in a preset interval as much as possible, and a second model cascaded with the first model is triggered to classify the detection data; therefore, accurate classification of different types of touch events is realized.
In some embodiments, the training samples of the second model comprise: and after the classification probability is input into the first model, the classification probability output by the first model is positioned in the samples in the preset interval.
In some embodiments, the determining the target touch area where the touch position of the touch event is located may include: determining coordinates of a touch position of the touch event; and searching a touch area matched with the coordinates in a plurality of preset touch areas according to the coordinates, and taking the touch area as a target touch area where the touch position of the touch event is located.
For example, a capacitive touch screen of a terminal is a simple and practical input device, and its working principle is to locate spatial position coordinates of a touch point by detecting an electrical signal of the touch point on a conductive film having an electric field distribution. After the coordinates of the touch points are determined according to the capacitive touch screen, the terminal matches the touch area where the coordinates are located according to the determined coordinates, and the located touch area is determined as a target touch area.
In other possible embodiments, the determining the target touch area where the touch position of the touch event is located may further include: firstly, a terminal determines pixel points of a touch position of a touch event; then, the terminal determines the coordinates of the touch position of the touch event according to the pixel point; and then, the terminal matches the touch area where the coordinate is located according to the determined coordinate, and the located touch area is determined as a target touch area.
It should be noted that the preset multiple touch areas may refer to the areas shown in fig. 2 and fig. 3, and multiple touch areas in other arrangement modes may also be set according to the position of the acceleration sensor.
In some embodiments, in response to the confidence level corresponding to the current acceleration value being less than or equal to the preset confidence level threshold corresponding to the target touch area, determining that the touch event is a non-finger joint tap event.
Based on the same inventive concept, the present disclosure also provides a touch event recognition apparatus, and fig. 5 is a block diagram illustrating a touch event recognition apparatus according to an exemplary embodiment. As shown in fig. 5, the touch event recognition apparatus includes a first determination module 501, a second determination module 502, and a third determination module 503.
A first determining module 501 configured to determine, in response to a touch event to a screen, that the touch event is a tap event.
A second determining module 502 configured to determine a target touch area where the touch position of the touch event is located, and,
and acquiring a current acceleration value detected in the target touch area aiming at the touch event.
A third determining module 503, configured to determine that the touch event is a finger joint tapping event in response to that the confidence degree corresponding to the current acceleration value is greater than a preset confidence degree threshold corresponding to the target touch area.
Optionally, the first determining module 501 further includes:
the detection data acquisition sub-module is configured to respond to a touch event of a screen and acquire detection data of the touch event.
And the first classification submodule is configured to input the detection data into a first model to obtain classification probability.
And the second classification submodule is configured to input the detection data into a second model to obtain the classification information of the touch event output by the second model if the classification probability is within a preset interval.
And the first event determining submodule is configured to determine that the touch event is a tapping event if the classification information is preset classification information.
Optionally, the first determining module 501 further includes:
and the second event determining submodule is configured to determine that the touch event is a tapping event if the classification probability is smaller than the lower probability limit of the preset interval.
And/or the presence of a gas in the gas,
and the third event determining submodule is configured to determine that the touch event is a non-tapping event if the classification probability is greater than the upper probability limit of the preset interval.
Optionally, the second model is trained by using a cross entropy cost function as a loss function; the first model is formed by training by adopting a modified cross entropy cost Focal local function as a Loss function.
Optionally, the second determining module 502 includes:
a coordinate determination submodule configured to determine coordinates of a touch location of the touch event.
And the target touch area determining submodule is configured to search a touch area matched with the coordinates in a plurality of preset touch areas according to the coordinates, and use the touch area as a target touch area where the touch position of the touch event is located.
The apparatus 500 further comprises:
a fourth determining module configured to determine that the touch event is a non-finger joint tapping event in response to that the confidence corresponding to the current acceleration value is less than or equal to a preset confidence threshold corresponding to the target touch area.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Based on the same inventive concept, the present disclosure also provides a computer-readable storage medium on which computer program instructions are stored, which when executed by a processor implement the steps of the touch event recognition method provided by the present disclosure.
Based on the same inventive concept, the present disclosure also provides a touch event recognition device, the device comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
in response to a touch event to a screen, determining that the touch event is a tap event;
determining a target touch area where the touch position of the touch event is located, and,
acquiring a current acceleration value detected in the target touch area for the touch event;
and determining that the touch event is a finger joint tapping event in response to the fact that the confidence corresponding to the current acceleration value is larger than a preset confidence threshold corresponding to the target touch area.
Fig. 6 is a block diagram illustrating a touch event recognition apparatus according to an exemplary embodiment. For example, the apparatus 600 may be a mobile device such as a mobile phone, tablet device, and the like.
Referring to fig. 6, the apparatus 600 may include one or more of the following components: a processing component 602, a memory 604, a power component 606, a multimedia component 608, an audio component 610, an input/output (I/O) interface 612, and a communication component 614, an acceleration sensor 616.
The processing component 602 generally controls overall operation of the device 600, such as operations associated with display, data communication, and recording operations. The processing component 602 may include one or more processors 620 to execute instructions to perform all or part of the steps of touch event recognition described above. Further, the processing component 602 can include one or more modules that facilitate interaction between the processing component 602 and other components. For example, the processing component 602 can include a multimedia module to facilitate interaction between the multimedia component 608 and the processing component 602.
The memory 604 is configured to store various types of data to support operations at the apparatus 600. Examples of such data include instructions for any application or method operating on the apparatus 600, and so forth. The memory 604 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 606 provides power to the various components of device 600. Power components 606 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 600.
The multimedia component 608 includes a display screen that provides an output interface between the device 600 and a user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen, and the touch screen is a capacitive touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The audio component 610 is configured to output and/or input audio signals. For example, audio component 610 includes a Microphone (MIC) configured to receive external audio signals when apparatus 600 is in an operational mode, such as a speech recognition mode. The received audio signals may further be stored in the memory 604 or transmitted via the communication component 614. In some embodiments, audio component 610 further includes a speaker for outputting audio signals.
An input/output (I/O) interface 612 provides an interface between the processing component 602 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The communication component 614 is configured to facilitate wired or wireless communication between the apparatus 600 and other devices. The apparatus 600 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 614 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 614 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
The acceleration sensor 616 is configured to detect an acceleration value corresponding to a touch operation.
In an exemplary embodiment, the apparatus 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described touch event recognition method.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 604 comprising instructions, executable by the processor 620 of the device 600 to perform the touch event identification method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned touch event identification when executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (7)

1. A touch event identification method is characterized by comprising the following steps:
responding to a touch event of a screen, and determining that the touch event is a tapping event;
determining a target touch area where the touch position of the touch event is located, and,
acquiring a current acceleration value detected in the target touch area for the touch event;
determining that the tapping event is a finger joint tapping event in response to the fact that the confidence degree corresponding to the current acceleration value is larger than a preset confidence degree threshold corresponding to the target touch area;
the determining, in response to the touch event to the screen, that the touch event is a tap event includes:
responding to a touch event of a screen, and acquiring detection data of the touch event; inputting the detection data into a first model to obtain classification probability; if the classification probability is within a preset interval, inputting the detection data of the touch event of which the classification probability is within the preset interval into a second model to obtain the classification information of the touch event output by the second model; if the classification information is preset classification information, determining that the touch event is a tapping event;
the determining the target touch area where the touch position of the touch event is located includes: determining coordinates of a touch position of the touch event; and searching a touch area matched with the coordinates in a plurality of preset touch areas according to the coordinates, and taking the touch area as a target touch area where the touch position of the touch event is located.
2. The method of claim 1, further comprising:
if the classification probability is smaller than the lower probability limit of the preset interval, determining that the touch event is a tapping event;
and/or the presence of a gas in the gas,
and if the classification probability is greater than the probability upper limit of the preset interval, determining that the touch event is a non-tapping event.
3. The method of claim 1, wherein the second model is trained using a cross-entropy cost function as a loss function; the first model is formed by training by adopting a modified cross entropy cost Focal local function as a Loss function.
4. The method according to any one of claims 1-3, further comprising:
and determining that the touch event is a non-finger joint tapping event in response to that the confidence degree corresponding to the current acceleration value is less than or equal to a preset confidence degree threshold corresponding to the target touch area.
5. A touch event recognition apparatus, the apparatus comprising:
a first determination module configured to determine, in response to a touch event to a screen, that the touch event is a tap event;
a second determination module configured to determine a target touch area at which a touch position of the touch event is located, and,
acquiring a current acceleration value detected in the target touch area for the touch event;
a third determining module configured to determine that the touch event is a finger joint tap event in response to the confidence corresponding to the current acceleration value being greater than a preset confidence threshold corresponding to the target touch area;
the first determining module includes:
the detection data acquisition sub-module is configured to respond to a touch event of a screen and acquire detection data of the touch event;
a first classification submodule configured to input the detection data into a first model, resulting in a classification probability;
the second classification submodule is configured to input detection data of the touch event of which the classification probability is within a preset interval into a second model to obtain classification information of the touch event output by the second model if the classification probability is within the preset interval;
a first event determining sub-module configured to determine that the touch event is a tapping event if the classification information is preset category information;
the second determining module includes:
a coordinate determination submodule configured to determine coordinates of a touch position of the touch event;
and the target touch area determining submodule is configured to search a touch area matched with the coordinates in a plurality of preset touch areas according to the coordinates, and use the touch area as a target touch area where the touch position of the touch event is located.
6. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
in response to a touch event to a screen, determining that the touch event is a tap event;
determining a target touch area where the touch position of the touch event is located, and,
acquiring a current acceleration value detected in the target touch area for the touch event;
determining that the touch event is a finger joint tapping event in response to the fact that the confidence corresponding to the current acceleration value is larger than a preset confidence threshold corresponding to the target touch area;
the determining, in response to the touch event to the screen, that the touch event is a tap event includes:
responding to a touch event of a screen, and acquiring detection data of the touch event; inputting the detection data into a first model to obtain classification probability; if the classification probability is within a preset interval, inputting the detection data of the touch event of which the classification probability is within the preset interval into a second model to obtain the classification information of the touch event output by the second model; if the classification information is preset classification information, determining that the touch event is a tapping event;
the determining the target touch area where the touch position of the touch event is located includes: determining coordinates of a touch position of the touch event; and searching a touch area matched with the coordinates in a plurality of preset touch areas according to the coordinates, and taking the touch area as a target touch area where the touch position of the touch event is located.
7. A computer-readable storage medium, on which computer program instructions are stored, which program instructions, when executed by a processor, carry out the steps of the method according to any one of claims 1 to 4.
CN202011439375.XA 2020-12-07 2020-12-07 Touch event identification method and device and computer readable storage medium Active CN112445410B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011439375.XA CN112445410B (en) 2020-12-07 2020-12-07 Touch event identification method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011439375.XA CN112445410B (en) 2020-12-07 2020-12-07 Touch event identification method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN112445410A CN112445410A (en) 2021-03-05
CN112445410B true CN112445410B (en) 2023-04-18

Family

ID=74740209

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011439375.XA Active CN112445410B (en) 2020-12-07 2020-12-07 Touch event identification method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112445410B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114567696A (en) * 2022-01-25 2022-05-31 北京小米移动软件有限公司 Application control method, application control device and storage medium
CN117389454A (en) * 2022-07-01 2024-01-12 荣耀终端有限公司 Finger joint operation identification method and electronic equipment
CN116027953A (en) * 2022-08-15 2023-04-28 荣耀终端有限公司 Finger joint touch operation identification method, electronic equipment and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4241329A (en) * 1978-04-27 1980-12-23 Dialog Systems, Inc. Continuous speech recognition method for improving false alarm rates
WO2017133615A1 (en) * 2016-02-03 2017-08-10 腾讯科技(深圳)有限公司 Service parameter acquisition method and apparatus
CN107729924A (en) * 2017-09-25 2018-02-23 平安科技(深圳)有限公司 Picture review probability interval generation method and picture review decision method
CN109240585A (en) * 2018-08-08 2019-01-18 瑞声科技(新加坡)有限公司 A kind of method, apparatus of human-computer interaction, terminal and computer readable storage medium
CN110209638A (en) * 2019-05-22 2019-09-06 努比亚技术有限公司 Application function identification method, device, terminal and storage medium

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103136247B (en) * 2011-11-29 2015-12-02 阿里巴巴集团控股有限公司 Attribute data interval division method and device
US20130176270A1 (en) * 2012-01-09 2013-07-11 Broadcom Corporation Object classification for touch panels
US20150242009A1 (en) * 2014-02-26 2015-08-27 Qeexo, Co. Using Capacitive Images for Touch Type Classification
WO2015188011A1 (en) * 2014-06-04 2015-12-10 Quantum Interface, Llc. Dynamic environment for object and attribute display and interaction
US10606417B2 (en) * 2014-09-24 2020-03-31 Qeexo, Co. Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns
CN106415472B (en) * 2015-04-14 2020-10-09 华为技术有限公司 Gesture control method and device, terminal equipment and storage medium
CN106339137A (en) * 2015-07-16 2017-01-18 小米科技有限责任公司 Terminal touch recognition method and device
CN106406587A (en) * 2015-07-16 2017-02-15 小米科技有限责任公司 Terminal touch control identification method and device
CN105630239B (en) * 2015-12-24 2019-03-22 小米科技有限责任公司 Operate detection method and device
CN106445120A (en) * 2016-09-05 2017-02-22 华为技术有限公司 Touch operation identification method and apparatus
CN106445231A (en) * 2016-09-18 2017-02-22 青岛海信移动通信技术股份有限公司 Identification method and device of touches
CN106569710A (en) * 2016-10-31 2017-04-19 宇龙计算机通信科技(深圳)有限公司 Task starting method and device, and a terminal
CN107402677B (en) * 2017-07-31 2020-10-09 北京小米移动软件有限公司 Method and device for recognizing finger lifting in touch operation and terminal
CN109753172A (en) * 2017-11-03 2019-05-14 矽统科技股份有限公司 The classification method and system and touch panel product of touch panel percussion event
TWI654541B (en) * 2018-04-13 2019-03-21 矽統科技股份有限公司 Method and system for identifying tapping events on a touch panel, and terminal touch products
CN109583501B (en) * 2018-11-30 2021-05-07 广州市百果园信息技术有限公司 Method, device, equipment and medium for generating image classification and classification recognition model
CN110880117A (en) * 2019-10-31 2020-03-13 北京三快在线科技有限公司 False service identification method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4241329A (en) * 1978-04-27 1980-12-23 Dialog Systems, Inc. Continuous speech recognition method for improving false alarm rates
WO2017133615A1 (en) * 2016-02-03 2017-08-10 腾讯科技(深圳)有限公司 Service parameter acquisition method and apparatus
CN107729924A (en) * 2017-09-25 2018-02-23 平安科技(深圳)有限公司 Picture review probability interval generation method and picture review decision method
CN109240585A (en) * 2018-08-08 2019-01-18 瑞声科技(新加坡)有限公司 A kind of method, apparatus of human-computer interaction, terminal and computer readable storage medium
CN110209638A (en) * 2019-05-22 2019-09-06 努比亚技术有限公司 Application function identification method, device, terminal and storage medium

Also Published As

Publication number Publication date
CN112445410A (en) 2021-03-05

Similar Documents

Publication Publication Date Title
CN112445410B (en) Touch event identification method and device and computer readable storage medium
CN102789332B (en) Method for identifying palm area on touch panel and updating method thereof
US8432368B2 (en) User interface methods and systems for providing force-sensitive input
CN102119376B (en) Multidimensional navigation for touch-sensitive display
CN106227520B (en) Application interface switching method and device
CN107436691B (en) Method, client, server and device for correcting errors of input method
CN106775087A (en) A kind of touch-screen control method of mobile terminal, device and mobile terminal
US20160299590A1 (en) Customization method, response method and mobile terminal for user-defined touch
CN103914196B (en) Electronic equipment and the method for determining the validity that the touch key-press of electronic equipment inputs
CN106873834B (en) Method and device for identifying triggering of key and mobile terminal
CN104007924A (en) Method and apparatus for operating object in user device
CN107357458B (en) Touch key response method and device, storage medium and mobile terminal
CN105739868B (en) A kind of method and device that identification terminal is accidentally touched
WO2017161637A1 (en) Touch control method, touch control device, and terminal
CN104598076B (en) Touch information screen method and device
CN105893955A (en) Fingerprint recognition device and method
CN105867822B (en) Information processing method and electronic equipment
CN107463290A (en) Response control mehtod, device, storage medium and the mobile terminal of touch operation
CN107037965A (en) A kind of information displaying method based on input, device and mobile terminal
KR20150027885A (en) Operating Method for Electronic Handwriting and Electronic Device supporting the same
CN107958273B (en) Volume adjusting method and device and storage medium
CN112286440A (en) Touch operation classification method and device, model training method and device, terminal and storage medium
US20180364907A1 (en) A system and method for detecting keystrokes in a passive keyboard in mobile devices
CN108153478A (en) The processing method of touch and terminal of terminal
CN116204073A (en) Touch control method, touch control device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant