CN114911401A - Electronic device, method for analyzing touch operation of electronic device, and readable medium - Google Patents

Electronic device, method for analyzing touch operation of electronic device, and readable medium Download PDF

Info

Publication number
CN114911401A
CN114911401A CN202110184394.0A CN202110184394A CN114911401A CN 114911401 A CN114911401 A CN 114911401A CN 202110184394 A CN202110184394 A CN 202110184394A CN 114911401 A CN114911401 A CN 114911401A
Authority
CN
China
Prior art keywords
touch
finger
data
touch operation
category
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110184394.0A
Other languages
Chinese (zh)
Inventor
李海
胡凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110184394.0A priority Critical patent/CN114911401A/en
Publication of CN114911401A publication Critical patent/CN114911401A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The application relates to the field of communication, and discloses an electronic device, an analysis method of touch operation of the electronic device and a readable medium. The analysis method of the touch operation comprises the following steps: under the condition that the touch gesture type meets a preset analysis condition, acquiring first volume value data, performing touch operation analysis on the first volume value data according to a preset finger form category in advance, and meanwhile calculating an actual finger form category based on the first volume value data; when the calculated actual finger form category is the same as the preset finger form category, continuously performing touch operation analysis on the capacity value data based on the preset finger form category; when the calculated actual finger form category is different from the preset finger form category, the actual finger form category is adopted to perform touch operation analysis on the acquired capacity value data, the calculation process of the finger form category does not need to be waited, the touch operation type of the user can be analyzed in advance according to the preset finger form category, the analysis efficiency is improved, and the user experience is optimized.

Description

Electronic device, method for analyzing touch operation of electronic device, and readable medium
Technical Field
The present invention relates to the field of communications, and in particular, to an electronic device, a method for analyzing a touch operation of the electronic device, and a readable medium.
Background
With the development of scientific technology, more and more electronic devices begin to adopt touch screens, so that a user can directly control the electronic devices through touch operation. The finger form category of the user touching the touch screen comprises a finger and a finger joint, wherein the finger is used for operating an application program corresponding to an icon displayed at a touch point, the finger joint is used for intercepting a current interface displayed by the touch screen, and at present, the finger is a common finger form category when the user touches the touch screen.
In the response process of the touch screen operated by the user, when the user touches the surface of the touch screen, a coupling capacitor is formed between the touch screen and the surface of the touch screen, so that currents sent by electrodes on four sides of the touch screen flow to a contact point, capacitance value data are formed, the capacitance value data are processed to obtain touch data, and then the touch data and acceleration data are fused through an algorithm to judge the finger form category of the touch operation.
At present, from the touch of a user to the response of a touch screen, the finger form category can be obtained only through data acquisition and complex algorithm calculation, and then the touch screen is driven to respond through touch data and the finger form category, so that the whole time consumption is long, the response speed of the touch screen is slow, and the user experience is poor. For example, the response time of a typical touch screen is 20ms, and the analysis of the finger shape category requires 4-5 ms, which accounts for 20% -25%.
Disclosure of Invention
The embodiment of the application provides an electronic device and an analysis method and a readable medium for touch operation of the electronic device.
The first aspect of the present application provides a touch operation analysis method for a touch screen, which is applied to an electronic device, and the method includes: under the condition that the touch gesture type of a user on a touch screen on the electronic equipment is detected to meet a preset analysis condition, first capacity value data generated by the touch operation of the user on the touch screen is acquired, and the actual finger form category of the touch operation of the user is calculated based on the first capacity value data; under the condition that the actual finger form category is not calculated, analyzing the user touch operation type of the volume value data acquired from the volume value data queue based on the preset finger form category; under the condition that the actual finger form category is calculated and is the same as the preset finger form category, continuously adopting the preset finger form category to analyze the user touch operation type of the capacity value data acquired from the capacity value data queue; and under the condition that the actual finger form category is calculated and is different from the preset finger form category, analyzing the user touch operation type of the capacity value data acquired from the capacity value data queue by adopting the actual finger form category.
In other words, in the embodiment of the present application, after the first volume value data is obtained, the touch operation type of the first volume value data is analyzed according to the preset finger form type, and meanwhile, the actual finger form type of the touch operation is calculated based on the first volume value data. When the actual finger form category is not calculated, continuously analyzing the touch operation type of the volume value data behind the first volume value data according to the preset finger form category; and when the calculated actual form category is the same as the preset finger form category, continuously analyzing the touch operation type of the volume value data after the first volume value data according to the preset finger form category. The first capacity value data may be first frame capacity value data, and in order to avoid inaccuracy of the first frame capacity value data and further influence accuracy of the actual finger form category, the first capacity value data may also be initial several frames of capacity value data.
For example, the electronic device is a mobile phone, the application program is a chat interface of instant messaging, when a finger of a user performs a touch operation on the mobile phone, the mobile phone may collect multiple frames of capacity value data through a touch screen, and first, based on a preset finger form type, the type of the touch operation performed on the capacity value data by the finger is analyzed, for example, the mobile phone slides up and down. Meanwhile, touch data are obtained through calculation according to the capacity value data, and then the actual finger form category of the touch operation is obtained through calculation according to the touch data. And when the actual finger form category is the finger, continuing to perform touch operation type analysis, such as up-down sliding, on the subsequent volume value data according to the finger.
The method can analyze the touch operation type of the user in advance according to the preset finger form type, and the touch operation type does not need to be analyzed after the calculation result of the finger form type is obtained, so that the analysis efficiency of the touch operation type can be improved, the analysis and response of the touch operation of the user are accelerated, and the user experience is optimized.
In one possible implementation of the first aspect, the method further includes: completing analysis of the touch operation type based on the preset finger form category; calculating that the actual finger form type is different from the preset finger form type; and analyzing the acquired volume value data by adopting the actual finger form category for the second touch operation type.
That is, in the embodiment of the present application, when the calculated actual form type is different from the preset finger form type and the analysis of the touch operation type of the first volume value data has been completed according to the preset finger form, the second touch operation type analysis is performed on the first volume value data according to the actual finger form, and the touch operation type analysis is performed on the volume value data subsequent to the first volume value data according to the actual finger form.
For example, when the actual finger form category is a finger joint and the analysis of the touch operation type of the first volume value data is completed according to the finger, the analysis of the touch operation type is performed again on the first volume value data according to the finger joint, and then the analysis of the touch operation type is performed on the subsequent volume value data according to the finger joint. That is, when the instant messaging program finishes up-and-down sliding according to the finger, the instant messaging program finishes the circling action based on the first capacity value data and the subsequent capacity value data according to the finger joint again.
The method needs to correct when analyzing subsequent volume value data, but the touch screen report rate is above 120HZ, namely the report interval between two adjacent frames of volume value data is less than 8ms, so that the analysis method can quickly correct even if false reports exist, and further the accuracy of touch operation analysis is ensured.
In a possible implementation of the first aspect, the method further includes: completing analysis of the touch operation type based on the preset finger form category; calculating that the actual finger form category is different from the preset finger form category; and analyzing the touch operation type of the capacitance value data after the acquired first capacitance value data by adopting the actual finger shape category.
That is, in the embodiment of the present application, when the calculated actual form type is different from the preset finger form type and the analysis of the touch operation type of the first capacitance value data has been completed according to the preset finger form, the analysis of the touch operation type is directly performed on the capacitance value data subsequent to the first capacitance value data according to the actual finger form.
For example, the instant messaging program is discarded, and the user directly enters the circling action completed according to the subsequent volume value data according to the up-and-down sliding completed by the finger based on the first volume value data.
In one possible implementation of the first aspect described above, the finger morphology categories include fingers and knuckles. For example, a finger is used for operating an application program corresponding to an icon displayed at the touch point, and a finger joint is used for intercepting a current interface displayed by the touch screen.
In a possible implementation of the first aspect, the preset finger shape category is a finger.
In other words, in the embodiment of the application, when the user touches the touch screen with the hand, the probability that the finger touches the touch screen is relatively high, so that by setting the preset finger form category as the finger, the probability that the preset finger form category is the same as the actual finger form category can be effectively improved, and the accuracy of analysis and response is improved.
In one possible implementation of the first aspect, the capacitance value data includes a coordinate feature and a capacitance feature, the coordinate feature is used for representing a position of the touch point, and the capacitance feature is used for representing a capacitance level of the touch point.
In one possible implementation of the first aspect, the actual finger form category is calculated by: acquiring acceleration data detected by an acceleration sensor of the electronic equipment; and calculating to obtain the actual finger form category based on the acceleration data and the first capacity value data.
In a possible implementation of the first aspect, the preset analysis condition is that the touch gesture type is a press-down event.
That is, in the embodiment of the present application, when the touch gesture type is a press event, the first volume value data may be the first frame of volume value data, or may also be the first several frames of volume value data, and therefore, the actual finger form category needs to be determined according to the first volume value data. And when the touch gesture type is not a pressing event, namely the first volume value data is the volume value data of the subsequent frame, when the actual finger shape type is obtained, the analysis of the touch operation type can be carried out without calculating the actual finger shape type again, so that the touch operation analysis method of the volume value data is optimized, and the analysis and response speed of the touch operation type is improved.
In one possible implementation of the first aspect, the touch operation types of the user include clicking, long pressing, dragging, and sliding when the finger form category is a finger, and circling and double clicking when the finger form category is a finger joint.
A second aspect of the present application provides an electronic device comprising:
the touch screen is used for responding to the touch operation of a user on the touch screen to generate capacity value data;
a memory storing instructions;
the processor, the processor and the memory are coupled, and when the program instructions stored in the memory are executed by the processor, the processor of the electronic device controls the touch screen to execute any one of the analysis methods for the touch operation.
A third aspect of the present application provides a computer readable medium, which when executed with instructions thereon, causes the readable medium to perform any one of the above-described methods of analyzing a touch operation.
Drawings
Fig. 1 provides a schematic view of a scene when a finger form category is a finger according to an embodiment of the present application.
Fig. 2 provides a scene schematic diagram when the finger form category is a finger joint according to an embodiment of the present application.
Fig. 3 shows a schematic structural diagram of a terminal according to an embodiment of the present application.
FIG. 4 is a schematic diagram of a touch operation analysis method of a touch screen according to some embodiments of the present application.
Fig. 5 is an architecture diagram of an electronic device for implementing a touch operation analysis method of a touch screen according to some embodiments of the present application.
Fig. 6 provides an interaction diagram of a touch operation analysis method of a touch screen according to some embodiments of the present application.
Fig. 7 is a timing diagram of a touch operation analysis method of a touch screen according to some embodiments of the present disclosure.
Fig. 8 is a flowchart of a method for analyzing a touch operation of a touch screen according to some embodiments of the present disclosure.
FIG. 9 provides a flow chart of type analysis of touch operations under different touch gesture categories according to some embodiments of the present application.
FIG. 10 provides a flowchart of an acquisition of actual finger morphology categories according to some embodiments of the present application.
FIG. 11 is a flowchart illustrating a method for correcting a predetermined finger shape category according to a comparison of an actual finger shape category and a predetermined finger shape category according to some embodiments of the present disclosure.
FIG. 12 is a flow chart illustrating a modification of a predetermined finger shape category according to a comparison of an actual finger shape category and a predetermined finger shape category according to some embodiments of the present disclosure.
FIG. 13 is a schematic diagram of another method for analyzing touch operations of a touch screen according to some embodiments of the present application.
FIG. 14 is a flow chart of another method for analyzing touch operations of a touch screen according to some embodiments of the present application.
FIG. 15 illustrates a block diagram of a software architecture of an electronic device, according to some embodiments of the present application.
Wherein, in the reference numeral, 100-mobile phone; 2-finger; 3-finger joints; 4-screenshot area.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1(a) and 1(b) show an example of a scenario where a user touches the touchscreen of a mobile phone 100 according to an embodiment of the present application. Specifically, fig. 1(a) shows a scene that the touch screen of the mobile phone 100 receives a touch of a finger 2, and fig. 1(b) shows a scene that the touch screen of the mobile phone responds to the touch of the finger 2. As shown in fig. 1(a) and 1(b), when the touch screen interface of the mobile phone 100 is a "chat group 1" instant messaging chat interface, the user touches the touch screen with a finger 2 and slides up and down to drag the instant messaging chat records to turn over up and down, and then the instant messaging chat records in different time periods are displayed through the touch screen interface.
Fig. 2(a) and 2(b) show another example of a scenario where a user touches the touchscreen of the cell phone 100 according to an embodiment of the present application. Specifically, fig. 2(a) shows a scene that the touchscreen of the mobile phone 100 receives the touch of the knuckle 3, and fig. 2(b) shows a scene that the touchscreen of the mobile phone 100 responds to the touch of the knuckle 3. As shown in fig. 2(a) and 2(b), when the touch screen interface is an instant messaging chat interface of a "chat group 1", the user finger joint 3 touches the touch screen of the mobile phone 100 to capture a current instant messaging chat interface, and the user finger joint 3 touches the touch screen of the mobile phone 100 to enclose the screenshot area 4 to capture a current interface in the screenshot area 4.
In the scenes shown in fig. 1(a) and 1(b) and fig. 2(a) and 2(b), when a finger of a user performs a touch operation on the mobile phone 100, the mobile phone 100 can acquire multiple frames of capacitance value data through the touch screen. As described above, in the prior art, the finger form category of the user needs to be determined according to the first frame or the first frames of volume value data acquired by the touch screen and the acceleration data acquired by the acceleration sensor 180E, that is, after it is determined that the user uses a finger or a joint, the touch operation of the user is analyzed by combining the volume value data, a touch response is made, and it takes a lot to analyze the finger form category of the user.
In some embodiments of the application, for the above scenario, in order to solve the above problem, after the mobile phone 100 collects the first frame of content data through the touch screen, the mobile phone first performs touch operation analysis on the collected content data according to a default preset finger form type (which may be a finger or a joint), and simultaneously calculates an actual finger form type by combining the first frame of content data and data collected by other sensors (such as the acceleration sensor 180E).
If the actual finger form category is calculated to be the same as the preset finger form category, the mobile phone 100 analyzes the volume value data subsequently collected by the touch screen of the mobile phone 100 according to the preset finger form category all the time, so as to obtain the touch operation category of the user, and give out the corresponding touch response. If the actual finger form type is different from the preset finger form type, the mobile phone 100 analyzes the collected volume value data by using the actual finger form type after detecting the actual finger form type, obtains the touch operation type of the user, and provides a corresponding touch response. The touch operation type may be any one of a click, a long press, a drag, and a slide when the finger form type is a finger, or may be a double click or a circle when the finger form type is a finger joint.
It can be understood that, under the condition that the preset finger form category is the same as the actual finger form category, the touch detection module may identify the touch operation type of the user in advance according to the preset finger form category, and does not need to wait until the calculation result of the finger form category is obtained before analyzing the touch operation type, so that the analysis efficiency of the touch operation type can be improved, the analysis and response of the touch operation of the user are accelerated, and the user experience is optimized.
It is understood that, besides the mobile phone 100 in the above scenario, the electronic device that can perform touch detection by applying the claimed technical solution may also be any electronic device with a touch screen, for example, a tablet computer, a laptop computer, a wearable device, a head-mounted display, a mobile email device, a portable game console, a portable music player, a reader device, a smart tv with a touch screen, a smart speaker, and other smart home devices. For convenience of explanation, the touch analysis scheme of the present application is described in detail below by taking the mobile phone 100 as an example.
In addition, the technical scheme of the application is suitable for any application program in the electronic equipment, including third-party applications and system applications in the electronic equipment, such as window management applications (such as a screen capture function of a finger joint) in the system applications, gallery applications, and third-party applications such as instant messaging applications, mailbox applications and reading applications.
Fig. 3 shows a schematic structural diagram of the mobile phone 100 according to an embodiment of the present application. The mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. It is understood that, in the embodiment of the present application, the processor 110 may be configured to execute the touch operation analysis method of the present application.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bidirectional synchronous serial bus, and includes a serial data line (SDA) and a Serial Clock Line (SCL). The I2S interface may be used for audio communication. The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. The UART interface is a universal serial data bus used for asynchronous communications. The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display screen 194, the camera 193, and the like. The GPIO interface may be configured by software. The USB interface 130 is an interface conforming to the USB standard specification, and may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like.
It should be understood that the connection relationship between the modules shown in the embodiment of the present invention is only illustrative, and does not limit the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 140 can also supply power to the mobile phone through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 100.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal.
The wireless communication module 160 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module.
In some embodiments, the antenna 1 of the handset 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset 100 can communicate with networks and other devices through wireless communication techniques.
The mobile phone 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 194 and the application processor.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The mobile phone 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like. The ISP is used to process the data fed back by the camera 193. The camera 193 is used to capture still images or video. The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. Video codecs are used to compress or decompress digital video.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications and data processing of the mobile phone 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The mobile phone 100 may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. The audio module 170 is used to convert digital audio information into analog audio signals for output, and also used to convert analog audio inputs into digital audio signals. The audio module 170 may also be used to encode and decode audio signals. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. The headphone interface 170D is used to connect a wired headphone.
The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for identifying the gesture of the mobile phone, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed in the display screen 194, and a touch screen, also referred to as a "touch screen", is formed by the touch sensor 180K and the display screen 194. The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor 180K may pass the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100, different from the position of the display 194. It can be understood that, in some embodiments of the present application, when a user performs a touch operation on a touch screen, the technical solution of the present application may be adopted to analyze the touch operation of the user and provide a corresponding operation response.
The touch screen can be any one of an infrared touch screen, a resistive touch screen, a surface acoustic wave touch screen and a capacitive touch screen.
The keys 190 include a power-on key, a volume key, and the like. The motor 191 may generate a vibration cue. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The touch operation analysis scheme of the present application is described in detail below with reference to specific embodiments.
Fig. 4 is a schematic diagram of an analysis method of a touch operation according to the present application. As shown in fig. 4(a), if the actual finger form category is the same as the preset finger form category, the application 199 in the mobile phone 100 does not receive the actual finger form category, and analyzes the volume value data subsequently collected by the touch screen of the mobile phone 100 according to the preset finger form category all the time, so as to obtain the touch operation category of the user, and provide a corresponding touch response. As shown in fig. 4(b), if the actual finger form category is different from the preset finger form category, the application 199 in the mobile phone 100 receives the actual finger form category, performs touch operation analysis according to the preset finger form category before analyzing the actual finger form category, performs touch operation analysis according to the actual finger form category after analyzing the actual finger form category, and corrects the capacity value data that has been responded according to the preset finger form category according to the actual finger form category. It is understood that the time for the cell phone 100 to analyze the actual finger form category is not limited.
Fig. 5 is a diagram illustrating a system architecture of the mobile phone 100 for implementing the touch operation analysis method of the present application in some embodiments of the present application, wherein the processor 110 in the mobile phone 100 includes a touch driver 196, an anti-false touch model 197, a gesture type determination module 198A, and a category determination model 198B. As shown in fig. 5, the handset 100 includes an application layer 550, a framework layer 540, a hardware abstraction layer 530, a driver layer 520, and a device layer 510. The device layer 510 includes physical devices such as a touch sensor 180K, an acceleration sensor 180E, and a touch chip 180M, the driver layer 520 includes a touch driver 196 for receiving and sending touch data, the hardware abstraction layer 530 includes an anti-false touch model 197 for screening touch data, the framework layer 540 includes a category determination model 198B, the framework layer 540 further includes a gesture type determination module 198A capable of determining a type of a touch gesture according to a serial number in the touch data, and the application 199 in the application layer 550 is configured to receive a position feature, a pressure, an area, and a finger form category, and perform touch operation analysis on the position feature, the pressure, and the area according to the finger form category, for example, the application 199 is a window management application.
Fig. 6 is an interaction diagram of an analysis method of a touch operation according to the present application. The following describes a touch operation analysis scheme according to some embodiments of the present application with reference to fig. 5 and 6, which specifically includes the following steps:
step 602: the touch sensor 180K in the touch screen of the mobile phone 100 scans the surface of the touch screen at regular time according to the sampling frequency, and obtains capacitance data generated by the operation touch of the user.
Specifically, in some embodiments, the touch sensor 180K derives capacitance data from the initial data. Specifically, the touch sensor 180K acquires the screening logic, the classification logic, and the initial data generated by the touch operation, and processes the initial data using the screening logic and the classification logic to obtain the capacitance data generated by the touch operation. The initial data is capacitance value information of all touch points on the touch screen, the capacitance value information comprises coordinate characteristics for representing positions of the touch points and capacitance characteristics for representing capacitance heights of the touch points, the screening logic is a data processing rule for screening out capacitance value information which does not exceed a threshold of a reporting point and reserving the capacitance value information which exceeds the threshold of the reporting point, the threshold of the reporting point can be a capacitance threshold, and the classification logic is a data processing rule for classifying the screened capacitance value information according to position characteristics. The touch sensor 180K first screens out capacitance value information whose capacitance characteristic is equal to or higher than a capacitance threshold value by using the screening logic, and then sorts the screened capacitance value information by using the sorting logic to obtain capacitance value data. It is understood that the capacity value data corresponding to one initial data may be one or more, when the quantity of the capacity value data is one, the user touches the touch screen with a single finger or a single knuckle, and when the quantity of the capacity value data is multiple, the user touches the touch screen with multiple fingers or multiple knuckles.
Step 603: the touch sensor 180K transmits the capacitance data to the touch chip 180M.
Step 604: after receiving the capacitance data, the touch chip 180M determines whether an interrupt message caused by a touch operation exists according to the capacitance data, and generates touch data according to the capacitance data. The interrupt message caused by the touch operation refers to a trigger signal generated when the touch screen receives the touch operation of a user. If the interrupt message exists, it indicates that the user starts the touch operation on the touch screen, and further triggers the touch screen to generate the capacitance value data, and then step 605 is entered, if the interrupt message does not exist, it indicates that the user does not perform the touch operation on the touch screen, and does not trigger the touch screen to generate the capacitance value data, and step 602 is returned.
Specifically, the touch data includes a position characteristic, a pressure, an area, a capacity matrix, and a cut-off value, and further includes a serial number for indicating that the capacity data is the several frames of capacity data.
In some embodiments, after receiving the capacitance value data, the touch chip 180M extracts the coordinate features and the capacitance features corresponding to all the capacitance value information in each capacitance value data, obtains the center coordinates of the capacitance value data according to the coordinate features and the capacitance features, and uses the center coordinates as the position features, where the center features are coordinate features capable of comprehensively representing the touch operation position of the user. The touch chip 180M calculates the distance between the position feature and each coordinate feature, selects a coordinate feature having a distance smaller than a distance threshold, acquires a capacitance feature corresponding to the selected coordinate feature, and stores the selected capacitance feature according to the coordinate feature to form a matrix of capacitance information corresponding to the touch point position, that is, a capacitance matrix. In addition, the touch chip 180M calculates the pressure and the area by the coordinate feature and the capacitance feature of the capacitance value data.
Step 605: the touch chip 180M reports the interrupt message to the touch driver 196 in the driver layer 520.
Step 606: after receiving the interrupt message, the touch driver 196 reads touch data from the touch chip 180M according to the interrupt message.
Specifically, after receiving the interrupt message, the touch driver 196 reads touch data from the touch chip 180M through an SPI (Serial Peripheral Interface) channel.
Step 607: the touch driver 196 sends the read touch data to the anti-false touch model 197 in the hardware abstraction layer 530.
Step 608: the false touch prevention model 197 determines whether the touch data is false touch data, and screens out touch data that is not false touch to obtain the screened touch data.
Specifically, the false touch prevention model 197 determines whether the touch data is the false touch data after receiving the touch data, rejects the touch data when the touch data is the false touch data, and retains the touch data when the touch data is not the false touch data, so as to obtain the screened touch data. It can be understood that, when a plurality of touch data are provided, the process is ended when all the touch data are false touch data; when part of the touch data is mistaken touch data, the mistaken touch data in the touch data are removed, and the screened touch data are obtained; and if the touch data do not comprise the false touch data, all the touch data are the screened touch data.
It is to be understood that the anti-false touch model 197 may be a model trained according to the sample touch data and the sample determination result. The false touch prevention model 197 may be a contact surface contour detection module, specifically, when the contour of the contact surface between the user and the touch screen is a circle, a quasi-circle, or a regular-like figure, the false touch determination result is no, that is, the touch data is not the false touch data; when the outline of the contact surface between the user and the touch screen changes and is in an irregular shape, the false touch judgment result is yes, namely the touch data is false touch data. The anti-false touch model 197 may also be a trimming value determination module, where the trimming value refers to a length of a trimming when a touch shape of a touch point is tangent to a boundary of the touch screen, and one touch shape may correspond to one or more trims. And the trimming value judgment module obtains a false touch judgment result according to the magnitude relation between the trimming value and the trimming value threshold. The false touch prevention model 197 may also be an area determination module, and the area determination module obtains a false touch determination result according to whether the position characteristic is in a false touch area.
In some embodiments, the filtered touch data is filtered using the false touch prevention model 197. Specifically, after the false touch prevention model 197 of the hardware abstraction layer 530 receives the touch data, a false touch determination result is obtained according to the position feature and the trimming value in the touch data, and then the touch data is screened according to the false touch determination result. When the touch data is the false touch data, if the false touch judgment result is yes, the touch data is removed; and when the touch data is not the false touch data, judging that the false touch is not, and keeping the touch data.
In other embodiments, the placing direction of the mobile phone 100 greatly affects the judgment result of the anti-touch model 197, for example, when the mobile phone 100 is placed horizontally, the left and right sides of the mobile phone 100 are easily touched by mistake; when the mobile phone 100 is placed vertically, the top and the bottom of the mobile phone 100 are easily touched by mistake. In order to improve the accuracy of the false touch judgment result, the false touch prevention model 197 and the motion data are used for screening to obtain the screened touch data. Specifically, the false touch prevention model 197 may further obtain motion data obtained by the acceleration sensor 180E, determine the placement direction of the mobile phone 100 according to the motion data, obtain a false touch determination result according to the touch data and the motion data, and then screen the touch data according to the false touch determination result. For example, the placing direction of the mobile phone 100 is obtained to be inclined forward by 45 degrees according to the motion data, an adjustment coefficient corresponding to the inclination forward by 45 degrees is inquired from a coefficient library, the false touch prevention model 197 is optimized through the adjustment coefficient, and then the judgment result of the touch data is obtained according to the optimized false touch prevention model 197.
In other embodiments, the filtered touch data is obtained by using common false touch data in the false touch prevention database. Specifically, after receiving the touch data, the hardware abstraction layer 530 may further obtain an intra-layer false touch prevention database, where common false touch data are stored in the false touch prevention database, and each common false touch data at least includes a position feature and a trimming value. The hardware abstraction layer 530 matches the location features and the trimming values in the touch data one by one with all common false touch data in the false touch prevention database. When the matching degree of the touch data and at least one common false touch data in the false touch prevention database exceeds a false touch matching threshold, the touch data is the false touch data, and when the matching degree of the touch data and any common false touch data in the false touch prevention database does not exceed the false touch matching threshold, the touch data is not the false touch data.
Step 609: the anti-false touch model 197 returns the filtered touch data to the touch driver 196.
In other embodiments, the hardware abstraction layer 530 may also return the filtered touch data directly to the touch driver 196.
Step 610: after receiving the filtered touch data, the touch driver 196 sends the filtered touch data to the gesture type determining module 198A of the frame layer 540.
Step 611: the gesture type determination module 198A of the frame layer 540 determines the touch gesture type according to the serial number in the filtered touch data. If the touch gesture type determination module 198A determines that the touch gesture type is a pressing event, which indicates that the first volume value data is the first frame of volume value data, or the first several frames of volume value data, it is necessary to determine the actual finger shape type according to the first volume value data, and execute step 612.a1 and step 612.b1 at the same time.
Specifically, the gesture type determination module 198A in the frame layer 540 receives the filtered touch data, and analyzes a serial number in the filtered touch data, that is, identifies the capacitance value data as the data of several frames, thereby obtaining the touch gesture type of the capacitance value data. For example, the value range of the serial number is 1 to n, when the serial number is 1, the volume data is first frame data, the touch gesture type is a press-down event, when the serial number is 2, 3, 4, …, n-1, the volume data is middle frame data, the touch gesture type is a move event, when the serial number of the volume data is n, the volume data is last frame data, and the touch gesture type is a lift-up event.
Step 612.a 1: the gesture type determination module 198A in the framework layer 540 sends the position characteristics, pressure, area, and preset finger form categories in the filtered touch data to the application 199 in the application layer 550. In some embodiments, since the probability that a finger touches the touch screen is high when the hand of the user touches the touch screen, the preset finger form category is preferably a finger, but in other embodiments, the preset finger form category is not limited to a finger, and may also refer to a joint.
Step 612.a 2: the application 199 performs touch operation analysis on the position feature, pressure and area according to the preset finger form category to give a corresponding operation response.
For example, when the application 199 is an instant messaging program and the preset finger shape is a finger, the application 199 analyzes the position characteristics, the pressure and the area according to the finger to obtain the position, the strength and the range of the user touching the finger on the current interface of the instant messaging program, and then determines the action type of the user touching the touch screen based on the position, the strength and the range of the user touching the finger to give a corresponding operation response. Specifically, when the action type is lightly clicked, the operation response is to click on a picture in the information record or play voice information in the information record; the action type is that the force is exerted at the same position for a long time, and the operation response is that the information in the information record is forwarded, collected, edited, deleted, selected or quoted for more times, etc.; the action type is that when sliding lightly at different positions, the operation response is dragging the information record to roll so as to display the information record in different time periods.
For another example, when the application 199 is a gallery program and the preset finger shape is a finger, the application 199 analyzes the position characteristics, the pressure and the area according to the finger to obtain the position, the strength and the range of the user touching the finger on the current interface of the instant communication program, and further determines the action type of the user touching the touch screen based on the position, the strength and the range of the user touching the finger to give a corresponding operation response. Specifically, when the action type is that two or more fingers slide in opposite directions, the operation response is to reduce the picture in the gallery program; when the action type is multi-finger reverse sliding, the operation response is to amplify the picture in the gallery program; when the action type is double-click, the operation response is to enlarge the picture by taking the center of the picture as the center.
Step 612.b 1: the gesture type determination module 198A in the framework layer 540 sends the screened touch data to the category determination model 198B in the framework layer 540, and triggers the category determination model 198B in the framework layer 540.
Step 612.b 2: class determination model 198B receives acceleration data sent by acceleration sensor 180E in device layer 510.
Step 612.b 3: the category determination model 198B obtains the actual finger form category from the touch data and the acceleration data after the screening.
In some embodiments, a finger type determination module (not shown) in the framework layer 540 may determine whether the user's actual finger form type is a knuckle by using a knuckle determination condition. For example, the finger type determination module of the framework layer 540 acquires the finger joint determination condition in the framework layer 540, and determines whether the touch data and the acceleration data after being filtered conform to the finger joint determination condition. The finger joint determination condition includes a volume value range, a pressure range and an acceleration range corresponding to the finger joint type. And when the capacitance characteristics in the capacitance matrix are in the capacitance range, the pressure is in the pressure range, and the acceleration data is in the acceleration range, the actual finger form category is the finger joint. Otherwise, the actual finger form category is not a knuckle, that is, the actual finger form category is a finger.
In other embodiments, a finger type determination module (not shown) of the framework layer 540 may determine whether the actual finger form type of the user is a finger joint by using the comparison data of the finger joints in the finger joint database. Specifically, the finger type determination module obtains a finger joint database in the frame layer 540, where the finger joint database stores comparison data in which the finger form type is a finger joint, and the comparison data includes a position characteristic, a pressure, an area, and a volume value matrix. The finger type determination module in the framework layer 540 matches the screened touch data with all the comparison data in the finger joint database one by one. When the matching degree of the screened touch data and at least one piece of comparison data in the knuckle database exceeds a knuckle matching threshold, the actual finger shape category is a knuckle, and when the matching degree of the touch data and any one piece of template data in the knuckle database does not exceed the knuckle matching threshold, the actual finger shape category is not a knuckle, that is, the actual finger shape category is a finger.
Step 612.b 4: the type determination model 198B determines whether to correct the preset finger form type based on the comparison result between the actual finger form type and the preset finger form type.
If the actual finger form category is the same as the preset finger form category, the category determination model 198B of the framework layer 540 ends the flow, so that the application 199 continues to perform touch operation analysis according to the preset finger form category; if the actual finger form category is different from the preset finger form category, the category determination model 198B of the framework layer 540 converts the preset finger form category into the actual finger form category, and sends the actual finger form category to the application 199, so that the application 199 performs touch operation analysis according to the actual finger form category.
In some embodiments, if the application layer 550 has responded to the touch operation according to the preset finger form categories, the framework layer 540 sends the location features, pressure, area, and actual finger form categories to the application 199 in the application layer 550. In some other embodiments, if the application 199 has not responded to the touch operation according to the preset form category, the framework layer 540 modifies the preset finger form category that has not been responded to the actual finger form category. For example, the preset finger form category in the application 199 message queue is modified to the actual finger form category, or the preset finger form category in the application 199 message queue is replaced with the actual finger form category.
Step 612.b 5: the application 199 performs touch operation analysis on location features, pressure and area according to actual finger form categories.
After receiving the actual finger form category, the application 199 in the mobile phone 100 performs touch operation analysis on the received position characteristics, pressure and area according to the actual finger form category.
In other embodiments, if the touch gesture type is not a press event, that is, the touch gesture type is a move event or a lift event, it indicates that the volume data is middle frame data, and the actual finger shape category has already been calculated from the previous frame volume data, it is not necessary to continue calculating the actual finger shape category according to the middle frame data. At this time, the gesture type determination module 198A in the framework layer 540 directly sends the position features, the pressure, and the area in the screened touch data to the application 199 in the application layer 550, so that the application 199 performs touch operation analysis on the position features, the pressure, and the area according to the original finger form category of the touch operation, thereby optimizing the touch operation analysis method for the capacitance data, and improving the analysis and response speed of the type of the touch operation.
In other embodiments, the processor 110 of the mobile phone 100 processes the capacitance data to obtain the touch data, and after receiving the capacitance data, the touch chip 180M directly reports the capacitance data to the processor 110 of the mobile phone 100 without processing the capacitance data, and the processor 110 directly processes the capacitance data to obtain the touch data.
For example, when the display interface of the touch screen of the mobile phone 100 is a chat interface of "chat group 1", and when the user touches the chat interface with a hand, the touch sensor 180K in the device layer 510 scans the surface of the touch screen of the mobile phone 100 at regular time according to the sampling frequency to receive the capacitance information at all touch points in the touch screen. Then, the touch sensor 180K screens out capacitance value information exceeding a reporting threshold to obtain capacitance value data, and sends the capacitance value data to the touch chip 180M of the device layer 510 of the mobile phone 100. After recognizing the interrupt message in the capacitance data, the touch chip 180M reports the interrupt message to the touch driver 196 in the driver layer 520. The touch driver 196 receives the interrupt message, reads touch data generated by the touch chip 180M according to the capacitance data, eliminates the false touch data in the touch data through the false touch prevention model 197 in the hardware abstraction layer 530, obtains the filtered touch data, and sends the filtered touch data to the gesture type determination module 198A in the framework layer 540.
The gesture type determination module 198A in the framework layer 540 sends the position characteristics, pressure, area, and finger in the screened touch data to the instant messaging program in the application layer 550, so that the instant messaging program performs touch operation analysis on the position characteristics, pressure, and area according to the finger. Meanwhile, the finger form category algorithm or category determination model 198B in the framework layer 540 obtains the actual finger form category of the touch operation from the screened touch data and the acceleration data obtained by the acceleration sensor 180E.
Comparing the actual finger form type with the finger reported to the instant messaging program by using a finger form type algorithm or a type judgment model 198B in the framework layer 540, if the actual finger form type is the same as the finger, continuing to perform touch operation analysis on subsequent capacity value data according to the finger, and ending the flow; and if the actual finger form type is different from the finger, namely the finger form type is a finger joint, inquiring the response state of the instant messaging program to the touch operation, and correcting the preset finger form type according to the response state and the actual finger form type.
Specifically, if the instant messaging program responds to the touch operation according to the finger, that is, the chat interface rolls and displays the chat record along with the touch operation of the finger, the framework layer 540 sends the finger joint to the instant messaging program, so that the instant messaging program analyzes and responds to the touch operation again according to the finger joint, that is, the chat interface in the finger joint encircling area is intercepted; if the instant messaging program does not respond to the touch operation according to the finger, the framework layer 540 modifies the finger in the message queue of the instant messaging program into a knuckle, may replace the finger in the message queue with the knuckle, or modify the finger in the message queue into the knuckle, so that the instant messaging program analyzes and responds to the touch operation according to the knuckle, that is, intercepts the chat interface in the knuckle circle area.
It can be understood that, if the instant messaging program has responded to the touch operation according to the finger, that is, the chat interface scrolls and displays the chat record along with the touch operation of the finger, the framework layer 540 sends the finger joint to the instant messaging program, so that the instant messaging program continues to analyze and respond to the touch operation according to the finger joint, that is, intercepts the chat interface in the subsequent finger joint circling area.
For another example, when the display interface of the touch screen of the mobile phone 100 is a picture display interface of a gallery program, and when a user touches a picture with a hand, the touch sensor 180K in the device layer 510 scans the surface of the touch screen of the mobile phone 100 at regular time according to the sampling frequency to receive the capacitance information at all touch points in the touch screen. Then, the touch sensor 180K screens out capacitance value information exceeding a reporting threshold to obtain capacitance value data, and sends the capacitance value data to the touch chip 180M of the device layer 510 of the mobile phone 100. After recognizing the interrupt message in the capacitance data, the touch chip 180M reports the interrupt message to the touch driver 196 in the driver layer 520. The touch driver 196 receives the interrupt message, reads touch data generated by the touch chip 180M according to the capacitance data, rejects the false touch data in the touch data through the false touch prevention model 197 in the hardware abstraction layer 530, obtains screened touch data, and sends the screened touch data to the gesture type determination module 198A in the framework layer 540.
The gesture type determination module 198A in the framework layer 540 sends the position characteristics, pressure, area, and finger in the screened touch data to the instant messaging program in the application layer 550, so that the instant messaging program performs touch operation analysis on the position characteristics, pressure, and area according to the finger. Meanwhile, the finger form category algorithm or category determination model 198B in the framework layer 540 obtains the actual finger form category of the touch operation according to the filtered touch data and the acceleration data obtained by the acceleration sensor 180E.
Comparing the actual finger form type with the finger reported to the gallery program by using a finger form type algorithm or a type judgment model 198B in the framework layer 540, if the actual finger form type is the same as the finger, continuing to perform touch operation analysis on subsequent volume value data according to the finger, and ending the flow; and if the actual finger form type is different from the finger, namely the finger form type is a finger joint, inquiring the response state of the gallery program to the touch operation, and correcting the preset finger form type according to the response state and the actual finger form type.
Specifically, if the gallery program responds to the touch operation according to the finger, that is, the picture display interface moves along with the touch operation of the finger or the picture displayed area is zoomed, the frame layer 540 sends the finger joint to the gallery program, so that the gallery program analyzes and responds to the touch operation again according to the finger joint, that is, the picture display interface in the finger joint circling area is intercepted; if the gallery program does not respond to the touch operation according to the finger, the framework layer 540 modifies the finger in the gallery program message queue into the knuckle, may replace the finger in the message queue with the knuckle, or modify the finger in the message queue into the knuckle, so that the gallery program analyzes and responds to the touch operation according to the knuckle, that is, captures the image interface of the knuckle circle area.
It can be understood that, if the gallery program has responded to the touch operation according to the finger, that is, the image interface partially displays the image area along with the touch operation of the finger, the framework layer 540 sends the knuckle to the gallery program, so that the gallery program continues to analyze and respond to the touch operation according to the knuckle, that is, intercepts the image interface of the subsequent knuckle circle area.
The application discloses a touch operation analysis method. Fig. 7 is a timing diagram of a touch operation analysis method in the present application. The scheme of fig. 7 is the same as that of fig. 6, but is described in terms of timing. That is, in the process of the user performing the touch operation, the touch sensor 180K sequentially collects multiple frames of capacity value data, and as described above, when the touch sensor 180K collects the first frame of capacity value data, the category determination model 198B calculates the actual finger form category according to the collected first frame of capacity value data. When the touch sensor 180K detects the nth frame of content data, if the category determination model 198B determines the actual finger form category, the application 199 performs the touch operation analysis according to the actual finger form category.
For example, steps 701 to 704b represent processing performed on the first frame capacity data, and steps 705a to 707 represent correlation processing performed on the nth frame capacity data. Specifically, the method comprises the following steps:
step 701: when a user touches the touch screen with a hand, the touch screen in the mobile phone 100 obtains first frame capacity data generated by a touch operation.
Step 702: the touch chip 180M in the mobile phone 100 obtains the first frame of touch data by calculation according to the first frame of content data.
Step 703: the gesture type determining module 198A in the mobile phone 100 obtains the touch gesture type corresponding to the first frame of touch data according to the first frame of touch data. When the touch gesture type corresponding to the first frame of touch data is a press-down event, steps 704a and 704b are executed simultaneously.
Wherein, step 704 a: the application 199 in the mobile phone 100 performs touch operation analysis on the position feature, pressure and area in the first frame of touch data according to the preset finger form category, and then enters step 705 a.
Step 704 b: the category determination model 198B in the cell phone 100 calculates the actual finger form category of the touch operation from the other sensor data acquired by the other sensors and the first frame touch data. Since the calculation process of the finger shape category is complex, step 704a is completed before step 704b, and step 705b is entered.
Step 705 a: the category determination model 198B in the cell phone 100 continues to calculate the actual finger form category from the other sensing data and the first frame of touch data.
Step 705 b: the touch screen in the mobile phone 100 acquires the nth frame content value data generated by the touch operation.
Step 706: the touch chip 180M in the mobile phone 100 obtains the second frame of touch data by calculation according to the nth frame of capacitance value data. Further, assume that the mobile phone 100 obtains the second touch data and the actual finger shape category at the same time.
Step 707: if the actual finger form category is different from the preset finger form category, the application 199 in the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area in the first touch data and the nth frame of touch data according to the actual finger form category; and if the actual finger form category is the same as the preset finger form category, continuing to perform touch operation analysis on the position characteristics, the pressure and the area in the second touch data according to the preset finger form category.
It is understood that the actual finger shape category may be reported with the location features, pressure, and area in the last frame of touch data, or may be reported separately.
The analysis method can effectively improve the reporting speed of the first frame of volume value data and improve the handedness of the touch operation of the user. However, since the first frame data does not pass through all algorithms, there is a possibility of false alarm, and once false alarm occurs, correction needs to be performed when analyzing subsequent value-containing data, but the touch screen report rate is above 120HZ, that is, the report interval between two adjacent frames of value-containing data is less than 8ms, so that the analysis method can quickly correct even if false alarm exists, thereby ensuring the accuracy of response to touch operation.
FIG. 8 shows a flow chart of a method of analyzing a touch operation of the present application. For the touch driver 196, the anti-false touch model 197, the gesture type determination module 198A and the gesture type determination module 198B in fig. 6, the functions executed by the processor 110 of the mobile phone 100 may be implemented by calling related programs. Specifically, as shown in fig. 8, the method includes the following steps:
step 801: the touch sensor 180K of the touch screen in the mobile phone 100 periodically collects capacitance data according to the sampling frequency, and please refer to step 602 above for a specific sampling manner.
Step 802: the touch chip 180M in the mobile phone 100 determines whether there is an interrupt message according to the capacity value data, and obtains touch data according to the capacity value data, please refer to the above step 603 and step 604 for a specific determination manner.
Step 803: the processor 110 in the mobile phone 100 determines whether the touch gesture type is a pressing event, and if the touch gesture type is not a pressing event, the process proceeds to step 804; if the touch gesture type is a press event, step 805 and step 806 are performed at the same time, please refer to step 611 and step 612.
Step 804: the processor 110 in the cell phone 100 performs touch operation analysis on the location features, pressure and area according to the original finger form category through the application 199.
Step 805: the processor 110 in the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the preset finger form category, and ends the process.
Step 806: the processor 110 in the mobile phone 100 obtains the acceleration data through the acceleration sensor 180E, please refer to the above step 612.b 2.
Step 807: the processor 110 in the mobile phone 100 obtains an actual finger form category according to the touch data and the acceleration data, and determines whether the actual finger form category is the same as the preset finger form category, if the actual finger form category is the same as the preset finger form category, the step 805 is performed, if the actual finger form category is different from the preset finger form category, the step 808 is performed, and please refer to the step 612.b4 for a specific determination manner.
Step 808: the processor 110 in the mobile phone 100 converts the preset finger form category into the actual finger form category.
Step 809: the processor 110 in the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the actual finger form category through the application 199, please refer to the above step 612.b 5.
Fig. 9 is a detailed description of touch operation analysis under different touch gesture categories on the basis of fig. 8. Specifically, as shown in fig. 9, the method includes the following steps:
step 901 is the same as step 801.
Step 902: the touch chip 180M in the mobile phone 100 determines whether there is an interrupt message according to the capacity value data, obtains touch data according to the capacity value data, indicates that the user touches the touch screen if there is an interrupt message, triggers the touch screen to generate capacity value data, and proceeds to step 903, indicates that the user does not touch the touch screen if there is no interrupt message, and does not trigger the touch screen to generate capacity value data, and returns to step 901, and please refer to the above step 603 and step 604 for a specific determination manner and a generation manner.
Step 903: the processor 110 in the mobile phone 100 acquires the motion data through the acceleration sensor 180E, and determines the placement direction of the mobile phone 100 according to the motion data.
Step 904: the processor 110 in the handset 100 obtains a false touch prevention database.
Step 905: the processor 110 in the mobile phone 100 determines whether the touch data is the false touch data according to the common false touch data and the placement direction in the false touch prevention database, and please refer to the above step 607 and step 608 for a specific false touch prevention determination method. If all the touch data are false touch data, ending the process; if the touch data part is false touch data, the false touch data is removed, and the step 906 is entered, and if the touch data part does not contain the false touch data, the step 906 is directly entered.
Step 906: the processor 110 in the mobile phone 100 analyzes the serial number in the filtered touch data to obtain the touch gesture type of the capacitance data, and please refer to step 611 above.
Step 907: the processor 110 in the mobile phone 100 determines whether the touch gesture type is a lift-off event, and if the touch gesture type is a lift-off event, executes step 908; if the touch gesture type is not a lift-off event, go to step 909, please refer to step 612 above.
Step 908: the processor 110 in the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the original finger form category through the application 199, and ends the process.
Step 909: the processor 110 in the mobile phone 100 determines whether the touch gesture type is a press event. If the touch gesture type is a press event, executing step 910; if the touch gesture type is not a press event, that is, the touch gesture type is a lift event, step 911 is executed.
Step 910: the processor 110 in the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the preset finger form category through the application 199, and ends the process.
Step 911: the processor 110 in the mobile phone 100 performs touch operation analysis on the position characteristics, pressure, and area according to the original finger form category through the application 199, and ends the process.
Fig. 10 is a diagram based on fig. 8, and fully describes the manner of acquiring the actual finger form category, specifically including the following steps, it can be understood that, since the determination of the finger form category of the touch operation is performed only when the touch gesture type is a press-down event, the following steps are performed when the gesture type is a press-down event:
steps 1001 to 1004 are the same as steps 901 to 904 described above.
Step 1005: the processor 110 in the mobile phone 100 determines whether the touch data is the false touch data according to the false touch prevention data and the placement direction, ends the process if all the touch data is the false touch data, rejects the false touch data if part of the touch data is the false touch data, and enters step 1006, and directly enters step 1006 if all the touch data does not contain the false touch data, and please refer to step 607 and step 608 for a specific determination manner.
Step 1006: the processor 110 in the mobile phone 100 obtains the acceleration data through the acceleration sensor 180E, please refer to the above step 612.b 2.
Step 1007: the processor 110 in the mobile phone 100 obtains the comparison data corresponding to the preset finger shape type, please refer to step 607 and step 608.
Step 1008: the processor 110 in the mobile phone 100 determines whether the finger shape is the preset finger shape type according to the touch data, the acceleration data and the comparison data, and if the finger shape is not the preset finger shape type, the step 1009 is entered; if the finger shape is the preset finger shape type, go to step 1010, please refer to step 612.b 3.
Step 1009: the processor 110 in the mobile phone 100 performs touch operation analysis according to the actual finger shape through the application 199, and ends the process, with reference to the above step 612.b4 for specific analysis manner.
Step 1010: the processor 110 in the mobile phone 100 performs touch operation analysis according to the preset finger shape through the application 199, and ends the process, with reference to the above step 612.b4 for specific analysis manner.
In some embodiments, fig. 11 shows a method for how the application 199 corrects the preset finger shape class according to the comparison result between the actual finger shape class and the preset finger shape class when the application has not responded to the touch operation, which specifically includes the following steps:
step 1101: the touch chip 180M in the mobile phone 100 obtains an interrupt message reported according to the capacity value data, and reads touch data according to the interrupt message. Specifically, if the interrupt message exists, triggering the touch screen to generate capacity value data, and further generating touch data according to the capacity value data; if there is no interrupt message, the flow ends.
Step 1102: the processor 110 in the cell phone 100 sends the location characteristics, pressure, area and preset finger form categories in the touch data to the message queue.
Step 1103: the processor 110 in the handset 100 obtains the finger form category to be responded to in the message queue. The finger shape category to be responded is the latest finger shape category, can be a preset finger shape category, and can also be an actual finger shape category reported in a later period.
Step 1104: the processor 110 in the handset 100 determines whether the finger form category to be responded to in the message queue is converted into an actual finger form category. If yes, go to step 1106, otherwise, go to step 1105.
Step 1105: the processor 110 in the mobile phone 100 discards the finger shape category to be responded, and generates a new finger shape category to be responded according to the actual finger shape category at the same position in the message queue, and then proceeds to step 1103.
Step 1106: the processor 110 in the handset 100 waits for the distribution of the location characteristics, pressure, area, and finger form categories to be responded to the input event receiver.
Step 1107: the processor 110 in the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the finger shape category to be responded through the application 199, and ends the process.
In some embodiments, fig. 12 shows how another method for modifying the preset finger shape class according to the comparison result between the actual finger shape class and the preset finger shape class when the application 199 has not responded to the touch operation includes the following steps:
steps 1201 to 1203 are the same as steps 1101 to 1103.
Step 1204: the processor 110 in the handset 100 determines whether the finger form category to be responded to in the message queue is converted into an actual finger form category. If so, go to step 1205; if not, step 1207 is entered.
Step 1205: the processor 110 in the handset 100 waits for the distribution of the location characteristics, pressure, area, and finger form category to be responded to the input event receiver.
Step 1206: the processor 110 in the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the actual finger form category through the application program, and ends the process.
Step 1207: the processor 110 in the mobile phone 100 modifies the finger shape category to be responded according to the actual finger shape category to obtain a new finger shape category to be responded, and then the process proceeds to step 1205.
In some embodiments, the analysis method for touch operation of the touch screen is further abstracted to obtain a response model, and the response model comprises a first algorithm consuming less time and a second algorithm consuming more time. The response model can also be used for some hardware or sensor to upload data, and the raw data needs to go through two algorithms to identify processing scenarios of different message types, for example, the categories of data reported by a gyroscope include rotation angle and acceleration.
When data are uploaded by the same group of hardware or sensors, a response model is used for preferentially copying a part of original data, a processing result is obtained after the original data are processed by a first algorithm, and the processing result and a preset message are sent to a message queue for message distribution; and meanwhile, calculating the retained original data by using a second algorithm of multiple processes to obtain the actual message. And if the actual message obtained by the calculation of the second algorithm is different from the preset message, correcting the preset message calculated and reported according to the first algorithm. The analysis method can improve the reporting speed of the processing result, is particularly suitable for the reporting speed of the processing result when the actual message is the same as the preset message, and improves the performance of hardware or a sensor.
For example, in a data reporting scene of a sensor, the sensor continuously uploads a coordinate feature (x, y, z) and a timestamp t, a first algorithm obtains a first rotation angle or a rough rotation direction according to the coordinate feature (x, y, z) within 1s, and a second algorithm obtains a second rotation angle or an accurate rotation direction according to the coordinate feature (x, y, z) within 2 s. In the process that the sensor reports data, the sensor calculates a first rotation angle or a rough rotation direction through a first algorithm, outputs the first rotation angle or the rough rotation direction, calculates a second rotation angle or an accurate rotation direction through a second algorithm, compares the first rotation angle and the second rotation angle, or compares the rough rotation direction and the accurate rotation direction, and corrects the output first rotation angle or the rough rotation direction according to a comparison result.
For example, in a scene that the sensor reports data, the first algorithm is to acquire the coordinate features (x, y, z) within 2s and directly calculate the first rotation angle according to the coordinate features (x, y, z) within 2s, the second algorithm is to acquire the coordinate features (x, y, z) within 2s, and after the coordinate features (x, y, z) within 2s are cleaned according to the cleaning logic, the second rotation angle is calculated according to the cleaned coordinate features (x, y, z). In the process of reporting data by the sensor, the sensor calculates to obtain a first rotation angle through a first algorithm, outputs the first rotation angle, calculates to obtain a second rotation angle through a second algorithm, compares the first rotation angle with the second rotation angle, and corrects the output first rotation angle according to the comparison result.
In addition, FIG. 13 provides a schematic diagram of another touch-operated analysis method. Unlike the analysis method of the touch operation shown in fig. 5, the analysis method of the touch operation shown in fig. 13 is to obtain the capacitance data, calculate the actual finger form type according to the acceleration data and the capacitance data obtained by the acceleration sensor, and then perform the touch analysis on the touch operation according to the actual finger form type. As shown in fig. 13, the method specifically includes the following steps:
step 1310: the mobile phone 100 receives the capacitance data generated by the touch operation of the user through the touch chip 180M, and monitors the touch gesture type of the touch operation through the capacitance data.
Step 1320: when the mobile phone 100 monitors that the touch gesture type is a press-down event, that is, the received capacitance data is the first frame capacitance data generated by the touch operation, other sensor data is obtained through other sensors, so that the finger form category of the touch operation is obtained through the capacitance data and the other sensor data.
For example, the other sensing data includes acceleration data acquired by the acceleration sensor 180E, a motion attitude acquired by the gyro sensor, a temperature acquired by the temperature sensor, and the like.
Step 1330: the cell phone 100 calculates the position characteristics, pressure, area, and finger shape categories from the volume data and other sensory data.
Step 1340: the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the finger form category.
Fig. 14 is a flowchart of an analysis method of a touch operation based on the principle in fig. 13, which specifically includes the following steps:
step 1401: the mobile phone 100 periodically collects the capacity data according to a preset sampling frequency.
Step 1402: the mobile phone 100 determines whether there is an interrupt message according to the volume value data, and obtains touch data according to the volume value data. If the touch chip 180M determines that the interrupt message is present according to the capacitance value data, reporting the interrupt message, and performing step 1404; if the touch chip 180M does not determine the interrupt message according to the volume value data, step 1401 is repeated.
Step 1404: the mobile phone 100 determines whether the touch gesture type is a pressing event according to the touch data. If the touch gesture type is a press event, step 1405 is entered, and if the touch gesture type is not a press event, the process ends.
Step 1405: the mobile phone 100 acquires acceleration data by the acceleration sensor 180E.
Step 1406: the mobile phone 100 obtains the finger form category of the touch operation according to the touch data and the acceleration data, and determines whether the finger form category is a finger joint. If the finger shape type is not a finger joint, step 1407 is performed, and if the finger shape type is a finger joint, step 1408 is performed.
Step 1407: the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the finger, and ends the process.
Step 1408: the mobile phone 100 converts fingers into knuckles.
Step 1409: the mobile phone 100 performs touch operation analysis on the position characteristics, pressure and area according to the finger joints, and ends the process.
However, in the analysis method of the touch operation, when the touch gesture type is a press-down event, the position feature, the pressure, the area, and the finger form category must be sent to the application 199 only after the finger form category is determined, and the calculation process of the finger form category is complex, so that the time for sending the position feature, the pressure, the area, and the finger form category to the application 199 is long, and further the response time of the application 199 to the touch operation is long, so that the touch tracking performance of the touch operation is poor, and the experience of the touch operation of the user is affected.
Fig. 15 is a block diagram of a software configuration of the mobile phone 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 15, the application packages may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, and wechat applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for applications 199 at the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 15, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the touch screen, intercept the touch screen and the like.
The content provider is used to store and retrieve data and make such data accessible to the application 199. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build application 199. The display interface may be composed of one or more views. For example, a view showing text and a view showing pictures are included.
The phone manager is used to provide the communication functions of the handset 100.
The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, and the like, to the application 199.
The notification manager allows the application 199 to display notification information in a status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to notify download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application 199, or a notification that appears in the form of a dialog window on a touch screen. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface Managers (SM), Media Libraries (ML), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications 199.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes an exemplary workflow of software and hardware of the mobile phone 100 in conjunction with a chat scenario of instant messaging. When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch continuous operation during the touch operation, and taking the control corresponding to the touch continuous operation as the control of the mobile phone application as an example, the mobile phone application calls the interface of the application framework layer, starts the mobile phone application, and then displays the interface where the touch continuous operation is responded by calling the display driver.
Reference in the specification to "some embodiments" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one example embodiment or technology in accordance with the present disclosure. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
The disclosure also relates to an operating device for executing in text. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, Read-Only memories (ROMs), Random Access Memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, Application Specific Integrated Circuits (ASICs), or any type of media suitable for storing electronic instructions, and each may be coupled to a computer system bus. Further, the computers referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
The processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform one or more method steps. The structure for a variety of these systems is discussed in the description that follows. In addition, any particular programming language sufficient to implement the techniques and embodiments of the present disclosure may be used. Various programming languages may be used to implement the present disclosure as discussed herein.
Moreover, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the disclosed subject matter. Accordingly, the present disclosure is intended to be illustrative, but not limiting, of the scope of the concepts discussed herein.

Claims (10)

1. A touch operation analysis method of a touch screen on electronic equipment is characterized by comprising the following steps:
under the condition that the touch gesture type of a user on a touch screen on the electronic equipment is detected to meet a preset analysis condition, acquiring first volume value data generated by touch operation of the user on the touch screen, and starting to calculate the actual finger form category of the touch operation of the user based on the first volume value data;
under the condition that the actual finger form category is not calculated, analyzing the user touch operation type of the volume value data acquired from the volume value data queue based on a preset finger form category;
under the condition that the actual finger form category is calculated and is the same as the preset finger form category, continuously adopting the preset finger form category to analyze the user touch operation type of the capacity value data acquired from the capacity data queue;
and under the condition that the actual finger form category is calculated and is different from the preset finger form category, analyzing the user touch operation type of the content value data acquired from the container value data queue by adopting the actual finger form category.
2. The method of claim 1, further comprising:
completing the analysis of the touch operation type based on the preset finger form category;
calculating that the actual finger form category is different from the preset finger form category;
and analyzing the acquired volume value data by adopting the actual finger form category for a second touch operation type.
3. The method of claim 1, wherein the finger morphology categories include fingers and knuckles.
4. The method of claim 3, wherein the predetermined finger shape category is the finger.
5. The method of claim 1, wherein the capacitance data comprises coordinate features and capacitance features, the coordinate features being used to characterize a location of a touch point, and the capacitance features being used to characterize a capacitance level of the touch point.
6. The method of claim 1, wherein the actual finger morphology category is computed by:
acquiring acceleration data detected by an acceleration sensor of the electronic equipment;
and calculating to obtain the actual finger form category based on the acceleration data and the first volume value data.
7. The method according to claim 1, wherein the preset analysis condition is that the touch gesture type is a press-down event.
8. The method according to claim 1, wherein the touch operation types of the user include clicking, long-pressing, dragging, and sliding when the finger form category is finger, and circle-and-double-clicking when the finger form category is finger joint.
9. An electronic device, comprising:
the touch screen is used for responding to the touch operation of a user on the touch screen to generate capacity value data;
a memory storing instructions;
a processor coupled to the memory, the program instructions stored by the memory, when executed by the processor, causing the processor of the electronic device to control the touch screen to perform the method of analyzing a touch operation of any of claims 1-8.
10. A readable medium having instructions stored therein, which when run on the readable medium, cause the readable medium to perform a method of analyzing a touch operation according to any one of claims 1 to 8.
CN202110184394.0A 2021-02-08 2021-02-08 Electronic device, method for analyzing touch operation of electronic device, and readable medium Pending CN114911401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110184394.0A CN114911401A (en) 2021-02-08 2021-02-08 Electronic device, method for analyzing touch operation of electronic device, and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110184394.0A CN114911401A (en) 2021-02-08 2021-02-08 Electronic device, method for analyzing touch operation of electronic device, and readable medium

Publications (1)

Publication Number Publication Date
CN114911401A true CN114911401A (en) 2022-08-16

Family

ID=82760947

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110184394.0A Pending CN114911401A (en) 2021-02-08 2021-02-08 Electronic device, method for analyzing touch operation of electronic device, and readable medium

Country Status (1)

Country Link
CN (1) CN114911401A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015041332A1 (en) * 2013-09-20 2015-03-26 株式会社デンソーウェーブ Robot maneuvering device, robot system, and robot maneuvering program
CN106415472A (en) * 2015-04-14 2017-02-15 华为技术有限公司 Gesture control method, device, terminal apparatus and storage medium
CN106445120A (en) * 2016-09-05 2017-02-22 华为技术有限公司 Touch operation identification method and apparatus
US20170168586A1 (en) * 2015-12-15 2017-06-15 Purdue Research Foundation Method and System for Hand Pose Detection
CN111356968A (en) * 2017-09-29 2020-06-30 索尼互动娱乐股份有限公司 Rendering virtual hand gestures based on detected hand input

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015041332A1 (en) * 2013-09-20 2015-03-26 株式会社デンソーウェーブ Robot maneuvering device, robot system, and robot maneuvering program
CN106415472A (en) * 2015-04-14 2017-02-15 华为技术有限公司 Gesture control method, device, terminal apparatus and storage medium
US20170168586A1 (en) * 2015-12-15 2017-06-15 Purdue Research Foundation Method and System for Hand Pose Detection
CN106445120A (en) * 2016-09-05 2017-02-22 华为技术有限公司 Touch operation identification method and apparatus
CN111356968A (en) * 2017-09-29 2020-06-30 索尼互动娱乐股份有限公司 Rendering virtual hand gestures based on detected hand input

Similar Documents

Publication Publication Date Title
JP7391102B2 (en) Gesture processing methods and devices
WO2021063343A1 (en) Voice interaction method and device
EP3979061A1 (en) Quick application starting method and related device
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN111061912A (en) Method for processing video file and electronic equipment
CN110618933B (en) Performance analysis method and system, electronic device and storage medium
US20220150403A1 (en) Input Method and Electronic Device
CN111147660B (en) Control operation method and electronic equipment
CN113132526B (en) Page drawing method and related device
CN112130714B (en) Keyword search method capable of learning and electronic equipment
WO2022100221A1 (en) Retrieval processing method and apparatus, and storage medium
CN116069212B (en) Quick looking-up method for application card, electronic equipment and storage medium
CN114816610B (en) Page classification method, page classification device and terminal equipment
US20230168784A1 (en) Interaction method for electronic device and electronic device
WO2022194190A1 (en) Method and apparatus for adjusting numerical range of recognition parameter of touch gesture
CN109062648B (en) Information processing method and device, mobile terminal and storage medium
CN114371985A (en) Automated testing method, electronic device, and storage medium
CN114564101A (en) Three-dimensional interface control method and terminal
CN113168257A (en) Method for locking touch operation and electronic equipment
CN114911401A (en) Electronic device, method for analyzing touch operation of electronic device, and readable medium
CN116028148A (en) Interface processing method and device and electronic equipment
CN115016712A (en) Method and device for exiting two-dimensional code
CN114911400A (en) Method for sharing pictures and electronic equipment
CN115131789A (en) Character recognition method, character recognition equipment and storage medium
CN113448658A (en) Screen capture processing method, graphical user interface and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination