CN113204293A - Touch sensing processing method, touch sensing processing device, medium and electronic device - Google Patents

Touch sensing processing method, touch sensing processing device, medium and electronic device Download PDF

Info

Publication number
CN113204293A
CN113204293A CN202110438102.1A CN202110438102A CN113204293A CN 113204293 A CN113204293 A CN 113204293A CN 202110438102 A CN202110438102 A CN 202110438102A CN 113204293 A CN113204293 A CN 113204293A
Authority
CN
China
Prior art keywords
touch
event
points
distance threshold
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110438102.1A
Other languages
Chinese (zh)
Other versions
CN113204293B (en
Inventor
古启才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110438102.1A priority Critical patent/CN113204293B/en
Publication of CN113204293A publication Critical patent/CN113204293A/en
Application granted granted Critical
Publication of CN113204293B publication Critical patent/CN113204293B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Abstract

The disclosure provides a touch sensing processing method, a touch sensing processing device, a computer readable storage medium and an electronic device, and relates to the technical field of computers. The touch sensing processing method comprises the following steps: determining a distance threshold according to the number of touch points in the historical touch event; acquiring the position of each touch point in the current touch event; and when the distance between two touch points in the current touch event is smaller than the distance threshold value, combining the two touch points into one touch point. The method and the device can effectively and accurately perform point combining processing on the touch points.

Description

Touch sensing processing method, touch sensing processing device, medium and electronic device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a touch sensing processing method, a touch sensing processing apparatus, a computer-readable storage medium, and an electronic device.
Background
With the rapid development and wide use of terminal devices, touch screens have received much attention as an important medium for connecting users to perform interactive operations with the terminal devices. The touch screen is based on capacitive sensing, and when a human body performs touch operation on the surface of the touch screen through fingers or other objects, the touch screen can identify information, such as the number and the position, of the fingers or the touch objects currently performing the touch operation through the change of sensing data. Particularly, as the requirements of the user on the terminal device gradually increase, the touch screen can simultaneously identify a plurality of touch points so as to meet the requirements of the user in the application scene of multi-finger touch operation. However, in this case, the touch screen may erroneously recognize the number of touch points, for example, when the user uses one finger to perform an operation, the touch points are considered as two touch points, and therefore, the touch points need to be appropriately combined to improve the accuracy of the user touch operation and the user experience.
Disclosure of Invention
The present disclosure provides a touch sensing processing method, a touch sensing processing apparatus, a computer-readable storage medium, and an electronic device, so as to improve, at least to a certain extent, a situation in which touch points cannot be effectively merged in the prior art.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a touch sensing processing method, including: determining a distance threshold according to the number of touch points in the historical touch event; acquiring the position of each touch point in the current touch event; and when the distance between two touch points in the current touch event is smaller than the distance threshold value, combining the two touch points into one touch point.
According to a second aspect of the present disclosure, there is provided a touch-sensitive processing apparatus comprising: the distance threshold value determining module is used for determining a distance threshold value according to the number of touch points in the historical touch event; the touch point position acquisition module is used for acquiring the position of each touch point in the current touch event; and the touch point merging module is used for merging the two touch points into one touch point when the distance between the two touch points in the current touch event is smaller than the distance threshold.
According to a third aspect of the present disclosure, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, implements the touch sensing processing method of the first aspect and possible implementations thereof.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; a memory for storing executable instructions of the processor. Wherein the processor is configured to execute the executable instructions to perform the touch sensing processing method of the first aspect and possible implementations thereof.
The technical scheme of the disclosure has the following beneficial effects:
determining a distance threshold according to the number of touch points in the historical touch event; acquiring the position of each touch point in the current touch event; and when the distance between two touch points in the current touch event is smaller than the distance threshold, combining the two touch points into one touch point. On one hand, the exemplary embodiment can determine whether two touch points need to be combined according to the judgment result of the distance between the touch points and the distance threshold value in the current touch event, so that the condition of abnormal touch operation caused by the mistaken identification of the number of the touch points by the touch screen is avoided in a reasonable and effective manner, and the user experience is improved; on the other hand, the distance threshold is determined according to the number of touch points in the historical touch event, and has variability, when the number of touch points in the historical touch event changes, different distance thresholds can be determined, that is, whether the distance between two touch points in the current touch event meets the merging requirement of the touch points is not fixed, and the distance threshold can be adaptively adjusted according to the touch operation condition in practical application, so that the flexibility is higher, and the reliability and the accuracy of merging the touch points are higher.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 is a block diagram showing an electronic apparatus in the present exemplary embodiment;
FIG. 2 shows a flow diagram of a touch sensing processing method in the present exemplary embodiment;
fig. 3 shows a system structure for sending an interrupt signal in a touch event processing method according to the present exemplary embodiment;
FIG. 4 illustrates a sub-flow diagram of a touch sensing processing method in the present exemplary embodiment;
FIG. 5 is a diagram illustrating sensing data of a touch area in the exemplary embodiment;
FIG. 6 illustrates a sub-flow diagram of another touch sensing processing method in the exemplary embodiment;
FIG. 7 is a flow chart illustrating another touch sensing processing method in the exemplary embodiment;
fig. 8 is a block diagram showing a touch-sensitive processing device in the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
An exemplary embodiment of the present disclosure provides an electronic device for implementing a touch sensing processing method, the electronic device including at least a processor and a memory for storing executable instructions of the processor, the processor being configured to execute the touch sensing processing method via execution of the executable instructions.
The structure of the electronic device is exemplarily described below by taking the mobile terminal 100 in fig. 1 as an example. It will be appreciated by those skilled in the art that the configuration of figure 1 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes.
As shown in fig. 1, the mobile terminal 100 may specifically include: a processor 110, an internal memory 121, an external memory interface 122, a USB (Universal Serial Bus) interface 130, a charging management Module 140, a power management Module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication Module 150, a wireless communication Module 160, an audio Module 170, a speaker 171, a receiver 172, a microphone 173, an earphone interface 174, a sensor Module 180, a display screen 190, a camera Module 191, an indicator 192, a motor 193, a key 194, and a SIM (Subscriber identity Module) card interface 195.
Processor 110 may include one or more processing units, such as: the Processor 110 may include an AP (Application Processor), a modem Processor, a GPU (Graphics Processing Unit), an ISP (Image Signal Processor), a controller, an encoder, a decoder, a DSP (Digital Signal Processor), a baseband Processor, and/or an NPU (Neural-Network Processing Unit), etc. The encoder may encode (i.e., compress) image or video data; the decoder may decode (i.e., decompress) the codestream data of the image or video to restore the image or video data.
In some embodiments, processor 110 may include one or more interfaces through which connections are made to other components of mobile terminal 100.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a volatile memory, a nonvolatile memory, and the like. The processor 110 executes various functional applications of the mobile terminal 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The external memory interface 122 may be used to connect an external memory, such as a Micro SD card, for expanding the storage capability of the mobile terminal 100. The external memory communicates with the processor 110 through the external memory interface 122 to implement data storage functions, such as storing files of music, video, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may be used to connect a charger to charge the mobile terminal 100, or connect an earphone or other electronic devices.
The charging management module 140 is configured to receive charging input from a charger. While the charging management module 140 charges the battery 142, the power management module 141 may also supply power to the device; the power management module 141 may also monitor the status of the battery.
The wireless communication function of the mobile terminal 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the mobile terminal 100. The Wireless Communication module 160 may provide Wireless Communication solutions including WLAN (Wireless Local Area Networks, WLAN) (e.g., Wi-Fi (Wireless Fidelity, Wireless Fidelity)) Networks, BT (Bluetooth), GNSS (Global Navigation Satellite System), FM (Frequency Modulation), NFC (Near Field Communication), IR (Infrared technology), and the like, which are applied to the mobile terminal 100. The mobile terminal 100 may implement a display function through the GPU, the display screen 190, the AP, and the like, and display a user interface.
The mobile terminal 100 may implement a photographing function through the ISP, the camera module 191, the encoder, the decoder, the GPU, the display screen 190, the AP, and the like, and may also implement an audio function through the audio module 170, the speaker 171, the receiver 172, the microphone 173, the earphone interface 174, the AP, and the like.
The sensor module 180 may include depth sensors 1801, pressure sensors 1802, gyroscope sensors 1803, air pressure sensors 1804, etc. to implement different sensing functions.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc. The motor 193 may generate a vibration cue, may also be used for touch vibration feedback, and the like. The keys 194 include a power-on key, a volume key, and the like.
The mobile terminal 100 may support one or more SIM card interfaces 195 for connecting SIM cards to enable telephony and data communications, among other functions.
Exemplary embodiments of the present disclosure provide a touch sensing processing method, which may be applied to a touch terminal, which may include a touch device such as a touch screen or a touch pad. The touch terminal may include, but is not limited to, a smart phone, a tablet computer, a notebook computer, a palm computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), an electronic reader, a vehicle-mounted device, a navigation device, a wearable device, and other devices including a touch device. The touch element is arranged in the terminal equipment and can be used for realizing the interactive operation between a user and the terminal equipment. The present exemplary embodiment will be explained below by taking a touch panel as an example.
A touch screen, also called a "touch screen" or a "touch panel", is an inductive liquid crystal display device that can be used to receive a user input signal. According to the working principle and transmission medium of the touch screen, the touch screen can be divided into a plurality of types, including a resistance type touch screen, a capacitance type touch screen, an infrared ray type touch screen, a surface acoustic wave touch screen and the like; among them, the capacitive touch screen is one of the most widely used touch screen types in the terminal device. In the present exemplary embodiment, the touch screen may be a capacitive touch screen, which may operate using the capacitive sensing principle of the touch screen.
Fig. 2 shows an exemplary flow of a touch sensing processing method, including the following steps S210 to S230:
step S210, determining a distance threshold according to the number of touch points in the historical touch event.
The touch event refers to an operation event that a user makes contact with the touch screen, for example, an event that the user makes a click, a press, a move, or a long press on the touch screen, and the user may perform a touch event with the touch screen through a specific object, for example, a finger, a stylus, or another touch object, to implement a corresponding function. The touch event may generally include various information, such as a degree of pressing force of the user pressing the touch screen with a finger, a number of fingers used for pressing, a pressing area of the finger, sensing data generated by the pressing area, a pressing time, and the like.
When a user performs a touch operation in the touch screen, the touch screen may continuously perform data acquisition on the touch operation of the user, where a historical touch event refers to a touch event before a current touch event, specifically, the historical touch event may be a touch event at any time before the touch event corresponding to the current time, for example, the user performs a 2-second pressing operation on the touch screen, the current touch event is a 2-second pressing operation, the 1-second pressing operation may be the historical touch event, the 1.5-second pressing operation may also be the historical touch event, and the specific time granularity may be divided more finely, which is not specifically limited by the present disclosure. In addition to determining the historical touch events according to the time dimension, the historical touch events may also be determined according to the frequency of the touch operations acquired by the touch screen, for example, the touch events are periodically acquired by the touch screen, the current period corresponds to the current touch event, the previous period corresponds to the historical touch event, and so on.
The touch point is a point reflecting which position or area of the touch screen the user operates, and in fact, the touch point may be regarded as a sensing area where the system can receive the touch operation, for example, when the user operates with two fingers, the touch point may be a contact point between the two fingers and the touch screen, and the number of the touch points is usually 2. In practical applications, due to different parameters such as pressing modes, force, angles and the like when a user performs a pressing operation, a situation that one finger has two sensing areas may also occur, for example, when the user performs a touch operation by pressing the finger belly of a thumb on a touch screen flatly, the situation that the finger belly area is large and the finger belly area is possibly recognized as two or more touch points may also occur, and specifically, the touch screen may be determined according to sensing data of the sensing areas actually received.
The distance threshold refers to a criterion for determining whether merging is required between different touch points, and in the present exemplary embodiment, the distance threshold may be determined through a history of touch events. Therefore, in different application scenes, the distance threshold is adjusted in a self-adaptive mode, and a more effective point combining strategy is guaranteed.
In the present exemplary embodiment, the touch screen may have its own independent memory and processor, and when the touch screen receives a touch event input by a user, as shown in fig. 3, the touch screen 310 usually triggers an interrupt signal periodically through a specific pin, such as an INT pin or I2C (Inter-Integrated Circuit)/SPI (Serial Peripheral Interface), so that the processor 320 of the touch terminal reads the touch event, that is, the present exemplary embodiment may receive the touch event by the touch screen system and report the touch event to the touch terminal for information processing.
Step S220, obtaining the position of each touch point in the current touch event.
Further, the exemplary embodiment may obtain the position of each touch point in the current touch event, and considering that the touch point is actually a point that can represent the sensing area where the touch event occurs currently, the touch point may be determined in the sensing area first, and then the position of the touch point may be obtained. The position of the touch point may be a coordinate of the touch point on the touch screen, or a coordinate position on the sensing display interface, such as a coordinate in an interface composed of a receiving channel and a transmitting channel. In this exemplary embodiment, the touch terminal may directly read the position of the touch point from the touch screen system, or may return the position of the touch point to the touch terminal after the touch screen system determines the position of the touch point.
In an exemplary embodiment, as shown in fig. 4, the touch sensing processing method may further include the following steps:
step S410, acquiring a plurality of touch areas in the current touch event;
in step S420, a touch point is determined in each touch area, and the position of the touch point is obtained.
Generally, when a user operates a touch screen, the touch screen may generate corresponding sensing areas according to a touch object, for example, when the user operates with two fingers, the two fingers may generate two or more sensing areas corresponding to each other, the sensing areas may include sensing data, for example, sensing capacitance values generated by the pressing areas corresponding to each other, the sensing capacitance values at different positions are different, and the sensing areas are touch areas. In the present exemplary embodiment, one touch point may be determined in each touch area to characterize the current touch area. Specifically, the touch point of each touch area may be determined in multiple ways, for example, the touch point may be any point in the sensing area, may also be a certain point calculated according to sensing data of various points in the sensing area, may also be a point where the sensing data in the sensing area is at a peak, and the like, and the way of determining the touch point of each touch area may be the same or different, which is not specifically limited by the present disclosure.
Specifically, in an exemplary embodiment, the determining a touch point in each touch area in step S420 may include:
and determining the point with the highest sensing signal value in each touch area as the touch point of the touch area.
The sensing signal value refers to sensing data generated after the touch screen receives a touch event input by a user, such as a sensing capacitance value, and a point with the highest sensing signal value is a point with the highest sensing capacitance value. For example, fig. 5 shows a schematic diagram that after the touch screen receives a touch event input by a user, the touch screen acquires different sensing points to sense capacitance values in a list composed of a transmitting channel and a receiving channel, and as shown in fig. 5, the sensing points included in the touch area S and the touch area P have different sensing capacitance values, while the sensing capacitance values of other areas are substantially 0, and it can be considered that the user in other areas does not perform a touch operation, the present exemplary embodiment may determine a point with the highest sensing capacitance value, that is, a point with a sensing capacitance value of 50 in the touch area S, as a touch point in the touch area S, and a point with a sensing capacitance value of 50 in the touch area P, as a touch point in the touch area P, it should be noted that in practical applications, peak values of sensing capacitance values in different touch areas may be different, and may be specifically determined according to practical situations, fig. 5 is a schematic illustration only.
In step S230, when it is detected that the distance between two touch points in the current touch event is smaller than the distance threshold, the two touch points are merged into one touch point.
The exemplary embodiment may determine whether to merge two touch points by comparing the distance between the two touch points with a distance threshold, in other words, it may also be considered that whether to merge touch areas corresponding to the two touch points is determined according to the distance between the two touch points, so that the touch terminal considers that the touch terminal is currently a touch area, for example, when a user uses the thumb to perform a pressing operation, the two touch points are detected, but actually the user only needs to perform a single-finger touch operation, at this time, based on the above judgment flow of merging the touch points, if a merging requirement is met, the two touch points may be merged into one touch point, so as to ensure that the touch terminal currently accurately identifies the single-finger operation of the user.
Specifically, after two touch areas are combined into one touch area, the touch point corresponding to the touch area may be determined in multiple ways, for example, one of the touch points before being combined may be used as the combined touch point, or the two touch points before being combined may be used for performing calculation to determine the combined touch point, and the like, which is not limited in this disclosure.
In addition, in the present exemplary embodiment, the distance between two touch points in the current touch event may be determined by first obtaining a first distance between the two touch points in a first axial direction and a second distance in a second axial direction, where the first axial direction is perpendicular to the second axial direction, for example, the first axial direction may be a horizontal direction and the second axial direction may be a vertical direction, and further, calculating based on the pythagorean theorem according to the first distance and the second distance to determine the straight path distance between the two touch points. In addition, other ways of calculating the distance between two touch points should also fall within the scope of the present disclosure.
In an exemplary embodiment, the step S230 may include the following steps:
when the distance between any two touch points in the multiple touch points in the current touch event is smaller than the distance threshold, combining the multiple touch points into one touch point.
In practical application, a situation that a plurality of touch points are detected may occur, and whether the plurality of touch points are determined to be combined or not is determined to determine one touch point.
In summary, in the exemplary embodiment, the distance threshold is determined according to the number of touch points in the historical touch event; acquiring the position of each touch point in the current touch event; and when the distance between two touch points in the current touch event is smaller than the distance threshold, combining the two touch points into one touch point. On one hand, the exemplary embodiment can determine whether two touch points need to be combined according to the judgment result of the distance between the touch points and the distance threshold value in the current touch event, so that the condition of abnormal touch operation caused by the mistaken identification of the number of the touch points by the touch screen is avoided in a reasonable and effective manner, and the user experience is improved; on the other hand, the distance threshold is determined according to the number of touch points in the historical touch event, and has variability, when the number of touch points in the historical touch event changes, different distance thresholds can be determined, that is, whether the distance between two touch points in the current touch event meets the merging requirement of the touch points is not fixed, and the distance threshold can be adaptively adjusted according to the touch operation condition in practical application, so that the flexibility is higher, and the reliability and the accuracy of merging the touch points are higher.
In an exemplary embodiment, the step S210 may include:
when the number of touch points in the historical touch event is smaller than a preset number threshold, determining the distance threshold as a first distance threshold;
when the number of touch points in the historical touch event is greater than or equal to a preset number threshold, determining the distance threshold as a second distance threshold;
wherein the first distance threshold is less than the second distance threshold.
The present exemplary embodiment may preset a preset number threshold for determining whether the number of touch points in the historical touch event is excessive, so as to determine which distance threshold is adopted by the current touch point for determining whether to perform the joint determination. The preset number threshold may be set by a user according to actual needs or needs of application scenarios, for example, if in some social application programs, a scenario in which a user needs to perform a touch operation with three or more fingers is rarely involved, therefore, the preset number threshold may be set to 3, in a game application program, a scenario in which a user may perform a touch operation with multiple fingers may be involved, the preset number threshold may be set to 4, and the like, which is not specifically limited by the present disclosure.
When the number of touch points in the historical touch event is greater than the preset number threshold, it is considered that the number of touch points is identified incorrectly, for example, when the preset number threshold is set to 4, when the number of touch points is determined to be 4, the user may use four fingers to perform operation, or the user may use three fingers to perform operation and be identified incorrectly as four touch points, and the like, at this time, the touch points are more prone to be subjected to point combination processing, and therefore, a larger second distance threshold can be determined; when the number of touch points in the historical touch event is smaller than the preset number threshold, the accuracy of the recognition result of the current number of touch points is considered to be higher, so that a smaller first distance threshold can be set, namely when the fingers used by the user are close to each other, the fingers can be considered to be combined. The first distance threshold and the second distance threshold may be set by self-definition according to actual needs, for example, the first distance threshold may be 5mm (millimeter) or 10 pixel units, and the second distance threshold may be 10mm or 20 pixel units, and the like, which is not specifically limited in this disclosure. Based on this, the present exemplary embodiment can implement that, for different application scenarios, different distance thresholds are set purposefully and flexibly to determine whether to perform a touch point combining.
In an exemplary embodiment, as shown in fig. 6, the historical touch event includes a last touch event, and the touch sensing processing method may include the following steps:
step S610, recording the number of touch points of the last touch event through a preset variable;
in step S620, when the current touch event is ended, updating a preset variable according to the number of touch points in the current touch event.
The present exemplary embodiment may set a preset variable for counting the number of touch points of the last touch event, for example, the preset variable Mlast. The preset variable may be updated along with the change of the touch event time attribute, for example, if the user performs a pressing operation for 2 seconds or more, the number of touch points corresponding to the touch event at the 1 st second is 3, and the number of touch points corresponding to the touch event at the 2 nd second is 2, when the touch event at the 2 nd second is a current touch event, the preset variable Mlast is 3, and when the touch event at the 2 nd second ends to enter the next current touch event, and the touch event at the 2 nd second becomes a history touch event, the preset variable Mlast is updated to 2.
In particular, in an exemplary embodiment, the touch sensing processing method may further include the following steps:
and when the current touch event is a lifting event, setting a preset variable to be 0.
If the current touch event is a lift-off event, for example, the user lifts off after performing a pressing operation, and the current touch event corresponds to the user for performing the lifting operation, it is considered that it is not necessary to determine whether the touch point of the touch event is a close point, because the preset variable may be set to 0, so as to update the preset variable next time.
Fig. 7 is a flowchart illustrating another touch sensing processing method in the present exemplary embodiment, which may specifically include the following steps:
step S702, acquiring a current touch event;
step S704, judging whether the current touch event is a lift-off event;
step S706, if the current touch event is a lift-off event, setting a preset variable to 0;
step 708, if the current touch event is not a lift-off event, recording the number of touch points of the last touch event through a preset variable;
step S710, judging whether the number of the touch points in the last touch event is less than a preset number threshold;
step 712, if the number of touch points in the last touch event is less than the preset number threshold, determining the distance threshold as a first distance threshold;
step 714, if the number of touch points in the last touch event is greater than or equal to the preset number threshold, determining the distance threshold as a second distance threshold;
step S716, determining a plurality of touch areas in the current touch event, determining a point with the highest sensing signal value in each touch area as a touch point of the touch area, and acquiring the position of the touch point;
step S718, detecting whether the distance between two touch points in the current touch event is less than a distance threshold;
step S720, if the distance between two touch points in the current touch event is smaller than the distance threshold, combining the two touch points into one touch point;
in step S722, if the distance between two touch points in the current touch event is greater than the distance threshold, the join point processing is not performed.
Exemplary embodiments of the present disclosure also provide a touch sensing processing apparatus. As shown in fig. 8, the touch-sensitive processing apparatus 800 may include: a distance threshold determination module 810, configured to determine a distance threshold according to the number of touch points in the historical touch event; a touch point position obtaining module 820, configured to obtain positions of touch points in a current touch event; the touch point merging module 830 is configured to merge two touch points into one touch point when it is detected that a distance between the two touch points in the current touch event is smaller than a distance threshold.
In an exemplary embodiment, the distance threshold determination module includes: the first distance determining unit is used for determining the distance threshold as a first distance threshold when the number of touch points in the historical touch event is smaller than a preset number threshold; when the number of touch points in the historical touch event is greater than or equal to a preset number threshold, determining the distance threshold as a second distance threshold; wherein the first distance threshold is less than the second distance threshold.
In an exemplary embodiment, the historical touch events include a last touch event; the touch-sensitive processing apparatus further includes: the touch point number recording module is used for recording the number of touch points of the last touch event through a preset variable; and the preset variable updating module is used for updating the preset variable according to the number of the touch points in the current touch event when the current touch event is finished.
In an exemplary embodiment, the touch-sensitive processing device further includes: and the zero setting module is used for setting the preset variable to be 0 when the current touch event is a lifting event.
In an exemplary embodiment, the touch point position obtaining module includes: the touch area acquisition unit is used for acquiring a plurality of touch areas in the current touch event; and the touch point position acquisition unit is used for determining a touch point in each touch area and acquiring the position of the touch point.
In an exemplary embodiment, the touch point position obtaining unit is configured to determine a point in each touch area where the sensing signal value is the highest as a touch point of the touch area, and obtain a position of the touch point.
In an exemplary embodiment, the touch point merging module includes: when the distance between any two touch points in the multiple touch points in the current touch event is smaller than the distance threshold, combining the multiple touch points into one touch point.
The specific details of each part in the above device have been described in detail in the method part embodiments, and thus are not described again.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium, which may be implemented in the form of a program product, including program code, for causing a terminal device to perform the steps according to various exemplary embodiments of the present disclosure described in the above-mentioned "exemplary method" section of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 2, fig. 4, fig. 6 or fig. 7 may be performed. The program product may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a random access memory, a Read Only Memory (ROM), an erasable programmable read only memory (EPROM or flash memory), an optical fiber, a portable compact disc read only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the following claims.

Claims (10)

1. A touch sensing processing method, comprising:
determining a distance threshold according to the number of touch points in the historical touch event;
acquiring the position of each touch point in the current touch event;
and when the distance between two touch points in the current touch event is smaller than the distance threshold value, combining the two touch points into one touch point.
2. The method of claim 1, wherein determining a distance threshold based on a number of touch points in the historical touch events comprises:
when the number of touch points in the historical touch event is smaller than a preset number threshold, determining the distance threshold as a first distance threshold;
when the number of touch points in the historical touch event is greater than or equal to the preset number threshold, determining the distance threshold as a second distance threshold;
wherein the first distance threshold is less than the second distance threshold.
3. The method of claim 2, wherein the historical touch events include a last touch event; the method further comprises the following steps:
recording the number of touch points of the last touch event through a preset variable;
and when the current touch event is finished, updating the preset variable according to the number of touch points in the current touch event.
4. The method of claim 3, further comprising:
and when the current touch event is a lifting event, setting the preset variable to be 0.
5. The method of claim 1, wherein obtaining the position of each touch point in the current touch event comprises:
acquiring a plurality of touch areas in a current touch event;
and determining a touch point in each touch area, and acquiring the position of the touch point.
6. The method of claim 5, wherein determining a touch point in each touch area comprises:
and determining the point with the highest sensing signal value in each touch area as the touch point of the touch area.
7. The method of claim 1, wherein when detecting that the distance between two touch points in the current touch event is smaller than the distance threshold, merging the two touch points into one touch point comprises:
and when the distance between any two touch points in the multiple touch points in the current touch event is detected to be smaller than the distance threshold, combining the multiple touch points into one touch point.
8. A touch-sensitive processing apparatus, comprising:
the distance threshold value determining module is used for determining a distance threshold value according to the number of touch points in the historical touch event;
the touch point position acquisition module is used for acquiring the position of each touch point in the current touch event;
and the touch point merging module is used for merging the two touch points into one touch point when the distance between the two touch points in the current touch event is smaller than the distance threshold.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
10. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 7 via execution of the executable instructions.
CN202110438102.1A 2021-04-22 2021-04-22 Touch sensing processing method, touch sensing processing device, medium and electronic equipment Active CN113204293B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110438102.1A CN113204293B (en) 2021-04-22 2021-04-22 Touch sensing processing method, touch sensing processing device, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110438102.1A CN113204293B (en) 2021-04-22 2021-04-22 Touch sensing processing method, touch sensing processing device, medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113204293A true CN113204293A (en) 2021-08-03
CN113204293B CN113204293B (en) 2023-09-08

Family

ID=77027941

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110438102.1A Active CN113204293B (en) 2021-04-22 2021-04-22 Touch sensing processing method, touch sensing processing device, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113204293B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115033165A (en) * 2022-06-29 2022-09-09 Oppo广东移动通信有限公司 Touch event processing method and device, storage medium and electronic equipment

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604220A (en) * 2009-07-20 2009-12-16 华硕电脑股份有限公司 The detection method of electric resistance touch-control panel and contact point form thereof
US20100141607A1 (en) * 2008-12-04 2010-06-10 Electronics And Telecommunications Research Institute Apparatus and method for recognizing multi touch point
US20110012864A1 (en) * 2009-07-20 2011-01-20 Asustek Computer Inc. Resistive touch panel and method for detecting touch point type
CN102681738A (en) * 2012-05-04 2012-09-19 敦泰科技有限公司 Multi-point touch identifying method and touch device for mutual capacitance type touch equipment
US20130154983A1 (en) * 2010-07-01 2013-06-20 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus
US20130234960A1 (en) * 2012-03-07 2013-09-12 Canon Kabushiki Kaisha Information processing apparatus, control method thereof, and storage medium
CN103955334A (en) * 2014-04-02 2014-07-30 联想(北京)有限公司 Information processing method and electronic equipment
US20140292701A1 (en) * 2011-10-11 2014-10-02 Flatfrog Laboratories Ab Multi-touch detection in a touch system
US20150227258A1 (en) * 2014-02-07 2015-08-13 Industrial Technology Research Institute Touch device, processor and touch signal accessing method thereof
US20150355772A1 (en) * 2014-06-04 2015-12-10 International Business Machines Corporation Touch prediction for visual displays
CN106484194A (en) * 2015-08-26 2017-03-08 上海精研电子科技有限公司 A kind of infrared touch panel multipoint positioning method
US20170115796A1 (en) * 2015-10-23 2017-04-27 Leauto Intelligent Technology (Beijing) Co. Ltd. Method, device and terminal apparatus for recognizing a multi-finger pinching-in or pinching-out gesture
CN108196708A (en) * 2017-12-19 2018-06-22 迈巨(深圳)科技有限公司 A kind of touch point anti-loss method, electronic equipment and storage medium
US20200233515A1 (en) * 2019-01-23 2020-07-23 Novatek Microelectronics Corp. Touch display device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100141607A1 (en) * 2008-12-04 2010-06-10 Electronics And Telecommunications Research Institute Apparatus and method for recognizing multi touch point
US20110012864A1 (en) * 2009-07-20 2011-01-20 Asustek Computer Inc. Resistive touch panel and method for detecting touch point type
CN101604220A (en) * 2009-07-20 2009-12-16 华硕电脑股份有限公司 The detection method of electric resistance touch-control panel and contact point form thereof
US20130154983A1 (en) * 2010-07-01 2013-06-20 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus
US20140292701A1 (en) * 2011-10-11 2014-10-02 Flatfrog Laboratories Ab Multi-touch detection in a touch system
US20130234960A1 (en) * 2012-03-07 2013-09-12 Canon Kabushiki Kaisha Information processing apparatus, control method thereof, and storage medium
CN102681738A (en) * 2012-05-04 2012-09-19 敦泰科技有限公司 Multi-point touch identifying method and touch device for mutual capacitance type touch equipment
US20150227258A1 (en) * 2014-02-07 2015-08-13 Industrial Technology Research Institute Touch device, processor and touch signal accessing method thereof
CN103955334A (en) * 2014-04-02 2014-07-30 联想(北京)有限公司 Information processing method and electronic equipment
US20150355772A1 (en) * 2014-06-04 2015-12-10 International Business Machines Corporation Touch prediction for visual displays
CN106484194A (en) * 2015-08-26 2017-03-08 上海精研电子科技有限公司 A kind of infrared touch panel multipoint positioning method
US20170115796A1 (en) * 2015-10-23 2017-04-27 Leauto Intelligent Technology (Beijing) Co. Ltd. Method, device and terminal apparatus for recognizing a multi-finger pinching-in or pinching-out gesture
CN108196708A (en) * 2017-12-19 2018-06-22 迈巨(深圳)科技有限公司 A kind of touch point anti-loss method, electronic equipment and storage medium
US20200233515A1 (en) * 2019-01-23 2020-07-23 Novatek Microelectronics Corp. Touch display device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115033165A (en) * 2022-06-29 2022-09-09 Oppo广东移动通信有限公司 Touch event processing method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN113204293B (en) 2023-09-08

Similar Documents

Publication Publication Date Title
US9967100B2 (en) Method of controlling power supply for fingerprint sensor, fingerprint processing device, and electronic device performing the same
CN106095284B (en) Electronic device
CN106060378B (en) Apparatus and method for setting photographing module
US20170199712A1 (en) Electronic device and method of recognizing wearing form thereof
KR102180677B1 (en) Method for managing battery of electronic device, and electronic device performing thereof
US20150128068A1 (en) Method for operating message application and electronic device implementing the same
KR102126568B1 (en) Method for processing data and an electronic device thereof
KR20150051663A (en) Method for executing function in response to touch input and electronic device implementing the same
CN110837343B (en) Snapshot processing method and device and terminal
US10432926B2 (en) Method for transmitting contents and electronic device thereof
US20160156214A1 (en) Method for charging control and an electronic device thereof
CN111784335A (en) Analog card management method, analog card management device, storage medium, and electronic apparatus
KR20150051278A (en) Object moving method and electronic device implementing the same
KR20150019813A (en) Method for controlling an content integrity and an electronic device
CN113204293B (en) Touch sensing processing method, touch sensing processing device, medium and electronic equipment
WO2019228370A1 (en) Data processing method and device, mobile terminal and storage medium
CN108932486B (en) Fingerprint matching method and device and electronic device
WO2021254200A1 (en) Page thrashing protection method and apparatus for memory reclaim of operating system
KR20190063177A (en) Electronic device for measuring hydrature using a display and method for the same
CN111556248A (en) Shooting method, shooting device, storage medium and mobile terminal
CN111722903A (en) Data processing method and device, electronic equipment and storage medium
CN111626035A (en) Layout analysis method and electronic equipment
CN111382694A (en) Face recognition method and device and electronic equipment
CN111770484B (en) Analog card switching method and device, computer readable medium and mobile terminal
WO2019228106A1 (en) Press area optimization method and device, mobile terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant