CN112987958B - Touch signal processing method and device - Google Patents

Touch signal processing method and device Download PDF

Info

Publication number
CN112987958B
CN112987958B CN201911300025.2A CN201911300025A CN112987958B CN 112987958 B CN112987958 B CN 112987958B CN 201911300025 A CN201911300025 A CN 201911300025A CN 112987958 B CN112987958 B CN 112987958B
Authority
CN
China
Prior art keywords
touch area
finger
touch
frame
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911300025.2A
Other languages
Chinese (zh)
Other versions
CN112987958A (en
Inventor
唐矩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201911300025.2A priority Critical patent/CN112987958B/en
Publication of CN112987958A publication Critical patent/CN112987958A/en
Application granted granted Critical
Publication of CN112987958B publication Critical patent/CN112987958B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a touch signal processing method, comprising: determining the shape of at least one touch area generating a touch signal; determining a target touch area acted by a finger in at least one touch area according to the shape; determining that the finger acting on the target touch area is a left hand finger or a right hand finger according to the relation between the shape and a preset structure in the terminal; and under the condition that the finger acting on the target touch area is a left finger, if the distance between the target touch area and the first frame of the terminal is greater than a first preset distance, ignoring touch signals generated by other touch areas with the distance between the target touch area and the first frame being smaller than a second preset distance. According to the embodiment of the disclosure, on the basis of avoiding false touch triggering operation, the screen operation near the frame can be timely responded when the user does not touch the frame by mistake, and good use experience of the user is guaranteed.

Description

Touch signal processing method and device
Technical Field
The disclosure relates to the field of touch technology, and in particular, to a touch signal processing method, a touch signal processing device, an electronic device, and a computer readable storage medium.
Background
At present, an electronic device with a touch function is generally used for sensing a touch signal, the structure is a capacitive touch panel, when a hand of a user touches or approaches to the touch panel, the hand can be used as a conductor to influence electric field distribution in the touch panel, so that a capacitance value of a capacitive sensor in the touch panel is changed, and then the touch signal is generated according to the variable of the capacitance value.
Under the trend of comprehensive screens, the frames of the screens of the electronic equipment are narrower and narrower, when a user performs touch operation on the screen, besides the fact that a finger contacts the screen, a palm is close to or even contacts the screen near the frames, and therefore a sensor in a touch panel in the screen generates a touch signal.
However, the situation that the palm touches the screen near the frame is generally a false touch, and in the related art, in order to avoid the false touch triggering operation, the touch signal generated by the sensor near the frame is directly ignored. However, the user can operate the screen nearby the frame when the user does not touch the screen by mistake, and the touch signal generated by the sensor nearby the frame is ignored, so that the user feels that the screen is insensitive to the response of the touch operation.
Disclosure of Invention
The present disclosure provides a touch signal processing method, a touch signal processing apparatus, an electronic device, and a computer-readable storage medium to solve the deficiencies in the related art.
According to a first aspect of an embodiment of the present disclosure, a touch signal processing method is provided, and the method is applicable to a terminal, and includes:
Determining the shape of at least one touch area generating a touch signal;
Determining a target touch area acted by a finger in the at least one touch area according to the shape;
Determining whether the finger acting on the target touch area is a left hand finger or a right hand finger according to the relation between the shape and a preset structure in the terminal;
If the finger acting on the target touch area is a left finger, ignoring touch signals generated by other touch areas with the distance from the target touch area to the first frame of the terminal being smaller than a second preset distance if the distance from the target touch area to the first frame of the terminal is larger than the first preset distance;
and under the condition that the finger acting on the target touch area is a right hand finger, if the distance between the target touch area and the second frame of the terminal is larger than the first preset distance, ignoring touch signals generated by other touch areas with the distance between the target touch area and the second frame being smaller than the second preset distance.
Optionally, the generating the shape of the at least one touch area of the touch signal includes:
determining positions of a plurality of sensors generating touch signals;
determining a touch area formed by the connected sensors according to the positions;
and performing curve fitting on the positions of the sensors positioned at the edge of the touch area to determine the shape of the touch area.
Optionally, the determining, according to the shape, the target touch area acted by the finger in the at least one touch area includes:
and determining the touch area with the oval shape as the target touch area.
Optionally, determining that the finger acting on the target touch area is a left finger or a right finger according to the relationship between the shape and the preset structure in the terminal includes:
calculating an included angle between a major axis or a minor axis of the ellipse and at least one frame of the terminal;
and determining that the finger acting on the target touch area is a left hand finger or a right hand finger according to the included angle.
Optionally, the method further comprises:
If the distance between the target touch area and the first frame of the terminal is smaller than a second preset distance under the condition that the finger acting on the target touch area is a left finger, responding to a touch signal generated by the target touch area;
And if the distance between the target touch area and the second frame of the terminal is smaller than a second preset distance, responding to a touch signal generated by the target touch area.
According to a second aspect of an embodiment of the present disclosure, a touch signal processing apparatus is provided, which is applicable to a terminal, and the apparatus includes:
a shape determining module configured to determine a shape of at least one touch area generating a touch signal;
the area determining module is configured to determine a target touch area acted by a finger in the at least one touch area according to the shape;
the finger determining module is configured to determine whether the finger acting on the target touch area is a left hand finger or a right hand finger according to the relation between the shape and a preset structure in the terminal;
The signal processing module is configured to ignore touch signals generated by other touch areas with the distance from the first frame of the terminal smaller than a second preset distance if the distance from the target touch area to the first frame of the terminal is larger than the first preset distance under the condition that the finger acting on the target touch area is a left finger; and under the condition that the finger acting on the target touch area is a right hand finger, if the distance between the target touch area and the second frame of the terminal is larger than a first preset distance, ignoring touch signals generated by other touch areas with the distance between the target touch area and the second frame being smaller than a second preset distance.
Optionally, the shape determining module includes:
a position determination sub-module configured to determine positions of a plurality of sensors generating touch signals;
The area determination submodule is configured to determine a touch area formed by the connected sensors according to the positions;
A shape determination sub-module configured to curve fit the positions of the sensors located at the edges of the touch area to determine the shape of the touch area.
Optionally, the area determining module is configured to determine a touch area with an elliptical shape as the target touch area.
Optionally, the finger determination module includes:
an included angle calculating sub-module configured to calculate an included angle of a major axis or a minor axis of the ellipse with at least one frame of the terminal;
And the finger determination submodule is configured to determine whether the finger acting on the target touch area is a left hand finger or a right hand finger according to the included angle.
Optionally, the signal processing module is further configured to respond to a touch signal generated by the target touch area if the distance between the target touch area and the first frame of the terminal is smaller than a second preset distance when the finger acting on the target touch area is a left finger; and if the finger acting on the target touch area is a right hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and the second frame of the terminal is smaller than a second preset distance.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device, including:
A processor;
A memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any of the embodiments described above.
According to a fourth aspect of the embodiments of the present disclosure, a computer-readable storage medium is presented, on which a computer program is stored, which program, when being executed by a processor, carries out the steps of the method according to any of the embodiments described above.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
In the embodiment of the disclosure, instead of ignoring touch signals generated by all touch areas closer to the frame, a target touch area acted by a finger may be determined in the touch area where the touch signals are generated, so as to determine whether the finger acted on the target touch area is a left hand finger or a right hand finger.
When the finger acting on the target touch area is a left hand finger and the distance between the left hand finger and the first frame of the terminal is greater than a first preset distance, it may be determined that other touch areas with the distance between the left hand finger and the first frame being smaller than a second preset distance are touch areas touched by the palm of the left hand of the user by mistake, so that touch signals generated by the other touch areas are ignored.
Similarly, when the finger acting on the target touch area is a right hand finger and the distance between the right hand finger and the second frame of the terminal is greater than the first preset distance, it can be determined that other touch areas with the distance between the right hand finger and the second frame being smaller than the second preset distance are touch areas which are touched by the palm of the user right hand by mistake, so that touch signals generated by other touch areas are ignored.
That is, according to the embodiment of the present disclosure, the touch signal generated by the touch area may be ignored when it is determined that the touch area closer to the frame is touched by the palm, and the touch signal generated by the touch area may be normally responded when it is determined that the touch area closer to the frame is not touched by the palm. Therefore, on the basis of avoiding false touch triggering operation, the screen operation near the frame can be timely responded when the user does not touch the frame by mistake, and good use experience of the user is guaranteed.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic flowchart of a touch signal processing method according to an embodiment of the disclosure.
Fig. 2 is a schematic diagram of a touch area shown according to an embodiment of the present disclosure.
Fig. 3 is a schematic diagram of a touch signal according to an embodiment of the disclosure.
Fig. 4 is a schematic diagram illustrating a relationship between a shape of a target touch area and a preset structure in a terminal according to an embodiment of the disclosure.
Fig. 5 is a schematic diagram illustrating a relationship between a shape of another target touch area and a preset structure in a terminal according to an embodiment of the disclosure.
Fig. 6 is a schematic flow chart diagram illustrating another touch signal processing method according to an embodiment of the disclosure.
Fig. 7 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the disclosure.
Fig. 8 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the disclosure.
Fig. 9 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the disclosure.
Fig. 10 is a schematic block diagram of a touch signal processing apparatus according to an embodiment of the disclosure.
Fig. 11 is a schematic block diagram of a shape determination module shown in accordance with an embodiment of the present disclosure.
Fig. 12 is a schematic block diagram of a finger determination module shown in accordance with an embodiment of the present disclosure.
Fig. 13 is a schematic block diagram of an apparatus for touch signal processing, shown in accordance with an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
Fig. 1 is a schematic flowchart of a touch signal processing method according to an embodiment of the disclosure. The method shown in the embodiment may be applied to a terminal, where the terminal includes, but is not limited to, an electronic device such as a mobile phone, a tablet computer, a wearable device, a personal computer, and the like.
A touch panel may be disposed in the terminal, and the touch panel may include a plurality of sensors for sensing a touch operation to generate a touch signal, and the sensors may be capacitive sensors, for example, self-capacitive sensors or mutual-capacitive sensors.
The terminal can be further provided with a display panel, the relation between the display panel and the touch panel can be set according to requirements, for example, the touch panel can be arranged on the display panel to form a OGS (One Glass Solution) structure, and the touch panel can also be arranged In the display panel to form an In cell structure.
As shown in fig. 1, the touch signal processing method may include the following steps:
In step S101, determining a shape of at least one touch area generating a touch signal;
In step S102, determining a target touch area acted by a finger in the at least one touch area according to the shape;
In step S103, determining that the finger acting on the target touch area is a left hand finger or a right hand finger according to the relationship between the shape and the preset structure in the terminal;
In step S104, if the finger acting on the target touch area is a left finger, if the distance between the target touch area and the first frame of the terminal is greater than a first preset distance, ignoring touch signals generated by other touch areas having a distance between the target touch area and the first frame less than a second preset distance;
In step S105, if the finger acting on the target touch area is a right hand finger, if the distance between the target touch area and the second frame of the terminal is greater than the first preset distance, the touch signals generated by other touch areas having a distance between the target touch area and the second frame less than the second preset distance are ignored.
In one embodiment, when a user acts on the touch panel, the sensor in the touch panel may generate a touch signal, and according to the position of the sensor generating the touch signal, the touch area of the touch panel acted on by the user may be determined, and thus the shape of the touch area may be determined. The user may act on one or more touch areas at the same time, so that the touch areas for generating the touch signals may be one or more.
Since the user may act on the touch panel (e.g. touch the touch panel or be closer to the touch panel) by means of fingers, palm, face, etc., the shape of the different parts is different.
Taking the touch panel as an example, the area where the palm and face are in contact with the touch panel, i.e., the touch area, is generally irregularly shaped, while the area where the fingers are in contact with the touch panel is generally regularly shaped, such as circular or oval.
Therefore, according to the shape of the touch area, the embodiment can determine the target touch area acted by the finger in at least one touch area, that is, determine which touch areas are the target touch areas where the user generates the touch signal through the finger in at least one touch area.
Fig. 2 is a schematic diagram of a touch area shown according to an embodiment of the present disclosure.
As shown in fig. 2, for example, a user clicks a screen through a right hand terminal (e.g., a mobile phone) and clicks a right hand finger (e.g., a thumb), in which case a portion of the thumb connected to the palm is close to or even contacts the screen, so as to act on a touch panel in the screen, and the touch panel generates a touch signal in both a touch area a contacted by the finger and a touch area B contacted by the palm.
Fig. 3 is a schematic diagram of a touch signal according to an embodiment of the disclosure.
When the user does not act on the touch panel, the touch sensor in the touch panel also generates a touch signal mainly due to noise influence, and the generated touch signal is smaller and generally has an absolute value within 20 (the unit can be set according to the requirement).
When a user acts on the touch panel, for example, by operating the screen in the manner shown in fig. 2, the touch panel in the screen may generate signals as shown in fig. 3, where each rectangle corresponds to a sensor and the numbers in the rectangle are the touch signals generated by the sensors.
In one embodiment, the shape of the touch area may be determined by curve fitting the positions of the sensors that generate the touch signal (specifically, the value of the generated touch signal is greater than a preset value, for example, the signal value is greater than 200 in fig. 3).
The shape of the touch area a in fig. 2, which is determined by fitting, is an ellipse shown by a dotted line in fig. 3, and the shape of the touch area B in fig. 2 is a triangle shown by a dotted line in fig. 3. Since the shape of the touch area is generally circular or elliptical when the part on the user's body acts on the touch panel without misoperation, it is possible to determine that the elliptical touch area is the target touch area acted by the finger.
After the target touch area is determined, since the touch signal is generated only by the action of the finger of the user, it is further determined whether the finger acting on the target area is specifically a left hand finger or a right hand finger.
Fig. 4 is a schematic diagram illustrating a relationship between a shape of a target touch area and a preset structure in a terminal according to an embodiment of the disclosure. Fig. 5 is a schematic diagram illustrating a relationship between a shape of another target touch area and a preset structure in a terminal according to an embodiment of the disclosure.
In one embodiment, the finger acting on the target touch area may be determined to be a left hand finger or a right hand finger according to the relationship between the shape and the preset structure in the terminal. For example, an included angle between the major axis or the minor axis of the ellipse and the frame of the terminal may be calculated (the included angle may refer to an included angle between the major axis or the minor axis and the frame without moving the major axis or the minor axis out of the screen), and then the finger acting on the target touch area is determined to be a left hand finger or a right hand finger according to the included angle.
The shape of the target touch area is an ellipse, for example, the major axis of the ellipse can be determined first, and then the included angle α between the major axis and the long border on the right side of the terminal is calculated. Based on the embodiment shown in fig. 2, the user uses the right hand to hold the terminal and performs the operation by using the right hand, so that the angle α between the major axis of the ellipse and the right side frame of the terminal is an acute angle as shown in fig. 4, and therefore, in the case that the angle is an acute angle, it can be determined that the finger acting on the target area is a right hand finger.
In the case that the user holds the terminal with his left hand and performs an operation by using his left finger, the angle α between the major axis of the ellipse and the right side frame of the terminal is an obtuse angle as shown in fig. 5, so that in the case that the angle is an obtuse angle, it can be determined that the finger acting on the target area is the left finger.
In the case that the finger acting on the target touch area is a right hand finger, if the distance between the right hand finger and the second frame (e.g., the right long frame) of the terminal is large (e.g., the distance is larger than the first preset distance), then in order to contact the area far from the second frame with the finger, the palm of the right hand may contact the screen as shown in fig. 2, so that a touch signal is generated in addition to the oval target touch area (the area contacted by the finger), and a touch signal is also generated in other touch areas (the area contacted by the palm) near to the second frame (e.g., the distance is smaller than the second preset distance). In this case, the touch signal generated by other touch areas can be ignored, so that the erroneous touch signal generated by the erroneous touch is avoided.
Accordingly, in the case that the finger acting on the target touch area is a left finger, if the distance between the left finger and the first frame (for example, the left long frame) of the terminal is large (for example, the distance is larger than the first preset distance), then in order to contact the area far from the first frame with the finger, the user may contact the screen with the left palm, so that a touch signal is generated in addition to the oval target touch area (the area contacted by the finger), and a touch signal is also generated in other touch areas (the area contacted by the palm) close to the first frame (for example, the distance is smaller than the second preset distance). In this case, the touch signal generated by other touch areas can be ignored, so that the erroneous touch signal generated by the erroneous touch is avoided.
In the embodiment of the disclosure, instead of ignoring the touch signals generated by all the touch areas closer to the frame, the target touch area acted by the finger may be determined in the touch area where the touch signals are generated, so as to determine whether the finger acted on the target touch area is a left hand finger or a right hand finger.
When the finger acting on the target touch area is a left hand finger and the distance between the left hand finger and the first frame of the terminal is greater than a first preset distance, it may be determined that other touch areas with the distance between the left hand finger and the first frame being smaller than a second preset distance are touch areas touched by the palm of the left hand of the user by mistake, so that touch signals generated by the other touch areas are ignored.
Similarly, when the finger acting on the target touch area is a right hand finger and the distance between the right hand finger and the second frame of the terminal is greater than the first preset distance, it can be determined that other touch areas with the distance between the right hand finger and the second frame being smaller than the second preset distance are touch areas which are touched by the palm of the user right hand by mistake, so that touch signals generated by other touch areas are ignored.
That is, according to the embodiment of the present disclosure, the touch signal generated by the touch area may be ignored when it is determined that the touch area closer to the frame is touched by the palm, and the touch signal generated by the touch area may be normally responded when it is determined that the touch area closer to the frame is not touched by the palm. Therefore, on the basis of avoiding false touch triggering operation, the screen operation near the frame can be timely responded when the user does not touch the frame by mistake, and good use experience of the user is guaranteed.
Fig. 6 is a schematic flow chart diagram illustrating another touch signal processing method according to an embodiment of the disclosure. As shown in fig. 6, the generating the shape of the at least one touch area of the touch signal includes:
in step S1011, positions of a plurality of sensors that generate touch signals are determined;
in step S1012, determining a touch area formed by the connected sensors according to the position;
In step S1013, curve fitting is performed on the positions of the sensors located at the edges of the touch area to determine the shape of the touch area.
In one embodiment, to determine the shape of the touch area, the touch area needs to be determined first.
First, a sensor that generates a touch signal may be determined, where the sensor generates a signal greater than a preset signal value, for example, in the embodiment shown in fig. 3, a sensor that generates a touch signal with a signal value greater than 200 is determined to be a sensor that generates a touch signal.
The position of the sensor generating the touch signal may then be determined, for example, in terms of the row and column to which the sensor belongs in the touch panel.
Further, a touch area formed by the connected sensors can be determined according to the positions, wherein the connected sensors refer to sensors with adjacent positions, such as three sensors, a first sensor is positioned in a third row and a fourth column, a second sensor is positioned in a third row and a fifth column, and a third sensor is positioned in a fourth row and a fourth column, so that it can be determined that the three sensors are connected.
Finally, curve fitting may be performed on the positions of the sensors located at the edges of the touch area to determine the shape of the touch area, for example, in the embodiment shown in fig. 3, there are two touch areas.
In the central touch area, the signal values of the sensors positioned at the edges of the touch edge area are 307, 981, 268, 306, 1119, 475, 333, 231, 555, 430, 681, 552, 200, 334 and 1076, and the positions of the sensors are subjected to curve fitting, so that an approximate elliptic shape can be obtained.
In the lower right touch area, the signal values of the sensors positioned at the edges of the touch edge area are 231, 225, 543, 486, 653, 322, 258, 236, 221, 334, 580, 951, 732, 555, and the positions of the sensors are subjected to curve fitting, so that the shape similar to a triangle can be obtained.
Fig. 7 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the disclosure. As shown in fig. 7, determining, in the at least one touch area, a target touch area acted on by a finger according to the shape includes:
In step S1021, it is determined that the touch area with the oval shape is the target touch area.
In one embodiment, since the user typically touches the screen with a finger under the condition of non-false touch, and the shape formed by the finger touching the screen is typically circular or elliptical, when the shape is elliptical, the user typically holds the terminal with one hand and clicks the screen with the thumb, in this case, the situation that the palm touches the screen by mistake may occur with a larger probability, so that it is necessary to determine which touch areas are target touch areas, so that the touch signals generated by other touch areas are omitted later.
When the shape is circular, the user typically clicks the screen by one hand of the terminal and by the fingers of the other hand, in which case the situation that the palm touches the screen by mistake does not occur, so that it is not necessary to determine which touch areas are target touch areas in order to subsequently ignore touch signals generated by other touch areas.
Fig. 8 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the disclosure. As shown in fig. 8, the determining, according to the relationship between the shape and the preset structure in the terminal, that the finger acting on the target touch area is a left finger or a right finger includes:
in step S1031, calculating an angle between the major axis or the minor axis of the ellipse and at least one frame of the terminal;
In step S1032, it is determined that the finger acting on the target touch area is a left hand finger or a right hand finger according to the included angle.
In one embodiment, the finger acting on the target touch area may be determined to be a left hand finger or a right hand finger according to the relationship between the shape and the preset structure in the terminal. For example, an included angle between the major axis or the minor axis of the ellipse and the frame of the terminal may be calculated (the included angle may refer to an included angle between the major axis or the minor axis and the frame without moving the major axis or the minor axis out of the screen), and then the finger acting on the target touch area is determined to be a left hand finger or a right hand finger according to the included angle.
The shape of the target touch area is an ellipse, for example, the major axis of the ellipse can be determined first, and then the included angle α between the major axis and the long border on the right side of the terminal is calculated. Based on the embodiment shown in fig. 2, the user uses the right hand to hold the terminal and performs the operation by using the right hand, so that the angle α between the major axis of the ellipse and the right side frame of the terminal is an acute angle as shown in fig. 4, and therefore, in the case that the angle is an acute angle, it can be determined that the finger acting on the target area is a right hand finger.
In the case that the user holds the terminal with his left hand and performs an operation by using his left finger, the angle α between the major axis of the ellipse and the right side frame of the terminal is an obtuse angle as shown in fig. 5, so that in the case that the angle is an obtuse angle, it can be determined that the finger acting on the target area is the left finger.
Fig. 9 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the disclosure. As shown in fig. 9, the method further includes:
In step S106, if the finger acting on the target touch area is a left finger, if the distance between the target touch area and the first frame of the terminal is smaller than a second preset distance, responding to a touch signal generated by the target touch area;
in step S107, if the finger acting on the target touch area is a right hand finger, if the distance between the target touch area and the second frame of the terminal is smaller than the second preset distance, the touch signal generated by the target touch area is responded.
In one embodiment, when the finger acting on the target touch area is a left finger, if the distance between the target touch area and the first frame of the terminal is smaller than the second preset distance, the shape of the target touch area is the area acted by the finger and is not the area touched by the palm in error, so that the touch signal generated by the target touch area can be responded;
Accordingly, if the finger acting on the target touch area is a right hand finger, if the distance between the target touch area and the second frame of the terminal is smaller than the second preset distance, the shape of the target touch area is the area acted by the finger and is not the area touched by the palm in error, so that the touch signal generated by the target touch area can be responded.
Accordingly, under the condition that the target touch area is close to the frame, the touch signal generated by the target touch area can be responded, and the touch operation of the user without false touch can be ensured to be responded.
Corresponding to the embodiment of the touch signal processing method, the disclosure further provides an embodiment of the touch signal processing device.
Fig. 10 is a schematic block diagram of a touch signal processing apparatus according to an embodiment of the disclosure. The apparatus shown in this embodiment may be suitable for a terminal, where the terminal includes, but is not limited to, an electronic device such as a mobile phone, a tablet computer, a wearable device, a personal computer, and the like.
As shown in fig. 10, the touch signal processing apparatus may include:
A shape determining module 101 configured to determine a shape of at least one touch area generating a touch signal;
a region determining module 102 configured to determine a target touch region acted on by a finger in the at least one touch region according to the shape;
A finger determining module 103 configured to determine, according to a relationship between the shape and a preset structure in the terminal, whether a finger acting on the target touch area is a left hand finger or a right hand finger;
The signal processing module 104 is configured to ignore touch signals generated by other touch areas with a distance from the first frame smaller than a second preset distance if the distance from the target touch area to the first frame of the terminal is larger than the first preset distance when the finger acting on the target touch area is a left finger; and under the condition that the finger acting on the target touch area is a right hand finger, if the distance between the target touch area and the second frame of the terminal is larger than a first preset distance, ignoring touch signals generated by other touch areas with the distance between the target touch area and the second frame being smaller than a second preset distance.
Fig. 11 is a schematic block diagram of a shape determination module shown in accordance with an embodiment of the present disclosure. As shown in fig. 11, the shape determining module 101 includes:
A position determination sub-module 1011 configured to determine positions of a plurality of sensors generating the touch signal;
a region determining sub-module 1012 configured to determine a touch region of connected sensors according to the position;
A shape determination submodule 1013 configured to curve-fit the positions of the sensors located at the edges of the touch region to determine the shape of the touch region.
Optionally, the area determining module is configured to determine a touch area with an elliptical shape as the target touch area.
Fig. 12 is a schematic block diagram of a finger determination module shown in accordance with an embodiment of the present disclosure. As shown in fig. 12, the finger determination module 103 includes:
an included angle calculating submodule 1032 configured to calculate an included angle of the major or minor axis of the ellipse with at least one frame of the terminal;
And the finger determination submodule 1032 is configured to determine whether the finger acting on the target touch area is a left hand finger or a right hand finger according to the included angle.
Optionally, the signal processing module is further configured to respond to a touch signal generated by the target touch area if the distance between the target touch area and the first frame of the terminal is smaller than a second preset distance when the finger acting on the target touch area is a left finger; and if the finger acting on the target touch area is a right hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and the second frame of the terminal is smaller than a second preset distance.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the related methods, and will not be described in detail herein.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The apparatus embodiments described above are merely illustrative, wherein the modules illustrated as separate components may or may not be physically separate, and the components shown as modules may or may not be physical, i.e., may be located in one place, or may be distributed over a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the objectives of the disclosed solution. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The embodiment of the disclosure also proposes an electronic device, including:
A processor;
A memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any of the embodiments described above.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any of the embodiments described above.
Fig. 13 is a schematic block diagram illustrating an apparatus 1300 for touch signal processing according to an embodiment of the disclosure. For example, apparatus 1300 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 13, apparatus 1300 may include one or more of the following components: a processing component 1302, a memory 1304, a power component 1306, a multimedia component 1308, an audio component 1310, an input/output (I/O) interface 1312, a sensor component 1314, and a communication component 1316.
The processing component 1302 generally controls overall operation of the apparatus 1300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1302 may include one or more processors 1320 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 1302 can include one or more modules that facilitate interactions between the processing component 1302 and other components. For example, the processing component 1302 may include a multimedia module to facilitate interaction between the multimedia component 1308 and the processing component 1302.
The memory 1304 is configured to store various types of data to support operations at the apparatus 1300. Examples of such data include instructions for any application or method operating on the apparatus 1300, contact data, phonebook data, messages, pictures, videos, and the like. The memory 1304 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply assembly 1306 provides power to the various components of the device 1300. The power supply components 1306 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 1300.
The multimedia component 1308 includes a screen between the device 1300 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the border of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1308 includes a front-facing camera and/or a rear-facing camera. When the apparatus 1300 is in an operation mode, such as a photographing mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 1310 is configured to output and/or input audio signals. For example, the audio component 1310 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1300 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 1304 or transmitted via the communication component 1316. In some embodiments, the audio component 1310 also includes a speaker for outputting audio signals.
The I/O interface 1312 provides an interface between the processing component 1302 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 1314 includes one or more sensors for providing status assessment of various aspects of the apparatus 1300. For example, the sensor assembly 1314 may detect the open/closed state of the device 1300, the relative positioning of the components, such as the display and keypad of the device 1300, the sensor assembly 1314 may also detect a change in position of the device 1300 or a component of the device 1300, the presence or absence of user contact with the device 1300, the orientation or acceleration/deceleration of the device 1300, and a change in temperature of the device 1300. The sensor assembly 1314 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 1314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1314 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1316 is configured to facilitate communication between the apparatus 1300 and other devices, either wired or wireless. The apparatus 1300 may access a wireless network based on a communication standard, such as WiFi,2G or 3G,4G LTE, 5G NR, or a combination thereof. In one exemplary embodiment, the communication component 1316 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 1316 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the touch signal processing method described in any of the above embodiments.
In an exemplary embodiment, a non-transitory computer-readable storage medium is also provided, such as memory 1304, including instructions executable by processor 1320 of apparatus 1300 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. A touch signal processing method, which is suitable for a terminal, the method comprising:
Determining the shape of at least one touch area generating a touch signal;
Determining a target touch area acted by a finger in the at least one touch area according to the shape;
Determining whether the finger acting on the target touch area is a left hand finger or a right hand finger according to the relation between the shape and a preset structure in the terminal;
If the finger acting on the target touch area is a left finger, ignoring touch signals generated by other touch areas with the distance from the target touch area to the first frame of the terminal being smaller than a second preset distance if the distance from the target touch area to the first frame of the terminal is larger than the first preset distance; wherein the first frame is a left side frame;
If the finger acting on the target touch area is a right hand finger, ignoring touch signals generated by other touch areas with the distance between the target touch area and the second frame of the terminal being smaller than a second preset distance if the distance between the target touch area and the second frame of the terminal is larger than the first preset distance; the second frame is a right side frame.
2. The method of claim 1, wherein generating the shape of the at least one touch area of the touch signal comprises:
determining positions of a plurality of sensors generating touch signals;
determining a touch area formed by the connected sensors according to the positions;
and performing curve fitting on the positions of the sensors positioned at the edge of the touch area to determine the shape of the touch area.
3. The method of claim 1, wherein determining a target touch area acted upon by a finger in the at least one touch area according to the shape comprises:
and determining the touch area with the oval shape as the target touch area.
4. The method according to claim 3, wherein determining that the finger acting on the target touch area is a left finger or a right finger according to the relationship between the shape and the preset structure in the terminal comprises:
calculating an included angle between a major axis or a minor axis of the ellipse and at least one frame of the terminal;
and determining that the finger acting on the target touch area is a left hand finger or a right hand finger according to the included angle.
5. The method according to any one of claims 1 to 4, further comprising:
If the distance between the target touch area and the first frame of the terminal is smaller than a second preset distance under the condition that the finger acting on the target touch area is a left finger, responding to a touch signal generated by the target touch area;
And if the distance between the target touch area and the second frame of the terminal is smaller than a second preset distance, responding to a touch signal generated by the target touch area.
6. A touch signal processing apparatus, adapted for a terminal, the apparatus comprising:
a shape determining module configured to determine a shape of at least one touch area generating a touch signal;
the area determining module is configured to determine a target touch area acted by a finger in the at least one touch area according to the shape;
the finger determining module is configured to determine whether the finger acting on the target touch area is a left hand finger or a right hand finger according to the relation between the shape and a preset structure in the terminal;
The signal processing module is configured to ignore touch signals generated by other touch areas with the distance from the first frame of the terminal smaller than a second preset distance if the distance from the target touch area to the first frame of the terminal is larger than the first preset distance under the condition that the finger acting on the target touch area is a left finger; and under the condition that the finger acting on the target touch area is a right hand finger, if the distance between the target touch area and the second frame of the terminal is larger than a first preset distance, ignoring touch signals generated by other touch areas with the distance between the target touch area and the second frame being smaller than a second preset distance; the first frame is a left side frame, and the second frame is a right side frame.
7. The apparatus of claim 6, wherein the shape determination module comprises:
a position determination sub-module configured to determine positions of a plurality of sensors generating touch signals;
The area determination submodule is configured to determine a touch area formed by the connected sensors according to the positions;
A shape determination sub-module configured to curve fit the positions of the sensors located at the edges of the touch area to determine the shape of the touch area.
8. The apparatus of claim 6, wherein the region determination module is configured to determine a touch region having an elliptical shape as the target touch region.
9. The apparatus of claim 8, wherein the finger determination module comprises:
an included angle calculating sub-module configured to calculate an included angle of a major axis or a minor axis of the ellipse with at least one frame of the terminal;
And the finger determination submodule is configured to determine whether the finger acting on the target touch area is a left hand finger or a right hand finger according to the included angle.
10. The apparatus according to any one of claims 6 to 9, wherein the signal processing module is further configured to respond to the touch signal generated by the target touch area if the distance between the target touch area and the first frame of the terminal is smaller than a second preset distance in the case that the finger acting on the target touch area is a left finger; and if the finger acting on the target touch area is a right hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and the second frame of the terminal is smaller than a second preset distance.
11. An electronic device, comprising:
A processor;
A memory for storing processor-executable instructions;
Wherein the processor is configured to implement the method of any one of claims 1 to 5.
12. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps of the method of any of claims 1 to 5.
CN201911300025.2A 2019-12-16 2019-12-16 Touch signal processing method and device Active CN112987958B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911300025.2A CN112987958B (en) 2019-12-16 2019-12-16 Touch signal processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911300025.2A CN112987958B (en) 2019-12-16 2019-12-16 Touch signal processing method and device

Publications (2)

Publication Number Publication Date
CN112987958A CN112987958A (en) 2021-06-18
CN112987958B true CN112987958B (en) 2024-04-19

Family

ID=76342021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911300025.2A Active CN112987958B (en) 2019-12-16 2019-12-16 Touch signal processing method and device

Country Status (1)

Country Link
CN (1) CN112987958B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016130963A (en) * 2015-01-14 2016-07-21 シャープ株式会社 Electronic device with touch panel and program for controlling the electronic device
CN105867654A (en) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 Method for adjusting focal length of camera and terminal
CN106293444A (en) * 2015-06-25 2017-01-04 小米科技有限责任公司 Mobile terminal, display control method and device
CN106406621A (en) * 2016-10-10 2017-02-15 努比亚技术有限公司 Mobile terminal and method of processing touch control operation for mobile terminal
CN106445238A (en) * 2016-10-17 2017-02-22 北京小米移动软件有限公司 Edge touch inhibiting method and device
CN106775302A (en) * 2016-11-29 2017-05-31 努比亚技术有限公司 A kind of terminal screen anti-error-touch device and method
CN106855783A (en) * 2016-12-16 2017-06-16 广东欧珀移动通信有限公司 A kind of method of false-touch prevention, device and mobile terminal
CN107257949A (en) * 2015-02-26 2017-10-17 三星电子株式会社 Processing method of touch and the electronic equipment for supporting this method
CN107577372A (en) * 2017-09-06 2018-01-12 广东欧珀移动通信有限公司 Edge touch control method, device and mobile terminal
CN108108683A (en) * 2017-12-14 2018-06-01 北京小米移动软件有限公司 Touch-control response method, mobile terminal and storage medium
CN108920081A (en) * 2018-06-28 2018-11-30 努比亚技术有限公司 A kind of interaction regulation method, equipment and computer readable storage medium
CN109002223A (en) * 2018-08-29 2018-12-14 维沃移动通信有限公司 A kind of touch interface display methods and mobile terminal
CN109213349A (en) * 2017-06-30 2019-01-15 北京小米移动软件有限公司 Exchange method and device, computer readable storage medium based on touch screen

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5810923B2 (en) * 2012-01-06 2015-11-11 富士通株式会社 Input device and touch position calculation method
US10209816B2 (en) * 2013-07-04 2019-02-19 Samsung Electronics Co., Ltd Coordinate measuring apparatus for measuring input position of a touch and a coordinate indicating apparatus and driving method thereof
CN105487805B (en) * 2015-12-01 2020-06-02 小米科技有限责任公司 Object operation method and device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016130963A (en) * 2015-01-14 2016-07-21 シャープ株式会社 Electronic device with touch panel and program for controlling the electronic device
CN107257949A (en) * 2015-02-26 2017-10-17 三星电子株式会社 Processing method of touch and the electronic equipment for supporting this method
CN106293444A (en) * 2015-06-25 2017-01-04 小米科技有限责任公司 Mobile terminal, display control method and device
CN105867654A (en) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 Method for adjusting focal length of camera and terminal
CN106406621A (en) * 2016-10-10 2017-02-15 努比亚技术有限公司 Mobile terminal and method of processing touch control operation for mobile terminal
CN106445238A (en) * 2016-10-17 2017-02-22 北京小米移动软件有限公司 Edge touch inhibiting method and device
CN106775302A (en) * 2016-11-29 2017-05-31 努比亚技术有限公司 A kind of terminal screen anti-error-touch device and method
CN106855783A (en) * 2016-12-16 2017-06-16 广东欧珀移动通信有限公司 A kind of method of false-touch prevention, device and mobile terminal
CN109213349A (en) * 2017-06-30 2019-01-15 北京小米移动软件有限公司 Exchange method and device, computer readable storage medium based on touch screen
CN107577372A (en) * 2017-09-06 2018-01-12 广东欧珀移动通信有限公司 Edge touch control method, device and mobile terminal
CN108108683A (en) * 2017-12-14 2018-06-01 北京小米移动软件有限公司 Touch-control response method, mobile terminal and storage medium
CN108920081A (en) * 2018-06-28 2018-11-30 努比亚技术有限公司 A kind of interaction regulation method, equipment and computer readable storage medium
CN109002223A (en) * 2018-08-29 2018-12-14 维沃移动通信有限公司 A kind of touch interface display methods and mobile terminal

Also Published As

Publication number Publication date
CN112987958A (en) 2021-06-18

Similar Documents

Publication Publication Date Title
US20170031557A1 (en) Method and apparatus for adjusting shooting function
US20170060320A1 (en) Method for controlling a mobile terminal using a side touch panel
CN106873834B (en) Method and device for identifying triggering of key and mobile terminal
CN107102772B (en) Touch control method and device
CN106484284B (en) Method and device for switching single-hand mode
CN107390932B (en) Edge false touch prevention method and device and computer readable storage medium
CN107798309B (en) Fingerprint input method and device and computer readable storage medium
EP3232301B1 (en) Mobile terminal and virtual key processing method
CN107402711B (en) Volume adjusting method and device and computer readable storage medium
CN112905103B (en) False touch processing method and device and storage medium
EP3182256B1 (en) Touch control button, touch control panel and touch control terminal
CN106919283A (en) The touch event processing method of terminal, device and terminal
CN107656694B (en) Display control method and device of user interface
CN106293184B (en) Terminal capable of lateral touch control and touch control method thereof
CN106484296A (en) Mobile terminal prevents processing method, device and the equipment of false touch
CN106527919A (en) Method and device for adjusting screen display
CN107402677B (en) Method and device for recognizing finger lifting in touch operation and terminal
CN108319885B (en) Fingerprint identification method and device
CN106126050B (en) Menu display method and device
CN107203315B (en) Click event processing method and device and terminal
CN111506246A (en) Fingerprint identification determination method and fingerprint identification determination device
CN112987958B (en) Touch signal processing method and device
US20160195992A1 (en) Mobile terminal and method for processing signals generated from touching virtual keys
CN116204073A (en) Touch control method, touch control device, electronic equipment and storage medium
CN112363647A (en) Touch operation method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant