CN112987958A - Touch signal processing method and device - Google Patents

Touch signal processing method and device Download PDF

Info

Publication number
CN112987958A
CN112987958A CN201911300025.2A CN201911300025A CN112987958A CN 112987958 A CN112987958 A CN 112987958A CN 201911300025 A CN201911300025 A CN 201911300025A CN 112987958 A CN112987958 A CN 112987958A
Authority
CN
China
Prior art keywords
touch area
touch
finger
target
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911300025.2A
Other languages
Chinese (zh)
Other versions
CN112987958B (en
Inventor
唐矩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201911300025.2A priority Critical patent/CN112987958B/en
Publication of CN112987958A publication Critical patent/CN112987958A/en
Application granted granted Critical
Publication of CN112987958B publication Critical patent/CN112987958B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a touch signal processing method, which includes: determining a shape of at least one touch area generating a touch signal; determining a target touch area acted on by a finger in at least one touch area according to the shape; determining fingers acting on the target touch area as left-hand fingers or right-hand fingers according to the relation between the shape and a preset structure in the terminal; and under the condition that the finger acting on the target touch area is a left-hand finger, if the distance between the target touch area and a first frame of the terminal is greater than a first preset distance, ignoring the touch signals generated by other touch areas, the distance between the other touch areas and the first frame of the terminal is less than a second preset distance. According to the embodiment of the disclosure, on the basis of avoiding the mistaken touch triggering operation, the user can be ensured to timely respond to the screen operation near the frame when the user does not mistakenly touch, and the user can be favorably ensured to have good use experience.

Description

Touch signal processing method and device
Technical Field
The present disclosure relates to the field of touch technologies, and in particular, to a touch signal processing method, a touch signal processing apparatus, an electronic device, and a computer-readable storage medium.
Background
At present, electronic equipment with a touch function generally has a capacitive touch panel as a structure for sensing a touch signal, and when a hand of a user touches or approaches the touch panel, the hand can influence electric field distribution in the touch panel as a conductor, so that a capacitance value of a capacitive sensor in the touch panel is changed, and then the touch signal is generated according to the variable of the capacitance value.
Under the trend of a full-screen, the frame of the screen of the electronic device is narrower and narrower, and when a user performs touch operation on the screen, besides the fingers can contact the screen, the palm can also be close to or even contact the screen near the frame, so that a sensor in a touch panel in the screen generates a touch signal.
However, the condition that the palm touches the screen near the frame generally refers to a false touch, and in the related art, in order to avoid the false touch triggering operation, the touch signal generated by the sensor near the frame is directly ignored. However, this also causes the screen near the frame to be operated when the user does not touch by mistake, and the touch signal generated by the sensor near the frame is omitted, so that the user feels that the response of the screen to the touch operation is not sensitive.
Disclosure of Invention
The present disclosure provides a touch signal processing method, a touch signal processing apparatus, an electronic device, and a computer-readable storage medium to solve the disadvantages of the related art.
According to a first aspect of the embodiments of the present disclosure, a touch signal processing method is provided, which is applied to a terminal, and the method includes:
determining a shape of at least one touch area generating a touch signal;
determining a target touch area acted on by a finger in the at least one touch area according to the shape;
determining fingers acting on the target touch area as left-hand fingers or right-hand fingers according to the relation between the shape and a preset structure in the terminal;
under the condition that the finger acting on the target touch area is a left-hand finger, if the distance between the target touch area and a first frame of the terminal is greater than a first preset distance, ignoring touch signals generated by other touch areas, the distance between the other touch areas and the first frame of the terminal is less than a second preset distance;
and under the condition that the finger acting on the target touch area is a right-hand finger, if the distance between the target touch area and a second frame of the terminal is greater than a first preset distance, ignoring the touch signals generated by other touch areas with the distance between the target touch area and the second frame being less than a second preset distance.
Optionally, the shape of the at least one touch area for generating the touch signal includes:
determining positions of a plurality of sensors generating touch signals;
determining a touch area formed by the connected sensors according to the position;
and performing curve fitting on the positions of the sensors at the edge of the touch area to determine the shape of the touch area.
Optionally, the determining, according to the shape, a target touch area acted on by a finger in the at least one touch area includes:
and determining a touch area with an oval shape as the target touch area.
Optionally, the determining, according to the relationship between the shape and the preset structure in the terminal, that the finger acting on the target touch area is a left-hand finger or a right-hand finger includes:
calculating an included angle between the major axis or the minor axis of the ellipse and at least one frame of the terminal;
and determining that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to the included angle.
Optionally, the method further comprises:
under the condition that the finger acting on the target touch area is a left-hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and a first frame of the terminal is less than a second preset distance;
and under the condition that the finger acting on the target touch area is a right-hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and a second frame of the terminal is less than a second preset distance.
According to a second aspect of the embodiments of the present disclosure, a touch signal processing apparatus is provided, which is suitable for a terminal, and the apparatus includes:
a shape determination module configured to determine a shape of at least one touch area generating a touch signal;
a region determination module configured to determine a target touch region on which a finger acts among the at least one touch region according to the shape;
the finger determining module is configured to determine that a finger acting on the target touch area is a left-hand finger or a right-hand finger according to the relation between the shape and a preset structure in the terminal;
the signal processing module is configured to, under the condition that a finger acting on the target touch area is a left-hand finger, ignore touch signals generated by other touch areas, of which the distance to the first frame is smaller than a second preset distance, if the distance between the target touch area and the first frame of the terminal is larger than the first preset distance; and under the condition that the finger acting on the target touch area is a right-hand finger, if the distance between the target touch area and a second frame of the terminal is greater than a first preset distance, ignoring the touch signals generated by other touch areas with the distance between the target touch area and the second frame being less than a second preset distance.
Optionally, the shape determination module comprises:
a position determination submodule configured to determine positions of a plurality of sensors generating touch signals;
the area determination submodule is configured to determine a touch area formed by the connected sensors according to the position;
a shape determination sub-module configured to curve fit positions of sensors located at edges of the touch area to determine a shape of the touch area.
Optionally, the area determination module is configured to determine a touch area with an elliptical shape as the target touch area.
Optionally, the finger determination module comprises:
an included angle calculation submodule configured to calculate an included angle between a major axis or a minor axis of the ellipse and at least one frame of the terminal;
and the finger determining submodule is configured to determine that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to the included angle.
Optionally, the signal processing module is further configured to respond to the touch signal generated by the target touch area if a distance between the target touch area and the first frame of the terminal is less than a second preset distance, when the finger acting on the target touch area is a left-hand finger; and under the condition that the finger acting on the target touch area is a right-hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and a second frame of the terminal is less than a second preset distance.
According to a third aspect of the embodiments of the present disclosure, an electronic device is provided, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any of the above embodiments.
According to a fourth aspect of the embodiments of the present disclosure, a computer-readable storage medium is proposed, on which a computer program is stored, which when executed by a processor implements the steps in the method according to any of the embodiments described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the embodiment of the disclosure, the touch signals generated by all the touch areas close to the frame are not omitted, but a target touch area acted on by a finger can be determined in the touch area generating the touch signals, and then whether the finger acted on the target touch area is a left-hand finger or a right-hand finger is determined.
When the finger acting on the target touch area is a left-hand finger and the distance between the left-hand finger and the first frame of the terminal is greater than the first preset distance, it can be determined that the other touch areas with the distance from the first frame being less than the second preset distance are touch areas touched by the palm of the left hand of the user by mistake, so that the touch signals generated by the other touch areas are ignored.
Similarly, when the finger acting on the target touch area is a right-hand finger and the distance between the right-hand finger and the second frame of the terminal is greater than the first preset distance, it may be determined that the other touch areas having the distance to the second frame less than the second preset distance are touch areas touched by the palm of the right hand of the user by mistake, so as to ignore the touch signals generated by the other touch areas.
That is, according to the embodiments of the present disclosure, the touch signal generated in the touch area may be omitted when it is determined that the touch area closer to the bezel is touched by the palm, and the touch signal generated in the touch area may be normally responded when it is determined that the touch area closer to the bezel is not touched by the palm. Therefore, on the basis of avoiding the mistaken touch triggering operation, the screen operation near the frame can be timely responded when the user does not touch the screen by mistake, and the good use experience of the user is favorably ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic flow chart diagram illustrating a touch signal processing method according to an embodiment of the present disclosure.
Fig. 2 is a schematic diagram illustrating a touch area according to an embodiment of the disclosure.
Fig. 3 is a schematic diagram illustrating a touch signal according to an embodiment of the disclosure.
Fig. 4 is a schematic diagram illustrating a relationship between a shape of a target touch area and a preset structure in a terminal according to an embodiment of the disclosure.
Fig. 5 is a schematic diagram illustrating another relationship between the shape of the target touch area and a preset structure in the terminal according to an embodiment of the disclosure.
Fig. 6 is a schematic flow chart diagram illustrating another touch signal processing method according to an embodiment of the present disclosure.
Fig. 7 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the present disclosure.
Fig. 8 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the present disclosure.
Fig. 9 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the present disclosure.
Fig. 10 is a schematic block diagram illustrating a touch signal processing apparatus according to an embodiment of the present disclosure.
FIG. 11 is a schematic block diagram illustrating a shape determination module in accordance with an embodiment of the present disclosure.
FIG. 12 is a schematic block diagram illustrating a finger determination module in accordance with an embodiment of the present disclosure.
Fig. 13 is a schematic block diagram illustrating an apparatus for touch signal processing according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a schematic flow chart diagram illustrating a touch signal processing method according to an embodiment of the present disclosure. The method shown in the embodiment can be applied to a terminal, which includes but is not limited to a mobile phone, a tablet computer, a wearable device, a personal computer, and other electronic devices.
The terminal may be provided with a touch panel, the touch panel may include a plurality of sensors for sensing touch operation to generate touch signals, and the sensors may be capacitive sensors, such as self-capacitive sensors, or mutual capacitive sensors.
The terminal may further include a display panel, and a relationship between the display panel and the touch panel may be set as needed, for example, the touch panel may be disposed on the display panel to form an ogs (one Glass solution) structure, or the touch panel may be disposed In the display panel to form an In Cell structure.
As shown in fig. 1, the touch signal processing method may include the following steps:
in step S101, determining a shape of at least one touch area where a touch signal is generated;
in step S102, determining a target touch area acted on by a finger in the at least one touch area according to the shape;
in step S103, determining that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to a relationship between the shape and a preset structure in the terminal;
in step S104, in a case that the finger acting on the target touch area is a left-hand finger, if a distance between the target touch area and a first frame of the terminal is greater than a first preset distance, ignoring touch signals generated by other touch areas whose distance from the first frame is less than a second preset distance;
in step S105, if the finger acting on the target touch area is a right-hand finger, if the distance between the target touch area and the second frame of the terminal is greater than a first preset distance, the touch signals generated by other touch areas whose distances from the second frame are less than a second preset distance are ignored.
In one embodiment, when a user acts on the touch panel, a sensor in the touch panel may generate a touch signal, and according to a position of the sensor generating the touch signal, a touch area where the user acts on the touch panel may be determined, and thus a shape of the touch area may be determined. The user may act on one or more touch areas at the same time, and thus, one or more touch areas for generating the touch signal may be used.
Since the user may generate the touch signal by applying a finger, a palm, a face, etc. to the touch panel (e.g., touching the touch panel or being in close proximity to the touch panel), the shape of different portions may be different.
Taking the touch panel as an example, the area where the palm and face contact the touch panel, i.e. the touch area, is generally irregular in shape, and the area where the finger contacts the touch panel is generally regular in shape, e.g. circular or elliptical.
Therefore, the present embodiment can determine, in the at least one touch area, a target touch area acted on by the finger according to the shape of the touch area, that is, determine which touch areas in the at least one touch area are the target touch areas for generating the touch signal by the user through the action of the finger.
Fig. 2 is a schematic diagram illustrating a touch area according to an embodiment of the disclosure.
As shown in fig. 2, for example, a user clicks the screen with a finger (e.g. thumb) of a right hand of a handheld terminal (e.g. a mobile phone) and touches the screen with a finger (e.g. a thumb), in which case, a portion of the thumb connected to a palm of the hand is close to or even contacts the screen, so as to act on the touch panel in the screen, and the touch panel generates touch signals in both a touch area a contacted by the finger and a touch area B contacted by the palm of the hand.
Fig. 3 is a schematic diagram illustrating a touch signal according to an embodiment of the disclosure.
When a user does not act on the touch panel, the touch sensor in the touch panel also generates a touch signal, which is mainly generated due to the influence of noise, and the generated touch signal is small, and generally has an absolute value within 20 (the unit can be set as required).
When a user acts on the touch panel, for example, by operating the screen as shown in fig. 2, the touch panel in the screen may generate signals as shown in fig. 3, where each rectangle corresponds to a sensor, and the number in the rectangle is the touch signal generated by the sensor.
In one embodiment, the shape of the touch area may be determined by curve fitting the positions of the sensors generating the touch signal (in particular, the value of the generated touch signal is greater than a preset value, for example, the signal value is greater than 200 in fig. 3).
The shape of the touch area a in fig. 2 determined by fitting is an ellipse shown by a dotted line as shown in fig. 3, and the shape of the touch area B in fig. 2 is a triangle shown by a dotted line as shown in fig. 3. When a part on the body of the user acts on the touch panel without misoperation, the shape of the touch area is generally circular or elliptical, so that the elliptical touch area can be determined as a target touch area acted on by a finger.
After the target touch area is determined, since the target touch area generates the touch signal due to the finger action of the user, it can be further determined whether the finger acting on the target area is a left-hand finger or a right-hand finger.
Fig. 4 is a schematic diagram illustrating a relationship between a shape of a target touch area and a preset structure in a terminal according to an embodiment of the disclosure. Fig. 5 is a schematic diagram illustrating another relationship between the shape of the target touch area and a preset structure in the terminal according to an embodiment of the disclosure.
In an embodiment, it may be determined that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to a relationship between the shape and a preset structure in the terminal. For example, an included angle between the major axis or the minor axis of the ellipse and a frame of the terminal may be calculated (the included angle may be an included angle between the major axis or the minor axis and the frame without moving the major axis or the minor axis out of the screen), and then it is determined according to the included angle that the finger acting on the target touch area is a left hand finger or a right hand finger.
The shape of the target touch area is an ellipse, for example, a long axis of the ellipse may be determined first, and then an included angle α between the long axis and a long frame on the right side of the terminal may be calculated. On the basis of the embodiment shown in fig. 2, the user holds the terminal with the right hand and operates the terminal with the right finger, so that the included angle α between the major axis of the ellipse and the right side frame of the terminal is an acute angle as shown in fig. 4, and therefore, when the included angle is an acute angle, the finger acting on the target area can be determined to be the right finger.
When the user holds the terminal with the left hand and operates the terminal with the fingers of the left hand, the included angle α between the major axis of the ellipse and the right side frame of the terminal is an obtuse angle as shown in fig. 5, and therefore, the fingers acting on the target area can be determined to be the fingers of the left hand under the condition that the included angle is an obtuse angle.
If the finger acting on the target touch area is a right hand finger, if the distance between the right hand finger and a second frame (for example, a long right frame) of the terminal is large (for example, the distance is greater than the first preset distance), the palm of the right hand may contact the screen as shown in fig. 2 in order to contact an area farther away from the second frame with the finger, so that a touch signal is generated in a touch area (for example, an area contacted by the palm) closer to the second frame (for example, the distance is less than the second preset distance) except the oval target touch area (for example, the area contacted by the finger). In this case, the touch signals generated by other touch areas can be ignored, so as to avoid generating an erroneous touch signal due to an erroneous touch.
Accordingly, if the finger acting on the target touch area is a left-hand finger, if the distance between the left-hand finger and a first frame (e.g., a left long frame) of the terminal is large (e.g., the distance is greater than the first preset distance), the user may touch the screen with the palm of the left hand in order to touch an area farther from the first frame with the finger, so that a touch signal is generated in a touch area (e.g., an area touched by the palm) closer to the first frame (e.g., the distance is less than the second preset distance) in addition to the oval target touch area (e.g., an area touched by the finger). In this case, the touch signals generated by other touch areas can be ignored, so as to avoid generating an erroneous touch signal due to an erroneous touch.
In the embodiment of the disclosure, the touch signals generated by all the touch areas close to the frame are not omitted, but a target touch area acted on by a finger can be determined in the touch area generating the touch signals, and then whether the finger acted on the target touch area is a left-hand finger or a right-hand finger is determined.
When the finger acting on the target touch area is a left-hand finger and the distance between the left-hand finger and the first frame of the terminal is greater than the first preset distance, it can be determined that the other touch areas with the distance from the first frame being less than the second preset distance are touch areas touched by the palm of the left hand of the user by mistake, so that the touch signals generated by the other touch areas are ignored.
Similarly, when the finger acting on the target touch area is a right-hand finger and the distance between the right-hand finger and the second frame of the terminal is greater than the first preset distance, it may be determined that the other touch areas having the distance to the second frame less than the second preset distance are touch areas touched by the palm of the right hand of the user by mistake, so as to ignore the touch signals generated by the other touch areas.
That is, according to the embodiments of the present disclosure, the touch signal generated in the touch area may be omitted when it is determined that the touch area closer to the bezel is touched by the palm, and the touch signal generated in the touch area may be normally responded when it is determined that the touch area closer to the bezel is not touched by the palm. Therefore, on the basis of avoiding the mistaken touch triggering operation, the screen operation near the frame can be timely responded when the user does not touch the screen by mistake, and the good use experience of the user is favorably ensured.
Fig. 6 is a schematic flow chart diagram illustrating another touch signal processing method according to an embodiment of the present disclosure. As shown in fig. 6, the shape of the at least one touch area for generating the touch signal includes:
in step S1011, positions of a plurality of sensors that generate touch signals are determined;
in step S1012, determining a touch area formed by the connected sensors according to the position;
in step S1013, a curve fitting is performed on the positions of the sensors at the edge of the touch area to determine the shape of the touch area.
In one embodiment, in order to determine the shape of the touch area, the touch area needs to be determined first.
First, a sensor generating a touch signal may be determined, where the generation of the touch signal is determined to be the sensor generating the touch signal, and the sensor generating the touch signal is greater than a preset signal value, for example, in the embodiment shown in fig. 3, the sensor generating the touch signal is determined to be the sensor generating the touch signal if the signal value of the sensor generating the touch signal is greater than 200.
The position of the sensor generating the touch signal may then be determined, for example, by representing the position of the sensor in the row and column to which the sensor belongs in the touch panel.
Furthermore, the touch area formed by the connected sensors can be determined according to the positions, wherein the connected sensors refer to the sensors with adjacent positions, such as three sensors, the first position is in the third row and the fourth column, the second position is in the third row and the fifth column, and the third position is in the fourth row and the fourth column, so that the three sensors can be determined to be connected.
Finally, the positions of the sensors at the edges of the touch areas can be curve-fitted to determine the shape of the touch areas, for example, in the embodiment shown in fig. 3, two touch areas are present.
In the central touch area, the signal values of the sensors at the edges of the touch edge area are 307, 981, 268, 306, 1119, 475, 333, 231, 555, 430, 681, 552, 200, 334, 1076, and the positions of these sensors are curve-fitted to obtain an approximate elliptical shape.
In the lower right touch area, the signal values of the sensors located at the edge of the touch edge area are 231, 225, 543, 486, 653, 322, 258, 236, 221, 334, 580, 951, 732, and 555, and the positions of these sensors are curve-fitted to obtain an approximately triangular shape.
Fig. 7 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the present disclosure. As shown in fig. 7, determining a target touch area acted on by a finger in the at least one touch area according to the shape includes:
in step S1021, a touch area with an elliptical shape is determined as the target touch area.
In one embodiment, since the user generally touches the screen with a finger to perform a touch operation without touching the screen by mistake, and the shape formed by the finger touching the screen is generally a circle or an ellipse, wherein when the shape is an ellipse, the user generally holds the terminal by one hand and clicks the screen by a thumb, in this case, a palm touches the screen by mistake, and therefore it is necessary to determine which touch areas are target touch areas so as to ignore touch signals generated by other touch areas.
When the shape is circular, the user generally holds the terminal with one hand and clicks the screen with the finger of the other hand, and in this case, the palm of the hand is not touched on the screen by mistake, so that it is not necessary to determine which touch areas are the target touch areas in order to omit touch signals generated by other touch areas subsequently.
Fig. 8 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the present disclosure. As shown in fig. 8, the determining that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to the relationship between the shape and the preset structure in the terminal includes:
in step S1031, calculating an included angle between the major axis or the minor axis of the ellipse and at least one frame of the terminal;
in step S1032, it is determined that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to the included angle.
In an embodiment, it may be determined that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to a relationship between the shape and a preset structure in the terminal. For example, an included angle between the major axis or the minor axis of the ellipse and a frame of the terminal may be calculated (the included angle may be an included angle between the major axis or the minor axis and the frame without moving the major axis or the minor axis out of the screen), and then it is determined according to the included angle that the finger acting on the target touch area is a left hand finger or a right hand finger.
The shape of the target touch area is an ellipse, for example, a long axis of the ellipse may be determined first, and then an included angle α between the long axis and a long frame on the right side of the terminal may be calculated. On the basis of the embodiment shown in fig. 2, the user holds the terminal with the right hand and operates the terminal with the right finger, so that the included angle α between the major axis of the ellipse and the right side frame of the terminal is an acute angle as shown in fig. 4, and therefore, when the included angle is an acute angle, the finger acting on the target area can be determined to be the right finger.
When the user holds the terminal with the left hand and operates the terminal with the fingers of the left hand, the included angle α between the major axis of the ellipse and the right side frame of the terminal is an obtuse angle as shown in fig. 5, and therefore, the fingers acting on the target area can be determined to be the fingers of the left hand under the condition that the included angle is an obtuse angle.
Fig. 9 is a schematic flow chart diagram illustrating yet another touch signal processing method according to an embodiment of the present disclosure. As shown in fig. 9, the method further includes:
in step S106, in a case that the finger acting on the target touch area is a left-hand finger, if a distance between the target touch area and the first frame of the terminal is less than a second preset distance, responding to a touch signal generated by the target touch area;
in step S107, in a case that the finger acting on the target touch area is a right-hand finger, if a distance between the target touch area and a second frame of the terminal is less than a second preset distance, responding to a touch signal generated by the target touch area.
In one embodiment, in a case that the finger acting on the target touch area is a left-hand finger, if the distance between the target touch area and the first frame of the terminal is less than the second preset distance, since the shape of the target touch area is the area acted by the finger and is not the area touched by the palm, the touch signal generated by the target touch area can be responded;
accordingly, when the finger acting on the target touch area is a right-hand finger, if the distance between the target touch area and the second frame of the terminal is less than the second preset distance, the target touch area is shaped as an area acted by the finger and is not an area touched by a palm in error, so that the touch signal generated by the target touch area can be responded.
Therefore, under the condition that the target touch area is close to the frame, the touch signal generated by the target touch area can be responded, and the touch operation which is not touched by a user by mistake can be ensured to be responded.
Corresponding to the foregoing embodiments of the touch signal processing method, the present disclosure further provides embodiments of a touch signal processing apparatus.
Fig. 10 is a schematic block diagram illustrating a touch signal processing apparatus according to an embodiment of the present disclosure. The apparatus shown in this embodiment may be applied to a terminal, which includes but is not limited to a mobile phone, a tablet computer, a wearable device, a personal computer, and other electronic devices.
As shown in fig. 10, the touch signal processing apparatus may include:
a shape determination module 101 configured to determine a shape of at least one touch area generating a touch signal;
a region determining module 102 configured to determine a target touch region to which a finger acts among the at least one touch region according to the shape;
the finger determining module 103 is configured to determine, according to a relationship between the shape and a preset structure in the terminal, that a finger acting on the target touch area is a left-hand finger or a right-hand finger;
the signal processing module 104 is configured to, if the finger acting on the target touch area is a left-hand finger, ignore touch signals generated by other touch areas whose distance from the first frame is smaller than a second preset distance if the distance between the target touch area and the first frame of the terminal is greater than the first preset distance; and under the condition that the finger acting on the target touch area is a right-hand finger, if the distance between the target touch area and a second frame of the terminal is greater than a first preset distance, ignoring the touch signals generated by other touch areas with the distance between the target touch area and the second frame being less than a second preset distance.
FIG. 11 is a schematic block diagram illustrating a shape determination module in accordance with an embodiment of the present disclosure. As shown in fig. 11, the shape determination module 101 includes:
a position determination submodule 1011 configured to determine positions of a plurality of sensors that generate touch signals;
a region determination sub-module 1012 configured to determine a touch region formed by the connected sensors according to the position;
a shape determination sub-module 1013 configured to perform curve fitting on the positions of the sensors located at the edges of the touch area to determine the shape of the touch area.
Optionally, the area determination module is configured to determine a touch area with an elliptical shape as the target touch area.
FIG. 12 is a schematic block diagram illustrating a finger determination module in accordance with an embodiment of the present disclosure. As shown in fig. 12, the finger determination module 103 includes:
an angle calculation sub-module 1032 configured to calculate an angle between a major axis or a minor axis of the ellipse and at least one bezel of the terminal;
the finger determination submodule 1032 is configured to determine that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to the included angle.
Optionally, the signal processing module is further configured to respond to the touch signal generated by the target touch area if a distance between the target touch area and the first frame of the terminal is less than a second preset distance, when the finger acting on the target touch area is a left-hand finger; and under the condition that the finger acting on the target touch area is a right-hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and a second frame of the terminal is less than a second preset distance.
With regard to the apparatus in the above embodiments, the specific manner in which each module performs operations has been described in detail in the embodiments of the related method, and will not be described in detail here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separate, and the parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
An embodiment of the present disclosure also provides an electronic device, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any of the above embodiments.
Embodiments of the present disclosure also provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps in the method according to any of the above embodiments.
Fig. 13 is a schematic block diagram illustrating an apparatus 1300 for touch signal processing according to an embodiment of the present disclosure. For example, apparatus 1300 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and so forth.
Referring to fig. 13, the apparatus 1300 may include one or more of the following components: a processing component 1302, a memory 1304, a power component 1306, a multimedia component 1308, an audio component 1310, an input/output (I/O) interface 1312, a sensor component 1314, and a communication component 1316.
The processing component 1302 generally controls overall operation of the device 1300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1302 may include one or more processors 1320 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 1302 can include one or more modules that facilitate interaction between the processing component 1302 and other components. For example, the processing component 1302 may include a multimedia module to facilitate interaction between the multimedia component 1308 and the processing component 1302.
The memory 1304 is configured to store various types of data to support operations at the apparatus 1300. Examples of such data include instructions for any application or method operating on device 1300, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1304 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power supply component 1306 provides power to the various components of device 1300. Power components 1306 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for device 1300.
The multimedia component 1308 includes a screen between the device 1300 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense a border boundary of a touch or slide action, but also detect a duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1308 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 1300 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1310 is configured to output and/or input audio signals. For example, the audio component 1310 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1300 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1304 or transmitted via the communication component 1316. In some embodiments, the audio component 1310 also includes a speaker for outputting audio signals.
The I/O interface 1312 provides an interface between the processing component 1302 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1314 includes one or more sensors for providing various aspects of state assessment for the device 1300. For example, the sensor assembly 1314 may detect the open/closed state of the device 1300, the relative positioning of components, such as a display and keypad of the device 1300, the sensor assembly 1314 may also detect a change in the position of the device 1300 or a component of the device 1300, the presence or absence of user contact with the device 1300, orientation or acceleration/deceleration of the device 1300, and a change in the temperature of the device 1300. The sensor assembly 1314 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1314 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1316 is configured to facilitate communications between the apparatus 1300 and other devices in a wired or wireless manner. The apparatus 1300 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G LTE, 5G NR, or a combination thereof. In an exemplary embodiment, the communication component 1316 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1316 also includes a Near Field Communications (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the touch signal processing method according to any of the above embodiments.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 1304 comprising instructions, executable by the processor 1320 of the apparatus 1300 to perform the method described above is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. A touch signal processing method is suitable for a terminal, and comprises the following steps:
determining a shape of at least one touch area generating a touch signal;
determining a target touch area acted on by a finger in the at least one touch area according to the shape;
determining fingers acting on the target touch area as left-hand fingers or right-hand fingers according to the relation between the shape and a preset structure in the terminal;
under the condition that the finger acting on the target touch area is a left-hand finger, if the distance between the target touch area and a first frame of the terminal is greater than a first preset distance, ignoring touch signals generated by other touch areas, the distance between the other touch areas and the first frame of the terminal is less than a second preset distance;
and under the condition that the finger acting on the target touch area is a right-hand finger, if the distance between the target touch area and a second frame of the terminal is greater than a first preset distance, ignoring the touch signals generated by other touch areas with the distance between the target touch area and the second frame being less than a second preset distance.
2. The method of claim 1, wherein the generating the shape of the at least one touch area of the touch signal comprises:
determining positions of a plurality of sensors generating touch signals;
determining a touch area formed by the connected sensors according to the position;
and performing curve fitting on the positions of the sensors at the edge of the touch area to determine the shape of the touch area.
3. The method of claim 1, wherein determining a target touch area to which a finger acts in the at least one touch area according to the shape comprises:
and determining a touch area with an oval shape as the target touch area.
4. The method according to claim 3, wherein the determining that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to the relationship between the shape and a preset structure in the terminal comprises:
calculating an included angle between the major axis or the minor axis of the ellipse and at least one frame of the terminal;
and determining that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to the included angle.
5. The method according to any one of claims 1 to 4, further comprising:
under the condition that the finger acting on the target touch area is a left-hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and a first frame of the terminal is less than a second preset distance;
and under the condition that the finger acting on the target touch area is a right-hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and a second frame of the terminal is less than a second preset distance.
6. A touch signal processing device is suitable for a terminal, and comprises:
a shape determination module configured to determine a shape of at least one touch area generating a touch signal;
a region determination module configured to determine a target touch region on which a finger acts among the at least one touch region according to the shape;
the finger determining module is configured to determine that a finger acting on the target touch area is a left-hand finger or a right-hand finger according to the relation between the shape and a preset structure in the terminal;
the signal processing module is configured to, under the condition that a finger acting on the target touch area is a left-hand finger, ignore touch signals generated by other touch areas, of which the distance to the first frame is smaller than a second preset distance, if the distance between the target touch area and the first frame of the terminal is larger than the first preset distance; and under the condition that the finger acting on the target touch area is a right-hand finger, if the distance between the target touch area and a second frame of the terminal is greater than a first preset distance, ignoring the touch signals generated by other touch areas with the distance between the target touch area and the second frame being less than a second preset distance.
7. The apparatus of claim 6, wherein the shape determination module comprises:
a position determination submodule configured to determine positions of a plurality of sensors generating touch signals;
the area determination submodule is configured to determine a touch area formed by the connected sensors according to the position;
a shape determination sub-module configured to curve fit positions of sensors located at edges of the touch area to determine a shape of the touch area.
8. The apparatus of claim 6, wherein the region determination module is configured to determine a touch region that is elliptical in shape as the target touch region.
9. The apparatus of claim 7, wherein the finger determination module comprises:
an included angle calculation submodule configured to calculate an included angle between a major axis or a minor axis of the ellipse and at least one frame of the terminal;
and the finger determining submodule is configured to determine that the finger acting on the target touch area is a left-hand finger or a right-hand finger according to the included angle.
10. The apparatus according to any one of claims 6 to 9, wherein the signal processing module is further configured to respond to the touch signal generated by the target touch area if a distance between the target touch area and the first frame of the terminal is less than a second preset distance, if the finger acting on the target touch area is a left-hand finger; and under the condition that the finger acting on the target touch area is a right-hand finger, responding to a touch signal generated by the target touch area if the distance between the target touch area and a second frame of the terminal is less than a second preset distance.
11. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 5.
CN201911300025.2A 2019-12-16 2019-12-16 Touch signal processing method and device Active CN112987958B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911300025.2A CN112987958B (en) 2019-12-16 2019-12-16 Touch signal processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911300025.2A CN112987958B (en) 2019-12-16 2019-12-16 Touch signal processing method and device

Publications (2)

Publication Number Publication Date
CN112987958A true CN112987958A (en) 2021-06-18
CN112987958B CN112987958B (en) 2024-04-19

Family

ID=76342021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911300025.2A Active CN112987958B (en) 2019-12-16 2019-12-16 Touch signal processing method and device

Country Status (1)

Country Link
CN (1) CN112987958B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130176247A1 (en) * 2012-01-06 2013-07-11 Fujitsu Limited Input device and method for touch position calculation
JP2016130963A (en) * 2015-01-14 2016-07-21 シャープ株式会社 Electronic device with touch panel and program for controlling the electronic device
CN105867654A (en) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 Method for adjusting focal length of camera and terminal
CN106293444A (en) * 2015-06-25 2017-01-04 小米科技有限责任公司 Mobile terminal, display control method and device
US20170003814A1 (en) * 2013-07-04 2017-01-05 Samsung Electronics Co., Ltd. Coordinate measuring apparatus and method of controlling the same
CN106406621A (en) * 2016-10-10 2017-02-15 努比亚技术有限公司 Mobile terminal and method of processing touch control operation for mobile terminal
CN106445238A (en) * 2016-10-17 2017-02-22 北京小米移动软件有限公司 Edge touch inhibiting method and device
CN106775302A (en) * 2016-11-29 2017-05-31 努比亚技术有限公司 A kind of terminal screen anti-error-touch device and method
US20170153754A1 (en) * 2015-12-01 2017-06-01 Xiaomi Inc. Method and device for operating object
CN106855783A (en) * 2016-12-16 2017-06-16 广东欧珀移动通信有限公司 A kind of method of false-touch prevention, device and mobile terminal
CN107257949A (en) * 2015-02-26 2017-10-17 三星电子株式会社 Processing method of touch and the electronic equipment for supporting this method
CN107577372A (en) * 2017-09-06 2018-01-12 广东欧珀移动通信有限公司 Edge touch control method, device and mobile terminal
CN108108683A (en) * 2017-12-14 2018-06-01 北京小米移动软件有限公司 Touch-control response method, mobile terminal and storage medium
CN108920081A (en) * 2018-06-28 2018-11-30 努比亚技术有限公司 A kind of interaction regulation method, equipment and computer readable storage medium
CN109002223A (en) * 2018-08-29 2018-12-14 维沃移动通信有限公司 A kind of touch interface display methods and mobile terminal
CN109213349A (en) * 2017-06-30 2019-01-15 北京小米移动软件有限公司 Exchange method and device, computer readable storage medium based on touch screen

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130176247A1 (en) * 2012-01-06 2013-07-11 Fujitsu Limited Input device and method for touch position calculation
US20170003814A1 (en) * 2013-07-04 2017-01-05 Samsung Electronics Co., Ltd. Coordinate measuring apparatus and method of controlling the same
JP2016130963A (en) * 2015-01-14 2016-07-21 シャープ株式会社 Electronic device with touch panel and program for controlling the electronic device
CN107257949A (en) * 2015-02-26 2017-10-17 三星电子株式会社 Processing method of touch and the electronic equipment for supporting this method
CN106293444A (en) * 2015-06-25 2017-01-04 小米科技有限责任公司 Mobile terminal, display control method and device
US20170153754A1 (en) * 2015-12-01 2017-06-01 Xiaomi Inc. Method and device for operating object
CN105867654A (en) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 Method for adjusting focal length of camera and terminal
CN106406621A (en) * 2016-10-10 2017-02-15 努比亚技术有限公司 Mobile terminal and method of processing touch control operation for mobile terminal
CN106445238A (en) * 2016-10-17 2017-02-22 北京小米移动软件有限公司 Edge touch inhibiting method and device
CN106775302A (en) * 2016-11-29 2017-05-31 努比亚技术有限公司 A kind of terminal screen anti-error-touch device and method
CN106855783A (en) * 2016-12-16 2017-06-16 广东欧珀移动通信有限公司 A kind of method of false-touch prevention, device and mobile terminal
CN109213349A (en) * 2017-06-30 2019-01-15 北京小米移动软件有限公司 Exchange method and device, computer readable storage medium based on touch screen
CN107577372A (en) * 2017-09-06 2018-01-12 广东欧珀移动通信有限公司 Edge touch control method, device and mobile terminal
CN108108683A (en) * 2017-12-14 2018-06-01 北京小米移动软件有限公司 Touch-control response method, mobile terminal and storage medium
CN108920081A (en) * 2018-06-28 2018-11-30 努比亚技术有限公司 A kind of interaction regulation method, equipment and computer readable storage medium
CN109002223A (en) * 2018-08-29 2018-12-14 维沃移动通信有限公司 A kind of touch interface display methods and mobile terminal

Also Published As

Publication number Publication date
CN112987958B (en) 2024-04-19

Similar Documents

Publication Publication Date Title
US20170031557A1 (en) Method and apparatus for adjusting shooting function
US20170060320A1 (en) Method for controlling a mobile terminal using a side touch panel
CN106484284B (en) Method and device for switching single-hand mode
CN106873834B (en) Method and device for identifying triggering of key and mobile terminal
CN107390932B (en) Edge false touch prevention method and device and computer readable storage medium
EP3232301B1 (en) Mobile terminal and virtual key processing method
CN112905103B (en) False touch processing method and device and storage medium
CN105094609A (en) Method and device for achieving key operation in single-handed mode
CN107402711B (en) Volume adjusting method and device and computer readable storage medium
CN105260115A (en) Method and device for realizing single-hand mode, and intelligent terminal
CN107798309B (en) Fingerprint input method and device and computer readable storage medium
EP3182256B1 (en) Touch control button, touch control panel and touch control terminal
CN107168566B (en) Operation mode control method and device and terminal electronic equipment
JP2017525076A (en) Character identification method, apparatus, program, and recording medium
CN111522498A (en) Touch response method and device and storage medium
CN111104001B (en) Method and device for preventing screen from being touched mistakenly, mobile terminal and storage medium
CN106293184B (en) Terminal capable of lateral touch control and touch control method thereof
CN107203315B (en) Click event processing method and device and terminal
CN111506246A (en) Fingerprint identification determination method and fingerprint identification determination device
CN107168631B (en) Application program closing method and device and terminal electronic equipment
CN106990893B (en) Touch screen operation processing method and device
CN112987958B (en) Touch signal processing method and device
CN116204073A (en) Touch control method, touch control device, electronic equipment and storage medium
CN111314552B (en) User interface control method and device and storage medium
US20160195992A1 (en) Mobile terminal and method for processing signals generated from touching virtual keys

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant