CN110392161B - Proximity identification method and device and terminal equipment - Google Patents

Proximity identification method and device and terminal equipment Download PDF

Info

Publication number
CN110392161B
CN110392161B CN201810370345.4A CN201810370345A CN110392161B CN 110392161 B CN110392161 B CN 110392161B CN 201810370345 A CN201810370345 A CN 201810370345A CN 110392161 B CN110392161 B CN 110392161B
Authority
CN
China
Prior art keywords
signal
user
screen
graph
convex polygon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810370345.4A
Other languages
Chinese (zh)
Other versions
CN110392161A (en
Inventor
龚尤岗
骆志强
肖钡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FocalTech Systems Ltd
Original Assignee
FocalTech Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FocalTech Systems Ltd filed Critical FocalTech Systems Ltd
Priority to CN201810370345.4A priority Critical patent/CN110392161B/en
Priority to TW107138462A priority patent/TWI693532B/en
Publication of CN110392161A publication Critical patent/CN110392161A/en
Application granted granted Critical
Publication of CN110392161B publication Critical patent/CN110392161B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The application provides a proximity identification method, a proximity identification device and terminal equipment, which are used for determining a part where a user performs operation, so that screen turning actions can be accurately executed. The application provides a proximity identification method, which comprises the following steps: converting the first signal into a second signal according to the signal threshold, wherein the first signal is an initial signal obtained according to the operation of a user on the touch screen TP; communicating adjacent effective signals in the second signals to obtain a communication domain, wherein the effective signals correspond to the induction channels of the TP one to one; and determining the part of the user for implementing the operation according to the graph of the connected domain.

Description

Proximity identification method and device and terminal equipment
Technical Field
The application relates to the field of touch screen sensing, in particular to a proximity identification method and device and terminal equipment.
Background
The application of the face-attached information screen becomes an indispensable function on the mobile phone terminal, when a user calls and communicates through the mobile phone terminal, the mobile phone terminal automatically closes the screen, and the function is realized, so that the user can be prevented from mistakenly touching to influence the conversation quality, and the effect of saving screen power consumption can be achieved.
Nowadays, a mobile phone terminal generally determines a user operation by detecting a capacitance variation on a touch screen (TP), where a capacitance value on the TP changes due to a proximity behavior and a touch behavior of the user, and the mobile phone terminal determines whether to perform a screen touch operation by detecting the capacitance variation on the TP.
The mobile phone terminal is respectively provided with different capacitance variation thresholds for distinguishing the approaching behavior and the touching behavior of a user, however, in practical application, the capacitance variation caused by the operation of the user is often in a fuzzy zone between the two capacitance variation thresholds, or the approaching behavior and the touching behavior of the user also often occur at the same time, at this moment, the mobile phone terminal easily misinterprets the touching behavior of the user as the approaching behavior and executes the screen-off action, the user can only wake up the TP and then operate, and the use experience of the user is directly influenced.
Disclosure of Invention
The application provides a proximity identification method, a proximity identification device and terminal equipment, which are used for determining a part where a user performs operation, so that screen turning actions can be accurately executed.
In a first aspect, the present application provides a proximity recognition method, including:
converting the first signal into a second signal according to the signal threshold, wherein the first signal is an initial signal obtained according to the operation of a user on the touch screen TP;
communicating adjacent effective signals in the second signals to obtain a communication domain, wherein the effective signals correspond to the induction channels of the TP one to one;
and determining the part of the user for implementing the operation according to the graph of the connected domain.
With reference to the first aspect of the present application, in a first possible implementation manner of the first aspect of the present application, the part includes a face of the user, an auricle of the user, or a finger of the user.
With reference to the first aspect of the present application, in a second possible implementation manner of the first aspect of the present application, determining, according to a graph of a connected component, a location where a user performs an operation includes:
drawing a convex polygon on the basis of the connected domain, wherein the vertex of the convex polygon is positioned at the edge of the graph of the connected domain;
drawing a target rectangle on the basis of the convex polygon, wherein the target rectangle is a rectangle which comprises the convex polygon and has the smallest area, and one side of the convex polygon is positioned on one side of the target rectangle;
and when the aspect ratio of the target rectangle is larger than the aspect ratio threshold value, determining the part as the auricle of the user.
With reference to the first aspect of the present application, in a third possible implementation manner of the first aspect of the present application, determining, according to a graph of a connected domain, a location where a user performs an operation includes:
drawing a convex polygon on the basis of the connected domain, wherein the vertex of the convex polygon is positioned at the edge of the graph of the connected domain;
when the convex polygon contains the invalid signal, the determined part is the auricle of the user, and the invalid signal and the valid signal form a second signal.
With reference to the third possible implementation manner of the first aspect of the present application, in a fourth possible implementation manner of the first aspect of the present application, the method further includes:
calculating the distance from any point on the straight line where the first edge is located to the convex polygon along the vertical direction of the first edge of the convex polygon;
and when the distance from the first point of the straight line to the convex polygon is more than 1 matrix element distance, determining that the convex polygon contains the invalid signal.
With reference to the first aspect of the present application, in a fifth possible implementation manner of the first aspect of the present application, determining, according to a graph of a connected component, a location where a user performs an operation includes:
calculating the area of the graph;
and when the area is larger than the area threshold value, determining the part as the face of the user.
With reference to the first aspect of the present application, in a sixth possible implementation manner of the first aspect of the present application, determining, according to a graph of a connected component, a location where a user performs an operation includes:
judging whether the connected domain comprises a first graph and a second graph, wherein the first graph corresponds to the first effective signal, the second graph corresponds to the second effective signal, and the connected domain comprises the first effective signal and the second effective signal;
if so, the part is determined to be the face of the user.
With reference to the sixth possible implementation manner of the first aspect of the present application, in a seventh possible implementation manner of the first aspect of the present application, a signal peak value corresponding to the first effective signal is greater than 1/2 of a signal peak value of the second effective signal.
With reference to the first aspect of the present application, in an eighth possible implementation manner of the first aspect of the present application, the type of the TP includes a capacitive screen, a resistive screen, an infrared screen, or an acoustic wave screen;
the first signal includes a capacitance variation corresponding to the capacitive screen, a pressure variation corresponding to the resistive screen, an infrared light variation corresponding to the infrared screen, or a sound wave energy variation corresponding to the sound wave screen.
In a second aspect, the present application provides a proximity recognition apparatus, comprising:
the conversion unit is used for converting the first signal into a second signal, wherein the first signal is an initial signal obtained according to the operation of a user on the touch screen TP;
the communication unit is used for communicating adjacent effective signals in the second signal to obtain a communication domain, and the effective signals correspond to the induction channels of the TP one to one;
and the first determining unit is used for determining the part of the operation performed by the user according to the graph of the connected domain.
With reference to the second aspect of the present application, in a first possible implementation manner of the second aspect of the present application, the part includes a face of the user, an auricle of the user, or a finger of the user.
With reference to the second aspect of the present application, in a second possible implementation manner of the second aspect of the present application, the first determining unit is specifically configured to:
drawing a convex polygon on the basis of the connected domain, wherein the vertex of the convex polygon is positioned at the edge of the graph of the connected domain;
drawing a target rectangle on the basis of the convex polygon, wherein the target rectangle is a rectangle which comprises the convex polygon and has the smallest area, and one side of the convex polygon is positioned on one side of the target rectangle;
and when the aspect ratio of the target rectangle is larger than the aspect ratio threshold value, determining the part as the auricle of the user.
With reference to the second aspect of the present application, in a third possible implementation manner of the second aspect of the present application, the first determining unit is specifically configured to:
drawing a convex polygon on the basis of the connected domain, wherein the vertex of the convex polygon is positioned at the edge of the graph of the connected domain;
when the convex polygon contains the invalid signal, the determined part is the auricle of the user, and the invalid signal and the valid signal form a second signal.
With reference to the third possible implementation manner of the second aspect of the present application, in a fourth possible implementation manner of the second aspect of the present application, the terminal device further includes:
the calculating unit is used for calculating the distance from any point on a straight line where the first edge is located to the convex polygon along the vertical direction of the first edge of the convex polygon;
and the second determining unit is used for determining that the convex polygon contains the invalid signal when the distance from the first point to the convex polygon is larger than 1 matrix element distance.
With reference to the second aspect of the present application, in a fifth possible implementation manner of the second aspect of the present application, the first determining unit is specifically configured to:
calculating the area of the graph;
and when the area is larger than the area threshold value, determining the part as the face of the user.
With reference to the second aspect of the present application, in a sixth possible implementation manner of the second aspect of the present application, the determining unit is specifically configured to:
judging whether the connected domain comprises a first graph and a second graph, wherein the first graph corresponds to the first effective signal, the second graph corresponds to the second effective signal, and the connected domain comprises the first effective signal and the second effective signal;
if so, the part is determined to be the face of the user.
With reference to the sixth possible implementation manner of the second aspect of the present application, in a seventh possible implementation manner of the second aspect of the present application, a signal peak value corresponding to the first valid signal is greater than 1/2 of a signal peak value of the second valid signal.
With reference to the second aspect of the present application, in an eighth possible implementation manner of the second aspect of the present application, the type of the TP includes a capacitive screen, a resistive screen, an infrared screen, or an acoustic wave screen;
the first signal includes a capacitance variation corresponding to the capacitive screen, a pressure variation corresponding to the resistive screen, an infrared light variation corresponding to the infrared screen, or a sound wave energy variation corresponding to the sound wave screen.
In a third aspect, the present application provides a terminal device, where the terminal device includes the proximity identification apparatus in the second aspect or any implementation manner of the second aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium includes instructions that, when executed on a terminal device, cause the terminal device to perform a flow in the proximity identification method in the first aspect or any implementation manner of the first aspect.
In a fifth aspect, the present application provides a computer program product, where the computer program product includes computer software instructions that, when run on a terminal device, cause the terminal device to execute the flow in the proximity identification method in the first aspect or any implementation manner of the first aspect.
According to the technical scheme, the method has the following advantages:
converting the first signal into a second signal according to the signal threshold, wherein the first signal is an initial signal obtained according to the operation of a user on the touch screen TP; communicating adjacent effective signals in the second signals to obtain a communication domain, wherein the effective signals correspond to the induction channels of the TP one to one; according to the graph of the connected domain, the part of the user for implementing the operation is determined, so that the screen turning action can be accurately executed, the misjudgment is avoided, and the use experience of the user is improved.
Drawings
FIG. 1 is a schematic diagram of one embodiment of a proximity identification method provided herein;
FIG. 2 is an expanded view of a first signal provided herein;
FIG. 3 is an expanded view of a second signal provided herein;
FIG. 4 is a schematic diagram of one embodiment of a connected domain provided herein;
FIG. 5 is a schematic diagram of yet another embodiment of a proximity identification method provided herein;
FIG. 6 is a schematic view of one embodiment of a convex polygon provided herein;
FIG. 7 is a schematic view of one embodiment of a rectangle provided herein;
FIG. 8 is a schematic view of yet another embodiment of a rectangle provided herein;
FIG. 9 is a schematic diagram of yet another embodiment of a proximity identification method provided herein;
FIG. 10 is a schematic diagram illustrating one embodiment of an identification dimple arrangement provided herein;
FIG. 11 is a schematic diagram of yet another embodiment of a proximity identification method provided herein;
FIG. 12 is a schematic diagram of yet another embodiment of a proximity identification method provided herein;
FIG. 13 is a diagram illustrating an embodiment of connected domains corresponding to a user's face, provided herein;
FIG. 14 is a schematic view of one embodiment of a proximity identification apparatus provided herein;
fig. 15 is a schematic view of still another embodiment of the proximity recognition apparatus provided in the present application.
Detailed Description
The application provides a proximity identification method, a proximity identification device and terminal equipment, which are used for determining a part where a user performs operation, so that screen turning actions can be accurately executed.
Before the present application is introduced, a terminal device to which the present application relates will be introduced.
In the application, the terminal device can be a mobile phone terminal (including an apple IOS mobile phone and an Android mobile phone) mentioned in the above description, and can also be a device which can perform calling communication, such as a tablet computer, an intelligent bracelet and an intelligent watch, and has a touch screen.
Furthermore, the terminal equipment provided by the application can accurately determine the part of the user for implementing the operation, so that the terminal equipment can also make corresponding reactions aiming at different parts of the user, and the diversified requirements of the user are met. Therefore, the terminal device may not be limited to a device having a call communication function, such as a tablet (having no call communication function), a smart watch (having no call communication function), a smart band (having no call communication function), a camera, a personal digital assistant (PAD), a touch screen display, and the like.
The following description is provided to explain the details of the present application.
First, referring to fig. 1, fig. 1 is a schematic diagram illustrating an embodiment of a proximity recognition method provided in the present application, wherein the proximity recognition method can be summarized as follows:
step 101, converting a first signal into a second signal according to a signal threshold;
the first signal is an initial signal acquired according to an operation of a user on the touch screen TP.
It is understood that when a user performs an operation on a TP, such as approaching, touching, clicking, sliding, etc., the TP may generate a response, acquire an analog signal corresponding to the operation, and convert the analog signal into a digital signal, i.e., the initial signal, through an analog to digital converter (ADC). In the prior art, the mobile phone terminal determines whether to perform the screen off operation according to the result of comparison between the initial signal (the capacitance value variation in TP) and the variation threshold.
In the present application, after the initial signal is obtained, the next conversion is performed, that is, the initial signal is compared with a signal threshold, so that the initial signal is converted into a sub-signal according to a comparison result, and the sub-signal is divided into an invalid signal and a valid signal to form the second signal. Specifically, for example, the conversion rule may be set as: if the initial signal is smaller than the signal threshold, converting the initial signal into a non-value, and representing the non-value as an invalid signal; if the initial signal is not less than the signal threshold, the initial signal is converted into a true value, and the true value is represented as a valid signal.
As shown in the expanded schematic diagram of the first signal in fig. 2, the first signal corresponds to the touch surface of the TP, and when the TP is rectangular, the expanded first signal also forms a corresponding rectangular diagram. After the first signal is expanded, the first signal comprises a plurality of unit cells, and the unit cells correspond to the induction channels of the TP one by one.
As shown in fig. 3, the expanded diagram of the second signal detects the magnitude of the signal value corresponding to each cell in the first signal, so that the corresponding sub-signal can be converted to form the second signal. The sub-signals in the second signal can be distinguished by numerical expressions such as "0", "1", and "2".
It should be noted that in this application, the type of the TP may be not only the capacitive screen, but also a touch screen in the form of a resistive screen, an infrared screen, an acoustic wave screen, or the like, which is not limited herein.
Correspondingly, the first signal may be a capacitance variation corresponding to the capacitive screen, a pressure variation corresponding to the resistive screen, an infrared light variation corresponding to the infrared screen, or a sound wave energy variation corresponding to the sound wave screen.
After converting the first signal into the second signal, the following step 102 may be performed.
102, communicating adjacent effective signals in the second signal to obtain a communication domain;
the effective signals correspond to the induction channels of the TP one to one.
As shown in fig. 3, adjacent valid signals are connected to form one or more connected domains.
And 103, determining the part of the user for implementing the operation according to the graph of the connected domain.
It can be understood that, when a user operates on a TP through different parts, the first signal obtained by the TP can identify different graphic features, and thus, after the connected domain is obtained, the part of the user performing the operation can be determined according to the graphic of the connected domain.
In practical applications, the user typically operates on the TP by using a finger of the user, an auricle of the user, and a face of the user.
For ease of understanding, a set of pattern recognition rules is used as an example:
fig. 4 shows a schematic diagram of an embodiment of a connected domain graph, where the corresponding graph recognition rule is:
event A, the graph tends to be oval or round, and the graph is concentrated, so that the graph can be determined as a finger of a user;
event B, the graph is longer or has a concave structure, so that the user auricle can be determined;
c events, the graph is more complex and therefore can be identified as the user's face.
The proximity identification method provided by the application can identify the position of the user performing operation on the TP by identifying the graph of the connected domain, so that the action of screen turning of the terminal equipment can be accurately executed, the misjudgment is avoided, and the use experience of the user is improved.
For example, after determining that the graph of the connected domain is or includes the user auricle or the user face, the terminal device may determine that the current communication scene is a call communication scene, so as to turn off the screen, which not only can prevent the user from touching the screen by mistake and influence the communication quality, but also can save the screen power; after the graph of the connected domain is determined to be the finger of the user, the terminal device can determine that the current scene is the normal use scene, and therefore corresponding reactions can be carried out according to operations such as approaching, touching, clicking and sliding of the finger of the user.
Of course, it is mentioned before the introduction of the present application that "the terminal device may also make corresponding reactions for different parts of the user to meet the diversified requirements of the user," and correspondingly, the terminal device may also be preset with other pattern recognition rules in addition to the pattern recognition rules of the user's finger, the user's pinna, and the user's face, so as to identify the parts of other users performing the operation.
For example, if the graph of the connected domain is a diamond shape penetrating through the second signal development graph, the connected domain can be identified as a user arm, and in the scene, the terminal device can ignore the operation, so that misjudgment caused by mistouch of the user is prevented; or, if the graph of the connected domain is a partial or complete palm graph, the graph can be identified as the palm of the user, and in the scene, the terminal device can also ignore the operation, so as to prevent the misjudgment caused by the mistouch of the user, especially a child.
In order to further understand the application of the present application in practice, the above-mentioned description of the user's pinna and the pattern recognition rule of the user's face will be further described below.
First, pattern recognition rule of user auricle
In the present application, the image recognition of the user's auricle can be specifically realized by calculating the aspect ratio or recognizing the fovea, specifically referring to the following:
1. calculation of aspect ratio
As shown in fig. 5, which is a schematic view of another embodiment of the proximity identification method provided by the present application, step 203 may specifically include:
step 501, drawing a convex polygon on the basis of a connected domain;
wherein the vertices of the convex polygons are located at the edges of the graph of connected domains.
One embodiment of the convex polygon as shown in fig. 6 is intended to draw the convex polygon on the basis of the connected component, and to locate the vertices of the convex polygon at the edges of the graph of the connected component, so that the convex polygon contains the connected component.
It should be understood that "edge" as used herein may be understood as an edge cell corresponding to a connected domain, and for ease of calculation, the vertex of the convex polygon may be located at the center of the edge cell.
Step 502, drawing a target rectangle on the basis of a convex polygon;
the target rectangle is a rectangle which comprises a convex polygon and has the smallest area, and one side of the convex polygon is positioned (overlapped or partially overlapped) on one side of the target rectangle.
And after the convex polygon is drawn on the basis of the connected domain, continuing to draw the rectangle on the basis of the convex polygon.
An embodiment of the rectangle shown in fig. 7 is to construct a rectangle, which has a minimum area and one coincident or partially coincident side with any M side of the convex polygon, and contains the convex polygon, based on the M side. The convex polygon has N sides and N rectangles, and the rectangle with the smallest area in the N rectangles is determined as the target rectangle.
Specifically, as shown in fig. 8, the straight line on the side M of the convex polygon may be the straight line L1 on one side of the rectangle; taking the vertex of the convex edge farthest from the L1 as another straight line L2 parallel to the L1; and (3) making a perpendicular line L3 of L1 outside the convex polygon, then translating L2 to reach a vertex with the closest translation distance and a vertex with the farthest translation distance to respectively form two straight lines L4 and L5, so that a graph formed by the L1, the L2, the L4 and the L5 is a rectangle with one side parallel to the X side, the convex polygon and the smallest area. The length of the projection of the convex polygon at L1 and L3 is the length and width of the rectangle.
Assuming that in fig. 8, the coordinates of P1 are (x1, y1), the coordinates of P8 are (x8, y8), and the equation of the straight line P1P8 is:
(x1-x8)(y-y1)-(y1-y8)(x-x1)=0;
the equation for the perpendicular to line P1P8 is:
(x1-x8)(x-x1)+(y1-y8)(y-y1)=0;
the distance between any point (x0, y0) and the straight line p1p8 is:
Figure BDA0001637919130000091
the distance between any point (x0, y0) and the perpendicular line of the straight line P1P8 is as follows:
Figure BDA0001637919130000092
and calculating the rectangle which surrounds the convex polygon and has the smallest area and corresponds to any one side according to the formula.
And step 503, when the aspect ratio of the target rectangle is larger than the aspect ratio threshold, determining that the part is the auricle of the user.
After the target rectangle is determined, the aspect ratio of the target rectangle may be calculated, and if the aspect ratio of the target rectangle is greater than the aspect ratio threshold, for example, the threshold such as 2 or 3, then the feature that the user auricle corresponds to the longer graphic is met, and the part where the user performs the operation may be determined to be the user auricle.
2. Identification of dimples
As shown in fig. 9, which is a schematic view of another embodiment of the proximity identification method provided by the present application, step 203 may specifically include:
step 901, drawing a convex polygon on the basis of a connected domain;
wherein the vertices of the convex polygons are located at the edges of the graph of connected domains.
It is understood that step 901 may refer to the description of step 501, and details are not described herein.
And step 902, determining the part as the auricle of the user when the convex polygon contains the invalid signal.
Wherein the invalid signal and the valid signal constitute a second signal.
It can be understood that, if the convex polygon includes an invalid signal outside the connected component, such as the invalid signal "0" shown in fig. 8, the graph of the connected component includes a concave structure, which corresponds to the feature that the pinna of the user has the concave structure, and it can be determined that the portion of the user performing the operation is the pinna of the user.
Specifically, as shown in fig. 10, an embodiment of identifying the concave structure is to calculate a distance y from any point on a straight line L1 to the convex polygon along a direction perpendicular to a straight line L1 where a first side of the convex polygon is located. It will be appreciated that only the line segment on which the projection of the convex polygon onto the line L1 is located is involved in this calculation.
It can be understood that if the graph of the connected domain has no concave structure, the calculated distances y should be all 0; if the graph of the connected domain has a concave structure, at least one calculated distance y is different from 0. Considering the error factor in practice, it can be set as: when the distance y of the first point of the straight line L1 from the convex polygon is greater than 1 matrix element distance, it is determined that the convex polygon contains an invalid signal.
The matrix element distance can be defined as the cell 1 x1 in the above figure.
Second, pattern recognition rule of user's face
In the present application, the image recognition of the face of the user may be specifically realized by calculating an area or recognizing a lifted area, specifically referring to the following:
1. calculation of area
As shown in fig. 11, which is a schematic view of another embodiment of the proximity identification method provided by the present application, step 203 may specifically include:
step 1101, calculating the area of the graph;
it can be understood that if the user operates the face on the TP, a large connected domain is easily formed. Therefore, whether the operation is performed by the face of the user can be judged by calculating the graphic area of the connected domain, and the connected domain with the small area corresponding to the finger and the auricle of the user is distinguished.
Step 1102, when the area is larger than the area threshold, determining the part as the face of the user.
When the calculated area is greater than an area threshold, for example greater than 25 matrix elements, then the part is determined to be the user's face.
2. Identification of elevated regions
As shown in fig. 12, which is a schematic view of another embodiment of the proximity identification method provided by the present application, the step 103 may specifically include:
step 1201, judging whether the connected domain comprises a first graph and a second graph;
if so, step 1202 is triggered.
The first graph corresponds to the first effective signal, the second graph corresponds to the second effective signal, and the connected domain comprises the first effective signal and the second effective signal.
Compared with the characteristics that the connected domain corresponding to the finger of the user has a small area and is concentrated in the analog signal value and the graph, the connected domain corresponding to the face of the user has a large area and also has the characteristics that the analog signal value and the graph are dispersed.
An embodiment of the connected component corresponding to the face of the user shown in fig. 13 is intended to distinguish the connected component corresponding to the face of the user not only having a larger area, but also being divided into two effective signals, for example, in fig. 13, the first effective signal is represented by "1", the second effective signal is represented by "2", the value corresponding to the first effective signal is larger than the value corresponding to the second effective signal, and the first graph corresponding to the first effective signal is located in the second graph corresponding to the second effective signal, and conforms to the feature of the operation of the face of the user on the TP. At this time, the second pattern corresponding to the second valid signal is called a lifted area.
When the connected component is determined to include the lifted area, i.e. the first graph and the second graph, step 1202 is triggered.
In step 1202, the part is determined to be the face of the user.
And when the connected domain comprises the uplifted region, namely the first graph and the second graph, determining that the part is the face of the user.
In practical applications, the signal peak corresponding to the first effective signal may be set to 1/2 greater than the signal peak of the second effective signal, so as to facilitate identification of the lift-off region.
It should be understood that, in the present application, some or all of the steps in the proximity identification method may be performed by a processor of the TP, and specifically may be performed by a Micro Controller Unit (MCU) in an Integrated Circuit (IC); or may also be executed by a processor of the terminal device; or may be executed by a combination of the TP and a processor of the terminal device, which is not limited herein.
Meanwhile, it should be understood that the above descriptions are only examples of the present application for the user's pinna, user's face, and user's fingers, and in practical applications, the recognizable part may also be a user's joint, specifically a hand joint, or a user's palm and other parts, and is not limited herein, by the recognition of the connected domain graph, specifically, the calculation of the above-mentioned aspect ratio, the recognition of the indent, the graph area, and the actual parameter corresponding to the recognition of the raised area are adjusted.
The above is an introduction of the approach recognition method provided by the present application, and the approach recognition apparatus provided by the present application is described below from the perspective of functional modules.
As shown in fig. 14, which is a schematic view of an embodiment of a proximity recognition apparatus, the virtual apparatus provided in the present application may include:
a conversion unit 1401 for converting the first signal into a second signal according to a signal threshold;
the first signal is an initial signal acquired according to an operation of a user on the touch screen TP.
A communicating unit 1402, configured to communicate adjacent effective signals in the second signal to obtain a communicating domain;
the effective signals correspond to the induction channels of the TP one to one.
A first determination unit 1403, configured to determine a location where a user performs an operation according to the graph of the connected component.
The approach recognition device provided by the application can determine the position of the user for implementing operation on the TP by recognizing the graph of the connected domain, so that the action of screen turning of the terminal equipment can be accurately executed, misjudgment is avoided, and the use experience of the user is improved.
For example, after determining that the graph of the connected domain is or includes the user auricle or the user face, the terminal device may determine that the current communication scene is a call communication scene, so as to turn off the screen, which not only can prevent the user from touching the screen by mistake and influence the communication quality, but also can save the screen power; after the graph of the connected domain is determined to be the finger of the user, the terminal device can determine that the current scene is the normal use scene, and therefore corresponding reactions can be carried out according to operations such as approaching, touching, clicking and sliding of the finger of the user.
Of course, it is mentioned before the introduction of the present application that "the terminal device may also make corresponding reactions for different parts of the user to meet the diversified requirements of the user," and correspondingly, the terminal device may also be preset with other pattern recognition rules in addition to the pattern recognition rules of the user's finger, the user's pinna, and the user's face, so as to identify the parts of other users performing the operation.
For example, if the graph of the connected domain is a diamond shape penetrating through the second signal development graph, the connected domain can be identified as a user arm, and in the scene, the terminal device can ignore the operation, so that misjudgment caused by mistouch of the user is prevented; or, if the graph of the connected domain is a partial or complete palm graph, the graph can be identified as the palm of the user, and in the scene, the terminal device can also ignore the operation, so as to prevent the misjudgment caused by the mistouch of the user, especially a child.
In order to further understand the application of the present application in practice, the above-mentioned description of the user's pinna and the pattern recognition rule of the user's face will be further described below.
First, pattern recognition rule of user auricle
In the present application, the image recognition of the user's auricle can be specifically realized by calculating the aspect ratio or recognizing the fovea, specifically referring to the following:
1. calculation of aspect ratio
The first determining unit 1403 is specifically configured to:
drawing a convex polygon on the basis of the connected domain;
wherein the vertices of the convex polygons are located at the edges of the graph of connected domains.
Drawing a target rectangle on the basis of the convex polygon;
the target rectangle is a rectangle which comprises a convex polygon and has the smallest area, and one side of the convex polygon is positioned (overlapped or partially overlapped) on one side of the target rectangle.
And when the aspect ratio of the target rectangle is larger than the aspect ratio threshold value, determining the part as the auricle of the user.
After the target rectangle is determined, the aspect ratio of the target rectangle may be calculated, and if the aspect ratio of the target rectangle is greater than the aspect ratio threshold, for example, the threshold such as 2 or 3, then the feature that the user auricle corresponds to the longer graphic is met, and the part where the user performs the operation may be determined to be the user auricle.
2. Identification of dimples
The first determining unit 1403 is specifically configured to:
drawing a convex polygon on the basis of the connected domain;
wherein the vertices of the convex polygons are located at the edges of the graph of connected domains.
When the convex polygon contains the invalid signal, the part is determined as the auricle of the user.
Wherein the invalid signal and the valid signal constitute a second signal.
It can be understood that, if the convex polygon includes an invalid signal outside the connected component, such as the invalid signal "0" shown in fig. 9, the graph of the connected component includes a concave structure, which corresponds to the feature that the pinna of the user has the concave structure, and it can be determined that the portion of the user performing the operation is the pinna of the user.
Specifically, as shown in fig. 15, the proximity recognition apparatus further includes:
the calculating unit 1501 is configured to calculate a distance from any point on the straight line on which the first edge of the convex polygon is located to the convex polygon along a direction perpendicular to the straight line on which the first edge of the convex polygon is located. It will be appreciated that in this calculation only the segment on which the projection of the convex polygon on the straight line lies is involved.
It can be understood that if the graph of the connected domain has no concave structure, the calculated distances should be all 0; if the graph of the connected domain has a concave structure, at least one calculated distance is different from 0. In consideration of the actual error factors, the second determining unit 1502 may be configured to: and when the distance from the first point of the straight line to the convex polygon is more than 1 matrix element distance, determining that the convex polygon contains the invalid signal.
The matrix element distance can be defined as the cell 1 x1 in the above figure.
Second, pattern recognition rule of user's face
In the present application, the image recognition of the face of the user may be specifically realized by calculating an area or recognizing a lifted area, specifically referring to the following:
1. calculation of area
The first determining unit 1403 is specifically configured to:
calculating the area of the graph;
and when the area is larger than the area threshold value, determining the part as the face of the user.
2. Identification of elevated regions
The first determining unit 1403 is specifically configured to:
judging whether the connected domain comprises a first graph and a second graph;
the first graph corresponds to the first effective signal, the second graph corresponds to the second effective signal, and the connected domain comprises the first effective signal and the second effective signal.
Compared with the characteristics that the connected domain corresponding to the finger of the user has a small area and is concentrated in the analog signal value and the graph, the connected domain corresponding to the face of the user has a large area and also has the characteristics that the analog signal value and the graph are dispersed.
When the connected domain is judged to comprise the uplifted area, namely the first graph and the second graph, the part can be determined to be the face of the user.
It should be understood that, in the present application, some or all of the units in the terminal device may be integrated into the processor of the TP, and specifically, may be integrated into the MCU in the TP driver IC; or may be integrated into the processor of the terminal device; or may be integrated with the TP and a combination of processors of the terminal device, which is not limited herein.
Meanwhile, it should be understood that the above descriptions are only examples of the present application for the user's pinna, user's face, and user's fingers, and in practical applications, the recognizable part may also be a user's joint, specifically a hand joint, or a user's palm and other parts, and is not limited herein, by the recognition of the connected domain graph, specifically, the calculation of the above-mentioned aspect ratio, the recognition of the indent, the graph area, and the actual parameter corresponding to the recognition of the raised area are adjusted.
The present application further provides a terminal device, where the terminal device includes the proximity identification apparatus, so that the terminal device executes a flow in the proximity identification method in any embodiment corresponding to fig. 1, fig. 5, fig. 9, fig. 11, and fig. 12.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the proximity identification apparatus, unit or terminal device mentioned above may refer to the corresponding process in the method embodiments corresponding to fig. 1, fig. 5, fig. 9, fig. 11 and fig. 12, and is not described herein again in detail.
The present application further provides a computer-readable storage medium, which includes instructions that, when executed on a terminal device, cause the terminal device to perform a flow in a proximity identification method in any embodiment corresponding to fig. 1, fig. 5, fig. 9, fig. 11, and fig. 12.
The present application further provides a computer program product comprising computer software instructions that, when run on a terminal device, cause the terminal device to perform a procedure in a proximity recognition method as in any of the embodiments corresponding to fig. 1, 5, 9, 11 and 12.
The steps in the method of the embodiments of the present application may be sequentially adjusted, combined, and deleted according to actual needs.
In the embodiments of the present application, the modules or units in the proximity identification apparatus and the terminal device may be combined, divided, and deleted according to actual needs.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (9)

1. A proximity recognition method, comprising:
converting a first signal into a second signal according to a signal threshold, wherein the first signal is an initial signal obtained according to the operation of a user on a touch screen TP;
communicating adjacent effective signals in the second signals to obtain a communication domain, wherein the effective signals correspond to the induction channels of the TP one to one;
drawing a convex polygon on the basis of the connected domain, wherein the vertex of the convex polygon is positioned at the edge of the graph of the connected domain;
when the convex polygon contains an invalid signal, determining that the part of the user for implementing the operation is the auricle of the user, wherein the invalid signal and the valid signal form the second signal;
or the like, or, alternatively,
judging whether the connected domain comprises a first graph and a second graph, wherein the first graph corresponds to a first effective signal, the second graph corresponds to a second effective signal, and the connected domain comprises the first effective signal and the second effective signal;
and if so, determining that the part of the user for implementing the operation is the face of the user.
2. The proximity recognition method of claim 1, further comprising:
calculating the distance from any point on a straight line where the first edge is located to the convex polygon along the vertical direction of the first edge of the convex polygon;
and when the distance from the first point of the straight line to the convex polygon is greater than 1 matrix element distance, determining that the invalid signal is contained in the convex polygon.
3. The proximity identification method of claim 1, wherein the first valid signal corresponds to a signal peak value greater than 1/2 of a signal peak value of the second valid signal.
4. The proximity recognition method according to any one of claims 1 to 3, wherein the type of the TP includes a capacitive screen, a resistive screen, an infrared screen, or an acoustic screen;
the first signal includes a capacitance variation corresponding to the capacitive screen, a pressure variation corresponding to the resistive screen, an infrared light variation corresponding to the infrared screen, or an acoustic wave energy variation corresponding to the acoustic wave screen.
5. An approach identification device, comprising:
the conversion unit is used for converting a first signal into a second signal, wherein the first signal is an initial signal obtained according to the operation of a user on the touch screen TP;
the communication unit is used for communicating adjacent effective signals in the second signals to obtain a communication domain, and the effective signals correspond to the induction channels of the TP one to one;
a determining unit, configured to draw a convex polygon on the basis of the connected component, where vertices of the convex polygon are located at edges of a graph of the connected component;
when the convex polygon contains an invalid signal, determining that the part of the user for implementing the operation is the auricle of the user, wherein the invalid signal and the valid signal form the second signal;
the determining unit is further configured to determine whether the connected domain includes a first graph and a second graph, where the first graph corresponds to a first valid signal, the second graph corresponds to a second valid signal, and the connected domain includes the first valid signal and the second valid signal;
and if so, determining that the part of the user for implementing the operation is the face of the user.
6. The proximity recognition apparatus according to claim 5, further comprising:
the calculating unit is used for calculating the distance from any point on a straight line where the first edge is located to the convex polygon along the vertical direction of the first edge of the convex polygon;
and the second determining unit is used for determining that invalid signals are contained in the convex polygon when the distance from the first point of the straight line to the convex polygon is greater than 1 matrix element distance.
7. The proximity identification device of claim 5, wherein the first valid signal corresponds to a signal peak value greater than 1/2 of a signal peak value of the second valid signal.
8. The proximity recognition apparatus according to any one of claims 5 to 7, wherein the type of the TP includes a capacitive screen, a resistive screen, an infrared screen, or an acoustic screen;
the first signal includes a capacitance variation corresponding to the capacitive screen, a pressure variation corresponding to the resistive screen, an infrared light variation corresponding to the infrared screen, or an acoustic wave energy variation corresponding to the acoustic wave screen.
9. A terminal device, characterized in that it comprises a proximity recognition arrangement according to any one of claims 5 to 8.
CN201810370345.4A 2018-04-23 2018-04-23 Proximity identification method and device and terminal equipment Active CN110392161B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810370345.4A CN110392161B (en) 2018-04-23 2018-04-23 Proximity identification method and device and terminal equipment
TW107138462A TWI693532B (en) 2018-04-23 2018-10-30 Proximity identification method, apparatus and a terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810370345.4A CN110392161B (en) 2018-04-23 2018-04-23 Proximity identification method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN110392161A CN110392161A (en) 2019-10-29
CN110392161B true CN110392161B (en) 2021-09-03

Family

ID=68284688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810370345.4A Active CN110392161B (en) 2018-04-23 2018-04-23 Proximity identification method and device and terminal equipment

Country Status (2)

Country Link
CN (1) CN110392161B (en)
TW (1) TWI693532B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113296636B (en) * 2021-06-09 2022-11-25 维沃移动通信有限公司 False touch prevention method and device and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102387252A (en) * 2011-10-11 2012-03-21 惠州Tcl移动通信有限公司 Handheld equipment and method for preventing touch screen from spurious triggering in call process of handheld equipment
CN103995627B (en) * 2013-02-19 2017-08-22 比亚迪股份有限公司 A kind of detection method and device of capacitance touch screen
TWI576741B (en) * 2013-11-08 2017-04-01 禾瑞亞科技股份有限公司 Digital demodulator and digital demodulating method
TWI552045B (en) * 2013-12-05 2016-10-01 禾瑞亞科技股份有限公司 Method and apparatus for determining mistaken approaching or touching event
US10262078B2 (en) * 2014-02-10 2019-04-16 Apple Inc. Systems and methods for optimizing performance of graph operations
CN105117132B (en) * 2015-08-31 2018-08-24 广州视源电子科技股份有限公司 Touch control method and device

Also Published As

Publication number Publication date
TW201944207A (en) 2019-11-16
CN110392161A (en) 2019-10-29
TWI693532B (en) 2020-05-11

Similar Documents

Publication Publication Date Title
US10969903B2 (en) Method, device and mobile terminal for preventing false-touch on touch screen
CN106527818B (en) Control method, device and the mobile terminal of touch operation on a kind of mobile terminal
CN106598335B (en) A kind of touch screen control method, device and mobile terminal of mobile terminal
CN106775087B (en) Touch screen control method and device of mobile terminal and mobile terminal
EP2579130B1 (en) Adaptive method and device for user touch operation mode
WO2018082411A1 (en) Method and terminal for preventing false touch
CN106681554B (en) A kind of control method of mobile terminal touch screen, device and mobile terminal
CN105549783B (en) Multi-touch input discrimination
EP3435216B1 (en) Method, mobile terminal and non-transitory computer-readable storage medium for response control of damaged touch screen
EP3336679A1 (en) Method and terminal for preventing unintentional triggering of a touch key and storage medium
CN106775407A (en) A kind of touch-screen control method of mobile terminal, device and mobile terminal
CN102135830A (en) Touch screen triggering method and touch device
CN105739868B (en) A kind of method and device that identification terminal is accidentally touched
CN106502470A (en) Prevent method, device and the terminal of touch key-press false triggering
CN103917943A (en) A terminal device treatment method and a terminal device
CN109582416A (en) Fingerprint collecting method, device, storage medium and electronic equipment
CN112099666A (en) Touch control method, system, terminal and storage medium applied to capacitive screen
CN110392161B (en) Proximity identification method and device and terminal equipment
CN106775406A (en) A kind of false-touch prevention control method of mobile terminal touch screen, device and mobile terminal
CN110162372B (en) Virtual key creation method and related equipment
CN108268291A (en) A kind of application program delet method and terminal device
CN105786373A (en) Touch track display method and electronic device
CN107480499B (en) Child lock control method and device in working of touch screen equipment and terminal
CN114356153A (en) Control method, control device, electronic equipment and storage medium
CN111930826A (en) Order generation method and system of software interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant