CN110502160B - Touch point classification method and device, touch screen and display - Google Patents

Touch point classification method and device, touch screen and display Download PDF

Info

Publication number
CN110502160B
CN110502160B CN201910765702.1A CN201910765702A CN110502160B CN 110502160 B CN110502160 B CN 110502160B CN 201910765702 A CN201910765702 A CN 201910765702A CN 110502160 B CN110502160 B CN 110502160B
Authority
CN
China
Prior art keywords
touch
points
area
touch points
shape
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910765702.1A
Other languages
Chinese (zh)
Other versions
CN110502160A (en
Inventor
王武军
张晓娜
吴明强
张连峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Commercial Display Co Ltd
Original Assignee
Qingdao Hisense Commercial Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Commercial Display Co Ltd filed Critical Qingdao Hisense Commercial Display Co Ltd
Priority to CN201910765702.1A priority Critical patent/CN110502160B/en
Publication of CN110502160A publication Critical patent/CN110502160A/en
Application granted granted Critical
Publication of CN110502160B publication Critical patent/CN110502160B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The embodiment of the invention provides a method and a device for classifying touch points, a touch screen and a display, relates to the field of touch control, and can expand more applications of the touch screen. The method comprises the following steps: acquiring a touch area under each scanning direction on a touch screen; determining a touch point according to the touch area; acquiring characteristic parameters of touch points; and determining the type of the touch point according to the characteristic parameters of the touch point. The invention is applied to the touch screen.

Description

Touch point classification method and device, touch screen and display
Technical Field
The invention relates to the field of touch control, in particular to a method and a device for classifying touch points, a touch screen and a display.
Background
The touch screen is an electronic system capable of detecting the position of a touch point in a display area, simplifies a man-machine interaction mode, and has the advantages of strong environmental adaptability, longer service life, more recognizable touch points and the like due to the fact that an infrared touch technology is adopted, and the current touch screen mainly utilizes the infrared touch technology to determine the position of the touch point.
The traditional infrared touch algorithm only uploads the position information of a touch point, and the characteristics of the touch point comprise the characteristics of size, area and the like besides the position. The existing touch control algorithm does not well utilize the characteristics of touch control points to expand the application of the touch control screen. Therefore, in order to expand the possibility of more and richer applications of the touch screen, a method for distinguishing the touch point types is needed.
Disclosure of Invention
The embodiment of the invention provides a method and a device for classifying touch points, a touch screen and a display, which are used for determining the types of the touch points.
In order to achieve the above purpose, the embodiment of the invention adopts the following technical scheme:
in a first aspect, a method for classifying touch points is provided, including:
acquiring a touch area under each scanning direction on a touch screen; the scanning direction is the direction of a group of scanning light paths with the same slope on the touch screen, and the scanning light paths in the scanning direction of the corresponding touch areas in each touch area are blocked;
determining a touch point according to the touch area;
acquiring characteristic parameters of touch points; the characteristic parameters of the touch point include: the number of layers of the touch points, the shading number of the touch points, the area of the touch points and the shape of the touch points; the number of layers of the touch points is the number of touch areas to which the touch points belong, and the shading number of the touch points is the number of scanning light paths contained in the touch area with the largest number of scanning light paths in the scanning direction including the touch area in which the touch points belong;
and determining the type of the touch point according to the characteristic parameters of the touch point.
Aiming at the fact that the existing touch screen technology only uploads the information of the position of the touch point, the method and the device increase the processes of feature calculation and type judgment of different touch points, and expand the possibility of more applications of the touch screen.
In a second aspect, an apparatus for classifying a touch point is provided, including: an acquisition unit and a determination unit;
the acquisition unit is used for acquiring a touch area under each scanning direction on the touch screen; the scanning direction is the direction of a group of scanning light paths with the same slope on the touch screen, and the scanning light paths in the scanning direction of the corresponding touch areas in each touch area are blocked;
the determining unit is used for determining a touch point according to the touch area acquired by the acquiring unit;
the acquisition unit is also used for acquiring the characteristic parameters of the touch points; the characteristic parameters of the touch point comprise: the number of layers of the touch points, the shading number of the touch points, the area of the touch points and the shape of the touch points; the number of layers of the touch points is the number of touch areas to which the touch points belong, and the shading number of the touch points is the number of scanning light paths contained in the touch area with the largest number of scanning light paths in the scanning direction including the touch area in which the touch points belong;
and the determining unit is also used for determining the type of the touch point according to the characteristic parameters of the touch point acquired by the acquiring unit.
In a third aspect, a touch screen is provided, which comprises the touch point classification device of the second aspect.
In a fourth aspect, a display is provided comprising a touch screen as in the third aspect.
In a fifth aspect, an apparatus for classifying touch points is provided, including: a processor, a memory, and a communication interface; the communication interface is used for the classification device of the touch point to communicate with other equipment or a network; the memory is used for storing one or more programs, the one or more programs comprising computer executable instructions, which when executed by the classification device of touch points, the processor executes the computer executable instructions stored by the memory to cause the classification device of touch points to perform the method of classifying touch points as in the first aspect.
In a sixth aspect, there is provided a computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computer, cause the computer to perform the method of classifying touch points as in the first aspect.
In a seventh aspect, a computer program product is provided comprising instructions which, when run on a computer, cause the computer to perform the method of classifying touch points according to the first aspect.
The embodiment of the invention provides a method and a device for classifying touch points, a touch screen and a display, wherein the method comprises the following steps: acquiring a touch area under each scanning direction on a touch screen; determining a touch point according to the touch area; acquiring characteristic parameters of touch points; and determining the type of the touch point according to the characteristic parameters of the touch point. The characteristic parameters of the touch points comprise the number of layers of the touch points, the shading number of the touch points, the area of the touch points and the shape of the touch points. According to the technical scheme provided by the embodiment of the invention, the characteristic parameters of each touch point are obtained after the touch point is determined, so that the type of the touch point can be determined according to the characteristic parameters of the touch point; furthermore, different operations can be matched according to different types of touch points, and the possibility of more and richer applications of the touch screen can be expanded.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic diagram of an infrared touch screen;
FIG. 2a is a schematic diagram of a touch area provided by an embodiment of the present invention;
FIG. 2b is a schematic diagram of another touch area provided by an embodiment of the present invention;
fig. 3 is a schematic diagram illustrating a method for classifying touch points according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating another method for classifying touch points according to an embodiment of the present invention;
fig. 5 is an application scenario of a touch screen according to an embodiment of the present invention;
fig. 6 is another application scenario of the touch screen provided in the embodiment of the present invention;
fig. 7 is a further application scenario of the touch screen according to the embodiment of the present invention;
fig. 8 is a flowchart illustrating a touch point type determination according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a touch point classification device according to an embodiment of the present invention;
fig. 10 is a schematic diagram of another touch point classification device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "such as" in an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
It should be noted that, in the embodiments of the present invention, "of", "corresponding" and "corresponding" may be sometimes used in combination, and it should be noted that, when the difference is not emphasized, the intended meaning is consistent.
For the convenience of clearly describing the technical solutions of the embodiments of the present invention, in the embodiments of the present invention, the words "first", "second", and the like are used for distinguishing the same items or similar items with basically the same functions and actions, and those skilled in the art can understand that the words "first", "second", and the like are not limited in number or execution order.
The principle of the infrared touch screen is shown in fig. 1, and the infrared touch screen comprises two infrared emitting edges and two infrared receiving edges, wherein the infrared emitting edges and the infrared receiving edges are arranged oppositely. Infrared transmitters are arranged on the infrared transmitting edge, and infrared receivers are arranged on the infrared receiving edge. As shown in fig. 1, one infrared emitter may emit infrared rays in a plurality of directions.
In the prior art, after the touch point is determined, only the position information of the touch point is determined, so that the possibility of more applications of the touch screen is limited. Considering that the information contained in the touch point is not only position information, the size and shape thereof may represent certain information. For example, when a stylus is used to touch on a touch screen, a small touch point with a certain shape is obtained; when a finger is used for touch control on the touch screen, a touch point which is larger than a touch pen and has a certain shape is obtained; when a false touch occurs, such as when a sleeve or arm presses against the touch screen, a larger touch point is obtained. In the above touch scenes, the type of the touch point can be determined according to the information such as the size and shape of the touch point.
According to the technical scheme provided by the invention, when the touch point is detected, the type of the touch point can be judged according to the characteristic parameters of the touch point. The type information of the touch point is uploaded to a processing system along with the position information, wherein the processing system can be an application program for calling the touch screen or a unit which is responsible for information processing, such as a main control system of the touch screen. Generally, in the prior art, when a user wants to draw a touch pen line, the user needs to select a touch pen mode first and then start drawing a graph with the touch pen. According to the invention, the type of the touch point can be obtained when the position of the touch point is judged, so that when a user wants to draw a pattern of the type of the touch pen, the touch pen is directly used for drawing on the touch screen, and the touch screen can judge that the current drawing mode is the touch pen mode according to the type of each touch point in the process of drawing a line by the touch pen, so that the user does not need to actively select the touch pen mode.
The following explains the terms of the present invention:
scanning light path: infrared rays emitted from the infrared emitter on the infrared emission side.
Scanning direction: the scanning direction is the direction of a group of scanning light paths with the same slope on the touch screen. The infrared emitters may emit infrared rays at different angles, and for a particular infrared emitter, it may correspond to a plurality of different angles of the scanning optical path, each angle being referred to as a scanning direction.
Touch area: a specific scanning direction corresponds to a group of parallel scanning light paths, when a touch occurs, a touch point can shield a plurality of continuous parallel scanning light paths, and in the same scanning direction, the area between the continuous shielded scanning light paths is a touch area. And the scanning light path of the scanning direction of the corresponding touch area in each touch area is shielded. As shown in fig. 2a, b, c, and d are 4 scanning optical paths in the same scanning direction that are blocked by a touch point, and a shadow area between the a scanning optical path and the d scanning optical path is a touch area. For convenience of description, when a touch area is shown in the subsequent figures, only the scanning light path at the outermost side of the touch area is drawn, that is, only the a scanning light path and the d scanning light path are drawn;
in another case, when the touch point is small and only one scanning optical path is blocked, the area between two adjacent scanning optical paths in the scanning direction of the blocked scanning optical path is determined as the touch area, as shown in fig. 2b, and when the touch point is small and only b scanning optical paths are blocked, the area between scanning optical paths a and c, that is, the shadow area in fig. 2b, is determined as the touch area.
To solve the problem that the prior art cannot determine the type of a touch point, as shown in fig. 3, an embodiment of the present invention provides a method for classifying touch points, which is executed by a touch screen or a corresponding control unit (e.g., a CPU, an MCU, etc.), and in an implementation manner, the method provided by the embodiment of the present invention includes: S101-S106.
And S101, acquiring a touch area of each scanning direction on the touch screen.
S102: and determining a touch point according to the touch area.
For example, in an embodiment of the present invention, as shown in fig. 4, S102 may specifically include:
and S1021, determining the scanning direction in which the touch areas other than the invalid touch area are determined to be the most as the target direction.
It should be noted that, when there are the same number of touch areas in the multiple scanning directions, one of the scanning directions is optionally the target direction. As shown in fig. 5, touch areas are detected in a total of A, B and C three scanning directions, a two touch areas a1 and a2 in the a scanning direction, B two touch areas B1 and B2 in the B scanning direction, and C two touch areas C1 and C2 in the C scanning direction. There are two touch areas per scanning direction, so that one of the scanning directions is optionally determined as the target direction.
If the determined touch area in the target direction does not have an intersection area with the touch areas in other scanning directions, determining the determined touch area in the target direction as an invalid touch area; the target direction is then redetermined: and determining the scanning direction with the largest touch area except the invalid touch area as the target direction.
S1022: and acquiring the intersection area of the touch area in the target direction and the touch areas in other scanning directions.
Since the scanning directions are different, there may be a case where the touch areas intersect. When two touch areas intersect, it can be confirmed that the intersection area is possibly a touch point.
S1023: and determining the intersection area with the most touch areas in each touch area in the target direction as a touch point.
Since the touch areas in the same scanning direction cannot intersect, the touch areas in different scanning directions intersect. Therefore, when the more touch areas the intersection area belongs to, the more scanning directions the intersection area is detected to correspond to, the more likely the intersection area is to be a true touch point. Therefore, the technical solution provided by the embodiment of the present invention determines, as the touch point, the intersection area that belongs to the largest number of touch areas among the intersection areas in each touch area in the target direction.
For example, as shown in fig. 6, it is determined that the D scanning direction is the target direction, in the D1 touch area, there are 4 touch areas to which the No. 2 intersecting area belongs, and there are no more than 4 touch areas to which other intersecting areas belong in the D1 touch area, so the No. 2 intersecting area is determined as the touch point. In the d2 touch area, 3 touch areas to which the No. 3 intersection area belongs are provided, and the number of touch areas to which other intersection areas in the d2 touch area belong is not more than 3, so that the No. 3 intersection area is determined as a touch point.
S1024: and determining all touch areas to which the touch points belong as effective touch areas.
For example, as shown in fig. 6, after determining that the intersection areas No. 2 and No. 3 are touch points, determining the touch area to which the intersection area No. 2 belongs and the touch area to which the intersection area No. 3 belongs as effective touch areas. After step S105, the remaining touch area is shown in fig. 7.
S1025, the scanning direction with the largest touch area except the effective touch area is determined as the target direction.
After execution of S1024, the remaining touch areas are determined. The target direction is determined in the same manner as in S1021.
As shown in fig. 7, touch regions are detected in two scanning directions a and C in total, the number of touch regions in each scanning direction is 1, and therefore, any one of the scanning directions can be selected as a target direction.
And determining that the scanning direction A is the target direction, and the two touch areas only have one intersection area, namely the intersection area No. 5 in the figure, so that the number of the touch areas to which the intersection areas belong does not need to be compared, and the intersection area No. 6 is directly determined as a touch point.
And (4) repeating the steps S1022 to S1025 for multiple times until no intersection area exists, so that all touch points can be determined.
S103: and acquiring the characteristic parameters of the touch points.
The characteristic parameters of the touch points comprise: the number of layers of the touch points, the shading number of the touch points, the area of the touch points and the shape of the touch points; the number of layers of the touch points is the number of touch areas to which the touch points belong, and the shading number of the touch points is the number of scanning optical paths contained in the touch area with the largest number of scanning optical paths in the scanning direction of the touch area contained in the touch area to which the touch points belong;
as shown in fig. 5, the touch area with a1, b1 and c1 in different scanning directions is blocked at the touch point 1, so the number of layers of the touch point 1 is 3. As can be seen from fig. 5, the b1 touch area includes 4 scanning optical paths, the a1 touch area includes 3 scanning optical paths, and the c1 touch area includes 4 scanning optical paths. Therefore, if b1 and c1 are the largest touch areas in the touch areas to which the touch point 1 belongs at the same time, the number of scanning optical paths in b1 or c1 is the number of light-shielded touch point 1, that is, the number of light-shielded touch point 1 is 4.
Since the touch point is an intersection region of the touch regions, it may be only a convex polygon, and thus, the area and shape of the touch point may be mathematically calculated from each vertex coordinate of the touch point. The coordinate axis may be formed by any two adjacent edges of the touch screen.
S104: and determining the type of the touch point according to the characteristic parameters of the touch point.
For example, the determination method provided by the embodiment of the present invention is shown in fig. 8:
judging whether the characteristic parameters of the touch points meet a first preset condition or not; the first preset condition is as follows: the number of layers of the touch points is smaller than a first threshold value, the shading number of the touch points is smaller than a second threshold value, and the area of the touch points is smaller than a third threshold value;
if the characteristic parameters of the touch points meet a first preset condition, judging whether the shapes of the touch points are similar to a first preset shape;
if the characteristic parameters of the touch points meet a first preset condition and the shapes of the touch points are similar to a first preset shape, determining that the types of the touch points are first type touch points; or whether the shape of the touch point is similar to the first preset shape or not can be judged first, and then whether the characteristic parameter of the touch point meets the first preset condition or not can be judged.
Judging whether the characteristic parameters of the touch points meet a second preset condition or not; the second preset condition is as follows: the number of layers of the touch points is greater than or equal to a first threshold value and smaller than a fourth threshold value, the shading number of the touch points is greater than or equal to a second threshold value and smaller than a fifth threshold value, and the area of the touch points is greater than or equal to a third threshold value and smaller than a sixth threshold value;
if the characteristic parameters of the touch points meet a second preset condition, judging whether the shapes of the touch points are similar to the second preset shape;
if the characteristic parameters of the touch points meet a second preset condition and the shapes of the touch points are similar to the second preset shape, determining that the types of the touch points are second type touch points; or whether the shape of the touch point is similar to a second preset shape or not can be judged first, and then whether the characteristic parameter of the touch point meets a second preset condition or not can be judged.
Judging whether the characteristic parameters of the touch points meet a third preset condition or not; the third preset condition is as follows: the number of layers of the touch points is greater than or equal to a fourth threshold value, the shading number of the touch points is greater than or equal to a fifth threshold value, and the area of the touch points is greater than or equal to a sixth threshold value;
and if the characteristic parameters of the touch points meet a third preset condition, determining that the type of the touch points is a third type of touch points.
It should be noted that, the sequence of the determination process of the three types of touch points may be performed simultaneously or may be ordered at will, and no specific limitation is made herein.
If the characteristic parameters of the touch points do not accord with the judgment conditions of the three types of touch points, the types of the touch points are not judged.
Illustratively, the first threshold may generally take one-half the number of scanning directions, the second threshold may take 2, and the third threshold may take 9mm 2 The fourth threshold value can generally take the number of scanning directions, the fifth threshold value can generally take the value of 4, and the sixth threshold value can take the value of 100mm 2 . It should be noted that specific numerical values of the first threshold, the second threshold, the third threshold, the fourth threshold, the fifth threshold, and the sixth threshold are only examples, and other numerical values may also be used in practice, and are not limited specifically here.
In a specific scenario defined by the above thresholds, the number of layers, the number of shades and the area of the first type touch point are the smallest, so that the first type touch point is most likely to be a thin-pen type touch point, and then whether the first type touch point is really a thin-pen type touch point is determined according to the shape of the first type touch point. It should be noted that the touch point generated by the pen tip touch of the touch pen may have other shapes, and here, the first shape is taken as a triangle only for convenience of describing the technical solution of the embodiment of the present invention.
The number of layers, the number of shielded light and the area of the second type touch point are more consistent with the condition of the touch point during finger touch, so that the second type touch point is most likely to be a finger board eraser type touch point, and then whether the second type touch point is really the finger board eraser type touch point is determined according to the shape of the second type touch point. It should be noted that the touch point generated by the fingertip touch may also be in other shapes, such as an ellipse, and the second shape is taken as a circle only for convenience of describing the technical solution of the embodiment of the present invention.
The number of layers, the shading number and the area of the third type of touch point are large, so the third type of touch point is most probably a touch point generated by sleeves, palms, elbows or other objects, and the touch point is large, so the touch point is usually generated by error touch; since the touch is a false touch, the shape of the false touch point is not fixed, and therefore, it is not necessary to consider the characteristic parameter of the shape when determining whether the touch point belongs to the third type of touch point.
For example, determining whether the shape of the touch point is similar to the first preset shape may be performed by:
calculating the similarity between the shape of the touch point and a first preset shape, and if the similarity is greater than a preset threshold value, judging that the shape of the touch point is similar to the first preset shape;
determining whether the shape of the touch point is similar to the second shape may be accomplished by:
and calculating the similarity between the shape of the touch point and a second preset shape, and if the similarity is greater than a preset threshold value, judging that the shape of the touch point is similar to the second preset shape.
Illustratively, a mean hash algorithm may be used to calculate the similarity between two shapes. The technical scheme provided by the embodiment of the invention does not limit the method for calculating the similarity between the two shapes, as long as the similarity between the shapes can be calculated.
And S105, executing ghost point removing operation, and tracking the touch point with the ghost point removed.
Specifically, in the case of multi-point touch, for example, in the case of two-point touch, two abscissa points and two ordinate points are generated at two touch points on the touch screen, and the position information of 4 types of touch points is obtained by combining the two abscissa points and the two ordinate points. Therefore, after the touch point is determined, ghost points in the touch point are removed. For example, the number of scanning optical paths passing through only a certain touch point to be determined but not other touch points to be determined may be counted, and for each touch point to be determined, if the number of scanning optical paths passing through only the touch point to be determined but not other touch points to be determined is greater than a preset threshold, it is determined that the touch point to be determined is a real touch point, otherwise, it is determined that the touch point to be determined is a ghost point. After the ghost points are deleted, the misjudgment of the touch points can be avoided, and the accuracy of touch point judgment is improved.
The trajectory tracking of the touch points may also be any one in the prior art, for example, after the touch points are acquired, the touch points in the next scanning period are acquired, and the distance between all touch point pairs in two adjacent scanning periods is calculated, where the touch point pair includes one touch point in each scanning period in the two adjacent scanning periods; fully arranging all touch point pairs to obtain at least two touch point pair sequences, wherein the number of the touch point pairs in the touch point pair sequences is equal to the number of the touch points in a first scanning period, and the first scanning period is a scanning period with the minimum number of the touch points in two adjacent scanning periods; and respectively calculating the sum of distances between all touch point pairs in each touch point pair sequence, and taking the touch point pair sequence with the minimum sum of distances as a matching point pair sequence, wherein a connecting line between two touch points in any touch point pair in the matching point pair sequence is a touch track. The touch trajectory may reflect the movement of the touch point.
And S106, determining the logic coordinates of the touch points and uploading the logic coordinates and the types of the touch points to the processing system.
And determining the device coordinates of the gravity center of the outline according to the outline of the touch point, wherein the device coordinates are expressed by a horizontal distance and a vertical distance from the upper left corner of the screen. And after acquiring the equipment coordinates of the touch point, converting the equipment coordinates into logical coordinates and outputting the logical coordinates.
Touch point type information can be distinguished by adding an identification in the protocol header. And for the touch points which do not meet the touch point type judgment condition, the touch screen only uploads the coordinate information of the touch points to an upper-layer processing system, and the touch point type mark is empty. As described above, the processing system may be an application program calling the touch screen or a unit in charge of information processing, such as a main control system of the touch screen.
The embodiment of the invention provides a method for classifying touch points, which comprises the following steps: acquiring a touch area under each scanning direction on a touch screen; determining a touch point according to the touch area; acquiring characteristic parameters of touch points; and determining the type of the touch point according to the characteristic parameters of the touch point. The characteristic parameters of the touch points comprise the number of layers of the touch points, the shading number of the touch points, the area of the touch points and the shape of the touch points. According to the technical scheme provided by the embodiment of the invention, the characteristic parameters of each touch point are obtained after the touch point is determined, so that the type of the touch point can be determined according to the characteristic parameters of the touch point; furthermore, different operations can be matched according to different types of touch points, and more application-rich possibilities of the touch screen can be expanded.
In order to better implement the method for classifying touch points provided in the foregoing embodiments, an embodiment of the present invention further provides a display, which includes a touch screen, where the touch screen includes a touch point classification device, and the touch point classification device is capable of implementing the method for classifying touch points.
Illustratively, referring to fig. 9, the touch point classification apparatus 90 according to an embodiment of the present invention includes: an acquisition unit 91 and a determination unit 92;
an acquiring unit 81 configured to acquire a touch area in each scanning direction on the touch screen; the scanning direction is the direction of a group of scanning light paths with the same slope on the touch screen, and the scanning light paths in the scanning direction of the corresponding touch areas in each touch area are blocked;
a determining unit 92, configured to determine a touch point according to the touch area acquired by the acquiring unit 91;
the acquiring unit 91 is further configured to acquire characteristic parameters of the touch points; the characteristic parameters of the touch points comprise: the number of layers of the touch points, the shading number of the touch points, the area of the touch points and the shape of the touch points; the number of layers of the touch points is the number of touch areas to which the touch points belong, and the shading number of the touch points is the number of scanning light paths contained in the touch area with the largest number of scanning light paths in the scanning direction including the touch area in which the touch points belong;
the determining unit 92 is further configured to determine the type of the touch point according to the characteristic parameters of the touch point acquired by the acquiring unit 91.
Optionally, the determining unit 92 is specifically configured to:
determining, as a target direction, a scanning direction in which the touch area other than the touch area determined as the ineffective touch area acquired by the acquisition unit 91 is the largest;
determining the intersection area with the largest number of touch areas acquired by the acquisition unit 91 in each touch area in the target direction as a touch point according to the intersection area between the touch area in the target direction acquired by the acquisition unit 91 and the touch areas in other scanning directions;
determining all touch areas to which the touch points belong as effective touch areas;
the scanning direction in which the touch area other than the one determined as the effective touch area acquired by the acquisition unit 91 is the largest is determined as the target direction.
Optionally, the determining unit 92 is specifically configured to:
judging whether the characteristic parameters of the touch points acquired by the acquisition unit 91 meet a first preset condition; wherein the first preset condition is as follows: the number of layers of the touch points is smaller than a first threshold value, the shading number of the touch points is smaller than a second threshold value, and the area of the touch points is smaller than a third threshold value;
if the characteristic parameter of the touch point acquired by the acquiring unit 91 meets the first preset condition, determining whether the shape of the touch point is similar to a first preset shape;
if the characteristic parameter of the touch point acquired by the acquiring unit 91 meets the first preset condition and the shape of the touch point is similar to the first preset shape, determining that the type of the touch point is a first type touch point; or it may be determined whether the shape of the touch point is similar to the first preset shape, and then it is determined whether the characteristic parameter of the touch point acquired by the acquiring unit 91 satisfies the first preset condition.
Judging whether the characteristic parameters of the touch points acquired by the acquisition unit 91 meet a second preset condition; wherein the second preset condition is as follows: the number of layers of the touch points is greater than or equal to the first threshold and smaller than a fourth threshold, the shading number of the touch points is greater than or equal to the second threshold and smaller than a fifth threshold, and the area of the touch points is greater than or equal to the third threshold and smaller than a sixth threshold;
if the characteristic parameter of the touch point acquired by the acquiring unit 91 meets the second preset condition, judging whether the shape of the touch point is similar to a second preset shape;
if the characteristic parameter of the touch point acquired by the acquiring unit 91 meets a second preset condition and the shape of the touch point is similar to a second preset shape, determining that the type of the touch point is a second type of touch point; or it may be determined whether the shape of the touch point is similar to a second preset shape, and then it is determined whether the characteristic parameter of the touch point acquired by the acquiring unit 91 satisfies a second preset condition.
Judging whether the characteristic parameters of the touch points acquired by the acquisition unit 91 meet a third preset condition; wherein the third preset condition is as follows: the number of layers of the touch points is greater than or equal to the fourth threshold, the shading number of the touch points is greater than or equal to the fifth threshold, and the area of the touch points is greater than or equal to the sixth threshold;
if the characteristic parameter of the touch point acquired by the acquiring unit 91 meets a third preset condition, it is determined that the type of the touch point is a third type of touch point. If the characteristic parameters of the touch points acquired by the acquiring unit 91 do not meet the judgment conditions of the three types of touch point types, the type of the touch point is not judged.
Optionally, the determining unit 92 determines whether the shape of the touch point acquired by the acquiring unit 91 is similar to a first preset shape, and specifically includes:
calculating the similarity between the shape of the touch point acquired by the acquiring unit 91 and a first preset shape, and if the similarity is greater than a preset threshold, judging that the shape of the touch point is similar to the first preset shape;
the determining unit 92 determines whether the shape of the touch point acquired by the acquiring unit 91 is similar to the second shape, and specifically includes:
the similarity between the shape of the touch point acquired by the acquiring unit 91 and the second preset shape is calculated, and if the similarity is greater than a preset threshold, it is determined that the shape of the touch point is similar to the second preset shape.
Optionally, the touch point classifying device 90 further includes a processing unit 93, where the processing unit 93 is configured to perform a ghost point removing operation, and perform trajectory tracking on the touch point from which the ghost point is removed.
Optionally, the determining unit 92 is further configured to determine the logical coordinates of the touch point.
Optionally, the touch point classifying device 90 further includes a sending unit 94, and the sending unit 94 is configured to upload the logical coordinates of the touch point and the type of the touch point determined by the determining unit 92 to the processing system. The processing system may be an application program calling the touch screen or a unit such as a main control system of the touch screen, which is responsible for information processing.
The touch point classification device provided by the embodiment of the invention comprises an acquisition unit and a determination unit. The acquisition unit is used for acquiring a touch area under each scanning direction on the touch screen; the determining unit is used for determining a touch point according to the touch area acquired by the acquiring unit; the acquisition unit is also used for acquiring the characteristic parameters of the touch points; and the determining unit is also used for determining the type of the touch point according to the characteristic parameters of the touch point acquired by the acquiring unit. According to the technical scheme provided by the embodiment of the invention, the characteristic parameters of each touch point are obtained after the determining unit determines the touch point, so that the determining unit can determine the type of the touch point according to the characteristic parameters of the touch point; furthermore, different operations can be matched according to different types of touch points, and more application-rich possibilities of the touch screen can be expanded.
Referring to fig. 10, another touch point classifying device according to an embodiment of the present invention includes a memory 101, a processor 102, a bus 103, and a communication interface 104; the memory 101 is used for storing computer execution instructions, and the processor 102 is connected with the memory 101 through a bus 103; when the touch point classification device is operated, the processor 102 executes computer-executable instructions stored in the memory 101 to cause the touch point classification device to perform the touch point classification method provided in the above-described embodiments.
In particular implementations, processor 102 (102-1 and 102-2) may include one or more CPUs, such as CPU0 and CPU1 shown in FIG. 10, as one embodiment. And as an example, the means for classifying touch points may include a plurality of processors 102, such as processor 102-1 and processor 102-2 shown in fig. 10. Each of the processors 102 may be a single-core processor (s ingle-CPU) or a multi-core processor (mul t i-CPU). Processor 102 may refer herein to one or more devices, circuits, and/or processing cores that process data (e.g., computer program instructions).
The Memory 101 may be a read-only Memory 101 (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only Memory (EEPROM), a compact disc read-only Memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these. The memory 101 may be self-contained and coupled to the processor 102 via a bus 103. Memory 101 may also be integrated with processor 102.
In a specific implementation, the memory 101 is used for storing data in the present application and computer-executable instructions corresponding to software programs for executing the present application. The processor 102 may operate or execute software programs stored in the memory 101 and invoke various functions of the sorting means for the touch points, data stored in the memory 101.
The communication interface 104 may be any device, such as a transceiver, for communicating with other devices or communication networks, such as a control system, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), and the like. The communication interface 104 may include a receiving unit implementing a receiving function and a transmitting unit implementing a transmitting function.
The bus 103 may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an extended ISA (enhanced industry standard architecture) bus, or the like. The bus 103 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 10, but this is not intended to represent only one bus or type of bus.
The embodiment of the present invention further provides a computer storage medium, where the computer storage medium includes computer execution instructions, and when the computer execution instructions are run on a computer, the computer is enabled to execute the method for classifying touch points provided in the above embodiment.
The embodiment of the present invention further provides a computer program, which can be directly loaded into the memory and contains software codes, and after the computer program is loaded and executed by the computer, the method for classifying touch points provided by the above embodiment can be implemented.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in this invention may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical function division, and there may be other division ways in actual implementation. For example, various elements or components may be combined or may be integrated in another device, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit. The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application, or portions thereof, which substantially contribute to the prior art, or all or portions thereof, may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions for enabling a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A method for classifying touch points, comprising:
acquiring a touch area under each scanning direction on a touch screen; the scanning direction is the direction of a group of scanning light paths with the same slope on the touch screen, and the scanning light path corresponding to the scanning direction of the touch area in each touch area is blocked;
determining a touch point according to the touch area;
acquiring characteristic parameters of touch points; the characteristic parameters of the touch point comprise: the number of layers of the touch points, the shading number of the touch points, the area of the touch points and the shape of the touch points; the number of layers of the touch points is the number of touch areas to which the touch points belong, and the shading number of the touch points is the number of scanning optical paths contained in the touch area with the largest number of scanning optical paths in the scanning direction of the touch area contained in the touch area to which the touch points belong;
determining the type of the touch point according to the characteristic parameters of the touch point;
the determining a touch point according to the touch area includes:
determining the scanning direction in which the touch areas other than the touch areas determined as invalid are the most as a target direction;
acquiring intersection areas of the touch areas in the target direction and the touch areas in other scanning directions;
determining the intersection area with the most touch areas in each touch area in the target direction as a touch point;
determining all the touch areas to which the touch points belong as effective touch areas;
determining the scanning direction in which the touch area other than the effective touch area is determined to be the largest as a target direction;
the determining the type of the touch point according to the characteristic parameters of the touch point comprises:
judging whether the characteristic parameters of the touch points meet a first preset condition or not; the first preset condition is as follows: the number of layers of the touch points is smaller than a first threshold, the shading number of the touch points is smaller than a second threshold, and the area of the touch points is smaller than a third threshold;
if the characteristic parameters of the touch point meet the first preset condition, judging whether the shape of the touch point is similar to a first preset shape;
if the characteristic parameters of the touch points meet the first preset condition and the shapes of the touch points are similar to a first preset shape, determining that the types of the touch points are first type touch points;
judging whether the characteristic parameters of the touch points meet a second preset condition or not; the second preset condition is as follows: the number of layers of the touch points is greater than or equal to the first threshold and smaller than a fourth threshold, the shading number of the touch points is greater than or equal to the second threshold and smaller than a fifth threshold, and the area of the touch points is greater than or equal to the third threshold and smaller than a sixth threshold;
if the characteristic parameters of the touch point meet the second preset condition, judging whether the shape of the touch point is similar to a second preset shape;
if the characteristic parameters of the touch points meet the second preset condition and the shapes of the touch points are similar to a second preset shape, determining that the types of the touch points are second type touch points;
judging whether the characteristic parameters of the touch points meet a third preset condition or not; the third preset condition is as follows: the number of layers of the touch points is greater than or equal to the fourth threshold, the shading number of the touch points is greater than or equal to the fifth threshold, and the area of the touch points is greater than or equal to the sixth threshold;
and if the characteristic parameters of the touch points meet a third preset condition, determining that the type of the touch points is a third type of touch points.
2. The method of classifying a touch point according to claim 1,
the determining whether the shape of the touch point is similar to a first preset shape specifically includes:
calculating the similarity between the shape of the touch point and a first preset shape, and if the similarity is greater than a preset threshold value, determining that the shape of the touch point is similar to the first preset shape;
the determining whether the shape is similar to a second preset shape specifically includes:
and calculating the similarity between the shape of the touch point and a second preset shape, and if the similarity is greater than a preset threshold value, determining that the shape of the touch point is similar to the second preset shape.
3. A classification apparatus of a touch point, applied to the classification method of a touch point according to claim 1 or 2, the apparatus including an acquisition unit and a determination unit;
the acquisition unit is used for acquiring a touch area under each scanning direction on the touch screen; the scanning direction is the direction of a group of scanning light paths with the same slope on the touch screen, and the scanning light path corresponding to the scanning direction of the touch area in each touch area is blocked;
the determining unit is used for determining a touch point according to the touch area acquired by the acquiring unit;
the acquisition unit is further used for acquiring characteristic parameters of the touch points; the characteristic parameters of the touch point comprise: the number of layers of the touch points, the shading number of the touch points, the area of the touch points and the shape of the touch points; the number of layers of the touch points is the number of touch areas to which the touch points belong, and the shading number of the touch points is the number of scanning optical paths contained in the touch area with the largest number of scanning optical paths in the scanning direction of the touch area contained in the touch area to which the touch points belong;
the determining unit is further configured to determine the type of the touch point according to the characteristic parameter of the touch point acquired by the acquiring unit.
4. The apparatus according to claim 3, wherein the determining unit is specifically configured to:
determining the scanning direction in which the touch area other than the touch area determined as the invalid touch area acquired by the acquisition unit is the largest as a target direction;
determining the intersection area with the largest number of touch areas acquired by the acquisition unit in each touch area in the target direction as a touch point according to the intersection area of the touch area in the target direction acquired by the acquisition unit and the touch areas in other scanning directions;
determining all the touch areas to which the touch points belong as effective touch areas;
determining the scanning direction in which the touch area other than the touch area determined as the effective touch area acquired by the acquisition unit is the largest as a target direction.
5. The apparatus according to claim 3, wherein the determining unit is specifically configured to:
judging whether the characteristic parameters of the touch points acquired by the acquisition unit meet a first preset condition or not; the first preset condition is as follows: the number of layers of the touch points is smaller than a first threshold value, the shading number of the touch points is smaller than a second threshold value, and the area of the touch points is smaller than a third threshold value;
if the characteristic parameters of the touch points acquired by the acquisition unit meet the first preset condition, judging whether the shape of the touch points acquired by the acquisition unit is similar to a first preset shape;
if the characteristic parameters of the touch point acquired by the acquisition unit meet the first preset condition and the shape of the touch point acquired by the acquisition unit is similar to a first preset shape, determining that the type of the touch point is a first type touch point;
judging whether the characteristic parameters of the touch points acquired by the acquisition unit meet a second preset condition or not; the second preset condition is as follows: the number of layers of the touch points is greater than or equal to the first threshold and smaller than a fourth threshold, the shading number of the touch points is greater than or equal to the second threshold and smaller than a fifth threshold, and the area of the touch points is greater than or equal to the third threshold and smaller than a sixth threshold;
if the characteristic parameters of the touch point acquired by the acquisition unit meet the second preset condition, judging whether the shape of the touch point acquired by the acquisition unit is similar to a second preset shape;
if the characteristic parameters of the touch point acquired by the acquisition unit meet the second preset condition and the shape of the touch point acquired by the acquisition unit is similar to a second preset shape, determining that the type of the touch point is a second type of touch point;
judging whether the characteristic parameters of the touch points acquired by the acquisition unit meet a third preset condition or not; the third preset condition is as follows: the number of layers of the touch points is greater than or equal to the fourth threshold, the shading number of the touch points is greater than or equal to the fifth threshold, and the area of the touch points is greater than or equal to the sixth threshold;
and if the characteristic parameters of the touch points acquired by the acquisition unit meet the third preset condition, determining that the type of the touch points is a third type of touch points.
6. The apparatus according to claim 5, wherein the determining unit is specifically configured to:
calculating the similarity between the shape of the touch point acquired by the acquisition unit and a first preset shape, and if the similarity is greater than a preset threshold, determining that the shape of the touch point is similar to the first preset shape;
and calculating the similarity between the shape of the touch point acquired by the acquisition unit and a second preset shape, and if the similarity is greater than a preset threshold, determining that the shape of the touch point is similar to the second preset shape.
7. A touch screen, characterized in that it comprises means for classifying the touch points according to any one of claims 3 to 6.
8. A display comprising the touch screen of claim 7.
CN201910765702.1A 2019-08-19 2019-08-19 Touch point classification method and device, touch screen and display Active CN110502160B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910765702.1A CN110502160B (en) 2019-08-19 2019-08-19 Touch point classification method and device, touch screen and display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910765702.1A CN110502160B (en) 2019-08-19 2019-08-19 Touch point classification method and device, touch screen and display

Publications (2)

Publication Number Publication Date
CN110502160A CN110502160A (en) 2019-11-26
CN110502160B true CN110502160B (en) 2023-03-28

Family

ID=68588798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910765702.1A Active CN110502160B (en) 2019-08-19 2019-08-19 Touch point classification method and device, touch screen and display

Country Status (1)

Country Link
CN (1) CN110502160B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111190506A (en) * 2020-03-20 2020-05-22 廖武佐 Touch screen false touch prevention method, touch screen, computing device and medium
CN113342216B (en) * 2021-06-29 2024-03-12 昆山龙腾光电股份有限公司 Touch screen and touch screen touch method
CN117032500A (en) * 2023-10-08 2023-11-10 广州众远智慧科技有限公司 Touch recognition method and device for infrared touch equipment, storage medium and equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105094454A (en) * 2014-04-17 2015-11-25 青岛海信电器股份有限公司 Method and device for multi-point positioning of touch screen, and touch screen device
CN105260064A (en) * 2015-10-15 2016-01-20 青岛海信电器股份有限公司 Touch point identification method and apparatus, and display device
WO2017129007A1 (en) * 2016-01-28 2017-08-03 华为技术有限公司 Touch point positioning method and apparatus, and terminal device
CN107728860A (en) * 2017-10-19 2018-02-23 青岛海信电器股份有限公司 A kind of touch points of infrared touch screen recognition methods, device and touch-screen equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201001258A (en) * 2008-06-23 2010-01-01 Flatfrog Lab Ab Determining the location of one or more objects on a touch surface
KR101470903B1 (en) * 2012-10-18 2014-12-09 주식회사 하이딥 Touch screen controller and method for controlling thereof
CN104216571A (en) * 2013-05-31 2014-12-17 上海精研电子科技有限公司 Touch screen and touch recognition method and device
CN105320360A (en) * 2014-07-29 2016-02-10 中强光电股份有限公司 Touch device and touch sensing method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105094454A (en) * 2014-04-17 2015-11-25 青岛海信电器股份有限公司 Method and device for multi-point positioning of touch screen, and touch screen device
CN105260064A (en) * 2015-10-15 2016-01-20 青岛海信电器股份有限公司 Touch point identification method and apparatus, and display device
WO2017129007A1 (en) * 2016-01-28 2017-08-03 华为技术有限公司 Touch point positioning method and apparatus, and terminal device
CN107728860A (en) * 2017-10-19 2018-02-23 青岛海信电器股份有限公司 A kind of touch points of infrared touch screen recognition methods, device and touch-screen equipment

Also Published As

Publication number Publication date
CN110502160A (en) 2019-11-26

Similar Documents

Publication Publication Date Title
CN110502160B (en) Touch point classification method and device, touch screen and display
CN110489015B (en) Touch point determining method and device, touch screen and display
JP2017529582A (en) Touch classification
US9104910B2 (en) Device and method for determining gesture and operation method of gesture determining device
CN107728860B (en) Touch point identification method and device of infrared touch screen and touch screen equipment
CN107967083B (en) Touch point determination method and device
JP2003330603A (en) Coordinate detecting device and method, coordinate detecting program for making computer execute the same method and recording medium with its program recorded
CN106557209B (en) Processing method, device and the terminal device of infrared touch panel touching signals
CN110383336A (en) A kind of rigid body configuration method, device, terminal device and computer storage medium
WO2021092771A1 (en) Target detection method and apparatus, and device and storage medium
CN109375833B (en) Touch instruction generation method and device
CN104615311B (en) A kind of touch-screen localization method, device and touch-screen equipment
EP3906525A1 (en) Error reduction of depth maps
CN109661671B (en) Improvement of image classification using boundary bitmaps
CN106569643B (en) Method and device for positioning touch point of infrared touch screen
US10437351B2 (en) Method for detecting input device and detection device
US20160313817A1 (en) Mouse pad with touch detection function
CN111857354A (en) Unlocking method and device, electronic equipment and storage medium
CN105094453B (en) A kind of touch screen multipoint positioning method, device and touch-screen equipment
CN111694468B (en) Method and device for determining type of target object
CN112799533B (en) Touch point determination method and touch equipment
CN108921129A (en) Image processing method, system, medium and electronic equipment
WO2021007733A1 (en) Method for recognizing gesture for operating terminal device, and terminal device
CN105204693B (en) Touch point identification method and device and touch screen equipment
CN108133206B (en) Static gesture recognition method and device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant