CN114077359A - Touch recognition method and display device - Google Patents

Touch recognition method and display device Download PDF

Info

Publication number
CN114077359A
CN114077359A CN202010799281.7A CN202010799281A CN114077359A CN 114077359 A CN114077359 A CN 114077359A CN 202010799281 A CN202010799281 A CN 202010799281A CN 114077359 A CN114077359 A CN 114077359A
Authority
CN
China
Prior art keywords
touch
scanning direction
scanning
directions
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010799281.7A
Other languages
Chinese (zh)
Inventor
王武军
张晓娜
乔增富
刘杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Commercial Display Co Ltd
Original Assignee
Qingdao Hisense Commercial Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Commercial Display Co Ltd filed Critical Qingdao Hisense Commercial Display Co Ltd
Priority to CN202010799281.7A priority Critical patent/CN114077359A/en
Publication of CN114077359A publication Critical patent/CN114077359A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger

Abstract

The embodiment of the application provides a touch identification method and display equipment, the method is applied to a touch screen, the touch screen comprises a transmitting edge and a receiving edge, the transmitting edge is used for transmitting light along a plurality of scanning directions, the receiving edge is used for receiving the transmitted light in the plurality of scanning directions, and the method comprises the following steps: acquiring the number of touch areas detected in each scanning direction in a plurality of scanning directions; determining at least one target scanning direction according to the number of the touch areas corresponding to each scanning direction; and determining position information of a touch point in the touch screen according to the at least one target scanning direction. The multi-point touch control or the few-point touch control can be determined according to the number of the touch areas detected in each scanning direction, so that different touch points under different touch area numbers can be identified, and the identification efficiency is high.

Description

Touch recognition method and display device
Technical Field
The embodiment of the application relates to the technical field of touch control, in particular to a touch identification method and display equipment.
Background
Touch screens are electronic systems that can detect the presence and location of a touch within a display area and have wide application in a variety of display devices.
The touch area of the touch screen is identified by scanning light paths in various scanning directions on the touch screen. As display devices are diversified, types of touches on a touch screen of a user are also gradually diversified, and may include multi-touch and few-touch, for example. The requirements for identifying the touch area may also be different for different touch types. For example, in the case of multi-touch, the response speed for identifying the touch area is required to be high, and in the case of less-touch, the accuracy for identifying the touch area is required to be high. However, the touch area of the existing touch screen is identified by the same identification scheme for different touch types, and the identification efficiency is low because the touch area cannot be identified according to the actual touch type.
Disclosure of Invention
The embodiment of the application provides a touch identification method and display equipment, and aims to solve the problem that the touch area cannot be identified according to the actual touch type at present.
In a first aspect, an embodiment of the present application provides a touch recognition method applied to a touch screen, where the touch screen includes a transmitting side and a receiving side, the transmitting side is configured to transmit light along multiple scanning directions, and the receiving side is configured to receive the light transmitted in the multiple scanning directions, and the method includes:
acquiring the number of touch areas detected in each scanning direction in a plurality of scanning directions;
determining at least one target scanning direction according to the number of the touch areas corresponding to each scanning direction;
and determining position information of a touch point in the touch screen according to the at least one target scanning direction.
In a possible implementation manner, determining at least one target scanning direction according to the number of touch areas corresponding to each scanning direction includes:
determining a reference direction among the plurality of scanning directions; wherein, among the plurality of scanning directions, the touch area detected in the reference direction is the largest;
determining a first number according to the number of touch areas detected in the reference direction;
a first number of target scanning directions is determined among the plurality of scanning directions.
In a possible implementation, determining the first number according to the number of touch areas detected in the reference direction includes:
judging whether the number of the touch areas detected in the reference direction is larger than a first threshold value;
if yes, determining that the first number is M, wherein M is an integer larger than 1;
if not, determining that the first number is N, wherein N is an integer greater than 1, and M is smaller than N.
In one possible embodiment, determining a first number of target scanning directions among the plurality of scanning directions comprises:
grouping the plurality of scanning directions to obtain a first number of scanning direction groups;
and respectively determining a target scanning direction in each scanning direction group to obtain the first number of target scanning directions.
In one possible implementation, in one scanning direction group, the corresponding target scanning direction is the scanning direction in which the number of detected touch areas is the largest.
In one possible implementation, grouping the plurality of scanning directions to obtain a first number of scanning direction groups includes:
acquiring an initial identifier of each scanning direction, wherein the initial identifiers of the scanning directions are the same;
and grouping the plurality of scanning directions according to the initial identification of each scanning direction and the first number to obtain a first number of scanning direction groups.
In one possible embodiment, the method further comprises:
and updating the mark of each target scanning direction, wherein the updated mark is different from the initial mark.
In one possible embodiment, the method further comprises:
and deleting the touch area with the mark corresponding to the scanning direction of the initial mark.
In a possible embodiment, determining the position information of the touch point in the touch screen according to the at least one target scanning direction includes:
determining a position and an area of a touch area detected in the at least one target scanning direction;
and determining position information of a touch point in the touch screen according to the position and the area of the touch area detected in the at least one target scanning direction.
In a second aspect, an embodiment of the present application provides a touch recognition apparatus, including an emitting edge and a receiving edge, where the emitting edge is configured to emit light along multiple scanning directions, and the receiving edge is configured to receive the emitted light in the multiple scanning directions, the apparatus includes:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring the number of touch areas detected in each scanning direction in a plurality of scanning directions;
the first processing module is used for determining at least one target scanning direction according to the number of the touch areas corresponding to each scanning direction;
and the second processing module is used for determining the position information of the touch point in the touch screen according to the at least one target scanning direction.
In a possible implementation manner, the first processing module is specifically configured to:
determining a reference direction among the plurality of scanning directions; wherein, among the plurality of scanning directions, the touch area detected in the reference direction is the largest;
determining a first number according to the number of touch areas detected in the reference direction;
a first number of target scanning directions is determined among the plurality of scanning directions.
In a possible implementation manner, the first processing module is specifically configured to:
judging whether the number of the touch areas detected in the reference direction is larger than a first threshold value;
if yes, determining that the first number is M, wherein M is an integer larger than 1;
if not, determining that the first number is N, wherein N is an integer greater than 1, and M is smaller than N.
In a possible implementation manner, the first processing module is specifically configured to:
grouping the plurality of scanning directions to obtain a first number of scanning direction groups;
and respectively determining a target scanning direction in each scanning direction group to obtain the first number of target scanning directions.
In one possible implementation, in one scanning direction group, the corresponding target scanning direction is the scanning direction in which the number of detected touch areas is the largest.
In a possible implementation manner, the first processing module is specifically configured to:
acquiring an initial identifier of each scanning direction;
and grouping the plurality of scanning directions according to the initial identification of each scanning direction and the first number to obtain a first number of scanning direction groups.
In one possible implementation, the first processing module is further configured to:
and updating the mark of each target scanning direction, wherein the updated mark is different from the initial mark.
In one possible implementation, the first processing module is further configured to:
and deleting the touch area with the mark corresponding to the scanning direction of the initial mark.
In a possible implementation manner, the second processing module is specifically configured to:
determining a position and an area of a touch area detected in the at least one target scanning direction;
and determining position information of a touch point in the touch screen according to the position and the area of the touch area detected in the at least one target scanning direction.
In a third aspect, an embodiment of the present application provides a display device, including:
a memory for storing a program;
a processor for executing the program stored in the memory, the processor being configured to perform the touch recognition method according to any one of the first aspect when the program is executed.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which includes instructions that, when executed on a computer, cause the computer to perform the touch recognition method according to any one of the first aspect.
The touch identification method and the display device provided by the embodiment of the application are applied to a touch screen, the touch screen comprises an emitting edge and a receiving edge, the emitting edge is used for emitting light along a plurality of scanning directions, the receiving edge is used for receiving the emitted light in the plurality of scanning directions, the number of touch areas detected in each scanning direction in the plurality of scanning directions is firstly obtained, at least one target scanning direction is determined according to the number of the touch areas corresponding to each scanning direction, and then the position information of a touch point in the touch screen is determined according to the at least one target scanning direction. Through the scheme, whether multi-point touch or few-point touch is determined according to the number of the touch areas detected in each scanning direction, the target scanning direction is further determined, then the position information of the touch points is determined based on the target scanning direction, the identification of different touch points in different numbers of the touch areas can be realized, and the identification efficiency is high.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an infrared touch screen according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a long side scanning direction according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a short side scanning direction according to an embodiment of the present application;
FIG. 4 is a schematic diagram of light path occlusion provided in the embodiment of the present application;
fig. 5 is a schematic flowchart of a touch recognition method according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a touch recognition method according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating a touch area blocking light according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of short-side light blocking provided in the present application;
FIG. 9 is a schematic diagram of long-side light blocking according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a touch recognition device according to an embodiment of the present application;
fig. 11 is a schematic hardware structure diagram of a display device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
For ease of understanding, first, the concepts related to the present application will be explained.
A touch screen: touch screens are electronic systems that can detect the presence and location of a touch within a display area, and the presence of a touch screen simplifies the method of human-computer interaction. Among current touch technologies, the infrared touch technology has the advantages of strong environmental adaptability, longer service life, more recognizable touch points and the like, and is widely applied.
Fig. 1 is a schematic structural diagram of an infrared touch screen according to an embodiment of the present disclosure, and as shown in fig. 1, the infrared touch screen includes a long-side emitting edge 12, a long-side receiving edge 11, a short-side emitting edge 14, and a short-side receiving edge 13. The transmitting side is provided with a plurality of transmitting lamps, and the corresponding receiving side is correspondingly provided with a plurality of receiving lamps.
As shown in fig. 1, the scanning is usually performed in a 1-to-many manner, that is, one emitting lamp emits light, and a plurality of receiving lamps facing each other can receive the light emitted from the emitting lamp, thereby forming an optical network, and determining the touch behavior according to different forms of the optical network in the case of touch and non-touch.
Scanning direction: in a 1-to-n (n > -1) scan mode, each light path has a different angle for a particular lamp. For a particular emission lamp, each angle of the n light paths corresponding to it is referred to as a scan direction. Thus, a 1-to-n scanning scheme has n scanning directions, each consisting of a set of parallel optical paths of the same slope.
Fig. 2 is a schematic view of a long side scanning direction provided in the embodiment of the present application, and fig. 3 is a schematic view of a short side scanning direction provided in the embodiment of the present application, where as shown in fig. 2 and fig. 3, the long side includes a long transmitting side and a long receiving side, and the short side also includes a short transmitting side and a short receiving side. The long emitting edge emits light, and the long receiving edge receives the light emitted by the long emitting edge; the short emitting edge emits light, and the short receiving edge receives the light emitted by the short emitting edge.
Touch area: one particular scan direction corresponds to a set of parallel scan paths. When a touch action occurs, each touch point can block a plurality of continuous light paths in the parallel light paths, and the continuous blocked light paths are a touch area under the scanning direction.
Fig. 4 is a schematic view of light path blocking provided in the embodiment of the present application, as shown in fig. 4, where a dotted line indicates a blocked light path. Among the continuous blocked light paths, the first blocked light path is called as the starting boundary of the touch area, and the last blocked light path is called as the ending boundary. The starting boundary and the ending boundary are only the boundaries of the blocked light paths and do not have time precedence, and the sequence of the starting boundary and the ending boundary can be customized.
In the existing infrared multi-point identification schemes, the positions of touch points are determined by using the same scanning direction light path and the same logic algorithm scheme in any scene, so that the high precision in the case of few-point touch and the high speed in the case of multi-point touch cannot be optimized. Based on this, the embodiment of the present application provides a control scheme for a touch screen, which solves the above problems.
Fig. 5 is a schematic flowchart of a touch recognition method provided in an embodiment of the present application, where the method is applied to a touch screen, where the touch screen includes a transmitting side and a receiving side, the transmitting side is configured to transmit light along multiple scanning directions, and the receiving side is configured to receive the light transmitted in the multiple scanning directions, as shown in fig. 5, the method may include:
s51, the number of touch areas detected in each of the plurality of scanning directions is acquired.
In this application, including a plurality of scanning direction on the touch-sensitive screen, scanning direction is the direction of launching limit to receiving limit emission light promptly. On one emission edge, there may be a plurality of scanning directions, and for any one emission lamp on the emission edge, the direction in which the emission lamp emits light is the plurality of scanning directions. Aiming at any scanning direction, all the emission lamps on the corresponding emission edge can emit light rays in the scanning direction, and the light rays emitted by the plurality of emission lamps in the same scanning direction are parallel to each other. The light rays in the plurality of scanning directions together form an optical network.
When a user performs touch operation on the touch screen, light on the touch screen can be shielded. Meanwhile, for each scanning direction, a touch operation of a user on the touch screen may block one or more light rays in the scanning direction. Therefore, the number of touch areas detected in each scanning direction can be obtained according to the shielded light in each scanning direction.
And S52, determining at least one target scanning direction according to the number of the touch areas corresponding to each scanning direction.
When a user performs a touch operation on the touch screen, there may be a case where light cannot be detected in some scanning directions, that is, some touch areas cannot be detected in some scanning directions, and therefore, the number of touch areas corresponding to each scanning direction may be different. In this embodiment of the application, at least one target scanning direction may be determined according to the number of touch areas corresponding to each scanning direction.
S53, according to the at least one target scanning direction, determining position information of a touch point in the touch screen.
After determining the at least one current scanning direction, position information of a touch point in the touch screen may be determined based on the at least one current scanning direction. When the number of the touch areas is large, the touch screen is multi-point touch, and the touch screen has a high requirement on the touch response speed. Therefore, it is possible to determine fewer target scanning directions and to determine position information of the touch point according to fewer current scanning directions. When the number of the touch areas is small, the touch screen is in touch with few points, and the touch screen has a high requirement on the accuracy of touch response. Therefore, more target scanning directions can be determined, and the position information of the touch point can be determined according to more target scanning directions, and so on.
The touch identification method provided by the embodiment of the application is applied to a touch screen, the touch screen comprises an emitting edge and a receiving edge, the emitting edge is used for emitting light along a plurality of scanning directions, the receiving edge is used for receiving the emitted light in the plurality of scanning directions, the number of touch areas detected in each scanning direction in the plurality of scanning directions is firstly obtained, at least one target scanning direction is determined according to the number of the touch areas corresponding to each scanning direction, and then the position information of a touch point in the touch screen is determined according to the at least one target scanning direction. Through the scheme, whether multi-point touch or few-point touch is determined according to the number of the touch areas detected in each scanning direction, the target scanning direction is further determined, then the position information of the touch points is determined based on the target scanning direction, the identification of different touch points in different numbers of the touch areas can be realized, and the identification efficiency is high.
The following describes the embodiments of the present application in detail with reference to the accompanying drawings.
After a user performs a touch operation on the touch screen, the number of touch areas detected in each scanning direction in the plurality of scanning directions may be acquired, so that a target scanning direction is determined according to the number of touch areas corresponding to each scanning direction.
Specifically, the reference direction may be determined among a plurality of scanning directions, the reference direction being a direction in which the number of touch regions detected among the plurality of scanning directions is the largest. If there are a plurality of directions in which the number of detected touch regions is the largest, one of the directions may be randomly selected as a reference direction.
After the reference direction is determined, the first number may be determined according to the number of touch regions detected by the reference direction.
Ideally, when the user leaves i touch points on the touch screen, there are i touch areas for each scan direction. However, in some cases, such as where the touch points are too small, there may be certain touch areas in some scan directions that cannot be detected. Therefore, in the embodiment of the present application, the scanning direction with the largest detected touch area is used as the reference direction, and the touch area detected in the reference direction is used as the touch point to be identified.
According to the number of the touch areas detected in the reference direction, whether the touch operation is multi-point touch or few-point touch can be judged, and the first number is determined according to the result of the multi-point touch or the few-point touch.
Specifically, it may be determined whether the number of touch areas detected in the reference direction is greater than a first threshold, and if so, the current touch operation may be determined as multi-point touch, and the first number is M, where M is an integer greater than 1; if not, the current touch operation can be determined as the few-point touch, and the first number is determined to be N, wherein N is an integer greater than 1, and M is smaller than N.
After the first number is determined, a first number of target scan directions may be determined among the plurality of scan directions.
Specifically, a plurality of scanning directions may be grouped to obtain a first number of scanning direction groups.
For example, an initial identifier of each scanning direction may be obtained, and then the plurality of scanning directions are grouped according to the initial identifier of each scanning direction and the first number, so as to obtain a first number of scanning direction groups. Wherein the initial identification of each scanning direction is the same. For example, the initial flag for each scan direction may be set to 0. In subsequent flows, the identity may be updated for the partial scan direction.
After obtaining the first number of scanning direction groups, a target scanning direction is determined in each scanning direction group, so as to obtain the first number of target scanning directions.
Optionally, after each target scanning direction is determined, the identifier of the target scanning direction may be updated, and the updated identifier is different from the initial identifier. For example, the updated flag for the target scan direction may be set to 1. In this case, the target scanning direction and the non-target scanning direction have different labels, so that the target scanning direction and the non-target scanning direction can be effectively distinguished.
After the identifier is updated, the area corresponding to the scanning direction identified as the initial identifier may be deleted. Since the identifier of the target scanning direction is updated, and the identifier of the non-target scanning direction is still the initial identifier, deleting the touch area corresponding to the scanning direction identified as the initial identifier, that is, deleting the touch area corresponding to the non-target scanning direction, can only perform the identification of the touch point according to the touch area corresponding to the target scanning direction.
When the multi-point touch is determined, dividing a plurality of scanning directions into M groups, determining a target scanning direction in each group, and obtaining M target scanning directions; when few-point touch is determined, the scanning directions are divided into N groups, one target scanning direction is determined in each group, and N target scanning directions are obtained.
For each scanning direction group, the scanning direction in which the number of detected touch regions is the largest may be determined as the target scanning direction in the scanning direction group.
After the first number of target scanning directions are determined, the position and area of the touch region detected in the target scanning directions may be determined, and position information of a touch point in the touch screen may be determined according to the position and area of the touch region detected in the target scanning directions, and the position and area size of the touch point may be output.
When the multi-point touch is determined, the number of the target scanning directions is M, and when the few-point touch is determined, the number of the target scanning directions is N. Because M is smaller than N, when the touch screen is touched at multiple points, the position information of the touch point in the touch screen is determined through fewer target scanning directions, the position information of the touch point can be acquired quickly, and the response speed is high. When few-point touch is performed, the position information of the touch point in the touch screen is determined through more target scanning directions, and the determined position information of the touch point is higher in precision and more accurate.
Fig. 6 is a schematic flowchart of a touch recognition method provided in an embodiment of the present application, and as shown in fig. 6, the method includes:
s601, setting various parameters.
Each parameter comprises the number N of scanning directions, the angle of each lamp tube in each scanning direction, the number threshold t of touch areas, the number N of scanning directions of a few-point high-precision mode and the number M of scanning directions of a multipoint quick response mode, an identifier flg is set for each scanning direction and initialized to 0, and the set identifier is the initial identifier of each scanning direction.
And S602, executing scanning in one period, and counting the number of touch areas in each scanning direction.
And executing n scanning directions in one period, and counting and recording the touch areas in each scanning direction and the number of the touch areas in each scanning direction. Fig. 7 is a schematic diagram of a touch area blocking light provided by an embodiment of the present application, as shown in fig. 7, showing two touch points, i.e., black circular areas in fig. 7.
The touch screen in fig. 7 includes 4 scanning directions, i.e., two scanning directions of a long side and two scanning directions of a short side. In fig. 7, the blocked light rays are also shown, i.e. indicated by the dashed lines in fig. 7. The blocked light rays pass through the areas where the two touch points are located, so that the light rays are blocked when a user performs touch operation on the touch screen. The touch point can be determined according to the shielded light rays.
Fig. 8 is a schematic diagram illustrating light shielding of short sides according to an embodiment of the present application, where as shown in fig. 8, the short sides respectively include two scanning directions, one is a vertical direction, and the other is an oblique direction. Fig. 9 is a schematic diagram of long-side light shielding according to an embodiment of the present application, and as shown in fig. 9, the long sides respectively include two scanning directions, one is a horizontal direction, and the other is an oblique direction.
S603, select the reference direction and record the number of touch areas detected in the reference direction.
Each scanning direction is traversed, and the scanning direction having the largest number of touch areas is selected as the reference direction Dir.
Fig. 7-9 illustrate situations where all touch areas are detected, and in practice some touch areas cannot be detected in some scanning directions.
For example, suppose that the touch screen includes 12 scanning directions in total, wherein the scanning direction of the long side of 6 bits, and the scanning direction of the short side of the other 6 bits. The number of touch areas detected in the 12 scan directions is as follows:
dir1 (scan direction 1): 2 touch areas; dir2 (scan direction 2): 1 touch area;
dir3 (scan direction 3): 1 touch area; dir4 (scan direction 4): 2 touch areas;
dir5 (scan direction 5): 3 touch areas; dir6 (scan direction 6): 2 touch areas;
dir7 (scan direction 7): 1 touch area; dir8 (scan direction 8): 2 touch areas;
dir9 (scan direction 9): 2 touch areas; dir10 (scan direction 10): 1 touch area;
dir11 (scan direction 11): 2 touch areas; dir12 (scan direction 12): 1 touch area.
Since the scanning direction in which the number of detected touch regions is the largest is Dir5, Dir5 is determined as the reference direction.
S604, judging whether the number k of the touch areas in the reference direction Dir is larger than t, if so, executing S605, otherwise, executing S606.
S605, the multi-point high-speed response mode is entered.
Multipoint high-speed response mode: equally dividing n scanning directions into M groups, and putting the rest scanning directions into the last group which cannot be equally divided; selecting the direction with the most number of touch areas in each group and marking the direction flg as 1, wherein the direction with the most number of touch areas in each group is the target scanning direction of the group, and the mark of the target scanning direction is updated when the direction flg is marked as 1; s607 is executed after all the groups are executed.
Assuming that the touch area threshold t is 2, the number M of target scanning directions at this time is 3, every 4 scanning directions form a group, and the scanning direction in which the most touch areas are detected is selected as the target scanning direction.
The first group, Dir1-Dir4, selected Dir 1;
the second group, Dir5-Dir8, selects Dir 5;
the third group, Dir9-Dir12, selects Dir 9.
And S606, entering a few-point high-precision mode.
Few high precision mode: equally dividing N scanning directions into N groups, and putting the rest scanning directions into the last group which cannot be equally divided; selecting the direction with the most number of touch areas in each group and marking the direction flg as 1, wherein the direction with the most number of touch areas in each group is the target scanning direction of the group, and the mark of the target scanning direction is updated when the direction flg is marked as 1; s607 is executed after all the groups are executed.
Assuming that the touch area threshold t is 3, the number N of target scanning directions at this time is 5, every 2 scanning directions form a group, and the scanning direction in which the most touch areas are detected is selected as the target scanning direction.
The first group, Dir1-Dir2, selected Dir 1;
the second group, Dir3-Dir4, selects Dir 4;
the third group, Dir5-Dir6, selected Dir 5;
a fourth group, Dir7-Dir8, selected Dir 6;
in the fifth group, Dir9-Dir12, Dir9 is selected.
And S607, traversing the scanning direction and deleting the touch area in the unselected scanning direction.
Traversing each scanning direction, deleting all touch areas of the direction with the identification flg being 0, namely: and deleting the touch areas corresponding to the scanning direction marked as the initial mark, wherein the number of the corresponding touch areas is 0.
And S608, calculating the point position and the point area by using the residual touch areas.
And S609, executing ghost point removing operation and executing track tracking and smoothing algorithm processing.
And S610, converting and outputting coordinates of all touch points.
And after the touch area in the target scanning direction is determined, determining the position information of the touch point according to the touch area in the target scanning direction, and outputting the coordinate of the touch point.
The touch identification method provided by the embodiment of the application is applied to a touch screen, the touch screen comprises an emitting edge and a receiving edge, the emitting edge is used for emitting light along a plurality of scanning directions, the receiving edge is used for receiving the emitted light in the plurality of scanning directions, the number of touch areas detected in each scanning direction in the plurality of scanning directions is firstly obtained, at least one target scanning direction is determined according to the number of the touch areas corresponding to each scanning direction, and then the position information of a touch point in the touch screen is determined according to the at least one target scanning direction. Through the scheme, whether multi-point touch or few-point touch is determined according to the number of the touch areas detected in each scanning direction, the target scanning direction is further determined, then the position information of the touch points is determined based on the target scanning direction, the identification of different touch points in different numbers of the touch areas can be realized, and the identification efficiency is high.
Fig. 10 is a schematic structural diagram of a touch recognition device provided in an embodiment of the present application, where the touch recognition device includes an emitting edge and a receiving edge, the emitting edge is configured to emit light rays in multiple scanning directions, and the receiving edge is configured to receive the emitted light rays in the multiple scanning directions, as shown in fig. 10, the device includes:
an obtaining module 101, configured to obtain the number of touch areas detected in each of a plurality of scanning directions;
the first processing module 102 is configured to determine at least one target scanning direction according to the number of touch areas corresponding to each scanning direction;
and the second processing module 103 is configured to determine position information of a touch point in the touch screen according to the at least one target scanning direction.
In a possible implementation manner, the first processing module 102 is specifically configured to:
determining a reference direction among the plurality of scanning directions; wherein, among the plurality of scanning directions, the touch area detected in the reference direction is the largest;
determining a first number according to the number of touch areas detected in the reference direction;
a first number of target scanning directions is determined among the plurality of scanning directions.
In a possible implementation manner, the first processing module 102 is specifically configured to:
judging whether the number of the touch areas detected in the reference direction is larger than a first threshold value;
if yes, determining that the first number is M, wherein M is an integer larger than 1;
if not, determining that the first number is N, wherein N is an integer greater than 1, and M is smaller than N.
In a possible implementation manner, the first processing module 102 is specifically configured to:
grouping the plurality of scanning directions to obtain a first number of scanning direction groups;
and respectively determining a target scanning direction in each scanning direction group to obtain the first number of target scanning directions.
In one possible implementation, in one scanning direction group, the corresponding target scanning direction is the scanning direction in which the number of detected touch areas is the largest.
In a possible implementation manner, the first processing module 102 is specifically configured to:
acquiring an initial identifier of each scanning direction;
and grouping the plurality of scanning directions according to the initial identification of each scanning direction and the first number to obtain a first number of scanning direction groups.
In a possible implementation, the first processing module 102 is further configured to:
and updating the mark of each target scanning direction, wherein the updated mark is different from the initial mark.
In a possible implementation, the first processing module 102 is further configured to:
and deleting the touch area with the mark corresponding to the scanning direction of the initial mark.
In a possible implementation manner, the second processing module 103 is specifically configured to:
determining a position and an area of a touch area detected in the at least one target scanning direction;
and determining position information of a touch point in the touch screen according to the position and the area of the touch area detected in the at least one target scanning direction.
The apparatus provided in the embodiment of the present application may be configured to implement the technical solution of the method embodiment, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 11 is a schematic diagram of a hardware structure of a display device according to an embodiment of the present application, and as shown in fig. 11, the display device includes: at least one processor 111 and a memory 112. Wherein the processor 111 and the memory 112 are connected by a bus 113.
Optionally, the model determination further comprises a communication component. For example, the communication component may include a receiver and/or a transmitter.
In a specific implementation, the at least one processor 111 executes computer-executable instructions stored by the memory 112 to cause the at least one processor 111 to perform the touch recognition method as described above.
For a specific implementation process of the processor 111, reference may be made to the above method embodiments, which implement principles and technical effects similar to each other, and details of this embodiment are not described herein again.
In the embodiment shown in fig. 11, it should be understood that the Processor may be a Central Processing Unit (CPU), other general-purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in the incorporated application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in the processor.
The memory may comprise high speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The present application also provides a computer-readable storage medium, in which computer-executable instructions are stored, and when a processor executes the computer-executable instructions, the touch recognition method as described above is implemented.
The computer-readable storage medium may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in the apparatus.
The division of the units is only a logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A touch recognition method applied to a touch screen, wherein the touch screen comprises an emitting side and a receiving side, the emitting side is used for emitting light along a plurality of scanning directions, and the receiving side is used for receiving the emitted light along the plurality of scanning directions, the method comprising:
acquiring the number of touch areas detected in each scanning direction in a plurality of scanning directions;
determining at least one target scanning direction according to the number of the touch areas corresponding to each scanning direction;
and determining position information of a touch point in the touch screen according to the at least one target scanning direction.
2. The method of claim 1, wherein determining at least one target scanning direction according to the number of touch areas corresponding to each scanning direction comprises:
determining a reference direction among the plurality of scanning directions; wherein, among the plurality of scanning directions, the touch area detected in the reference direction is the largest;
determining a first number according to the number of touch areas detected in the reference direction;
a first number of target scanning directions is determined among the plurality of scanning directions.
3. The method of claim 2, wherein determining the first number based on the number of touch regions detected in the reference direction comprises:
judging whether the number of the touch areas detected in the reference direction is larger than a first threshold value;
if yes, determining that the first number is M, wherein M is an integer larger than 1;
if not, determining that the first number is N, wherein N is an integer greater than 1, and M is smaller than N.
4. The method of claim 2, wherein determining a first number of target scan directions among the plurality of scan directions comprises:
grouping the plurality of scanning directions to obtain a first number of scanning direction groups;
and respectively determining a target scanning direction in each scanning direction group to obtain the first number of target scanning directions.
5. The method of claim 4, wherein the corresponding target scanning direction is the scanning direction in which the number of detected touch areas is the largest in one scanning direction group.
6. The method of claim 4 or 5, wherein grouping the plurality of scan directions to obtain a first number of scan direction groups comprises:
acquiring an initial identifier of each scanning direction, wherein the initial identifiers of the scanning directions are the same;
and grouping the plurality of scanning directions according to the initial identification of each scanning direction and the first number to obtain a first number of scanning direction groups.
7. The method of claim 6, further comprising:
and updating the mark of each target scanning direction, wherein the updated mark is different from the initial mark.
8. The method of claim 7, further comprising:
and deleting the touch area with the mark corresponding to the scanning direction of the initial mark.
9. The method according to any of claims 1-5, wherein determining position information of a touch point in the touch screen based on the at least one target scan direction comprises:
determining a position and an area of a touch area detected in the at least one target scanning direction;
and determining position information of a touch point in the touch screen according to the position and the area of the touch area detected in the at least one target scanning direction.
10. A display device, comprising:
a memory for storing a program;
a processor for executing the program stored in the memory, the processor being configured to perform the touch recognition method of any one of claims 1 to 9 when the program is executed.
CN202010799281.7A 2020-08-11 2020-08-11 Touch recognition method and display device Pending CN114077359A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010799281.7A CN114077359A (en) 2020-08-11 2020-08-11 Touch recognition method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010799281.7A CN114077359A (en) 2020-08-11 2020-08-11 Touch recognition method and display device

Publications (1)

Publication Number Publication Date
CN114077359A true CN114077359A (en) 2022-02-22

Family

ID=80280029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010799281.7A Pending CN114077359A (en) 2020-08-11 2020-08-11 Touch recognition method and display device

Country Status (1)

Country Link
CN (1) CN114077359A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327155A (en) * 2022-03-14 2022-04-12 上海海栎创科技股份有限公司 Multi-contact identification method and device, electronic equipment and readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793361A (en) * 1994-06-09 1998-08-11 Corporation For National Research Initiatives Unconstrained pointing interface for natural human interaction with a display-based computer system
CN101742036A (en) * 2008-11-17 2010-06-16 夏普株式会社 Image processing apparatus, image forming apparatus and image processing method
US20130234993A1 (en) * 2010-11-26 2013-09-12 Haibing Zhang Infrared touch screen multi-point recognizing method and infrared touch screen
CN107728860A (en) * 2017-10-19 2018-02-23 青岛海信电器股份有限公司 A kind of touch points of infrared touch screen recognition methods, device and touch-screen equipment
US20180356935A1 (en) * 2016-01-28 2018-12-13 Huawei Technologies Co., Ltd. Touch point positioning method and apparatus, and terminal device
CN110262685A (en) * 2019-05-28 2019-09-20 青岛海信商用显示股份有限公司 Touch detecting method, touch detecting apparatus, storage medium and touch screen
CN110275645A (en) * 2019-06-12 2019-09-24 青岛海信商用显示股份有限公司 Touch point recognition methods, touch point identification device, storage medium and touch device
CN110489015A (en) * 2019-08-19 2019-11-22 青岛海信商用显示股份有限公司 Touch point determines method and device, touch screen and display

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793361A (en) * 1994-06-09 1998-08-11 Corporation For National Research Initiatives Unconstrained pointing interface for natural human interaction with a display-based computer system
CN101742036A (en) * 2008-11-17 2010-06-16 夏普株式会社 Image processing apparatus, image forming apparatus and image processing method
US20130234993A1 (en) * 2010-11-26 2013-09-12 Haibing Zhang Infrared touch screen multi-point recognizing method and infrared touch screen
US20180356935A1 (en) * 2016-01-28 2018-12-13 Huawei Technologies Co., Ltd. Touch point positioning method and apparatus, and terminal device
CN107728860A (en) * 2017-10-19 2018-02-23 青岛海信电器股份有限公司 A kind of touch points of infrared touch screen recognition methods, device and touch-screen equipment
CN110262685A (en) * 2019-05-28 2019-09-20 青岛海信商用显示股份有限公司 Touch detecting method, touch detecting apparatus, storage medium and touch screen
CN110275645A (en) * 2019-06-12 2019-09-24 青岛海信商用显示股份有限公司 Touch point recognition methods, touch point identification device, storage medium and touch device
CN110489015A (en) * 2019-08-19 2019-11-22 青岛海信商用显示股份有限公司 Touch point determines method and device, touch screen and display

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327155A (en) * 2022-03-14 2022-04-12 上海海栎创科技股份有限公司 Multi-contact identification method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN110489015B (en) Touch point determining method and device, touch screen and display
CN109215037B (en) Target image segmentation method and device and terminal equipment
US8405625B2 (en) Method for detecting tracks of touch inputs on touch-sensitive panel and related computer program product and electronic apparatus using the same
CN107728860B (en) Touch point identification method and device of infrared touch screen and touch screen equipment
CN109542276B (en) Touch point identification method and device and display equipment
CN105787133A (en) Method and device for filtering advertisement information
CN116416190A (en) Flaw detection method and device, electronic equipment and storage medium
CN114077359A (en) Touch recognition method and display device
CN104572917A (en) Method and device for locking data and distributed storage system
CN103399674A (en) Method and device of multi-point touch detection
CN113934312B (en) Touch object identification method based on infrared touch screen and terminal equipment
CN111460072B (en) Lane line detection method, device, equipment and storage medium
CN107450840B (en) Method and device for determining finger touch connected domain and electronic equipment
CN113760123A (en) Screen touch optimization method and device, terminal device and storage medium
CN111258455B (en) Event stream processing method, event stream processing device and mobile terminal
CN106528575A (en) Data connection method and device
US11397493B2 (en) Method for touch sensing enhancement implemented in single chip, single chip capable of achieving touch sensing enhancement, and computing apparatus
CN113126795B (en) Touch identification method of touch display device and related equipment
KR102029860B1 (en) Method for tracking multi objects by real time and apparatus for executing the method
US20120182228A1 (en) Method and apparatus for detecting touch point movement
CN110794994A (en) Method and device for determining real contact
CN111475060B (en) Touch identification method and device
CN112882594A (en) Touch device, positioning method, equipment and medium
CN107943340B (en) Touch screen scanning detection method and device
CN112581526A (en) Evaluation method, device, equipment and storage medium for obstacle detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination