CN109542276B - Touch point identification method and device and display equipment - Google Patents

Touch point identification method and device and display equipment Download PDF

Info

Publication number
CN109542276B
CN109542276B CN201811399723.8A CN201811399723A CN109542276B CN 109542276 B CN109542276 B CN 109542276B CN 201811399723 A CN201811399723 A CN 201811399723A CN 109542276 B CN109542276 B CN 109542276B
Authority
CN
China
Prior art keywords
touch
point
scanning
intersection
touch point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811399723.8A
Other languages
Chinese (zh)
Other versions
CN109542276A (en
Inventor
王武军
薛晓光
赵书凯
张晓娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Commercial Display Co Ltd
Original Assignee
Qingdao Hisense Commercial Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Commercial Display Co Ltd filed Critical Qingdao Hisense Commercial Display Co Ltd
Priority to CN201811399723.8A priority Critical patent/CN109542276B/en
Publication of CN109542276A publication Critical patent/CN109542276A/en
Application granted granted Critical
Publication of CN109542276B publication Critical patent/CN109542276B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Abstract

The embodiment of the invention provides a touch point identification method and device and display equipment, relates to the technical field of electronics, and can simultaneously utilize a shielded scanning light path and an unshielded scanning light path to identify touch points and improve the accuracy and the discrimination of touch point identification. The method comprises the following steps: the touch screen is driven to complete a scanning period, n scanning angles are included in the vertical edge direction, m scanning angles are included in the horizontal edge direction, n and m are integers, n is larger than or equal to 1, and m is larger than or equal to 1; acquiring intersection points of the shielded light paths to form an intersection point set; determining a first unshielded light path on two sides of the reference point at each scanning angle by taking any intersection point in the intersection point set as the reference point, and acquiring n + m groups of touch boundary areas, wherein at any scanning angle, the touch boundary areas are areas between the first unshielded scanning light paths on two sides of the reference point; taking the intersection of the n + m groups of touch boundary areas as a quasi touch point; and determining a touch point according to the quasi touch point.

Description

Touch point identification method and device and display equipment
Technical Field
The embodiment of the invention relates to the technical field of electronics, in particular to a touch point identification method and device and display equipment.
Background
Touch screens are electronic systems that can detect the presence and location of a touch within a display area, simplifying human-computer interaction methods. In the current touch technology, the infrared touch technology has the advantages of strong environmental adaptability, longer service life, more recognizable touch points and the like. The infrared touch screen is of a rectangular structure and comprises a long-edge emitting edge and a long-edge receiving edge corresponding to the long-edge emitting edge, and a short-edge emitting edge and a short-edge receiving edge corresponding to the short-edge emitting edge. The method is characterized in that a plurality of infrared emitting devices are arranged on an emitting edge, a plurality of infrared receiving devices are correspondingly arranged on a corresponding receiving edge, a 1-pair multi-mode scanning is generally adopted, namely, one emitting device emits light, and a plurality of receiving devices are simultaneously received in the opposite direction, so that an optical network is formed, and the touch behavior is judged according to different sheltered forms of the optical network under the conditions of touch and no touch.
The existing infrared touch screen is of a rectangular structure, and a traditional infrared point positioning algorithm obtains touch points according to the orthogonalization of a shielded light path in the long-short side normal scanning direction. In representative multi-point touch in the industry, point positioning is mainly considered when the position of each touch point is shielded by a scanning light path, and the problems that a plurality of points are easily identified as one point, a true point is discarded, a false point is judged as true, the calculation amount is large, the response speed is slow, the cost is high and the like exist, so that the accuracy and the discrimination degree of touch point identification are low.
Disclosure of Invention
The embodiment of the invention provides a touch point identification method and device and display equipment.
In a first aspect, a touch point identification method is provided for touch point identification of a touch screen, where two horizontal side directions opposite to the touch screen are respectively provided with a light path transmitting structure and a light path receiving structure, and two vertical side directions opposite to the touch screen are respectively provided with a light path transmitting structure and a light path receiving structure; driving light path emitting structures in the horizontal side direction and the vertical side direction of the touch screen to finish a scanning period, wherein the vertical side direction comprises n scanning angles, the horizontal side direction comprises m scanning angles, n and m are integers, n is more than or equal to 1, and m is more than or equal to 1; acquiring intersection points of the shielded scanning light paths to form an intersection point set; determining a first unshielded scanning light path on two sides of the reference point and acquiring n + m groups of touch boundary areas under each scanning angle by taking any intersection point in the intersection point set as the reference point, wherein the touch boundary areas are areas between the first unshielded scanning light paths on two sides of the reference point under any scanning angle; taking the intersection of the n + m groups of touch boundary areas as a quasi touch point; and determining the touch point according to the quasi touch point. In the above scheme, in one scanning period, the touch point identification device can combine an intersection point formed by any scanning optical path, acquire n + m groups of touch boundary areas by using the first unshielded scanning optical path on two sides of the intersection point at any scanning angle, then take the intersection of the n + m groups of touch boundary areas as a quasi touch point, and finally determine the touch point according to the quasi touch point. Compared with the prior art, in the embodiment of the invention, the shielded scanning light path and the unshielded scanning light path are simultaneously utilized for identifying the touch points, so that the accuracy and the discrimination of the touch point identification can be improved.
In a second aspect, a touch point identification device is provided, which is used for identifying touch points of a touch screen, wherein two horizontal side directions opposite to the touch screen are respectively provided with a light path transmitting structure and a light path receiving structure, and two vertical side directions opposite to the touch screen are respectively provided with a light path transmitting structure and a light path receiving structure; the device includes:
the driving module is used for driving the light path emission structure in the horizontal side direction and the vertical side direction of the touch screen to complete a scanning period, wherein the vertical side direction comprises n scanning angles, the horizontal side direction comprises m scanning angles, n and m are integers, n is more than or equal to 1, and m is more than or equal to 1;
the acquisition module is used for acquiring the intersection point of the shielded scanning light path to form an intersection point set;
a processing module, configured to determine, for any intersection in the intersection set acquired by the acquisition module as a reference point, a first unshielded scanning optical path on two sides of the reference point at each scanning angle, and acquire n + m sets of touch boundary regions, where, at any scanning angle, the touch boundary region is a region between the first unshielded scanning optical paths on two sides of the reference point;
the processing module is further configured to take an intersection of the n + m sets of touch boundary areas as a quasi-touch point; and determining the touch point according to the quasi touch point.
In a third aspect, a touch point identification device is provided, which includes a communication interface, a processor, a memory, and a bus; the memory is used for storing computer execution instructions, the processor is connected with the memory through the bus, and when the touch point identification device runs, the processor executes the computer execution instructions stored in the memory so as to enable the touch point identification device to execute the touch point identification method.
In a fourth aspect, a computer storage medium is provided, which includes instructions that, when executed on a computer, cause the computer to execute the above-mentioned touch point identification method.
In a fifth aspect, there is provided a display device comprising: a touch screen and any one of the touch point identification devices.
It can be understood that any one of the touch point identification apparatus, the computer storage medium, or the display device provided above is used to execute the method corresponding to the first aspect provided above, and therefore, the beneficial effects achieved by the method according to the first aspect and the beneficial effects of the solutions in the following detailed description may be referred to, and are not repeated herein.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a touch point identification method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a touch screen according to an embodiment of the present invention;
FIG. 3 is a schematic view of the scanning direction in the horizontal direction according to the embodiment of the present invention;
FIG. 4 is a schematic diagram of a scan direction in a vertical edge direction according to an embodiment of the present invention;
FIG. 5 is a schematic view of the scanning direction and angle in the vertical direction according to the embodiment of the present invention;
FIG. 6 is a schematic diagram of scanning directions and angles in a horizontal direction according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a touch boundary area according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of an occluded scanning beam path according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a touch boundary area according to another embodiment of the present invention;
FIG. 10 is a diagram illustrating a quasi-touch point according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of a set of intersections provided by an embodiment of the present invention;
FIG. 12 is a schematic diagram of determining a quasi-touch point X according to an embodiment of the present invention;
FIG. 13 is a diagram illustrating a method for determining a quasi-touch point Y according to an embodiment of the present invention;
FIG. 14 is a diagram illustrating a method for determining a quasi-touch point Z according to an embodiment of the invention;
FIG. 15 is a schematic diagram of determining a quasi-touch point U according to an embodiment of the present invention;
fig. 16 is a schematic structural diagram of a touch point identification device according to an embodiment of the present invention;
fig. 17 is a schematic structural diagram of a touch point identification device according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a touch point identification method for identifying touch points of a touch screen, and fig. 2 shows a structure of the touch screen, where two horizontal edge directions opposite to the touch screen are respectively provided with an optical path transmitting structure and an optical path receiving structure, and two vertical edge directions opposite to the touch screen are respectively provided with an optical path transmitting structure and an optical path receiving structure.
Specifically, the touch point identification method includes the following steps:
101. and driving the light path emission structures in the horizontal side direction and the vertical side direction of the touch screen to complete one scanning period.
The vertical edge direction comprises n scanning angles, the horizontal edge direction comprises m scanning angles, n and m are integers, n is larger than or equal to 1, and m is larger than or equal to 1. In the step 101, in the 1-to-n (n >: 1) scanning mode, each scanning optical path has a different angle for a specific optical path emission structure (for example, an infrared emission lamp); for a specific light path emitting structure, each angle of the n light paths corresponding to the specific light path emitting structure is called a scanning direction. Thus, a 1-to-n scan pattern has n scan directions, each scan direction consisting of a set of parallel optical paths with the same slope. Specifically, the scanning angle in the vertical side direction is an included angle between the light path emitted by the light path emitting structure in the vertical side direction and the vertical side direction, wherein the light paths emitted by the light path emitting structure in the vertical side direction at the same scanning angle form a group of parallel scanning light paths, as shown in fig. 3, in a 1-to-2 scanning mode in the vertical side direction, there are two scanning directions corresponding to two scanning angles; the scanning angle in the horizontal side direction is an included angle between the optical path emitted by the optical path emission structure in the horizontal side direction and the horizontal side direction, wherein the optical path emitted by the optical path emission structure in the horizontal side direction at the same scanning angle forms a group of parallel scanning optical paths, as shown in fig. 4, two scanning directions corresponding to two scanning angles are provided in a 1-to-2 scanning mode in the horizontal side direction. The optical path emission structures in the horizontal direction and in the vertical direction all emit optical paths once in each scanning direction in a certain sequence, which is defined as a scanning period, and the specific scanning sequence is not limited herein, for example, the optical path emission may be completed for each scanning direction sequentially from left to right in the horizontal side direction, and then the optical path emission may be completed for each scanning direction sequentially from top to bottom in the vertical side direction; it is of course also possible to perform the above-described operations only in the vertical side direction and then in the horizontal side direction. In step 101, the number of scanning directions in the horizontal direction and the angle of each scanning direction may also be configured; the number of scanning directions in the vertical direction and the angle of each scanning direction are arranged. Illustratively, referring to fig. 5, the X axis of the horizontal side direction includes 3 scanning directions, the angle of the first scanning direction is θ, the angle of the second scanning direction is pi/2, and the angle of the third scanning direction is pi- θ; the angle of each scanning direction in the horizontal side direction is represented by an included angle with the X-axis direction; referring to fig. 6, the Y axis in the vertical direction includes 3 scanning directions, the angle of the first scanning direction is ∈, the angle of the second scanning direction is pi/2, the angle of the third scanning direction is pi-epsilon, and the angles of the scanning directions in the vertical direction are all represented by angles with the Y axis direction.
102. And acquiring the intersection point of the shielded scanning light path to form an intersection point set.
In step 101, an intersection set C with an empty initialization state may be further set, and after scanning in the scanning directions corresponding to n + m scanning angles is performed in one scanning cycle, the blocked scanning optical paths in each scanning direction are counted, and the intersections (for example, represented by position coordinates (X, Y) of the focal points) of the blocked scanning optical paths are calculated and stored in the intersection set C in sequence.
103. And determining a first unshielded scanning light path on two sides of the reference point by taking any intersection point in the intersection point set as the reference point under each scanning angle, and acquiring n + m groups of touch boundary areas.
Under any scanning angle, the touch boundary area is an area between the first unshielded scanning light paths on the two sides of the reference point. For example, in a particular scan direction, as described above, corresponding to a set of parallel scan paths; when a touch occurs, each contact point will block several consecutive parallel scanning optical paths, and the previous and the next unblocked scanning optical paths of the consecutive blocked optical paths are referred to as a touch boundary region in the scanning direction, as shown in fig. 7. Where L1 denotes the occluded scanning optical path, the previous scanning optical path L3 (which is not occluded) of the first occluded scanning optical path L1 is the starting boundary of the touch boundary region, and the next scanning optical path L4 (which is not occluded) of the last occluded scanning optical path L2 is the ending boundary (of course, the occluded scanning optical path is defined in the order from left to right in the example, and may be defined in the order from right to left, of course).
And step 103, when the intersection point set C is not empty, arbitrarily selecting an intersection point from the intersection point set C, traversing the scanning directions corresponding to n + m scanning angles one by one with the intersection point as a reference point, calculating a scanning light path passing through the reference point in the scanning direction, respectively looking for a nearest unobstructed scanning light path forward and backward with the scanning light path as a reference, taking the slope of the current scanning direction as a reference slope, and taking the nearest unobstructed scanning light path as a starting and stopping boundary of the reference point in the current scanning direction touch boundary area until all processing of the scanning directions corresponding to the n + m scanning angles is completed, so as to obtain n + m groups of touch boundary areas related to the reference point. Specifically, a 1-to-2 scanning method is described as an example in both the horizontal direction and the vertical direction, and as shown in fig. 8, the reference point blocks the scanning optical paths La and Lb; then as shown in fig. 9, a set of touch boundary areas (as shown in fig. 9, the area between La1 and La 2) is determined in the first scan direction; a set of touch boundary regions (shown in fig. 9, the region between Lb1 and Lb 2) is determined in the second scan direction, and a set of touch boundary regions (shown in fig. 9, the region between Lc1 and Lc 2) is determined in the third scan direction.
104. And taking the intersection of the n + m groups of touch boundary areas as a quasi touch point.
The n + m sets of touch boundary regions are used to superimpose a quasi-touch point (where the quasi-touch point has a certain area, for example, quasi-touch point F in fig. 10) and store the quasi-touch point into a touch point set D, where the touch point set D with an empty initialization state can be set in step 101. Meanwhile, calculating which intersection points in the intersection point set C are covered by the quasi-touch point, and removing the covered intersection points from the intersection point set C; and circularly processing all the intersection points in the intersection point set C by the method, namely determining another reference point in the intersection point set C again, and determining another quasi touch point according to the another reference point until the intersection point set C is empty, so that all quasi touch points, namely the touch point set D, are obtained. As shown in fig. 11, when there are 4 intersection points a, b, C, and d in C, determining one intersection point a after the quasi-touch point X is deleted according to the intersection point a in the above manner (of course, if the quasi-touch point X is covered with other intersection points, it may also be deleted) is shown in fig. 12; deleting an intersection b of the points after determining the quasi-touch point Y according to b, as shown in FIG. 13; thus, after the quasi-touch point Z is determined according to c, one intersection point c of the points is deleted, as shown in fig. 14; finally, after the quasi-touch point U is determined according to d, one intersection point d of the points is deleted, as shown in fig. 15.
105. And determining a touch point according to the quasi touch point.
Specifically, for improving touch point set D, the accuracy of accurate touch point can also go the deceitful point operation to accurate touch point in touch point set D, rejects the false touch point in the touch point set. In addition, in step 101, an area threshold t of the quasi-touch points may also be set, and then the quasi-touch points in the touch point set D whose area is smaller than the area threshold t may also be eliminated. For example, if the touch screen supports a stylus with a minimum diameter of 1mm for touch, t may be set to 1mm2. And finally, performing track tracking and smoothing algorithm processing on the quasi-touch points of the touch point set D, and performing coordinate conversion on the quasi-touch points to output position coordinates of the quasi-touch points, thereby determining touch points.
In the above scheme, in one scanning period, the touch point identification device can combine an intersection point formed by any scanning optical path, acquire n + m groups of touch boundary areas by using the first unshielded scanning optical path on two sides of the intersection point at any scanning angle, then take the intersection of the n + m groups of touch boundary areas as a quasi touch point, and finally determine the touch point according to the quasi touch point. Compared with the prior art, in the embodiment of the invention, the shielded scanning light path and the unshielded scanning light path are simultaneously utilized for identifying the touch points, so that the accuracy and the discrimination of the touch point identification can be improved.
Referring to fig. 16, a touch point identification device is provided, which is used for identifying touch points of a touch screen, where two horizontal side directions opposite to the touch screen are respectively provided with a light path emitting structure and a light path receiving structure, and two vertical side directions opposite to the touch screen are respectively provided with a light path emitting structure and a light path receiving structure; the device includes:
the driving module 21 is configured to drive the light path emission structure in the horizontal side direction and the vertical side direction of the touch screen to complete a scanning cycle, where the vertical side direction includes n scanning angles, the horizontal side direction includes m scanning angles, n and m are integers, n is greater than or equal to 1, and m is greater than or equal to 1;
an obtaining module 22, configured to obtain an intersection point of the shielded scanning light path, and form an intersection point set;
a processing module 23, configured to determine, for any intersection in the intersection set acquired by the acquisition module 22 as a reference point, a first unshielded scanning optical path on two sides of the reference point at each scanning angle, and acquire n + m groups of touch boundary regions, where, at any scanning angle, the touch boundary region is a region between the first unshielded scanning optical paths on two sides of the reference point;
the processing module 23 is further configured to take an intersection of the n + m sets of touch boundary areas as a quasi-touch point; and determining the touch point according to the quasi touch point.
In an exemplary scheme, the processing module 23 is further configured to cull the intersection covered by the quasi-touch point from the intersection set; and when the intersection point set is not empty, another datum point is determined in the focus set again, and another quasi-touch point is determined according to the another datum point.
In an exemplary scheme, the processing module 23 is specifically configured to perform an scam point removing operation on a quasi touch point in a touch point set, and reject a pseudo touch point in the touch point set and/or reject a quasi touch point whose area is smaller than an area threshold in the touch point set.
In case of adopting an integrated module, the touch point recognition apparatus includes: the device comprises a storage unit, a processing unit and an interface unit. The processing unit is configured to control and manage the motion of the touch point identification device, for example, the processing unit is configured to support the touch point identification device to execute the process 101-104 in fig. 1; the interface unit is used for supporting information interaction between the touch point identification device and other equipment. And the storage unit is used for storing the program codes and the data of the touch point identification device.
For example, the processing unit is a processor, the storage unit is a memory, and the interface unit is a communication interface. The touch point identification device shown in fig. 17 includes a communication interface 301, a processor 302, a memory 303, and a bus 304, where the communication interface 301 and the processor 302 are connected to the memory 303 through the bus 304.
Processor 302 may be a general-purpose Central Processing Unit (Central Processing Unit,
a CPU), a microprocessor, an Application-Specific Integrated Circuit (ASIC), or one or more Integrated circuits for controlling the execution of programs in accordance with the teachings of the present Application.
The Memory 303 may be a Read-Only Memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these. The memory may be self-contained and coupled to the processor via a bus. The memory may also be integral to the processor.
The memory 303 is used for storing application program codes for executing the scheme of the application, and the processor 302 is used for controlling the execution. The communication interface 301 is used for information interaction with other devices. The processor 302 is configured to execute application program code stored in the memory 303 to implement the methods described in the embodiments of the present application.
In addition, a computing storage medium (or media) is also provided, which comprises instructions that when executed perform the method operations performed by the touch point identification device in the above embodiments. Additionally, a computer program product is also provided, comprising the above-described computing storage medium (or media).
An embodiment of the present invention provides a display device including: a touch screen and a touch point identification device provided by the above embodiments. The display device may be: any product or component with a display function, such as electronic paper, a mobile phone, a tablet computer, a television, a display, a notebook computer, a digital photo frame, a navigator and the like.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and the function thereof is not described herein again.
It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (7)

1. A touch point identification method is characterized by being used for touch point identification of a touch screen, wherein a light path transmitting structure and a light path receiving structure are respectively arranged in two horizontal edge directions opposite to the touch screen, and a light path transmitting structure and a light path receiving structure are respectively arranged in two vertical edge directions opposite to the touch screen;
driving light path emitting structures in the horizontal side direction and the vertical side direction of the touch screen to finish a scanning period, wherein the vertical side direction comprises n scanning angles, the horizontal side direction comprises m scanning angles, n and m are integers, n is more than or equal to 1, and m is more than or equal to 1;
acquiring intersection points of the shielded scanning light paths to form an intersection point set;
determining a first unshielded scanning light path on two sides of the reference point and acquiring n + m groups of touch boundary areas under each scanning angle by taking any intersection point in the intersection point set as the reference point, wherein the touch boundary areas are areas between the first unshielded scanning light paths on two sides of the reference point under any scanning angle;
taking the intersection of the n + m groups of touch boundary areas as a quasi touch point;
eliminating the intersection points covered by the quasi-touch points in the intersection point set;
when the intersection point set is not empty, another reference point is determined in the intersection point set again, and another quasi touch point is determined according to the another reference point;
and determining the touch point according to the quasi touch point.
2. The touch point identification method according to claim 1, wherein the determining the touch point according to the quasi-touch point comprises:
and performing anti-rogue point operation on the quasi touch points in the touch point set, and rejecting false touch points in the touch point set and/or rejecting quasi touch points with the area smaller than an area threshold value in the touch point set.
3. A touch point identification device is characterized by being used for touch point identification of a touch screen, wherein a light path transmitting structure and a light path receiving structure are respectively arranged in two horizontal edge directions opposite to the touch screen, and a light path transmitting structure and a light path receiving structure are respectively arranged in two vertical edge directions opposite to the touch screen; the device includes:
the driving module is used for driving the light path emission structure in the horizontal side direction and the vertical side direction of the touch screen to complete a scanning period, wherein the vertical side direction comprises n scanning angles, the horizontal side direction comprises m scanning angles, n and m are integers, n is more than or equal to 1, and m is more than or equal to 1;
the acquisition module is used for acquiring the intersection point of the shielded scanning light path to form an intersection point set;
a processing module, configured to determine, for any intersection in the intersection set acquired by the acquisition module as a reference point, a first unshielded scanning optical path on two sides of the reference point at each scanning angle, and acquire n + m sets of touch boundary regions, where, at any scanning angle, the touch boundary region is a region between the first unshielded scanning optical paths on two sides of the reference point;
the processing module is further configured to take an intersection of the n + m sets of touch boundary areas as a quasi-touch point;
the processing module is further configured to eliminate an intersection covered by the quasi-touch point from the intersection set;
the processing module is further configured to determine another reference point in the intersection set again when the intersection set is not empty, and determine another quasi-touch point according to the another reference point;
the processing module is further configured to determine the touch point according to the quasi touch point.
4. The touch point identification device according to claim 3, wherein the processing module is specifically configured to perform rogue point removal operation on quasi touch points in the touch point set, and reject the pseudo touch points in the touch point set and/or reject the quasi touch points whose area in the touch point set is smaller than an area threshold.
5. A touch point identification device is characterized by comprising a communication interface, a processor, a memory and a bus; the memory is used for storing computer execution instructions, the processor is connected with the memory through the bus, and when the touch point identification device runs, the processor executes the computer execution instructions stored in the memory so as to enable the touch point identification device to execute the touch point identification method according to any one of claims 1-2.
6. A computer storage medium comprising instructions that, when executed on a computer, cause the computer to perform the touch point identification method of any of claims 1-2.
7. A display device, comprising: touch screen, and touch point identification device according to any of claims 3-5.
CN201811399723.8A 2018-11-22 2018-11-22 Touch point identification method and device and display equipment Active CN109542276B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811399723.8A CN109542276B (en) 2018-11-22 2018-11-22 Touch point identification method and device and display equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811399723.8A CN109542276B (en) 2018-11-22 2018-11-22 Touch point identification method and device and display equipment

Publications (2)

Publication Number Publication Date
CN109542276A CN109542276A (en) 2019-03-29
CN109542276B true CN109542276B (en) 2022-02-08

Family

ID=65849442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811399723.8A Active CN109542276B (en) 2018-11-22 2018-11-22 Touch point identification method and device and display equipment

Country Status (1)

Country Link
CN (1) CN109542276B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262685B (en) * 2019-05-28 2022-07-26 青岛海信商用显示股份有限公司 Touch detection method, touch detection device, storage medium and touch screen
CN113126795B (en) * 2019-12-31 2022-08-19 青岛海信商用显示股份有限公司 Touch identification method of touch display device and related equipment
CN111708456A (en) * 2020-06-17 2020-09-25 青岛海信商用显示股份有限公司 Display device and touch area determination method
CN114415864B (en) * 2020-10-13 2024-02-02 青岛海信商用显示股份有限公司 Touch area determining method and display device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731418A (en) * 2015-03-27 2015-06-24 青岛海信电器股份有限公司 Touch screen positioning method and device and touch screen device
CN105260064A (en) * 2015-10-15 2016-01-20 青岛海信电器股份有限公司 Touch point identification method and apparatus, and display device
CN106201064A (en) * 2016-06-24 2016-12-07 青岛海信电器股份有限公司 Touch localization method, device and touch screen
CN106802744A (en) * 2017-01-03 2017-06-06 青岛海信电器股份有限公司 There is the method and device of positioning touch area during touch control operation in infrared touch panel
CN107728860A (en) * 2017-10-19 2018-02-23 青岛海信电器股份有限公司 A kind of touch points of infrared touch screen recognition methods, device and touch-screen equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101930322B (en) * 2010-03-26 2012-05-23 深圳市天时通科技有限公司 Identification method capable of simultaneously identifying a plurality of contacts of touch screen
CN102479000A (en) * 2010-11-26 2012-05-30 北京汇冠新技术股份有限公司 Infrared touch screen and multi-point identification method thereof
CN103677441B (en) * 2012-09-18 2017-02-08 北京汇冠新技术股份有限公司 Infrared multipoint identification method, infrared multipoint identification device and infrared touch screen
CN106020566A (en) * 2016-05-05 2016-10-12 广州华欣电子科技有限公司 Scanning method and device for infrared touch screen
CN106502475B (en) * 2016-10-26 2019-07-16 青岛海信电器股份有限公司 A kind of infrared touch point identification method, infrared touch device and display device
CN107957826B (en) * 2018-01-04 2020-10-30 河北华发教育科技股份有限公司 Method and system for identifying multi-touch-point area of infrared touch screen

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731418A (en) * 2015-03-27 2015-06-24 青岛海信电器股份有限公司 Touch screen positioning method and device and touch screen device
CN105260064A (en) * 2015-10-15 2016-01-20 青岛海信电器股份有限公司 Touch point identification method and apparatus, and display device
CN106201064A (en) * 2016-06-24 2016-12-07 青岛海信电器股份有限公司 Touch localization method, device and touch screen
CN106802744A (en) * 2017-01-03 2017-06-06 青岛海信电器股份有限公司 There is the method and device of positioning touch area during touch control operation in infrared touch panel
CN107728860A (en) * 2017-10-19 2018-02-23 青岛海信电器股份有限公司 A kind of touch points of infrared touch screen recognition methods, device and touch-screen equipment

Also Published As

Publication number Publication date
CN109542276A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN109542276B (en) Touch point identification method and device and display equipment
CN106201064B (en) Touch localization method, device and touch screen
US20110267264A1 (en) Display system with multiple optical sensors
CN111292531B (en) Tracking method, device and equipment of traffic signal lamp and storage medium
JPS6170407A (en) Instrument for measuring distance
CN111310840B (en) Data fusion processing method, device, equipment and storage medium
KR20110016994A (en) Camera gestures for user interface control
KR20110051164A (en) Image processing device, image processing method, image input device, and image input/output device
CN111402160B (en) Point cloud data denoising method, device, equipment and storage medium
CN104777984A (en) Touch trajectory tracking method and device and touch screen device
US20190325593A1 (en) Image processing apparatus, system, method of manufacturing article, image processing method, and non-transitory computer-readable storage medium
CN106598351B (en) Touch point processing method and device
CN102419664B (en) Multi-point identification method and system of infrared touch screen
CN112233139A (en) System and method for detecting motion during 3D data reconstruction
CN110262685B (en) Touch detection method, touch detection device, storage medium and touch screen
CN111524165A (en) Target tracking method and device
US20220392107A1 (en) Image processing apparatus, image processing method, image capturing apparatus, and non-transitory computer-readable storage medium
CN103093475A (en) Image processing method and electronic device
US20140035879A1 (en) Optical touch system and method
CN113126795B (en) Touch identification method of touch display device and related equipment
CN114964204A (en) Map construction method, map using method, map constructing device, map using equipment and storage medium
CN114077359A (en) Touch recognition method and display device
CN110291557B (en) Monitoring device
US10713808B2 (en) Stereo matching method and system using rectangular window
RU2658092C2 (en) Method and navigation system of the mobile object using three-dimensional sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant