CN112905064B - Photoelectric touch positioning method, touch screen and application thereof - Google Patents

Photoelectric touch positioning method, touch screen and application thereof Download PDF

Info

Publication number
CN112905064B
CN112905064B CN202110295833.5A CN202110295833A CN112905064B CN 112905064 B CN112905064 B CN 112905064B CN 202110295833 A CN202110295833 A CN 202110295833A CN 112905064 B CN112905064 B CN 112905064B
Authority
CN
China
Prior art keywords
point
indicator
image sensor
touch
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110295833.5A
Other languages
Chinese (zh)
Other versions
CN112905064A (en
Inventor
王贵有
王锦洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SICHUAN IDAO TECHNOLOGY CO LTD
Original Assignee
SICHUAN IDAO TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SICHUAN IDAO TECHNOLOGY CO LTD filed Critical SICHUAN IDAO TECHNOLOGY CO LTD
Priority to CN202110295833.5A priority Critical patent/CN112905064B/en
Publication of CN112905064A publication Critical patent/CN112905064A/en
Application granted granted Critical
Publication of CN112905064B publication Critical patent/CN112905064B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a photoelectric touch positioning method, a touch screen and application thereof, wherein the touch screen comprises an imaging surface and a touch positioning detection device for the imaging surface, the touch positioning detection device can be integrally installed by internally arranging a display screen and can also be externally installed by designing the display screen, and the small plug-and-play equipment is realized and is convenient to carry. The touch location detection device is located in the following photoelectric touch location method. The touch positioning detection device and the imaging surface are arranged on the same vertical surface. The touch location detection device includes a first optical imaging system and a second optical imaging system. The invention directly determines the coordinate of the first indicator, then compares the current picture with the previous picture, thereby detecting the change of the phase coordinate of the indicator in the current picture, and then obtains the coordinate of the object plane of the indicator according to the triangulation location method.

Description

Photoelectric touch positioning method, touch screen and application thereof
Technical Field
The invention relates to the field of electronics, in particular to a photoelectric touch positioning method, a touch screen and application thereof.
Background
Currently, touch screens have been pushed to numerous fields as the easiest human-computer interaction technology. In the touch screen implementation scheme based on the photoelectric technology, the most mature one is that infrared transmitting tubes and infrared receiving tubes are arranged on four sides of a screen, and form infrared matrixes which are crossed horizontally and vertically in a one-to-one correspondence mode, the specific position of an indicator can be detected according to different shielding light rays, and the application range is wide; however, the resolution is limited by the number of infrared pair tubes, which is difficult to be made high, and the touch device disclosed at present needs to be manufactured according to the size of the display screen; as the display screen increases in area, its cost increases. The touch device between the touch screens with different sizes cannot replace the common touch device due to the fixed size.
In view of the above situation, CN201616089U in the prior art discloses a touch screen and touch system, which includes a lens, an image sensor, a touch surface, and a processing unit, where the lens collects touch information of the touch surface and transmits the touch information to the image sensor through an optical fiber to form an image on the image sensor and generate image information, the image sensor transmits the image information to the processing unit, and the processing unit processes image data from the image sensor by using a triangulation method to obtain a position of an indicator. However, the prior art has many defects of low coordinate precision, poor adjustability, poor matching performance, inconvenient universality, blind spots in scanning and the like. The method comprises the following specific steps:
1. the coordinate precision is low: due to machining errors and assembly errors of any structural product, certain errors exist in the design size and the actual design size of the structural product, so that the designed size information is directly output to be applied to the actually manufactured touch screen, the coordinate reference of the structural product cannot be completely matched with the actual touch screen, and the coordinate error exists; specifically, in the touch system of the prior art, the coordinate reference is the principal point of the objective lens of the optical lens as the coordinate reference, and the coordinate reference is independent from the touch screen, and the correlation is lacked, so that the determination of the coordinate is limited by the installation position of the lens, the manufacturing error of the lens, the assembly error of the lens, and the like, and no matter whether the lens is manufactured and assembled by itself or an error exists between the actual installation position and the designed installation position, the touch positioning read by the touch system is directly caused to have an error;
2. poor adjustability and fault tolerance: the prior art cannot adjust the position of the lens, and can only be installed at a preset position, and once the position of the lens is adjusted, the error between the coordinate parameter of the image sensor and the corresponding touch screen is increased;
3. the universality is inconvenient: when the touch screen is used for touch screens of different models, corresponding parameter information needs to be reset, and the like, so that the touch screen is troublesome to use;
4. recognition is error prone and data volume is large: in the technology, the analysis processing is carried out by acquiring the image information on the touch surface, so that not only can wrong images similar to touch be easily identified, but also a correct touch position needs to be found in the images without any obvious determination, and the data processing amount is large and complicated;
5. blind spots exist in the scanning: even with a light source, there may be blind spots when the reflection angle at the touch is not at the reception angle.
Disclosure of Invention
The invention aims to: the photoelectric touch positioning method, the touch screen and the application thereof are provided, and the technical problems in the prior art are solved.
The technical scheme adopted by the invention is as follows:
the photoelectric touch positioning method comprises the following steps:
s1, respectively installing a first optical imaging system and a second optical imaging system at two ends of one edge of an imaging surface, wherein end points of the two ends of the edge are P 1 Point sum P 2 Point;
the first optical imaging system comprises a first information processing module with a first storage module, a first optical lens and a first image sensor, wherein the first information processing module is arranged at an imaging surface frame, the first optical lens is used for collecting image information of a touch surface of an imaging surface, and the first image sensor is used for receiving the collected information from the first optical lens and sending the information to the first information processing module;
the second optical imaging system comprises a second information processing module which is arranged at the position of an imaging surface frame and is provided with a second storage module, a second optical lens for collecting the image information of the touch surface of the imaging surface, and a second image sensor for receiving the collected information from the second optical lens and sending the information to the second information processing module;
the first information processing module and the second information processing module are communicated with each other, and the principal points of the objective lenses of the first optical lens and the second optical lens are respectively an H point and an H point 1 The point of the objective lens principal point H corresponding to the phase plane in the dot matrix diagram of the first image sensor coincides with the central point H' of the dot matrix diagram, and the objective lens principal point H 1 The point corresponding to the phase plane in the bitmap of the second image sensor corresponds to the center point H of this bitmap 1 "coincide with H and H' coincide, H 1 Dot sum H 1 "overlapping;
s2, initializing the first optical imaging system and the second optical system, and acquiring P by the first information processing module and the second information processing module 1 P 2 And storing it in a memory module;
s3, exposure initialization:
starting a first optical imaging system, exposing static information of a touch surface by a first image sensor, and storing an exposed static picture in a first storage module;
starting a second optical imaging system, exposing static information of one touch surface by a second image sensor, and storing the exposed static image into a second storage module;
s4, initializing a reference point P 1 Angle (c): at P 1 Setting a trigger point which can be collected by an image sensor;
starting the first optical imaging system, the first image sensor exposes P of a touch surface 1 Clicking dynamic information and storing the exposed dynamic picture into a first storage module;
starting the second optical imaging system, the second image sensor exposes P of a touch surface 1 Point dynamic information and store the exposed dynamic picture in a second storage module;
s5, in the first optical imaging system, the first information processing module acquires a trigger point P 1 P corresponding to the light spot at the phase plane of the first image sensor 1 ' coordinates in a dot matrix of the first image sensor where the first information processing module is located, and the first information processing module calculates & lt P according to a trigonometric function 1 ′H′H 1 "and storing it in a first storage module;
s6, in the second optical imaging system, the second information processing module acquires P of a phase plane light spot corresponding to the trigger point on the second image sensor 1 The second information processing module calculates the & lt P & gt according to a trigonometric function 1 "H 1 The radian absolute value of 'H' is stored in a second storage module;
s7, determining P according to a triangulation method based on the angle values in S5 and S6 1 Coordinates of the point relative to the objective principal point H of the first optical lens;
s8, initializing a reference point P 2 Angle (c): according to the above-mentioned steps S4 to S7,at P 2 A trigger point which can be collected by the first image sensor and the second image sensor is arranged at the position, and the shooting exposure is carried out, and the angle P is calculated by the first optical imaging system 2 ′H′H 1 "the radian absolute value is stored and the second optical imaging system calculates ≈ P 2 "H 1 The value of "H" is stored and based on these two angle values, P is determined according to triangulation 2 Point relative to P 1 Coordinates of the points;
s9, the optical imaging system and the optical imaging system start to patrol the moving coordinates of the indicator on the touch surface:
(1) 1 st patrol inspection indicator:
the first image sensor exposes a picture of the indicator, acquires the coordinate value of the indicator in the dot matrix diagram and caches the coordinate value in the first storage module;
meanwhile, the second image sensor exposes a picture of the indicator, obtains the coordinate value of the indicator 3 in the dot matrix diagram and caches the coordinate value in the second storage module;
determining the relative position of the indicator to P according to triangulation 1 Coordinates of the points;
(2) the 2 nd patrol indicator:
exposing a picture of a current indicator by a first image sensor, and caching the picture into a first storage module;
meanwhile, the second image sensor exposes the picture of the current indicator and caches the picture in the second storage module;
(3) the first information processing module processes the pictures acquired in the steps (1) and (2) in the first storage module, and each picture reserves pixel points of the indicator in the range of +/-alpha angles of the circumferential track of the lens;
(4) the first information processing module compares the picture of the current step by taking the picture of the previous step as a reference picture to obtain a coordinate difference value of the indicator in the current picture relative to the indicator of the previous step in the dot matrix diagram, and obtains a coordinate value of the current indicator in the dot matrix diagram according to the coordinate value of the indicator of the previous step in the dot matrix diagram and the difference value; and based on the coordinate value, the center of the current point relative to the optical axis of the first optical lens is obtainedThe absolute value of arc of ≈ P 3 ′H′H 1 "and storing it in a first storage module;
(5) the second information processing module processes the pictures acquired in the steps (1) and (2) in the second storage module according to the step (3) and the step (4) in the step S9, obtains a coordinate value of the current indicator in the dot matrix diagram through the second information processing module, and obtains an radian absolute value of the current point position relative to the optical axis center of the second optical lens, namely ≈ P based on the coordinate value 3 "H 1 The value of "H" and store it in the second storage module;
(6) determining the current P based on triangulation method based on the obtained angle value 3 Point relative to P 1 Coordinate values of the points;
(7) judging the pixel coordinate value of the indicator, if the pixel coordinate value is valid in the range of the image pixel area measurement range of the imaging surface, storing the pixel coordinate value into a storage module, otherwise, discarding the data;
(8) the information processing module sends the pixel coordinate value of the effective indicator to the host;
(9) and (4) repeating the steps (2) to (8) to track the moving track of the indicator.
The touch screen further comprises an optical scanning mechanism capable of emitting light for scanning the touch surface, and the trigger point is a component capable of reflecting the light emitted by the optical scanning mechanism.
Further, the trigger point is an LED light emitting source.
Further, a first optical filter is arranged on an optical path between the first optical lens and the first image sensor, and a second optical filter is arranged on an optical path between the second optical lens and the second image sensor.
Further, the objective lenses of the first optical lens and the second optical lens are both fish-eye lenses.
A touch screen comprises an imaging surface and a touch positioning detection device for the imaging surface, wherein the touch positioning detection device performs positioning detection according to the photoelectric touch positioning method.
The invention discloses application of a touch screen in a touch feedback system.
Due to the adoption of the technical scheme, the invention has the beneficial effects that:
1. the invention relates to a photoelectric touch positioning method, a touch screen and application thereof, which can reestablish coordinates based on trigger points no matter how the size of the actually manufactured touch screen is, how the size of an optical imaging system is and the like, is not limited by the size of the touch screen, is not influenced by the assembly precision and the processing precision of other related parts, and can be only limited by the actual position of the trigger points when in actual use, thereby eliminating the adverse influence of the processing and the assembly precision of the touch screen and an optical lens on the coordinate precision of the touch surface, effectively limiting the actual coordinate precision of the touch surface to the position of the trigger points, eliminating the influence of unnecessary size precision, and further obtaining a touch positioning detection mode with higher positioning precision;
2. according to the photoelectric touch positioning method, the touch screen and the application thereof, when the photoelectric touch positioning method is in actual use, the position of the optical lens can be adaptively installed according to actual conditions such as a touch surface and the like, and only the accurate installation and high installation position precision of the range positioning unit need to be determined, so that the applicability and fault tolerance of the photoelectric touch positioning method are improved, and the convenience of actual use is improved.
3. The invention relates to a photoelectric touch positioning method, a touch screen and application thereof.A scheme directly determines the coordinate of a first indicator, and then compares the current picture with a previous picture so as to detect the change of the phase coordinate of the indicator in the current picture, and then obtains the coordinate of the object plane of the indicator according to a triangulation positioning method;
4. according to the photoelectric touch positioning method, the touch screen and the application thereof, the optical scanning mechanism can improve the identification accuracy of the indicator, so that the touch positioning is not influenced by ambient light, and the program calculation and judgment difficulty is reduced.
Drawings
In order to more clearly illustrate the technical solution of the embodiment of the present invention, the drawings needed to be used in the embodiment will be briefly described below, and it should be understood that the proportional relationship of each component in the drawings in this specification does not represent the proportional relationship in the actual material selection design, and is only a schematic diagram of the structure or the position, in which:
FIG. 1 is a schematic structural view of the present invention;
FIG. 2 is a geometric diagram of the object plane corresponding to the phase plane;
FIG. 3 is a schematic diagram of a first optical imaging system;
FIG. 4 is a schematic diagram of a second optical imaging system;
FIG. 5 is a schematic diagram of a first information processing module;
FIG. 6 is a schematic diagram of a second information processing module;
FIG. 7 is a diagram of the position relationship between the optical lens and the display;
FIG. 8, panel a;
fig. 9 panel b.
Reference numerals in the drawings indicate:
1-a first optical imaging system, 11-a first optical lens, 12-a first optical filter, 13-a first image sensor, 14-a first information processing module and a first storage module;
2-a second optical imaging system, 21-a second optical lens, 22-a second optical filter, 23-a second image sensor, 24-a second information processing module and a second storage module;
3-an indicator;
4-image plane, 41-image pixel area.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the detailed description and specific examples, while indicating the preferred embodiment of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
The present invention will be described in detail with reference to fig. 1 to 7.
Example 1
As shown in figure 1, a touch screen, including imaging surface 4 and imaging surface 4 with touch location detection device, touch location detection device can the integrative installation of built-in display screen, can design external display screen external installation again, realizes miniaturized plug-and-play equipment, conveniently carries. The positioning mode of the touch positioning detection device is the following photoelectric touch positioning method. The touch positioning detection device and the imaging surface 4 are arranged on the same vertical surface (or plane). The touch location detection device includes a first optical imaging system 1 and a second optical imaging system 2.
The photoelectric touch positioning method comprises the following steps:
s1, respectively installing a first optical imaging system 1 and a second optical imaging system 2 at two ends of one edge of an imaging surface, wherein end points of the two ends of the edge are a point P1 and a point P2;
the first optical imaging system 1 comprises a first information processing module 14 with a first storage module, a first optical lens 11 and a first image sensor 13, wherein the first information processing module 14 is installed at the frame of the imaging surface 4, the first optical lens 11 collects image information of a touch surface of the imaging surface 4, and the first image sensor 13 receives the collected information from the first optical lens 11 and sends the information to the first information processing module 14;
the second optical imaging system 2 comprises a second information processing module 24 with a second storage module, which is arranged at the frame of the imaging surface 4, a second optical lens 21 for collecting the image information of the touch surface of the imaging surface 4, and a second image sensor 23 for receiving the collected information from the second optical lens 21 and sending the information to the second information processing module 24;
the first information processing module 14 and the second information processing module 24 communicate with each other, and the objective principal points of the first optical lens 11 and the second optical lens 21 are point H and point H, respectively 1 The point at which the objective principal point H corresponds to the phase plane in the dot diagram of the first image sensor 13 coincides with the center point H' of this dot diagram, the objective principal point H 1 The point corresponding to the phase plane in the dot-matrix of the second image sensor 23 and the center point H of this dot-matrix 1 ", and point H' coincide, H 1 Dot sum H 1 "overlapping;
s2, initializing the first optical imaging system 1 and the second optical system 2:
(1) initializing a first storage module and a second storage module;
(2) initializing communication ports of a first information processing module and a second information processing module;
③P 1 P 2 initializing the range equivalent;
each information processing module is communicated with a host through a peripheral interface, and automatically reads the current display driving pixel resolution value of the host and caches the value in a storage module; such as: resolution is 1920 (length) × 1080 (width) pixels, so that the length and width of image pixel region 41 is 1920 pixels long, 1080 pixels wide, and P 1 P 2 =1920 pixels; p 1 P 2 The span equivalent can also be manually entered.
(4) Individual image sensor initialization
(5) Other System initialization
After the system is initialized, each optical imaging system starts to work.
(6) Initializing synchronous exposure timers of all information processing modules;
s3, exposure initialization:
starting the first optical imaging system 1, exposing static information of one touch surface by the first image sensor 13, and storing the exposed static image into a first storage module;
starting the second optical imaging system 2, exposing static information of one touch surface by the second image sensor 23, and storing the exposed static image into the second storage module;
s4, initializing a reference point P 1 Angle (c): at P 1 Setting a trigger point which can be collected by an image sensor;
the first optical imaging system 1 is started and the first image sensor 13 exposes P of a touch surface 1 Point dynamic information and store the exposed dynamic picture in a first storage module;
the second optical imaging system 2 is started, and the second image sensor 23 exposes P of a touch surface 1 Point dynamic information and store the exposed dynamic picture in a second storage module;
s5, in the first optical imaging system 1, the first information processing module 14 acquires the trigger point P 1 P corresponding to the light spot at the phase plane of the first image sensor 13 1 ' coordinates in the bitmap of the first image sensor 13 where it is located, the first information processing module 14 calculates ^ P according to the trigonometric function 1 Radian absolute value of 'H1' and storing it in the first storage module;
s6, in the second optical imaging system 2, the second information processing module 24 acquires P of the trigger point corresponding to the phase plane light spot of the second image sensor 23 1 "coordinates in the bitmap of the second image sensor 23 where it is located, and the second information processing module 24 calculates ≦ P according to a trigonometric function 1 "H 1 The radian absolute value of 'H' is stored in a second storage module;
s7, determining the coordinate of the point P1 relative to the objective principal point H of the first optical lens 11 according to a triangulation method based on the angle values in the S5 and the S6;
s8, initializing a reference point P 2 Angle (c): according to the above steps S4 to S7, in P 2 A trigger point which can be collected by the first image sensor 13 and the second image sensor 23 is arranged at the position, and the photographing exposure is carried out, and the angle P is calculated by the first optical imaging system 1 2 ′H′H 1 "the radian absolute value of the radian is stored, the second optical imaging system calculates and stores the value of ≈ P2" H1"H', and based on the two angle values, P is determined according to a triangulation method 2 Point relative to P 1 Coordinates of the points;
s9, the optical imaging system 1 and the optical imaging system 2 start to scan the movement coordinates of the pointer 3 on the touch surface (the pointer is within a plane range defined as a touch object, for example, a display screen of a computer or an electronic whiteboard, and performs a touch to input information):
(1) 1 st patrol indicator 3:
the first image sensor 13 exposes a picture of the indicator 3, obtains the coordinate value of the indicator 3 in the dot matrix diagram, and caches the coordinate value in the first storage module;
meanwhile, the second image sensor 23 exposes a picture of the indicator 3, acquires the coordinate value of the indicator 3 from the dot matrix diagram, and caches the coordinate value in the second storage module;
determining the relative position of the indicator 3 to P according to triangulation 1 Coordinates of the points;
the first exposed picture is not compared with the standard picture, so that the first exposed picture is assumed to be a still picture;
(2) the 2 nd patrol indicator 3:
the first image sensor 13 exposes the picture of the current indicator 3 and caches the picture in the first storage module;
meanwhile, the second image sensor 23 exposes the picture of the current indicator 3, and caches the picture in the second storage module;
(3) the first information processing module 14 processes the pictures acquired in the steps (1) and (2) in the first storage module, and each picture reserves pixel points of the indicator 3 in the range of +/-alpha angles of the lens circumferential track; α is preferably 10 °;
(4) the first information processing module 14 compares the picture of the current step with the picture of the previous step as a reference picture to obtain a coordinate difference value of the indicator 3 in the current picture relative to the indicator 3 of the previous step in the dot matrix diagram, and obtains a coordinate value of the indicator 3 in the dot matrix diagram according to the coordinate value of the indicator 3 of the previous step in the dot matrix diagram and the difference value; and calculating the radian absolute value of the current point phase relative to the center of the optical axis of the first optical lens, namely ≦ P 3 ′H′H 1 "and storing it in a first storage module;
(5) the second information processing module 24 processes the pictures collected in steps (1) and (2) in the second storage module according to step (3) and step (4) in step S9, obtains the coordinate value of the current indicator 3 in the bitmap through the second information processing module 24, and obtains the radian absolute value of the current point position relative to the optical axis center of the second optical lens, namely · P, based on the coordinate value 3 "H 1 The value of "H" and storing it in a second memoryStoring in a module;
(6) determining the current P based on triangulation based on the obtained angle value 3 Point relative to P 1 Coordinate values of the points;
(7) judging the pixel coordinate value of the indicator 3, if the pixel coordinate value is valid in the range numerical range of the image pixel area 41 of the imaging surface 4, storing the pixel coordinate value into a storage module, otherwise, discarding the data;
(8) the information processing module sends the pixel coordinate value of the effective indicator 3 to the host;
(9) and (4) repeating the steps (2) to (8) to track the moving track of the indicator.
In the invention, no matter how the size of the touch screen is actually manufactured, how the size of the optical imaging system is, and the like, the coordinate can be reestablished based on the trigger point, the coordinate is not limited by the size of the touch screen, and is not influenced by the assembly precision and the processing precision of other related components, and the coordinate is only limited by the actual position of the trigger point when the touch screen is actually used, so that the adverse influence of the processing and the assembly precision of the touch screen and the optical lens on the coordinate precision of the touch surface is eliminated, the actual coordinate precision of the touch surface is effectively limited to the position of the trigger point, the unnecessary influence of the dimensional precision is eliminated, and the touch positioning detection mode with higher positioning precision can be obtained.
Meanwhile, when the optical lens is actually used, the position of the optical lens can be adaptively installed according to actual conditions such as a touch surface and the like, and the optical lens is installed only by determining that the range positioning unit is accurately installed and the installation position is high in precision, so that the applicability and fault tolerance of the optical lens are improved, and the convenience of actual use is improved.
The scheme directly determines the coordinate of the first indicator 3, and then compares the current picture with the previous picture, so that the change of the phase coordinate of the indicator 3 in the current picture is detected, and then the coordinate of the object plane of the indicator 3 is obtained according to a triangulation positioning method.
Example 2
This embodiment is based on embodiment 1, and the present invention is further described in an optimized manner.
The baffle is arranged at the boundary of the touch surface acquired by the optical imaging system, and in order to prevent the image sensor from acquiring useless image data and increase the data processing capacity, the baffle limits the image information which can be acquired by the image sensor in the imaging surface 4, so that the useless image data are eliminated, the data processing capacity is reduced, and the positioning efficiency is improved.
Example 3
The following can be used for the implementation of the trigger point:
1. the touch screen comprises an optical scanning mechanism capable of emitting scanning touch surface, and the trigger point is a component capable of reflecting light emitted by the optical scanning mechanism. The optical scanning mechanism can improve the identification accuracy of the indicator, so that the touch positioning is not influenced by ambient light, and the program calculation and judgment difficulty is reduced.
2. The trigger point is an LED luminous source, and when the trigger point is the LED luminous source, the reference point P is completed 1 When the angle of (1) is initialized, P 1 The LED luminous source is closed through the first or second information processing module, and correspondingly, the reference point P is finished 2 When the angle of (1) is initialized, P 2 The LED luminous source is closed through the first or second information processing module.
3. The trigger point can also be directly a finger, when touching and positioning: s4, initializing a reference point P 1 Angle (c): move finger to P 1 (ii) a The first optical imaging system 1 is started and the first image sensor 13 exposes P of a touch surface 1 Point dynamic information and store the exposed dynamic picture in a first storage module; the second optical imaging system 2 is started, and the second image sensor 23 exposes P of a touch surface 1 And clicking the dynamic information, and storing the exposed dynamic picture into a second storage module.
Example 4
This embodiment is based on embodiment 1, and the present invention is further described in an optimized manner.
As shown in fig. 3, a first filter 12 is disposed on an optical path between the first optical lens 11 and the first image sensor 13, and as shown in fig. 4, a second filter 22 is disposed on an optical path between the second optical lens 21 and the second image sensor 23.
The optical filter is arranged at the bottom of the optical lens and filters out light waves except the scanning light beams, and the light waves enter the image sensor for photoelectric conversion.
The image sensor is arranged on the phase plane of the optical lens, and the main plane of the optical lens, the optical filter and the image sensor plane are all perpendicular to the optical axis. In this example, the image sensor is a CMOS image sensor.
Further, the objective lenses of the first optical lens 11 and the second optical lens 21 are both fish-eye lenses.
The optical lens is a circular fisheye lens formed by combining a plurality of lenses, the visual angle of the circular fisheye lens is larger than 160 degrees, and the optical imaging system images an object with the visual angle range larger than 160 degrees on a phase plane of the circular fisheye lens. The light pattern of the imaging surface is a circular (or elliptical) image. The optical lens of this example employs a fisheye lens with a viewing angle of 200 °.
The invention utilizes the characteristic that the fisheye lens can see objects on the same plane, and can integrally install the lens and the display surface of the display on one surface, thereby enlarging the touch positioning surface and enabling the positioning detection of adopting the optical imaging system as a touch screen, in particular to a large touch screen to be feasible.
Example 5
As shown in fig. 5, the first information processing module includes a logic control and data processing unit, a first rf transceiver unit, a first storage module, a peripheral interface, a synchronous exposure timer, and a clock module, which are all in signal connection with the logic control and data processing unit, the clock module is in signal connection with the synchronous exposure timer, and the first image sensor is in signal connection with the logic control and data processing unit.
The logic control and data processing unit is the control center of the first information processing module 14. The clock module provides a time sequence clock and is connected with the logic control and data processing unit.
The first image sensor 13 is connected to the logic control and data processing unit, which is connected to the synchronous exposure timer.
The clock module is a timing pulse of a synchronous exposure timer, and the synchronous exposure timer regularly sends out an exposure pulse. The synchronous exposure timer triggers a synchronous exposure.
And synchronously processing under the synchronization of exposure pulses:
(1) the logic control and data processing unit of the first information processing module controls the first image sensor 13 to convert the light pattern of the phase plane into an electronic dot matrix pattern.
(2) The logic control and data processing of the second information processing module controls the second image sensor 23 to convert the light pattern of the phase plane into an electronic dot pattern.
The logic control and data processing unit is connected with the first radio frequency transceiving unit, and the first radio frequency transceiving is communicated with the second radio frequency transceiving of the second information processing module to exchange data.
The logic control and data processing unit is connected with the RAM unit serving as the first storage module; the RAM unit buffers the process data.
The logic control and data processing unit is connected with the peripheral interface, and the peripheral interface is communicated with the host.
As shown in fig. 6, the second information processing module includes a logic control and data processing unit, a second rf transceiver unit, a second storage module, a peripheral interface, a synchronous exposure timer, and a clock module, which are all in signal connection with the logic control and data processing unit, the clock module is in signal connection with the synchronous exposure timer, and the second image sensor is in signal connection with the logic control and data processing unit.
The logic control and data processing unit is the control center of the second information processing module 24. The clock module provides a time sequence clock and is connected with the logic control and data processing unit.
The second image sensor 23 is connected to the logic control and data processing unit, which is connected to the synchronous exposure timer.
The clock module is a timing pulse of a synchronous exposure timer, and the synchronous exposure timer regularly sends out an exposure pulse. The synchronous exposure timer triggers a synchronous exposure.
And synchronously processing under the synchronization of exposure pulses:
(1) the logic control and data processing unit of the first information processing module controls the first image sensor 13 to convert the light pattern of the phase plane into an electronic dot matrix pattern.
(2) The logic control and data processing of the second information processing module controls the second image sensor 23 to convert the light pattern of the phase plane into an electronic dot pattern.
The logic control and data processing unit is connected with the second radio frequency transceiving unit, and the second radio frequency transceiving unit is communicated with the first radio frequency transceiving unit of the first information processing module to exchange data.
The logic control and data processing unit is connected with the RAM unit serving as the second storage module; the RAM unit buffers the process data.
The logic control and data processing unit is connected with the peripheral interface, and the peripheral interface is communicated with the host.
Example 6
In this embodiment, an implementation description is performed on the triangulation method of the present invention, that is, an application of a touch screen in a touch feedback system is specifically as follows:
the principal plane of the objective lens (principal point H) of the first optical lens 11, and the principal plane of the objective lens (principal point H) of the second optical lens 21 1 ) The indicator 3 is arranged on the same detection surface, and the detection surface is approximate to the geometrical plane characteristic. Therefore, the first optical imaging system 1 and the second optical imaging system 2 measure respective reference points (P) in the apparatus 1 、P 2 、P 3 ) On the geometric plane is the end point H (H) 1 ) As shown in fig. 2. Then HP 1 、HP 2 、HP 3 、H 1 P 1 、H 1 P 2 、H 1 P 3 Line segments and HH 1 、P 1 P 2 The line segments form an associated triangle.
The image pixel area 41 shows that the pixel resolution attribute is known; convenient calculation, P 1 The first pixel, P, located in line 1 of the image pixel region 41 2 The first optical imaging system 1 and the second optical imaging system 2 are provided with HH at the end pixel point of the 1 st line of the image pixel region 41 1 And P 1 P 2 Parallel.
In an optical imaging system, light rays refracted by an object on an objective lens have a point-to-point conjugate relationship with an image of a phase plane, the image of the phase plane is a scaled two-dimensional map of the position of the object on the line of sight of the objective lens, and similar angle angles corresponding to the image of the object are equal.
Therefore, the image angle and object angle correspondence similarity angle relationship between the first image sensor 13 (shown in fig. 3) mounted on the phase plane of the first optical lens 11 and the second image sensor 23 (shown in fig. 4) mounted on the phase plane of the second optical lens 21 is:
(π-∠P 1 ′H′H 1 ")=∠P 1 HH 1
(π-∠P 3 ′H′H 1 ")=∠P 3 HH 1
......
in the first optical imaging system 1: the first image sensor 13 converts the light image of the image into an electronic dot-matrix image and temporarily stores the electronic dot-matrix image in a first storage module, which is a RAM unit shown in fig. 5. The dot matrix diagram is a unit circle with an H' origin (circle center), and the row and column dot matrix of the unit photosensitive elements of the first image sensor is a horizontal (x) and vertical (y) coordinate scale of the unit circle. The first information processing module 14 (as shown in fig. 5) compares the current electronic dot-matrix picture of the RAM unit with the changed point location of the previous electronic dot-matrix picture, and the point location is P 1 ' (or P) 2 ', or P 3 ') the value of the coordinate x ' y ', and calculating the angle P according to the trigonometric function theorem 1 ′H′H 1 "、∠P 3 ′H′H 1 The arc of.
Also, in the second optical imaging system 2: the optical image of the second image sensor 23 is converted into an electronic dot-matrix image and temporarily stored in a second storage module, which is a RAM unit in fig. 6. The dot pattern is H 1 "unit circle of origin (center of circle)" and the matrix of unit photosensitive element row and column of the image sensor is the scale of horizontal (x) and vertical (y) coordinates of the unit circle. The second information processing module 24 (in fig. 6) compares the current electronic dot-matrix picture of the RAM unit with the changed point location of the previous electronic dot-matrix picture, and the point location is P 1 "(or P) 2 "、Or P 3 ") coordinate x" y "value, calculating out angle according to trigonometric function theorem and calculating out angle P according to trigonometric function theorem 1 "H 1 "H、∠P 3 "H 1 Radian of "H.
Initializing the angles of reference points P1, P2 (trigger points are illustrated with an indicator):
(1) moving the pointer 3 to P 1 Continuously clicking (generating dynamic characteristics) the display surface of the image pixel area 41 at a constant position, and continuously exposing pictures by using the first optical imaging system 1 and the second optical imaging system 2; the first information processing circuit 14 and the second information processing circuit 24 compare the pictures and calculate P 1 The angle values of the moving indicator 3 in the point position with respect to the first optical imaging system 1 and the second optical imaging system 2 are stored to the RAM unit;
(2) moving the pointer 3 to P 2 Continuously clicking (generating dynamic characteristics) the display surface of the image pixel area 41 at a constant position, and continuously exposing pictures by using the first optical imaging system 1 and the second optical imaging system 2; the first information processing circuit 14 and the second information processing circuit 24 compare the pictures and calculate P 2 The angle values of the moving pointer 3 in the point position with respect to the first optical imaging system 1 and the second optical imaging system 2 are stored to the RAM unit.
After the initialization is completed, the first optical imaging system 1 and the second optical imaging system 2 continuously expose the inspection tour moving pointer 3. The first information processing circuit 14 and the second information processing circuit 24 compare the pictures and calculate P 3 The angle values of the moving pointer 3 in the point position with respect to the first optical imaging system 1 and the second optical imaging system 2 are stored to the RAM unit.
Detecting and calculating corresponding angle P 1 HH 1 、∠P 3 HH 1 、∠P 2 H 1 H、∠P 3 H 1 H 1 P 2 Range equivalent, calculating P of the indicator 3 based on triangulation, i.e. trigonometric function theorem 3 The point location is relative to the coordinate value of the P1 point location.
The first optical imaging system 1 and the second lightThe optical imaging system 2 continuously exposes the moving indicator 3. The first information processing circuit 14 and the second information processing circuit 24 compare the current picture with the previous picture, obtain the coordinate difference of the current pointer 3 relative to the previous pointer 3, obtain the actual position of the current pointer 3 based on the coordinate difference, and thus obtain the position of the current pointer relative to the position P 1 The coordinate values of the points.
Example 7
As shown in fig. 7, in the optical imaging system of the present invention, the optical lens is a fisheye lens with a viewing angle of 200 °, the optical lens and the plane of the imaging plane 4 are on the same detection plane, and the light projection imaging point of the indicator 3 moving in the plane of the imaging plane 4 forms a moving direction indication on the circumference of each optical lens. The position at which the pointer 3 moves is therefore the point of the line segment on the unit circle.
The fisheye lens is a circular picture imaged in a phase plane, pixels imaged by an object on the front side of the lens are filtered when the logic control and data processing unit reads a dot matrix image, and the coordinates of the image points of the movable indicator 3 in the picture can be inspected by reserving the image points of the lens in the range of +/-10 degrees on the circumferential track. And the data volume can be effectively reduced and the analysis efficiency can be improved by filtering out the pixels imaged by the object on the front surface of the lens and reserving the image points of the lens in the range of +/-10 degrees on the circumferential track.
Example 8
In the present invention, it is preferable that the storage module for storing the image of the inspection target 3 in each optical imaging system is a module independent of other storage modules, and the storage module of the inspection target 3 can store only two pictures, and then the step S9 is specifically as follows:
s9, the optical imaging system 1 and the optical imaging system 2 start to patrol the moving coordinates of the pointer 3 on the touch surface:
(1) 1 st patrol indicator 3:
the first image sensor 13 exposes a picture of the indicator 3, obtains the coordinate value of the indicator 3 in the dot matrix diagram, and caches the coordinate value in the storage module of the indicator 3, wherein the picture is marked as a picture a;
meanwhile, the second image sensor 23 exposes a picture of the indicator 3, obtains the coordinate value of the indicator 3 in the dot matrix diagram, and caches the coordinate value in the storage module of the indicator 3, wherein the picture is marked as a picture a;
determining the relative position of the indicator 3 to P according to triangulation 1 Coordinates of the points;
(2) the 2 nd patrol indicator 3:
the first image sensor 13 exposes a picture of the current indicator 3, and the picture is cached in a storage module of the indicator 3 and is marked as a picture b;
meanwhile, the second image sensor 23 exposes a picture of the current indicator 3, and the picture is cached in the storage module of the indicator 3 and is marked as a picture b;
(3) the first information processing module 14 processes the pictures a and b in the first storage module, and each picture reserves pixel points of the coordinate of the indicator 3 in the angle range of +/-alpha around the principal point H of the objective lens; α is preferably 10 °;
(4) the first information processing module 14 compares the picture b by taking the picture a as a reference picture, obtains a coordinate difference value of the indicator 3 in the picture b in the dot matrix diagram relative to the indicator 3 in the picture a, and obtains a coordinate value of the indicator 3 in the dot matrix diagram of the picture b according to the coordinate value of the indicator 3 in the dot matrix diagram of the picture a and the difference value; and calculating the radian absolute value of the point position of the current indicator 3 relative to the optical axis center of the first optical lens based on the coordinate value, namely ≈ P 3 ′H′H 1 "and storing it in a first storage module;
specifically, as shown in fig. 8 and 9: in picture a, the coordinates of the indicator 3 are x =07, y =05; in the picture b, the coordinate position x =10,y =06 of the indicator 3, when the pixel value change of the same position as that of the picture a is detected by the coordinate point x =07,y =05 and the coordinate point x =10,y =06 in the picture b, the default is that the indicator 3 moves from the coordinate point x =07,y =05 to the coordinate point x =10,y =06 of the display screen, and the coordinate point x =10,y =06 is taken as the 1 st coordinate point of the movement of the indicator 3.
The logic control and data processing unit calculates the radian absolute value of the point relative to the center of the optical axis according to the trigonometric function, and if the angle is equal to < P 3 ′H′H 1 ", then is marked∠P 3 ′H′H 1 "stored in the storage module for subsequent calculations.
(5) The second information processing module 24 processes the pictures acquired in the steps (1) and (2) in the second storage module according to the step (3) and the step (4) in the step S9, specifically, the logic control and data processing unit defaults that the picture a is a reference picture, compares the pictures a and b, and patrols the coordinate point of the picture b which changes pixels relative to the picture a; when the pixel value of the picture b at the same position as the picture a is patrolled to change the x =07,y =05 coordinate point and the x =10,y =06 coordinate point, the default is that the indicator 3 moves from the x =07,y =05 coordinate point to the x =10,y =06 coordinate point of the display screen, and the x =10,y =06 coordinate point is taken as the 1 st coordinate point of the movement of the indicator 3;
the coordinate value of the current indicator 3 in the dot matrix diagram obtained by the second information processing module 24, and the radian absolute value of the current point position relative to the optical axis center of the second optical lens, namely ≈ P, is obtained based on the coordinate value 3 "H 1 The value of "H" and store it in the second storage module;
(6) obtaining an object plane angle from the obtained angle value: logic control and data processing unit reading P of first information processing module 1 、P 2 、P 3 Converting the coordinate value in the dot photograph picture into ≤ P 1 HH 1 、∠P 2 HH 1 、∠P 3 HH 1 To the RAM cell;
second logic control and data processing unit read P 1 、P 2 、P 3 Converting the coordinate value in the dot photo picture into ^ P 1 H 1 H、∠P 2 H 1 H、∠P 3 H 1 Radian of H to RAM cell; a logic control and data processing unit of the second information processing module is used for adjusting the angle P 1 H 1 H、∠P 2 H 1 H、∠P 3 H 1 The H value is transmitted to the logic control and data processing unit of the first information processing module. Then the logic control and data processing unit of the first information processing module determines the current P based on the triangulation method 3 Point relative to P 1 Coordinate value of point;
When the image information of the subsequent indicator 3 is shot, the current picture b is stored as the picture a in the storage space where the picture a is originally stored, the original picture a is covered, then the new picture is stored as the picture b in the storage space of the picture b, and then the coordinate calculation of the new indicator 3 is carried out according to the comparison.
The above description is intended to be illustrative of the preferred embodiment of the present invention and should not be taken as limiting the invention, but rather, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

Claims (7)

1. A photoelectric touch positioning method is characterized in that: the method comprises the following steps:
s1, respectively installing a first optical imaging system (1) and a second optical imaging system (2) at two ends of one edge of an imaging surface, wherein end points of the two ends of the edge are respectively P 1 Point sum P 2 Point;
the first optical imaging system (1) comprises a first information processing module (14) which is arranged at the frame of the imaging surface (4) and is provided with a first storage module, a first optical lens (11) for collecting image information of a touch surface of the imaging surface (4), and a first image sensor (13) for receiving the collected information from the first optical lens (11) and sending the information to the first information processing module (14);
the second optical imaging system (2) comprises a second information processing module (24) with a second storage module, a second optical lens (21) for collecting image information of a touch surface of the imaging surface (4) and a second image sensor (23) for receiving the collected information from the second optical lens (21) and sending the information to the second information processing module (24), wherein the second information processing module (24) is arranged at the frame of the imaging surface (4);
the first information processing module (14) and the second information processing module (24) are communicated with each other, and the principal points of the objective lenses of the first optical lens (11) and the second optical lens (21) are respectively an H point and an H point 1 The point of the objective principal point H corresponding to the phase plane in the bitmap of the first image sensor (13) coincides with the central point H' of the bitmap, the objective principal point H 1 Corresponds to the secondThe point of the phase plane in the dot matrix of the image sensor (23) is located at the center point H of the dot matrix 1 "coincide with H and H' coincide, H 1 Dot sum H 1 "overlapping;
s2, initializing the first optical imaging system (1) and the second optical system (2), and acquiring P by the first information processing module (14) and the second information processing module (24) 1 P 2 And storing it in a memory module;
s3, exposure initialization:
starting a first optical imaging system (1), exposing static information of a touch surface by a first image sensor (13), and storing an exposed static picture in a first storage module;
starting a second optical imaging system (2), exposing static information of one touch surface by a second image sensor (23), and storing the exposed static pictures into a second storage module;
s4, initializing a reference point P 1 Angle (c): at P 1 Setting a trigger point which can be collected by an image sensor;
starting the first optical imaging system (1), the first image sensor (13) exposes P of a touch surface 1 Point dynamic information and store the exposed dynamic picture in a first storage module;
activating the second optical imaging system (2) and exposing P of a touch surface by the second image sensor (23) 1 Point dynamic information and store the exposed dynamic picture in a second storage module;
s5, in the first optical imaging system (1), the first information processing module (14) acquires the trigger point P 1 P corresponding to the light spot at the phase plane of the first image sensor (13) 1 ' coordinates in a dot matrix of a first image sensor (13) are located, and a first information processing module (14) calculates & lt P according to a trigonometric function 1 ′H′H 1 "and storing it in a first storage module;
s6, in the second optical imaging system (2), the second information processing module (24) acquires P of the trigger point corresponding to the phase plane light spot of the second image sensor (23) 1 ' coordinates in a bitmap of the second image sensor (23) are located, and a second information processing module (24) calculates < P > according to a trigonometric function 1 "H 1 The radian absolute value of 'H' is stored in a second storage module;
s7, determining P according to a triangulation method based on the angle values in S5 and S6 1 Coordinates of the point with respect to an objective principal point H of the first optical lens (11);
s8, initializing a reference point P 2 Angle (c): according to the above steps S4 to S7, in P 2 A trigger point which can be collected by a first image sensor (13) and a second image sensor (23) is arranged at the position, photographing and exposure are carried out, and the angle P is calculated by a first optical imaging system (1) 2 ′H′H 1 The radian absolute value of the arc is stored and the second optical imaging system (2) calculates the angle P 2 "H 1 The value of "H" is stored and based on these two angle values, P is determined according to triangulation 2 Point relative to P 1 Coordinates of the points;
s9, the optical imaging system (1) and the optical imaging system (2) start to inspect the moving coordinates of the pointer (3) on the touch surface:
(1) 1 st patrol indicator (3):
the method comprises the following steps that a first image sensor (13) exposes a picture of an indicator (3), coordinate values of the indicator (3) are obtained in a dot matrix diagram and cached in a first storage module;
meanwhile, a second image sensor (23) exposes a picture of the indicator (3), obtains the coordinate value of the indicator (3) in the dot matrix diagram, and caches the coordinate value in a second storage module;
determining the indicator (3) relative to P according to triangulation 1 Coordinates of the points;
(2) 2 nd patrol indicator (3):
the first image sensor (13) exposes a picture of the current indicator (3) and caches the picture in the first storage module;
meanwhile, the second image sensor (23) exposes the picture of the current indicator (3) and caches the picture in the second storage module;
(3) the first information processing module (14) processes the pictures acquired in the steps (1) and (2) in the first storage module, and each picture reserves pixel points of the indicator (3) in the range of +/-alpha angles of the lens circumferential track;
(4) the first information processing module (14) compares the picture of the current step by taking the picture of the previous step as a reference picture, obtains a coordinate difference value of the indicator (3) in the current picture relative to the indicator (3) of the previous step, and obtains a coordinate value of the current indicator (3) in the dot matrix according to the coordinate value of the indicator (3) of the previous step in the dot matrix; and calculating the radian absolute value of the current point phase relative to the center of the optical axis of the first optical lens, namely ≦ P 3 ′H′H 1 "and storing it in a first storage module;
(5) the second information processing module (24) processes the pictures collected in the steps (1) and (2) in the second storage module according to the step (3) and the step (4) in the step S9, coordinates of the current indicator (3) in the dot matrix diagram are obtained through the second information processing module (24), and an radian absolute value of the current point position relative to the optical axis center of the second optical lens, namely & lt P & gt, is obtained based on the coordinates 3 "H 1 The value of "H" and store it in the second storage module;
(6) determining the current P based on triangulation based on the obtained angle value 3 Point relative to P 1 Coordinate values of the points;
(7) judging the pixel coordinate value of the indicator (3), if the pixel coordinate value is valid in the range numerical value range of the image pixel area (41) of the imaging surface (4), storing the pixel coordinate value into a storage module, otherwise, discarding the data;
(8) the information processing module sends the pixel coordinate value of the effective indicator (3) to the host;
(9) and (5) repeating the steps (2) to (8) to track the moving track of the indicator (3).
2. An optoelectronic touch positioning method as recited in claim 1, further comprising: the touch screen further comprises an optical scanning mechanism capable of emitting scanning touch surfaces, and the trigger points are components capable of reflecting light emitted by the optical scanning mechanism.
3. An electro-optical touch location method according to claim 1, characterized by: the trigger point is an LED luminous source.
4. An optoelectronic touch positioning method as recited in claim 1, further comprising: a first filter (12) is disposed on an optical path between the first optical lens (11) and the first image sensor (13), and a second filter (22) is disposed on an optical path between the second optical lens (21) and the second image sensor (23).
5. An electro-optical touch location method according to any one of claims 1 to 4, characterized in that: the objective lenses of the first optical lens (11) and the second optical lens (21) are both fish-eye lenses.
6. A touch panel comprising an imaging plane (4) and a touch position detection device for the imaging plane (4), characterized in that: the touch positioning detection device performs positioning detection according to the photoelectric touch positioning method of claim 1.
7. Use of a touch screen according to claim 6 in a touch feedback system.
CN202110295833.5A 2021-03-19 2021-03-19 Photoelectric touch positioning method, touch screen and application thereof Active CN112905064B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110295833.5A CN112905064B (en) 2021-03-19 2021-03-19 Photoelectric touch positioning method, touch screen and application thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110295833.5A CN112905064B (en) 2021-03-19 2021-03-19 Photoelectric touch positioning method, touch screen and application thereof

Publications (2)

Publication Number Publication Date
CN112905064A CN112905064A (en) 2021-06-04
CN112905064B true CN112905064B (en) 2022-10-11

Family

ID=76106691

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110295833.5A Active CN112905064B (en) 2021-03-19 2021-03-19 Photoelectric touch positioning method, touch screen and application thereof

Country Status (1)

Country Link
CN (1) CN112905064B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1635541A (en) * 2003-12-26 2005-07-06 北京汇冠新技术有限公司 Photoelectric detection positioning system and method for computer touch screen
CN101794184A (en) * 2010-04-07 2010-08-04 广东威创视讯科技股份有限公司 Coordinate detection device and locating method thereof
CN101847060A (en) * 2009-03-27 2010-09-29 联想(北京)有限公司 Optical touch system and optical touch positioning method
CN101937292A (en) * 2009-06-30 2011-01-05 原相科技股份有限公司 Object detection calibration system of an optical touch screen and method thereof
CN103186295A (en) * 2013-04-01 2013-07-03 广东威创视讯科技股份有限公司 Touch screen positioning device and touch point calculating method
CN109976590A (en) * 2017-12-27 2019-07-05 上海品奇数码科技有限公司 A kind of touch control detecting method based on camera

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101697133B1 (en) * 2009-09-30 2017-02-01 베이징 아이어터치 시스템 코퍼레이션 리미티드 Touch screen, touch system and method for positioning a touch object in a touch system
TWI496059B (en) * 2013-11-27 2015-08-11 Wistron Corp Touch locating method and optical touch system
TWI612445B (en) * 2015-09-21 2018-01-21 緯創資通股份有限公司 Optical touch apparatus and a method for determining a touch position

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1635541A (en) * 2003-12-26 2005-07-06 北京汇冠新技术有限公司 Photoelectric detection positioning system and method for computer touch screen
CN101847060A (en) * 2009-03-27 2010-09-29 联想(北京)有限公司 Optical touch system and optical touch positioning method
CN101937292A (en) * 2009-06-30 2011-01-05 原相科技股份有限公司 Object detection calibration system of an optical touch screen and method thereof
CN101794184A (en) * 2010-04-07 2010-08-04 广东威创视讯科技股份有限公司 Coordinate detection device and locating method thereof
CN103186295A (en) * 2013-04-01 2013-07-03 广东威创视讯科技股份有限公司 Touch screen positioning device and touch point calculating method
CN109976590A (en) * 2017-12-27 2019-07-05 上海品奇数码科技有限公司 A kind of touch control detecting method based on camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于双目视觉的触摸技术研究;朱丽进等;《微型机与应用》;20110110(第01期);26-31页 *
计算机虚拟超大触摸屏定位方法研究;周济人等;《电脑知识与技术》;20110731;6937-6940页 *

Also Published As

Publication number Publication date
CN112905064A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
US10996050B2 (en) Apparatus and method for measuring a three dimensional shape
CN106643699B (en) Space positioning device and positioning method in virtual reality system
CN101825431B (en) Reference image techniques for three-dimensional sensing
US7417717B2 (en) System and method for improving lidar data fidelity using pixel-aligned lidar/electro-optic data
Prasad et al. First steps in enhancing 3D vision technique using 2D/3D sensors
CN101672620B (en) Electronic device and method for measuring size of object
EP3709270A1 (en) Registration of individual 3d frames
CN110612428B (en) Three-dimensional measurement method using characteristic quantity and apparatus therefor
WO2014106303A1 (en) Panoramic lens calibration for panoramic image and/or video capture apparatus
US12026916B2 (en) Method and apparatus for in-field stereo calibration
CN103206926A (en) Panorama three-dimensional laser scanner
US20240087167A1 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
JP2000065532A (en) Multi-eye image sensor
EP4386677A2 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
CN112905064B (en) Photoelectric touch positioning method, touch screen and application thereof
CN115830131A (en) Method, device and equipment for determining fixed phase deviation
JP5445064B2 (en) Image processing apparatus and image processing program
CN112882613A (en) Positioning photoelectric detector, detection system, touch screen, detection method and application
CN115552585B (en) System, method, and object for wafer alignment
CN214704601U (en) Touch positioning photoelectric detection system and touch screen thereof
CN214704602U (en) Positioning photoelectric detector, detection system and touch screen thereof
CN104567812A (en) Method and device for measuring spatial position
CN107888805A (en) A kind of mobile phone camera is taken pictures tracks of device and method
CN106254741B (en) A kind of more hundred million pixel remote sensing cameras of large visual field high resolution
CN112882614A (en) Touch positioning photoelectric detection system, touch screen and detection method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant