CN112947801A - Large-screen touch identification method, system and terminal based on radar and android system - Google Patents

Large-screen touch identification method, system and terminal based on radar and android system Download PDF

Info

Publication number
CN112947801A
CN112947801A CN202110377876.8A CN202110377876A CN112947801A CN 112947801 A CN112947801 A CN 112947801A CN 202110377876 A CN202110377876 A CN 202110377876A CN 112947801 A CN112947801 A CN 112947801A
Authority
CN
China
Prior art keywords
radar
touch identification
screen touch
android system
android
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110377876.8A
Other languages
Chinese (zh)
Inventor
李子超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hummer Intelligent Technology Tianjin Co ltd
Original Assignee
Hummer Intelligent Technology Tianjin Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hummer Intelligent Technology Tianjin Co ltd filed Critical Hummer Intelligent Technology Tianjin Co ltd
Priority to CN202110377876.8A priority Critical patent/CN112947801A/en
Publication of CN112947801A publication Critical patent/CN112947801A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment

Abstract

The invention belongs to the technical field of touch identification, and discloses a large-screen touch identification method, a large-screen touch identification system and a large-screen touch identification terminal based on a radar and an android system, wherein the large-screen touch identification method based on the radar and the android system comprises the following steps: collecting point cloud data of a shelter by using a radar sensor, and performing point cloud integration on information of the shelter; converting the integrated position information into position coordinates in the display device; and performing touch identification in an android layer event injection mode. The invention provides an intelligent interactive terminal based on an android architecture, which is simple to install, can adapt to screens of different sizes and is convenient to calibrate. Avoiding shielding and being used for oversized screens. And is cheap. And at the same time, miniaturization is realized. The method can simplify the operation, does not need to carry out measurement, can simulate all the operations of the android system, and can improve the measurement accuracy by averaging the median algorithm and calibration in the algorithm.

Description

Large-screen touch identification method, system and terminal based on radar and android system
Technical Field
The invention belongs to the technical field of touch identification, and particularly relates to a large-screen touch identification method, system and terminal based on a radar and android system.
Background
At present, when the device interacts with a large screen, the following technical schemes are mainly adopted:
(1) touch screen (resistive screen/capacitive screen): the touch screen can realize accurate touch control, but for an oversized screen, the touch screen needs to be customized, and the manufacturing cost is very high;
(2) the grating technology comprises the following steps: the grating technology is limited by the power of a laser transmitter, the power loss of an ultra-large screen is serious, the identification loss is serious, the power is limited, the grating is inevitably insufficient due to a large infrared light emitting area, the acquired signal strength is insufficient, the interference is easy to occur, and the identification is inaccurate finally;
(3) infrared camera motion capture technology, which has problems of being cumbersome to install or requiring a specific projection device; meanwhile, the user may have a shielding situation when using the equipment, which leads to inaccurate identification.
(4) Laser radar scanning technology. The technology is mainly based on Windows technology architecture at present, and the product cannot be miniaturized and is expensive. The technical requirement is that the radar scanning surface is parallel to the wall surface, and the adjustment is very troublesome. Various measurement data need to be filled in during calibration, which is inconvenient and has large accuracy deviation.
Through the above analysis, the problems and defects of the prior art are as follows: the touch screen of the existing touch control identification method needs to be customized, is high in manufacturing cost, insufficient in signal intensity, easy to interfere, inaccurate in identification and difficult to adjust.
The difficulty in solving the above problems and defects is:
actually measuring the collected point cloud data by a filtering algorithm and repeatedly checking. The calibration algorithm needs to solve the quinary trigonometric equation and needs to perform special transformation to simplify the solution.
The significance of solving the problems and the defects is as follows:
the screen length and width measurement in the calibration process is avoided, and the filtering algorithm avoids systematic interference in the environment.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a large-screen touch identification method, a large-screen touch identification system and a large-screen touch identification terminal based on a radar and android system.
The invention is realized in such a way that a large-screen touch identification method based on a radar and android system comprises the following steps:
acquiring point cloud data of a sheltering object by using a radar sensor, and performing point cloud integration on information of the sheltering object;
step two, converting the integrated position information into position coordinates in the display equipment; and performing touch identification in an android layer event injection mode.
Further, the large-screen touch identification method based on the radar and android system further comprises the following steps: and (6) performing radar calibration.
Further, the radar calibration includes:
(1) acquiring background points containing angle and distance data, and removing collected points which are larger than the background points by-30 mm;
(2) filtering the acquired data, removing discontinuous data on the left and right in angle, and sorting according to the angle; grouping points with continuous angles and distance difference values smaller than a threshold value, averaging the data of each group in terms of angles, and taking median in terms of distance;
(3) sequentially touching the No. 1,2,3 and 4 calibration bits to obtain point data of the No. 1,2,3 and 4 calibration bits; and respectively calculating the radar position offset, the angle offset, the x-direction conversion coefficient and the y-direction conversion coefficient, and sequentially calibrating.
Further, the radar position deviation, the angle deviation, the x-direction conversion coefficient and the y-direction conversion coefficient are calculated according to the following formula:
Figure BDA0003011523280000031
Figure BDA0003011523280000032
θ0=(θ0A0B)/2
Figure BDA0003011523280000033
x0=(x0A+x0B)/2
Figure BDA0003011523280000034
y0=(y0A+y0B)/2
Figure BDA0003011523280000041
p=(pA+pB)/2
Figure BDA0003011523280000042
t=(tA+tB)/2。
after solving, the method is used for actual touch mapping:
xs=x0-ds*Sin(θs0)*p
ys=y0+ds*cos(θs0)*t
wherein:
x0-sensor x-direction offset, unit pixel;
y0-sensor x-direction offset, unit pixel;
θ0-sensor angle deviation;
p-mapping coefficient of distance and pixel in x direction;
t-mapping coefficient of distance and pixel in y direction;
d1~d4-the measured range distances of the four calibration points;
θ1~θ4-the measured angles of the four calibration points;
xswhen touchingActual wall x coordinates, unit pixels;
ysthe actual y coordinate of the wall surface and the unit pixel during touch control;
ds-a measured distance measurement during touch control;
θs-an angle actually measured during touch control.
Further, the obtaining background points including angle and distance data includes:
and taking the minimum value of the distance corresponding to each angle as a background point of the points collected when no object contacts the wall in the first 10 seconds.
Further, in the second step, the performing touch identification by means of android layer event injection includes:
and simulating the acquired points into touch points by using an android event injection method, and performing simulation identification on large-screen touch.
Another object of the present invention is to provide a radar and android system-based large-screen touch recognition system for implementing the radar and android system-based large-screen touch recognition method, where the radar and android system-based large-screen touch recognition system includes:
the point cloud data acquisition module is used for acquiring point cloud information of the sheltering object by using the calibrated laser radar sensor;
the analysis mapping module is used for carrying out operation analysis on the data acquired by the radar sensor and mapping the data with the screen coordinate;
and the identification module is used for finishing touch identification on the simulation of large screen touch by an android layer event injection mode.
Another object of the present invention is to provide an information data processing terminal, which includes a memory and a processor, wherein the memory stores a computer program, and when the computer program is executed by the processor, the processor executes the radar and android system-based large-screen touch identification method.
Another object of the present invention is to provide a computer device, which includes a memory and a processor, wherein the memory stores a computer program, and the computer program, when executed by the processor, causes the processor to execute the radar and android system-based large-screen touch recognition method.
The invention also aims to provide a human-computer interaction wall curtain, which executes the large-screen touch identification method based on the radar and android system.
By combining all the technical schemes, the invention has the advantages and positive effects that: the invention provides an intelligent interactive terminal based on an android architecture, which is simple to install, can adapt to screens of different sizes and is convenient to calibrate. Avoiding shielding and being used for oversized screens. And is cheap. And at the same time, miniaturization is realized.
The invention discloses a method for solving the mapping relation between radar original data and screen coordinates. Can be operated simply without any measurement. According to the method, all operations of the android system can be simulated by the large screen through an event injection method. (click, slide, long press, drag). Multiple points can be simulated and operations can be performed on a large screen (10m by 5 m).
The invention uses a radar sensor to collect the point cloud data of a shelter (a hand or an operating rod), and uses an algorithm to perform point cloud integration on the information of the shelter. The integrated position information is then converted into position coordinates in the display device. And replacing the touch event by an android layer event injection mode. Finally, the interaction between the human and the wall surface is achieved. The whole structure of the device is simple, the cost is low, and the recognition precision is high.
The method can simplify the operation, does not need to carry out measurement, can simulate all the operations of the android system, and can improve the measurement accuracy by averaging the median algorithm and calibration in the algorithm.
Technical effect or experimental effect of comparison. The method comprises the following steps:
the test effect can be realized by completing the touch operation on a 3.5 × 6 meter wall surface, and the precision is 5 cm. The delay is less than 100 ms.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained from the drawings without creative efforts.
Fig. 1 is a flowchart of a large-screen touch identification method based on a radar and android system according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a radar calibration method according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of radar calibration according to an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a large-screen touch recognition system based on a radar and an android system provided in an embodiment of the present invention;
in the figure: 1. a point cloud data acquisition module; 2. analyzing the mapping module; 3. and identifying the module.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Aiming at the problems in the prior art, the invention provides a large-screen touch identification method, a large-screen touch identification system and a large-screen touch identification terminal based on a radar and android system, and the invention is described in detail with reference to the attached drawings.
As shown in fig. 1, the idea of the large-screen touch recognition method based on the radar and the android system provided by the embodiment of the invention is to reduce the input of artificial measurement and numbers, avoid the introduction of calibration errors caused by artificial measurement errors, reduce the use training cost, and provide a simple and rapid calibration and use scheme. The method comprises the following steps:
s101, collecting point cloud data of a shelter by using a radar sensor, and performing point cloud integration on information of the shelter;
s102, converting the integrated position information into position coordinates in the display equipment; and performing touch identification in an android layer event injection mode.
The large-screen touch identification method based on the radar and the android system provided by the embodiment of the invention further comprises the following steps: and (6) performing radar calibration.
As shown in fig. 2 to 3, the radar calibration provided by the embodiment of the present invention includes:
(1) acquiring background points containing angle and distance data, and removing collected points which are larger than the background points by-30 mm;
(2) filtering the acquired data, removing discontinuous data on the left and right in angle, and sorting according to the angle; grouping points with continuous angles and distance difference values smaller than a threshold value, averaging the data of each group in terms of angles, and taking median in terms of distance;
(3) sequentially touching the No. 1,2,3 and 4 calibration bits to obtain point data of the No. 1,2,3 and 4 calibration bits; and respectively calculating the radar position offset, the angle offset, the x-direction conversion coefficient and the y-direction conversion coefficient, and sequentially calibrating.
The method for acquiring the background points containing the angle and distance data comprises the following steps:
and taking the minimum value of the distance corresponding to each angle as a background point of the points collected when no object contacts the wall in the first 10 seconds.
The touch identification through the android layer event injection mode provided by the embodiment of the invention comprises the following steps:
and simulating the acquired points into touch points by using an android event injection method, and performing simulation identification on large-screen touch.
As shown in fig. 4, the large-screen touch recognition system based on the radar and android system includes:
the point cloud data acquisition module 1 is used for acquiring point cloud information of a sheltering object by using the calibrated laser radar sensor;
the analysis mapping module 2 is used for carrying out operation analysis on the data acquired by the radar sensor and mapping the data with the screen coordinates;
and the identification module 3 is used for completing touch identification on the simulation of large screen touch by an android layer event injection mode.
The technical effects of the present invention will be further described with reference to specific embodiments.
Example (b):
the invention comprises the following basic components:
1. the laser radar sensor is used for acquiring point cloud information of a shelter;
2. and the upper computer is used for carrying out operation analysis on the data acquired by the radar sensor and mapping the data with the screen coordinates.
The implementation method comprises the following steps:
1. and (6) radar calibration. Because the coordinate mapping relationship between the collected raw data and the final screen is different for different screen sizes, the first step needs to perform radar calibration first.
1.1 Radar calibration schematic
1.2 Radar calibration procedure
1.3 calibration method
a. Background points are obtained, and each collected point is in the form of an angle and a distance. And taking the minimum value of the distance corresponding to each angle as a background point when no object contacts the wall in the first 10 seconds. Later, if the collected data is larger than (the value of background point is-30 mm), the data is regarded as invalid point and discarded.
b. And point cloud integration. Filtering the acquired data, removing data which are discontinuous on the left and right of the angle, then sorting according to the angle, grouping points which have continuous angles and have a distance difference value smaller than a threshold value after finishing sorting, averaging the data of each group on the angle, and taking a median on the distance.
c. The people touch the No. 1,2,3 and 4 calibration bits in sequence. The point data of the 1,2,3,4, No. calibration bit is acquired.
d. 5 unknowns are solved, namely radar position offset (x0, y0), angle offset theta 0, x-direction conversion coefficient p and y-direction conversion coefficient t.
e. Set forth the system of equations:
Figure BDA0003011523280000091
f. in the process of solving the equation set, 8 equations are used for solving 5 unknowns, redundancy exists, and the accuracy is improved by averaging different solved values.
g. When solving the equation, in order to simplify the operation, x1 ═ x3, x2 ═ x4, y1 ═ y2, and y3 ═ y4 are used.
Obtaining by solution:
Figure BDA0003011523280000101
θ0=(θ0A0B)/2
Figure BDA0003011523280000102
x0=(x0A+x0B)/2
Figure BDA0003011523280000103
y0=(y0A+y0B)/2
Figure BDA0003011523280000104
p=(pA+pB)/2
Figure BDA0003011523280000105
t=(tA+tB)/2。
after solving, the method is used for actual touch mapping:
xs=x0ds*sin(θs0)*p
ys=y0+ds*cos(θs0)*t
wherein:
x0-deviation of sensor in x-direction, unit pixel
y0-deviation of sensor in x-direction, unit pixel
θ0-angular deviation of the sensor
Mapping coefficient of distance and pixel in p-x direction
Mapping coefficient of distance and pixel in t-y direction
d1~d4-measured distance measurements at four calibration points
θ1~θ4-the measured angles of the four calibration points
xsActual wall x coordinate, unit pixel, at touch
ysActual wall y coordinates, unit pixels during touch control
ds-measured distance measurement during touch control
θs-an angle actually measured during touch control.
1.4 methods of use:
after solving the equation, when the screen is used later, a person touches the screen, and the touched point is converted into screen coordinates.
And simulating the acquired points into touch points by using an event injection method carried by the android system. And finishing the simulation of large screen touch control.
The method has been used in the field of radar touch interactive projection. Currently used for the interactive products of the titan craftsman intelligent technology (Tianjin) Co.
The above description is only for the purpose of illustrating the present invention and the appended claims are not to be construed as limiting the scope of the invention, which is intended to cover all modifications, equivalents and improvements that are within the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A large-screen touch identification method based on a radar and an android system is characterized by comprising the following steps:
collecting point cloud data of a shelter by using a radar sensor, and performing point cloud integration on information of the shelter;
converting the integrated position information into position coordinates in the display device; and performing touch identification in an android layer event injection mode.
2. The radar and android system-based large-screen touch identification method as claimed in claim 1, wherein the radar and android system-based large-screen touch identification method further comprises: and (6) performing radar calibration.
3. The radar and android system-based large-screen touch identification method of claim 2, wherein the radar calibration comprises:
(1) acquiring background points containing angle and distance data, and removing collected points which are larger than the background points by-30 mm;
(2) filtering the acquired data, removing discontinuous data on the left and right in angle, and sorting according to the angle; grouping points with continuous angles and distance difference values smaller than a threshold value, averaging the data of each group in terms of angles, and taking median in terms of distance;
(3) sequentially touching the No. 1,2,3 and 4 calibration bits to obtain point data of the No. 1,2,3 and 4 calibration bits; and respectively calculating the radar position offset, the angle offset, the x-direction conversion coefficient and the y-direction conversion coefficient, and sequentially calibrating.
4. The radar and android system-based large-screen touch identification method as claimed in claim 3, wherein the radar position offset, the angle offset, the x-direction conversion coefficient and the y-direction conversion coefficient are calculated according to the following formulas:
the following 8 equations are simultaneously solved:
Figure FDA0003011523270000021
obtaining by solution:
Figure FDA0003011523270000022
θ0=(θ0A0B)/2
Figure FDA0003011523270000023
x0=(x0A+x0B)/2
Figure FDA0003011523270000024
y0=(y0A+y0B)/2
Figure FDA0003011523270000031
p=(pA+pB)/2
Figure FDA0003011523270000032
t=(tA+tB)/2。
after solving, the method is used for actual touch mapping:
xs=x0-ds*sin(θs0)*p
ys=y0+ds*cos(θs0)*t
wherein:
x0deviation of the sensor in the x-direction, unit imageA peptide;
y0-sensor x-direction offset, unit pixel;
θ0-sensor angle deviation;
p-mapping coefficient of distance and pixel in x direction;
t-mapping coefficient of distance and pixel in y direction;
d1~d4-the measured range distances of the four calibration points;
θ1~θ4-the measured angles of the four calibration points;
xsthe actual x coordinate of the wall surface and the unit pixel during touch control;
ysthe actual y coordinate of the wall surface and the unit pixel during touch control;
ds-a measured distance measurement during touch control;
θs-an angle actually measured during touch control.
5. The radar and android system-based large-screen touch identification method of claim 3, wherein the obtaining background points containing angle and distance data comprises:
and taking the minimum value of the distance corresponding to each angle as a background point of the points collected when no object contacts the wall in the first 10 seconds.
6. The radar and android system-based large-screen touch identification method of claim 1, wherein in the second step, the performing touch identification by means of android layer event injection comprises:
and simulating the acquired points into touch points by using an android event injection method, and performing simulation identification on large-screen touch.
7. The large-screen touch identification system based on the radar and the android system is characterized by comprising the following components in parts by weight:
the point cloud data acquisition module is used for acquiring point cloud information of the sheltering object by using the calibrated laser radar sensor;
the analysis mapping module is used for carrying out operation analysis on the data acquired by the radar sensor and mapping the data with the screen coordinate;
and the identification module is used for finishing touch identification on the simulation of large screen touch by an android layer event injection mode.
8. An information data processing terminal, characterized in that the information data processing terminal comprises a memory and a processor, the memory stores a computer program, and the computer program is executed by the processor, so that the processor executes the radar and android system-based large-screen touch identification method according to any one of claims 1 to 6.
9. A computer device, characterized in that the computer device comprises a memory and a processor, the memory stores a computer program, and the computer program is executed by the processor, so that the processor executes the radar and android system-based large-screen touch identification method according to any one of claims 1 to 6.
10. A human-computer interaction wall curtain is characterized in that the human-computer interaction wall curtain executes the large-screen touch identification method based on the radar and android system according to any one of claims 1-6.
CN202110377876.8A 2021-04-08 2021-04-08 Large-screen touch identification method, system and terminal based on radar and android system Pending CN112947801A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110377876.8A CN112947801A (en) 2021-04-08 2021-04-08 Large-screen touch identification method, system and terminal based on radar and android system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110377876.8A CN112947801A (en) 2021-04-08 2021-04-08 Large-screen touch identification method, system and terminal based on radar and android system

Publications (1)

Publication Number Publication Date
CN112947801A true CN112947801A (en) 2021-06-11

Family

ID=76231067

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110377876.8A Pending CN112947801A (en) 2021-04-08 2021-04-08 Large-screen touch identification method, system and terminal based on radar and android system

Country Status (1)

Country Link
CN (1) CN112947801A (en)

Similar Documents

Publication Publication Date Title
US9288373B2 (en) System and method for human computer interaction
US9445079B2 (en) Calibration of a 3D camera
TWI471815B (en) Gesture recognition device and method
CN1912816A (en) Virtus touch screen system based on camera head
CN103995592A (en) Wearable equipment and terminal information interaction method and terminal
US9436333B2 (en) Processing method for implementing high resolution output of capacitive touch pad on low-end single-chip microcomputer
CN112146761A (en) Human body temperature measurement compensation method based on machine learning
CN102591533A (en) Multipoint touch screen system realizing method and device based on computer vision technology
CN110988849A (en) Calibration method and device of radar system, electronic equipment and storage medium
CN103176606B (en) Based on plane interaction system and the method for binocular vision identification
Hou et al. Automatic recognition system of pointer meters based on lightweight CNN and WSNs with on-sensor image processing
CN107682595B (en) interactive projection method, system and computer readable storage medium
CN103559809A (en) Computer-based on-site interaction demonstration system
CN103487692B (en) A kind of quality determining method of touch screen
WO2020202352A1 (en) Pen condition detection circuit and pen condition detection method
TW201502889A (en) Out-cell optical touch device and related calibrating method
CN112947801A (en) Large-screen touch identification method, system and terminal based on radar and android system
WO2024016980A1 (en) Robot calibration method and apparatus, electronic device and storage medium
CN102866808B (en) Method and system for self-correcting of specially-shaped touch screen
JP2005010850A (en) Learning support device and program
JP2018049498A (en) Image processor, operation detection method, computer program, and storage medium
CN108153451B (en) Calibration method and system based on touch screen
CN102063352B (en) Test index device is in the light calibration method shown by display device
CN103793046A (en) Micro motion sensing detection module and micro motion sensing detection method thereof
CN110930448A (en) Parameter measuring method and device based on hand image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination