CN102508574A - Projection-screen-based multi-touch detection method and multi-touch system - Google Patents

Projection-screen-based multi-touch detection method and multi-touch system Download PDF

Info

Publication number
CN102508574A
CN102508574A CN2011103535122A CN201110353512A CN102508574A CN 102508574 A CN102508574 A CN 102508574A CN 2011103535122 A CN2011103535122 A CN 2011103535122A CN 201110353512 A CN201110353512 A CN 201110353512A CN 102508574 A CN102508574 A CN 102508574A
Authority
CN
China
Prior art keywords
pixel
image
projection
finger tip
projection plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011103535122A
Other languages
Chinese (zh)
Other versions
CN102508574B (en
Inventor
胡军
李国林
谢翔
刘金
王志华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201110353512.2A priority Critical patent/CN102508574B/en
Publication of CN102508574A publication Critical patent/CN102508574A/en
Application granted granted Critical
Publication of CN102508574B publication Critical patent/CN102508574B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The invention discloses a projection-screen-based multi-touch detection method, which relates to the technical field of human-computer interaction. The method comprises the following steps of: measuring a projection plane, calculating parameters of the projection plane, and performing background model training on the projection plane to form a background image; projecting a graphical interface to the projection plane, and judging whether to start monitoring touch operation or not; acquiring an image s of the touch operation; performing background segmentation on the image s acquired according to the background image, and extracting a foreground image; extracting fingertips from the foreground image, and filtering a palm, an arm and corresponding shades; judging contact points according to the spatial three-dimensional information of the extracted fingertips; and if the fingertips contact the projection plane, determining that the touch operation is effective, and returning to the position coordinates of the contact points. The invention also discloses a projection-screen-based multi-touch system. A human-computer interaction mode which is convenient, comfortable, accurate, real-time and stable is realized.

Description

Multi-point touch detection method and multi-point touch system based on projection screen
Technical field
The present invention relates to human-computer interaction technique field, particularly a kind of multi-point touch detection method and multi-point touch system based on projection screen.
Background technology
People thirst for linking up with digital world whenever and wherever possible, therefore follow the microminiaturized trend of electronic product, and mobile portable digital products such as smart mobile phone, panel computer have received consumer's favor.The volume of mobile phone, removable computer is more and more littler, and performance is increasingly high, function is more and more.Nowadays a large amount of sensors, for example: accelerometer, gyroscope, imageing sensor etc. have been widely used in smart mobile phone, the panel computer.Along with the high speed development of microelectric technique, can estimate to have increasing sensor and be embedded in the portable minisize equipment such as smart mobile phone, panel computer.
Electronic product of today adopts light-emitting diode display, CRT monitor, LCDs (LCD) etc. as display device more, adopts physical keyboard, contact panel (Touch Panel) etc. as input equipment, thereby realizes man-machine interaction.The microminiaturization of above-mentioned electronic product is not from hope to see the microminiaturization of human-computer interaction interface in essence.The miniaturization of human-computer interaction interface only can influence convenience, the comfortableness of man-machine interaction in the plurality of applications environment---and more be willing to be intended to typewriting on the physical keyboard, more be willing to be intended to browsing information on the display of giant-screen such as, people.Therefore the convenience of man-machine interaction, comfortableness require input equipment to be fit to people's manual manipulation, display interface is fit to information browse.The microminiaturization of this and electronic product becomes a pair of contradiction.And adopt existing man-machine interaction mode, the microminiaturization of electronic product will inevitably cause the miniaturization of interactive interface.For example, present handset size is too small, causes display screen little, is not suitable for the people and observes, and keyboard is too little, also very inconvenient user's operation.
In order to overcome the defective of traditional input-output device, some new man-machine interactive systems have appearred in recent years.Adopt the projection arrangement display image among patent CN 101881921A, patent CN 102063618A and the patent US 2011/0197147A1; Realize man-machine interaction through the gesture on the projection plane then---the some gestures of definition in the system; Every kind of gesture and instruction is corresponding, is stored in the instruction database.Such as, utilize gesture can realize, operation such as amplify, dwindle such as page turning.The maximum defective of this system is that the instruction number of gesture definition is limited.On the one hand, the increase of instruction number will inevitably add large user's learning difficulty, is difficult to accepted by domestic consumer; In addition on the one hand, even instruction number is increased to the function that to a certain degree also is difficult to realize character, numeral input.Recently, match road benefactor department has released a radium-shine dummy keyboard (VKB).This product employing laser projection on the table goes out a dummy keyboard, and infrared sensor judges whether finger knocks keyboard.The keyboard image dull coloring that this product projection goes out also can only be accomplished the function of character, numeral input, and can not realize projection of rich color graphical interfaces and operation.
Summary of the invention
The technical matters that (one) will solve
The technical matters that the present invention will solve is: how to realize the convenience of miniature mobile portable equipment, comfortable, quick, accurate and stable man-machine interaction.
(2) technical scheme
For solving the problems of the technologies described above, the invention provides a kind of multi-point touch detection method based on projection screen, may further comprise the steps:
S1: measure projection plane, calculate said projection plane parameter, and said projection plane is carried out the background model training, form background image;
S2: behind the projection plane graphical interfaces and judge whether to begin to monitor touch control operation,, then continue monitoring if there is not touch control operation;
S3: the image s that gathers touch control operation;
S4: according to said background image the image s that gathers among the step S3 is carried out background segment, extract foreground image, comprised the shade of finger tip, palm, arm and each self-forming in this foreground image at least;
S5: from foreground image, extract finger tip, filtering palm, arm and corresponding shade;
S6: the three-dimensional information according to the finger tip that extracts is judged contact point;
S7: if finger tip contacts with projection plane, be judged to be once effectively touch control operation so, and return the position coordinates of contact point.
Wherein, the plane parameter among the said step S1 calculates and comprises the steps:
Adopt method of structured light to detect N point on the projection plane, N>=3, non-N point conllinear is at the coordinate (x of imageing sensor coordinate system iy iz i) T, i=1,2 ..., N, when N=3, calculate plane parameter through following formula:
α β γ = x 1 y 1 z 1 × x 2 y 2 z 2 × x 3 y 3 z 3
Otherwise through least square method estimated projection plane parameter, wherein α, β, γ are the equation coefficient of said projection plane, satisfy plane equation α x+ β y+ γ z=1.
Wherein, the mode of cutting apart background image among the said step S4 is: traversing graph is as each pixel among the s, will depart from the background image average statistical and be judged as foreground image greater than the pixel of first threshold value; To depart from the background image average statistical and be judged to be background image, according to this pixel value background image updating less than the pixel of second threshold value; The pixel that departs between first threshold value and second threshold value is judged to be background image, but background image updating not, said first threshold value is greater than said second threshold value.
Wherein, among the said step S4, passing through four corresponding on the coordinate Calculation images acquired plane at four angles of projected image coordinates before the traversal, to confirm the zone of the projected image in image s, a traversing graph is as the projected image region among the s during traversal.
Wherein, said step S5 specifically comprises:
(1) judges whether pixel is foreground image,, then finish processing this pixel if be not foreground image;
(2),, finish processing, otherwise conclude that said pixel is a finger tip this pixel if this pixel is a shade if foreground image then judges whether to be shade;
(3) by above-mentioned (1)-(2) traversing graph each pixel as s.
Wherein, said step S5 specifically comprises:
(1) judges whether pixel is foreground image,, then finish processing this pixel if be not foreground image;
(2) if foreground image then judge whether to be shade, if said pixel is a shade, then finish processing to said pixel;
(3) the pixel slope of the said pixel of calculating;
(4) judge whether shadow spots is arranged on the same slope, if with said pixel same slope on have shadow spots, conclude that then said pixel region does not contact with projection screen, finish processing, otherwise conclude that said pixel is a finger tip this pixel;
(5) by above-mentioned (1)-(4) traversing graph each pixel as s.
Wherein, said step S6 specifically comprises:
Structured light projection is arrived fingertip area, and catch the structured light patterns of distortion;
Search the pixel mapping relations of projection structure light image and distortion structured light patterns in the fingertip area;
Calculate the distance of finger tip to projection plane, it is (x at the coordinate of imageing sensor coordinate system that the method through structured light obtains finger tip iy iz i) T, the distance between finger tip and the projection plane is so:
d i = α x i + β y i + γ z i - 1 α 2 + β 2 + γ 2
Judge contact point, if d i>threshold_1, then finger tip does not contact with projection plane, otherwise finger tip contacts with projection plane, and wherein, threshold_1 is a predetermined threshold.
Wherein, said step S6 specifically comprises:
To fingertip area projection structure striations, and the arresting structure light image;
Detect striped coboundary pixel horizontal ordinate x in the arresting structure light image 1With lower limb pixel horizontal ordinate x 2
Calculate fringe center
Figure BDA0000106908040000042
Judge whether finger tip contacts with projection plane, if
| x c - x 1 + x 2 2 | < threshold _ 2 ,
Conclude that so finger tip contacts with projection plane, wherein x cBe finger tip center horizontal ordinate, threshold_2 is a predetermined threshold.
The present invention also provides a kind of multi-point touch system based on projection screen; Comprise: projector, imageing sensor, I/O port, input equipment, memory device and central controller; Said projector image sensor, I/O port, input equipment, memory device all link to each other with said central controller; Said I/O port is used to connect portable terminal, and said projector is used for the image of mobile terminal displays and operation interface are projected to projection plane; Said imageing sensor is used to catch gesture on the projection screen; Said central controller is used to resolve the touch control operation of said gesture, and the touch information that parses is fed back to portable terminal, and said input equipment is used to import configuration information and instruction.
Wherein, said input equipment comprises: photosensitive sensor, be used for the brightness of ambient light, and said central controller is regulated the brightness of projector according to the output valve of photosensitive sensor.
(3) beneficial effect
In the multi-point touch system and method based on projection screen of the present invention, adopted Gauss model that background modeling, real-time update background model and employing method of difference are extracted foreground information in the background segment step.System can extract foreground image accurately and rapidly, and the variation of the light that conforms.
Detect step at finger tip and adopt shadow Detection and pixel slope method, the fingertip area that filtering does not contact with projection plane.Optimize the surveyed area in the follow-up depth detection step, reduced depth detection step consumed time, accelerated the speed of system's touch control detection.
In addition, in finger tip depth detection step,, judge according to feedback information whether finger tip contacts with projection plane to fingertip area projective structure light.The present invention can adopt fast algorithm, need not calculate the occurrence of finger tip volume coordinate, thereby has reduced the calculated amount of depth detection step.
Description of drawings
Fig. 1 a is a kind of multi-point touch system architecture based on projection screen of the embodiment of the invention;
Fig. 1 b is an applying examples figure of system among Fig. 1 a;
Fig. 2 is the process flow diagram of a kind of multi-point touch detection method based on projection screen of the embodiment of the invention;
Fig. 3 is the system initialization flow process of multi-point touch detection method;
Fig. 4 is the finger tip testing process of a kind of simplification of multi-point touch detection method;
Fig. 5 is the finger tip testing process of a kind of pinpoint accuracy of multi-point touch detection method;
Fig. 6 is the synoptic diagram of pixel slope;
Fig. 7 is a kind of finger tip depth detection flow process of multi-point touch detection method;
Fig. 8 is a kind of quick finger tip depth detection flow process of multi-point touch detection method.
Embodiment
Below in conjunction with accompanying drawing and embodiment, specific embodiments of the invention describes in further detail.Following examples are used to explain the present invention, but are not used for limiting scope of the present invention.
The embodiment of the invention discloses a kind of multi-point touch system architecture 101, shown in Fig. 1 a based on projection screen.This system architecture comprises:
Projector 104 is as the output display unit of system.The plane that it projects image onto any user's of being easy to operation such as metope, desktop and observes, this projection plane is called projection screen.Projector 104 can adopt technology such as DLP, LCD, LCOS and laser projection;
Imageing sensor 106 is used for IMAQ, is captured in click on the projection screen, touch control operation such as drags.At present, imageing sensor 106 has been applied in the mobile portable equipment widely---all embedded camera on all basically mobile phone, the notebook computer.The imageing sensor 106 of main flow mainly is divided into CCD and two kinds of technology of CMOS.Under equal resolution, the CMOS price is more cheap than CCD, but the picture quality that cmos device produces is compared CCD and will be hanged down.In addition, one of advantage of cmos image sensor is that power consumption is lower than CCD; Another advantage of cmos image sensor is high with the conformability of peripheral circuit, can ADC and signal processor be combined, and volume is significantly dwindled.Preferably, imageing sensor 106 adopts the camera of CMOS technology;
I/O port one 13, external mobile portable equipment.I/O port one 13 can be common interfaces such as USB, serial ports, VGA, also can be self defined interface.Mobile portable equipment can be smart mobile phone, panel computer, notebook computer etc.This I/O port has realized that mobile portable equipment communicates by letter with the data and instruction of central controller 120, can be very easily the multi-point touch system of projection screen be embedded in other electronic equipments through this port.
Input equipment 115, realize the configuration of user to system: the input dependent instruction, configuration effort pattern etc. are a kind of auxiliary input modes.
Preferably, input equipment can be customizing keyboard, mouse, contact panel, realizes the configuration of user to system: input dependent instruction, configuration effort pattern etc.;
Preferably, input equipment also can be a photosensitive sensor: adopt the brightness of photosensitive sensor ambient light, system regulates the brightness of projector according to the output valve of photosensitive sensor.
Memory device 111 is used for stores system parameters, cache image.Carry out in the process of view data at central controller 120, need view data, systematic parameter be stored.Memory device can be general dynamic storage or static storage device;
Central controller 120 is connected with projector 104, imageing sensor 106, I/O port one 13, input equipment 115 and memory device 111 respectively, is the nucleus module of whole touch-control system.Module mainly realizes following function:
1) from the display image of I/O port one 13 reception mobile portable equipment, display image is sent to projector 104;
2) receive imageing sensor 106 images (having comprised the gesture information of naked hand on operation interface in this image), image is carried out dissection process, analyze touch information;
3) analysis result is returned mobile portable equipment through I/O port one 13, move portable equipment then and carry out corresponding operating.
Central controller 120 is cores of total system, and its data operation of being responsible for system is handled and control.In order to satisfy the performance index of system, central controller 120 should guarantee the real-time of data processing.
Preferably, central controller 120 can be designed to asic chip, to improve the response speed of total system.Certainly, also can adopt high performance little processing.
The multi-point touch system of projection screen can be applied in the mobile portable electronic products such as smart mobile phone, panel computer, notebook computer, digital camera; This touch-control system also can be applied to replace traditional LCD display, mouse and keyboard in the general electronic equipment; This touch-control system even can be applied in the environment of certain adverse; Such as afloat equipment, need waterproof, anti-salt fog, protection against the tide, if do not have traditional control panel and display panel; With making that three anti-characteristics of system are better, also make Equipment Design simple more and unified simultaneously.
Preferably, present embodiment is embedded in the smart mobile phone 102 touch-control system to show explanation, shown in Fig. 1 b.
Projector 104 is the position directly over imageing sensor 106.This mounting means mainly detects the shade that is produced by projector 104 and finger tip for the ease of imageing sensor.Certainly, does not limit to and above-mentioned example the installation site of projector 104 and imageing sensor 106 yet.
Projector 104 can project to projected image or operation interface on the arbitrary plane.Staff 112 can carry out corresponding touch control operation on projected image 110.Such as: button click, drag slide block etc., it has replaced mouse function basically from this angle.In addition, can also dummy keyboard be projected, can change the size of keyboard, make the keyboard size of projection be suitable for the size of staff and user's custom according to the characteristics of different user.
In the example 100 image or interface are projected on the desktop 108.The largest benefit of this projection pattern is to be convenient to user's operation and observation.In face of the user was erected at mobile phone 102, projected image was presented between mobile phone 102 and the user, and size and the brightness of regulating projected image is to the degree of suitable use.Before the user can leisurely be sitting in table, online, office etc.
Projection plane also can be a metope.The maximum characteristics of this projection pattern are to be convenient to information sharing.When the user needs the commercial affairs report, during with the peer-to-peer discussion problem, can project image onto on the metope, this sample loading mode is more convenient for exchanging in face of being trapped among a notebook with respect to the several people.Certainly, in the time of rest, also can a plurality of people gather and appreciate film around together.
Also disclose a kind of multi-point touch detection method 200 in the embodiment of the invention, adopted said system to realize this method in the present embodiment based on projection screen, as shown in Figure 2, comprising:
Step 202 is measured projection plane, is calculated said projection plane parameter, and said projection plane is carried out the background model training, forms background image.
Step 204 to screen prjection display image or operation interface, and judges whether to begin touch control detection, if do not begin to detect, continues step 204.
Step 206, the imageing sensor images acquired, comprised in this image naked hand click, touch information such as drag.
Step 208, background segment is carried out background segment to the image of gathering in the step 206, extracts foreground image, possibly comprise finger tip, palm, arm and shade etc. in this foreground image.
Step 210, finger tip detects, and from foreground image, extracts finger tip (general with the active component of finger tip as touch control operation), need be with garbage filterings such as palm, arm and shades.
Step 212 is judged contact point according to the three-dimensional information of finger tip, and the method for employing structured light is obtained the three-dimensional information of finger tip.To fingertip area projective structure striations, imageing sensor 106 is caught the structural light stripes of distortion, calculates the three-dimensional information of finger tip then; In conjunction with the three-dimensional information of finger tip and the equation of projection plane, judge whether finger tip contacts with projection plane.
Step 214 is returned touch information, if finger tip contacts with projection plane, is judged to be once effectively touch control operation so.This system supports a plurality of finger tips to click projection screen simultaneously, and corresponding a plurality of touch points are all effective.Accomplish one time touch control detection, return step 204.
Step 202 is as shown in Figure 3, in the present embodiment, is the equal of that said system is carried out initialization, and main process is the correlation parameter of detection system applied environment, specifically comprises:
Step 302, projection plane is measured, plane parameter calculates;
Step 304, projection brightness is regulated;
Step 306, the background model training.
The principle that projection plane is measured is following, and the imaging formula of imageing sensor 106 and projector 104 can be described as:
x y 1 = e M int M ext x w y w z w 1 - - - ( 1 )
Wherein the equality left side is the coordinate in imageing sensor 106 photographic images or projector 104 images, and unit is a pixel; The right M IntBe intrinsic parameter matrix, M ExtBe outer parameter matrix, e is the normalization variable, (x w, y w, z w, 1) TBe the homogeneous coordinates of world coordinate system.The measurement of intrinsic parameter matrix is comparative maturity, preferably, adopts the Zhang Zhengyou standardization to measure the intrinsic parameter matrix in the present embodiment.The intrinsic parameter of imageing sensor 106 and projector 104 is just decided when dispatching from the factory, and uses as known quantity in an embodiment; In addition, relative position between imageing sensor 106 and the projector 104 and attitude are also used as known quantity.
Can see that the coordinate of image sensor system is the linear mapping of real world coordinates, but this is not to shine upon one by one, a plurality of real world coordinates maybe a corresponding same image sensor system coordinate.But, if known real world coordinates satisfies a certain conditions, promptly on certain plane, so, when above-mentioned equation is set up, the plane equation of world coordinate system is arranged again:
1 = &alpha; &beta; &gamma; x w y w z w - - - ( 2 )
Above-mentioned plane parameter (α β γ) can adopt structured light technique to measure.Structured light technique is a kind ofly to rebuild the widely used method in field at 3D rendering: it throws coded structured light to body surface, because object surface depth differs, so deformation can take place structured light; Catch this deformation texture light, according to the degree of depth on deformation data computing object surface.Among the present invention, adopt structured light technique can calculate the world coordinates of putting on the projection plane, thus calculating parameter (α β γ).
Just can obtain the mapping relations one by one of world coordinates and image sensor system coordinate from equation (1) (2); Simultaneously because the similarity of optical projection system and image sensor system; Also can obtain the mapping relations one by one of world coordinates and projecting apparatus system coordinate; Thereby can know that imageing sensor (video camera) system coordinates and projecting apparatus system coordinate have mapping relations one by one on projection plane, and can use following formula linear expression:
x c y c 1 = H x p y p 1 - - - ( 3 )
Wherein be respectively the pixel coordinate of imageing sensor and projector image about equality, H is 3 * 3 transformation matrixs, is called homography matrix.And following formula only sets up for the point on a certain plane, that is to say, for the point on projection plane not, following formula is set up scarcely.Through this character, can judge whether in the plane some spatial point.
The assay method of said projection plane is comparative maturity, preferably, adopts step following in the present embodiment:
(1) projector 104 is to the black and white chessboard figure of projection plane (in the imageing sensor calibration technique the most frequently used checkerboard figure);
(2) the chessboard figure on the imageing sensor 106 acquired projections planes;
(3) adopt Corner Detection technology for detection chessboard angle point, the Corner Detection technology is comparative maturity, such as adopting the function among the opencv directly to detect angle point;
(4) utilize angle point corresponding relation between imageing sensor image and the projected image, calculate homography matrix H, this step also can adopt the function among the opencv to accomplish.
The calculation procedure of plane parameter is following described in the above-mentioned steps 302:
(1) adopt structured light technique to obtain the individual coordinate (x of N on the projection plane (non-N point conllinear, N>=3) at the imageing sensor coordinate system iy iz i) T, i=1 wherein, 2 ..., N;
(2) through the coordinate figure of above-mentioned N point, calculate the projection plane equation parameter.When N=3,
&alpha; &beta; &gamma; = x 1 y 1 z 1 &times; x 2 y 2 z 2 &times; x 3 y 3 z 3 - - - ( 4 )
If N>3 are adopted least square method solving equation parameter so, wherein α, β, γ are the equation coefficient of said projection plane, satisfy plane equation α x+ β y+ γ x=1.
Above-mentioned projection brightness regulating step 304 is specific as follows:
The multi-point touch system can be applied to different environment, and the surround lighting of high brightness can cause projected image or operation interface unintelligible, thereby need heighten the brightness of projector 104, but this moment, projection module was in the highly energy-consuming pattern; In the surround lighting of low-light level, projector 104 does not need projection to go out the image of special high brightness, therefore, can turn down the brightness of projector 104, makes projection module be in the low power consuming pattern.Can the brightness of manual adjustments projector, make its brightness suitable.Preferably, can in system, increase luminance sensor---measurement environment light intensity, system regulates the brightness of projector 104 automatically according to the rreturn value of luminance sensor.
Above-mentioned background model training step 306 is specific as follows:
Background modeling is a kind of technology of comparative maturity, the preferred background Gauss model that adopts among the present invention.The Gaussian Background model is each pixel Gaussian distributed on the supposition background.In the background model training stage, system catches some width of cloth images from projection screen, adds up the average μ of these each pixels of image IjWith standard deviation sigma, by average μ IjThe image of forming is called background image BackGround (BackGround is an image array, the matrix element image pixel value).
After having accomplished background training, can carry out the extraction of prospect and cutting apart of background through the average of the background pixel that extracts.But having here at 2 should be noted that: can adjust context parameter according to the faint variation of ambient lighting in real time, thereby make algorithm no longer responsive for the slow variation of ambient lighting.Simultaneously, algorithm can provide good prospect separating resulting for the fast moving of staff.Background segment step 208 is specific as follows:
Adopt set FrontArea to represent foreground area.FrontArea is the binary matrix of a width * height, and width is the wide of image, and height is the height of image.If pixel (i j) is prospect, then FrontArea (i, j)=1; Otherwise FrontArea (i, j)=0.(i j) is the pixel coordinate of the imageing sensor plane of delineation.
Preferred adopt following prospect to judge that rule carries out the extraction of foreground image and the renewal of background model, (i, j) (establishing imageing sensor 106, to catch image be s, during initialization, makes to travel through pixel successively FrontArea ( i , j ) = 0 , &ForAll; i , j ) :
If depart from the background average statistical | s IjIj| the pixel greater than threshold value Ω 1, be judged to be foreground image, make FrontArea (i, j)=1, not according to its background image updating, s IjFor image s in that (preferably, catching image s is gray level image, s for i, the value of j) locating IjFor image s at (i, the gray-scale value of j) locating;
Depart from the background average statistical | s IjIj| the pixel in threshold value Ω 2 (Ω 1>Ω 2), be judged to be background image, according to it carry out background model renewal BackGround (i, j)=s Ij
Pixel between Ω 1 and Ω 2 is then thought background image, but not according to its new model more.
Preferably, threshold value Ω 1 gets 5 σ, and threshold value Ω 2 gets 3 σ.
Judge successively and a Refreshing Every pixel that according to above-mentioned 3 conditions the traversal mode is as shown in the table:
Figure BDA0000106908040000131
Finally obtain foreground image FrontArea and upgrade background model image B ackGround afterwards.
Especially, the visual angle of imageing sensor is generally greater than the visual angle of projector, so the image that imageing sensor is taken is except comprising projected image, and also some zones do not have projection.Owing to only need to pay close attention near the image the projection screen, therefore can according to before the projection screen parameter information that obtains, only extract near the pixel of screen and handle, come speed up processing and enhanced stability with this.The detailed process of this thinking can be described below:
4 angles with projector image:
a 1 &RightArrow; = 1 1 1 , a 2 &RightArrow; = 1 width 1 , a 3 &RightArrow; = height 1 1 , a 4 &RightArrow; = height width 1
Bring equation (3) into, can calculate 4 points of the imageing sensor plane of delineation:
b 1 &RightArrow; = H a 1 &RightArrow; , b 2 &RightArrow; = H a 2 &RightArrow; , b 3 &RightArrow; = H a 3 &RightArrow; , b 4 &RightArrow; = H a 4 &RightArrow;
If regional Area is the quadrilateral area by the point on the above-mentioned imageing sensor plane of delineation
Figure BDA0000106908040000134
decision.All pixels are mapped on the image sensor plane on the projector image, all are positioned at regional Area.
Therefore, the entire image zone that said background segment does not just need the traversing graph image-position sensor to collect only needs the above-mentioned Area of traversal, i.e. projected image area in the image that collects of imageing sensor.
Fig. 4 and Fig. 5 show the flow process that two kinds of finger tips detect step 210.
Wherein, Fig. 4 has described a kind of finger tip testing process 210a of simplification.Each pixel for the image s that in step 206, catches is handled according to the flow process of Fig. 4.If (i j) belongs to fingertip area to pixel, so it is added set FingerArea, FingerArea=empty set at the beginning.Flow is following:
Step 402 is obtained pixel (i, value s j) Ij
Step 404 judges whether this pixel is foreground image, if FrontArea (i, j)=1 so the pixels illustrated point (i j) is foreground image; Otherwise be not foreground image, finish processing this pixel.
Step 406 judges whether to be shade, and the shadow Detection technology is a known technology, because shade has on color spaces such as RGB and YVU necessarily, can judge whether to be shade according to these characteristics.If this pixel is a shade, finish processing to this pixel.
Step 408, (i j) adds set FingerArea with pixel.
The result that the finger tip detection method of simplifying detects has comprised a lot of redundant informations: the finger tip that has has shade on projection screen, thereby can judge that finger tip does not contact with projection screen, in theory, need not carry out depth survey to such finger tip.But, this method do not add such finger tip differentiation adding set FingerArea, this has strengthened the operand of follow-up depth survey.
To the defective of the finger tip detection method of above-mentioned simplification, Fig. 5 has described the higher finger tip detection method 210b of a kind of degree of accuracy.The finger tip that adopts pixel slope detection method will comprise shade in this method distinguishes, and has reduced the element of set FingerArea, thereby has reduced the calculated amount of finger tip depth survey.The pixel slope is defined as:
k = x 0 - x i y 0 - y i - - - ( 5 )
As shown in Figure 6, (x wherein 0, y 0) be meant limit coordinate, (x i, y i) being meant the pixel coordinate, the upper left corner of image is as true origin.Wherein, the limit coordinate can directly calculate through outer parameter.Theory according to outer polar curve geometry is known: if finger tip (the pixel slope of establishing finger tip is k) has shade, this shadow spots is inevitable on the corresponding ray of same pixel slope k.Therefore, when judging whether finger tip has shade, basic thought is: calculate the pixel slope of finger tip, search for whether there is shadow spots on this slope then.If being arranged, shadow spots can conclude that finger tip does not contact with projection screen; If there is not shadow spots, can not conclude directly that finger tip contacts with projection screen---in actual use,, finger too high owing to shadow lightness and plan range are crossed reasons such as near, cause the part shadow Detection not come out through regular meeting.Confirm fully whether unblanketed finger tip contacts with projection plane, also need the result of follow-up finger tip depth detection to judge.The flow process of the finger tip detection method 210b that degree of accuracy is higher is following:
Step 502 is obtained pixel (i, value s j) Ij
Step 504 through the foreground image that extracts in the step 208, judges whether this pixel is foreground image.If FrontArea (i, j)=1 so the pixels illustrated point (i j) is foreground image; Otherwise be not foreground image, finish processing this pixel.
Step 506, the result through shadow Detection judges whether this pixel is shade.If this pixel is a shade, finish processing to this pixel.
Step 508, calculating pixel point (i, pixel slope k j).
Step 510 judges whether shadow spots is arranged on the same slope, if with the pixel same slope on have shadow spots, can conclude that so (i, j) region does not contact with projection screen pixel, finishes the processing to this pixel.
Step 512, (i j) adds set FingerArea with pixel.
Fig. 7 has described a kind of method 212a of finger tip depth survey.The structural light measurement degree of depth is a known technology, and its basic step is:
Step 702, the bar structure optical projection is to fingertip area.
Step 704 is caught the structured light patterns of distortion through imageing sensor 106.
Step 706 is searched the pixel mapping relations of projection structure light image and distortion structured light patterns in the fingertip area.Preferably, can adopt gray-coded, color coding or sinusoidal codes mode to obtain the mapping relations between projected image and the imageing sensor image.
Step 708 is calculated the distance of finger tip to projection plane, and it is (x at the coordinate of imageing sensor coordinate system that the method through structured light supposed obtains finger tip iy iz i) T, z iThe expression finger tip degree of depth, promptly finger tip is to the distance of imageing sensor 106, and the distance between finger tip and the projection plane is so:
d i = &alpha; x i + &beta; y i + &gamma; z i - 1 &alpha; 2 + &beta; 2 + &gamma; 2 - - - ( 6 )
Step 710 is judged contact point, if d i>threshold_1, then finger tip does not contact with projection plane, otherwise finger tip contacts with projection plane, and wherein, threshold_1 is a threshold value of setting, and preferred, threshold_1 gets 1cm.
Fig. 8 has described another kind of quick finger tip depth detection method 212b.If finger tip contacts with projection screen, the projector coordinates that finger tip is corresponding so is that coordinate and imageing sensor coordinate system coordinate should satisfy formula (3).Method 212b utilizes this character to judge whether finger tip contacts with projection screen just.If the center of fingertip area is (x among the image s c, y c) T, suppose that finger tip contacts with projection screen, so according to formula (3), can calculate the coordinate (x of finger tip in projector coordinates system p, y p, 1) T=H -1(x c, y c, 1) T, with (x p, y p) as the middle mind-set fingertip area projected fringe of structural light stripes.If it is respectively x that imageing sensor 106 captures the last lower limb of structural light stripes 1And x 2, the center of the structured light that therefore captures does
Figure BDA0000106908040000161
If center
Figure BDA0000106908040000162
With x cIdentical, can conclude that so finger tip contacts with projection screen, otherwise finger tip does not contact with projection screen.The step of this method is following:
Step 802, the projection of fingertip area structural light stripes.
Step 804, the arresting structure light image.
Step 806 detects striped coboundary pixel horizontal ordinate x in the arresting structure light image 1With lower limb pixel horizontal ordinate x 2
Step 808 is calculated fringe center
Figure BDA0000106908040000163
Step 810 judges whether finger tip contacts with projection plane, if
| x c - x 1 + x 2 2 | < threshold _ 2
Finger tip contacts with projection plane, otherwise does not contact.Wherein, threshold_2 is another threshold value of setting.Preferably, threshold_2 gets 2.
Adopt above-mentioned quick finger tip depth detection method 212b, can reduce the algorithm complex of judging contact point step 212, accelerate the speed of touch control detection, algorithm complex is low to be convenient to be integrated on the hardware, and speed has guaranteed the real-time of total system touch control detection soon.
Above embodiment only is used to explain the present invention; And be not limitation of the present invention; The those of ordinary skill in relevant technologies field under the situation that does not break away from the spirit and scope of the present invention, can also be made various variations and modification; Therefore all technical schemes that are equal to also belong to category of the present invention, and scope of patent protection of the present invention should be defined by the claims.

Claims (10)

1. the multi-point touch detection method based on projection screen is characterized in that, may further comprise the steps:
S1: measure projection plane, calculate said projection plane parameter, and said projection plane is carried out the background model training, form background image;
S2: behind the projection plane graphical interfaces and judge whether to begin to monitor touch control operation,, then continue monitoring if there is not touch control operation;
S3: the image s that gathers touch control operation;
S4: according to said background image the image s that gathers among the step S3 is carried out background segment, extract foreground image, comprised the shade of finger tip, palm, arm and each self-forming in this foreground image at least;
S5: from foreground image, extract finger tip, filtering palm, arm and corresponding shade;
S6: the three-dimensional information according to the finger tip that extracts is judged contact point;
S7: if finger tip contacts with projection plane, be judged to be once effectively touch control operation so, and return the position coordinates of contact point.
2. the multi-point touch detection method based on projection screen as claimed in claim 1 is characterized in that, the plane parameter among the said step S1 calculates and comprises the steps:
Adopt method of structured light to detect N point on the projection plane, N>=3, non-N point conllinear is at the coordinate (x of imageing sensor coordinate system iy iz i) T, i=1,2 ..., N, when N=3, calculate plane parameter through following formula:
&alpha; &beta; &gamma; = x 1 y 1 z 1 &times; x 2 y 2 z 2 &times; x 3 y 3 z 3
Otherwise through least square method estimated projection plane parameter, wherein α, β, γ are the equation coefficient of said projection plane, satisfy plane equation α x+ β y+ γ z=1.
3. the multi-point touch detection method based on projection screen as claimed in claim 1; It is characterized in that; The mode of cutting apart background image among the said step S4 is: traversing graph is as each pixel among the s, will depart from the background image average statistical and be judged as foreground image greater than the pixel of first threshold value; To depart from the background image average statistical and be judged to be background image, according to this pixel value background image updating less than the pixel of second threshold value; The pixel that departs between first threshold value and second threshold value is judged to be background image, but background image updating not, said first threshold value is greater than said second threshold value.
4. the multi-point touch detection method based on projection screen as claimed in claim 3; It is characterized in that; Among the said step S4; Passing through four corresponding on the coordinate Calculation images acquired plane at four angles of projected image coordinates before the traversal, to confirm the zone of the projected image in image s, a traversing graph is as the projected image region among the s during traversal.
5. the multi-point touch detection method based on projection screen as claimed in claim 1 is characterized in that said step S5 specifically comprises:
(1) judges whether pixel is foreground image,, then finish processing this pixel if be not foreground image;
(2),, finish processing, otherwise conclude that said pixel is a finger tip this pixel if this pixel is a shade if foreground image then judges whether to be shade;
(3) by above-mentioned (1)-(2) traversing graph each pixel as s.
6. the multi-point touch detection method based on projection screen as claimed in claim 1 is characterized in that said step S5 specifically comprises:
(1) judges whether pixel is foreground image,, then finish processing this pixel if be not foreground image;
(2) if foreground image then judge whether to be shade, if said pixel is a shade, then finish processing to said pixel;
(3) the pixel slope of the said pixel of calculating;
(4) judge whether shadow spots is arranged on the same slope, if with said pixel same slope on have shadow spots, conclude that then said pixel region does not contact with projection screen, finish processing, otherwise conclude that said pixel is a finger tip this pixel;
(5) by above-mentioned (1)-(4) traversing graph each pixel as s.
7. the multi-point touch detection method based on projection screen as claimed in claim 1 is characterized in that said step S6 specifically comprises:
Structured light projection is arrived fingertip area, and catch the structured light patterns of distortion;
Search the pixel mapping relations of projection structure light image and distortion structured light patterns in the fingertip area;
Calculate the distance of finger tip to projection plane, it is (x at the coordinate of imageing sensor coordinate system that the method through structured light obtains finger tip iy iz i) T, the distance between finger tip and the projection plane is so:
d i = &alpha; x i + &beta; y i + &gamma; z i - 1 &alpha; 2 + &beta; 2 + &gamma; 2
Judge contact point, if d i>threshold_1, then finger tip does not contact with projection plane, otherwise finger tip contacts with projection plane, and wherein, threshold_1 is a predetermined threshold.
8. the multi-point touch detection method based on projection screen as claimed in claim 1 is characterized in that said step S6 specifically comprises:
To fingertip area projection structure striations, and the arresting structure light image;
Detect striped coboundary pixel horizontal ordinate x in the arresting structure light image 1With lower limb pixel horizontal ordinate x 2
Calculate fringe center
Figure FDA0000106908030000032
Judge whether finger tip contacts with projection plane, if
| x c - x 1 + x 2 2 | < threshold _ 2 ,
Conclude that so finger tip contacts with projection plane, wherein x cBe finger tip center horizontal ordinate, threshold_2 is a predetermined threshold.
9. multi-point touch system based on projection screen; Comprise: projector, imageing sensor, I/O port, input equipment, memory device and central controller; Said projector image sensor, I/O port, input equipment, memory device all link to each other with said central controller; It is characterized in that said I/O port is used to connect portable terminal, said projector is used for the image of mobile terminal displays and operation interface are projected to projection plane; Said imageing sensor is used to catch gesture on the projection screen; Said central controller is used to resolve the touch control operation of said gesture, and the touch information that parses is fed back to portable terminal, and said input equipment is used to import configuration information and instruction.
10. the multi-point touch system based on projection screen as claimed in claim 9; It is characterized in that; Said input equipment comprises: photosensitive sensor, be used for the brightness of ambient light, and said central controller is regulated the brightness of projector according to the output valve of photosensitive sensor.
CN201110353512.2A 2011-11-09 2011-11-09 Projection-screen-based multi-touch detection method and multi-touch system Active CN102508574B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110353512.2A CN102508574B (en) 2011-11-09 2011-11-09 Projection-screen-based multi-touch detection method and multi-touch system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110353512.2A CN102508574B (en) 2011-11-09 2011-11-09 Projection-screen-based multi-touch detection method and multi-touch system

Publications (2)

Publication Number Publication Date
CN102508574A true CN102508574A (en) 2012-06-20
CN102508574B CN102508574B (en) 2014-06-04

Family

ID=46220673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110353512.2A Active CN102508574B (en) 2011-11-09 2011-11-09 Projection-screen-based multi-touch detection method and multi-touch system

Country Status (1)

Country Link
CN (1) CN102508574B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968979A (en) * 2012-11-12 2013-03-13 广东欧珀移动通信有限公司 Screen brightness scheduling method based on curve fitting
CN103093475A (en) * 2013-01-28 2013-05-08 海信集团有限公司 Image processing method and electronic device
CN103279225A (en) * 2013-05-30 2013-09-04 清华大学 Projection type man-machine interactive system and touch control identification method
CN103336634A (en) * 2013-07-24 2013-10-02 清华大学 Adaptive hierarchical structure light-based touch detection system and method
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal
CN104090689A (en) * 2014-06-27 2014-10-08 深圳市中兴移动通信有限公司 Mobile terminal and interactive projection method and system thereof
CN104464412A (en) * 2014-12-23 2015-03-25 成都韬睿教育咨询有限公司 Remote education system and implementation method thereof
CN105657307A (en) * 2014-12-02 2016-06-08 霍尼韦尔国际公司 System and method of foreground extraction for digital cameras
CN103955316B (en) * 2014-04-28 2016-09-21 清华大学 A kind of finger tip touching detecting system and method
CN106774904A (en) * 2016-12-20 2017-05-31 哈尔滨拓博科技有限公司 A kind of method that finger movement signal is converted into PC control signal
CN107092350A (en) * 2017-03-22 2017-08-25 深圳大学 A kind of remote computer based system and method
CN107122083A (en) * 2017-04-25 2017-09-01 上海唱风信息科技有限公司 The touch control detecting method on perspective plane
CN107239178A (en) * 2016-03-28 2017-10-10 精工爱普生株式会社 Display system, information processor, projecting apparatus and information processing method
CN109144318A (en) * 2013-12-05 2019-01-04 禾瑞亚科技股份有限公司 Judge the method and apparatus whether palm line segment group needs to divide
CN109799928A (en) * 2017-11-16 2019-05-24 清华大学深圳研究生院 Project the acquisition methods and system of user's finger parameter in touch tablet
CN110471576A (en) * 2018-08-16 2019-11-19 中山叶浪智能科技有限责任公司 A kind of nearly screen touch method of single camera, system, platform and storage medium
CN114777684A (en) * 2017-10-06 2022-07-22 先进扫描仪公司 Generating one or more luminance edges to form a three-dimensional model of an object
CN115046733A (en) * 2022-05-09 2022-09-13 中国电器科学研究院股份有限公司 Touch screen performance detection method and device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106095199A (en) * 2016-05-23 2016-11-09 广州华欣电子科技有限公司 A kind of touch-control localization method based on projection screen and system
TWI626423B (en) * 2016-09-12 2018-06-11 財團法人工業技術研究院 Tapping detecting device, tapping detecting method and smart projecting system using the same

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1655175A (en) * 2004-02-13 2005-08-17 株式会社日立制作所 Table type information terminal
CN101231450A (en) * 2008-02-25 2008-07-30 陈伟山 Multipoint and object touch panel arrangement as well as multipoint touch orientation method
CN101477427A (en) * 2008-12-17 2009-07-08 卫明 Contact or non-contact type infrared laser multi-point touch control apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1655175A (en) * 2004-02-13 2005-08-17 株式会社日立制作所 Table type information terminal
CN101231450A (en) * 2008-02-25 2008-07-30 陈伟山 Multipoint and object touch panel arrangement as well as multipoint touch orientation method
CN101477427A (en) * 2008-12-17 2009-07-08 卫明 Contact or non-contact type infrared laser multi-point touch control apparatus

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968979A (en) * 2012-11-12 2013-03-13 广东欧珀移动通信有限公司 Screen brightness scheduling method based on curve fitting
CN103093475A (en) * 2013-01-28 2013-05-08 海信集团有限公司 Image processing method and electronic device
CN104766331B (en) * 2013-01-28 2017-10-13 海信集团有限公司 A kind of image processing method and electronic equipment
CN104766329B (en) * 2013-01-28 2018-04-27 海信集团有限公司 A kind of image processing method and electronic equipment
CN104766332B (en) * 2013-01-28 2017-10-13 海信集团有限公司 A kind of image processing method and electronic equipment
CN103093475B (en) * 2013-01-28 2015-05-13 海信集团有限公司 Image processing method and electronic device
CN104766329A (en) * 2013-01-28 2015-07-08 海信集团有限公司 Image processing method and electronic device
CN104766332A (en) * 2013-01-28 2015-07-08 海信集团有限公司 Image processing method and electronic device
CN104766331A (en) * 2013-01-28 2015-07-08 海信集团有限公司 Imaging processing method and electronic device
CN103279225B (en) * 2013-05-30 2016-02-24 清华大学 Projection type man-machine interactive system and touch control identification method
CN103279225A (en) * 2013-05-30 2013-09-04 清华大学 Projection type man-machine interactive system and touch control identification method
CN103336634B (en) * 2013-07-24 2016-04-20 清华大学 Based on touching detection system and the method for adaptive layered structured light
CN103336634A (en) * 2013-07-24 2013-10-02 清华大学 Adaptive hierarchical structure light-based touch detection system and method
CN109144318B (en) * 2013-12-05 2021-07-27 禾瑞亚科技股份有限公司 Method and device for judging whether palm line segment group needs to be divided
CN109144318A (en) * 2013-12-05 2019-01-04 禾瑞亚科技股份有限公司 Judge the method and apparatus whether palm line segment group needs to divide
CN103955316B (en) * 2014-04-28 2016-09-21 清华大学 A kind of finger tip touching detecting system and method
CN103995592A (en) * 2014-05-21 2014-08-20 上海华勤通讯技术有限公司 Wearable equipment and terminal information interaction method and terminal
CN104090689B (en) * 2014-06-27 2019-11-05 努比亚技术有限公司 A kind of method and system of mobile terminal and its interactive projection
CN104090689A (en) * 2014-06-27 2014-10-08 深圳市中兴移动通信有限公司 Mobile terminal and interactive projection method and system thereof
CN105657307A (en) * 2014-12-02 2016-06-08 霍尼韦尔国际公司 System and method of foreground extraction for digital cameras
CN104464412A (en) * 2014-12-23 2015-03-25 成都韬睿教育咨询有限公司 Remote education system and implementation method thereof
CN107239178A (en) * 2016-03-28 2017-10-10 精工爱普生株式会社 Display system, information processor, projecting apparatus and information processing method
CN106774904A (en) * 2016-12-20 2017-05-31 哈尔滨拓博科技有限公司 A kind of method that finger movement signal is converted into PC control signal
CN107092350A (en) * 2017-03-22 2017-08-25 深圳大学 A kind of remote computer based system and method
CN107122083B (en) * 2017-04-25 2020-02-18 上海唱风信息科技有限公司 Touch detection method of projection surface
CN107122083A (en) * 2017-04-25 2017-09-01 上海唱风信息科技有限公司 The touch control detecting method on perspective plane
CN114777684A (en) * 2017-10-06 2022-07-22 先进扫描仪公司 Generating one or more luminance edges to form a three-dimensional model of an object
CN109799928A (en) * 2017-11-16 2019-05-24 清华大学深圳研究生院 Project the acquisition methods and system of user's finger parameter in touch tablet
CN109799928B (en) * 2017-11-16 2022-06-17 清华大学深圳研究生院 Method and system for acquiring user finger parameters in projection touch panel
CN110471576A (en) * 2018-08-16 2019-11-19 中山叶浪智能科技有限责任公司 A kind of nearly screen touch method of single camera, system, platform and storage medium
CN110471576B (en) * 2018-08-16 2023-11-17 中山叶浪智能科技有限责任公司 Single-camera near-screen touch method, system, platform and storage medium
CN115046733A (en) * 2022-05-09 2022-09-13 中国电器科学研究院股份有限公司 Touch screen performance detection method and device
CN115046733B (en) * 2022-05-09 2024-05-10 中国电器科学研究院股份有限公司 Touch screen performance detection method and device

Also Published As

Publication number Publication date
CN102508574B (en) 2014-06-04

Similar Documents

Publication Publication Date Title
CN102508574B (en) Projection-screen-based multi-touch detection method and multi-touch system
CN110310288B (en) Method and system for object segmentation in a mixed reality environment
US9288373B2 (en) System and method for human computer interaction
US9710109B2 (en) Image processing device and image processing method
CN104199550B (en) Virtual keyboard operation device, system and method
CN103164022B (en) Many fingers touch method and device, portable terminal
US20140204120A1 (en) Image processing device and image processing method
Caputo et al. 3D hand gesture recognition based on sensor fusion of commodity hardware
CN103279225B (en) Projection type man-machine interactive system and touch control identification method
CN103955316B (en) A kind of finger tip touching detecting system and method
CN103383731A (en) Projection interactive method and system based on fingertip positioning and computing device
CN102591533A (en) Multipoint touch screen system realizing method and device based on computer vision technology
CN111739069B (en) Image registration method, device, electronic equipment and readable storage medium
CN104991684A (en) Touch control device and working method therefor
US10437342B2 (en) Calibration systems and methods for depth-based interfaces with disparate fields of view
US20150277570A1 (en) Providing Onscreen Visualizations of Gesture Movements
CN113011258A (en) Object monitoring and tracking method and device and electronic equipment
CN105653017A (en) Electronic device and gravity sensing correction method for electronic device
CN112657176A (en) Binocular projection man-machine interaction method combined with portrait behavior information
CN104199548A (en) Man-machine interactive type virtual touch device, system and method
KR101105872B1 (en) Method and apparatus for a hand recognition using an ir camera and monitor
WO2024055531A1 (en) Illuminometer value identification method, electronic device, and storage medium
CN104199549B (en) A kind of virtual mouse action device, system and method
US20220050528A1 (en) Electronic device for simulating a mouse
Zhang et al. Transforming a regular screen into a touch screen using a single webcam

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant