CN103383731B - A kind of projection interactive method based on finger tip location, system and the equipment of calculating - Google Patents

A kind of projection interactive method based on finger tip location, system and the equipment of calculating Download PDF

Info

Publication number
CN103383731B
CN103383731B CN201310284483.8A CN201310284483A CN103383731B CN 103383731 B CN103383731 B CN 103383731B CN 201310284483 A CN201310284483 A CN 201310284483A CN 103383731 B CN103383731 B CN 103383731B
Authority
CN
China
Prior art keywords
staff
rsqb
lsqb
finger tip
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310284483.8A
Other languages
Chinese (zh)
Other versions
CN103383731A (en
Inventor
程俊
王群
沈三明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201310284483.8A priority Critical patent/CN103383731B/en
Publication of CN103383731A publication Critical patent/CN103383731A/en
Application granted granted Critical
Publication of CN103383731B publication Critical patent/CN103383731B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention is applicable to computer disposal field, it is provided that a kind of projection interactive method based on finger tip location, system and the equipment of calculating.Comprise the following steps: the equipment that calculates receives the projection screen draw above picture of cameras capture;By whether having staff above image described in the methods analyst of computer vision;If having analyzed staff, then the method further with triangulation calculates the finger tip of described staff three-dimensional position on image;According to described finger tip three-dimensional position on image, calculate the finger tip distance to projection screen;Judge whether finger contacts projection screen according to the distance calculated;If it is judged that contact projection screen, then according to the fingertip location location mouse position calculated, the input of simulated touch screen.Present invention accomplishes the requirement of real-time in interaction.

Description

A kind of projection interactive method based on finger tip location, system and the equipment of calculating
Technical field
The invention belongs to computer disposal field, particularly relate to a kind of projection interactive method based on finger tip location, system And calculating equipment.
Background technology
Along with the development of science and technology, camera and projection have gradually entered into the life of ordinary citizen, projection is applied to each at present Aspect, such as teaching, various meetings etc..Carry out the automatic of gesture by projection and camera and identify the study hotspot becoming current, By the automatic identification of gesture, reach more preferable man-machine interaction so that the use of projection is convenient.Although integrated use regards The multimodal human-computer interaction technology such as feel, audition, sense of touch, olfactory sensation, the sense of taste are increasingly being application, but, both hands are as void Intending action important in reality system and perception relational model, it still plays the most replaceable effect in virtual reality system. At present, touch screen is as a kind of up-to-date computer input apparatus, and it is man-machine interaction the simplest, convenient, a kind of Mode.It give multimedia with brand-new looks, be extremely attractive brand-new multimedia interactive equipment.Along with entering of science and technology Step, the use of projector is more extensive, training conference, classroom instruction, cinema etc..Projector easy to use, it Any one plane can be become a display screen.On China national Department of Intellectual Property patent retrieval website, retrieval finds mesh The projection interactive system of front view-based access control model is substantially based on fill-in light location.
Patent 200910190517.0 discloses a kind of projection interactive method based on finger, this invention color by hands With the profile that the information such as shape extract finger in video file, and record the movement locus of finger, then by the fortune of finger Dynamic track and the instruction contrast in the instruction database pre-defined, judge which kind of operational order is affiliated operation trace belong to, reach Purpose to man-machine interaction.
Patent 200910197516.9 discloses a kind of finger identification method in interactive demonstration system, for The interactive demonstration system of video camera projection is determined the operation behavior of user by the identification of finger.
In order to be identified the gesture in image, first have to carry out is the segmentation of target, due to the complexity of background With the multiformity of applied environment, the Target Segmentation of arm is always a difficult point.In projection interactive system, due to projector light The irradiation of line, the arm of people may present different colors, is also possible to comprise staff in projected picture, and this all gives segmentation Arm brings difficulty, and present technical scheme all can make a mistake when running into problem above detection.
It addition, prior art does not the most provide the judgement of finger whether contact screen.Utilize the movement locus of finger Interact, inevitably have the time delay of certain time, limit its application.
Summary of the invention
It is an object of the invention to provide a kind of projection interactive method, system and calculating equipment, purport of based on finger tip location Solving present in prior art in projection interactive system, due to the irradiation of projector light, the arm of people may be in Revealing different colors, be also possible to comprise staff in projected picture, this brings difficulty all to segmentation arm, and mistake all can occur Error detection;It addition, prior art does not the most provide the judgement of finger whether contact screen, the movement locus of finger is utilized to enter Row is mutual, inevitably has the time delay of certain time, the problem limiting its application.
The present invention is achieved in that a kind of projection interactive method based on finger tip location, and described method includes following step Rapid:
Calculating equipment receives the projection screen draw above picture of cameras capture;
By whether having staff above image described in the methods analyst of computer vision;
If having analyzed staff, then the method further with triangulation calculates the finger tip of described staff on image Three-dimensional position;
According to described finger tip three-dimensional position on image, calculate the finger tip distance to projection screen;
Judge whether finger contacts projection screen according to the distance calculated;
If it is judged that contact projection screen, then according to the fingertip location location mouse position calculated, simulated touch screen Input.
Another object of the present invention is to provide a kind of projection interactive system based on finger tip location, described system includes:
Receiver module, for receiving the projection screen draw above picture of cameras capture;
Staff analyzes module, for by whether having staff above image described in the methods analyst of computer vision;
Position computation module, if for having analyzed staff, then the method further with triangulation calculates described The finger tip of staff three-dimensional position on image;
Distance calculation module, for according to described finger tip three-dimensional position on image, calculates finger tip to projection screen Distance;
According to the distance calculated, judge module, for judging whether finger contacts projection screen;
Analog module, for if it is judged that contact projection screen, then positioning mouse position according to the fingertip location calculated Put, the input of simulated touch screen.
Another object of the present invention is to provide a kind of and include projection interactive system based on finger tip location recited above Calculating equipment.
In the present invention, the image in computer is projected on projection screen as outut device by projector, then leads to Cross the picture above cameras capture projection screen, by whether having staff above the methods analyst image of computer vision, so After from image, isolate staff, then find out finger tip position in the picture, calculate finger tip afterwards and sentence to the distance of screen Cut off the hands and refer to whether contact screen, then according to the fingertip location location mouse position calculated, the input of simulated touch screen, it is achieved The purpose of man-machine interaction.Solve present in prior art in projection interactive system, due to the irradiation of projector light, people Arm may present different colors, projected picture is also possible to comprise staff, this all give segmentation arm bring Difficulty, all can make a mistake detection;It addition, prior art does not the most provide the judgement of finger whether contact screen, utilize The movement locus of finger interacts, and inevitably has the time delay of certain time, the problem limiting its application, this Bright meet the requirement of real-time in interaction.
Accompanying drawing explanation
Fig. 1 be the embodiment of the present invention provide based on finger tip location projection interactive method realize schematic flow sheet.
Fig. 2 is the projector image schematic diagram with the corresponding relation of camera review of embodiment of the present invention offer.
Fig. 3 is the schematic diagram setting up corresponding relation by finding X-comers that the embodiment of the present invention provides.
Fig. 4 is the schematic diagram of the triangulation that the embodiment of the present invention provides.
Fig. 5 is the structural representation of the projection interactive system based on finger tip location that the embodiment of the present invention provides.
Detailed description of the invention
In order to make the purpose of the present invention, technical scheme and beneficial effect clearer, below in conjunction with accompanying drawing and enforcement Example, is further elaborated to the present invention.Should be appreciated that specific embodiment described herein is only in order to explain this Bright, it is not intended to limit the present invention.
In embodiments of the present invention, use the method for " estimation image " to carry out the extraction of arm, then utilize projector with The active vision system that video camera is constituted, utilizes the principle of triangulation to calculate the finger distance to screen, it is judged that whether finger Contact screen, and cursor of mouse is navigated to the position of finger tip, it is achieved the function of touch screen.The embodiment of the present invention overcomes projection Light irradiates the impact on Target Segmentation.It is implemented as: mainly include video camera, projector, calculating equipment and projection screen Curtain (can be wall, blank or other planes).Image in computer is projected projection as outut device by projector On screen, then by the picture above cameras capture projection screen, above the methods analyst image of computer vision Whether there is staff, from image, then isolate staff, then find out finger tip position in the picture, calculate finger tip afterwards to screen The distance of curtain judges finger whether contact screen, then according to the fingertip location location mouse position calculated, simulated touch The input of screen, it is achieved the purpose of man-machine interaction.
Refer to Fig. 1, for the embodiment of the present invention provide based on finger tip location projection interactive method realize flow process, its Mainly comprise the steps that
In step S101, the equipment that calculates receives the projection screen draw above picture of cameras capture;
In step s 102, by whether having staff above image described in the methods analyst of computer vision;
In embodiments of the present invention, described step S102 particularly as follows:
In step S1021, obtain projector output is incident upon projection screen draw above picture;
In step S1022, exported by described projector is incident upon projection screen draw above picture and cameras capture Projection screen draw above picture contrast;
In step S1023, there are differences if contrasted, then combine staff characteristic information to judge cameras capture Whether staff is had on projection screen draw above picture.
In embodiments of the present invention, using the method for computer vision to find detection staff has a variety of method, most common Be that the CF utilizing hand detects, but utilize the colour of skin to carry out detection to have two big frauds in projection interactive system End: one is the color that the light that projector sends can change arm when being radiated on arm, brings difficulty to detection;Two is when throwing When shadow picture itself comprises staff, error detection can be caused.In the system that projector and video camera form, have one the heaviest The information wanted can utilize: computer can obtain the picture that projection goes out, and therefore, video camera can be read by computer Image carries out an estimation.If not having arm to block on projected picture, then video camera read image should and computer The image estimated is the most close, if certain position of projected picture is blocked by arm, then the image of the actual reading of video camera and The image that computer estimates has bigger difference at the position that arm blocks, according to this information, and can not be by projector Picture disturbs, and finds the position of arm accurately.It is critical only that of the embodiment of the present invention to estimate video camera reading accurately Image, this needs following two steps:
The demarcation of geometric position
In order to estimate the image that video camera reads, need between projection picture and video camera, set up a kind of corresponding relation. Can be represented by 3 × 3 matrix H from the mapping relations of a plane to another plane.As in figure 2 it is shown, projected picture A'b'c' tri-point in the most corresponding camera image plane of the abc of plane 3.In order to calculate transfer matrix H, need throwing Some corresponding point are found out on shadow instrument image and camera review.Embodiment of the present invention employing following steps:
1, control projector projects and go out a gridiron pattern image.
2, by cameras capture projection picture, and the figure that the image of detection projector output respectively arrives with cameras capture The angle point of picture.
3, by corresponding angle point, transfer matrix H is calculated.
As it is shown on figure 3, detect the angle of the image that the angle point of the image that projector exports arrives with cameras capture respectively Point, then calculates matrix H.Assuming any point on the image that P is projector output, P ' is the correspondence on camera review Point, then P'=H*P.
The demarcation of color
For 1 P on the image of projector output, by transfer matrix computed above, it is known that it is being taken the photograph Position corresponding on camera image, in order to reach to estimate the purpose of image, in addition it is also necessary to know that this point is on camera review Pixel value is how many.Because the light that projector projects goes out is not uniform light, even if the image of pure color is through projector imaging Also different pixel values can be shown, even if what projector projects went out is uniform light, because camera lens daylighting is uneven Reason, video camera read image pixel value be also different at the diverse location of image, so the demarcation of color not only with Pixel value is about also should be relevant with position.The embodiment of the present invention uses following steps to carry out color calibration:
10, the rgb pixel value of the 0-255 of image is quantified the space to 10 × 10 × 10=1000 kind color;
20, the color projection each quantified is on screen, catches with video camera simultaneously;
30, projection screen is divided into 64 × 48=3072 fritter, calculates the RGB channel pixel value of each image respectively Average in each fritter and variance V.
Thus on 3072 fritters, establish a form for these 1000 kinds of colors, thus can be by following step Suddenly the estimation of image is carried out:
A, computer obtain current projector image, are quantified the color sky to 10 × 10 × 10 after the same method Between;
B, the fritter finding correspondence of tabling look-up according to the position of each pixel and pixel value average as pixel value.
Pass through above step, it is possible to the image exported by projector pre-estimates out the figure that video camera reads Picture, the difference then reading image by analyzing estimation image and reality detects arm.
Arm extracts
Pass through above step, it is thus achieved that estimating that image reads image with actual, ensuing task seeks to analyze this Two width images find out arm position in video camera reads image.Assume computer picture to have a bit that (x, y), through projection After instrument imaging, brightness becomes P, and the color transfer relationship C of video camera represents, the reflectance A of screen represents, becomes at video camera Pixel value I on the image of picture represents, then be tied to form vertical just like ShiShimonoseki:
I=C × A × P (1);
The reflectance A' of staff represents, the pixel value I' of the actual reading of video camera represents then there is I'=C × A' × P. The change of reflectance is represented with a=A'/A.(tri-colors of x, RGB y) are led to calculate each pixel by following (2) formula Road is in the change of the reflectance of diverse location.
a [ x , y , c ] = A ′ A = I ′ [ x , y , c ] I [ x , y , c ] - - - ( 2 )
Assuming that the noise of reflectance A obeys an average is zero, and variance is the Gauss distribution of V/I, and we use following Decision rules judges a pixel, and (x, y) in arm regions.
1 - a [ x , y , R ] + a [ x , y , G ] + a [ x , y , B ] 3 > V [ x , y , R ] + V [ x , y , G ] + V [ x , y , B ] I [ x , y , R ] + I [ x , y , G ] + I [ x , y , B ] - - - ( 3 )
Even if using originally to comprise staff on upper type projection picture, it is also possible to avoid the dry of staff in projection picture Disturbing, the method can be not only used for static background, it is also possible to is applied to dynamic background, when playing video such as projector, also Arm detection can be normally carried out.
In step s 103, if having analyzed staff, then the method further with triangulation calculates described staff Finger tip three-dimensional position on image;
In embodiments of the present invention, three-dimensional position on image of the finger tip of the described staff of described calculating step it Before, also include:
Location fingertip location.
In embodiments of the present invention, after extracting arm, task below is intended to find finger tip position in the picture, The method finding finger tip has a variety of, such as the approximation K curvature by calculating profile, takes extreme value according to K curvature at finger tip point and obtains Fingertip location.The embodiment of the present invention is by following steps searching finger tip:
Step S1031, find out largest contours and fill this profile, thus having obtained not having noisy arm foreground picture Picture;
Step S1032, the convex closure of calculating profile;
Step S1033, calculating arm position of centre of gravity, find out several candidate points of maximum curvature on convex closure;
Step S1034, using the distance farthest candidate point of position of centre of gravity as finger tip.
In step S104, according to described finger tip three-dimensional position on image, calculate the finger tip distance to projection screen;
In step S105, judge whether finger contacts projection screen according to the distance calculated;
In step s 106, if it is judged that contact projection screen, then according to the mouse position, fingertip location location calculated Put, the input of simulated touch screen.
In embodiments of the present invention, after finding fingertip location, our next step work to be done is to calculate finger tip 3 d space coordinate point, thus determine whether mouse click event.According to solid geometry principle, we first have to the work done It is by the demarcation of projector and video camera.Accurately and simply calibration process is that projector constitutes active vision with video camera System carries out the key point of three-dimensional measurement, and calibrating camera method is the most ripe, and the embodiment of the present invention uses Zhang Zhengyou's Scaling method.Calibrating camera has only to use a plane gridiron pattern.Following research emphasis is how labeling projection instrument.Throw The imaging of shadow instrument is the same with video camera can also be described with pin-hole model, so inner parameter and the side of external parameter can also be used Method represents.We regard the inverse process of video camera imaging as projector imaging process, it is possible to by the method for calibrating camera Carry out labeling projection instrument.
As long as finding projection just can solve as the corresponding relation between the two-dimensional points of three-dimensional coordinate point and projection picture Go out inner parameter and the external parameter of projector.
Following steps are used to carry out labeling projection instrument:
1) calibrating camera
2) prepare a blank, blank posts a papery gridiron pattern
3) control projector projects to go out a gridiron pattern and be radiated on blank
4) two tessellated angle points are extracted respectively
5) blank place plane is calculated according to papery X-comers
6) video camera demarcated is used to calculate the three-dimensional coordinate of projection X-comers
7) original image of the projection combining angle point three-dimensional coordinate point and projector calculates projector inner parameter with outside Parameter
After solving the parameter obtaining projector and video camera, we can utilize the method for triangulation to calculate finger tip Three-dimensional position, as shown in Figure 4, be the schematic diagram of triangulation.Utilize similar triangles can be easy to derive Z value.As Figure understands:
T - ( x l - x r ) Z - f = T Z = > Z = f T x l - x r - - - ( 4 )
Utilizing the principle of triangulation, we can also calculate the distance of video camera distance projection screen easily, from And the distance of finger distance screen can be extrapolated, if the distance of finger distance screen, less than a certain specific threshold, is considered as Click event is had to occur.By finger tip position in screen, and geometric calibration before, we can be fixed by cursor of mouse Position is at finger tip, and analog mouse clicks on event.Realize man-machine interaction, thus reached to become any one projection plane The purpose of one touch screen.
Refer to Fig. 5, for the structure of the projection interactive system based on finger tip location that the embodiment of the present invention provides.In order to just In explanation, illustrate only the part relevant to the embodiment of the present invention.Described projection interactive system based on finger tip location includes: connect Receive module 101, staff analyzes module 102, position computation module 103, distance calculation module 104, judge module 105, Yi Jimo Intend module 106.Described projection interactive system based on finger tip location can be software unit, the hardware being built in calculating equipment Unit or the unit of software and hardware combining.
Receiver module 101, for receiving the projection screen draw above picture of cameras capture;
Staff analyzes module 102, for by whether having staff above image described in the methods analyst of computer vision;
Position computation module 103, if for having analyzed staff, then the method further with triangulation calculates institute State the finger tip of the staff three-dimensional position on image;
Distance calculation module 104, for according to described finger tip three-dimensional position on image, calculates finger tip to projection screen Distance;
According to the distance calculated, judge module 105, for judging whether finger contacts projection screen;
Analog module 106, for if it is judged that contact projection screen, then positioning mouse according to the fingertip location calculated Position, the input of simulated touch screen.
As one embodiment of the present invention, staff is analyzed module 102 and is specifically included: acquisition module, contrast module, staff Judge module.
Acquisition module, for obtain projector output be incident upon projection screen draw above picture;
Contrast module, for described projector is exported be incident upon projection screen draw above as with cameras capture Projection screen draw above picture contrasts;
Staff judge module, there are differences if contrasted, then combine staff characteristic information to judge cameras capture Whether staff is had on projection screen draw above picture.
As another preferred embodiment of the present invention, described system also includes: locating module.
Locating module, is used for positioning fingertip location.
As another preferred embodiment of the present invention, described system also includes: packing module, convex hull computation module, candidate point Determine module and finger tip locating module.
Packing module, is used for finding out largest contours and filling this profile;
Convex hull computation module, for calculating the convex closure of profile;
Candidate point determines module, is used for calculating arm position of centre of gravity, finds out several candidate points of maximum curvature on convex closure;
Finger tip locating module, is used for using candidate point farthest for distance position of centre of gravity as finger tip.
In sum, first the embodiment of the present invention extracts staff from complex background.Staff is extracted from complex background Being exactly corresponding staff part to be extracted from entire image, it relates to the segmentation of image and to staff area judging two Individual problem.Segmentation image typically belongs to be the feature extraction of low level, mainly make use of the geological information of staff, colouring information and Movable information.Wherein, geological information includes the shape of staff, profile etc.;Movable information refers to the movement locus of staff.People The most accurate location fingertip location that is extracted as in hands region is laid a good foundation, and grey relevant dynamic matrix, edge generally can be used to examine The method such as Operator Method, calculus of finite differences of survey realizes.In the embodiment of the present invention, in order to remove the impact that projection ray irradiates, at arm The method of prognostic chart picture is used to separate prospect and background during extraction.Computer know projector just in the content of projection image, The corresponding relation of corresponding relation Yu color space by setting up geometric position, computer is it is estimated that video camera reads picture Content, contrasted by analyses and prediction image and the actual image that reads, analyze difference, then by the shape of hands, profile, face The information such as color find out the position of staff further.
Afterwards, in the hand foreground image obtained, it is accurately positioned fingertip location.The method finding finger tip has a lot, is all The image of isolated hand prospect is carried out.Such as edge analysis, circle Hough change, special marking method etc., the present invention Embodiment uses the method location finger tip of curvature.
Finally, calculate the depth information of finger, it may be judged whether contact with projection screen, produce and click on button event.At this In inventive embodiments, use the principle of active vision, calculate finger and the depth information of screen respectively, by the ratio of the two value Relatively can calculate finger whether contact screen, thus determine whether click event.
The embodiment of the present invention solves present in prior art in projection interactive system, due to the photograph of projector light Penetrating, the arm of people may present different colors, is also possible to comprise staff in projected picture, and this all gives segmentation arm band Having carried out difficulty, all can make a mistake detection;It addition, prior art does not the most provide the judgement of finger whether contact screen, The movement locus utilizing finger interacts, and inevitably has the time delay of certain time, the problem limiting its application, Present invention accomplishes the requirement of real-time in interaction.
One of ordinary skill in the art will appreciate that all or part of step realizing in above-described embodiment method is permissible Instructing relevant hardware by program to complete, described program can be stored in a computer read/write memory medium, Described storage medium, such as ROM/RAM, disk, CD etc..
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all essences in the present invention Any amendment, equivalent and the improvement etc. made within god and principle, should be included within the scope of the present invention.

Claims (7)

1. a projection interactive method based on finger tip location, it is characterised in that said method comprising the steps of:
Calculating equipment receives the projection screen draw above picture of cameras capture;
By whether having staff above image described in the methods analyst of computer vision;
If having analyzed staff, then the method further with triangulation calculates the finger tip of described staff three on image Dimension position;
According to described finger tip three-dimensional position on image, calculate the finger tip distance to projection screen;
Judge whether finger contacts projection screen according to the distance calculated;
If it is judged that contact projection screen, then according to the fingertip location location mouse position calculated, simulated touch screen defeated Enter;
Wherein, described by whether there being the step of staff above image described in the methods analyst of computer vision, particularly as follows:
Obtain projector output is incident upon projection screen draw above picture;
The projection screen draw above being incident upon projection screen draw above picture and cameras capture that described projector is exported As contrasting;
There are differences if contrasted, then combine staff characteristic information to judge the projection screen draw above picture of cameras capture On whether have staff;
Where it is assumed that have in computer picture a bit, (x, y), after projector imaging, brightness becomes P, and the color of video camera turns Shifting relation C represents, the reflectance A of screen represents, the pixel value I on the image of video camera imaging represents, then just like ShiShimonoseki is tied to form vertical:
I=C × A × P (1);
The reflectance A' of staff represents, the pixel value I' of the actual reading of video camera represents then there is I'=C × A' × P, uses a =A'/A represents the change of reflectance;
(x, RGB y) tri-Color Channels are at the reflectance of diverse location to calculate each pixel by following (2) formula Change;
a [ x , y , c ] = A ′ A = I ′ [ x , y , c ] I [ x , y , c ] - - - ( 2 ) ;
Assuming that the noise of reflectance A obeys an average is zero, and variance is the Gauss distribution of V/I, uses following decision rules (x, y) in arm regions to judge a pixel;
1 - a [ x , y , R ] + a [ x , y , G ] + a [ x , y , B ] 3 > V [ x , y , R ] + V [ x , y , G ] + V [ x , y , B ] I [ x , y , R ] + I [ x , y , G ] + I [ x , y , B ] ;
Wherein, the RGB channel pixel value of each image variance in each fritter is V.
2. the method for claim 1, it is characterised in that in three-dimensional on image of the finger tip of the described staff of described calculating Before the step of position, also include:
Location fingertip location.
3. method as claimed in claim 2, it is characterised in that the step of described location fingertip location, particularly as follows:
Find out largest contours and fill this profile;
Calculate the convex closure of profile;
Calculate arm position of centre of gravity, convex closure is found out several candidate points of maximum curvature;
Using candidate point farthest for distance position of centre of gravity as finger tip.
4. a projection interactive system based on finger tip location, it is characterised in that described system includes:
Receiver module, for receiving the projection screen draw above picture of cameras capture;
Staff analyzes module, for by whether having staff above image described in the methods analyst of computer vision;
Position computation module, if for having analyzed staff, then the method further with triangulation calculates described staff Finger tip three-dimensional position on image;
Distance calculation module, for according to described finger tip three-dimensional position on image, calculates the finger tip distance to projection screen;
According to the distance calculated, judge module, for judging whether finger contacts projection screen;
Analog module, for if it is judged that contact projection screen, then positioning mouse position, mould according to the fingertip location calculated Intend the input of touch screen;
Wherein, described staff analysis module specifically includes: acquisition module, contrast module, staff judge module;
Acquisition module, for obtain projector output be incident upon projection screen draw above picture;
Contrast module, for the projection being incident upon projection screen draw above picture and cameras capture exported by described projector Screen draw above picture contrasts;
Staff judge module, there are differences if contrasted, then combine staff characteristic information to judge the projection of cameras capture Whether staff is had on screen draw above picture;
Where it is assumed that have in computer picture a bit, (x, y), after projector imaging, brightness becomes P, and the color of video camera turns Shifting relation C represents, the reflectance A of screen represents, the pixel value I on the image of video camera imaging represents, then just like ShiShimonoseki is tied to form vertical:
I=C × A × P (1);
The reflectance A' of staff represents, the pixel value I' of the actual reading of video camera represents then there is I'=C × A' × P, uses a =A'/A represents the change of reflectance;
(x, RGB y) tri-Color Channels are at the reflectance of diverse location to calculate each pixel by following (2) formula Change;
a [ x , y , c ] = A ′ A = I ′ [ x , y , c ] I [ x , y , c ] - - - ( 2 ) ;
Assuming that the noise of reflectance A obeys an average is zero, and variance is the Gauss distribution of V/I, uses following decision rules (x, y) in arm regions to judge a pixel;
1 - a [ x , y , R ] + a [ x , y , G ] + a [ x , y , B ] 3 > V [ x , y , R ] + V [ x , y , G ] + V [ x , y , B ] I [ x , y , R ] + I [ x , y , G ] + I [ x , y , B ] ;
Wherein, the RGB channel pixel value of each image variance in each fritter is V.
5. system as claimed in claim 4, it is characterised in that described system also includes:
Locating module, is used for positioning fingertip location.
6. system as claimed in claim 5, it is characterised in that described system also includes:
Packing module, is used for finding out largest contours and filling this profile;
Convex hull computation module, for calculating the convex closure of profile;
Candidate point determines module, is used for calculating arm position of centre of gravity, finds out several candidate points of maximum curvature on convex closure;
Finger tip locating module, is used for using candidate point farthest for distance position of centre of gravity as finger tip.
7. the calculating equipment of the projection interactive system based on finger tip location included described in any one of claim 4 to 6.
CN201310284483.8A 2013-07-08 2013-07-08 A kind of projection interactive method based on finger tip location, system and the equipment of calculating Active CN103383731B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310284483.8A CN103383731B (en) 2013-07-08 2013-07-08 A kind of projection interactive method based on finger tip location, system and the equipment of calculating

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310284483.8A CN103383731B (en) 2013-07-08 2013-07-08 A kind of projection interactive method based on finger tip location, system and the equipment of calculating

Publications (2)

Publication Number Publication Date
CN103383731A CN103383731A (en) 2013-11-06
CN103383731B true CN103383731B (en) 2016-12-28

Family

ID=49491517

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310284483.8A Active CN103383731B (en) 2013-07-08 2013-07-08 A kind of projection interactive method based on finger tip location, system and the equipment of calculating

Country Status (1)

Country Link
CN (1) CN103383731B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824282B (en) * 2013-12-11 2017-08-08 香港应用科技研究院有限公司 Touch and motion detection using surface mapping figure, shadow of object and camera
CN104750286B (en) * 2013-12-26 2017-12-29 联想(北京)有限公司 A kind of data capture method and electronic equipment
CN103793060B (en) * 2014-02-14 2017-07-28 杨智 A kind of user interactive system and method
CN104978012B (en) * 2014-04-03 2018-03-16 华为技术有限公司 One kind points to exchange method, apparatus and system
CN104778460B (en) * 2015-04-23 2018-05-04 福州大学 A kind of monocular gesture identification method under complex background and illumination
JP6607121B2 (en) * 2016-03-30 2019-11-20 セイコーエプソン株式会社 Image recognition apparatus, image recognition method, and image recognition unit
CN105854290A (en) * 2016-03-31 2016-08-17 湖南快玩网络科技有限公司 Software implementation method for three-dimensional go
KR20190075096A (en) * 2016-10-21 2019-06-28 트룸프 베르크초이그마쉬넨 게엠베하 + 코. 카게 Manufacturing control based on internal personal tracking in the metalworking industry
CN107515714B (en) * 2017-07-27 2020-08-28 歌尔股份有限公司 Finger touch identification method and device and touch projection equipment
CN109799928B (en) * 2017-11-16 2022-06-17 清华大学深圳研究生院 Method and system for acquiring user finger parameters in projection touch panel
CN108762660B (en) 2018-05-29 2021-03-23 京东方科技集团股份有限公司 Floating display device and method for indicating touch position of floating display device
CN108920088A (en) * 2018-07-18 2018-11-30 成都信息工程大学 A kind of desktop projection exchange method and system based on every empty touch operation
CN110471576B (en) * 2018-08-16 2023-11-17 中山叶浪智能科技有限责任公司 Single-camera near-screen touch method, system, platform and storage medium
CN110471575A (en) * 2018-08-17 2019-11-19 中山叶浪智能科技有限责任公司 A kind of touch control method based on dual camera, system, platform and storage medium
CN110471577B (en) * 2018-08-17 2023-08-22 中山叶浪智能科技有限责任公司 360-degree omnibearing virtual touch control method, system, platform and storage medium
CN112445326B (en) * 2019-09-03 2023-04-07 浙江舜宇智能光学技术有限公司 Projection interaction method based on TOF camera, system thereof and electronic equipment
CN111860142A (en) * 2020-06-10 2020-10-30 南京翱翔信息物理融合创新研究院有限公司 Projection enhancement oriented gesture interaction method based on machine vision
CN113095243B (en) * 2021-04-16 2022-02-15 推想医疗科技股份有限公司 Mouse control method and device, computer equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101038671A (en) * 2007-04-25 2007-09-19 上海大学 Tracking method of three-dimensional finger motion locus based on stereo vision
CN101694694A (en) * 2009-10-22 2010-04-14 上海交通大学 Finger identification method used in interactive demonstration system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101038671A (en) * 2007-04-25 2007-09-19 上海大学 Tracking method of three-dimensional finger motion locus based on stereo vision
CN101694694A (en) * 2009-10-22 2010-04-14 上海交通大学 Finger identification method used in interactive demonstration system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A Novel Projector-Camera Interaction System with the Fingertip;王群等;《Journal of Image and Graphic》;20130630;第1卷(第2期);第80页Ⅰ节第2段,第81页Ⅱ节第1段,第82页Ⅲ节第1段、Ⅳ节第1-3段 *
Hand gesture based user interface for computer using a camera and projector;Syed Akhlaq Hussain Shah等;《2011 IEEE International Conference on Signal and Image Processing Applications (ICSIPA2011)》;20111118;第171页第2段 *

Also Published As

Publication number Publication date
CN103383731A (en) 2013-11-06

Similar Documents

Publication Publication Date Title
CN103383731B (en) A kind of projection interactive method based on finger tip location, system and the equipment of calculating
CN102508574B (en) Projection-screen-based multi-touch detection method and multi-touch system
CN100407798C (en) Three-dimensional geometric mode building system and method
CN103226387B (en) Video fingertip localization method based on Kinect
CN102402680B (en) Hand and indication point positioning method and gesture confirming method in man-machine interactive system
US9405182B2 (en) Image processing device and image processing method
CN102902355B (en) The space interaction method of mobile device
CN109816704A (en) The 3 D information obtaining method and device of object
CN103677274B (en) A kind of interaction method and system based on active vision
CN102591533B (en) Multipoint touch screen system realizing method and device based on computer vision technology
KR101719088B1 (en) Method for partitioning area, and inspection device
CN104838337A (en) Touchless input for a user interface
CN103577322B (en) A kind of hit testing method and apparatus
CN104317391A (en) Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
US8948493B2 (en) Method and electronic device for object recognition, and method for acquiring depth information of an object
CN110276293A (en) Method for detecting lane lines, device, electronic equipment and storage medium
US10165168B2 (en) Model-based classification of ambiguous depth image data
CN103598870A (en) Optometry method based on depth-image gesture recognition
CN110263713A (en) Method for detecting lane lines, device, electronic equipment and storage medium
CN103105924A (en) Man-machine interaction method and device
CN113516113A (en) Image content identification method, device, equipment and storage medium
CN111400423B (en) Smart city CIM three-dimensional vehicle pose modeling system based on multi-view geometry
CN107240104A (en) Point cloud data segmentation method and terminal
CN103761011B (en) A kind of method of virtual touch screen, system and the equipment of calculating
CN106611165B (en) A kind of automotive window detection method and device based on correlation filtering and color-match

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant