CN106774846B - Alternative projection method and device - Google Patents
Alternative projection method and device Download PDFInfo
- Publication number
- CN106774846B CN106774846B CN201611050637.7A CN201611050637A CN106774846B CN 106774846 B CN106774846 B CN 106774846B CN 201611050637 A CN201611050637 A CN 201611050637A CN 106774846 B CN106774846 B CN 106774846B
- Authority
- CN
- China
- Prior art keywords
- image
- projection
- video camera
- finger tip
- hand
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Abstract
The present invention is suitable for technical field of image processing, provide a kind of alternative projection method and device, it include: to obtain the first projected image according to projection image prediction, and the second projected image of the projection image captured according to first projected image and video camera extracts hand prospect and the hand shadows;From the hand prospect and the hand shadows, the position of finger tip and the finger tip shade is obtained;Touch-control judgement is carried out according to the position of the finger tip and the finger tip shade.Above-mentioned alternative projection method can be realized to extract for interactive hand motion using single common camera to be determined with touch-control, device structure is simple, operate it is easy to accomplish, and do not need expensive depth camera or high-speed camera, it can be achieved that low cost unarmed human-computer interaction.
Description
Technical field
The invention belongs to technical field of image processing more particularly to a kind of alternative projection method and devices.
Background technique
Projection-video camera touch-control system project Computer display picture to arbitrary plane and allow user by manually come
Manipulate computer.It provides a kind of mode for vividly naturally interact with computer using finger, such as point for user
Hit virtual keyboard, drag object, open file, stir webpage etc..
There are many kinds of classes for existing projection-video camera touch-control system, for example, depth camera projection interaction, structured light projection
Interactive and infrared projection interaction etc..Depth camera projection interactive mode is extracted from scene around by a depth camera
Hand region for identification realize by gesture.Structured light projection interactive mode is based on a kind of finger tip partial projection adaptive coding side
Method is realized, and needs high-speed camera.It can be seen that the prior art realizes projection-, there are device structures for video camera touch-control system
Complicated and high cost of implementation problem.
Summary of the invention
In view of this, the embodiment of the invention provides a kind of alternative projection method and device, to solve to set in the prior art
The standby problem that structure is complicated and cost of implementation is high.
The embodiment of the present invention in a first aspect, providing a kind of alternative projection method, comprising:
The first projected image is obtained according to projection image prediction, and is captured according to first projected image and video camera
The second projected image of the projection image extract hand prospect and the hand shadows;
From the hand prospect and the hand shadows, the position of finger tip and the finger tip shade is obtained;
Touch-control judgement is carried out according to the position of the finger tip and the finger tip shade.
The second aspect of the embodiment of the present invention provides a kind of alternative projection device, comprising:
Extraction module, for obtaining the first projected image according to projection image prediction, and according to first projected image
Second projected image of the projection image captured with video camera extracts hand prospect and the hand shadows;
Position acquisition module, for from the hand prospect and the hand shadows, obtaining finger tip and finger tip yin
The position of shadow;
Determination module, for carrying out touch-control judgement according to the position of the finger tip and the finger tip shade.
The embodiment of the present invention is possessed compared with the existing technology the utility model has the advantages that obtaining the first throwing according to projection image prediction
Shadow image, and the second projected image of the projection image captured according to first projected image and video camera extracts hand
Portion's prospect and the hand shadows obtain finger tip and the finger tip shade from the hand prospect and the hand shadows
Position carries out touch-control judgement according to the position of the finger tip and the finger tip shade, and the embodiment of the present invention is utilized and individually commonly taken the photograph
Camera, which can be realized to extract for interactive hand motion, determines that device structure is simple with touch-control, and operation is easy to accomplish, and is not required to
Want expensive depth camera or high-speed camera, it can be achieved that low cost unarmed human-computer interaction.
Detailed description of the invention
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art
Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is only of the invention some
Embodiment for those of ordinary skill in the art without any creative labor, can also be according to these
Attached drawing obtains other attached drawings.
Fig. 1 is the flow chart for the alternative projection method that the embodiment of the present invention one provides;
Fig. 2 is the flow chart that the foundation projection image prediction that the embodiment of the present invention one provides obtains the first projected image;
Fig. 3 is the flow chart that geometric correction is carried out to video camera that the embodiment of the present invention one provides;
Fig. 4 is the flow chart for the acquisition fingertip location that the embodiment of the present invention one provides;
Fig. 5 is the structural block diagram of alternative projection device provided by Embodiment 2 of the present invention;
Fig. 6 is the structural block diagram of extraction module provided by Embodiment 2 of the present invention;
Fig. 7 is the structural block diagram of determination module provided by Embodiment 2 of the present invention.
Specific embodiment
In being described below, for illustration and not for limitation, the tool of such as particular system structure, technology etc is proposed
Body details, to understand thoroughly the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific
The present invention also may be implemented in the other embodiments of details.In other situations, it omits to well-known system, device, electricity
The detailed description of road and method, in case unnecessary details interferes description of the invention.
In order to illustrate technical solutions according to the invention, the following is a description of specific embodiments.
Embodiment one:
Fig. 1 shows the implementation process of the alternative projection method of the offer of the embodiment of the present invention one, and details are as follows:
Step S101 obtains the first projected image according to projection image prediction, and according to first projected image and takes the photograph
Second projected image of the projection image that camera captures extracts hand prospect and the hand shadows.
Wherein, projection image is the picture that will be projected on projection screen.First projected image can be by projecting picture
Face predicts, and predicts the first projected image and need to do geometric correction and chromaticity correction to video camera.Referring to fig. 2, an implementation
It is described to show that the first projected image can specifically include following steps according to projection image prediction in example:
Step S201 carries out geometric correction to video camera, to obtain corresponding pixel points in camera views and projection image
Between positional relationship.
Referring to Fig. 3, in one embodiment, described in step S201 to video camera do geometric correction specifically can by with
Lower process is realized:
Step S301 loads chessboard table images, and chessboard table images is projected projection screen.
Wherein, facilitate for subsequent statement, set on the input picture for project projection screen herein tessellated angle point as
First angle point.In the present embodiment, chessboard table images can be loaded into projector, and be thrown chessboard table images by projector
It is mapped on projection screen, but is not limited thereto.
Step S302 captures projection screen image by video camera, and detects the projection screen image that video camera captures
Upper tessellated second angle point.
It should be understood that video camera can be passed through after chessboard table images are projected projection screen in step S301
Capture projection screen image.After capturing projection screen image, on the projection screen image captured tessellated
Two angle points are detected.
In the present embodiment, the method that is detected to tessellated second angle point on the projection screen image captured can be with
Include:
RGB image is converted into gray level image, and detects the edge of gray level image.
Pass through the straight line of Hough transition detection edge image.Wherein, Hough transform is a kind of ginseng using voting principle
Number estimation technique, principle is the point-line duality using image space and Hough parameter space, the inspection in image space
Survey problem is transformed into parameter space.By carrying out simple cumulative statistics in parameter space, then sought in Hough parameter space
The method for looking for accumulator peak value detects straight line.The essence of Hough transform be will in image space with certain relationship pixel into
Row cluster, corresponding points can be accumulated with the parameter space that a certain analytical form connects these pixels by finding.
Go out quadrangle according to the straight line fitting detected, the angle point of the quadrangle fitted is the second angle point.This
In embodiment, the side N shape can also be fitted according to projection screen image, N is the integer more than or equal to 3.I.e., it is possible to according to throwing
Shadow screen picture and actual needs fit triangle, quadrangle, pentagon etc..Under normal conditions, projection screen image is four
Side shape, therefore be quadrangle according to the straight line fitting detected in the present embodiment, the vertex of the quadrangle fitted is
Second angle point described in step S302.
Step S303, according to the positional relationship of first angle point and second angle point, obtain projection screen image and
Project the homography matrix of input picture.
In the present embodiment, according to the positional relationship of the first angle point and the second angle point, establishes one and be able to reflect projection screen
The homography matrix of positional relationship between image and projection input picture, to video camera captured image.Optionally, homography
Matrix is 3 × 3 homography matrix, but is not limited thereto.
Step S304 is corrected video camera by the homography matrix.
In this step, it is able to reflect between projection screen image and projection input picture by calculated in step S303
The homography matrix of positional relationship carries out geometric correction to video camera captured image, is thrown with obtaining camera views and input
Positional relationship under shadow image between corresponding pixel points.
Step S202 captures the projection image using the video camera Jing Guo the geometric correction.
In this step, after projection image is projected on projection screen, using in step S201 after geometric correction
Video camera capture, obtain projected image.
Step S203 carries out chromaticity correction to the projection image that the video camera Jing Guo the geometric correction captures, obtains
First projected image.
In this step, the projection image captured to video camera is needed to carry out chromaticity correction.Optionally, pass through institute for described pair
State the projection image that the video camera of geometric correction captures carry out the detailed process of chromaticity correction can be with are as follows:
Pass through chromaticity correction model
P=A (VI+F)
Chromaticity correction is carried out to the projection image that the video camera Jing Guo the geometric correction captures;
Wherein,
Matrix P is the pixel for predicting hand projected image, the i.e. pixel of the first projected image;Matrix I is projection input figure
The pixel of picture, the i.e. pixel of projection image;Matrix A is projection screen reflectivity, and projection screen is the throwing that projection image projects
Shadow screen;Vector F is the influence factor of environment light;Matrix V is color mixing matrix, characterizes the correlation between Color Channel.
In the present embodiment, AF is calculated by projecting the picture of ater;AV is calculated by projecting pure red, pure green and ethereal blue picture.
After obtaining forecast image (i.e. the first projected image), hand prospect and hand shadows can by forecast image and
The difference between the first projected image that video camera captures extracts.Specifically, if image is directly incident upon screen
On curtain, video camera captured image I should be identical as forecast image P;If there is a hand before projection screen, projection screen
Projection image reflectivity can change.The reflectivity changes of pixel [x, y] can be calculated by reflectivity a [x, y]:
Wherein, IgIt is the gray level image of image I, Ig[x, y] is pixel [x, y] in image IgIn gray value.PgIt is image
P gray level image, Pg[x, y] is pixel [x, y] in image PgIn gray value.
If the reflectivity a [x, y] of projection screen should be close to 1 without leading member.If pixel [x, y] meets a
[x, y]<1-s or a [x, y]>1+s, then the pixel belongs to hand prospect and hand shadows.Wherein, s is that reflectivity is fault-tolerant
Rate, usual value 0.5 to 0.8.If pixel [x, y] is not belonging to hand prospect and hand shadows, then the ash of pixel [x, y] is set
Spending grade is 255.
Step S102 obtains finger tip and the finger tip shade from the shade of the hand prospect and the hand prospect
Position.
Wherein, hand prospect and hand shadows image are extracted from video camera captured image in step S101,
It is then the positional relationship that finger tip and finger tip shade are detected from hand prospect and hand shadows image in this step, to be used to
Determine whether touch-control.In this step, finger tip prospect and finger tip shade should be separated first.From the histogram for extracting image
A separation threshold value is found out, separation threshold value is to exclude the smallest gray value of pixel number in 255 tonal gradations.
Referring to fig. 4, in one embodiment, the process for obtaining fingertip location includes:
Step S401 carries out binary conversion treatment to the hand foreground image extracted, and extracts by binary conversion treatment
The contour images of hand foreground image.
Step S402 constructs multiple masks centered on the pixel of the contour images.
Step S403 traverses the profile point of the contour images, and calculating includes hand region pixel in each mask
Number.
Step S404, using the corresponding profile point of the least mask of the number comprising hand region pixel as finger tip.Specifically
, when the number of the hand pixel inside radiator grille is fewer than the number of hand pixel inside previous mask, using working as radiator grille
Replace previous mask as preferred value, traverse the profile point of the contour images, using the profile point of the optimal value traversed as
Finger tip carries out touch-control judgement.
An implementation for obtaining fingertip location is illustrated only in the present embodiment, can also be passed through in other embodiments
Other modes are realized.For example, calculating the curvature of each pixel of image outline to identify finger tip, the point conduct of maximum curvature is found
Finger tip.
In most instances under the shining upon of projection pattern brightness, the shade of hand can be extracted for interacting.However, working as
When projecting the luminance shortage of input picture, for example, the image projection when black will not observe hand yin on the screen, on screen
Shadow.Therefore, it is necessary to the hand shadows to enhance for interaction are compensated to light.As an embodiment, light is compensated
Enhancing shade can be realized by following procedure:
Firstly, finding the position that video camera obtains finger tip in image.
Then, video camera is captured into fingertip location in image and directly passes through geometric correction relationship map into projected image.
Finally, a square highlight regions are generated in projected image to enhance the shade of hand.Wherein, highlight bar
The center in domain is finger tip location of pixels.In order to handle conveniently, the red channel gray value of square area is set as peak 255
For brightness enhancing.Green and blue channel gray value in square region remains unchanged.
Step S103 carries out touch-control judgement according to the position of the finger tip and the finger tip shade.
In the present embodiment, pass through formula
Carry out touch-control judgement.Wherein, DTtouch(f) be video camera obtain video f frame image touch determine the factor.
Dtip-shadowIt is finger tip at a distance from its shade;DthreholdIt is the threshold distance of finger tip Yu its shade;Here D is setthrehold=
Wfinger;WfingerIt is finger tip width, for the width that convenience of calculation finger tip width referred herein is horizontal direction, WshadowIt is
The horizontal direction width of finger shade.α can be with any number between value 1 to 1/10th, preferably one third;β can
With any number between value 1 to 1/10th, preferably a quarter.Work as DTtouch(f)=1 when, illustrate that finger is touching
Encounter projection display screen;Work as DTtouch(f)=0 when, illustrate that finger does not touch projection screen.
For f frame image, if the distance between finger tip and its shade Dtip-shadowLess than or equal to threshold distance
Dthrehold, and horizontal finger width WfingerOne third be more than or equal to finger shade horizontal width Wshadow, then throwing
There is no touch action on shadow screen.When finger is contacted with projection screen completely, for example, finger has the dynamic of a sliding on the screen
Make, finger tip merges completely with its shade at this time.It should so be touched with projection screen in previous frame finger tip, and finger yin
Shadow horizontal width WshadowThan finger horizontal width WfingerIt is small.Here, it is arrangedSentence as a contact
Fixed condition.If finger tip and its shade are unsatisfactory for both of the aforesaid condition, it is judged to not touching on the projection screen
Touch movement.
Above-mentioned alternative projection method obtains the first projected image according to projection image prediction, and according to first projection
Second projected image of the projection image that image and video camera capture extracts hand prospect and the hand shadows, from institute
It states in hand prospect and the hand shadows, the position of finger tip and the finger tip shade is obtained, according to the finger tip and the finger
The position of sharp shade carries out touch-control judgement, can be realized to extract for interactive hand motion using single camera and sentence with touch-control
It is fixed, and by geometric correction and chromaticity correction, it is adapted to more complicated light environment, strong robustness does not need valuableness
Depth camera or high resolution CCD camera, it can be achieved that low cost unarmed human-computer interaction.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process
Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present invention constitutes any limit
It is fixed.
Embodiment two:
Corresponding to alternative projection method described in foregoing embodiments, Fig. 5 shows interaction provided in an embodiment of the present invention and throws
The structural block diagram of image device.For ease of description, only the parts related to this embodiment are shown.
Referring to Fig. 5, which includes extraction module 501, position acquisition module 502 and determination module 503.Wherein:
Extraction module 501, for obtaining the first projected image according to projection image prediction, and according to first perspective view
Second projected image of the projection image that picture and video camera capture extracts hand prospect and the hand shadows;
Position acquisition module 502, for obtaining finger tip and the finger tip from the hand prospect and the hand shadows
The position of shade;
Determination module 503, for carrying out touch-control judgement according to the position of the finger tip and the finger tip shade.
Referring to Fig. 6, it is preferred that the extraction module 501 may include: geometry correction unit 601, chromaticity correction unit
602 and predicting unit 603.Wherein:
Geometry correction unit 601, for carrying out geometric correction to video camera, to obtain in camera views and projection image
Positional relationship between corresponding pixel points;
Chromaticity correction unit 602, for capturing the projection image using the video camera Jing Guo the geometric correction;
Predicting unit 603, the projection image for capturing to the video camera Jing Guo the geometric correction carry out coloration school
Just, first projected image is obtained.
Preferably, geometric correction Unit 601 is specifically used for:
Chessboard table images are loaded, and chessboard table images are projected into projection screen;Wherein, the input of projection screen is projected
Tessellated angle point is the first angle point on image;
Projection screen image is captured by video camera, and is detected tessellated on the projection screen image that video camera captures
Second angle point;
According to the positional relationship of first angle point and second angle point, projection screen image and the input figure are obtained
The homography matrix of picture;
Video camera is corrected by the homography matrix.
Further, the gray correction unit 602 is specifically used for:
Pass through chromaticity correction model
P=A (VI+F)
Chromaticity correction is carried out to the projection image that the video camera Jing Guo the geometric correction captures;
Wherein,
Matrix P is the pixel of the first projected image;Matrix I is the pixel for projecting input picture;Matrix A is that projection screen is anti-
Penetrate rate;Vector F is the influence factor of environment light;Matrix V is color mixing matrix, characterizes the correlation between Color Channel.
Referring to Fig. 7, it is preferred that the determination module 503 may include: binarization unit 701, construction unit 702, calculate
Unit 703 and touch-control judging unit 704.Wherein:
Binarization unit 701 for carrying out binary conversion treatment to the hand foreground image extracted, and is extracted by two-value
Change the contour images of the hand foreground image of processing;
Construction unit 702, for constructing multiple masks centered on the pixel of the contour images;
Computing unit 703, for traversing the profile point of the contour images, calculating includes hand area in each mask
The number of domain pixel;
Touch-control judging unit 704, using the corresponding profile point of the least mask of the number comprising hand region pixel as referring to
Point.Specifically, touch-control judging unit 704 is used for the number in the hand pixel inside radiator grille than hand picture inside previous mask
When the number of element is few, replaces previous mask as preferred value using when radiator grille, traverse the profile point of the contour images, it will be all over
The profile point for the optimal value gone through carries out touch-control judgement as finger tip.
Above-mentioned alternative projection device obtains the first projected image according to projection image prediction, and according to first projection
Second projected image of the projection image that image and video camera capture extracts hand prospect and the hand shadows, from institute
It states in hand prospect and the hand shadows, the position of finger tip and the finger tip shade is obtained, according to the finger tip and the finger
The position of sharp shade carries out touch-control judgement, can be realized to extract for interactive hand motion using single camera and sentence with touch-control
It is fixed, and by geometric correction and chromaticity correction, it is adapted to more complicated light environment, strong robustness does not need valuableness
Depth camera or high resolution CCD camera, it can be achieved that low cost unarmed human-computer interaction.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function
Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different
Functional unit, module are completed, i.e., the internal structure of described device is divided into different functional unit or module, more than completing
The all or part of function of description.Each functional unit in embodiment, module can integrate in one processing unit, can also
To be that each unit physically exists alone, can also be integrated in one unit with two or more units, it is above-mentioned integrated
Unit both can take the form of hardware realization, can also realize in the form of software functional units.In addition, each function list
Member, the specific name of module are also only for convenience of distinguishing each other, the protection scope being not intended to limit this application.Above system
The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure
Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually
It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician
Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed
The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device and method can pass through others
Mode is realized.For example, system embodiment described above is only schematical, for example, the division of the module or unit,
Only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or components can be with
In conjunction with or be desirably integrated into another system, or some features can be ignored or not executed.Another point, it is shown or discussed
Mutual coupling or direct-coupling or communication connection can be through some interfaces, the INDIRECT COUPLING of device or unit or
Communication connection can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product
When, it can store in a computer readable storage medium.Based on this understanding, the technical solution of the embodiment of the present invention
Substantially all or part of the part that contributes to existing technology or the technical solution can be with software product in other words
Form embody, which is stored in a storage medium, including some instructions use so that one
Computer equipment (can be personal computer, server or the network equipment etc.) or processor (processor) execute this hair
The all or part of the steps of bright each embodiment the method for embodiment.And storage medium above-mentioned include: USB flash disk, mobile hard disk,
Read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic
The various media that can store program code such as dish or CD.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although referring to aforementioned reality
Applying example, invention is explained in detail, those skilled in the art should understand that: it still can be to aforementioned each
Technical solution documented by embodiment is modified or equivalent replacement of some of the technical features;And these are modified
Or replacement, the spirit and scope for technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution should all
It is included within protection scope of the present invention.
Claims (6)
1. a kind of alternative projection method characterized by comprising
The first projected image, and the institute captured according to first projected image and video camera are obtained according to projection image prediction
The second projected image for stating projection image extracts hand prospect and the hand shadows;
From the hand prospect and the hand shadows, the position of finger tip and the finger tip shade is obtained;
Touch-control judgement is carried out according to the position of the finger tip and the finger tip shade;
Wherein, described to show that the first projected image includes: according to projection image prediction
Geometric correction is carried out to video camera, is closed with obtaining the position in camera views and projection image between corresponding pixel points
System;
The projection image is captured using the video camera Jing Guo the geometric correction;
Chromaticity correction is carried out to the projection image that the video camera Jing Guo the geometric correction captures, obtains first perspective view
Picture;
Wherein, described to include: to video camera progress geometric correction
Chessboard table images are loaded, and chessboard table images are projected into projection screen;Wherein, the input picture of projection screen is projected
Upper tessellated angle point is the first angle point;
Projection screen image is captured by video camera, and is detected tessellated second on the projection screen image that video camera captures
Angle point;
According to the positional relationship of first angle point and second angle point, projection screen image and the input picture are obtained
Homography matrix;
Video camera is corrected by the homography matrix;
Wherein, described that projection screen image is captured by video camera, and detect chess on the projection screen image that video camera captures
Second angle point of disk lattice includes:
RGB image is converted into gray level image, and detects the edge of gray level image;
Pass through the straight line of Hough transition detection edge image;
Go out quadrangle according to the straight line fitting detected, the angle point of the quadrangle is determined as the second angle point;
It is wherein, described that touch-control judgement is carried out according to the position of the finger tip and the finger tip shade, comprising:
Pass through formula
Carry out touch-control judgement;Wherein, DTtouch(f) be video camera obtain video f frame image touch determine the factor;Ttouch(f-
1) be f-1 frame image touch determine the factor;Dtip-shadowIt is finger tip at a distance from its shade;DthreholdIt is finger tip and Qi Yin
The threshold distance of shadow;Set Dthrehold=Wfinger;WfingerIt is the horizontal direction width of finger tip, WshadowIt is the water of finger shade
Square to width, α and β are respectively parameter preset.
2. alternative projection method according to claim 1, which is characterized in that the described pair of camera shooting Jing Guo the geometric correction
The projection image that machine captures carries out chromaticity correction specifically:
Pass through chromaticity correction model
P=A (VI+F)
Chromaticity correction is carried out to the projection image that the video camera Jing Guo the geometric correction captures;
Wherein,
Matrix P is the pixel of the first projected image;Matrix I is the pixel for projecting input picture;Matrix A is projection screen reflection
Rate;Vector F is the influence factor of environment light;Matrix V is color mixing matrix, characterizes the correlation between Color Channel.
3. alternative projection method according to claim 1, which is characterized in that the process for obtaining fingertip location includes:
Binary conversion treatment is carried out to the hand foreground image that extracts, and extracts hand foreground image by binary conversion treatment
Contour images;
Centered on the pixel of the contour images, multiple masks are constructed;
The profile point of the contour images is traversed, the number in each mask comprising hand region pixel is calculated;
Using the corresponding profile point of the least mask of the number comprising hand region pixel as finger tip.
4. a kind of alternative projection device characterized by comprising
Extraction module for obtaining the first projected image according to projection image prediction, and according to first projected image and is taken the photograph
Second projected image of the projection image that camera captures extracts hand prospect and the hand shadows;
Position acquisition module obtains finger tip and the finger tip shade for from the hand prospect and the hand shadows
Position;
Determination module, for carrying out touch-control judgement according to the position of the finger tip and the finger tip shade;
Wherein, the extraction module includes:
Geometry correction unit corresponds to picture for carrying out geometric correction to video camera to obtain in camera views and projection image
Positional relationship between vegetarian refreshments;
Chromaticity correction unit, for capturing the projection image using the video camera Jing Guo the geometric correction;
Predicting unit, the projection image for capturing to the video camera Jing Guo the geometric correction carry out chromaticity correction, obtain
First projected image;
Wherein, the geometry correction unit is specifically used for:
Chessboard table images are loaded, and chessboard table images are projected into projection screen;Wherein, the input picture of projection screen is projected
Upper tessellated angle point is the first angle point;
Projection screen image is captured by video camera, and is detected tessellated second on the projection screen image that video camera captures
Angle point;
According to the positional relationship of first angle point and second angle point, projection screen image and the input picture are obtained
Homography matrix;
Video camera is corrected by the homography matrix;
Wherein, described that projection screen image is captured by video camera, and detect chess on the projection screen image that video camera captures
Second angle point of disk lattice includes:
RGB image is converted into gray level image, and detects the edge of gray level image;
Pass through the straight line of Hough transition detection edge image;
Go out quadrangle according to the straight line fitting detected, the angle point of the quadrangle is determined as the second angle point;Wherein, institute
Determination module is stated to be specifically used for passing through formula
Carry out touch-control judgement;Wherein, DTtouch(f) be video camera obtain video f frame image touch determine the factor;Ttouch(f-
1) be f-1 frame image touch determine the factor;Dtip-shadowIt is finger tip at a distance from its shade;DthreholdIt is finger tip and Qi Yin
The threshold distance of shadow;Set Dthrehold=Wfinger;WfingerIt is the horizontal direction width of finger tip, WshadowIt is the water of finger shade
Square to width, α and β are respectively parameter preset.
5. alternative projection device according to claim 4, which is characterized in that the chromaticity correction unit is specifically used for:
Pass through chromaticity correction model
P=A (VI+F)
Chromaticity correction is carried out to the projection image that the video camera Jing Guo the geometric correction captures;
Wherein,
Matrix P is the pixel of the first projected image;Matrix I is the pixel for projecting input picture;Matrix A is projection screen reflection
Rate;Vector F is the influence factor of environment light;Matrix V is color mixing matrix, characterizes the correlation between Color Channel.
6. alternative projection device according to claim 4, which is characterized in that the determination module includes:
Binarization unit for carrying out binary conversion treatment to the hand foreground image extracted, and is extracted by binary conversion treatment
Hand foreground image contour images;
Construction unit, for constructing multiple masks centered on the pixel of the contour images;
Computing unit, for traversing the profile point of the contour images, calculating includes hand region pixel in each mask
Number;
Touch-control judging unit is touched for that will include the corresponding profile point of the least mask of hand number of pixels as finger tip
Control determines.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611050637.7A CN106774846B (en) | 2016-11-24 | 2016-11-24 | Alternative projection method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611050637.7A CN106774846B (en) | 2016-11-24 | 2016-11-24 | Alternative projection method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106774846A CN106774846A (en) | 2017-05-31 |
CN106774846B true CN106774846B (en) | 2019-12-03 |
Family
ID=58912757
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611050637.7A Active CN106774846B (en) | 2016-11-24 | 2016-11-24 | Alternative projection method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106774846B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107357422B (en) * | 2017-06-28 | 2023-04-25 | 深圳先进技术研究院 | Camera-projection interactive touch control method, device and computer readable storage medium |
CN109375833B (en) * | 2018-09-03 | 2022-03-04 | 深圳先进技术研究院 | Touch instruction generation method and device |
CN110888536B (en) * | 2019-12-12 | 2023-04-28 | 北方工业大学 | Finger interaction recognition system based on MEMS laser scanning |
CN111552367B (en) * | 2020-05-25 | 2023-09-26 | 广东小天才科技有限公司 | Click operation identification method, electronic equipment and storage medium |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101625723B (en) * | 2009-07-02 | 2012-03-28 | 浙江省电力公司 | Rapid image-recognizing method of power line profile |
CN103092437B (en) * | 2012-12-13 | 2016-07-13 | 同济大学 | A kind of Portable touch interactive system based on image processing techniques |
CN103824282B (en) * | 2013-12-11 | 2017-08-08 | 香港应用科技研究院有限公司 | Touch and motion detection using surface mapping figure, shadow of object and camera |
-
2016
- 2016-11-24 CN CN201611050637.7A patent/CN106774846B/en active Active
Non-Patent Citations (1)
Title |
---|
Fingertip-based interactive projector-camera system;Qun Wang等;《2013 IEEE International Conference on Information and Automation》;20140127;第140-144页 * |
Also Published As
Publication number | Publication date |
---|---|
CN106774846A (en) | 2017-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106774846B (en) | Alternative projection method and device | |
CN102508574B (en) | Projection-screen-based multi-touch detection method and multi-touch system | |
TWI499966B (en) | Interactive operation method of electronic apparatus | |
US9317127B2 (en) | Method and apparatus for motion recognition | |
US10311295B2 (en) | Heuristic finger detection method based on depth image | |
CN103279225B (en) | Projection type man-machine interactive system and touch control identification method | |
JP5510907B2 (en) | Touch position input device and touch position input method | |
Sener et al. | Error-tolerant interactive image segmentation using dynamic and iterated graph-cuts | |
JP6559359B2 (en) | Gesture determination device, gesture operation device, and gesture determination method | |
US20120249468A1 (en) | Virtual Touchpad Using a Depth Camera | |
López-Rubio et al. | Local color transformation analysis for sudden illumination change detection | |
CN102043949B (en) | Method for searching region of interest (ROI) of moving foreground | |
WO2018082498A1 (en) | Mid-air finger pointing detection for device interaction | |
CN113052923A (en) | Tone mapping method, tone mapping apparatus, electronic device, and storage medium | |
Bulbul et al. | A color-based face tracking algorithm for enhancing interaction with mobile devices | |
Xu et al. | Bare hand gesture recognition with a single color camera | |
CN103389793B (en) | Man-machine interaction method and system | |
Yeh et al. | Vision-based virtual control mechanism via hand gesture recognition | |
Wu et al. | Partially occluded head posture estimation for 2D images using pyramid HoG features | |
CN108255298A (en) | Infrared gesture identification method and equipment in a kind of projection interactive system | |
CN114565777A (en) | Data processing method and device | |
CN106339089A (en) | Human-computer interaction action identification system and method | |
CN108268861B (en) | Human body prone position state identification method and device | |
CN112702586A (en) | Projector virtual touch tracking method, device and system based on visible light | |
Xie et al. | Hand posture recognition using kinect |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |