CN107357517A - A kind of tripper based on gesture identification - Google Patents

A kind of tripper based on gesture identification Download PDF

Info

Publication number
CN107357517A
CN107357517A CN201710582099.4A CN201710582099A CN107357517A CN 107357517 A CN107357517 A CN 107357517A CN 201710582099 A CN201710582099 A CN 201710582099A CN 107357517 A CN107357517 A CN 107357517A
Authority
CN
China
Prior art keywords
mrow
pixel
msub
gray
mtd
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710582099.4A
Other languages
Chinese (zh)
Inventor
韦德远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuzhou Well Trading Co Ltd
Original Assignee
Wuzhou Well Trading Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuzhou Well Trading Co Ltd filed Critical Wuzhou Well Trading Co Ltd
Priority to CN201710582099.4A priority Critical patent/CN107357517A/en
Publication of CN107357517A publication Critical patent/CN107357517A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a kind of tripper based on gesture identification, including display module, image capture module, image processing module and unlocked state, the display module is used to show password unlock interface;Described image acquisition module is used to gather the image information in the vertical direction corresponding with password unlock interface;Described image processing module is used for according to described image acquisition of information hand region, and the change of gesture is identified according to hand region, if the gesture of change matches with the initiation gesture of default password unlock interface, password unlock interface starts;The unlocked state is used for after password unlock interface starts, the position of finger tip is obtained according to hand region, and movement locus of the finger tip in password unlock interface is tracked, by the movement locus compared with the unblock track to prestore, tripper is changed to released state by lock-out state if matching.The device can obtain gesture change and be unlocked, convenient use and security performance height.

Description

A kind of tripper based on gesture identification
Technical field
The present invention relates to Intelligent Recognition to unlock field, and in particular to a kind of tripper based on gesture identification.
Background technology
Human action and gesture identification based on machine vision are to realize that man-machine interactive system institute of new generation is indispensable One key technology.Gesture control generally can be divided into contact gesture control and non-contact gesture control, contact gesture control System includes the touch-screen used on the intelligent mobile terminals such as mobile phone, and user can hand in the enterprising pedestrian's machine of touch-screen display interface Mutually, but there is the place for the deficiencies of interference is big, needs closely to operate in contact gesture control, be set for some giant-screens It is standby, such as intelligent television, Consumer's Experience can be impacted using contact gesture control, and contactless gesture control can be with Make up the deficiency of this point.
The content of the invention
A kind of in view of the above-mentioned problems, the present invention is intended to provide tripper based on gesture identification.
The purpose of the present invention is realized using following technical scheme:
A kind of tripper based on gesture identification, it is characterised in that including display module, image capture module, image Processing module and unlocked state, the display module are used to show password unlock interface;Described image acquisition module is used to gather Image information in the vertical direction corresponding with password unlock interface;Described image processing module is used to be believed according to described image Breath obtains hand region, and the change of gesture is identified according to hand region, if the gesture of change and default password unlock interface Initiation gesture matching, then password unlock interface start;If mismatching, password unlock interface does not start;The unlocked state For after password unlock interface starts, the position of finger tip to be obtained according to hand region, and finger tip is tracked in password unlock interface Movement locus, by the movement locus compared with the unblock track to prestore, if matching if tripper be changed to by lock-out state Released state, if mismatching, the state of tripper remains unchanged.
Beneficial effects of the present invention are:Using the tripper based on contactless gesture identification, can be effectively prevented from User's is not intended to system misoperation caused by gesture operation, and hand area can be fast and accurately positioned in the image of acquisition Domain, it is easy to use, amount of calculation is small and recognition speed is fast.
Brief description of the drawings
Using accompanying drawing, the invention will be further described, but the embodiment in accompanying drawing does not form any limit to the present invention System, for one of ordinary skill in the art, on the premise of not paying creative work, can also be obtained according to the following drawings Other accompanying drawings.
The frame construction drawing of Fig. 1 present invention;
Fig. 2 is the frame construction drawing of image processing module of the present invention.
Reference:
Display module 1, image capture module 2, image processing module 3, unlocked state 4, filter unit 30, binarization unit 31st, morphological operation unit 32, cutting unit 33 and recognition unit 34.
Embodiment
With reference to following application scenarios, the invention will be further described.
Referring to Fig. 1, a kind of tripper based on gesture identification of the present embodiment, including display module 1, IMAQ mould Block 2, image processing module 3 and unlocked state 4,1 module of the display are used to show password unlock interface;Described image gathers mould Block 2 is used to gather the image information in the vertical direction corresponding with password unlock interface;Described image processing module 3 is used for root According to described image acquisition of information hand region, and according to hand region identify gesture change, if change gesture with it is default The initiation gesture matching of password unlock interface, then password unlock interface startup;If mismatching, password unlock interface does not start; The unlocked state 4 is used for after password unlock interface starts, and the position of finger tip is obtained according to hand region, and tracks finger tip and exist The movement locus of password unlock interface, by the movement locus compared with the unblock track to prestore, tripper is by locking if matching It is released state to determine Status Change, if mismatching, the state of tripper remains unchanged.
Further, gesture change is to be changed into separating from the closure of five fingers or separate change by five fingers For closure.
Further, the password unlock interface is 16 palace grid patterns.
Preferably, referring to Fig. 2, described image processing module 3 includes filter unit 30, binarization unit 31, morphology behaviour Make unit 32, cutting unit 33 and recognition unit 34, the filter unit 30 is used for the hand gathered to image capture module Image carries out removing noise processed;The binarization unit 31 is used for except the hand images binaryzation after noise processed;The shape State operating unit 32 is used to expand the image after binaryzation and corrosion treatment;The cutting unit 33 is used for expansion Hand region contours extract and dividing processing are carried out with the image after corrosion;The recognition unit 34 is used to track hand region simultaneously Identify the change of gesture.
Beneficial effects of the present invention are:Using the tripper based on contactless gesture identification, can be effectively prevented from User's is not intended to system misoperation caused by gesture operation, and hand area can be fast and accurately positioned in the image of acquisition Domain, it is easy to use, amount of calculation is small and recognition speed is fast.
Preferably, the filter unit 30, to being carried out removing noise processed by image capture module acquired image, bag Include:
For the setting regions operator of filter unit 30, wherein definition region operator is:
In formula,The set of pixel of the expression in the different zones centered on pixel (i, j), a=1, 2 ..., 6, γ and τ represents different pixels point in the designated area centered on pixel (i, j) to pixel (i, j) respectively Horizontal range and vertical range;
The gray value of hand images is obtained, each interior pixel (i, j) of correspondence image, calculates the area each set respectively DomainThe variance of the gray value of interior pixel, obtain the minimum region of variance and be designated asAnd remove noise using self-defined Function pair hand images carry out removing noise processed, export pretreated hand images, self-defined except noise function is:
In formula, f (i, j) represents the gray value of pixel (i, j),Represent regionThe ash of interior pixel Angle value set,Represent regionThe average value of the gray value of interior pixel.
This preferred embodiment, make hand images to be carried out removing noise processed with the aforedescribed process, from standard area operator Automatically region operator of the most suitable region as filtering process is chosen, can be retained to greatest extent during noise is gone out The textural characteristics of hand images, hand images caused influence of noise in transmitting procedure is simultaneously effective removed, be follow-up Gesture identification provides the hand images of high quality.
Preferably, the binarization unit 31, by except the hand images binaryzation after noise processed, including:
Set global gray threshold G1, for eliminating the gray scale distinguished point in the hand images after removing noise processed, setting Global threshold G1Value be:
In formula, GqThe global gray threshold of setting is represented, sup (z) represents set's Supremum, hist [I] represent remove noise processed after hand images grey level histogram in gray value be I pixel quantity,Represent except gray value is less than or equal to m pixel quantity, C in the hand images grey level histogram after noise processed Represent hand images pixel total quantity;
According to global gray threshold, binary conversion treatment first is carried out to the gray scale distinguished point in hand images:
If p (i, j)>(1+λ)×Gq, then b (i, j)=1;
If p (i, j)<(1-λ)×Gq, then b (i, j)=0;
(if 1- λ) × G1<p(i,j)≤(1+λ)×Gq, then pixel (i, j) is labeled as general pixel;
Wherein, p (i, j) identifies the gray value of pixel (i, j), and b (i, j) identifies the binaryzation result of pixel (i, j), G1The global gray threshold of flag, λ represent the global gray scale factor of setting, λ ∈ [0.2,0.4];
The local gray level threshold value based on each general pixel is obtained, function is obtained using self-defined local gray level threshold value For:
Wherein,
In formula, Gp(i, j) represents the local gray level threshold value of general pixel (i, j), and H (i+ ζ, j+ η) represents pixel (i+ ζ, j+ η) gray reference value, ζ and η represent pixel (i+ ζ, j+ η) to the horizontal range of general pixel (i, j) and hung down respectively Straight distance, ζ, η ∈ [- u, u], u represent the range factor of setting, mid-u≤ζ,η≤u(H (i+ ζ, j+ η)) is represented with general pixel Size centered on (i, j) is the intermediate value of the gray reference value of the pixel in (2u+1) × (2u+1) window, ha(i, j) and hb(i, j) represents size using centered on general pixel (i, j) as the pixel in (2u+1) × (2u+1) window respectively Gray scale greatly and minimum, p (i+ ζ, j+ η) represent pixel (i+ ζ, j+ η) gray value;
According to global gray threshold and pixel local gray level threshold value, binary conversion treatment is carried out to general pixel, specifically For:
In formula, b (i, j) represents the binaryzation result of general pixel (i, j), and p (i, j) represents general pixel (i, j) Gray value, GqRepresent global gray threshold, Gp(i, j) represents the local gray level threshold value of general pixel (i, j);
Binary image is exported according to binary conversion treatment result:The gray value for the pixel that binaryzation result is 1 is set For 255, the gray value for the pixel that binaryzation result is 0 is arranged to 0.
This preferred embodiment, make with the aforedescribed process to except the hand images after noise processed carry out binary conversion treatment, leading to , to judging except the hand images after noise processed carry out global gray scale distinguished point, gray scale distinguished point band can be effectively removed after first The harmful effect come, then it is logical introduce local gray level threshold value, can adapt to each pixel local characteristicses of itself, accurately obtain The binaryzation result of each pixel, while the harmful effect that brightness irregularities are brought can also be efficiently avoid, improve hand The quality of portion's image binaryzation, after being is that hand contours extract lays the foundation.
Preferably, the cutting unit 33, to after expansion and corrosion hand images carry out hand region contours extract and Dividing processing, including:
Hand profile is obtained using the adaptive level set movements method for adding variable weight coefficient, after treatment hand figure As choosing a definition region Q in the Q of region0, wherein Q0∈ Q, definition region Q is represented with ι0Boundary curve;
Initialize level set function
In formula,The level set function of pixel (i, j) initialization is represented, (i, j) ∈ Q, p (i, j) represent pixel (i, j) arrives definition region Q0Boundary curve ι Euclidean distance, (i, j) ∈ Q0Represent pixel (i, j) in definition region Q0It is interior, (i, j) ∈ ι represent pixel (i, j) in domain Q0Boundary curve ι on, (i, j) ∈ Q-Q0Represent pixel (i, j) fixed Adopted region Q0Outside;
Variable gray scale weight coefficient z (i, j) is concurrently set, for controlling boundary curve ι can be according to figure in evolutionary process Determine inwardly and outwardly to move as information self-adapting, the variable gray scale weight coefficient set as:
Z (i, j)=ssgn [f (i, j)-mean (f (i, j))]
In formula, z (i, j) represents the variable gray scale weight coefficient of pixel (i, j), and f (i, j) represents the ash of pixel (i, j) Angle value, mean (f (i, j)) represent the average gray value in License Plate region, and s represents the weight coefficient factor of setting, for adapting to Different edge demands;
Developed using self-defined evolution function pair level set function, self-defined evolution function is:
In formula,Level set function after expression pixel (i, j) evolution, (i, j) ∈ Q,Expression pixel (i, J) current level set function, μ1And μ2The inside and outside energy factors respectively set, ρ (i, j) represent that pixel (i, j) is horizontal Collect the curvature of curved surface, Represent Dirac function, E represents the narrow-band threshold of setting, and z (i, j) represents the variable gray scale weight coefficient of pixel (i, j), the time step that t ' expressions are developed It is long;
Check whether evolution curve reaches stable state after developing each time, do not reach stable also if developing, repeat to drill Change;If evolution has reached stable state, the boundary curve ι ' for stopping developing and obtaining after developing is bent as the profile of car plate Line;
Accurate hand images are partitioned into according to the contour curve of hand and are tracked identification.
This preferred embodiment, make to carry out hand region contours extract and dividing processing with the aforedescribed process, using adaptive Level set movements method, a small definition region is first chosen, is then developed according to the shape facility of hand profile, accurately Rapidly obtain hand profile so that it is higher to the Tracking Recognition accuracy of hand images according to hand profile, it is follow-up rail Mark identification provides guarantee.
It should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than model is protected to the present invention The limitation enclosed, although being explained with reference to preferred embodiment to the present invention, one of ordinary skill in the art should manage Solution, can modify or equivalent substitution to technical scheme, without departing from technical solution of the present invention essence and Scope.

Claims (7)

1. a kind of tripper based on gesture identification, it is characterised in that at display module, image capture module, image Module and unlocked state are managed, the display module is used to show password unlock interface;Described image acquisition module be used for gather with Image information in the corresponding vertical direction of password unlock interface;Described image processing module is used for according to described image information Hand region is obtained, and the change of gesture is identified according to hand region, if the gesture of change and default password unlock interface Initiation gesture matches, then password unlock interface starts;If mismatching, password unlock interface does not start;The unlocked state is used In after password unlock interface starts, the position of finger tip is obtained according to hand region, and tracks finger tip in password unlock interface Movement locus, by the movement locus compared with the unblock track to prestore, tripper is changed to solve by lock-out state if matching Lock status, if mismatching, the state of tripper remains unchanged.
2. the tripper according to claim 1 based on gesture identification, it is characterised in that the gesture change is by five The closure of individual finger is changed into separating or is changed into closing from five separating for finger.
3. the tripper according to claim 1 based on gesture identification, it is characterised in that the password unlock interface is 16 palace grid patterns.
4. the tripper according to claim 1 based on gesture identification, it is characterised in that described image processing module bag Filter unit, binarization unit, morphological operation unit, cutting unit and recognition unit are included, the filter unit is used for figure The hand images gathered as acquisition module carry out removing noise processed;The binarization unit is used for except the hand after noise processed Portion's image binaryzation;The morphological operation unit is used to expand the image after binaryzation and corrosion treatment;Described point Unit is cut to be used to carry out hand region contours extract and dividing processing to the image after expansion and corrosion;The recognition unit is used for Tracking hand region simultaneously identifies the change of gesture.
5. the tripper according to claim 4 based on gesture identification, it is characterised in that the filter unit, to figure As acquisition module acquired image carries out removing noise processed, including:
For filter unit setting regions operator, wherein definition region operator is:
In formula,The set of pixel of the expression in the different zones centered on pixel (i, j), a=1,2 ..., 6, γ and τ represent respectively different pixels point in the designated area centered on pixel (i, j) to pixel (i, j) it is horizontal away from From and vertical range;
The gray value of hand images is obtained, each interior pixel (i, j) of correspondence image, calculates the region each set respectivelyThe variance of the gray value of interior pixel, obtain the minimum region of variance and be designated asAnd remove noise letter using self-defined It is several that hand images are carried out to remove noise processed, pretreated hand images are exported, it is self-defined except noise function is:
In formula, f (i, j) represents the gray value of pixel (i, j),Represent regionThe gray value of interior pixel Set,Represent regionThe average value of the gray value of interior pixel.
6. the tripper according to claim 5 based on gesture identification, it is characterised in that the binarization unit, will Except the hand images binaryzation after noise processed, including:
Set global gray threshold G1, for eliminating the gray scale distinguished point in the hand images after removing noise processed, the overall situation of setting Threshold value G1Value be:
<mrow> <msub> <mi>G</mi> <mi>q</mi> </msub> <mo>=</mo> <mi>s</mi> <mi>u</mi> <mi>p</mi> <mo>{</mo> <mi>z</mi> <mo>|</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>I</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>z</mi> </munderover> <mi>h</mi> <mi>i</mi> <mi>s</mi> <mi>t</mi> <mo>&amp;lsqb;</mo> <mi>I</mi> <mo>&amp;rsqb;</mo> <mo>&amp;GreaterEqual;</mo> <mi>C</mi> <mo>&amp;times;</mo> <mn>90</mn> <mi>%</mi> <mo>}</mo> </mrow>
In formula, GqThe global gray threshold of setting is represented, sup (z) represents setIt is upper really Boundary, hist [I] represent remove noise processed after hand images grey level histogram in gray value be I pixel quantity,Represent except gray value is less than or equal to m pixel quantity, C in the hand images grey level histogram after noise processed Represent hand images pixel total quantity;
According to global gray threshold, binary conversion treatment first is carried out to the gray scale distinguished point in hand images:
If p (i, j)>(1+λ)×Gq, then b (i, j)=1;
If p (i, j)<(1-λ)×Gq, then b (i, j)=0;
(if 1- λ) × G1<p(i,j)≤(1+λ)×Gq, then pixel (i, j) is labeled as general pixel;
Wherein, p (i, j) identifies the gray value of pixel (i, j), the binaryzation result of b (i, j) mark pixels (i, j), G1Mark Know the global gray threshold of setting, λ represents the global gray scale factor of setting, λ ∈ [0.2,0.4];
Obtain the local gray level threshold value based on each general pixel, use self-defined local gray level threshold value obtain function for:
<mrow> <msub> <mi>G</mi> <mi>p</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mrow> <mi>m</mi> <mi>i</mi> <mi>d</mi> </mrow> <mrow> <mo>-</mo> <mi>u</mi> <mo>&amp;le;</mo> <mi>&amp;zeta;</mi> <mo>,</mo> <mi>&amp;eta;</mi> <mo>&amp;le;</mo> <mi>u</mi> </mrow> </munder> <mrow> <mo>(</mo> <mi>H</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mi>&amp;zeta;</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mi>&amp;eta;</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow>
Wherein,
<mrow> <mi>H</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>h</mi> <mi>a</mi> </msub> <msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>h</mi> <mi>b</mi> </msub> <msup> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> <mrow> <msub> <mi>h</mi> <mi>a</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>h</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>-</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <msqrt> <mrow> <msub> <mi>h</mi> <mi>a</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>h</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> </mrow> </msqrt> <mo>,</mo> </mrow>
<mrow> <msub> <mi>h</mi> <mi>a</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>max</mi> <mrow> <mo>-</mo> <mi>u</mi> <mo>&amp;le;</mo> <mi>&amp;zeta;</mi> <mo>,</mo> <mi>&amp;eta;</mi> <mo>&amp;le;</mo> <mi>u</mi> </mrow> </munder> <mrow> <mo>(</mo> <mi>p</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mi>&amp;zeta;</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mi>&amp;eta;</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow>
<mrow> <msub> <mi>h</mi> <mi>b</mi> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>min</mi> <mrow> <mo>-</mo> <mi>u</mi> <mo>&amp;le;</mo> <mi>&amp;zeta;</mi> <mo>,</mo> <mi>&amp;eta;</mi> <mo>&amp;le;</mo> <mi>u</mi> </mrow> </munder> <mrow> <mo>(</mo> <mi>p</mi> <mo>(</mo> <mrow> <mi>i</mi> <mo>+</mo> <mi>&amp;zeta;</mi> <mo>,</mo> <mi>j</mi> <mo>+</mo> <mi>&amp;eta;</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow>
In formula, Gp(i, j) represents the local gray level threshold value of general pixel (i, j), and H (i+ ζ, j+ η) represents pixel (i+ ζ, j+ Gray reference value η), ζ and η represent respectively pixel (i+ ζ, j+ η) to general pixel (i, j) horizontal range and vertically away from From ζ, η ∈ [- u, u], u represent the range factor of setting, mid-u≤ζ,η≤u(H (i+ ζ, j+ η)) is represented with general pixel (i, j) Centered on size be (2u+1) × (2u+1) window in pixel gray reference value intermediate value, ha(i, j) and hb(i, J) gray scale of the size using centered on general pixel (i, j) as the pixel in (2u+1) × (2u+1) window is represented respectively Greatly and minimum, p (i+ ζ, j+ η) represent the gray value of pixel (i+ ζ, j+ η);
According to global gray threshold and pixel local gray level threshold value, binary conversion treatment is carried out to general pixel, is specially:
<mrow> <mi>b</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>1</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&amp;GreaterEqual;</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>G</mi> <mi>q</mi> </msub> <mo>+</mo> <msub> <mi>G</mi> <mi>p</mi> </msub> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>0</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>&lt;</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mrow> <mo>(</mo> <msub> <mi>G</mi> <mi>q</mi> </msub> <mo>+</mo> <msub> <mi>G</mi> <mi>p</mi> </msub> <mo>(</mo> <mrow> <mi>i</mi> <mo>,</mo> <mi>j</mi> </mrow> <mo>)</mo> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
In formula, b (i, j) represents the binaryzation result of general pixel (i, j), and p (i, j) represents the ash of general pixel (i, j) Angle value, GqRepresent global gray threshold, Gp(i, j) represents the local gray level threshold value of general pixel (i, j);
Binary image is exported according to binary conversion treatment result:The gray value for the pixel that binaryzation result is 1 is arranged to 255, the gray value for the pixel that binaryzation result is 0 is arranged to 0.
7. the tripper according to claim 6 based on gesture identification, it is characterised in that the cutting unit, to swollen Hand images after swollen and corrosion carry out hand region contours extract and dividing processing, including:
Hand profile is obtained using the adaptive level set movements method for adding variable weight coefficient, after treatment hand images area A definition region Q is chosen in the Q of domain0, wherein Q0∈ Q, definition region Q is represented with ι0Boundary curve;
Then level set function is initialized
<mrow> <msubsup> <mi>D</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mn>0</mn> </msubsup> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> <mo>&amp;Element;</mo> <msub> <mi>Q</mi> <mn>0</mn> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>0</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> <mo>&amp;Element;</mo> <mi>&amp;iota;</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mi>p</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> <mo>&amp;Element;</mo> <mi>Q</mi> <mo>-</mo> <msub> <mi>Q</mi> <mn>0</mn> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
In formula,The level set function of pixel (i, j) initialization is represented, (i, j) ∈ Q, p (i, j) represent pixel (i, j) To definition region Q0Boundary curve ι Euclidean distance, (i, j) ∈ Q0Represent pixel (i, j) in definition region Q0It is interior, (i, j) ∈ ι represent pixel (i, j) in domain Q0Boundary curve ι on, (i, j) ∈ Q-Q0Represent pixel (i, j) in definition region Q0Outside;
Variable gray scale weight coefficient z (i, j) is set, for controlling boundary curve ι can be according to image information certainly in evolutionary process Adaptively determine inwardly and outwardly to move, the variable gray scale weight coefficient set as:
Z (i, j)=ssgn [f (i, j)-mean (f (i, j))]
In formula, z (i, j) represents the variable gray scale weight coefficient of pixel (i, j), and f (i, j) represents the gray value of pixel (i, j), Mean (f (i, j)) represents the average gray value in License Plate region, and s represents the weight coefficient factor of setting, different for adapting to Edge demand;
Developed using self-defined evolution function pair level set function, self-defined evolution function is:
<mrow> <msubsup> <mi>D</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mrow> <mi>k</mi> <mo>+</mo> <mn>1</mn> </mrow> </msubsup> <mo>=</mo> <mo>{</mo> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>B</mi> <msup> <mrow> <mo>(</mo> <msubsup> <mi>D</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mi>k</mi> </msubsup> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> <mrow> <msub> <mi>&amp;mu;</mi> <mn>1</mn> </msub> <mo>+</mo> <msub> <mi>&amp;mu;</mi> <mn>2</mn> </msub> <mi>B</mi> <mrow> <mo>(</mo> <msubsup> <mi>D</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mi>k</mi> </msubsup> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>)</mo> </mrow> <mi>&amp;rho;</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mo>+</mo> <mi>z</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mi>B</mi> <mrow> <mo>(</mo> <msubsup> <mi>D</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mi>k</mi> </msubsup> <mo>)</mo> </mrow> <mo>}</mo> <mo>&amp;CenterDot;</mo> <msup> <mi>t</mi> <mo>&amp;prime;</mo> </msup> <mo>+</mo> <msubsup> <mi>D</mi> <mrow> <mo>(</mo> <mi>i</mi> <mo>,</mo> <mi>j</mi> <mo>)</mo> </mrow> <mi>k</mi> </msubsup> </mrow>
In formula,Level set function after expression pixel (i, j) evolution, (i, j) ∈ Q,Represent that pixel (i, j) is current Level set function, μ1And μ2The inside and outside energy factors respectively set, ρ (i, j) represent that pixel (i, j) level set is bent The curvature in face, Represent Dirac function, E represents the narrow-band threshold of setting, and z (i, j) represents the variable gray scale weight coefficient of pixel (i, j), the time step that t ' expressions are developed It is long;
Check whether evolution curve reaches stable state after developing each time, do not reach stable also if developing, repeat to develop;If Evolution has reached stable state, then stops developing and obtaining contour curves of the boundary curve ι ' as car plate after developing;
Accurate hand images are partitioned into according to the contour curve of hand and are tracked identification.
CN201710582099.4A 2017-07-17 2017-07-17 A kind of tripper based on gesture identification Pending CN107357517A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710582099.4A CN107357517A (en) 2017-07-17 2017-07-17 A kind of tripper based on gesture identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710582099.4A CN107357517A (en) 2017-07-17 2017-07-17 A kind of tripper based on gesture identification

Publications (1)

Publication Number Publication Date
CN107357517A true CN107357517A (en) 2017-11-17

Family

ID=60292204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710582099.4A Pending CN107357517A (en) 2017-07-17 2017-07-17 A kind of tripper based on gesture identification

Country Status (1)

Country Link
CN (1) CN107357517A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112034981A (en) * 2020-08-20 2020-12-04 深圳创维-Rgb电子有限公司 Display terminal control method, display terminal, and computer-readable storage medium
CN114885140A (en) * 2022-05-25 2022-08-09 华中科技大学 Multi-screen splicing immersion type projection picture processing method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2708981A2 (en) * 2012-08-31 2014-03-19 Omron Corporation Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium
CN104809387A (en) * 2015-03-12 2015-07-29 山东大学 Video image gesture recognition based non-contact unlocking method and device
CN106469268A (en) * 2016-09-05 2017-03-01 广西大学 A kind of contactless unlocking system and method based on gesture identification
CN106650391A (en) * 2016-09-30 2017-05-10 广西大学 Unlocking device and method based on gesture recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2708981A2 (en) * 2012-08-31 2014-03-19 Omron Corporation Gesture recognition apparatus, control method thereof, display instrument, and computer readable medium
CN104809387A (en) * 2015-03-12 2015-07-29 山东大学 Video image gesture recognition based non-contact unlocking method and device
CN106469268A (en) * 2016-09-05 2017-03-01 广西大学 A kind of contactless unlocking system and method based on gesture identification
CN106650391A (en) * 2016-09-30 2017-05-10 广西大学 Unlocking device and method based on gesture recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
谢剑斌 等: "《视觉感知与智能视频监控》", 31 March 2012 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112034981A (en) * 2020-08-20 2020-12-04 深圳创维-Rgb电子有限公司 Display terminal control method, display terminal, and computer-readable storage medium
CN114885140A (en) * 2022-05-25 2022-08-09 华中科技大学 Multi-screen splicing immersion type projection picture processing method and system
CN114885140B (en) * 2022-05-25 2023-05-26 华中科技大学 Multi-screen spliced immersion type projection picture processing method and system

Similar Documents

Publication Publication Date Title
CN104809387B (en) Contactless unlocking method and device based on video image gesture identification
CN103020579A (en) Face recognition method and system, and removing method and device for glasses frame in face image
CN106548163B (en) Method based on TOF depth camera passenger flow counting
CN103679118A (en) Human face in-vivo detection method and system
CN110781896B (en) Track garbage identification method, cleaning method, system and resource allocation method
CN103605971B (en) Method and device for capturing face images
CN103870818B (en) Smog detection method and device
CN106709450A (en) Recognition method and system for fingerprint images
CN101604384B (en) Individualized fingerprint identification method
CN101201893A (en) Iris recognizing preprocessing method based on grey level information
CN107357517A (en) A kind of tripper based on gesture identification
CN108734235A (en) A kind of personal identification method and system for electronic prescription
CN107481374A (en) A kind of intelligent terminal unlocked by fingerprint door opener
CN101483763A (en) Digital video processing method oriented to social security monitoring and apparatus thereof
CN103854013A (en) ARM fingerprint identification method and device based on sparse matrix
CN105426890A (en) Method for identifying graphic verification code with twisty and adhesion characters
CN105788048A (en) Electronic lock system achieving recognition through fingerprints
CN106446921A (en) High-voltage power transmission line barrier identification method and apparatus
CN103123726A (en) Target tracking algorithm based on movement behavior analysis
CN105825206A (en) Household appliance automatic regulation and control device having identity identification function
CN106342325B (en) A kind of region segmentation method of fingerprint image
CN103914829A (en) Method for detecting edge of noisy image
CN109166220B (en) Intelligent access control system based on block chain
CN106407904A (en) Bang zone determining method and device
CN111950606B (en) Knife switch state identification method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171117