CN106780809A - Punch card method based on water purifier - Google Patents
Punch card method based on water purifier Download PDFInfo
- Publication number
- CN106780809A CN106780809A CN201611112071.6A CN201611112071A CN106780809A CN 106780809 A CN106780809 A CN 106780809A CN 201611112071 A CN201611112071 A CN 201611112071A CN 106780809 A CN106780809 A CN 106780809A
- Authority
- CN
- China
- Prior art keywords
- point
- user
- image
- video camera
- key point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C1/00—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
- G07C1/10—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people together with the recording, indicating or registering of other data, e.g. of signs of identity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of punch card method based on water purifier, including controller on water purifier, memory, infrared temperature sensor, the first video camera and the second video camera for infrared thermal imaging;APP softwares on the office computer of user, controller is electrically connected with infrared temperature sensor, the first video camera, the second video camera, memory and server respectively;When user is near water purifier, the human body signal of infrared temperature sensor detection is obtained;First video camera and the second camera acquisition user images;The database of set of characteristic points and set of keypoints including all users is provided with memory, controller carries out key point identification and the identification of matching characteristic point and matches, and final identifying user simultaneously carries out work attendance treatment.The present invention has that discrimination is high, strong applicability, low cost, the characteristics of improve administrative convenience.
Description
Technical field
The present invention relates to intelligent identification technology field, it is difficult that cheating, discrimination be high, low cost base more particularly, to a kind of
In the punch card method of water purifier.
Background technology
Management system of intelligently checking card is the management of the clock in and out record equicorrelated case of the employee of a set of management company
System, is the product that software is combined with hardware of checking card of checking card, and generally HR departments use, and the employee for grasping and managing enterprise goes out
Diligent dynamic.
Conventional intelligent punch card system includes fingerprint punch card system and recognition of face punch card system, but there is following lacking
Point:
Intelligent human-face identification punch card system is directed to two similar people of appearance cannot effectively be recognized;Recognition of face
Success rate is subject to more multifactor limitation, such as:Pattern of body form change can then cause identification to malfunction when causing shape of face to change, more restyle the hair and wear
Cap is likely to cause recognition failures, needs manpower to carry out data replacing if customer identification information is changed, and increased manpower
Cost;
Fingerprint recognition requirement finger cleaning, has water stain, greasy dirt all to cause fingerprint None- identified, and fingerprint recognition refers to typing
The fingerprint integrity degree of line has higher requirements, and fingerprint recognition replaceability is higher, and in the market has many fingerprint recognition sets can be with
Instead of checking card;It is relatively costly that existing resolution iris recognition higher is checked card, it is impossible to is widely popularized.
The content of the invention
Goal of the invention of the invention is the deficiency of high cost in order to overcome punch card method of the prior art easily to practise fraud,
There is provided it is a kind of be difficult cheating, discrimination is high, the low cost punch card method based on water purifier.
To achieve these goals, the present invention uses following technical scheme:
A kind of punch card method based on water purifier, including controller, memory, infrared temperature sensing on water purifier
Device, the first video camera and the second video camera for infrared thermal imaging;APP softwares on the office computer of user, control
Device is electrically connected with infrared temperature sensor, the first video camera, the second video camera, memory and server respectively;Including following step
Suddenly:
Working moment and the next moment is provided with (1-1) memory, when user comes work position daily, computer is opened
When, APP softwares remind user to go at water purifier to check card;When user comes off duty clicks on the button for closing computer, APP softwares are reminded and used
Family is gone at water purifier to check card;
When (1-2) user is close to water purifier every time, controller obtains the human body signal of infrared temperature sensor detection;Control
Device controls the first video camera and the start-up operation of the second video camera, the first video camera and the second camera acquisition user images;
The database of set of characteristic points and set of keypoints including all registered users is provided with (1-3) memory, is controlled
Each characteristic point of user is obtained in the image that device processed shoots from the first video camera, by each characteristic point of user and database
The set of characteristic points of all users compare, select the characteristic point of correct matching;
Each key point of user is obtained in the image that controller shoots from the second video camera, by each key point of user
Set of keypoints with all users in database is compared, and selectes the key point of correct matching;
(1-4) utilizes formulaCalculate discrimination γ1, wherein, n is the characteristic point and key of accumulative correct matching
Point sum, N is the characteristic point of setting and the sum of key point, and K is each characteristic point and the characteristic of each key point;
Work as γ1> γ, controller makes the judgement that the match is successful, controller find in database with γ1Corresponding user name
Claim, user's name is passed into server, server storage current time, discrimination γ1And user's name;γ is recognized for standard
Rate;
(1-5) server checks card the time each user the identified for the first time time in one day as working, will be every
The individual user identified for the last time time in one day checks card the time as coming off duty, by working check card the time, come off duty check card when
Between compared with working moment and next moment respectively, calculate whether user is late, leaves early and works overtime and store clothes daily
In business device.
The present invention realizes user identity identification and the function of checking card based on intelligent water purifier, when user is near water purifier
When, obtain the human body signal of infrared temperature sensor detection;First video camera and the second camera acquisition user images;Memory
In be provided with the database of set of characteristic points and set of keypoints including all registered users, controller carry out key point identification and
The identification of matching characteristic point and matching, final identifying user and automatic work attendance.
With water purifier be combined identifying system by the present invention, is, based on the duration market survey of month, to be sent out by research
It is existing, checked card for 2 minutes number of times are 6 times in the doorway residence time in commercial affairs people's group mean daily working time such as employee, white collar
The residence time is that 2 minutes number of times are 2 times before machine, and the residence time is that 16 minutes number of times are 8 times by water purifier, is stopped on station
Time is that 6 hours number of times are 10 times, is found by investigational data, the off-the-job daily before station is removed, before water purifier
Dwell times and residence time all account for higher proportion, and with water purifier be combined intelligent identifying system, can effectively increase by the present invention
Plus resolution, reduce manual maintenance cost;
The present invention improves the convenience of management work, reduces manual maintenance cost, improves user experience;Have
Resolution higher, percent 99 recognition accuracy can be reached in range of application;The present invention is pressed etc. without user
Deliberately identification operation, Intelligent Recognition can be just realized in user's water receiving every morning, and it is more convenient to recognize;With resolution high
Iris recognition is compared, and water purifier intelligent identifying system has the advantages that low cost, has a wide range of application, the commercial affairs such as the company that is more convenient for field
Conjunction is used.
Preferably, working as γ1≤ γ, controller makes the judgement that the user is nonregistered user;
Each characteristic point for being obtained in the image that controller will shoot from the first video camera and shoot from the second video camera
Each key point obtained in image is sent to server, and server produces a numbering for nonregistered user, and by non-note
The numbering of volume user and current time, each characteristic point and each key point associated storage.
Preferably, scope residing for each key point is user face up to hair line, under to chin minimum point, left and right
To ear edge point;Including 7 regions, 7 regions are respectively forehead region, left eye region, right eye region, nasal area, a left side
Face region, right face region and nose chin area;Crucial point symmetry in left eye region, right eye region is chosen, left face region, the right side
Crucial point symmetry in face region is chosen.
Preferably, each characteristic point is located at face trigonum, characteristic point is 30.
Preferably, obtaining each characteristic point of user in the image that is shot from the first video camera of the controller, will use
The set of characteristic points of all users in each characteristic point at family and database is compared, and selectes the characteristic point bag of correct matching
Include following steps:
(5-1) for image I (x, y) that the first video camera shoots, using formula
G (i)=| [f (i-1, j-1)+f (i-1, j)+f (i-1, j+1)]-[f (i+1, j-1)+f (i+1, j)+f (i+1, j+
1)] | and
G (j)=| [f (i-1, j+1)+f (i, j+1)+f (i+1, j+1)]-[f (i-1, j-1)+f (i, j-1)+f (i+1, j-
1)] | calculate in image I (x, y) each pixel (I, neighborhood convolution G (i) j), G (j),
Setting P (i, j)=max [G (i), G (j)], it is image border point to select P (i, j);
(5-2) for image I (x, y) that the first video camera shoots, using formula L (x, y, σ)=g (x, y, σ) × I (x,
Y) scale space images L (x, y, σ) are built, g (x, y, σ) is yardstick Gauss variable function,(x, y) is space coordinates, and σ is Image Smoothness;
(5-3) utilizes formula
D (x, y, σ)=(g (x, y, k σ)-g (x, y, σ)) × I (x, y)=L (x, y, k σ)-L (x, y, σ) calculates Gaussian difference
Divide metric space D (x, y, σ);K is the constant of adjacent metric space multiple;
For each pixel in image I (x, y), the sub- octave image that s layers length and width halve respectively is set up successively, its
In, the first straton octave image is artwork;
Be compared for the D (x, y, σ) of D (x, y, σ) pixel adjacent thereto of each pixel by (5-4), if described
When the D (x, y, σ) of pixel is maximum or minimum value in this layer and bilevel every field, it is spy to take the pixel
Levy a little;
(5-5) obtains the dog figures being made up of each selected characteristic point, and LPF is carried out to dog figures;Removal dog figures
Each point outside middle marginal point, obtains two-dimentional point diagram;
(5-6) utilizes formula
With θ (x, y)=arc
((L (x, y+1)-L (x, y-1))/(L (x+1, y)-L (x-1, y))) calculates modulus value m (x, y) and angle, θ of each characteristic point to tan
(x, y), sets the number of plies of sub- octave image of the yardstick of each characteristic point as where it;Set modulus value, the angle of each characteristic point
Degree and yardstick are characterized feature 1 a little, feature 2 and feature 3;L (x+1, y) characteristic point (x+1, yardstick y);
(5-7) is by 3 of each characteristic point of all of set of characteristic points in 3 features of each characteristic point A1 and database
Individual feature is compared respectively, and characteristic point B1 most close with A1 and secondary close characteristic point C1 is found out in set of characteristic points;
The difference for setting the feature 1 of characteristic point A1 and B1 is a11, set characteristic point A1 and C1 feature 1 difference as
b11;
The difference for setting the feature 2 of characteristic point A1 and B1 is a12, set characteristic point A1 and C1 feature 1 difference as
b12;
The difference for setting the feature 32 of characteristic point A1 and B1 is a13, set characteristic point A1 and C1 feature 1 difference as
b13;
WhenAndAndRatio is the rate threshold of setting;
It is correct match point then to select characteristic point B1.
Preferably, each key point of user is obtained in the image that is shot from the second video camera of controller, by user's
The set of keypoints of all users in each key point and database is compared, and the key point for selecting correct matching is included such as
Lower step:
(6-1) sets the gray value that f (i, j) is (i, j) point in the image that the second video camera shoots, in being with (i, j) point
The heart takes a window of N ' × N ' in the picture, the point set of pixel composition in window is set as A ', using formulaBe filtered, obtain it is dry after image g (i, j);
(6-2) is slided with the window of N ' × N ' on image, the gray value of all pixels in window is pressed and rises sequential arrangement,
Take the gray value for being arranged in the gray value of middle as window center pixel;
(6-3) utilizes formulaEdge inspection is carried out to image f (x, y)
Survey, obtain marginal point h (x, y);
(6-4) for image f (x, y) that the second video camera shoots, using formula L ' (x, y, σ)=g (x, y, σ) × f (x,
Y) scale space images L ' (x, y, σ) are built, g (x, y, σ) is yardstick Gauss variable function,(x, y) is space coordinates, and σ is Image Smoothness;
(6-5) utilizes formula
D ' (x, y, σ)=(g (x, y, k σ)-g (x, y, σ)) × f (x, y)=L ' (x, y, k σ)-L ' (x, y, σ) calculates Gauss
Difference scale space D ' (x, y, σ);
For each pixel in image f (x, y), the sub- octave image that s layers of length and width halve respectively is set up successively, wherein,
First straton octave image is artwork;
Be compared for the D ' (x, y, σ) of D ' (x, y, σ) pixel adjacent thereto of each pixel by (6-6), if institute
When the D ' (x, y, σ) for stating pixel is maximum or minimum value in this layer and bilevel each neighborhood, the pixel is taken
It is key point;
(6-7) obtains the dog figures being made up of each selected key point, and LPF is carried out to dog figures;Removal dog figures
Each point outside middle marginal point, obtains two-dimentional point diagram;
(6-8) utilizes formula
With θ (x, y)=arc
Tan2 ((L (x, y+1)-L (x, y-1))/(L(X+1, y)-L (x-1, y))) calculate modulus value m (x, y) and angle of each key point
(x+1 y) is key point (x+1, yardstick y) for θ (x, y), L;Modulus value, angle and the yardstick for setting each key point are key point
Feature 1, feature 2 and feature 3;
(6-9) is by 3 of each characteristic point of all of set of keypoints in 3 features of each key point A2 and database
Individual feature is compared respectively, and key point B2 most close with A and secondary close key point C2 is found out in set of keypoints;
The difference for setting the feature 1 of key point A2 and B2 is a21, set key point A2 and C2 feature 1 difference as
b21;
The difference for setting the feature 2 of key point A2 and B2 is a22, set key point A2 and C2 feature 1 difference as
b22;
The difference for setting the feature 32 of key point A2 and B2 is a23, set key point A2 and C2 feature 1 difference as
b23;
WhenAndAndRatio is the rate threshold of setting;
It is correct match point then to select key point B2.
Preferably, being handled as follows to I (x, y) before step (5-1):
Using formula R=1.164 (Y-16)+1.596 (Cr-128)
G=1.164 (Y-16) -0.813 (Cr-128) -0.392 (Cb-128)
B=1.164 (Y-16)+2.017 (Cb-128)
The I (x, y) of YCrCh forms is converted into rgb coloured images;
Rgb coloured images are converted into black white image using formula Gray=0.229R+0.587G+0.11B;Wherein, R is
Red component, G is green component, and B is blue component.
Therefore, the present invention has the advantages that:Discrimination is high, strong applicability, and low cost improves administrative convenience
Property.
Brief description of the drawings
Fig. 1 is a kind of flow chart of the invention;
Fig. 2 is a kind of set of keypoints figure of the invention;
Fig. 3 is a kind of set of characteristic points figure of the invention.
Specific embodiment
The present invention will be further described with reference to the accompanying drawings and detailed description.
Embodiment as shown in Figure 1 is a kind of punch card method based on water purifier, including controller on water purifier,
Memory, infrared temperature sensor, the first video camera and the second video camera for infrared thermal imaging;Controller respectively with it is infrared
The electrical connection of temperature sensor, the first video camera, the second video camera, memory and server;Comprise the following steps:
Step 100, prompting of checking card
Working moment and the next moment is provided with memory, when user comes work position daily, opens computer, APP
Software reminds user to go at water purifier to check card;When user comes off duty clicks on the button for closing computer, APP softwares remind user to remove
Checked card at hydrophone;
Step 200, human testing and IMAQ
When user is close to water purifier every time, controller obtains the human body signal of infrared temperature sensor detection;Controller control
Make the first video camera and the start-up operation of the second video camera, the first video camera and the second camera acquisition user images;
The identification and matching of step 300, characteristic point and key point
Be provided with the database of set of characteristic points and set of keypoints including all registered users in memory, controller from
Each characteristic point of user is obtained in the image that first video camera shoots, will be all in each characteristic point of user and database
The set of characteristic points of user is compared, and selectes the characteristic point of correct matching;
Comprise the following steps that:
Step 310, obtains each characteristic point of user in the image that controller shoots from the first video camera,
It is handled as follows for image I (x, y) that the first video camera shoots:
Using formula R=1.164 (Y-16)+1.596 (Cr-128)
G=1.164 (Y-16) -0.813 (Cr-128) -0.392 (Cb-128)
B=1.164 (Y-16)+2.017 (Cb-128)
The I (x, y) of YCrCh forms is converted into rgb coloured images;
Rgb coloured images are converted into black white image using formula Gray=0.229R+0.587G+0.11B;Wherein, R is
Red component, G is green component, and B is blue component;
Step 311, for image I (x, y) that the first video camera shoots, using formula
G (i)=| [f (i-1, j-1)+f (i-1, j)+f (i-1, j+1)]-[f (i+1, j-1)+f (i+1, j)+f (i+1, j+
1)] | and
G (j)=| [f (i-1, j+1)+f (i, j+1)+f (i+1, j+1)]-[f (i-1, j-1)+f (i, j-1)+f (i+1, j-
1)] | neighborhood convolution G (i) of each pixel (l, j) in calculating image I (x, y), G (j),
Setting P (i, j)=max [G (i), G (j)], it is image border point to select P (i, j);
Step 312, for image I (x, y) that the first video camera shoots, using formula L (x, y, σ)=g (x, y, σ) × I
(x, y) builds scale space images L (x, y, σ), and g (x, y, σ) is yardstick Gauss variable function,(x, y) is space coordinates, and σ is Image Smoothness;
Step 313, using formula
D (x, y, σ)=(g (x, y, k σ)-g (x, y, σ)) × I (x, y)=L (x, y, k σ)-L (x, y, σ) calculates Gaussian difference
Divide metric space D (x, y, σ);K is the constant of adjacent metric space multiple;
For each pixel in image I (x, y), the sub- octave image that s layers length and width halve respectively is set up successively, its
In, the first straton octave image is artwork;
Step 314, the D (x, y, σ) of D (x, y, σ) pixel adjacent thereto of each pixel is compared, if institute
When the D (x, y, σ) for stating pixel is maximum or minimum value in this layer and bilevel every field, taking the pixel is
Characteristic point;
Step 315, the dog figures that acquisition is made up of each selected characteristic point, LPF is carried out to dog figures;Removal dog
Each point in figure outside marginal point, obtains two-dimentional point diagram;
Step 316, using formula
With θ (x, y)=arc
((L (x, y+1)-L (x, y-1))/(L (x+1, y)-L (x-1, y))) calculates modulus value m (x, y) and angle, θ of each characteristic point to tan
(x, y), sets the number of plies of sub- octave image of the yardstick of each characteristic point as where it;Set modulus value, the angle of each characteristic point
Degree and yardstick are characterized feature 1 a little, feature 2 and feature 3;L (x+1, y) characteristic point (x+1, yardstick y);
Step 317, by each characteristic point of all of set of characteristic points in 3 features of each characteristic point A1 and database
3 features be compared respectively, characteristic point B1 most close with A1 and time close characteristic point C1 is found out in set of characteristic points;
The difference for setting the feature 1 of characteristic point A1 and B1 is a11, set characteristic point A1 and C1 feature 1 difference as
b11;
The difference for setting the feature 2 of characteristic point A1 and B1 is a12, set characteristic point A1 and C1 feature 1 difference as
b12;
The difference for setting the feature 32 of characteristic point A1 and B1 is a13, set characteristic point A1 and C1 feature 1 difference as
b13;
WhenAndAndRatio is the rate threshold of setting;
It is correct match point then to select characteristic point B1;
Step 320, key point identification and matching
Each key point of user is obtained in the image that controller shoots from the second video camera, by each key point of user
Set of keypoints with all users in database is compared, and selectes the key point of correct matching;
Comprise the following steps that:
Step 321, setting f (i, j) is the gray value of (i, j) point in the image that the second video camera shoots, and is with (i, j) point
Center takes a window of N ' × N ' in the picture, the point set of pixel composition in window is set as A ', using formulaBe filtered, obtain it is dry after image g (i, j);
Step 322, is slided with the window of N ' × N ' on image, and the gray value of all pixels in window is arranged by order is risen
Row, take the gray value for being arranged in the gray value of middle as window center pixel;
Step 323, using formulaSide is carried out to image f (x, y)
Edge detection, obtains marginal point h (x, y);
Step 324, for image f (x, y) that the second video camera shoots, using formula L ' (x, y, σ)=g (x, y, σ) × f
(x, y) builds scale space images L ' (x, y, σ), and g (x, y, σ) is yardstick Gauss variable function,(x, y) is space coordinates, and σ is Image Smoothness;
Step 325, using formula
D ' (x, y, σ)=(g (x, y, k σ)-g (x, y, σ)) × f (x, y)=L ' (x, y, k σ)-L ' (x, y, σ) calculates Gauss
Difference scale space D ' (x, y, σ);
For each pixel in image f (x, y), the sub- octave image that s layers of length and width halve respectively is set up successively, wherein,
First straton octave image is artwork;
Step 326, the D ' (x, y, σ) of D ' (x, y, σ) pixel adjacent thereto of each pixel is compared, if
When the D ' (x, y, σ) of the pixel is maximum or minimum value in this layer and bilevel each neighborhood, the pixel is taken
Point is key point;
Step 327, the dog figures that acquisition is made up of each selected key point, LPF is carried out to dog figures;Removal dog
Each point in figure outside marginal point, obtains two-dimentional point diagram;
Step 328, using formula
With θ (x, y)=arc
((L (x, y+1)-L (x, y-1))/(L (x+1, y)-L (x-1, y))) calculates modulus value m (x, y) and angle of each key point to tan2
(x+1 y) is key point (x+1, yardstick y) for θ (x, y), L;Modulus value, angle and the yardstick for setting each key point are key point
Feature 1, feature 2 and feature 3;
Step 329, by each characteristic point of all of set of keypoints in 3 features of each key point A2 and database
3 features be compared respectively, key point B2 most close with A and time close key point C2 is found out in set of keypoints;
The difference for setting the feature 1 of key point A2 and B2 is a21, set key point A2 and C2 feature 1 difference as
b21;
The difference for setting the feature 2 of key point A2 and B2 is a22, set key point A2 and C2 feature 1 difference as
b22;
The difference for setting the feature 32 of key point A2 and B2 is a23, set key point A2 and C2 feature 1 difference as
b23;
WhenAndAndRatio is the rate threshold of setting;
It is correct match point then to select key point B2.
Step 400, identifying user
Using formulaCalculate discrimination γ1, wherein, n is that the characteristic point and key point of accumulative correct matching are total
Number, N is the characteristic point of setting and the sum of key point, and K is each characteristic point and the characteristic of each key point;γ=90%;
Work as γ1> γ, controller makes the judgement that the match is successful, controller find in database with γ1Corresponding user name
Claim, user's name is passed into server, server storage current time, discrimination γ1And user's name;
Step 500, work attendance treatment
Server checks card the time each user the identified for the first time time in one day as working, by each user
The time identified for the last time is checked card the time as coming off duty in one day, and working is checked card time, time difference of checking card of coming off duty
Compared with working moment and next moment, calculated during whether user be late, leave early and work overtime and store server daily.
As shown in Fig. 2 scope residing for each face key point is user face up to hair line, under it is minimum to chin
Point, left and right takes ear edge point, and four direction is face maximum frame, and data sampling is carried out in facial maximum frame region, is removed
Go characteristic point region take 86 key points (hair line to characteristic area brow portion take 22 key points (by 5 points of upper volume hair line, on
Each 5 foreheads middle part of 5 points of eyebrow marginal portion, left and right hair line takes at 2 points with reference to cross), the left key point of cheek part 26
(left cheek takes 2 points of upper canthus distance below by 16 points of ear boundary line, 8 points of side face limit, cheek middle part), right cheek portion
(right cheek takes upper canthus distance below 2 by 16 points of ear boundary line, 8 points of side face limit, cheek middle part point to take 26 key points
Point), chin portion take 12 key points (chin portion by 6 points of chin border, 2 points of lip border, chengjiang and surrounding 3 points).
As shown in figure 3, human face characteristic point be located at the trigonum that the eyebrow intermediate point of face trigonum two and chengjiang constitute and
The border of shoulder two to face borderline region, wherein, 16 characteristic points of eyes, 4 characteristic points of face, 4 characteristic points of nose, forehead
30 points of compositions of 4 characteristic points and face's shoulder.Ratio is 0.4.
It should be understood that the present embodiment is only illustrative of the invention and is not intended to limit the scope of the invention.In addition, it is to be understood that
Read after the content of instruction of the present invention, those skilled in the art can make various changes or modifications to the present invention, these etc.
Valency form equally falls within the application appended claims limited range.
Claims (7)
1. a kind of punch card method based on water purifier, it is characterized in that, including it is controller on water purifier, memory, infrared
Temperature sensor, the first video camera and the second video camera for infrared thermal imaging;APP on the office computer of user is soft
Part, controller is electrically connected with infrared temperature sensor, the first video camera, the second video camera, memory and server respectively;Including
Following steps:
Working moment and the next moment is provided with (1-1) memory, when user comes work position daily, opens computer, APP
Software reminds user to go at water purifier to check card;When user comes off duty clicks on the button for closing computer, APP softwares remind user to remove
Checked card at hydrophone;
When (1-2) user is close to water purifier every time, controller obtains the human body signal of infrared temperature sensor detection;Controller control
Make the first video camera and the start-up operation of the second video camera, the first video camera and the second camera acquisition user images;
The database of set of characteristic points and set of keypoints including all registered users, controller are provided with (1-3) memory
Each characteristic point of user is obtained in the image shot from the first video camera, by the institute in each characteristic point of user and database
The set of characteristic points for having user is compared, and selectes the characteristic point of correct matching;
Each key point of user is obtained in the image that controller shoots from the second video camera, by each key point and number of user
Set of keypoints according to all users in storehouse is compared, and selectes the key point of correct matching;
(1-4) utilizes formulaCalculate discrimination γ1, wherein, n is that the characteristic point and key point of accumulative correct matching are total
Number, N is the characteristic point of setting and the sum of key point, and K is each characteristic point and the characteristic of each key point;
Work as γ1> γ, controller makes the judgement that the match is successful, controller find in database with γ1Corresponding user's name,
User's name is passed into server, server storage current time, discrimination γ1And user's name;γ is standard discrimination;
(1-5) server checks card the time each user the identified for the first time time in one day as working, and each is used
The family identified for the last time time in one day checks card the time as coming off duty, by working check card the time, come off duty the time of checking card point
Do not compared with working moment and next moment, calculate whether user is late, leaves early and works overtime and store server daily
In.
2. the punch card method based on water purifier according to claim 1, it is characterized in that, work as γ1≤ γ, controller makes institute
State the judgement that user is nonregistered user;
Each characteristic point obtained in the image that controller will shoot from the first video camera and the image shot from the second video camera
Each key point of middle acquisition is sent to server, and server produces a numbering for nonregistered user, and by non-registered use
The numbering at family and current time, each characteristic point and each key point associated storage.
3. the punch card method based on water purifier according to claim 1, it is characterized in that, the scope residing for each key point is
User face up to hair line, under to chin minimum point, left and right to ear edge point;Including 7 regions, 7 regions are respectively
Forehead region, left eye region, right eye region, nasal area, left face region, right face region and nose chin area;Left eye region,
Crucial point symmetry in right eye region is chosen, and the crucial point symmetry in left face region, right face region is chosen.
4. the punch card method based on water purifier according to claim 1, it is characterized in that, each characteristic point is located at face triangle
Area, characteristic point is 30.
5. the punch card method based on water purifier according to claim 1, it is characterized in that, the controller is from the first video camera
Each characteristic point of user is obtained in the image of shooting, by the feature of all users in each characteristic point of user and database
Point set is compared, and the characteristic point of selected correct matching comprises the following steps:
(5-1) for image I (x, y) that the first video camera shoots, using formula G (i)=| [f (i-1, j-1)+f (i-1, j)+f
(i-1, j+1)]-[f (i+1, j-1)+f (i+1, j)+f (i+1, j+1)] | and
G (j)=| [f (i-1, j+1)+f (i, j+1)+f (i+1, j+1)]-[f (i-1, j-1)+f (i, j-1)+f (i+1, j-1)] |
Neighborhood convolution G (i) of each pixel (l, j) in calculating image I (x, y), G (j),
Setting P (i, j)=max [G (i), G (j)], it is image border point to select P (i, j);
(5-2) for image I (x, y) that the first video camera shoots, using formula L (x, y, σ)=g (x, y, σ) × I (x, y) structure
Scale space images L (x, y, σ) is built, g (x, y, σ) is yardstick Gauss variable function,
(x, y) is space coordinates, and σ is Image Smoothness;
(5-3) utilizes formula
D (x, y, σ)=(g (x, y, k σ)-g (x, y, σ)) × I (x, y)=L (x, y, k σ)-L (x, y, σ) calculates difference of Gaussian chi
Degree space D (x, y, σ);K is the constant of adjacent metric space multiple;
For each pixel in image I (x, y), the sub- octave image that s layers length and width halve respectively is set up successively, wherein, the
One straton octave image is artwork;
Be compared for the D (x, y, σ) of D (x, y, σ) pixel adjacent thereto of each pixel by (5-4), if the pixel
When the D (x, y, σ) of point is maximum or minimum value in this layer and bilevel every field, takes the pixel and be characterized
Point;
(5-5) obtains the dog figures being made up of each selected characteristic point, and LPF is carried out to dog figures;Side in removal dog figures
Each point outside edge point, obtains two-dimentional point diagram;
(5-6) utilizes formula
With θ (x, y)=arc tan ((L
(x, y+1)-L (x, y-1))/(L (x+1, y)-L (x-1, y))) calculates modulus value m (x, y) and angle, θ (x, y) of each characteristic point,
Set the number of plies of sub- octave image of the yardstick of each characteristic point as where it;Set modulus value, angle and the chi of each characteristic point
Degree is characterized feature 1 a little, feature 2 and feature 3;L (x+1, y) characteristic point (x+1, yardstick y);
(5-7) is by 3 spies of each characteristic point of all of set of characteristic points in 3 features of each characteristic point A1 and database
Levy and be compared respectively, characteristic point B1 most close with A1 and secondary close characteristic point C1 is found out in set of characteristic points;
The difference for setting the feature 1 of characteristic point A1 and B1 is a11, and the difference for setting the feature 1 of characteristic point A1 and C1 is b11;
The difference for setting the feature 2 of characteristic point A1 and B1 is a12, and the difference for setting the feature 1 of characteristic point A1 and C1 is b12;
The difference for setting the feature 32 of characteristic point A1 and B1 is a13, and the difference for setting the feature 1 of characteristic point A1 and C1 is b13;
WhenAndAndRatio is the rate threshold of setting;
It is correct match point then to select characteristic point B1.
6. the punch card method based on water purifier according to claim 1, it is characterized in that, controller shoots from the second video camera
Image in obtain each key point of user, by the crucial point set of all users in each key point of user and database
Conjunction is compared, and the key point of selected correct matching comprises the following steps:
(6-1) set f (i, j) be the second video camera shoot image in (i, j) point gray value, with (i, j) put centered on
A window of N ' × N ' is taken in image, the point set of pixel composition in window is set as A ', using formulaBe filtered, obtain it is dry after image g (i, j);
(6-2) is slided with the window of N ' × N ' on image, the gray value of all pixels in window is pressed and rises sequential arrangement, the row of taking
The gray value of middle is listed in as the gray value of window center pixel;
(6-3) utilizes formulaRim detection is carried out to image f (x, y), is obtained
To marginal point h (x, y);
(6-4) for image f (x, y) that the second video camera shoots, using formula L ' (x, y, σ)=g (x, y, σ) × f (x, y) structure
Scale space images L ' (x, y, σ) is built, g (x, y, σ) is yardstick Gauss variable function,
(x, y) is space coordinates, and σ is Image Smoothness;
(6-5) utilizes formula
D ' (x, y, σ)=(g (x, y, k σ)-g (x, y, σ)) × f (x, y)=L ' (x, y, k σ)-L ' (x, y, σ) calculates difference of Gaussian
Metric space D ' (x, y, σ);
For each pixel in image f (x, y), the sub- octave image that s layers of length and width halve respectively is set up successively, wherein, first
Straton octave image is artwork;
Be compared for the D ' (x, y, σ) of D ' (x, y, σ) pixel adjacent thereto of each pixel by (6-6), if the picture
When the D ' (x, y, σ) of vegetarian refreshments is maximum or minimum value in this layer and bilevel each neighborhood, the pixel is taken to close
Key point;
(6-7) obtains the dog figures being made up of each selected key point, and LPF is carried out to dog figures;Side in removal dog figures
Each point outside edge point, obtains two-dimentional point diagram;
(6-8) utilizes formula
With θ (x, y)=arc tan2
((L (x, y+1)- L (x, y-1))/(L (x+1, y)-L (x-1, y))) calculate each key point modulus value m (x, y) and angle, θ (x,
Y), (x+1 y) is key point (x+1, yardstick y) to L;Modulus value, angle and the yardstick for setting each key point are the spy of key point
Levy 1, feature 2 and feature 3;
(6-9) is by 3 spies of each characteristic point of all of set of keypoints in 3 features of each key point A2 and database
Levy and be compared respectively, key point B2 most close with A and secondary close key point C2 is found out in set of keypoints;
The difference for setting the feature 1 of key point A2 and B2 is a21, and the difference for setting the feature 1 of key point A2 and C2 is b21;
The difference for setting the feature 2 of key point A2 and B2 is a22, and the difference for setting the feature 1 of key point A2 and C2 is b22;
The difference for setting the feature 32 of key point A2 and B2 is a23, and the difference for setting the feature 1 of key point A2 and C2 is b23;
WhenAndAndRatio is the rate threshold of setting;
It is correct match point then to select key point B2.
7. the punch card method based on water purifier according to claim 5, it is characterized in that, before step (5-1) to I (x,
Y) it is handled as follows:
Using formula R=1.164 (Y-16)+1.596 (Cr-128)
G=1.164 (Y-16) -0.813 (Cr-128) -0.392 (Cb-128)
B=1.164 (Y-16)+2.017 (Cb-128)
The I (x, y) of YCrCh forms is converted into rgb coloured images;
Rgb coloured images are converted into black white image using formula Gray=0.229R+0.587G+0.11B;Wherein, R is red
Component, G is green component, and B is blue component.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611112071.6A CN106780809B (en) | 2016-12-06 | 2016-12-06 | Punch card method based on water purifier |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611112071.6A CN106780809B (en) | 2016-12-06 | 2016-12-06 | Punch card method based on water purifier |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106780809A true CN106780809A (en) | 2017-05-31 |
CN106780809B CN106780809B (en) | 2019-02-15 |
Family
ID=58878462
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611112071.6A Active CN106780809B (en) | 2016-12-06 | 2016-12-06 | Punch card method based on water purifier |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106780809B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110969713A (en) * | 2018-09-30 | 2020-04-07 | 上海小蚁科技有限公司 | Attendance statistics method, device and system and readable storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130282609A1 (en) * | 2012-04-20 | 2013-10-24 | Honeywell International Inc. | Image recognition for personal protective equipment compliance enforcement in work areas |
CN103984941A (en) * | 2014-06-10 | 2014-08-13 | 深圳市赛为智能股份有限公司 | Face recognition checking-in method and device thereof |
CN104851140A (en) * | 2014-12-12 | 2015-08-19 | 重庆凯泽科技有限公司 | Face recognition-based attendance access control system |
WO2015124914A1 (en) * | 2014-02-18 | 2015-08-27 | ALINIA, Danielle | System and method for recordal of personnel attendance |
CN105718925A (en) * | 2016-04-14 | 2016-06-29 | 苏州优化智能科技有限公司 | Real person living body authentication terminal equipment based on near infrared and facial micro expression |
CN205384664U (en) * | 2016-03-08 | 2016-07-13 | 广东履安实业有限公司 | Developments face identification entrance guard attendance voice broadcast system |
-
2016
- 2016-12-06 CN CN201611112071.6A patent/CN106780809B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130282609A1 (en) * | 2012-04-20 | 2013-10-24 | Honeywell International Inc. | Image recognition for personal protective equipment compliance enforcement in work areas |
WO2015124914A1 (en) * | 2014-02-18 | 2015-08-27 | ALINIA, Danielle | System and method for recordal of personnel attendance |
CN103984941A (en) * | 2014-06-10 | 2014-08-13 | 深圳市赛为智能股份有限公司 | Face recognition checking-in method and device thereof |
CN104851140A (en) * | 2014-12-12 | 2015-08-19 | 重庆凯泽科技有限公司 | Face recognition-based attendance access control system |
CN205384664U (en) * | 2016-03-08 | 2016-07-13 | 广东履安实业有限公司 | Developments face identification entrance guard attendance voice broadcast system |
CN105718925A (en) * | 2016-04-14 | 2016-06-29 | 苏州优化智能科技有限公司 | Real person living body authentication terminal equipment based on near infrared and facial micro expression |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110969713A (en) * | 2018-09-30 | 2020-04-07 | 上海小蚁科技有限公司 | Attendance statistics method, device and system and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN106780809B (en) | 2019-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105447466B (en) | A kind of identity integrated recognition method based on Kinect sensor | |
CN106056064B (en) | A kind of face identification method and face identification device | |
CN105139404B (en) | A kind of the license camera and shooting quality detection method of detectable shooting quality | |
EP3719694A1 (en) | Neural network model-based human face living body detection | |
CN109858439A (en) | A kind of biopsy method and device based on face | |
CN108549884A (en) | A kind of biopsy method and device | |
CN106529414A (en) | Method for realizing result authentication through image comparison | |
CN107545536A (en) | The image processing method and image processing system of a kind of intelligent terminal | |
JP6822482B2 (en) | Line-of-sight estimation device, line-of-sight estimation method, and program recording medium | |
US20220043895A1 (en) | Biometric authentication system, biometric authentication method, and program | |
CN111008971B (en) | Aesthetic quality evaluation method of group photo image and real-time shooting guidance system | |
JP6956986B1 (en) | Judgment method, judgment device, and judgment program | |
AU2019201152B2 (en) | Methods and systems for detecting user liveness | |
CN110036407A (en) | For the system and method based on mankind's sclera and pupil correcting digital image color | |
Wasnik et al. | Presentation attack detection for smartphone based fingerphoto recognition using second order local structures | |
CN106780810B (en) | Work attendance method based on water purifier | |
Long et al. | Near infrared face image quality assessment system of video sequences | |
CN106780809B (en) | Punch card method based on water purifier | |
Low et al. | Experimental study on multiple face detection with depth and skin color | |
CN106778578A (en) | Water purifier method for identifying ID | |
Hosseini et al. | Facial expression analysis for estimating patient's emotional states in RPMS | |
JP7457991B1 (en) | Impersonation detection system and impersonation detection program | |
JP7103443B2 (en) | Information processing equipment, information processing methods, and programs | |
Mollah et al. | Improvement of haar feature based face detection incorporating human skin color analysis | |
CN109145724A (en) | " four seasons type people " automatic distinguishing method based on character face's image characteristic analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |