CN108491705A - A kind of user's on-line authentication device - Google Patents

A kind of user's on-line authentication device Download PDF

Info

Publication number
CN108491705A
CN108491705A CN201810184043.8A CN201810184043A CN108491705A CN 108491705 A CN108491705 A CN 108491705A CN 201810184043 A CN201810184043 A CN 201810184043A CN 108491705 A CN108491705 A CN 108491705A
Authority
CN
China
Prior art keywords
face
user
sample
facial image
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201810184043.8A
Other languages
Chinese (zh)
Inventor
韦玥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yixin Intelligent Technology Co Ltd
Original Assignee
Shenzhen Yixin Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yixin Intelligent Technology Co Ltd filed Critical Shenzhen Yixin Intelligent Technology Co Ltd
Priority to CN201810184043.8A priority Critical patent/CN108491705A/en
Publication of CN108491705A publication Critical patent/CN108491705A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/164Detection; Localisation; Normalisation using holistic features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive

Abstract

The present invention provides a kind of user's on-line authentication device, including user registration module, for for the online fill message of user, obtaining the facial image of user, extracts face characteristic data, establishes user's facial feature database;User authentication module, including information acquisition unit, for acquiring user's facial image;In vivo detection unit, whether the face in user's facial image for confirming the acquisition is live body;Image processing unit is pre-processed for the facial image to the acquisition;Feature extraction unit extracts face characteristic for handling pretreated facial image;Comparison unit is compared for the face characteristic data in the face characteristic data and user's facial feature database by extraction, exports matched result;Result treatment unit, for making corresponding processing according to the result of comparison.Apparatus of the present invention can be effectively prevented from gains certification by cheating using the photo containing face, and security performance is high.

Description

A kind of user's on-line authentication device
Technical field
The present invention relates to field of computer technology, especially a kind of user's on-line authentication device.
Background technology
In current information-intensive society, side's aspect of life is almost penetrated into using computer and networks as the information technology of representative Face.However we have also brought unsafe shade while enjoyment " information ", so people are being engaged in various work In dynamic, it is often necessary to the certification for carrying out personal identification, to ensure the safety of information.Because the naturality of face, stability, Easy collectivity, and it is applied to authentication.
When user carries out on-line authentication, the facial image for acquiring user is needed, then carries out Face datection, at image Reason, face characteristic extraction, and the face characteristic extracted face characteristic corresponding with property data base is compared, Obtain authentication result;But authentication device has the following disadvantages in the prior art:Traditional user based on acquisition user's face Identification authentication system None- identified user's face when acquiring user images is photo or true man.
Invention content
In view of the above-mentioned problems, the present invention is intended to provide a kind of user's on-line authentication device.
The purpose of the present invention is realized using following technical scheme:
A kind of user's on-line authentication device, including:
User registration module extracts face characteristic number for for the online fill message of user, obtaining the facial image of user According to establishing user's facial feature database;
User authentication module, including information acquisition unit, In vivo detection unit, image processing unit, feature extraction unit, Comparison unit, result treatment unit;
Information acquisition unit, for acquiring user's facial image;
In vivo detection unit, whether the face in user's facial image for confirming the acquisition is live body;
Image processing unit is pre-processed for the facial image to the acquisition;
Feature extraction unit extracts face characteristic for handling pretreated facial image;
Comparison unit, the face characteristic data in face characteristic data and user's facial feature database for that will extract It is compared, exports matched result;
Result treatment unit, for making corresponding processing according to the result of comparison.
In one embodiment, the ID card No. for the submission that the user registration module is filled according to user passes through public affairs Peace Intranet transfers corresponding identity card picture, extracts the face characteristic data in photo, establishes user's facial feature database.
In one embodiment, the In vivo detection unit includes Face datection subelement and live body judgment sub-unit,
Face datection subelement is for detecting face and structures locating;
Live body judgment sub-unit, for detecting whether face is live body.
In one embodiment, described image processing unit is used to carry out illumination benefit to collected user's facial image Repay, greyscale transformation, histogram equalization, normalization, geometric correction, filtering and correction process;
The feature extraction unit is for the human face feature in the facial image after extraction process, the human face Feature includes naked face, eyebrow, eyes, mouth nose feature.
Beneficial effects of the present invention are:User's on-line authentication device proposed by the present invention can avoid using and contain face Photo gain certification by cheating, improve the safety of device;Meanwhile apparatus of the present invention are good to the facial image recognition effect of user, Accuracy rate is high, can adapt to the needs of different occasion authenticating user identifications.
Description of the drawings
Using attached drawing, the invention will be further described, but the embodiment in attached drawing does not constitute any limit to the present invention System, for those of ordinary skill in the art, without creative efforts, can also obtain according to the following drawings Other attached drawings.
Fig. 1 is the frame construction drawing of the present invention;
Fig. 2 is the frame construction drawing of In vivo detection unit of the present invention.
Reference numeral:
User registration module 100, user authentication module 200, information acquisition unit 210, In vivo detection unit 220, image Processing unit 230, comparison unit 250, result treatment unit 260, Face datection subelement 221, is lived at feature extraction unit 240 Body judgment sub-unit 222 and grader establish subelement 223
Specific implementation mode
In conjunction with following application scenarios, the invention will be further described.
Referring to Fig. 1, a kind of user's on-line authentication device is shown, including:
User registration module 100, for for the online fill message of user, obtaining the facial image of user, extraction face is special Data are levied, user's facial feature database is established;
User authentication module 200, including information acquisition unit 210, In vivo detection unit 220, image processing unit 230, Feature extraction unit 240, comparison unit 250, result treatment unit 260;
Information acquisition unit 210, including camera, for acquiring user's facial image;
In vivo detection unit 220, whether the face in user's facial image for confirming the acquisition is live body;
Image processing unit 230 is pre-processed for the facial image to acquisition;
Feature extraction unit 240 extracts face characteristic for handling pretreated facial image;
Comparison unit 250, the face characteristic in face characteristic data and user's facial feature database for that will extract Data are compared, and export matched result;
Result treatment unit 260, for making corresponding processing according to the result of comparison.
In one embodiment, the ID card No. for the submission that the user registration module 100 is filled according to user is logical It crosses public security Intranet and transfers corresponding identity card picture, extract the face characteristic data in photo, establish user's face characteristic data Library.
In one embodiment, the In vivo detection unit 220 includes that Face datection subelement 221 and live body judge son Unit 222,
Face datection subelement 221 is for detecting face and structures locating;
Live body judgment sub-unit 222, for detecting whether face is live body.
In one embodiment, described image processing unit 230 is used to carry out light to collected user's facial image According to compensation, greyscale transformation, histogram equalization, normalization, geometric correction, filtering and correction process;
The feature extraction unit 240 is for the human face feature in the facial image after extraction process, the face Organ characteristic includes naked face, eyebrow, eyes, mouth nose feature.
The application the above embodiment can avoid gaining certification by cheating using the photo containing face, improve the peace of device Quan Xing;Meanwhile apparatus of the present invention are good to the facial image recognition effect of user, accuracy rate is high, can adapt to different occasion users The needs of authentication.
In one embodiment, the Face datection subelement 221 further includes:It is collected to image acquisition units every One frame facial image is filtered, and obtains filtered facial image, is then utilized to the filtered facial image Integral quickly calculates Haar-Like characteristic values, is applied in the good grader of off-line training, determines whether face.
In one embodiment, the Face datection subelement 221 further includes:Grader is being used to judge whether for people Before face, complexion model is established by color space transformation, calculates colour of skin likelihood value, analyzed with features of skin colors at the beginning of realizing face Step detection, obtains the face candidate region in the filtered facial image, then utilizes product to the face candidate region Divide and quickly calculate Haar-Like characteristic values, is applied in the good grader of off-line training, determines whether face;
Wherein, the face candidate region obtained in image specifically includes:
(1) the filtered facial image of acquisition is transformed into YCbCrColor space, wherein Y indicate brightness, CbAnd CrPoint It Biao Shi not blue component and red component;
(2) by being trained in advance to the colour of skin sample of acquisition, since the colour of skin of people is in Cb-CrSpatially assemble again very In small spatial dimension, it is shown that good Clustering features, and dimensional gaussian distribution is substantially conformed to, therefore may be used following Function establishes Gauss complexion model, wherein the fitting parameter used for:
C=E [(x-m) (x-m)T]
In formula, m and C indicate that the colour of skin mean value obtained by statistical analysis and covariance matrix, x indicate each pixel respectively Color vector (the C of pointb,Cr)T,WithIndicate colour of skin sample in C respectivelyb-CrBlue component and red component is equal in space Value, E indicate unit matrix;
(3) the similarity size of all pixels point and the colour of skin in filtered facial image, the i.e. likelihood value of the colour of skin are calculated Size, wherein the colour of skin likelihood value that uses calculate function for:
P(Cb,Cr)=exp [- 0.5 (x-m)TC-1(x-m)]
Wherein, P (Cb,Cr) indicate color likelihood value;
(4) likelihood value matrix is calculated and then is normalized using the maximum value in likelihood value matrix, passes through After crossing binaryzation, Morphological scale-space, by area of skin color and background separation, obtaining may be containing the candidate region of face.
The application the above embodiment is adopted the facial image in manner just described to acquisition and is handled, in detection image Human face region, can accurately according to the features of skin colors of face obtain image in face candidate region, it is adaptable, accurately Property it is high;It is handled by Face datection, rejects the background area in image, the subsequently identification to face and spy can be effectively reduced The complexity and data volume for levying extraction, improve speed and efficiency of the device to user authentication.
In one embodiment, the In vivo detection unit 220 further includes:Grader establishes subelement 223;
The grader establishes subelement 223 for training the grader according to user's face database, specifically Including:
The facial image in user's face database is obtained as training sample (x1,y1),(x2,y2),…(xi, yi),…(xn,yn), wherein xiIndicate i-th of face training sample, yiFace sample, y are expressed as when=1iIt is expressed as when=0 non- Face sample, n indicate the sum of training sample,
The sorter model wherein used for:
In formula, atIndicate evaluation points, ht(x) it indicates simple classification device, h is trained according to the sample data after weightingt(x) And at, by the weight of the sample of raising classification error, the weight for correct sample of classifying is reduced to adjust sample weights, wherein T Indicate that repetitive exercise number, H (x) indicate strong classifier;
Initial phase:Face sample is initialized respectively and the weight of non-face sample isWithWherein Dt(i) it indicates The Error weight of i-th of sample in the t times repetitive exercise cycle, when i-th of sample is face sample,When i-th When a sample is non-face sample,Wherein l and m indicates the number of face sample and non-face sample respectively;
Training stage:The weight of training sample is normalized, wherein the normalized function used for:
In formula, qt(i) the normalization Error weight of i-th of sample is indicated;
For each training sample, the Haar-Like rectangular characteristics of the training sample are obtained, and according to each Haar- Like rectangular characteristics j generates simple classification device:
In formula, θjIndicate the threshold value of setting, pjIndicate biasing coefficient, pj=± 1, majorization inequality direction;Wherein threshold θj With biasing coefficient pjSetting make weighting fault rate function of εj=∑ qt|hj(xi)-yi| it is minimum;
Being chosen from the simple classification device of generation has minimal error rate εtSimple classification device ht
Update the weight of all training samples:
In formula,etClassification results of the training sample at the t times in training are indicated, if the training sample is by just Really classify, then et=0, otherwise et=1;
The obtained simple classification device with minimal error rate of training stage is combined into strong classifier:
In formula, h (x) indicates the strong classifier being made of simple classification device, atIndicate evaluation points, at=ln (1/ βt);
Cascade classifier builds the stage:Using cascade organizational form, each strong point will obtained in above-mentioned training process The serial connection of class device at cascade classifier, sentenced by each layer of strong classifier in cascade classifier according to this by the facial image of acquisition It is fixed, if it is determined that being face, then continue to judge into next layer, if it is determined that being non-face, then be labeled as the facial image of acquisition It is non-face, until acquisition facial image by the cascade classifier in each layer of judgement, then mark the face of the acquisition Image is face;Wherein, from N-u layers, N-u levels connection classifier functions are:
In formula, γN-u-1It indicates weighting coefficient, indicates the size of this layer of structure change;
Then the cascade classifier of this layer is:
In formula, θN-uIndicate the classification thresholds of N-u layers of cascade classifier setting, θN-u=min (fN-u(xi)) (i= 1,…,m)
Secondary judgement is carried out using quadratic classifier to the sample refused by N-u layers of cascade classifier, if sample passes through The quadratic classifier then enters next layer of judgement, wherein the quadratic classifier used for:
In formula, H 'N-u(x) quadratic classifier, f ' are indicatedN-u(x) secondary decision function is indicated,μ indicates that coefficient of determination, δ indicate sample by all layers in front Cascade classifier refusal number;
Wherein, N indicates total number of plies of the cascade classifier of setting, and u=U, U-1 ..., 0, U indicates the cascade sort of setting The number of plies of secondary judgement is proceeded by device.
The application the above embodiment adopts and establishes face classification according to the facial image in database in manner just described Device can obtain and most accurately divide adaptively according to the facial image training in database for the grader of different characteristic Class threshold value improves the accuracy of Face datection;Cascade classifier is built using cascade mode, can aid in while to not Same image carries out classification processing, improves the efficiency of grader work, reduces the complexity of grader;In practical operation In, due to the increase with the cascade classifier number of plies, the complexity of grader increases with the increase of the number of plies, it may be said that latter The condition of the grader of layer is more harsh, as long as the image of input is determined as non-face by wherein one layer of grader, does not then have Any measure remedied, misclassification rate will increase, and therefore, secondary classification carried out to the last grassroots public libraries device of cascade classifier The setting of device can effectively be directed to situation misjudged under severe conditions, can be remedied, to improve accuracy of detection.
In one embodiment, the live body judgment sub-unit 222 includes:
(1) human face region context cues are obtained and compares score:For the human face region image sequence of acquisitionReference scene picture IR, the feature point set of extraction is combined intoHuman face region context cues are compared into score It is defined as:
In formula, Φ (S, IR) indicating that context cues compare score, n indicates that the number per frame characteristic point, D (M, N) indicate The distance between LBP histograms M and N, whereinMbAnd NbIt indicates in histogram M and N respectively B-th of component, H (Ii, x, y) and H (IR, x, y) and it is illustrated respectively in image IiWith image IRBe characterized the adjacent n of pixel (x, y) × The statistics with histogram of each pixel LBP values in the parts n, wherein
Hb(x, y) indicates that b-th of component value in histogram, Γ indicate the partial zones n × n centered on pixel (x, y) Domain, P indicate with (x, y) that as the center of circle, R is the number of the characteristic point on the circle of radius, and δ () indicates discriminant function, when for just When true, δ ()=1, otherwise δ ()=0, T expressions obtain the time of human face region image sequence, are indicated with frame number, IiIt indicates The human face region image that i-th frame obtains;
(2) by context cues comparison score Φ (S, IR) and setting threshold value η be compared, if Φ (S, IR)≤η, Judge the user's facial image obtained for live body.
The application the above embodiment adopts and judges whether the human face region obtained is live body in manner just described, by right Than the LBP texture clues of context, whether can be effectively reflected human face region in the facial image of acquisition is live body, energy Enough to be judged by capturing variation subtle in user's face, adaptable, accuracy is high, suitable for passing through under different situations The user authentication that camera carries out, improves the safety and reliability of device.
Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than the present invention is protected The limitation of range is protected, although being explained in detail to the present invention with reference to preferred embodiment, those skilled in the art answer Work as analysis, technical scheme of the present invention can be modified or replaced equivalently, without departing from the reality of technical solution of the present invention Matter and range.

Claims (7)

1. a kind of user's on-line authentication device, which is characterized in that including:
User registration module is used to, for the online fill message of user, obtain the facial image of user, extracts face characteristic data, Establish user's facial feature database;
User authentication module, including information acquisition unit, In vivo detection unit, image processing unit, feature extraction unit, comparison Unit, result treatment unit;
Information acquisition unit, for acquiring user's facial image;
In vivo detection unit, whether the face in user's facial image for confirming the acquisition is live body;
Image processing unit is pre-processed for the facial image to the acquisition;
Feature extraction unit extracts face characteristic for handling pretreated facial image;
Comparison unit, the face characteristic data in face characteristic data and user's facial feature database for that will extract carry out It compares, exports matched result;
Result treatment unit, for making corresponding processing according to the result of comparison.
2. a kind of user's on-line authentication device according to claim 1, which is characterized in that the user registration module according to The ID card No. for the submission that user fills in transfers corresponding identity card picture by public security Intranet, and the face extracted in photo is special Data are levied, user's facial feature database is established.
3. a kind of user's on-line authentication device according to claim 1, which is characterized in that the In vivo detection unit includes Face datection subelement and live body judgment sub-unit,
Face datection subelement, for detecting face and structures locating;
Live body judgment sub-unit, for detecting whether face is live body.
4. a kind of user's on-line authentication device according to claim 3, which is characterized in that
Described image processing unit is used for equal to collected user's facial image progress illumination compensation, greyscale transformation, histogram Weighing apparatusization, normalization, geometric correction, filtering and correction process;
The feature extraction unit is for the human face feature in the facial image after extraction process, the human face feature Including naked face, eyebrow, eyes, mouth nose feature.
5. a kind of user's on-line authentication device according to claim 3, which is characterized in that the Face datection subelement is also Including:Each frame facial image collected to image acquisition units is filtered, and obtains filtered facial image, so Haar-Like characteristic values are quickly calculated using integral to the filtered facial image afterwards, are applied to good point of off-line training In class device, face is determined whether.
6. a kind of user's on-line authentication device according to claim 5, which is characterized in that the Face datection subelement is also Including:Before determining whether face using grader, complexion model is established by color space transformation, calculates colour of skin likelihood Value is analyzed with features of skin colors and realizes face Preliminary detection, obtains the face candidate region in the filtered facial image, Then Haar-Like characteristic values are quickly calculated using integral to the face candidate region, is applied to the good classification of off-line training In device, face is determined whether;
Wherein, the face candidate region obtained in image specifically includes:
(1) the filtered facial image of acquisition is transformed into YCbCrColor space, wherein Y indicate brightness, CbAnd CrIt indicates respectively Blue component and red component;
(2) by being trained in advance to the colour of skin sample of acquisition, Gauss complexion model is established, wherein the fitting parameter used For:
C=E [(x-m) (x-m)T]
In formula, m and C indicate that the colour of skin mean value obtained by statistical analysis and covariance matrix, x indicate each pixel respectively Color vector (Cb,Cr)T,WithIndicate colour of skin sample in C respectivelyb-CrThe mean value of blue component and red component, E in space Indicate unit matrix, CbiAnd CriIndicate that the blue component and red component of ith pixel point in image, N indicate pixel respectively Sum;
(3) the similarity size of all pixels point and the colour of skin in filtered facial image, i.e. the likelihood value size of the colour of skin are calculated, The colour of skin likelihood value that wherein uses calculate function for:
P(Cb,Cr)=exp [- 0.5 (x-m)TC-1(x-m)]
Wherein, P (Cb,Cr) indicate color likelihood value;
(4) likelihood value matrix is calculated and then is normalized using the maximum value in likelihood value matrix, by two After value, Morphological scale-space, by area of skin color and background separation, obtaining may be containing the candidate region of face.
7. a kind of user's on-line authentication device according to claim 5, which is characterized in that the In vivo detection unit also wraps It includes:Grader establishes subelement;
The grader establishes subelement for training the grader according to user's face database, specifically includes:
The facial image in user's face database is obtained as training sample (x1,y1),(x2,y2),…(xi,yi),… (xn,yn), wherein xiIndicate i-th of face training sample, yiFace sample, y are expressed as when=1iNon-face sample is expressed as when=0 This, n indicates the sum of training sample,
The sorter model wherein used for:
In formula, atIndicate evaluation points, ht(x) it indicates simple classification device, h is trained according to the sample data after weightingt(x) and at, By improving the weight of the sample of classification error, the weight for correct sample of classifying is reduced to adjust sample weights, wherein T expressions change For frequency of training, H (x) indicates strong classifier;
Initial phase:Face sample is initialized respectively and the weight of non-face sample isWithWherein Dt(i) it indicates the t times The Error weight of i-th of sample in repetitive exercise cycle, when i-th of sample is face sample,When i-th of sample When this is non-face sample,Wherein l and m indicates the number of face sample and non-face sample respectively;
Training stage:The weight of training sample is normalized, wherein the normalized function used for:
In formula, qt(i) the normalization Error weight of i-th of sample is indicated;
For each training sample, the Haar-Like rectangular characteristics of the training sample are obtained, and according to each Haar-Like squares Shape feature j generates simple classification device:
In formula, θjIndicate the threshold value of setting, pjIndicate biasing coefficient, pj=± 1, majorization inequality direction;Wherein threshold θjWith it is inclined Set coefficient pjSetting make weighting fault rate function of εj=∑ qt|hj(xi)-yi| it is minimum;
Being chosen from the simple classification device of generation has minimal error rate εtSimple classification device ht
Update the weight of all training samples:
In formula,etClassification results of the training sample at the t times in training are indicated, if the training sample is correctly divided Class, then et=0, otherwise et=1;
The obtained simple classification device with minimal error rate of training stage is combined into strong classifier:
In formula, h (x) indicates the strong classifier being made of simple classification device, atIndicate evaluation points, at=ln (1/ βt);
Cascade classifier builds the stage:Using cascade organizational form, each strong classifier that will be obtained in above-mentioned training process Be connected in series into cascade classifier, the facial image of acquisition according to this by the judgement of each layer of strong classifier in cascade classifier, If it is determined that being face, then continue to judge into next layer, if it is determined that being non-face, then by the facial image of acquisition labeled as inhuman Face, until acquisition facial image by the cascade classifier in each layer of judgement, then mark the facial image of the acquisition For face;Wherein, from N-u layers, N-u levels connection classifier functions are:
In formula, γN-u-1It indicates weighting coefficient, indicates the size of this layer of structure change;
Then the cascade classifier of this layer is:
In formula, θN-uIndicate the classification thresholds of N-u layers of cascade classifier setting, θN-u=min (fN-u(xi)) (i=1 ..., m)
Secondary judgement is carried out using quadratic classifier to the sample refused by N-u layers of cascade classifier, if sample is described in Quadratic classifier then enters next layer of judgement, wherein the quadratic classifier used for:
In formula, H 'N-u(x) quadratic classifier, f ' are indicatedN-u(x) secondary decision function is indicated,μ indicates that coefficient of determination, δ indicate sample by all layers in front Cascade classifier refusal number;
Wherein, N indicates total number of plies of the cascade classifier of setting, and u=U, U-1 ..., 0, U indicates in the cascade classifier of setting Proceed by the number of plies of secondary judgement.
CN201810184043.8A 2018-03-07 2018-03-07 A kind of user's on-line authentication device Withdrawn CN108491705A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810184043.8A CN108491705A (en) 2018-03-07 2018-03-07 A kind of user's on-line authentication device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810184043.8A CN108491705A (en) 2018-03-07 2018-03-07 A kind of user's on-line authentication device

Publications (1)

Publication Number Publication Date
CN108491705A true CN108491705A (en) 2018-09-04

Family

ID=63341682

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810184043.8A Withdrawn CN108491705A (en) 2018-03-07 2018-03-07 A kind of user's on-line authentication device

Country Status (1)

Country Link
CN (1) CN108491705A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110400035A (en) * 2019-04-09 2019-11-01 刘建 Network communicating system, method and storage medium based on parameter acquisition
CN110634094A (en) * 2019-04-03 2019-12-31 卜风雷 Big data analysis-based regulation and control mechanism
CN112069904A (en) * 2020-08-07 2020-12-11 武汉天喻聚联科技有限公司 System and method for determining online picture attribution
CN112507299A (en) * 2020-12-04 2021-03-16 重庆邮电大学 Self-adaptive keystroke behavior authentication method and device in continuous identity authentication system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110634094A (en) * 2019-04-03 2019-12-31 卜风雷 Big data analysis-based regulation and control mechanism
CN110400035A (en) * 2019-04-09 2019-11-01 刘建 Network communicating system, method and storage medium based on parameter acquisition
CN112069904A (en) * 2020-08-07 2020-12-11 武汉天喻聚联科技有限公司 System and method for determining online picture attribution
CN112507299A (en) * 2020-12-04 2021-03-16 重庆邮电大学 Self-adaptive keystroke behavior authentication method and device in continuous identity authentication system

Similar Documents

Publication Publication Date Title
CN108491705A (en) A kind of user's on-line authentication device
CN108509862B (en) Rapid face recognition method capable of resisting angle and shielding interference
US7596247B2 (en) Method and apparatus for object recognition using probability models
CN104036278B (en) The extracting method of face algorithm standard rules face image
CN108416940A (en) A kind of locker managing device
US10922399B2 (en) Authentication verification using soft biometric traits
CN110991389A (en) Matching method for judging appearance of target pedestrian in non-overlapping camera view angle
CN106845328A (en) A kind of Intelligent human-face recognition methods and system based on dual camera
US20230206700A1 (en) Biometric facial recognition and liveness detector using ai computer vision
Santos et al. Nudity detection based on image zoning
CN111832405A (en) Face recognition method based on HOG and depth residual error network
CN111259756A (en) Pedestrian re-identification method based on local high-frequency features and mixed metric learning
CN111767877A (en) Living body detection method based on infrared features
CN109614927B (en) Micro expression recognition based on difference of front and rear frames and characteristic dimension reduction
CN110598574A (en) Intelligent face monitoring and identifying method and system
Tiwari et al. A palmprint based recognition system for smartphone
Ahmed et al. Using fusion of iris code and periocular biometric for matching visible spectrum iris images captured by smart phone cameras
Derman et al. Integrating facial makeup detection into multimodal biometric user verification system
Ravinaik et al. Face Recognition using Modified Power Law Transform and Double Density Dual Tree DWT
Lu et al. Multimodal biometric identification approach based on face and palmprint
Nguyen et al. User re-identification using clothing information for smartphones
Naveena et al. Partial face recognition by template matching
Lee et al. Face detection using multi-modal features
Jalal et al. Facial Mole Detection Approach for Suspect Face Identification using ResNeXt-50
Sandahl et al. Face Recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20180904

WW01 Invention patent application withdrawn after publication