CN106599782B - Authentication method using iris characteristic point position information - Google Patents

Authentication method using iris characteristic point position information Download PDF

Info

Publication number
CN106599782B
CN106599782B CN201610980778.2A CN201610980778A CN106599782B CN 106599782 B CN106599782 B CN 106599782B CN 201610980778 A CN201610980778 A CN 201610980778A CN 106599782 B CN106599782 B CN 106599782B
Authority
CN
China
Prior art keywords
point
information
characteristic
read
iris
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610980778.2A
Other languages
Chinese (zh)
Other versions
CN106599782A (en
Inventor
金虎林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201610980778.2A priority Critical patent/CN106599782B/en
Publication of CN106599782A publication Critical patent/CN106599782A/en
Application granted granted Critical
Publication of CN106599782B publication Critical patent/CN106599782B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Abstract

The invention discloses an authentication method using iris characteristic point position information, which is divided into an inner terminal point, an outer terminal point, and a cross characteristic point between the connected inner terminal point and the outer terminal point from the distance of a pupil, and the iris characteristic points are classified by taking the cross characteristic point as the characteristic and are authenticated through position information. The authentication method using the position information of the iris characteristic point can improve the authentication efficiency and accuracy. The invention extracts the characteristic points from the iris image and calculates through the position information and the characteristic point classification information, thereby having higher calculation speed than the iris code or the characteristic vector, and the user authentication can be carried out even if the iris image is partially shielded by eyelids or noise is added into the eyelash, thereby greatly improving the use convenience of the user.

Description

Authentication method using iris characteristic point position information
Technical Field
The invention belongs to the technical field of electronic information, relates to an authentication method, and particularly relates to an authentication method using iris feature point position information.
Background
The method for user authentication by iris generally comprises the steps of converting an iris image of a rectangular coordinate system extracted by a camera into a polar coordinate with a pupil as a center, generating an iris code of 256 bytes from the iris image of 512 pixels by Gabor filtering, or generating a feature vector by a Wavelet conversion method for storage. When the user is authenticated, the iris code is extracted from the newly acquired iris image, and then the Hamming Distance (Hamming Distance) is calculated with the stored iris code, and the user authentication is performed through the similarity. The similarity between feature vectors is also obtained by hamming distance calculation.
The above-mentioned conventional iris recognition method has several disadvantages that the order of the Track (Track) and the sector (sector) is distorted when the center detection of the pupil has an error, and an error occurs when the feature vector performs image rotation or extraction of the iris field by using the zero cross point of the 1 st-order iris image, and thus authentication failure is easily caused. In addition, the authentication performance is significantly reduced when the iris area is small or the image resolution is low. The computation time will be significantly lengthened when moving tracks and sectors for authentication. Moreover, it is likely that the probability of authentication failure is significantly increased due to the iris being blocked by the eyelid or the addition of an eyelash image to the iris image-generated noise, and thus there may occur inconvenience of the user shaking the eye large at the time of login or authentication.
In view of the above, there is a need to design a new authentication method to overcome the above-mentioned defects of the existing authentication methods.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: provided is an authentication method using position information of iris feature points, which can improve authentication efficiency and accuracy.
In order to solve the technical problems, the invention adopts the following technical scheme:
an authentication method using iris feature point position information, the authentication method comprising:
step S1, a feature point classification step;
classifying the characteristic points on the fishbone line of the iris, and calculating the distance between each characteristic point and the pupil from the characteristic points extracted from the same fishbone line of the iris; in each feature point on the same line, the feature point far away is set as an external terminal point, the feature point near the external terminal point is set as an internal terminal point, and the feature point between the external terminal point and the internal terminal point is set as a cross point;
the inner end point is set to be S, the outer end point is set to be E, the cross point is set to be M, and the sets of iris characteristic points are respectively set to be the sets S of the inner end pointsi(Xi,Yi) Set of external endpoints Ej(Xj,Yj) Set of intersections Mk(Nk,Xk,Yk) (ii) a The related characteristic point information is represented by an internal end point, an external end point, and a cross point sequence number set Ct(Si,Ei,Mk) Represents;
at Si(Xi,Yi) And Ej(Xj,Yj) Wherein i, j represents the serial number of each end point, and X, Y represents the position coordinate of each end point;
Mk(Nk,Xk,Yk) Wherein k represents the serial number of the cross point, N represents the number of crossed fishbone lines, and X and Y represent position coordinates;
Ct(Si,Ei,Mk) Where t denotes the number of the relevant feature points, Si,Ei,MkEach represents an internal end point set serial number, an external end point set serial number, and a cross point serial number;
the feature point information to be saved for user authentication is Si(Xi,Yi),Ej(Xj,Yj),Mk(Nk,Xk,Yk),Ct(Si,Ei,Mk);If the characteristic information generated by the newly read iris image is Si’(Xi,Yi),Ej’(Xj,Yj),Mk’(Nk,Xk,Yk),Ct’(Si,Ei,Mk) Comparing the stored Si(Xi,Yi) And S newly read ini’(Xi,Yi),Ej(Xj,Yj) And Ej’(Xj,Yj) Comparison, Mk(Nk,Xk,Yk) And Mk’(Nk,Xk,Yk) Comparison, Ct(Si,Ei,Mk) And Ct’(Si,Ei,Mk) Obtaining an authentication result after comparison; the comparison process of each characteristic point is to obtain a result after calculating the distance and the angle between two characteristic points;
step S2, the step of performing authentication using the feature point position information specifically includes:
step S21, comparing whether the saved external terminal information is consistent with the read-in external terminal information, if so, the matching is successful, and turning to step S22, otherwise, turning to step S28;
step S22, comparing whether the stored internal terminal information is consistent with the read-in internal terminal information, if so, the matching is successful, and turning to step S23, otherwise, turning to step S28;
step S23, comparing whether the stored cross point information is consistent with the read cross point information, if so, the matching is successful, and turning to step S24, otherwise, turning to step S25;
step S24, comparing whether the stored characteristic point related information is consistent with the read-in characteristic point related information, if so, successfully matching, and turning to step S27, otherwise, turning to step S26;
s25, comparing the stored intersection information with the read intersection information to generate an error, applying a weighted value to the fish bone texture, if the weighted values are consistent after the weighted values are applied, successfully matching, and turning to S24, otherwise, turning to S28;
step S26, comparing the saved characteristic point related information with the read characteristic point related information, applying a weighted value to the intersection point serial number when the error occurs on the intersection point serial number, if the applied weighted value is consistent, the matching is successful, and going to step S27, otherwise, going to step S28;
step S27, judging that the user authentication fails;
step S28, judging that the user authentication fails;
in each stage of comparison, user authentication failure is directly output as long as any stage of comparison failure, so that the calculation speed is high, and the method is very suitable for efficiently searching a plurality of iris information;
the cross point information stored in the user authentication process and the read cross point information are compared, and the phenomenon that the fishbone grain number is inconsistent along with the quality of the iris image occurs in the stored characteristic point related information and the read characteristic point related information are compared;
when the authentication fails in the comparison stage of the stored intersection information and the newly read intersection information, the information of the number of fishbone lines of the intersection is given to a weighted value to be calculated, and then a user authentication result is obtained;
similarly, when authentication failure occurs in the comparison process of the stored characteristic point related information and the read-in characteristic point related information, the cross point sequence number is given to the weighted value to be calculated, and then the user authentication result is obtained.
An authentication method using iris feature point position information, the authentication method comprising:
step S1, a feature point classification step;
step S2, authentication is performed using the feature point position information.
As a preferred scheme of the present invention, in step S1, the feature points on the fishbone line of the iris are classified, and the distance between each feature point and the pupil is calculated from the feature points extracted from the same fishbone line of the iris; in each feature point on the same line, the feature point far away is set as an external terminal point, the feature point near the external terminal point is set as an internal terminal point, and the feature point between the external terminal point and the internal terminal point is set as a cross point;
the inner end point is set to be S, the outer end point is set to be E, the cross point is set to be M, and the sets of iris characteristic points are respectively set to be the sets S of the inner end pointsi(Xi,Yi) Set of external endpoints Ej(Xj,Yj) Set of intersections Mk(Nk,Xk,Yk) (ii) a The related characteristic point information is represented by an internal end point, an external end point, and a cross point sequence number set Ct(Si,Ei,Mk) Represents;
at Si(Xi,Yi) And Ej(Xj,Yj) Wherein i, j represents the serial number of each end point, and X, Y represents the position coordinate of each end point;
Mk(Nk,Xk,Yk) Wherein k represents the serial number of the cross point, N represents the number of crossed fishbone lines, and X and Y represent position coordinates;
Ct(Si,Ei,Mk) Where t denotes the number of the relevant feature points, Si,Ei,MkEach represents an internal end point set serial number, an external end point set serial number, and a cross point serial number;
the feature point information to be saved for user authentication is Si(Xi,Yi),Ej(Xj,Yj),Mk(Nk,Xk,Yk),Ct(Si,Ei,Mk) (ii) a If the characteristic information generated by the newly read iris image is Si’(Xi,Yi),Ej’(Xj,Yj),Mk’(Nk,Xk,Yk),Ct’(Si,Ei,Mk) Comparing the stored Si(Xi,Yi) And S newly read ini’(Xi,Yi),Ej(Xj,Yj) And Ej’(Xj,Yj) Comparison, Mk(Nk,Xk,Yk) And Mk’(Nk,Xk,Yk) Comparison, Ct(Si,Ei,Mk) And Ct’(Si,Ei,Mk) Obtaining an authentication result after comparison; the comparison process of each feature point is to obtain a result by calculating the distance and the angle between two feature points.
As a preferable embodiment of the present invention, step S2 specifically includes:
step S21, comparing whether the saved external terminal information is consistent with the read-in external terminal information, if so, the matching is successful, and turning to step S22, otherwise, turning to step S28;
step S22, comparing whether the stored internal terminal information is consistent with the read-in internal terminal information, if so, the matching is successful, and turning to step S23, otherwise, turning to step S28;
step S23, comparing whether the stored cross point information is consistent with the read cross point information, if so, the matching is successful, and turning to step S24, otherwise, turning to step S25;
step S24, comparing whether the stored characteristic point related information is consistent with the read-in characteristic point related information, if so, successfully matching, and turning to step S27, otherwise, turning to step S26;
s25, comparing the stored intersection information with the read intersection information to generate an error, applying a weighted value to the fish bone texture, if the weighted values are consistent after the weighted values are applied, successfully matching, and turning to S24, otherwise, turning to S28;
step S26, comparing the saved characteristic point related information with the read characteristic point related information, applying a weighted value to the intersection point serial number when the error occurs on the intersection point serial number, if the applied weighted value is consistent, the matching is successful, and going to step S27, otherwise, going to step S28;
step S27, judging that the user authentication fails;
step S28, the user authentication is determined to have failed.
As a preferred scheme of the invention, the user authentication failure is directly output as long as any stage of comparison failure in each stage of comparison, so that the calculation speed is high, and the method is very suitable for efficiently searching a plurality of iris information.
As a preferred embodiment of the present invention, the comparison stage of the stored cross point information and the read cross point information and the comparison stage of the stored feature point related information and the read feature point related information in the user authentication process may cause a phenomenon that the number of fish bone grains is inconsistent according to the quality of the iris image.
As a preferable aspect of the present invention, when the authentication fails in the stage of comparing the stored intersection information with the newly read intersection information, the information on the number of fishbone lines at the intersection is given a weighted value and calculated to obtain the user authentication result;
similarly, when authentication failure occurs in the comparison process of the stored characteristic point related information and the read-in characteristic point related information, the cross point sequence number is given to the weighted value to be calculated, and then the user authentication result is obtained.
The invention has the beneficial effects that: the authentication method using the position information of the iris characteristic point can improve the authentication efficiency and accuracy. The invention extracts the characteristic points from the iris image and calculates through the position information and the characteristic point classification information, thereby having higher calculation speed than the iris code or the characteristic vector, and the user authentication can be carried out even if the iris image is partially shielded by eyelids or noise is added into the eyelash, thereby greatly improving the use convenience of the user.
Drawings
Fig. 1 is a schematic view of an iris image.
Fig. 2 is an image schematic diagram of iris image partial interception and thinning processing.
Fig. 3 is a schematic diagram of a thinned iris image feature point mark.
Fig. 4 is a schematic diagram illustrating extraction of feature point information of an iris image.
FIG. 5 is a flowchart illustrating user authentication according to the present invention.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Example one
The invention discloses an authentication method by using position information of iris feature points, which comprises the following steps:
step S1, a characteristic point classification step;
classifying the characteristic points on the fishbone line of the iris, and calculating the distance between each characteristic point and the pupil from the characteristic points extracted from the same fishbone line of the iris; in each feature point on the same line, the feature point far away is set as an external terminal point, the feature point near the external terminal point is set as an internal terminal point, and the feature point between the external terminal point and the internal terminal point is set as a cross point;
the inner end point is set to be S, the outer end point is set to be E, the cross point is set to be M, and the sets of iris characteristic points are respectively set to be the sets S of the inner end pointsi(Xi,Yi) Set of external endpoints Ej(Xj,Yj) Set of intersections Mk(Nk,Xk,Yk) (ii) a The related characteristic point information is represented by an internal end point, an external end point, and a cross point sequence number set Ct(Si,Ei,Mk) Represents;
at Si(Xi,Yi) And Ej(Xj,Yj) Wherein i, j represents the serial number of each end point, and X, Y represents the position coordinate of each end point;
Mk(Nk,Xk,Yk) Wherein k represents the serial number of the cross point, N represents the number of crossed fishbone lines, and X and Y represent position coordinates;
Ct(Si,Ei,Mk) Where t denotes the number of the relevant feature points, Si,Ei,MkEach represents an internal end point set serial number, an external end point set serial number, and a cross point serial number;
the feature point information to be saved for user authentication is Si(Xi,Yi),Ej(Xj,Yj),Mk(Nk,Xk,Yk),Ct(Si,Ei,Mk) (ii) a If the characteristic information generated by the newly read iris image is Si’(Xi,Yi),Ej’(Xj,Yj),Mk’(Nk,Xk,Yk),Ct’(Si,Ei,Mk) Comparing the stored Si(Xi,Yi) And S newly read ini’(Xi,Yi),Ej(Xj,Yj) And Ej’(Xj,Yj) Comparison, Mk(Nk,Xk,Yk) And Mk’(Nk,Xk,Yk) Comparison, Ct(Si,Ei,Mk) And Ct’(Si,Ei,Mk) Obtaining an authentication result after comparison; the comparison process of each characteristic point is to obtain a result after calculating the distance and the angle between two characteristic points;
step S2, performing authentication using the feature point position information, specifically including:
step S500, comparing whether the stored external terminal information is successfully matched with the read-in external terminal information, if so, turning to step S510, otherwise, turning to step S28;
step S510, comparing whether the stored internal terminal information is successfully matched with the read-in internal terminal information, if so, turning to step S520, otherwise, turning to step S28;
step S520, comparing whether the stored cross point information is successfully matched with the read cross point information or not, if so, turning to step S530, otherwise, turning to step S540;
step S530, comparing whether the stored characteristic point related information is successfully matched with the read-in characteristic point related information or not, if so, turning to step S27, otherwise, turning to step S550;
step S540, comparing whether the weight values applicable to the fishbone grains are successfully matched when the stored intersection information and the read intersection information have errors, if so, turning to step S530, otherwise, turning to step S28;
step S550, comparing the stored characteristic point related information with the read characteristic point related information, judging whether the matching of the weighted values applied to the intersection serial numbers is successful when the error occurs on the intersection serial numbers, if so, turning to step S27, otherwise, turning to step S28;
step S27, judging that the user authentication fails;
step S28, judging that the user authentication fails;
in each stage of comparison, user authentication failure is directly output as long as any stage of comparison failure, so that the calculation speed is high, and the method is very suitable for efficiently searching a plurality of iris information;
the cross point information stored in the user authentication process and the read cross point information are compared, and the phenomenon that the fishbone grain number is inconsistent along with the quality of the iris image occurs in the stored characteristic point related information and the read characteristic point related information are compared;
when the authentication fails in the comparison stage of the stored intersection information and the newly read intersection information, the information of the number of fishbone lines of the intersection is given to a weighted value to be calculated, and then a user authentication result is obtained;
similarly, when authentication failure occurs in the comparison process of the stored characteristic point related information and the read-in characteristic point related information, the cross point sequence number is given to the weighted value to be calculated, and then the user authentication result is obtained.
Example two
The middle black part of the image in diagram 1 is the pupil, and the rest part is the iris image with the fish bone lines visible.
Fig. 2 is an image processing in which a part is extracted from the iris image of fig. 1 and thinned. Characteristic points of fishbone grains can be found in the thinned image.
Diagram 3 is a portion of the fishbone line of the thinned iris of diagram 2, each fishbone line labeled with a characteristic point. The method for classifying iris feature points and authenticating a user through position information introduced in the invention is characterized in that the distance between pupils is calculated according to feature points extracted from the same fishbone line of the iris, and classification is carried out according to a far external terminal (300-309), a near internal terminal (310-316) and a cross point (320-322) between the external terminal and the internal terminal.
If the internal end point is S and the external end point is E, crossThe cross points are marked by M, and the sets of iris characteristic points are respectively expressed as a set S of internal terminal pointsi(Xi,Yi) Set of external endpoints Ej(Xj,Yj) Set of intersections Mk(Nk,Xk,Yk). The related characteristic point information can be used as an internal terminal, an external terminal, and a cross point sequence number set, namely Ct(Si,Ei,Mk) And (4) showing.
At Si(Xi,Yi) And Ej(Xj,Yj) Where i, j denote the number of each end point, and X, Y denote the position coordinates of each end point.
Mk(Nk,Xk,Yk) Where k denotes the number of the crossing point, N denotes the number of crossing fishbone lines, and X, Y denote the position coordinates.
Ct(Si,Ei,Mk) Where t denotes the number of the relevant feature points, Si,Ei,MkEach indicates an internal end point set number, an external end point set number, and a cross point number.
The feature point information to be saved for user authentication is Si(Xi,Yi),Ej(Xj,Yj),Mk(Nk,Xk,Yk),Ct(Si,Ei,Mk). If the characteristic information generated by the newly read iris image is Si’(Xi,Yi),Ej’(Xj,Yj),Mk’(Nk,Xk,Yk),Ct’(Si,Ei,Mk) Comparing the stored Si(Xi,Yi) And S newly read ini’(Xi,Yi),Ej(Xj,Yj) And Ej’(Xj,Yj) Comparison, Mk(Nk,Xk,Yk) And Mk’(Nk,Xk,Yk) Comparison, Ct(Si,Ei,Mk) And Ct’(Si,Ei,Mk) And obtaining an authentication result after comparison. The comparison process of each feature point is to obtain a result by calculating the distance and the angle between two feature points.
Diagram 4 is a schematic diagram of a part of diagram 3 being cut out to further illustrate extraction of feature points of an iris image. The characteristic points of the illustrative diagram are constituted by an outer end point (404,405), an inner end point (411.412), and an intersection point (422). The first external termination (404) is E1(X1,Y1) The second external termination (405) is E2(X2,Y2) The first internal termination (411) is S1(X3,Y3) The second internal termination (412) is S2(X4,Y4) The intersection (422) is M1(4,X5,Y5) The characteristic point-related information is C1(S1,S2,E1,E2,M1)。C1Middle analysis M1Obtaining 4 fish bone lines connected with each other at S1,S2,E1,E2On the extension line of (a).
FIG. 5 is a user authentication flow chart of the present invention, which comprises a step (500) of comparing stored external end point information with read-in external end point information, a step (510) of comparing stored internal end point information with read-in internal end point information, a step (520) of comparing stored cross point information with read-in cross point information, a step (530) of comparing stored characteristic point related information with read-in characteristic point related information, a step (540) of comparing stored cross point information with read-in cross point information with a weighting value applied to a fish bone line when an error occurs in the number of cross point numbers, and a step (550) of comparing stored characteristic point related information with read-in characteristic point related information with a weighting value applied to a cross point number when an error occurs in the number of cross point numbers.
The cross point information stored in the user authentication process and the read cross point information are compared (520), and the phenomenon that the fish bone grain number is inconsistent caused by the quality of the iris image occurs in the stored characteristic point related information and the read characteristic point related information are compared (530).
When the authentication fails in the stage (520) of comparing the stored intersection information with the newly read intersection information, the information on the number of fishbone lines at the intersection is weighted and calculated (540), and the user authentication result is obtained.
Similarly, when authentication failure occurs during the comparison (530) between the stored characteristic point related information and the read-in characteristic point related information, the number of the cross points is given to the weighted value to be calculated (550), and then the user authentication result is obtained.
In summary, the authentication method using the position information of the iris feature point provided by the invention can improve the authentication efficiency and accuracy. The invention extracts the characteristic points from the iris image and calculates through the position information and the characteristic point classification information, thereby having higher calculation speed than the iris code or the characteristic vector, and the user authentication can be carried out even if the iris image is partially shielded by eyelids or noise is added into the eyelash, thereby greatly improving the use convenience of the user.
The description and applications of the invention herein are illustrative and are not intended to limit the scope of the invention to the embodiments described above. Variations and modifications of the embodiments disclosed herein are possible, and alternative and equivalent various components of the embodiments will be apparent to those skilled in the art. It will be clear to those skilled in the art that the present invention may be embodied in other forms, structures, arrangements, proportions, and with other components, materials, and parts, without departing from the spirit or essential characteristics thereof. Other variations and modifications of the embodiments disclosed herein may be made without departing from the scope and spirit of the invention.

Claims (1)

1. An authentication method using position information of an iris feature point, the authentication method comprising:
step S1, a feature point classification step;
classifying the characteristic points on the fishbone lines of the iris, and calculating the distance from each characteristic point to the pupil from the characteristic points extracted from the same fishbone line of the iris; in each feature point on the same line, the feature point far away is set as an external terminal point, the feature point near the external terminal point is set as an internal terminal point, and the feature point between the external terminal point and the internal terminal point is set as a cross point;
the inner end point is set to be S, the outer end point is set to be E, the cross point is set to be M, and the sets of iris characteristic points are respectively set to be the sets S of the inner end pointsi(Xi,Yi) Set of external endpoints Ej(Xj,Yj) Set of intersections Mk(Nk,Xk,Yk) (ii) a The related characteristic point information is represented by an internal end point, an external end point, and a cross point sequence number set Ct(Si,Ei,Mk) Represents;
at Si(Xi,Yi) And Ej(Xj,Yj) Wherein i, j represents the serial number of each end point, and X, Y represents the position coordinate of each end point;
Mk(Nk,Xk,Yk) Wherein k represents the serial number of the cross point, N represents the number of crossed fishbone lines, and X and Y represent position coordinates;
Ct(Si,Ei,Mk) Where t denotes the number of the relevant feature points, Si,Ei,MkEach represents an internal end point set serial number, an external end point set serial number, and a cross point serial number;
the feature point information to be saved for user authentication is Si(Xi,Yi),Ej(Xj,Yj),Mk(Nk,Xk,Yk),Ct(Si,Ei,Mk) (ii) a If the characteristic information generated by the newly read iris image is Si’(Xi,Yi),Ej’(Xj,Yj),Mk’(Nk,Xk,Yk),Ct’(Si,Ei,Mk) Comparing the stored Si(Xi,Yi) And S newly read ini’(Xi,Yi),Ej(Xj,Yj) And Ej’(Xj,Yj) Comparison, Mk(Nk,Xk,Yk) And Mk’(Nk,Xk,Yk) Comparison, Ct(Si,Ei,Mk) And Ct’(Si,Ei,Mk) Obtaining an authentication result after comparison; the comparison process of each characteristic point is to obtain a result after calculating the distance and the angle between two characteristic points;
step S2, the step of performing authentication using the feature point position information specifically includes:
step S21, comparing whether the saved external terminal information is consistent with the read-in external terminal information, if so, the matching is successful, and turning to step S22, otherwise, turning to step S28;
step S22, comparing whether the stored internal terminal information is consistent with the read-in internal terminal information, if so, the matching is successful, and turning to step S23, otherwise, turning to step S28;
step S23, comparing whether the stored cross point information is consistent with the read cross point information, if so, the matching is successful, and turning to step S24, otherwise, turning to step S25;
step S24, comparing whether the stored relevant feature point information is consistent with the read relevant feature point information, if so, successfully matching, and turning to step S27, otherwise, turning to step S26;
s25, comparing the stored intersection information with the read intersection information to generate an error, applying a weighted value to the fish bone texture, if the weighted values are consistent after the weighted values are applied, successfully matching, and turning to S24, otherwise, turning to S28;
step S26, comparing the saved related characteristic point information with the read related characteristic point information, applying a weighted value to the serial number of the cross point when the error occurs on the serial number of the cross point, if the applied weighted values are consistent, the matching is successful, and going to step S27, otherwise, going to step S28;
step S27, judging that the user authentication is successful;
step S28, judging that the user authentication fails;
in each stage of comparison, user authentication failure is directly output as long as any stage of comparison fails;
the cross point information stored in the user authentication process and the read cross point information are compared, and the phenomenon that the fishbone grain number is inconsistent along with the quality of the iris image occurs in the stored relevant characteristic point information and the read relevant characteristic point information are compared;
when the authentication fails in the comparison stage of the stored intersection information and the newly read intersection information, the information of the number of fishbone lines of the intersection is given to a weighted value to be calculated, and then a user authentication result is obtained;
similarly, when authentication failure occurs in the comparison process of the stored relevant feature point information and the read-in relevant feature point information, the cross point sequence number is given to the weighted value to be calculated, and then the user authentication result is obtained.
CN201610980778.2A 2016-11-08 2016-11-08 Authentication method using iris characteristic point position information Active CN106599782B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610980778.2A CN106599782B (en) 2016-11-08 2016-11-08 Authentication method using iris characteristic point position information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610980778.2A CN106599782B (en) 2016-11-08 2016-11-08 Authentication method using iris characteristic point position information

Publications (2)

Publication Number Publication Date
CN106599782A CN106599782A (en) 2017-04-26
CN106599782B true CN106599782B (en) 2020-06-09

Family

ID=58590168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610980778.2A Active CN106599782B (en) 2016-11-08 2016-11-08 Authentication method using iris characteristic point position information

Country Status (1)

Country Link
CN (1) CN106599782B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101330386A (en) * 2008-05-19 2008-12-24 刘洪利 Authentication system based on biological characteristics and identification authentication method thereof
CN104462923A (en) * 2014-12-31 2015-03-25 河南华辰智控技术有限公司 Intelligent iris identity recognition system applied to mobile communication device
CN104899583A (en) * 2015-06-30 2015-09-09 成都点石创想科技有限公司 Iris identifying method for door access monitoring system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101122949A (en) * 2007-08-30 2008-02-13 中国科学技术大学 Iris identification method based on local frequency characteristic and local direction characteristic
CN101727574A (en) * 2008-10-17 2010-06-09 深圳富泰宏精密工业有限公司 Iris recognition system and method
CN101853378A (en) * 2010-05-24 2010-10-06 哈尔滨工程大学 Finger vein identification method based on relative distance
US10038691B2 (en) * 2013-10-08 2018-07-31 Princeton Identity, Inc. Authorization of a financial transaction
US20160019421A1 (en) * 2014-07-15 2016-01-21 Qualcomm Incorporated Multispectral eye analysis for identity authentication
CN105574385A (en) * 2015-06-16 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Control apparatus and method for household appliances, and household appliances
CN105138890B (en) * 2015-09-30 2018-11-30 宇龙计算机通信科技(深圳)有限公司 Method for authenticating, authentication device and terminal
CN105426877B (en) * 2015-12-22 2019-09-10 金虎林 Utilize the information identifying method and system of sweat gland location information

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101330386A (en) * 2008-05-19 2008-12-24 刘洪利 Authentication system based on biological characteristics and identification authentication method thereof
CN104462923A (en) * 2014-12-31 2015-03-25 河南华辰智控技术有限公司 Intelligent iris identity recognition system applied to mobile communication device
CN104899583A (en) * 2015-06-30 2015-09-09 成都点石创想科技有限公司 Iris identifying method for door access monitoring system

Also Published As

Publication number Publication date
CN106599782A (en) 2017-04-26

Similar Documents

Publication Publication Date Title
Miao et al. Pose-guided feature alignment for occluded person re-identification
US9824258B2 (en) Method and apparatus for fingerprint identification
Spreeuwers Fast and accurate 3D face recognition: using registration to an intrinsic coordinate system and fusion of multiple region classifiers
WO2019015466A1 (en) Method and apparatus for verifying person and certificate
JP4951498B2 (en) Face image recognition device, face image recognition method, face image recognition program, and recording medium recording the program
US9690973B2 (en) Feature-based matcher for distorted fingerprint matching
JP2008521122A (en) Multiscale variable domain decomposition method and system for iris discrimination
CN103902978B (en) Face datection and recognition methods
JP2012519927A (en) A novel inpainting technique and method for reconstructing iris scans through partial acquisition mosaic processing
US20200302218A1 (en) Fast curve matching for tattoo recognition and identification
WO2015165227A1 (en) Human face recognition method
JP2019117577A (en) Program, learning processing method, learning model, data structure, learning device and object recognition device
CN104091145A (en) Human palm vein feature image acquisition method
CN106407978B (en) Method for detecting salient object in unconstrained video by combining similarity degree
JPWO2013122009A1 (en) Reliability acquisition device, reliability acquisition method, and reliability acquisition program
US11263437B2 (en) Method for extracting a feature vector from an input image representative of an iris by means of an end-to-end trainable neural network
Proenca et al. Iris recognition: An analysis of the aliasing problem in the iris normalization stage
CN108090460B (en) Weber multidirectional descriptor-based facial expression recognition feature extraction method
CN106650616A (en) Iris location method and visible light iris identification system
JP7121132B2 (en) Image processing method, apparatus and electronic equipment
CN106599782B (en) Authentication method using iris characteristic point position information
CN110598647B (en) Head posture recognition method based on image recognition
CN110147769B (en) Finger vein image matching method
CN115497125B (en) Fingerprint identification method, system, computer equipment and computer readable storage medium
CN104268502A (en) Recognition method for human body after vein image feature extraction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant