WO2019103520A1 - Dispositif et procédé d'authentification d'utilisateur sur la base de la reconnaissance d'iris - Google Patents

Dispositif et procédé d'authentification d'utilisateur sur la base de la reconnaissance d'iris Download PDF

Info

Publication number
WO2019103520A1
WO2019103520A1 PCT/KR2018/014515 KR2018014515W WO2019103520A1 WO 2019103520 A1 WO2019103520 A1 WO 2019103520A1 KR 2018014515 W KR2018014515 W KR 2018014515W WO 2019103520 A1 WO2019103520 A1 WO 2019103520A1
Authority
WO
WIPO (PCT)
Prior art keywords
iris
eye
normalized
image
hamming distance
Prior art date
Application number
PCT/KR2018/014515
Other languages
English (en)
Korean (ko)
Inventor
안드레비치 오디노키코그래브
미크하일로비치 팔투코브알렉세이
설지비치 그나티유크비탈리
알렉스비치 에레미브블라디미르
유주완
이광현
이희준
신대규
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from RU2017141021A external-priority patent/RU2670798C9/ru
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to EP18880306.8A priority Critical patent/EP3680794A4/fr
Priority to CN201880073875.0A priority patent/CN111344703B/zh
Priority to US16/765,298 priority patent/US11449590B2/en
Publication of WO2019103520A1 publication Critical patent/WO2019103520A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/45Structures or tools for the administration of authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an iris recognition-based user authentication apparatus and method, and more particularly, to an apparatus and method for recognizing iris recognition for a user including a user with a part of an iris, will be.
  • the conventional iris recognition method requires registration for a sufficient area (for example, 40% or more) of the iris to obtain a reliable recognition result.
  • the iris region is not limited to the user's eyelid, Or by highlights caused by wearing glasses.
  • the user must open eyes irrespective of the iris registration and recognition process in order to always obtain a reliable authentication level in various external environments.
  • Such a variety of external environments can be bright light that is reflected directly or indirectly from other objects such as snow, wind, dust, fog, smoke, etc., and it can also be a disease that is difficult to open the eyes, such as ptosis.
  • FIG. 1 is a diagram illustrating an iris and a pupil region on an eye region image of a user according to some embodiments.
  • an eye region image 10 of a user may be roughly divided into an iris region 101 and a pupil region 103.
  • the conventional iris recognition technology extracts features of the iris feature, that is, the iris texture through image analysis for the iris region 101, and compares the extracted iris feature with previously registered iris data to check whether or not they match.
  • texture refers to image pieces or patterns characterized by statistical indices describing image pattern direction characteristics and spatial frequency.
  • each user can register and manage iris data in advance through image capturing and analysis of his / her iris and use the iris data as a user authentication means.
  • the pupil area 103 is not covered by the user's eyelids, the iris area 101 is partially obscured by the user's eyelids, And a sufficient area necessary for recognition are exposed.
  • the pupil region 103 and the iris region 101 are partially obscured by the user's eyelids, and in particular, the iris region 101 has a sufficient area
  • the iris recognition and the user authentication may be difficult.
  • One of the conventional biometric systems using iris recognition is to modify the similarity score calculation by determining the trust score based on the local density of the similarity between the compared iris images, Is used.
  • One of the conventional biometric systems using iris recognition is a recognition method that fuses a plurality of features of the user's eye image.
  • a number of features include both information about parameters of the iris texture and additional information about the eye color and the area around the eyes.
  • this additional information can be used to reduce the number of failed recognition attempts because it is used to increase recognition accuracy but requires the use of additional computational resources due to the use of additional information and only information about one eye is used . Therefore, it is not suitable as a solution to the problem concerning the recognition of the partially obscured eye.
  • iris recognition uses a combination of fingerprint recognition and face recognition to improve recognition accuracy.
  • face recognition can rotate the left and right iris images according to the angle in the direction of removing the head tilt angle, using imaginary lines extending between the user's two eyes according to predefined features associated with the eye have.
  • the iris image generated according to rotation can be used for registration or authentication, and the head tilt angle can be determined using a predefined feature associated with one eye in the image.
  • a single integrated similarity score based on the iris similarity scores of the two eyes used for user identification is generated, but no meter is used for the generation of the integrated similarity score.
  • geometric parameters such as inter pupillary distance (IPD) and IPD-to-iris ratio can be used for additional recognition of an appropriate template in a pre-stored database or for iris image quality estimation for fast searching, .
  • this method is also not suitable as a solution to the problem concerning the recognition of the partially obscured eye.
  • Some embodiments can provide an iris recognition based user authentication apparatus and method that can increase the accuracy of iris recognition through analysis of iris and pupil of the left and right eyes of a user.
  • some embodiments provide an iris recognition-based user authentication apparatus and method capable of efficiently performing iris recognition using information on a geometric parameter of iris acquired through analysis of a user's pupil image and iris image can do.
  • a first aspect of the present disclosure is a method for acquiring an image of a user's left and right eyes; Extracting a pupil image and an iris image from the image; Analyzing the iris image to obtain a first iris feature; Analyzing the iris image and the pupil image to obtain a second iris feature; Obtaining a similarity score based on the first iris feature, the second iris feature, and pre-stored reference iris data; And determining whether the user authentication is approved based on the similarity score.
  • the second aspect of the present disclosure also provides a method for acquiring an image of a user's left and right eyes, extracting a pupil image and an iris image from the image, analyzing the iris image to obtain a first iris feature, Acquiring a similarity score based on the first iris feature, the second iris feature, and the previously stored reference iris data, analyzing the image and the pupil image to obtain a second iris feature, At least one processor for determining whether the processor is in a non-volatile memory; And at least one memory for storing the reference iris data.
  • the third aspect of the present disclosure can also provide a computer program product comprising a computer-readable recording medium storing a program for causing a computer to execute the method of the first aspect.
  • Some embodiments have the effect of increasing the accuracy of iris recognition through the analysis of the iris and pupil of the left eye and the right eye of the user.
  • some embodiments have the effect of efficiently performing iris recognition using information on the geometric parameters of the iris acquired through analysis of the pupil image and the iris image of the user.
  • FIG. 1 is a diagram illustrating an iris and a pupil region on an eye region image of a user according to some embodiments.
  • FIG. 2 is a diagram illustrating an example of an iris recognition-based user authentication apparatus according to some embodiments.
  • FIG. 3 is a flow diagram of an iris recognition based user authentication method in accordance with some embodiments of the present disclosure.
  • FIG. 4 is a diagram illustrating a first iris feature and a second iris feature acquisition method in accordance with some embodiments of the present disclosure.
  • FIG. 5 is a view showing positions of pupil and iris on a coordinate system according to some embodiments.
  • FIG. 6 is a flow diagram of a method for calculating a similarity score in accordance with some embodiments.
  • FIG. 7 is a flow diagram of a method for computing a first similarity parameter according to some embodiments.
  • FIG. 8 is a diagram illustrating a method of calculating a similarity score using a first similarity parameter and a second similarity parameter.
  • FIG. 9 is a flow diagram illustrating a method for determining whether a user is authorized by comparing a similarity score and a threshold score in accordance with some embodiments.
  • a method comprising: obtaining an image of a user's left and right eyes; Extracting a pupil image and an iris image from the image; Analyzing the iris image to obtain a first iris feature; Analyzing the iris image and the pupil image to obtain a second iris feature; Obtaining a similarity score based on the first iris feature, the second iris feature, and pre-stored reference iris data; And determining whether the user authentication is approved based on the similarity score.
  • a method for acquiring an image of a user's left and right eyes extracting a pupil image and an iris image from the image, analyzing the iris image to obtain a first iris feature, Acquiring a second iris feature by analyzing the iris image and the pupil image, obtaining a similarity score based on the first iris feature, the second iris feature, and the previously stored reference iris data, At least one processor for determining whether to approve; And at least one memory for storing the reference iris data.
  • a computer program product may be provided that includes a computer-readable recording medium storing a program for causing a computer to execute the method of the first aspect.
  • FIG. 2 is a diagram illustrating an example of an iris recognition-based user authentication apparatus according to some embodiments.
  • the iris recognition-based user authentication apparatus 20 can acquire an image 21 of the left and right eyes of the user 2 to perform user authentication.
  • the iris recognition based user authentication apparatus 20 may include a processor 201 and a memory 203.
  • the processor 201 may obtain the image 21 for the left and right eyes of the user.
  • the processor 201 may capture the user image directly, e.g., through the image acquisition unit to obtain the image 21 for the user's left and right eyes, or the image 21 for the left and right eyes of the user captured by another device 21 from another device.
  • the processor 201 may extract pupil images and iris images from the images 21 for the user's left and right eyes.
  • the processor 201 can obtain, for example, the pupil image of the left eye and the pupil image of the right eye and the iris image of the left eye, separately from the image 21 for the left and right eyes of the user.
  • the processor 201 may analyze the iris image to obtain a first iris feature.
  • the processor 201 may perform, for example, segmentation and normalization on the iris to obtain a first iris feature.
  • the first iris feature may include information about an iris texture feature.
  • the first iris feature may include information about the extracted iris texture, for example, by analyzing the iris image obtained from the image 21 for the left and right eyes of the user.
  • the first iris feature may be, for example, converted into an encoded and binarized iris code.
  • the processor 201 may analyze the iris image and the pupil image to obtain the second iris feature.
  • the second iris feature may include information about the pupil position of the user and geometric parameters for the iris position.
  • a second iris characteristics for example, information about the iris circle radius (R I), the pupil circle ratio between the radius of the iris circle radius (PIR), X-axis normalized distance (NDX) and Y-axis normalized distance (NDY) . ≪ / RTI >
  • the iris circle means the circle having the smallest radius in the circle circumscribing the iris
  • the pupil circle means the circle having the smallest radius in the circle circumscribing the pupil
  • the processor 201 calculates the pupil center coordinates (X P , Y P ) and the pupil center coordinates determined by analyzing the iris image, the iris circle center coordinates (X I , Y I ) and the iris circle radius (R I )
  • the second iris feature can be obtained based on the pupil radius (R P ).
  • the processor 201 may obtain a similarity score based on the first iris feature, the second iris feature, and the reference iris data stored in advance in the memory 203.
  • the processor 201 can generate the first similarity parameter by comparing the iris code corresponding to the first iris feature and the reference code included in the reference iris data stored in advance in the memory 203.
  • the processor 201 may generate the second similarity parameter by comparing the second iris feature and the reference feature included in the reference iris data stored in the memory 203 in advance.
  • the processor 201 may obtain a similarity score using the first similarity parameter and the second similarity parameter.
  • the processor 201 may assign a first similarity parameter and a second similarity parameter to at least one of a predetermined mathematical expression and a parameter mapping table to obtain a similarity score for determining whether to approve user authentication.
  • the processor 201 may determine whether to approve user authentication based on the similarity score.
  • the processor 201 may compare the obtained similarity score with, for example, a preset threshold score to determine whether to approve the user authentication.
  • FIG. 3 is a flow diagram of an iris recognition based user authentication method in accordance with some embodiments of the present disclosure.
  • the processor 201 may acquire images of the left and right eyes of the user.
  • the processor 201 may capture an image of the user's left and right eyes by, for example, capturing the user image directly through the image acquiring unit, or may acquire images of the user's left and right eyes captured by another apparatus from another apparatus .
  • the processor 201 may extract a pupil image and an iris image from the acquired image.
  • the processor 201 can obtain, for example, the pupil image and the iris image of the left eye, the pupil image of the left eye, and the iris image from the images of the left and right eyes of the user.
  • the processor 201 may analyze the iris image to obtain the first iris feature.
  • the processor 201 may perform, for example, segmentation and normalization on the iris to obtain a first iris feature.
  • the first iris feature may include information about an iris texture feature.
  • the first iris feature may be, for example, converted into an encoded and binarized iris code.
  • the processor 201 may acquire the second iris feature by analyzing the iris image and the pupil image.
  • the second iris feature may include information about the pupil position of the user and geometric parameters according to the iris position.
  • a second iris characteristics for example, information about the iris circle radius (R I), the pupil circle ratio between the radius of the iris circle radius (PIR), X-axis normalized distance (NDX) and Y-axis normalized distance (NDY) . ≪ / RTI >
  • the processor 201 calculates the pupil center coordinates (X P , Y P ) and the pupil center coordinates determined by analyzing the iris image, the iris circle center coordinates (X I , Y I ) and the iris circle radius (R I )
  • the second iris feature can be obtained based on the pupil radius (R P ).
  • step S305 the processor 201 may obtain a similarity score based on the first iris feature, the second iris feature, and the previously stored reference iris data.
  • the processor 201 can generate the first similarity parameter by comparing the iris code corresponding to the first iris feature and the reference code included in the reference iris data stored in advance in the memory 203.
  • the processor 201 may generate the first similarity parameter based on the left eye normalized hamming distance HD Left , the right eye normalized hamming distance HD Right , the left eye bit count BitCount Left and the right eye bit count BitCount Right .
  • the processor 201 may compute the left-eye normalized hamming distance (HD Left ) and the right-eye normalized hamming distance (HD Right ), for example, by comparing the iris code and the previously stored reference code to generate the first similarity parameter .
  • the Hamming distance (HD) is a distance function indicating the number of different symbols at the same position in two strings of the same length, and in particular, in the case of binary codes, the Hamming distance is the distance function of the discrepancy bits .
  • Processor 201 is, for example, the left eye normalized Hamming Distance (HD Left) and right-eye normalized Hamming Distance (HD Right), number of the left eye of bits used to compute the left-eye normalized Hamming Distance (HD Left) and calculate (BitCount Left ) And the number of right-eye bits (BitCount Right ) used to calculate the right-eye normalized hamming distance (HD Right ).
  • the number of bits corresponds to the number of bits for the area where masking due to the user's eyelid or the like does not exist in the iris recognition process, that is, the exposure area.
  • the first similarity parameter may include, for example, a Hamming distance difference parameter (HDnorm), which is a normalized difference of a left-eye normalization hamming distance (HD Left ) and a right eye normalization hamming distance (HD right ).
  • HDnorm Hamming distance difference parameter
  • the first similarity parameter may include, for example, a Hamming distance average parameter (HD avg ) which is an average of the left- eye normalized hamming distance (HD left ) and the right eye normalized hamming distance (HD right ).
  • HD avg Hamming distance average parameter
  • the first similarity parameter may include, for example, the minimum total number of bits parameter (BitCount min ) which is the minimum total number of bits used in the calculation of the left- eye normalization hamming distance (HD left ) and the right eye normalization hamming distance (HD right ) .
  • BitCount min the minimum total number of bits parameter which is the minimum total number of bits used in the calculation of the left- eye normalization hamming distance (HD left ) and the right eye normalization hamming distance (HD right ) .
  • the first similarity parameter may include, for example, the maximum total number of bits parameter (BitCount max ) which is the maximum total number of bits used in the calculation of the left- eye normalized hamming distance (HD left ) and the right-eye normalized hamming distance (HD right ) .
  • BitCount max the maximum total number of bits parameter
  • the processor 201 can generate the second similarity parameter by comparing the second iris feature and the reference feature included in the reference iris data stored in advance in the memory 203.
  • the processor 201 calculates the distance between the iris circle radius R I for the left eye and the right eye, the ratio PIR between the pupil circle radius and the iris circle radius, the X axis normalization distance NDX and the Y axis normalization distance NDY A second similarity parameter may be generated.
  • a second degree of similarity parameters include, for example, may include a pupil center of the circle and the circle iris minimum distance is the minimum center distance difference parameter ( ⁇ ND min) of the distance between the centers.
  • the second similarity parameter may include, for example, the maximum center-distance difference parameter ( NDmax ), which is the maximum distance in the distance between the center of the pupil circle and the center of the iris circle.
  • NDmax the maximum center-distance difference parameter
  • the second similarity parameter may include, for example, a radial difference average parameter (DELTA RI , avg ) that is an average of the normalized iris circle radius difference of the left eye and the normalized iris circle radius difference of the right eye.
  • DELTA RI , avg radial difference average parameter
  • the second similarity parameter includes a radius ratio difference average parameter (DELTA PIR avg ), which is an average of the ratio of the pupil circle radius to the iris circle radius in the left eye and the ratio difference between the pupil circle radius in the right eye and the iris circle radius can do.
  • DELTA PIR avg radius ratio difference average parameter
  • the processor 201 may obtain a similarity score using the first similarity parameter and the second similarity parameter.
  • the processor 201 generates a plurality of parameters such as, for example, a Hamming distance difference parameter? HDnorm, a Hamming distance average parameter HD avg , a minimum total bit number parameter BitCount min and a maximum total bit number parameter BitCount max a first similarity degree parameter and a minimum center distance difference parameter ( ⁇ ND min), the maximum center distance difference parameter ( ⁇ ND max), the radial difference between the average parameter ( ⁇ R I, avg) and the radius ratio difference average parameters including a parameter ( ≪ / RTI > PIR avg ) to obtain a similarity score.
  • the processor 201 can determine whether to approve user authentication based on the similarity score.
  • the processor 201 may compare the obtained similarity score with, for example, a preset threshold score to determine whether to approve the user authentication.
  • FIG. 4 is a diagram illustrating a first iris feature and a second iris feature acquisition method in accordance with some embodiments of the present disclosure.
  • the superscript " Enrolled” refers to a parameter obtained from previously stored reference iris data
  • the superscript " Probe” refers to a parameter acquired from an image of the user's left and right eyes Lt; / RTI >
  • the subscript " Left " represents a parameter obtained for the left eye, and the subscript " Right " represents a parameter obtained for the right eye.
  • the processor 201 may receive a captured user image via an external device, such as, for example, an infrared camera 401, and the processor 201 may receive an image of the user's left and right eyes from the received user image (402).
  • an external device such as, for example, an infrared camera 401
  • the processor 201 may receive an image of the user's left and right eyes from the received user image (402).
  • the processor 201 After acquiring the images of the left and right eyes of the user, the processor 201 separately separates the left and right eye regions on the image to obtain the first iris feature and the second iris feature for the left eye and the right eye, Can be obtained (41, 43).
  • the processor 201 may, for example, detect the left pupil on the image and determine the radius and center coordinates of the detected left pupil source.
  • the processor 201 can acquire the left pupil image through the detection of the left pupil.
  • the processor 201 can perform the left eye iris division.
  • the left eye iris segmentation may include, for example, a process of determining the boundary between the left pupil and the left eye iris.
  • the left eye iris segmentation can be performed using, for example, an image segmentation algorithm.
  • the processor 201 can obtain the left eye iris image through the left eye iris division and determine the radius and the center coordinate of the left eye iris circle.
  • the processor 201 can normalize the left eye iris image.
  • the left eye iris image normalization can be defined, for example, as converting the pixels of the left eye iris image from polar coordinates to linear coordinates, and the pixels of the left eye iris image are normalized to a rectangular matrix position from a circular location Can be moved.
  • Processor 201 may extract the left eye first iris feature from the normalized left eye iris image (413).
  • the processor 201 can generate a binaural left eye iris code through encoding of the extracted left eye first iris feature.
  • the processor 201 may acquire (411) the left eye second iris feature based on the information on the left pupil image obtained in the left eye pupil detection process and the information on the left eye iris image obtained in the left eye iris segmentation process .
  • the processor 201 calculates the left eye iris circle center coordinates ( XI , Left , Left pupil circle center (X P, Left , Y P, Left ) and left pupil radius (R P ) determined by analyzing the left eye pupil image and the left eye iris circle radius (R I, Left )
  • the second iris feature of the left eye can be obtained (411).
  • the left second iris feature includes, for example, a left eye iris circle radius (RI , Left ), a ratio (PIR Left ) between a left eye pupil circle radius and a left eye iris circle radius, a left eye X axis normalization distance (NDX Left ) Y axis normalized distance (NDY Left ), and the like.
  • Right eye image processing includes steps similar to left eye image processing.
  • the process 43 of the processor 201 processing the right eye region the following plural steps may be performed.
  • the processor 201 may, for example, detect the right eye pupil on the image and determine the radius and center coordinates of the detected right eye pupil.
  • the processor 201 can acquire the right pupil image through the right eye pupil detection.
  • the processor 201 can perform right eye iris segmentation.
  • the right eye iris segmentation may include, for example, a process of determining the boundary between the right eye pupil and the right eye iris.
  • the right eye iris segmentation can be performed using, for example, an image segmentation algorithm.
  • the processor 201 can acquire the right eye iris image through the right eye iris division and determine the radius and the center coordinate of the right eye iris circle.
  • Right eye iris image normalization can be defined, for example, as converting pixels of a right eye iris image from polar coordinates to linear coordinates. The pixels of the right eye iris image can be moved from the circular position to the rectangular matrix position through normalization.
  • Processor 201 may extract the first right eye iris feature from the normalized right eye iris image (433).
  • the processor 201 can generate a binaural right eye iris code through encoding of the extracted right eye first iris feature.
  • the processor 201 may acquire (431) the right eye second iris feature based on the information on the right pupil image obtained in the right eye pupil detection process and the information on the right eye iris image obtained in the right eye iris segmentation process .
  • the processor 201 calculates the right eye iris circle center coordinates ( XI , Right , Y I, Right) and right-eye iris circle radius (R I, Right), a right eye pupil circle center coordinates determined by analyzing the right eye pupil image (X P, Right, Y P, Right) and the right eye pupil circle radius (R P, Right eye iris feature (431).
  • Right-eye second iris characteristics for example, the right eye iris circle radius (R I, Right), a ratio between the right eye pupil circle radius and the right eye iris circle radius (PIR Right), the right eye X-axis normalized distance (NDX Right) and right-eye And the position of the pupil and iris such as the Y-axis normalized distance (NDY Right ).
  • a method for acquiring information on the pupil and iris geometric parameters of the left eye and right eye will be described later with reference to FIG.
  • the processor 201 may compare the first iris feature and the second iris feature acquired for the left eye and the right eye with the reference iris data of the database 405 stored in the memory 203 to determine whether to approve the user authentication 403 have.
  • FIG. 5 is a view showing positions of pupil and iris on a coordinate system according to some embodiments.
  • the processor 201 can generate the parameter using the circumscribed circle, i.e., the iris circle 53 and the pupil circle 51, in which the iris and the pupil are respectively approximated.
  • the pupil circle 51 surrounding the pupil has a radius R P and a center coordinate (X P, Y P ), and an iris circle 53 surrounding the iris has a radius R I and a center coordinate (X I, Y I ) .
  • the parameters for the pupil circle 51 and the iris circle 53 is the second iris characteristics, that is, iris circle radius (R I), the ratio between the pupil circle radius and the iris circle radius (PIR), X-axis normalized distance (NDX ) And the Y-axis normalization distance NDY.
  • the second iris feature may be compared with the reference feature included in the previously stored reference iris data and used to generate the second similarity parameter.
  • the second iris feature is calculated based on information about the size and position of the pupil and iris that was determined during acquisition of the first iris feature and the iris code, and since the calculation formula is computationally simple, No computational resources are required.
  • FIG. 6 is a flow diagram of a method for calculating a similarity score in accordance with some embodiments.
  • step S601 the processor 201 may generate the first similarity parameter by comparing the iris code corresponding to the first iris feature and the reference code included in the previously stored reference iris data.
  • the details of the first similarity parameter generating method of step S601 will be described later on the basis of the embodiment of Fig.
  • step S602 the processor 201 may generate the second similarity parameter by comparing the reference feature included in the second iris feature and the previously stored reference iris data.
  • the processor 201 calculates the pupil center coordinates (X P , Y P ) and the pupil center coordinates determined by analyzing the iris image, the iris circle center coordinates (X I , Y I ) and the iris circle radius (R I )
  • the second iris feature can be obtained based on the pupil radius (R P ).
  • the second iris feature obtained by analyzing the iris image and the left pupil image by the processor 201 includes, for example, an X-axis normalized distance NDX Left , a Y-axis normalized distance NDY Left , (PIR Left ), and the iris circle radius (R I, Left ).
  • the second iris feature obtained by analyzing the iris image and the left pupil image by the processor 201 may include, for example, an X-axis normalized distance (NDX Right ), a Y-axis normalized distance (NDY Right ) The radius ratio (PIR Right ), and the iris circle radius (RI , Right ).
  • the X axis normalization distance represents the normalization distance along the X axis, and is calculated according to the following equation.
  • X I is the X-axis component of the center of the iris circle
  • X P is the X-axis component of the center of the pupil source
  • RI is the radius of the iris circle.
  • NDY Y-axis normalization distance
  • Y I is the Y-axis component of the iris circle center coordinates
  • P X is the X-axis component of the pupil circle center coordinates
  • R I denotes an iris circle radius
  • circle radius ratio represents the ratio of the iris circle radius (R I) and the pupil circle radius of the left and right eyes (R P), is calculated according to the following formula:
  • the processor 201 calculates the second iris feature, i.e., the X-axis normalized distance NDX Left , the Y-axis normalized distance NDY Left , the PIR Left , and the iris circle radius on the basis of the (R I, Left) and, X-axis normalized distance (NDX Right), Y-axis normalized distance (NDY Right), the circle radius ratio for the right eye (PIR Right) and iris circle radius (R I, Right) the 2 similarity parameter can be obtained.
  • the second iris feature i.e., the X-axis normalized distance NDX Left , the Y-axis normalized distance NDY Left , the PIR Left , and the iris circle radius on the basis of the (R I, Left) and, X-axis normalized distance (NDX Right), Y-axis normalized distance (NDY Right), the circle radius ratio for the right eye (PIR Right) and iris circle radius (R I, Right) the 2 similarity parameter can
  • a second degree of similarity parameters include, for example, may include a pupil center of the circle and the circle iris minimum distance is the minimum center distance difference parameter ( ⁇ ND min) of the distance between the centers.
  • Processor 201 is, for example, the pupil circle determines the minimum distance of the distance between the center of the iris and the circle center and is able to obtain a minimum center distance difference parameter ( ⁇ ND min).
  • the second similarity parameter may include, for example, a maximum center distance difference parameter? ND max , which is the maximum distance between distances between the center of the pupil circle and the center of the iris circle.
  • the processor 201 may obtain the maximum center-distance difference parameter ([Delta] NDmax ), for example, by determining the maximum distance between the center of the pupil circle and the center of the iris circle.
  • the minimum center distance difference parameter ( ⁇ ND min) and maximum center distance difference parameter ( ⁇ ND max) is determined as a parameter having the minimum distance and maximum distance of the distance difference parameter ( ⁇ ND) calculated according to the following formula, respectively.
  • the second similarity parameter may include, for example, a radial difference average parameter (DELTA RI , avg ) that is an average of the normalized iris circle radius difference of the left eye and the normalized iris circle radius difference of the right eye.
  • DELTA RI , avg radial difference average parameter
  • the processor 201 may obtain a radius difference average parameter (DELTA RI , avg ) , for example, by averaging the difference between the normalized iris circle radius difference in the left eye and the normalized iris circle radius difference in the right eye.
  • the radius difference average parameter (DELTA RI , avg ) can be calculated, for example, according to the following equation.
  • the second similarity parameter may be, for example, a radius ratio difference average parameter (DELTA PIR avg ) which is an average of a ratio of the pupil circle radius of the left eye to the iris circle radius and a ratio difference of the pupil circle radius and the iris circle radius of the right eye, . ≪ / RTI >
  • DELTA PIR avg a radius ratio difference average parameter
  • the processor 201 obtains a radius ratio difference average parameter (DELTA PIR avg ) by, for example, obtaining an average of the ratio of the pupil circle radius to the iris circle radius in the left eye and the ratio difference between the pupil circle radius in the right eye and the iris circle radius Can be obtained.
  • the radial ratio difference average parameter [Delta] PIR avg can be calculated, for example, according to the following equation.
  • the processor 201 may calculate a similarity score using the first similarity parameter and the second similarity parameter.
  • FIG. 7 is a flow diagram of a method for computing a first similarity parameter according to some embodiments.
  • step S701 the processor 201 compares the iris code and the reference code to calculate the left-eye normalization hamming distance (HD Left ) and the right eye normalization hamming distance (HD Right ).
  • the processor 201 compares, for example, a reference code (IrisCode Enrolled ) included in the previously stored reference iris data and a binary iris code (IrisCode Probe ) obtained from the image of the user's left and right eyes, The distance (HD Left ) and the right-eye normalized hamming distance (HD Right ) can be calculated.
  • a reference code IrisCode Enrolled
  • a binary iris code IrisCode Probe
  • the processor 201 may be the right eye of bits used to compute the left the number of bits (BitCount Left) and right-eye normalized Hamming Distance (HD Right) that is used to calculate the left normalized Hamming Distance (HD Left) (BitCount Right) Can be determined.
  • the bits used to compute each normalized hamming distance parameter may be, for example, a bit of iris-masked iris portion when registering an image that can obtain a greater amount of information in the authentication process when the user is finely- to be.
  • step S703 the processor 201 calculates the first similarity parameter based on the left-eye normalization hamming distance HD Left , the right eye normalization hamming distance HD Right , the left eye bit count Left and the right eye bit count BitCount Right Can be generated.
  • the first similarity parameter may include, for example, a Hamming distance difference parameter (HDnorm), which is a normalized difference of a left-eye normalization hamming distance (HD Left ) and a right eye normalization hamming distance (HD right ).
  • HDnorm Hamming distance difference parameter
  • the processor 201 may obtain the hamming distance difference parameter? HDnorm, for example, by normalizing the distance difference between the left- eye normalized hamming distance (HD left ) and the right-eye normalized hamming distance (HD right ).
  • the Hamming distance difference parameter? HDnorm can be calculated according to the following equation, for example.
  • the first similarity parameter may include, for example, a Hamming distance average parameter (HD avg ) which is an average of the left- eye normalization hamming distance (HD left ) and the right eye normalization hamming distance (HD right ).
  • HD avg Hamming distance average parameter
  • the processor 201 may obtain the hamming distance average parameter HD avg by , for example, obtaining an average of the left- eye normalized hamming distance HD left and the right-eye normalized hamming distance HD right .
  • the Hamming distance average parameter (HD avg ) can be calculated, for example, according to the following equation.
  • the first similarity parameter may also include a minimum total number of bits parameter (BitCount min ), which is the minimum total number of bits used in the calculation of the left- eye normalization hamming distance (HD left ) and the right eye normalization hamming distance (HD right ) .
  • BitCount min a minimum total number of bits parameter
  • the processor 201 obtains the minimum total number of bits used in the calculation of the left- eye normalized hamming distance (HD left ) and the right-eye normalized hamming distance (HD right ) to obtain the minimum total number of bits parameter (BitCount min ) .
  • the minimum total number of bits parameter (BitCount min ) can be calculated according to the following equation, for example.
  • the first likelihood parameter is, for example, the left eye normalized Hamming Distance (HD left) and the maximum number of total bits up to total bit number used for the calculation of the right-eye normalized Hamming Distance (HD right) include a parameter (BitCount max) .
  • the processor 201 obtains the maximum total number of bits parameter (BitCount max ) by, for example, obtaining the maximum total number of bits used in the calculation of the left- eye normalized hamming distance (HD left ) and the right-eye normalized hamming distance (HD right ) .
  • the maximum full bit number parameter (BitCount max ) can be calculated, for example, according to the following equation.
  • MAX_BITS is the maximum number of bits usable in the representation of the binary iris code and can be set in advance based on experimental data.
  • FIG. 8 is a diagram illustrating a method of calculating a similarity score using a first similarity parameter and a second similarity parameter.
  • the processor 201 calculates a left eye iris code based on a left eye reference code (IrisCode Left Enrolled ) included in the previously stored reference iris data and a left eye iris code (IrisCode Left Probe ) obtained from an image of the user's left eye and right eye
  • the left-eye normalized hamming distance (HD Left ) and the left-eye bit count (BitCount Left ) can be obtained (801).
  • the processor 201 calculates the right eye normalization hamming distance based on the right eye reference code (IrisCode Right Enrolled ) included in the previously stored reference iris data and the right eye iris code (IrisCode Right Probe ) obtained from the image of the user's left eye and right eye, (HD Right ) and the right eye bit count (BitCount Right ) (802).
  • the processor 201 generates the first similarity parameter based on the obtained left-eye normalized hamming distance HD Left , the number of left eye bits (BitCount Left ), the right eye normalized hamming distance (HD Right ), and the number of right eye bits (BitCount Right ) (821).
  • a first similarity parameters include, for example, the left eye normalized Hamming Distance (HD Left) and right-eye normalized difference between the Hamming distance difference parameter ( ⁇ HDnorm), the left eye normalized Hamming Distance (HD left) normalization of the Hamming distance (HD right) and average Hamming distance average parameters of the right eye normalized Hamming distance (HD right) (HD avg) , the left eye normalized Hamming distance (HD left) and the minimum number of all the bits in the minimum total bit number used for the calculation of the right-eye normalized Hamming distance (HD right) (BitCount max ), which is the maximum total number of bits used in the calculation of the parameter (BitCount min ) and the left-eye normalized hamming distance (HD left ) and the right-eye normalized hamming distance (HD right ).
  • the processor 201 may generate the second similarity parameter by comparing the reference feature included in the reference iris data stored in advance with the second iris feature obtained from the images of the left and right eyes of the user.
  • the processor 201 calculates the left eye reference feature, i.e., the reference X-axis normalized distance parameter (NDX Left Enrolled ), the reference Y-axis normalized distance parameter (NDY Left Enrolled , PIR Left Enrolled , and Standard Iris Circle Radius Parameters (RI , Left Enrolled ), and the right eye reference feature, i.e., the reference X axis normalized distance parameter for the right eye (NDX Right Enrolled )
  • the NDY Right Enrolled , PIR Right Enrolled , and Reference Iris Radius parameters (RI , Right Enrolled ) can be used (822).
  • the processor 201 calculates an X-axis normalized distance parameter (NDX Left Probe ) for the left eye second iris feature, i.e., the left eye, obtained from the image of the user's left and right eyes, normalized distance parameter (NDY Left Probe), the circle radius ratio parameter (PIR Left Enrolled) and iris circle radius parameter (RI, Left Enrolled), and a right-eye second iris characteristics, that is, normalized distance X axis for the right-eye parameter (NDX Right Probe ), Y-axis normalized distance parameter (NDY Right Probe), can be used for the circle radius ratio parameter (PIR Right Enrolled) and iris parameter circle radius (R I, Right Enrolled) ( 822).
  • the second similarity parameter e.g., the pupil center of the circle and the iris circle of maximum distance of the distance between the center distance a minimum distance which is the minimum center distance difference parameter ( ⁇ ND min), the pupil center of the circle and the iris circle center of between the maximum A center difference parameter (? ND max ), a normalized iris circle radius difference in the left eye, and a radius difference average parameter (? R I, avg ) which is an average of the normalized iris circle radius difference in the right eye, (PIR avg ), which is the average of the difference in the ratio of the radius of the pupil circle and the ratio of the pupil circle radius to the iris circle radius in the right eye.
  • the processor 201 may calculate a similarity score based on the generated first similarity parameter and the second similarity parameter.
  • the similarity score may be calculated, for example, according to the following formula using both the first similarity parameter and the second similarity parameter.
  • Equation (11) S denotes a similarity score, ic denotes a bias coefficient, and w i (w 1 to w 8 ) denotes a weighting coefficient.
  • the bias coefficient ic and the weighting coefficient w i may be empirically obtained using a logistic regression model for a training data set comprising, for example, a plurality of iris images of a user, And can be predetermined by experimental data under different capture conditions.
  • FIG. 9 is a flow diagram illustrating a method for determining whether a user is authorized by comparing a similarity score and a threshold score in accordance with some embodiments.
  • step S901 the processor 201 may obtain a first similarity parameter and a second similarity parameter, and in step S902, the processor 201 uses the acquired first similarity parameter and the second similarity parameter to calculate a similarity score Can be calculated.
  • the processor 201 may compare the calculated similarity score with a predetermined threshold score.
  • the threshold score can be preset, for example, based on experimental data for a previously performed test.
  • the processor 201 can approve the user authentication in step S904.
  • the processor 201 determines that the iris included in the reference iris data stored in advance and the iris obtained from the image for the left eye and the right eye in the user authentication process are equal to each other And approve the user authentication.
  • the processor 201 may reject the user authentication in step S905.
  • the processor 201 determines whether the iris included in the reference iris data stored in advance and the iris acquired from the image for the left eye and the right eye in the user authentication process are equal to each other And the user authentication can be denied.
  • the user authentication approval condition of the authentication method using the consensus rule satisfies that the left-eye hamming distance (HD Left ) is less than a preset threshold value and the right-eye humming distance (HD Right ) is less than a preset threshold value.
  • the user authentication approval condition of the authentication method using the minimum HD value selection rule is determined by determining the minimum value of the left-handed humming distance (HD Left ) and the right-eye humming distance (HD Right ) as the similarity score, Is less than the threshold value.
  • the percentage of authentication errors that is, the False Non-Match Rate (FNMR) that recognizes the user as a person, and the False Match Rate (FMR)
  • the ratio (FNMR at FMR? 10-7,%) was calculated.
  • the percentage of authentication errors recorded when using the authentication method according to the consensus rule appears to be 3.59% higher compared to the results obtained when implementing the authentication method of this disclosure.
  • the percentage of recorded authentication errors when using the authentication method according to the minimum HD value selection rule is 3.13% higher than the result obtained when implementing the authentication method of this disclosure.
  • the authentication method of the present disclosure provides a more accurate result compared to an authentication method using consensus rules and an authentication method using a minimum HD value selection rule, unless the required computational resources are greatly increased.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)

Abstract

L'invention concerne un dispositif et un procédé d'authentification d'utilisateur sur la base de la reconnaissance d'iris. Un procédé d'authentification d'utilisateur sur la base de la reconnaissance d'iris selon la présente invention peut comprendre les étapes consistant : à acquérir des images de l'œil gauche et de l'œil droit d'un utilisateur ; à extraire des images une image de pupille et une image d'iris ; à analyser l'image d'iris pour acquérir une première caractéristique d'iris ; à analyser l'image d'iris et l'image de pupille pour acquérir une seconde caractéristique d'iris ; à acquérir un score de similarité selon la première caractéristique d'iris, la seconde caractéristique d'iris et des données d'iris de référence mémorisées au préalable ; et à déterminer s'il faut approuver l'authentification d'utilisateur en fonction du score de similarité.
PCT/KR2018/014515 2017-11-24 2018-11-23 Dispositif et procédé d'authentification d'utilisateur sur la base de la reconnaissance d'iris WO2019103520A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP18880306.8A EP3680794A4 (fr) 2017-11-24 2018-11-23 Dispositif et procédé d'authentification d'utilisateur sur la base de la reconnaissance d'iris
CN201880073875.0A CN111344703B (zh) 2017-11-24 2018-11-23 基于虹膜识别的用户认证设备和方法
US16/765,298 US11449590B2 (en) 2017-11-24 2018-11-23 Device and method for user authentication on basis of iris recognition

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
RU2017141021 2017-11-24
RU2017141021A RU2670798C9 (ru) 2017-11-24 2017-11-24 Способ аутентификации пользователя по радужной оболочке глаз и соответствующее устройство
KR10-2018-0138307 2018-11-12
KR1020180138307A KR102554391B1 (ko) 2017-11-24 2018-11-12 홍채 인식 기반 사용자 인증 장치 및 방법

Publications (1)

Publication Number Publication Date
WO2019103520A1 true WO2019103520A1 (fr) 2019-05-31

Family

ID=66631065

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/014515 WO2019103520A1 (fr) 2017-11-24 2018-11-23 Dispositif et procédé d'authentification d'utilisateur sur la base de la reconnaissance d'iris

Country Status (1)

Country Link
WO (1) WO2019103520A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080017763A (ko) * 2006-08-22 2008-02-27 연세대학교 산학협력단 스코어 레벨 결합을 통한 홍채 인식 방법
KR20090074185A (ko) * 2006-09-29 2009-07-06 오끼 덴끼 고오교 가부시끼가이샤 개인 인증 시스템 및 개인 인증방법
KR20130011913A (ko) * 2011-07-20 2013-01-30 한국기초과학지원연구원 신원 인식 장치 및 방법
US20160012275A1 (en) * 2012-12-10 2016-01-14 Sri International Iris biometric matching system
KR101763761B1 (ko) * 2010-06-04 2017-08-02 한국전자통신연구원 홍채 형상의 인식 방법 및 홍채 형상 인식 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080017763A (ko) * 2006-08-22 2008-02-27 연세대학교 산학협력단 스코어 레벨 결합을 통한 홍채 인식 방법
KR20090074185A (ko) * 2006-09-29 2009-07-06 오끼 덴끼 고오교 가부시끼가이샤 개인 인증 시스템 및 개인 인증방법
KR101763761B1 (ko) * 2010-06-04 2017-08-02 한국전자통신연구원 홍채 형상의 인식 방법 및 홍채 형상 인식 장치
KR20130011913A (ko) * 2011-07-20 2013-01-30 한국기초과학지원연구원 신원 인식 장치 및 방법
US20160012275A1 (en) * 2012-12-10 2016-01-14 Sri International Iris biometric matching system

Similar Documents

Publication Publication Date Title
WO2021080103A1 (fr) Procédé d'apprentissage et de test d'un réseau d'apprentissage utilisateur à utiliser pour reconnaître des données obscurcies créées par dissimulation de données originales afin de protéger des informations personnelles et dispositif d'apprentissage et dispositif de test l'utilisant
WO2016129917A1 (fr) Terminal d'utilisateur et son procédé de production
WO2021080102A1 (fr) Procédé de formation et d'essai d'un réseau d'adaptation correspondant à un réseau de brouillage pouvant traiter des données à dissimuler à des fins de confidentialité et dispositif de formation et dispositif d'essai utilisant ledit procédé
WO2015102361A1 (fr) Appareil et procédé d'acquisition d'image pour une reconnaissance de l'iris à l'aide d'une distance de trait facial
WO2014017697A1 (fr) Procédé et dispositif d'extraction de motifs de veines de doigt à l'aide d'un filtre de gabor guidé
WO2018164363A1 (fr) Procédé de reconnaissance sans contact de parties du corps multiples et dispositif de reconnaissance de parties du corps multiples, utilisant des données biométriques multiples
WO2015160207A1 (fr) Système et procédé de détection de région d'intérêt
WO2013085193A1 (fr) Appareil et procédé pour améliorer la reconnaissance d'un utilisateur
WO2015115681A1 (fr) Procédé et appareil de reconnaissance d'expression à l'aide d'un dictionnaire d'expressions-gestes
KR20190060671A (ko) 홍채 인식 기반 사용자 인증 장치 및 방법
WO2017061758A1 (fr) Système et procédé d'authentification de signature manuscrite basés sur des blocs de segments
WO2018164364A1 (fr) Procédé de reconnaissance de parties de corps multiples sans contact et dispositif de reconnaissance de parties de corps multiples, à l'aide de multiples données biométriques
EP3714419A1 (fr) Dispositif électronique et son procédé d'authentification
WO2017039287A1 (fr) Système et procédé d'authentification de signature manuelle sur la base de segments
WO2018008881A1 (fr) Dispositif terminal et serveur de service, procédé et programme de fourniture d'un service d'analyse de diagnostic exécutés par ledit dispositif, et support d'enregistrement lisible par ordinateur sur lequel est enregistré ledit programme
WO2017099427A1 (fr) Procédé d'authentification biométrique convergente reposant sur une articulation du doigt et une veine du doigt, et appareil associé
WO2017179846A1 (fr) Dispositif d'imagerie tridimensionnelle polyédrique pour authentifier simultanément une empreinte digitale et des veines de doigt
WO2018093105A1 (fr) Module d'intégration d'entrée/sortie pour la liaison simultanée d'algorithmes d'informations biologiques
WO2017183830A1 (fr) Procédé et appareil de renforcement de la sécurité pour la reconnaissance d'iris, par enregistrement et appariement répartis de modèles d'iris
WO2022240030A1 (fr) Système de gestion de la durée de vie d'un animal de compagnie et procédé associé
WO2013165048A1 (fr) Système de recherche d'image et serveur d'analyse d'image
WO2018070576A1 (fr) Procédé de reconnaissance d'utilisateur à l'aide d'informations biométriques hybrides et dispositif associé
WO2015137666A1 (fr) Appareil de reconnaissance d'objet et son procédé de commande
WO2022240029A1 (fr) Système d'identification d'animal de compagnie et procédé associé
EP3746923A1 (fr) Dispositif électronique permettant de réaliser une authentification biométrique et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18880306

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE