CN116152139A - Pupil detection and iris recognition method, device, readable storage medium and equipment - Google Patents

Pupil detection and iris recognition method, device, readable storage medium and equipment Download PDF

Info

Publication number
CN116152139A
CN116152139A CN202111372916.6A CN202111372916A CN116152139A CN 116152139 A CN116152139 A CN 116152139A CN 202111372916 A CN202111372916 A CN 202111372916A CN 116152139 A CN116152139 A CN 116152139A
Authority
CN
China
Prior art keywords
pupil
image
pixels
eye
iris
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111372916.6A
Other languages
Chinese (zh)
Inventor
刘洋
周军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Eyes Intelligent Technology Co ltd
Beijing Eyecool Technology Co Ltd
Original Assignee
Beijing Eyes Intelligent Technology Co ltd
Beijing Eyecool Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Eyes Intelligent Technology Co ltd, Beijing Eyecool Technology Co Ltd filed Critical Beijing Eyes Intelligent Technology Co ltd
Priority to CN202111372916.6A priority Critical patent/CN116152139A/en
Priority to PCT/CN2022/128304 priority patent/WO2023088071A1/en
Publication of CN116152139A publication Critical patent/CN116152139A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Abstract

The invention discloses a method, a device, a readable storage medium and equipment for pupil detection and iris recognition, and belongs to the field of iris recognition. The invention utilizes the characteristics that the texture of the pupil beautifying area is complex, the relative gradient is large, and the excircle of the normal iris is smooth; and the characteristic that the pixel value of the pupil beautifying region is smaller than that of the whole iris region is utilized to perform pupil beautifying detection. According to the pupil detection and iris recognition method, an eye image is firstly obtained, gray stretching and gradient transformation are respectively carried out on the eye image, matrix addition operation and binarization are carried out, and whether the pupil is worn or not is judged according to the proportion of pixels with the value of 1 in the part except the pupil area on the obtained binarization image. The invention can accurately identify dark and light pupils, does not need to depend on image samples, has stronger robustness, and can identify pupils of various types.

Description

Pupil detection and iris recognition method, device, readable storage medium and equipment
Technical Field
The invention relates to the field of iris recognition, in particular to a method and a device for pupil detection and iris recognition, a readable storage medium and equipment.
Background
The iris is a ring-like structure between the pupil and the sclera, and as shown in fig. 1, the portion between the outer circumference of the iris and the inner circumference of the iris is the iris, and a part of iris information is lost due to the occlusion of the eyelid and the eyelashes. The iris has a diameter of about 12mm and a thickness of about 0.5mm. The fine features of the iris, which are interlaced with each other and resemble the shape of filaments, fringes, etc., are unique features of the iris, which are typically the texture features of the iris, for iris recognition.
The iris recognition mainly comprises iris image acquisition, iris image quality evaluation, iris image preprocessing, iris image normalization, iris image feature extraction and iris feature comparison and the like.
In fact, iris recognition is to recognize iris areas in eyes, but in practical application of iris recognition, users (especially female users) often wear pupils, and the pupils bring certain problems to iris recognition, so that accuracy of iris recognition is affected. The mydriasis may generally include three types of transparent mydriasis (similar to a contact lens, having substantially no effect on iris recognition), light-colored mydriasis, and dark-colored mydriasis. As shown in fig. 2 and 3, fig. 2 is a dark mydriasis, and fig. 3 is a light mydriasis.
The method for detecting the mydriasis in the prior art mainly comprises the following two steps:
1) Based on the characteristic that the pupil area is darker, gray level judgment is carried out, mainly by directly judging the gray level value of pixels and then setting a threshold value to judge the pupil area.
The method has a certain identification effect on dark color pupil, but can not be basically identified on light color pupil.
2) The deep learning-based method mainly comprises the steps of collecting training sample data, dividing the training sample data into two types of samples of a beautiful pupil and a non-beautiful pupil, searching a deep network model, performing deep training, and performing two classifications of wearing and not wearing the beautiful pupil.
The dependence of deep learning on training samples is large, and particularly for pupil patterns, the patterns of all pupils cannot be collected completely in reality, so that the training deep network model cannot be classified accurately.
Disclosure of Invention
In order to solve the technical problems that the pupil detection method cannot identify the pupil and relies on training samples, the invention provides a pupil detection and iris identification method, a pupil detection and iris identification device, a readable storage medium and a pupil identification device, which can accurately identify dark pupil and light pupil without relying on image samples and have stronger robustness.
The technical scheme provided by the invention is as follows:
in a first aspect, the present invention provides a pupil detection method, the method including:
acquiring an eye image;
calculating the average value of pixels of the eye image, and carrying out gray stretching on pixels smaller than the average value of the pixels in the eye image to [0,1], wherein the pixels larger than the average value of the pixels are set as 0, so as to obtain a first image matrix;
performing gradient transformation on the eye image, and normalizing the gray scale of each pixel on the image obtained by the gradient transformation to be [0,1] to obtain a second image matrix;
performing matrix addition operation on the first image matrix and the second image matrix, and performing binarization processing on each pixel on the image obtained by the matrix addition operation according to a set binarization threshold value to obtain a binarized image;
and counting the proportion of the pixels with the value of 1 in the parts except the pupil areas on the binarized image, judging that the pupil is not worn if the proportion is larger than a set pupil beautifying threshold value, and judging that the pupil is worn if the proportion is not worn.
Further, the method further comprises: preprocessing the acquired eye image, wherein the preprocessing comprises the following steps:
performing pupil primary positioning on the eye image to obtain pupil positions, pupil radiuses, upper eyelid boundaries and lower eyelid boundaries;
respectively selecting the vertex of the upper eyelid boundary, the bottom point of the lower eyelid boundary and two intersection points of the upper eyelid boundary and the lower eyelid boundary as intercepting boundaries to intercept the eye image to obtain an eye region image;
and carrying out light spot detection on the eye area image, and if the light spot is detected, filling the detected light spot in the eye area image by using bicubic interpolation.
Further, the proportion of the pixels with the value of 1 in the part of the statistical binarized image except the pupil area includes:
selecting a middle region on the binarized image, and counting the total number N of pixels of the middle region T
Counting the number N of the pixels with the value of 1 in the part outside the pupil area on the middle area 1
By formula N 1 /N T The ratio is calculated.
In a second aspect, the present invention provides a pupil detection apparatus, the apparatus comprising:
the image acquisition module is used for acquiring an eye image;
the first image matrix calculation module is used for calculating the pixel average value of the eye image, and carrying out gray stretching on pixels smaller than the pixel average value in the eye image to [0,1], wherein the pixels larger than the pixel average value are set to be 0, so as to obtain a first image matrix;
the second image matrix calculation module is used for carrying out gradient transformation on the eye image, normalizing the gray scale of each pixel on the image obtained by the gradient transformation to be 0,1, and obtaining a second image matrix;
the binarization image calculation module is used for carrying out matrix addition operation on the first image matrix and the second image matrix, and carrying out binarization processing on each pixel on the image obtained by matrix addition operation according to a set binarization threshold value to obtain a binarization image;
and the pupil beauty judgment module is used for counting the proportion of the pixels with the value of 1 in the parts except the pupil areas on the binarized image, judging that the pupil beauty is not worn if the proportion is larger than a set pupil beauty threshold, and judging that the pupil beauty is worn if the proportion is not larger than the set pupil beauty threshold.
Further, the device further comprises a preprocessing module for preprocessing the acquired eye image, wherein the preprocessing module comprises:
the pupil initial positioning unit is used for performing pupil initial positioning on the eye image to obtain pupil positions, pupil radiuses, upper eyelid boundaries and lower eyelid boundaries;
the image intercepting unit is used for respectively selecting the vertex of the upper eyelid boundary, the bottom point of the lower eyelid boundary and two intersection points of the upper eyelid boundary and the lower eyelid boundary as intercepting boundaries to intercept the eye image to obtain an eye area image;
and the light spot filling unit is used for carrying out light spot detection on the eye area image, and if the light spot is detected, filling the light spot detected in the eye area image by using bicubic interpolation.
Further, in the pupil-beautifying judging module, the statistics of the proportion of the pixels with the value of 1 in the part outside the pupil area on the binarized image includes:
a middle region selecting unit for selecting a middle region on the binarized image and counting the total number N of pixels of the middle region T
A pixel statistics unit for counting the number N of 1-valued pixels in the part except pupil area on the middle area 1
A proportion calculating unit for passing through formula N 1 /N T The ratio is calculated.
In a third aspect, the present invention provides a computer readable storage medium for pupil detection, comprising a memory for storing processor executable instructions which, when executed by the processor, implement steps comprising the pupil detection method of the first aspect.
In a fourth aspect, the present invention provides an apparatus for pupil detection, comprising at least one processor and a memory storing computer executable instructions, which when executed by the processor implement the steps of the pupil detection method of the first aspect.
In a fifth aspect, the present invention provides an iris recognition method, the method comprising:
performing pupil detection by the pupil detection method in the first aspect;
iris detection is carried out on the eye images, and an iris area is obtained;
and counting the ratio of the pixels with the value of 1 in the iris region on the binarized image, and if the ratio is smaller than a set recognition threshold, performing iris recognition by using the eye image.
Further, in iris recognition, the region where the pixel having a value of 1 on the binarized image is located is treated as a noise region.
In a sixth aspect, the present invention provides an iris recognition apparatus, the apparatus comprising:
the pupil detection module is used for performing pupil detection through the pupil detection device in the second aspect;
the iris detection module is used for carrying out iris detection on the eye images to obtain an iris region;
and the identification judging module is used for counting the ratio of the pixels with the value of 1 in the iris region on the binarized image, and if the ratio is smaller than a set identification threshold, iris identification is carried out by using the eye image.
Further, in iris recognition, the region where the pixel having a value of 1 on the binarized image is located is treated as a noise region.
In a seventh aspect, the present invention provides a computer readable storage medium for iris recognition, comprising a memory for storing processor executable instructions which when executed by the processor implement steps comprising the iris recognition method of the fifth aspect.
In an eighth aspect, the present invention provides an apparatus for iris recognition comprising at least one processor and a memory storing computer executable instructions which when executed by the processor implement the steps of the iris recognition method of the fifth aspect.
The invention has the following beneficial effects:
the invention utilizes the characteristics that the texture of the pupil beautifying area is complex, the relative gradient is large, and the excircle of the normal iris is smooth; and the characteristic that the pixel value of the pupil beautifying region is smaller than that of the whole iris region is utilized to perform pupil beautifying detection. Firstly, an eye image is acquired, gray stretching and gradient transformation are respectively carried out on the eye image, matrix addition operation and binarization are carried out, and whether the pupil is worn or not is judged according to the proportion of 1-valued pixels in the part, except the pupil area, of the obtained binarized image.
The invention can accurately identify dark and light pupils, does not need to depend on image samples, has stronger robustness, and can identify pupils of various types.
Drawings
FIG. 1 is a schematic illustration of various parts of an eye, including the iris;
FIG. 2 is a schematic view of an eye image with dark mydriasis;
FIG. 3 is a schematic view of an eye image with a light-colored mydriasis;
FIG. 4 is a flow chart of a pupil detection method of the present invention;
FIG. 5 is a schematic diagram of the pupil initial positioning result;
FIG. 6 is a schematic illustration of an image taken of an eye region;
FIG. 7 is a schematic illustration of an eye image after spot filling;
FIG. 8 is a schematic diagram of a binarized image;
FIG. 9 is a schematic illustration of a binarized image selecting a middle region;
FIG. 10 is a schematic diagram of a pupil detection apparatus according to the present invention;
FIG. 11 is a flow chart of an iris recognition method of the present invention;
FIG. 12 is a graph of the results of intra-iris and outer-border localization on a membrane image;
FIG. 13 is a schematic diagram of a noise template;
fig. 14 is a schematic view of an iris recognition device according to the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages to be solved more clear, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings and specific embodiments. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present invention.
Example 1:
the embodiment of the invention provides a pupil beautifying detection method, as shown in fig. 4, which comprises the following steps:
s100: an eye image is acquired.
After the eye image is acquired, the acquired eye image may be further preprocessed as needed, where in one example, the preprocessing method includes:
s101: and (3) performing pupil primary positioning on the eye image to obtain pupil positions, pupil radiuses, upper eyelid boundaries and lower eyelid boundaries.
In the step, first, pupil initial positioning is performed through an edge detection operator and edge gradient binarization detection, pupil positions, pupil radius sizes and upper eyelid boundaries and lower eyelid boundaries are preliminarily determined, and the upper eyelid boundaries and the lower eyelid boundaries are parabolic, as shown in fig. 5.
S102: and respectively selecting the vertex of the upper eyelid boundary, the bottom point of the lower eyelid boundary and two intersection points of the upper eyelid boundary and the lower eyelid boundary as intercepting boundaries to intercept the eye image to obtain an eye region image.
The step is used for carrying out image interception according to the pupil initial positioning result of the step S101, reducing the size of the image and reducing noise.
The specific process can be as follows: and respectively selecting the top points of the upper eyelid boundary parabolas, the bottom points of the lower eyelid boundary parabolas, the intersection points of the upper eyelid parabolas and the lower eyelid parabolas on the left side, and the intersection points of the upper eyelid parabolas and the lower eyelid parabolas on the right side as the upper boundary, the lower boundary, the left boundary and the right boundary of the intercepted image, wherein the intercepted image is shown in figure 6.
S103: and carrying out light spot detection on the eye area image, and if the light spot is detected, filling the detected light spot in the eye area image by using bicubic interpolation.
When an eye image is acquired, due to the influence of factors such as light filling and the like, light spots are easy to exist in pupils, the light spots are filled by utilizing light spot detection and a biquadratic difference value, and an effect diagram after the light spots are filled is shown in fig. 7.
S200: and calculating the average value of pixels of the eye image, and carrying out gray stretching on pixels smaller than the average value of the pixels in the eye image to [0,1], wherein the pixels larger than the average value of the pixels are set as 0, so as to obtain a first image matrix.
One specific implementation mode of the steps is as follows: and adding and averaging all pixels in the eye image, calculating a pixel average value, carrying out gray stretching on pixels smaller than the pixel average value in the eye image to [0,1], and taking the value of the pixels larger than the pixel average value as 0 to obtain a first image matrix Img1.
S300: and carrying out gradient transformation on the eye image, and normalizing the gray scale of each pixel on the image obtained by the gradient transformation to be [0,1] to obtain a second image matrix.
The step can utilize an edge detection Canny operator to carry out gradient transformation on an eye image, and then normalize the gray scale of each pixel on the image obtained by the gradient transformation to be [0,1] to obtain a second image matrix Img2.
S400: and performing matrix addition operation on the first image matrix and the second image matrix, and performing binarization processing on each pixel on the image obtained by matrix addition operation according to the set binarization threshold value to obtain a binarized image.
The step adds the first Image matrix Img1 and the second Image matrix Img2 to obtain a result matrix image=img1+img2. Then, the binary threshold value is compared with a binary threshold value, for example, the binary threshold value is set to 0.6, and then, the binary processing is performed. Wherein a binarization threshold of 0.6 is merely illustrative and not intended to limit the present invention. The specific binarization method is as follows:
Figure BDA0003362878670000081
an example 8 of a binarized image is shown.
S500: and counting the proportion of the pixels with the value of 1 in the parts except the pupil areas on the binarized image, judging that the pupil is not worn if the proportion is larger than a set pupil beautifying threshold value, and judging that the pupil is worn if the proportion is not worn.
In a normal non-pupil image, most of the pixels except the pupil area have a value of 1, while in a pupil image, a large proportion of the pixels except the pupil area have a value of 0. Therefore, whether to wear the pupil can be judged according to the proportion of the pixels with the value of 1 in the parts except the pupil area on the binarized image.
The foregoing S101 has obtained the position and radius of the pupil, that is, the pupil area size and coordinates can be known, and then the distribution of pixels with a value of 1 outside the pupil area in fig. 8 can be counted, and the area formed by the pixels with a value of 1 outside the pupil area can be considered as the pupil area. And then calculating the proportion of the pupil area and the binary image, wherein the proportion represents the proportion of the pupil outside the pupil area, and comparing the proportion with a set pupil threshold value to judge whether to wear the pupil.
The mydriatic threshold is chosen here to be 0.8, taking into account some edge noise effects. Wherein 0.8 is for illustration only and not for limiting the invention. And counting the proportion of pixels with the value of 1 in the non-pupil area on the binarized image, if the proportion is larger than 0.8, considering that the pupil is not worn, and if the proportion is smaller than or equal to 0.8, considering that the pupil is worn.
The invention utilizes the characteristics that the texture of the pupil beautifying area is complex, the relative gradient is large, and the excircle of the normal iris is smooth; and the characteristic that the pixel value of the pupil beautifying region is smaller than that of the whole iris region is utilized to perform pupil beautifying detection. Firstly, an eye image is acquired, gray stretching and gradient transformation are respectively carried out on the eye image, matrix addition operation and binarization are carried out, and whether the pupil is worn or not is judged according to the proportion of 1-valued pixels in the part, except the pupil area, of the obtained binarized image.
The invention can accurately identify dark and light pupils, does not need to depend on image samples, has stronger robustness, and can identify pupils of various types.
When counting the proportion of 1-valued pixels in the part of the binarized image outside the pupil area, the invention adopts the following method to eliminate the influence of upper eyelid and lower eyelid and eyelash noise:
s501: selecting a middle region on the binarized image, and counting the total number N of pixels of the middle region T
Illustratively, the middle region is an image obtained by cutting the binarized image 1/3 down from the upper edge, 1/3 up from the lower edge, 1/3 up from the left boundary, and 1/3 up from the right boundary, as in the rectangular portion of fig. 9. Wherein 1/3 is exemplary only and not intended to limit the present invention.
S502: counting the number N of 1-valued pixels in the part outside the pupil area on the middle area 1
The region composed of the pixels having a value of 1 in the portion other than the pupil region on the middle region represents the pupil region in the middle region.
S503: by formula N 1 /N T The ratio is calculated.
In this step, the proportion of the pupil area to the middle area is calculated, and the proportion is used as the proportion of the pixels with the value of 1 in the part except the pupil area in the binarized image, thereby eliminating the influence of upper and lower eyelids, eyelash noise and the like.
Example 2:
the embodiment of the invention provides a pupil detection device, as shown in fig. 10, which comprises:
an image acquisition module 100 for acquiring an eye image.
The first image matrix calculating module 200 is configured to calculate a pixel average value of the eye image, and stretch gray scale of pixels smaller than the pixel average value to [0,1] in the eye image, and set pixels larger than the pixel average value to 0, so as to obtain a first image matrix.
The second image matrix calculation module 300 is configured to perform gradient transformation on the eye image, and normalize the gray scale of each pixel on the image obtained by the gradient transformation to [0,1], so as to obtain a second image matrix.
The binarization image calculation module 400 is configured to perform matrix addition operation on the first image matrix and the second image matrix, and perform binarization processing on each pixel on the image obtained by the matrix addition operation according to the set binarization threshold value, so as to obtain a binarization image.
The pupil determining module 500 is configured to count a proportion of pixels with a value of 1 in a portion other than the pupil area on the binarized image, and determine that the pupil is not worn if the proportion is greater than a set pupil threshold, or determine that the pupil is worn if the proportion is not greater than the set pupil threshold.
The invention utilizes the characteristics that the texture of the pupil beautifying area is complex, the relative gradient is large, and the excircle of the normal iris is smooth; and the characteristic that the pixel value of the pupil beautifying region is smaller than that of the whole iris region is utilized to perform pupil beautifying detection. Firstly, an eye image is acquired, gray stretching and gradient transformation are respectively carried out on the eye image, matrix addition operation and binarization are carried out, and whether the pupil is worn or not is judged according to the proportion of 1-valued pixels in the part, except the pupil area, of the obtained binarized image.
The invention can accurately identify dark and light pupils, does not need to depend on image samples, has stronger robustness, and can identify pupils of various types.
The apparatus further comprises a preprocessing module for preprocessing the acquired eye image, the preprocessing module comprising:
and the pupil initial positioning unit is used for acquiring an eye image and performing pupil initial positioning to obtain a pupil position, a pupil radius, an upper eyelid boundary and a lower eyelid boundary.
And the image intercepting unit is used for respectively selecting the vertex of the upper eyelid boundary, the bottom point of the lower eyelid boundary and two intersection points of the upper eyelid boundary and the lower eyelid boundary as intercepting boundaries to intercept the eye image so as to obtain an eye area image.
And the light spot filling unit is used for carrying out light spot detection on the eye area image, and if the light spot is detected, filling the detected light spot in the eye area image by using bicubic interpolation.
In the pupil-beautifying judging module, the proportion of the pixels with the value of 1 in the parts except the pupil area on the binary image is counted, and the method comprises the following steps:
a middle region selecting unit for selecting a middle region on the binarized image and counting the total number N of pixels of the middle region T
A pixel statistics unit for counting the number N of 1-valued pixels in the part other than the pupil area on the middle area 1
A proportion calculating unit for passing through formula N 1 /N T The ratio is calculated.
The device provided in the embodiment of the present invention has the same implementation principle and technical effects as those of the embodiment 1 of the method, and for brevity, reference may be made to the corresponding content of the embodiment 1 of the method for the part of the embodiment of the device that is not mentioned. It will be clear to those skilled in the art that, for convenience and brevity of description, the specific working procedures of the apparatus and unit described above may refer to the corresponding procedures in the method embodiment 1, and are not repeated here.
Example 3:
the method described in the above embodiment 1 provided in the present specification may implement service logic by a computer program and be recorded on a storage medium, which may be read and executed by a computer, to implement the effects of the scheme described in embodiment 1 of the present specification. Accordingly, the present invention also provides a computer readable storage medium for pupil detection, comprising a memory for storing processor executable instructions which, when executed by a processor, implement steps comprising the pupil detection method described in embodiment 1.
The invention can accurately identify dark and light pupils, does not need to depend on image samples, has stronger robustness, and can identify pupils of various types.
The storage medium may include physical means for storing information, typically by digitizing the information before storing it in an electronic, magnetic, or optical medium. The storage medium may include: means for storing information using electrical energy such as various memories, e.g., RAM, ROM, etc.; devices for storing information using magnetic energy such as hard disk, floppy disk, magnetic tape, magnetic core memory, bubble memory, and USB flash disk; devices for optically storing information, such as CDs or DVDs. Of course, there are other ways of readable storage medium, such as quantum memory, graphene memory, etc.
The above-described device according to method embodiment 1 may also comprise other embodiments. Specific implementation may refer to description of related method embodiment 1, and will not be described in detail herein.
Example 4:
the invention also provides a device for pupil detection, which can be a single computer or can comprise an actual operating device or the like using one or more of the methods or one or more embodiment devices of the specification. The apparatus for pupil detection may include at least one processor and a memory storing computer-executable instructions that when executed by the processor implement the steps of the pupil detection method described in any one or more of embodiments 1 above.
The invention can accurately identify dark and light pupils, does not need to depend on image samples, has stronger robustness, and can identify pupils of various types.
The above description of the apparatus according to the method or apparatus embodiment may further include other embodiments, and specific implementation may refer to the description of the related method embodiment 1, which is not described herein in detail.
Example 5:
the embodiment of the invention also provides an iris recognition method, as shown in fig. 11, which comprises the following steps:
s100': the mydriasis detection was performed by the mydriasis detection method described in example 1.
S200': and (3) iris detection is carried out on the eye images to obtain iris areas.
The iris detection may be performed simultaneously in S100 of the foregoing embodiment 1, or may be performed separately from S100 of the foregoing embodiment 1, which is not limited by the present invention.
The iris detection obtains an inner iris boundary (the inner iris boundary is the outer pupil circle) and an outer iris boundary, and further obtains an iris region, as shown in fig. 12.
S300': and counting the ratio of the pixels with the value of 1 in the iris region on the binarized image, and if the ratio is smaller than a set recognition threshold, performing iris recognition by using the eye image.
In the step, the pixel with the value of 1 in the iris area on the binarized image is the pupil beautifying area in the iris area, the ratio is the ratio of the pupil beautifying area in the iris area to the iris area, when the ratio is large, the large area of the iris texture is blocked by the pupil beautifying, the eye image can not be used for iris recognition, otherwise, the eye image can be used for iris recognition.
The invention carries out pupil beautifying detection on an eye image and judges whether iris recognition can be carried out according to pupil beautifying detection results, and the main idea is to utilize the positioning results of the inner boundary and the outer boundary of the iris, then to generate a noise template, judge the ratio of the whole iris area occupied by the pupil beautifying area and judge whether the iris recognition can be carried out according to the ratio. Fig. 12 and 13 are respectively iris inner and outer boundary localization result maps and noise templates obtained by returning to the original eye image.
Based on the positioning results of the inner and outer boundaries of the iris shown in fig. 12, referring to the ratio of the pixel with a value of 1 in the noise template shown in fig. 13 to the iris region, for example, the recognition threshold is set to be 50%, and whether the iris recognition can be performed by the eye image is determined according to the ratio and the recognition threshold, wherein the recognition threshold of 50% is only for illustration and is not intended to limit the invention.
Meanwhile, in iris recognition, in the process of extracting iris features, the region where the pixel with the value of 1 on the binary image is located is treated as a noise region, namely, the pupil beautifying region in the iris region is treated as the noise region, so that the iris feature extraction and the feature comparison are not involved.
In the invention, the ratio of the iris area occupied by the pupil area is confirmed, whether the eye image can be used for iris recognition is judged according to the ratio, and the pupil area is treated as noise in the iris recognition process, so that the accuracy of iris recognition is improved.
The iris recognition method provided in embodiment 5 of the present invention includes the pupil detection method described in embodiment 1, and for brevity, the details of the iris recognition method in embodiment 5 are not mentioned in the foregoing, and reference is made to the corresponding details in embodiment 1, which are not described herein.
Example 6:
the embodiment of the invention also provides an iris recognition device, as shown in fig. 14, which comprises:
the pupil detection module 100' is configured to perform pupil detection by using the pupil detection apparatus described in embodiment 2.
The iris detection module 200' is configured to perform iris detection on the eye image to obtain an iris region.
The recognition judging module 300' is configured to count a ratio of pixels with a value of 1 in an iris area on the binarized image, and if the ratio is smaller than a set recognition threshold, perform iris recognition by using the eye image.
In iris recognition, a region where a pixel having a value of 1 is located on a binarized image is treated as a noise region.
The implementation principle and the technical effects of the device provided by the embodiment of the present invention are the same as those of the foregoing method embodiment 5, and for brevity, the details of the device embodiment are not mentioned in the foregoing method embodiment 5, and the details thereof will not be described in detail.
Example 7:
the method described in the above embodiment 5 provided in the present specification may implement business logic by a computer program and be recorded on a storage medium, which may be read and executed by a computer, to implement the effects of the scheme described in embodiment 5 of the present specification. Accordingly, the present invention also provides a computer readable storage medium for iris recognition, comprising a memory for storing processor executable instructions which when executed by a processor implement steps comprising the iris recognition method described in embodiment 5.
The above-described device according to method example 5 may also comprise other embodiments. Specific implementation may refer to description of related method embodiment 1, and will not be described in detail herein.
Example 8:
the invention also provides a device for iris recognition, which can be a separate computer or can comprise actual operation devices using one or more of the methods or one or more of the embodiment devices of the present specification, etc. The apparatus for iris recognition may include at least one processor and a memory storing computer executable instructions that when executed by the processor perform the steps of the iris recognition method described in any one or more of embodiments 5 above.
The above description of the apparatus according to the method or apparatus embodiment may further include other embodiments, and specific implementation may refer to the description of the related method embodiment 1, which is not described herein in detail.
It should be noted that, the description of the apparatus or the system according to the embodiments of the related method in this specification may further include other embodiments, and specific implementation manner may refer to the description of the embodiments of the method, which is not described herein in detail. In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the hardware + program class, the storage medium + program embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and reference is made to the partial description of the method embodiment for relevant points.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a car-mounted human-computer interaction device, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various modules, respectively. Of course, when one or more of the present description is implemented, the functions of each module may be implemented in the same piece or pieces of software and/or hardware, or a module that implements the same function may be implemented by a plurality of sub-modules or a combination of sub-units, or the like. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller can be regarded as a hardware component, and means for implementing various functions included therein can also be regarded as a structure within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method or apparatus comprising such elements.
One skilled in the relevant art will recognize that one or more embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Moreover, one or more embodiments of the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
One or more embodiments of the present specification may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. One or more embodiments of the present specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments. In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present specification. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Finally, it should be noted that: the above examples are only specific embodiments of the present invention, and are not intended to limit the scope of the present invention, but it should be understood by those skilled in the art that the present invention is not limited thereto, and that the present invention is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the corresponding technical solutions. Are intended to be encompassed within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A pupil detection method, the method comprising:
acquiring an eye image;
calculating the average value of pixels of the eye image, and carrying out gray stretching on pixels smaller than the average value of the pixels in the eye image to [0,1], wherein the pixels larger than the average value of the pixels are set as 0, so as to obtain a first image matrix;
performing gradient transformation on the eye image, and normalizing the gray scale of each pixel on the image obtained by the gradient transformation to be [0,1] to obtain a second image matrix;
performing matrix addition operation on the first image matrix and the second image matrix, and performing binarization processing on each pixel on the image obtained by the matrix addition operation according to a set binarization threshold value to obtain a binarized image;
and counting the proportion of the pixels with the value of 1 in the parts except the pupil areas on the binarized image, judging that the pupil is not worn if the proportion is larger than a set pupil beautifying threshold value, and judging that the pupil is worn if the proportion is not worn.
2. The mydriasis detection method of claim 1, further comprising: preprocessing the acquired eye image, wherein the preprocessing comprises the following steps:
performing pupil primary positioning on the eye image to obtain pupil positions, pupil radiuses, upper eyelid boundaries and lower eyelid boundaries;
respectively selecting the vertex of the upper eyelid boundary, the bottom point of the lower eyelid boundary and two intersection points of the upper eyelid boundary and the lower eyelid boundary as intercepting boundaries to intercept the eye image to obtain an eye region image;
and carrying out light spot detection on the eye area image, and if the light spot is detected, filling the detected light spot in the eye area image by using bicubic interpolation.
3. The method according to claim 1 or 2, wherein the proportion of the pixels with a value of 1 in the portion other than the pupil area on the statistical binarized image includes:
selecting a middle region on the binarized image, and counting the total number N of pixels of the middle region T
Counting the number N of the pixels with the value of 1 in the part outside the pupil area on the middle area 1
By formula N 1 /N T The ratio is calculated.
4. A mydriasis detection apparatus, the apparatus comprising:
the image acquisition module is used for acquiring an eye image;
the first image matrix calculation module is used for calculating the pixel average value of the eye image, and carrying out gray stretching on pixels smaller than the pixel average value in the eye image to [0,1], wherein the pixels larger than the pixel average value are set to be 0, so as to obtain a first image matrix;
the second image matrix calculation module is used for carrying out gradient transformation on the eye image, normalizing the gray scale of each pixel on the image obtained by the gradient transformation to be 0,1, and obtaining a second image matrix;
the binarization image calculation module is used for carrying out matrix addition operation on the first image matrix and the second image matrix, and carrying out binarization processing on each pixel on the image obtained by matrix addition operation according to a set binarization threshold value to obtain a binarization image;
and the pupil beauty judgment module is used for counting the proportion of the pixels with the value of 1 in the parts except the pupil areas on the binarized image, judging that the pupil beauty is not worn if the proportion is larger than a set pupil beauty threshold, and judging that the pupil beauty is worn if the proportion is not larger than the set pupil beauty threshold.
5. The mydriasis detection apparatus of claim 4, further comprising a preprocessing module for preprocessing the acquired eye image, the preprocessing module comprising:
the pupil initial positioning unit is used for performing pupil initial positioning on the eye image to obtain pupil positions, pupil radiuses, upper eyelid boundaries and lower eyelid boundaries;
the image intercepting unit is used for respectively selecting the vertex of the upper eyelid boundary, the bottom point of the lower eyelid boundary and two intersection points of the upper eyelid boundary and the lower eyelid boundary as intercepting boundaries to intercept the eye image to obtain an eye area image;
and the light spot filling unit is used for carrying out light spot detection on the eye area image, and if the light spot is detected, filling the light spot detected in the eye area image by using bicubic interpolation.
6. The device for detecting a pupil according to claim 4 or 5, wherein the pupil determining module counts a proportion of the pixels with a value of 1 in a portion other than the pupil area on the binarized image, and the method comprises:
a middle region selecting unit for selecting a middle region on the binarized image and counting the total number N of pixels of the middle region T
A pixel statistics unit for counting the number N of 1-valued pixels in the part except pupil area on the middle area 1
A proportion calculating unit for passing through formula N 1 /N T The ratio is calculated.
7. A computer readable storage medium for pupil detection, comprising a memory for storing processor executable instructions which, when executed by the processor, implement the steps comprising the pupil detection method of any of claims 1-3.
8. An apparatus for pupil detection, characterized in that it comprises at least one processor and a memory storing computer-executable instructions, which when executed by the processor implement the steps of the pupil detection method of any one of claims 1-3.
9. An iris recognition method, the method comprising:
performing mydriasis detection by the mydriasis detection method of any one of claims 1 to 3;
iris detection is carried out on the eye images, and an iris area is obtained;
and counting the ratio of the pixels with the value of 1 in the iris region on the binarized image, and if the ratio is smaller than a set recognition threshold, performing iris recognition by using the eye image.
10. The iris identification method of claim 9 wherein the area where the pixel having a value of 1 is located on the binarized image is treated as a noise area at the time of iris identification.
CN202111372916.6A 2021-11-19 2021-11-19 Pupil detection and iris recognition method, device, readable storage medium and equipment Pending CN116152139A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111372916.6A CN116152139A (en) 2021-11-19 2021-11-19 Pupil detection and iris recognition method, device, readable storage medium and equipment
PCT/CN2022/128304 WO2023088071A1 (en) 2021-11-19 2022-10-28 Cosmetic contact lens detection method and apparatus, iris recognition method and apparatus, and readable storage medium and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111372916.6A CN116152139A (en) 2021-11-19 2021-11-19 Pupil detection and iris recognition method, device, readable storage medium and equipment

Publications (1)

Publication Number Publication Date
CN116152139A true CN116152139A (en) 2023-05-23

Family

ID=86349335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111372916.6A Pending CN116152139A (en) 2021-11-19 2021-11-19 Pupil detection and iris recognition method, device, readable storage medium and equipment

Country Status (2)

Country Link
CN (1) CN116152139A (en)
WO (1) WO2023088071A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050025927A (en) * 2003-09-08 2005-03-14 유웅덕 The pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its
US7970179B2 (en) * 2006-09-25 2011-06-28 Identix Incorporated Iris data extraction
CN109800618A (en) * 2017-11-16 2019-05-24 山西天地科技有限公司 A kind of device and method thereof for detecting whether to wear U.S. pupil
CN109934143A (en) * 2019-03-04 2019-06-25 深圳三人行在线科技有限公司 A kind of method and apparatus of the detection of iris image Sino-U.S. pupil
CN110909601B (en) * 2019-10-18 2022-12-09 武汉虹识技术有限公司 Beautiful pupil identification method and system based on deep learning
CN110516661B (en) * 2019-10-21 2020-05-05 武汉虹识技术有限公司 Beautiful pupil detection method and device applied to iris recognition

Also Published As

Publication number Publication date
WO2023088071A1 (en) 2023-05-25

Similar Documents

Publication Publication Date Title
CN109359538B (en) Training method of convolutional neural network, gesture recognition method, device and equipment
KR20210100602A (en) Face image-based risk recognition method, apparatus, computer device and storage medium
CN110148121A (en) A kind of skin image processing method, device, electronic equipment and medium
CN109389129A (en) A kind of image processing method, electronic equipment and storage medium
CN109684959A (en) The recognition methods of video gesture based on Face Detection and deep learning and device
Zong et al. U-net based method for automatic hard exudates segmentation in fundus images using inception module and residual connection
CN106570447B (en) Based on the matched human face photo sunglasses automatic removal method of grey level histogram
CN109635669B (en) Image classification method and device and classification model training method and device
CN108323203A (en) A kind of method, apparatus and intelligent terminal quantitatively detecting face skin quality parameter
CN107169479A (en) Intelligent mobile equipment sensitive data means of defence based on fingerprint authentication
Veiga et al. Quality evaluation of digital fundus images through combined measures
CN108369644A (en) A kind of method and intelligent terminal quantitatively detecting face wrinkles on one's forehead
CN103778406A (en) Object detection method and device
WO2022137603A1 (en) Determination method, determination device, and determination program
CN114283052A (en) Method and device for cosmetic transfer and training of cosmetic transfer network
CN103034840B (en) A kind of gender identification method
US20220319208A1 (en) Method and apparatus for obtaining feature of duct tissue based on computer vision, and intelligent microscope
Das et al. An efficient deep sclera recognition framework with novel sclera segmentation, vessel extraction and gaze detection
CN110516661A (en) U.S. pupil detection method and device applied to iris recognition
CN110909601B (en) Beautiful pupil identification method and system based on deep learning
CN112825120B (en) Face illumination evaluation method, device, computer readable storage medium and equipment
Montazeri et al. Automatic extraction of eye field from a gray intensity image using intensity filtering and hybrid projection function
CN111814738A (en) Human face recognition method, human face recognition device, computer equipment and medium based on artificial intelligence
CN112949353A (en) Iris silence living body detection method and device, readable storage medium and equipment
CN116152139A (en) Pupil detection and iris recognition method, device, readable storage medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination