CN114944006A - Iris identification method, device, equipment and storage medium - Google Patents

Iris identification method, device, equipment and storage medium Download PDF

Info

Publication number
CN114944006A
CN114944006A CN202210729974.8A CN202210729974A CN114944006A CN 114944006 A CN114944006 A CN 114944006A CN 202210729974 A CN202210729974 A CN 202210729974A CN 114944006 A CN114944006 A CN 114944006A
Authority
CN
China
Prior art keywords
iris
image
pixel gray
different directions
iris image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210729974.8A
Other languages
Chinese (zh)
Inventor
王明磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Puhui Enterprise Management Co Ltd
Original Assignee
Ping An Puhui Enterprise Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Puhui Enterprise Management Co Ltd filed Critical Ping An Puhui Enterprise Management Co Ltd
Priority to CN202210729974.8A priority Critical patent/CN114944006A/en
Publication of CN114944006A publication Critical patent/CN114944006A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the field of biological identification, and discloses an iris identification method, which comprises the following steps: carrying out various filtering operations on the iris image to be recognized to obtain a de-noised iris image; carrying out weighting operation on pixel gray values in different directions in the denoised iris image to obtain weighted average gray values in different directions; coding the weighted average gray value and the central pixel gray value in different directions to obtain an iris characteristic diagram; and performing down-sampling operation on the iris feature map to obtain an iris feature vector, calculating a Hamming distance between the iris feature vector and a preset training iris feature vector, and comparing the Hamming distance with a preset training classification threshold to obtain a recognition result of the iris image to be recognized. The invention also relates to a block chain technology, and the iris image can be stored in the block chain points. The invention also provides an iris recognition device, equipment and a medium. The invention can improve the efficiency and accuracy of iris recognition.

Description

Iris identification method, device, equipment and storage medium
Technical Field
The present invention relates to the field of biometric identification, and in particular, to an iris identification method, apparatus, device, and storage medium.
Background
Iris recognition refers to determining the identity of a user by comparing the similarity between features of iris images. The iris identification process is divided into the steps of iris image acquisition, iris preprocessing, iris feature extraction and identification, wherein the stable extraction of iris features and the iris identification are key, the iris feature extraction refers to the analysis of iris images so as to extract the iris features in the iris images, and the traditional iris feature extraction method comprises the following steps: based on phase analysis, based on zero crossing point detection, based on texture analysis and based on LBP operator.
However, these methods are sensitive to image noise and have a large amount of calculation, and these methods have a certain pertinence, and special features require special feature extraction and recognition algorithms, so that the flexibility of iris feature extraction and the efficiency of iris feature extraction are low. In addition, when describing the characteristic texture of the iris, the methods usually ignore the whole spatial information of the iris and do not consider the central pixel of the iris, so that the accuracy of iris recognition is low.
Disclosure of Invention
The invention provides an iris identification method, device, equipment and storage medium, and mainly aims to improve the efficiency and accuracy of iris identification.
In order to achieve the above object, the present invention provides an iris identification method, including:
acquiring an iris image to be recognized, and performing various filtering operations on the iris image to obtain a de-noised iris image;
extracting pixel gray values of the de-noised iris image, and performing weighting operation on the pixel gray values in different directions in the de-noised iris image to obtain weighted average gray values in different directions;
extracting a central pixel gray value of the denoised iris image, and coding the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map;
and performing down-sampling operation on the iris feature map to obtain an iris feature vector, calculating a Hamming distance between the iris feature vector and a preset training iris feature vector, and comparing the Hamming distance with a preset training classification threshold to obtain a recognition result of the iris image to be recognized.
Optionally, the encoding the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map includes:
calculating the difference between the weighted average gray value and the central pixel gray value in different directions respectively to obtain a plurality of difference pixel gray values;
coding the plurality of difference pixel gray values by using a preset coding rule to obtain a plurality of pixel binary values;
and outputting a plurality of pixel binary values by using a preset activation function to obtain the iris characteristic diagram.
Optionally, the performing a weighting operation on the pixel gray-scale values in different directions in the denoised iris image to obtain weighted average gray-scale values in different directions includes:
selecting a plurality of adjacent pixel gray values in different directions in the denoised iris image, and calculating the adjacent pixel gray values by using a preset weighted gray value formula to obtain the weighted average gray values in different directions.
Wherein the preset weighted gray-scale value formula is as follows:
Figure BDA0003712807140000021
wherein, G is w Representing a weighted average gray value; the P is w Representing a pixel gray value at an intermediate position among a plurality of adjacent pixel gray values; said P is w+1 Representing pixel gray values above or to the right of the intermediate position among the plurality of adjacent pixel gray values; the P is w-1 Representing pixel gray values below or around the middle position in the plurality of adjacent pixel gray values; the w represents the number of the positions where the pixel gray values are located.
Optionally, the downsampling the iris feature map to obtain an iris feature vector includes:
splitting the iris feature map into a plurality of sub-blocks, and reading the average grey value of each sub-block;
and converting the average grey value of the sub-blocks into sub-block binary values, and connecting the sub-block binary values to obtain the iris characteristic vector.
Optionally, the calculating a hamming distance between the iris feature vector and a preset training iris feature vector includes:
reading a training template of the training iris feature vector and an iris template of the iris feature vector;
and calculating the training template and the iris template to obtain the Hamming distance between the training iris characteristic vector and the iris characteristic vector.
Optionally, the performing a plurality of filtering operations on the iris image to obtain a denoised iris image includes:
carrying out Gaussian filtering operation on the iris image to obtain a Gaussian filtering image;
carrying out median filtering operation on the iris image to obtain a median filtering image;
carrying out bilateral filtering operation on the iris image to obtain a bilateral filtering image;
performing Gaussian Laplace filtering operation on the iris image to obtain a Laplace filtering image;
comparing the pixel gray values of the Gaussian filtered image, the median filtered image, the bilateral filtered image and the Laplace filtered image with the pixel gray value of the iris image to obtain a Gaussian difference image, a median difference image, a bilateral difference image and a Laplace difference image respectively;
and mapping the Gaussian difference image, the median difference image, the bilateral difference image and the Laplace difference image to an image with a preset dimensionality to obtain a de-noised iris image.
Optionally, after acquiring the iris image to be recognized, the method further includes:
iris positioning is carried out on the iris image to obtain an effective iris area in the iris image;
normalizing the effective area of the iris to obtain a normalized iris image;
and carrying out histogram equalization operation on the normalized iris image to obtain an iris image with enhanced iris texture.
In order to solve the above problems, the present invention also provides an iris recognition apparatus, including:
the filtering operation module is used for acquiring an iris image to be identified and carrying out various filtering operations on the iris image to obtain a de-noised iris image;
the pixel gray value weighting module is used for extracting the pixel gray value of the de-noised iris image and carrying out weighting operation on the pixel gray values in different directions in the de-noised iris image to obtain weighted average gray values in different directions;
the iris feature coding module is used for extracting a central pixel gray value of the de-noised iris image, and coding the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map;
and the iris identification module is used for carrying out down-sampling operation on the iris characteristic map to obtain an iris characteristic vector, calculating the Hamming distance between the iris characteristic vector and a preset training iris characteristic vector, and comparing the Hamming distance with a preset training classification threshold value to obtain an identification result of the iris image to be identified.
In order to solve the above problem, the present invention also provides an electronic device, including:
a memory storing at least one computer program; and
and a processor for executing the computer program stored in the memory to realize the iris identification method.
In order to solve the above problem, the present invention also provides a computer-readable storage medium having at least one computer program stored therein, the at least one computer program being executed by a processor in an electronic device to implement the iris recognition method as described above.
In the embodiment of the invention, firstly, the iris image is subjected to various filtering operations to obtain the de-noised iris image, so that the noise in the iris image can be removed from different directions of the iris image, the key texture features in the iris can be searched to the greatest extent, and the accuracy of subsequent iris feature extraction can be improved conveniently; secondly, weighting the pixel gray values in different directions in the denoised iris image to obtain weighted average gray values, and coding the weighted average gray values in different directions and the central pixel gray value to obtain an iris feature map, so that the texture features of the iris in different directions can be extracted, and the iris features in different directions and the central pixel of the iris are coded, thereby not only retaining the integral spatial information of the iris, but also considering the central pixel of the iris, and realizing the accurate extraction of the iris features; and finally, downsampling the iris characteristic diagram, reducing the calculated amount and improving the iris identification efficiency while keeping the key characteristic information of the iris, and improving the accuracy of the iris identification by carrying out the iris identification through the Hamming distance. Therefore, the iris identification method, the iris identification device, the iris identification equipment and the iris identification storage medium provided by the embodiment of the invention can improve the efficiency and the accuracy of iris identification.
Drawings
Fig. 1 is a schematic flow chart of an iris identification method according to an embodiment of the present invention;
fig. 2 is a detailed flowchart illustrating a step in an iris identification method according to an embodiment of the present invention;
fig. 3 is a detailed flowchart illustrating a step in an iris identification method according to an embodiment of the present invention;
fig. 4 is a schematic block diagram of an iris recognition apparatus according to an embodiment of the present invention;
fig. 5 is a schematic internal structure diagram of an electronic device implementing an iris recognition method according to an embodiment of the present invention;
the implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The embodiment of the invention provides an iris identification method. The execution subject of the iris identification method includes, but is not limited to, at least one of electronic devices such as a server and a terminal, which can be configured to execute the method provided by the embodiments of the present application. In other words, the iris recognition method may be performed by software or hardware installed in the terminal device or the server device, and the software may be a block chain platform. The server includes but is not limited to: a single server, a server cluster, a cloud server or a cloud server cluster, and the like.
Referring to fig. 1, a flowchart of an iris identification method according to an embodiment of the present invention is shown, where in the embodiment of the present invention, the iris identification method includes the following steps S1-S4:
s1, obtaining the iris image to be identified, and carrying out various filtering operations on the iris image to obtain the de-noised iris image.
In the embodiment of the invention, the iris image to be identified refers to a user iris image collected by a camera arranged in a target area; the de-noising iris image is an iris image which removes iris noise and reserves the characteristic points of the internal texture of the iris.
In an embodiment of the present invention, after acquiring the iris image to be recognized, the method further includes:
iris positioning is carried out on the iris image to obtain an effective iris area in the iris image; normalizing the effective iris area to obtain a normalized iris image; and carrying out histogram equalization operation on the normalized iris image to obtain an iris image with enhanced iris texture.
Before the iris image is positioned, quality evaluation can be carried out on the iris image, and whether subsequent iris recognition can be carried out on the iris quality is judged; carrying out iris positioning on the iris image to position an effective iris area in the iris image; then, normalizing the effective area of the iris by using a rubber ring model method to ensure that the iris image has the characteristics of translation invariance and scale invariance; and finally, performing histogram equalization operation on the iris image to enhance the iris texture, so that more accurate iris features can be extracted subsequently.
The embodiment of the invention obtains the de-noised iris image by carrying out various filtering operations on the iris image, because the characteristics in the iris texture can not be accurately found by a single filtering operation, the noise in the iris image can be removed from different directions of the iris by adopting the combination of various filtering operations, the unnecessary interference is reduced, the iris texture is highlighted, more effective characteristic information of the iris is kept, and the full expression of the iris texture is ensured.
As an embodiment of the present invention, referring to fig. 2, the performing multiple filtering operations on the iris image to obtain a denoised iris image includes the following steps S11-S16:
s11, carrying out Gaussian filtering operation on the iris image to obtain a Gaussian filtered image;
s12, carrying out median filtering operation on the iris image to obtain a median filtering image;
s13, carrying out bilateral filtering operation on the iris image to obtain a bilateral filtering image;
s14, performing Gaussian Laplace filtering operation on the iris image to obtain a Laplace filtering image;
s15, comparing the pixel gray values of the Gaussian filter image, the median filter image, the bilateral filter image and the Laplace filter image with the pixel gray value of the iris image to obtain a Gaussian difference image, a median difference image, a bilateral difference image and a Laplace difference image respectively;
s16, mapping the Gaussian difference image, the median difference image, the bilateral difference image and the Laplace difference image to an image with a preset dimension to obtain a denoised iris image.
The Gaussian filtering operation is to scan each pixel in the iris image by using a preset sliding window convolution, and replace the value of a central pixel point of the Gaussian template by using the weighted average gray value of the pixels in the neighborhood determined by the Gaussian template, so as to remove Gaussian noise points. The median filtering operation is to scan the pixel gray value in the iris image according to a certain step length by using a filtering window, select the median of the pixel gray value covered by the filtering window, and replace the median by using the median of each point value in the neighborhood of the median, thereby eliminating the isolated noise point. Compared with Gaussian filtering and median filtering, the bilateral filtering operation takes the luminosity and color difference among pixels of the iris image into consideration, so that the noise on the image can be effectively removed, and meanwhile, the edge information on the image is saved. The Gaussian Laplace filtering operation is that on the basis that the noise influence of the iris image is reduced by utilizing Gaussian filtering, the edge information of the iris image is extracted by utilizing a Laplace operator, and the noise in the iris image can be accurately removed in the denoising process of the iris image.
In an embodiment of the present invention, the pixel gray values of the gaussian filter image, the median filter image, the bilateral filter image, and the laplacian filter image are respectively compared with the pixel gray values of the iris image, that is, the image of the filtering operation is subtracted from the iris image, so as to respectively obtain the corresponding image from which the noise of the iris image is removed after each filtering operation.
Further, mapping the gaussian difference image, the median difference image, the bilateral difference image and the laplacian difference image into an image with a preset dimension, integrating 4 groups of difference images into one group, and setting the pixel gray value of a non-feature point in the integrated difference image to be 0, so that all points with pixel gray values of the integrated difference image not being 0 are used as stable feature points of the iris image to obtain the de-noised iris image.
S2, extracting the pixel gray value of the de-noised iris image, and carrying out weighting operation on the pixel gray values in different directions in the de-noised iris image to obtain weighted average gray values in different directions.
In the embodiment of the invention, the pixel gray value refers to a value of the brightness degree of the recorded image; the weighted average gray value is the average value of the gray values of several adjacent pixels selected in a certain range in different directions in the denoised iris image.
In the embodiment of the invention, because each pixel on the denoised iris image is not isolated and is closely connected with the surrounding pixels, the weighted average gray value in different directions is obtained by carrying out weighting operation on the gray values of the pixels in different directions in the denoised iris image, and the integral spatial characteristic information of the iris can be more completely retained.
As an embodiment of the present invention, the weighting the pixel gray-scale values in different directions in the denoised iris image to obtain weighted average gray-scale values in different directions includes:
selecting a plurality of adjacent pixel gray values in different directions in the denoised iris image, and calculating the adjacent pixel gray values by using a preset weighted gray value formula to obtain the weighted average gray values in different directions.
Wherein the preset weighted gray-scale value formula is as follows:
Figure BDA0003712807140000061
wherein, G is w Representing a weighted average gray value; the P is w Representing a pixel gray value at an intermediate position among a plurality of adjacent pixel gray values; said P is w+1 Representing pixel gray values above or to the right of the intermediate position among the plurality of adjacent pixel gray values; the P is w-1 Representing pixel gray values below or around the middle position in the multiple adjacent pixel gray values; the w represents the number of the positions where the pixel gray values are located.
For example, the gray values of 6 adjacent pixels in the pi/2 direction are respectively selected as P 2 、P 3 、P 4 、Q 4 、Q 5 、Q 6 Wherein P is w =P 3 +Q 5 、P w+1 =P 2 +Q 4 、P w-1 =P 4 +Q 6 And calculating the weighted average gray value in the pi/2 direction by using the weighted gray value formula as follows: g 3 =[2(P 3 +Q 5 )+(P 2 +Q 4 )+(P 4 +Q 6 ) In addition, the invention can also select a plurality of adjacent pixel gray values in different directions of pi, pi/4, 3 pi/4 and the like for calculation.
In one embodiment of the invention, the different directions are directions selected by a user according to requirements, generally 8 different directions are selected from the de-noised iris image, and more complete iris characteristic information can be obtained; the number of the plurality of adjacent pixel gray values is determined according to the requirement of a user.
S3, extracting the central pixel gray value of the de-noised iris image, and coding the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map.
In the embodiment of the invention, the central pixel gray value refers to the pixel gray value at the middle position in the de-noised iris image; the iris feature map is an image storing iris texture features.
According to the embodiment of the invention, the weighted average gray value and the central pixel gray value in different directions are coded to obtain the iris feature map, so that the texture features of the iris in different directions can be extracted, and the central pixel of the iris is considered, so that the accurate extraction of the iris features is realized.
As an embodiment of the present invention, referring to fig. 3, the encoding the weighted average gray-level values in different directions and the gray-level value of the central pixel to obtain an iris feature map includes the following steps S31-S33:
s31, calculating the difference between the weighted average gray value and the central pixel gray value in different directions respectively to obtain a plurality of difference pixel gray values;
s32, encoding the difference pixel gray values by using a preset encoding rule to obtain a plurality of pixel binary values;
and S33, outputting a plurality of pixel binary values by using a preset activation function to obtain the iris feature map.
Wherein the main function of calculating the difference between the weighted average gray value in different directions and the gray value of the central pixel is to further enhance the expression of the iris texture features.
In an embodiment of the present invention, the preset encoding rule may be the following encoding formula:
Figure BDA0003712807140000071
Figure BDA0003712807140000072
Figure BDA0003712807140000073
wherein, D is i Representing a pixel binary value corresponding to the weighted average gray value direction; the G is w -P c Representing a plurality of difference pixel gray values; the above-mentioned
Figure BDA0003712807140000074
Represents an encoding threshold; the sgn (x) represents the comparison result of a plurality of difference pixel gray values and an encoding threshold value; the G is w Representing weighted average gray values in different directions; the P is c Representing a central pixel gray value; the binary values of 8 directions adjacent to the central pixel value can be judged by the encoding rule.
Further, a plurality of binary values of the pixels are output by using a preset activation function (such as a Mish function) to obtain the iris feature map.
S4, carrying out down-sampling operation on the iris feature map to obtain an iris feature vector, calculating the Hamming distance between the iris feature vector and a preset training iris feature vector, and comparing the Hamming distance with a preset training classification threshold to obtain the recognition result of the iris image to be recognized.
In the embodiment of the invention, the iris characteristic vector is used for describing the iris characteristics, and the down-sampling operation is carried out on the iris characteristic map to obtain the iris characteristic vector, so that the dimensionality of the iris characteristic map can be reduced, the operation is reduced and the efficiency of subsequent iris identification is improved on the premise of ensuring that the key information of the iris texture is not lost.
In the embodiment of the invention, the Hamming distance refers to the number of different pixel gray values of the iris characteristic vector and the training iris characteristic vector in the same position; the training classification threshold is a category threshold to which iris images registered in a library belong; the identification result refers to the user identity corresponding to the identified iris image, for example, the iris in the iris image is identified as the third iris.
Furthermore, by calculating the Hamming distance between the feature code of the iris feature vector and the preset training feature code and comparing the Hamming distance with the preset classification threshold value, the recognition result of the iris image to be recognized is obtained, the calculation amount can be reduced while the key feature information of the iris is kept, the iris recognition efficiency is improved, and the iris recognition accuracy is improved by carrying out the iris recognition through the Hamming distance.
As an embodiment of the present invention, the down-sampling the iris feature map to obtain an iris feature vector includes:
splitting the iris feature map into a plurality of sub-blocks, and reading the average grey value of each sub-block; and converting the average grey value of the sub-blocks into sub-block binary values, and connecting the sub-block binary values to obtain the iris characteristic vector.
The iris feature map can be divided into m × n sub-blocks, the non-overlapping division feature map method can keep the overall structure information of the iris feature map, the smaller the size of the sub-block is, the higher the recognition rate is, but the calculation time is increased at the same time, when the size of the sub-block is too low, the recognition rate is reduced, preferably, the size of the sub-block can be divided into 16 × 8, and further, the average gray value of the sub-block can be divided into sub-blocks α (1. ltoreq. alpha. ltoreq. mxn) isThe rows represent.
In an embodiment of the present invention, the binary values of the sub-blocks may be connected by the following formula to obtain an iris feature vector:
Figure BDA0003712807140000081
wherein, the sub β Representing the feature vector of the iris, said sub mean Representing the sub-block binary values.
Further, the calculating the hamming distance between the iris feature vector and a preset training iris feature vector includes:
reading a training template of the training iris feature vector and an iris template of the iris feature vector; and calculating the training template and the iris template to obtain the Hamming distance between the training iris characteristic vector and the iris characteristic vector.
The training template refers to binary feature codes of training iris feature vectors, and the iris template refers to binary feature codes of the iris feature vectors.
In an embodiment of the present invention, the training template and the iris template may be calculated by the following formula:
Figure BDA0003712807140000091
wherein, the H represents a Hamming distance; a is described i Representing an iris feature vector; b is i Representing training iris feature vectors; a is described mask Representing a training template; b is mask Representing an iris template; the m × n represents the number of split sub-blocks.
In an embodiment of the invention, when the hamming distance is smaller, the representative iris template is more matched with the training template, the characteristics of the two templates are more matched, and the extracted characteristics are more stable.
And further, comparing the Hamming distance with a preset training classification threshold, when the Hamming distance is smaller than the training classification threshold, determining that the recognition result of the iris image to be recognized and the training iris belong to the same category, and when the Hamming distance is larger than or equal to the training classification threshold, determining that the recognition result of the iris image to be recognized and the training iris do not belong to the same category.
In an optional embodiment of the invention, through comparison between the hamming distance and the training classification threshold, whether the identity of the user is registered can be determined, and in iris recognition applications, such as entrance guard recognition applications, entrance of strangers can be effectively prevented, and the application safety is improved.
In the embodiment of the invention, firstly, the iris image is subjected to various filtering operations to obtain the de-noised iris image, so that the noise in the iris image can be removed from different directions of the iris image, the key texture features in the iris can be searched to the greatest extent, and the accuracy of subsequent iris feature extraction can be improved conveniently; secondly, weighting the pixel gray values in different directions in the denoised iris image to obtain weighted average gray values, and coding the weighted average gray values in different directions and the central pixel gray value to obtain an iris feature map, so that the texture features of the iris in different directions can be extracted, and the iris features in different directions and the central pixel of the iris are coded, thereby not only retaining the integral spatial information of the iris, but also considering the central pixel of the iris, and realizing the accurate extraction of the iris features; and finally, downsampling the iris characteristic diagram, reducing the calculated amount and improving the iris identification efficiency while keeping the key characteristic information of the iris, and improving the accuracy of the iris identification by carrying out the iris identification through the Hamming distance. Therefore, the iris identification method provided by the embodiment of the invention can improve the efficiency and accuracy of iris identification.
The iris recognition apparatus 100 according to the present invention may be installed in an electronic device. According to the implemented functions, the iris recognition apparatus may include a filtering operation module 101, a pixel gray-scale value weighting module 102, an iris feature coding module 103, and an iris recognition module 104, which may also be referred to as a unit, and refer to a series of computer program segments capable of being executed by a processor of an electronic device and performing a fixed function, and the computer program segments are stored in a memory of the electronic device.
In the present embodiment, the functions regarding the respective modules/units are as follows:
the filtering operation module 101 is configured to obtain an iris image to be identified, and perform multiple filtering operations on the iris image to obtain a denoised iris image.
In the embodiment of the invention, the iris image to be identified refers to a user iris image collected by a camera arranged in a target area; the de-noising iris image is an iris image which removes iris noise and reserves the characteristic points of the internal texture of the iris.
The filtering operation module 101 may be further configured to:
after the iris image to be identified is obtained, carrying out iris positioning on the iris image to obtain an effective iris area in the iris image; normalizing the effective area of the iris to obtain a normalized iris image; and carrying out histogram equalization operation on the normalized iris image to obtain an iris image with enhanced iris texture.
Before the iris image is positioned, quality evaluation can be carried out on the iris image, and whether subsequent iris recognition can be carried out on the iris quality is judged; iris positioning is carried out on the iris image so as to position an effective iris area in the iris image; then, normalizing the effective area of the iris by using a rubber ring model method to ensure that the iris image has the characteristics of translation invariance and scale invariance; and finally, performing histogram equalization operation on the iris image to enhance the iris texture, so that more accurate iris features can be extracted subsequently.
The embodiment of the invention obtains the de-noised iris image by carrying out various filtering operations on the iris image, because the characteristics in the iris texture can not be accurately found by a single filtering operation, the noise in the iris image can be removed from different directions of the iris by adopting the combination of various filtering operations, the unnecessary interference is reduced, the iris texture is highlighted, more effective characteristic information of the iris is kept, and the full expression of the iris texture is ensured.
As an embodiment of the present invention, the filtering operation module 101 performs multiple filtering operations on the iris image by performing the following operations to obtain a denoised iris image, including:
carrying out Gaussian filtering operation on the iris image to obtain a Gaussian filtered image;
carrying out median filtering operation on the iris image to obtain a median filtering image;
carrying out bilateral filtering operation on the iris image to obtain a bilateral filtering image;
performing Gaussian Laplace filtering operation on the iris image to obtain a Laplace filtering image;
comparing the pixel gray values of the Gaussian filtered image, the median filtered image, the bilateral filtered image and the Laplace filtered image with the pixel gray value of the iris image to obtain a Gaussian difference image, a median difference image, a bilateral difference image and a Laplace difference image respectively;
and mapping the Gaussian difference image, the median difference image, the bilateral difference image and the Laplace difference image to an image with a preset dimensionality to obtain a de-noised iris image.
And the Gaussian filtering operation is to scan each pixel in the iris image by using a preset sliding window convolution, and replace the value of a central pixel point of the Gaussian template by using a weighted average gray value of the pixels in the neighborhood determined by the Gaussian template, so as to remove Gaussian noise points. The median filtering operation is to scan the pixel gray value in the iris image according to a certain step length by using a filtering window, select the median of the pixel gray value covered by the filtering window, and replace the median by using the median of each point value in the neighborhood of the median, thereby eliminating the isolated noise point. Compared with Gaussian filtering and median filtering, the bilateral filtering operation takes the luminosity and color difference among pixels of the iris image into consideration, so that the noise on the image can be effectively removed, and meanwhile, the edge information on the image is saved. The Gaussian Laplace filtering operation is that on the basis that the noise influence of the iris image is reduced by Gaussian filtering, the Laplace operator is used for extracting the edge information of the iris image, and the noise in the iris image can be accurately removed in the denoising process of the iris image.
In an embodiment of the present invention, the pixel gray values of the gaussian filter image, the median filter image, the bilateral filter image, and the laplacian filter image are respectively compared with the pixel gray values of the iris image, that is, the image of the filtering operation is subtracted from the iris image, so as to respectively obtain the corresponding image from which the noise of the iris image is removed after each filtering operation.
Further, mapping the gaussian difference image, the median difference image, the bilateral difference image and the laplacian difference image into an image with a preset dimension, integrating 4 groups of difference images into one group, and setting the pixel gray value of a non-feature point in the integrated difference image to be 0, so that all points with pixel gray values of the integrated difference image not being 0 are used as stable feature points of the iris image to obtain the de-noised iris image.
The pixel gray value weighting module 102 is configured to extract a pixel gray value of the denoised iris image, and perform weighting operation on pixel gray values in different directions in the denoised iris image to obtain weighted average gray values in different directions.
In the embodiment of the invention, the pixel gray value refers to a value of the brightness degree of the recorded image; the weighted average gray value is the average value of the gray values of several adjacent pixels selected in a certain range in different directions in the denoised iris image.
In the embodiment of the invention, because each pixel on the denoised iris image is not isolated and is closely connected with the surrounding pixels, the weighted average gray value in different directions is obtained by carrying out weighting operation on the gray values of the pixels in different directions in the denoised iris image, and the integral spatial characteristic information of the iris can be more completely retained.
As an embodiment of the present invention, the pixel gray-level value weighting module 102 performs a weighting operation on pixel gray-level values in different directions in the denoised iris image by performing the following operations to obtain weighted average gray-level values in different directions, including:
selecting a plurality of adjacent pixel gray values in different directions in the denoised iris image, and calculating the adjacent pixel gray values by using a preset weighted gray value formula to obtain the weighted average gray values in different directions.
Wherein the preset weighted gray-scale value formula is as follows:
Figure BDA0003712807140000121
wherein, G is w Representing a weighted average gray value; said P is w Representing a pixel gray value at an intermediate position among a plurality of adjacent pixel gray values; said P is w+1 Representing pixel gray values above or to the right of the intermediate position among the plurality of adjacent pixel gray values; the P is w-1 Representing pixel gray values below or around the middle position in the plurality of adjacent pixel gray values; the w represents the number of the positions where the pixel gray values are located.
For example, the gray values of 6 adjacent pixels in the pi/2 direction are respectively selected as P 2 、P 3 、P 4 、Q 4 、Q 5 、Q 6 Wherein P is w =P 3 +Q 5 、P w+1 =P 2 +Q 4 、P w-1 =P 4 +Q 6 And calculating the weighted average gray value in the pi/2 direction by using the weighted gray value formula as follows: g 3 =[2(P 3 +Q 5 )+(P 2 +Q 4 )+(P 4 +Q 6 )]And/8, in addition, the invention can also select a plurality of adjacent pixel gray values in different directions of pi, pi/4, 3 pi/4 and the like for calculation.
In an embodiment of the invention, the different directions are directions selected by a user according to requirements, generally 8 different directions are selected from a de-noised iris image, and more complete iris characteristic information can be obtained; the number of the plurality of adjacent pixel gray values is determined according to the requirement of a user.
The iris feature encoding module 103 is configured to extract a central pixel gray value of the denoised iris image, and encode the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map.
In the embodiment of the invention, the central pixel gray value refers to the pixel gray value at the middle position in the de-noised iris image; the iris feature map is an image storing iris texture features.
According to the embodiment of the invention, the weighted average gray value and the central pixel gray value in different directions are coded to obtain the iris feature map, so that the texture features of the iris in different directions can be extracted, and the central pixel of the iris is considered, so that the accurate extraction of the iris features is realized.
As an embodiment of the present invention, the iris feature encoding module 103 encodes the weighted average gray-level values and the central pixel gray-level values in different directions by performing the following operations to obtain an iris feature map, including:
calculating the difference between the weighted average gray value and the central pixel gray value in different directions respectively to obtain a plurality of difference pixel gray values;
coding the plurality of difference pixel gray values by using a preset coding rule to obtain a plurality of pixel binary values;
and outputting a plurality of pixel binary values by using a preset activation function to obtain the iris characteristic diagram.
Wherein the main function of calculating the difference between the weighted average gray value in different directions and the gray value of the central pixel is to further enhance the expression of the iris texture features.
In an embodiment of the present invention, the preset encoding rule may be the following encoding formula:
Figure BDA0003712807140000131
Figure BDA0003712807140000132
Figure BDA0003712807140000133
wherein, D is i Representing a pixel binary value corresponding to the weighted average gray value direction; the G is w -P c Representing a plurality of difference pixel gray values; the above-mentioned
Figure BDA0003712807140000134
Represents a coding threshold; the sgn (x) represents the comparison result of a plurality of difference pixel gray values with an encoding threshold; the G is w Representing weighted average gray values in different directions; said P is c Representing a central pixel gray value; the encoding rule can judge binary values of 8 directions adjacent to the central pixel value.
Further, a plurality of binary values of the pixels are output by using a preset activation function (such as a Mish function) to obtain the iris feature map.
The iris recognition module 104 is configured to perform downsampling on the iris feature map to obtain an iris feature vector, calculate a hamming distance between the iris feature vector and a preset training iris feature vector, and compare the hamming distance with a preset training classification threshold to obtain a recognition result of the iris image to be recognized.
In the embodiment of the invention, the iris characteristic vector is used for describing the iris characteristics, and the down-sampling operation is carried out on the iris characteristic map to obtain the iris characteristic vector, so that the dimensionality of the iris characteristic map can be reduced, the operation is reduced and the efficiency of subsequent iris identification is improved on the premise of ensuring that the key information of the iris texture is not lost.
In the embodiment of the invention, the Hamming distance refers to the number of different pixel gray values of the iris characteristic vector and the training iris characteristic vector in the same position; the training classification threshold is a category threshold to which iris images registered in a library belong; the identification result refers to the user identity corresponding to the identified iris image, for example, the iris in the iris image is identified as the third iris.
Furthermore, by calculating the Hamming distance between the feature code of the iris feature vector and the preset training feature code and comparing the Hamming distance with the preset classification threshold value, the recognition result of the iris image to be recognized is obtained, the calculation amount can be reduced while the key feature information of the iris is kept, the iris recognition efficiency is improved, and the iris recognition accuracy is improved by carrying out the iris recognition through the Hamming distance.
As an embodiment of the present invention, the iris identification module 104 performs down-sampling on the iris feature map to obtain an iris feature vector by performing the following operations, including:
splitting the iris feature map into a plurality of sub-blocks, and reading the average grey value of each sub-block; and converting the average grey value of the sub-blocks into sub-block binary values, and connecting the sub-block binary values to obtain the iris characteristic vector.
The iris feature map can be divided into m × n sub-blocks, the non-overlapping division feature map method can keep the overall structure information of the iris feature map, the smaller the size of the sub-block is, the higher the recognition rate is, but the calculation time is increased at the same time, when the size of the sub-block is too low, the recognition rate is reduced, preferably, the size of the sub-block can be divided into 16 × 8, and further, the average gray value of the sub-block can be divided into sub-blocks α (1. ltoreq. alpha. ltoreq. m.times.n).
In an embodiment of the present invention, the binary values of the sub-blocks may be connected by the following formula to obtain an iris feature vector:
Figure BDA0003712807140000141
wherein, the sub β Representing the feature vector of the iris, said sub mean Representing the sub-block binary values.
Further, the calculating the hamming distance between the iris feature vector and a preset training iris feature vector includes:
reading a training template of the training iris feature vector and an iris template of the iris feature vector; and calculating the training template and the iris template to obtain the Hamming distance between the training iris characteristic vector and the iris characteristic vector.
The training template refers to binary feature codes of training iris feature vectors, and the iris template refers to binary feature codes of the iris feature vectors.
In an embodiment of the present invention, the training template and the iris template may be calculated by the following formula:
Figure BDA0003712807140000142
wherein, the H represents a Hamming distance; a is described i Representing an iris feature vector; b is i Representing training iris feature vectors; a is described mask Representing a training template; b is described mask Representing an iris template; the m × n represents the number of split sub-blocks.
In an embodiment of the invention, when the hamming distance is smaller, the representative iris template is more matched with the training template, the characteristics of the two templates are more matched, and the extracted characteristics are more stable.
And further, comparing the Hamming distance with a preset training classification threshold, when the Hamming distance is smaller than the training classification threshold, determining that the recognition result of the iris image to be recognized and the training iris belong to the same category, and when the Hamming distance is larger than or equal to the training classification threshold, determining that the recognition result of the iris image to be recognized and the training iris do not belong to the same category.
In an optional embodiment of the invention, through comparison between the hamming distance and the training classification threshold, whether the identity of the user is registered can be determined, and in iris recognition applications, such as entrance guard recognition applications, entrance of strangers can be effectively blocked, and the application safety is improved.
In the embodiment of the invention, firstly, the iris image is subjected to various filtering operations to obtain the de-noised iris image, so that the noise in the iris image can be removed from different directions of the iris image, the key texture features in the iris can be searched to the greatest extent, and the accuracy of subsequent iris feature extraction can be improved conveniently; secondly, weighting the pixel gray values in different directions in the denoised iris image to obtain weighted average gray values, and coding the weighted average gray values in different directions and the central pixel gray value to obtain an iris feature map, so that the texture features of the iris in different directions can be extracted, and the iris features in different directions and the central pixel of the iris are coded, thereby not only retaining the integral spatial information of the iris, but also considering the central pixel of the iris, and realizing the accurate extraction of the iris features; and finally, downsampling the iris characteristic diagram, reducing the calculated amount and improving the iris identification efficiency while keeping the key characteristic information of the iris, and improving the accuracy of the iris identification by carrying out the iris identification through the Hamming distance. Therefore, the iris recognition device provided by the embodiment of the invention can improve the efficiency and accuracy of iris recognition.
Fig. 5 is a schematic structural diagram of an electronic device for implementing the iris recognition method according to the present invention.
The electronic device may comprise a processor 10, a memory 11, a communication bus 12 and a communication interface 13, and may further comprise a computer program, such as an iris recognition program, stored in the memory 11 and operable on the processor 10.
The memory 11 includes at least one type of media, which includes flash memory, removable hard disk, multimedia card, card type memory (e.g., SD or DX memory, etc.), magnetic memory, local magnetic disk, optical disk, etc. The memory 11 may in some embodiments be an internal storage unit of the electronic device, for example a removable hard disk of the electronic device. The memory 11 may also be an external storage device of the electronic device in other embodiments, such as a plug-in mobile hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the electronic device. Further, the memory 11 may also include both an internal storage unit and an external storage device of the electronic device. The memory 11 may be used not only to store application software installed in the electronic device and various types of data, such as codes of an iris recognition program, etc., but also to temporarily store data that has been output or will be output.
The processor 10 may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor 10 is a Control Unit (Control Unit) of the electronic device, connects various components of the whole electronic device by using various interfaces and lines, and executes various functions of the electronic device and processes data by running or executing programs or modules (e.g., iris recognition programs, etc.) stored in the memory 11 and calling data stored in the memory 11.
The communication bus 12 may be a PerIPheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The bus may be divided into an address bus, a data bus, a control bus, etc. The communication bus 12 is arranged to enable connection communication between the memory 11 and at least one processor 10 or the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
Fig. 5 shows only an electronic device having components, and those skilled in the art will appreciate that the structure shown in fig. 5 does not constitute a limitation of the electronic device, and may include fewer or more components than those shown, or some components may be combined, or a different arrangement of components.
For example, although not shown, the electronic device may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor 10 through a power management device, so that functions of charge management, discharge management, power consumption management and the like are realized through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The electronic device may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
Optionally, the communication interface 13 may include a wired interface and/or a wireless interface (e.g., WI-FI interface, bluetooth interface, etc.), which is generally used to establish a communication connection between the electronic device and other electronic devices.
Optionally, the communication interface 13 may further include a user interface, which may be a Display (Display), an input unit (such as a Keyboard (Keyboard)), and optionally, a standard wired interface, or a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable, among other things, for displaying information processed in the electronic device and for displaying a visualized user interface.
It is to be understood that the described embodiments are for purposes of illustration only and that the scope of the appended claims is not limited to such structures.
The iris recognition program stored in the memory 11 of the electronic device is a combination of a plurality of computer programs, which when run in the processor 10, may implement:
acquiring an iris image to be recognized, and performing various filtering operations on the iris image to obtain a de-noised iris image;
extracting pixel gray values of the de-noised iris image, and performing weighting operation on the pixel gray values in different directions in the de-noised iris image to obtain weighted average gray values in different directions;
extracting a central pixel gray value of the denoised iris image, and coding the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map;
and performing down-sampling operation on the iris feature map to obtain an iris feature vector, calculating a Hamming distance between the iris feature vector and a preset training iris feature vector, and comparing the Hamming distance with a preset training classification threshold to obtain a recognition result of the iris image to be recognized.
Specifically, the processor 10 may refer to the description of the relevant steps in the embodiment corresponding to fig. 1 for a specific implementation method of the computer program, which is not described herein again.
Further, the electronic device integrated module/unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable medium. The computer readable medium may be non-volatile or volatile. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM).
Embodiments of the present invention may also provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor of an electronic device, the computer program may implement:
acquiring an iris image to be recognized, and performing various filtering operations on the iris image to obtain a de-noised iris image;
extracting pixel gray values of the de-noised iris image, and performing weighting operation on the pixel gray values in different directions in the de-noised iris image to obtain weighted average gray values in different directions;
extracting a central pixel gray value of the denoised iris image, and coding the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map;
and performing down-sampling operation on the iris feature map to obtain an iris feature vector, calculating a Hamming distance between the iris feature vector and a preset training iris feature vector, and comparing the Hamming distance with a preset training classification threshold to obtain a recognition result of the iris image to be recognized.
Further, the computer-readable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
In the embodiments provided by the present invention, it should be understood that the disclosed media, devices, apparatuses and methods may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the system claims may also be implemented by one unit or means in software or hardware. The terms second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. An iris identification method, comprising:
acquiring an iris image to be recognized, and performing various filtering operations on the iris image to obtain a de-noised iris image;
extracting pixel gray values of the de-noised iris image, and performing weighting operation on the pixel gray values in different directions in the de-noised iris image to obtain weighted average gray values in different directions;
extracting a central pixel gray value of the denoised iris image, and coding the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map;
and performing down-sampling operation on the iris feature map to obtain an iris feature vector, calculating a Hamming distance between the iris feature vector and a preset training iris feature vector, and comparing the Hamming distance with a preset training classification threshold to obtain a recognition result of the iris image to be recognized.
2. The iris identification method as claimed in claim 1, wherein said encoding the weighted mean gray-level values of different directions and the gray-level value of the central pixel to obtain an iris feature map comprises:
calculating the difference between the weighted average gray value and the central pixel gray value in different directions respectively to obtain a plurality of difference pixel gray values;
coding the plurality of difference pixel gray values by using a preset coding rule to obtain a plurality of pixel binary values;
and outputting a plurality of pixel binary values by using a preset activation function to obtain the iris characteristic diagram.
3. The iris identification method as claimed in claim 1, wherein the weighting operation is performed on the pixel gray-scale values in different directions in the de-noised iris image to obtain weighted average gray-scale values in different directions, and the weighting operation comprises:
selecting a plurality of adjacent pixel gray values in different directions in the denoised iris image, and calculating the adjacent pixel gray values by using a preset weighted gray value formula to obtain the weighted average gray values in different directions.
Wherein the preset weighted gray-scale value formula is as follows:
Figure FDA0003712807130000011
wherein, G is w Representing a weighted average gray value; the P is w Representing a pixel gray value at an intermediate position among a plurality of adjacent pixel gray values; the P is w+1 Representing pixel gray values above or to the right of the intermediate position among the plurality of adjacent pixel gray values; the P is w-1 Representing pixel gray values below or around the middle position in the plurality of adjacent pixel gray values; the w represents the number of the positions where the pixel gray values are located.
4. The iris identification method of claim 1, wherein the down-sampling operation of the iris feature map to obtain an iris feature vector comprises:
splitting the iris characteristic diagram into a plurality of sub-blocks, and reading the average grey value of each sub-block;
and converting the average grey value of the sub-blocks into sub-block binary values, and connecting the sub-block binary values to obtain the iris characteristic vector.
5. The iris identification method as claimed in claim 1, wherein the calculating of the hamming distance between the iris feature vector and a preset training iris feature vector comprises:
reading a training template of the training iris feature vector and an iris template of the iris feature vector;
and calculating the training template and the iris template to obtain the Hamming distance between the training iris characteristic vector and the iris characteristic vector.
6. The iris identification method as claimed in claim 1, wherein said performing a plurality of filtering operations on said iris image to obtain a denoised iris image comprises:
carrying out Gaussian filtering operation on the iris image to obtain a Gaussian filtering image;
carrying out median filtering operation on the iris image to obtain a median filtering image;
carrying out bilateral filtering operation on the iris image to obtain a bilateral filtering image;
performing Gaussian Laplace filtering operation on the iris image to obtain a Laplace filtering image;
comparing the pixel gray values of the Gaussian filtered image, the median filtered image, the bilateral filtered image and the Laplace filtered image with the pixel gray value of the iris image to obtain a Gaussian difference image, a median difference image, a bilateral difference image and a Laplace difference image respectively;
and mapping the Gaussian difference image, the median difference image, the bilateral difference image and the Laplace difference image to an image with a preset dimensionality to obtain a de-noised iris image.
7. An iris identification method as claimed in claim 1, wherein after the acquisition of the iris image to be identified, the method further comprises:
iris positioning is carried out on the iris image to obtain an effective iris area in the iris image;
normalizing the effective area of the iris to obtain a normalized iris image;
and carrying out histogram equalization operation on the normalized iris image to obtain an iris image with enhanced iris texture.
8. An iris recognition apparatus, comprising:
the filtering operation module is used for acquiring an iris image to be identified and carrying out various filtering operations on the iris image to obtain a de-noised iris image;
the pixel gray value weighting module is used for extracting the pixel gray value of the de-noised iris image and carrying out weighting operation on the pixel gray values in different directions in the de-noised iris image to obtain weighted average gray values in different directions;
the iris feature coding module is used for extracting a central pixel gray value of the de-noised iris image, and coding the weighted average gray value and the central pixel gray value in different directions to obtain an iris feature map;
and the iris identification module is used for carrying out down-sampling operation on the iris characteristic map to obtain an iris characteristic vector, calculating the Hamming distance between the iris characteristic vector and a preset training iris characteristic vector, and comparing the Hamming distance with a preset training classification threshold value to obtain an identification result of the iris image to be identified.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the iris recognition method as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out an iris recognition method according to any one of claims 1 to 7.
CN202210729974.8A 2022-06-24 2022-06-24 Iris identification method, device, equipment and storage medium Pending CN114944006A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210729974.8A CN114944006A (en) 2022-06-24 2022-06-24 Iris identification method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210729974.8A CN114944006A (en) 2022-06-24 2022-06-24 Iris identification method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114944006A true CN114944006A (en) 2022-08-26

Family

ID=82910665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210729974.8A Pending CN114944006A (en) 2022-06-24 2022-06-24 Iris identification method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114944006A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108133187A (en) * 2017-12-22 2018-06-08 吉林大学 Dimensional variation invariant feature and the one-to-one iris identification method of more algorithms voting

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108133187A (en) * 2017-12-22 2018-06-08 吉林大学 Dimensional variation invariant feature and the one-to-one iris identification method of more algorithms voting

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱晓冬 等: "《基于多方向局部二值模式与稳定特征的虹膜识别》", 《吉林大学学报(工学版)》, vol. 51, no. 2, 31 March 2021 (2021-03-31), pages 650 - 658 *

Similar Documents

Publication Publication Date Title
CN109558832B (en) Human body posture detection method, device, equipment and storage medium
CN113283446B (en) Method and device for identifying object in image, electronic equipment and storage medium
CN112465060A (en) Method and device for detecting target object in image, electronic equipment and readable storage medium
CN113705462B (en) Face recognition method, device, electronic equipment and computer readable storage medium
CN112418216A (en) Method for detecting characters in complex natural scene image
CN111639704A (en) Target identification method, device and computer readable storage medium
CN110717497B (en) Image similarity matching method, device and computer readable storage medium
CN110852311A (en) Three-dimensional human hand key point positioning method and device
CN112507934A (en) Living body detection method, living body detection device, electronic apparatus, and storage medium
CN116168351B (en) Inspection method and device for power equipment
CN113490947A (en) Detection model training method and device, detection model using method and storage medium
CN113012068A (en) Image denoising method and device, electronic equipment and computer readable storage medium
CN114049568A (en) Object shape change detection method, device, equipment and medium based on image comparison
CN114444565A (en) Image tampering detection method, terminal device and storage medium
CN112906671B (en) Method and device for identifying false face-examination picture, electronic equipment and storage medium
CN112966687B (en) Image segmentation model training method and device and communication equipment
CN112862703B (en) Image correction method and device based on mobile photographing, electronic equipment and medium
CN112528903B (en) Face image acquisition method and device, electronic equipment and medium
CN113780492A (en) Two-dimensional code binarization method, device and equipment and readable storage medium
CN112883346A (en) Safety identity authentication method, device, equipment and medium based on composite data
CN112329666A (en) Face recognition method and device, electronic equipment and storage medium
CN111626313B (en) Feature extraction model training method, image processing method and device
CN115439850A (en) Image-text character recognition method, device, equipment and storage medium based on examination sheet
CN114944006A (en) Iris identification method, device, equipment and storage medium
CN113705459A (en) Face snapshot method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination