CN112949518B - Iris image processing method, device, equipment and storage medium - Google Patents

Iris image processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN112949518B
CN112949518B CN202110257732.9A CN202110257732A CN112949518B CN 112949518 B CN112949518 B CN 112949518B CN 202110257732 A CN202110257732 A CN 202110257732A CN 112949518 B CN112949518 B CN 112949518B
Authority
CN
China
Prior art keywords
target
iris
area
determining
pupil
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110257732.9A
Other languages
Chinese (zh)
Other versions
CN112949518A (en
Inventor
王清涛
陈园园
李嘉扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Irisian Optronics Technology Co ltd
Original Assignee
Shanghai Irisian Optronics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Irisian Optronics Technology Co ltd filed Critical Shanghai Irisian Optronics Technology Co ltd
Priority to CN202110257732.9A priority Critical patent/CN112949518B/en
Publication of CN112949518A publication Critical patent/CN112949518A/en
Application granted granted Critical
Publication of CN112949518B publication Critical patent/CN112949518B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Eye Examination Apparatus (AREA)
  • Image Analysis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The embodiment of the invention discloses a method, a device, equipment and a storage medium for processing iris images, wherein the method comprises the following steps: acquiring a target image and determining a target to-be-processed area comprising a pupil area and an iris area in the target image; determining a target processing mode for processing the target to-be-processed region according to the region ratio of the pupil region to the iris region in the target to-be-processed region; processing each position information in the target conversion area based on a target processing mode, and determining target iris position information of each position information in a target to-be-processed area; and determining a normalized iris texture image corresponding to the iris region according to the pixel information of the target iris position information corresponding to each position information. The technical scheme of the embodiment of the invention realizes the improvement of the accuracy of the nonlinear normalization of the iris region texture, thereby improving the technical effect of the accuracy of the identity identification.

Description

Iris image processing method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to a biological feature recognition technology, in particular to an iris image processing method, an iris image processing device, iris image processing equipment and a storage medium.
Background
The biological characteristic identification is a mode of using various physiological and morphological characteristics inherent to human body as identification medium so as to uniquely identify the personal identity and perform personal identity authentication. The commonly used biometric recognition may be iris recognition, which is characterized by lifelong invariance and individual variability, relative to other biometric recognition, such as: the fingerprint, the face, the palmprint, the voice and the like have higher accuracy.
When the iris image is processed, the iris image is required to be calibrated or normalized so as to facilitate the subsequent feature extraction and feature matching, thereby realizing the effect of identity authentication. At present, the iris image normalization processing mainly comprises linearly expanding an annular iris region into a rectangular region with a fixed size to obtain an iris texture image. When the identity is identified based on the above mode, the following problems exist: because the brightness of the external environment is different during shooting, the pupil can automatically zoom to adapt to the environment, and when the iris image is normalized in the same way, the technical problem of affecting the accuracy of biological feature recognition exists.
Disclosure of Invention
The embodiment of the invention provides an iris image processing method, device, equipment and storage medium, which are used for carrying out corresponding normalization processing on an iris region according to different pupil states so as to improve the accuracy of nonlinear normalization processing on the iris region, thereby improving the technical effect of identity recognition accuracy.
In a first aspect, an embodiment of the present invention provides an iris image processing method, including:
acquiring a target image and determining a target to-be-processed area comprising a pupil area and an iris area in the target image;
determining a target processing mode for processing the target to-be-processed region according to the region ratio of the pupil region to the iris region in the target to-be-processed region;
processing each position information in a target conversion area based on the target processing mode, and determining target iris position information of each position information in the target to-be-processed area;
and determining a normalized iris texture image corresponding to the iris region according to the pixel information of the target iris position information corresponding to each position information.
In a second aspect, an embodiment of the present invention further provides an iris image processing apparatus, including:
the target to-be-processed area determining module is used for acquiring a target image and determining a target to-be-processed area comprising a pupil area and an iris area in the target image;
the target processing mode determining module is used for determining a target processing mode for processing the target to-be-processed area according to the area ratio of the pupil area to the iris area in the target to-be-processed area;
The target iris position information determining module is used for processing each position information in the target conversion area based on the target processing mode and determining target iris position information of each position information in the target to-be-processed area;
and the iris texture image determining module is used for determining a normalized iris texture image corresponding to the iris region according to the pixel information of the target iris position information corresponding to each position information.
In a third aspect, an embodiment of the present invention further provides an image processing apparatus that performs the iris image processing method, the image processing apparatus including:
one or more processors;
storage means for storing one or more programs,
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the iris image processing method according to any of the embodiments of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements an iris image processing method according to any of the embodiments of the present invention.
According to the technical scheme, the target to-be-processed area in the target image is determined, the target processing mode for processing the target to-be-processed area is determined according to the area ratio of the pupil area to the iris area in the target to-be-processed area, the images in different pupil states are processed respectively, the position information in the target conversion area is processed based on the target processing mode, the target iris position information of the position information in the target to-be-processed area is determined, the normalized iris texture image corresponding to the iris area is determined according to the pixel information of the target iris position information corresponding to the position information, the technical problems that in the prior art, the iris texture image converted by the same processing mode is processed in different pupil states, the iris texture image actually acquired has certain difference, the recognition accuracy is low, the user needs to be acquired for recognition to pass through a plurality of times, and the user experience is poor are solved, the corresponding normalization processing is performed on the iris area according to different pupil states, the iris area texture nonlinear normalization accuracy is improved, and therefore the identity recognition accuracy is improved.
Drawings
In order to more clearly illustrate the technical solution of the exemplary embodiments of the present invention, a brief description is given below of the drawings required for describing the embodiments. It is obvious that the drawings presented are only drawings of some of the embodiments of the invention to be described, and not all the drawings, and that other drawings can be made according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an iris image processing method according to an embodiment of the invention;
FIG. 2 is a schematic diagram of a target conversion area according to an embodiment of the invention;
FIG. 3 is a schematic view of iris ring width according to one embodiment of the present invention;
FIG. 4 is a schematic diagram of a Cartesian coordinate system and a polar coordinate system according to an embodiment of the present invention;
fig. 5 is a flowchart of an iris image processing method according to a second embodiment of the present invention;
fig. 6 is a flowchart of an iris image processing method according to a third embodiment of the present invention;
fig. 7 is a schematic diagram of three iris widths to be determined when the pupil state is an enlarged state according to a third embodiment of the present invention;
Fig. 8 is a schematic diagram of three iris widths to be determined when the pupil state is in a contracted state according to a third embodiment of the present invention;
fig. 9 is a schematic diagram of a linear iris width corresponding to each iris region to be used when the pupil state is in an enlarged state according to the third embodiment of the present invention;
fig. 10 is a schematic diagram of a linear iris width corresponding to each iris region to be used when the pupil state is in a contracted state according to the third embodiment of the present invention;
fig. 11 is a schematic structural diagram of an iris image processing apparatus according to a fifth embodiment of the present invention;
fig. 12 is a schematic diagram of an image processing apparatus for performing an iris image processing method according to a sixth embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Example 1
Fig. 1 is a flowchart of an iris image processing method according to an embodiment of the present invention, where the method may be applied to the case of processing iris images under different pupil states, and the method may be performed by an iris image processing apparatus, where the apparatus may be implemented in the form of software and/or hardware, and the hardware may be an electronic device, optionally, the electronic device may be a mobile terminal, or the like.
The technical scheme of the embodiment of the invention can be applied to an identity recognition device based on iris recognition, for example: the method can be applied to an identity recognition device in an access control, an identity verification device for elevator floor recognition and the like.
As shown in fig. 1, the method of this embodiment specifically includes the following steps:
s110, acquiring a target image, and determining a target to-be-processed area comprising a pupil area and an iris area in the target image.
The target image may be an image including face information of a user, and the target area to be processed may be an area formed by a pupil and an iris.
Optionally, based on the image acquisition device acquiring the target image including the face information of the user, extracting a target to-be-processed area including a pupil area and an iris area in the target image.
Specifically, when the biological information recognition device is used for recognizing the identity information of the user, the image acquisition equipment can be used for example: and the image pickup device is used for collecting images of the face information of the user so as to acquire target images. Further, the target image can be processed by adopting a human eye positioning technology or an iris positioning technology, and a target to-be-processed area comprising a pupil area and an iris area in the target image is extracted.
S120, determining a target processing mode for processing the target to-be-processed area according to the area ratio of the pupil area to the iris area in the target to-be-processed area.
The area ratio may be the area ratio of the pupil area to the iris area, or the radius ratio of the pupil area to the iris area. The target processing method may be a processing method corresponding to a different area ratio.
In this embodiment, the target processing manner may be determined according to the area ratio of the pupil area and the iris area: and determining a corresponding target processing mode based on the radius of the pupil area and the radius ratio of the target area to be processed.
Optionally, determining a pupil radius of the pupil region and a total radius of the target region to be processed; determining a pupil state of the pupil based on the pupil radius and the total radius; and determining a target processing mode for processing the target to-be-processed area according to the pupil state.
Wherein the pupil is approximately circular, then the pupil area is approximately a circular surface, and the pupil radius can be the radius of the pupil area; the total radius may be the radius of the target area to be treated; the pupil state may include at least three states, optionally an dilated state, a normal state, and a contracted state. In an actual scenario, however, the pupil state is one of the three states described above. The target processing modes corresponding to different pupil states are different, and optionally, the target processing modes comprise a first target processing mode for the pupil in a normal state; a second target treatment mode in which the pupil is in an enlarged state; and a third target treatment mode in which the pupil is in a contracted state.
Specifically, the ratio of the total radius to the pupil radius can be determined according to the total radius of the target area to be processed and the pupil radius of the pupil area. And according to the corresponding relation between the ratio and the pupil state, determining a target processing mode corresponding to the target to-be-processed area so as to process the target to-be-processed area.
In this embodiment, the correspondence between the radius ratio and the pupil state may be: if the ratio between the total radius and the pupil radius is within a first preset range, the pupil state of the pupil information is a contracted state; if the ratio between the total radius and the pupil radius is within a second preset range, the pupil state of the pupil information is a normal state; if the ratio between the total radius and the pupil radius is within the third preset range, the pupil state of the pupil information is a dilated state.
In this embodiment, the advantage of determining the pupil state is that the iris region can be processed according to the pupil state by adopting a corresponding processing mode, so as to improve the accuracy of iris normalization processing, and further improve the technical effect of identity recognition accuracy.
It should be noted that the pupil constriction is governed by the annular sphincter muscle constriction surrounding the pupil circumference; pupil dilation is governed by radial pupil opening large muscle constriction of the iris region, so that pupil states are dilated or contracted, and are governed by two different muscle fibers, and also have a certain relationship with the external environment. Because the muscle fibers are unevenly distributed in the iris area, when the muscle fibers are contracted, the stress of the iris area is different, so that the specific treatment mode of the iris area is determined by combining the pupil state. Typically, the iris and pupil areas range in diameter from about 11-12mm and the pupil areas range in diameter from about 2.5-4mm.
On the basis of the above data, a preset range corresponding to the pupil state may be determined, for example: let total radius be rI, pupil radius be rP, ratio between total radius and pupil radius be rlipr=ri/rP, the first preset range be rlipr > 4.8 (i.e. pupil area is smaller, which can be understood as pupil is in contracted state), the second preset range be 2.75 r ipr 4.8 (i.e. pupil area is normal, which can be understood as pupil is in normal state), the third preset range be rlipr < 2.75 (i.e. pupil area is larger, which can be understood as pupil is in dilated state). It should be noted that: the specific setting of each preset range may be set according to actual data, and is not specifically limited in this embodiment.
It should be noted that, when the iris region is processed in a corresponding manner, the processing may be performed in combination with a specific width of the iris region. In this embodiment, the iris ring width is determined as follows:
according to the length of the target conversion area, determining each point to be processed on the edge line of the pupil area, and determining the point to be used corresponding to each point to be processed on the edge line of the iris area; determining the iris width of the point corresponding to each point to be processed according to the position information of each point to be processed and the position information of the corresponding point to be used; and determining the iris ring width of the iris region according to the iris width of each point.
The target conversion area is a rectangular area corresponding to the normalized iris area, for example, the target conversion area may be a preset rectangular area, as shown in the schematic diagram of fig. 2, where points of the iris area are mapped to the rectangular area when the iris area is normalized. In the iris recognition algorithm, the iris image normalization method is processed based on a Rubber-Sheet model proposed by John. Daugman, namely, an iris region is normalized and unfolded into a rectangular region. The length refers to the rectangular area length. Each point on the pupil area edge line can be used as a point to be processed, or the pupil area edge line can be equally divided into parts matched with the length according to the length of the target conversion area, and each equally divided point is the point to be processed. The point to be used is a point corresponding to the point to be processed, and the point to be used is positioned on the iris edge line. And determining the iris width of the point according to the point to be processed on the pupil area edge line and the point to be used corresponding to the point to be processed on the iris area edge line. The position information may be coordinate information, for example, cartesian coordinate information, polar coordinate information, or the like. The iris width of a point may be the distance between the point to be processed and the point to be used corresponding to the point to be processed. The iris ring width may be the ring width of each point iris width, as shown in fig. 3. It is understood that the iris circular ring width is a set of point iris widths, the number of elements in the set is consistent with the number of points to be processed, and each element is used for representing the point iris width corresponding to the corresponding point to be processed.
For example, if the length of the target conversion area is 360pixel, the pupil area edge line may be equally divided into 360 parts, and the central angle corresponding to each part is 1 °, and accordingly, the position of the central angle number on the iris area edge point is determined according to the central angle number corresponding to each point to be processed, and the position is used as the point to be used corresponding to the point to be processed. And determining the iris widths of the points according to the points to be used and the points to be processed corresponding to the same central angle, and taking the set of the iris widths of each point as the iris ring width.
It should be noted that, since the degree of the central angle is generally set within the range of 0 to 5 °, the width of the rectangular region is generally between 72pixel and 720 pixel.
For a clear description of the manner in which the iris of each point is determined, reference may be made to the following detailed description: optionally, the position information of each point to be processed and each point to be used is determined by establishing a Cartesian coordinate system and a polar coordinate system, and then the iris width of each corresponding point is determined according to the position information.
Step one, a target Cartesian coordinate system corresponding to a target image is established, and the pupil center point position of a pupil area and the iris center point position of an iris area are determined.
Specifically, a target cartesian coordinate system is established for the target image, and cartesian coordinates of the pupil center point position of the pupil area and cartesian coordinates of the iris center point position of the iris area are determined.
For example, referring to fig. 4, a cartesian coordinate system is established with the upper left vertex of the target image as the origin of coordinates. If the resolution of the target image is X Y, the upper left corner of the target image is the origin (0, 0), and the lower right corner coordinates are (X-1, Y-1). Pupil center point P (x) op ,y op ) The iris center point is I (x oi ,y oi )。
It should be noted that the iris and the pupil are not strictly concentric circles, but their centers are in general very close. Therefore, there is a case where the pupil center point and the iris center point do not coincide.
And step two, obtaining a length transformation parameter according to the length and the central angle degree of the target transformation area, and determining each point to be processed on the edge of the pupil area according to the length transformation parameter.
Wherein the number of central angles corresponds to the length of the target transition region. The length conversion parameter may be an angle parameter corresponding to a unit length of the target conversion region. For example: the number of degrees of the central angle is 360 degrees, and the length of the target conversion area is 180 pixels, the length change parameter may be 2.
Specifically, the length change parameter can be determined by dividing the central angle number by the length of the target conversion region, so that 180 points to be processed are determined from the pupil region edge line according to the pupil center point and the length change parameter of the pupil region.
And thirdly, respectively establishing a pupil polar coordinate system by taking the position of the central point of the pupil as an origin according to the same rule, and establishing an iris polar coordinate system by taking the position of the central point of the iris in the iris area as the origin.
The same rule is mainly that the rule for establishing the pupil polar coordinate system and the iris polar coordinate system is the same, namely the polar axis direction is the same.
Specifically, the position of the pupil center point may be taken as the pole, i.e. the origin, of the pupil polar coordinate system. And emitting rays from the poles to a certain direction, and taking the rays as polar axes of a pupil polar coordinate system. A pupil polar coordinate system may be established based on the poles and polar axes. Further, the iris center point position of the iris region may be a pole of the iris polar coordinate system, and a ray may be emitted from the pole in a polar axis direction of the pupil polar coordinate system, and the ray may be used as a polar axis of the iris polar coordinate system. An iris polar coordinate system can be established based on the poles and polar axes.
It is understood that the polar axes of the iris polar coordinate system and the pupil polar coordinate system are parallel and coincident in direction.
In general, a ray in a horizontal direction may be used as a polar axis of a polar coordinate system, and a specific manner of determining the polar axis may be set according to actual requirements. Illustratively, the pupil center point is taken as P (x op ,y op ) Establishing a pupil polar coordinate system by taking an X axis parallel to a Cartesian coordinate system as a polar axis as a polar coordinate origin; similarly, the iris center point is I (x oi ,y oi ) An iris polar coordinate system is established by taking an X axis parallel to a Cartesian coordinate system as a polar axis for the origin of polar coordinates.
And fourthly, determining angle parameters of the points to be processed under the pupil polar coordinate system, and determining points to be used of the points to be processed on the iris edge line under the iris polar coordinate system according to the angle parameters.
Specifically, according to the established pupil polar coordinate system and each point to be processed, the angle parameter corresponding to each point to be processed can be determined. According to the angle parameter, rays corresponding to the angle parameter can be emitted under the iris polar coordinate system, and the intersection point of the rays and the iris edge line is used as a point to be used corresponding to the angle parameter. That is, the point to be processed and the point to be used corresponding to the same angle parameter are corresponding.
Exemplary, the polar angle of the point to be processed P' in the pupil polar coordinate system is θ 1 The polar diameter is rP. According to angle theta 1 Can determine that the polar angle on the edge line of the iris is theta under the polar coordinate system of the iris 1 The point of (2) is the point I 'to be used corresponding to the point P' to be processed.
And fifthly, determining the polar coordinates of pupil points of the points to be processed and the polar coordinates of iris points of the corresponding points to be used.
Specifically, the angle parameter corresponding to each point to be processed is taken as a polar angle, the distance between the pupil center point and each point to be processed is taken as a polar diameter, and the pupil point polar coordinates of each point to be processed are determined. According to the same manner, the polar coordinates of the iris point of each point to be used are determined.
The polar coordinate is expressed in the form of (r, θ), where r represents the polar diameter and θ represents the polar angle.
Illustratively, after determining the point to be processed P ' and the point to be used I ' based on the above example, the pupil point polar coordinates of the point to be processed P ' may be determined as (rP, θ) 1 ) The polar coordinates of the iris point of I 'corresponding to P' are (rI, θ) 1 )。
According to the methods described in the fourth and fifth steps, the polar coordinates of each point to be processed and the corresponding point to be used can be determined.
After determining the polar coordinates of each point to be processed and the corresponding point to be used, in order to determine the iris width, the processing may be performed in the following manner: (1) first determining the iris width of the point; (2) And determining the width of the circular iris according to the width of the point iris.
The specific modes corresponding to the above (1) and (2) can be seen in the following specific description:
step one, determining the current Cartesian coordinates to be processed of the current point to be processed according to the pupil point polar coordinates and pupil center point positions of the current point to be processed aiming at each point to be processed.
It should be noted that, the processing manner of converting the polar coordinates of each point to be processed into the corresponding cartesian coordinates is the same, and in this embodiment, one point to be processed is described as an example, and other points to be processed may repeatedly execute the step.
The pupil center point position may be coordinate position information of the pupil center point in the target cartesian coordinate system. The current cartesian coordinates to be processed may be coordinate position information of the current point to be processed in the target cartesian coordinate system.
Specifically, according to the polar coordinates of the pupil point and the position of the pupil center point of the current point to be processed, the coordinate information of the current point to be processed under the target Cartesian coordinate system can be determined through coordinate system conversion.
Illustratively, with continued reference to FIG. 4, the pupil center point is P (x op ,y op ) The point to be processed is currently P ', and the polar coordinate of P' is (rP, theta). According to the following coordinate system conversion formula, the abscissa x of P' in the target Cartesian coordinate system can be obtained p And the ordinate y p
And step two, determining the current Cartesian coordinates of the current point to be used according to the polar coordinates of the iris point and the position of the iris center point of the current point to be used aiming at each point to be used.
The current cartesian coordinate to be used may be coordinate position information of the current point to be used in the target cartesian coordinate system.
Specifically, according to the polar coordinates of the iris point and the position of the center point of the iris of the point to be used currently, the coordinate information of the point to be used currently in the target Cartesian coordinate system can be determined in the same manner as in the first step.
Illustratively, with continued reference to FIG. 4, the iris center point is I (x oi ,y oi ) The polar coordinates of the point to be used at present are (rI, θ) with I ', I'. According to the following coordinate system conversion formula, the abscissa x of I' under the target Cartesian coordinate system can be obtained i And the ordinate y i
And thirdly, determining the Cartesian coordinates to be processed and the Cartesian coordinates to be used corresponding to the same angle parameters in the polar coordinates, and determining the iris width of the point corresponding to each point to be processed according to the Cartesian coordinates to be processed and the Cartesian coordinates to be used.
Specifically, according to the angle parameters in the polar coordinates of the points to be processed in the pupil polar coordinate system, the points to be used with the same angle parameters in the iris polar coordinates are determined, and the points to be processed and the points to be used with the same angle parameters are used as corresponding points. According to the Cartesian coordinates to be processed and the corresponding Cartesian coordinates to be used, the distance between two points can be calculated through a linear distance formula between the two points, and the distance is the iris width of the point corresponding to the point to be processed.
Illustratively, the current Cartesian coordinates P' (x) to be processed may be determined based on the examples described above p ,y p ) Cartesian coordinates I' (x) are currently to be used i ,y i ) The iris width of the point corresponding to the current point to be processed can be calculated as IPWidth according to the following formula:
according to the mode, the iris width of the point corresponding to each point to be processed can be determined.
Alternatively, the iris ring width can be determined according to the point iris width, and then the transformation coefficients required for normalizing the iris region to the target transformation region under different pupil states can be determined.
The width transformation coefficients corresponding to different pupil states are different, so that the transformation coefficients corresponding to the pupil normal state and the pupil abnormal state can be respectively determined. And when the pupil state is determined to be in a normal state, determining a normal transformation coefficient according to the width of the iris ring and the height of the target transformation area.
Specifically, the normal transformation coefficient when the pupil state is in the normal state is determined according to the ratio between the width of the iris ring and the height of the target transformation area.
Specifically, the iris width of each point to be processed can be determined according to the iris ring width, and the ratio of the iris width of each point to the height of the target conversion area can be calculated to obtain the normal transformation coefficient corresponding to each point to be processed.
Illustratively, the iris width of the current point to be processed is determined to be IPWidth according to the width of the iris ring, and the Height of the target conversion area is Height. The normal transformation coefficient Factor corresponding to the current point to be processed can be determined according to the following formula:
furthermore, normal transformation coefficients corresponding to each point to be processed can be determined for normalizing the iris region to the target transformation region.
If the pupil state is an abnormal state, in this embodiment, an abnormal transformation coefficient may be determined according to the iris standard annular width and the height of the target transformation area, where the abnormal state includes an enlarged state or a contracted state.
Specifically, determining the width of an iris standard ring; and determining an abnormal transformation coefficient according to the ratio of the width of the iris standard circular ring to the height of the target transformation area.
To determine the outlier transform coefficients, a standard iris width may be first determined: optionally, determining a standard pupil radius according to the ratio of the total radius of the target to-be-processed area to a preset standard radius; and determining the iris standard annular width according to the total radius determination and the standard pupil radius.
The preset standard radius ratio may be a radius ratio of a total radius of the target to-be-processed area to a pupil radius obtained according to big data statistics, and optionally, the preset standard radius ratio may be 3.
Specifically, the standard pupil radius can be determined according to the total radius of the target to-be-processed area and the preset standard radius ratio, and the iris standard ring width can be determined by differentiating the total radius and the standard pupil radius. And calculating the ratio of the width of the iris standard circular ring to the height of the target conversion area to obtain an abnormal transformation coefficient.
The standard pupil radius rPStd is determined, for example, from the ratio of the total radius rI of the target area to be treated to the preset standard radius ratio rprstd.
Further, subtracting the total radius rI of the target area to be processed from the standard pupil radius rPStd may determine the iris standard annular width stdIPWidth, i.e., stdipwidth=ri-rPStd. The Height of the target transition region is Height. The abnormal transform coefficient stdFactor may be determined according to the following formula:
s130, processing the position information in the target conversion area based on the target processing mode, and determining target iris position information of the position information in the target to-be-processed area.
Specifically, each position information in the target conversion area is processed based on a target processing mode, the current position information in the target conversion area is determined and converted into polar coordinate information under the polar coordinates of the iris, the polar coordinate information is converted into target Cartesian coordinates under a target Cartesian coordinate system, and the target Cartesian coordinates are used as target iris position information corresponding to the current position information.
And S140, determining a normalized iris texture image corresponding to the iris region according to the pixel information of the target iris position information corresponding to each position information.
The pixel information may be a pixel value, and the normalized iris texture image may be an image obtained by filling the pixel value in the target conversion region. The pixel values of the padding are determined from the pixel values of the individual points in the iris region.
According to the technical scheme, the target to-be-processed area in the target image is determined, the target processing mode for processing the target to-be-processed area is determined according to the area ratio of the pupil area to the iris area in the target to-be-processed area, the images in different pupil states are processed respectively, the position information in the target conversion area is processed based on the target processing mode, the target iris position information of the position information in the target to-be-processed area is determined, the normalized iris texture image corresponding to the iris area is determined according to the pixel information of the target iris position information corresponding to the position information, the technical problems that in the prior art, the iris texture image converted by the same processing mode is processed in different pupil states, the iris texture image actually acquired has certain difference, the recognition accuracy is low, the user needs to be acquired for recognition to pass through a plurality of times, and the user experience is poor are solved, the corresponding normalization processing is performed on the iris area according to different pupil states, the iris area texture nonlinear normalization accuracy is improved, and therefore the identity recognition accuracy is improved.
In this embodiment, a pixel value corresponding to each target iris position information is obtained, and the pixel value is filled into the corresponding position information in the target conversion area.
The pixel value may be understood as a gray value corresponding to the iris image after being converted into the gray map.
Specifically, according to the target iris position information corresponding to the current position information, a pixel value corresponding to the target iris position information, that is, a gray value, may be determined, and the pixel value is determined as a pixel value corresponding to the current position information in the target conversion area. According to the mode, after filling all pixel point values in the target conversion area, the obtained image of the target conversion area can be used as a normalized iris texture image.
Based on the technical scheme, the identity recognition can be performed based on the normalized iris texture image. Optionally, extracting features in the normalized iris texture image and coding to obtain feature codes; and matching the characteristic codes with pre-stored iris characteristics to perform identity recognition.
The feature coding can be obtained by carrying out feature extraction and coding on the normalized iris texture image by adopting a preset feature extraction algorithm, and the preset feature extraction algorithm can be a texture feature analysis method of a gray level co-occurrence matrix, a structural method, a Markov random airport model method, an autoregressive texture model method and the like.
Specifically, a preset feature extraction algorithm is adopted to extract features required by iris recognition from the normalized iris texture image, and the features are encoded. And matching the feature codes obtained by the feature extraction with the pre-stored iris features one by one, and judging whether the iris features are the same iris feature codes or not, thereby achieving the purpose of identity recognition.
Example two
Fig. 5 is a flow chart of an iris image processing method according to a second embodiment of the present invention, and a specific implementation of processing each position information in a target conversion area based on a target processing manner when a pupil state is in a normal state based on the above embodiments can be referred to the technical solution of the present embodiment. Wherein, the explanation of the same or corresponding terms as the above embodiments is not repeated herein.
As shown in fig. 5, the method specifically includes the following steps:
s210, acquiring a target image, and determining a target to-be-processed area comprising a pupil area and an iris area in the target image.
S220, if the ratio of the total radius to the pupil radius is within a second preset range, the pupil state of the pupil information is a normal state.
S230, if the pupil state is a normal state, determining that the target processing mode for processing the target to-be-processed area is a first target processing mode.
S240, determining, for each piece of position information, that the current position information in the target conversion area is converted into a first target iris polar coordinate under an iris polar coordinate system, converting the first target iris polar coordinate into a target Cartesian coordinate under a target Cartesian coordinate system, and taking the target Cartesian coordinate as target iris position information corresponding to the current position information.
The first target iris polar coordinate may be polar coordinate information under an iris polar coordinate system corresponding to the current position information when the pupil state is in a normal state.
Specifically, the specific steps of determining that the current position information in the target conversion area is converted to the first target iris polar coordinate under the iris polar coordinate system, and converting the first target iris polar coordinate to the target cartesian coordinate under the target cartesian coordinate system may be the following steps:
step one, determining a target polar coordinate angle of a first target iris polar coordinate according to a first abscissa parameter and a length transformation parameter of current position information, and determining a target polar coordinate radius of the first target iris polar coordinate according to a first ordinate and a normal transformation coefficient of the current position information.
The first abscissa parameter may be an abscissa parameter of the current position information in the target conversion area, and the first ordinate parameter may be an ordinate parameter of the current position information in the target conversion area.
Specifically, the target polar coordinate angle is determined according to the product of the first abscissa parameter of the current position information and the length transformation parameter. The length transformation parameters can be determined according to the length of the target transformation area and the number of circle center angles. And determining the radius of the polar coordinate of the target according to the product of the first ordinate of the current position information and the normal transformation coefficient.
Illustratively, the current position information is (m, n), the length of the target conversion area is Width, the Height is Height, the number of central angles is 360 degrees, and the iris Width of the point corresponding to the current position information is IPwidth. The normal transform coefficient Factor corresponding to the current position information can be determined according to the following formula:
further, the target polar coordinate angle θ and the target polar coordinate radius r of the first target iris polar coordinate corresponding to the current position information may be determined according to the following formula:
and step two, determining the target Cartesian coordinate of the first target iris polar coordinate under the target Cartesian coordinate system according to the target polar coordinate angle, the target polar coordinate radius and the iris ring width.
And determining the pupil Cartesian coordinates of the target pupil edge points on the pupil edge line corresponding to the current position information and the iris Cartesian coordinates of the target iris edge points on the iris edge line according to the target polar coordinate angle.
Specifically, according to the target polar coordinate angle, the polar coordinate information of the target pupil edge point can be determined on the pupil polar coordinate system established by taking the position of the pupil center point as the origin, and the pupil Cartesian coordinate of the target pupil edge point on the target Cartesian coordinate system can be determined through coordinate conversion. According to the target polar coordinate angle, the polar coordinate information of the target iris edge point on the iris edge line can be determined on an iris polar coordinate system established by taking the iris center point position as an origin, and the Cartesian iris coordinates of the target iris edge point on the Cartesian target coordinate system can be determined through coordinate conversion.
Illustratively, with continued reference to FIG. 4, the pupil center point is P (x op ,y op ) The polar coordinates of the target pupil edge point being P ', P' being (rP, θ); the iris center point is I (x) oi ,y oi ) The polar coordinates of the target iris edge point being I ', I' being (rI, θ). According to the following coordinate system conversion formula, the pupil Cartesian abscissa x of P' under the target Cartesian coordinate system can be obtained p And the pupil Cartesian ordinate y p
/>
In the same way, the Iris Cartesian abscissa x of I' in the target Cartesian coordinate system can be found i And the Cartesian ordinate y of the iris i
Based on the above formula, the specific way of determining the target Cartesian coordinates is as follows: determining a difference value between the Cartesian abscissa of the iris and the Cartesian abscissa of the pupil, determining a product of the difference value and a radius of a target polar coordinate, determining a first numerical value based on a ratio between the product and the width of the iris ring, and determining a target Cartesian abscissa in the target Cartesian coordinate according to a sum between the first numerical value and the Cartesian abscissa of the pupil; and determining a difference value between the Cartesian ordinate of the iris and the Cartesian ordinate of the pupil, determining a product of the difference value and the radius of the target polar coordinate, determining a second numerical value based on a ratio between the product and the width of the iris ring, and determining the target Cartesian ordinate in the target Cartesian coordinate according to the sum of the second numerical value and the Cartesian ordinate of the pupil.
Exemplary, the current location information is (m, n), the target polar coordinate angle corresponding to the current location information is θ, the target polar coordinate radius is r, and the pupil Cartesian abscissa is x p The Cartesian ordinate of the pupil is y p Iris Cartesian abscissa x i And the Cartesian ordinate of the iris is y i The iris width of the point in the iris ring width corresponding to the current position information is IPWidth, and the target Cartesian coordinates (x, y) can be determined. First, the first and second values may be calculated by the following formula:
Further, the first value and the Cartesian abscissa x of the pupil can be used for p And determining the target Cartesian abscissa x in the target Cartesian coordinates according to the second value and the pupil Cartesian ordinate y p And determining the target cartesian ordinate y in the target cartesian coordinates.
I.e.
The target cartesian coordinates may be used as target iris position information corresponding to the current position information to determine a normalized iris texture image.
S250, determining a normalized iris texture image corresponding to the iris region according to the pixel information of the target iris position information corresponding to each position information.
According to the technical scheme of the embodiment of the invention, the target to-be-processed area in the target image is determined by acquiring the target image, then the target processing mode for processing the target to-be-processed area is determined according to the area ratio of the pupil area to the iris area in the target to-be-processed area, the pupil state is determined to be in a normal state, then each position information in the target conversion area is processed based on the first target processing mode, then the current position information in the target conversion area is determined to be converted into a first target iris polar coordinate under the iris polar coordinate system according to each position information, the first target iris polar coordinate is converted into a target Cartesian coordinate under the target Cartesian coordinate system, the target Cartesian coordinate is used as the target iris position information corresponding to the current position information, the method has the advantages that the normalized iris texture image corresponding to the iris region is determined according to the pixel information of the target iris position information corresponding to each position information, the problem that in the prior art, the iris images in different pupil states are processed in the same processing mode, a certain difference exists between the converted iris texture image and the actually acquired iris image, the recognition accuracy is low, the user needs to be acquired for multiple times to be able to recognize, the technical problem of poor user experience is caused, the corresponding normalization processing is carried out on the iris region according to different pupil states, and when the pupil states are normal, the accuracy of nonlinear normalization of the iris region texture is improved, so that the technical effect of the identity recognition accuracy is improved.
Example III
Fig. 6 is a flow chart of an iris image processing method according to a third embodiment of the present invention, and a specific implementation of processing each position information in a target conversion area based on a target processing manner when a pupil state is an abnormal state based on the above embodiments can be referred to the technical solution of the present embodiment. Wherein, the explanation of the same or corresponding terms as the above embodiments is not repeated herein.
As shown in fig. 6, the method specifically includes the following steps:
s310, acquiring a target image, and determining a target to-be-processed area comprising a pupil area and an iris area in the target image.
S320, if the ratio between the total radius and the pupil radius is within the first preset range or the third preset range, the pupil state of the pupil information is an abnormal state.
Wherein the abnormal state includes an enlarged state or a reduced state.
Specifically, if the ratio between the total radius and the pupil radius is within the first preset range, the pupil state of the pupil information is a contracted state; if the ratio between the total radius and the pupil radius is within the third preset range, the pupil state of the pupil information is a dilated state.
S330, if the pupil state is an abnormal state, determining that the target processing mode for processing the target to-be-processed area is the second target processing mode or the third target processing mode.
Specifically, if the pupil state is an enlarged state, determining that the mode of processing the target to-be-processed area is a second target processing mode, wherein the second target processing mode comprises an area enlargement dividing coefficient and an area enlargement conversion coefficient; if the pupil state is a contracted state, determining that the mode of processing the target to-be-processed area is a third target processing mode, wherein the third target processing mode comprises an area contracted dividing coefficient and an area contracted conversion coefficient.
The region enlargement and division coefficient may be a scaling coefficient required when dividing the iris region into a preset number of annular regions when the pupil state is an enlarged state, and the region enlargement and transformation coefficient may be a scaling coefficient required when performing width adjustment on the preset number of annular regions after the division according to the region enlargement and division coefficient is completed. The region-reduction dividing coefficient may be a scaling coefficient required when dividing the iris region into a preset number of annular regions when the pupil state is in a reduced state, and the region-reduction conversion coefficient may be a scaling coefficient required when width-adjusting the preset number of annular regions after division according to the region-reduction dividing coefficient is completed. Typically, the preset number is at least three.
For example, if the pupil state is the dilated state, the region division coefficient is a region dilated division coefficient, and the region transform coefficient is a region dilated transform coefficient; if the pupil state is a contracted state, the region division coefficient is a region contracted division coefficient, and the region transform coefficient is a region contracted transform coefficient. The coefficient values may be determined theoretically and/or empirically, and optionally, when the pupil state is an dilated state, the region-dilation division coefficient is 3:3:4, the area amplification conversion coefficient is 0.8:1:1.2; when the pupil state is the contracted state, the area-contracted dividing coefficient is 3:4:3, the region reduction transform coefficient is 1.2:1:0.8.
s340, determining a target standard polar coordinate angle of the polar coordinate of the second target iris according to the first abscissa parameter and the length transformation parameter of the current position information and determining a target standard polar coordinate radius of the polar coordinate of the second target iris according to the second ordinate and the abnormal transformation coefficient of the current position information.
The first abscissa parameter may be an abscissa parameter of the current position information in the target conversion area, and the first ordinate parameter may be an ordinate parameter of the current position information in the target conversion area.
Specifically, the target standard polar coordinate angle is determined according to the product of the first abscissa parameter of the current position information and the length transformation parameter. And determining the standard polar coordinate radius of the target according to the product of the first ordinate of the current position information and the abnormal transformation coefficient.
Illustratively, the current location information is (m, n), the target transition region is Width, height, central angle number of 360 degrees, and the iris standard annular Width is stdIPWidth. The abnormal transformation coefficient stdFactor corresponding to the current position information can be determined according to the following formula:
further, the target standard polar coordinate angle θ and the target standard polar coordinate radius r of the second target iris polar coordinate corresponding to the current position information may be determined according to the following formula:
s350, determining standard target Cartesian coordinates of the second target iris polar coordinates under the target Cartesian coordinates according to the target standard polar coordinate angle, the target standard polar coordinate radius and the iris standard ring width, and taking the standard target Cartesian coordinates as target iris position information.
Specifically, the specific steps for determining the standard target cartesian coordinates of the polar coordinates of the second target iris under the target cartesian coordinates are as follows:
Dividing the iris widths of points corresponding to each point to be processed according to the region dividing coefficient and the region transformation coefficient in the target processing mode to obtain at least three iris annular regions to be used, and determining the non-linear iris region widths of the iris annular regions to be used.
The region dividing coefficient may be a scaling coefficient required for dividing the iris region into a preset number of annular regions, and the region transformation coefficient may be a scaling coefficient required for performing width adjustment on the preset number of annular regions after the division is completed. The sum of the widths of the nonlinear iris areas is equal to the width of the standard iris ring. The iris annular region to be used may be an annular region obtained by dividing according to the region division coefficient and the region transformation coefficient.
Specifically, the nonlinear iris region widths of at least three iris annular regions to be used corresponding to each point to be processed can be determined according to the product of the iris widths of the points corresponding to each point to be processed, the region dividing coefficient and the region transformation coefficient, and the divided at least three iris annular regions to be used can be obtained.
Optionally, dividing the iris width of the point corresponding to each point to be processed into at least three sections of iris widths to be determined according to the region division coefficient and the iris ring width; processing at least three sections of iris widths to be determined corresponding to the same point to be processed according to the region transformation coefficient to obtain at least three sections of iris widths to be used; and obtaining at least three iris annular areas to be used according to at least three iris widths to be used corresponding to each point to be processed, and determining the width of the nonlinear iris area of the iris annular areas to be used.
The iris width to be determined may be a width determined according to the region division coefficient and the iris annular width. The iris width to be used may be a width determined according to the region transform coefficient and the iris width to be determined.
Specifically, the iris width to be determined required when dividing the iris width into at least three parts can be determined according to the product of the iris width of the iris ring corresponding to each point to be processed and the corresponding part in the region division coefficient. Furthermore, the iris width to be used required for dividing the iris width of the point into at least three parts can be determined according to the product of the iris width to be determined corresponding to the same point to be processed and the corresponding part in the region transformation coefficient. According to the mode, at least three sections of iris widths to be used corresponding to each point to be processed can be determined. Further, the iris region can be divided according to at least three sections of iris widths to be used, at least three iris annular regions to be used are obtained, and the nonlinear iris region width of each iris annular region to be used can be determined according to each iris width to be used corresponding to each iris annular region to be used.
For example, the iris width of the point corresponding to the current point to be processed is IPWidth, and the area division coefficient is inR: midR: outR, region transform coefficient inF: midF: outF. Wherein inR, midR, outR, inF, midF and outF are all values greater than zero, inF > midf=1 > outF when the pupil state is in the dilated state, inF < midf=1 < outF when the pupil state is in the contracted state. From this, it can be determined that the number of iris annular regions to be used corresponding to the current point to be processed is 3. The iris widths inW, midW and outW to be determined of the three segments corresponding to the current point to be processed can be determined according to the following formula:
Fig. 7 is a schematic diagram of three iris widths to be determined when the pupil state is an enlarged state, and fig. 8 is a schematic diagram of three iris widths to be determined when the pupil state is a contracted state.
Further, the iris width inStdW to be used of three segments corresponding to the current point to be processed can be determined according to the following formula, midStdW and outStdW:
according to the method, the widths of three sections of iris to be used corresponding to each point to be processed can be determined, and the iris areas are divided. And determining the width of the nonlinear iris region of each iris annular region to be used according to the width of each iris to be used corresponding to each iris annular region to be used. And, the iris standard ring width stdIPWidth can be determined as
stdIPWidth=inStdW+midStdW+outStdW
I.e.
stdIPWidth=IPWidth×(inR×inF+midR×midF+outR×outF)
Alternatively, if the sum of inR, midR and outR in the region division coefficient is not equal to 1, inR, midR and outR may be normalized, i.e. the region division coefficient is updated to be Similarly, the region transform coefficients may be updated in the same manner.
And step two, determining the linear iris width corresponding to each iris area to be used according to the nonlinear iris width corresponding to each iris annular area to be used and the abnormal transformation coefficient.
The linear iris width may be a width obtained by processing according to the nonlinear iris width and the abnormal transformation coefficient, and is used for converting the iris region into the target conversion region later.
Specifically, division processing is performed on the nonlinear iris width corresponding to each iris annular region to be used and the abnormal transformation coefficient, and the obtained quotient can be used as the linear iris width corresponding to each iris region to be used.
For example, the nonlinear iris widths corresponding to the three iris annular regions to be used are inctdw, midStdW and outStdW, the anomaly transform coefficients are stdFactor, and the linear iris widths inNormW, midNormW and outNormW corresponding to the iris regions to be used can be determined:
fig. 9 is a schematic diagram of a linear iris width corresponding to each iris region to be used when the pupil state is in an enlarged state, and fig. 10 is a schematic diagram of a linear iris width corresponding to each iris region to be used when the pupil state is in a contracted state.
And thirdly, determining the width of the sub-conversion area of each annular area to be used into the target conversion area according to the width of the linear iris area of each annular area to be used.
Wherein the sub-conversion region width may be a portion of the target conversion region width, and the sum of all the sub-conversion region widths is the target conversion region width.
Specifically, after determining the width of the linear iris region of each iris annular region to be used, the width of the linear iris region of each iris annular region to be used is taken as the width of the sub-conversion region of the target conversion region.
Determining a sub-conversion area to which the current position information belongs according to the current position information, determining a standard target Cartesian coordinate of a second target Iris polar coordinate under the target Cartesian coordinate according to the width of the sub-conversion area, the standard polar coordinate angle of the target, the current position information, the width of a linear Iris area corresponding to the sub-conversion area, the width of a nonlinear Iris, the radius of a pupil and the Cartesian coordinate of the pupil, and taking the standard target Cartesian coordinate as target Iris position information of the current position information.
Specifically, according to the target standard polar coordinate angle, the polar coordinate information of the target pupil edge point can be determined on the pupil polar coordinate system established by taking the position of the pupil center point as the origin, and the pupil Cartesian coordinate of the target pupil edge point on the target Cartesian coordinate system can be determined through coordinate conversion. According to the target standard polar coordinate angle, the polar coordinate information of the target iris edge point on the iris edge line can be determined on the iris polar coordinate system established by taking the iris center point position as the origin, and the Iris Cartesian coordinate of the target iris edge point on the target Cartesian coordinate system can be determined through coordinate conversion.
By way of example only, and not by way of limitation,the current position information is (m, n), the target standard polar coordinate radius corresponding to the current position information is r, the iris width of the point is IPWidth, and the Cartesian abscissa of the pupil is x p The Cartesian ordinate of the pupil is y p Iris Cartesian abscissa x i And the Cartesian ordinate of the iris is y i The standard iris ring width is stdIPWidth, and the standard target Cartesian coordinates (x, y) can be determined according to the following formula:
due toAnd +.>The above formula can be converted into:
due tox i -x p =ipwidth×cos θ and y i -y p As can be seen from =ipwidth×sin θ, the above formula can be converted into formula (1):
it should be noted that the number of iris annular regions to be used may be at least three, and in this embodiment, description will be given by taking the example that the number of iris annular regions to be used is three, and if the number of iris annular regions to be used is greater than three, reference may be made to the implementation procedure of this embodiment.
The iris annular region to be used corresponding to the current position information can be determined according to the ordinate corresponding to the current position information (namely, which sub-conversion region in the target conversion region the current position information is located in is determined according to the ordinate), and the processing mode corresponding to the iris annular region can be further determined according to the iris annular region.
In the first case, if the current position information is (m, n), and n < inornw (i.e. the current position information is located in the first iris annular region to be used in the neighboring pupil region), then the formula (1) may be converted into:
inW =ipwidth× inRSubstitution into the above formula can result in: />
The above formula can be converted into one according to instdw=ipwidth× inR × inF:
in summary, it can be summarized as follows: if the sub-conversion area is a first iris annular area to be used of the neighboring pupil, determining a first intermediate value according to the ordinate of the current position information and the abnormal transformation coefficient, and determining a first coefficient value according to the first intermediate value and the area transformation coefficient corresponding to the sub-conversion area; determining a second intermediate value according to the first coefficient value and the pupil radius, and determining a second coefficient value according to the cosine value of the second intermediate value and the target standard polar coordinate angle; determining the abscissa of the target iris position information according to the second coefficient value and the abscissa of the pupil center point position; and determining a third coefficient value according to the second intermediate value and the sine value of the target standard polar coordinate angle, and determining the ordinate of the target iris position information according to the third coefficient value and the ordinate of the pupil center point position.
Specifically, if the sub-conversion region is the first iris annular region to be used adjacent to the pupil, i.e. the region inW in fig. 7 or 8, determining that the first intermediate value is n×stdfactor according to the ordinate n of the current position information and the abnormal transformation coefficient stdFactor, and determining that the first coefficient value isDetermining a second intermediate value as based on the first coefficient value and the pupil radius rPCosine value according to second intermediate value and target standard polar coordinate angle +.>Determining the second coefficient value as +.> X, the abscissa according to the second coefficient value and the pupil center position op Determining the abscissa of the target iris position information asAnd, sine value +/according to second intermediate value and target standard polar coordinate angle>Determining the third coefficient value asAnd according to the third coefficient value and the ordinate y of the pupil center point position op Determining the ordinate of the iris position information of a targetIs-> />
If the current position information is (m, n) and inonormw is less than or equal to n < inonormw+midnormw (i.e., the current position information is located in the second iris annular region to be used adjacent to the first iris annular region to be used), the formula (1) may be converted into a deduction manner according to the first case:
In summary, it can be summarized as follows: if the sub-conversion area is a second iris annular area to be used adjacent to the first iris annular area to be used, determining a third intermediate value according to the ordinate of the current position information and the linear iris width of the first iris annular area to be used, determining a fourth intermediate value according to the third intermediate value and an abnormal transformation coefficient, and determining a fourth coefficient value according to the fourth intermediate value and the area transformation coefficient corresponding to the sub-conversion area; determining a fifth intermediate value according to the fourth coefficient value, the pupil radius and the nonlinear iris width of the first iris region to be used; determining a sixth intermediate value according to the cosine value of the fifth intermediate value and the target standard polar coordinate angle, and determining a seventh intermediate value according to the sine value of the fifth intermediate value and the target polar coordinate angle; and determining the abscissa of the target iris position information according to the sixth intermediate value and the abscissa of the pupil center point, and determining the ordinate of the target iris position information according to the sixth intermediate value and the ordinate of the pupil center point.
Specifically, if the sub-conversion region is the second iris annular region to be used adjacent to the first iris annular region to be used, i.e. the midW region in fig. 7 or 8, determining that the third intermediate value is n-inNormW according to the ordinate n of the current position information and the linear iris width inNormW of the first iris annular region to be used, and according to the third intermediate value Determining the fourth intermediate value as (n-inNormW) x stdFactor by using the value and the abnormal transformation coefficient stdFactor, and determining the fourth coefficient value as according to the fourth intermediate value and the region transformation coefficient midF corresponding to the sub-transformation regionDetermining a fifth intermediate value as the first iris width inW of the first iris region to be used based on the fourth coefficient value, the pupil radius rP and the second iris width inW of the second iris regionCosine value according to fifth intermediate value and target standard polar coordinate angle +.>Determining the sixth intermediate value as +.> And, based on the sine value of the fifth intermediate value and the target polar coordinate angleIt may be determined that the seventh intermediate value is +.> According to the sixth intermediate value and the abscissa x of the pupil center point op Determining the abscissa of the target iris position information as +.> And according to the sixth intermediate value and the ordinate y of the pupil center point op Determining longitudinal direction of target iris position informationCoordinates are
In the third case, if the current position information is (m, n), and inornw+midnonmw is equal to or less than n is equal to or less than inornw+midnonmw+outnormw (i.e., the current position information is located in the iris annular region to be used in the third iris annular region to be used in the neighboring second iris annular region), the derivation method according to the first case may convert the formula (1) into:
in summary, it can be summarized as follows: if the sub-conversion area is a third iris annular area to be used adjacent to the second iris annular area to be used, determining an eighth intermediate value according to the ordinate of the current position information, the linear iris width of the first iris annular area to be used and the linear iris annular width of the second iris annular area to be used, determining a ninth intermediate value according to the eighth intermediate value and an abnormal transformation coefficient, and determining a fifth coefficient value according to the ninth intermediate value and the area transformation coefficient corresponding to the sub-conversion area; determining a sixth coefficient value according to the fifth coefficient value, the pupil radius, the nonlinear iris width of the first iris region to be used and the nonlinear iris width of the second iris region to be used; determining a seventh coefficient value according to the cosine value of the sixth coefficient value and the target standard polar coordinate angle, and determining an eighth coefficient value according to the sine value of the sixth coefficient value and the target polar coordinate angle; and determining the abscissa of the target iris position information according to the seventh coefficient value and the abscissa of the pupil center point, and determining the ordinate of the target iris position information according to the eighth coefficient value and the ordinate of the pupil center point.
Specifically, if the sub-conversion region is the third iris annular region to be used adjacent to the second iris annular region to be used, i.e. the outW region in fig. 7 or 8, according to the ordinate n of the current position information, the linear iris width inonormw of the first iris annular region to be used, anddetermining an eighth intermediate value as n-inNormW-midmW according to the linear iris annular width midmW of the iris annular region to be used, determining a ninth intermediate value as (n-inNormW-midmW) x stdFactor according to the eighth intermediate value and the abnormal transformation coefficient stdFactor, and determining a fifth coefficient value as (n-inNormW-midmW) x stdFactor according to the region transformation coefficient outF corresponding to the ninth intermediate value and the sub-transformation regionDetermining a sixth coefficient value as +.A sixth coefficient value is determined based on the fifth coefficient value, the pupil radius rp, the non-linear iris width inW of the first iris region to be used and the non-linear iris width midW of the second iris region to be used> Cosine value according to sixth coefficient value and target standard polar coordinate angle +.>Determining the seventh coefficient value as +.> And sine value according to sixth coefficient value and target polar coordinate angle +.>Determining the eighth coefficient value as +.> According to the seventh coefficient value and the abscissa x of the pupil center point op Determining the abscissa of the target iris position information as +. > And according to the eighth coefficient value and the ordinate y of the pupil center point op Determining the ordinate of the iris position information of the target as
S360, determining a normalized iris texture image corresponding to the iris region according to the pixel information of the target iris position information corresponding to each position information.
According to the technical scheme of the embodiment of the invention, a target to-be-processed area in a target image is determined by acquiring the target image, then a target processing mode for processing the target to-be-processed area is determined according to the area ratio of the pupil area to the iris area in the target to-be-processed area, the pupil state is determined to be an abnormal state, then each position information in a target conversion area is processed based on a second target processing mode or a third target processing mode, then, for each position information, a target standard polar coordinate angle of a second target iris polar coordinate is determined according to a first abscissa parameter and a length transformation parameter of the current position information, a target standard polar coordinate radius of the second target iris polar coordinate is determined according to a second ordinate and an abnormal transformation coefficient of the current position information, and the target standard polar coordinate angle, the target standard polar coordinate radius and the iris standard ring width are determined according to the target standard polar coordinate angle, determining standard target Cartesian coordinates of the second target iris polar coordinates under the target Cartesian coordinates, taking the standard target Cartesian coordinates as target iris position information, determining normalized iris texture images corresponding to iris areas according to pixel information of the target iris position information corresponding to each position information, solving the technical problems that in the prior art, the iris images under different pupil states are processed in the same processing mode, the converted iris texture images and the actually acquired iris images have certain differences, the identification accuracy is lower, the user needs to acquire for many times to identify the passing, the user experience is poor, the corresponding normalization processing of the iris areas according to different pupil states is realized, and when the pupil states are abnormal states, the accuracy of the nonlinear normalization of the iris region texture improves the technical effect of the accuracy of the identity identification.
Example IV
The fourth embodiment of the present invention is a preferred embodiment of each of the above embodiments, and the specific implementation manner is as follows:
let the resolution of the target conversion area after expanding the target to-be-processed area be 360×60 (width=360, height=60). The coordinates of any point in the target transformation area can be noted as (m, n), 0.ltoreq.m < 360,0.ltoreq.n < 60.
When a target image is obtained and iris recognition is performed, the following operations may be performed:
1. preprocessing a target image, and detecting a target to-be-processed area comprising a pupil area and an iris area in the target image, namely detecting the inner circle and the outer circle of the iris texture. The iris region can be seen as consisting of a series of circles. The center of the pupil area (the center of the inner circle) can be marked as P (x) op ,y op ) The radius is denoted rP. Marking the circle center (excircle circle center) of the target to-be-treated area as I (x) oi ,y oi ) Radius is denoted rI.
2. And performing expansion normalization processing according to the information of the pupil area and the information of the target area to be processed. And determining the pupil state, namely the dilated state, the normal state or the contracted state, according to the radius of the target to-be-processed area and the radius of the pupil area. And further, determining a target processing mode for processing the target to-be-processed area system according to the pupil state.
If the pupil state is normal, the following formula can be used
Substituting width=360 and height=60, the method can obtain
The Cartesian coordinates (x, y) on the iris region corresponding to any point (m, n) in the normalized target transformation region can be derived from the above equation.
If the pupil state is in an enlarged state, the iris region is divided into three regions (to-be-used iris annular region), namely an inner circle region (to-be-used iris annular region adjacent to the pupil), a transition region (to-be-used iris annular region adjacent to the first to-be-used iris annular region), and an outer circle region (to-be-used iris annular region adjacent to the second to-be-used iris annular region). If the three regions have a duty cycle of inR (0.3), midR (0.4), outR (0.3), three segments of iris widths inW, midW, and outW to be determined can be obtained. Wherein, the shrinkage of the inner circle area is obvious, the duty ratio of the texture features is increased, the duty ratio of the corresponding outer circle area features is relatively reduced, the transition area can be approximately the same duty ratio, and inF > midF=1 > outF can be obtained. Further, the iris region may be expanded and linearly normalized.
When n.epsilon.inNormW, one can follow the following formula
Substituting width=360 and height=60, the method can obtain
Similarly, when n is equal to middnormW, the method can obtain
When n is E the outNormW, can be obtained
According to the formulas, the values of Cartesian coordinates (x, y) of the pixel points on the iris region corresponding to any point (m, n) in the normalized target conversion region can be obtained.
If the pupil state is a contracted state, the iris region is divided into three regions, namely an inner circle region, a transition region and an outer circle region. If the three regions have a duty cycle of inR (0.4), midR (0.3), outR (0.3), three segments of iris widths inW, midW, and outW to be determined can be obtained. Wherein, the internal circle area is obviously enlarged, the texture characteristic duty ratio is reduced, the corresponding external circle area characteristic duty ratio is relatively increased, the transition area can be approximately constant in duty ratio, and inF < midF=1 < outF can be obtained. Further, the iris region may be expanded and linearly normalized.
When n.epsilon.inNormW, one can follow the following formula
Substituting width=360 and height=60, the method can obtain
Similarly, when n is equal to middnormW, the method can obtain
When n is E the outNormW, can be obtained
According to the formulas, the values of Cartesian coordinates (x, y) of the pixel points on the iris region corresponding to any point (m, n) in the normalized target conversion region can be obtained.
3. And carrying out feature coding on the normalized target conversion region, and carrying out identity matching identification.
According to the technical scheme, the target image is preprocessed, the target to-be-processed area comprising the pupil area and the iris area in the target image is detected, expansion normalization processing is carried out according to the information of the pupil area and the information of the target to-be-processed area, the pupil state is determined according to the radius of the target to-be-processed area and the radius of the pupil area, further, the target processing mode for processing the target to-be-processed area is determined according to the pupil state, the normalized target conversion area is obtained, further, feature encoding is carried out on the normalized target conversion area, identity matching identification is carried out, the problem that nonlinear normalization of iris textures is inaccurate when images in different pupil states are processed by the same processing mode is solved, the accuracy of nonlinear normalization of the iris area textures is improved, and further, the technical effect of the accuracy of identity identification is improved.
Example five
Fig. 11 is a schematic structural diagram of an iris image processing apparatus according to a fifth embodiment of the present invention, where the apparatus includes: a target pending area determination module 510, a target processing mode determination module 520, a target iris position information determination module 530, and an iris texture image determination module 540.
The target to-be-processed area determining module 510 is configured to acquire a target image, and determine a target to-be-processed area including a pupil area and an iris area in the target image; the target processing mode determining module 520 is configured to determine a target processing mode for processing the target to-be-processed area according to an area ratio of the pupil area to the iris area in the target to-be-processed area; the target iris position information determining module 530 is configured to process each position information in the target conversion area based on the target processing manner, and determine target iris position information of each position information in the target to-be-processed area; the iris texture image determining module 540 is configured to determine a normalized iris texture image corresponding to the iris region according to pixel information of the target iris position information corresponding to each position information.
Optionally, the target to-be-processed area determining module 510 is further configured to acquire a target image including face information of the user based on the image acquisition device; and extracting a target to-be-processed area comprising a pupil area and an iris area from the target image.
Optionally, the target processing manner determining module 520 is further configured to determine a pupil radius of the pupil area and a total radius of the target area to be processed; determining a pupil state of the pupil based on the pupil radius and the total radius; the pupil state includes an dilated state, a normal state, or a contracted state; and determining a target processing mode for processing the target to-be-processed area according to the pupil state.
Optionally, the target processing manner determining module 520 is further configured to, if a ratio between the total radius and the pupil radius is within a first preset range, determine that a pupil state of the pupil information is a contracted state; if the ratio between the total radius and the pupil radius is within a second preset range, the pupil state of the pupil information is a normal state; and if the ratio between the total radius and the pupil radius is within a third preset range, the pupil state of the pupil information is a dilated state.
Optionally, the target processing mode includes a first target processing mode for the pupil in a normal state; a second target treatment mode in which the pupil is in an enlarged state; and a third target treatment mode in which the pupil is in a contracted state.
Optionally, the apparatus further includes: the iris ring width determining module is used for determining each point to be processed on the pupil area edge line according to the length of the target conversion area and determining a point to be used corresponding to each point to be processed on the iris area edge line; determining the iris width of the point corresponding to each point to be processed according to the position information of each point to be processed and the position information of the corresponding point to be used; and determining the iris ring width of the iris region according to the iris width of each point.
Optionally, the apparatus further includes: the target Cartesian coordinate system establishing module is used for establishing a target Cartesian coordinate system corresponding to the target image and determining the pupil center point position of the pupil area and the iris center point position of the iris area; the corresponding target conversion area is rectangular, and the iris ring width determining module is further used for obtaining a length conversion parameter according to the length and the central angle number of the target conversion area and determining each point to be processed on the edge of the pupil area according to the length conversion parameter; respectively establishing a pupil polar coordinate system by taking the position of the pupil center point as an origin according to the same rule, and establishing an iris polar coordinate system by taking the position of the iris center point of the iris region as the origin; determining angle parameters of all points to be processed under the pupil polar coordinate system, and determining points to be used of all points to be processed on an iris edge line under the iris polar coordinate system according to the angle parameters; and determining the polar coordinates of pupil points of the points to be processed and the polar coordinates of iris points of the corresponding points to be used.
Optionally, the iris ring width determining module is further configured to determine, for each point to be processed, a current cartesian coordinate to be processed of the current point to be processed according to a pupil point polar coordinate of the current point to be processed and the pupil center point position; determining the current Cartesian coordinates of the current points to be used according to the polar coordinates of the iris points of the current points to be used and the positions of the iris center points; and determining the Cartesian coordinates to be processed and the Cartesian coordinates to be used corresponding to the same angle parameters in the polar coordinates, and determining the iris width of the point corresponding to each point to be processed according to the Cartesian coordinates to be processed and the Cartesian coordinates to be used.
Optionally, the apparatus further includes: the transformation coefficient determining module is used for determining a normal transformation coefficient according to the width of the iris ring and the height of the target conversion area when the pupil state is determined to be in a normal state; when the pupil state is determined to be an abnormal state, determining an abnormal transformation coefficient according to the iris standard ring width and the height of the target transformation area; the abnormal state includes an enlarged state or a reduced state.
Optionally, the transformation coefficient determining module is further configured to determine a normal transformation coefficient when the pupil state is a normal state according to a ratio between the width of the iris ring and the height of the target transformation area.
Optionally, the target processing mode determining module 520 is further configured to determine that the target processing mode for processing the target to-be-processed area is a first target processing mode if the pupil state is a normal state; correspondingly, the target iris position information determining module 530 is further configured to determine, for each position information, a first target iris polar coordinate in which the current position information is converted into an iris polar coordinate system in the target conversion area, convert the first target iris polar coordinate into a target cartesian coordinate in the target cartesian coordinate system, and use the target cartesian coordinate as the target iris position information corresponding to the current position information.
Optionally, the target iris position information determining module 530 is further configured to determine a target polar coordinate angle of the first target iris polar coordinate according to a first abscissa parameter of the current position information and the length transformation parameter, and determine a target polar coordinate radius of the first target iris polar coordinate according to a first ordinate of the current position information and a normal transformation coefficient; and determining the target Cartesian coordinate of the first target iris polar coordinate under the target Cartesian coordinate system according to the target polar coordinate angle, the target polar coordinate radius and the iris ring width.
Optionally, the target iris position information determining module 530 is further configured to determine, according to the target polar coordinate angle, a pupil cartesian coordinate of a target pupil edge point on the pupil edge line and an iris cartesian coordinate of a target iris edge point on the iris edge line, where the current position information corresponds to the pupil edge point; determining a difference between an iris Cartesian abscissa and the pupil Cartesian abscissa, determining a product of the difference and the target polar radius, determining a first value based on a ratio between the product and the iris annular width, and determining a target Cartesian abscissa in the target Cartesian coordinates according to a sum between the first value and the pupil Cartesian abscissa; determining a difference value between an iris Cartesian ordinate and the pupil Cartesian ordinate, determining a product of the difference value and the target polar radius, determining a second numerical value based on a ratio between the product and the iris annular width, and determining a target Cartesian ordinate in the target Cartesian coordinate according to a sum between the second numerical value and the pupil Cartesian ordinate.
Optionally, the transformation coefficient determining module is further configured to determine the iris standard ring width; and determining an abnormal transformation coefficient according to the ratio of the width of the iris standard circular ring to the height of the target transformation area.
Optionally, the transformation coefficient determining module is further configured to determine a standard pupil radius according to the ratio of the total radius to a preset standard radius; and determining the iris standard circular ring width according to the total radius and the standard pupil radius.
Optionally, the target processing mode determining module 520 is further configured to determine that the target processing mode for processing the target to-be-processed area is a second target processing mode or a third target processing mode if the pupil state is an abnormal state; correspondingly, the target iris position information determining module 530 is further configured to determine, for each position information, a target standard polar coordinate angle of a second target iris polar coordinate according to a first abscissa parameter and a length transformation parameter of the current position information, and determine a target standard polar coordinate radius of the second target iris polar coordinate according to a second ordinate of the current position information and the abnormal transformation coefficient; determining a standard target Cartesian coordinate of the second target iris polar coordinate under the target Cartesian coordinate according to the target standard polar coordinate angle, the target standard polar coordinate radius and the iris standard ring width; and taking the standard target Cartesian coordinates as the target iris position information.
Optionally, the target processing mode determining module 520 is further configured to determine, if the pupil state is an enlarged state, that a mode of processing the target to-be-processed area is a second target processing mode, where the second target processing mode includes an area enlargement division coefficient and an area enlargement conversion coefficient; if the pupil state is a contracted state, determining that the mode for processing the target to-be-processed area is a third target processing mode, wherein the third target processing mode comprises an area contracted dividing coefficient and an area contracted conversion coefficient.
Optionally, the target iris position information determining module 530 is further configured to divide the iris widths of the points corresponding to each point to be processed according to the region division coefficient and the region transformation coefficient in the target processing manner, so as to obtain at least three iris annular regions to be used, and determine the non-linear iris region widths of the iris annular regions to be used; wherein the sum of the widths of the nonlinear iris areas is equal to the width of the iris standard circular ring; determining the linear iris width corresponding to each iris area to be used according to the nonlinear iris width corresponding to each iris annular area to be used and the abnormal transformation coefficient; determining the width of a sub-conversion area of each annular area to be used converted into the target conversion area according to the width of the linear iris area of each annular area to be used; determining a sub-conversion area to which the current position information belongs according to the current position information, determining a standard target Cartesian coordinate of the second target iris polar coordinate under the target Cartesian coordinate according to the width of the sub-conversion area, the target standard polar coordinate angle, the current position information, the linear iris area width corresponding to the sub-conversion area, the nonlinear iris width, the pupil radius and the pupil Cartesian coordinate, and taking the standard target Cartesian coordinate as target iris position information of the current position information.
Optionally, the target iris position information determining module 530 is further configured to divide, according to the region division coefficient and the iris ring width, a point iris width corresponding to each point to be processed into at least three sections of iris widths to be determined; processing at least three sections of iris widths to be determined corresponding to the same point to be processed according to the region transformation coefficient to obtain at least three sections of iris widths to be used; and obtaining at least three iris annular areas to be used according to at least three iris widths to be used corresponding to each point to be processed, and determining the width of the nonlinear iris area of the iris annular areas to be used.
Optionally, the target iris position information determining module 530 is further configured to determine, if the sub-conversion area is a first iris annular area to be used of a neighboring pupil, a first intermediate value according to an ordinate of the current position information and an abnormal transformation coefficient, and determine a first coefficient value according to the first intermediate value and an area transformation coefficient corresponding to the sub-conversion area; determining a second intermediate value according to the first coefficient value and the pupil radius, and determining a second coefficient value according to the cosine value of the second intermediate value and the target standard polar coordinate angle; determining the abscissa of the target iris position information according to the second coefficient value and the abscissa of the pupil center point position; and determining a third coefficient value according to the second intermediate value and the sine value of the target standard polar coordinate angle, and determining the ordinate of the target iris position information according to the third coefficient value and the ordinate of the pupil center point position.
Optionally, the target iris position information determining module 530 is further configured to determine a third intermediate value according to an ordinate of the current position information and a linear iris width of the first iris annular region to be used if the sub-conversion region is a second iris annular region to be used adjacent to the first iris annular region to be used, determine a fourth intermediate value according to the third intermediate value and the abnormal transformation coefficient, and determine a fourth coefficient value according to the fourth intermediate value and the region transformation coefficient corresponding to the sub-conversion region; determining a fifth intermediate value according to the fourth coefficient value, the pupil radius and the nonlinear iris width of the first iris region to be used; determining a sixth intermediate value according to the fifth intermediate value and the cosine value of the target standard polar coordinate angle, and determining a seventh intermediate value according to the fifth intermediate value and the sine value of the target polar coordinate angle; and determining the abscissa of the target iris position information according to the sixth intermediate value and the abscissa of the pupil center point, and determining the ordinate of the target iris position information according to the sixth intermediate value and the ordinate of the pupil center point.
Optionally, the target iris position information determining module 530 is further configured to determine, if the sub-conversion area is a third iris annular area to be used adjacent to the second iris annular area to be used, an eighth intermediate value according to an ordinate of the current position information, a linear iris width of the first iris annular area to be used, and a linear iris annular width of the second iris annular area to be used, determine a ninth intermediate value according to the eighth intermediate value and the abnormal transformation coefficient, and determine a fifth coefficient value according to the ninth intermediate value and the area transformation coefficient corresponding to the sub-conversion area; determining a sixth coefficient value according to the fifth coefficient value, the pupil radius, the nonlinear iris width of the first iris region to be used and the nonlinear iris width of the second iris region to be used; determining a seventh coefficient value according to the sixth coefficient value and the cosine value of the target standard polar coordinate angle, and determining an eighth coefficient value according to the sixth coefficient value and the sine value of the target polar coordinate angle; and determining the abscissa of the target iris position information according to the seventh coefficient value and the abscissa of the pupil center point, and determining the ordinate of the target iris position information according to the eighth coefficient value and the ordinate of the pupil center point.
Optionally, the iris texture image determining module 540 is further configured to obtain pixel values corresponding to the iris position information of each object, and fill the pixel values into the corresponding position information in the object conversion area.
Optionally, the apparatus further includes: the identity recognition module is used for extracting the characteristics in the normalized iris texture image and coding the characteristics to obtain characteristic codes; and matching the characteristic codes with pre-stored iris characteristics to perform identity recognition.
According to the technical scheme, the target to-be-processed area in the target image is determined, the target processing mode for processing the target to-be-processed area is determined according to the area ratio of the pupil area to the iris area in the target to-be-processed area, the images in different pupil states are processed respectively, the position information in the target conversion area is processed based on the target processing mode, the target iris position information of the position information in the target to-be-processed area is determined, the normalized iris texture image corresponding to the iris area is determined according to the pixel information of the target iris position information corresponding to the position information, the technical problems that in the prior art, the iris texture image converted by the same processing mode is processed in different pupil states, the iris texture image actually acquired has certain difference, the recognition accuracy is low, the user needs to be acquired for recognition to pass through a plurality of times, and the user experience is poor are solved, the corresponding normalization processing is performed on the iris area according to different pupil states, the iris area texture nonlinear normalization accuracy is improved, and therefore the identity recognition accuracy is improved. The iris image processing device provided by the embodiment of the invention can execute the iris image processing method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
It should be noted that each unit and module included in the iris image processing apparatus are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the embodiments of the present invention.
Example six
Fig. 12 is a schematic diagram of an image processing apparatus for performing an iris image processing method according to a sixth embodiment of the present invention. Fig. 12 shows a block diagram of an exemplary image processing device 60 suitable for use in implementing the embodiments of the present invention. The image processing apparatus 60 shown in fig. 12 is only an example, and should not impose any limitation on the functions and the scope of use of the embodiment of the present invention.
As shown in fig. 12, the image processing device 60 is in the form of a general-purpose computing device. The components of the image processing device 60 may include, but are not limited to: one or more processors or processing units 601, a system memory 602, and a bus 603 that connects the different system components (including the system memory 602 and the processing units 601).
Bus 603 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Image processing device 60 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by image processing device 60 and includes both volatile and non-volatile media, removable and non-removable media.
The system memory 602 may include computer system readable media in the form of volatile memory such as Random Access Memory (RAM) 604 and/or cache memory 605. The image processing device 60 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 606 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 12, commonly referred to as a "hard disk drive"). Although not shown in fig. 12, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 603 through one or more data medium interfaces. Memory 602 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the invention.
A program/utility 608 having a set (at least one) of program modules 607 may be stored in, for example, memory 602, such program modules 607 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 607 generally perform the functions and/or methods of the described embodiments of the invention.
The image processing device 60 may also communicate with one or more external devices 609 (e.g., keyboard, pointing device, display 610, etc.), one or more devices that enable a user to interact with the image processing device 60, and/or any device (e.g., network card, modem, etc.) that enables the image processing device 60 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 611. Also, the image processing device 60 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through a network adapter 612. As shown, the network adapter 612 communicates with other modules of the image processing apparatus 60 over the bus 603. It should be appreciated that although not shown in fig. 12, other hardware and/or software modules may be used in connection with image processing device 60, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 601 executes various functional applications and data processing by running a program stored in the system memory 602, for example, implementing the iris image processing method provided by the embodiment of the present invention.
Example seven
A seventh embodiment of the present invention also provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing an iris image processing method, the method comprising:
acquiring a target image and determining a target to-be-processed area comprising a pupil area and an iris area in the target image;
determining a target processing mode for processing the target to-be-processed region according to the region ratio of the pupil region to the iris region in the target to-be-processed region;
processing each position information in a target conversion area based on the target processing mode, and determining target iris position information of each position information in the target to-be-processed area;
and determining a normalized iris texture image corresponding to the iris region according to the pixel information of the target iris position information corresponding to each position information.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (19)

1. An iris image processing method, comprising:
acquiring a target image and determining a target to-be-processed area comprising a pupil area and an iris area in the target image;
determining a target processing mode for processing the target to-be-processed region according to the region ratio of the pupil region to the iris region in the target to-be-processed region;
processing each position information in a target conversion area based on the target processing mode, and determining target iris position information of each position information in the target to-be-processed area;
Determining a normalized iris texture image corresponding to the iris region according to pixel information of target iris position information corresponding to each position information;
the determining a target processing mode for processing the target to-be-processed area according to the area ratio of the pupil area to the iris area in the target to-be-processed area comprises the following steps:
determining the pupil radius of the pupil area and the total radius of the target area to be processed;
determining a pupil state of the pupil based on the pupil radius and the total radius; the pupil state includes an dilated state, a normal state, or a contracted state;
determining a target processing mode for processing the target to-be-processed area according to the pupil state;
the target processing mode comprises a first target processing mode for the pupil in a normal state; a second target treatment mode in which the pupil is in an enlarged state; a third target treatment mode in which the pupil is in a contracted state;
before determining the target processing mode for processing the target to-be-processed area according to the pupil state, the method further comprises:
determining the width of an iris ring of the iris region;
The determining the iris annular width of the iris region includes:
according to the length of the target conversion area, determining each point to be processed on the pupil area edge line, and determining a point to be used corresponding to each point to be processed on the iris area edge line;
determining the iris width of the point corresponding to each point to be processed according to the position information of each point to be processed and the position information of the corresponding point to be used;
and determining the iris ring width of the iris region according to the iris width of each point.
2. The method of claim 1, wherein the acquiring the target image and determining a target area to be processed in the target image that includes a pupil area and an iris area comprises:
acquiring a target image comprising user face information based on an image acquisition device;
and extracting a target to-be-processed area comprising a pupil area and an iris area from the target image.
3. The method of claim 1, wherein the determining the pupil state of the pupil based on the pupil radius and the total radius comprises:
if the ratio between the total radius and the pupil radius is within a first preset range, the pupil state of the pupil information is a contracted state;
If the ratio between the total radius and the pupil radius is within a second preset range, the pupil state of the pupil information is a normal state;
if the ratio between the total radius and the pupil radius is within a third preset range, the pupil state of the pupil information is a dilated state.
4. A method according to claim 3, further comprising, prior to said determining the iris-circle width of the iris region:
establishing a target Cartesian coordinate system corresponding to the target image, and determining the pupil center point position of the pupil area and the iris center point position of the iris area;
correspondingly, the target conversion area is rectangular, and according to the length of the target conversion area, determining each point to be processed on the pupil area edge line, and determining a point to be used corresponding to each point to be processed on the iris area edge line, including:
obtaining length transformation parameters according to the length and central angle degrees of the target transformation area, and determining each point to be processed on the edge of the pupil area according to the length transformation parameters;
respectively establishing a pupil polar coordinate system by taking the position of the pupil center point as an origin according to the same rule, and establishing an iris polar coordinate system by taking the position of the iris center point of the iris region as the origin;
Determining angle parameters of all points to be processed under the pupil polar coordinate system, and determining points to be used of all points to be processed on an iris edge line under the iris polar coordinate system according to the angle parameters;
and determining the polar coordinates of pupil points of the points to be processed and the polar coordinates of iris points of the corresponding points to be used.
5. The method of claim 4, wherein determining the iris width of the point corresponding to each point to be processed according to the position information of each point to be processed and the position information of the corresponding point to be used comprises:
determining the current Cartesian coordinates to be processed of the current points to be processed according to the pupil point polar coordinates of the current points to be processed and the pupil center point positions;
determining the current Cartesian coordinates of the current points to be used according to the polar coordinates of the iris points of the current points to be used and the positions of the iris center points;
and determining the Cartesian coordinates to be processed and the Cartesian coordinates to be used corresponding to the same angle parameters in the polar coordinates, and determining the iris width of the point corresponding to each point to be processed according to the Cartesian coordinates to be processed and the Cartesian coordinates to be used.
6. The method of claim 4, further comprising, prior to said determining a target treatment for treating said target treatment area based on said pupil state:
when the pupil state is determined to be in a normal state, determining a normal transformation coefficient according to the width of the iris ring and the height of the target transformation area;
when the pupil state is determined to be an abnormal state, determining an abnormal transformation coefficient according to the iris standard ring width and the height of the target transformation area; the abnormal state includes an enlarged state or a reduced state.
7. The method of claim 6, wherein determining a normal transform coefficient based on the iris-circle width and the height of the target transition region when determining that the pupil state is a normal state comprises:
and determining a normal transformation coefficient when the pupil state is in a normal state according to the ratio between the width of the iris ring and the height of the target transformation area.
8. The method of claim 7, wherein determining a target treatment mode for treating the target treatment area according to the pupil state comprises:
If the pupil state is a normal state, determining that a target processing mode for processing the target to-be-processed area is a first target processing mode;
correspondingly, the processing each piece of position information in the target conversion area based on the target processing mode, and determining the target iris position information of each piece of position information in the target to-be-processed area, includes:
for each piece of position information, determining that the current position information in a target conversion area is converted into a first target iris polar coordinate in an iris polar coordinate system, converting the first target iris polar coordinate into a target Cartesian coordinate in the target Cartesian coordinate system, and taking the target Cartesian coordinate as target iris position information corresponding to the current position information.
9. The method of claim 6, wherein the determining an abnormal transformation coefficient based on an iris standard annular width and a height of the target transition region when determining that the pupil state is an abnormal state comprises:
determining the width of the iris standard ring;
and determining an abnormal transformation coefficient according to the ratio of the width of the iris standard circular ring to the height of the target transformation area.
10. The method of claim 9, wherein said determining the iris standard annular width comprises:
determining a standard pupil radius according to the ratio of the total radius to a preset standard radius;
and determining the iris standard circular ring width according to the total radius and the standard pupil radius.
11. The method of claim 10, wherein determining a target treatment mode for treating the target treatment area according to the pupil state comprises:
if the pupil state is an abnormal state, determining that the target processing mode for processing the target to-be-processed area is a second target processing mode or a third target processing mode;
correspondingly, the processing each piece of position information in the target conversion area based on the target processing mode, and determining the target iris position information of each piece of position information in the target to-be-processed area, includes:
for each piece of position information, determining a target standard polar coordinate angle of a second target iris polar coordinate according to a first abscissa parameter and a length transformation parameter of the current position information, and determining a target standard polar coordinate radius of the second target iris polar coordinate according to a second ordinate of the current position information and the abnormal transformation coefficient;
Determining a standard target Cartesian coordinate of the second target iris polar coordinate under the target Cartesian coordinate according to the target standard polar coordinate angle, the target standard polar coordinate radius and the iris standard ring width;
and taking the standard target Cartesian coordinates as the target iris position information.
12. The method of claim 11, wherein the determining that the target processing manner for processing the target area to be processed is a second target processing manner or a third target processing manner, comprises:
if the pupil state is an enlarged state, determining that the mode for processing the target to-be-processed area is a second target processing mode, wherein the second target processing mode comprises an area enlargement dividing coefficient and an area enlargement conversion coefficient;
if the pupil state is a contracted state, determining that the mode for processing the target to-be-processed area is a third target processing mode, wherein the third target processing mode comprises an area contracted dividing coefficient and an area contracted conversion coefficient.
13. The method of claim 12, wherein said determining a standard target cartesian coordinate of said second target iris polar coordinate at said target cartesian coordinate based on said target standard polar coordinate angle, said target standard polar coordinate radius and said iris standard annular width comprises:
Dividing the iris widths of points corresponding to each point to be processed according to the region division coefficient and the region transformation coefficient in the target processing mode to obtain at least three iris annular regions to be used, and determining the non-linear iris region widths of the iris annular regions to be used; wherein the sum of the widths of the nonlinear iris areas is equal to the width of the iris standard circular ring;
determining the linear iris width corresponding to each iris area to be used according to the nonlinear iris width corresponding to each iris annular area to be used and the abnormal transformation coefficient;
determining the width of a sub-conversion area of each annular area to be used converted into the target conversion area according to the width of the linear iris area of each annular area to be used;
determining a sub-conversion area to which the current position information belongs according to the current position information, determining a standard target Cartesian coordinate of the second target iris polar coordinate under the target Cartesian coordinate according to the width of the sub-conversion area, the target standard polar coordinate angle, the current position information, the linear iris area width corresponding to the sub-conversion area, the nonlinear iris width, the pupil radius and the pupil Cartesian coordinate, and taking the standard target Cartesian coordinate as target iris position information of the current position information.
14. The method of claim 13, wherein dividing the iris widths of the points corresponding to each point to be processed according to the region division coefficient and the region transformation coefficient in the target processing manner to obtain at least three iris annular regions to be used, and determining the non-linear iris region widths of the iris annular regions to be used, comprises:
dividing the iris width of the point corresponding to each point to be processed into at least three sections of iris widths to be determined according to the region division coefficient and the iris ring width;
processing at least three sections of iris widths to be determined corresponding to the same point to be processed according to the region transformation coefficient to obtain at least three sections of iris widths to be used;
and obtaining at least three iris annular areas to be used according to at least three iris widths to be used corresponding to each point to be processed, and determining the width of the nonlinear iris area of the iris annular areas to be used.
15. The method of claim 1, wherein determining a normalized iris texture image corresponding to the iris region based on pixel information of target iris position information corresponding to each position information, comprises:
And acquiring pixel point values corresponding to the position information of each target iris, and filling the pixel point values into the corresponding position information in the target conversion area.
16. The method according to claim 1, wherein after the determining of the normalized iris texture image corresponding to the iris region based on the pixel information of the target iris position information corresponding to each position information, the method further comprises:
extracting features in the normalized iris texture image and coding to obtain feature codes;
and matching the characteristic codes with pre-stored iris characteristics to perform identity recognition.
17. An iris image processing apparatus, comprising:
the target to-be-processed area determining module is used for acquiring a target image and determining a target to-be-processed area comprising a pupil area and an iris area in the target image;
the target processing mode determining module is used for determining a target processing mode for processing the target to-be-processed area according to the area ratio of the pupil area to the iris area in the target to-be-processed area;
the target iris position information determining module is used for processing each position information in the target conversion area based on the target processing mode and determining target iris position information of each position information in the target to-be-processed area;
The iris texture image determining module is used for determining a normalized iris texture image corresponding to the iris region according to pixel information of target iris position information corresponding to each position information;
the target processing mode determining module is further used for determining the pupil radius of the pupil area and the total radius of the target area to be processed; determining a pupil state of the pupil based on the pupil radius and the total radius; the pupil state includes an dilated state, a normal state, or a contracted state; determining a target processing mode for processing the target to-be-processed area according to the pupil state;
the target processing mode comprises a first target processing mode for the pupil in a normal state; a second target treatment mode in which the pupil is in an enlarged state; a third target treatment mode in which the pupil is in a contracted state;
the apparatus further comprises: the iris ring width determining module is used for determining each point to be processed on the pupil area edge line according to the length of the target conversion area and determining a point to be used corresponding to each point to be processed on the iris area edge line; determining the iris width of the point corresponding to each point to be processed according to the position information of each point to be processed and the position information of the corresponding point to be used; and determining the iris ring width of the iris region according to the iris width of each point.
18. An image processing apparatus that performs the iris image processing method, characterized in that the image processing apparatus comprises:
one or more processors;
a storage means for storing one or more programs;
the camera device is used for collecting facial images of the user;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the iris image processing method as claimed in any one of claims 1 to 16.
19. A storage medium containing computer executable instructions for performing the iris image processing method as claimed in any one of claims 1 to 16 when executed by a computer processor.
CN202110257732.9A 2021-03-09 2021-03-09 Iris image processing method, device, equipment and storage medium Active CN112949518B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110257732.9A CN112949518B (en) 2021-03-09 2021-03-09 Iris image processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110257732.9A CN112949518B (en) 2021-03-09 2021-03-09 Iris image processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112949518A CN112949518A (en) 2021-06-11
CN112949518B true CN112949518B (en) 2024-04-05

Family

ID=76228604

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110257732.9A Active CN112949518B (en) 2021-03-09 2021-03-09 Iris image processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112949518B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114758407B (en) * 2022-06-17 2022-09-20 慧眼识真(北京)电子科技有限公司 Iris visual angle correction method based on affine transformation

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101404059A (en) * 2008-09-24 2009-04-08 中国科学院自动化研究所 Iris image database synthesis method based on block texture sampling
CN103544420A (en) * 2013-08-15 2014-01-29 马建 Anti-fake iris identity authentication method used for intelligent glasses
CN104484649A (en) * 2014-11-27 2015-04-01 北京天诚盛业科技有限公司 Method and device for identifying irises
CN107506754A (en) * 2017-09-19 2017-12-22 厦门中控智慧信息技术有限公司 Iris identification method, device and terminal device
CN108288053A (en) * 2018-03-01 2018-07-17 武汉轻工大学 A kind of method, apparatus and computer readable storage medium of the processing of iris image
CN108288052A (en) * 2018-03-01 2018-07-17 武汉轻工大学 Iris image method for normalizing, device and computer readable storage medium
CN108629262A (en) * 2017-03-18 2018-10-09 上海荆虹电子科技有限公司 Iris identification method and related device
CN110309774A (en) * 2019-06-28 2019-10-08 京东数字科技控股有限公司 Iris segmentation method, apparatus, storage medium and electronic equipment
CN112001244A (en) * 2020-07-17 2020-11-27 公安部物证鉴定中心 Computer-aided iris comparison method and device
CN112287872A (en) * 2020-11-12 2021-01-29 北京建筑大学 Iris image segmentation, positioning and normalization method based on multitask neural network

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8639058B2 (en) * 2011-04-28 2014-01-28 Sri International Method of generating a normalized digital image of an iris of an eye
US8755607B2 (en) * 2011-04-28 2014-06-17 Sri International Method of normalizing a digital image of an iris of an eye
FR3037422B1 (en) * 2015-06-15 2017-06-23 Morpho METHOD FOR IDENTIFYING AND / OR AUTHENTICATING AN INDIVIDUAL BY RECOGNIZING IRIS

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101404059A (en) * 2008-09-24 2009-04-08 中国科学院自动化研究所 Iris image database synthesis method based on block texture sampling
CN103544420A (en) * 2013-08-15 2014-01-29 马建 Anti-fake iris identity authentication method used for intelligent glasses
CN104484649A (en) * 2014-11-27 2015-04-01 北京天诚盛业科技有限公司 Method and device for identifying irises
CN108629262A (en) * 2017-03-18 2018-10-09 上海荆虹电子科技有限公司 Iris identification method and related device
CN107506754A (en) * 2017-09-19 2017-12-22 厦门中控智慧信息技术有限公司 Iris identification method, device and terminal device
CN108288053A (en) * 2018-03-01 2018-07-17 武汉轻工大学 A kind of method, apparatus and computer readable storage medium of the processing of iris image
CN108288052A (en) * 2018-03-01 2018-07-17 武汉轻工大学 Iris image method for normalizing, device and computer readable storage medium
CN110309774A (en) * 2019-06-28 2019-10-08 京东数字科技控股有限公司 Iris segmentation method, apparatus, storage medium and electronic equipment
CN112001244A (en) * 2020-07-17 2020-11-27 公安部物证鉴定中心 Computer-aided iris comparison method and device
CN112287872A (en) * 2020-11-12 2021-01-29 北京建筑大学 Iris image segmentation, positioning and normalization method based on multitask neural network

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
A biomechanical approach to iris normalization;Inmaculada Tomeo-Reyes 等;《2015 International Conference on Biometrics (ICB)》;1-8 *
A Non-linear Normalization Model for Iris Recognition;Xiaoyan Yuan 等;《IWBRS 2005: Advances in Biometric Person Authentication》;135–141 *
基于虹膜身份识别算法的研究;刘婷;《中国优秀硕士学位论文全文数据库 信息科技辑》;I138-1157 *
虹膜编码及匹配算法的研究;谷建清;《中国优秀硕士学位论文全文数据库 信息科技辑》;I138-593 *
虹膜识别算法的研究和优化;周晓宇;《中国优秀硕士学位论文全文数据库 信息科技辑》;I138-709 *
虹膜识别算法研究;赵朗月;《中国优秀硕士学位论文全文数据库 信息科技辑》;I138-2143 *

Also Published As

Publication number Publication date
CN112949518A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
US10699103B2 (en) Living body detecting method and apparatus, device and storage medium
CN108509915B (en) Method and device for generating face recognition model
US10769423B2 (en) Method, system and terminal for identity authentication, and computer readable storage medium
US10740636B2 (en) Method, system and terminal for identity authentication, and computer readable storage medium
WO2019223102A1 (en) Method and apparatus for checking validity of identity, terminal device and medium
CN105917353B (en) Feature extraction and matching for biological identification and template renewal
CN110569756B (en) Face recognition model construction method, recognition method, device and storage medium
CN109858384B (en) Face image capturing method, computer readable storage medium and terminal device
CN110728234A (en) Driver face recognition method, system, device and medium
CN110852310B (en) Three-dimensional face recognition method and device, terminal equipment and computer readable medium
WO2017106996A1 (en) Human facial recognition method and human facial recognition device
CN111695462B (en) Face recognition method, device, storage medium and server
CN110084238B (en) Finger vein image segmentation method and device based on LadderNet network and storage medium
CN107408195B (en) Iris identification method and device
CN112528866A (en) Cross-modal face recognition method, device, equipment and storage medium
CN102254188A (en) Palmprint recognizing method and device
KR102329128B1 (en) An adaptive quantization method for iris image encoding
CN112507897A (en) Cross-modal face recognition method, device, equipment and storage medium
CN109325472B (en) Face living body detection method based on depth information
CN112949518B (en) Iris image processing method, device, equipment and storage medium
KR102558736B1 (en) Method and apparatus for recognizing finger print
Yang et al. $\alpha $-Trimmed Weber Representation and Cross Section Asymmetrical Coding for Human Identification Using Finger Images
US11036968B2 (en) Method and apparatus for pattern recognition
CN113228105A (en) Image processing method and device and electronic equipment
CN113569707A (en) Living body detection method, living body detection device, electronic apparatus, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant