CN111428654B - Iris recognition method, iris recognition device and storage medium - Google Patents

Iris recognition method, iris recognition device and storage medium Download PDF

Info

Publication number
CN111428654B
CN111428654B CN202010228395.6A CN202010228395A CN111428654B CN 111428654 B CN111428654 B CN 111428654B CN 202010228395 A CN202010228395 A CN 202010228395A CN 111428654 B CN111428654 B CN 111428654B
Authority
CN
China
Prior art keywords
world coordinate
infrared camera
camera
face image
target face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010228395.6A
Other languages
Chinese (zh)
Other versions
CN111428654A (en
Inventor
张小亮
请求不公布姓名
王秀贞
戚纪纲
杨占金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Superred Technology Co Ltd
Original Assignee
Beijing Superred Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Superred Technology Co Ltd filed Critical Beijing Superred Technology Co Ltd
Priority to CN202010228395.6A priority Critical patent/CN111428654B/en
Publication of CN111428654A publication Critical patent/CN111428654A/en
Application granted granted Critical
Publication of CN111428654B publication Critical patent/CN111428654B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to an iris recognition method, apparatus and storage medium, wherein the iris recognition method is applied to an electronic device, the electronic device includes a rotatable camera device, the method includes: world coordinate information of a target face image shot by the shooting device is acquired; calculating the rotation angle of the image pickup device according to the world coordinate information, and controlling the image pickup device to be positioned to a designated position according to the rotation angle, wherein the designated position is a position corresponding to the effective imaging range of the image pickup device of the target face image; and acquiring iris information of the target face image, and performing iris recognition based on the iris information. Through the method and the device, the iris information of different distances can be acquired while high-precision positioning identification is provided through the positioning technology of the face detection and camera device, and even under the condition of long-distance even external environment interference, the iris information can be acquired and identified.

Description

Iris recognition method, iris recognition device and storage medium
Technical Field
The present disclosure relates to the field of camera positioning, target detection, and iris recognition, and in particular, to an iris recognition method, apparatus, and storage medium.
Background
With the development of society, some important and even confidential places have high standard requirements on the accuracy, stability and applicability of identity recognition. Among the related arts, iris recognition technology has received a great deal of attention from the industry. The iris recognition technology is a biological recognition technology, and the iris has the detailed characteristics and texture characteristics, has the advantages of uniqueness, stability and non-contact property, has high stability and safety, is acquired in a non-contact manner, and has the physiological advantages of inherent isolation and protection capability and the like.
However, the iris recognition technology at present is based on iris texture characteristics, and the iris recognition devices are used for collecting iris information at a short distance and have limitations compared with the iris information collected at a long distance. Moreover, when the iris image is acquired in the identification process, the problems that the figure texture is unclear and the iris image cannot be identified exist, and the iris image is easily interfered by the external environment in the process of acquiring the iris image.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an iris recognition method, apparatus, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an iris recognition method applied to an electronic device including a rotatable image pickup apparatus, the method including:
world coordinate information of a target face image shot by the shooting device is acquired;
calculating the rotation angle of the image pickup device according to the world coordinate information, and controlling the image pickup device to be positioned to a designated position according to the rotation angle, wherein the designated position is a position corresponding to the effective imaging range of the image pickup device of the target face image;
and acquiring iris information of the target face image, and performing iris recognition based on the iris information.
In yet another embodiment, the camera device comprises a binocular wide angle camera;
the obtaining world coordinate information of the target face image shot by the camera device comprises the following steps:
acquiring parameters of a binocular wide-angle camera in the image pickup device;
and determining world coordinates of the target face image according to the parameters.
In yet another embodiment, determining world coordinates of the target face image according to the parameters further includes:
and carrying out error conversion on the world coordinates of the target face image.
In yet another embodiment, the imaging device includes a binocular wide angle camera and an infrared camera, the method further comprising:
acquiring world coordinate information of an infrared camera and world coordinate information of a binocular wide-angle camera in the camera device;
the calculating the rotation angle of the image capturing device according to the world coordinate information includes:
calculating a first moving direction and a first moving distance of the infrared camera according to the world coordinate information of the binocular wide-angle camera and the world coordinate information of the target face image to obtain a world coordinate first moving coordinate of the infrared camera moving from the world coordinate to a designated position coordinate;
and calculating the rotation angle of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image.
In yet another embodiment, the calculating the first moving direction and the first moving distance of the infrared camera according to the world coordinate information of the binocular wide angle camera and the world coordinate information of the target face image includes:
Forming a straight line according to the world coordinates of the binocular wide-angle camera and the world coordinates of the target face image;
selecting a plane parallel to the y axis of the world coordinate in the plane of the straight line;
the direction from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving direction of the world coordinate of the infrared camera, the distance from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving distance, and the intersection point from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving coordinate of the world coordinate of the infrared camera.
In yet another embodiment, the calculating the rotation angle of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image includes:
acquiring the depth of field distance of the infrared camera and the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image;
if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is smaller than or equal to the depth of field distance, the rotation angle of the camera device is directly calculated.
In yet another embodiment, the calculating the rotation angle of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image further includes:
if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is larger than the depth of field distance, calculating a second moving direction and a second moving distance of the infrared camera according to the first moving coordinate of the world coordinate of the infrared camera and the world coordinate of the target face image;
and calculating the rotation angle of the image pickup device according to the second moving direction and the second moving distance of the infrared camera.
In yet another embodiment, the infrared camera includes an optical axis;
the initial state of the optical axis of the infrared camera is parallel to the z-axis direction of the world coordinates.
In yet another embodiment, the angle between the optical axis of the infrared camera and the boundary of the field of view of the infrared camera is a preset angle.
In yet another embodiment, the iris recognition based on the iris information of the object further includes:
and controlling the optical axis of the image pickup device to rotate to an initial state.
According to a second aspect of embodiments of the present disclosure, there is provided an iris recognition apparatus applied to an electronic device including a rotatable image pickup apparatus, the apparatus including:
The acquisition module is used for: world coordinate information of a target face image shot by the camera device is acquired;
and (3) a rotation module: the method comprises the steps of calculating the rotation angle of the image pickup device according to world coordinate information, and controlling the image pickup device to be positioned to a designated position according to the rotation angle, wherein the designated position is a position corresponding to an effective imaging range of the image pickup device of the target face image;
and an identification module: and the iris recognition module is used for acquiring iris information of the target face image and carrying out iris recognition based on the iris information.
In one embodiment, the imaging device comprises a binocular wide angle camera;
the acquisition module comprises: an acquisition unit and a determination unit;
the acquisition unit is used for acquiring parameters of the binocular wide-angle camera in the image pickup device;
and the determining unit is used for determining world coordinates of the target face image according to the parameters.
In one embodiment, the determining unit further comprises: a conversion unit;
and the conversion unit is used for carrying out error conversion on the world coordinates of the target face image.
In one embodiment, the camera device comprises a binocular wide angle camera and an infrared camera, the acquisition module is further configured to,
Acquiring world coordinate information of an infrared camera and world coordinate information of a binocular wide-angle camera in the camera device;
the rotation module includes: a calculation unit and a rotation unit;
the computing unit is used for computing a first moving direction and a first moving distance of the infrared camera according to the world coordinate information of the binocular wide-angle camera and the world coordinate information of the target face image, and obtaining a world coordinate first moving coordinate of the infrared camera moving from the world coordinate to a designated position coordinate;
the rotating unit is used for calculating the rotating angle of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image.
In one embodiment, the computing unit includes: forming a subunit and selecting the subunit;
the composing subunit is used for composing a straight line according to the world coordinates of the binocular wide-angle camera and the world coordinates of the target face image;
the selecting subunit is used for selecting a plane parallel to the y axis of the world coordinate in the plane where the straight line is located;
the direction from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving direction of the world coordinate of the infrared camera, the distance from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving distance, and the intersection point from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving coordinate of the world coordinate of the infrared camera.
In one embodiment, the rotation unit is further adapted to,
acquiring the depth of field distance of the infrared camera and the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image;
if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is smaller than or equal to the depth of field distance, the rotation angle of the camera device is directly calculated.
In one embodiment, the rotation unit is further configured to:
if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is larger than the depth of field distance, calculating a second moving direction and a second moving distance of the infrared camera according to the first moving coordinate of the world coordinate of the infrared camera and the world coordinate of the target face image;
and calculating the rotation angle of the image pickup device according to the second moving direction and the second moving distance of the infrared camera.
In one embodiment, the apparatus further comprises an infrared camera, the infrared camera comprising an optical axis;
the initial state of the optical axis of the infrared camera is parallel to the z-axis direction of the world coordinates.
In one embodiment, the included angle between the optical axis of the infrared camera and the boundary of the field of view of the infrared camera is a preset included angle.
In one embodiment, the apparatus further comprises: an initial module;
the initial module is used for controlling the optical axis of the image pickup device to rotate to an initial state.
According to a third aspect of embodiments of the present disclosure, there is provided an iris recognition apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: the iris recognition method of the first aspect or any implementation manner of the first aspect is performed.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium, which when executed by a processor of a network device, causes an electronic device to perform the iris recognition method of the first aspect or any implementation of the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: by the positioning technology of the face detection and camera device, iris information acquisition at different distances can be realized while high-precision positioning identification is provided, and even under the condition of long-distance even external environment interference, iris information can be identified.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a simplified diagram illustrating a structure of an infrared camera in an iris recognition method according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating an iris recognition method according to an exemplary embodiment.
Fig. 3 is a flowchart illustrating yet another iris recognition method according to an exemplary embodiment.
Fig. 4 is a simplified visual illustration of an iris recognition method according to an exemplary embodiment.
Fig. 5 is a flowchart illustrating yet another iris recognition method according to an exemplary embodiment.
Fig. 6 is a schematic view showing an infrared camera rotation angle in an iris recognition method according to an exemplary embodiment.
Fig. 7 is a flowchart illustrating yet another iris recognition method according to an exemplary embodiment.
Fig. 8 is a flowchart illustrating yet another iris recognition method according to an exemplary embodiment.
Fig. 9 is a flowchart illustrating yet another iris recognition method according to an exemplary embodiment.
Fig. 10 is a visual operation diagram when a target object is not within an effective imaging range of an infrared camera in the iris recognition method according to an exemplary embodiment.
Fig. 11 is a block diagram illustrating a structure of an iris recognition apparatus according to an exemplary embodiment.
Fig. 12 is a block diagram illustrating a structure of an iris recognition apparatus according to an exemplary embodiment.
Fig. 13 is a block diagram showing a structure of an iris recognition apparatus according to an exemplary embodiment.
Fig. 14 is a block diagram showing a structure of an iris recognition apparatus according to an exemplary embodiment.
Fig. 15 is a block diagram showing a structure of an iris recognition apparatus according to an exemplary embodiment.
Fig. 16 is a block diagram illustrating a structure of still another iris recognition apparatus according to an exemplary embodiment.
Fig. 17 is a block diagram illustrating an iris recognition method apparatus according to an exemplary embodiment.
Fig. 18 is a block diagram illustrating an iris recognition method apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The iris recognition method is applied to iris recognition devices in electronic equipment, such as mobile phones, computers, access control systems and the like.
The electronic equipment at least comprises an image pickup device, wherein the image pickup device comprises a binocular camera and an infrared camera. The binocular camera may be a binocular wide angle camera. And the camera device is free to rotate.
The initial state of the infrared camera is set to be that the initial direction of the optical axis of the infrared camera is parallel to the z-axis direction of the world coordinate system, and the optical axis of the infrared camera is always vertical to the forward base, so that the forward base of the image pickup device is always vertical to the pointing direction of the z-axis in the world coordinate system in the disclosure, and the base pointing direction of the image pickup device is kept unchanged in the rotating process. The preset included angle between the optical axis of the infrared camera and the visual field boundary of the infrared camera is represented by a symbol theta. As shown in fig. 1.
Fig. 2 is a flowchart illustrating an iris recognition method according to an exemplary embodiment. As shown in fig. 2, the iris recognition method may include steps S21 to S23, applied to an electronic apparatus including a rotatable image pickup device:
step S21, acquiring world coordinate information of a target face image captured by the imaging device.
Wherein world coordinates are the absolute coordinate system of the camera system, in this disclosure, the world coordinate system is the position of the binocular camera in the camera system, in binocularIn vision, the origin of the world coordinate system is fixed at the midpoint of the X axis direction of the left camera, the right camera or both, and D (X) d ,y d ,z d ). The representation is performed.
In the specific embodiment, a binocular wide-angle camera in electronic equipment is combined with opencv (cross-platform computer vision library issued based on BSD permission (open source)) to detect a target face, and a detected target face image is obtained.
Selecting a central pixel point of the acquired target face image, and acquiring world coordinates P (x) of the target person by combining calibration parameters of the binocular camera p ,y p ,z p )。
Step S22, calculating the rotation angle of the image pickup device according to the world coordinate information, and controlling the image pickup device to be positioned to a specified position according to the rotation angle.
The appointed position is a position corresponding to the effective imaging range of the image pickup device of the target face image.
In particular, an infrared camera in the camera device is used for calibrating world coordinates P (x) p ,y p ,z p ) And its own world coordinate point I (x i ,y i ,z i ) And moving and rotating the infrared camera to enable the infrared camera to be aligned to the face.
Step S23, iris information of the target face image is obtained, and iris recognition is performed based on the iris information.
After the infrared camera is aligned to the face, if the target object is in the effective range of the infrared camera, iris information of the target is directly acquired.
If the target object is not within the effective range of the infrared camera, the target object is required to be positioned according to the world coordinate point I (x i ,y i ,z i ) And world coordinates P (x) p ,y p ,z p ) And calculating the moving track of the infrared camera, moving along the calculated moving track until the target object is in the effective range of the infrared camera, then rotating, and acquiring iris information of the target object.
In the iris recognition method, the iris information acquisition at different distances can be realized while high-precision positioning recognition is provided by the positioning technology of the face detection and camera device, and the iris information can be recognized even under the condition of long-distance even external environment interference.
Fig. 3 is a schematic diagram showing another iris recognition method step flow according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, the image pickup apparatus includes a binocular wide angle camera; as shown in fig. 3, step S21 may include: step S31-step S32;
Step S31: parameters of a binocular wide angle camera in an image pickup apparatus are acquired.
And acquiring the inner and outer parameters of a binocular wide-angle camera in the camera device by adopting a Zhang camera calibration method, and performing data processing on the detected face image of the target object to acquire a central coordinate point (u ', v') of the face image of the target object.
Step S32: and determining world coordinates of the target face image according to the parameters.
The specific implementation mode is as follows: and obtaining a three-dimensional coordinate of a face image of a target object by utilizing an imaging model of the binocular wide-angle camera and combining the acquired internal and external parameters of the camera through data conversion, wherein an equation of the data conversion is as follows:
where f is the focal length, R is the rotation matrix, and t is the translation matrix, all of which can be obtained by camera calibration. Z is Z p For object depth values, it can be obtained by a geometric model of the binocular camera. (u, v) is the pixel coordinates of the face image; (u) 0 ,v 0 ) Is the coordinates at the intersection of the camera optical axis and the image plane; (X) p ,Y p ,Z p ) World coordinates are obtained for the target object face as shown in fig. 4.
After step S21, it may further include: and carrying out error conversion on the world coordinates of the target face image.
And (3) carrying out distortion conversion on the central coordinate point (u ', v') of the face image of the target object according to errors in the manufacturing and mounting processes of the image pickup device, so as to obtain central coordinates (u, v) eliminating distortion. The distortion conversion is as follows:
Wherein,k 1 ,k 2 ,p 1 ,p 2 is a distortion coefficient.
The distortion coefficient is obtained through calibration of the camera device.
In the iris recognition method, the distortion conversion model is introduced in consideration of errors of camera design and installation, so that the accuracy of camera positioning is improved.
Fig. 5 is a schematic diagram showing another iris recognition method step flow according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, the image capturing apparatus includes a binocular wide angle camera and an infrared camera, the method further comprising: world coordinate information of an infrared camera and world coordinate information of a binocular wide-angle camera in the image pickup device are obtained.
The world coordinate acquisition method of the infrared camera comprises the following steps: shooting an infrared camera by using a binocular camera, obtaining a central point coordinate by using the infrared camera, and obtaining a world coordinate I (x) of the infrared camera by using a method for obtaining a world coordinate of a face of a target object i ,y i ,z i )。
As shown in fig. 5, step S22 may include steps S51 to S52.
Step S51: and calculating a first moving direction and a first moving distance of the infrared camera according to the world coordinate information of the binocular wide-angle camera and the world coordinate information of the target face image, and obtaining a world coordinate first moving coordinate of the infrared camera moving from the world coordinate to the appointed position coordinate.
As can be seen from the iris recognition method, the world coordinates D (x d ,y d ,z d ) And world coordinates P (x) p ,y p ,z p ) Has been obtained from the world coordinates D (x d ,y d ,z d ) And world coordinates P (x) p ,y p ,z p ) A straight line DP is obtained and a plane pi is chosen in which the straight line DP lies and which is parallel to the y-axis of the world coordinate. The first moving direction is the direction of the infrared camera along the plane piMoving, wherein the first moving distance is the world coordinate I (x) i ,y i ,z i ) The intersection point of the distance delta d from the plane pi and the plane pi is the world coordinate first moving coordinate I of the infrared camera 1 (x i1 ,y i ,z i1 )。
Wherein the coordinates of the y-axis are not changed and are not rotated during the first movement of the infrared camera, I 1 Not necessarily on the straight line DP.
Step S52: and calculating the rotation angle of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image.
First movement coordinate I according to world coordinate of infrared camera 1 (x i1 ,y i ,z i1 ) And world coordinates P (x p ,y p ,z p ) Calculating the initial direction of the optical axis of the infrared camera and the straight line I 1 Included angle between P
Wherein the included angle isThe formula of (2) is as follows:
the rotation direction of the infrared camera is The rotation angle is alpha.
Wherein the value range of the rotation angle alpha is
Direction of rotationThe formula of (2) is as follows:
since the y-axis coordinate is unchanged during the movement of the infrared camera, the coordinates are simplified to obtain I, I 1 The projection points of P and D on the plane xoz are I respectively 0 ,I 10 ,P 0 ,D 0 Calculation ofThe included angle cosine delta between them.
The calculation formula of the included angle cosine delta is as follows:
Directionthe formula of (2) is as follows:
the first movement distance Δd is calculated as follows:
point I 10 The coordinates of (2) are as follows:
point I 1 The coordinates of (2) are as follows:
as shown in fig. 6, the infrared camera is moved to the first movement coordinate according to the rotation angle and the rotation direction calculated above and the first movement distance.
It is to be understood that a straight line is formed according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image, a preset maximum value m of the depth of field of the infrared camera is obtained, and a moving track of the infrared camera is obtained according to the maximum value m of the depth of field of the infrared camera and the straight line formed by the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image, so that the camera device rotates to a designated position.
In the iris recognition method, world coordinates of the infrared camera are detected and positioned by means of the target, and a trigonometric function is used for obtaining the rotation angle of the infrared camera. The method skillfully uses the geometric model to solve the actual space problem, and has good use value.
Fig. 7 is a schematic diagram showing a flow of steps of another iris recognition method according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 7, step S51 may include: step S71-step S72 step S71: and forming a straight line according to the world coordinates of the binocular wide-angle camera and the world coordinates of the target face image.
Wherein the line formed by the world coordinates of the binocular wide-angle camera and the world coordinates of the target face image is DP.
Step S72: and selecting a plane parallel to the y axis of the world coordinate in the planes of the straight lines.
The direction from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving direction of the world coordinate of the infrared camera, the distance from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving distance, and the intersection point of the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving coordinate of the world coordinate of the infrared camera.
The expression of the plane pi in which the line DP lies and which is parallel to the y-axis is as follows:
fig. 8 is a schematic diagram showing a flow of steps of another iris recognition method according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 8, step S55 may include:
Step S81: and acquiring the depth of field distance of the infrared camera and the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image.
Wherein the depth of field distance of the infrared camera is m, and the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is I 1 P。
Step S82: if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is smaller than or equal to the depth of field distance, the rotation angle of the camera device is directly calculated.
Comparing the depth of field distance of the infrared camera with the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image as I 1 The size of P.
If the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is I 1 P is less than or equal to the sceneAnd (3) directly calculating the rotation angle of the image pickup device by the deep distance m.
Fig. 9 is a schematic diagram showing a flow of steps of another iris recognition method according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 9, step S55 may further include:
step S91: if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is greater than the depth of field distance, calculating a second moving direction and a second moving distance of the infrared camera according to the first moving coordinate of the world coordinate of the infrared camera and the world coordinate of the target face image.
Step S92: and calculating the rotation angle of the image pickup device according to the second moving direction and the second moving distance of the infrared camera.
The second moving direction and the second moving distance of the infrared camera are determined by the world coordinates P (x p ,y p ,z p ) And first movement coordinate I of infrared camera 1 (x i1 ,y i ,z i1 ) Determining that the shortest linear distance between two points according to the linear theorem can be obtained, wherein the second moving direction of the infrared camera isThe second moving distance is I 1 P|-m。
Wherein the second moving direction isThe equation of (2) is as follows:
as shown in fig. 10, the infrared camera always moves toward the target object, and the range of the rotation angle β of the infrared camera is set to [ - θ, θ ].
According to the iris recognition method, the iris information of the object can be acquired in a long distance while the recognition accuracy is ensured, and the practicability of the iris recognition system is effectively improved.
In an exemplary embodiment of the present disclosure, after the iris recognition method of the present disclosure is ended, the optical axis of the image pickup apparatus is restored to an initial state. It should be understood that the image capturing device is restored to the initial state without a change in position, and the optical axis direction of the image capturing device restored to the initial state may be parallel to the z-axis direction of the world coordinate.
Based on the same naming concept, the present disclosure also provides an iris recognition device.
It can be appreciated that, in order to achieve the above-mentioned functions, the iris recognition device provided in the embodiments of the disclosure includes a hardware structure and/or a software module that perform respective functions. The disclosed embodiments may be implemented in hardware or a combination of hardware and computer software, in combination with the various example elements and algorithm steps disclosed in the embodiments of the disclosure. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present disclosure.
Fig. 11 is a schematic diagram showing a structure of an iris recognition apparatus 1100 according to an exemplary embodiment. As shown in fig. 11, applied to an electronic apparatus including a rotatable image pickup device, the device includes: an acquisition module 1101, a rotation module 1102, and an identification module 1103.
The obtaining module 1101 is configured to obtain world coordinate information of a target face image captured by the image capturing device.
The rotation module 1102 is configured to calculate a rotation angle of the image capturing device according to the world coordinate information, and control the image capturing device to be positioned to a specified position according to the rotation angle.
The appointed position is a position corresponding to the effective imaging range of the image pickup device of the target face image.
The recognition module 1103 is configured to obtain iris information of the target face image, and perform iris recognition based on the iris information.
Fig. 12 is a schematic diagram showing a structure of an iris recognition apparatus 1200 according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, the image pickup apparatus includes a binocular wide angle camera; the acquisition module 1101 includes: an acquisition unit 1201 and a determination unit 1202;
an acquisition unit 1201 configured to acquire parameters of a binocular wide angle camera in the image pickup apparatus;
a determining unit 1202, configured to determine world coordinates of the target face image according to the parameters.
In an exemplary embodiment of the present disclosure, the determining unit further includes: and a conversion unit.
And the conversion unit is used for performing error conversion on the world coordinates of the target face image.
Fig. 13 is a schematic diagram showing a structure of an iris recognition apparatus 1300 according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, the camera device comprises a binocular wide angle camera and an infrared camera, the acquisition module 1101 is further configured to,
World coordinate information of an infrared camera and world coordinate information of a binocular wide-angle camera in the image pickup device are obtained.
As shown in fig. 13, the rotation module 1102 includes: a calculation unit 1301 and a rotation unit 1302;
the calculating unit 1301 is configured to calculate, according to world coordinate information of the binocular wide-angle camera and world coordinate information of the target face image, a first moving direction and a first moving distance of the infrared camera, and obtain a world coordinate first moving coordinate of the infrared camera from the world coordinate to a specified position coordinate;
and a rotation unit 1302 for calculating a rotation angle of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image.
Fig. 14 is a schematic diagram showing a structure of an iris recognition apparatus 1400 according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 14, a computing unit 1301 includes: a constituent subunit 1401 and a selection subunit 1402;
a composing sub-unit 1401 for composing a straight line according to the world coordinates of the binocular wide angle camera and the world coordinates of the target face image;
a selecting subunit 1402, configured to select a plane parallel to the y-axis of the world coordinate from planes in which the straight line is located;
The direction from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving direction of the world coordinate of the infrared camera, the distance from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving distance, and the intersection point of the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving coordinate of the world coordinate of the infrared camera.
Fig. 15 is a schematic diagram showing a structure of an iris recognition apparatus 1500 according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 15, a rotating unit 1302 includes: an acquisition subunit 1501 and a mobile subunit 1502;
the obtaining subunit 1501 is configured to obtain a depth of field distance of the infrared camera and a distance between a first world coordinate of the infrared camera and a world coordinate of the target face image.
The moving subunit 1502 is configured to directly calculate the rotation angle of the image capturing device if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is less than or equal to the depth-of-field distance.
Fig. 16 is a schematic diagram showing a structure of an iris recognition device 1600 according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 16, the rotating unit 1302 further includes: a computing subunit 1601 and a mobile subunit 1602;
And a calculating subunit 1601, configured to calculate the second moving direction and the second moving distance of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is greater than the depth of field distance.
A moving subunit 1602 for calculating a rotation angle of the imaging device according to the second moving direction and the second moving distance of the infrared camera
In an exemplary embodiment of the present disclosure, the apparatus further comprises an infrared camera comprising an optical axis;
the initial state of the optical axis of the infrared camera is parallel to the z-axis direction of the world coordinates.
In an exemplary embodiment of the present disclosure, an included angle between an optical axis of the infrared camera and a boundary of a field of view of the infrared camera is a preset included angle.
In an exemplary embodiment of the present disclosure, the apparatus further comprises: an initial module;
and the initial module is used for controlling the optical axis of the image pickup device to rotate to an initial state.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 17 is a schematic block diagram of any of the foregoing embodiment apparatus according to an example embodiment. For example, the apparatus 1700 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, or the like.
Referring to fig. 17, apparatus 1700 may comprise one or more of the following components: a processing component 1702, a memory 1704, a power source component 1706, a multimedia component 1708, an audio component 1710, an input/output (I/O) interface 1712, a sensor component 1714, and a communications component 1716.
The processing component 1702 generally controls overall operation of the device 1700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1702 may include one or more processors 1720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 1702 can include one or more modules that facilitate interactions between the processing component 1702 and other components. For example, the processing component 1702 may include a multimedia module to facilitate interaction between the multimedia component 1708 and the processing component 1702.
The memory 1704 is configured to store various types of data to support operations at the apparatus 1700. Examples of such data include instructions for any application or method operating on device 1700, contact data, phonebook data, messages, pictures, video, and the like. The memory 1704 may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk, or optical disk.
The power supply component 1706 provides power to the various components of the device 1700. The power components 1706 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device 1700.
The multimedia component 1708 includes a screen between the device 1700 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1708 includes a front-facing camera and/or a rear-facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 1700 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 1710 is configured to output and/or input audio signals. For example, the audio component 1710 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 1700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 1704 or transmitted via the communication component 1716. In some embodiments, audio component 1710 also includes a speaker for outputting audio signals.
The I/O interface 1712 provides an interface between the processing component 1702 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 1714 includes one or more sensors for providing status assessment of various aspects of the apparatus 1700. For example, the sensor assembly 1714 may detect the on/off state of the device 1700, the relative positioning of the components, such as the display and keypad of the device 1700, the sensor assembly 1714 may also detect the change in position of the device 1700 or one of the components of the device 1700, the presence or absence of user contact with the device 1700, the orientation or acceleration/deceleration of the device 1700, and the change in temperature of the device 1700. The sensor assembly 1714 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 1714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1714 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1716 is configured to facilitate communication between the apparatus 1700 and other devices in a wired or wireless manner. The apparatus 1700 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 1716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 1716 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a computer-readable storage medium is also provided, such as a memory 1704, including instructions executable by the processor 1720 of the apparatus 1700 to perform the above-described method. For example, the computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Fig. 18 is a block diagram illustrating an electronic device 1800, according to an example embodiment. For example, apparatus 1800 may be provided as a server. Referring to fig. 18, apparatus 1800 includes a processing component 1822 that further includes one or more processors and memory resources, represented by memory 1842, for storing instructions, such as application programs, executable by processing component 1822. The application program stored in memory 1842 may include one or more modules each corresponding to a set of instructions. Further, the processing component 1822 is configured to execute instructions to perform the methods described above.
The apparatus 1800 may also include a power component 1826 configured to perform power management of the apparatus 1700, a wired or wireless network interface 1850 configured to connect the apparatus 1800 to a network, and an input output (I/O) interface 1858. The apparatus 1800 may operate based on an operating system stored in a memory 1842, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
It is further understood that the term "plurality" in this disclosure means two or more, and other adjectives are similar thereto. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is further understood that the terms "first," "second," and the like are used to describe various information, but such information should not be limited to these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the expressions "first", "second", etc. may be used entirely interchangeably. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (18)

1. An iris recognition method, characterized by being applied to an electronic apparatus including a rotatable image pickup device including a binocular wide angle camera and an infrared camera, comprising:
acquiring world coordinate information of a target face image shot by the camera device, and acquiring world coordinate information of an infrared camera and world coordinate information of a binocular wide-angle camera in the camera device;
calculating a first moving direction and a first moving distance of the infrared camera according to the world coordinate information of the binocular wide-angle camera and the world coordinate information of the target face image to obtain a world coordinate first moving coordinate of the infrared camera moving from the world coordinate to a designated position coordinate;
calculating the rotation angle of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image;
Controlling the image pickup device to position a designated position according to the rotation angle, wherein the designated position is a position corresponding to the effective imaging range of the image pickup device of the target face image;
acquiring iris information of the target face image, and performing iris recognition based on the iris information;
the calculating the first moving direction and the first moving distance of the infrared camera according to the world coordinate information of the binocular wide-angle camera and the world coordinate information of the target face image comprises the following steps:
forming a straight line according to the world coordinates of the binocular wide-angle camera and the world coordinates of the target face image;
selecting a plane parallel to the y axis of the world coordinate in the plane of the straight line;
the direction from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving direction of the world coordinate of the infrared camera, the distance from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving distance, and the intersection point from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving coordinate of the world coordinate of the infrared camera.
2. The iris recognition method according to claim 1, wherein the image pickup device comprises a binocular wide angle camera;
the obtaining world coordinate information of the target face image shot by the camera device comprises the following steps:
acquiring parameters of a binocular wide-angle camera in the image pickup device;
and determining world coordinates of the target face image according to the parameters.
3. The iris recognition method according to claim 2, wherein after determining world coordinates of the target face image according to the parameters, further comprising:
and carrying out error conversion on the world coordinates of the target face image.
4. The iris recognition method of claim 1, wherein the calculating the rotation angle of the infrared camera according to the world coordinate first moving coordinates of the infrared camera and the world coordinates of the target face image comprises:
acquiring the depth of field distance of the infrared camera and the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image;
if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is smaller than or equal to the depth of field distance, the rotation angle of the camera device is directly calculated.
5. The iris recognition method of claim 1 or 4, wherein the calculating the rotation angle of the infrared camera according to the world coordinate first moving coordinates of the infrared camera and the world coordinates of the target face image further comprises:
if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is greater than the depth of field distance, calculating a second moving direction and a second moving distance of the infrared camera according to the first moving coordinate of the world coordinate of the infrared camera and the world coordinate of the target face image;
and calculating the rotation angle of the image pickup device according to the second moving direction and the second moving distance of the infrared camera.
6. The iris recognition method of claim 1, wherein the infrared camera includes an optical axis;
the initial state of the optical axis of the infrared camera is parallel to the z-axis direction of the world coordinates.
7. The iris recognition method of claim 6, wherein an included angle between an optical axis of the infrared camera and a boundary of a field of view of the infrared camera is a preset included angle.
8. The iris recognition method of claim 6, wherein the iris recognition based on the iris information of the object further comprises:
And controlling the optical axis of the image pickup device to rotate to an initial state.
9. An iris recognition apparatus, characterized by being applied to an electronic device including a rotatable image pickup apparatus including a binocular wide angle camera and an infrared camera, the apparatus comprising:
the acquisition module is used for: the method comprises the steps of acquiring world coordinate information of a target face image shot by the camera device, and acquiring world coordinate information of an infrared camera and world coordinate information of a binocular wide-angle camera in the camera device;
and (3) a rotation module: the method comprises the steps of calculating the rotation angle of the image pickup device according to world coordinate information, and controlling the image pickup device to be positioned to a designated position according to the rotation angle, wherein the designated position is a position corresponding to an effective imaging range of the image pickup device of the target face image;
and an identification module: the iris recognition method comprises the steps of acquiring iris information of the target face image, and carrying out iris recognition based on the iris information;
wherein the acquisition module is further used for acquiring the data of the object,
acquiring world coordinate information of an infrared camera and world coordinate information of a binocular wide-angle camera in the camera device;
the rotation module includes: a calculation unit and a rotation unit;
The computing unit is used for computing a first moving direction and a first moving distance of the infrared camera according to the world coordinate information of the binocular wide-angle camera and the world coordinate information of the target face image, and obtaining a world coordinate first moving coordinate of the infrared camera moving from the world coordinate to a designated position coordinate;
the rotating unit is used for calculating the rotating angle of the infrared camera according to the world coordinate first moving coordinate of the infrared camera and the world coordinate of the target face image;
wherein the computing unit includes: forming a subunit and selecting the subunit;
the composing subunit is used for composing a straight line according to the world coordinates of the binocular wide-angle camera and the world coordinates of the target face image;
the selecting subunit is used for selecting a plane parallel to the y axis of the world coordinate in the plane where the straight line is located;
the direction from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving direction of the world coordinate of the infrared camera, the distance from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving distance, and the intersection point from the world coordinate of the infrared camera to the plane parallel to the world coordinate y-axis is the first moving coordinate of the world coordinate of the infrared camera.
10. The apparatus of claim 9, wherein the imaging means comprises a binocular wide angle camera;
the acquisition module comprises: an acquisition unit and a determination unit;
the acquisition unit is used for acquiring parameters of the binocular wide-angle camera in the image pickup device;
and the determining unit is used for determining world coordinates of the target face image according to the parameters.
11. The apparatus of claim 10, wherein the determining unit further comprises: a conversion unit;
and the conversion unit is used for carrying out error conversion on the world coordinates of the target face image.
12. The apparatus of claim 9, wherein the rotating unit is further configured to,
acquiring the depth of field distance of the infrared camera and the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image;
if the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is smaller than or equal to the depth of field distance, the rotation angle of the camera device is directly calculated.
13. The iris recognition device of claim 12, wherein the rotation unit is further configured to:
If the distance between the first world coordinate of the infrared camera and the world coordinate of the target face image is larger than the depth of field distance, calculating a second moving direction and a second moving distance of the infrared camera according to the first moving coordinate of the world coordinate of the infrared camera and the world coordinate of the target face image;
and calculating the rotation angle of the image pickup device according to the second moving direction and the second moving distance of the infrared camera.
14. The iris recognition device of claim 9, wherein the device further comprises an infrared camera, the infrared camera comprising an optical axis;
the initial state of the optical axis of the infrared camera is parallel to the z-axis direction of the world coordinates.
15. The iris identification device of claim 14, wherein an included angle between an optical axis of the infrared camera and a boundary of a field of view of the infrared camera is a preset included angle.
16. The iris recognition apparatus as claimed in claim 9, wherein the apparatus further comprises: an initial module;
the initial module is used for controlling the optical axis of the image pickup device to rotate to an initial state.
17. An iris recognition apparatus, comprising:
A processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: an iris recognition method as claimed in any one of claims 1 to 8 is performed.
18. A non-transitory computer readable storage medium, which when executed by a processor of a network device, causes an electronic device to perform the iris recognition method of any one of claims 1 to 8.
CN202010228395.6A 2020-03-27 2020-03-27 Iris recognition method, iris recognition device and storage medium Active CN111428654B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010228395.6A CN111428654B (en) 2020-03-27 2020-03-27 Iris recognition method, iris recognition device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010228395.6A CN111428654B (en) 2020-03-27 2020-03-27 Iris recognition method, iris recognition device and storage medium

Publications (2)

Publication Number Publication Date
CN111428654A CN111428654A (en) 2020-07-17
CN111428654B true CN111428654B (en) 2023-11-28

Family

ID=71549021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010228395.6A Active CN111428654B (en) 2020-03-27 2020-03-27 Iris recognition method, iris recognition device and storage medium

Country Status (1)

Country Link
CN (1) CN111428654B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114373218B (en) * 2022-03-21 2022-06-14 北京万里红科技有限公司 Method for generating convolution network for detecting living body object

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250839A (en) * 2016-07-27 2016-12-21 徐鹤菲 A kind of iris image perspective correction method, device and mobile terminal
CN107403148A (en) * 2017-07-14 2017-11-28 广东欧珀移动通信有限公司 Iris identification method and related product
CN109558764A (en) * 2017-09-25 2019-04-02 杭州海康威视数字技术股份有限公司 Face identification method and device, computer equipment
CN110210333A (en) * 2019-05-16 2019-09-06 佛山科学技术学院 A kind of focusing iris image acquiring method and device automatically

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250839A (en) * 2016-07-27 2016-12-21 徐鹤菲 A kind of iris image perspective correction method, device and mobile terminal
CN107403148A (en) * 2017-07-14 2017-11-28 广东欧珀移动通信有限公司 Iris identification method and related product
CN109558764A (en) * 2017-09-25 2019-04-02 杭州海康威视数字技术股份有限公司 Face identification method and device, computer equipment
CN110210333A (en) * 2019-05-16 2019-09-06 佛山科学技术学院 A kind of focusing iris image acquiring method and device automatically

Also Published As

Publication number Publication date
CN111428654A (en) 2020-07-17

Similar Documents

Publication Publication Date Title
CN106778773B (en) Method and device for positioning target object in picture
CN109584362B (en) Three-dimensional model construction method and device, electronic equipment and storage medium
CN110647834A (en) Human face and human hand correlation detection method and device, electronic equipment and storage medium
EP3173970A1 (en) Image processing method and apparatus
JP6348611B2 (en) Automatic focusing method, apparatus, program and recording medium
US20210158560A1 (en) Method and device for obtaining localization information and storage medium
CN111323007B (en) Positioning method and device, electronic equipment and storage medium
CN110989901B (en) Interactive display method and device for image positioning, electronic equipment and storage medium
CN112013844B (en) Method and device for establishing indoor environment map
US11425305B2 (en) Control method and apparatus, electronic device, and storage medium
CN109544458B (en) Fisheye image correction method, device and storage medium thereof
CN110930351A (en) Light spot detection method and device and electronic equipment
CN111428654B (en) Iris recognition method, iris recognition device and storage medium
CN107239140A (en) Processing method, device and the terminal of VR scenes
CN113345000A (en) Depth detection method and device, electronic equipment and storage medium
CN112767541A (en) Three-dimensional reconstruction method and device, electronic equipment and storage medium
CN110213205B (en) Verification method, device and equipment
CN114296587A (en) Cursor control method and device, electronic equipment and storage medium
CN109813295B (en) Orientation determination method and device and electronic equipment
CN108595930B (en) Terminal device control method and device
CN112465901B (en) Information processing method and device
CN112860827B (en) Inter-device interaction control method, inter-device interaction control device and storage medium
CN117974772A (en) Visual repositioning method, device and storage medium
CN110060355B (en) Interface display method, device, equipment and storage medium
CN112148815B (en) Positioning method and device based on shared map, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100081 room 701, floor 7, Fuhai international port, Haidian District, Beijing

Applicant after: Beijing wanlihong Technology Co.,Ltd.

Address before: 100081 1504, floor 15, Fuhai international port, Daliushu Road, Haidian District, Beijing

Applicant before: BEIJING SUPERRED TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant