CN114765661A - Iris identification method, device and equipment - Google Patents

Iris identification method, device and equipment Download PDF

Info

Publication number
CN114765661A
CN114765661A CN202011616344.7A CN202011616344A CN114765661A CN 114765661 A CN114765661 A CN 114765661A CN 202011616344 A CN202011616344 A CN 202011616344A CN 114765661 A CN114765661 A CN 114765661A
Authority
CN
China
Prior art keywords
iris
distance
focusing
target object
focal length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011616344.7A
Other languages
Chinese (zh)
Other versions
CN114765661B (en
Inventor
卢洁玲
任志浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202011616344.7A priority Critical patent/CN114765661B/en
Priority to PCT/CN2021/142690 priority patent/WO2022143813A1/en
Publication of CN114765661A publication Critical patent/CN114765661A/en
Application granted granted Critical
Publication of CN114765661B publication Critical patent/CN114765661B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Abstract

The embodiment of the specification provides an iris identification method, device and equipment. The method comprises the following steps: establishing a corresponding relation between a focusing distance and a focal length and between the focusing distance and a displacement in advance; after the focal distance of the target object is detected, firstly, the focal length corresponding to the focal distance is determined according to the corresponding relation, the zooming component is driven to move to the position corresponding to the focal length, then, the displacement corresponding to the focal distance is determined according to the corresponding relation, and the focusing component is driven to move the displacement, so that the periscopic optical zooming is completed by utilizing the movement of two or more lens groups in the lens, the rapid imaging of the iris can be realized, and the lightening and thinning of the equipment can be realized on the premise of ensuring the identification range.

Description

Iris identification method, device and equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, and a device for iris recognition.
Background
The iris identification technology is a technology for identifying the identity based on the iris in the eyes, and is generally applied to security equipment (such as access control and the like) and places with high confidentiality requirements.
In order to realize iris recognition in a long distance range, various iris equipment manufacturers at present put forward imaging schemes of zoom optical systems. However, all of the zoom lenses used by these manufacturers are industrial camera lenses, and the Total Track Length (TTL) of the lenses is large, so that the thickness of the iris recognition product is too large. Compared with the face recognition product with the thickness of about 20mm, the iris recognition product is too thick and heavy.
Therefore, it is necessary to provide a fast and reliable iris recognition scheme for iris recognition in a long distance range, and at the same time, iris recognition equipment is lighter, thinner and smaller.
Disclosure of Invention
The embodiment of the specification provides an iris identification method, which is used for realizing lightness and thinness of equipment on the premise of ensuring an identification range.
An embodiment of the present specification further provides an iris identification method, which is applied to an iris identification system, and includes:
acquiring a focusing distance corresponding to a target object;
determining a first focal length corresponding to the focal distance under a first constraint condition, and controlling a first piezoelectric motor to drive a zooming assembly of an iris imaging module to move to a position corresponding to the first focal length along an optical axis, wherein the first constraint condition is used for constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
and determining a displacement corresponding to the focusing distance under a second constraint condition, and controlling a second piezoelectric motor to drive a focusing assembly of the iris imaging module to move the displacement so as to perform iris recognition on the target object, wherein the second constraint condition is used for constraining the moved focusing assembly to be located at an optimal focusing position.
An embodiment of the present specification further provides an iris recognition apparatus, including: little the control unit, periscopic optics zoom imaging module and piezoelectric motor, periscopic optics zoom imaging module includes: zoom subassembly, focus the subassembly, wherein:
the micro control unit is used for acquiring a focusing distance corresponding to a target object; determining a first focal length corresponding to the focusing distance under a first constraint condition, and sending a first driving instruction to a first piezoelectric motor corresponding to the zooming component to indicate the first piezoelectric motor to drive the zooming component to move to a position corresponding to the first focal length along an optical axis, wherein the first constraint condition is used for constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
the micro control unit is further configured to determine a displacement corresponding to the focus distance under a second constraint condition, send a second driving instruction to a second piezoelectric motor corresponding to the focus assembly, instruct the second piezoelectric motor to drive the focus assembly to move the displacement, and perform iris recognition on the target object, where the second constraint condition is used to constrain the focus assembly after movement to be located at an optimal focus position.
An embodiment of the present specification further provides an electronic device, including:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the steps of the method as described above.
Embodiments of the present specification also provide a computer readable storage medium storing one or more programs which, when executed by an electronic device comprising a plurality of application programs, perform the steps of the method as described above.
One embodiment of the present specification realizes that the correspondence between the focal distance and the focal length, and between the focal distance and the displacement amount is established in advance; after the focal distance of the target object is detected, firstly, the focal length corresponding to the focal distance is determined according to the corresponding relation, the zooming component is driven to move to the position corresponding to the focal length, then, the displacement corresponding to the focal distance is determined according to the corresponding relation, and the focusing component is driven to move the displacement, so that periscopic optical zooming is completed, rapid imaging of the iris can be realized, and the equipment is light and thin on the premise of ensuring the recognition range.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification and are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description serve to explain the specification and not to limit the specification in a non-limiting sense. In the drawings:
fig. 1 is a schematic flowchart of an iris identification method according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an iris identification system according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an iris imaging module provided in an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an optical path structure of an iris imaging module provided in an embodiment of the present disclosure;
fig. 5 is a schematic flowchart of a method for adjusting a lens field angle of an iris imaging module according to an embodiment of the present disclosure;
fig. 6 is a schematic distribution diagram of an iris potential field and a human face potential field provided in an embodiment of the present specification;
fig. 7 is a schematic diagram of a front layout of an iris lens and a face lens according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an iris imaging module according to another embodiment of the present disclosure;
FIG. 9 is a schematic diagram illustrating a rotation of a reflection assembly according to an embodiment of the present disclosure;
FIG. 10 is a top schematic view of an illumination layout provided by one embodiment of the present description;
fig. 11 is a schematic structural diagram of an iris identification apparatus according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more clear, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person skilled in the art without making any inventive step based on the embodiments in this description belong to the protection scope of this document.
The technical solutions provided by the embodiments of the present description are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of an iris identification method provided in an embodiment of the present specification, which may be executed by an iris identification system, and referring to fig. 1, the method may specifically include the following steps:
102, acquiring a focusing distance corresponding to a target object;
the focus distance may refer to a distance between object images, and is a sum of a distance from the lens to the object (target object) and a distance from the lens to the photosensitive element.
Referring to the schematic structural diagram of the iris identification system provided in fig. 2, the implementation of step 102 is described in detail below:
the first implementation manner may be:
the focusing distance is obtained through detection of the distance measuring module. Specifically, the method comprises the following steps:
when the target object is in the identification area of the iris identification system, the distance measurement module is triggered to detect the focusing distance between the target object and the iris identification system and report the focusing distance to the processor.
The second implementation may be;
the focal distance is obtained by analyzing the interpupillary distance of the target object. Specifically, the method comprises the following steps:
when the target object is in the identification area of the iris identification system, the visible light imaging module collects the face image of the target object and reports the face image to the processor; and the processor performs face detection on the face image to obtain the interpupillary distance IPD of the target object, and determines the focusing distance based on the corresponding relation between the IPD marked in advance and the focusing distance.
Wherein, the processor is preferably a micro control unit MCU to further realize product miniaturization.
Based on this, on one hand, the present embodiment can accurately detect the focal distance of the target object through the ranging module; on the other hand, the focus distance of the target object can be detected through the existing visible light imaging module, so that additional hardware equipment is avoided, and the system cost and the complexity of the system structure can be effectively reduced. Moreover, the embodiment shows a specific implementation manner of the step 102. Of course, it should be understood that step 102 can be implemented in other ways, and the embodiment is not limited thereto.
104, determining a first focal length corresponding to the focusing distance under a first constraint condition, and controlling a first piezoelectric motor to drive a zooming assembly of an iris imaging module to move to a position corresponding to the first focal length along an optical axis, wherein the first constraint condition is used for constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
the first piezoelectric motor can be a motor which is arranged in the equipment and is specially used for driving the zooming component; the piezoelectric motor is a device for converting electric energy into mechanical energy by utilizing the piezoelectric inverse effect of a piezoelectric material, can shield electromagnetic field interference, has small noise influence, and is suitable for miniaturized equipment.
The structure of the iris imaging module is schematically described below with reference to fig. 3:
the iris imaging module includes periscopic zoom lens and piezoelectric motor, periscopic zoom lens includes: zoom subassembly Zoom, Focus subassembly Focus, reflection assembly and image Sensor, wherein:
the optical path structures corresponding to the zoom assembly, the focusing assembly, the reflecting assembly and the image sensor are shown in fig. 4, and the reflecting assembly may be referred to as a reflecting prism.
For convenience of description, on one hand, the present embodiment takes a 5MP Sensor (pixel size 2 μm), 3 times optical zoom system as an example, but it is understood that the present embodiment is not limited to the system parameters described in the present embodiment, and the present embodiment can be implemented to obtain optimal system parameters for different application scenario requirements; on the other hand, considering that the size of the iris diameter of an adult is about 11mm, the iris identification algorithm has definite quantity requirements on iris diameter pixels, and the focal length of the iris identification algorithm is changed from 12mm to 36mm through 3 times of optical zoom by taking 150-220 pixels as a reference so as to realize iris identification within the range of 30cm-130 cm.
Based on this, one implementation of step 104 may be:
inquiring a first comparison table (table 1 below) to obtain a first focal length corresponding to the focal distance; and converting the first focal length into voltage to drive a piezoelectric motor, and controlling the ZOOM component to move back and forth along the optical axis direction through the piezoelectric motor to realize optical zooming. And the focal lengths corresponding to different focal distances under the first constraint condition are stored in the first comparison table.
Figure BDA0002871413630000051
Figure BDA0002871413630000061
TABLE 1
As shown in the above table, in the focusing distance range of [30,40) cm, a focal length of 12mm can be selected, in the focusing distance range of [40,55) cm, a focal length of 16mm can be selected, and the like, and the description is omitted here.
Further, the position corresponding to the focal length can also be marked by adopting a way of pre-constructing a comparison table, specifically:
inquiring a second comparison table to obtain a first position corresponding to the first focal length; and the second comparison table stores the positions of the zooming components under different focal lengths.
Therefore, after the first focal length is determined, the first position can be obtained through a table lookup, and the ZOOM out element can be controlled by the piezoelectric motor to move to the first position along the optical axis direction.
Based on this, in this embodiment, a corresponding relationship among the focusing distance, the focal length, and the position of the zooming component is established by pre-constructing the comparison table, so that after the focusing distance is obtained, the corresponding focal length and the corresponding position thereof are quickly determined, and the efficiency of optical zooming can be effectively improved. Moreover, the embodiment shows a specific implementation manner of step 104. Of course, it should be understood that step 104 can be implemented in other ways, and the embodiment is not limited thereto.
And 106, determining a displacement corresponding to the focusing distance under a second constraint condition, and controlling a second piezoelectric motor to drive a focusing assembly of the iris imaging module to move the displacement so as to perform iris recognition on the target object, wherein the second constraint condition is used for constraining the focusing assembly after moving to be located at an optimal focusing position.
The optimal focusing position may refer to a position at which the quality of the acquired image is highest; the displacement may be a displacement required for the focusing assembly to move from a reference position to an optimal focusing position, where the reference position is a position pre-marked as a position reference; the iris is an oblate annular film in the middle layer of the eyeball wall, is positioned between a cornea and a crystalline lens, is a circular ring part with special textures between a pupil and an eye white when viewed from the front, and the iris textures can be used for identity verification.
One implementation of step 106 may be:
inquiring a third comparison table to obtain the front depth of field and the rear depth of field corresponding to the focusing distance; determining a displacement amount required by the focusing assembly to move to the optimal focusing position at the focusing distance based on the front depth of field and the rear depth of field; and the third comparison table stores the foreground depth and the back depth corresponding to different focus distances under the second constraint condition.
It should be understood that after the focal length is determined, the optimal focusing position of the focusing assembly at different focal length distances can be calculated in advance, and the displacement required for the focusing assembly to move to the optimal focusing position can be calculated. The method for calculating the displacement required for the focusing assembly to move to the optimal focusing position may specifically be as follows:
example 1 mode based on common image sensor + ranging module
After the distance measurement module detects the focusing distance, images before and after the focusing distance are collected, the image contrast is calculated, and the optimal focusing position is determined, so that the displacement of the focusing assembly moving from the reference position to the optimal focusing position is calculated, and by analogy, the displacement of the focusing assembly moving from the reference position to the optimal focusing position under different focusing distances can be calculated.
Example 2 phase Focus sensor based approach
And sensing the phase change amount under the focusing distance through a phase focusing Sensor PDAF Sensor, and determining the displacement amount corresponding to the phase change amount.
The realization principle can be as follows: after the focal length is confirmed, the Sensor automatically calculates the phase variation and the corresponding displacement and reports the phase variation and the corresponding displacement to the processor, and the processor converts the phase variation and the corresponding displacement into voltage and drives the motor to finish focusing.
Taking a 12mm focal length as an example, a third look-up table can be constructed as the following table 2:
Figure BDA0002871413630000071
Figure BDA0002871413630000081
based on this, on one hand, the present embodiment can calculate the optimal focusing positions at different focusing distances by the existing distance measuring module + common image sensor, and can effectively reduce the equipment cost; on the other hand, the phase focusing sensor can be used for sensing the displacement under different focusing distances, so that the calculation efficiency and accuracy of the displacement can be effectively improved. Moreover, the embodiment herein shows a specific implementation of step 106. Of course, it should be understood that step 106 can be implemented in other ways, and the embodiment is not limited thereto.
In summary, the present embodiment establishes the corresponding relationship between the focal distance and the focal length, and between the focal distance and the displacement in advance; after the focusing distance of the target object is detected, firstly, the focal length corresponding to the focusing distance is determined according to the corresponding relation, the zooming component is driven to move to the position corresponding to the focal length, then, the displacement corresponding to the focusing distance is determined according to the corresponding relation, and the focusing component is driven to move the displacement, so that periscopic optical zooming is completed, rapid imaging of the iris can be realized, the equipment is light and thin on the premise of ensuring the recognition range, and the thickness of a product can be controlled within 30 mm.
Further, the present specification considers that the angle of the iris recognition product in the longitudinal visual field direction is too small, typically within 30 °. Compared with the 50-degree field angle of a common face recognition device, the use experience of users with partial height is seriously influenced. As shown in fig. 6, the faces A, B and C are both in the field of view of the face lens, but only the face a is in the field of view of the iris lens, and the user whose height is in the face B area needs to bend over, while the user who is in the face C area needs to stand on tiptoe. The face lens may refer to the visible light imaging module in fig. 2, the face detection view field may refer to the view field of the visible light imaging module, the iris lens may refer to the periscopic zoom lens, and the iris detection view field may refer to the view field of the periscopic zoom lens.
Based on this, another embodiment of the present specification provides a lens field angle adjusting method for an iris imaging module, which may be executed by an iris recognition system corresponding to fig. 2, and with reference to fig. 5, the method may specifically include the following steps:
step 502, determining the relative position of the face of the target object and the iris detection field when detecting that the face of the target object is in the face detection field and not in the iris detection field of the iris imaging module;
and step 504, controlling a third piezoelectric motor to drive a reflection assembly of the iris imaging module to rotate based on the relative position so as to enable the face of the target object to be positioned in the iris detection visual field.
Specifically, the method comprises the following steps: firstly, a human face imaging module (which can be arranged up, down, left and right and can be arranged according to the appearance requirement of a product) is arranged near an iris imaging module. Fig. 7 shows an example in which the face shot is directly below the iris shot when viewed from the front. And if the human face angle of view is larger than the iris angle of view, the position of the human face and the angle of the iris reflection mechanism are calibrated, and if the human face B and the human face C in the figure 6 are in the areas, the upper and lower pitching angles of the reflection mechanism are controlled, so that the iris can be tracked.
For a pitching mechanism, the existing scheme generally directly rotates the whole imaging module, so that the mechanism is bulky. In this embodiment, the entire optical path does not need to be rotated, and only the reflection assembly needs to be rotated. The reflecting component can be a reflecting prism or a reflecting mirror in fig. 4, or can be a free-form surface mirror shown in fig. 8 below. The processor may control the rotation of the reflective assembly about the X-axis by the piezoelectric motor, as shown in fig. 9, wherein the amount of rotation of the reflective assembly corresponds to the relative position.
Based on this, the reflection assembly is controlled to rotate by the piezoelectric motor to expand the vertical field angle of the lens, so that the product can adapt to the user experience of people with different heights; furthermore, the piezoelectric motor used for driving the reflection assembly in the present embodiment has a characteristic of miniaturization, and therefore, the iris recognition product can be miniaturized, and thus, the miniaturization and the light and thin of the product can be simultaneously realized on the basis of the embodiment corresponding to fig. 1.
Further, on the basis of the two embodiments described above, the present specification also proposes another embodiment, and this embodiment provides a lighting module, referring to fig. 10, the lighting module of the iris recognition system includes: the multiple rows of illuminating lamps are distributed around the iris imaging module, and at least two rows of illuminating lamps have different lamp beam orientations; the method further comprises:
determining the illuminating lamp of the lamp beam facing the target row matched with the focusing distance;
and controlling the illuminating lamps of the target row to enter an electrified state, and maintaining the illuminating lamps of other rows to be in a non-electrified state.
Specifically, the method comprises the following steps: the lamp plate and the horizontal direction have certain angle. Iris recognition uses NIR band illumination, either 810nm or 850 nm. With LED or VCSEL device illumination, a lamp beam angle of less than 30 ° is required. Can arrange multiseriate LED on the lamp plate, set up the column number according to discernment distance. The light beams of the LED lamps in different columns are oriented differently. And the LEDs in different columns can be controlled independently, and the LEDs in the corresponding columns are lightened after the distance is determined, so that the power consumption is reduced.
Based on this, this embodiment can effectively reduce the required resource of illumination on the basis of satisfying the illumination condition through the overall arrangement of rational configuration lighting system.
Fig. 11 is a schematic structural diagram of an iris identification apparatus provided in an embodiment of the present specification, and referring to fig. 11, the apparatus may specifically include: little the control unit MCU, periscopic optics zoom imaging module and piezoelectric motor, periscopic optics zoom imaging module includes: zoom subassembly, focus subassembly, wherein:
the micro control unit is used for acquiring a focusing distance corresponding to a target object; determining a first focal length corresponding to the focusing distance under a first constraint condition, and sending a first driving instruction to a first piezoelectric motor corresponding to the zooming component to indicate the first piezoelectric motor to drive the zooming component to move to a position corresponding to the first focal length along an optical axis, wherein the first constraint condition is used for constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
the micro control unit is further configured to determine a displacement amount corresponding to the focusing distance under a second constraint condition, send a second driving instruction to a second piezoelectric motor corresponding to the focusing assembly, instruct the second piezoelectric motor to drive the focusing assembly to move the displacement amount, and perform iris recognition on the target object, where the second constraint condition is used to constrain the focusing assembly after movement to be located at an optimal focusing position.
Wherein, the MCU corresponds to the processor in fig. 2, the periscopic optical zoom imaging module corresponds to the iris imaging module in fig. 2, and the specific working principle thereof is also detailed, so that the MCU and the periscopic optical zoom imaging module are not explained herein, but only the implementation principle thereof is briefly stated, including: in the zooming process, after the MCU acquires the distance information, the distance information is converted into voltage to drive the piezoelectric motor, and the ZOOM component is controlled by the piezoelectric motor to move back and forth along the optical axis direction, so that optical zooming is realized; in the focusing process, after the MCU acquires distance information, the displacement required by the optimal focusing position is inquired and converted into voltage to drive the piezoelectric motor, and the FOCUS component is controlled by the piezoelectric motor to move back and forth along the optical axis direction so as to complete focusing.
In conjunction with fig. 2, the apparatus may further include:
the distance measurement module is used for detecting the focus distance corresponding to the target object and providing the focus distance to the micro control unit; alternatively, the first and second electrodes may be,
and the visible light imaging module is used for acquiring a face image of a target object and providing the face image to the micro control unit, and the micro control unit identifies the face image to obtain the interpupillary distance of the target object and determines the focusing distance corresponding to the interpupillary distance.
The distance measuring module and the visible light module can be integrated on the equipment at the same time, or only one of the distance measuring module and the visible light module can be integrated.
In conjunction with fig. 6, the field angle of the iris lens (i.e. the lens corresponding to the periscopic optical zoom imaging module) is small, which is very unfriendly to a user with a high height compared with human face recognition. The user whose height is in the face B area needs to stoop, and the user who is in the face C area needs to stand on tiptoe. Therefore, a pitching mechanism is required to expand the field of view in the vertical direction.
For the pitching mechanism, the existing scheme generally directly rotates the whole imaging module, which results in a huge mechanism and is not beneficial to the miniaturization of the device, therefore, the embodiment provides a new pitching mechanism, and with reference to fig. 3, the pitching mechanism comprises a reflection assembly and a piezoelectric motor corresponding to the reflection assembly, so as to drive the pitch angle of the reflection assembly up and down through the piezoelectric motor, thereby achieving the purpose of tracking the iris, and moreover, the pitching mechanism does not need to rotate the whole light path, only rotates the reflection assembly, and can effectively reduce the area of the occupied area of the mechanism, thereby achieving the purpose of further realizing the miniaturization of the device.
The specific structure of the reflection assembly of the periscopic optical zoom imaging module in this embodiment may refer to the reflection prism or the reflection mirror shown in fig. 4 and the free-form surface mirror shown in fig. 8. The iris lens can be arranged near the face imaging module (both up and down and left and right, and the arrangement can be carried out according to the appearance requirements of products). Fig. 6 shows an example in which the face shot is directly below the iris shot when viewed from the front.
Correspondingly, when the micro control unit detects that the face of the target object is in the face detection field and not in the iris detection field, the micro control unit determines the relative position of the face of the target object and the iris detection field; and sending a third driving instruction to a third piezoelectric motor corresponding to the reflection assembly based on the relative position, and instructing the third piezoelectric motor to drive the reflection assembly to rotate (see fig. 9) so as to enable the face of the target object to be located in the iris detection field of view. The principle of iris tracking has been described in detail in the embodiment corresponding to fig. 1, and therefore, the description thereof is omitted here.
In addition, with reference to fig. 2, the apparatus may further include: a lighting module;
the lighting module includes: the multiple rows of illuminating lamps are distributed around the iris imaging module, and at least two rows of illuminating lamps have different lamp beam orientations;
the micro control unit is also used for determining the illuminating lamp of the target row with the lamp beam facing the focusing distance matched with the focusing distance; and controlling the illuminating lamps of the target row to enter an electrified state, and maintaining the illuminating lamps of other rows to be in a non-electrified state.
Wherein, there is certain angle in lamp plate and horizontal direction. Iris recognition can be illuminated in the NIR band, either 810nm or 850nm, preferably with LED or VCSEL devices, requiring a lamp beam angle of less than 30 °.
Therefore, the purpose of effectively reducing the resources required by illumination on the basis of meeting the illumination condition can be realized by reasonably configuring the layout of the illumination system.
Based on this, the present embodiment establishes in advance the correspondence between the focal distance and the focal length, and between the focal distance and the displacement amount; after the focusing distance of the target object is detected, firstly, the focal length corresponding to the focusing distance is determined according to the corresponding relation, the zooming component is driven to move to the position corresponding to the focal length, then, the displacement corresponding to the focusing distance is determined according to the corresponding relation, and the focusing component is driven to move the displacement, so that periscopic optical zooming is completed, rapid imaging of the iris can be realized, the equipment is light and thin on the premise of ensuring the recognition range, and the thickness of a product can be controlled within 30 mm.
In addition, as for the device embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to part of the description of the method embodiment. Further, it should be noted that, among the respective components of the apparatus of the present specification, the components thereof are logically divided according to the functions to be implemented, but the present specification is not limited thereto, and the respective components may be newly divided or combined as necessary.
Fig. 12 is a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure, and referring to fig. 12, the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile memory, and may also include hardware required by other services. The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the iris recognition device on the logic level. Of course, besides the software implementation, this specification does not exclude other implementations, such as logic devices or combination of software and hardware, and so on, that is, the execution subject of the following processing flow is not limited to each logic unit, and may be hardware or logic devices.
The network interface, the processor and the memory may be interconnected by a bus system. The bus may be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 12, but that does not indicate only one bus or one type of bus.
The memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both read-only memory and random access memory, and provides instructions and data to the processor. The Memory may include a Random-Access Memory (RAM), and may further include a non-volatile Memory (non-volatile Memory), such as at least 1 disk Memory.
The processor is used for executing the program stored in the memory and specifically executing:
acquiring a focusing distance corresponding to a target object;
determining a first focal length corresponding to the focal distance under a first constraint condition, and controlling a first piezoelectric motor to drive a zooming assembly of an iris imaging module to move to a position corresponding to the first focal length along an optical axis, wherein the first constraint condition is used for constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
and determining a displacement corresponding to the focusing distance under a second constraint condition, and controlling a second piezoelectric motor to drive a focusing assembly of the iris imaging module to move the displacement so as to perform iris recognition on the target object, wherein the second constraint condition is used for constraining the moved focusing assembly to be located at an optimal focusing position.
The method performed by the iris recognition apparatus or the manager (Master) node according to the embodiment shown in fig. 11 of the present specification may be applied to or implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of this specification may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present specification may be embodied directly in a hardware decoding processor, or in a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The iris recognition apparatus may also perform the methods illustrated in fig. 1 and 5, and implement the method performed by the manager node.
Based on the same inventive creation, the present specification also provides a computer readable storage medium storing one or more programs, which when executed by an electronic device including a plurality of application programs, cause the electronic device to perform the iris recognition method provided by the embodiment corresponding to fig. 1 and 5.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the system embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and reference may be made to the partial description of the method embodiment for relevant points.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The description has been presented with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the description. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.

Claims (13)

1. An iris identification method is applied to an iris identification system and comprises the following steps:
acquiring a focusing distance corresponding to a target object;
determining a first focal length corresponding to the focal distance under a first constraint condition, and controlling a first piezoelectric motor to drive a zooming assembly of an iris imaging module to move to a position corresponding to the first focal length along an optical axis, wherein the first constraint condition is used for constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
and determining a displacement corresponding to the focusing distance under a second constraint condition, and controlling a second piezoelectric motor to drive a focusing assembly of the iris imaging module to move the displacement so as to perform iris recognition on the target object, wherein the second constraint condition is used for constraining the moved focusing assembly to be located at an optimal focusing position.
2. The method of claim 1,
the focusing distance is detected by a distance measuring module; alternatively, the first and second electrodes may be,
the focal distance is obtained by analyzing the interpupillary distance of the target object.
3. The method of claim 1, wherein the determining the first focal length corresponding to the focal distance under the first constraint comprises:
inquiring a first comparison table to obtain a first focal length corresponding to the focusing distance;
and the focal lengths corresponding to different focal distances under the first constraint condition are stored in the first comparison table.
4. The method of claim 1, wherein before the controlling the first piezo motor to drive the zoom component of the iris imaging module to move along the optical axis to the position corresponding to the first focal length, the method further comprises:
inquiring a second comparison table to obtain a first position corresponding to the first focal length;
and the second comparison table stores the positions of the zooming components under different focal lengths.
5. The method according to claim 1, wherein the determining a displacement corresponding to the focus distance under the second constraint condition comprises:
inquiring a third comparison table to obtain the front depth of field and the rear depth of field corresponding to the focusing distance;
determining a displacement amount required by the focusing assembly to move to the optimal focusing position at the focusing distance based on the front depth of field and the rear depth of field;
and the third comparison table stores the foreground depth and the back depth corresponding to different focus distances under the second constraint condition.
6. The method according to claim 1, wherein the determining a displacement corresponding to the focus distance under the second constraint condition comprises:
and inducing the phase variation under the focusing distance through a phase focusing sensor, and determining the displacement corresponding to the phase variation.
7. The method of claim 1, further comprising:
when the face of the target object is detected to be in a face detection field of view and not in an iris detection field of view of the iris imaging module, determining the relative position of the face of the target object and the iris detection field of view;
and controlling a third piezoelectric motor to drive a reflection assembly of the iris imaging module to rotate based on the relative position so as to enable the face of the target object to be positioned in the iris detection field of view.
8. The method of claim 1, wherein the illumination module of the iris recognition system comprises: the multiple rows of illuminating lamps are distributed around the iris imaging module, and at least two rows of illuminating lamps have different lamp beam orientations;
the method further comprises:
determining the illuminating lamp of the lamp beam facing the target row matched with the focusing distance;
and controlling the illuminating lamps of the target row to enter a power-on state, and maintaining the illuminating lamps of other rows to be in a power-off state.
9. An iris recognition apparatus, comprising: little the control unit, periscopic optics zoom imaging module and piezoelectric motor, periscopic optics zoom imaging module includes: zoom subassembly, focus the subassembly, wherein:
the micro control unit is used for acquiring a focusing distance corresponding to a target object; determining a first focal length corresponding to the focusing distance under a first constraint condition, and sending a first driving instruction to a first piezoelectric motor corresponding to the zooming component to indicate the first piezoelectric motor to drive the zooming component to move to a position corresponding to the first focal length along an optical axis, wherein the first constraint condition is used for constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
the micro control unit is further configured to determine a displacement corresponding to the focus distance under a second constraint condition, send a second driving instruction to a second piezoelectric motor corresponding to the focus assembly, instruct the second piezoelectric motor to drive the focus assembly to move the displacement, and perform iris recognition on the target object, where the second constraint condition is used to constrain the focus assembly after movement to be located at an optimal focus position.
10. The apparatus of claim 9, further comprising:
the distance measurement module is used for detecting the focus distance corresponding to the target object and providing the focus distance to the micro control unit; alternatively, the first and second liquid crystal display panels may be,
the visible light imaging module is used for acquiring a face image of a target object and providing the face image to the micro control unit, and the micro control unit identifies the face image to obtain the interpupillary distance of the target object and determines the focusing distance corresponding to the interpupillary distance.
11. The apparatus of claim 9, wherein the periscopic optical zoom imaging module further comprises: a reflective component;
the micro control unit is further used for determining the relative position of the face of the target object and the iris detection visual field when detecting that the face of the target object is in the face detection visual field and not in the iris detection visual field; and sending a third driving instruction to a third piezoelectric motor corresponding to the reflection assembly based on the relative position, and instructing the third piezoelectric motor to drive the reflection assembly to rotate so as to enable the face of the target object to be located in the iris detection field of view.
12. The apparatus of claim 9, further comprising: a lighting module;
the lighting module includes: the multiple rows of illuminating lamps are distributed around the iris imaging module, and at least two rows of illuminating lamps have different lamp beam orientations;
the micro control unit is also used for determining the illuminating lamp of the target row with the lamp beam facing the focusing distance matched with the focusing distance; and controlling the illuminating lamps of the target row to enter a power-on state, and maintaining the illuminating lamps of other rows to be in a power-off state.
13. An electronic device, comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the steps of the method of any one of claims 1 to 8.
CN202011616344.7A 2020-12-30 2020-12-30 Iris identification method, device and equipment Active CN114765661B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011616344.7A CN114765661B (en) 2020-12-30 2020-12-30 Iris identification method, device and equipment
PCT/CN2021/142690 WO2022143813A1 (en) 2020-12-30 2021-12-29 Iris recognition method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011616344.7A CN114765661B (en) 2020-12-30 2020-12-30 Iris identification method, device and equipment

Publications (2)

Publication Number Publication Date
CN114765661A true CN114765661A (en) 2022-07-19
CN114765661B CN114765661B (en) 2022-12-27

Family

ID=82260265

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011616344.7A Active CN114765661B (en) 2020-12-30 2020-12-30 Iris identification method, device and equipment

Country Status (2)

Country Link
CN (1) CN114765661B (en)
WO (1) WO2022143813A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100202666A1 (en) * 2009-02-06 2010-08-12 Robert Bosch 30 02 20 Time-of-flight sensor-assisted iris capture system and method
CN101814129A (en) * 2009-02-19 2010-08-25 中国科学院自动化研究所 Automatically focused remote iris image acquisition device, method and recognition system
CN105874473A (en) * 2014-01-02 2016-08-17 虹膜技术公司 Apparatus and method for acquiring image for iris recognition using distance of facial feature
CN108369338A (en) * 2015-12-09 2018-08-03 快图有限公司 Image capturing system
CN108446648A (en) * 2018-03-26 2018-08-24 北京上古视觉科技有限公司 A kind of iris capturing system and iris authentication system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8260008B2 (en) * 2005-11-11 2012-09-04 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
CN104268517B (en) * 2014-09-19 2018-11-09 武汉虹识技术有限公司 A kind of Atomatic focusing method and system applied to iris authentication system
CN107341467A (en) * 2017-06-30 2017-11-10 广东欧珀移动通信有限公司 Method for collecting iris and equipment, electronic installation and computer-readable recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100202666A1 (en) * 2009-02-06 2010-08-12 Robert Bosch 30 02 20 Time-of-flight sensor-assisted iris capture system and method
CN101814129A (en) * 2009-02-19 2010-08-25 中国科学院自动化研究所 Automatically focused remote iris image acquisition device, method and recognition system
CN105874473A (en) * 2014-01-02 2016-08-17 虹膜技术公司 Apparatus and method for acquiring image for iris recognition using distance of facial feature
CN108369338A (en) * 2015-12-09 2018-08-03 快图有限公司 Image capturing system
CN108446648A (en) * 2018-03-26 2018-08-24 北京上古视觉科技有限公司 A kind of iris capturing system and iris authentication system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
高永锋: "《虹膜识别镜头的光学设计》", 《中国激光》 *

Also Published As

Publication number Publication date
WO2022143813A1 (en) 2022-07-07
CN114765661B (en) 2022-12-27

Similar Documents

Publication Publication Date Title
US10802283B2 (en) Wearable device and method for outputting virtual image
CN1892676B (en) Apparatus and method for face/iris combination optical imagine
US9535537B2 (en) Hover detection in an interactive display device
US9052414B2 (en) Virtual image device
US10002293B2 (en) Image collection with increased accuracy
RU2608690C2 (en) Light projector and vision system for distance determination
TW201531730A (en) Information processing apparatus and information processing method
US10592739B2 (en) Gaze-tracking system and method of tracking user's gaze
US11860375B2 (en) Virtual reality display device and method for presenting picture
WO2024041312A1 (en) Eyeball tracking apparatus and method, display apparatus, device, and medium
CN114765661B (en) Iris identification method, device and equipment
CN1119809A (en) Viewpoint detecting device
US9262983B1 (en) Rear projection system with passive display screen
ES2900248T3 (en) Determination of the ocular surface contour using multifocal keratometry
US10268040B2 (en) Display box
JP4622541B2 (en) Iris photography device
CN103024259A (en) Imaging apparatus and control method of imaging apparatus
CN116569221A (en) Flexible illumination for imaging systems
CN116529787A (en) Multi-wavelength biological identification imaging system
CN116583885A (en) Gesture optimization in biometric authentication systems
EP3139586B1 (en) Image shooting processing method and device
EP4148465A1 (en) Radar system, photodetector, automobile, and photodetection method
JPH10179521A (en) Visual axis contact eye detecting method and device, and storage medium
TWI674000B (en) Visual tracking system having marks with different image distances and method of marking the images
CN116149051A (en) Lens assembly, infrared light supplementing method and eye movement tracking method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant