WO2022143813A1 - 一种虹膜识别方法、装置及电子设备 - Google Patents

一种虹膜识别方法、装置及电子设备 Download PDF

Info

Publication number
WO2022143813A1
WO2022143813A1 PCT/CN2021/142690 CN2021142690W WO2022143813A1 WO 2022143813 A1 WO2022143813 A1 WO 2022143813A1 CN 2021142690 W CN2021142690 W CN 2021142690W WO 2022143813 A1 WO2022143813 A1 WO 2022143813A1
Authority
WO
WIPO (PCT)
Prior art keywords
iris
target object
focusing
imaging module
distance
Prior art date
Application number
PCT/CN2021/142690
Other languages
English (en)
French (fr)
Inventor
卢洁玲
任志浩
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Publication of WO2022143813A1 publication Critical patent/WO2022143813A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Definitions

  • This document relates to the field of computer technology, in particular to an iris recognition method, device and electronic device.
  • Iris recognition technology refers to the identification technology based on the iris in the eyes, which is generally used in security equipment (such as access control equipment, etc.) and places with high security requirements.
  • iris recognition products are too thick.
  • the iris recognition device is lighter, thinner and smaller.
  • Embodiments of the present application provide an iris recognition method, device, and electronic device, which are used to realize the thinning of the device on the premise of ensuring the recognition range.
  • the specific instructions are as follows:
  • the embodiment of the present application also provides an iris recognition method, which is applied to an iris recognition system, including:
  • the first piezoelectric motor Determine the first focal length corresponding to the focus distance under the first constraint condition, and control the first piezoelectric motor to drive the zoom component of the iris imaging module to move along the optical axis to the position corresponding to the first focal length, the first constraint condition For constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
  • the second constraint condition is used to constrain the moving focus component to be at the best focus position.
  • the embodiment of the present application also provides an iris recognition device, including: a micro control unit, a periscope optical zoom imaging module and a piezoelectric motor, the periscope optical zoom imaging module includes: a zoom component and a focusing component, wherein:
  • the micro-control unit is used to obtain the focus distance corresponding to the target object; determine the first focus distance corresponding to the focus distance under the first constraint condition, and send a first drive instruction to the first piezoelectric motor corresponding to the zoom assembly , instructing the first piezoelectric motor to drive the zoom assembly to move along the optical axis to a position corresponding to the first focal length, and the first constraint is used to constrain the image obtained by identifying the target object under the first focal length The number of iris diameter pixels;
  • the micro-control unit is further configured to determine the displacement corresponding to the focusing distance under the second constraint condition, and send a second driving instruction to the second piezoelectric motor corresponding to the focusing assembly, instructing the second piezoelectric motor
  • the motor drives the focusing assembly to move by the displacement amount, so as to perform iris recognition on the target object, and the second constraint condition is used to constrain the focusing assembly after the movement to be in an optimal focusing position.
  • the embodiment of the present application also provides an electronic device, including:
  • a memory arranged to store computer executable instructions which, when executed, cause the processor to perform any of the iris recognition methods in this application.
  • Embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium stores one or more programs, and the one or more programs are executed by an electronic device including a plurality of application programs. Any iris recognition method.
  • An embodiment of the present application realizes that by pre-establishing the corresponding relationship between the focusing distance and the focal length, the focusing distance and the displacement amount; after detecting the focusing distance of the target object, first determine the focal length corresponding to the focusing distance according to the corresponding relationship , and drive the zoom component to move to the position corresponding to the focal length, and then determine the displacement corresponding to the focusing distance according to the corresponding relationship, and drive the focusing component to move the displacement, which can realize iris recognition within a certain range, thus completing the potential
  • the telescopic optical zoom can realize the rapid imaging of the iris and realize the thinning of the device under the premise of ensuring the recognition range.
  • FIG. 1 is a schematic flowchart of an iris recognition method provided by an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of an iris recognition system provided by an embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of an iris imaging module provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an optical path structure of an iris imaging module provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a method for adjusting a lens field of view angle of an iris imaging module according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of the distribution of an iris potential field and a human face potential field provided by an embodiment of the present application;
  • FIG. 7 is a schematic diagram of a frontal layout of an iris lens and a face lens provided by an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of an iris imaging module provided by another embodiment of the present application.
  • FIG. 9 is a schematic rotation diagram of a reflection assembly provided by an embodiment of the present application.
  • FIG. 10 is a top schematic diagram of a lighting layout provided by an embodiment of the application.
  • FIG. 11 is a schematic structural diagram of an iris recognition device provided by an embodiment of the application.
  • FIG. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 1 is a schematic flowchart of an iris recognition method provided by an embodiment of the application, which can be executed by an iris recognition system. Referring to FIG. 1 , the method may specifically include the following steps:
  • Step 102 obtaining the focus distance corresponding to the target object
  • the focusing distance may refer to the distance between objects and images, which is the sum of the distance from the lens to the object (target object) and the distance from the lens to the photosensitive element.
  • the target object is the object to be subjected to iris recognition.
  • step 102 is described in detail:
  • the first implementation can be:
  • the focusing distance is detected by the ranging module. specifically:
  • the ranging module is triggered to detect the focusing distance between the target object and the iris recognition system and report it to the processor.
  • the second implementation can be;
  • the focusing distance is obtained by analyzing the interpupillary distance of the target object. specifically:
  • the visible light imaging module collects the face image of the target object and reports it to the processor; the processor performs face detection on the face image to obtain the human eye IPD (Interpupillary Distance, interpupillary distance), and based on the correspondence between the pre-marked IPD and the focusing distance, the focusing distance of the target object is determined.
  • IPD Interpupillary Distance, interpupillary distance
  • the processor is preferably a micro-control unit MCU, so as to further realize the miniaturization of the product.
  • the focus distance of the target object can be accurately detected by the ranging module.
  • the focal distance of the target object can be detected by the existing visible light imaging module, so as to avoid adding other hardware devices, which can effectively reduce the system cost and the complexity of the system structure.
  • this embodiment shows a specific implementation manner of step 102 here. Of course, it should be understood that step 102 may also be implemented in other manners, which is not limited in this embodiment.
  • Step 104 Determine the first focal length corresponding to the focusing distance under the first constraint condition, and control the first piezoelectric motor to drive the zoom component of the iris imaging module to move to the position corresponding to the first focal length along the optical axis, and the first focal length
  • a constraint condition is used to constrain the number of iris diameter pixels obtained by identifying the target object under the first focal length;
  • the first piezoelectric motor may refer to a built-in motor specially used to drive the zoom component; the piezoelectric motor is a device that converts electrical energy into mechanical energy by utilizing the piezoelectric inverse effect of piezoelectric materials, and the piezoelectric motor can shield Electromagnetic field interference, noise impact is also small, suitable for miniaturized equipment.
  • the structure of the iris imaging module will be schematically described below with reference to FIG. 3:
  • the iris imaging module includes a periscope zoom lens and a piezoelectric motor, and the periscope zoom lens includes a zoom component (Zoom), a focus component (Focus), a reflection component and an image sensor (Sensor), wherein:
  • the optical path structure corresponding to the zoom component, focus component, reflection component (that is, the reflection prism in Fig. 4 ) and the image sensor is shown in Fig. 4, and the reflection component may refer to the reflection prism.
  • this embodiment takes a 5MP Sensor (pixel size 2 ⁇ m) and a 3x optical zoom system as an example to illustrate, but it is not difficult to understand that this embodiment is not limited to the system parameters described in this example, According to the requirements of different application scenarios, this embodiment can be implemented to obtain the best system parameters; on the other hand, considering that the diameter of the iris of an adult is about 11mm, the iris recognition algorithm has a clear number of iris diameter pixels. For example, based on 150-220 pixels, through 3x optical zoom, the focal length is changed from 12mm to 36mm to achieve iris recognition in the range of 30cm-130cm.
  • step 104 may be:
  • the first comparison table (Table 1 below) to obtain the first focal length corresponding to the focusing distance; convert the first focal length into a voltage to drive the piezoelectric motor, and control the ZOOM assembly to move back and forth along the optical axis through the piezoelectric motor, Achieve optical zoom.
  • the first comparison table stores focal lengths corresponding to different focusing distances under the first constraint condition.
  • the focal length of 12mm when the focus distance is in the range of [30,40)cm, the focal length of 12mm can be selected, when the focus distance is in the range of [40,55)cm, the focal length of 16mm can be selected, and so on. , and will not be repeated here.
  • the position corresponding to the focal length can also be marked by means of a pre-built comparison table, specifically:
  • the second comparison table is queried to obtain the first position corresponding to the first focal length; wherein, the second comparison table stores the positions of the zoom components under different focal lengths.
  • the first position can be obtained by looking up the table, and then the ZOOM assembly can be controlled to move to the first position along the optical axis by the piezoelectric motor.
  • this embodiment establishes the correspondence between the focus distance, the focal length, and the position of the zoom component by pre-constructing a comparison table, so that after the focus distance is acquired, the corresponding focal length and its corresponding position are quickly determined, and then the corresponding focal length and its corresponding position can be quickly determined. Effectively improve the efficiency of optical zoom.
  • this embodiment shows a specific implementation manner of step 104 here. Of course, it should be understood that step 104 may also be implemented in other manners, which are not limited in this embodiment.
  • Step 106 Determine the displacement amount corresponding to the focusing distance under the second constraint condition, and control the second piezoelectric motor to drive the focusing component of the iris imaging module to move the displacement amount, so as to iris the target object. It is identified that the second constraint condition is used to constrain the moving focus component to be at the best focus position.
  • the best focus position may refer to the position with the highest quality of the captured image; it is understandable that the displacement adjustment of the piezoelectric motor has a certain precision, so the position with the highest image quality here is allowed to have a certain error
  • the best focus position is 12.22 when the calculated position with the highest image quality is 12.22, and the accuracy of the displacement of the second piezoelectric motor-driven focus component is 0.1, so the best focus position can be considered to be 12.2.
  • the displacement amount may refer to the displacement amount required for the focusing assembly to move from a reference position to an optimal focusing position, and the reference position is a pre-marked position as a position reference; in some scenarios, the iris imaging module may perform continuous execution.
  • the displacement amount may also be the displacement required for the focusing assembly to move from the current position to the optimal focusing position quantity.
  • the second constraint condition indicates that after the focus assembly of the iris imaging module moves by the displacement amount, the focus assembly is at the optimal focus position.
  • the iris refers to the flat circular annular film in the middle layer of the eyeball wall, located between the cornea and the lens, from the front, it is the ring part with a special texture between the pupil and the white of the eye, and the iris texture can be used for authentication.
  • step 106 may be:
  • Foreground depth refers to the range of sharp points from the focus point toward the camera
  • rear depth of field refers to the range from the focus point to the farthest sharp point.
  • the optimal focusing positions of the focusing components under different focal lengths can be pre-calculated, and then the displacements required for the focusing components to move to the optimal focusing positions can be calculated.
  • the method of calculating the displacement required for the focusing component to move to the optimal focusing position can be specifically exemplified as follows:
  • Example 1 The method based on ordinary image sensor + ranging module:
  • the ranging module After the ranging module detects the focusing distance, it collects images before and after the focusing distance and calculates the contrast of the images to determine the best focusing position, thereby calculating the displacement of the focusing assembly from the reference position to the best focusing position, so as to By analogy, the displacement of the focusing assembly from the reference position to the optimal focusing position under different focusing distances can be calculated.
  • Example 2 The method based on the phase focus sensor:
  • phase change amount under the focusing distance is sensed by a phase focus sensor (PDAF Sensor), and the displacement amount corresponding to the phase change amount is determined.
  • PDAF Sensor phase focus sensor
  • the realization principle can be as follows: After the focal length is confirmed, the Sensor automatically calculates the phase change and its corresponding displacement and reports it to the processor. The processor converts it into voltage and drives the motor to complete the focus.
  • the constructed third comparison table can be the following table 2:
  • an existing ranging module + a common image sensor can be used to calculate the optimal focus position under different focus distances, which can effectively reduce equipment costs; in another example, the phase The focus sensor senses the displacement at different focusing distances, which can effectively improve the calculation efficiency and accuracy of the displacement.
  • this embodiment shows a specific implementation manner of step 106 here. Of course, it should be understood that step 106 may also be implemented in other manners, which is not limited in this embodiment.
  • the corresponding relationship between the focusing distance and the focal length, the focusing distance and the displacement is established in advance; after the focusing distance of the target object is detected, the focal length corresponding to the focusing distance is first determined according to the corresponding relationship , and drive the zoom component to move to the position corresponding to the focal length, and then determine the displacement corresponding to the focus distance according to the corresponding relationship, and drive the focus component to move the displacement, so as to complete the periscope optical zoom, which can realize the iris
  • the rapid imaging and the thinning of the equipment under the premise of ensuring the recognition range can control the thickness of the product within 30mm.
  • the present application considers that the angle of the iris recognition product in the longitudinal field of view direction is too small, usually within 30°. Compared with the 50° field of view of general face recognition devices, the 30° vertical field of view seriously affects the experience of some tall users.
  • the faces A, B and C are all in the field of view of the face lens, but only the face A is in the field of view of the iris lens, and the user whose height is in the face B area needs to bend over , while the user in the face C area needs to stand on tiptoe.
  • the face lens may refer to the visible light imaging module in FIG.
  • the face detection field of view may refer to the field of view of the visible light imaging module
  • the iris lens may refer to the above-mentioned periscope zoom lens
  • the iris detection field of view may be Refers to the field of view of a periscope zoom lens.
  • another embodiment of the present application provides a method for adjusting the angle of view of a lens of an iris imaging module.
  • the method can be performed by the iris recognition system corresponding to FIG. 2 .
  • the method may specifically include the following steps:
  • Step 502 when it is detected that the face of the target object is in the face detection field of view and not in the iris detection field of view of the iris imaging module, determine the face of the target object and the iris detection field of view. the relative position of the field;
  • Step 504 Based on the relative position, control the third piezoelectric motor to drive the reflection component of the iris imaging module to rotate, so that the face of the target object is located within the iris detection field of view.
  • a face imaging module (up, down, left, right, and layout can be arranged according to product appearance requirements) is set near the iris imaging module.
  • the face lens is directly below the iris lens.
  • the face field angle is greater than the iris field angle, and the position of the face and the angle of the iris reflection mechanism (equivalent to the reflection component) are calibrated.
  • the piezoelectric motor drives the reflection component of the iris imaging module to rotate, so that the face of the target object is located within the iris detection field of view, including: indicating that the face is in the face detection field of view at the relative position
  • the third piezoelectric motor is controlled to drive the reflection component of the iris imaging module to adjust the pitch angle, so that the The face of the target object is located within the iris detection field of view. If the situation in the areas B and C of the human face in FIG. 6 occurs, the relative position indicates that the reflective component needs to be adjusted in the direction of the pitch angle,
  • the iris can be tracked by controlling the up and down pitch angle of the reflection mechanism.
  • the existing solutions generally use the entire iris imaging module as a pitch mechanism to directly rotate, resulting in a huge mechanism.
  • the entire iris imaging module does not need to be rotated, and only the reflective component can be rotated.
  • the reflective component can be a reflective prism or mirror as shown in Figure 4, or can be a free-form surface mirror as shown in Figure 8 below.
  • the processor can control the reflection component to rotate in the plane composed of the X axis and the Y axis through the piezoelectric motor, as shown in FIG. 9 , wherein the rotation amount of the reflection component corresponds to the relative position.
  • the piezoelectric motor is used to control the rotation of the reflection assembly to expand the vertical field of view of the lens, so that the product can adapt to the user experience of people of different heights; Due to the characteristics of miniaturization, the miniaturization of the iris recognition product can be realized, and thus, on the basis of the embodiment corresponding to FIG. 1 , the miniaturization and thinning of the product can be realized at the same time.
  • the present application also proposes another embodiment.
  • This embodiment provides a lighting module.
  • the lighting module of the iris recognition system includes: a plurality of columns of lighting lamps , the plurality of columns of illumination lamps are distributed around the iris imaging module, and at least two columns of illumination lamps have different beam orientations; the method further includes:
  • the lighting lamps of the target column are controlled to enter a power-on state, and the lighting lamps of other columns are kept in a power-off state.
  • Iris recognition uses the NIR band for illumination, either 810nm or 850nm.
  • the beam angle of the lamp is required to be less than 30°.
  • Multiple columns of LEDs can be arranged on the light board, and the number of columns can be set according to the recognition distance.
  • the LED light beams of different columns face different directions.
  • LEDs in different columns can be controlled individually, and the LEDs in the corresponding columns can be lit after the distance is determined, so as to reduce power consumption.
  • FIG. 11 is a schematic structural diagram of an iris recognition device provided by an embodiment of the application.
  • the device may specifically include: a micro-control unit MCU, a periscope optical zoom imaging module, and a piezoelectric motor.
  • the optical zoom imaging module includes: a zoom component and a focusing component, wherein:
  • the micro-control unit is used to obtain the focus distance corresponding to the target object; determine the first focus distance corresponding to the focus distance under the first constraint condition, and send a first drive instruction to the first piezoelectric motor corresponding to the zoom assembly , instructing the first piezoelectric motor to drive the zoom assembly to move along the optical axis to a position corresponding to the first focal length, and the first constraint is used to constrain the image obtained by identifying the target object under the first focal length The number of iris diameter pixels;
  • the micro-control unit is further configured to determine the displacement corresponding to the focusing distance under the second constraint condition, and send a second driving instruction to the second piezoelectric motor corresponding to the focusing assembly, instructing the second piezoelectric motor
  • the motor drives the focusing assembly to move by the displacement amount, so as to perform iris recognition on the target object, and the second constraint condition is used to constrain the focusing assembly after the movement to be in an optimal focusing position.
  • the MCU corresponds to the processor in Figure 2
  • the periscope optical zoom imaging module corresponds to the iris imaging module in Figure 2
  • its specific working principle is also correspondingly detailed. Therefore, the MCU and the periscope are no longer discussed here.
  • the MCU converts it into voltage to drive the piezoelectric motor, and controls the ZOOM component along the piezo motor through the piezoelectric motor.
  • the optical axis moves forward and backward to realize optical zoom; during the focusing process, after the MCU obtains the distance information, it queries the displacement required for the best focusing position, and converts it into voltage to drive the piezoelectric motor, and controls the FOCUS component through the piezoelectric motor Move back and forth along the optical axis to focus.
  • the apparatus may also include:
  • a ranging module configured to detect the focusing distance corresponding to the target object and provide it to the micro-control unit;
  • the visible light imaging module is used to collect the face image of the target object and provide it to the micro-control unit, and the micro-control unit recognizes the face image to obtain the interpupillary distance of the target object and determines the corresponding interpupillary distance. Focus distance.
  • the ranging module and the visible light module can be integrated on the device at the same time, or only any one of them can be integrated.
  • the iris lens (ie the lens corresponding to the periscope optical zoom imaging module) has a smaller field of view, which is extremely unfriendly to some tall users compared to face recognition.
  • the pitching mechanism includes The reflection component and its corresponding piezoelectric motor are used to drive the up and down pitch angles of the reflection component through the piezoelectric motor, so as to achieve the purpose of tracking the iris.
  • the pitch structure does not need to rotate the entire optical path, only the reflection component can be rotated, which can effectively The area of the area occupied by the mechanism is reduced, and the purpose of further realizing the miniaturization of the equipment is achieved.
  • the setting position of the iris lens can be near the face imaging module (up, down, left, right, and layout according to product appearance requirements). As shown in Figure 6 below, from the front, the face lens is directly below the iris lens.
  • the micro-control unit when it detects that the face of the target object is in the face detection field of view and not in the iris detection field of view, it determines the difference between the face of the target object and the iris detection field of view. relative position; based on the relative position, send a third driving instruction to the third piezoelectric motor corresponding to the reflection component, instructing the third piezoelectric motor to drive the reflection component to rotate (see FIG. 9 ), so that the The face of the target object is located within the iris detection field of view.
  • the principle of iris tracking has been described in detail in the embodiment corresponding to FIG. 1 , so it is not described here again.
  • the apparatus may further include: a lighting module
  • the lighting module includes: multiple rows of lighting lamps, the multiple rows of lighting lamps are distributed around the iris imaging module, and at least two rows of lighting lamps have different light beam orientations;
  • the micro-control unit is further configured to determine that the light beams are oriented towards the lighting lamps of the target column matching the focusing distance; control the lighting lamps of the target column to enter a power-on state, and maintain the lighting lamps of other columns in a non-power-on state .
  • the iris recognition can be illuminated in the NIR band, either 810nm or 850nm.
  • LED or VCSEL devices are used for illumination, and the light beam angle is required to be less than 30°.
  • the corresponding relationship between the focus distance and the focus distance, the focus distance and the displacement is established in advance; after the focus distance of the target object is detected, the focus distance corresponding to the focus distance is first determined according to the corresponding relationship, and Drive the zoom component to move to the position corresponding to the focal length, and then determine the displacement corresponding to the focus distance according to the corresponding relationship, and drive the focus component to move by the displacement, so as to complete the periscope optical zoom and realize the rapidity of the iris.
  • Imaging and thinning the device under the premise of ensuring the recognition range can control the thickness of the product within 30mm.
  • FIG. 12 is a schematic structural diagram of an electronic device according to an embodiment of the application.
  • the electronic device includes a processor, an internal bus, a network interface, a memory, and a non-volatile memory, and of course may also include other business required hardware.
  • the processor reads the corresponding computer program from the non-volatile memory into the memory and runs it, forming an iris recognition device on a logical level.
  • this application does not exclude other implementations, such as logic devices or a combination of software and hardware. hardware or logic device.
  • the bus can be an ISA (Industry Standard Architecture, industry standard architecture) bus, a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or an EISA (Extended Industry Standard Architecture, extended industry standard structure) bus and the like.
  • the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one bidirectional arrow is shown in FIG. 12, but it does not mean that there is only one bus or one type of bus.
  • Memory is used to store programs.
  • the program may include program code, and the program code includes computer operation instructions.
  • the memory which may include read-only memory and random access memory, provides instructions and data to the processor.
  • the memory may include high-speed random-access memory (Random-Access Memory, RAM), and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
  • a processor configured to execute the program stored in the memory, and specifically execute:
  • the first piezoelectric motor Determine the first focal length corresponding to the focus distance under the first constraint condition, and control the first piezoelectric motor to drive the zoom component of the iris imaging module to move along the optical axis to the position corresponding to the first focal length, the first constraint condition For constraining the number of iris diameter pixels obtained by identifying the target object under the first focal length;
  • the second constraint condition is used to constrain the moving focus component to be at the best focus position.
  • the above-mentioned method performed by the iris recognition apparatus or the manager (Master) node disclosed in the embodiment shown in FIG. 11 of the present application may be applied to a processor, or implemented by a processor.
  • a processor may be an integrated circuit chip with signal processing capabilities.
  • each step of the above-mentioned method can be completed by a hardware integrated logic circuit in a processor or an instruction in the form of software.
  • the above-mentioned processor can be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; it can also be a digital signal processor (Digital Signal Processor, DSP), dedicated integrated Circuit (Application Specific Integrated Circuit, ASIC), Field Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the steps of the method disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software modules may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
  • the iris recognition device can also perform the methods shown in FIGS. 1 and 5 and implement the methods performed by the manager node.
  • the embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium stores one or more programs, and the one or more programs, when included in multiple application programs
  • the electronic device executes, the electronic device is made to execute the iris recognition method provided by the embodiments corresponding to FIGS. 1 and 5 .
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
  • the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • Memory may include forms of non-persistent memory, random access memory (RAM) and/or non-volatile memory in computer readable media, such as read only memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
  • RAM random access memory
  • ROM read only memory
  • flash RAM flash memory
  • Computer-readable media includes both persistent and non-permanent, removable and non-removable media, and storage of information may be implemented by any method or technology.
  • Information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
  • computer-readable media does not include transitory computer-readable media, such as modulated data signals and carrier waves.
  • the embodiments of the present application may be provided as a method, a system or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Studio Devices (AREA)

Abstract

一种虹膜识别方法、装置及设备。方法包括:通过预先建立对焦距离与焦距、对焦距离与位移量之间的对应关系;在检测到目标对象的对焦距离之后,先依据该对应关系,确定该对焦距离对应的焦距,并驱动变焦组件移动至焦距对应的位置,然后依据该对应关系,确定该对焦距离对应的位移量,并驱动对焦组件进行该位移量的移动,从而利用镜头内部两个或两个以上的透镜组的移动完成潜望式的光学变焦,可实现虹膜的快速成像以及在保证识别范围的前提下实现设备轻薄化。

Description

一种虹膜识别方法、装置及电子设备
本申请要求于2020年12月30日提交中国专利局、申请号为202011616344.7发明名称为“一种虹膜识别方法、装置及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本文件涉及计算机技术领域,尤其涉及一种虹膜识别方法、装置及电子设备。
背景技术
虹膜识别技术是指基于眼睛中的虹膜来进行身份识别的技术,一般应用于安防设备(如门禁设备等)和有高度保密需求的场所。
为实现长距离范围内的虹膜识别,目前的各虹膜设备厂商都推出了变焦光学系统的成像方案。但这些厂商使用的变焦镜头均为工业相机镜头,镜头总长(Total Track Length,TTL)较大,从而导致虹膜识别产品的厚度过大。相比于通常只有20mm左右厚度的人脸识别产品,虹膜识别产品过于厚重。
因此,需要提供满足长距离范围虹膜识别快速可靠的虹膜识别方案的同时,虹膜识别设备更为轻薄小巧。
发明内容
本申请实施例提供一种虹膜识别方法、装置及电子设备,用以在保证识别范围的前提下实现设备轻薄化。以下进行具体说明:
本申请实施例还提供一种虹膜识别方法,应用于虹膜识别系统,包括:
获取目标对象对应的对焦距离;
确定第一约束条件下所述对焦距离对应的第一焦距,并控制第一压电马达驱动虹膜成像模块的变焦组件沿光轴移动至所述第一焦距对应的位置,所述第一约束条件用于约束所述第一焦距下识别所述目标对象得到的虹膜直径像素的数量;
确定第二约束条件下所述对焦距离对应的位移量,并控制第二压电马达驱动所述虹膜成像模块的对焦组件进行所述位移量的移动,以对所述目标对象进行虹膜识别,所述第二约束条件用于约束移动之后的对焦组件位于最佳 对焦位置。
本申请实施例还提供一种虹膜识别装置,包括:微控制单元、潜望式光学变焦成像模块和压电马达,所述潜望式光学变焦成像模块包括:变焦组件、对焦组件,其中:
所述微控制单元,用于获取目标对象对应的对焦距离;确定第一约束条件下所述对焦距离对应的第一焦距,并向所述变焦组件对应的第一压电马达发送第一驱动指令,指示所述第一压电马达驱动所述变焦组件沿光轴移动至所述第一焦距对应的位置,所述第一约束条件用于约束所述第一焦距下识别所述目标对象得到的虹膜直径像素的数量;
所述微控制单元,还用于确定第二约束条件下所述对焦距离对应的位移量,并向所述对焦组件对应的第二压电马达发送第二驱动指令,指示所述第二压电马达驱动所述对焦组件进行所述位移量的移动,以对所述目标对象进行虹膜识别,所述第二约束条件用于约束移动之后的对焦组件位于最佳对焦位置。
本申请实施例还提供一种电子设备,包括:
处理器;以及
被安排成存储计算机可执行指令的存储器,所述可执行指令在被执行时使所述处理器执行本申请中的任一虹膜识别方法。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质存储一个或多个程序,所述一个或多个程序当被包括多个应用程序的电子设备执行本申请中的任一虹膜识别方法。
本申请一个实施例实现了,通过预先建立对焦距离与焦距、对焦距离与位移量之间的对应关系;在检测到目标对象的对焦距离之后,先依据该对应关系,确定该对焦距离对应的焦距,并驱动变焦组件移动至焦距对应的位置,然后依据该对应关系,确定该对焦距离对应的位移量,并驱动对焦组件进行该位移量的移动,可以实现一定范围内的虹膜识别,从而完成潜望式的光学变焦,可实现虹膜的快速成像以及在保证识别范围的前提下实现设备轻薄化。
附图说明
为了更清楚地说明本申请实施例和现有技术的技术方案,下面对实施例 和现有技术中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请一实施例提供的一种虹膜识别方法的流程示意图;
图2为本申请一实施例提供的虹膜识别系统的结构示意图;
图3为本申请一实施例提供的虹膜成像模块的结构示意图;
图4为本申请一实施例提供的虹膜成像模块的光路结构的示意图;
图5为本申请一实施例提供的虹膜成像模块的镜头视场角调节方法的流程示意图;
图6为本申请一实施例提供的虹膜势场和人脸势场的分布示意图;
图7为本申请一实施例提供的虹膜镜头和人脸镜头正面布局的示意图;
图8为本申请另一实施例提供的虹膜成像模块的结构示意图;
图9为本申请一实施例提供的反射组件的旋转示意图;
图10为本申请一实施例提供的照明布局的顶部示意图;
图11为本申请一实施例提供的虹膜识别装置的结构示意图;
图12为本申请一实施例提供的一种电子设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案、及优点更加清楚明白,以下参照附图并举实施例,对本申请进一步详细说明。显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
以下结合附图,详细说明本申请各实施例提供的技术方案。
图1为本申请一实施例提供的一种虹膜识别方法的流程示意图,可以由虹膜识别系统执行,参见图1,所述方法具体可以包括如下步骤:
步骤102、获取目标对象对应的对焦距离;
其中,对焦距离可以是指物象之间的距离,是镜头到物体(目标对象)的距离与镜头到感光元件的距离之和。目标对象为待进行虹膜识别的对象。
下面参见图2提供的虹膜识别系统的结构示意图,对步骤102的实现方 式进行详细说明:
第一种实现方式可以为:
所述对焦距离为通过测距模块检测得到的。具体地:
目标对象处于虹膜识别系统的识别区域时,触发测距模块检测所述目标对象与虹膜识别系统之间的对焦距离并上报给处理器。
第二种实现方式可以为;
所述对焦距离为通过分析所述目标对象的瞳孔间距得到的。具体地:
目标对象处于虹膜识别系统的识别区域时,可见光成像模块采集目标对象的人脸图像并上报给处理器;由处理器对人脸图像进行人脸检测,获得目标对象的人眼IPD(Interpupillary Distance,瞳间距),并基于预先标记的IPD与对焦距离之间的对应关系,确定目标对象的对焦距离。
其中,处理器优选为微控制单元MCU,以进一步地实现产品小型化。
基于此,本实施例一个例子中可通过测距模块精确检测目标对象的对焦距离。一个例子中,可通过已有的可见光成像模块检测目标对象的对焦距离,避免额外增加其他硬件设备,可有效降低系统成本和系统结构的复杂度。而且,本实施例在此示出了步骤102的具体实现方式。当然,应理解,步骤102也可以采用其它的方式实现,本实施例对此不作限制。
步骤104、确定第一约束条件下所述对焦距离对应的第一焦距,并控制第一压电马达驱动虹膜成像模块的变焦组件沿光轴移动至所述第一焦距对应的位置,所述第一约束条件用于约束所述第一焦距下识别所述目标对象得到的虹膜直径像素的数量;
其中,第一压电马达可以是指设备内置的专门用于驱动所述变焦组件的马达;压电马达是利用压电材料的压电逆效应将电能转变为机械能的装置,压电马达能够屏蔽电磁场干扰,噪声影响也很小,适合小型化设备。
下面先结合图3对虹膜成像模块的结构进行示意性说明:
虹膜成像模块包括潜望式变焦镜头和压电马达,所述潜望式变焦镜头包括:变焦组件(Zoom)、对焦组件(Focus)、反射组件和图像传感器(Sensor),其中:
变焦组件、对焦组件、反射组件(即图4中的反射棱镜)和图像传感器 对应的光路结构如图4所示,所述反射组件可以是指反射棱镜。
为便于描述,一方面,本实施例以5MP Sensor(像元尺寸2μm),3倍光学变焦系统为例进行说明,但不难理解的是,本实施例不仅限于本例所述的系统参数,针对不同的应用场景要求,本实施例均可实施得到最佳的系统参数;另一方面,考虑到成年人虹膜直径大小约为11mm,虹膜识别算法对虹膜直径像素有明确的数量要求,本实施例以150-220像素为基准,通过3倍光学变焦,焦距从12mm变化至36mm,以实现30cm-130cm范围内的虹膜识别。
基于此,步骤104的一种实现方式可以为:
查询第一对照表(下表1),得到所述对焦距离对应的第一焦距;将第一焦距转换为电压对压电马达进行驱动,通过压电马达控制ZOOM组件沿光轴方向前后移动,实现光学变焦。其中,所述第一对照表中保存有所述第一约束条件下不同对焦距离对应的焦距。
表1
Figure PCTCN2021142690-appb-000001
如上表所示在对焦距离处于[30,40)cm的范围内时,可选用12mm的焦距,在对焦距离处于[40,55)cm的范围内时,可选用16mm的焦距,其他情况依次类推,此处不再赘述。
进一步地,焦距对应的位置同样可采用预构建对照表的方式进行标记,具体地:
查询第二对照表,得到所述第一焦距对应的第一位置;其中,所述第二对照表中保存有不同焦距下变焦组件的位置。
由此,在确定第一焦距之后,可通过查表得到第一位置,进而可通过压电马达控制ZOOM组件沿光轴方向移动至该第一位置。
基于此,本实施例通过预构建对照表的方式,建立对焦距离、焦距、变焦组件的位置之间的对应关系,从而在获取对焦距离之后,快速确定对应的焦距及其对应的位置,进而可有效提高光学变焦的效率。而且,本实施例在此示出了步骤104的具体实现方式。当然,应理解,步骤104也可以采用其它的方式实现,本实施例对此不作限制。
步骤106、确定第二约束条件下所述对焦距离对应的位移量,并控制第二压电马达驱动所述虹膜成像模块的对焦组件进行所述位移量的移动,以对所述目标对象进行虹膜识别,所述第二约束条件用于约束移动之后的对焦组件位于最佳对焦位置。
其中,所述最佳对焦位置可以是指采集的图像质量最高的位置;可以理解的是,压电马达的位移量调整具有一定的精度,因此此处的图像质量最高的位置允许存在一定的误差;例如计算得到的图像质量最高的位置即最佳对焦位置为12.22,而第二压电马达驱动对焦组件的位移量的精度为0.1,则可以认为最佳对焦位置为12.2。所述位移量可以是指对焦组件从基准位置移动至最佳对焦位置所需的位移量,所述基准位置为预标记的作为位置基准的位置;在一些场景中,虹膜成像模块可能存在连续执行虹膜识别的情况,即虹膜成像模块的对焦组件不是每次均从基准位置开始移动,一个例子中,所述位移量还可以为所述对焦组件从当前位置移动至最佳对焦位置所需的位移量。第二约束条件表示在所述虹膜成像模块的对焦组件进行所述位移量的移动后,所述对焦组件处于所述最佳对焦位置上。虹膜是指眼球壁中层的扁圆形环状 薄膜,位于角膜和晶状体之间,从正面看即为瞳孔和眼白间的带有特殊纹理的圆环部分,虹膜纹理可用于身份验证。
步骤106的一种实现方式可以为:
查询第三对照表,得到所述对焦距离对应的前景深和后景深;基于所述前景深和后景深,确定所述对焦组件移动至所述对焦距离下的最佳对焦位置所需的位移量;其中,所述第三对照表中保存有所述第二约束条件下不同对焦距离对应的前景深和后景深。前景深指从对焦点往相机方向的清晰点的范围,后景深指从对焦点到最远的清晰点的范围。
可以理解的是,可预先计算出不同焦距距离下的对焦组件的最佳对焦位置,进而计算出对焦组件移动至最佳对焦位置所需的位移量。其中,计算对焦组件移动至最佳对焦位置所需的位移量的方式具体可以示例为:
示例1、基于普通图像传感器+测距模块的方式:
在测距模块检测出对焦距离之后,采集该对焦距离前后的图像并进行图像对比度的计算,确定出最佳对焦位置,从而计算出对焦组件从基准位置移动至最佳对焦位置的位移量,以此类推,可计算出不同对焦距离下对焦组件从基准位置移动至最佳对焦位置的位移量。
示例2、基于相位对焦传感器的方式:
通过相位对焦传感器(PDAF Sensor)感应所述对焦距离下的相位变化量,并确定所述相位变化量对应的位移量。
其实现原理可以为:在焦距确认后,Sensor自动计算相位变化量及其对应的位移量并上报给处理器,由处理器转换为电压并驱动马达完成对焦。
以12mm焦距为例,构建的第三对照表可以为下表2:
表2
Figure PCTCN2021142690-appb-000002
Figure PCTCN2021142690-appb-000003
基于此,本实施例一个例子中可通过已有的测距模块+普通图像传感器的方式,计算不同对焦距离下的最佳对焦位置,可有效降低设备成本;另一个例子中,还可通过相位对焦传感器感应不同对焦距离下的位移量,可有效提高位移量的计算效率和精确度。而且,本实施例在此示出了步骤106的具体实现方式。当然,应理解,步骤106也可以采用其它的方式实现,本实施例对此不作限制。
综上所述,本实施例通过预先建立对焦距离与焦距、对焦距离与位移量之间的对应关系;在检测到目标对象的对焦距离之后,先依据该对应关系,确定该对焦距离对应的焦距,并驱动变焦组件移动至焦距对应的位置,然后依据该对应关系,确定该对焦距离对应的位移量,并驱动对焦组件进行该位移量的移动,从而完成潜望式的光学变焦,可实现虹膜的快速成像以及在保证识别范围的前提下实现设备轻薄化,可将产品厚度控制在30mm以内。
进一步地,本申请考虑到虹膜识别产品在纵向视场方向角度过小,通常在30°以内。相比于一般人脸识别设备50°的视场角,30°的纵向视场严重影响了部分身高用户的使用体验。如图6所示,其中的人脸A、B和C均处于人脸镜头的视场内,但仅有人脸A处于虹膜镜头的视场中,身高处在人脸B区域的用户需要弯腰,而处在人脸C区域的用户则需要踮起脚尖。其中,人脸镜头可以是指图2中的可见光成像模块,人脸检测视场可以是指可见光 成像模块的视场,虹膜镜头可以是指上述的潜望式变焦镜头,虹膜检测视场可以是指潜望式变焦镜头的视场。
基于此,本申请的另一实施例提供了一种虹膜成像模块的镜头视场角调节方法,该方法可由图2对应的虹膜识别系统执行,参见图5,所述方法具体可以包括如下步骤:
步骤502、当检测到所述目标对象的人脸处于人脸检测视场内且不处于所述虹膜成像模块的虹膜检测视场内时,确定所述目标对象的人脸与所述虹膜检测视场的相对位置;
步骤504、基于所述相对位置,控制第三压电马达驱动所述虹膜成像模块的反射组件进行旋转,以使所述目标对象的人脸位于所述虹膜检测视场之内。
具体地:首先,在虹膜成像模块附近设置人脸成像模块(上下左右均可,根据产品外观需求布局)。以下图7所示为例,从正面看,人脸镜头在虹膜镜头正下方。人脸视场角大于虹膜视场角,对人脸位置和虹膜反射机构(相当于反射组件)的角度进行标定,在一种可能的实施方式中,所述基于所述相对位置,控制第三压电马达驱动所述虹膜成像模块的反射组件进行旋转,以使所述目标对象的人脸位于所述虹膜检测视场之内,包括:在所述相对位置表示人脸处于人脸检测视场内且不处于所述虹膜成像模块的虹膜检测视场的俯仰角内时,基于所述相对位置,控制第三压电马达驱动所述虹膜成像模块的反射组件的进行俯仰角的调节,以使所述目标对象的人脸位于所述虹膜检测视场之内。若出现图6中人脸B和C区域的情况,所述相对位置表示反射组件需要沿俯仰角的方向进行调节,
控制反射机构上下的俯仰角即可跟踪到虹膜。
对于俯仰机构,已有方案一般都是将整个虹膜成像模块作为俯仰机构直接旋转,造成机构庞大。本实施例则不需要转动整个虹膜成像模块,仅转动反射组件即可。反射组件可以是图4中的反射棱镜或反射镜,也可以是下图8所示的自由曲面镜。处理器可通过压电马达控制反射组件在X轴及Y轴组成的平面内旋转,如图9所示,其中,反射组件的旋转量与所述相对位置相对应。
基于此,本实施例通过压电马达控制反射组件旋转,以扩展镜头的垂直 视场角,使得产品能够适应不同身高人群的用户体验;而且,本实施例驱动反射组件所采用的压电马达具有小型化的特性,因此,可实现虹膜识别产品的小型化,由此,可在图1对应实施例的基础上,同时实现产品的小型化和轻薄化。
更进一步地,在上述两个实施例的基础上,本申请还提出了又一实施例,本实施例提供了照明模块,参见图10,所述虹膜识别系统的照明模块包括:多列照明灯,所述多列照明灯分布于所述虹膜成像模块周围,且至少存在两列照明灯的灯束朝向不同;则方法还包括:
确定灯束朝向与所述对焦距离相匹配的目标列的照明灯;
控制所述目标列的照明灯进入通电状态,并维持其他列的照明灯处于不通电状态。
具体地:灯板和水平方向存在一定角度。虹膜识别采用NIR波段进行照明,810nm或者850nm均可。采用LED或者VCSEL器件照明,要求灯光束角小于30°。灯板上可以布置多列LED,根据识别距离设置列数。不同列的LED灯光束朝向不一样。且不同列的LED可以单独控制,确定距离后点亮对应列的LED,以便降低功耗。
基于此,本实施例通过合理配置照明系统的布局,可在满足照明条件的基础上,有效降低照明所需资源。
图11为本申请一实施例提供的虹膜识别装置的结构示意图,参见图11,所述装置具体可以包括:微控制单元MCU、潜望式光学变焦成像模块和压电马达,所述潜望式光学变焦成像模块包括:变焦组件、对焦组件,其中:
所述微控制单元,用于获取目标对象对应的对焦距离;确定第一约束条件下所述对焦距离对应的第一焦距,并向所述变焦组件对应的第一压电马达发送第一驱动指令,指示所述第一压电马达驱动所述变焦组件沿光轴移动至所述第一焦距对应的位置,所述第一约束条件用于约束所述第一焦距下识别所述目标对象得到的虹膜直径像素的数量;
所述微控制单元,还用于确定第二约束条件下所述对焦距离对应的位移量,并向所述对焦组件对应的第二压电马达发送第二驱动指令,指示所述第 二压电马达驱动所述对焦组件进行所述位移量的移动,以对所述目标对象进行虹膜识别,所述第二约束条件用于约束移动之后的对焦组件位于最佳对焦位置。
其中,MCU与图2中的处理器相对应,潜望式光学变焦成像模块与图2中的虹膜成像模块相对应,其具体工作原理也对应详细,故,此处不再对MCU和潜望式光学变焦成像模块进行展开说明,而仅简要陈述其实现原理,包括:在变焦过程中,MCU获取距离信息后,将其转换为电压对压电马达进行驱动,通过压电马达控制ZOOM组件沿光轴方向前后移动,实现光学变焦;在对焦过程中,MCU获取距离信息后,查询最佳对焦位置所需的位移量,并转换为电压对压电马达进行驱动,通过压电马达控制FOCUS组件沿光轴方向前后移动,以完成对焦。
结合图2,所述装置还可以包括:
测距模块,用于检测所述目标对象对应的对焦距离并提供给所述微控制单元;或者,
可见光成像模块,用于采集目标对象的人脸图像并提供给所述微控制单元,由所述微控制单元识别所述人脸图像得到所述目标对象的瞳孔间距并确定所述瞳孔间距对应的对焦距离。
其中,测距模块和可见光模块可同时集成于设备上,也可仅集成其中任意一个。
结合图6,虹膜镜头(即潜望式光学变焦成像模块对应的镜头)视场角较小,相比于人脸识别而言对部分身高用户极不友好。身高处在人脸B区域的用户需要弯腰,而处在人脸C区域的用户则需要踮起脚尖。因此需要增加俯仰机构,来扩展垂直方向的视场。
对于俯仰机构,现有方案一般都是将整个成像模块直接旋转,造成机构庞大,不利于设备的小型化,由此,本实施例提供了新的俯仰机构,结合图3,所述俯仰机构包括反射组件及其对应的压电马达,以通过压电马达驱动反射组件上下的俯仰角,从而达到跟踪虹膜的目的,而且,该俯仰结构不需要转动整个光路,仅转动反射组件即可,可有效降低机构所占区域的面积,达到进一步地实现设备小型化的目的。
其中,本实施例中所述潜望式光学变焦成像模块的反射组件的具体结构可以参见图4示出的反射棱镜或反射镜和图8示出的自由曲面镜。虹膜镜头的设置位置可以在人脸成像模块附近(上下左右均可,根据产品外观需求布局)。以下图6所示为例,从正面看,人脸镜头在虹膜镜头正下方。
对应的,所述微控制单元在检测到所述目标对象的人脸处于人脸检测视场内且不处于虹膜检测视场时,确定所述目标对象的人脸与所述虹膜检测视场的相对位置;基于所述相对位置,向所述反射组件对应的第三压电马达发送第三驱动指令,指示所述第三压电马达驱动所述反射组件进行旋转(参见图9),以使所述目标对象的人脸位于所述虹膜检测视场之内。其中,虹膜追踪的原理已在图1对应的实施例进行了详细说明,故,此处不再展开说明。
另外,结合图2,所述装置还可以包括:照明模块;
所述照明模块包括:多列照明灯,所述多列照明灯分布于所述虹膜成像模块周围,且至少存在两列照明灯的灯束朝向不同;
所述微控制单元,还用于确定灯束朝向与所述对焦距离相匹配的目标列的照明灯;控制所述目标列的照明灯进入通电状态,并维持其他列的照明灯处于不通电状态。
其中,灯板和水平方向存在一定角度。虹膜识别可采用NIR波段进行照明,810nm或者850nm均可,优选地,采用LED或者VCSEL器件照明,要求灯光束角小于30°。
由此,可通过合理配置照明系统的布局,实现在满足照明条件的基础上,有效降低照明所需资源的目的。
基于此,本实施例通过预先建立对焦距离与焦距、对焦距离与位移量之间的对应关系;在检测到目标对象的对焦距离之后,先依据该对应关系,确定该对焦距离对应的焦距,并驱动变焦组件移动至焦距对应的位置,然后依据该对应关系,确定该对焦距离对应的位移量,并驱动对焦组件进行该位移量的移动,从而完成潜望式的光学变焦,可实现虹膜的快速成像以及在保证识别范围的前提下实现设备轻薄化,可将产品厚度控制在30mm以内。
另外,对于上述装置实施方式而言,由于其与方法实施方式基本相似,所以描述的比较简单,相关之处参见方法实施方式的部分说明即可。而且, 应当注意的是,在本申请的装置的各个部件中,根据其要实现的功能而对其中的部件进行了逻辑划分,但是,本申请不受限于此,可以根据需要对各个部件进行重新划分或者组合。
图12为本申请一实施例提供的一种电子设备的结构示意图,参见图12,该电子设备包括处理器、内部总线、网络接口、内存以及非易失性存储器,当然还可能包括其他业务所需要的硬件。处理器从非易失性存储器中读取对应的计算机程序到内存中然后运行,在逻辑层面上形成虹膜识别装置。当然,除了软件实现方式之外,本申请并不排除其他实现方式,比如逻辑器件抑或软硬件结合的方式等等,也就是说以下处理流程的执行主体并不限定于各个逻辑单元,也可以是硬件或逻辑器件。
网络接口、处理器和存储器可以通过总线系统相互连接。总线可以是ISA(Industry Standard Architecture,工业标准体系结构)总线、PCI(Peripheral Component Interconnect,外设部件互连标准)总线或EISA(Extended Industry Standard Architecture,扩展工业标准结构)总线等。所述总线可以分为地址总线、数据总线、控制总线等。为便于表示,图12中仅用一个双向箭头表示,但并不表示仅有一根总线或一种类型的总线。
存储器用于存放程序。具体地,程序可以包括程序代码,所述程序代码包括计算机操作指令。存储器可以包括只读存储器和随机存取存储器,并向处理器提供指令和数据。存储器可能包含高速随机存取存储器(Random-Access Memory,RAM),也可能还包括非易失性存储器(non-volatile memory),例如至少1个磁盘存储器。
处理器,用于执行所述存储器存放的程序,并具体执行:
获取目标对象对应的对焦距离;
确定第一约束条件下所述对焦距离对应的第一焦距,并控制第一压电马达驱动虹膜成像模块的变焦组件沿光轴移动至所述第一焦距对应的位置,所述第一约束条件用于约束所述第一焦距下识别所述目标对象得到的虹膜直径像素的数量;
确定第二约束条件下所述对焦距离对应的位移量,并控制第二压电马达驱动所述虹膜成像模块的对焦组件进行所述位移量的移动,以对所述目标对 象进行虹膜识别,所述第二约束条件用于约束移动之后的对焦组件位于最佳对焦位置。
上述如本申请图11所示实施例揭示的虹膜识别装置或管理者(Master)节点执行的方法可以应用于处理器中,或者由处理器实现。处理器可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器,包括中央处理器(Central Processing Unit,CPU)、网络处理器(Network Processor,NP)等;还可以是数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
虹膜识别装置还可执行图1和5示出的方法,并实现管理者节点执行的方法。
基于相同的发明创造,本申请实施例还提供了一种计算机可读存储介质,所述计算机可读存储介质存储一个或多个程序,所述一个或多个程序当被包括多个应用程序的电子设备执行时,使得所述电子设备执行图1和5对应的实施例提供的虹膜识别方法。
本申请中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
上述对本申请特定实施例进行了描述。其它实施例在所附权利要求书的 范围内。在一些情况下,在权利要求书中记载的动作或步骤可以按照不同于实施例中的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定要求示出的特定顺序或者连续顺序才能实现期望的结果。在某些实施方式中,多任务处理和并行处理也是可以的或者可能是有利的。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。 内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
本领域技术人员应明白,本申请的实施例可提供为方法、系统或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (15)

  1. 一种虹膜识别方法,应用于虹膜识别系统,包括:
    获取目标对象对应的对焦距离;
    确定第一约束条件下所述对焦距离对应的第一焦距,并控制第一压电马达驱动虹膜成像模块的变焦组件沿光轴移动至所述第一焦距对应的位置,所述第一约束条件用于约束所述第一焦距下识别所述目标对象得到的虹膜直径像素的数量;
    确定第二约束条件下所述对焦距离对应的位移量,并控制第二压电马达驱动所述虹膜成像模块的对焦组件进行所述位移量的移动,以对所述目标对象进行虹膜识别,所述第二约束条件用于约束移动之后的对焦组件位于最佳对焦位置。
  2. 根据权利要求1所述的方法,其中,所述对焦距离为通过测距模块检测得到的;或者,
    所述对焦距离为通过分析所述目标对象的瞳孔间距得到的。
  3. 根据权利要求1所述的方法,其中,所述确定第一约束条件下所述对焦距离对应的第一焦距,包括:
    查询第一对照表,得到所述对焦距离对应的第一焦距;
    其中,所述第一对照表中保存有所述第一约束条件下不同对焦距离对应的焦距。
  4. 根据权利要求1所述的方法,其中,在所述控制第一压电马达驱动虹膜成像模块的变焦组件沿光轴移动至所述第一焦距对应的位置之前,方法还包括:
    查询第二对照表,得到所述第一焦距对应的第一位置;
    其中,所述第二对照表中保存有不同焦距下变焦组件的位置。
  5. 根据权利要求1所述的方法,其中,所述确定第二约束条件下所述对焦距离对应的位移量,包括:
    查询第三对照表,得到所述对焦距离对应的前景深和后景深;
    基于所述前景深和后景深,确定所述对焦组件移动至所述对焦距离下的最佳对焦位置所需的位移量;
    其中,所述第三对照表中保存有所述第二约束条件下不同对焦距离对应的前景深和后景深。
  6. 根据权利要求1所述的方法,其中,所述确定第二约束条件下所述对焦距离对应的位移量,包括:
    通过相位对焦传感器感应所述对焦距离下的相位变化量,并确定所述相位变化量对应的位移量。
  7. 根据权利要求1所述的方法,其中,还包括:
    检测到所述目标对象的人脸处于人脸检测视场内且不处于所述虹膜成像模块的虹膜检测视场时,确定所述目标对象的人脸与所述虹膜检测视场的相对位置;
    基于所述相对位置,控制第三压电马达驱动所述虹膜成像模块的反射组件进行旋转,以使所述目标对象的人脸位于所述虹膜检测视场之内。
  8. 根据权利要求7所述的方法,其中,所述基于所述相对位置,控制第三压电马达驱动所述虹膜成像模块的反射组件进行旋转,以使所述目标对象的人脸位于所述虹膜检测视场之内,包括:
    在所述相对位置表示人脸处于人脸检测视场内且不处于所述虹膜成像模块的虹膜检测视场的俯仰角内时,基于所述相对位置,控制第三压电马达驱动所述虹膜成像模块的反射组件的进行俯仰角的调节,以使所述目标对象的人脸位于所述虹膜检测视场之内。
  9. 根据权利要求1所述的方法,其中,所述虹膜识别系统的照明模块包括:多列照明灯,所述多列照明灯分布于所述虹膜成像模块周围,且至少存在两列照明灯的灯束朝向不同;
    则方法还包括:
    确定灯束朝向与所述对焦距离相匹配的目标列的照明灯;
    控制所述目标列的照明灯进入通电状态,并维持其他列的照明灯处于不通电状态。
  10. 根据权利要求1所述的方法,其中,所述虹膜成像模块包括潜望式变焦镜头,所述潜望式变焦镜头包括变焦组件及对焦组件。
  11. 一种虹膜识别装置,包括:微控制单元、潜望式光学变焦成像模块 和压电马达,所述潜望式光学变焦成像模块包括:变焦组件、对焦组件,其中:
    所述微控制单元,用于获取目标对象对应的对焦距离;确定第一约束条件下所述对焦距离对应的第一焦距,并向所述变焦组件对应的第一压电马达发送第一驱动指令,指示所述第一压电马达驱动所述变焦组件沿光轴移动至所述第一焦距对应的位置,所述第一约束条件用于约束所述第一焦距下识别所述目标对象得到的虹膜直径像素的数量;
    所述微控制单元,还用于确定第二约束条件下所述对焦距离对应的位移量,并向所述对焦组件对应的第二压电马达发送第二驱动指令,指示所述第二压电马达驱动所述对焦组件进行所述位移量的移动,以对所述目标对象进行虹膜识别,所述第二约束条件用于约束移动之后的对焦组件位于最佳对焦位置。
  12. 根据权利要求11所述的装置,所述装置还包括:
    测距模块,用于检测所述目标对象对应的对焦距离并提供给所述微控制单元;或者,
    可见光成像模块,用于采集目标对象的人脸图像并提供给所述微控制单元,由所述微控制单元识别所述人脸图像得到所述目标对象的瞳孔间距并确定所述瞳孔间距对应的对焦距离。
  13. 根据权利要求11所述的装置,其中,所述潜望式光学变焦成像模块还包括:反射组件;
    所述微控制单元,还用于检测到所述目标对象的人脸处于人脸检测视场内且不处于虹膜检测视场时,确定所述目标对象的人脸与所述虹膜检测视场的相对位置;基于所述相对位置,向所述反射组件对应的第三压电马达发送第三驱动指令,指示所述第三压电马达驱动所述反射组件进行旋转,以使所述目标对象的人脸位于所述虹膜检测视场之内。
  14. 根据权利要求11所述的装置,其中,所述装置还包括:照明模块;
    所述照明模块包括:多列照明灯,所述多列照明灯分布于所述虹膜成像模块周围,且至少存在两列照明灯的灯束朝向不同;
    所述微控制单元,还用于确定灯束朝向与所述对焦距离相匹配的目标列 的照明灯;控制所述目标列的照明灯进入通电状态,并维持其他列的照明灯处于不通电状态。
  15. 一种电子设备,包括:
    处理器;以及
    被安排成存储计算机可执行指令的存储器,所述可执行指令在被执行时使所述处理器执行如权利要求1至10中任一项所述的方法的步骤。
PCT/CN2021/142690 2020-12-30 2021-12-29 一种虹膜识别方法、装置及电子设备 WO2022143813A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011616344.7A CN114765661B (zh) 2020-12-30 2020-12-30 一种虹膜识别方法、装置及设备
CN202011616344.7 2020-12-30

Publications (1)

Publication Number Publication Date
WO2022143813A1 true WO2022143813A1 (zh) 2022-07-07

Family

ID=82260265

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/142690 WO2022143813A1 (zh) 2020-12-30 2021-12-29 一种虹膜识别方法、装置及电子设备

Country Status (2)

Country Link
CN (1) CN114765661B (zh)
WO (1) WO2022143813A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104268517A (zh) * 2014-09-19 2015-01-07 武汉虹识技术有限公司 一种应用于虹膜识别系统的自动对焦方法及系统
CN107341467A (zh) * 2017-06-30 2017-11-10 广东欧珀移动通信有限公司 虹膜采集方法及设备、电子装置和计算机可读存储介质
US20180053052A1 (en) * 2005-11-11 2018-02-22 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
CN108446648A (zh) * 2018-03-26 2018-08-24 北京上古视觉科技有限公司 一种虹膜采集系统及虹膜识别系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7912252B2 (en) * 2009-02-06 2011-03-22 Robert Bosch Gmbh Time-of-flight sensor-assisted iris capture system and method
CN101814129B (zh) * 2009-02-19 2013-05-08 中国科学院自动化研究所 自动对焦的远距离虹膜图像获取装置、方法和识别系统
KR101569268B1 (ko) * 2014-01-02 2015-11-13 아이리텍 잉크 얼굴 구성요소 거리를 이용한 홍채인식용 이미지 획득 장치 및 방법
CN108369338B (zh) * 2015-12-09 2021-01-12 快图有限公司 图像采集系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180053052A1 (en) * 2005-11-11 2018-02-22 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
CN104268517A (zh) * 2014-09-19 2015-01-07 武汉虹识技术有限公司 一种应用于虹膜识别系统的自动对焦方法及系统
CN107341467A (zh) * 2017-06-30 2017-11-10 广东欧珀移动通信有限公司 虹膜采集方法及设备、电子装置和计算机可读存储介质
CN108446648A (zh) * 2018-03-26 2018-08-24 北京上古视觉科技有限公司 一种虹膜采集系统及虹膜识别系统

Also Published As

Publication number Publication date
CN114765661A (zh) 2022-07-19
CN114765661B (zh) 2022-12-27

Similar Documents

Publication Publication Date Title
US10802283B2 (en) Wearable device and method for outputting virtual image
KR102024954B1 (ko) 아티팩트 없는 이미지들을 캡처하기 위한 시스템들 및 방법들
US9354046B2 (en) Three dimensional shape measurement apparatus, control method therefor, and storage medium
WO2019105214A1 (zh) 图像虚化方法、装置、移动终端和存储介质
KR102138845B1 (ko) 오브젝트 거리 정보에 기초하여 자동-초점 기능을 제공하기 위한 디바이스, 시스템 및 방법
TW201531730A (zh) 資訊處理裝置及資訊處理方法
WO2021135867A1 (zh) 一种镜头模组的对焦方法、装置及设备
US20190163044A1 (en) Projection apparatus and auto-focusing method
JP6230911B2 (ja) 距離測定のためのライトプロジェクタ及びビジョンシステム
US20160110600A1 (en) Image collection and locating method, and image collection and locating device
TW202006585A (zh) 圖像採集方法、裝置和系統以及電子設備
WO2020124517A1 (zh) 拍摄设备的控制方法、拍摄设备的控制装置及拍摄设备
TW202004419A (zh) 眼球追蹤裝置及其光源控制方法
WO2024041312A1 (zh) 眼球跟踪装置和方法、显示装置、设备及介质
CN101813867A (zh) 半通过镜头无反光镜相位比较对焦数码相机系统
CN114363522A (zh) 拍照方法及相关装置
WO2022143813A1 (zh) 一种虹膜识别方法、装置及电子设备
US10880536B2 (en) Three-dimensional image capturing device and method
CN207148400U (zh) 摄像模组和成像设备
TWI661233B (zh) 點陣投影器結構,以及利用點陣投影器結構擷取圖像的方法
CN216310373U (zh) 一种可调焦距的镜头模块
ES2900248T3 (es) Determinación del contorno superficial ocular utilizando queratometría multifocal
WO2021148050A1 (zh) 一种三维空间相机及其拍照方法
CN103428414A (zh) 用于便携式终端的摄像装置
CN116583885A (zh) 生物识别认证系统中的姿态优化

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21914545

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21914545

Country of ref document: EP

Kind code of ref document: A1