CN105654040B - Distance measurement method based on iris and mobile terminal - Google Patents
Distance measurement method based on iris and mobile terminal Download PDFInfo
- Publication number
- CN105654040B CN105654040B CN201510988985.8A CN201510988985A CN105654040B CN 105654040 B CN105654040 B CN 105654040B CN 201510988985 A CN201510988985 A CN 201510988985A CN 105654040 B CN105654040 B CN 105654040B
- Authority
- CN
- China
- Prior art keywords
- iris
- lens
- distance
- mobile terminal
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses a distance measuring method and a mobile terminal based on iris, wherein the method comprises the following steps: acquiring an iris image of an iris to be detected through a first lens; determining the radius or diameter of the iris in the iris image of the iris to be detected; determining a distance between the first lens and the iris according to the radius or diameter of the iris. The embodiment of the invention is based on the characteristic that the iris diameters of different users have smaller variation range, and the distance between the lens of the mobile terminal and the eyes of the user is measured on the premise that the iris diameters of different users are the same, so that the distance between the lens of the mobile terminal and the eyes of the user is not required to be determined through the pupil distance of the user, and the precision of measuring the distance between the lens of the mobile terminal and the eyes of the user is improved.
Description
Technical Field
The invention relates to the field of mobile terminals, in particular to a distance measuring method based on irises and a mobile terminal.
Background
The application of the current biometric technology to the mobile terminal is very popular, such as face recognition, fingerprint recognition, iris recognition and the like. These biometric techniques have greatly stimulated attention to security and practicability of authentication. The iris recognition technology is based on the identification of the iris in the eye, and the structure of the eye of a person consists of a sclera, the iris, a pupil lens, a retina and the like. The iris is a circular ring-shaped segment located between the black pupil and the white sclera, and the iris diameter is typically between 11 mm and 12 mm for different people. The iris contains many detailed features of interlaced spots, filaments, crowns, striations, crypts, etc., and will remain unchanged throughout the life span after the iris is formed during the fetal development stage. These features determine the uniqueness of the iris features and also the uniqueness of the identification. Therefore, the iris feature of the eye can be used as an identification target for each person. The user needs to keep a certain distance from the mobile terminal in the process of using the iris recognition function, but the iris recognition function is disabled when the distance exceeds a recognizable distance. Therefore, whether the distance between the user and the mobile terminal exceeds the recognizable distance or not can be prompted in the process of iris recognition.
In the prior art, a method for measuring a distance between a lens of a mobile terminal and eyes of a user mainly comprises the steps of shooting an image containing two eyes of the user through the lens on the mobile terminal, and positioning the positions of two pupil center points of the user in the image according to an algorithm, so as to determine the pupil distance of the user. And then determining the distance between the lens of the mobile terminal and the eyes of the user according to an inverse ratio coefficient of the interpupillary distance and the distance prestored by the mobile terminal based on the principle that the interpupillary distance and the distance are in inverse ratio relation.
The above method is established on the premise that the interpupillary distances of different users are the same, however, generally, the interpupillary distance of an adult male is between 60 mm and 73 mm, and the interpupillary distance of an adult female is between 53 mm and 68 mm, that is, the variation range of the interpupillary distances of different users is large, and thus, the error of measuring the distance between the lens of the mobile terminal and the eyes of the user by the above method is large.
Disclosure of Invention
The embodiment of the invention provides a distance measuring method based on an iris and a mobile terminal, which are used for improving the precision of measuring the distance between a lens of the mobile terminal and eyes of a user.
In a first aspect, an iris-based ranging method is provided, including: acquiring an iris image of an iris to be detected through a first lens; determining the radius or diameter of the iris in the iris image of the iris to be detected; determining a distance between the first lens and the iris according to the radius or diameter of the iris.
In some implementations, the distance between the first lens and the iris can be a distance between a lens in the first lens and the iris.
In some implementations, the radius or diameter of the iris may also be replaced by the circumference of the iris, i.e., the distance between the first lens and the iris may be determined based on the circumference of the iris.
On the one hand, the distance between the lens of the mobile terminal and the eyes of the user is measured on the basis of the characteristic that the iris diameters of different users are smaller and on the premise that the iris diameters of different users are the same, the distance between the lens of the mobile terminal and the eyes of the user is not required to be determined through the pupil distance of the user, the problem that the measurement precision is not high due to the fact that the pupil distance change range of different users is large is avoided, and therefore the precision of measuring the distance between the lens of the mobile terminal and the eyes of the user is improved.
On the other hand, the distance measurement can be carried out by only containing the local image of one eye of the user, so that the problem that the image capturing rate of the user is influenced by the movement of the user when the distance measurement method based on the user pupil distance in the prior art is avoided, and the image capturing rate of the user is improved.
With reference to the first aspect, in a first possible implementation manner of the first aspect, determining a distance between the first lens and the iris according to a radius or a diameter of the iris includes: the distance between the first lens and the iris is determined based on the radius or diameter of the iris, the field angle of the first lens, and the area of the optical sensor in the first lens.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, determining a distance between the first lens and the iris according to a radius or a diameter of the iris, a field angle of the first lens, and an area of an optical sensor in the first lens includes: determining a distance between the first lens and the iris according to the following formula:where L denotes a distance between the first lens and the iris, D denotes a radius or a diameter of the iris, S denotes an area of an optical sensor in the first lens, F denotes an angle of view of the first lens, and k denotes a coefficient.
With reference to the second implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the coefficient is obtained based on an iris image of an iris whose distance from a lens is known.
In some implementations, the coefficient may be determined by the principles of an iris-based ranging method based on a known iris diameter or radius, and the above-described distance corresponding to the iris diameter or radius.
With reference to the first aspect or any one of the foregoing possible implementation manners of the first aspect, in a fourth possible implementation manner of the first aspect, determining a radius or a diameter of an iris in an iris image of the iris to be measured includes: determining a circular image area of the iris to be detected in the iris image; and determining the radius or the diameter of the iris according to the number of the pixels occupied by the circular image area in the iris image of the iris to be detected.
In some implementations, a circular image region of the to-be-detected iris in the iris image may be determined by edge detection.
In some implementations, the number of pixels in the iris image of the iris to be measured may be other pixel values that can represent the area of the iris to be measured in the iris image.
With reference to the first aspect or any one of the foregoing implementation manners of the first aspect, in a fifth possible implementation manner of the first aspect, the radius or the diameter of the iris is represented by the number of pixels.
With reference to the first aspect or any one of the foregoing implementation manners of the first aspect, in a sixth possible implementation manner of the first aspect, the first lens may be a front infrared camera of the mobile terminal.
According to the method and the device, the iris image of the user can be acquired through the front infrared camera of the mobile terminal, the additional transmitting mobile terminal and the additional receiving mobile terminal are not required, and the hardware overhead of the mobile terminal is reduced.
In a second aspect, there is provided an iris-based ranging mobile terminal, including: the acquisition module is used for acquiring an iris image of the iris to be detected through the first lens; the first determining module is used for determining the radius or the diameter of the iris in the iris image of the iris to be detected, which is acquired by the acquiring module; and the second determination module is used for determining the distance between the first lens and the iris according to the radius or the diameter of the iris determined by the first determination module.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the second determining module is specifically configured to: the distance between the first lens and the iris is determined based on the radius or diameter of the iris, the field angle of the first lens, and the area of the optical sensor in the first lens.
With reference to the first possible implementation manner of the second aspect, in a second possible implementation manner of the second aspect, the second determining module is specifically further configured to: determining a distance between the first lens and the iris according to the following formula:where L denotes a distance between the first lens and the iris, D denotes a radius or a diameter of the iris, S denotes an area of an optical sensor in the first lens, F denotes an angle of view of the first lens, and k denotes a coefficient.
With reference to the second implementation manner of the second aspect, in a third possible implementation manner of the second aspect, the coefficient is obtained based on an iris image of an iris with a known lens distance.
With reference to the second aspect or any one of the foregoing possible implementations of the first aspect, in a fourth possible implementation of the second aspect, the first determining module is specifically configured to: determining a circular image area of the iris to be detected in the iris image; and determining the radius or the diameter of the iris according to the number of the pixels occupied by the circular image area in the iris image of the iris to be detected.
In a third aspect, a computer-readable storage medium for storing program code for an iris-based ranging method is provided, the program code comprising instructions for performing the method of the first aspect.
The embodiment of the invention is based on the characteristic that the iris diameters of different users are smaller in variation range, and the distance between the lens of the mobile terminal and the eyes of the user is measured on the premise that the iris diameters of different users are the same, so that the distance between the lens of the mobile terminal and the eyes of the user is not required to be determined through the pupil distance of the user, the problem of low measurement precision caused by the larger variation range of the pupil distance of different users is solved, and the precision of measuring the distance between the lens of the mobile terminal and the eyes of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of the principle of an iris-based ranging method according to an embodiment of the present invention.
Fig. 2 is a schematic flow diagram of an iris-based ranging method according to an embodiment of the present invention.
Fig. 3 is a schematic block diagram of an iris-based ranging mobile terminal according to an embodiment of the present invention.
Fig. 4 is a schematic block diagram of an iris-based ranging mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
Fig. 1 is a schematic view of the principle of an iris-based ranging method according to an embodiment of the present invention. In fig. 1, light 1 and light 2 are parallel light beams, light 1 and light 2 are focused on the focal point of the optical axis through the lens, light 3 and light 4 are light beams passing through the principal point of the lens, and the directions of light 3 and light 4 are not changed. D represents the diameter of the iris in the iris image to be detected, the diameter D of the iris in the iris image to be detected can be represented by the number of pixels, D represents the diameter of the iris of a user, L represents the distance between a lens and the user, and L represents the distance between the optical center of the lens and the diameter D of the iris in the iris image to be detected, as can be seen from the schematic diagram shown in FIG. 1, the diameter D of the iris of the user and the distance L between a mobile terminal and the user are on a first triangle, the distance L between the optical center of the lens of the mobile terminal and the diameter D of the iris in the iris image to be detected are on a second triangle, and the first triangle and the second triangle are similar triangles, so that the diameter D of the iris in the iris image to be detected and the distance L are in inverse proportion.
Based on the above principle, the iris-based ranging method according to an embodiment of the present invention is described in detail below with reference to fig. 2.
Fig. 2 is a schematic flow diagram of an iris-based ranging method according to an embodiment of the present invention. The method shown in fig. 2 may be performed by a mobile terminal, the method comprising:
and 210, acquiring an iris image of the iris to be detected through the first lens.
220, determining the radius or diameter of the iris in the iris image of the iris to be measured.
And 230, determining the distance between the first lens and the iris according to the radius or the diameter of the iris.
On one hand, the embodiment of the invention is based on the characteristic that the iris diameters of different users are smaller in variation range, and the distance between the lens of the mobile terminal and the eyes of the user is measured on the premise that the iris diameters of different users are the same, so that the distance between the lens of the mobile terminal and the eyes of the user is not required to be determined through the pupil distance of the user, the problem of low measurement precision caused by the larger variation range of the pupil distance of different users is solved, and the precision of measuring the distance between the lens of the mobile terminal and the eyes of the user is improved.
On the other hand, the embodiment of the invention can measure the distance by only containing the local image of one eye of the user, thereby avoiding the problem that the image capturing rate of the user is influenced by the movement of the user when the distance measuring method based on the user pupil distance in the prior art, and further improving the image capturing rate of the user.
Optionally, as an embodiment, 230 may include: determining a distance between the first lens and the iris based on a radius or diameter of the iris, a field angle of the first lens, and an area of an optical sensor in the first lens.
Specifically, the field angle of the first lens may refer to an included angle formed by two edges of the first lens, where the first lens is a vertex and an object image of a target to be measured can pass through the maximum range of the first lens.
The area of the optical sensor in the first lens may refer to an area of a Charge-coupled Device (CCD)/Complementary Metal Oxide Semiconductor (CMOS). The larger the area of the optical sensor is, the larger the area of the CCD/CMOS is, the more photons are captured, the better the light sensing performance is, and the higher the signal-to-noise ratio is.
Optionally, as an embodiment, 230 may further include: determining a distance between the first lens and the iris according to the following formula:
wherein L represents a distance between the first lens and the iris, D represents a radius or a diameter of the iris, S represents an area of an optical sensor in the first lens, F represents a field angle of the first lens, and k represents a coefficient.
It should be appreciated that the above formula may also be used to determine lens parameters for a mobile terminal. For example, parameters of a lens of the mobile terminal, i.e., an area S of an optical sensor in the lens and a field angle F of the lens, may be determined by the formula according to a diameter D of an iris of a user and a distance L corresponding to the diameter D.
Alternatively, as one embodiment, the coefficient is derived based on an iris image of the iris at a known distance from the lens.
Specifically, in the above formula, the coefficient k may be an empirical value obtained from an actual ranging method, and when the parameter of the lens of the mobile terminal is determined, the coefficient k may also be determined in general. For example, determining the lens module of the mobile terminal also determines the relevant parameters of the lens, that is, the area S of the optical sensor in the lens in the above formula, and the field angle F of the lens; the coefficient k can be obtained by substituting the known iris diameter D and the distance L corresponding to the known iris diameter D into the above formula.
Optionally, as an embodiment, 220 may include: determining a circular image area of the iris to be detected in the iris image; and determining the radius or the diameter of the iris according to the number of the pixels occupied by the circular image area in the iris image of the iris to be detected.
Specifically, the iris image may be subjected to gaussian filtering to perform image processing, the iris image subjected to image processing is subjected to edge detection to determine the circumference of the iris, and finally the diameter or the number of pixels of the radius of the circumference of the iris in the iris image is determined by a hough circle detection algorithm for the circumference of the iris.
It should be understood that the circle detection algorithm may be a circle detection algorithm based on hough transform, or may be a circle detection algorithm based on least square method, which is not specifically limited in this embodiment of the present invention.
Optionally, as an embodiment, the radius or diameter of the iris is expressed in terms of the number of pixels.
Optionally, as an embodiment, the lens of the mobile terminal is a front infrared camera of the mobile terminal.
The iris-based ranging method according to an embodiment of the present invention is described in detail above with reference to fig. 2. An iris-based ranging mobile terminal according to an embodiment of the present invention will be described in detail with reference to fig. 3 and 4. It should be understood that the mobile terminal shown in fig. 3 is capable of implementing the steps of fig. 2, and will not be described in detail herein to avoid repetition.
Fig. 3 is a schematic block diagram of an iris-based ranging mobile terminal according to an embodiment of the present invention. The mobile terminal 300 shown in fig. 3 includes an acquisition module 310, a first determination module 320, and a second determination module 330.
The acquisition module 310 is used for acquiring an iris image of an iris to be detected through the first lens;
the first determining module 320 is configured to determine the radius or the diameter of the iris in the iris image of the iris to be detected acquired by the acquiring module;
a second determining module 330, configured to determine a distance between the first lens and the iris according to the radius or the diameter of the iris determined by the first determining module.
On one hand, the embodiment of the invention is based on the characteristic that the iris diameters of different users are smaller in variation range, and the distance between the lens of the mobile terminal and the eyes of the user is measured on the premise that the iris diameters of different users are the same, so that the distance between the lens of the mobile terminal and the eyes of the user is not required to be determined through the pupil distance of the user, the problem of low measurement precision caused by the larger variation range of the pupil distance of different users is solved, and the precision of measuring the distance between the lens of the mobile terminal and the eyes of the user is improved.
On the other hand, the embodiment of the invention can measure the distance by only containing the local image of one eye of the user, thereby avoiding the problem that the image capturing rate of the user is influenced by the movement of the user when the distance measuring method based on the user pupil distance in the prior art, and further improving the image capturing rate of the user.
Optionally, as an embodiment, the second determining module 330 is specifically configured to determine the distance between the first lens and the iris according to the radius or diameter of the iris, the field angle of the first lens, and the area of the sensor in the first lens.
Optionally, as an embodiment, the second determining module 330 is specifically configured to determine the distance between the first lens and the iris according to the following formula:
where L denotes a distance between the first lens and the iris, D denotes a radius or a diameter of the iris, S denotes an area of an optical sensor in the first lens, F denotes an angle of view of the first lens, and k denotes a coefficient.
Alternatively, as one embodiment, the coefficient is derived based on an iris image of the iris at a known distance from the lens.
Optionally, as an embodiment, the first determining module 320 is specifically configured to determine a circular image region of the iris to be detected in the iris image; and determining the radius or the diameter of the iris according to the number of the pixels occupied by the circular image area in the iris image of the iris to be detected.
Optionally, as an embodiment, the radius or diameter of the iris is expressed in terms of the number of pixels.
Optionally, as an embodiment, the lens of the mobile terminal 300 is a front lens of the mobile terminal 300.
Fig. 4 is a schematic block diagram of an iris-based ranging mobile terminal according to an embodiment of the present invention. The mobile terminal 400 shown in fig. 4 includes a lens 410, a memory 420, a processor 430, an input/output interface 440, a communication interface 450, and a bus system 460. The lens 410, the processor 420, the memory 430, the input/output interface 440 and the communication interface 450 are connected by a bus system 460, the memory 430 is used for storing instructions, and the processor 420 is used for executing the instructions stored by the memory 430 to control the input/output interface 440 to receive input data and information, output data such as operation results, and control the communication interface 450 to send signals.
A lens 410 for photographing an iris image of an iris to be measured;
a processor 420, configured to acquire an iris image of an iris to be measured from the lens 410, and determine a radius or a diameter of the iris in the iris image of the iris to be measured; and determining the distance between the first lens and the iris according to the radius or the diameter of the iris.
It should be understood that, in the embodiment of the present invention, the processor 420 may adopt a general-purpose Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits, for executing related programs to implement the technical solutions provided by the embodiments of the present invention.
It should also be appreciated that the communication interface 450 enables communication between the mobile terminal 400 and other devices or communication networks using transceiver means, such as, but not limited to, transceivers.
The memory 430 may include both read-only memory and random access memory, and provides instructions and data to the processor 420. A portion of processor 420 may also include non-volatile random access memory. For example, processor 420 may also store information of the device type.
The bus system 460 may include a power bus, a control bus, a status signal bus, and the like, in addition to a data bus. For clarity of illustration, however, the various buses are labeled in the figures as the bus system 460.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 420. The steps of the ranging method disclosed in the embodiments of the present invention may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 420, and the processor 430 reads the information in the memory 420, and performs the steps of the above method in combination with the hardware thereof. To avoid repetition, it is not described in detail here.
On one hand, the embodiment of the invention is based on the characteristic that the iris diameters of different users are smaller in variation range, and the distance between the lens of the mobile terminal and the eyes of the user is measured on the premise that the iris diameters of different users are the same, so that the distance between the lens of the mobile terminal and the eyes of the user is not required to be determined through the pupil distance of the user, the problem of low measurement precision caused by the larger variation range of the pupil distance of different users is solved, and the precision of measuring the distance between the lens of the mobile terminal and the eyes of the user is improved.
On the other hand, the embodiment of the invention can measure the distance by only containing the local image of one eye of the user, thereby avoiding the problem that the image capturing rate of the user is influenced by the movement of the user when the distance measuring method based on the user pupil distance in the prior art, and further improving the image capturing rate of the user.
It should be understood that in the present embodiment, "B corresponding to a" means that B is associated with a, from which B can be determined. It should also be understood that determining B from a does not mean determining B from a alone, but may be determined from a and/or other information.
It should be understood that the term "and/or" herein is merely one type of association relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the mobile terminal and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, mobile terminal and method may be implemented in other manners. For example, the above-described mobile terminal embodiments are merely illustrative, and for example, the division of the unit is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of the mobile terminals or units through some interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
This functionality, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present invention, and all such changes or substitutions are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (8)
1. An iris-based ranging method, comprising:
acquiring an iris image of an iris to be detected through a first lens;
determining the radius or the diameter of the iris in the iris image of the iris to be detected;
determining a distance between the first lens and the iris according to the following formula:
where L denotes a distance between the first lens and the iris, D denotes a radius or diameter of the iris, S denotes an area of an optical sensor in the first lens, F denotes an angle of view of the first lens, and k denotes a coefficient.
2. The method of claim 1, wherein the coefficients are derived based on an iris image of the iris at a known distance from the lens.
3. The method of claim 1 or 2, wherein said determining a radius or diameter of the iris in the iris image of the iris to be measured comprises:
determining a circular image area of the iris to be detected in the iris image;
and determining the radius or the diameter of the iris according to the number of the pixels occupied by the circular image area in the iris image of the iris to be measured.
4. The method of claim 3, wherein the radius or diameter of the iris is expressed in terms of a number of pixels.
5. An iris-based ranging mobile terminal, comprising:
the acquisition module is used for acquiring an iris image of the iris to be detected through the first lens;
the first determining module is used for determining the radius or the diameter of the iris in the iris image of the iris to be measured, which is acquired by the acquiring module;
a second determining module, configured to determine a distance between the first lens and the iris according to the following formula:
where L denotes a distance between the first lens and the iris, D denotes a radius or diameter of the iris, S denotes an area of an optical sensor in the first lens, F denotes an angle of view of the first lens, and k denotes a coefficient.
6. The mobile terminal of claim 5, wherein the coefficient is derived based on an iris image of an iris at a known distance from a lens.
7. The mobile terminal of claim 5 or 6, wherein the first determining module is specifically configured to:
determining a circular image area of the iris to be detected in the iris image;
and determining the radius or the diameter of the iris according to the number of the pixels occupied by the circular image area in the iris image of the iris to be measured.
8. The mobile terminal of claim 7, wherein the radius or diameter of the iris is expressed in terms of a number of pixels.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510988985.8A CN105654040B (en) | 2015-12-24 | 2015-12-24 | Distance measurement method based on iris and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510988985.8A CN105654040B (en) | 2015-12-24 | 2015-12-24 | Distance measurement method based on iris and mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105654040A CN105654040A (en) | 2016-06-08 |
CN105654040B true CN105654040B (en) | 2020-01-03 |
Family
ID=56476829
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510988985.8A Active CN105654040B (en) | 2015-12-24 | 2015-12-24 | Distance measurement method based on iris and mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105654040B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106778713B (en) * | 2017-03-01 | 2023-09-22 | 武汉虹识技术有限公司 | Iris recognition device and method for dynamic human eye tracking |
CN107301403A (en) * | 2017-06-30 | 2017-10-27 | 广东欧珀移动通信有限公司 | Control method, electronic installation and computer-readable recording medium |
CN107341469B (en) * | 2017-06-30 | 2022-06-14 | Oppo广东移动通信有限公司 | Control method, electronic device, and computer-readable storage medium |
CN107463885B (en) * | 2017-07-19 | 2021-04-16 | Oppo广东移动通信有限公司 | Biometric pattern control method and related product |
WO2021145913A1 (en) * | 2020-01-13 | 2021-07-22 | Google Llc | Estimating depth based on iris size |
CN113920195A (en) * | 2021-10-08 | 2022-01-11 | Oppo广东移动通信有限公司 | Distance detection method, control method, device, storage medium and electronic equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6289113B1 (en) * | 1998-11-25 | 2001-09-11 | Iridian Technologies, Inc. | Handheld iris imaging apparatus and method |
CN103440483A (en) * | 2013-09-03 | 2013-12-11 | 吉林大学 | Active auto-focus type iris image capturing device |
CN104809424A (en) * | 2014-01-23 | 2015-07-29 | 北京七鑫易维信息技术有限公司 | Method for realizing sight tracking based on iris features |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101893858B (en) * | 2010-07-15 | 2012-01-25 | 华中科技大学 | Method for controlling distance between eyes of user and screen of electronic equipment |
CN103106401B (en) * | 2013-02-06 | 2017-02-22 | 北京中科虹霸科技有限公司 | Mobile terminal iris recognition device with human-computer interaction mechanism |
-
2015
- 2015-12-24 CN CN201510988985.8A patent/CN105654040B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6289113B1 (en) * | 1998-11-25 | 2001-09-11 | Iridian Technologies, Inc. | Handheld iris imaging apparatus and method |
CN103440483A (en) * | 2013-09-03 | 2013-12-11 | 吉林大学 | Active auto-focus type iris image capturing device |
CN104809424A (en) * | 2014-01-23 | 2015-07-29 | 北京七鑫易维信息技术有限公司 | Method for realizing sight tracking based on iris features |
Also Published As
Publication number | Publication date |
---|---|
CN105654040A (en) | 2016-06-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105654040B (en) | Distance measurement method based on iris and mobile terminal | |
CN107563304B (en) | Terminal equipment unlocking method and device and terminal equipment | |
KR20180053108A (en) | Method and apparatus for extracting iris region | |
JP3598109B2 (en) | Iris code generation method, personal authentication method, iris code registration device, and personal authentication device | |
CN108629262B (en) | Iris identification method and corresponding device | |
KR20150022016A (en) | Systems and method for facial verification | |
CN104680128B (en) | Biological feature recognition method and system based on four-dimensional analysis | |
KR101632912B1 (en) | Method for User Authentication using Fingerprint Recognition | |
WO2016171624A1 (en) | Extended depth-of-field biometric system | |
CN107491675B (en) | Information security processing method and device and terminal | |
CN109948439B (en) | Living body detection method, living body detection system and terminal equipment | |
EP4095744A1 (en) | Automatic iris capturing method and apparatus, computer-readable storage medium, and computer device | |
EP3249607A1 (en) | Method for obtaining a position of a main lens optical center of a plenoptic camera | |
KR101919090B1 (en) | Apparatus and method of face recognition verifying liveness based on 3d depth information and ir information | |
KR20180134280A (en) | Apparatus and method of face recognition verifying liveness based on 3d depth information and ir information | |
US20140369553A1 (en) | Method for triggering signal and in-vehicle electronic apparatus | |
CN106991376B (en) | Depth information-combined side face verification method and device and electronic device | |
CN117496560B (en) | Fingerprint line identification method and device based on multidimensional vector | |
KR20150069799A (en) | Method for certifying face and apparatus thereof | |
KR20210136092A (en) | Image processing apparatus, image processing method and image processing program | |
CN109034004A (en) | A kind of identity verification method and device based on recognition of face | |
CN106485265B (en) | Image identification method and device | |
CN115423804B (en) | Image calibration method and device and image processing method | |
KR102011195B1 (en) | Image processing method and image processing device | |
CN112418189B (en) | Face recognition method, device and equipment for wearing mask and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |