WO2020170894A1 - 画像処理装置、方法、システム、及びコンピュータ可読媒体 - Google Patents
画像処理装置、方法、システム、及びコンピュータ可読媒体 Download PDFInfo
- Publication number
- WO2020170894A1 WO2020170894A1 PCT/JP2020/005220 JP2020005220W WO2020170894A1 WO 2020170894 A1 WO2020170894 A1 WO 2020170894A1 JP 2020005220 W JP2020005220 W JP 2020005220W WO 2020170894 A1 WO2020170894 A1 WO 2020170894A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- period
- iris
- focus position
- blink
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/141—Control of illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols
- H04L9/32—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials
- H04L9/3226—Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols including means for verifying the identity or authority of a user of the system or for message authentication, e.g. authorization, entity authentication, data integrity or data verification, non-repudiation, key authentication or verification of credentials using a predetermined code, e.g. password, passphrase or PIN
- H04L9/3231—Biological data, e.g. fingerprint, voice or retina
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/06—Authentication
Definitions
- the present disclosure relates to an image processing device, a method, a system, and a computer-readable medium, and more particularly to an image processing device, a method, a system, and a computer-readable medium that can be used for an authentication application using an iris.
- -Biometric authentication using an iris is known.
- an iris of a subject is imaged using an imaging device, and a feature amount is extracted from the imaged iris pattern.
- the extracted feature amount is collated with the feature amount registered in the database in advance, and the pass/fail is determined based on the collation score.
- the extracted feature amount is added to the database.
- the iris which is a donut-shaped tissue surrounding the pupil, has a very complicated pattern and is unique to each person. Further, in imaging the iris, the subject's eyes are irradiated with near-infrared light.
- an iris image is captured with a resolution capable of expressing the radius of the iris with 100 to 140 pixels. Further, the wavelength of near-infrared light with which the subject's eyes are irradiated is 700 nm to 900 nm.
- the diameter of the iris is about 1 cm, and if you try to express the radius with 100 pixels, you will get a particle size of 50 ⁇ m.
- the iris pattern is minute, it is possible to image the iris pattern with a quality that can be authenticated or collated under conditions such as a long distance between the subject and the imaging unit, a wide field of view for imaging, and movement of the subject. Have difficulty.
- an object of the present disclosure to provide an image processing device, a method, a system, and a computer-readable medium capable of capturing an iris pattern with a quality that can be authenticated or collated.
- a plurality of iris image pickup means arranged in different positions in the same visual field range and an image in a visual field range wider than the visual field range of the iris image pickup means.
- At least one of an image and a sound is provided, and a control unit that controls at least one of irradiation of light from the illumination unit, the control unit further includes a period in which the subject's next blink occurs, And, when the subject predicts a period of time when the subject passes the in-focus position of the iris image pickup means, and a period in which the next blink occurs and a period of time when the subject passes the in-focus position, the in-focus position and the There is provided an image processing system for controlling at least one of the guide unit and the iris image pickup unit so that the position where the subject next blinks is different.
- the present disclosure predicts a period in which the subject's moving subject blinks next, and a period in which the subject passes the in-focus position of the iris imaging unit for capturing the iris of the subject.
- a control unit that controls a guide unit for guiding the subject so that the subject does not blink in the in-focus position.
- the present disclosure predicts a period in which the subject's moving subject blinks next, and a period in which the subject passes the in-focus position of the iris imaging unit for capturing the iris of the subject.
- the in-focus position is set to a position other than the position of the subject in the period in which the next blink occurs.
- an image processing apparatus including a control unit that controls an iris image pickup unit.
- the present disclosure uses the images of the entire imaging unit that captures images in a wider visual field range than the visual field ranges of the plural iris imaging units that are arranged at different positions in the same visual field range.
- An image processing method for carrying out at least one of reading out an image of a means, presenting at least one of a video and a sound of a guiding means for guiding a subject, and irradiating the subject with light of an illuminating means. provide.
- the present disclosure predicts a period in which the subject's moving subject's next blink occurs, and predicts a period in which the subject passes through the in-focus position of the iris imaging unit for imaging the subject's iris. Then, it is determined whether or not the period in which the next blink occurs and the period in which the in-focus position passes, and when it is determined that they overlap, the in-focus position and the position where the subject makes the next blink.
- an image processing method for performing at least one of the guidance of the subject and the control of the iris image pickup means so as to differ from each other.
- the plurality of iris images are captured using an image of the entire image capturing unit that captures a wider field of view than the plurality of iris imagers arranged in different positions in the same field of view.
- a non-transitory computer-readable medium that stores a program to be executed is provided.
- the present disclosure predicts a period in which the subject's moving subject's next blink occurs, and predicts a period in which the subject passes through the in-focus position of the iris imaging unit for imaging the subject's iris. Then, it is determined whether or not the period in which the next blink occurs and the period in which the in-focus position passes, and when it is determined that they overlap, the in-focus position and the position where the subject makes the next blink.
- a non-transitory computer-readable medium storing a program for causing a computer to execute a process of performing at least one of the guidance of the subject and the control of the iris image pickup means.
- the image processing device, method, system, and computer-readable medium according to the present disclosure can capture an iris pattern with a quality that allows authentication and matching.
- FIG. 1 is a block diagram showing an image processing system according to a first embodiment of the present disclosure.
- 3 is a flowchart showing an operation procedure in the image processing system.
- 9 is a flowchart showing an operation procedure in the image processing system according to the second embodiment of the present disclosure.
- the block diagram which shows the structural example of a computer device.
- the problem is quantified.
- the distance between the subject and the imaging means distance between the subject and the imaging means (distance between the subject and the gate) is 2 m
- the horizontal field of view that is, the horizontal range that can cover both eyes of one subject is 0.2 m.
- the vertical field of view that is, the vertical range that can be covered from the eyes of a tall subject, typically a male, to the eyes of a short subject, typically a female, is 0.4 m. .
- the walking speed (moving speed) of the subject with respect to the gate is an average value of the slow walking speed of an adult, for example, 1 m/s.
- the image pickup means has a high resolution of 32 M pixels and 100 fps (frame second), as will be described later. Both high frame rates are required.
- the depth of field that can be secured 2 m ahead is about 1 cm.
- FIG. 1 shows an image processing system according to the first embodiment of the present disclosure.
- the image processing system includes a whole imager 100, an inductor 200, an illuminator 300, iris imagers 401 to 404, and a controller 500.
- the number of iris imagers is four in FIG. 1, the number of iris imagers is not particularly limited. The number of iris imagers can be appropriately set according to the range of visual field to be covered, available resolution of the iris imager, and the like.
- the whole image pickup device (whole image pickup means) 100 picks up an image of a subject in a wide visual field range so as to cover the whole from a tall subject to a short subject.
- the whole imager 100 may have a resolution capable of authenticating a subject with a face.
- the controller (control means) 500 monitors the whole image supplied from the whole image pickup device 100, and includes an inductor (guide means) 200, an illuminator (illumination means) 300, and a plurality of iris imagers (iris image pickup means). Controls 401 to 404.
- the function of the controller 500 can be configured by hardware, but can also be realized by a computer program.
- the controller 500 determines the start of biometric authentication for the subject based on the whole image supplied from the whole imager 100 or an external input.
- the control executed by the controller 500 includes guidance control, lighting control, and iris image pickup control.
- the controller 500 supplies guidance control information for guiding the subject to the guidance device 200.
- the inductor 200 guides the subject based on the guidance control information.
- the inductor 200 includes, for example, a display and a speaker.
- the inductor 200 presents, for example, a video or audio for presenting the start of biometric authentication on a display or a speaker.
- the inducing device 200 presents an image, a sound, and the like for inducing the eye gaze of the subject on the iris image pickup device through a display and a speaker.
- the controller 500 supplies the lighting device 300 with lighting control information for irradiating the subject with the illumination light.
- the illuminator 300 irradiates the subject with light (for example, near-infrared light) based on the illumination control information.
- the illuminator 300 includes an LED (Light Emitting Diode) that is a light source and a synchronization signal generator.
- the amount of light emitted from the illuminator 300 to the subject is determined by the supply current value to the LED, the lighting time of the LED, and the lighting cycle.
- the lighting control information includes these numerical values.
- the LED lighting cycle is synchronized with the frame rates of the plurality of iris imagers 401 to 404.
- the controller 500 selects an iris image pickup device that can appropriately pick up an area including the eyes of the subject among the plurality of iris image pickup devices 401 to 404 based on the whole image supplied from the whole image pickup device 100. decide. Further, the controller 500 determines the vertical position of the gaze area to be read at high speed in the determined iris imager.
- FIG. 2 is a diagram showing a state of iris image pickup control. Details of the iris image pickup control will be described with reference to FIG.
- the iris imagers 401 to 404 are general-purpose cameras of 12M pixels (horizontal 4000 pixels, vertical 3000 pixels) and 60 fps. Such cameras are becoming popular as industrial cameras.
- a plurality of iris imagers 401 to 404 are arranged so as to be vertically stacked on each other. At this time, each of the plurality of iris imagers 401 to 404 is arranged such that the image areas thereof partially overlap with the adjacent iris imagers.
- the iris imagers 401 to 404 are arranged, for example, so that the image areas of adjacent iris imagers overlap by 2.5 cm. In that case, at the focus position of 2 m ahead, the four units have horizontal 0.2 m and vertical 0.45 m ((0.15-0.025)+(0.15-0.025-0.025)+( A visual field of 0.15-0.025-0.025)+(0.15-0.025)m) can be secured. That is, the required visual field of 0.2 m in the horizontal direction and 0.4 m in the vertical direction can be secured. From the drawings and the above description, it can be seen that the iris imagers have the same visual field range and are arranged at different positions.
- the frame rate of each iris imager is 60 fps, the required frame rate of 100 fps cannot be satisfied as it is.
- an industrial camera or the like has a gaze area mode. In the gaze area mode, only the partial area set as the gaze area is read out, not the entire screen. The frame rate can be increased by using such a gaze area mode.
- the controller 500 sets a gaze area in an arbitrary iris imager and reads an image of the gaze area from the iris imager.
- a partial area of 4000 pixels in the horizontal direction and 1500 pixels in the vertical direction is set as the gazing area.
- the frame rate can be increased to 120 fps, which is twice the frame rate of 60 fps when reading the entire screen.
- the horizontal field of view and the vertical field of view of the partial regions are 0.2 m and 0.75 m, respectively. Both eyes of a person are lined up horizontally. Therefore, in the gaze area mode, it is preferable to reduce the number of pixels vertically rather than horizontally so that both eyes can be imaged.
- the iris imager that captures the eye region is only one of the four iris imagers 401 to 404.
- the condition for reading at 120 fps is the partial area in the iris imager.
- the controller 500 estimates an iris imager that can preferably image the eye region among the plurality of iris imagers 401 to 404, and estimates the vertical position of the gaze region that is read out at high speed in the iris imager.
- the above estimation can be realized by the following method.
- the whole imager 100 has a resolution capable of authenticating the subject with the face, and the controller 500 derives the eye position of the subject in the whole image captured by the whole imager 100.
- the controller 500 determines the iris imager corresponding to the eye position of the subject in the entire image and the eye positions existing in the imager by using the camera parameters and the layout relationship of the whole imager 100 and each iris imager. Derive.
- a field of view wider than 0.2 m in the horizontal direction and 0.4 m in the vertical direction and a time resolution higher than 100 fps can be realized by using a general-purpose camera.
- the controller 500 supplies iris image pickup information to each of the iris imagers 401 to 404 based on the above-described iris image pickup control.
- the controller 500 supplies the iris imaging information including the vertical position of the gaze area to the iris imaging device that passes the eye area of the subject.
- the controller 500 may supply arbitrary iris imaging information for other iris imagers.
- the controller 500 may supply the iris image pickup information including the information about the supply stop of the iris image to other iris imagers in order to reduce the total data amount of the iris image, for example.
- Each of the iris imagers 401 to 404 supplies an iris image to the controller 500 based on the iris image pickup information supplied from the controller 500. At this time, the iris imagers 401 to 404 output to the controller 500 an image (iris image) of the gaze area set by the controller 500 using the iris image pickup information.
- the iris imagers 401 to 404 may apply lossy compression to the iris image in the gaze area and output the compressed iris image to the controller 500.
- the iris imagers 401 to 404 may perform quantization (compression for each pixel), predictive encoding and quantization (compression with a set of a plurality of pixels), or combination of transform encoding and quantization (compression of a plurality of pixels).
- the controller 500 uses the iris images supplied from the iris imagers 401 to 404 to perform the authentication and registration described in the background art. The controller 500 returns to the next process when the next subject exists or when the authentication or registration fails.
- FIG. 3 shows an operation procedure in the image processing system.
- the controller 500 performs guidance control and guides the subject using the inductor 200 (step S1001).
- the controller 500 performs illumination control and irradiates the subject with infrared light using the illuminator 300 (step S1002).
- the controller 500 performs the above-described iris image pickup control, and acquires an image of an iris (iris image) picked up by the plurality of iris imagers 401 to 404 (step S1003).
- the iris image acquired in step S1003 is used for iris authentication and registration.
- the controller 500 does not need to obtain an iris image from all the iris imagers 401 to 404 for a certain subject, as described above.
- the controller 500 acquires an iris image from an iris imager that is imaging the subject's eye region.
- the controller 500 performs iris authentication using the iris image acquired in step S1003 or registers the iris image (step S1004).
- the controller 500 determines whether there is a next subject, or whether to perform re-authentication or re-registration (step S1005). If there is a next subject, or if it is determined to perform re-authentication or re-registration, the process returns to step S1001 and the process is performed from the guidance control.
- the whole image pickup device 100 of the present embodiment has a resolution capable of authenticating with a face, holds a feature amount for face recognition of a subject in a database, and stores a feature amount for subject iris authentication with a database.
- the apparatus according to the present disclosure can be used for the purpose of identifying a subject based on face recognition and registering the extracted feature amount of the iris in a database.
- the device according to the present disclosure provides height information of a subject based on eye position information obtained by iris imaging control, or eye position information obtained when authenticating or registering an iris image obtained by an iris imager. It can also be used to estimate and register in the database.
- the apparatus uses the estimated height information to determine the vertical position of the gaze region that can be used to quickly image the eye region from a plurality of iris image pickup devices and the gaze region that is read out at high speed in the image pickup device. Can be used for calibration.
- a high resolution corresponding to a required visual field of 0.2 m ⁇ 0.4 m and a high frame rate corresponding to a time resolution of 0.01 seconds can be achieved with a combination of general-purpose cameras.
- the configuration of the image processing system according to this embodiment may be the same as the configuration of the image processing system in the first embodiment shown in FIG.
- the controller 500 predicts the period in which the subject's next blink occurs and the period in which the subject passes the in-focus position (2 m ahead) of the iris imager. When the periods overlap, the controller 500 controls the inductor 200 so that blinking does not occur at the in-focus position.
- the controller 500 also functions as an image processing device (control device) that performs an image processing method (control method) related to blinking. Other points may be similar to those of the first embodiment.
- blinking is a physiological phenomenon, it is difficult to control the subject's efforts, and blinking may occur at the in-focus position of the iris imagers 401 to 404.
- the average blinking time of a person is 20 times per minute (interval time 3 seconds), 0.2 seconds per time (duration 0.2 seconds), and the time rate with eyes open is about 93.3%. Become. If a simple calculation is performed, there is a 6.6% probability of redone authentication or registration due to blinking, and about 7 out of 100 people will have to perform authentication or registration again.
- the controller 500 can acquire the information as to whether or not the subject has closed his/her eyes by analyzing the eye region of the whole image of the whole image pickup device 100. Further, the controller 500 can acquire the distance information to the subject by using the whole image pickup device 100 having a distance measuring function such as Time of Flight (ToF). The controller 500 can acquire not only the position information of the subject but also the walking speed information of the subject by analyzing the distance information in time series.
- TOF Time of Flight
- the range information is defined as follows.
- b(t) be the information on whether the subject closed his eyes at time t.
- Position information of the subject at time t that is, the distance from the entire image pickup device 100 to the subject is d(t)[m]. Let the average walking speed of the subject be v [m/s].
- the blinking interval time is set to b_i_t[s].
- the duration of blinking is b_p_t[s].
- Focus position information that is, the distance from the iris imagers 401 to 404 to the focus position is df [m].
- Information of the focusing range is set to dof[m]. This completes the definition of each piece of information.
- the controller 500 can predict the occurrence period of the subject's next blink, as follows (hereinafter also referred to as the next blink period prediction).
- the controller 500 predicts the time (tbnext ⁇ t ⁇ tbnext+b_p_t) from the next blink start time tbnext to tbnext+b_p_t as the next blink occurrence period.
- the controller 500 can predict the period during which the subject passes through the in-focus position (hereinafter, also referred to as in-focus position passage period prediction).
- the controller 500 calculates the time ta at which the subject at the distance d(t) reaches the in-focus position at the time t.
- the controller 500 allows the subject to pass the in-focus position for a period (ta ⁇ t ⁇ ta+(dof ⁇ v)) from the time ta to the time when the subject passes the in-focus range at time ta. Predict the period.
- the controller 500 determines whether or not the subject's next blink occurrence period derived as described above and the subject's passing period of the in-focus position overlap. When the next blink occurrence period and the period of passing the in-focus position overlap, the controller 500 guides the subject using the inductor 200 so that the in-focus blink does not occur (hereinafter, blink avoidance guidance control). Also called).
- the controller 500 controls the inductor 200 so as to delay the time at which the subject reaches the in-focus position, for example.
- the controller 500 causes the inductor 200 to present an image, a sound, or the like for "walking slowly” or “stopping” the subject using a display or a speaker.
- the subject decreases the walking speed or temporarily stops according to the guidance, the time when the subject arrives at the in-focus position is delayed, and the possibility of avoiding blinking at the in-focus position can be increased.
- the controller 500 may control the inductor 200 so that the subject arrives at the in-focus position earlier. In that case, the controller 500 may cause the inductor 200 to present an image, a sound, or the like for “walking the subject fast” using a display or a speaker.
- the controller 500 may control the inductor 200 so that the subject arrives at the in-focus position earlier.
- the controller 500 may cause the inductor 200 to present an image, a sound, or the like for “walking the subject fast” using a display or a speaker.
- the controller 500 may control the inductor 200 so that the subject blinks. For example, when b(t) cannot be stored for a long time, the controller 500 presents an image or sound for “blinking” to the subject by the display or the speaker by the time when the duration of the blinking is subtracted from the time ta. You may let me. In that case, by blinking the subject once before passing the in-focus position, it is possible to prevent blinking during the period in which the subject passes the in-focus position. Further, when the iris image pickup means has a function of moving the in-focus position, the controller 500 may move the in-focus position so that the iris image capturing means is in focus outside the period in which the next blink occurs. The function of moving the focus position can be realized by driving a means for controlling the focus position such as a liquid lens provided in the iris imager.
- FIG. 4 shows an operation procedure in the image processing system.
- the controller 500 performs the above-described next blink period prediction (step S2001).
- the controller 500 also performs the above-described focus position passage period prediction (step S2002).
- the controller 500 determines whether or not the next blinking period predicted in step S2001 and the in-focus position passing period predicted in step S2002 overlap (step S2003).
- the controller 500 performs the above-described blink avoidance guidance system and guides the subject using the inductor 200 (step S2004).
- the controller 500 performs the guidance control described in the first embodiment, and guides the subject using the inductor 200 (step S2005).
- the controller 500 performs illumination control and irradiates the subject with infrared light using the illuminator 300 (step S2006).
- the controller 500 performs iris image capturing control to acquire an image of an iris (iris image) captured by using the plurality of iris imagers 401 to 404 (step S2007).
- step S2007 the controller 500 does not need to obtain an iris image from all the iris imagers 401 to 404 for a subject, as described in the first embodiment.
- the controller 500 acquires an iris image from an iris imager that is imaging the subject's eye region.
- the controller 500 performs iris authentication using the iris image acquired in step S2007 or registers the iris image (step S2008).
- the controller 500 determines whether there is a next subject, or whether to perform re-authentication or re-registration (step S2009). When there is the next subject, or when it is determined that re-authentication or re-registration is performed, the process returns to step S2001, and the process is performed from the next blink period prediction.
- Steps S2005 to S2009 may be the same as steps S1001 to 1005 of FIG.
- the controller 500 predicts the subject's next blink period. Further, the controller 500 predicts a period in which the subject passes through the in-focus position of the iris imager. When the next blinking period and the period of passing the in-focus position overlap, the controller 500 performs blink avoidance guidance control and guides the subject using the inductor 200. By doing so, it is possible to avoid the subject from blinking at the in-focus position, and it is possible to reduce the rate of redoing the authentication and registration due to blinking.
- the controller 500 derives the positions of the right eye and left eye of the subject from the entire image (overhead image) captured by the entire imager 100, and determines the gaze area corresponding to the position of the right eye and the gaze area corresponding to the position of the left eye. And may be set in the iris imager. In that case, the iris imager supplies the iris image of the right eye and the iris image of the left eye to the controller 500.
- the shape of the gaze area may be rectangular or elliptical.
- the controller 500 may derive the positions of the right eye and the left eye of the subject based on the iris image captured by the iris imager, instead of the overhead image. For example, the controller 500 may temporarily set the partial area shown in FIG. 2 as a gaze area and derive the positions of the right eye and the left eye from the image of the gaze area. In that case, the controller 500 sets a partial area corresponding to the position of the right eye and a partial area corresponding to the position of the left eye to the iris imager as the gaze area based on the derived positions of the right eye and the left eye. May be.
- the controller 500 can be configured as a computer device.
- FIG. 5 shows a configuration example of an information processing device (computer device) that can be used as the controller 500.
- the information processing device 600 includes a control unit (CPU: Central Processing Unit) 610, a storage unit 620, a ROM (Read Only Memory) 630, a RAM (Random Access Memory) 640, a communication interface (IF) 650, and a user interface 660.
- CPU Central Processing Unit
- storage unit 620 a storage unit 620
- ROM (Read Only Memory) 630 a ROM (Read Only Memory) 630
- RAM Random Access Memory
- IF communication interface
- user interface 660 a user interface
- the communication interface 650 is an interface for connecting the information processing device 600 and a communication network via a wired communication means or a wireless communication means.
- the user interface 660 includes a display unit such as a display.
- the user interface 660 also includes an input unit such as a keyboard, a mouse, and a touch panel.
- the storage unit 620 is an auxiliary storage device that can hold various data.
- the storage unit 620 does not necessarily have to be a part of the information processing device 600, and may be an external storage device or a cloud storage connected to the information processing device 600 via a network.
- the ROM 630 is a non-volatile storage device.
- a semiconductor memory device such as a flash memory having a relatively small capacity is used.
- the program executed by the CPU 610 can be stored in the storage unit 620 or the ROM 630.
- Non-transitory computer readable media include various types of tangible storage media.
- Examples of the non-transitory computer-readable medium are, for example, magnetic recording media such as flexible disks, magnetic tapes, and hard disks, magneto-optical recording media such as magneto-optical disks, CDs (compact disk), or DVDs (digital versatile disk).
- an optical disk medium such as a mask ROM, a PROM (programmable ROM), an EPROM (erasable PROM), a flash ROM, or a semiconductor memory such as a RAM.
- the program may be supplied to the computer using various types of temporary computer-readable media.
- Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves.
- the transitory computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
- the RAM 640 is a volatile storage device.
- various semiconductor memory devices such as DRAM (Dynamic Random Access Memory) or SRAM (Static Random Access Memory) are used.
- the RAM 640 can be used as an internal buffer that temporarily stores data and the like.
- the CPU 610 expands the program stored in the storage unit 620 or the ROM 630 into the RAM 640 and executes it.
- various controls including, for example, guidance control, lighting control, and iris image pickup control are executed.
- various processes including an image processing method related to blinking are executed.
- [Appendix 1] A plurality of iris imaging means arranged in different positions in the same visual field range, A whole image pickup means for picking up an image in a visual field range wider than the visual field range of the iris image pickup means; Guiding means for guiding the subject, Illumination means for illuminating the subject with light, Using at least one of the image of the plurality of iris image pickup means, at least one of presentation of at least one of video and sound of the guide means, and control of at least one of irradiation of light of the illumination means using the image of the whole image pickup means.
- control means for The control means further predicts a period in which the subject's next blink occurs, and a period in which the subject passes the in-focus position of the iris image pickup means, and the period in which the next blink occurs and the combination.
- An image processing system that controls at least one of the guide unit and the iris image pickup unit so that the focus position and the position at which the subject makes the next blink differ when the period of passing the focus position overlaps. ..
- the control means reads out images of the plurality of iris image pickup means, In the reading of the images of the plurality of iris image pickup means, the control means specifies the iris image pickup means capable of picking up the eye of the subject out of the plurality of iris image pickup means based on the image picked up by the whole image pickup means.
- the control means when the period in which the next blinking and the period of passing the in-focus position overlap, the in-focus position is a position other than the position of the subject in the period in which the next blink occurs. 3.
- Appendix 5 The image processing system according to appendix 3 or 4, wherein the control unit predicts a period in which the next blink occurs based on the previous blink time, blink interval time, and blink duration of the subject.
- Appendix 6 The image processing system according to any one of appendices 3 to 5, wherein the control unit predicts a period during which the in-focus position is passed based on the position and the moving speed of the subject.
- An image processing apparatus comprising: a control unit that controls a guiding unit that guides the subject so that the subject does not blink at the in-focus position when the period of passing through the in-focus position overlaps.
- Appendix 8 8. The image processing apparatus according to appendix 7, wherein the control unit predicts a period in which the next blink occurs based on the previous blink time, blink interval time, and blink duration of the subject.
- Appendix 9 The image processing device according to appendix 7 or 8, wherein the control unit predicts a period during which the focus position is passed based on the position and the moving speed of the subject.
- control unit controls the guiding unit to blink the subject when the period in which the next blinking occurs and the period in which the next blinking passes overlap with each other.
- Image processing device The control unit controls the guiding unit to blink the subject when the period in which the next blinking occurs and the period in which the next blinking passes overlap with each other.
- the in-focus position includes a control unit that controls the iris imaging unit so that the in-focus position is a position other than the position of the subject in the period in which the next blinking occurs.
- Image processing device includes a control unit that controls the iris imaging unit so that the in-focus position is a position other than the position of the subject in the period in which the next blinking occurs.
- Appendix 14 14. The image processing apparatus according to appendix 13, wherein the control unit predicts a period in which the next blink occurs based on the previous blink time, blink interval time, and blink duration of the subject.
- Appendix 15 15. The image processing apparatus according to appendix 13 or 14, wherein the control unit predicts a period during which the focus position is passed, based on the position and the moving speed of the subject.
- [Appendix 17] Predicting a period during which the subject passes through the in-focus position of the iris image capturing means for capturing the iris of the subject, It is determined whether or not the period in which the next blink occurs and the period in which the focus position passes, When it is determined that they overlap, an image processing method that performs at least one of guiding the subject and controlling the iris image pickup unit so that the focus position and the position at which the subject blinks the next time are different.
- [Appendix 21] Predict the duration of the next blink of moving subjects, Predicting a period during which the subject passes through the in-focus position of the iris image capturing means for capturing the iris of the subject, It is determined whether or not the period in which the next blink occurs and the period in which the focus position passes, If it is determined that they overlap, the computer performs a process of performing at least one of the guidance of the subject and the control of the iris imaging unit so that the focus position and the position where the subject blinks next are different.
- a non-transitory computer-readable medium that stores programs for execution.
- Appendix 22 When the program determines that the period in which the next blink occurs and the period in which the in-focus position passes are overlapped with each other, a computer guides the subject so that the in-focus subject does not blink at the in-focus position. 22.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Security & Cryptography (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- Biomedical Technology (AREA)
- General Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Software Systems (AREA)
- Biodiversity & Conservation Biology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Input (AREA)
- Collating Specific Patterns (AREA)
- Studio Devices (AREA)
Abstract
Description
同じ視野範囲で互いに異なる位置に配置される複数の虹彩撮像手段と、
前記虹彩撮像手段の視野範囲より広い視野範囲で撮像する全体撮像手段と、
被験者を誘導するための誘導手段と、
前記被験者に光を照らす照明手段と、
前記全体撮像手段の画像を用いて、前記複数の虹彩撮像手段の画像の読み出し、前記誘導手段の映像及び音声の少なくとも一方の提示、及び、前記照明手段の光の照射のうち少なくとも1つを制御する制御手段とを備え、
前記制御手段は、更に、前記被験者の次のまばたきが発生する期間、及び、前記被験者が前記虹彩撮像手段の合焦位置を通過する期間を予測し、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置と、前記被験者が次の瞬きをする位置とを異ならせるように、前記誘導手段及び前記虹彩撮像手段の少なくとも一方を制御する画像処理システム。
前記制御手段は、前記複数の虹彩撮像手段の画像の読み出しを実施し、
前記制御手段は、前記複数の虹彩撮像手段の画像の読み出しでは、前記全体撮像手段で撮像された映像に基づいて、前記複数の虹彩撮像手段のうち前記被験者の目を撮像できる虹彩撮像手段を特定し、該特定した虹彩撮像手段に前記被験者の目の位置を含む注視領域を設定し、前記特定した虹彩撮像手段から前記注視領域の映像を取得する付記1に記載の画像処理システム。
前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置で被験者がまばたきをしないように前記誘導手段を制御する付記1又は2に記載の画像処理システム。
前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置が、前記次のまばたきが発生する期間における前記被験者の位置以外の位置になるように前記虹彩撮像手段を制御する付記1又は2に記載の画像処理システム。
前記制御手段は、前記被験者の前回のまばたき時刻、まばたき間隔時間、及びまばたき継続時間に基づいて、次のまばたきが発生する期間を予測する付記3又は4に記載の画像処理システム。
前記制御手段は、前記被験者の位置及び移動速度に基づいて、前記合焦位置を通過する期間を予測する付記3から5何れか1つに記載の画像処理システム。
移動する被験者の次のまばたきが発生する期間、及び、前記被験者の虹彩を撮像するための虹彩撮像手段の合焦位置を前記被験者が通過する期間を予測し、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置で被験者がまばたきをしないように、前記被験者を誘導するための誘導手段を制御する制御手段を備える画像処理装置。
前記制御手段は、前記被験者の前回のまばたき時刻、まばたき間隔時間、及びまばたき継続時間に基づいて、次のまばたきが発生する期間を予測する付記7に記載の画像処理装置。
前記制御手段は、前記被験者の位置及び移動速度に基づいて、前記合焦位置を通過する期間を予測する付記7又は8に記載の画像処理装置。
前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記被験者が前記合焦位置に到達する時刻を遅らせるように前記誘導手段を制御する付記7から9何れか1つに記載の画像処理装置。
前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記被験者が前記合焦位置に到達する時刻を早めるように前記誘導手段を制御する付記7から9何れか1つに記載の画像処理装置。
前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記被験者にまばたきさせるように前記誘導手段を制御する付記7から9何れか1つに記載の画像処理装置。
移動する被験者の次のまばたきが発生する期間、及び、前記被験者の虹彩を撮像するための虹彩撮像手段の合焦位置を前記被験者が通過する期間を予測し、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置が、前記次のまばたきが発生する期間における前記被験者の位置以外の位置になるように前記虹彩撮像手段を制御する制御手段を備える画像処理装置。
前記制御手段は、前記被験者の前回のまばたき時刻、まばたき間隔時間、及びまばたき継続時間に基づいて、次のまばたきが発生する期間を予測する付記13に記載の画像処理装置。
前記制御手段は、前記被験者の位置及び移動速度に基づいて、前記合焦位置を通過する期間を予測する付記13又は14に記載の画像処理装置。
同じ視野範囲で互いに異なる位置に配置される複数の虹彩撮像手段の視野範囲より広い視野範囲で撮像する全体撮像手段の画像を用いて、前記複数の虹彩撮像手段の画像の読み出し、被験者を誘導するための誘導手段の映像及び音声の少なくとも一方の提示、及び、前記被験者に光を照らす照明手段の光の照射のうち少なくとも1つを実施する画像処理方法。
前記被験者の虹彩を撮像するための虹彩撮像手段の合焦位置を前記被験者が通過する期間を予測し、
前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なるか否かを判断し、
重なると判断した場合、前記合焦位置と、前記被験者が次の瞬きをする位置とを異ならせるように、前記被験者の誘導、及び前記虹彩撮像手段の制御の少なくとも一方を実施する画像処理方法。
前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なると判断した場合、前記合焦位置で前記被験者がまばたきしないように、前記被験者を誘導する次の瞬きをする位置とを異ならせるように、前記被験者を誘導する付記17に記載の画像処理方法。
前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なると判断した場合、前記合焦位置が、前記次のまばたきが発生する期間における前記被験者の位置以外の位置になるように前記虹彩撮像手段を制御する付記17又は18に記載の画像処理方法。
同じ視野範囲で互いに異なる位置に配置される複数の虹彩撮像手段の視野範囲より広い視野範囲で撮像する全体撮像手段の画像を用いて、前記複数の虹彩撮像手段の画像の読み出し、被験者を誘導するための誘導手段の映像及び音声の少なくとも一方の提示、及び、前記被験者に光を照らす照明手段の光の照射のうち少なくとも1つを実施する処理をコンピュータに実行させるためのプログラムを格納する非一時的なコンピュータ可読媒体。
移動する被験者の次のまばたきが発生する期間を予測し、
前記被験者の虹彩を撮像するための虹彩撮像手段の合焦位置を前記被験者が通過する期間を予測し、
前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なるか否かを判断し、
重なると判断した場合、前記合焦位置と、前記被験者が次の瞬きをする位置とを異ならせるように、前記被験者の誘導、及び前記虹彩撮像手段の制御の少なくとも一方を実施する処理をコンピュータに実行させるためのプログラムを格納する非一時的なコンピュータ可読媒体。
前記プログラムは、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なると判断した場合、前記合焦位置で前記被験者が瞬きしないように、前記被験者を誘導する処理をコンピュータに実行させる付記21に記載の非一時的なコンピュータ可読媒体。
前記プログラムは、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なると判断した場合、前記合焦位置が、前記次のまばたきが発生する期間における前記被験者の位置以外の位置になるように前記虹彩撮像手段を制御する処理をコンピュータに実行させる付記21又は22に記載の非一時的なコンピュータ可読媒体。
200:誘導器
300:照明器
401~404:虹彩撮像器
500:制御器
600:情報処理装置
Claims (23)
- 同じ視野範囲で互いに異なる位置に配置される複数の虹彩撮像手段と、
前記虹彩撮像手段の視野範囲より広い視野範囲で撮像する全体撮像手段と、
被験者を誘導するための誘導手段と、
前記被験者に光を照らす照明手段と、
前記全体撮像手段の画像を用いて、前記複数の虹彩撮像手段の画像の読み出し、前記誘導手段の映像及び音声の少なくとも一方の提示、及び、前記照明手段の光の照射のうち少なくとも1つを制御する制御手段とを備え、
前記制御手段は、更に、前記被験者の次のまばたきが発生する期間、及び、前記被験者が前記虹彩撮像手段の合焦位置を通過する期間を予測し、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置と、前記被験者が次の瞬きをする位置とを異ならせるように、前記誘導手段及び前記虹彩撮像手段の少なくとも一方を制御する画像処理システム。 - 前記制御手段は、前記複数の虹彩撮像手段の画像の読み出しを実施し、
前記制御手段は、前記複数の虹彩撮像手段の画像の読み出しでは、前記全体撮像手段で撮像された映像に基づいて、前記複数の虹彩撮像手段のうち前記被験者の目を撮像できる虹彩撮像手段を特定し、該特定した虹彩撮像手段に前記被験者の目の位置を含む注視領域を設定し、前記特定した虹彩撮像手段から前記注視領域の映像を取得する請求項1に記載の画像処理システム。 - 前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置で被験者がまばたきをしないように前記誘導手段を制御する請求項1又は2に記載の画像処理システム。
- 前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置が、前記次のまばたきが発生する期間における前記被験者の位置以外の位置になるように前記虹彩撮像手段を制御する請求項1又は2に記載の画像処理システム。
- 前記制御手段は、前記被験者の前回のまばたき時刻、まばたき間隔時間、及びまばたき継続時間に基づいて、次のまばたきが発生する期間を予測する請求項3又は4に記載の画像処理システム。
- 前記制御手段は、前記被験者の位置及び移動速度に基づいて、前記合焦位置を通過する期間を予測する請求項3から5何れか1項に記載の画像処理システム。
- 移動する被験者の次のまばたきが発生する期間、及び、前記被験者の虹彩を撮像するための虹彩撮像手段の合焦位置を前記被験者が通過する期間を予測し、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置で被験者がまばたきをしないように、前記被験者を誘導するための誘導手段を制御する制御手段を備える画像処理装置。
- 前記制御手段は、前記被験者の前回のまばたき時刻、まばたき間隔時間、及びまばたき継続時間に基づいて、次のまばたきが発生する期間を予測する請求項7に記載の画像処理装置。
- 前記制御手段は、前記被験者の位置及び移動速度に基づいて、前記合焦位置を通過する期間を予測する請求項7又は8に記載の画像処理装置。
- 前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記被験者が前記合焦位置に到達する時刻を遅らせるように前記誘導手段を制御する請求項7から9何れか1項に記載の画像処理装置。
- 前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記被験者が前記合焦位置に到達する時刻を早めるように前記誘導手段を制御する請求項7から9何れか1項に記載の画像処理装置。
- 前記制御手段は、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記被験者にまばたきさせるように前記誘導手段を制御する請求項7から9何れか1項に記載の画像処理装置。
- 移動する被験者の次のまばたきが発生する期間、及び、前記被験者の虹彩を撮像するための虹彩撮像手段の合焦位置を前記被験者が通過する期間を予測し、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なる場合、前記合焦位置が、前記次のまばたきが発生する期間における前記被験者の位置以外の位置になるように前記虹彩撮像手段を制御する制御手段を備える画像処理装置。
- 前記制御手段は、前記被験者の前回のまばたき時刻、まばたき間隔時間、及びまばたき継続時間に基づいて、次のまばたきが発生する期間を予測する請求項13に記載の画像処理装置。
- 前記制御手段は、前記被験者の位置及び移動速度に基づいて、前記合焦位置を通過する期間を予測する請求項13又は14に記載の画像処理装置。
- 同じ視野範囲で互いに異なる位置に配置される複数の虹彩撮像手段の視野範囲より広い視野範囲で撮像する全体撮像手段の画像を用いて、前記複数の虹彩撮像手段の画像の読み出し、被験者を誘導するための誘導手段の映像及び音声の少なくとも一方の提示、及び、前記被験者に光を照らす照明手段の光の照射のうち少なくとも1つを実施する画像処理方法。
- 移動する被験者の次のまばたきが発生する期間を予測し、
前記被験者の虹彩を撮像するための虹彩撮像手段の合焦位置を前記被験者が通過する期間を予測し、
前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なるか否かを判断し、
重なると判断した場合、前記合焦位置と、前記被験者が次の瞬きをする位置とを異ならせるように、前記被験者の誘導、及び前記虹彩撮像手段の制御の少なくとも一方を実施する画像処理方法。 - 前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なると判断した場合、前記合焦位置で前記被験者がまばたきしないように、前記被験者を誘導する次の瞬きをする位置とを異ならせるように、前記被験者を誘導する請求項17に記載の画像処理方法。
- 前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なると判断した場合、前記合焦位置が、前記次のまばたきが発生する期間における前記被験者の位置以外の位置になるように前記虹彩撮像手段を制御する請求項17又は18に記載の画像処理方法。
- 同じ視野範囲で互いに異なる位置に配置される複数の虹彩撮像手段の視野範囲より広い視野範囲で撮像する全体撮像手段の画像を用いて、前記複数の虹彩撮像手段の画像の読み出し、被験者を誘導するための誘導手段の映像及び音声の少なくとも一方の提示、及び、前記被験者に光を照らす照明手段の光の照射のうち少なくとも1つを実施する処理をコンピュータに実行させるためのプログラムを格納する非一時的なコンピュータ可読媒体。
- 移動する被験者の次のまばたきが発生する期間を予測し、
前記被験者の虹彩を撮像するための虹彩撮像手段の合焦位置を前記被験者が通過する期間を予測し、
前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なるか否かを判断し、
重なると判断した場合、前記合焦位置と、前記被験者が次の瞬きをする位置とを異ならせるように、前記被験者の誘導、及び前記虹彩撮像手段の制御の少なくとも一方を実施する処理をコンピュータに実行させるためのプログラムを格納する非一時的なコンピュータ可読媒体。 - 前記プログラムは、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なると判断した場合、前記合焦位置で前記被験者が瞬きしないように、前記被験者を誘導する処理をコンピュータに実行させる請求項21に記載の非一時的なコンピュータ可読媒体。
- 前記プログラムは、前記次のまばたきが発生する期間と前記合焦位置を通過する期間とが重なると判断した場合、前記合焦位置が、前記次のまばたきが発生する期間における前記被験者の位置以外の位置になるように前記虹彩撮像手段を制御する処理をコンピュータに実行させる請求項21又は22に記載の非一時的なコンピュータ可読媒体。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/431,550 US11983961B2 (en) | 2019-02-18 | 2020-02-12 | Image processing apparatus, method, system, and computer readable medium |
JP2021501882A JP7276425B2 (ja) | 2019-02-18 | 2020-02-12 | 画像処理装置、方法、システム、及びプログラム |
EP20758798.1A EP3929859A4 (en) | 2019-02-18 | 2020-02-12 | IMAGE PROCESSING DEVICE, METHOD AND SYSTEM, AND COMPUTER READABLE MEDIA |
BR112021016144-9A BR112021016144A2 (pt) | 2019-02-18 | 2020-02-12 | Aparelho, método e sistema de processamento de imagem e meio legível por computador |
SG11202108914SA SG11202108914SA (en) | 2019-02-18 | 2020-02-12 | Image processing apparatus, method, system, and computer readable medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-026941 | 2019-02-18 | ||
JP2019026941 | 2019-02-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020170894A1 true WO2020170894A1 (ja) | 2020-08-27 |
Family
ID=72144651
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/005220 WO2020170894A1 (ja) | 2019-02-18 | 2020-02-12 | 画像処理装置、方法、システム、及びコンピュータ可読媒体 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11983961B2 (ja) |
EP (1) | EP3929859A4 (ja) |
JP (2) | JP7276425B2 (ja) |
BR (1) | BR112021016144A2 (ja) |
SG (1) | SG11202108914SA (ja) |
WO (1) | WO2020170894A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022080293A1 (ja) * | 2020-10-16 | 2022-04-21 | 日本電気株式会社 | 認証システム、認証方法およびプログラム記録媒体 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11935328B2 (en) * | 2019-02-18 | 2024-03-19 | Nec Corporation | Image pick-up apparatus, method, system, and computer readable medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007159610A (ja) * | 2005-12-09 | 2007-06-28 | Matsushita Electric Ind Co Ltd | 登録装置、認証装置、登録認証装置、登録方法、認証方法、登録プログラムおよび認証プログラム |
WO2009016846A1 (ja) * | 2007-08-02 | 2009-02-05 | Panasonic Corporation | 虹彩認証装置および虹彩認証システム |
JP2017517165A (ja) * | 2014-03-10 | 2017-06-22 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | 写真画像内の瞬きおよび逸視回避 |
JP2019026941A (ja) | 2015-09-28 | 2019-02-21 | 日本軽金属株式会社 | 導電部材及びその製造方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003050995A (ja) | 2001-08-03 | 2003-02-21 | Casio Comput Co Ltd | 虹彩画像入力装置及び虹彩画像入力処理プログラム |
US8896725B2 (en) | 2007-06-21 | 2014-11-25 | Fotonation Limited | Image capture device with contemporaneous reference image capture mechanism |
JP2006212185A (ja) | 2005-02-03 | 2006-08-17 | Matsushita Electric Ind Co Ltd | 生体判別装置および認証装置ならびに生体判別方法 |
EP1748378B1 (en) * | 2005-07-26 | 2009-09-16 | Canon Kabushiki Kaisha | Image capturing apparatus and image capturing method |
US8121356B2 (en) * | 2006-09-15 | 2012-02-21 | Identix Incorporated | Long distance multimodal biometric system and method |
US9036871B2 (en) * | 2007-09-01 | 2015-05-19 | Eyelock, Inc. | Mobility identity platform |
US9888839B2 (en) * | 2009-04-01 | 2018-02-13 | Tearscience, Inc. | Methods and apparatuses for determining contact lens intolerance in contact lens wearer patients based on dry eye tear film characteristic analysis and dry eye symptoms |
US20170329138A1 (en) * | 2016-05-16 | 2017-11-16 | Osterhout Group, Inc. | Eye imaging systems for head-worn computers |
ES2610196A1 (es) | 2016-12-20 | 2017-04-26 | Universitat D'alacant / Universidad De Alicante | Método y dispositivo de autenticación biométrica mediante el reconocimiento del parpadeo |
-
2020
- 2020-02-12 SG SG11202108914SA patent/SG11202108914SA/en unknown
- 2020-02-12 EP EP20758798.1A patent/EP3929859A4/en not_active Withdrawn
- 2020-02-12 WO PCT/JP2020/005220 patent/WO2020170894A1/ja unknown
- 2020-02-12 BR BR112021016144-9A patent/BR112021016144A2/pt not_active Application Discontinuation
- 2020-02-12 JP JP2021501882A patent/JP7276425B2/ja active Active
- 2020-02-12 US US17/431,550 patent/US11983961B2/en active Active
-
2023
- 2023-04-28 JP JP2023074482A patent/JP7468746B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007159610A (ja) * | 2005-12-09 | 2007-06-28 | Matsushita Electric Ind Co Ltd | 登録装置、認証装置、登録認証装置、登録方法、認証方法、登録プログラムおよび認証プログラム |
WO2009016846A1 (ja) * | 2007-08-02 | 2009-02-05 | Panasonic Corporation | 虹彩認証装置および虹彩認証システム |
JP2017517165A (ja) * | 2014-03-10 | 2017-06-22 | クゥアルコム・インコーポレイテッドQualcomm Incorporated | 写真画像内の瞬きおよび逸視回避 |
JP2019026941A (ja) | 2015-09-28 | 2019-02-21 | 日本軽金属株式会社 | 導電部材及びその製造方法 |
Non-Patent Citations (3)
Title |
---|
DAUGMAN, HOW IRIS RECOGNITION WORKS, Retrieved from the Internet <URL:https://www.cl.cam.ac.uk/~jgd1000/irisrecog.pdf> |
HOSOYA: "Identification System by Iris Recognition", vol. 44, 2006, JAPANESE SOCIETY FOR MEDICAL AND BIOLOGICAL ENGINEERING, pages: 33 - 39 |
See also references of EP3929859A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022080293A1 (ja) * | 2020-10-16 | 2022-04-21 | 日本電気株式会社 | 認証システム、認証方法およびプログラム記録媒体 |
EP4231176A4 (en) * | 2020-10-16 | 2024-03-13 | NEC Corporation | AUTHENTICATION SYSTEM, AUTHENTICATION METHOD, PROGRAM STORAGE MEDIUM |
Also Published As
Publication number | Publication date |
---|---|
JPWO2020170894A1 (ja) | 2021-12-16 |
SG11202108914SA (en) | 2021-09-29 |
US11983961B2 (en) | 2024-05-14 |
US20220139114A1 (en) | 2022-05-05 |
EP3929859A4 (en) | 2022-04-13 |
JP7276425B2 (ja) | 2023-05-18 |
JP7468746B2 (ja) | 2024-04-16 |
EP3929859A1 (en) | 2021-12-29 |
BR112021016144A2 (pt) | 2021-10-13 |
JP2023106426A (ja) | 2023-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7468746B2 (ja) | 画像処理システム | |
JP7276427B2 (ja) | 画像処理装置、方法、システム、及びプログラム | |
JP7276426B2 (ja) | 照明制御装置、方法、システム、及びプログラム | |
JP7211483B2 (ja) | 画像処理装置、方法、システム、及びプログラム | |
US10497132B2 (en) | Irradiation system, irradiation method, and program storage medium | |
WO2020170892A1 (ja) | 撮像装置、方法、システム、及びコンピュータ可読媒体 | |
US12033433B2 (en) | Iris recognition apparatus, iris recognition method, computer program and recording medium | |
WO2021199188A1 (ja) | 撮像システム、撮像方法及び撮像プログラムが格納された非一時的なコンピュータ可読媒体 | |
JP2008052317A (ja) | 認証装置および認証方法 | |
WO2021111548A1 (ja) | 情報提示システム、情報提示方法、コンピュータプログラム、及び認証システム | |
WO2020261424A1 (ja) | 虹彩認証装置、虹彩認証方法、コンピュータプログラム及び記録媒体 | |
WO2023229498A1 (en) | Biometric identification system and method for biometric identification | |
JP7355213B2 (ja) | 画像取得装置、画像取得方法および画像処理装置 | |
WO2022201240A1 (ja) | 虹彩撮像システム、虹彩撮像方法、及びコンピュータプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20758798 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021501882 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112021016144 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2020758798 Country of ref document: EP Effective date: 20210920 |
|
ENP | Entry into the national phase |
Ref document number: 112021016144 Country of ref document: BR Kind code of ref document: A2 Effective date: 20210816 |