WO2020183732A1 - 情報処理装置、情報処理システム、情報処理方法及び記録媒体 - Google Patents
情報処理装置、情報処理システム、情報処理方法及び記録媒体 Download PDFInfo
- Publication number
- WO2020183732A1 WO2020183732A1 PCT/JP2019/010697 JP2019010697W WO2020183732A1 WO 2020183732 A1 WO2020183732 A1 WO 2020183732A1 JP 2019010697 W JP2019010697 W JP 2019010697W WO 2020183732 A1 WO2020183732 A1 WO 2020183732A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line
- iris
- information processing
- target person
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to an information processing device, an information processing system, an information processing method and a recording medium.
- Patent Document 1 discloses a system in which a user's iris image is taken by a near-infrared camera and the user is authenticated based on the degree of similarity between the iris code generated from the iris image and the registered iris code of the registrant. ing.
- Patent Document 1 The system illustrated in Patent Document 1 is based on the premise that the user himself / herself aligns his / her eyes with respect to the near-infrared camera and takes an iris image while the user is stationary. Therefore, there is room for improvement from the viewpoint of user convenience.
- an object of the present invention is to provide an information processing device, an information processing system, an information processing method, and a recording medium that can improve the convenience of the user in iris recognition.
- an information processing device that controls an iris matching system, and is obtained from an acquisition unit that acquires line-of-sight information of a collation target person from an image taken by the first photographing device, and from the line-of-sight information.
- An information processing device including a control unit that controls an iris matching system so that the obtained line-of-sight direction and the shooting direction in the second photographing device that captures the iris image used for the iris matching of the collation target person face each other.
- a first photographing device that captures a first image including at least a part of the face of the collation target person and a second photographing device that captures a second image including the iris of the collation target person.
- An information processing system including an apparatus and an information processing apparatus, wherein the information processing apparatus obtains from an acquisition unit that acquires line-of-sight information of the collation target person based on the first image, and the line-of-sight information.
- an information processing system including a control unit that controls the information processing system so that the line-of-sight direction of the collation target person and the photographing direction in the second photographing apparatus face each other.
- an information processing method for controlling the iris matching system from the step of acquiring the line-of-sight information of the collation target person from the image taken by the first photographing device, and from the line-of-sight information.
- An information processing method including a step of controlling the iris matching system so that the obtained line-of-sight direction and the shooting direction in the second photographing device for shooting the iris image used for the iris matching of the collation target person face each other.
- the computer controlling the iris matching system has a step of acquiring the line-of-sight information of the collation target person from the image taken by the first photographing device, and the line-of-sight direction obtained from the line-of-sight information.
- a program for executing the step of controlling the iris matching system so that the shooting direction in the second photographing device for shooting the iris image used for the iris matching of the collation target person faces each other is recorded. Recording medium is provided.
- FIG. 1 is a diagram showing an overall configuration example of the iris matching system 1 according to the present embodiment.
- the iris matching system 1 has a function of iris matching, which is a kind of biological matching.
- the iris collation system 1 performs iris collation by photographing the iris of the user who is the collation target person and collating it with the registered iris image.
- the pattern of the iris is invariant to all and is immutable for life. Therefore, the identity can be confirmed by collating the iris pattern acquired at the time of collation with the iris image registered in advance in the database.
- the iris verification system 1 in the present embodiment is, for example, identity verification for entry / exit at an airport / seaport / border, identity verification at an administrative agency, identity verification for entry / exit to a factory / business establishment, or to an event venue. It can be applied to identity verification at the time of admission.
- the iris collation system 1 includes an information processing device 10, a line-of-sight detection camera 20, an iris photographing camera 30, a distance sensor 40, a notification device 50, a collation device 60, and an iris database. 70 and.
- a network NW such as a LAN (Local Area Network) or the Internet.
- the information processing device 10 controls the iris photographing camera 30 to photograph the iris image of the collation target person. Then, the information processing device 10 controls the collation device 60 to collate the photographed iris image of the collation target person with the iris image of the registrant stored in advance in the iris database 70. On the contrary, when the angle does not satisfy a predetermined determination criterion (allowable range), the information processing device 10 controls the notification device 50 to call attention to the collation target person, and the line-of-sight direction of the collation target person. Is directed toward the iris photographing camera 30 side. That is, the information processing device 10 of the present embodiment controls the notification device 50 so as to reduce the angle formed by the line-of-sight direction of the collation target person and the photographing direction in the photographing device.
- the line-of-sight detection camera 20 is a photographing device (first photographing device) capable of photographing the face, eyes, and the like of the collation target person with visible light and acquiring an image.
- the line-of-sight detection camera 20 captures a first image including at least a part of the face of the collation target person.
- CMOS Complementary Metal Oxide Semiconductor
- CCD Charge Coupled Device
- the line-of-sight detection camera 20 may include a light source that irradiates the collation target with illumination light.
- the iris photographing camera 30 is a photographing device (second photographing device) including an infrared irradiation device 30a and an infrared camera 30b, and photographs a second image including the iris of the collation target person.
- the infrared irradiation device 30a includes a light emitting element that emits infrared light such as an infrared LED.
- the shooting wavelength of the iris shooting camera 30 is different from the shooting wavelength of the line-of-sight detection camera 20.
- the wavelength of infrared rays emitted from the infrared irradiation device 30a can be, for example, a near infrared region of about 800 nm.
- the infrared camera 30b a digital camera using a CMOS image sensor, a CCD image sensor, or the like provided with a light receiving element configured to have sensitivity to infrared rays can be used.
- the iris image used for the iris matching can be photographed.
- the resolution of the second image is higher than the resolution of the first image.
- the distance sensor 40 irradiates an object with light rays such as infrared rays, detects the distance based on the time required for the irradiated light rays to reciprocate between the distance sensor 40 and the object, and determines the detected distance.
- the indicated signal is output to the information processing device 10.
- the notification device 50 is a device that alerts the collation target person to direct the line-of-sight direction toward the iris photographing camera 30 side based on the notification control information from the information processing device 10.
- the notification device 50 includes at least one of a display 50a, an LED 50b, and a speaker 50c.
- the notification control information in the present embodiment includes information for guiding the line-of-sight direction of the collation target person toward the iris photographing camera 30 side.
- the display 50a, the LED 50b, and the speaker 50c give the following notifications based on the notification control information.
- the display 50a determines whether or not the angle formed by the line-of-sight direction of the collation target person and the shooting direction of the iris photographing camera 30 satisfies a predetermined determination criterion. Notify to. For example, on the display 50a, it is possible to notify that the screen color is OK when the screen color is green, caution when the screen color is yellow, and correction is required when the screen color is red.
- the LED 50b collates whether or not the angle formed by the line-of-sight direction of the collation target person and the shooting direction of the iris photographing camera 30 satisfies a predetermined determination criterion by switching lighting / non-lighting and switching the lighting color. Notify the target person. For example, in the LED 50b, when the lighting color is green, it can be notified that it is OK, when it is yellow, it is caution, and when it is red, it can be notified that correction is required.
- the speaker 50c outputs an alarm sound and a guidance voice to notify the collation target person whether or not the angle formed by the line-of-sight direction of the collation target person and the shooting direction of the iris photographing camera 30 satisfies a predetermined determination criterion. .. For example, it is advisable to output guidance voices such as "Look at the camera with the lamp on" and "Please shift your line of sight slightly to the right.”
- the collation device 60 includes an iris image (or feature amount) taken by the iris photographing camera 30 and a registered iris image (or feature amount) registered in the iris database 70 based on the control information from the information processing device 10. Executes the collation process with and authenticates the collation target person.
- the iris database 70 is a database that stores an image of the registrant's eye as a collation destination, an iris image detected from the eye image, an iris feature amount calculated from the iris image, and the like in association with the registrant's ID. ..
- the collation device 60 and the iris database 70 may be configured as an integrated device with the information processing device 10.
- FIG. 2 is a block diagram showing a hardware configuration example of the information processing device 10 and the collation device 60 according to the present embodiment.
- the information processing device 10 is a computer that performs arithmetic, control, and storage, and includes a CPU (Central Processing Unit) 151, a RAM (Random Access Memory) 152, a ROM (Read Only Memory) 153, and an HDD (Hard Disk Drive) 154.
- the communication I / F (interface) 155, the display device 156, and the input device 157 are provided.
- the CPU 151, RAM 152, ROM 153, HDD 154, communication I / F 155, display device 156, and input device 157 are connected to each other via a bus 158.
- the display device 156 and the input device 157 may be connected to the bus 158 via a drive device (not shown) for driving these devices.
- the CPU 151 is a processor having a function of performing a predetermined operation according to a program stored in the ROM 153, the HDD 154, etc., and controlling each part of the information processing device 10.
- the RAM 152 is composed of a volatile storage medium and provides a temporary memory area necessary for the operation of the CPU 151.
- the ROM 153 is composed of a non-volatile storage medium and stores necessary information such as a program used for the operation of the information processing apparatus 10.
- the HDD 154 is a storage device composed of a non-volatile storage medium and storing data required for processing, an operation program of the information processing device 10, and the like.
- Communication I / F155 is a communication interface based on standards such as Ethernet (registered trademark), Wi-Fi (registered trademark), and 4G, and is a module for communicating with other devices.
- the display device 156 is a liquid crystal display, an OLED (Organic Light Emitting Diode) display, or the like, and is used for displaying moving images, still images, characters, and the like.
- the input device 157 is a keyboard, a pointing device, or the like, and is used for the user to operate the information processing device 10. Examples of pointing devices include mice, trackballs, touch panels, pen tablets and the like.
- the display device 156 and the input device 157 may be integrally formed as a touch panel.
- the collation device 60 includes a CPU 651, a RAM 652, a ROM 653, an HDD 654, and a communication I / F 655 as a computer that performs calculation, control, and storage. Since these devices are the same as the CPU 151, RAM 152, ROM 153, HDD 154 and communication I / F 155 of the information processing device 10, detailed description thereof will be omitted.
- the CPU 651, RAM 652, ROM 653, HDD 654 and communication I / F 655 are connected to each other via bus 658.
- the hardware configuration shown in FIG. 2 is an example, and devices other than these may be added, and some devices may not be provided. Further, some devices may be replaced with another device having the same function. Further, some functions of the present embodiment may be provided by other devices via the network NW, or the functions of the present embodiment may be distributed and realized by a plurality of devices.
- the HDD 154 may be replaced with an SSD (Solid State Drive) using a semiconductor memory, or may be replaced with a cloud storage.
- FIG. 3 is a block diagram showing the functions of the information processing device 10 and the collation device 60 in the present embodiment.
- the information processing device 10 includes an image acquisition unit 11, a line-of-sight information acquisition unit 12, a distance information acquisition unit 13, a control unit 14, and a storage unit 15.
- the CPU 151 loads the program stored in the ROM 153, the HDD 154, etc. into the RAM 152 and executes the program.
- the CPU 151 has an image acquisition unit 11, a line-of-sight information acquisition unit 12, a distance information acquisition unit 13, and a control unit 14 (line-of-sight direction estimation unit 14a, angle detection unit 14b, determination unit 14c, notification control unit 14d, drive control unit). 14e) and other functions are realized.
- the CPU 151 realizes the function of the storage unit 15 by controlling the HDD 154.
- the storage unit 15 stores data such as an image acquired by the image acquisition unit 11, a determination criterion of the line-of-sight direction, and line-of-sight information acquired from a face image.
- the collation device 60 includes an image acquisition unit 61, an iris image extraction unit 62, a coordinate conversion unit 63, a block division unit 64, a feature amount calculation unit 65, a collation unit 66, and a storage unit 67. ..
- the CPU 651 loads the program stored in the ROM 653 or the like into the RAM 652 and executes the program. As a result, the CPU 651 realizes the functions of the image acquisition unit 61, the iris image extraction unit 62, the coordinate conversion unit 63, the block division unit 64, the feature amount calculation unit 65, and the collation unit 66. Details of the processing performed in each of these parts will be described later.
- the CPU 651 realizes the function of the storage unit 67 by controlling the HDD 654.
- the storage unit 67 stores data such as an eye image acquired by the image acquisition unit 61, an iris image extracted from the eye image, and a feature amount calculated from the iris image.
- FIG. 4 is a flowchart showing an example of the control process of the information processing apparatus 10 according to the present embodiment.
- the process of FIG. 4 is an example, and the order of the processes can be changed as appropriate.
- the image acquisition unit 11 receives the captured image captured by the line-of-sight detection camera 20 (step S101). Since the captured image is used for estimating the line-of-sight direction, it is assumed that the captured image includes at least a part of the face of the collation target person.
- the distance information acquisition unit 13 receives a signal indicating the distance between the collation target person and the iris photographing camera 30 from the distance sensor 40 as distance information (step S102).
- the line-of-sight information acquisition unit 12 analyzes the captured image received in step S101 and acquires the line-of-sight information of the collation target person (step S103).
- the line-of-sight information in the present embodiment includes the face direction of the collation target person, the position information of the eye region in the image, the position information of the outer corner of the eye, the pupil, and the iris.
- the control unit 14 selects a method for estimating the line-of-sight direction of the collation target person based on the distance between the collation target person and the iris photographing camera 30 (step S104).
- two types of methods can be selected as the method for estimating the line-of-sight direction.
- the control unit 14 (line-of-sight direction estimation unit 14a) estimates the line-of-sight direction based on the face direction of the collation target person included in the line-of-sight information when the distance is equal to or greater than a predetermined reference distance (first). Method 1).
- control unit 14 (line-of-sight direction estimation unit 14a) estimates the line-of-sight direction based on the position information of the pupil or iris of the collation target person included in the line-of-sight information when the distance is less than the reference distance (second). the method of).
- the first method is, for example, as follows. First, the line-of-sight information acquisition unit 12 extracts the face direction from the face region extracted from the face image by an arbitrary method. The line-of-sight information acquisition unit 12 acquires the positions of the left and right eyes (pupils) and the positions of the nose from the face region extracted from the face image by template matching. Next, the line-of-sight information acquisition unit 12 uses the line connecting the midpoints of the left and right eye positions and the position of the nose as the center line of the face, the distance from the center line to the left end of the face region, and the face from the center line. Find the ratio to the distance to the right edge of the area.
- the line-of-sight information acquisition unit 12 calculates the face direction in the left-right direction using the left-right ratio in the face region based on the table showing the relationship between the left-right ratio and the face direction recorded in advance.
- the table showing the relationship between the left-right ratio and the face direction is determined in advance by experiments and simulations and is stored in the storage area.
- the line-of-sight information acquisition unit 12 calculates the face direction in the vertical direction using the positions of the constituent parts of the face.
- the line connecting the positions of the left and right eyes is defined as the horizontal line of the face, and the ratio of the distance from the horizontal line to the upper end of the face area and the distance from the horizontal line to the lower end of the face area is obtained.
- the line-of-sight direction estimation unit 14a calculates the face direction in the vertical direction using the vertical ratio in the face region based on the table showing the relationship between the vertical ratio and the face direction recorded in advance.
- the face direction is represented by three angles, for example, a pan angle, a chill angle, and a roll angle.
- the line-of-sight information acquisition unit 12 holds the detected face direction as line-of-sight information in the storage area. Then, when the distance is equal to or greater than a predetermined reference distance, the control unit 14 (line-of-sight direction estimation unit 14a) calculates (estimates) the line-of-sight direction based on the face direction of the collation target person included in the line-of-sight information.
- FIG. 5 shows a case where the line-of-sight direction is estimated based on the face direction of the collation target person.
- the line-of-sight detection camera 20 is present in the front direction (X-axis direction in the drawing) of the collation target person, the face direction of the collation target person is directed upward by an angle ⁇ .
- it is detected how much the face is tilted based on the time when the collation target person faces straight forward in an upright state, and the line-of-sight direction is estimated from the face direction.
- the face direction and the line-of-sight direction of the collation target person do not always match. Therefore, the method of estimating the line-of-sight direction based on the face direction of the collation target person is not in the iris imaging section but when the collation target person is located in a section far from the camera (for example, near the entrance of the line-of-sight adjustment section). It is preferable to apply it and prepare the collation target person for shooting.
- the second method is, for example, as follows.
- the line-of-sight information acquisition unit 12 acquires, for example, the positions of the pupils of the left and right eyes as line-of-sight information from the face region extracted from the face image by template matching.
- the control unit 14 calculates the line-of-sight direction for each of the left and right eyes based on the relative position of the pupil from a predetermined reference point on the human face.
- the predetermined reference point is, for example, the position of the Purkinje image, which is the reflection point of light on the cornea, the position of the inner corner of the eye, and the like.
- control unit 14 (line-of-sight direction estimation unit 14a) starts from a predetermined reference point in the face region based on a table showing the relationship between the relative position of the pupil from the predetermined reference point and the line-of-sight direction recorded in advance.
- the line-of-sight direction is calculated using the relative position of the pupil of.
- the table showing the relationship between the relative position of the pupil and the line-of-sight direction is determined in advance by experiments and simulations and is stored in the storage area.
- the line-of-sight direction is represented by two angles, for example, an azimuth angle and an elevation angle, for each of the left and right eyes.
- the line-of-sight direction may be represented by the average value of the line-of-sight directions of the left and right eyes.
- the line-of-sight direction estimation unit 14a holds the calculated line-of-sight direction in the storage area.
- FIG. 6A and 6B show a case where the line-of-sight direction of the collation target person is estimated based on the positional relationship between the inner corner C and the iris I with the inner corner C as the reference point and the iris I as the moving point in the eye image of the collation target person. ing.
- the iris I is separated from the inner corner C in the left eye of the collation target. In this case, it is presumed that the collation target person is looking at the outer corner of the eye.
- FIG. 6B the iris I is located on the inner corner C side of the left eye of the collation target. In this case, it is presumed that the collation target person is looking at the inner corner C side of the eye.
- FIG. 7A and 7B show a corneal reflex point with the corneal reflex point R as a reference point and the pupil P as a moving point in the eye image of the collation target person when the face of the collation target person is irradiated with irradiation light such as an infrared LED.
- irradiation light such as an infrared LED.
- FIG. 7A the pupil P is located on the outer corner of the eye with respect to the corneal reflex point R in the left eye of the collation target. In this case, it is presumed that the collation target person is looking at the outer corner of the eye.
- FIG. 7B the pupil P is located closer to the inner corner C of the corneal reflex point R in the left eye of the collation target. In this case, it is presumed that the collation target person is looking at the inner corner C side of the eye.
- FIGS. 5, 6A, 6B, 7A, and 7B Although a plurality of methods for estimating the line-of-sight directions have been described in FIGS. 5, 6A, 6B, 7A, and 7B, these methods comprehensively consider the distance between the collation target person and the camera and the introduction cost. It is preferable to select.
- the estimation method based on the face direction shown in FIG. 5 is effective when the distance to the collation target person is long and the inside of the eye cannot be photographed with high resolution.
- the estimation methods of FIGS. 6A, 6B, 7A and 7B have an advantage that they can be estimated with higher accuracy than the estimation method of FIG. 5, and in particular, the collation target person exists at a short distance. It is effective when it is. Further, since the estimation methods shown in FIGS.
- FIGS. 7A and 7B can be mounted only by the line-of-sight detection camera 20 (visible camera), there is an advantage that the cost at the time of mounting can be suppressed.
- the estimation methods of FIGS. 7A and 7B require an infrared LED or the like as a separate light source, but have the advantage of higher estimation accuracy than the estimation methods shown in FIGS. 6A and 6B.
- control unit 14 (line-of-sight direction estimation unit 14a) estimates the line-of-sight direction of the collation target person from the line-of-sight information based on the selected estimation method (step S105).
- control unit 14 detects the angle (hereinafter, referred to as “detection angle”) formed by the line-of-sight direction estimated in step S105 and the shooting direction of the iris shooting camera 30 (step). S106).
- step S107 determines whether or not the detection angle satisfies a predetermined determination criterion.
- step S107 determines whether or not the detection angle satisfies a predetermined determination criterion.
- step S107: YES determines whether or not the detection angle satisfies a predetermined determination criterion.
- step S109: NO determines whether or not the detection angle satisfies a predetermined determination criterion.
- FIG. 8 and 9 are diagrams for explaining the angle formed by the line-of-sight direction of the collation target person and the shooting direction of the iris shooting camera 30 in the present embodiment.
- FIG. 8 it is shown that the collation target person T exists in the line-of-sight adjustment section S1 and the detection angle is ⁇ .
- the notification control unit 14d described above outputs control information to the notification device 50 and adjusts the line-of-sight direction to the collation target person T in the line-of-sight adjustment section S1.
- the collation target person T exists in the iris photographing section S2 set closer to the iris photographing camera 30 than the line-of-sight adjustment section S1. Further, since the line-of-sight direction of the collation target person and the shooting direction of the iris shooting camera 30 are aligned on substantially a straight line, it is shown that the detection angle ⁇ (not shown) is sufficiently small. That is, the line-of-sight direction is within a predetermined range. In such a case, the eye portion of the collation target person fits in the angle of view of the iris photographing camera 30.
- the drive control unit 14e described above receives the determination result of the determination unit 14c and outputs control information for causing the iris photographing camera 30 to capture an iris image.
- step S108 the control unit 14 (determination unit 14c) determines whether or not the collation target person exists in the iris imaging section.
- the control unit 14 determines that the collation target person exists in the iris photographing section (step S108: YES)
- the process proceeds to step S110.
- the control unit 14 determines that the collation target person does not exist in the iris imaging section (step S108: NO)
- the process returns to step S101.
- step S109 the control unit 14 (notification control unit 14d) generates notification control information for notifying the collation target person of various information based on the detection angle, and when the notification device 50 is controlled, the process proceeds to step S101.
- the control unit 14 determines the notification method in the notification device 50, and iris the line-of-sight direction to the collation target person by the notification method such as screen display, sound, and light in the notification device 50. A warning is given so that the camera 30 is directed toward the shooting camera 30.
- step S110 the control unit 14 (drive control unit 14e) outputs control information to the iris photographing camera 30. That is, the control unit 14 (drive control unit 14e) causes the iris photographing camera 30 to capture an iris image when the angle is equal to or less than a predetermined angle and the distance is equal to or less than a predetermined distance.
- control unit 14 (drive control unit 14e) outputs control information to the collation device 60 (step S111), and registers the iris image taken by the iris photographing camera 30 and the iris database 70 in advance.
- the collation process with the iris image is executed, and the process is completed.
- FIG. 10 is a flowchart showing an example of processing of the collating device 60 in the present embodiment. Further, FIG. 11 is a schematic diagram showing an outline of the iris matching process. The process of FIG. 10 is started when the collation device 60 receives control information from the information processing device 10.
- step S201 the image acquisition unit 61 acquires an image of the eyes of the collation target person. This process corresponds to FIG. 11 (a).
- the acquired image is stored in the storage unit 67. Typically, this image is taken by infrared and is a grayscale image.
- step S202 the iris image extraction unit 62 determines the iris region from the image of the eye of the collation target person and extracts the iris image. This process corresponds to FIGS. 11 (b) and 11 (c).
- the iris image extraction unit 62 detects the pupil from the image of the eye and identifies the position thereof.
- the position of the identified pupil is stored in the storage unit 67.
- the shape of the pupil can be approximated as a circle. Therefore, the position of the pupil can be expressed by, for example, the center coordinates and the radius of the pupil.
- the pupil region can be detected, for example, by extracting pixels having a brightness lower than a predetermined value.
- the iris image extraction unit 62 detects the iris from the image of the eye and identifies the position of the iris.
- the identified iris position is stored in the storage unit 67.
- the shape of the iris can be approximated as an annular shape containing the pupil, so that the position of the iris can be represented, for example, by the center coordinates of the iris, the outer peripheral radius and the inner peripheral radius. Since the inner radius of the iris corresponds to the radius of the pupil, it may be omitted from the information indicating the position of the iris.
- the iris can be detected, for example, by extracting the change in brightness at the boundary between the outer circumference of the iris and the sclera (so-called white eye).
- the iris image extraction unit 62 extracts the iris image by cutting out the specified iris portion.
- the extracted iris image is stored in the storage unit 67.
- step S203 the coordinate conversion unit 63 transforms the iris image by converting the coordinates.
- This process corresponds to FIGS. 11 (d) and 11 (e).
- the coordinate conversion unit 63 converts the annular iris image into a rectangle. This process can be performed, for example, by converting the coordinate system of the iris image from the xy plane coordinate system to the r ⁇ polar coordinate system. Since the shape of the iris image is simplified by this coordinate transformation, the feature amount calculation process is simplified.
- step S204 the block division unit 64 divides the iris image converted into a rectangle into a plurality of blocks.
- This process corresponds to FIG. 11 (f).
- the number of divisions may be, for example, 128 in the horizontal direction, 16 in the vertical direction (that is, 2048 in total), and the like.
- the iris image itself is expressed as being cut and divided into a plurality of images for easy understanding, but it is not essential to divide the image into a plurality of images.
- the process of step S204 may be, for example, a process of acquiring the correspondence between the brightness of each block of the iris image and the coordinates of each block.
- step S205 the feature amount calculation unit 65 performs a process of calculating the feature amount for the iris image divided into a plurality of blocks. This process corresponds to FIG. 11 (g).
- FIG. 11 (g) an example of a specific processing method for calculating the feature amount will be described.
- the feature amount calculation unit 65 acquires the brightness of the iris image in each block.
- the code of the feature amount of a certain block (hereinafter, referred to as "first block") is the magnitude of the brightness with the block to the right of the first block (hereinafter, referred to as "second block”).
- first block is the magnitude of the brightness with the block to the right of the first block (hereinafter, referred to as "second block”).
- the feature amount code may be set to "4".
- the feature code has four kinds of values. In the following description, it is assumed that the feature amount codes are the above-mentioned four types.
- FIG. 11 (g) shows a feature amount image in which the feature amount code is illustrated at the position of each block.
- the feature amount code values "1", “2", “3", and "4" are displayed in different patterns.
- This display may be a pattern in which an image pattern such as a color, brightness, or pattern is changed according to a code value.
- the extracted feature amount or feature amount image is stored in the storage unit 67.
- the positional relationship that the second block is to the right of the first block is illustrated, but it may be to the left or the like, and more generally, the second block is the first block. It can be a positional relationship that is adjacent to a block.
- step S206 the collating unit 66 performs a process of collating the feature amount calculated in step S205 with the pre-registered feature amount.
- the feature amount calculated in step S205 is compared with the feature amount registered in advance, and a matching area (matching area), a non-matching area (mismatching area), and a non-matching area (non-matching area) are compared. ) Is determined. For example, an area in which the codes to be compared are both "1" or "3" can be determined to be a matching area. A region in which one of the codes to be compared is "1" and the other is "3" can be determined to be a mismatched region.
- the collation unit 66 outputs the collation result in step S206 to the information processing apparatus 10, and ends the process.
- the iris collation system 1 in the present embodiment notifies the user of voice, screen display, etc. when the line-of-sight direction of the collation target person deviates from the photographing direction of the iris photographing camera 30 by exceeding a predetermined threshold value. Attention can be given to the collation target person by the method. As a result, even if the collation target person is not in a stationary state, the iris image can be efficiently taken by the walk-through. As a result, the time required for iris recognition can be shortened.
- the iris collation system 1 in the present embodiment can switch the method of estimating the line-of-sight direction depending on whether the collation target person exists at a position far from the iris photographing camera 30 or near the iris photographing camera 30.
- the iris matching system 1 can select an appropriate estimation method according to the distance.
- FIG. 12 is a diagram showing an overall configuration example of the iris collation system 2 according to the present embodiment.
- the iris collation system 2 in the present embodiment includes a plurality of (N ⁇ 2) iris photographing cameras 30 instead of one.
- FIG. 13 is a diagram illustrating an angle formed by the line-of-sight direction of the collation target person and the shooting direction of the plurality of iris shooting cameras 30 in the present embodiment.
- the four iris photographing cameras 30 (30a to 30d) have different photographing directions.
- the angle formed by the photographing direction of the iris photographing camera 30c and the line-of-sight direction of the collation target person is the minimum value.
- the information processing device 10 (control unit 14) in the present embodiment is selected from among a plurality of iris photographing cameras 30 having different imaging directions based on the detection angle.
- the control information is output to the iris photographing camera 30c so as to capture the iris image.
- FIG. 14 is a flowchart showing an example of the control process of the information processing device 10 in the present embodiment.
- the process of FIG. 14 is an example, and the order of the processes can be changed as appropriate.
- the image acquisition unit 11 receives the captured image captured by the line-of-sight detection camera 20 (step S301). Since the captured image is used for estimating the line-of-sight direction, it is assumed that the captured image includes at least a part of the face of the collation target person.
- the distance information acquisition unit 13 receives a signal indicating the distance between the collation target person and the iris photographing camera 30 from the distance sensor 40 as distance information (step S302).
- the line-of-sight information acquisition unit 12 analyzes the captured image received in step S301 and acquires the line-of-sight information of the collation target person (step S303).
- the line-of-sight information in the present embodiment includes the face direction of the collation target person, the position information of the eye region in the image, the position information of the outer corner of the eye, the pupil, and the iris.
- control unit 14 selects a method for estimating the line-of-sight direction of the collation target person based on the distance between the collation target person and the iris photographing camera 30 (step S304). Specifically, when the distance is equal to or greater than a predetermined reference distance, the control unit 14 (line-of-sight direction estimation unit 14a) estimates the line-of-sight direction based on the face direction of the collation target person included in the line-of-sight information.
- control unit 14 estimates the line-of-sight direction based on the position information of the pupil or iris of the collation target person included in the line-of-sight information.
- control unit 14 (line-of-sight direction estimation unit 14a) estimates the line-of-sight direction of the collation target person from the line-of-sight information for each of the plurality of iris photographing cameras 30 based on the selected estimation method (step S305). .. That is, the control unit 14 (line-of-sight direction estimation unit 14a) estimates the line-of-sight directions in N ways when N iris photographing cameras 30 are installed.
- control unit 14 detects the angle formed by the plurality of line-of-sight directions estimated in step S305 and the shooting direction of the iris shooting camera 30 (step S306).
- control unit 14 determines the iris photographing camera 30 corresponding to the minimum detection angle from the plurality of iris photographing cameras 30 (step S307).
- control unit 14 determines whether or not the detection angle satisfies a predetermined determination criterion (step S308).
- step S308 determines that the detection angle satisfies a predetermined determination criterion
- step S309 determines that the detection angle does not satisfy the predetermined determination criterion
- step S309 the control unit 14 (determination unit 14c) determines whether or not the collation target person exists in the iris imaging section.
- the control unit 14 determines that the collation target person exists in the iris photographing section (step S309: YES)
- the process proceeds to step S311.
- the control unit 14 determines that the collation target person does not exist in the iris photographing section (step S309: NO)
- the process returns to step S301.
- step S310 the control unit 14 (notification control unit 14d) generates notification control information for notifying the collation target person of the information based on the angle, and when the notification device 50 is controlled, the process returns to step S301. That is, the control unit 14 (notification control unit 14d) determines the notification method in the notification device 50, and determines the line-of-sight direction with respect to the collation target person by the notification method such as screen display, sound, and light in the notification device 50. Attention is given to the iris photographing camera 30 side corresponding to the minimum detection angle. Examples of notifications in this embodiment include a method of displaying a message such as "Look at the camera with the LED lit" or "Look at the camera No. 3" on the display 50a. Be done.
- step S311 the control unit 14 (drive control unit 14e) outputs control information to the iris photographing camera. That is, the control unit 14 (drive control unit 14e) causes the iris photographing camera 30 to capture an iris image when the angle is equal to or less than a predetermined angle and the distance is equal to or less than a predetermined distance.
- control unit 14 (drive control unit 14e) outputs control information to the collation device 60 (step S312), and the iris image taken by the iris photographing camera and the registered iris registered in advance in the iris database 70.
- the collation process with the image is executed, and the process is completed.
- the iris matching system 2 in the present embodiment selects the iris photographing camera 30 that actually photographs the iris image based on the detection angle from the plurality of iris photographing cameras 30 having different photographing directions. To do. Since the iris photographing camera 30 corresponding to the photographing direction close to the line-of-sight direction of the collation target person is selected, the collation target person can easily adjust the line-of-sight direction as compared with the case of the first embodiment. As a result, the convenience of the collation target person at the time of iris recognition can be further improved.
- the control unit 14 determines unit 14c. faces the line-of-sight direction of the collation target person and the shooting direction of the iris shooting camera 30 based on the detected distance (reduces the detection angle). It differs from the first and second embodiments in that the determination reference value used for control is changed.
- FIG. 15 is a graph showing the relationship between the distance between the iris photographing camera 30 and the collation target person in the embodiment and the determination reference value of the detection angle.
- an example of the relationship between the distance and the determination reference value is shown with the distance between the iris photographing camera 30 and the collation target person as the horizontal axis and the determination reference value according to the distance as the vertical axis.
- the section between the distances D1 and D2 indicates the line-of-sight adjustment section S1. Further, the section from the distance D3 to D2 indicates the iris photographing section S2.
- the determination reference value decreases from C1 to C2 as the collation target person approaches the camera in the line-of-sight adjustment section S1. Then, in the iris photographing section S2, the determination reference value is constant at C2.
- FIG. 16 is a flowchart showing an example of the control process of the information processing device 10 in the present embodiment.
- the process of FIG. 16 is an example, and the order of the processes can be changed as appropriate.
- the image acquisition unit 11 receives the captured image captured by the line-of-sight detection camera 20 (step S401). Since the captured image is used for estimating the line-of-sight direction, it is assumed that the captured image includes at least a part of the face of the collation target person.
- the distance information acquisition unit 13 receives a signal indicating the distance between the collation target person and the iris photographing camera 30 from the distance sensor 40 as distance information (step S402).
- the line-of-sight information acquisition unit 12 analyzes the captured image received in step S401 and acquires the line-of-sight information of the collation target person (step S403).
- the line-of-sight information in the present embodiment includes the face direction of the collation target person, the position information of the eye region in the image, the position information of the outer corner of the eye, the pupil, and the iris.
- control unit 14 estimates the line-of-sight direction based on the position information of the pupil or iris of the collation target person included in the line-of-sight information. ..
- control unit 14 (line-of-sight direction estimation unit 14a) estimates the line-of-sight direction of the collation target person from the line-of-sight information based on the selected estimation method (step S405).
- the control unit 14 (line-of-sight direction estimation unit 14a) estimates the line-of-sight directions in N ways when N iris photographing cameras 30 are installed.
- control unit 14 detects the angle formed by the line-of-sight direction estimated in step S405 and the shooting direction of the iris shooting camera 30 (step S406).
- control unit 14 determines whether or not the detection angle satisfies a predetermined determination criterion (step S408).
- step S408 determines that the detection angle satisfies a predetermined determination criterion
- step S409 determines that the detection angle does not satisfy the predetermined determination criterion
- step S410 proceeds to step S410.
- step S409 the control unit 14 (determination unit 14c) determines whether or not the collation target person exists in the iris imaging section.
- the control unit 14 (determination unit 14c) determines that the collation target person exists in the iris photographing section (step S409: YES)
- the process proceeds to step S411.
- the control unit 14 (determination unit 14c) determines that the collation target person does not exist in the iris photographing section (step S409: NO)
- the process returns to step S401.
- step S410 the control unit 14 (notification control unit 14d) generates notification control information for notifying the collation target person of the information based on the angle, and when the notification device 50 is controlled, the process returns to step S401. That is, the control unit 14 (notification control unit 14d) determines the notification method in the notification device 50, and iris the line-of-sight direction to the collation target person by the notification method such as screen display, sound, and light in the notification device 50. A warning is given so that the camera 30 is directed toward the shooting camera 30.
- step S411 the control unit 14 (drive control unit 14e) outputs control information to the iris photographing camera. That is, the control unit 14 (drive control unit 14e) causes the iris photographing camera 30 to capture an iris image when the angle is equal to or less than a predetermined angle and the distance is equal to or less than a predetermined distance.
- the angle formed by the line-of-sight direction of the collation target person and the imaging direction of the iris photographing camera 30, that is, the determination reference value for determining the suitability of the deviation amount is the distance. Can be changed according to. For this reason, by slowly defining the judgment reference value at the position where accuracy is not required and strictly defining the judgment reference value at the position where accuracy is required, the adjustment of the line-of-sight direction is made more efficient and the iris image with high accuracy is displayed. Shooting becomes possible.
- FIG. 17 is a block diagram showing the functions of the information processing apparatus 100 according to the present embodiment.
- the information processing device 100 is an information processing device that controls an iris matching system, and is an acquisition unit 110 that acquires the line-of-sight information of a collation target person from an image captured by the first photographing device, and a line-of-sight direction obtained from the line-of-sight information.
- a control unit 120 that controls the iris matching system so that the shooting directions in the second photographing device that captures the iris image used for the iris matching of the collation target person are opposed to each other.
- the convenience of the user in iris authentication can be improved.
- FIG. 18 is a block diagram showing the functions of the information processing system 500 according to the present embodiment.
- the information processing system 500 includes a first photographing device 510 that captures a first image including at least a part of the face of the collation target person, a second photographing device 520 that captures a second image including the iris of the collation target person, and the like. It includes an information processing device 530.
- the acquisition unit 531 that acquires the line-of-sight information of the collation target person based on the first image, the line-of-sight direction of the collation target person obtained from the line-of-sight information, and the shooting direction in the second photographing device face each other.
- the control unit 532 for controlling the information processing system 500 is provided. According to the information processing system 500 in the present embodiment, the convenience of the user in iris authentication can be improved.
- the iris photographing camera 30 has been described on the premise that it is in a fixed state.
- the control unit 14 (drive control unit 14e) may drive the iris photographing camera 30 in a direction in which the angle is reduced based on the line-of-sight information.
- it is not necessary to call attention to the collation target person. Therefore, there is an advantage that the user of the facility (verification target person) can pass through without being aware of the authentication process.
- the control unit 14 selects an iris image based on an angle from a group of iris images taken by a plurality of iris photographing cameras 30 having different shooting directions, and collates the iris image with the collating device 60 based on the selected image. May be executed.
- the control unit 14 may select an iris image captured by the iris imaging camera 30 corresponding to the minimum detection angle from a group of iris images captured by a plurality of imaging devices having different imaging directions. In this case, there is an advantage that the image can be easily selected.
- an iris image taken by the iris photographing camera 30 whose angle is within an allowable range may be selected.
- the angle is not the minimum, there is an advantage that the image quality is high and an image more suitable for iris matching can be selected.
- the line-of-sight direction of the collation target person may be estimated based on the second image.
- the resolution of the second image captured by the iris photographing camera 30 is higher than the resolution of the first image captured by the line-of-sight detection camera 20. Therefore, there is an advantage that the accuracy of estimation of the line-of-sight direction can be improved.
- the distance to the collation target person may be acquired by the line-of-sight detection camera 20 (first photographing device). That is, the line-of-sight detection and the distance detection may be performed by the same camera. For example, when an object whose distance and size are known is included in the captured image taken by the line-of-sight detection camera 20, the distance can be determined by comparing the size of the object with the collation target person in the image. Can be estimated. In this case, since the distance sensor 40 can be omitted, there is an advantage that the hardware configuration can be simplified.
- the collation device 60 is based on the face authentication that authenticates the collation target person based on the collation result of the first image and the registered face image of the registrant, and the collation result of the second image and the registered iris image of the registrant.
- Two-element authentication consisting of iris authentication for authenticating the collation target person may be executed. By executing two-factor authentication, the authentication accuracy can be improved.
- the verification device 60 may execute two-step authentication in which the verification target person authenticated as the registrant in the face authentication is the authentication target in the iris authentication. Since the number of matching targets is narrowed down by face recognition, the speed of 1-to-N iris image recognition can be improved.
- the collation target person has an advantage that the adjustment of the line-of-sight direction is not required.
- the line-of-sight direction is represented by an angle, and a case where it is determined whether or not to take an iris image based on the angle formed by the line-of-sight direction and the shooting direction has been described.
- the line-of-sight direction may be represented by the difference in position between the center of the pupil and the reflected image reflected in the pupil. That is, when the center of the pupil and the reflected image reflected in the pupil match, it can be determined that the line-of-sight direction of the collation target person and the shooting direction of the iris shooting camera 30 are opposite to each other.
- the information processing device 10 does not necessarily have to calculate the angle as the line-of-sight direction.
- a floppy (registered trademark) disk for example, a hard disk, an optical disk, a magneto-optical disk, a CD (Compact Disk) -ROM, a magnetic tape, a non-volatile memory card, or a ROM can be used.
- a floppy (registered trademark) disk for example, a hard disk, an optical disk, a magneto-optical disk, a CD (Compact Disk) -ROM, a magnetic tape, a non-volatile memory card, or a ROM
- a CD Compact Disk
- An information processing device that controls the iris matching system.
- An acquisition unit that acquires the line-of-sight information of the collation target person from the image captured by the first imaging device, and A control unit that controls the iris matching system so that the line-of-sight direction obtained from the line-of-sight information and the shooting direction in the second photographing device that shoots the iris image used for the iris matching of the collation target person face each other.
- Information processing device equipped with An information processing device that controls the iris matching system.
- the control unit generates notification control information for notifying the collation target person of information based on the line-of-sight direction.
- the information processing device according to Appendix 1.
- the notification control information includes information for guiding the line-of-sight direction to the second photographing apparatus side.
- the information processing device according to Appendix 2.
- the control unit causes the second imaging device selected based on the line-of-sight direction to capture the iris image from the plurality of second imaging devices having different imaging directions.
- the information processing device according to any one of Appendix 1 to 3.
- the control unit causes the second imaging device, which minimizes the angle between the line-of-sight direction and the imaging direction, to capture the iris image from among the plurality of second imaging devices having different imaging directions.
- the information processing device according to Appendix 4.
- the control unit Based on the line-of-sight information, the control unit drives the second photographing device in a direction in which the angle between the line-of-sight direction and the photographing direction is reduced.
- the information processing device according to any one of Appendix 1 to 3.
- the control unit selects the iris image having the smallest angle between the line-of-sight direction and the photographing direction from the iris image group photographed by the plurality of second photographing devices having different photographing directions.
- the information processing device according to Appendix 7.
- a distance information acquisition unit for acquiring the distance to the collation target person is further provided.
- the control unit changes a determination criterion used for controlling the line-of-sight direction and the photographing direction to face each other based on the distance.
- the information processing device according to any one of Appendix 1 to 8.
- a distance information acquisition unit for acquiring the distance to the collation target person is further provided.
- the control unit causes the second imaging device to capture the iris image when the line-of-sight direction is included in a predetermined range and the distance is equal to or less than a predetermined distance.
- the information processing device according to any one of Appendix 1 to 8.
- a distance information acquisition unit for acquiring the distance to the collation target person is further provided.
- the control unit estimates the line-of-sight direction based on the face direction of the collation target person included in the line-of-sight information, and the distance is less than the reference distance.
- the line-of-sight direction is estimated based on the position information of the pupil or iris of the collation target person included in the line-of-sight information.
- the control unit estimates the line-of-sight direction based on the position information of the pupil or iris of the collation target person included in the line-of-sight information.
- the information processing device according to any one of Appendix 1 to 10.
- a first photographing device that captures a first image including at least a part of the face of the collation target person, and A second photographing device that captures a second image including the iris of the collation target person, and Information processing device and It is an information processing system equipped with The information processing device An acquisition unit that acquires line-of-sight information of the collation target person based on the first image, and A control unit that controls the information processing system so that the line-of-sight direction of the collation target person obtained from the line-of-sight information and the image-taking direction in the second imaging device face each other. Information processing system with.
- the photographing wavelength in the second photographing apparatus is different from the photographing wavelength in the first photographing apparatus.
- the control unit estimates the line-of-sight direction based on the first image when the distance to the collation target person is equal to or greater than a predetermined reference distance, and when the distance is less than the predetermined reference distance. Estimates the line-of-sight direction of the collation target person based on the second image.
- the information processing system according to any one of Appendix 13 to 15.
- Appendix 17 A distance sensor that detects the distance and outputs it to the information processing device is further provided.
- Appendix 18 The distance is acquired by the first photographing apparatus.
- the collation device sets the collation target person authenticated as the registrant in the face authentication as the authentication target in the iris authentication.
- the information processing system according to Appendix 19.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Collating Specific Patterns (AREA)
- Image Processing (AREA)
Abstract
Description
図1は、本実施形態における虹彩照合システム1の全体構成例を示す図である。虹彩照合システム1は、生体照合の一種である虹彩照合の機能を備える。虹彩照合システム1は、照合対象者であるユーザの虹彩を撮影し、登録されている虹彩画像と照合することにより虹彩照合を行う。虹彩の模様は、万人不同かつ終生不変である。そのため、照合時に取得した虹彩の模様と、データベースに予め登録されている虹彩の画像とを照合することにより本人確認が可能である。
ステップS207において、照合部66は、ステップS206における照合結果を情報処理装置10に出力し、処理を終了する。
以下、第2実施形態における虹彩照合システム2について説明する。なお、第1実施形態の図中において付与した符号と共通する符号は同一の対象を示す。第1実施形態と共通する箇所の説明は省略し、異なる箇所について詳細に説明する。
以下、第3実施形態における虹彩照合システムについて説明する。なお、上述した実施形態の図中において付与した符号と共通する符号は同一の対象を示す。上述した実施形態と共通する箇所の説明は省略し、異なる箇所について詳細に説明する。
図17は、本実施形態における情報処理装置100の機能を示すブロック図である。情報処理装置100は、虹彩照合システムを制御する情報処理装置であって、第1撮影装置により撮影された画像から照合対象者の視線情報を取得する取得部110と、視線情報から得られる視線方向と、照合対象者の虹彩照合に用いられる虹彩画像を撮影する第2撮影装置における撮影方向とが対向するように虹彩照合システムを制御する制御部120と、を備える。本実施形態における情報処理装置100によれば、虹彩認証における利用者の利便性を向上できる。
図18は、本実施形態における情報処理システム500の機能を示すブロック図である。情報処理システム500は、照合対象者の顔の少なくとも一部を含む第1画像を撮影する第1撮影装置510と、照合対象者の虹彩を含む第2画像を撮影する第2撮影装置520と、情報処理装置530と、を備える。情報処理装置530は、第1画像に基づいて照合対象者の視線情報を取得する取得部531と、視線情報から得られる照合対象者の視線方向と、第2撮影装置における撮影方向とが対向するように、情報処理システム500を制御する制御部532と、を有する。本実施形態における情報処理システム500によれば、虹彩認証における利用者の利便性を向上できる。
以上、実施形態を参照して本発明を説明したが、本発明は上述の実施形態に限定されるものではない。本願発明の構成及び詳細には本発明の要旨を逸脱しない範囲で、当業者が理解し得る様々な変形をできる。例えば、いずれかの実施形態の一部の構成を、他の実施形態に追加した実施形態、あるいは他の実施形態の一部の構成と置換した実施形態も本発明を適用し得る実施形態であると理解されるべきである。
虹彩照合システムを制御する情報処理装置であって、
第1撮影装置により撮影された画像から照合対象者の視線情報を取得する取得部と、
前記視線情報から得られる視線方向と、前記照合対象者の虹彩照合に用いられる虹彩画像を撮影する第2撮影装置における撮影方向とが対向するように、虹彩照合システムを制御する制御部と、
を備える情報処理装置。
前記制御部は、前記視線方向に基づく情報を前記照合対象者に通知するための通知制御情報を生成する、
付記1に記載の情報処理装置。
前記通知制御情報は、前記視線方向を前記第2撮影装置側に誘導するための情報を含む、
付記2に記載の情報処理装置。
前記制御部は、前記撮影方向が互いに異なる複数の前記第2撮影装置のうちから、前記視線方向に基づいて選択された前記第2撮影装置に前記虹彩画像を撮影させる、
付記1乃至3のいずれかに記載の情報処理装置。
前記制御部は、前記撮影方向が互いに異なる複数の前記第2撮影装置のうちから、前記視線方向と前記撮影方向とのなす角度が最小となる前記第2撮影装置に前記虹彩画像を撮影させる、
付記4に記載の情報処理装置。
前記制御部は、前記視線情報に基づいて、前記視線方向と前記撮影方向とのなす角度が低減される方向に前記第2撮影装置を駆動させる、
付記1乃至3のいずれかに記載の情報処理装置。
前記制御部は、前記撮影方向が互いに異なる複数の前記第2撮影装置が撮影した虹彩画像群のうちから、前記視線方向と前記撮影方向とのなす角度に基づいて前記虹彩画像を選択する、
付記1乃至3のいずれかに記載の情報処理装置。
前記制御部は、前記撮影方向が互いに異なる複数の前記第2撮影装置が撮影した虹彩画像群のうちから、前記視線方向と前記撮影方向とのなす角度が最小である前記虹彩画像を選択する、
付記7に記載の情報処理装置。
前記照合対象者までの距離を取得する距離情報取得部をさらに備え、
前記制御部は、前記距離に基づいて前記視線方向と前記撮影方向とを対向させる制御に用いられる判定基準を変更する、
付記1乃至8のいずれかに記載の情報処理装置。
前記照合対象者までの距離を取得する距離情報取得部をさらに備え、
前記制御部は、前記視線方向が所定の範囲に含まれ、かつ、前記距離が所定距離以下である場合に前記第2撮影装置に前記虹彩画像を撮影させる、
付記1乃至8のいずれかに記載の情報処理装置。
前記照合対象者までの距離を取得する距離情報取得部をさらに備え、
前記制御部は、前記距離が所定の基準距離以上である場合には、前記視線情報に含まれる前記照合対象者の顔方向に基づいて前記視線方向を推定し、前記距離が前記基準距離未満である場合には、前記視線情報に含まれる前記照合対象者の瞳孔又は虹彩の位置情報に基づいて前記視線方向を推定する、
付記1乃至8のいずれかに記載の情報処理装置。
前記制御部は、前記視線情報に含まれる前記照合対象者の瞳孔又は虹彩の位置情報に基づいて前記視線方向を推定する、
付記1乃至10のいずれかに記載の情報処理装置。
照合対象者の顔の少なくとも一部を含む第1画像を撮影する第1撮影装置と、
前記照合対象者の虹彩を含む第2画像を撮影する第2撮影装置と、
情報処理装置と、
を備えた情報処理システムであって、
前記情報処理装置は、
前記第1画像に基づいて前記照合対象者の視線情報を取得する取得部と、
前記視線情報から得られる前記照合対象者の視線方向と、前記第2撮影装置における撮影方向とが対向するように前記情報処理システムを制御する制御部と、
を有する情報処理システム。
前記第2画像の解像度は、前記第1画像の解像度よりも高い、
付記13に記載の情報処理システム。
前記第2撮影装置における撮影波長は、前記第1撮影装置における撮影波長と異なる、
付記14に記載の情報処理システム。
前記制御部は、前記照合対象者までの距離が所定の基準距離以上である場合には、前記第1画像に基づいて前記視線方向を推定し、前記距離が所定の基準距離未満である場合には、前記第2画像に基づいて前記照合対象者の前記視線方向を推定する、
付記13乃至15のいずれかに記載の情報処理システム。
前記距離を検出し、前記情報処理装置へ出力する距離センサをさらに備える、
付記16に記載の情報処理システム。
前記距離は、前記第1撮影装置により取得される、
付記17に記載の情報処理システム。
前記第1画像と登録者の登録顔画像との照合結果に基づいて前記照合対象者を認証する顔認証と、前記第2画像と前記登録者の登録虹彩画像との照合結果に基づいて前記照合対象者を認証する虹彩認証とを実行する照合装置をさらに備える、
付記13乃至18のいずれかに記載の情報処理システム。
前記照合装置は、前記顔認証において前記登録者として認証された前記照合対象者を、前記虹彩認証における認証対象とする、
付記19に記載の情報処理システム。
虹彩照合システムを制御する情報処理方法であって、
第1撮影装置により撮影された画像から照合対象者の視線情報を取得するステップと、
前記視線情報から得られる視線方向と、前記照合対象者の虹彩照合に用いられる虹彩画像を撮影する第2撮影装置における撮影方向とが対向するように、前記虹彩照合システムを制御するステップと、
を備える情報処理方法。
虹彩照合システムを制御するコンピュータに、
第1撮影装置により撮影された画像から照合対象者の視線情報を取得するステップと、
前記視線情報から得られる視線方向と、前記照合対象者の虹彩照合に用いられる虹彩画像を撮影する第2撮影装置における撮影方向とが対向するように、前記虹彩照合システムを制御するステップと、
を実行させるためのプログラムが記録された記録媒体。
10・・・情報処理装置 11・・・画像取得部
12・・・視線情報取得部 13・・・距離情報取得部
14・・・制御部 14a・・・視線方向推定部
14b・・・角度検出部 14c・・・判定部
14d・・・通知制御部 14e・・・駆動制御部
15・・・記憶部
20・・・視線検出用カメラ(第1撮影装置)
30・・・虹彩撮影用カメラ(第2撮影装置)
40・・・距離センサ
50・・・通知装置 50a・・・ディスプレイ
50b・・・LED 50c・・・スピーカ
60・・・照合装置 61・・・画像取得部
62・・・虹彩画像抽出部 63・・・座標変換部
64・・・ブロック分割部 65・・・特徴量算出部
66・・・照合部 67・・・記憶部
70・・・虹彩データベース
100・・・情報処理装置 110・・・取得部
120・・・制御部
151,651・・・CPU 152,652・・・RAM
153,653・・・ROM 154,654・・・HDD
155,655・・・通信I/F 156・・・表示装置
157・・・入力装置 158,658・・・バス
500・・・情報処理システム 510・・・第1撮影装置
520・・・第2撮影装置 530・・・情報処理装置
531・・・取得部 532・・・制御部
Claims (22)
- 虹彩照合システムを制御する情報処理装置であって、
第1撮影装置により撮影された画像から照合対象者の視線情報を取得する取得部と、
前記視線情報から得られる視線方向と、前記照合対象者の虹彩照合に用いられる虹彩画像を撮影する第2撮影装置における撮影方向とが対向するように、虹彩照合システムを制御する制御部と、
を備える情報処理装置。 - 前記制御部は、前記視線方向に基づく情報を前記照合対象者に通知するための通知制御情報を生成する、
請求項1に記載の情報処理装置。 - 前記通知制御情報は、前記視線方向を前記第2撮影装置側に誘導するための情報を含む、
請求項2に記載の情報処理装置。 - 前記制御部は、前記撮影方向が互いに異なる複数の前記第2撮影装置のうちから、前記視線方向に基づいて選択された前記第2撮影装置に前記虹彩画像を撮影させる、
請求項1乃至3のいずれか一項に記載の情報処理装置。 - 前記制御部は、前記撮影方向が互いに異なる複数の前記第2撮影装置のうちから、前記視線方向と前記撮影方向とのなす角度が最小となる前記第2撮影装置に前記虹彩画像を撮影させる、
請求項4に記載の情報処理装置。 - 前記制御部は、前記視線情報に基づいて、前記視線方向と前記撮影方向とのなす角度が低減される方向に前記第2撮影装置を駆動させる、
請求項1乃至3のいずれか一項に記載の情報処理装置。 - 前記制御部は、前記撮影方向が互いに異なる複数の前記第2撮影装置が撮影した虹彩画像群のうちから、前記視線方向と前記撮影方向とのなす角度に基づいて前記虹彩画像を選択する、
請求項1乃至3のいずれか一項に記載の情報処理装置。 - 前記制御部は、前記撮影方向が互いに異なる複数の前記第2撮影装置が撮影した虹彩画像群のうちから、前記視線方向と前記撮影方向とのなす角度が最小である前記虹彩画像を選択する、
請求項7に記載の情報処理装置。 - 前記照合対象者までの距離を取得する距離情報取得部をさらに備え、
前記制御部は、前記距離に基づいて前記視線方向と前記撮影方向とを対向させる制御に用いられる判定基準を変更する、
請求項1乃至8のいずれか一項に記載の情報処理装置。 - 前記照合対象者までの距離を取得する距離情報取得部をさらに備え、
前記制御部は、前記視線方向が所定の範囲に含まれ、かつ、前記距離が所定距離以下である場合に前記第2撮影装置に前記虹彩画像を撮影させる、
請求項1乃至8のいずれか一項に記載の情報処理装置。 - 前記照合対象者までの距離を取得する距離情報取得部をさらに備え、
前記制御部は、前記距離が所定の基準距離以上である場合には、前記視線情報に含まれる前記照合対象者の顔方向に基づいて前記視線方向を推定し、前記距離が前記基準距離未満である場合には、前記視線情報に含まれる前記照合対象者の瞳孔又は虹彩の位置情報に基づいて前記視線方向を推定する、
請求項1乃至8のいずれか一項に記載の情報処理装置。 - 前記制御部は、前記視線情報に含まれる前記照合対象者の瞳孔又は虹彩の位置情報に基づいて前記視線方向を推定する、
請求項1乃至10のいずれか一項に記載の情報処理装置。 - 照合対象者の顔の少なくとも一部を含む第1画像を撮影する第1撮影装置と、
前記照合対象者の虹彩を含む第2画像を撮影する第2撮影装置と、
情報処理装置と、
を備えた情報処理システムであって、
前記情報処理装置は、
前記第1画像に基づいて前記照合対象者の視線情報を取得する取得部と、
前記視線情報から得られる前記照合対象者の視線方向と、前記第2撮影装置における撮影方向とが対向するように前記情報処理システムを制御する制御部と、
を有する情報処理システム。 - 前記第2画像の解像度は、前記第1画像の解像度よりも高い、
請求項13に記載の情報処理システム。 - 前記第2撮影装置における撮影波長は、前記第1撮影装置における撮影波長と異なる、
請求項14に記載の情報処理システム。 - 前記制御部は、前記照合対象者までの距離が所定の基準距離以上である場合には、前記第1画像に基づいて前記視線方向を推定し、前記距離が所定の基準距離未満である場合には、前記第2画像に基づいて前記照合対象者の前記視線方向を推定する、
請求項13乃至15のいずれか一項に記載の情報処理システム。 - 前記距離を検出し、前記情報処理装置へ出力する距離センサをさらに備える、
請求項16に記載の情報処理システム。 - 前記距離は、前記第1撮影装置により取得される、
請求項17に記載の情報処理システム。 - 前記第1画像と登録者の登録顔画像との照合結果に基づいて前記照合対象者を認証する顔認証と、前記第2画像と前記登録者の登録虹彩画像との照合結果に基づいて前記照合対象者を認証する虹彩認証とを実行する照合装置をさらに備える、
請求項13乃至18のいずれか一項に記載の情報処理システム。 - 前記照合装置は、前記顔認証において前記登録者として認証された前記照合対象者を、前記虹彩認証における認証対象とする、
請求項19に記載の情報処理システム。 - 虹彩照合システムを制御する情報処理方法であって、
第1撮影装置により撮影された画像から照合対象者の視線情報を取得するステップと、
前記視線情報から得られる視線方向と、前記照合対象者の虹彩照合に用いられる虹彩画像を撮影する第2撮影装置における撮影方向とが対向するように、前記虹彩照合システムを制御するステップと、
を備える情報処理方法。 - 虹彩照合システムを制御するコンピュータに、
第1撮影装置により撮影された画像から照合対象者の視線情報を取得するステップと、
前記視線情報から得られる視線方向と、前記照合対象者の虹彩照合に用いられる虹彩画像を撮影する第2撮影装置における撮影方向とが対向するように、前記虹彩照合システムを制御するステップと、
を実行させるためのプログラムが記録された記録媒体。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021505476A JP7223303B2 (ja) | 2019-03-14 | 2019-03-14 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
EP19919135.4A EP3940627A4 (en) | 2019-03-14 | 2019-03-14 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD AND RECORDING MEDIA |
US17/436,220 US20220130173A1 (en) | 2019-03-14 | 2019-03-14 | Information processing device, information processing system, information processing method, and storage medium |
BR112021016773A BR112021016773A2 (pt) | 2019-03-14 | 2019-03-14 | Dispositivo de processamento de informações, sistema de processamento de informações, método de proces-samento de informações e meio de armazenamento |
PCT/JP2019/010697 WO2020183732A1 (ja) | 2019-03-14 | 2019-03-14 | 情報処理装置、情報処理システム、情報処理方法及び記録媒体 |
CN201980093922.2A CN113557519A (zh) | 2019-03-14 | 2019-03-14 | 信息处理设备、信息处理系统、信息处理方法以及记录介质 |
TW109107229A TW202040426A (zh) | 2019-03-14 | 2020-03-05 | 資訊處理裝置、資訊處理系統、資訊處理方法以及非暫時性儲存媒體 |
ARP200100711A AR118356A1 (es) | 2019-03-14 | 2020-03-13 | Dispositivo de procesamiento de información, sistema de procesamiento de información, método de procesamiento de información y medio de almacenamiento |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/010697 WO2020183732A1 (ja) | 2019-03-14 | 2019-03-14 | 情報処理装置、情報処理システム、情報処理方法及び記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020183732A1 true WO2020183732A1 (ja) | 2020-09-17 |
Family
ID=72426651
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/010697 WO2020183732A1 (ja) | 2019-03-14 | 2019-03-14 | 情報処理装置、情報処理システム、情報処理方法及び記録媒体 |
Country Status (8)
Country | Link |
---|---|
US (1) | US20220130173A1 (ja) |
EP (1) | EP3940627A4 (ja) |
JP (1) | JP7223303B2 (ja) |
CN (1) | CN113557519A (ja) |
AR (1) | AR118356A1 (ja) |
BR (1) | BR112021016773A2 (ja) |
TW (1) | TW202040426A (ja) |
WO (1) | WO2020183732A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023234003A1 (ja) * | 2022-06-02 | 2023-12-07 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
US12072964B2 (en) | 2021-02-18 | 2024-08-27 | Nec Corporation | Biometric authentication system, biometric authentication method, and recording medium |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7210872B2 (ja) * | 2017-07-19 | 2023-01-24 | 富士フイルムビジネスイノベーション株式会社 | 画像処理装置及び画像処理プログラム |
FR3144376A1 (fr) * | 2022-12-23 | 2024-06-28 | Idemia Identity & Security France | Système d'acquisition d’image pour la reconnaissance biométrique d'iris d’un individu |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1040386A (ja) * | 1996-07-25 | 1998-02-13 | Oki Electric Ind Co Ltd | 虹彩認識システム |
JP2000011163A (ja) * | 1998-06-18 | 2000-01-14 | Matsushita Electric Ind Co Ltd | 虹彩撮像装置及び、その虹彩撮像方法 |
JP2007011667A (ja) * | 2005-06-30 | 2007-01-18 | Matsushita Electric Ind Co Ltd | 虹彩認証装置および虹彩認証方法 |
JP2008197713A (ja) | 2007-02-08 | 2008-08-28 | Toyama Prefecture | 画像識別方法 |
WO2009016846A1 (ja) * | 2007-08-02 | 2009-02-05 | Panasonic Corporation | 虹彩認証装置および虹彩認証システム |
JP2010267121A (ja) * | 2009-05-15 | 2010-11-25 | Oki Electric Ind Co Ltd | アイリス撮影装置 |
JP2012216180A (ja) * | 2011-03-30 | 2012-11-08 | Advanced Telecommunication Research Institute International | 視線方向の推定装置、視線方向の推定方法およびコンピュータに当該視線方向の推定方法を実行させるためのプログラム |
JP2017208638A (ja) * | 2016-05-17 | 2017-11-24 | レノボ・シンガポール・プライベート・リミテッド | 虹彩認証装置、虹彩認証方法、及びプログラム |
Family Cites Families (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7262919B1 (en) * | 1994-06-13 | 2007-08-28 | Canon Kabushiki Kaisha | Head-up display device with curved optical surface having total reflection |
US6700998B1 (en) * | 1999-04-23 | 2004-03-02 | Oki Electric Industry Co, Ltd. | Iris registration unit |
JP2003141516A (ja) * | 2001-10-31 | 2003-05-16 | Matsushita Electric Ind Co Ltd | 虹彩撮像装置及び虹彩認証装置 |
JP2005031711A (ja) * | 2003-05-12 | 2005-02-03 | Omron Corp | 車載端末装置、車載端末制御プログラム、車載端末制御プログラムを記録した記録媒体、車載端末装置の通信決済方法、決済管理システム、決済管理プログラム、決済管理プログラムを記録した記録媒体、決済管理方法、料金収受システム、料金収受プログラム、料金収受プログラムを記録した記録媒体、および料金収受方法 |
WO2005063114A1 (ja) * | 2003-12-25 | 2005-07-14 | National University Corporation Shizuoka University | 視線検出方法および装置ならびに三次元視点計測装置 |
JP4604190B2 (ja) * | 2004-02-17 | 2010-12-22 | 国立大学法人静岡大学 | 距離イメージセンサを用いた視線検出装置 |
JP2005334402A (ja) * | 2004-05-28 | 2005-12-08 | Sanyo Electric Co Ltd | 認証方法および認証装置 |
JP4622702B2 (ja) | 2005-05-27 | 2011-02-02 | 株式会社日立製作所 | 映像監視装置 |
US8159519B2 (en) * | 2007-05-31 | 2012-04-17 | Eastman Kodak Company | Personal controls for personal video communications |
EP2351668A4 (en) * | 2008-09-12 | 2013-03-13 | Toshiba Kk | IMAGE RADIATION SYSTEM AND IMAGE RADIATION PROCESS |
CN102149325B (zh) * | 2008-09-26 | 2013-01-02 | 松下电器产业株式会社 | 视线方向判定装置及视线方向判定方法 |
US8570423B2 (en) * | 2009-01-28 | 2013-10-29 | Hewlett-Packard Development Company, L.P. | Systems for performing visual collaboration between remotely situated participants |
JP5361618B2 (ja) * | 2009-09-04 | 2013-12-04 | キヤノン株式会社 | 画像処理装置及びその制御方法 |
JP2011113196A (ja) | 2009-11-25 | 2011-06-09 | Olympus Corp | 顔方向特定装置及び撮像装置 |
EP3002752A1 (en) * | 2010-01-15 | 2016-04-06 | LG Electronics, Inc. | Method and apparatus for processing an audio signal |
DE102010017837A1 (de) * | 2010-04-22 | 2011-10-27 | Carl Zeiss Meditec Ag | Anordnung zur Erzielung hochgenauer Messwerte am Auge |
JP2012038106A (ja) * | 2010-08-06 | 2012-02-23 | Canon Inc | 情報処理装置、情報処理方法、およびプログラム |
KR101544524B1 (ko) * | 2010-12-16 | 2015-08-17 | 한국전자통신연구원 | 차량용 증강현실 디스플레이 시스템 및 차량용 증강현실 디스플레이 방법 |
EP2907453B1 (en) * | 2012-09-28 | 2018-11-21 | JVC Kenwood Corporation | Diagnosis assistance device and diagnosis assistance method |
CN104780834B (zh) * | 2012-11-12 | 2016-12-28 | 阿尔卑斯电气株式会社 | 生物体信息计测装置及使用该装置的输入装置 |
KR101382772B1 (ko) * | 2012-12-11 | 2014-04-08 | 현대자동차주식회사 | 디스플레이 시스템 및 방법 |
JP6217445B2 (ja) * | 2013-03-07 | 2017-10-25 | 株式会社Jvcケンウッド | 診断支援装置および診断支援方法 |
JP6176070B2 (ja) * | 2013-11-13 | 2017-08-09 | 株式会社デンソー | 視線方向検知装置 |
US9525817B2 (en) * | 2013-11-22 | 2016-12-20 | Samsung Electro-Mechanics Co., Ltd. | System and method of controlling imaging direction and angle of view of camera |
US20150346814A1 (en) * | 2014-05-30 | 2015-12-03 | Vaibhav Thukral | Gaze tracking for one or more users |
KR20160051411A (ko) * | 2014-11-03 | 2016-05-11 | 삼성전자주식회사 | 외부의 물체를 제어하는 전자 장치 및 그 방법 |
US10048749B2 (en) * | 2015-01-09 | 2018-08-14 | Microsoft Technology Licensing, Llc | Gaze detection offset for gaze tracking models |
JP6613740B2 (ja) * | 2015-09-09 | 2019-12-04 | 富士通コネクテッドテクノロジーズ株式会社 | 表示制御装置、表示制御方法および表示制御プログラム |
CN108885117A (zh) * | 2016-03-29 | 2018-11-23 | 三菱电机株式会社 | 语音引导装置和语音引导方法 |
US10698481B1 (en) * | 2017-09-28 | 2020-06-30 | Apple Inc. | Glint-assisted gaze tracker |
EP3856008A1 (en) * | 2018-09-26 | 2021-08-04 | Essilor International | Method for determining at least one geometrico-morphological parameter of a subject |
US10825245B1 (en) * | 2019-06-03 | 2020-11-03 | Bank Of America Corporation | Three dimensional rendering for a mobile device |
-
2019
- 2019-03-14 WO PCT/JP2019/010697 patent/WO2020183732A1/ja active Application Filing
- 2019-03-14 US US17/436,220 patent/US20220130173A1/en active Pending
- 2019-03-14 EP EP19919135.4A patent/EP3940627A4/en not_active Withdrawn
- 2019-03-14 BR BR112021016773A patent/BR112021016773A2/pt unknown
- 2019-03-14 CN CN201980093922.2A patent/CN113557519A/zh active Pending
- 2019-03-14 JP JP2021505476A patent/JP7223303B2/ja active Active
-
2020
- 2020-03-05 TW TW109107229A patent/TW202040426A/zh unknown
- 2020-03-13 AR ARP200100711A patent/AR118356A1/es unknown
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1040386A (ja) * | 1996-07-25 | 1998-02-13 | Oki Electric Ind Co Ltd | 虹彩認識システム |
JP2000011163A (ja) * | 1998-06-18 | 2000-01-14 | Matsushita Electric Ind Co Ltd | 虹彩撮像装置及び、その虹彩撮像方法 |
JP2007011667A (ja) * | 2005-06-30 | 2007-01-18 | Matsushita Electric Ind Co Ltd | 虹彩認証装置および虹彩認証方法 |
JP2008197713A (ja) | 2007-02-08 | 2008-08-28 | Toyama Prefecture | 画像識別方法 |
WO2009016846A1 (ja) * | 2007-08-02 | 2009-02-05 | Panasonic Corporation | 虹彩認証装置および虹彩認証システム |
JP2010267121A (ja) * | 2009-05-15 | 2010-11-25 | Oki Electric Ind Co Ltd | アイリス撮影装置 |
JP2012216180A (ja) * | 2011-03-30 | 2012-11-08 | Advanced Telecommunication Research Institute International | 視線方向の推定装置、視線方向の推定方法およびコンピュータに当該視線方向の推定方法を実行させるためのプログラム |
JP2017208638A (ja) * | 2016-05-17 | 2017-11-24 | レノボ・シンガポール・プライベート・リミテッド | 虹彩認証装置、虹彩認証方法、及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3940627A4 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12072964B2 (en) | 2021-02-18 | 2024-08-27 | Nec Corporation | Biometric authentication system, biometric authentication method, and recording medium |
WO2023234003A1 (ja) * | 2022-06-02 | 2023-12-07 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
TW202040426A (zh) | 2020-11-01 |
EP3940627A4 (en) | 2022-03-23 |
JPWO2020183732A1 (ja) | 2020-09-17 |
EP3940627A1 (en) | 2022-01-19 |
CN113557519A (zh) | 2021-10-26 |
AR118356A1 (es) | 2021-09-29 |
BR112021016773A2 (pt) | 2021-11-16 |
JP7223303B2 (ja) | 2023-02-16 |
US20220130173A1 (en) | 2022-04-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020183732A1 (ja) | 情報処理装置、情報処理システム、情報処理方法及び記録媒体 | |
US9798927B2 (en) | Mobile terminal iris recognition method and device having human-computer interaction mechanism | |
CN103383723B (zh) | 用于生物特征验证的电子欺骗检测的方法和系统 | |
US8831295B2 (en) | Electronic device configured to apply facial recognition based upon reflected infrared illumination and related methods | |
US7369759B2 (en) | Eye image pickup apparatus, iris authentication apparatus and portable terminal device having iris authentication function | |
US11227170B2 (en) | Collation device and collation method | |
CN111598065B (zh) | 深度图像获取方法及活体识别方法、设备、电路和介质 | |
WO2019163066A1 (ja) | なりすまし検知装置、なりすまし検知方法、及びコンピュータ読み取り可能な記録媒体 | |
US20210256244A1 (en) | Method for authentication or identification of an individual | |
US11375133B2 (en) | Automatic exposure module for an image acquisition system | |
JP2015138449A (ja) | 個人認証装置、個人認証方法及びプログラム | |
JP5766096B2 (ja) | 顔画像認証装置 | |
US12118826B2 (en) | Information processing system, information processing method, and storage medium | |
JP2022033267A (ja) | 画像処理システム、画像処理方法及び記憶媒体 | |
JP2009015518A (ja) | 眼画像撮影装置及び認証装置 | |
WO2021166289A1 (ja) | データ登録装置、生体認証装置、および記録媒体 | |
JP7482380B2 (ja) | 撮像装置 | |
JP7207506B2 (ja) | なりすまし検知装置、なりすまし検知方法、及びプログラム | |
WO2023012900A1 (ja) | 情報処理装置、情報処理方法及び記憶媒体 | |
WO2024214289A1 (ja) | 表示装置、認証システム、認証方法、及び記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19919135 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112021016773 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2021505476 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2019919135 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2019919135 Country of ref document: EP Effective date: 20211014 |
|
ENP | Entry into the national phase |
Ref document number: 112021016773 Country of ref document: BR Kind code of ref document: A2 Effective date: 20210824 |