WO2015136938A1 - 情報処理方法および情報処理システム - Google Patents
情報処理方法および情報処理システム Download PDFInfo
- Publication number
- WO2015136938A1 WO2015136938A1 PCT/JP2015/001359 JP2015001359W WO2015136938A1 WO 2015136938 A1 WO2015136938 A1 WO 2015136938A1 JP 2015001359 W JP2015001359 W JP 2015001359W WO 2015136938 A1 WO2015136938 A1 WO 2015136938A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- authentication
- information
- unit
- image
- person
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0861—Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/37—Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/38—Individual registration on entry or exit not involving the use of a pass with central registration
Definitions
- Embodiments described herein relate generally to an information processing method and an information processing system.
- biometric information is acquired at an entrance of a device such as an escalator where movement of a passerby is restricted, and the passerby is authenticated before the passerby arrives at the exit of the device.
- a technique for efficiently authenticating passers-by without omission by switching authentication of passers-by to simple processing there is a technique for authenticating a passer-by multiple times and increasing the passer-by authentication accuracy step by step.
- the technology for obtaining biometric information at the entrance of an apparatus such as an escalator where movement of the passerby is restricted and performing authentication until the passerby reaches the exit improves the efficiency of authentication itself using biometric information. Rather, it is effective when there is a grace period for authentication.
- the technology for switching the passer-by authentication to a simple one may cause passers to whom strict authentication is not applied, resulting in a decrease in authentication accuracy.
- the technology that authenticates the passer several times and increases the authentication accuracy step by step cannot be applied in places where there are no restrictions on the passer's passage route. When a passerby is changed on the way, the change of the passerby cannot be detected.
- the technology for transmitting a photographic image to a second position such as an immigration check at the destination of an airplane makes it possible to use the photographic image for a visual immigration check. Efficient authentication cannot be achieved.
- person authentication is performed using a person's face image captured and captured while the person is moving through a gate, waiting room, hallway, etc., and a face image read from a personal authentication medium possessed by the person. It is difficult for the technology to stabilize the authentication accuracy due to a variation such as the deterioration of the face image read from the personal authentication medium.
- An embodiment of the present invention aims to provide an information processing method and an information processing system capable of detecting a change of a passer when the passer changes in the course of a pass of the passer. .
- the first authentication process for authenticating the passer-by using the second biometric information acquired from the first authentication process and the passer-by authentication by the first authentication process are successful, the first authentication process is used.
- the third biological information based on at least one of the first biological information and the second biological information is stored in the storage unit, and passes through the second position downstream from the first position in the direction in which the passerby travels.
- the block diagram which shows the functional structure of the information processing system concerning 1st Embodiment.
- the flowchart which shows the flow of the authentication process of the passer-by in the 1st authentication apparatus which the information processing system concerning 1st Embodiment has.
- the flowchart which shows the flow of the authentication process of the passerby in the 2nd authentication apparatus which the information processing system concerning 1st Embodiment has.
- the figure which shows the example of a display of the authentication result of the passerby in the 2nd authentication apparatus which the information processing system concerning 1st Embodiment has.
- the block diagram which shows the functional structure of the information processing system concerning 2nd Embodiment.
- the flowchart which shows the flow of the authentication process of the passerby in the 2nd authentication apparatus which the information processing system concerning 2nd Embodiment has;
- the figure which shows the structure of the information processing system to which the entrance / exit management system concerning 3rd Embodiment is applied.
- the face image of the person photographed by the camera and the face image of the person stored in the passport possessed by the passing person are compared and displayed. Display unit as a monitor for managers.
- the flowchart which shows the flow of the acquisition process of the biometric information by the boarding guidance apparatus which the information processing system concerning 3rd Embodiment has.
- the flowchart which shows the flow of the authentication process by the 1st authentication apparatus which the information processing system concerning 3rd Embodiment has.
- the flowchart which shows the flow of the authentication process by the 2nd authentication apparatus which the information processing system concerning 3rd Embodiment has.
- the flowchart which shows the flow of an authentication process in case the 2nd authentication apparatus which the information processing system concerning 3rd Embodiment has a reading part which can read identification information from a passport.
- FIG. 1 is a diagram illustrating a configuration of an information processing system to which an information processing method according to the first embodiment is applied.
- the information processing system 1 includes biometric information (an example of first biometric information) read from a medium M possessed by a passerby who passes through a first position P ⁇ b> 1, 1st authentication which authenticates a passerby using biometric information (an example of the 2nd biometric information) acquired from the 1st picture G1 obtained by picturizing a passerby which passes position P1 with the 1st image pick-up part 11
- biometric information an example of first biometric information
- the first biometric information generated based on at least one of the two pieces of biometric information used in the first authentication process when the first authentication device 10 that executes the process and the passer-by authentication by the first authentication process are successful.
- a server 30 having a feature information storage unit 31 for storing (hereinafter referred to as authentication feature information) and a passer-by who passes through a second position P2 downstream from the first position P1 in the travel direction of the passer-by is second. Obtained by imaging with the imaging unit 21 Using the biometric information (an example of the fourth biometric information) acquired from the second image G2 and the authentication feature information stored in the feature information storage unit 31, a second authentication process for authenticating the passerby is executed. 2 authentication device 20. Furthermore, a second display unit 26 is provided as an administrator monitor that displays the result of the authentication process between the first authentication device 10 and the second authentication device 20 for the administrator.
- the downstream side of the first position P1 in the traveling direction of the passer is a position that passes after the first position P1 in the pass route of the passer.
- the information processing system 1 performs the first authentication process on the passerby who passes the first position P1, and then uses the second imaging unit 21 to detect the passerby who passes the second position P2.
- the second authentication process for authenticating the passerby using the biometric information acquired from the second image G2 obtained by imaging and the authentication feature information stored in the feature information storage unit 31, An identification process is performed to detect whether or not the passer-by passing through the second position P2 is the same person as the passer-by passing through the first position P1.
- the database of the host device such as the server 30 can be used. It is not necessary to search for biometric information used in the first and second authentication processes, or to switch the passer authentication to a simple one in order to shorten the time required for the first and second authentication processes. Passer authentication can be performed efficiently while preventing a decrease in accuracy.
- the information processing system 1 includes a first authentication process (so-called 1: 1 verification) executed by the first authentication apparatus 10 and a second authentication process (executed by the second authentication apparatus 20).
- This is a system for efficiently authenticating a passerby by combining so-called 1: N verification.
- the 1: 1 verification in the first authentication device 10 is performed before the passerby reaches the second position P2 such as an entrance of a building or an entrance of a security management area.
- Biological information read from the medium M possessed by a passerby passing through P1 and biological information acquired from the first image G1 obtained by imaging the passer-by passing through the first position P1 by the first imaging unit 11. Is a passer-by authentication process.
- the 1: N verification in the second authentication device 20 is performed on the authentication feature information stored in the feature information storage unit 31 of the server 30 and a passerby who passes the second position P2.
- This is a passer-by authentication process using biometric information acquired from the second image G2 obtained by imaging with the second imaging unit 21.
- the biometric information is not read from the medium M possessed by the passerby, and is acquired from the second image G2 obtained by the imaging of the second imaging unit 21. If the biometric information matches any of the authentication feature information stored in the feature information storage unit 31 of the server 30, it is determined that the passer-by has been successfully authenticated.
- the 2nd authentication apparatus 20 performs the process for controlling a passerby's passage according to the authentication result by 1: N collation. Specifically, when the authentication by the 1: N verification is successful, the second authentication device 20 opens the entrance gate provided at the second position P2, or the door key provided at the second position P2. The unlocking process is performed to allow the passerby to pass. On the other hand, if the authentication by the 1: N verification fails, the second authentication device 20 prohibits closing the entrance gate provided at the second position P2 or unlocking the door key provided at the second position P2. For example, a process for prohibiting the passage of passersby is executed.
- the second authentication device 20 sends a message (alarm) for notifying that the authentication has failed to the second display unit 26 as an administrator monitor (see FIG. 1 and (see FIG. 7), notification of the alarm to an external terminal, and storing of an image of a passerby who failed authentication (for example, a face image included in the second image G2). Is executed as a process for prohibiting the passage of passers-by.
- authentication of a passer-by passing through the second position P2 is performed without reading the biological information from the medium M possessed by the passer-by passing through the second position P2. Since it is not necessary to switch the second authentication process to a simple one in order to reduce the time required for the authentication of the passer-by, the passer-by when a large number of passers pass the second position P2. It is possible to prevent the occurrence of stagnation and the deterioration of passer authentication accuracy.
- the information processing system 1 can be applied to an entrance / exit management system, a video surveillance system, and the like installed in a facility where many passers-by such as public facilities, important facilities, office buildings, and commercial facilities pass. .
- an example in which the feature information of the passer's face image is used as the biometric information used in the passer-by authentication process is not limited to this.
- iris, fingerprint, vein, palm print, ear Information on the body of a passerby such as a model may be used as biometric information.
- FIG. 2 is a block diagram illustrating a functional configuration of the information processing system according to the first embodiment.
- the first authentication device 10 has feature information read by a first imaging unit 11 provided so as to be able to image a passerby that passes through a first position P1, and an identification information reading unit 14 described later.
- the first image acquisition unit 12 that acquires the first image G1 obtained by the imaging of the first imaging unit 11 and the characteristics of the face image of the passerby from the first image G1 acquired by the first image acquisition unit 12
- Feature information is obtained from a first facial feature extraction unit 13 that acquires (extracts) information (an example of second biological information) and a medium M possessed by a passerby that passes through the first position P1.
- the identification information reading unit 14 (an example of a reading unit) provided so as to be readable, the feature information acquired by the first face feature extraction unit 13 and the feature information read by the identification information reading unit 14, a passer-by is identified.
- First passer-by authentication that performs the first authentication process to authenticate And parts 15, and a first output section 16 for outputting the authentication result of the passerby by the first passer authentication section 15, the.
- the server 30 includes two pieces of feature information (feature information and media acquired from the first image G1) used in the first authentication process by the first authentication device 10 (first passer authentication unit 15).
- Feature information for authentication based on at least one of feature information read from M) (feature information acquired from the first image G1 or feature information read from the medium M in the present embodiment).
- Has a feature information storage unit 31 an example of a storage unit).
- the second authentication device 20 includes a second imaging unit 21 provided so as to be able to image a passerby passing through the second position P ⁇ b> 2, and a second image obtained by imaging of the second imaging unit 21.
- the second image acquisition unit 22 that acquires the two images G2, and the feature information (an example of the fourth biological information) of the passer's face image is acquired (extracted) from the second image G2 acquired by the second image acquisition unit 22.
- Second authentication that authenticates a passer-by using the second facial feature extraction unit 23 that performs the processing, the feature information acquired by the second facial feature extraction unit 23, and the feature information for authentication stored in the feature information storage unit 31.
- a second passer authentication unit 24 that executes processing, and a second output unit 25 that outputs a passer authentication result by the second passer authentication unit 24 are provided.
- the first output unit 16 of the first authentication device 10 and the second output unit 25 of the second authentication device 20 are connected via the first display unit 17.
- FIG. 4 is a flowchart illustrating the flow of passer authentication processing in the first authentication device included in the information processing system according to the first embodiment.
- the identification information reading unit 14 is configured by a card reader or the like, and includes a medium M (for example, an ID card for identifying the passerby, an RFID (Radio Frequency Identification) chip) possessed by the passerby who passes the first position P1.
- Identification information that can identify one feature information and passer-by from passer-by feature information such as a card, key, identification card, passport, and other public identity verification media) Read (step S401).
- the identification information includes identification information (ID number), name, gender, age, affiliation, career, height, passer's face image data such as passer identification information. Is included.
- the identification information reading unit 14 reads the feature information stored (or printed) on the medium M by an external device other than the first authentication device 10, but the feature information can be acquired on the medium M.
- the image or medium printed on the medium M Feature information may be acquired from an image based on the image data stored in M.
- the identification information reading unit 14 acquires feature information from an image printed on the medium M or an image based on the image data stored in the medium M in the same manner as the first face feature extracting unit 13 described later. To do.
- the first image acquisition unit 12 receives the first image G ⁇ b> 1 obtained by the imaging of the first imaging unit 11 (in other words, the feature information is acquired by the identification information reading unit 14.
- An image obtained by capturing an image of a passerby carrying the read medium M is acquired (step S402).
- the first imaging unit 11 is configured by, for example, an ITV (Industrial Television) camera or the like, and is a part (in this embodiment) that is necessary for acquiring feature information among the bodies of passers-by passing through the first position P1. , The face of a passerby) can be imaged.
- the first imaging unit 11 generates image data obtained by digitizing optical information obtained through the lens by an A / D converter at a predetermined frame rate, and generates a first image acquisition unit. 12 is output.
- the first face feature extraction unit 13 acquires the feature information of the passer's face image included in the first image G1 from the first image G1 acquired by the first image acquisition unit 12 (step S403).
- the first face feature extraction unit 13 moves the face detection template stored in advance in the first authentication device 10 in the first image G1 while moving the face detection template in the first image G1.
- a correlation value (correlation coefficient) between the first image G1 and the template is obtained.
- the first face feature extraction unit 13 detects a position having the highest correlation value with the template as a face image in the first image G1.
- the first face feature extraction unit 13 detects a face image from the first image G1 using a face detection template stored in advance.
- a face image may be detected from the first image G1 using a known eigenspace method or subspace method.
- the first face feature extraction unit 13 includes a plurality of frame images (first image G1) based on image data of a predetermined frame rate output from the first imaging unit 11 and the same passer's face image is continuous. Included, it is necessary to perform a tracking process for detecting the face image as the face image of the same passerby.
- the first face feature extraction unit 13 uses a method described in Japanese Patent No. 5355446, etc., at which position the face image detected from one frame image exists in the next frame image. By estimating this, a face image continuously included in a plurality of frame images is detected as the face image of the same passerby.
- the first facial feature extraction unit 13 detects the position of a facial part such as an eye or nose from the detected facial image using a method described in Japanese Patent No. 3279913 or the like.
- the first face feature extracting unit 13 includes a plurality of images including the face image of the same passer. The position of the face part is detected using a face image detected from any one of the frame images or all of the plurality of frame images.
- the first face feature extraction unit 13 When detecting the position of the face part, acquires the feature information of the face image of the passer-by based on the detected position of the face part, and quantifies the acquired feature information.
- the data is output to the first passer authentication unit 15.
- the first face feature extraction unit 13 cuts out a face image having a predetermined size and a predetermined shape from the first image G1 based on the detected face part, and the shading information of the cut face image. Is acquired as feature information.
- the first face feature extracting unit 13 uses the m ⁇ n-dimensional feature information of the m-pixel ⁇ n-pixel rectangular face image cut out from the first image G1 based on the detected position of the facial part. Obtained as a vector (an example of feature information).
- the first face feature extraction unit 13 may acquire, as feature information, a partial space indicating the feature of the face image included in the first image G1 using the partial space method described in Japanese Patent No. 4087953. good.
- the first passer authentication unit 15 performs a first authentication process for authenticating a passer by using the feature information acquired by the first face feature extraction unit 13 and the feature information read by the identification information reading unit 14. Execute (Step S404). In other words, the first passer authentication unit 15 determines whether or not the feature information acquired from the first image G1 and the feature information read from the medium M by the identification information reading unit 14 are the same person's feature information. To do. Thereby, the feature information used for the first authentication process is searched from the database of the host device such as the server 30, or the passer authentication is switched to a simple one in order to shorten the time required for the first authentication process. Since it is not necessary, the passer-by can be efficiently authenticated while preventing the passer-by authentication accuracy from being lowered.
- the first passer authentication unit 15 calculates the similarity between the feature information acquired by the first face feature extraction unit 13 and the feature information read by the identification information reading unit 14. Specifically, the first passer authentication unit 15 calculates a similarity index between the feature information acquired by the first feature extraction unit 13 and the feature information read by the identification information reading unit 14.
- the similarity index refers to feature information such as a feature vector and a partial space extracted by the first face feature extraction unit 13, feature information such as a feature vector and a partial space read by the identification information reading unit 14, and the like. The similarity between two feature vectors based on the simple similarity method or the similarity between subspaces based on the subspace method.
- the first passer authentication unit 15 normalizes the lengths of the feature vector extracted by the first face feature extraction unit 13 and the feature vector read by the identification information reading unit 14 to “1”.
- the inner product is calculated as the similarity between feature vectors.
- the first passer authentication unit 15 uses the partial space method and the compound similarity method described in Japanese Patent No. 40879533 and the partial space and identification information reading unit acquired by the first facial feature extraction unit 13. The angle formed by the partial space read by 14 is calculated as the similarity.
- the first passer authentication unit 15 uses the first face feature extraction unit 13 as a similarity index other than the similarity between two feature vectors by the simple similarity method and the similarity between the partial spaces by the subspace method. It is also possible to use the similarity between two pieces of feature information based on a distance such as a Euclidean distance or a Mahalanobis distance in a feature space constituted by the acquired feature information and the feature information read by the identification information reading unit 14. When determining the similarity of two feature information based on a distance such as the Euclidean distance or Mahalanobis distance in the feature space, the similarity decreases as the distance increases, and the similarity decreases as the distance decreases. Get higher.
- the first passer-by authentication unit 15 determines that the passer-by has been successfully authenticated when the calculated similarity exceeds a predetermined first threshold (step S405: Yes). On the other hand, the first passer-by authentication unit 15 determines that the passer-by authentication has failed when the calculated similarity is equal to or less than a predetermined threshold (step S405: No).
- the first passer authentication unit 15 (an example of a storage control unit) succeeds in authentication of the passerby (step S405: Yes), the authentication feature based on at least one of the two feature information used in the first authentication process. Information is stored (saved) in the feature information storage unit 31 of the server 30 (step S406). On the other hand, when the first passer authentication unit 15 fails to authenticate the passer (step S405: No), the first passer authentication unit 15 prohibits the storage of the authentication feature information in the feature information storage unit 31. In the present embodiment, the first passer authentication unit 15 uses the feature information acquired by the first face feature extraction unit 13 or the feature information read by the identification information reading unit 14 as the feature information for authentication. Remember me.
- the first passer authentication unit 15 stores the feature information for authentication in the feature information storage unit 31 in association with the identification information read by the identification information reading unit 14.
- the first passer authentication unit 15 uses the feature information acquired by the first face feature extraction unit 13 or the feature information itself read by the identification information reading unit 14 as feature information for authentication. 31.
- a feature vector that is an example of feature information or a correlation matrix for calculating a partial space may be stored in the feature information storage unit 31 as authentication feature information.
- the first passer-by authentication unit 15 associates the identification information read by the identification information reading unit 14 with the authentication feature information and the passer's face image included in the first image G1.
- the feature information storage unit 31 stores the image data, the time information related to the time when the first authentication process is executed, and the device information that makes it possible to identify the first authentication device 10 that has executed the first authentication process.
- the second authentication device 20 When the feature information read by the identification information reading unit 14 among the two feature information used in the first authentication process is stored in the feature information storage unit 31 as the authentication feature information, the second authentication device 20 The passer-by is authenticated using the same information (feature information read by the identification information reading unit 14) as the feature information used for authenticating the passer-by who has passed the first position P1. Accordingly, when a passerby is switched between the first position P1 and the second position P2, a passer who has passed the first position P1 is different from a passerby who has passed the second position P2. Since the possibility that it can be detected can be increased, it is possible to improve the security against the change of passers-by.
- the feature information read from the medium M by the identification information reading unit 14 is generally feature information acquired from an image obtained by imaging a passerby before the time when the first authentication process is executed. is there. Therefore, when the feature information acquired from the image obtained by imaging the passerby changes due to the aging of the passerby or the like, the feature information read from the medium M by the identification information reading unit 14 indicates the passerby. The similarity with the feature information acquired from the image obtained by imaging is reduced. In other words, the feature information read from the medium M by the identification information reading unit 14 is easily affected by the aging of the passerby.
- the first passer authentication unit 15 causes the feature information storage unit 31 to store the feature information acquired by the first face feature extraction unit 13 as feature information for authentication.
- the passer-by authentication process in the second authentication device 20 is performed using the feature information acquired from the image (first image G1) obtained by capturing the passer-by when the first authentication process is executed.
- it is possible to reduce the influence of aging, etc. of passers-by it is possible to improve passer-by authentication accuracy.
- the aging data as shown in FIG. 3 can be stored as data by associating the face image stored in the passport.
- FIG. 3 when a face image stored in a passport is photographed, the face matching image at the time of January 1990, and the face matching at the time of February 2000, 10 years later.
- the passport image and the photographed image are combined and recorded as a history.
- the degree of similarity is calculated in chronological order.
- the face image of a passport is x (0)
- the face images remaining in the history are x (1),..., X (t)
- the similarity between the face image a and the face image b Is S (a, b)
- ⁇ is the threshold for similarity that determines whether or not the person is S (x (0), x (1))> ⁇ , S (x (1), x (2))> ⁇ , S (x (2), x (3)> ⁇ , ..., S ( x (t-1), x (t))> ⁇ If it is, it is determined that it is the person. Due to the secular change, S (x (0), x (t)) ⁇ and the error of determining the person as another person can be reduced.
- the first passer authentication unit 15 when the issue date of the medium M stored (or printed) on the medium M is a date from the date of execution of the first authentication process to a date that is back a predetermined period, The feature information read from the medium M by the identification information reading unit 14 is used as authentication feature information.
- the first passer authentication unit 15 determines the first face feature extraction unit 13.
- the feature information acquired by the above may be used as authentication feature information.
- the first passer authentication unit 15 also includes authentication feature information including both the feature information acquired by the first face feature extraction unit 13 and the feature information read by the identification information reading unit 14, as a feature information storage unit 31.
- the second authentication device 20 acquires characteristic information (an example of fourth biological information) acquired from the second image G2 obtained by imaging the passerby that has passed the second position P2 by the second imaging unit 21. Is matched with either one of the two pieces of feature information included in the feature information for authentication stored in the feature information storage unit 31, it is determined that the passer-by has been successfully authenticated. Thereby, the possibility that the authentication of the passer-by will fail can be reduced.
- characteristic information an example of fourth biological information
- the first passer authentication unit 15 uses, as authentication information, information obtained by updating the feature information read from the medium M by the identification information reading unit 14 based on the first image G1 acquired by the first image acquisition unit 12.
- the information may be stored in the feature information storage unit 31.
- the first passer authentication unit 15 includes the feature information acquired from the first image G1 in the feature information read from the medium M by the identification information reading unit 14, thereby reading the feature read from the medium M. Update information.
- the first passer authentication unit 15 adds the first image G1 to the image used to create the partial space. May be added to update the partial space.
- the first passer authentication unit 15 may update the feature information read from the medium M by the identification information reading unit 14 with the feature information acquired from the first image G1.
- the first passer authentication unit 15 performs a process for removing information unnecessary for identification between the authentication feature information for the plurality of authentication feature information stored in the feature information storage unit 31.
- the first passer authentication unit 15 uses the constrained mutual subspace method described in Japanese Patent No. 4087953, etc., to convert the feature vector stored as the authentication feature information into the feature information storage unit 31. By projecting or converting into space, the identification accuracy between the feature information for authentication stored in the feature information storage unit 31 is increased. Accordingly, it is possible to prevent the second authentication process from being executed using unnecessary information included in the authentication feature information, and thus it is possible to improve passer authentication accuracy by the second authentication process.
- the first authentication unit 10 When the first output unit 16 successfully authenticates the passer-by by the first passer-authentication unit 15 (step S405: Yes), the first authentication unit 10 sends a message notifying that the passer-by has been successfully authenticated. The information is displayed on the first display unit 17 (see FIGS. 2 and 5) provided and viewed by the administrator (step S407). On the other hand, if the first passer authentication unit 15 fails to authenticate the passerby (step S405: No), the first output unit 16 sends a message notifying that the passer authentication has failed to the first authentication device. 10 is displayed on the first display unit 17 (refer to FIGS. 2 and 5) included in 10 (step S408).
- FIG. 5 is a diagram illustrating a display example of a passer-by authentication result in the first authentication device included in the information processing system according to the first embodiment.
- the first output unit 16 as shown in FIG. 5, a message 501 for notifying the success of the passer-by authentication
- the input image 502 which is a face image included in the first image G1, and the face image from which the feature information read by the identification information reading unit 14 is acquired (in this embodiment, based on image data read as identification information from the medium M).
- a first screen D1 including a reference image 503 that is a face image) is displayed on the first display unit 17 of the first authentication device 10.
- the feature information acquired from the face image included in the first image G1 is similar to the feature information read by the identification information reading unit 14, and thus the input image 502
- the reference image 503 is a similar image.
- the first output unit 16 sends a message 504 for notifying the passer of authentication failure, an input image 502, a reference image 503, Is displayed on the first display unit 17.
- the feature information acquired from the face image included in the first image G1 is not similar to the feature information read by the identification information reading unit 14, and thus the input image 502
- the reference image 503 is also a similar image.
- the 1st output part 16 displays the 1st screen D1 or the 2nd screen D2 on the 1st display part 17 as a monitor for managers, and a passer's by the 1st passer authentication part 15 is shown.
- the authentication result is notified, the present invention is not limited to this.
- a sound is emitted from a speaker (not shown) included in the first authentication device 10 or a higher-level device of the first authentication device 10 (management of the information processing system 1).
- the authentication result of the passer may be notified by transmitting the authentication result of the passer by wired or wireless communication to a terminal operated by the passer).
- FIG. 6 is a flowchart illustrating a flow of passer authentication processing in the second authentication device included in the information processing system according to the first embodiment.
- the second image acquisition unit 22 performs imaging by the second imaging unit 21.
- the obtained second image G2 is acquired (step S601).
- the second imaging unit 21 is configured by, for example, an ITV camera or the like, similar to the first imaging unit 11, and is a part necessary for acquiring feature information among the bodies of passers-by passing through the second position P2. (In this embodiment, it is provided so that a face of a passerby can be imaged).
- the second imaging unit 21 generates image data obtained by digitizing optical information obtained through the lens using an A / D converter at a predetermined frame rate, and generates a second image acquisition unit. 22 to output.
- the second face feature extraction unit 23 acquires the feature information of the passer's face image included in the second image G2 from the second image G2 acquired by the second image acquisition unit 22 (step S602).
- the second face feature extraction unit 23 acquires the feature information of the passer's face image included in the second image G2 in the same manner as the first face feature extraction unit 13 included in the first authentication device 10. To do.
- the second passer authentication unit 24 executes a second authentication process for authenticating the passer by using the feature information acquired by the second face feature extraction unit 23 and the feature information for authentication stored in the feature information storage unit 31. (Step S603).
- the passer-by who passes the second position P2 can be authenticated without reading the biological information from the medium M possessed by the passer-by who passes the second position P2, and the second position P2. Since it is not necessary to switch the second authentication process to a simple one in order to shorten the time required for authentication of a passerby who passes the road, when a lot of passers pass the second position P2, Occurrence and deterioration of passerby authentication accuracy can be prevented.
- the second passer authentication unit 24 passes when the feature information acquired by the second face feature extraction unit 23 matches any of the authentication feature information stored in the feature information storage unit 31. It is determined that the user has been successfully authenticated.
- the term “match” refers to the case where the feature information acquired by the second face feature extraction unit 23 and the feature information for authentication stored in the feature information storage unit 31 completely match or the two feature information are similar. (In this embodiment, it is assumed that the similarity between the two feature information exceeds a predetermined second threshold).
- the second passer authentication unit 24 in the same way as the first passer authentication unit 15, features information acquired by the second face feature extraction unit 23 and authentication stored in the feature information storage unit 31. Similarity with each feature information is calculated. Then, the second passer authentication unit 24 has the highest similarity with the feature information acquired by the second face feature extraction unit 23 among the authentication feature information stored in the feature information storage unit 31. Is identified.
- the second passer-by authentication unit 24 passes when the similarity between the feature information acquired by the second face feature extraction unit 23 and the specified authentication feature information exceeds a predetermined second threshold. It is determined that the user has been successfully authenticated. On the other hand, when the similarity between the feature information acquired by the second face feature extraction unit 23 and the specified authentication feature information is equal to or less than the second threshold, the second passer authentication unit 24 It is determined that the authentication of the failed.
- the second passer authentication unit 24 performs passer authentication in the same manner as the first passer authentication unit 15 included in the first authentication device 10.
- the second threshold value may be the same value as the first threshold value used for passer-by authentication in the first authentication device 10, or may be a value different from the first threshold value.
- the second passer authentication unit 24 deletes the authentication feature information that matches the feature information acquired by the second face feature extraction unit 23 from the feature information storage unit 31. At that time, the second passer authentication unit 24 stores identification information, image data, time information, device information, etc. stored in association with the authentication feature information that matches the feature information acquired by the second face feature extraction unit 23. Are also deleted from the feature information storage unit 31.
- the authentication feature information that matches the feature information acquired by the second face feature extraction unit 23 is used for authentication of other passers-by that is performed thereafter. Therefore, the reliability of the authentication feature information stored in the feature information storage unit 31 can be maintained. Further, when the passerby is successfully authenticated, the authentication feature information that matches the feature information acquired by the second facial feature extraction unit 23 is acquired from the second image G2 obtained by imaging another passerby. Therefore, it is possible to prevent unnecessary judgment processing from being performed, and to improve the passer-by authentication speed and save resources.
- the second passer authentication unit 24 includes, among the authentication feature information stored in the feature information storage unit 31, the authentication feature information that has been stored for a predetermined time after being stored in the feature information storage unit 31. Delete from 31.
- the second passer authentication unit 24 indicates that the time indicated by the time information stored in association with the authentication feature information among the authentication feature information stored in the feature information storage unit 31 is the current time. The feature information for authentication that is not the time until the time that is a predetermined time later is deleted from the feature information storage unit 31.
- the second passer authentication unit 24 uses the authentication feature information stored in the feature information storage unit 31 for authentication of a passer-by in a preset authentication device other than the first authentication device 10.
- the feature information is deleted from the feature information storage unit 31.
- the second passer authentication unit 24 preliminarily selects an authentication device indicated by the device information stored in association with the authentication feature information among the authentication feature information stored in the feature information storage unit 31. If the first authentication device 10 is not set, the authentication feature information is deleted from the feature information storage unit 31.
- the second authentication process of the passerby is executed using the authentication feature information stored in the feature information storage unit 31 after successful authentication in an authentication device other than the preset first authentication device 10. Therefore, the reliability of the authentication feature information stored in the feature information storage unit 31 can be maintained.
- the second passer authentication unit 24 compares the degree of similarity between the authentication feature information that has been stored in the feature information storage unit 31 and the feature information acquired by the second face feature extraction unit 23 after a predetermined time has elapsed.
- the second threshold value is compared with the similarity between the feature information for authentication that has not been passed for a predetermined time since being stored in the feature information storage unit 31 and the feature information acquired by the second face feature extraction unit 23 May be higher.
- step S604: Yes When the second passer authentication unit 24 succeeds in authenticating the passerby (step S604: Yes), the second output unit 25 sends a message notifying that the passer authentication has been successful to the second display unit 26 ( (See FIGS. 1 and 7), so that the administrator can see (step S605).
- step S604: No when the second passer authentication unit 24 fails to authenticate the passerby (step S604: No), the second output unit 25 notifies the administrator of a message notifying that the passer authentication has failed. In this manner, the second display unit 26 (see FIGS. 1 and 7) displays the image (step S606).
- FIG. 7 is a diagram illustrating a display example of a passer-by authentication result in the second authentication device included in the information processing system according to the first embodiment.
- the second output unit 25 as shown in FIG. 7, a message 701 for notifying the passer-by of the authentication success, A predetermined number (three in the present embodiment) of authentication feature information in descending order of similarity between the input image 702 that is a face image included in the second image G2 and the feature information acquired by the second face feature extraction unit 23.
- Reference images 703 to 705 which are face images from which the authentication feature information is acquired (in this embodiment, face images based on image data included in identification information stored in association with the authentication feature information); Are displayed on the second display unit 26 as an administrator monitor.
- the second output unit 25 obtains the reference image 703 from which the authentication feature information having the highest similarity among the authentication feature information having the highest similarity with the feature information acquired by the second face feature extraction unit 23 is acquired. Is displayed in a display mode different from that of the other reference images 704 and 705.
- the second output unit 25 displays the third screen D3 on the second display unit 26 to notify the administrator of the authentication result of the passerby by the second passer authentication unit 24.
- the present invention is not limited to this.
- a sound is emitted from a speaker (not shown) included in the second authentication device 20, or a higher-level device of the second authentication device 20 (a terminal operated by an administrator of the information processing system 1).
- the authentication result of the passer may be notified by transmitting the authentication result of the passer by wired or wireless communication.
- the second output unit 25 opens the entrance gate provided at the second position P2 or is provided at the second position P2 when the second passer authentication unit 24 succeeds in authentication of the passer-by.
- the passer authentication result may be notified by unlocking the door key.
- the second output unit 25 outputs the image data of the face image included in the second image G2 obtained by the imaging of the second imaging unit 21.
- it may be transmitted to the host device as image data of an unauthorized passer's face image and stored in the host device.
- the passerby passes through the first position P1 and then passes through the second position P2, the first pass Even if the authentication of the passerby by the authentication process is successful, the authentication of the passerby in the second authentication process fails, so that the replacement of the passerby can be prevented.
- the time required for searching the feature information used for the first and second authentication processes from the database of the host device such as the server 30 and for the first and second authentication processes is not necessary to switch the passer authentication to a simple one in order to shorten the time, so that the passer authentication can be performed efficiently while preventing the passer authentication accuracy from being lowered.
- the biometric information when biometric information is read from a medium possessed by a passerby who passes through the second position, the biometric information is read from a medium possessed by a passerby who passes through the second position, instead of the second authentication process.
- the third authentication process for authenticating the passerby can be executed. It is an example. In the following description, description of the same parts as those in the first embodiment is omitted.
- FIG. 8 is a block diagram showing a functional configuration of the information processing system according to the second embodiment.
- the second authentication device 70 includes a card in addition to the second imaging unit 21, the second image acquisition unit 22, the second facial feature extraction unit 23, and the second output unit 25.
- a second identification information reading unit 71 (second reading unit) configured to be capable of reading characteristic information (an example of fifth biological information) from a medium M possessed by a passerby who passes through the second position P2 and is configured by a reader or the like. 1) and the feature information acquired by the second face feature extraction unit 23 and the feature information read by the second identification information reading unit 71 in place of the second authentication process, And a second passer-by authentication unit 72 capable of executing the authentication process.
- FIG. 9 is a flowchart showing a flow of passer-by authentication processing in the second authentication apparatus included in the information processing system according to the second embodiment.
- the second identification information reading unit 71 determines whether or not reading of identification information is instructed by an operation unit (not illustrated) (for example, a numeric keypad or a touch panel) included in the second authentication device 70 (step S901). When the reading of the identification information is not instructed (step S901: No), the second authentication device 70 executes the same processing as steps S601 to S606 shown in FIG.
- the second identification information reading unit 71 passes through the second position P2 in the same manner as the identification information reading unit 14 included in the first authentication device 10.
- the feature information and identification information that enables identification of the passer-by are read from the medium M possessed by the passer-by (step S902).
- the second image acquisition unit 22 performs the second image G2 obtained by imaging by the second imaging unit 21 in the same manner as in step S601 shown in FIG. Is acquired (step S903). Further, the second face feature extraction unit 23 acquires the feature information of the passer's face image included in the second image G2 from the second image G2, similarly to step S602 shown in FIG. 6 (step S904). ).
- the second passer authentication unit 72 uses the feature information read by the second identification information reading unit 71 and the feature information acquired by the second face feature extraction unit 23 instead of the second authentication process to pass A third authentication process for authenticating the person is performed (step S905).
- the second passer authentication unit 72 calculates the similarity between the feature information acquired by the second face feature extraction unit 23 and the feature information read by the second identification information reading unit 71. Then, the second passer-by identification unit 72 determines that the passer-by has been successfully authenticated when the calculated similarity exceeds the predetermined third threshold (step S604: Yes). On the other hand, when the calculated similarity is equal to or smaller than the third threshold value, the second passer-by identification unit 72 determines that the passer-by authentication has failed (step S604: No).
- the third threshold value is preferably higher than the first threshold value used for authentication in the first authentication device 10 to increase the recognition accuracy of the passerby.
- the second identification information reading unit 71 when the second identification information reading unit 71 is provided, the authentication feature similar to the feature information acquired by the second face feature extraction unit 23 is provided. Even if the information is not stored in the feature information storage unit 31, the second authentication process can be executed, so that special passers-by such as celebrities are exempt from the first authentication process and pass through the first position P1. Even when the second position P2 is reached, the second authentication process can be executed in the same manner as a normal passer other than the special passer-by. Similarly to the first embodiment, when a passerby is switched between passing the first position P1 and before passing the second position P2, authentication of the passerby by the first authentication process is performed. Even if it succeeds, the authentication of the passerby in the second authentication process fails, so that the replacement of the passerby can be prevented.
- the program executed by the first authentication device 10 and the second authentication device 20 (70) of the present embodiment is provided by being incorporated in advance in a ROM (Read Only Memory) or the like.
- the program executed by the first authentication device 10 and the second authentication device 20 (70) of the present embodiment is a file in an installable format or an executable format, and is a CD-ROM, a flexible disk (FD), a CD.
- the recording medium may be recorded on a computer-readable recording medium such as R, DVD (Digital Versatile Disk), or the like.
- the program executed by the first authentication device 10 and the second authentication device 20 (70) of the present embodiment is provided by being stored on a computer connected to a network such as the Internet and downloaded via the network. You may comprise as follows.
- the program executed by the first authentication device 10 and the second authentication device 20 (70) of the present embodiment may be configured to be provided or distributed via a network such as the Internet.
- the program executed by the first authentication device 10 of the present embodiment includes the above-described units (the first image acquisition unit 12, the first face feature extraction unit 13, the first passer authentication unit 15, and the first output unit 16).
- a CPU Central Processing Unit
- the program executed by the second authentication device 20 (70) of the present embodiment includes the above-described units (second image acquisition unit 22, second face feature extraction unit 23, second passer authentication unit 24 (72).
- the second output unit 25) has a module configuration, and as actual hardware, the CPU reads the program from the ROM and executes it to load each unit on the main storage device, thereby obtaining the second image.
- the unit 22, the second face feature extraction unit 23, the second passer authentication unit 24 (72), and the second output unit 25 are generated on the main storage device.
- the first authentication device 10 includes the first image acquisition unit 12, the first facial feature extraction unit 13, the first passer authentication unit 15, and the first output unit 16.
- the second authentication device 20 includes a second image acquisition unit 22, a second facial feature extraction unit 23, a second passer authentication unit 24 (72), and a second output unit 25.
- Any device in the information processing system 1 may have the above-described units.
- one device such as the server 30 may include all of the above-described units.
- Each of the above-described units may be divided and provided in any one of three or more devices in one.
- FIG. 10 is a diagram showing a configuration of an immigration control system to which the person authentication system according to the third embodiment is applied.
- the immigration control system according to the present embodiment adjusts the person's imaging condition in the authentication process at the person's destination in accordance with the person's imaging condition in the authentication process at the time of departure. It is a system that makes it possible to improve the accuracy of authentication processing at the destination.
- the information processing system 1 as a person authentication method and a person authentication system according to the present embodiment is applied to an immigration control system
- a person such as a public facility, an important facility, an office building, or a commercial facility
- a system for example, an entrance / exit management system or a video monitoring system
- the person authentication method and the person authentication system according to the present embodiment are the same as in the output entry management system 1. Applicable.
- the immigration system As shown in FIG. 10, the immigration system according to the present embodiment combines 1: 1 verification at the immigration counter DC or immigration counter IC and 1: N verification at the boarding gate BG or the baggage receiving place BC.
- the system for efficiently performing person authentication processing in facilities such as airports.
- the 1: 1 verification is read from a passport P (an example of a medium) possessed by a person passing through the immigration counter DC or the immigration counter IC before the person reaches the boarding gate BG or the baggage receiving place BC.
- the authentication process is executed using identification information that enables identification of the person and biometric information acquired from an image obtained by imaging the person.
- N verification is an authentication process that is executed using biometric information acquired from an image obtained by imaging a person passing through the boarding gate BG or the baggage receiving location BC and a plurality of biometric information stored in advance. .
- the immigration control system includes a boarding guide device 40, a first authentication device 41, a second authentication device 42, a third authentication device 43, 4 authentication device 44, first storage device 45, and second storage device 46.
- the boarding guidance device 40 is installed at a check-in counter as the first position P1, reads destination information from an airline ticket T (an example of a medium) possessed by a person boarding an aircraft, etc., and a passport P possessed by the person. And the like, and the biometric information is acquired from the image obtained by the imaging of the camera 101 provided so that the person can be imaged.
- the destination information is information indicating a destination of a person who passes the check-in counter as the first position P1.
- the boarding guide device 40 associates the destination information read from the airline ticket T, the identification information read from the passport P, and the biological information acquired from the image obtained by the imaging of the camera 101 with each other. Save in the storage device 45.
- the first authentication device 41 is installed in the immigration examination counter DC and reads identification information (an example of the first identification information) from the passport P possessed by a person passing through the immigration examination counter DC (an example of the first position). Acquisition of biological information (an example of first biological information) from an image (an example of a first image) obtained by imaging by a camera 111 (an example of a first imaging unit) provided so as to be capable of imaging the person is performed. . Next, the first authentication device 41 uses the identification information read from the passport P held by the person leaving the country and the biometric information acquired from the image obtained by the imaging of the camera 111 to authenticate the person (hereinafter referred to as “person authentication process”). , Referred to as first authentication processing).
- the first authentication device 41 associates the identification information and the biometric information used in the first authentication process with the device identification information that enables the first authentication device 41 to be identified. Saved in the first storage device 45. Further, when the first authentication process is successful, the first authentication device 41 associates the imaging condition of the camera 111 with the second storage device 46 (an example of a storage device) in association with the identification information used for the first authentication process. Save to. As shown in FIG.
- the face image of the person photographed by the camera 111, the camera number as the place of photographing (which gate), the photographing date and time are displayed on the left side, and from the passport P possessed by the passing person
- a display unit for the manager that displays the face image of the person stored in the passport as identification information, the person's name, gender, and age in comparison with the right side.
- FIGS. 12A to 12E there are cases where the face image of a person stored in the passport P as identification information is inappropriate as a captured image, as shown in FIGS. 12A to 12E.
- the face is hidden by sunglasses.
- the face is concealed by hair.
- the facial expression is changed by opening the mouth as shown in FIG. 12C.
- the face is facing the side instead of the front.
- the face is concealed by a hat. If the above is detected, the display unit 105, 116, 121 or the like instructs the passer-by to take a picture again. Alternatively, a warning is issued on the display section for the administrator.
- the illumination at the time of shooting is too bright or too dark. If compression noise is included when an image is encoded, a warning is given to the display for the administrator, and adjustments are made to make the lighting suitable for shooting.
- the camera 101, 111, or 121 is adjusted so that the
- the second authentication device 42 is installed in the boarding gate BG, and acquires biometric information from an image obtained by imaging of the camera 121 provided so that a person passing through the boarding gate BG can be imaged.
- the second authentication device 42 does not read identification information from the passport P possessed by a person passing through the boarding gate BG.
- the second authentication device 42 uses the acquired biological information and the biological information stored in the first storage device 45 to execute an authentication process for a person passing through the boarding gate BG.
- the person can be authenticated without reading the identification information from the passport P. Even if it passes, it is possible to execute the person authentication process without causing any stagnation.
- the second authentication device 42 boards. Assuming that the authentication processing of the person passing through the gate BG has been successful, the passage of the boarding gate BG is permitted by opening the boarding gate BG or unlocking the door key installed on the boarding gate BG. On the other hand, when the second authentication device 42 does not match the biological information acquired from the image obtained by imaging the person passing through the boarding gate BG with any of the biological information stored in the first storage device 45. The person who passes the boarding gate BG has failed to be authenticated, and the passage of the boarding gate BG is prohibited by not opening the boarding gate BG or unlocking the door key installed on the boarding gate BG. .
- the second authentication device 42 permits the passage of the boarding gate BG by the following process when a gate or a door key for permitting or prohibiting the passage of the boarding gate BG is not installed. Or do a ban.
- the second authentication device 42 is obtained by imaging a person passing through the boarding gate BG among the biometric information stored in the first storage device 45 when the authentication process of the person passing through the boarding gate BG is successful.
- Information indicating that the boarding gate BG has been normally passed is stored in association with the biological information that matches the biological information acquired from the image.
- the second authentication device 42 when the second authentication device 42 fails in the authentication process of the person passing through the boarding gate BG, the second authentication device 42 provides warning information for notifying the display unit included in the second authentication device 42 that the authentication has failed. Display information, send warning information for notifying the external terminal that authentication has failed, or save an image obtained by imaging a person who has passed the boarding gate BG.
- the third authentication device 43 is installed at the immigration counter IC (an example of the second position), and reads the identification information (an example of the second identification information) from the passport P possessed by a person passing through the immigration counter IC. Next, the third authentication device 43 reads the imaging conditions stored in association with the identification information that matches the identification information read from the passport P from the second storage device 46. Next, the third authentication device 43 adjusts the imaging conditions of a camera 131 (an example of a second imaging unit) provided to be able to image a person passing through the immigration counter IC according to the read imaging conditions.
- a camera 131 an example of a second imaging unit
- the third authentication device 43 acquires biological information (an example of second biological information) from an image (an example of a second image) obtained by imaging with the camera 131 with the imaging conditions adjusted. And the 3rd authentication apparatus 43 uses the identification information read from the passport P, and the biometric information acquired from the image obtained by the imaging of the camera 131, The said person's authentication process (henceforth a 2nd authentication process). ). As a result, the biometric information obtained from the image obtained by the imaging of the camera 131 under the imaging conditions similar to the imaging conditions of the camera 111 when the first authentication process at the immigration counter DC is successful is used. Since two authentication processes can be executed, the accuracy of person authentication at the immigration counter IC can be improved. Then, when the second authentication process is successful, the third authentication apparatus 43 stores the identification information used in the second authentication process and the biological information in association with each other in the second storage device 46.
- biological information an example of second biological information
- the fourth authentication device 44 is installed in the baggage receiving place BC, and acquires biometric information from an image obtained by taking an image of a camera 141 provided so as to be able to take an image of a person passing through the baggage receiving place BC.
- the fourth authentication device 44 uses the acquired biometric information and the biometric information stored in the second storage device 46 to execute an authentication process for a person passing through the package receiving location BC.
- the person can be authenticated without reading the identification information from the passport P in the final authentication of the person at the time of entry, a large number of persons pass through the package receiving place BC. Even so, it is possible to execute authentication of a person without causing any delay.
- the fourth authentication device 44 when the biometric information acquired from the image obtained by imaging the person passing through the package receiving location BC matches any of the biometric information stored in the second storage device 46, Assuming that the authentication process for the person passing through the package receiving place BC is successful, the user is permitted to pass through the package receiving place BC. On the other hand, in the fourth authentication device 44, the biometric information acquired from the image obtained by imaging the person passing through the package receiving location BC did not match any of the biometric information stored in the second storage device 46. In this case, it is determined that the authentication process for the person passing through the package receiving place BC has failed, and passage through the package receiving place BC is prohibited.
- FIG. 13 is a diagram illustrating a functional configuration of the boarding guide device 40, the first authentication device 41, and the second authentication device 42 included in the immigration control system according to the present embodiment.
- the boarding guide device 40 extracts an image acquisition unit 102 that acquires an image obtained by imaging with the camera 101, and extracts feature information of a face image in the image acquired by the image acquisition unit 102.
- the facial feature extraction unit 103, the identification information reading unit 104 that reads destination information from the airline ticket T possessed by the person who passes the check-in counter P1, and the reading result information of the destination information from the airline ticket T can be displayed. Display unit 105.
- the first authentication device 41 includes an image acquisition unit 112 that acquires an image obtained by imaging with the camera 111, and a face image in the image from the image acquired by the image acquisition unit 112.
- Feature information extraction unit 113 that extracts (acquires) feature information (an example of biometric information)
- an identification information reading unit 114 that reads identification information from a passport P possessed by a person who passes the immigration counter DC
- facial feature extraction When the first authentication process of the person who passes the immigration examination counter DC is executed using the feature information extracted by the unit 113 and the identification information read by the identification information reading unit 114, and the first authentication process is successful,
- a person authentication unit 115 that stores the identification information and feature information used in the first authentication process in association with the device identification information in the first storage device 45, and the person authentication unit 1 5 and the first authentication display unit 116 capable of displaying results of processing by the has.
- the person authentication unit 115 when the first authentication process for the person passing through the immigration counter DC is successful, the person authentication unit 115 associates the identification information used for the first authentication process with the first authentication process.
- the imaging condition (an example of the first imaging condition) of the camera 111 that has obtained the used image is stored in the second storage device 46 shown in FIG. 14 as the imaging condition (an example of the second imaging condition) of the camera 131 shown in FIG. To do.
- the person authentication unit 115 stores the imaging conditions of the camera 111 in the second storage device 46.
- the imaging conditions of the camera 111 are stored in a storage device accessible by the third authentication device 43 illustrated in FIG.
- the present invention is not limited to this.
- the imaging conditions of the camera 111 may be stored in the first storage device 45.
- the second authentication device 42 extracts an image acquisition unit 122 that acquires an image obtained by imaging with the camera 121, and feature information of a face image in the image acquired by the image acquisition unit 122.
- the feature information stored in the first storage device 45 and the feature information extracted by the face feature extraction unit 123 to perform a person search that performs an authentication process for a person passing through the boarding gate BG Section 124 and a display section 125 capable of displaying a person authentication result and the like by the person search section 124.
- the person search unit 124 searches for the person passing through the boarding gate BG. It is determined that the authentication process has been successful, and the passage through the boarding gate BG is permitted. On the other hand, if the feature information extracted by the face feature extraction unit 123 does not match any of the feature information stored in the first storage device 45, the person search unit 124 authenticates the person who passes the boarding gate BG. It is determined that the process has failed, and passage through the boarding gate BG is prohibited.
- the second authentication device 42 can pass through the boarding gate BG even if a person who has not performed the authentication process in the first authentication device 41 passes through the boarding gate BG. In order to do so, you may have the reading part which is not illustrated which can read identification information from the passport P which the said person has.
- the person search unit 124 of the second authentication device 42 uses the identification information read by a reading unit (not shown) included in the second authentication device 42 and the feature information extracted by the face feature extraction unit 123, An authentication process for a person passing through the boarding gate BG is executed.
- FIG. 14 is a diagram illustrating a functional configuration of the third authentication device included in the immigration system according to the present embodiment.
- the third authentication device 43 includes an identification information reading unit 132 that reads identification information from a passport P possessed by a person passing through the immigration counter IC, and an identification information reading unit from the second storage device 46.
- An imaging condition adjusting unit 133 that reads out imaging conditions stored in association with identification information that matches the identification information read by 132 and adjusts the imaging conditions of the camera 131 according to the readout imaging conditions, and the imaging conditions are adjusted
- An image acquisition unit 134 that acquires an image obtained by imaging by the camera 131, and feature information (an example of biological information) of a face image in the image is extracted from the image acquired by the image acquisition unit 134 Acquisition) facial feature extraction unit 135, and feature information extracted by the facial feature extraction unit 135 and identification read by the identification information reading unit 132
- the identification information and the feature information used in the second authentication process correspond to each other.
- a person authentication unit 136 that is stored in the second authentication process
- the imaging condition adjustment unit 133 adjusts the imaging condition of the camera 131 so as to approach the imaging condition read from the second storage device 46.
- fluctuations in the imaging conditions of the camera 131 based on the imaging conditions of the camera 111 when the first authentication process at the immigration counter DC is successful can be suppressed.
- the robustness of accuracy can be improved.
- FIG. 15 is a diagram illustrating a functional configuration of the fourth authentication device included in the immigration system according to the present embodiment.
- the fourth authentication device 44 includes a first image acquisition unit 142 that acquires an image obtained by imaging of a camera 141 provided so as to be able to image a person passing through the package receiving location BC,
- the first face feature extraction unit 143 that extracts the feature information of the face image in the image acquired by the first image acquisition unit 142, and the feature information stored in the second storage device 46 and the first face feature extraction unit 143 Using the extracted feature information, a person search unit 144 that executes an authentication process for a person passing through the package receiving location BC, a first display unit 145 that can display a result of the authentication process by the person search unit 144, have.
- the fourth authentication device 44 acquires a second image acquisition unit 146 that acquires an image obtained by imaging of a camera 141 provided so as to be able to image a person who receives the package at the package receiving location BC.
- the second face feature extracting unit 147 that extracts the feature information of the face image in the image acquired by the second image acquiring unit 146 and the baggage tag 400 possessed by the person passing the baggage receiving place BC, Is stored in association with tag information reading unit 148 that reads tag information that can identify the owner of the tag, and identification information that matches the tag information read by tag information reading unit 148 in second storage device 46.
- tag information reading unit 148 that reads tag information that can identify the owner of the tag, and identification information that matches the tag information read by tag information reading unit 148 in second storage device 46.
- the feature information extracted by the second face feature extraction unit 147 the person who has received the package at the package receiving location BC is authenticated. Having an object authentication section 149, and the person authentication unit 149 the second display unit 150 can display the results of
- FIG. 16 is a flowchart showing a flow of biometric information acquisition processing by the boarding guide apparatus 40 included in the immigration control system according to the present embodiment.
- the identification information reading unit 104 reads destination information indicating the destination of the person from the airline ticket T possessed by the person passing through the check-in counter P1 (step S1601). Further, the identification information reading unit 104 reads the feature information of the face image of the holder of the passport P as identification information from the IC chip embedded in the passport P possessed by the person who passes the check-in counter P1 (step S1602). ).
- the identification information reading unit 104 is configured by a card reader, for example, and can read destination information from the airline ticket T and read identification information from an IC chip embedded in the passport P.
- the identification information reading unit 104 reads the feature information of the face image of the holder of the passport P as the identification information.
- the present invention is not limited to this.
- an ID number for uniquely identifying the holder of the passport P may be read as identification information.
- biological information of the holder of the passport P for example, a face image, fingerprint, iris, etc.
- personal information of the holder of the passport P for example, name, Date of birth, sex, age, affiliation, career, etc.
- the identification information reading unit 104 reads various information such as destination information and identification information from the airline ticket T or the passport P, but the present invention is not limited to this.
- it may be configured by an input unit capable of inputting various information such as destination information and identification information, such as a numeric keypad and a touch panel.
- a user of the boarding guide device 40 (for example, a person passing through the check-in counter P1 or the like) operates the identification information reading unit 104 functioning as an input unit and inputs destination information and identification information.
- the identification information reading unit 104 reads identification information (for example, identification information stored by an external device other than the immigration system) from an IC chip embedded in the passport P.
- identification information for example, identification information stored by an external device other than the immigration system
- the present invention is not limited to this as long as it can read the identification information from.
- the identification information reading unit 104 can read a face image printed on the passport P, and can read feature information of the read face image as identification information.
- the feature information reading method from the face image is the same as the feature information extraction method by the face feature extraction unit 103 described later.
- the image acquisition unit 102 controls the camera 101 and images a person passing through the check-in counter P1. Then, the image acquisition unit 102 acquires an image obtained by imaging with the camera 101 (step S1603).
- the camera 101 is composed of, for example, a video camera or the like, and is provided so as to be able to image a person passing through the check-in counter P1. In the present embodiment, the camera 101 is provided so as to be able to capture the face of a person passing through the check-in counter P1.
- the camera 101 digitizes and outputs an image obtained by imaging a person passing through the check-in counter P1 by an A / D converter (not shown).
- the face feature extraction unit 103 detects a face image from the image acquired by the image acquisition unit 102, and extracts feature information of the detected face image (step S1604).
- the face feature extraction unit 103 first calculates a correlation value between the acquired image and the template while moving a preset face detection template in the image acquired by the image acquisition unit 102. Ask. Then, the face feature extraction unit 103 detects a region having the highest correlation value with the template as a face image in the acquired image.
- the face feature extraction unit 103 detects a face image from the image acquired by the image acquisition unit 102 using a preset face detection template.
- a face image can be detected from the image acquired by the image acquisition unit 102 using a known eigenspace method, subspace method, or the like.
- the face feature extraction unit 103 uses, for example, a method described in Japanese Patent No. 5355446, among a plurality of images acquired by the image acquisition unit 102, a plurality of persons passing through the check-in counter P1. The face image is detected.
- the face feature extraction unit 103 can also select a face image necessary for extracting feature information from the detected plurality of face images.
- the facial feature extraction unit 103 detects facial parts such as eyes and nose from the detected facial image using a method described in, for example, Japanese Patent No. 3279913. After that, the face feature extraction unit 103 digitizes and outputs the feature information that can identify the person who passes the check-in counter P1 from the detected face part. Specifically, the face feature extraction unit 103 extracts an area having a predetermined size and a predetermined shape from the face image detected from the image acquired by the image acquisition unit 102 based on the position of the detected face part. cut. Then, the face feature extraction unit 103 extracts the shading information of the clipped area as feature information. The face feature extraction unit 103 uses, for example, a partial space method described in Japanese Patent No. 40879953 or the like, and uses m ⁇ n-dimensional feature vectors (features) of m ⁇ n pixel regions cut out from a face image. Information).
- the face feature extraction unit 103 stores the extracted feature information in the first storage device 45 in association with the destination information and the identification information read by the identification information reading unit 104 (step S1605).
- FIG. 17 is a flowchart showing a flow of authentication processing by the first authentication device included in the immigration system according to the present embodiment.
- the identification information reading unit 114 reads the identification information of the holder of the passport P from the passport P possessed by the person passing through the immigration counter DC (step S1701).
- the method for reading the identification information from the passport P by the identification information reading unit 114 is the same as the method for reading the identification information from the passport P by the identification information reading unit 104 of the boarding guide device 40.
- the image acquisition unit 112 controls the camera 111 to image a person passing through the immigration examination counter DC. Then, the image acquisition unit 112 acquires an image obtained by imaging with the camera 111 (step S1702).
- the camera 111 is composed of, for example, a video camera or the like, and is provided so as to be able to image a person passing through the immigration examination counter DC. In the present embodiment, the camera 111 is provided so as to be able to image the face of a person who passes through the immigration counter DC.
- the camera 111 digitizes and outputs an image obtained by imaging a person passing through the immigration examination counter DC by an A / D converter (not shown).
- the face feature extraction unit 113 detects a face image from the image acquired by the image acquisition unit 112, and extracts feature information of the detected face image (step S1703).
- the method of extracting feature information from the image by the face feature extraction unit 113 is the same as the method of extracting feature information from the image by the face feature extraction unit 103 of the boarding guide device 40.
- the person authentication unit 115 uses the identification information read by the identification information reading unit 114 and the feature information extracted by the face feature extraction unit 113 to perform a first authentication process (1: 1 collation) is executed (step S1704).
- the person authentication unit 115 first calculates the similarity between the feature information read as the identification information by the identification information reading unit 114 and the feature information extracted by the face feature extraction unit 113.
- the person authentication unit 115 calculates a similarity index between the feature information read by the identification information reading unit 114 and the feature information extracted by the face feature extraction unit 113.
- the similarity index is the similarity between the partial space of feature information read by the identification information reading unit 114 and the partial space of feature information extracted by the face feature extraction unit 113.
- the person authentication unit 115 extracts the partial space of the feature information read by the identification information reading unit 114 and the facial feature extraction unit 113 by a partial space method or a composite similarity method described in Japanese Patent No. 4087953, for example.
- the angle formed by the partial space of the feature information is calculated as the similarity between the two partial spaces.
- the person authentication unit 115 uses the Euclidean distance or the Mahalanobis distance between the feature information read by the identification information reading unit 114 and the feature information extracted by the face feature extraction unit 113 to determine the similarity between the two feature information. You may ask. In this case, the person authentication unit 115 decreases in similarity as the Euclidean distance or Mahalanobis distance between the feature information read by the identification information reading unit 114 and the feature information extracted by the face feature extraction unit 113 increases.
- the person authentication unit 115 determines that the first authentication process of the person passing through the departure examination counter DC has been successful when the calculated similarity is equal to or greater than a predetermined first threshold. In other words, if the calculated similarity is greater than or equal to the first threshold, the person authentication unit 115 determines that the person who passes the immigration examination counter DC is the person who owns the passport P (step S1705: Yes). ). When it is determined that the person who passes the immigration counter DC is the holder of the passport P, the person authentication unit 115 associates the identification information read by the identification information reading unit 114 with the imaging conditions of the camera 111. Are stored in the second storage device 46 as imaging conditions of the camera 131 (step S1706).
- the imaging conditions are the imaging conditions of the camera 111 when imaging a person who is determined to be the holder of the passport P.
- the imaging condition is information relating to an image obtained by imaging with the camera 111 (that is, an image obtained from the feature information used in the first authentication process).
- the imaging conditions are the face image included in the image obtained by imaging by the camera 111, the height of the person based on the image, and the illumination condition based on the image (in other words, when the image is obtained). At least one of the imaging conditions of the imaging range of the camera 111.
- the person authentication unit 115 determines that the person who passes the immigration examination counter DC is the holder of the passport P, the person authentication unit 115 identifies the person in association with the device identification information of the first authentication device 41.
- the identification information read by the information reading unit 114 and the feature information extracted by the face feature extraction unit 113 are stored in the first storage device 45.
- it is possible to specify which identification information and feature information stored in the first storage device 45 is the information used for the first authentication processing in which first authentication device 41.
- the authentication process is executed using the identification information and feature information stored in the first storage device 45 in the device 42, the authentication process is performed using the identification information and feature information stored by the predetermined first authentication device 41. Can be executed.
- the person authentication unit 115 stores the feature information extracted by the face feature extraction unit 113 in the first storage device 45 in association with the feature information read as identification information by the identification information reading unit 114.
- it corresponds to identification information such as the time indicating the time when the first authentication process was executed, the face image of the name of the passport P, name, date of birth, gender, age, height, etc.
- the feature information extracted by the face feature extraction unit 113 may be stored in the first storage device 45.
- the person authentication unit 115 stores the feature vector, subspace, correlation matrix, and the like of the feature information extracted by the face feature extraction unit 113 in the first storage device 45 in association with the identification information read by the identification information reading unit 114. May be saved.
- the display unit 116 displays information for notifying that the first authentication process by the person authentication unit 115 has been successful (Ste S1707).
- the person authentication unit 115 determines that the first authentication process for the person passing through the departure examination counter DC has failed. In other words, when the calculated similarity is less than the first threshold, the person authentication unit 115 determines that the person who passes the immigration examination counter DC is not the person who owns the passport P (step S1705: No). If it is determined that the person passing through the immigration counter DC is not the holder of the passport P, the display unit 116 displays information for notifying that the first authentication process by the person authentication unit 115 has failed. (Step S1708).
- FIG. 18 is a flowchart showing a flow of authentication processing by the second authentication device included in the immigration system according to the present embodiment.
- the image acquisition unit 122 controls the camera 121 to image a person passing through the boarding gate BG. Then, the image acquisition unit 122 acquires an image obtained by imaging with the camera 121 (step S1801).
- the camera 121 is composed of, for example, a video camera or the like, and is provided so as to be able to image a person passing through the boarding gate BG. In the present embodiment, the camera 121 is provided so as to be able to capture the face of a person passing through the boarding gate BG.
- the camera 121 digitizes and outputs an image obtained by imaging a person passing through the boarding gate BG by an A / D converter (not shown).
- the face feature extraction unit 123 detects a face image from the image acquired by the image acquisition unit 122, and extracts feature information of the detected face image (step S1802).
- the feature information extraction method from the image by the face feature extraction unit 123 is the same as the feature information extraction method from the image by the face feature extraction unit 103 included in the boarding guide device 40.
- the person search unit 124 uses the feature information extracted by the face feature extraction unit 123 and the feature information stored in the first storage device 45 to authenticate a person who passes the boarding gate BG (1: N verification). ) Is executed (step S1803). At that time, the person search unit 124 executes an authentication process using the feature information stored in association with the device identification information other than the predetermined device identification information among the feature information stored in the first storage device 45. Is prohibited. Thereby, since the authentication process of the person who passes the boarding gate BG can be executed using the feature information used for the first authentication process of the predetermined first authentication device 41, the person who passes the boarding gate BG. The reliability of the authentication process can be improved.
- the person search unit 124 calculates the similarity between each piece of feature information stored in the first storage device 45 and the feature information extracted by the face feature extraction unit 123.
- the similarity calculation method by the person search unit 124 is the same as the similarity calculation method by the person authentication unit 115 included in the first authentication device 41.
- the person search unit 124 selects feature information having the highest degree of similarity with the feature information extracted by the face feature extraction unit 123 from the feature information stored in the first storage device 45.
- the person search unit 124 succeeds in the authentication process of the person passing through the boarding gate BG. (In other words, it is determined that the feature information extracted by the face feature extraction unit 123 matches any of the feature information stored in the first storage device 45), and the passage through the boarding gate BG is determined. Permitted (Step S1804: Yes).
- the display unit 125 displays information for notifying that the authentication process by the person search unit 124 is successful (step S1805).
- the person search unit 124 stores various pieces of information (for example, stored in association with the identification information of the person who has successfully performed the authentication process) from the first storage device 45. Device identification information, feature information, destination information, etc.).
- the person search unit 124 also deletes various information stored in association with identification information that has been stored in the first storage device 45 for a predetermined time.
- the reliability of the feature information stored in the first storage device 45 can be maintained.
- the person search unit 124 performs an authentication process for a person passing through the boarding gate BG. Is determined to have failed and the passage of the boarding gate BG is prohibited (step S1804: No). Further, when the authentication process for the person passing through the boarding gate BG fails, the display unit 125 displays information for notifying that the authentication process by the person search unit 124 has failed (step S1806).
- the person search unit 124 can execute a process for removing information unnecessary for identification between the feature information for the plurality of feature information stored in the first storage device 45.
- the person search unit 124 projects or converts the feature vector stored as the feature information in the first storage device 45 into the subspace using the constrained mutual subspace method described in Japanese Patent No. 4087953. By doing so, the identification accuracy between the feature information memorize
- the person search unit 124 stores the first storage device. Using the identification information stored in 45 and the feature information extracted by the face feature extraction unit 123, an authentication process for a person passing through the boarding gate BG may be executed. As a result, like the first authentication device 41, the authentication process for the person passing through the boarding gate BG is executed using the feature information read from the passport P as the identification information. Can be secured. However, since the feature information read as identification information from the passport P is generally feature information older than the point in time when the authentication process is executed, the authentication process is easily affected by the secular change of the person to be authenticated. .
- the person search unit 124 as described above, the feature information stored in the first storage device 45 (feature information extracted by the face feature extraction unit 113 of the first authentication device 41), By performing the authentication process of the person passing through the boarding gate BG using the feature information extracted by the face feature extraction unit 123, the authentication process is affected by the secular change of the person to be authenticated. It can be reduced and the accuracy of authentication of the person can be improved.
- the person search unit 124 when at least one of the identification information and feature information stored in the first storage device 45 matches the feature information extracted by the face feature extraction unit 123, It may be determined that the authentication process for the person who passes through is successful. Thereby, the failure of the authentication process of the person who passes the boarding gate BG using the information memorize
- the person search unit 124 detects the congestion level of the boarding gate BG based on the image obtained by the imaging of the camera 121, and the authentication process of the person passing through the boarding gate BG according to the detected congestion level
- the number of persons passing through the boarding gate BG per unit time is controlled by changing the processing speed (for example, if the detected congestion level is higher than a predetermined value, increasing the processing speed of the authentication process). It is also possible to do.
- the feature information extracted by the face feature extraction unit 123 is predetermined feature information (for example, feature information of a first class person)
- the person search unit 124 has a person having the predetermined feature information. May be displayed on the display unit 125 for notifying that the boarding is preferentially boarded.
- the person search unit 124 also associates the identification information that matches the feature information extracted by the face feature extraction unit 123 in the first storage device 45 when the authentication process of the person passing through the boarding gate BG is successful.
- the destination information stored in the memory is read out. Then, the person search unit 124 notifies the fact that there is a passenger with a different destination when the destination indicated by the read destination information does not match the destination of the airplane on which the person who has passed the boarding gate BG does not match. Can be displayed on the display unit 125.
- the person search unit 124 also displays information for instructing boarding using the airline ticket T held by the person passing through the boarding gate BG when the authentication process of the person passing through the boarding gate BG fails. May be displayed.
- FIG. 19 is a flowchart showing a flow of authentication processing when the second authentication device included in the immigration control system according to the present embodiment includes a reading unit that can read identification information from a passport.
- the image acquisition unit 122 is operated by an input unit (not shown) included in the second authentication device 42 and receives a reading instruction for instructing reading of identification information from the passport P possessed by a person passing through the boarding gate BG. It is determined whether or not (step S1901). When the reading instruction has not been input (step S1901: No), the second authentication device 42 executes the same processing as steps S1801 to S1806 shown in FIG.
- step S1901 when a reading instruction is input (step S1901: Yes), the reading unit (not shown) included in the second authentication device 42 identifies the holder of the passport P from the passport P possessed by the person passing through the boarding gate BG. Information is read (step S1902).
- a method for reading identification information from the passport P by a reading unit (not shown) included in the second authentication device 42 is the same as a method for reading identification information from the passport P by the identification information reading unit 104 included in the boarding guide device 40.
- the second authentication device 42 executes the same processing as in steps S1801 to S1802 shown in FIG.
- the person search unit 124 uses the identification information read from the passport P by the reading unit (not shown) included in the second authentication device 42 and the feature information extracted by the face feature extraction unit 123 to pass through the boarding gate BG.
- the authentication process is executed (step S1903).
- the person search unit 124 executes an authentication process for a person passing through the boarding gate BG in the same manner as the authentication process performed by the person authentication unit 115 included in the first authentication device 41.
- the person search unit 124 does not perform the authentication process in the first authentication device 41, so The predetermined second threshold value to be compared with the similarity between the read feature information and the feature information extracted by the face feature extraction unit 123 may be higher than the first threshold value used in the authentication process in the first authentication device 41.
- the person search unit 124 may set the second threshold value lower than the first threshold value when priority is given to the efficiency of the authentication process of the person passing through the boarding gate BG.
- FIG. 20 is a flowchart showing a flow of authentication processing by the third authentication device 43 included in the immigration system according to the present embodiment.
- the identification information reading unit 132 reads the identification information of the holder of the passport P from the passport P possessed by a person passing through the immigration counter IC (step S2001).
- the method for reading the identification information from the passport P by the identification information reading unit 132 is the same as the method for reading the identification information from the passport P by the identification information reading unit 104 of the boarding guide device 40.
- the imaging condition adjustment unit 133 reads the imaging conditions stored in association with the identification information that matches the identification information read by the identification information reading unit 132 from the second storage device 46. Then, the imaging condition adjustment unit 133 adjusts the imaging conditions of the camera 131 according to the read imaging conditions (step S2002).
- the biometric information obtained from the image obtained by the imaging of the camera 131 under the imaging conditions similar to the imaging conditions of the camera 111 when the first authentication process at the immigration counter DC is successful is used. Since two authentication processes can be executed, the accuracy of person authentication at the immigration counter IC can be improved.
- the imaging condition adjustment unit 133 when the read imaging condition includes height, the imaging condition adjustment unit 133 captures the face of the person passing through the immigration counter IC from the front according to the height. Adjust. In the present embodiment, the imaging condition adjustment unit 133 adjusts the light source that can irradiate the immigration counter IC according to the illumination condition when the read imaging condition includes the illumination condition.
- the imaging condition adjustment unit 133 displays a first instruction for instructing a facial expression close to the face image on the display unit 137 when the read imaging condition includes a face image. Further, the imaging condition adjusting unit 133 displays a second instruction for instructing wearing of the glasses on the display unit 137 when the read imaging condition includes a face image and the face image is a face image with glasses. To do. In addition, when the read imaging condition includes a face image, the imaging condition adjustment unit 133 can display a third instruction for instructing a hairstyle close to the face image on the display unit 137.
- the second information is obtained by using the biological information acquired from the image obtained by the imaging of the camera 131 under the imaging condition closer to the imaging condition of the camera 111 when the first authentication process at the immigration counter DC is successful. Since the authentication process can be executed, the accuracy of person authentication at the immigration counter IC can be further improved.
- the image acquisition unit 134 controls the camera 131 to image a person passing through the immigration counter IC. Then, the image acquisition unit 134 acquires an image obtained by imaging with the camera 131 (step S2003).
- the camera 131 is composed of, for example, a video camera or the like, and is provided so as to be able to image a person passing through the immigration counter IC. In the present embodiment, the camera 131 is provided so as to be able to capture the face of a person passing through the immigration counter IC.
- the camera 131 digitizes and outputs an image obtained by imaging a person passing through the immigration counter IC by an A / D converter (not shown).
- the face feature extraction unit 135 detects a face image from the image acquired by the image acquisition unit 134, and extracts feature information of the detected face image (step S2004).
- the feature information extraction method from the image by the face feature extraction unit 135 is the same as the feature information extraction method from the image by the face feature extraction unit 103 included in the boarding guide device 40.
- the person authentication unit 136 uses the identification information read by the identification information reading unit 132 and the feature information extracted by the face feature extraction unit 135 to perform a second authentication process (1: 1 collation) is executed (step S2005).
- the person authentication method by the person authentication unit 136 is the same as the person authentication method by the person authentication unit 115 of the first authentication device 41.
- the person authentication unit 136 when the read imaging condition includes height, the person authentication unit 136 has a similarity between the feature information read as identification information by the identification information reading unit 132 and the feature information extracted by the face feature extraction unit 135. It is also possible to determine that the second authentication process has been successful when the height is equal to or greater than the first threshold and the height included in the read imaging condition matches the height of the person based on the image acquired by the image acquisition unit 134. It is. Thereby, in addition to the feature information of the person passing through the immigration counter IC, the second authentication process is executed in consideration of the height of the person, so that the person authentication accuracy by the second authentication process can be improved. it can.
- the person authentication unit 136 determines that the person who passes the immigration counter IC has succeeded in the second authentication process of the person who passes the immigration counter IC and that the person who passes the immigration counter IC is the holder of the passport P (step S2006: Yes).
- the feature information extracted by the face feature extraction unit 135 is stored in the second storage device 46 in association with the identification information read by the identification information reading unit 132 (step S2007). Further, when it is determined that the person who passes the immigration counter IC is the holder of the passport P, the display unit 137 displays information for notifying that the second authentication process by the person authentication unit 136 is successful. (Step S2008).
- step S2006: No when it is determined that the person who passes the immigration counter IC is not the holder of the passport P (step S2006: No), the display unit 137 notifies that the second authentication process by the person authentication unit 136 has failed. Information for displaying is displayed (step S2009).
- FIG. 21 is a flowchart showing a flow of authentication processing by the fourth authentication device 44 included in the immigration system according to the present embodiment.
- the first image acquisition unit 142 and the second image acquisition unit 146 are operated by an input unit (not shown) included in the fourth authentication device 44 to operate the tag from the package tag 400 possessed by a person passing through the package receiving place BC. It is determined whether or not a reading instruction for instructing reading of information has been input (step S2101).
- the first image acquisition unit 142 controls the camera 141 to image a person passing through the package receiving location BC. Then, the first image acquisition unit 142 acquires an image obtained by imaging with the camera 141 (step S2102).
- the camera 141 is composed of, for example, a video camera or the like, and is provided so as to be able to image a person passing through the package receiving place BC. In the present embodiment, the camera 141 is provided so as to be able to image the face of a person passing through the package receiving place BC.
- the camera 141 digitizes and outputs an image obtained by imaging a person passing through the package receiving place BC by an A / D converter (not shown).
- the first face feature extraction unit 143 detects a face image from the image acquired by the first image acquisition unit 142, and extracts feature information of the detected face image (step S2103).
- the feature information extraction method from the image by the first face feature extraction unit 143 is the same as the feature information extraction method from the image by the face feature extraction unit 103 included in the boarding guide device 40.
- the person search unit 144 uses the feature information extracted by the first face feature extraction unit 143 and the feature information stored in the second storage device 46 to perform an authentication process (1 : N collation) is executed (step S2104).
- the authentication process by the person search unit 144 is the same as the authentication process by the person search unit 124 included in the second authentication device 42.
- the person search unit 144 authenticates the person who passes the package receiving location BC. It is determined that the processing has been successful (in other words, it is determined that the feature information extracted by the first face feature extraction unit 143 matches any of the feature information stored in the second storage device 46), and the package The passage of the receiving place BC is permitted (step S2105: Yes).
- the first display unit 145 displays information for notifying that the authentication process by the person search unit 144 is successful (step S2106).
- the person search unit 144 stores various information (stored from the second storage device 46 in association with the identification information of the person who has succeeded in the authentication process) For example, feature information is deleted. Alternatively, the person search unit 144 also deletes various information stored in association with the identification information that has been stored in the second storage device 46 for a predetermined time.
- the number of feature information for calculating the similarity with the feature information extracted in step S2103 among the feature information stored in the second storage device 46 is reduced. Therefore, useless calculation processing can be omitted, the processing speed of authentication processing can be improved, and resources can be saved. Further, the reliability of the feature information stored in the second storage device 46 can be maintained.
- the person search unit 144 passes the package receiving location BC. It is determined that the authentication process has failed, and passage through the package receiving location BC is prohibited. Further, when the authentication process for the person passing through the package receiving location BC fails, the first display unit 145 displays information for notifying that the authentication process by the person search unit 144 has failed (step S2107).
- the tag information reading unit 148 reads the tag information from the package tag 400 possessed by the person passing through the package receiving location BC (step S2108).
- the method for reading the tag information from the luggage tag 400 by the tag information reading unit 148 is the same as the method for reading the identification information from the passport P by the identification information reading unit 104 of the boarding guide device 40.
- the second image acquisition unit 146 controls the camera 141 to image a person passing through the package receiving location BC. And the 2nd image acquisition part 146 controls the camera 141, and images the person who passes the package receiving place BC. Then, the second image acquisition unit 146 acquires an image obtained by imaging with the camera 141 (step S2109).
- the second face feature extraction unit 147 detects a face image from the image acquired by the second image acquisition unit 146, and extracts feature information of the detected face image (step S18210).
- the method of extracting feature information from the image by the second face feature extracting unit 147 is the same as the method of extracting feature information from the image by the face feature extracting unit 103 included in the boarding guide device 40.
- the person authentication unit 149 uses the tag information read by the tag information reading unit 148 and the feature information extracted by the second face feature extraction unit 147 to execute an authentication process for a person passing through the package receiving location BC (step S2111). ).
- the person authentication method by the person authentication unit 149 is the same as the person authentication method by the person authentication unit 115 of the first authentication device 41.
- the person authentication unit 149 It is determined that the authentication process for the person who passes through (step S2105: Yes) and the person's package passing through the package receiving place BC is shipped (step S2106). At that time, the person authentication unit 149 can also improve the efficiency of receiving the package by shipping the packages of the person passing through the package receiving place BC in the order from the first in which the authentication processing is performed. Further, the person authentication unit 149 causes the second display unit 150 to display the location of the person's package and the waiting time until the package is shipped based on the result of the authentication process of the person passing through the package receiving place BC. It is also possible.
- the person authentication unit 149 receives the package when the similarity between the tag information read by the tag information reading unit 148 and the feature information extracted by the second face feature extraction unit 147 is less than the first threshold. It is determined that the authentication process for the person passing through the place BC has failed, and the shipment of the package to the person passing through the package receiving place BC is prohibited. Further, when the authentication process for the person passing through the package receiving location BC fails, the second display unit 150 displays information for notifying that the authentication process by the person authentication unit 149 has failed (step S2107).
- the passer-by had to stop at the time of photographing with the camera, but as the fourth embodiment, it is installed in, for example, the second authentication device 20 even when walking as shown in FIG. In addition, photographing and verification by the second imaging unit 21 can be performed.
- the operation will be described along the flowchart shown in FIG.
- the IC image of the passport P is read (step S2301).
- the second imaging unit 21 captures the face of a person who is walking with the second imaging unit 21 (step S2302).
- a face is detected and tracked from the obtained captured moving image (step S2303).
- the gate GT is set. Open and allow the pedestrian to pass (step S2305).
- the gate GT is closed (step S2306).
- a warning is issued to the pedestrian (step S2307).
- the program executed by the boarding guide device 40, the first authentication device 41, the second authentication device 42, the third authentication device 43, and the fourth authentication device 44 of the present embodiment is stored in advance in a ROM (Read Only Memory) or the like. Provided embedded.
- the program executed by the boarding guide device 40, the first authentication device 41, the second authentication device 42, the third authentication device 43, and the fourth authentication device 44 of the present embodiment is a file in an installable format or an executable format. May be provided by being recorded on a computer-readable recording medium such as a CD-ROM, flexible disk (FD), CD-R, DVD (Digital Versatile Disk).
- a computer-readable recording medium such as a CD-ROM, flexible disk (FD), CD-R, DVD (Digital Versatile Disk).
- a computer that is connected to a network such as the Internet for a program executed by the boarding guide device 40, the first authentication device 41, the second authentication device 42, the third authentication device 43, and the fourth authentication device 44 of the present embodiment. You may comprise so that it may provide by being stored and downloaded via a network. Further, the program executed by the boarding guide device 40, the first authentication device 41, the second authentication device 42, the third authentication device 43, and the fourth authentication device 44 of the present embodiment is provided or distributed via a network such as the Internet. You may comprise as follows.
- the program executed by the boarding guide device 40 has a module configuration including the above-described units (the image acquisition unit 102, the face feature extraction unit 103, and the identification information reading unit 104).
- the CPU Central Processing Unit
- the image acquisition unit 102, the facial feature extraction unit 103, and the identification information reading unit 104 are included in the main storage device. It is supposed to be generated above.
- the program executed by the first authentication device 41 of the present embodiment has a module configuration including the above-described units (image acquisition unit 112, face feature extraction unit 113, identification information reading unit 114, person authentication unit 115).
- the CPU reads the program from the ROM and executes it, so that the above-described units are loaded on the main storage device, and the image acquisition unit 112, the face feature extraction unit 113, the identification information reading unit 114, the person authentication The unit 115 is generated on the main storage device.
- the program executed by the second authentication device 42 has a module configuration including the above-described units (the image acquisition unit 122, the face feature extraction unit 123, and the person search unit 124).
- the CPU reads the program from the ROM and executes it, the above-described units are loaded on the main storage device, and the image acquisition unit 122, the facial feature extraction unit 123, and the person search unit 124 are generated on the main storage device. It has become.
- the program executed by the third authentication device 43 of the present embodiment includes the above-described units (imaging condition adjustment unit 133, image acquisition unit 134, face feature extraction unit 135, identification information reading unit 132, person authentication unit 136).
- the CPU reads the program from the ROM and executes it, so that the above-described units are loaded on the main storage device, and the imaging condition adjustment unit 133, the image acquisition unit 134, the facial features An extraction unit 135, an identification information reading unit 132, and a person authentication unit 136 are generated on the main storage device.
- the program executed by the fourth authentication device 44 of this embodiment includes the above-described units (the first image acquisition unit 142, the first face feature extraction unit 143, the person search unit 144, the second image acquisition unit 146, and the second face.
- the module configuration includes a feature extraction unit 147, a tag information reading unit 148, and a person authentication unit 149).
- the CPU reads out the program from the ROM and executes the program so that each unit is a main storage device.
- the first image acquisition unit 142, the first face feature extraction unit 143, the person search unit 144, the second image acquisition unit 146, the second face feature extraction unit 147, the tag information reading unit 148, and the person authentication unit 149. Are generated on the main memory.
- the first imaging condition of the first imaging unit that obtains the first image used for the first authentication process of the person passing through the first position is stored in association with the first identification information of the person passing through the first position.
- An information processing method comprising:
- the similarity between the facial image feature information as the second identification information and the facial image feature information included in the second image is greater than or equal to a predetermined threshold.
- the imaging range of the second imaging unit is adjusted so that the face of a person passing through the second position is imaged from the front according to the first height.
- the first imaging condition includes a face image included in the first image and the face image is a face image with glasses
- a second instruction for instructing wearing of glasses is displayed on the display unit [ 3] Information processing method.
- the person A first authentication unit for executing the first authentication process of A storage unit that stores the first imaging condition of the first imaging unit in association with the first identification information when the first authentication process is successful;
- the first imaging stored in association with the first identification information that matches the second identification information read from the medium possessed by the person passing through the second position different from the first position from the storage unit
- An adjustment unit that reads a condition and adjusts a second imaging condition of a second imaging unit capable of imaging a person passing through the second position according to the read first imaging condition;
- Second authentication for executing a second authentication process for a person passing through the second position by using the second identification information and the biological information acquired from the second image obtained by imaging by the second imaging unit.
- An information processing system comprising:
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computing Systems (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Collating Specific Patterns (AREA)
- Lock And Its Accessories (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
図1は、第1の実施形態にかかる情報処理方法を適用した情報処理システムの構成を示す図である。本実施形態にかかる情報処理システム1は、図1に示すように、第1位置P1を通過する通行者が所持する媒体Mから読み取られた生体情報(第1生体情報の一例)と、第1位置P1を通過する通行者を第1撮像部11により撮像して得られた第1画像G1から取得した生体情報(第2生体情報の一例)とを用いて、通行者を認証する第1認証処理を実行する第1認証装置10と、第1認証処理による通過者の認証に成功した場合、第1認証処理に用いられた2つの生体情報の少なくとも一方に基づいて生成された第3生体情報(以下、認証用特徴情報と言う)を記憶する特徴情報記憶部31を有するサーバ30と、通行者の進行方向において第1位置P1より下流側の第2位置P2を通過する通行者を第2撮像部21により撮像して得られた第2画像G2から取得した生体情報(第4生体情報の一例)と、特徴情報記憶部31に記憶された認証用特徴情報とを用いて、通行者を認証する第2認証処理を実行する第2認証装置20と、を備えている。更に、第1認証装置10と第2認証装置20との認証処理の結果を、管理者向けに表示する管理者モニタとしての第2表示部26を備えている。
S(x(0), x(1))>θ, S(x(1), x(2))>θ, S(x(2), x(3)>θ, ・・・, S(x(t-1), x(t))>θ
のときは本人であると判定する。経年変化によってS(x(0), x(t))<θとなって本人を別人と判定する誤りを低減することができる。
本実施形態は、第2位置を通過する通行者が所持する媒体から生体情報が読み取られた場合、第2認証処理に代えて、第2位置を通過する通行者が所持する媒体から読み取られた生体情報と、第2位置を通過する通行者を撮像部によって撮像して得られた第2画像から取得した生体情報と、を用いて、通行者を認証する第3認証処理を実行可能とする例である。以下の説明では、第1の実施形態と同様の箇所については説明を省略する。
以下、添付の図面を用いて、第3の実施形態にかかる人物認証方法および人物認証システムについて説明する。なお、人物認証方法は情報処理方法に含まれ、人物認証システムは情報処理システムに含まれる。
以上の実施例では、通行者はカメラによる撮影時には立ち止まっていなければならなかったが、第4の実施形態として、図22に示すように歩行中であっても例えば第2認証装置20に設置された第2撮像部21による撮影及び照合が行える。
このように、本実施形態にかかる出入国管理システムによれば、入国審査カウンターICでの人物の認証精度を向上させることができる。
第1位置を通過した人物の第1識別情報と対応付けて、前記第1位置を通過する人物の第1認証処理に用いた第1画像を得た第1撮像部の第1撮像条件を記憶する記憶装置から、前記第1位置とは異なる第2位置を通過する人物が所持する媒体から読み取られた第2識別情報と一致する前記第1識別情報と対応付けて記憶された前記第1撮像条件を読み出す工程と、
読み出した前記第1撮像条件に従って、前記第2位置を通過する人物を撮像可能な第2撮像部の第2撮像条件を調整する工程と、
前記第2識別情報と、前記第2撮像部の撮像により得られた第2画像から取得した生体情報とを用いて、前記第2位置を通過する人物の第2認証処理を実行する工程と、
を具備する情報処理方法。
前記第1撮像条件は、前記第1画像に関する情報である[1]の情報処理方法。
前記第1撮像条件は、前記第1画像に含まれる顔画像、前記第1画像に基づく人物の第1身長、および前記第1画像に基づく照明条件の少なくとも1つを含む[2]の情報処理方法。
前記第1撮像条件が前記第1身長を含む場合、前記第2識別情報としての顔画像の特徴情報と前記第2画像に含まれる顔画像の特徴情報との類似度が所定の閾値以上であり、かつ前記第1身長と前記第2画像に基づく人物の第2身長とが一致した場合に、前記第2認証処理が成功したものとする[3]の情報処理方法。
前記第1撮像条件が前記第1身長を含む場合、当該第1身長に従って、前記第2位置を通過する人物の顔が正面から撮像されるように、前記第2撮像部の撮像範囲を調整する[3]の情報処理方法。
前記第1撮像条件が前記照明条件を含む場合、当該照明条件に従って、前記第2撮像部の撮像範囲に光を照射可能な光源を調整する[3]の情報処理方法。
前記第1撮像条件が前記第1画像に含まれる顔画像を含む場合、当該顔画像に近い表情を指示するための第1指示を表示部に表示する[3]の情報処理方法。
前記第1撮像条件が前記第1画像に含まれる顔画像を含みかつ当該顔画像が眼鏡をかけた顔画像である場合、眼鏡の着用を指示するための第2指示を表示部に表示する[3]の情報処理方法。
前記第1撮像条件が前記第1画像に含まれる顔画像を含む場合、当該顔画像に近い髪型を指示するための第3指示を表示部に表示する[3]の情報処理方法。
第1位置を通過する人物が所持する媒体から読み取った第1識別情報と当該人物を第1撮像部によって撮像して得られた第1画像から取得した第1生体情報とを用いて、当該人物の第1認証処理を実行する第1認証部と、
前記第1認証処理が成功した場合、前記第1識別情報と対応付けて、前記第1撮像部の第1撮像条件を記憶する記憶部と、
前記記憶部から、前記第1位置とは異なる第2位置を通過する人物が所持する媒体から読み取られた第2識別情報と一致する前記第1識別情報と対応付けて記憶された前記第1撮像条件を読み出し、読み出した前記第1撮像条件に従って、前記第2位置を通過する人物を撮像可能な第2撮像部の第2撮像条件を調整する調整部と、
前記第2識別情報と、前記第2撮像部の撮像により得られた第2画像から取得した生体情報とを用いて、前記第2位置を通過する人物の第2認証処理を実行する第2認証部と、
を具備する情報処理システム。
10…第1認証装置
11…第1撮像部
12…第1画像取得部
13…第1顔特徴抽出部
14…識別情報読取部
15…第1通行者認証部
16…第1出力部
17…第1表示部
20…第2認証装置
21…第2撮像部
22…第2画像取得部
23…第2顔特徴抽出部
24…第2通行者認証部
25…第2出力部
26…第2表示部
30…サーバ
31…特徴情報記憶部
Claims (16)
- 第1位置を通過する通行者が所持する媒体から読み取られた第1生体情報と、前記第1位置を通過する通行者を撮像して得られた画像から取得した第2生体情報とを用いて、通行者を認証する第1認証処理を実行する工程と、
前記第1認証処理による通行者の認証に成功した場合、前記第1認証処理に用いた前記第1生体情報および前記第2生体情報の少なくとも一方に基づく第3生体情報を記憶部に記憶する工程と、
通行者の進行する方向において前記第1位置より下流側の第2位置を通過する通行者を撮像して得られた画像から取得した第4生体情報と、前記記憶部に記憶された前記第3生体情報とを用いて、通行者を認証する第2認証処理を実行する工程と、
前記第2認証処理により認証された場合、前記第2位置を通行することを許可する認証工程と
を具備する情報処理方法。 - 前記第1認証処理を実行した結果もしくは第2認証処理を実行した結果を管理者用モニタに表示する工程を更に有する請求項1に記載の情報処理方法。
- 前記第1生体情報または前記第2生体情報を前記第3生体情報として前記記憶部に記憶する工程を有する請求項1に記載の情報処理方法。
- 前記第2認証処理は、前記第4生体情報が前記記憶部に記憶された前記第3生体情報のいずれかと一致した場合、通行者の認証に成功したと判断する工程を有する請求項1に記載の情報処理方法。
- 前記第2認証処理による通行者の認証に成功した場合、前記第4生体情報と一致した前記第3生体情報を前記記憶部から消去する工程を有する請求項4に記載の情報処理方法。
- 前記記憶部に記憶されている複数の前記第3生体情報に対して、前記第3生体情報間の識別に不必要な情報を取り除くための処理を実行する工程を有する請求項1に記載の情報処理方法。
- 前記第2位置を通過する通行者が所持する媒体から第5生体情報が読み取られた場合、前記第2認証処理に代えて、前記第4生体情報と前記第5生体情報とを用いて、通行者を認証する第3認証処理を実行する工程を有する請求項1に記載の情報処理方法。
- 前記第2位置を通過する歩行中の通行者を撮像する工程を有する請求項1に記載の情報処理方法。
- 第1位置を通過する通行者が所持する媒体から第1生体情報を読取可能に設けられた読取部と、
前記第1位置を通過する通行者を撮像可能に設けられた第1撮像部と、
前記読取部により読み取られた前記第1生体情報と当該第1生体情報が読み取られた際に前記第1撮像部の撮像により得られた第1画像から取得した第2生体情報とを用いて、通行者を認証する第1認証処理を実行する第1認証部と、
前記第1認証処理による通行者の認証に成功した場合、前記第1認証処理に用いた前記第1生体情報および前記第2生体情報の少なくとも一方に基づく第3生体情報を記憶部に記憶させ、かつ前記第1認証処理による通行者の認証に失敗した場合、前記第3生体情報の前記記憶部への記憶を禁止する記憶制御部と、
通行者の進行方向において前記第1位置より下流側の第2位置を通過する通行者を撮像可能に設けられた第2撮像部と、
前記第2撮像部の撮像により得られた画像から取得した第4生体情報と、前記記憶部に記憶された前記第3生体情報とを用いて、通行者を認証する第2認証処理を実行する第2認証部と、
を具備する情報処理システム。 - 前記第1認証処理を実行した結果もしくは第2認証処理を実行した結果を表示する管理者用モニタを更に具備する請求項9記載の情報処理システム。
- 前記第2撮像部は、前記第2位置を通過する歩行中の通行者を撮像する請求項9に記載の情報処理システム。
- 第1位置を通過する人物が所持する媒体から読み取った第1識別情報と当該人物を第1撮像部によって撮像して得られた第1画像から取得した第1生体情報とを用いて、当該人物の第1認証処理を実行する工程と、
前記第1認証処理が成功した場合、前記第1識別情報と対応付けて、前記第1撮像部の第1撮像条件を、前記第1位置とは異なる第2位置を通過する人物の第2認証処理に用いる第2生体情報を取得するための第2画像を得る第2撮像部の第2撮像条件として記憶装置に保存する工程と、
を具備する情報処理方法。 - 前記第1撮像条件は、前記第1画像に関する情報である請求項12に記載の情報処理方法。
- 前記第1撮像条件は、前記第1画像に含まれる顔画像、前記第1画像に基づく人物の第1身長、および前記第1画像に基づく照明条件の少なくとも1つを含む請求項13に記載の情報処理方法。
- 前記第1認証処理を実行した結果もしくは第2認証処理を実行した結果を管理者用モニタに表示する工程を更に具備する請求項12に記載の情報処理方法。
- 前記第2撮像部は、前記第2位置を通過する歩行中の通行者を撮像する工程を具備する請求項12に記載の情報処理方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016507365A JPWO2015136938A1 (ja) | 2014-03-14 | 2015-03-12 | 情報処理方法および情報処理システム |
EP15760931.4A EP3118810A4 (en) | 2014-03-14 | 2015-03-12 | Information processing method and information processing system |
US15/263,984 US20170070501A1 (en) | 2014-03-14 | 2016-09-13 | Information processing method and information processing system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014051808 | 2014-03-14 | ||
JP2014-051808 | 2014-03-14 | ||
JP2014-183596 | 2014-09-09 | ||
JP2014183596 | 2014-09-09 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/263,984 Continuation US20170070501A1 (en) | 2014-03-14 | 2016-09-13 | Information processing method and information processing system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015136938A1 true WO2015136938A1 (ja) | 2015-09-17 |
Family
ID=54071394
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/001359 WO2015136938A1 (ja) | 2014-03-14 | 2015-03-12 | 情報処理方法および情報処理システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170070501A1 (ja) |
EP (1) | EP3118810A4 (ja) |
JP (1) | JPWO2015136938A1 (ja) |
WO (1) | WO2015136938A1 (ja) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018185679A (ja) * | 2017-04-26 | 2018-11-22 | 株式会社テイパーズ | 顔認証システム |
JP2019101779A (ja) * | 2017-12-04 | 2019-06-24 | 京セラドキュメントソリューションズ株式会社 | 店舗用情報管理システムおよび店舗用情報管理プログラム |
JP2019139340A (ja) * | 2018-02-07 | 2019-08-22 | 株式会社ケアコム | 徘徊検出システム |
WO2020026368A1 (ja) | 2018-07-31 | 2020-02-06 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JPWO2020188714A1 (ja) * | 2019-03-18 | 2020-09-24 | ||
WO2021029046A1 (ja) * | 2019-08-14 | 2021-02-18 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JP2021106001A (ja) * | 2016-08-29 | 2021-07-26 | パナソニックIpマネジメント株式会社 | システム及び方法 |
JPWO2021186628A1 (ja) * | 2020-03-18 | 2021-09-23 | ||
WO2021205844A1 (ja) * | 2020-04-09 | 2021-10-14 | Necソリューションイノベータ株式会社 | 認証装置 |
WO2022014562A1 (ja) * | 2020-07-13 | 2022-01-20 | アクティア株式会社 | 情報処理システム、情報処理装置、情報処理方法、及びプログラム |
WO2022038709A1 (ja) * | 2020-08-19 | 2022-02-24 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JP2022067133A (ja) * | 2020-06-30 | 2022-05-02 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
WO2022154093A1 (ja) * | 2021-01-14 | 2022-07-21 | 富士フイルム株式会社 | 真正性照合システム及び真正性照合方法 |
JP7243900B1 (ja) | 2022-06-17 | 2023-03-22 | 三菱電機株式会社 | 認証システム及び認証装置 |
WO2023162041A1 (ja) * | 2022-02-22 | 2023-08-31 | 日本電気株式会社 | サーバ装置、システム、サーバ装置の制御方法及び記憶媒体 |
US11914035B2 (en) | 2021-03-19 | 2024-02-27 | Nec Corporation | Inspection system for inspecting contents of a target person, and inspection method thereof |
JP7487827B2 (ja) | 2022-03-10 | 2024-05-21 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018110334A1 (ja) * | 2016-12-16 | 2018-06-21 | パナソニックIpマネジメント株式会社 | ゲートシステム制御装置、及び、ゲートシステムの制御方法 |
US20190034934A1 (en) | 2017-07-28 | 2019-01-31 | Alclear, Llc | Biometric payment |
JP7220373B2 (ja) * | 2018-06-28 | 2023-02-10 | パナソニックIpマネジメント株式会社 | ゲート装置及びシステム |
JP7310112B2 (ja) * | 2018-10-02 | 2023-07-19 | 日本電気株式会社 | 情報処理装置、情報処理システム、情報処理方法及びプログラム |
JP7006804B2 (ja) * | 2018-10-12 | 2022-01-24 | 日本電気株式会社 | 情報処理装置、情報処理方法及びプログラム |
CN109934978B (zh) * | 2019-03-15 | 2020-12-11 | 上海华铭智能终端设备股份有限公司 | 闸机设备的控制方法、终端、闸机设备及系统 |
JP7276591B2 (ja) * | 2020-02-18 | 2023-05-18 | 日本電気株式会社 | ゲート装置 |
WO2021199338A1 (ja) * | 2020-03-31 | 2021-10-07 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
US20220327188A1 (en) * | 2020-07-30 | 2022-10-13 | Nec Corporation | Authentication system, authentication apparatus, authentication method and computer program |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005129016A (ja) * | 2003-09-29 | 2005-05-19 | Fuji Photo Film Co Ltd | 認証システム、プログラム、及び建築物 |
WO2005055151A1 (ja) * | 2003-12-03 | 2005-06-16 | Hitachi, Ltd. | 搭乗セキュリティチェックシステムおよび方法ならびにコンピュータプログラム |
JP2010287124A (ja) * | 2009-06-12 | 2010-12-24 | Glory Ltd | 生体照合システムおよび生体照合方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7641108B2 (en) * | 2004-04-02 | 2010-01-05 | K-Nfb Reading Technology, Inc. | Device and method to assist user in conducting a transaction with a machine |
TWI336054B (en) * | 2006-02-15 | 2011-01-11 | Toshiba Kk | Person identification device and person identification method |
US20100031626A1 (en) * | 2008-03-21 | 2010-02-11 | Robert Oehrlein | Carbon-Kevlar uni-body rocket engine and method of making same |
US7855529B2 (en) * | 2008-07-16 | 2010-12-21 | ConvenientPower HK Ltd. | Inductively powered sleeve for mobile electronic device |
-
2015
- 2015-03-12 WO PCT/JP2015/001359 patent/WO2015136938A1/ja active Application Filing
- 2015-03-12 JP JP2016507365A patent/JPWO2015136938A1/ja active Pending
- 2015-03-12 EP EP15760931.4A patent/EP3118810A4/en not_active Withdrawn
-
2016
- 2016-09-13 US US15/263,984 patent/US20170070501A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005129016A (ja) * | 2003-09-29 | 2005-05-19 | Fuji Photo Film Co Ltd | 認証システム、プログラム、及び建築物 |
WO2005055151A1 (ja) * | 2003-12-03 | 2005-06-16 | Hitachi, Ltd. | 搭乗セキュリティチェックシステムおよび方法ならびにコンピュータプログラム |
JP2010287124A (ja) * | 2009-06-12 | 2010-12-24 | Glory Ltd | 生体照合システムおよび生体照合方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3118810A4 * |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021106001A (ja) * | 2016-08-29 | 2021-07-26 | パナソニックIpマネジメント株式会社 | システム及び方法 |
JP7065350B2 (ja) | 2016-08-29 | 2022-05-12 | パナソニックIpマネジメント株式会社 | システム及び方法 |
JP2018185679A (ja) * | 2017-04-26 | 2018-11-22 | 株式会社テイパーズ | 顔認証システム |
JP2019101779A (ja) * | 2017-12-04 | 2019-06-24 | 京セラドキュメントソリューションズ株式会社 | 店舗用情報管理システムおよび店舗用情報管理プログラム |
JP2019139340A (ja) * | 2018-02-07 | 2019-08-22 | 株式会社ケアコム | 徘徊検出システム |
US11610438B2 (en) | 2018-07-31 | 2023-03-21 | Nec Corporation | Information processing apparatus, information processing method, and storage medium |
US10963716B2 (en) | 2018-07-31 | 2021-03-30 | Nec Corporation | Information processing apparatus, information processing method, and storage medium |
WO2020026368A1 (ja) | 2018-07-31 | 2020-02-06 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JPWO2020188714A1 (ja) * | 2019-03-18 | 2020-09-24 | ||
WO2020188714A1 (ja) * | 2019-03-18 | 2020-09-24 | 日本電気株式会社 | 情報処理装置、サーバ装置、情報処理方法及び記録媒体 |
JP7215566B2 (ja) | 2019-03-18 | 2023-01-31 | 日本電気株式会社 | 情報処理装置、サーバ装置、情報処理方法及びプログラム |
WO2021029046A1 (ja) * | 2019-08-14 | 2021-02-18 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JPWO2021029046A1 (ja) * | 2019-08-14 | 2021-02-18 | ||
JP7235123B2 (ja) | 2019-08-14 | 2023-03-08 | 日本電気株式会社 | 情報処理装置、情報処理方法及びプログラム |
JPWO2021186628A1 (ja) * | 2020-03-18 | 2021-09-23 | ||
WO2021186628A1 (ja) * | 2020-03-18 | 2021-09-23 | 日本電気株式会社 | ゲート装置、認証システム、ゲート制御方法及び記憶媒体 |
JP7218837B2 (ja) | 2020-03-18 | 2023-02-07 | 日本電気株式会社 | ゲート装置、認証システム、ゲート装置の制御方法及びプログラム |
JP7414327B2 (ja) | 2020-04-09 | 2024-01-16 | Necソリューションイノベータ株式会社 | 認証装置 |
JPWO2021205844A1 (ja) * | 2020-04-09 | 2021-10-14 | ||
WO2021205844A1 (ja) * | 2020-04-09 | 2021-10-14 | Necソリューションイノベータ株式会社 | 認証装置 |
JP2022067133A (ja) * | 2020-06-30 | 2022-05-02 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JP7300095B2 (ja) | 2020-06-30 | 2023-06-29 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JPWO2022014562A1 (ja) * | 2020-07-13 | 2022-01-20 | ||
WO2022014562A1 (ja) * | 2020-07-13 | 2022-01-20 | アクティア株式会社 | 情報処理システム、情報処理装置、情報処理方法、及びプログラム |
JP7158692B2 (ja) | 2020-07-13 | 2022-10-24 | アクティア株式会社 | 情報処理システム、情報処理装置、情報処理方法、及びプログラム |
WO2022038709A1 (ja) * | 2020-08-19 | 2022-02-24 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
WO2022154093A1 (ja) * | 2021-01-14 | 2022-07-21 | 富士フイルム株式会社 | 真正性照合システム及び真正性照合方法 |
US11914035B2 (en) | 2021-03-19 | 2024-02-27 | Nec Corporation | Inspection system for inspecting contents of a target person, and inspection method thereof |
WO2023162041A1 (ja) * | 2022-02-22 | 2023-08-31 | 日本電気株式会社 | サーバ装置、システム、サーバ装置の制御方法及び記憶媒体 |
JP7487827B2 (ja) | 2022-03-10 | 2024-05-21 | 日本電気株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
JP7243900B1 (ja) | 2022-06-17 | 2023-03-22 | 三菱電機株式会社 | 認証システム及び認証装置 |
JP2023184029A (ja) * | 2022-06-17 | 2023-12-28 | 三菱電機株式会社 | 認証システム及び認証装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015136938A1 (ja) | 2017-04-06 |
EP3118810A1 (en) | 2017-01-18 |
US20170070501A1 (en) | 2017-03-09 |
EP3118810A4 (en) | 2017-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015136938A1 (ja) | 情報処理方法および情報処理システム | |
US10796514B2 (en) | System and method for optimizing a facial recognition-based system for controlling access to a building | |
CN107958258B (zh) | 用于追踪限定区域中的对象的方法和系统 | |
US9875392B2 (en) | System and method for face capture and matching | |
JP6151582B2 (ja) | 顔認証システム | |
JP4836633B2 (ja) | 顔認証装置、顔認証方法および入退場管理装置 | |
US8064651B2 (en) | Biometric determination of group membership of recognized individuals | |
KR100831122B1 (ko) | 얼굴 인증 장치, 얼굴 인증 방법, 및 출입 관리 장치 | |
JP2016170700A (ja) | 人物認証方法 | |
JP2008108243A (ja) | 人物認識装置および人物認識方法 | |
JP2006236260A (ja) | 顔認証装置、顔認証方法および入退場管理装置 | |
WO2020115890A1 (ja) | 情報処理システム、情報処理装置、情報処理方法、およびプログラム | |
JP5955056B2 (ja) | 顔画像認証装置 | |
JP4855180B2 (ja) | 画像処理システム | |
JP2004118359A (ja) | 人物認識装置、人物認識方法および通行制御装置 | |
US20220198861A1 (en) | Access control system screen capture facial detection and recognition | |
KR20140060081A (ko) | 원거리 얼굴인식 기반 스피드 게이트 통제 방법 및 장치 | |
JP7379213B2 (ja) | 画像照合装置 | |
US20230222193A1 (en) | Information processing device, permission determination method, and program | |
US20220028197A1 (en) | Access control solution for a passage device | |
KR20220087908A (ko) | 마스크 착용 얼굴 인식 기능을 갖는 출입관리방법 | |
JP2022138548A (ja) | 画像照合装置、画像照合方法、およびプログラム | |
Patil et al. | A Review: Office Monitoring and Surveillance System | |
KR20230031496A (ko) | 얼굴 인증을 통한 출입 관리 방법 및 시스템 | |
JP2010086350A (ja) | 認証装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15760931 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016507365 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015760931 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015760931 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |