WO2018181968A1 - 顔認証システム、装置、方法、プログラム - Google Patents
顔認証システム、装置、方法、プログラム Download PDFInfo
- Publication number
- WO2018181968A1 WO2018181968A1 PCT/JP2018/013797 JP2018013797W WO2018181968A1 WO 2018181968 A1 WO2018181968 A1 WO 2018181968A1 JP 2018013797 W JP2018013797 W JP 2018013797W WO 2018181968 A1 WO2018181968 A1 WO 2018181968A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- face
- unit
- feature amount
- gate
- wireless tag
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/20—Individual registration on entry or exit involving the use of a pass
- G07C9/22—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
- G07C9/25—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
- G07C9/257—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition electronically
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07B—TICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
- G07B15/00—Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
- G07B15/02—Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points taking into account a variable factor such as distance or time, e.g. for passenger transport, parking systems or car rental systems
- G07B15/04—Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points taking into account a variable factor such as distance or time, e.g. for passenger transport, parking systems or car rental systems comprising devices to free a barrier, turnstile, or the like
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/10—Movable barriers with registering means
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/20—Individual registration on entry or exit involving the use of a pass
- G07C9/22—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
- G07C9/25—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
Definitions
- the present invention is based on the priority claim of Japanese patent application: Japanese Patent Application No. 2017-073042 (filed on March 31, 2017), the entire contents of which are incorporated herein by reference. Shall.
- the present invention relates to a face authentication system, apparatus, method, and program.
- biometric authentication that performs authentication using biometric information, which is information relating to human physical characteristics and behavioral characteristics, has come to be used in situations where identity verification is performed.
- Face authentication one of biometric authentication, ⁇
- ⁇ There is little psychological resistance of the person to be certified, ⁇ Authentication is possible even from a remote location.
- ⁇ Has the advantage of psychological deterrence against fraud. Face authentication technology is used for identity verification in various fields, and its application range is expanding.
- FIG. 1 is a diagram for explaining a typical example of an automatic door using face authentication technology.
- the face data (face feature amount) of the user 11 is registered in the database (DataBase: DB) 15 in advance.
- the user 11 stands in front of the camera 13 such as next to the door 12.
- the door (door) 12 is normally locked.
- the face authentication device 14 extracts a facial feature amount from the image data (face image) of the user 11 captured by the camera 13 and collates it with the face data (feature amount) registered in the database 15. Authenticate.
- the opening / closing control device 16 controls the opening / closing of the door 12 based on the authentication result of the face authentication device 14.
- the opening / closing control device 16 outputs a signal for opening the lock of the door 12 only for the user 11 who passed the face authentication. For users who do not pass face authentication, the door 12 remains locked. For this reason, it is possible to prevent intruders other than suspicious persons and registered users from entering.
- Patent Document 1 discloses the following configuration for this problem.
- a wireless terminal device wireless tag: for example, a passive RFID (Radio Frequency Identifier) tag
- the image acquisition unit acquires the image to be authenticated
- the feature amount acquisition unit acquires the feature amount.
- a storage unit stores feature amounts and feature amount determination information in association with identification information of a plurality of objects. If the authentication means stores feature quantity determination information that defines a determination method different from the standard determination method in association with the received identification information, it is used as the identification information. The feature quantity stored in association with the acquired feature quantity is collated to authenticate the authentication target.
- Patent Document 2 discloses a biometric matching system that does not require input of an ID (Identification) code, presentation of a magnetic card, etc. if a portable terminal is owned, and improves convenience for the user.
- This biometric matching system has a plurality of portable terminals and a biometric matching device, and the biometric matching device has a storage unit, an acquiring unit, a receiving unit, an extracting unit, and a matching unit.
- the storage unit stores the registered biometric image of the user of each mobile terminal in association with the identification information of each mobile terminal.
- the acquisition unit acquires input biometric information of a specific user.
- the receiving unit receives a position signal or a radio signal from each portable terminal.
- the extraction unit detects two or more portable terminals within a predetermined range from the biometric matching device based on the received position signal or wireless signal, and detects the detected portable terminal from the registered biometric images stored in the storage unit Two or more registered biometric images respectively associated with the identification information are extracted.
- a collation part collates each input biometric information and the extracted registered biometric image.
- Patent Document 3 discloses a face image extracted from a database in which a face image of a subject is captured and the captured face image and the face image of the subject are accumulated.
- a face image search system for calculating the similarity of the image is disclosed.
- This face image retrieval system reads a stored card ID by a card reader, which is given to a storage medium such as a magnetic card, an IC (Integrated Circuit) card, a wireless tag, etc. possessed by the subject, and reads the read card ID.
- a database is searched as a key, and a face image associated with the card ID is acquired.
- the outline of the related technology is summarized as follows.
- -It is intended for opening and closing of doors, etc.
- the target area is basically a closed area (small to medium-sized rooms, etc.).
- the number of registered face data of users to be authenticated is small to medium.
- the authentication area is also an operation mode in which the user looks into a fixed frame (for example, 13 in FIG. 1).
- -No special speed is required for authentication speed.
- -Furthermore there is one route for each inlet and outlet.
- the door is normally closed and will be opened when authentication is successful.
- -Authentication is basically only in one direction, such as when entering a room from outside.
- FIG. 2B are diagrams for explaining a case (prototype) in which a walk-through gate (for example, an entrance gate or a ticket gate) is realized using a wireless tag (IC card) having a communication distance of about 3 m to 5 m, for example.
- the wireless tags (IC cards) 20A and 20C may be passive RF-ID tags such as 13.56 MHz, 920 MHz (Mega Hertz), 2.4 GHz (Giga Hertz) band, for example. Users (Mr. A and M) who possess the wireless tags (IC cards) 20A and 20C can pass through the gate of FIG.
- the wireless tags (IC cards) 20A and 20C transmit ID information and the like, and authenticate by an authentication device (not shown) on the gate side, and the door in FIG. 2B is set to an open state.
- the gate of FIG. 2A it is also possible to pass through, and the problem remains.
- a user is recognized by a camera, a facial feature amount is extracted, and a database storing the facial feature amount is searched. For example, without narrowing down the facial feature quantity to be collated using an IC card or the like, it is collated with a facial feature quantity of the user's face captured by the camera from a database in which a large amount (for example, 10,000 units or more) of facial feature quantities are registered. This is almost impossible especially due to problems such as collation accuracy and time constraints.
- a 1: N collation is performed with a moving image or the like from a small non-front face or the like. Is difficult to do.
- the present invention was created in view of the above-described problems, and its purpose is to eliminate the need for ticketing and the like, for example, an apparatus, a face authentication system, a method, and a program that can improve the throughput of passing through a gate. Is to provide.
- a reading unit that receives one or a plurality of identifiers transmitted from one or a plurality of user wireless tags that enter a wireless area outside one longitudinal end of the gate;
- An acquisition unit that acquires each facial feature amount registered corresponding to each identifier received from each wireless tag, an imaging unit that captures the user, and a facial feature amount based on image data captured by the imaging unit
- a face that receives the feature quantity of the face extracted by the extraction section and the face quantity that is matched with the one or more face feature quantities acquired by the acquisition section
- a gate device including a verification unit is provided.
- a reading unit that receives one or a plurality of identifiers transmitted from one or a plurality of user wireless tags that have entered a wireless area outside one longitudinal end of the gate; When the facial feature amount is extracted from the imaging unit that captures the user and the image captured by the imaging unit, does the facial feature amount match the facial feature amount registered corresponding to the identifier? And an opening / closing control unit that opens or closes the gate outlet.
- the facial features registered corresponding to each identifier An acquisition unit that acquires a quantity; an extraction unit that extracts a facial feature quantity from an image captured by the user; and the extraction unit extracts the facial feature quantity, and the feature quantity and the acquisition unit acquire the feature quantity.
- a face authentication system including a face matching unit that matches whether or not one or a plurality of face feature quantities match.
- the facial features registered corresponding to each identifier A step of acquiring a quantity; a step of extracting a facial feature quantity from an image captured by the user; and whether the extracted feature quantity and the obtained one or more facial feature quantities match A face authentication method is provided.
- the facial features registered corresponding to each identifier A process of acquiring a quantity, a process of extracting a facial feature quantity from an image captured by the user, and whether the extracted feature quantity and the acquired one or more facial feature quantities match
- a program for causing a computer to execute the collating process and a computer-readable program recording medium storing the program.
- the program recording medium includes, for example, a semiconductor storage such as a RAM (Random Access Memory), a ROM (Read Only Memory), or an EEPROM (Electrically Eraseable and Programmable ROM), an HDD (Hard Disk Drive), It consists of non-transitory computer readable recording media such as CD (Compact Disc) and DVD (Digital Versatile Disc).
- a semiconductor storage such as a RAM (Random Access Memory), a ROM (Read Only Memory), or an EEPROM (Electrically Eraseable and Programmable ROM), an HDD (Hard Disk Drive), It consists of non-transitory computer readable recording media such as CD (Compact Disc) and DVD (Digital Versatile Disc).
- FIG. 2A is a diagram for explaining a gate by a wireless tag (remote IC card).
- FIG. 2B is a diagram illustrating a gate by a wireless tag (remote IC card).
- FIG. 3A is a diagram illustrating a first exemplary embodiment of the present invention.
- FIG. 3B is a diagram for explaining a first exemplary embodiment of the present invention. It is a figure explaining exemplary 1st Embodiment of this invention. It is a figure explaining the structure of illustrative 1st Embodiment of this invention. 3 is a flowchart for explaining the operation of the first exemplary embodiment of the present invention.
- FIG. 3 is a diagram for explaining a gate by a wireless tag (remote IC card).
- FIG. 3B is a diagram illustrating a gate by a wireless tag (remote IC card).
- FIG. 3A is a diagram illustrating a first exemplary embodiment of the present invention.
- FIG. 3B is a diagram
- FIG. 7A is a diagram for explaining a first exemplary embodiment of the present invention.
- FIG. 7B is a diagram for explaining an exemplary first embodiment of the present invention. It is a figure explaining the modification of exemplary 1st Embodiment of this invention.
- FIG. 9A is a diagram for explaining a second embodiment of the present invention.
- FIG. 9B is a diagram illustrating FIG. 9A. It is a figure explaining the 2nd Embodiment of this invention. It is a figure explaining an example of the reader of the 2nd Embodiment of this invention. It is a figure explaining the structure of illustrative 2nd Embodiment of this invention. It is a flowchart explaining operation
- FIG. 16A is a diagram illustrating a first form of a wireless tag.
- FIG. 16B is a diagram for explaining the database of FIG. It is a figure explaining the operation
- FIG. 18A is a diagram illustrating a second form of the wireless tag.
- FIG. 18B is a diagram for explaining the database of FIG. It is a figure explaining the operation
- one or a plurality of user wireless tags in which a reading unit disposed in a gate enters a wireless area outside one end in the longitudinal direction of the gate (for example, a region outside the entrance of the gate).
- the acquisition unit acquires each facial feature amount registered corresponding to each identifier received from each wireless tag.
- the user for example, the user who has entered the gate
- the extraction unit extracts a facial feature amount from the captured image data
- the face matching unit extracts the facial feature amount.
- the acquisition unit calculates a facial feature amount corresponding to the identifier received by the reading unit from the wireless tag before the imaging unit captures the user who has entered the gate. get.
- an open / close control unit that opens or closes the exit of the gate.
- the opening / closing control unit when the face matching unit matches the face feature amount extracted by the extracting unit and one or more face feature amounts acquired by the acquiring unit, as a result of the matching by the face matching unit, When there is no coincidence, the gate exit may be closed.
- the feature amount is acquired in advance from, for example, a data server that stores the facial feature amount based on the identifier by the wireless tag before approaching the gate.
- face recognition can be performed. For this reason, a plurality of users can pass through the gate smoothly.
- 3A and 3B are diagrams illustrating a first exemplary embodiment of the present invention.
- the user can, for example, UHF (Ultra High Frequency) band (915 MHz (MegaHertz) to 928 MHz: 920 MHz), microwave band 2.45 GHz (GigaHertz), passive type (RFID (Radio Frequency identifier) tag (in the case of an IC card, also referred to as “remote IC card”)) (communication distance: 2 m to 5 m).
- UHF Ultra High Frequency
- RFID Radio Frequency identifier
- RFID is used for a communication distance of about 2 to 5 m as an application to one-way entrance gates, exit gates, two-way entrance / exit gates, etc., but the gate configuration (width, length)
- a UHF band active RFID tag battery built-in type
- the gate system 10 includes a gate device 100 having a three-lane configuration.
- a reader 101 (first reader) disposed on one side in the longitudinal direction of the gate device 100 (for example, the entrance side) is a reader / writer that reads and writes data from and into the wireless tag 20 that enters the wireless area.
- the reader of this specification includes an RFID reader / writer.
- the reader 101 When the reader 101 receives IDs from a plurality of wireless tags of one or more users who have entered the wireless area (the wireless area outside the gate entrance), the reader 101 is not equipped with an internal or external device.
- the face authentication apparatus shown in the figure searches a database (not shown) using each ID as a key, and acquires a face feature amount registered in advance corresponding to each ID from the database.
- the reader 101 uses, for example, an anti-collision function or the like to sequentially (in time order) the IDs of the wireless tags 20. You may make it read.
- the wireless tag 20 may transmit the ID after a predetermined back-off time has elapsed.
- the reader may communicate after a predetermined back-off time when the communication of another reader is detected.
- the image is captured by the camera 103, image data including the user's face is acquired, and the face is detected by a face authentication device (not shown).
- the feature amount is extracted.
- the camera 103 may be configured to take an image of the user who has entered the gate when the user enters the gate and the sensor 104 (for example, an infrared sensor) detects the progress of the user.
- the camera 103 may acquire an image at 30 frames per second, for example. Alternatively, the camera 103 acquires a frame image (for example, one or a plurality of still images) based on a control signal from a device that controls the camera 103 (for example, the image data acquisition unit 116 in FIG. 5). Also good.
- the camera 103 may include a buffer (circular buffer or ring buffer) (not shown), and when the captured image (moving image or still image) becomes full, the image may be overwritten from the top.
- a buffer for recording a moving image or a still image captured by the camera 103 may be provided on the apparatus side that controls the camera 103.
- the reader 102 (second reader) disposed on the other side (exit side) in the gate longitudinal direction.
- the camera 103 may take an image.
- the face is detected from the image data captured by the camera 103, and the facial feature amount of the detected face is extracted.
- the wireless tag 20 communicates with the reader 102 on the gate exit side and returns the ID of the wireless tag 20, -From the previously acquired facial feature values of one or more users (one or more people within the wireless communication range of 2 to 5 m in the lane direction), for example, the user's travel direction and facial feature values Select the target for collation according to the priority order of (narrow down), -It is good also as a structure collated with the face feature-value extracted from the image data imaged with the selected face feature-value (filtered down) with the camera 103.
- a user who does not have a wireless tag (card) (Mr. B in FIG. 3A) has been captured by the camera 103 because the facial feature amount corresponding to the wireless tag ID has not been acquired by the reader 101 in advance. There is nothing that matches the facial feature amount of the user (Mr. B in FIG. 3A) extracted from the image data. For this reason, the door of the gate exit is closed, and a user who does not have a wireless tag (card) is prevented from passing through the gate.
- the communication range (coverage) of the readers 101 and 102 may be configured to avoid interference between readers with the radio wave of the reader 101 in the adjacent lane.
- the antenna of the reader 101 may have a multi-antenna configuration, for example, and may perform beam forming in the lane direction.
- FIG. 4 is a diagram for explaining an exemplary first embodiment of the present invention.
- the size of the user, the wireless tag 20, the gate device 100, the camera 103, the readers 101 and 102, the ratio thereof, and the like are not considered.
- the wireless tag 20 ⁇ / b> A possessed by Mr. A enters the wireless area 171 of the reader 101 arranged on one side in the longitudinal direction (entrance side) of the gate device 100, the radio wave from the reader 101 is transmitted. And returns the ID of the wireless tag 20A.
- the wireless area 171 of the reader 101 includes a predetermined area (front side of the gate entrance) outside one end (inlet) in the longitudinal direction of the gate device 100.
- the wireless tag 20B of Mr. B entering the wireless area 171 also receives the radio wave from the reader 101 and returns the ID of the wireless tag 20B.
- face feature amounts corresponding to the IDs of the wireless tags 20A and 20B are acquired from a database (data server) (not shown). 17A and 17B schematically show the communication ranges of the wireless tags 20A and 20B.
- the reader 101 cannot recognize the ID of the wireless tag (results in a reception error). In this case, the reader 101 may instruct change of the reply timing of the wireless tags 20A and 20B using, for example, an ALOHA anti-collision function, and return the ID in order at different timings.
- the wireless tags 20A and 20B detect that another wireless tag is transmitting an ID, the transmission of the ID from the own device may be stopped for a predetermined time.
- the wireless tag 20A of Mr. A enters the wireless area of the reader 102 on the other side (exit side) in the longitudinal direction of the gate device 100, communicates with the reader 102, returns the ID of the wireless tag 20A, and reads the reader 102.
- the facial feature amounts of Mr. A and Mr. B acquired previously and the facial feature amount extracted from the image data captured by the camera 103 (for example, the face of Mr. A extracted from the image data) (Feature amount) is matched.
- the door of the gate device 100 for example, the door of FIG. 2). Are kept open.
- the ID received by the reader 102 (received by the reader 101 and continued).
- the face feature amount corresponding to the same ID received by the reader 102) may be prioritized and collated with the face feature amount extracted from the image data captured by the camera 103.
- the first exemplary embodiment of the present invention it is possible to reduce the number of face authentication subjects using a wireless tag such as an RFID, and there is no need for card presentation or card touch operation during personal authentication.
- the user can go through the gate with natural movement. Further, the accuracy of face authentication can be improved by narrowing down the number of persons to be authenticated.
- FIG. 5 is a diagram for explaining the configuration of the first exemplary embodiment of the present invention.
- a unidirectional gate device 100 such as an entrance gate includes a reader (first reader) 101 and a reader ( (Second reader) 102, camera 103, sensor (infrared sensor or the like) 104 that senses the movement of the user on the lane, reader 101, interfaces 105 and 106 that communicate with reader 102, and sensor An interface 107, an opening / closing control unit 109 for controlling opening / closing of the gate, a gate (door) 108, a face authentication device 110, and a communication interface 122 are provided.
- the interfaces 105 and 106 and the sensor interface 107 may be RS232C, Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
- the camera 103 may be arranged facing the entrance of the gate passage. Further, one camera 103 may be arranged in a plurality of lanes.
- a plurality of sensors (infrared sensors or the like) 104 may be arranged at a predetermined interval in the gate longitudinal direction and function as sensors for following the movement of the user in the gate.
- the face authentication device 110 includes ID acquisition units 111 and 115 that acquire IDs of wireless tags received by the reader 101 and the reader 102, and facial feature amounts corresponding to the IDs acquired by the ID acquisition unit 111 from the data server 40.
- the feature amount acquisition unit 112 to be acquired, the storage unit 113 that stores the face feature amount acquired from the data server 40, and the face feature amount stored in the storage unit 113 are selected in order or based on the priority, and the face matching unit 119 is selected.
- a selection control unit 114 that performs transfer control, an image data acquisition unit 116 that acquires image data from the camera 103, a face detection unit 117 that extracts a face portion from the image data, and a feature value of the extracted face.
- a face feature amount extraction unit 118 and a face matching unit 119 are provided.
- the feature amount acquisition unit 112 transmits a feature amount query including ID information to the data server 40 via the network 30 via the communication interface 122.
- the data server 40 includes a communication interface 401, a control unit 402 that controls database access, and a storage device (database) 403 that stores IDs and facial feature quantities in association with each other.
- the control unit 402 reads the face feature amount corresponding to the ID using the ID included in the search request (query) transmitted from the face authentication device 110 as a search key.
- the storage device (database) 403 may store the face image information captured by the user at the time of registration in addition to the ID and the face feature amount.
- the search request may include identification information of the gate device 100 that issued the request, regional information, and the like.
- the data server 40 may encrypt the face feature amount corresponding to the ID and transmit it to the gate device 100, or may encrypt and store the face feature amount.
- the feature amount acquisition unit 112 receives the face feature amount searched by the data server 40 and stores it in the storage unit 113. At that time, the feature quantity acquisition unit 112 may store the face feature quantity in association with the ID. Further, the reception time of the ID of the wireless tag or the reception time of the face feature value from the data server 40 may be stored in the storage unit 113 in association with the ID and the face feature value. When the facial feature amount is encrypted and transmitted from the data server 40, the feature amount acquisition unit 112 decrypts the facial feature amount and stores it in the storage unit 113.
- the selection control unit 114 determines the collation order of the face feature amounts stored in the storage unit 113 in the face collation unit 119, and selects the face feature amount to be collated from the face feature amounts stored in the storage unit 113.
- the selection control unit 114 selects the face feature amount to be collated from the face feature amounts stored in the storage unit 113 with a predetermined priority, and selects the face feature amount.
- the face feature amount thus obtained may be supplied to the face matching unit 119. That is, when the RFID tag ID is received by the reader 102 following the reader 101, the facial feature amount may be collated by the face collation unit 119.
- the selection control unit 114 sends the feature amounts stored in the storage unit 113 to the face matching unit 119 in the order in which, for example, the ID of the wireless tag is received by the reader 101 from among the face feature amounts stored in the storage unit 113. You may make it deliver (the face feature-value of ID received earliest first). In this case, the ID acquisition unit 111 may add the time when the ID is received from the wireless tag. The storage unit 113 stores the reception time of the ID in association with the facial feature amount acquired corresponding to the ID.
- the selection control unit 114 determines the order in which the face feature amount acquired by the feature amount acquisition unit 112 is supplied to the face matching unit 119 based on the order in which the feature amount acquisition unit 112 acquires the face feature amount from the data server 40. You may make it do. For example, the selection control unit 114 may supply the face feature amount acquired from the data server 40 earliest to the face matching unit 119 as the first matching target.
- the selection control unit 114 also includes the face feature amount corresponding to the ID received from the same wireless tag in both the reader 101 and the reader 102 for the face feature amount acquired from the data server 40 by the feature amount acquisition unit 112. May be supplied to the face collation unit 119 in preference to the face feature amount corresponding to the ID received from the wireless tag by either the reader 101 or the reader 102.
- the facial feature amount corresponding to the ID received from Mr. A's wireless tag 20A in FIG. 4 may be collated in preference to the facial feature amount corresponding to the ID received from Mr. B's wireless tag 20B. . This is because the ID of Mr. A's wireless tag 20A has been received by both the reader 101 and the reader 102, but the ID of Mr. B's wireless tag 20B has been received only by the reader 101. is there.
- the selection control unit 114 determines the order in which the face feature amount acquired by the feature amount acquisition unit 112 is supplied to the face matching unit 119 based on the user's gate moving direction (second embodiment described later). You may make it do.
- the image data acquisition unit 116 acquires the image data from the camera 103 when the sensor 104 detects the passage of the user.
- the image data acquisition unit 116 may instruct the camera 103 to take an image and acquire an image.
- the image data acquisition unit 116 acquires a moving image or a still image (one or a plurality of continuous images) captured by the camera 103 and stores it in a buffer (circular buffer or ring buffer) (not shown).
- a detection signal is received from the sensor interface 107, an image (frame image) corresponding to the timing may be supplied to the face detection unit 117. That is, reception of the RFID tag ID by the reader 102 may be an opportunity for acquiring image data by the camera 103 and the image data acquiring unit 116.
- the image data acquisition unit 116 may acquire the image data from the camera 103 when the reader 102 receives the ID from the wireless tag.
- the signal line from the sensor interface 107 to the image data acquisition unit 116 is deleted, and the output of the ID acquisition unit 115 is input to the image data acquisition unit 116.
- the face detection unit 117 performs face detection for each of the image data sequentially acquired by the image data acquisition unit 116.
- the face detection unit 117 detects, for example, the face of the user who is traveling in the lane as a detected face image from the image data sequentially acquired by the image data acquisition unit 116.
- the face outline (edge) is extracted from the face image using a filter such as horizontal or vertical.
- the algorithm used by the face detection unit 117 for face detection is not particularly limited, and various algorithms can be used. Note that the face detection unit 117 may instruct the face feature amount extraction unit 118 with the priority in the order of face matching when detecting the faces of a plurality of persons in the image data.
- the face feature amount extraction unit 118 may be instructed with priority so that feature amounts are detected in order from a face image having a large inter-eye distance. Or you may make it detect a feature-value in order of the magnitude
- the face feature amount extraction unit 118 extracts a face feature amount that is a feature amount of the face image for each of the face images detected by the face detection unit 117.
- the face feature amount may be configured as a vector including a combination of scalar amount components expressing the feature of the face image.
- the component of the feature amount is not particularly limited, and various kinds of components can be used.
- a positional relationship such as distance and angle between feature points set at the center or end point of facial organs such as eyes, nose and mouth, curvature of face contour, color distribution of face surface, etc.
- a gray value or the like can be used.
- the number of components of the feature amount is not particularly limited, and can be set as appropriate according to required collation accuracy, processing speed, and the like.
- the face feature amount extraction unit 118 For each detected face image, the face feature amount extraction unit 118, together with the face image data and the face feature amount, a detection number that is a number for specifying the image data, and an imaging time when the detected face image is captured. You may make it preserve
- the face collating unit 119 compares the registered face feature amounts with the feature amounts extracted by the face feature amount extracting unit 118 in the order selected by the selection control unit 114, for example, and has the highest sum of similarities, for example. In the registered image, if the sum of the similarity degrees exceeds the threshold value, it is determined that they are the same person. Of the face feature values stored in the storage unit 113, the face feature value determined by the face matching unit 119 to match the feature value extracted by the face feature value extraction unit 118 is sent via the selection control unit 114. , It may be deleted.
- the storage unit 113 may be reset (deleted) via the selection control unit 114 or the like.
- FIG. 6 is a flowchart for explaining an example of the processing procedure of the first exemplary embodiment of the present invention. Although not particularly limited, FIG. 6 illustrates an example in which an IC card is used as a wireless tag.
- the wireless tag (IC card) that has received the radio wave from the reader 101 receives the power, reads the ID stored in the storage unit of the wireless tag (IC card), and wirelessly transmits the ID to the reader 101 (S101). ).
- the ID acquisition unit 111 acquires the ID of the wireless tag (IC card) received by the reader 101 (S102).
- the feature amount acquisition unit 112 acquires the face feature amount from the data server 40 using the ID as a key, and stores it in the storage unit 113 (S103).
- the ID is returned (S104).
- the selection control unit 114 determines the collation order of the facial feature amounts stored in the storage unit 113, and the facial features stored in the storage unit 113.
- a face feature quantity to be collated is selected from the quantities (S105). That is, when the reader 102 receives the ID from the radio tag of the user who entered the gate, the radio tag of one or more users (one or more people within the radio communication range in the lane direction) before entering the gate From the face feature amounts acquired corresponding to the ID from the ID, the selection control unit 114 narrows down the matching target according to, for example, the user's traveling direction and the priority order of the face feature amounts, and then performs face matching. You may do it. In this case, the face collation unit 119 may collate the face feature amount narrowed down to a predetermined number of people with the face feature amount extracted from the image data captured by the camera 103.
- step S106 is, for example, a timing later than when the feature amount acquisition unit 112 acquires the feature amount corresponding to the ID received from the user's wireless tag before entering the gate lane.
- the image data acquisition unit 116 acquires the image data from the camera 103, and the face detection unit 117 detects the face from the image data (S107).
- the feature amount acquisition unit 112 acquires the face feature amount corresponding to the ID, and the reader 102 receives the wireless tag ID, the camera 103 receives the face of the user. You may make it image.
- the camera 103 when the reader 102 receives the ID from the wireless tag after the reader 101, the camera 103 The user may be imaged, and the image data acquisition unit 116 may acquire image data from the camera 103.
- the face detection unit 117 acquires a face image of a person using, for example, the largest face or the distance between eyes (the face having the maximum distance between eyes and a sufficiently large distance between eyes) or the like (S108). ). Subsequently, the face feature amount extraction unit 118 extracts the face feature amount of the face image detected by the face detection unit 117 (S109).
- the face collation unit 119 collates whether the face feature amount extracted by the face feature amount extraction unit 118 matches the face feature amount selected by the selection control unit 114. If they match, a match detection signal is output to the open / close control unit 109 (S112).
- the door 108 is kept open (S113). When the facial feature amount extracted by the facial feature amount extraction unit 118 matches the facial feature amount selected by the selection control unit 114, the selection control unit 114 deletes the facial feature amount stored in the storage unit 113. .
- the opening / closing control unit 109 may momentarily open the door 108 when the user enters the gate.
- the door 108 may be set to an open state when it is closed and a coincidence detection signal is received.
- the selection control unit 114 stores in the storage unit 113.
- the face feature value to be checked next is selected from the stored face feature values, and the face feature value extracted by the face feature value extracting unit 118 is checked by the face matching unit 119.
- the face matching unit 119 outputs a mismatch signal to the open / close control unit 109 (S114). .
- the opening / closing control unit 109 sets the door 108 to the closed state (S115).
- FIG. 7A is a diagram for explaining face detection in the face detection unit 117 of FIG.
- FIG. 7A illustrates an image included by three users acquired by the image data acquisition unit 116.
- the face detection unit 117 selects the face image of the person with the largest distance between eyes (Mr. A having a distance between eyes of 100), and the face feature quantity extraction unit 118 selects the face image (18 in FIG. 7B). A feature amount (19 in FIG. 7B) is extracted.
- FIG. 8 is a diagram illustrating an example of a modification of the first embodiment described with reference to FIG.
- the face authentication device 110 is disposed in the gate device 100, but in the modification of the first embodiment, the face authentication device 110 is disposed outside the gate device 100.
- the face authentication device 110 includes a communication interface 122 ⁇ / b> B that communicates with the communication interface 122 ⁇ / b> A of the gate device 100.
- the ID acquisition units 111 and 115, the image data acquisition unit 116, and the open / close control unit 109 are connected to the communication interface 122B of the face authentication apparatus 110 via a common communication interface 122A.
- the communication interface may be provided individually.
- the communication interface 122A of the gate device 100 multiplexes (multiplexes) the signals from the ID acquisition units 111 and 115 and the image data acquisition unit 116 and transmits the signals to the communication interface 122B, and demultiplexes (separates) the communication interface 122B. Then, it may be transferred to the feature amount acquisition unit 112, the face collation unit 119, and the face detection unit 117 of the transmission destination.
- the image data acquisition unit 116 may be configured to compress and encode the image data acquired by the camera 103 and transmit the compressed image data to the face detection unit 117 and to be decoded on the face detection unit 117 side.
- the communication interfaces 122A and B may be a wired LAN (Local Area Network) such as Ethernet (registered trademark), USB (Universal Serial Bus), RS232, RS485, GPIB (General Purpose Interface Bus), or the like.
- a wireless communication interface such as Bluetooth (registered trademark) may be used.
- the feature amount acquisition unit 112 of the face authentication apparatus 110 is connected to the network 30 via the communication interface 122C, and acquires a face feature amount corresponding to the ID acquired by the ID acquisition unit 111 or the like from the data server 40.
- the collation result in the face collation unit 119 is transmitted to the communication interface 122A of the gate device 100 via the communication interface 122B, transferred from the communication interface 122A to the opening / closing control unit 109, and according to the collation result, the door 108 Open / close control is performed.
- the image data acquisition unit 116 may be integrated with the camera 103.
- the image data acquisition unit 116 may be provided on the face authentication device 110 side.
- image data captured by the camera 103 is transmitted to the image data acquisition unit 116 of the face authentication device 110 via the communication interface 122A and the communication interface 122B.
- an output signal of the sensor interface 107 is transmitted to the image data acquisition unit 116 of the face authentication device 110 via the communication interface 122A and the communication interface 122B.
- the operations of the face authentication device 110, the gate device 100, and the data server 40 are the same as those in the first embodiment, and a description thereof is omitted to avoid duplication.
- FIG. 9A is a diagram for explaining a second exemplary embodiment of the present invention.
- the second embodiment provides a bidirectional lane, and the reader 101 and the reader 102 determine the direction of the user.
- the ID of Mr. A's wireless tag 20A is read in the order of the reader 101 and the reader 102, and the facial feature quantity registered corresponding to the ID is acquired from the data server 40.
- the direction of Mr. A is from the left to the right in the figure.
- the ID of the wireless tag 20B of Mr. B is read by the reader 101, and the facial feature amount registered corresponding to the ID is acquired from the data server (DB) 40.
- reference numerals 171A and 172B denote wireless areas outside the gate end and the other end of the reader 101 and the reader 102, respectively.
- FIG. 9B is based on the presence / absence of ID reception in the reader 101 and the reader 102 in FIG. 9A ( ⁇ is received, ⁇ is not received), direction, acquisition of facial feature quantity from the data server, and direction.
- This list summarizes the priority of face recognition.
- the order may be determined based on the speed of the face feature amount of the ID received by the reader 101.
- the priority of the face feature amount acquired from a database (data server) (not shown) is increased, and the camera 103 You may make it collate with the face feature-value extracted from the image data imaged by.
- FIG. 10 is a schematic plan view illustrating the antenna directivity of the readers 101 and 102 according to the second embodiment described with reference to FIG.
- the reader 101 ⁇ / b> A has a wireless area 171 ⁇ / b> A (corresponding to 171 ⁇ / b> A in FIG. 9A) that extends outside one end in the gate longitudinal direction (left side in the figure), and the reader 101 ⁇ / b> B has one end in the gate longitudinal direction.
- the wireless area 171B is extended inside (right side in the figure).
- the reader 102A has a wireless area 172A that extends to the inside (left side in the figure) of the other end in the longitudinal direction of the gate, and the reader 102A has a wireless area 172B that extends to the outside (right side in the figure) of the other end in the longitudinal direction of the gate.
- the readers 101A and 101B may be configured as the same unit (reader 101).
- the readers 102A and 102B may also be configured as the same unit (reader 102).
- FIG. 11 is a diagram illustrating a configuration example of the reader 101 (102) of the second embodiment.
- the reader 101 has antennas 1011A and 1011B having directivity, a transmitter that converts the transmission signal to an RF (Radio Frequency) frequency, amplifies the power, and transmits the antenna 1011A and 1011B, and the RF received by the antennas 1011A and 1011B.
- An RF (Radio Frequency) circuit 1012A, 1012B having a receiver that amplifies the frequency signal and converts the frequency signal to an intermediate frequency, a control circuit 1013 that transmits / receives a transmission signal and a reception signal to the RF circuit 1012A, 1012B, and a communication interface 1014 I have.
- the readers 101A and 101B include antennas 1011A and 1011B and RF circuits 1012A and 1012B.
- FIG. 12 is a diagram for explaining the configuration of the second embodiment.
- the reader 101 includes readers 101A and 101B, and the reader 102 includes readers 102A and 102B.
- the difference from the first embodiment described with reference to FIG. 5 is that the ID of the same wireless tag received by the reader 101 and the reader 102 is the ID of the reader 101 or the reader 102 first.
- a direction control unit 120 that detects the direction of the user based on the order of whether the IDs are received, acquires the feature amount of the corresponding ID, and stores the feature amount in the storage unit.
- the reader 101 When the reader 101 first receives the wireless tag ID and then the reader 102 receives the wireless tag ID, the user who owns the wireless tag has the same length in the longitudinal direction of the gate where the reader 101 is disposed. It can be estimated that it progresses from one side to the other. For this reason, you may make it control so that a user may be imaged with the camera 103 which images the user who approaches a gate from the direction where the reader 101 is arrange
- the control circuit 1013 of the reader 101 in FIG. 11 gives an identification code indicating which antenna the signal is received from the RF circuit 1012A, 1012B, and notifies the communication interface 105 of it.
- the direction control unit 120 can identify the ID signal received by any of the antennas 1011A and 1011B. Similarly, the direction control unit 120 can also identify which of the ID signals received by the readers 102A and 102B for the reader 102.
- the direction control unit 120 delivers ID1 to the feature amount acquisition unit 112.
- the feature amount acquisition unit 112 acquires the face feature amount corresponding to ID 1 from the data server 40 and stores it in the storage unit 113.
- the direction control unit 120 instructs the image data acquisition unit 116 to acquire image data from the camera 103A.
- the direction controller 120 notifies the selection controller 114 of the ID.
- the direction control unit 120 instructs the selection control unit 114 to narrow down the face feature amount to be collated from the face feature amounts acquired by the feature amount acquisition unit 112 and stored in the storage unit 113.
- the selection control unit 114 selects the face feature amount stored in the storage unit 113 in accordance with the priority of collation, and supplies it to the face collation unit 119.
- the direction control unit 120 delivers the ID2 to the feature amount acquisition unit 112.
- the feature amount acquisition unit 112 acquires a face feature amount corresponding to ID2 (ID received by the reader 102B) from the data server 40 and stores it in the storage unit 113.
- the direction control unit 120 instructs the image data acquisition unit 116 to acquire image data from the camera 103B facing away from the camera 103A.
- the direction control unit 120 causes the selection control unit 114 to check the face to be collated from the face feature amounts acquired by the feature amount acquisition unit 112. Instructs to narrow down the feature amount.
- the selection control unit 114 Based on the order of reception of the IDs in the readers 101A and 102A and the readers 102B and 101B (see FIG. 10), the selection control unit 114 starts from one side in the longitudinal direction of the gate and from the other side in the longitudinal direction of the gate. You may make it determine which of one side gives priority.
- FIG. 13 is a flowchart for explaining the processing procedure of the second exemplary embodiment of the present invention.
- the processing procedure of FIG. 13 includes a step 103A for controlling the direction based on a signal received by the reader, and a step 105A for determining which direction from left to right or right to left is to be prioritized. This is different from FIG.
- the face authentication device 110 of FIG. 12 may be arranged outside the gate device 100 as in the modification of the first embodiment described with reference to FIG.
- FIG. 14 is a diagram for explaining an exemplary third embodiment of the present invention.
- the user receives the ID by the reader 101 of lane 2, and the face feature amount corresponding to the ID of the wireless tag is received from the data server 40 in lane 2.
- the acquired lane that the user actually passes is assumed to be lane 3 or lane 1.
- the RFID 101 ID is received by the reader 101 in the lane 1 or lane 3, and the facial feature amount registered corresponding to the ID is acquired from the data server 40.
- the storage unit 113 in each face authentication device 110 (gate device 100) in a plurality of lanes is shared.
- the face feature amount corresponding to the ID is acquired from the data server 40 in the lane 2, but even if the lane that the user actually passes is changed to the lane 3 or the lane 1, the change destination In the lanes 1 and 3, the face feature amount need not be acquired from the data server 40.
- the face feature amount corresponding to the ID of the wireless tag is shared by a plurality of lanes, but the face feature amount collated in the predetermined lane is deleted from the storage unit 113.
- the face authentication device 110 may be provided for each lane with respect to a plurality of lanes, or may be provided with one face authentication device 110 in common for a plurality of lanes. Good.
- One face authentication device 110 may be provided outside.
- a signal source lane may be specified in the face authentication apparatus 110 by transmitting a signal to which the number or the like is added to the face authentication apparatus 110.
- FIG. 15 is a diagram for explaining the correspondence between the features of the embodiment and the effects (solved problems).
- the face to be authenticated using the largest face and the distance between eyes in the image data acquired by the camera, it is not necessary for the user to stop in front of the camera and look into it.
- the face pass is made possible by movement.
- a face pass can be performed with a natural operation.
- first and second readers are provided apart from each other in the longitudinal direction of the lane, thereby enabling reciprocal authentication.
- FIG. 16A is a diagram illustrating the wireless tag according to each of the embodiments described above.
- the wireless tag 20 (IC card) includes an RF circuit 203, a control circuit 204, an IC chip 202 including a memory 205 such as a ROM that holds an ID of about 64 to 128 bits, and an antenna 201.
- the wireless tag 20 is a passive type that receives power supply by radio waves from a reader, and does not include a battery.
- the communication distance is 2 to 5 m.
- the frequency may be 2.45 MHz (such as ISO-18000-4).
- the communication speed may be 20-40 kbps. Note that the wireless tag 20 is not limited to the card type as shown in FIG.
- the wireless tag 16A or the like may have a rod shape or the like.
- the wireless tag may be a fountain pen type, or a wearable type in which an RFID tag is incorporated in a wristwatch or glasses.
- what integrated the RFID tag in the smart phone may be used.
- FIG. 17 is a diagram for explaining the operation sequence of the first embodiment using the wireless tag of FIG. 16A. The operation sequence of the first embodiment will be described with reference to FIGS.
- the plurality of wireless tags 20 that have received radio waves from the reader 101 return IDs in order (S1).
- the face authentication device 110 transmits a search request including the ID of the wireless tag 20 to the data server 40 (S2).
- the data server 40 transmits the face feature amount corresponding to the received ID of the wireless tag to the face authentication device 110 (S3).
- FIG. 16B is a diagram illustrating information (ID and face feature amount) registered in the storage device (database) 403 of the data server 40.
- the face authentication device 110 stores the face feature amount received from the data server 40 in the storage unit 113.
- the camera 103 takes an image (S4) and transmits it to the face authentication device 110 (S5).
- the image data acquisition unit 116 may receive the image data acquired by the camera 103 based on the detection result of the sensor 104 (FIG. 5 and the like).
- the camera 103 may acquire a moving image and transmit an image at a timing when receiving an instruction from the image data acquisition unit 116 to the image data acquisition unit 116 as a still image.
- the face collation unit 119 of the face authentication device 110 collates the face feature amount extracted from the image data with the face feature amount received from the data server 40 (S6).
- the face collation unit 119 transmits the collation result to the opening / closing control unit 109 (S7).
- the opening / closing control unit 109 sets the gate (door 108) to an open state when they match, and closes when they do not match based on the result of collation (S8).
- FIG. 18A is a diagram illustrating another form of a wireless tag.
- the wireless tag 20 a part of the feature amount of the user's face is stored in the memory 205.
- the memory 205 includes a rewritable nonvolatile memory such as an EEPROM (Electrically Erasable Programmable Read-Only Memory).
- the data server 40 stores a part of the face feature amount corresponding to the ID.
- the memory 205 of the wireless tag 20 may be configured to encrypt and store a part of the feature amount of the user's face.
- the data server 40 may encrypt a part of the face feature amount corresponding to the ID and transmit it to the face authentication device 110.
- 18B is a diagram illustrating information registered in the storage device (database) 403 of the data server 40.
- IDs and partial facial feature quantities are stored in association with each other.
- FIG. 19 is a diagram for explaining an operation sequence when the wireless tag of FIG. 18A is used.
- the plurality of wireless tags 20 that have received radio waves from the reader 101 transmit IDs and partial facial feature quantities (S11).
- the face authentication device 110 that has received the IDs and partial facial feature quantities from the plurality of wireless tags 20 transmits a search request including the ID of the wireless tag 20 to the data server 40 (S12).
- the data server 40 transmits a partial face feature amount corresponding to each ID to the face authentication device 110 as a response to the search request (S13).
- the feature amount combining unit 124 of the face authentication device 110 combines the partial face feature amount corresponding to the ID from the data server 40 and the partial face feature amount of the ID received from the wireless tag 20 (S14).
- the camera 103 captures an image (S15) and transmits it to the face authentication device 110 (S16).
- the face authentication apparatus 110 uses the partial facial feature value from the wireless tag 20 and the data server 40. Are decoded.
- the partial face feature value transmitted from the wireless tag 20 and the partial face feature value from the data server 40 are combined, and the facial feature value extracted from the image data and the combined feature value are Are collated (S17).
- the face verification unit 119 transmits a verification result (authentication result) to the opening / closing control unit 109 (S18).
- the opening / closing control unit 109 sets the gate (door 108) to an open state when they match, and closes when they do not match based on the result of collation (S19).
- FIG. 20 is a diagram illustrating a device configuration of the fourth embodiment corresponding to the wireless tag of FIG. 18A. 20 is different from the first embodiment of FIG. 5 in that a partial feature quantity 1 receiving unit 121, a partial feature quantity 2 acquisition unit 123, and a feature quantity synthesis unit 124 are provided.
- the partial feature quantity 1 receiving unit 121 acquires the partial facial feature quantity transmitted together with the ID from the wireless tag.
- the partial feature quantity 2 acquisition unit 123 acquires a partial facial feature quantity corresponding to the ID transmitted from the data server 40.
- the feature amount combining unit 124 combines the partial face feature amounts 1 and 2 received by the partial feature amount 1 receiving unit 121 and the partial feature amount 2 acquiring unit 123.
- the face collating unit 119 collates the partial facial feature amount transmitted from the wireless tag 20, the facial feature amount obtained by synthesizing the partial facial feature amount from the data server 40, and the facial feature amount extracted from the image data.
- the face matching unit 119 transmits the matching result to the opening / closing control unit 109.
- the opening / closing control unit 109 keeps the door 108 open when the matching result matches, and closes the door 108 when the matching does not match. Set.
- the feature amount synthesis unit 124 synthesizes a facial feature amount based on the partial face feature amount received from the wireless tag and the partial feature amount received from the data server 40.
- face authentication may not be verified simultaneously with the imaging timing. obtain. In this case, the throughput deteriorates.
- the partial facial feature quantity corresponding to the ID acquired from the wireless tag in advance is acquired from the data server 40, so that the camera 103 before imaging the user.
- the face authentication device 110 in FIG. 20 may be arranged outside the gate device 100 as in the modification of the first embodiment described with reference to FIG.
- FIG. 21 is a diagram illustrating another form of the wireless tag.
- the memory 205 is a rewritable nonvolatile memory such as an EEPROM.
- the memory 205 of the wireless tag 20 may be configured to encrypt and store the facial feature amount of the user.
- FIG. 22 is a diagram for explaining an operation sequence when the wireless tag of FIG. 21 is used.
- the plurality of wireless tags 20 that have received radio waves from the reader 101 transmit IDs and facial feature quantities (S21). Note that when the facial feature value from the wireless tag 20 is encrypted and stored, the face authentication apparatus 110 decrypts the facial feature value transmitted from the wireless tag 20.
- the image is taken by the camera 103 (S22) and transmitted to the face authentication device 110 (S23).
- the face collation unit 119 of the face authentication device 110 collates the face feature amount transmitted from the wireless tag 20 with the face feature amount extracted from the image data (S24).
- the face matching unit 119 transmits a matching result (authentication result) to the opening / closing control unit 109 (S25).
- the opening / closing control unit 109 sets the gate (door 108) to an open state when they match, and closes when they do not match based on the result of collation (S26).
- FIG. 23 is a diagram illustrating a device configuration of the fifth embodiment corresponding to the wireless tag of FIG. 23 differs from the embodiment of FIG. 5 in that a feature amount receiving unit 125 is provided and the feature amount acquiring unit 112 is deleted.
- the feature amount receiving unit 125 acquires the feature amount transmitted from the wireless tag and stores it in the storage unit 113.
- the face authentication apparatus 110 collates the face feature amount transmitted from the wireless tag 20 with the face feature amount extracted from the image data.
- the face authentication device 110 of FIG. 23 may be arranged outside the gate device 100 as in the modification of the first embodiment described with reference to FIG.
- FIG. 24 is a diagram illustrating a configuration in which the face authentication apparatus according to the above embodiment is mounted on a computer 300.
- the computer 300 includes a processor 301, a storage device 302, a display device 303, and an interface 304.
- the processor 301 can control the face authentication apparatus according to each of the above embodiments by executing a program stored in the storage device 302.
- the display device 303 may display a circle, a cross, and a direction for notifying opening / closing of the gate.
- the interface 304 may be configured to include a communication interface connected to the data server 40 via the readers 1 and 2, the camera, the sensor, and the network.
- a face authentication method comprising:
- Appendix 3 The face authentication method according to appendix 1 or 2, wherein an exit of the gate is set in an open state or a closed state in accordance with a result of the collation.
- the reading unit is disposed on one side in the longitudinal direction of the gate, and is transmitted from the wireless tag of one or a plurality of users entering a wireless area outside one side end of the gate in the longitudinal direction.
- a first reader receiving one or more identifiers;
- a second reader disposed on the other longitudinal side of the gate;
- the face authentication method according to any one of appendices 1 to 4, characterized in that:
- the second reader is also capable of receiving an identifier from one or a plurality of user wireless tags that have entered a wireless area outside the other longitudinal end of the gate
- the first reader is also capable of receiving an identifier from a user's wireless tag that proceeds from the other longitudinal end of the gate to the one longitudinal end of the gate,
- the user's traveling direction is from one end to the other end in the longitudinal direction of the gate, depending on which received first.
- Appendix 7 Acquired by the acquisition unit corresponding to each identifier respectively received by the plurality of first readers or the plurality of second readers provided corresponding to the plurality of lanes of the gate, respectively.
- Appendix 8 Detecting a face from the captured image data; The face authentication method according to any one of appendices 1 to 7, wherein a face feature amount acquired by the acquisition unit is collated from a user in the front row among the detected faces.
- the acquired face feature amount is selected and collated with the extracted face feature amount based on the order in which identifiers are received from the wireless tag or the order according to the traveling direction of the user.
- the face authentication method according to any one of appendices 1 to 8.
- Appendix 10 9. The face authentication method according to any one of appendices 1 to 8, wherein an order of collating with the extracted face feature amount is determined based on the order of acquiring the face feature amount.
- the face feature amount corresponding to the identifier of the wireless tag received by both the first reader and the second reader is obtained as the first reader and the second reader.
- Appendix 12 The face according to any one of appendices 1 to 11, wherein a sensor that monitors the progress of the user in the lane in the gate captures an image with the imaging unit when detecting the user. Authentication method.
- Appendix 14 14. The face authentication method according to any one of appendices 1 to 13, wherein a face feature amount matched as a result of the collation is deleted from the acquired face feature amounts.
- the wireless tag is A part of the facial feature amount of the user is stored and held together with the identifier, The remaining part of the user's facial feature is registered in the data server, From the wireless tag, the identifier of the wireless tag and a part of the facial feature amount are received, The remaining part of the facial feature amount registered corresponding to the identifier is acquired from the data server, Combining a part of the face feature value received from the wireless tag and the remaining part of the acquired face feature value, 3.
- the face authentication method according to appendix 2 wherein the extracted face feature quantity is compared with the synthesized face feature quantity.
- the wireless tag is Stores and holds the facial feature amount of the user together with the identifier, Receiving the identifier of the wireless tag and the facial feature quantity from the wireless tag; The face authentication method according to appendix 1, wherein the extracted face feature value is collated with the face feature value received from each of the wireless tags.
- the data server detects fraud when a plurality of search requests having the same identifier received from the wireless tag are issued to the data server from a plurality of different regions simultaneously or within a certain time range.
- Appendix 19 Before imaging the user who entered the gate, send a search request including the identifier of the wireless tag received by the reading unit to the data server, The data server stores in advance the user's facial feature amount corresponding to the identifier stored in the user's wireless tag, The program according to appendix 18, which causes the computer to execute a process of acquiring a face feature amount corresponding to the identifier from the data server.
- Appendix 20 The program according to appendix 18 or 19, which causes the computer to execute processing for opening or closing the gate outlet according to the result of the collation.
- the reading unit is disposed on one side in the longitudinal direction of the gate, and is transmitted from the wireless tag of one or a plurality of users entering a wireless area outside one side end of the gate in the longitudinal direction.
- a first reader receiving one or more identifiers;
- a second reader disposed on the other longitudinal side of the gate;
- the extracted face feature amount matches the acquired one or more face feature amounts.
- the program according to any one of appendices 18 to 21, which causes the computer to execute a process of checking or matching.
- the second reader is also capable of receiving an identifier from one or a plurality of user wireless tags that have entered a wireless area outside the other longitudinal end of the gate
- the first reader is also capable of receiving an identifier from a user's wireless tag that proceeds from the other longitudinal end of the gate to the one longitudinal end of the gate,
- the user's traveling direction is from one end to the other end in the longitudinal direction of the gate, depending on which received first.
- Appendix 24 Acquired by the acquisition unit corresponding to each identifier respectively received by the plurality of first readers or the plurality of second readers provided corresponding to the plurality of lanes of the gate, respectively. 24.
- Appendix 25 Detecting a face from the captured image data;
- the program according to any one of appendices 18 to 24, which causes the computer to execute a process of collating with a face feature amount acquired by the acquisition unit from a user in the front row among the detected faces.
- Appendix 26 Processing for selecting the acquired face feature amount and collating with the extracted face feature amount based on the order in which identifiers are received from the wireless tag or the order according to the traveling direction of the user The program according to any one of appendices 18 to 25, which is executed by the program.
- Appendix 27 The program according to any one of appendices 18 to 25, which causes the computer to execute a process of determining an order of collating with the extracted facial feature quantity based on the order of obtaining the facial feature quantity.
- the face feature amount corresponding to the identifier of the wireless tag received by both the first reader and the second reader is obtained as the first reader and the second reader.
- Appendix 31 31.
- the wireless tag is A part of the facial feature amount of the user is stored and held together with the identifier, The remaining part of the user's facial feature is registered in the data server, Receiving from the wireless tag the identifier of the wireless tag and a part of the facial feature amount; Processing for acquiring the remaining part of the facial feature amount registered corresponding to the identifier from the data server; A process of combining a part of the face feature value received from the wireless tag and the remaining part of the acquired face feature value; A process of collating the extracted face feature quantity with the synthesized face feature quantity; Item 20.
- the wireless tag is Stores and holds the facial feature amount of the user together with the identifier, A process of receiving the identifier of the wireless tag and the facial feature amount from the wireless tag; 19.
- Gate system 11 User 12 Door 13 Camera 14 Face Authentication Device 15 Database (DB) 16 Opening / closing control device 17, 17A, 17B, 17C Wireless area (communication range) 18, 18A, 18C Facial image data 19 Facial features 20, 20A, 20B, 20C Wireless tag (IC card) 30 Network 40 Data Server 100 Gate Devices 101, 101A, 101B Reader (Reader / Writer) 102, 102A, 102B Reader (Reader / Writer) 103, 103A, 103B Camera 104 Sensor (Infrared sensor) 105, 106, 122, 122A, 122B, 122C Communication interface 107 Sensor interface 108 Door 109 Opening / closing control unit 110 Face authentication device 111 ID acquisition unit 112 Feature quantity acquisition unit 113 Storage unit 114 Selection control unit 115 ID acquisition unit 116 Image data acquisition Unit 117 face detection unit 118 face feature amount extraction unit 119 face collation unit 120 direction control unit 121 partial feature amount 1 reception unit 123 partial feature amount 2 acquisition unit 124 feature amount synthesis unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Business, Economics & Management (AREA)
- Finance (AREA)
- Collating Specific Patterns (AREA)
- Devices For Checking Fares Or Tickets At Control Points (AREA)
- Time Recorders, Dirve Recorders, Access Control (AREA)
Abstract
Description
本発明は、日本国特許出願:特願2017-073042号(2017年3月31日出願)の優先権主張に基づくものであり、同出願の全記載内容は引用をもって本書に組み込み記載されているものとする。
本発明は顔認証システム、装置、方法、プログラムに関する。
・認証対象者の心理的抵抗が少ない、
・離れた場所からでも認証が可能である、
・不正に対する心理的抑止効果がある、等の利点を有している。顔認証技術は、様々な分野における本人確認に利用されており、適用範囲が拡大している。
・カメラにより対象者の顔を撮像し、
・撮像画像から画像認識技術により顔の特徴量を抽出し、
・事前抽出しておいた顔の特徴量データベースとの照合を行い、
・照合結果の一致又は不一致により、扉(ドア)の開錠、施錠等の開閉制御を行う。図1は、顔認証技術を利用した自動ドアの一典型例を説明する図である。予め利用者11の顔データ(顔の特徴量)をデータベース(DataBase:DB)15に登録しておく。利用者11は、扉(ドア)12の横等のカメラ13の前に立つ。扉(ドア)12は通常施錠されている。顔認証装置14は、カメラ13で撮像された利用者11の画像データ(顔画像)から顔の特徴量を抽出し、データベース15に登録されている顔データ(特徴量)と照合することで顔認証を行う。開閉制御装置16は、顔認証装置14での認証結果に基づき、扉12の開閉を制御する。開閉制御装置16は、顔認証にパスした利用者11についてのみ扉12の錠の開信号を出力する。顔認証にパスしない利用者に対しては、扉12は施錠されたままとされる。このため、不審者や、登録された利用者以外のものの進入を防ぐことができる。
・特徴量データベースの登録人数が多ければ多いほど、似た特徴量を持つ人物がデータベース上に存在する確率が増大し、
・似た特徴量を持つ人物が増えれば増えるほど、誤認証は増加し、
・認証における正解率は低下する、
といった問題がある。特許文献1には、この問題に対して、以下の構成が開示されている。扉の前の認証対象(人)に所持される無線端末装置(無線タグ:例えばパッシブ型のRFID(Radio Frequency Identifier)タグ)が識別情報を無線送信し、無線通信手段が受信する。画像取得手段が認証対象の画像を取得し、特徴量取得手段が特徴量を取得する。記憶手段が複数の対象の識別情報に対応付けて特徴量や特徴量判定情報を記憶する。認証手段が、受信された識別情報に対応付けて標準的な判定の仕方とは異なる判定の仕方を規定する特徴量判定情報が記憶されている場合には、それを用いて、当該識別情報に対応付けて記憶された特徴量と取得された特徴量とを照合して認証対象に対する認証を行う。
・ドア等の開閉を対象としており、その対象エリアは、基本的に閉鎖エリア(小から中規模の部屋等)である。
・また、認証対象の利用者の顔データの登録件数も小から中規模である。
・そして、一人毎の顔認証およびドア開閉となる。
・認証エリアも、利用者が、例えば固定枠(例えば図1の13など)を覗き込む運用形態とされる。
・認証スピードについても特段の高速性は要求されない。
・さらに、各入口、出口の経路の数は1つである。
・また、ドアは、通常閉まっており、認証が成功すると開かれる。
・認証は、例えば外から部屋に入る時等のように、基本的に、一方向のみである。
図3A、図3Bは、本発明の例示的な第1の実施形態を説明する図である。本発明の第1の実施形態においては、特に制限されないが、利用者は、例えばUHF(Ultra High Frequency)帯(915MHz(MegaHertz)~ 928MHz: 920MHz)、マイクロ波帯2.45GHz(GigaHertz)、パッシブ型の無線タグ(RFID(Radio Frequency identifier)タグ(ICカードの場合、「遠隔ICカード」とも称する))(通信距離:2m~5m)を保持する。パッシブ型無線タグを保持する利用者がゲートに近づき、読み取り機101から電力の供給を受けると、無線タグが、タグの識別子(ID)を返信する。
・先に取得された1人又は複数の利用者の顔特徴量(レーン方向に2~5mの無線通信範囲内の1人又は複数人)の中から、例えば利用者の進行方向や顔特徴量の優先順位等にしたがって、照合対象を選択し(絞り込み)、
・選択された(絞り込まれた)顔特徴量と、カメラ103で撮像した画像データから抽出された顔特徴量と照合する構成としてもよい。
図8は、図5を参照して説明した第1の実施形態の変形例の一例を例示した図である。
図5の例では、顔認証装置110は、ゲート装置100内に配設されているが、第1の実施形態の変形例では、顔認証装置110は、ゲート装置100の外部に配設されている。図8を参照すると、顔認証装置110は、ゲート装置100の通信インタフェース122Aと通信接続する通信インタフェース122Bを備えている。なお、図8では、簡単のため、ID取得部111、115、画像データ取得部116、開閉制御部109が共通の通信インタフェース122Aを介して顔認証装置110の通信インタフェース122Bに接続しているが、個別に通信インタフェースを備えた構成としてもよい。
図9Aは、本発明の例示的な第2の実施形態を説明する図である。第2の実施形態は双方向のレーンを提供し、読み取り機101と読み取り機102で利用者の方向を決定する。図9Aを参照すると、Aさんの無線タグ20AのIDは、読み取り機101と読み取り機102の順に読みとり、データサーバ40から該IDに対応して登録された顔特徴量が取得される。Aさんの方向は、図の左から右となる。Bさんの無線タグ20BのIDは読み取り機101で読みとられ、データサーバ(DB)40から該IDに対応して登録された顔特徴量が取得される。Cさんの方向は、図の右から左となり、無線タグ20CのIDは読み取り機102で読みとられ、データサーバ(DB)40から当該IDに対応して登録された顔録特徴量が取得される。なお、図9Aにおいて、171A、172Bは、それぞれ、読み取り機101と読み取り機102のゲート一端と他端の外側の無線エリアを表している。
図14は、本発明の例示的な第3の実施形態を説明する図である。図14や図3Bに示すように、複数のレーンがあり、利用者は、レーン2の読み取り機101でIDが受信され、レーン2で無線タグのIDに対応する顔特徴量がデータサーバ40から取得されたが、利用者が実際に通過するレーンは、レーン3又はレーン1であるとする。この場合、レーン1又はレーン3の読み取り機101で、無線タグのIDが受信され、該IDに対応して登録された顔特徴量をデータサーバ40から取得することになる。第3の実施形態では、複数のレーンの各顔認証装置110(ゲート装置100)における記憶部113を共用(share)する。このため、レーン2で、IDに対応する顔特徴量がデータサーバ40から取得されたが、利用者が実際に通過するレーンは、レーン3又はレーン1に変更した場合であっても、変更先のレーン1、3において、顔特徴量がデータサーバ40から取得する必要はなくなる。なお、第3の実施形態では、複数のレーンで、無線タグのIDに対応する顔特徴量を共有するが、所定のレーンで照合した顔特徴量は記憶部113から削除する。なお、第3の実施形態において、顔認証装置110は、複数のレーンに対してレーン毎に備えてもよいし、あるいは、複数のレーンに対して共通に1つの顔認証装置110を備えてもよい。
次に、本発明に適用される無線タグのいくつかの形態を説明する。
図18Aは、無線タグの別の形態を説明する図である。無線タグ20には、利用者の顔の特徴量の一部がメモリ205に記憶されている。メモリ205はEEPROM(Electrically Erasable Programmable Read-Only Memory)等の書き替え可能な不揮発メモリからなる。データサーバ40には、IDに対応して顔特徴量の一部が格納されている。なお、無線タグ20のメモリ205には、利用者の顔の特徴量の一部を暗号化して記憶する構成としてもよい。データサーバ40は、IDに対応して顔特徴量の一部を暗号化して顔認証装置110に送信してもよい。図18Bは、データサーバ40の記憶装置(データベース)403に登録された情報を例示する図である。データサーバ40の記憶装置(データベース)403には、IDと部分顔特徴量(顔特徴量のうち、無線タグ20の顔特徴量の一部の残り)が対応して記憶されている。
図21は、無線タグの別の形態を説明する図である。無線タグ20には、利用者の顔の特徴量の全部がメモリ205に記憶されている。メモリ205はEEPROM等の書き替え可能な不揮発メモリからなる。なお、無線タグ20のメモリ205には、利用者の顔の特徴量を暗号化して記憶する構成としてもよい。
図24は、上記実施形態の顔認証装置をコンピュータ300で実装した構成を例示する図である。コンピュータ300は、プロセッサ301、記憶装置302、表示装置303、インタフェース304を備える。プロセッサ301は、記憶装置302に記憶されたプログラムを実行することで、前記各実施形態の顔認証装置の制御を行うことができる。表示装置303は、ゲートの開閉を通知する○、×印、方向を表示するようにしてもよい。インタフェース304は、読み取り機1、2、カメラ、センサ、ネットワークを介してデータサーバ40に接続する通信インタフェースを備えた構成としてもよい。
ゲートに入る前の1又は複数の利用者の各無線タグから送信された識別子を読み取り部で受信すると、前記各識別子に対応して登録されている顔特徴量を取得するステップと、
前記利用者が撮像された画像から顔の特徴量を抽出するステップと、
前記抽出された特徴量と前記取得された1つ又は複数の顔特徴量とが一致するか照合するステップと、
を含む、ことを特徴とする顔認証方法。
前記ゲートに入った前記利用者を撮像する前に、前記読み取り部で受信した無線タグの識別子を含む検索要求をデータサーバに送信し、
前記データサーバは、利用者の無線タグに記憶された識別子に対応して前記利用者の顔特徴量を予め記憶しており、
前記データサーバから前記識別子に対応した顔特徴量を取得する、ことを特徴とする付記1に記載の顔認証方法。
前記照合の結果に応じて、前記ゲートの出口を開状態または閉状態とする、ことを特徴とする付記1又は2に記載の顔認証方法。
前記抽出された顔特徴量が、前記取得した1つ又は複数の顔特徴量のいずれかと一致する場合、前記ゲートの出口を開状態とし、一致するものがない場合には、前記ゲートの出口を閉状態とする、ことを特徴とする付記3に記載の顔認証方法。
前記読み取り部は、前記ゲートの長手方向の一側に配設され、前記ゲートの長手方向の一側端の外側の無線エリアに入った1又は複数の利用者の前記無線タグから送信された1つ又は複数の識別子を受信する第1の読み取り機と、
前記ゲートの長手方向の他側に配設された第2の読み取り機と、
を有し、
前記第1の読み取り機につづいて前記第2の読み取り機で前記無線タグからの識別子を受信すると、前記抽出した顔特徴量と、前記取得した1つ又は複数の顔特徴量と一致するか照合する、ことを特徴とする付記1乃至4のいずれかに記載の顔認証方法。
前記第2の読み取り機は、前記ゲートの長手方向他端の外側の無線エリアに入った1又は複数の利用者の無線タグからの識別子を受信することも可能とされ、
前記第1の読み取り機は、前記ゲートの長手方向他端側から前記ゲートの長手方向一端側に進む利用者の無線タグからの識別子を受信することも可能とされ、
前記第1の読み取り機と前記第2の読み取り機で同じ無線タグからの識別子を受信した場合、どちらが先に受信したかで、前記利用者の進行方向が前記ゲートの長手方向一端から他端であるか、又は、その逆であるかを判断する、ことを特徴とする付記5に記載の顔認証方法。
前記ゲートの複数のレーンにそれぞれ対応して設けられた、複数の前記第1の読み取り機、又は、複数の前記第2の読み取り機でそれぞれ受信した各識別子に対応して前記取得部で取得した顔特徴量を、前記ゲートの前記複数のレーンの各レーンにおいて前記抽出され顔特徴量との照合に共通に用いる、ことを特徴とする付記5又は6に記載の顔認証方法。
前記撮像された画像データから顔を検出し、
前記検出された顔のうち最前列の利用者から、前記取得部で取得された顔特徴量と照合を行う、ことを特徴とする付記1乃至7のいずれかに記載の顔認証方法。
前記無線タグから識別子を受信した順、又は、前記利用者の進行方向に応じた順に基づき、前記取得された前記顔特徴量を選択して、前記抽出され顔特徴量と照合する、ことを特徴とする付記1乃至8のいずれかに記載の顔認証方法。
前記顔特徴量を取得した順番に基づき、前記抽出され顔特徴量と照合する順番を決定する、ことを特徴とする付記1乃至8のいずれかに記載の顔認証方法。
前記取得された顔特徴量について、前記第1の読み取り機と前記第2の読み取り機の双方で受信した前記無線タグの識別子に対応した顔特徴量を、前記第1の読み取り機と前記第2の読み取り機のいずれか一方で受信した前記無線タグの識別子に対応した顔特徴量よりも、優先して、前記抽出され顔特徴量と照合する、ことを特徴とする付記1乃至8のいずれかに記載の顔認証方法。
前記ゲート内のレーンでの前記利用者の進行を監視するセンサが、前記利用者を検出したときに、前記撮像部で撮像する、ことを特徴とする付記1乃至11のいずれかに記載の顔認証方法。
前記第1および第2の読み取り機の一方で前記無線タグの識別子を受信し、前記第1および第2の読み取り機の他方で前記無線タグの識別子を受信すると、前記撮像部で撮像する、ことを特徴とする付記5乃至7のいずれかに記載の顔認証方法。
前記取得された顔特徴量のうち前記照合の結果一致した顔特徴量を削除する、ことを特徴とする付記1乃至13のいずれかに記載の顔認証方法。
前記無線タグが、
前記識別子とともに利用者の顔特徴量の一部を記憶保持し、
前記利用者の顔特徴量の残りの部分は前記データサーバに登録され、
前記無線タグから、前記無線タグの前記識別子と前記顔特徴量の一部を受信し、
前記識別子に対応して登録されている顔特徴量の残りの部分を、前記データサーバから取得し、
前記無線タグから受信した前記顔特徴量の一部と、前記取得した前記顔特徴量の残りの部分とを合成し、
前記抽出された顔特徴量と、前記合成された顔特徴量とを照合する、ことを特徴とする付記2に記載の顔認証方法。
前記無線タグが、
前記識別子とともに利用者の顔特徴量を記憶保持し、
前記無線タグから、前記無線タグの前記識別子と前記顔特徴量とを受信し、
前記抽出された顔特徴量と、前記各無線タグから受信した顔特徴量とを照合する、ことを特徴とする付記1に記載の顔認証方法。
前記データサーバが、前記無線タグから受信した前記識別子が同一の複数の検索要求が同時又はある時間範囲内で、異なる複数の地域から前記データサーバに対して発行された場合に、不正を検出する、ことを特徴とする付記2に記載の顔認証方法。
ゲートに入る前の1又は複数の利用者の各無線タグから送信された識別子を読み取り部で受信すると、前記各識別子に対応して登録されている顔特徴量を取得する処理と、
前記利用者が撮像された画像から顔の特徴量を抽出する処理と、
前記抽出された特徴量と前記取得された1つ又は複数の顔特徴量とが一致するか照合する処理と、コンピュータに実行させるプログラム。
前記ゲートに入った前記利用者を撮像する前に、前記読み取り部で受信した無線タグの識別子を含む検索要求をデータサーバに送信し、
前記データサーバは、利用者の無線タグに記憶された識別子に対応して前記利用者の顔特徴量を予め記憶しており、
前記データサーバから前記識別子に対応した顔特徴量を取得する処理を前記コンピュータに実行させる付記18に記載のプログラム。
前記照合の結果に応じて、前記ゲートの出口を開状態または閉状態とする、処理を前記コンピュータに実行させる付記18又は19に記載のプログラム。
前記抽出された顔特徴量が、前記取得した1つ又は複数の顔特徴量のいずれかと一致する場合、前記ゲートの出口を開状態とし、一致するものがない場合には、前記ゲートの出口を閉状態とする処理を前記コンピュータに実行させる付記20に記載のプログラム。
前記読み取り部は、前記ゲートの長手方向の一側に配設され、前記ゲートの長手方向の一側端の外側の無線エリアに入った1又は複数の利用者の前記無線タグから送信された1つ又は複数の識別子を受信する第1の読み取り機と、
前記ゲートの長手方向の他側に配設された第2の読み取り機と、
を有し
前記第1の読み取り機につづいて前記第2の読み取り機で前記無線タグからの識別子を受信すると、前記抽出した顔特徴量と、前記取得した1つ又は複数の顔特徴量と一致するか照合する処理を前記コンピュータに実行させる付記18乃至21のいずれかに記載のプログラム。
前記第2の読み取り機は、前記ゲートの長手方向他端の外側の無線エリアに入った1又は複数の利用者の無線タグからの識別子を受信することも可能とされ、
前記第1の読み取り機は、前記ゲートの長手方向他端側から前記ゲートの長手方向一端側に進む利用者の無線タグからの識別子を受信することも可能とされ、
前記第1の読み取り機と前記第2の読み取り機で同じ無線タグからの識別子を受信した場合、どちらが先に受信したかで、前記利用者の進行方向が前記ゲートの長手方向一端から他端であるか、又は、その逆であるかを判断する処理を前記コンピュータに実行させる付記22に記載のプログラム。
前記ゲートの複数のレーンにそれぞれ対応して設けられた、複数の前記第1の読み取り機、又は、複数の前記第2の読み取り機でそれぞれ受信した各識別子に対応して前記取得部で取得した顔特徴量を、前記ゲートの前記複数のレーンの各レーンにおいて前記抽出され顔特徴量との照合に共通に用いる処理を前記コンピュータに実行させる付記22又は23に記載のプログラム。
前記撮像された画像データから顔を検出し、
前記検出された顔のうち最前列の利用者から、前記取得部で取得された顔特徴量と照合を行う処理を前記コンピュータに実行させる付記18乃至24のいずれかに記載のプログラム。
前記無線タグから識別子を受信した順、又は、前記利用者の進行方向に応じた順に基づき、前記取得された前記顔特徴量を選択して、前記抽出され顔特徴量と照合する処理を前記コンピュータに実行させる付記18乃至25のいずれかに記載のプログラム。
前記顔特徴量を取得した順番に基づき、前記抽出され顔特徴量と照合する順番を決定する処理を前記コンピュータに実行させる付記18乃至25のいずれかに記載のプログラム。
前記取得された顔特徴量について、前記第1の読み取り機と前記第2の読み取り機の双方で受信した前記無線タグの識別子に対応した顔特徴量を、前記第1の読み取り機と前記第2の読み取り機のいずれか一方で受信した前記無線タグの識別子に対応した顔特徴量よりも、優先して、前記抽出され顔特徴量と照合する処理を前記コンピュータに実行させる、付記18乃至25のいずれかに記載のプログラム。
前記ゲート内のレーンでの前記利用者の進行を監視するセンサが、前記利用者を検出したときに、前記撮像部で撮像する処理を前記コンピュータに実行させる付記18乃至28のいずれかに記載のプログラム。
前記第1および第2の読み取り機の一方で前記無線タグの識別子を受信し、前記第1および第2の読み取り機の他方で前記無線タグの識別子を受信すると、前記撮像部で撮像する処理を前記コンピュータに実行させる付記22乃至24のいずれかに記載のプログラム。
前記取得された顔特徴量のうち前記照合の結果一致した顔特徴量を削除する処理を前記コンピュータに実行させる付記18乃至30のいずれかに記載のプログラム。
前記無線タグが、
前記識別子とともに利用者の顔特徴量の一部を記憶保持し、
前記利用者の顔特徴量の残りの部分は前記データサーバに登録され、
前記無線タグから、前記無線タグの前記識別子と前記顔特徴量の一部を受信する処理と、
前記識別子に対応して登録されている顔特徴量の残りの部分を、前記データサーバから取得する処理と、
前記無線タグから受信した前記顔特徴量の一部と、前記取得した前記顔特徴量の残りの部分とを合成する処理と、
前記抽出された顔特徴量と、前記合成された顔特徴量とを照合する処理と、
を前記コンピュータに実行させる付記19に記載のプログラム。
前記無線タグが、
前記識別子とともに利用者の顔特徴量を記憶保持し、
前記無線タグから、前記無線タグの前記識別子と前記顔特徴量とを受信する処理と、
前記抽出された顔特徴量と、前記各無線タグから受信した顔特徴量とを照合する処理と、を前記コンピュータに実行させる付記18に記載のプログラム。
ゲートに入る前の1又は複数の利用者の各無線タグから送信された識別子を読み取り部で受信すると、前記各識別子に対応して登録されている顔特徴量を取得する処理と、
前記利用者が撮像された画像から顔の特徴量を抽出する処理と、
前記抽出された特徴量と前記取得された1つ又は複数の顔特徴量とが一致するか照合する処理と、コンピュータに実行させるプログラムを記録したコンピュータ読み出し可能な非一時的記録媒体。
11 利用者
12 扉(ドア)
13 カメラ
14 顔認証装置
15 データベース(DB)
16 開閉制御装置
17、17A、17B、17C 無線エリア(通信範囲)
18、18A、18C 顔画像データ
19 顔特徴量
20、20A、20B、20C 無線タグ(ICカード)
30 ネットワーク
40 データサーバ
100 ゲート装置
101、101A、101B 読み取り機(リーダ/ライタ)
102、102A、102B 読み取り機(リーダ/ライタ)
103、103A、103B カメラ
104 センサ(赤外線センサ)
105、106、122、122A、122B、122C 通信インタフェース
107 センサインタフェース
108 扉
109 開閉制御部
110 顔認証装置
111 ID取得部
112 特徴量取得部
113 記憶部
114 選択制御部
115 ID取得部
116 画像データ取得部
117 顔検出部
118 顔特徴量抽出部
119 顔照合部
120 方向制御部
121 部分特徴量1受信部
123 部分特徴量2取得部
124 特徴量合成部
125 特徴量受信部
171、171A、171B、172A、172B 無線エリア(通信範囲)
201 アンテナ
202 ICチップ
203 RF回路
204 制御回路
205 メモリ
300 コンピュータ
301 プロセッサ
302 記憶装置
303 表示装置
304 インタフェース
401 通信インタフェース
402 制御部
403 データベース(記憶装置)
1011A、1011B アンテナ
1012A、1012B RF回路
1013 制御回路
1014 通信インタフェース
Claims (33)
- ゲートの長手方向一端の外側の無線エリアに入った1又は複数の利用者の無線タグから送信された1つ又は複数の識別子を受信する読み取り部と、
前記各無線タグから受信した各識別子に対応して登録されている各顔特徴量を取得する取得部と、
前記利用者を撮像する撮像部と、
前記撮像部が撮像した画像データから顔の特徴量を抽出する抽出部と、
前記抽出部で抽出された前記顔の特徴量を受け、当該特徴量と前記取得部で取得された1つ又は複数の前記顔特徴量とが一致するか照合する顔照合部と、
を備えた、ことを特徴とするゲート装置。 - 前記撮像部が前記ゲートに入った前記利用者を撮像する前に、前記取得部は、前記読み取り部が前記無線タグから受信した識別子に対応する顔特徴量を取得する、ことを特徴とする請求項1に記載のゲート装置。
- 前記顔照合部での照合の結果に応じて、前記ゲートの出口を開状態または閉状態とする開閉制御部を備えた、ことを特徴とする請求項1又は2に記載のゲート装置。
- 前記開閉制御部は、前記顔照合部での照合の結果、前記抽出部で抽出された顔特徴量が前記取得部が取得した1つ又は複数の顔特徴量のいずれかと一致する場合、前記ゲートの出口を開状態とし、一致するものがない場合には、前記ゲートの出口を閉状態とする、ことを特徴とする請求項3に記載のゲート装置。
- 前記読み取り部は、前記ゲートの長手方向の一側に配設され、前記無線エリアに入った1又は複数の利用者の前記無線タグから送信された1つ又は複数の識別子を受信する第1の読み取り機と、
前記ゲートの長手方向の他側に配設された第2の読み取り機と、
を備え、
前記第1の読み取り機につづいて前記第2の読み取り機で前記無線タグからの識別子を受信すると、
前記顔照合部は、前記抽出部で抽出した前記顔特徴量と、前記取得部で取得した1つ又は複数の顔特徴量とを照合する、ことを特徴とする請求項1乃至4のいずれか一項に記載のゲート装置。 - 前記第2の読み取り機は、前記ゲートの長手方向他端の外側の無線エリアに入った1又は複数の利用者の無線タグからの識別子を受信することも可能とされ、
前記第1の読み取り機は、前記ゲートの長手方向他端側から前記ゲートの長手方向一端側に進む利用者の無線タグからの識別子を受信することも可能とされ、
前記第1の読み取り機と前記第2の読み取り機で同じ無線タグからの識別子を受信した場合、どちらが先に受信したかで、前記無線タグを所持する利用者の進行方向が前記ゲートの長手方向一端から他端であるか、又は、その逆であるかを判断する方向制御部を備えた、ことを特徴とする請求項5に記載のゲート装置。 - 前記ゲートの複数のレーンにそれぞれ対応して設けられた、複数の前記第1の読み取り機、又は、複数の前記第2の読み取り機でそれぞれ前記無線タグから受信した識別子に対応して前記取得部で取得した顔特徴量を、前記ゲートの前記複数のレーンにおける前記顔照合部での照合に共通に用いる、ことを特徴とする請求項5又は6に記載のゲート装置。
- 前記撮像部で撮像された画像データから顔を検出する顔検出部を備え、
前記顔照合部は、前記顔検出部で検出された顔のうち、前記顔の大きさや目間距離に基づいて特定される最前列の利用者から、前記取得部で取得された顔特徴量と照合を行う、ことを特徴とする請求項1乃至7のいずれか1項に記載のゲート装置。 - 前記無線タグから識別子を受信した順、又は、前記利用者の進行方向に応じた順に基づき、前記取得部で取得された顔特徴量を選択して、前記顔照合部に供給する選択制御部を備えた、ことを特徴とする請求項1乃至8のいずれか1項に記載のゲート装置。
- 前記取得部で顔特徴量を取得した順番に基づき、前記取得部で取得された顔特徴量を前記顔照合部に供給する順番を決定する選択制御部を備えた、ことを特徴とする請求項1乃至8のいずれか1項に記載のゲート装置。
- 前記取得部で取得された顔特徴量について、前記第1の読み取り機と前記第2の読み取り機の双方で、同一の前記無線タグから受信した識別子に対応した顔特徴量を、前記第1の読み取り機と前記第2の読み取り機のいずれか一方で前記無線タグから受信した識別子に対応した顔特徴量よりも、優先して、前記顔照合部に供給する選択制御部を備えたことを特徴とする請求項5乃至7のいずれか1項に記載のゲート装置。
- 前記ゲート内のレーンでの前記利用者の進行を監視するセンサが、前記利用者を検出したときに、前記撮像部で前記利用者を撮像する、ことを特徴とする請求項1乃至11のいずれか1項に記載のゲート装置。
- 前記第1及び第2の読み取り機の一方で前記無線タグの識別子を受信し、前記第1及び第2の読み取り機の他方で前記無線タグの識別子を受信すると、前記撮像部で撮像する、ことを特徴とする請求項5乃至7のいずれか1項に記載のゲート装置。
- ゲートの長手方向一端の外側の無線エリアに入った1又は複数の利用者の無線タグから送信された1つ又は複数の識別子を受信する読み取り部と、
前記利用者を撮像する撮像部と、
前記撮像部が撮像した画像から顔特徴量を抽出すると、当該顔特徴量と前記識別子に対応して登録されている顔の特徴量とが一致するかに基づいて、前記ゲートの出口を開状態または閉状態とする開閉制御部と、
を備えた、ことを特徴とするゲート装置。 - ゲートに入る前の1又は複数の利用者の各無線タグから送信された識別子を読み取り部で受信すると、前記各識別子に対応して登録されている顔特徴量を取得する取得部と、
前記利用者が撮影された画像から顔の特徴量を抽出する抽出部と、
前記抽出部が前記顔特徴量を抽出すると、当該特徴量と前記取得部が取得した1つ又は複数の顔特徴量とが一致するか照合する顔照合部と、
を備えた、ことを特徴とする顔認証システム。 - 前記利用者の無線タグに記憶された識別子に対応して前記利用者の顔特徴量を予め記憶しているデータサーバを備え、
前記ゲートに入った前記利用者を撮像する前に、前記取得部は、前記読み取り部で受信した無線タグの識別子を含む検索要求を前記データサーバに送信し、前記データサーバから前記識別子に対応した顔特徴量を取得する、ことを特徴とする請求項15に記載の顔認証システム。 - 前記顔照合部での照合の結果に応じて、前記ゲートの出口を開状態または閉状態とする開閉制御部を備えた、ことを特徴とする請求項15又は16に記載の顔認証システム。
- 前記開閉制御部は、前記抽出部で抽出された顔特徴量が前記取得部が取得した1つ又は複数の顔特徴量のいずれかと一致する場合は前記ゲートの出口を開状態とし、一致するものがない場合は、前記ゲートの出口を閉状態とする、ことを特徴とする請求項17に記載の顔認証システム。
- 前記読み取り部は、前記ゲートの長手方向の一側に配設され、前記ゲートの長手方向の一側端の外側の無線エリアに入った1又は複数の利用者の前記無線タグから送信された1つ又は複数の識別子を受信する第1の読み取り機と、
前記ゲートの長手方向の他側に配設された第2の読み取り機と、
を備え、
前記第1の読み取り機につづいて前記第2の読み取り機で前記無線タグからの識別子を受信すると、前記顔照合部は、前記抽出部で抽出した顔特徴量と、前記取得部が取得した1つ又は複数の顔特徴量とを照合する、ことを特徴とする請求項15乃至18のいずれか1項に記載の顔認証システム。 - 前記第2の読み取り機は、前記ゲートの長手方向他端の外側の無線エリアに入った1又は複数の利用者の無線タグからの識別子を受信することも可能とされ、
前記第1の読み取り機は、前記ゲートの長手方向他端側から前記ゲートの長手方向一端側に進む利用者の無線タグからの識別子を受信することも可能とされ、
前記第1の読み取り機と前記第2の読み取り機で同じ無線タグからの識別子を受信した場合、どちらが先に受信したかで、前記利用者の進行方向が前記ゲートの長手方向一端から他端であるか、又は、その逆であるかを判断する方向制御部を備えた、ことを特徴とする請求項19に記載の顔認証システム。 - 前記ゲートの複数のレーンにそれぞれ対応して設けられた、複数の前記第1の読み取り機、又は、複数の前記第2の読み取り機でそれぞれ受信した各識別子に対応して前記取得部で取得した顔特徴量を、前記ゲートの前記複数のレーンにおける前記顔照合部での照合に共通に用いる、ことを特徴とする請求項19又は20に記載の顔認証システム。
- 撮像部で撮像された画像データから顔を検出する顔検出部を備え、
前記顔照合部は、前記顔検出部で検出された顔のうち最前列の利用者から、前記取得部で取得された顔特徴量と照合を行う、ことを特徴とする請求項15乃至21のいずれか1項に記載の顔認証システム。 - 前記無線タグから識別子を受信した順、又は、前記利用者の進行方向に応じた順に基づき、前記取得部で取得された前記顔特徴量を選択して、前記顔照合部に供給する選択制御部を備えた、ことを特徴とする請求項15乃至22のいずれか1項に記載の顔認証システム。
- 前記取得部で前記顔特徴量を取得した順番に基づき、前記取得した顔特徴量を前記顔照合部に供給する順番を決定する選択制御部を備えた、ことを特徴とする請求項15乃至22のいずれか1項に記載の顔認証システム。
- 前記取得部で取得された顔特徴量について、第1の読み取り機と第2の読み取り機の双方で受信した前記無線タグの識別子に対応した顔特徴量を、前記第1の読み取り機と前記第2の読み取り機のいずれか一方で受信した前記無線タグの識別子に対応した顔特徴量よりも、優先して、前記顔照合部に供給する選択制御部を備えた、ことを特徴とする請求項15乃至22のいずれか1項に記載の顔認証システム。
- 前記ゲート内のレーンでの前記利用者の進行を監視するセンサが、前記利用者を検出したときに、撮像部で撮像する、ことを特徴とする請求項15乃至25のいずれか1項に記載の顔認証システム。
- 前記第1および第2の読み取り機の一方で前記無線タグの識別子を受信し、前記第1および第2の読み取り機の他方で前記無線タグの識別子を受信すると、撮像部で撮像する、ことを特徴とする請求項19乃至21のいずれか1項に記載の顔認証システム。
- 前記取得部で取得された顔特徴量を記憶する記憶部は、前記顔照合部で照合が一致した顔特徴量を削除する、ことを特徴とする請求項15乃至27のいずれか1項に記載の顔認証システム。
- 前記無線タグが、
前記識別子と利用者の顔特徴量の一部とを記憶保持し、
前記利用者の顔特徴量の残りの部分は前記データサーバに登録され、
前記読み取り部は、
前記無線タグから、前記無線タグの前記識別子と前記顔特徴量の一部とを受信し、
前記取得部は、
前記識別子に対応して登録されている顔特徴量の残りの部分を、前記データサーバから取得し、
前記読み取り部で前記無線タグから受信した前記顔特徴量の一部と、前記取得部で取得した前記顔特徴量の残りの部分とを合成する合成部をさらに備え、
前記顔照合部は、前記抽出部で抽出された顔特徴量と、前記合成部で合成された顔特徴量とを照合する、ことを特徴とする請求項16に記載の顔認証システム。 - 前記無線タグが、
前記識別子と利用者の顔特徴量とを記憶保持し、
前記読み取り部は、
前記無線タグから、前記無線タグの前記識別子と前記顔特徴量とを受信し、
前記顔照合部は、前記抽出部で抽出された顔特徴量と、前記読み取り部が前記各無線タグから受信した顔特徴量とを照合する、ことを特徴とする請求項15に記載の顔認証システム。 - 前記データサーバが、前記無線タグから受信した前記識別子が同一の複数の検索要求が同時又はある時間範囲内で、異なる複数の地域から前記データサーバに対して発行された場合に、不正を検出する制御部を備えた、ことを特徴とする請求項16に記載の顔認証システム。
- ゲートに入る前の1又は複数の利用者の各無線タグから送信された識別子を読み取り部で受信すると、前記各識別子に対応して登録されている顔特徴量を取得するステップと、
前記利用者が撮影された画像から顔の特徴量を抽出するステップと、
前記抽出された特徴量と前記取得された1つ又は複数の顔特徴量とが一致するか照合するステップと、
を含む、ことを特徴とする顔認証方法。 - ゲートに入る前の1又は複数の利用者の各無線タグから送信された識別子を読み取り部で受信すると、前記各識別子に対応して登録されている顔特徴量を取得する処理と、
前記利用者が撮影された画像から顔の特徴量を抽出する処理と、
前記抽出された特徴量と前記取得された1つ又は複数の顔特徴量とが一致するか照合する処理と、をコンピュータに実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/498,788 US11315375B2 (en) | 2017-03-31 | 2018-03-30 | Facial authentication system, apparatus, method and program |
JP2019509398A JP6816821B2 (ja) | 2017-03-31 | 2018-03-30 | 顔認証システム、装置、方法、プログラム |
EP18778014.3A EP3605473A4 (en) | 2017-03-31 | 2018-03-30 | FACIAL RECOGNITION SYSTEM, DEVICE, METHOD AND PROGRAM |
US17/708,345 US20220222993A1 (en) | 2017-03-31 | 2022-03-30 | Facial authentication system, apparatus, method and program field |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-073042 | 2017-03-31 | ||
JP2017073042 | 2017-03-31 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/498,788 A-371-Of-International US11315375B2 (en) | 2017-03-31 | 2018-03-30 | Facial authentication system, apparatus, method and program |
US17/708,345 Continuation US20220222993A1 (en) | 2017-03-31 | 2022-03-30 | Facial authentication system, apparatus, method and program field |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018181968A1 true WO2018181968A1 (ja) | 2018-10-04 |
Family
ID=63678241
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/013797 WO2018181968A1 (ja) | 2017-03-31 | 2018-03-30 | 顔認証システム、装置、方法、プログラム |
Country Status (4)
Country | Link |
---|---|
US (2) | US11315375B2 (ja) |
EP (1) | EP3605473A4 (ja) |
JP (2) | JP6816821B2 (ja) |
WO (1) | WO2018181968A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110378696A (zh) * | 2019-06-26 | 2019-10-25 | 深圳市万通顺达科技股份有限公司 | 一种刷脸支付方法、装置、可读存储介质及终端设备 |
WO2020167354A1 (en) * | 2019-02-15 | 2020-08-20 | Nec Laboratories America, Inc. | Physical structure, state machine, and concepts of a rfid walk-through gate |
JP2020196991A (ja) * | 2019-05-30 | 2020-12-10 | パナソニックIpマネジメント株式会社 | 認証システム、管理システム、認証方法、プログラム |
JP2021005316A (ja) * | 2019-06-27 | 2021-01-14 | キヤノン株式会社 | システム、システムの制御方法、及びプログラム |
JPWO2021059537A1 (ja) * | 2019-09-27 | 2021-04-01 | ||
WO2021129256A1 (zh) * | 2019-12-25 | 2021-07-01 | 南京兰林智慧建筑科技有限公司 | 一种用于物业管理的小区门禁系统 |
WO2021186576A1 (ja) | 2020-03-17 | 2021-09-23 | 日本電気株式会社 | ゲートシステム、ゲート装置、その画像処理方法、およびプログラム、ならびに、ゲート装置の配置方法 |
JPWO2021186627A1 (ja) * | 2020-03-18 | 2021-09-23 | ||
EP3940653A4 (en) * | 2019-03-15 | 2023-04-05 | Shanghai Huaming Intelligent Terminal Equipment Co., Ltd. | DOOR DEVICE CONTROL METHOD, TERMINAL, DOOR DEVICE AND SYSTEM |
US11893844B2 (en) | 2019-03-04 | 2024-02-06 | Panasonic Intellectual Property Management Co., Ltd. | Face authentication machine and face authentication method |
WO2024038507A1 (ja) * | 2022-08-16 | 2024-02-22 | 日本電気株式会社 | 撮像装置、撮像方法及び記録媒体 |
US11995937B2 (en) | 2019-03-04 | 2024-05-28 | Panasonic Intellectual Property Management Co., Ltd. | Gate open/close control device and gate open/close control method |
US12002046B2 (en) | 2019-03-04 | 2024-06-04 | Panasonic Intellectual Property Management Co., Ltd. | Face authentication system and face authentication method |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6409929B1 (ja) | 2017-09-19 | 2018-10-24 | 日本電気株式会社 | 照合システム |
WO2020022014A1 (ja) * | 2018-07-25 | 2020-01-30 | 日本電気株式会社 | 情報処理装置、情報処理方法および情報処理プログラム |
US11328546B2 (en) * | 2018-08-29 | 2022-05-10 | Essex Electronics Incorporated | Method and apparatus for operating a RFID system |
EP4038584A4 (en) * | 2019-10-04 | 2022-11-23 | NEC Corporation | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD AND RECORDING MEDIUM |
CN211506604U (zh) * | 2020-03-16 | 2020-09-15 | 支付宝(杭州)信息技术有限公司 | 电子设备及闸机组件 |
WO2022003786A1 (ja) * | 2020-06-29 | 2022-01-06 | 日本電気株式会社 | ゲート装置、制御方法、プログラム |
US11204281B1 (en) * | 2020-09-03 | 2021-12-21 | Sensormatic Electronics, LLC | Enhanced temperature measurement techniques |
US20220084343A1 (en) * | 2020-09-14 | 2022-03-17 | Sanjay Kumar Biswal | Multifunction smart door lock |
CN112309018B (zh) * | 2020-10-30 | 2023-01-31 | 北京市商汤科技开发有限公司 | 一种图像展示方法、装置、计算机设备和存储介质 |
CN112309017A (zh) * | 2020-11-27 | 2021-02-02 | 杭州海康威视数字技术股份有限公司 | 一种基于人脸认证的门禁控制方法及系统 |
JP7194901B2 (ja) * | 2021-05-25 | 2022-12-23 | パナソニックIpマネジメント株式会社 | 認証装置、認証方法、認証システムおよびプログラム |
CN113802491B (zh) * | 2021-08-30 | 2022-12-20 | 厦门熙佺文体科技有限公司 | 一种具有人脸快速检测的通道闸装置 |
CN113723380B (zh) * | 2021-11-03 | 2022-02-08 | 亿慧云智能科技(深圳)股份有限公司 | 基于雷达技术的人脸识别方法、装置、设备和存储介质 |
CN114495294A (zh) * | 2021-12-03 | 2022-05-13 | 华中科技大学鄂州工业技术研究院 | 一种地铁闸机无感支付方法、装置及存储介质 |
US20230334911A1 (en) * | 2022-04-13 | 2023-10-19 | Nec Corporation | Face liveness detection |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0636096A (ja) * | 1992-07-20 | 1994-02-10 | Toshiba Corp | 自動改札装置 |
JPH11120304A (ja) * | 1997-10-13 | 1999-04-30 | Sony Corp | 非接触型icカード用受信装置及び非接触型icカード用受信方法 |
JP2008052549A (ja) | 2006-08-25 | 2008-03-06 | Hitachi Kokusai Electric Inc | 画像処理システム |
JP2010097272A (ja) * | 2008-10-14 | 2010-04-30 | Toshiba Corp | 自動改札機 |
JP2011018300A (ja) * | 2009-06-08 | 2011-01-27 | Jr East Mechatronics Co Ltd | ゲートシステム、サーバ及びゲートシステムにおける紐付け方法 |
JP2013061875A (ja) | 2011-09-14 | 2013-04-04 | Hitachi Information & Communication Engineering Ltd | 認証システム及び信頼度判定方法 |
US20140015978A1 (en) * | 2012-07-16 | 2014-01-16 | Cubic Corporation | Barrierless gate |
JP2016170517A (ja) * | 2015-03-11 | 2016-09-23 | オムロン株式会社 | 通行管理システム、携帯装置、ゲート装置、進入通知プログラム、通行管理プログラム、及び、通行管理方法 |
JP2017059060A (ja) | 2015-09-17 | 2017-03-23 | ソフトバンク株式会社 | 生体照合システム、生体照合方法、生体照合装置及び制御プログラム |
JP2017073042A (ja) | 2015-10-08 | 2017-04-13 | 富士通株式会社 | 画像生成システム、画像生成プログラム及び画像生成方法 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5414249A (en) * | 1992-07-20 | 1995-05-09 | Kabushiki Kaisha Toshiba | Automatic gate apparatus |
JP2000220333A (ja) * | 1999-01-29 | 2000-08-08 | Toshiba Corp | 人物認証装置およびその方法 |
JP2000222534A (ja) | 1999-01-29 | 2000-08-11 | Hitachi Ltd | 不正id検知支援システム |
JP2003331323A (ja) * | 2002-05-17 | 2003-11-21 | Nippon Signal Co Ltd:The | 自動ゲートシステム |
US20040151347A1 (en) * | 2002-07-19 | 2004-08-05 | Helena Wisniewski | Face recognition system and method therefor |
US7817013B2 (en) * | 2003-09-05 | 2010-10-19 | Honeywell International Inc. | Distributed stand-off ID verification compatible with multiple face recognition systems (FRS) |
JP2006072862A (ja) | 2004-09-03 | 2006-03-16 | Sukurudo Enterprise Kk | 双方向型自動改札・集札システム |
JP4984728B2 (ja) | 2006-08-07 | 2012-07-25 | パナソニック株式会社 | 被写体照合装置および被写体照合方法 |
JP2012010085A (ja) * | 2010-06-24 | 2012-01-12 | Sony Corp | 立体表示装置及び立体表示装置の制御方法 |
JP5613855B1 (ja) * | 2014-04-23 | 2014-10-29 | 株式会社 ディー・エヌ・エー | ユーザ認証システム |
US10489973B2 (en) * | 2015-08-17 | 2019-11-26 | Cubic Corporation | 3D face reconstruction from gate camera |
US10475272B2 (en) * | 2016-09-09 | 2019-11-12 | Tyco Integrated Security, LLC | Architecture for access management |
-
2018
- 2018-03-30 WO PCT/JP2018/013797 patent/WO2018181968A1/ja active Application Filing
- 2018-03-30 EP EP18778014.3A patent/EP3605473A4/en active Pending
- 2018-03-30 US US16/498,788 patent/US11315375B2/en active Active
- 2018-03-30 JP JP2019509398A patent/JP6816821B2/ja active Active
-
2020
- 2020-12-24 JP JP2020214895A patent/JP2021061030A/ja active Pending
-
2022
- 2022-03-30 US US17/708,345 patent/US20220222993A1/en not_active Abandoned
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0636096A (ja) * | 1992-07-20 | 1994-02-10 | Toshiba Corp | 自動改札装置 |
JPH11120304A (ja) * | 1997-10-13 | 1999-04-30 | Sony Corp | 非接触型icカード用受信装置及び非接触型icカード用受信方法 |
JP2008052549A (ja) | 2006-08-25 | 2008-03-06 | Hitachi Kokusai Electric Inc | 画像処理システム |
JP2010097272A (ja) * | 2008-10-14 | 2010-04-30 | Toshiba Corp | 自動改札機 |
JP2011018300A (ja) * | 2009-06-08 | 2011-01-27 | Jr East Mechatronics Co Ltd | ゲートシステム、サーバ及びゲートシステムにおける紐付け方法 |
JP2013061875A (ja) | 2011-09-14 | 2013-04-04 | Hitachi Information & Communication Engineering Ltd | 認証システム及び信頼度判定方法 |
US20140015978A1 (en) * | 2012-07-16 | 2014-01-16 | Cubic Corporation | Barrierless gate |
JP2016170517A (ja) * | 2015-03-11 | 2016-09-23 | オムロン株式会社 | 通行管理システム、携帯装置、ゲート装置、進入通知プログラム、通行管理プログラム、及び、通行管理方法 |
JP2017059060A (ja) | 2015-09-17 | 2017-03-23 | ソフトバンク株式会社 | 生体照合システム、生体照合方法、生体照合装置及び制御プログラム |
JP2017073042A (ja) | 2015-10-08 | 2017-04-13 | 富士通株式会社 | 画像生成システム、画像生成プログラム及び画像生成方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3605473A4 |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020167354A1 (en) * | 2019-02-15 | 2020-08-20 | Nec Laboratories America, Inc. | Physical structure, state machine, and concepts of a rfid walk-through gate |
US12002046B2 (en) | 2019-03-04 | 2024-06-04 | Panasonic Intellectual Property Management Co., Ltd. | Face authentication system and face authentication method |
US11995937B2 (en) | 2019-03-04 | 2024-05-28 | Panasonic Intellectual Property Management Co., Ltd. | Gate open/close control device and gate open/close control method |
US11893844B2 (en) | 2019-03-04 | 2024-02-06 | Panasonic Intellectual Property Management Co., Ltd. | Face authentication machine and face authentication method |
EP3940653A4 (en) * | 2019-03-15 | 2023-04-05 | Shanghai Huaming Intelligent Terminal Equipment Co., Ltd. | DOOR DEVICE CONTROL METHOD, TERMINAL, DOOR DEVICE AND SYSTEM |
JP7194898B2 (ja) | 2019-05-30 | 2022-12-23 | パナソニックIpマネジメント株式会社 | 認証システム、管理システム、認証方法、プログラム |
JP2020196991A (ja) * | 2019-05-30 | 2020-12-10 | パナソニックIpマネジメント株式会社 | 認証システム、管理システム、認証方法、プログラム |
CN110378696A (zh) * | 2019-06-26 | 2019-10-25 | 深圳市万通顺达科技股份有限公司 | 一种刷脸支付方法、装置、可读存储介质及终端设备 |
JP2021005316A (ja) * | 2019-06-27 | 2021-01-14 | キヤノン株式会社 | システム、システムの制御方法、及びプログラム |
JPWO2021059537A1 (ja) * | 2019-09-27 | 2021-04-01 | ||
WO2021059537A1 (ja) * | 2019-09-27 | 2021-04-01 | 日本電気株式会社 | 情報処理装置、端末装置、情報処理システム、情報処理方法及び記録媒体 |
WO2021129256A1 (zh) * | 2019-12-25 | 2021-07-01 | 南京兰林智慧建筑科技有限公司 | 一种用于物业管理的小区门禁系统 |
JPWO2021186576A1 (ja) * | 2020-03-17 | 2021-09-23 | ||
JP7424469B2 (ja) | 2020-03-17 | 2024-01-30 | 日本電気株式会社 | ゲートシステム、ゲート装置、その画像処理方法、およびプログラム、ならびに、ゲート装置の配置方法 |
WO2021186576A1 (ja) | 2020-03-17 | 2021-09-23 | 日本電気株式会社 | ゲートシステム、ゲート装置、その画像処理方法、およびプログラム、ならびに、ゲート装置の配置方法 |
WO2021186627A1 (ja) * | 2020-03-18 | 2021-09-23 | 日本電気株式会社 | ゲート装置、認証システム、ゲート装置の制御方法及び記憶媒体 |
JPWO2021186627A1 (ja) * | 2020-03-18 | 2021-09-23 | ||
JP7318801B2 (ja) | 2020-03-18 | 2023-08-01 | 日本電気株式会社 | ゲート装置、認証システム、ゲート装置の制御方法及びプログラム |
WO2024038507A1 (ja) * | 2022-08-16 | 2024-02-22 | 日本電気株式会社 | 撮像装置、撮像方法及び記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
US20210110625A1 (en) | 2021-04-15 |
JP6816821B2 (ja) | 2021-01-20 |
US20220222993A1 (en) | 2022-07-14 |
EP3605473A4 (en) | 2020-07-22 |
JPWO2018181968A1 (ja) | 2020-02-06 |
US11315375B2 (en) | 2022-04-26 |
JP2021061030A (ja) | 2021-04-15 |
EP3605473A1 (en) | 2020-02-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018181968A1 (ja) | 顔認証システム、装置、方法、プログラム | |
EP3494553B1 (en) | Method and system for automated physical access control system using biometric recognition coupled with tag authentication | |
USH2120H1 (en) | Biometric personal identification credential system (PICS) | |
US20070252001A1 (en) | Access control system with RFID and biometric facial recognition | |
JP4412162B2 (ja) | 利用者認証装置および入退室管理装置 | |
US20090096580A1 (en) | Secure authentication | |
JP6848301B2 (ja) | 認証システム、認証データ管理装置、ゲート管理装置および認証方法 | |
WO2020006943A1 (zh) | 基于距离传感器的rfid无感控制方法、装置及系统 | |
CN101140620A (zh) | 一种人脸识别系统 | |
CN1972186A (zh) | 一种移动式身份认证系统及其认证方法 | |
JP2009527804A (ja) | 複数の顔認識システム(frs)に適合する分散スタンドオフid照合 | |
RU2711510C1 (ru) | Способ для проверки полномочий доступа посредством системы контроля доступа | |
JP2003331323A (ja) | 自動ゲートシステム | |
CN212750034U (zh) | 一种基于rfid和超宽带技术的防入侵系统 | |
CN108765671A (zh) | 基于触发开关的rfid无感控制方法、装置及系统 | |
JP4855180B2 (ja) | 画像処理システム | |
CN208314887U (zh) | 基于触发开关的rfid无感控制装置及系统 | |
JP2001167306A (ja) | Icカード管理システム | |
JP2006099687A (ja) | 利用者認証装置 | |
CN208314886U (zh) | 基于感应开关的rfid无感控制装置及系统 | |
KR101672599B1 (ko) | Ble 기술을 이용한 생체 인증 장치 및 그 방법 | |
JP6911999B2 (ja) | 入場管理システム | |
CN108648320A (zh) | 基于感应开关的rfid无感控制方法、装置及系统 | |
KR20070073168A (ko) | 소지품 분실 방지 장치 및 그 방법 | |
US10565406B2 (en) | Item management system using tag information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18778014 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019509398 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018778014 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2018778014 Country of ref document: EP Effective date: 20191031 |