EP4109873A1 - Dispositif de porte, procédé de commande de dispositif de porte et support de stockage - Google Patents

Dispositif de porte, procédé de commande de dispositif de porte et support de stockage Download PDF

Info

Publication number
EP4109873A1
EP4109873A1 EP20920411.4A EP20920411A EP4109873A1 EP 4109873 A1 EP4109873 A1 EP 4109873A1 EP 20920411 A EP20920411 A EP 20920411A EP 4109873 A1 EP4109873 A1 EP 4109873A1
Authority
EP
European Patent Office
Prior art keywords
light
user
luminance
gate apparatus
gate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20920411.4A
Other languages
German (de)
English (en)
Other versions
EP4109873A4 (fr
Inventor
Fumi IRIE
Yoshitaka Yoshimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Publication of EP4109873A1 publication Critical patent/EP4109873A1/fr
Publication of EP4109873A4 publication Critical patent/EP4109873A4/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1075Measuring physical dimensions, e.g. size of the entire body or parts thereof for measuring dimensions by non-invasive methods, e.g. for determining thickness of tissue layer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6891Furniture
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/06Special arrangements of screening, diffusing, or reflecting devices, e.g. in studio
    • G03B15/07Arrangements of lamps in studios
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/10Movable barriers with registering means
    • G07C9/15Movable barriers with registering means with arrangements to prevent the passage of more than one individual at a time
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/20Individual registration on entry or exit involving the use of a pass
    • G07C9/22Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
    • G07C9/25Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
    • G07C9/257Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition electronically
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • A61B5/0079Devices for viewing the surface of the body, e.g. camera, magnifying lens using mirrors, i.e. for self-examination
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1172Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means

Definitions

  • the present invention relates to a gate apparatus, a gate apparatus control method, and a storage medium.
  • Emigration and immigration examination is performed at airports.
  • An officer in charge of the emigration and immigration examination compares a photograph of a face attached to a passport with the face of a person in front of the officer. If the face image in the passport does not match the face of the person in front of the officer, the emigration and immigration of the person is not permitted.
  • a gate apparatus disclosed in NPL 1 performs the emigration and immigration examination by comparing previously registered biological information with biological information acquired by the gate apparatus.
  • a terminal needs to acquire a face image of a user in front of the terminal.
  • the face authentication terminal needs to acquire a high quality image (face image). That is, to achieve accurate matching, high quality images need to be used as a matching target image and a registered image. For example, if the brightness of one portion is different from the brightness of another portion in an image, this image is not suitable for authentication.
  • a gate apparatus including: an acquisition unit that acquires biological information about a user; an upper light that emits light from above the user; a lower light that emits light from below the user; and a light control unit that changes luminance of the light emitted from the upper light and luminance of the light emitted from the lower light when the biological information about the user is acquired.
  • a control method of a gate apparatus including an upper light that emits light from above a user and a lower light that emits light from below the user, the control method including: changing luminance of the light emitted from the upper light and luminance of the light emitted from the lower light; and acquiring biological information about the user.
  • a computer-readable storage medium storing a program that causes a computer mounted on a gate apparatus including an upper light that emits light from above a user and a lower light that emits light from below the user to perform processing for: changing luminance of the light emitted from the upper light and luminance of the light emitted from the lower light; and acquiring biological information about the user.
  • the individual aspects of the present invention provide a gate apparatus, a gate apparatus control method, and a storage medium that contribute to acquiring biological information suitable for authentication.
  • the advantageous effect of the present invention is not limited to the above advantageous effect.
  • the present invention may provide other advantageous effects, instead of or in addition to the above advantageous effect.
  • an outline of an example embodiment will be described.
  • various components are denoted by reference characters for the sake of convenience. That is, the following reference characters are used as examples to facilitate the understanding of the present invention. Thus, the description of the outline is not intended to impose any limitations.
  • an individual block illustrated in the drawings represents a configuration of a functional unit, not a hardware unit.
  • An individual connection line between blocks in the drawings signifies both one-way and two-way directions.
  • An arrow schematically illustrates a principal signal (data) flow and does not exclude bidirectionality.
  • elements that can be described in a like way will be denoted by a like reference character, and redundant description thereof will be omitted as needed.
  • a gates apparatus 100 includes an upper light 101 that emits light from above a user, a lower light 102 that emits light from below the user, a light control unit 103, and an acquisition unit 104.
  • the acquisition unit 104 acquires biological information about the user.
  • the light control unit 103 changes the luminance of the light emitted from the upper light 101 and the luminance of the light emitted from the lower light 102 when the biological information about the user is acquired.
  • the gate apparatus 100 controls the luminance of the light emitted from the upper light 101 and the luminance of the light emitted from the lower light 102. For example, the gate apparatus 100 changes the luminance of the light emitted from the upper light 101 and the luminance of the light emitted from the lower light 102 such that the user is illuminated with light with uniform illuminance. Specifically, the gate apparatus 100 controls the two light sources such that, for example, the face of the user is illuminated with light with uniform brightness. As a result, biological information suitable for authentication is acquired.
  • Fig. 2 is a diagram illustrating an example of a schematic configuration of an emigration and immigration examination system according to the first example embodiment.
  • the emigration and immigration examination system includes a plurality of gate apparatuses 10-1 to 10-3 and a server apparatus 20.
  • any one of these gate apparatuses 10-1 to 10-3 will simply be referred to as a "gate apparatus 10".
  • the number of gate apparatuses 10 included in the system is not of course limited to any particular number.
  • the emigration and immigration examination system includes at least one gate apparatus 10.
  • the individual gate apparatus 10 and the server apparatus 20 can communicate with each other via wired or wireless communication means.
  • the server apparatus 20 may be installed in the same airport where the gate apparatuses 10 are installed. Alternatively, the server apparatus 20 may be installed on a network (cloud).
  • the individual gate apparatus 10 is an apparatus that automatically performs emigration and immigration examination procedures for users.
  • the gate apparatus 10 includes a gate that can be opened and closed. If the gate apparatus 10 determines that a person standing in front of the gate apparatus 10 has passed the emigration and immigration examination and that the person possesses a correct passport, the gate apparatus 10 opens the gate and allows the user to pass through the gate. Thus, the gate apparatus 10 controls the gate based on the result of the emigration and immigration examination of the user.
  • the server apparatus 20 is an apparatus that realizes the emigration and immigration examination with the above gate apparatus 10.
  • the server apparatus 20 stores information about users who can use the gate apparatus 10 (the information will hereinafter be referred to as gate user information).
  • a user visits a passport center and presents his or her passport to an officer in charge at the passport center.
  • the person in charge examines whether the user who has presented the passport is truly the owner of the passport.
  • the person in charge acquires a fingerprint image of the user.
  • the person in charge acquires a fingerprint image of the user by using a fingerprint scanner or the like. This fingerprint image is entered to the server apparatus 20.
  • the server apparatus 20 adds the acquired fingerprint image to a database (which will hereinafter be referred to as a registered user database).
  • the fingerprint image to be registered is a fingerprint image obtained from at least one finger.
  • other information for example, the name and passport number
  • feature values kinds and locations of feature points, for example
  • necessary for matching processing using the fingerprint image may be registered in the registered user database in association with the fingerprint image.
  • the gate apparatus 10 When a user stands in front of a gate apparatus 10, the user places his or her finger on a scanner in accordance with an instruction given by the gate apparatus 10.
  • the gate apparatus 10 acquires a fingerprint image of the user and transmits an examination request including the acquired fingerprint image to the server apparatus 20.
  • the server apparatus 20 performs matching processing (1-to-N matching processing; N will hereinafter denote a positive integer) by using the acquired fingerprint image and a plurality of fingerprint images registered in the registered user database. If a fingerprint image that substantially matches the acquired fingerprint image is registered in the registered user database, the server apparatus 20 sets the examination result to "emigration and immigration permitted".
  • the server apparatus 20 sets the examination result to "emigration and immigration not permitted".
  • the server apparatus 20 transmits the examination result to the gate apparatus 10 that has transmitted the examination request.
  • the gate apparatus 10 determines whether the user in front of the gate apparatus 10 possesses a correct passport (his or her own passport). Specifically, the gate apparatus 10 instructs the user to open and place his or her passport on a scanner. The gate apparatus 10 reads out a face image or the like from an IC (Integrated Circuit) chip in the passport by using a card reader function of the scanner. The gate apparatus 10 acquires a face image of the user by using a camera device. The gate apparatus 10 generates feature values (hereinafter, face feature values) from each of the two face images and determines whether the two sets of feature values substantially match. That is, the gate apparatus 10 performs 1-to-1 matching by using the face feature values obtained by capturing an image of the user in front of the gate apparatus 10 and the face feature values obtained from the IC chip in the passport.
  • face feature values hereinafter, face feature values
  • the gate apparatus 10 opens the gate and allows the user to pass through the gate.
  • Fig. 3 is a diagram illustrating an example of a processing configuration (processing modules) of the server apparatus 20 according to the first example embodiment.
  • the server apparatus 20 includes a communication control unit 201, a fingerprint image registration unit 202, an examination unit 203, and a storage unit 204.
  • the communication control unit 201 is means for controlling communication with other apparatuses. Specifically, the communication control unit 201 receives data (packets) from a gate apparatus 10. In addition, the communication control unit 201 transmits data to a gate apparatus 10.
  • the fingerprint image registration unit 202 is means for registering acquired fingerprint images in the registered user database configured in the storage unit 204.
  • the method for acquiring fingerprint images is not limited to any particular method.
  • an officer in charge at a passport center may enter a fingerprint image in the server apparatus 20.
  • an officer in charge operates a fingerprint scanner to acquire a fingerprint image of a user.
  • the officer in charge operates a terminal (a computer installed at the passport center) to transmit the acquired fingerprint image to the server apparatus 20.
  • the above read data may be entered to the server apparatus 20 via an external storage device, such as a USB (Universal Serial Bus) memory.
  • USB Universal Serial Bus
  • the examination unit 203 is means for processing examination requests transmitted by the gate apparatuses 10. Specifically, the examination unit 203 sets a fingerprint image (biological information) included in an examination request as the matching target fingerprint image and performs matching processing between this fingerprint image and the fingerprint images registered in the registered user database.
  • a fingerprint image biological information
  • the examination unit 203 sets a fingerprint image extracted from an examination request as the matching target fingerprint image and performs 1-to-N matching between this fingerprint image and the plurality of fingerprint images registered in the registered user database.
  • the examination unit 203 calculates a score (similarity) between the fingerprint image as the matching target fingerprint image and each of the plurality of fingerprint images registered.
  • the examination unit 203 extracts feature points (edge points, branch points) from each of the matching target fingerprint image and the registered fingerprint images.
  • the examination unit 203 calculates a score indicating a similarity between two fingerprint images, based on the extracted feature points, etc. Specifically, the examination unit 203 matches the core area of one fingerprint image (the center area of one fingerprint) with the core area of another fingerprint image (the center area of another fingerprint) and calculates the above score, for example, based on the locations of and the number of feature points seen from each core area and the number of core lines present between feature points. A higher score represents a higher similarity between two fingerprint images.
  • the examination unit 203 determines whether at least one of the plurality of fingerprint images registered in the registered user database indicates a score more than or equal to a predetermined value with respect to the matching target fingerprint image.
  • the examination unit 203 sets the examination result to "emigration and immigration permitted". However, if a fingerprint image indicating a score more than or equal to a predetermined value is not present in the registered user database, the examination unit 203 sets the examination result to "emigration and immigration not permitted”. The examination unit 203 transmits the examination result to the gate apparatus 10 that has transmitted the examination request.
  • the storage unit 204 stores various kinds of information necessary for the operation of the server apparatus 20.
  • the registered user database is configured in the storage unit 204.
  • Fig. 4 is a diagram illustrating an example of a hardware configuration of the server apparatus 20 according to the first example embodiment.
  • the server apparatus 20 can be configured by an information processing apparatus (a so-called computer) and has a configuration illustrated as an example in Fig. 4 .
  • the server apparatus 20 includes a processor 211, a memory 212, an input-output interface 213, a communication interface 214, etc.
  • the components such as the processor 211 are connected to an internal bus or the like so that these components can communicate with each other.
  • the hardware configuration of the server apparatus 20 is not limited to the configuration illustrated in Fig. 4 .
  • the server apparatus 20 may include hardware not illustrated or may be configured without the input-output interface 213 if desired.
  • the number of components, such as the number of processors 211, included in the server apparatus 20 is not limited to the example illustrated in Fig. 4 .
  • a plurality of processors 211 may be included in the server apparatus 20.
  • the processor 211 is a programmable device such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
  • the processor 211 may be a device such as an FPGA (Field Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit).
  • the processor 211 executes various kinds of programs including an operating system (OS).
  • OS operating system
  • the memory 212 is a RAM (Random Access Memory), a ROM (Read-Only Memory), an HDD (Hard Disk Drive), an SSD (Solid State Drive), or the like.
  • the memory 212 stores an OS program, application programs, and various kinds of data.
  • the input-output interface 213 is an interface for a display apparatus and an input apparatus not illustrated.
  • the display apparatus is a liquid crystal display or the like.
  • the input apparatus is an apparatus that receives user operations, and examples of the input apparatus include a keyboard and a mouse.
  • the communication interface 214 is a circuit, a module, or the like for performing communication with other apparatuses.
  • the communication interface 214 includes a NIC (Network Interface Card) or the like.
  • the functions of the server apparatus 20 are realized by various kinds of processing modules.
  • the processing modules are realized, for example, by causing the processor 211 to execute a program stored in the memory 212.
  • this program can be recorded in a computer-readable storage medium.
  • the storage medium may be a non-transient (non-transitory) storage medium, such as a semiconductor memory, a hard disk, a magnetic recording medium, or an optical recording medium. That is, the present invention can be embodied as a computer program product.
  • the above program may be updated by downloading a program via a network or by using a storage medium in which a program is stored.
  • the above processing modules may be realized by semiconductor chips.
  • Fig. 5 is a diagram illustrating an example of the exterior of a gate apparatus 10 according to the first example embodiment.
  • the individual gate apparatus 10 is an apparatus that automatically performs the emigration and immigration examination for users.
  • a gate apparatus 10 detects presence of a user in front of the gate apparatus 10, for example, the gate apparatus 10 displays, on a display 401, a procedure of an operation that the user needs to perform for the automatic emigration and immigration examination.
  • the gate apparatus 10 instructs the user to place his or her finger (a predetermined finger; a finger corresponding to the fingerprint image registered in the registered user database) on a scanner 402.
  • the gate apparatus 10 captures an image of the placed finger by controlling the scanner 402.
  • the gate apparatus 10 transmits an examination request including the obtained fingerprint image to the server apparatus 20.
  • the gate apparatus 10 After acquiring the fingerprint image, the gate apparatus 10 notifies the user that an image of the face of the user needs to be captured and captures an image of the face of the user by controlling a camera device 403. At this point, the gate apparatus 10 controls the luminance of an upper light 404 attached to a location above the user and the luminance of lower light 405 attached to a location below the user.
  • the upper light 404 is installed in a ceiling portion 407 attached to a supporting portion 406 extending vertically upward from the main body of the gate apparatus 10. More specifically, the upper light 404 is embedded into the ceiling portion 407. As illustrated in Fig. 5 , since the supporting portion 406 is structured to have a wide width, the display 401 or the camera device 403 can be attached to the supporting portion 406.
  • the lower light 405 is embedded into the main body of the gate apparatus 10.
  • the "main body” of the gate apparatus 10 is a structure that forms the core of the gate apparatus 10. This main body is into contact with the floor, and a gate 408 and the supporting portion 406 are attached to the main body.
  • Fig. 6 is a diagram schematically illustrating a cross section of the gate apparatus 10, taken along a line A-A in Fig. 5 .
  • a user an examination target user of the gate apparatus 10 is illustrated in Fig. 6 .
  • the upper light 404 is installed such that light is emitted to the user from above the user.
  • the lower light 405 is installed such that light is emitted to the user from below the user (more specifically, from below the face of the user).
  • the upper light 404 may be installed on the supporting portion 406.
  • the upper light 404 may be installed on the supporting portion 406 and attached to emit light toward the ceiling portion 407.
  • a reflective plate for example, a mirror
  • the upper light 404 may be installed to emit light toward the ceiling portion 407, and the gate apparatus 10 may include a reflective plate that reflects the light emitted from the upper light 404.
  • the lower light 405 may be installed on a lower part of the supporting portion 406.
  • the lower light 405 may be installed to emit light to a mirror attached to the main body of the gate apparatus 10.
  • the light emitted from the lower light 405 is reflected by the mirror, and the user is illuminated with the reflected light from below the user.
  • the lower light 405 may be installed to emit light toward the main body of the gate apparatus 10, and the gate apparatus 10 may include a reflective plate that reflects the light emitted from the lower light 405.
  • the gate apparatus 10 controls the luminances of the above two lights such that a face image suitable for matching processing using face images to be described below can be acquired. Details of the light control performed by the gate apparatus 10 will be described below.
  • the gate apparatus 10 After capturing an image of the user, the gate apparatus 10 extracts a face area from the image and acquires a face image.
  • the gate apparatus 10 instructs the user to open his or her passport to the page including a photograph of the face of the user and places the open passport on the scanner 402.
  • the gate apparatus 10 reads out information (hereinafter, MRZ information) written in a Machine Readable Zone (MRZ) in the passport.
  • MRZ Machine Readable Zone
  • the gate apparatus 10 acquires a face image stored in an IC chip in the passport by using the MRZ information.
  • the gate apparatus 10 performs matching (1-to-1 matching) between the face image acquired from the camera device 403 and the face image acquired from the IC chip in the passport. If the matching succeeds (if the two face images substantially match), the gate apparatus 10 determines that the user possesses a correct passport. However, if the matching fails (if the two face images are different), the gate apparatus 10 determines that the user does not possess a correct passport.
  • the gate apparatus 10 opens the gate 408 and allows the user (the examination target user) to pass through the gate 408.
  • the gate apparatus 10 keeps the gate 408 closed and displays a predetermined message or the like on the display 401. For example, the gate apparatus 10 displays a message requesting the user to go to a staffed examination booth or to operate the gate apparatus 10 for the automatic examination again.
  • Fig. 7 is an example of a front view schematically illustrating the baggage placement area 430 of the gate apparatus 10.
  • Fig. 7 is a diagram of the baggage placement area 430 seen from the direction in which the user walks toward the gate apparatus 10.
  • the gate apparatus 10 When operating the gate apparatus 10, the user places his or her baggage on a top board area 431 or a side area 432 of the baggage placement area 430.
  • the gate apparatus 10 detects whether there is an object on the top board area 431 or the side area 432.
  • the gate apparatus 10 detects an object placed on the top board area 431 by using means such as a weight sensor or a pressure sensor.
  • the gate apparatus 10 detects an object placed on the side area 432 by using a distance sensor using infrared light or by analyzing an image obtained from a camera.
  • the above object detection method of the gate apparatus 10 is only an example.
  • the gate apparatus 10 may use any method or means to detect an object placed in the baggage placement area 430.
  • the object (user's baggage) detection area of the gate apparatus 10 is not limited to the top board area 431 and the side area 432. Any areas where users may place their baggage may be set as the detection areas. That is, the gate apparatus 10 not only detects presence of objects located on the area vertically upward of the baggage placement area 430 and the side area but also detects presence of objects located at other areas where users may place their baggage.
  • the gate apparatus 10 detects presence of an object in the baggage placement area 430, the gate apparatus 10 sets a "baggage detection flag” to "1". If the gate apparatus 10 no longer detects the object in the baggage placement area 430, the gate apparatus 10 clears the "baggage detection flag" to "0".
  • the gate apparatus 10 Upon completion of the emigration and immigration examination on a user (when the gate apparatus 10 has received an examination result and completes the passport possession determination), if "1" is set as the baggage detection flag, the gate apparatus 10 notifies (alerts) the user about his or her left-behind baggage in the baggage placement area 430. For example, the gate apparatus 10 may display an alert message on the display 401 or may output a sound or the like as an alert.
  • one of the conditions for the gate apparatus 10 to open the gate 408 is that the baggage detection flag has been cleared to "0". Even when the user does not notice the display or sound alerting the user about his or her left-behind baggage, since the gate 408 remains closed, the user is prevented from proceeding to the next procedure without his or her baggage.
  • Fig. 8 is a diagram illustrating an example of a hardware configuration of the individual gate apparatus 10 according to the first example embodiment.
  • the gate apparatus 10 includes a processor 311, a memory 312, and a communication interface 313.
  • the gate apparatus 10 includes a display 401, a scanner 402, a camera device 403, an upper light 404, a lower light 405, a gate 408, and an object detector 409.
  • These components such as the processor 311 are connected to an internal bus or the like and configured to communicate with each other.
  • Fig. 8 illustrates only the components connected (electrically connected) to the processor 311.
  • the supporting portion 406, the ceiling portion 407, and the baggage placement area 430 are not illustrated.
  • the main body of the gate apparatus, the supporting portion 406, and the ceiling portion 407 are formed in the shape of the letter C. More specifically, the main body of the gate apparatus 10 (the housing that supports the supporting portion 406) faces the ceiling portion 407. The main body and the ceiling portion 407 are connected to each other by the supporting portion 406. Since the gate apparatus 10 is structured in this way, the main body, the supporting portion 406, and the ceiling portion 407 are formed in the shape of the letter C (specifically, a mirror symmetry of the letter "c").
  • processor 311, the memory 312, and the communication interface 313 are equivalent to those of the server apparatus 20 described with reference to Fig. 4 , detailed description thereof will be omitted.
  • the display 401 is a device (for example, a liquid crystal monitor or the like) for outputting information.
  • the scanner 402 is a device that reads out MRZ information from passports and acquires fingerprint images of users.
  • the scanner 402 also has a function of accessing IC chips in passports.
  • the scanner 402 may be installed at any location of the gate apparatus 10. However, it is preferable that the scanner 402 be installed at a location where users can easily place their passports or fingers.
  • the present application will be described assuming that the scanner 402 has a function as a card reader that accesses IC chips, a function as a passport scanner that reads out MRZ information from passports, and a function as a fingerprint scanner that acquires fingerprint images from fingers.
  • these functions may be separated from each other. That is, a card reader, a passport scanner, and a fingerprint scanner may be installed separately in the gate apparatus 10.
  • the camera device 403 is a digital camera installed to capture an image of a person in front of the gate apparatus 10.
  • the camera device 403 may be installed at any location.
  • the camera device 403 may be installed on the main body of the gate apparatus 10 or may be installed away from the gate apparatus 10. As long as the camera device 403 can capture an image of a user (in particular, the face of a user) in front of the gate apparatus 10, the camera device 10 may be installed at any location.
  • the upper light 404 is a light source installed to emit light to the user from above the user.
  • the lower light 405 is a light source installed to emit light to the user from below the user.
  • the luminance of the upper light 404 and the luminance of the lower light 405 are variable. Since the luminance of the upper light 404 and the luminance of the lower light 405 are variable, the illuminance of the light emitted to the user varies. That is, since the luminance of the upper light 404 and the luminance of the lower light 405 are varied, the brightness of the user seen from the camera device 403 varies.
  • Any light source of which luminance is variable may be used as each of the upper light 404 and the lower light 405.
  • an LED Light Emitting Diode
  • the luminance can be changed by controlling the current flowing through the LED.
  • the gate 408 shifts from its standby closed state that blocks passage of the user to its open state that allows passage of the user.
  • the mechanism of the gate 408 is not limited to any particular mechanism.
  • the gate 408 is a flap gate that opens and closes a flap installed on one side or flaps installed on both sides of the passage or is a turnstile gate that rotate three bars.
  • the object detector 409 is a device for detecting an object placed in the baggage placement area 430. As described above, a weight sensor, a distance sensor, or the like may be used as the object detector 409.
  • the functions of the gate apparatus 10 are realized by various kinds of processing modules.
  • the processing modules are realized by, for example, causing the processor 311 to execute a program stored in the memory 312.
  • Fig. 9 is a diagram illustrating an example of a processing configuration (processing modules) of the individual gate apparatus 10 according to the first example embodiment.
  • the gate apparatus 10 includes a communication control unit 301, a fingerprint image acquisition unit 302, an examination request unit 303, a face image acquisition unit 304, a passport possession determination unit 305, a baggage detection unit 306, a left-behind baggage alert unit 307, a gate control unit 308, and a storage unit 309.
  • the communication control unit 301 is means for controlling communication with other apparatuses. Specifically, the communication control unit 301 receives data (packets) from the server apparatus 20. In addition, the communication control unit 301 transmits data to the server apparatus 20.
  • the fingerprint image acquisition unit 302 is means for acquiring a fingerprint image of a user standing in front of the gate apparatus 10.
  • the fingerprint image acquisition unit 302 acquires a fingerprint image of a user by controlling the scanner 402.
  • the fingerprint image acquisition unit 302 gives the acquired fingerprint image to the examination request unit 303.
  • the examination request unit 303 is means for requesting the server apparatus 20 to perform the emigration and immigration examination on the examination target user (the user standing in front of the gate apparatus 10). Specifically, the examination request unit 303 generates an examination request including the acquired fingerprint image (biological information) and transmits the generated examination request to the server apparatus 20 via the communication control unit 301.
  • the examination request unit 303 generates an examination request including an identifier of this gate apparatus 10 (hereinafter referred to as a gate identifier), a fingerprint image, etc. (see Fig. 10 ).
  • a gate identifier an identifier of this gate apparatus 10
  • a fingerprint image etc.
  • a MAC (Media Access Control) address or an IP (Internet Protocol) address of the gate apparatus 10 may be used as the gate identifier.
  • the examination request unit 303 receives a response to the examination request from the server apparatus 20 via the communication control unit 301.
  • the examination request unit 303 gives the response (an examination result; emigration and immigration permitted or emigration and immigration not permitted) from the server apparatus 20 to the left-behind baggage alert unit 307 and the gate control unit 308.
  • the face image acquisition unit 304 is means for acquiring a face image (biological information) of the user standing in front of the gate apparatus 10. For example, the face image acquisition unit 304 acquires a face image of the user by controlling the camera device 403. The face image acquisition unit 304 gives the acquired face image to the passport possession determination unit 305.
  • the gate apparatus 10 controls the luminance of the upper light 404 and the luminance of the lower light 405.
  • the face image acquisition unit 304 controls the luminance of the upper light 404 and the luminance of the lower light 405 such that the face of the user is illuminated with light with substantially uniform illuminance when an image of the face of the user is captured.
  • the face image acquisition unit 304 may control the luminance of only one of the upper light 404 and the lower light 405.
  • the face image acquisition unit 304 has a function as a light control unit that controls at least two lights and a function as an acquisition unit that acquires biological information about users.
  • the face image acquisition unit 304 determines the luminance of the upper light 404 and the luminance of the lower light 405 based on a physical feature of a user (a user to be photographed). For example, if the body height of the user is high (if the body height of the user is higher than a first threshold), the face image acquisition unit 304 sets the luminance of the lower light 405 to be higher than the luminance of the upper light 404. In contrast, if the body height of the user is low (if the body height of the user is lower than a second threshold), the face image acquisition unit 304 sets the luminance of the upper light 404 to be higher than the luminance of the lower light 405.
  • the face image acquisition unit 304 may set the luminance of the upper light and the luminance of the lower light to be the same.
  • the face image acquisition unit 304 controls the luminance of the upper light 404 and the luminance of the lower light 405 as described above such that the user is illuminated with light with uniform illuminance when an image of the user is captured.
  • the face image acquisition unit 304 measures the body height of the user.
  • the face image acquisition unit 304 refers to table information in which body heights and luminances of the two light sources are defined in advance.
  • the face image acquisition unit 304 controls the upper light 404 and the lower light 405 such that the necessary luminances can be obtained from the table information.
  • Fig. 11 is a diagram illustrating an example of table information in which body heights and luminances of the two light sources are defined.
  • the face image acquisition unit 304 acquires the luminance of the upper light 404 and the luminance of the lower light 405 based on the body height of the user.
  • the face image acquisition unit 304 controls a current flowing through the upper light 404 and a current flowing through the lower light 405 such that the upper light 404 and the lower light 405 emit light with their respective luminances acquired.
  • the face image acquisition unit 304 may use table information in which these relationships are defined in advance or may use a function (a function that outputs current values when receiving luminances).
  • a resistor connected between the light source and a power supply may be changed.
  • a voltage applied to the light source may be changed by using a PWM (Pulse Width Modulation) technique or the like.
  • the face image acquisition unit 304 acquires the body height of the user in accordance with any method.
  • a plurality of sensors for example, infrared distance sensors
  • the plurality of sensors are disposed vertically at predetermined intervals.
  • the face image acquisition unit 304 monitors the output of each of the plurality of sensors and measures the body height of the user from the difference in the output value of each sensor (the output of each voltage value if infrared distance sensors are used). That is, if the body height of the user is low, fewer sensors react to the user. If the body height of the user is high, more sensors react to the user.
  • the face image acquisition unit 304 determines the body height of the user based on the outputs of the sensors that change depending on the body height of the user. That is, the face image acquisition unit 304 determines the body height of the user based on the locations of the sensors that react to the body height of the user.
  • the face image acquisition unit 304 may determine the body height of the user by capturing an image of the user and analyzing the obtained image.
  • the luminance of the upper light 404 and the luminance of the lower light 405 are set to initial values (default values).
  • the face image acquisition unit 304 When the face image acquisition unit 304 acquires a body height determination image, the face image acquisition unit 304 extracts a face area from the image. The face image acquisition unit 304 determines (estimates) the body height of the user based on the location of the extracted face image in the above body height determination image. Specifically, if the face image is located at an upper portion in the original image, the face image acquisition unit 304 estimates that the body height of the user is high. In contrast, if the face image is located at a lower portion in the original image, the face image acquisition unit 304 estimates that the body height of the user is low.
  • a learning model obtained by machine learning may be used to determine the body height of the user.
  • a learning model is generated by preparing many teaching data including images of users and body heights of the users as labels.
  • the face image acquisition unit 304 may acquire a body height by entering a body height determination image to the generated learning model.
  • Any algorithm, such as support vector machine, boosting, or neural network, may be used to generate the learning model. Since known techniques can be used for the above algorithms such as support vector machine, description thereof will be omitted.
  • the gate apparatus 10 may control the luminance of the upper light 404 and the luminance of the lower light 405 based on the face detection by image processing or the output values of sensors.
  • the passport possession determination unit 305 is means for determining whether the user possesses a correct passport.
  • the passport possession determination unit 305 acquires MRZ information written in a Machine Readable Zone in a passport by controlling the scanner 402.
  • the passport possession determination unit 305 reads out information stored in an IC chip by using the acquired MRZ information.
  • the passport possession determination unit 305 decrypts information read out from an IC chip by using MRZ information acquired by the scanner 402 and acquires a face image stored in the IC chip.
  • the passport possession determination unit 305 extracts feature points from the face image acquired from the face image acquisition unit 304 and the face image acquired from the IC chip. Since an existing technique can be used to extract these feature points, detailed description of the extraction will be omitted. For example, the passport possession determination unit 305 extracts the eyes, nose, mouth, etc. as feature points from the individual face image. Next, the passport possession determination unit 305 calculates, as feature values, the location of the individual feature point and the distance between feature points and generates a feature vector formed by a plurality of feature values (vector information that features the face image).
  • the passport possession determination unit 305 calculates the similarity between two sets of feature values (feature vectors). For this similarity, the chi-squared distance, the Euclidean distance, or the like may be used. A longer distance represents a lower similarity, and a shorter distance represents a higher similarity.
  • the passport possession determination unit 305 determines that the matching has succeeded. That is, if the above calculated similarity is more than or equal to a predetermined value, the passport possession determination unit 305 determines that each of the two face images is an image obtained by capturing the face of the same person.
  • the passport possession determination unit 305 determines that the user standing in front of the corresponding gate apparatus 10 possesses a correct passport.
  • the passport possession determination unit 305 notifies the left-behind baggage alert unit 307 and the gate control unit 308 of the determination result (whether or not the user possesses a correct passport).
  • the baggage detection unit 306 is means for detecting whether there is an object in the baggage placement area 430.
  • the baggage detection unit 306 monitors the output of the object detector 409. If the detector (an object detection sensor) detects presence of an object, the baggage detection unit 306 sets the "baggage detection flag” to "1". In addition, if the baggage detection unit 306 no longer detects presence of the object from the output of the object detector 409, the baggage detection unit 306 clears the "baggage detection flag" to "0".
  • the left-behind baggage alert unit 307 is means for giving, if there is an object in the baggage placement area 430, an alert about the left-behind baggage. More specifically, if the left-behind baggage alert unit 307 determines that the user has forgotten to take his or her baggage away from the baggage placement area 430, the left-behind baggage alert unit 307 outputs an alert about the left-behind baggage. The left-behind baggage alert unit 307 checks the baggage detection flag, upon completion of the emigration and immigration examination by the corresponding gate apparatus 10.
  • the left-behind baggage alert unit 307 determines the situation in which the corresponding gate apparatus 10 has received an examination result from the server apparatus 20 via the communication control unit 301 and has received a determination result from the passport possession determination unit 305 as "completion of the emigration and immigration examination".
  • the left-behind baggage alert unit 307 checks the baggage detection flag when the corresponding gate apparatus 10 has received the two results (the examination result and the determination result).
  • the left-behind baggage alert unit 307 determines that the user has forgotten to take away his or her baggage and outputs an alert about the baggage. For example, the left-behind baggage alert unit 307 displays information as illustrated in Fig. 12 on the display 401. Alternatively, the left-behind baggage alert unit 307 may output an audio message notifying that the user has left behind his or her baggage from a speaker. Alternatively, the left-behind baggage alert unit 307 may notify a terminal held by the user that the user has left behind his or her baggage.
  • the gate control unit 308 is means for controlling the gate 408 of the gate apparatus 10. If there is no object at least in the baggage placement area 430, the gate control unit 308 controls the gate 408 such that the examination target user can pass through the gate 408. In other words, if there is an object (baggage) in the baggage placement area 430, the gate control unit 308 controls the gate 408 such that the examination target user cannot pass through the gate 408.
  • the gate control unit 308 opens the gate 408. More specifically, when the examination result from the server apparatus 20 indicates “emigration and immigration permitted", the passport possession determination result indicates “possession of passport”, and the baggage detection flag indicates "0", the gate control unit 308 opens the gate 408. In principle, unless the above three conditions are met, the gate control unit 308 does not open the gate 408.
  • the gate control unit 308 closes the gate 408 after the user who is allowed to pass through the gate 408 (the user who has passed the emigration and immigration examination) passes through the gate 408.
  • the storage unit 309 is means for storing information necessary for the operation of the gate apparatus 10.
  • Fig. 13 is a sequence diagram illustrating an example of an operation of the emigration and immigration examination system according to the first example embodiment.
  • Fig. 13 is a sequence diagram illustrating an example of a system operation performed on the departure date of a user. The following example assumes that the user has previously registered his or her "gate user information (fingerprint image)" in the server apparatus 20 before the operation in Fig. 13 .
  • the user who has performed the pre-registration for use of the system moves to a gate apparatus 10 and stands in front of the gate apparatus 10.
  • the gate apparatus 10 acquires a fingerprint image of the user (step S01).
  • the gate apparatus 10 transmits an examination request including the fingerprint image to the server apparatus 20 (step S02).
  • the server apparatus 20 performs 1-to-N matching by setting the acquired fingerprint image as the matching target fingerprint image (authentication target fingerprint image) and the fingerprint images stored in the registered user database as the registered fingerprint images (step S03).
  • the server apparatus 20 transmits an examination result (emigration and immigration permitted or emigration and immigration not permitted) obtained as a result of the 1-to-N matching to the gate apparatus 10 that has transmitted the examination request (step S04).
  • an examination result emigration and immigration permitted or emigration and immigration not permitted
  • the gate apparatus 10 acquires a fingerprint image of the examination target user. Next, by transmitting an examination request including the acquired fingerprint image to the server apparatus storing the fingerprint images of the users whose emigration and immigration is permitted, the gate apparatus 10 requests the server apparatus to perform the emigration and immigration examination on the examination target user.
  • the server apparatus 20 determines an examination result by performing matching (1-to-N matching) using the previously registered fingerprint images and transmits the examination result to the gate apparatus 10.
  • the gate apparatus 10 captures an image of the user to acquire a face image and reads out a face image from an IC chip in the passport (acquire face images; step S05).
  • the gate apparatus 10 estimates the body height of the user and controls the two light sources such that the luminance suitable for the body height of the user can be obtained.
  • the gate apparatus 10 performs 1-to-1 matching using the two face images and determines whether the user possess a correct passport (step S06).
  • the gate apparatus 10 acquires a first face image of the examination target user by controlling the camera device 403.
  • the gate apparatus 10 acquires a second face image stored in the passport possessed by the examination target user. By performing matching between the first and second face images, the gate apparatus 10 determines whether the examination target user possesses a correct passport.
  • the gate apparatus 10 detects whether baggage is placed in the baggage placement area 430. Specifically, the gate apparatus 10 monitors the output of the object detector 409, to detect the state of the baggage placement area 430. If there is baggage placed in the baggage placement area 430, the baggage detection flag is set to "1". If the baggage has been removed from the baggage placement area 430, the baggage detection flag is cleared to "0".
  • the gate apparatus 10 When the gate apparatus 10 completes the emigration and immigration examination (when the gate apparatus 10 receives an examination result from the server apparatus 20 and completes the passport possession determination), the gate apparatus 10 checks whether the baggage detection flag is set to "1" (step S07).
  • the gate apparatus 10 If the baggage detection flag is set to "1" (Yes in step S07), the gate apparatus 10 outputs an alert about the left-behind baggage (step S08).
  • the gate apparatus 10 when the gate apparatus 10 receives an examination result from the server apparatus 20 and completes the passport possession determination, if there is an object in the baggage placement area 430, the gate apparatus 10 outputs an alert about the left-behind baggage. If there is an object in the baggage placement area 430, the gate apparatus 10 sets the baggage detection flag. If there is no object in the baggage placement area 430, the gate apparatus 10 clears the baggage detection flag. The gate apparatus 10 determines whether there is left-behind baggage by performing the above control processing of the baggage detection flag and by checking the baggage detection flag when the gate apparatus 10 receives an examination result from the server apparatus 20 and completes the passport possession determination.
  • the gate apparatus 10 continuously monitors the baggage placement area 430 to detect whether the baggage has been taken away from the baggage placement area 430. That is, the gate apparatus 10 continuously monitors the baggage detection flag (step S09).
  • the gate apparatus 10 opens the gate 408 (step S10).
  • the gate apparatus 10 controls the gate 408 such that the examination target user can pass through the gate 408.
  • the gate apparatus 10 controls the gate 408 such that the examination target user cannot pass through the gate 408.
  • the gate apparatus 10 transmits a matching request (an examination request) using biological information (a fingerprint image) to the server apparatus 20 and receives a matching result (an examination result) from the server apparatus 20.
  • the gate apparatus 10 controls the two light sources such that the face of the user is illuminated with light with uniform illuminance and acquires a face image.
  • the gate apparatus 10 performs matching between this acquired face image and the face image read out from an IC chip in the passport, to determine whether the user possesses a correct passport. If the gate apparatus 10 determines that the examination target user has forgotten to take away his or her baggage from the baggage placement area 430, the gate apparatus 10 alerts the user about the left-behind baggage.
  • the gate apparatus 10 does not open the gate 408 even if the emigration and immigration examination on the user has been completed. As a result, the gate apparatus 10 can prevent the user from leaving the baggage placement area 430 without his or her baggage.
  • the gate apparatus 10 can obtain a face image suitable for face matching. That is, by optimally controlling the luminances of the different lights, the gate apparatus 10 can obtain a high quality image (a face image), regardless of the body height of the user. More specifically, the gate apparatus 10 changes the luminance of the light emitted from the upper light 404 and the luminance of the light emitted from the lower light 405 such that the user is illuminated with uniform illuminance. As a result, for example, the brightness of the light emitted to the face or the like does not vary depending on the location, and biological information suitable for authentication is acquired.
  • the upper light 404 is attached to the ceiling portion 407.
  • This light with a roof (the upper light 404 embedded into the ceiling portion 407) blocks the outside light, etc.
  • the gate apparatus 10 can create an environment suitable for acquiring a face image of a user, regardless of the time of the day or the like. That is, since the gate apparatus 10 includes a light with a roof, the accuracy of the face authentication can be prevented from being deteriorated by disturbance.
  • the light with a roof is hung by the supporting portion 406, and the supporting portion 406 is shaped in a thick flat plate. In this way, the display 401 can be installed and wiring can be made inside the supporting portion 406.
  • the gate apparatus 10 when the gate apparatus 10 acquires biological information about a face image, the gate apparatus 10 controls the luminance of the upper light 404 and the luminance of the lower light 405.
  • this control on the two light sources is also applicable to when an iris image is acquired for authentication or when a fingerprint image is acquired for authentication. That is, when an iris image or a fingerprint image is acquired, the two light sources may be controlled such that an eye area or a finger is illuminated with light with uniform luminance.
  • the function of the server apparatus 20 may entirely or partly be realized by the gate apparatus 10.
  • the registered user database may be configured in the gate apparatus 10, and the gate apparatus 10 may perform the emigration and immigration examination on the examination target user by using this database.
  • the registered user database of the server apparatus 20 may be configured in a different database server.
  • the fingerprint image may be used to determine whether the user is a criminal or the like.
  • the server apparatus 20 may perform matching between an acquired fingerprint image and the fingerprint images stored in a blacklist in which fingerprints of criminals are listed.
  • bio information may be alternatively stored in the server apparatus 20.
  • face images, voiceprint information, iris information, or the like or feature values thereof may be stored as biological information.
  • biological information other than a face image can be acquired from a passport
  • the biological information other than a face image may be used to determine whether the examination target user possesses a correct passport.
  • the data exchange mode between the individual gate apparatus 10 and the server apparatus 20 is not limited to a particular mode.
  • the data exchanged between these apparatuses may be encrypted.
  • the fingerprint images are personal information, and to appropriately protect the personal information, it is desirable that encrypted data be exchanged.
  • the gate apparatus 10 may transmit an examination request having a digital signature to the server apparatus 20. If the server apparatus 20 succeeds in verifying the digital signature, the server apparatus 20 may process the acquired examination request.
  • the server apparatus 20 may be configured to verify the validity of the gate apparatus 10 as the examination request transmission source by verifying a digital signature.
  • any one of the upper light 404 and the lower light 405 may be formed by a plurality of light sources.
  • the face image acquisition unit 304 may change the luminance of the upper light 404 or the lower light 405. That is, by performing digital control on the plurality of light sources, the gate apparatus 10 may control the luminance of the light emitted to the face of the user.
  • a middle light may also be attached to the gate apparatus 10 in addition to the above two light sources. That is, a middle light may be installed between the upper light 404 and the lower light 405, and the luminance of the middle light may be controlled.
  • the face image acquisition unit 304 may control the luminances of the three light sources based on the body height of the user. That is, the gate apparatus 10 controls the luminances of the plurality of light sources such that the user is illuminated with light with uniform illuminance.
  • the above luminances may be controlled based on other information.
  • the luminances of the light sources may be controlled based on the length of the user's hair and whether the user is wearing glasses.
  • the luminances of the light sources may be controlled by comprehensively taking a plurality of elements (the body height and the presence or absence of glasses) into account.
  • the luminances of the lights emitted from the two light sources may be different between a tall user with glasses and a short user with glasses. Whether the user is wearing glasses can be determined by performing image processing using a template or by using a learning model obtained by machine learning.
  • the gate apparatus 10 may analyze a captured face image and control the luminances of the two light sources based on the analysis result of the face image. For example, the gate apparatus 10 (the face image acquisition unit 304) divides a face image into a plurality of areas and calculates an average brightness value of pixels constituting each of the plurality of small areas. The gate apparatus 10 may control the two light sources such that the calculated average value indicates a predetermined value or more and the variation in brightness among the small areas (for example, the variance value or standard deviation) becomes smaller than a threshold. That is, the gate apparatus 10 may feed the analysis result of the captured face image back to the control of the two light sources, to acquire a face image having uniform brightness.
  • the gate apparatus 10 may feed the analysis result of the captured face image back to the control of the two light sources, to acquire a face image having uniform brightness.
  • the gate apparatus 10 may determine the content of the alert about left-behind baggage by using information obtained from the passport.
  • the left-behind baggage alert unit 307 may change the content of the message or the display method based on information obtained from an MRZ or an IC chip in the passport.
  • the left-behind baggage alert unit 307 may generate an alert message by using the name of the user. For example, if the name of the user is "Taro", the left-behind baggage alert unit 307 may output an alert message "Mr. Taro, please take your baggage with you".
  • the left-behind baggage alert unit 307 may change the language of the alert message based on the nationality of the user. For example, if the nationality of the user is Japan, the left-behind baggage alert unit 307 may output an alert message in Japanese (display an alert message in Japanese on display 401 or output an audio message in Japanese from a speaker). If the nationality of the user is China, the left-behind baggage alert unit 307 may output an alert message in Chinese.
  • the left-behind baggage alert unit 307 may generate a plurality of alert messages in a plurality of languages. For example, the left-behind baggage alert unit 307 may generate an alert message in "English" as its fixed first language and an alert message in a language corresponding to the nationality of the user as its second language.
  • the left-behind baggage alert unit 307 may use a parametric speaker or the like having a strong directivity. By using a parametric speaker or the like, the left-behind baggage alert unit 307 can reliably send an alert message to the user.
  • each of the example embodiments may be used individually or a plurality of example embodiments may be used in combination.
  • part of a configuration according to one example embodiment may be replaced by a configuration according to another example embodiment.
  • a configuration according to one example embodiment may be added to a configuration according to another example embodiment.
  • addition, deletion, or replacement is possible between part of a configuration according to one example embodiment and another configuration.
  • the present invention is suitably applicable, for example, to emigration and immigration examination systems at airports.
  • a gate apparatus including:
  • the gate apparatus according to supplementary note 1, wherein the light control unit changes the luminance of the light emitted from the upper light and the luminance of the light emitted from the lower light such that the user is illuminated with light with uniform illuminance.
  • the gate apparatus according to supplementary note 1 or 2, wherein the light control unit changes the luminance of the light emitted from the upper light and the luminance of the light emitted from the lower light based on a physical feature of the user.
  • the gate apparatus according to supplementary note 3, wherein the light control unit changes the luminance of the light emitted from the upper light and the luminance of the light emitted from the lower light based on a body height of the user.
  • the gate apparatus wherein, when the body height of the user is higher than a first threshold, the light control unit sets the luminance of the lower light to be higher than the luminance of the upper light.
  • the gate apparatus wherein, when the body height of the user is lower than a second threshold, the light control unit sets the luminance of the upper light to be higher than the luminance of the lower light.
  • the gate apparatus according to any one of supplementary notes 4 to 6, wherein the light control unit extracts a face area from an image captured from the user and changes the luminance of the light emitted from the upper light and the luminance of the light emitted from the lower light based on a location of the extracted face area in the captured image.
  • the gate apparatus according to any one of supplementary notes 4 to 6, further including a plurality of sensors that are disposed vertically upward from a main body and that detect the body height of the user, wherein the light control unit changes the luminance of the light emitted from the upper light and the luminance of the light emitted from the lower light based on an output value obtained from each of the plurality of sensors.
  • the gate apparatus according to any one of supplementary notes 1 to 8, wherein the light control unit controls the luminance of the light emitted from the upper light and the luminance of the light emitted from the lower light, to acquire an image suitable for face authentication.
  • a control method of a gate apparatus including an upper light that emits light from above a user and a lower light that emits light from below the user, the control method comprising:
  • a computer-readable storage medium storing a program that causes a computer mounted on a gate apparatus including an upper light that emits light from above a user and a lower light that emits light from below the user to perform processing for:
  • the supplementary notes 10 and 11 can be expanded in the same way as the supplementary note 1 can be expanded into the supplementary note 2 to the supplementary note 9.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Dentistry (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Collating Specific Patterns (AREA)
  • Devices For Checking Fares Or Tickets At Control Points (AREA)
EP20920411.4A 2020-02-18 2020-02-18 Dispositif de porte, procédé de commande de dispositif de porte et support de stockage Pending EP4109873A4 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/006194 WO2021166061A1 (fr) 2020-02-18 2020-02-18 Dispositif de porte, procédé de commande de dispositif de porte et support de stockage

Publications (2)

Publication Number Publication Date
EP4109873A1 true EP4109873A1 (fr) 2022-12-28
EP4109873A4 EP4109873A4 (fr) 2023-03-01

Family

ID=77390723

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20920411.4A Pending EP4109873A4 (fr) 2020-02-18 2020-02-18 Dispositif de porte, procédé de commande de dispositif de porte et support de stockage

Country Status (4)

Country Link
US (1) US20230067694A1 (fr)
EP (1) EP4109873A4 (fr)
JP (1) JP7388530B2 (fr)
WO (1) WO2021166061A1 (fr)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003015211A (ja) 2001-06-28 2003-01-15 Konica Corp 撮影装置、撮影方法及び撮影システム
JP2003308303A (ja) 2002-04-18 2003-10-31 Toshiba Corp 個人認証装置および通行制御装置
JP2004030156A (ja) 2002-06-25 2004-01-29 Toshiba Corp 顔認証装置および通行制御装置
JP2006019901A (ja) 2004-06-30 2006-01-19 Omron Corp 視覚センサ
JP2010009106A (ja) 2008-06-24 2010-01-14 Oki Electric Ind Co Ltd アイリス撮影装置
US9836647B2 (en) * 2013-10-08 2017-12-05 Princeton Identity, Inc. Iris biometric recognition module and access control assembly
WO2017070638A1 (fr) * 2015-10-23 2017-04-27 Xivix Holdings Llc Système et procédé d'authentification au moyen d'un dispositif mobile
JP7260730B2 (ja) 2017-10-20 2023-04-19 辰巳電子工業株式会社 撮影装置、撮影方法、及び撮影処理プログラム

Also Published As

Publication number Publication date
JP7388530B2 (ja) 2023-11-29
US20230067694A1 (en) 2023-03-02
JPWO2021166061A1 (fr) 2021-08-26
WO2021166061A1 (fr) 2021-08-26
EP4109873A4 (fr) 2023-03-01

Similar Documents

Publication Publication Date Title
EP3067829A1 (fr) Procédé d'authentification de personnes
US20240053658A1 (en) Gate apparatus
US20130088685A1 (en) Iris Cameras
CN102523381A (zh) 对无线设备的功能的受控访问
EP4108860A1 (fr) Dispositif de porte, procédé de commande de dispositif de porte, et support de stockage
US11756338B2 (en) Authentication device, authentication method, and recording medium
KR20150069799A (ko) 얼굴 인증 방법 및 그 장치
CN114360697A (zh) 远程防疫作业方法、系统、设备及存储介质
EP4160527A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
EP4109873A1 (fr) Dispositif de porte, procédé de commande de dispositif de porte et support de stockage
JP2023138550A (ja) ゲート装置、出入国審査システム、ゲート装置の制御方法及びプログラム
EP4123110A1 (fr) Dispositif de porte, système d'authentification, procédé de commande de dispositif de porte et support de stockage
US20220358804A1 (en) Gate device, authentication system, gate control method, and storage medium
US20230065328A1 (en) Gate apparatus, control method of gate apparatus, and storage medium
EP4108858A1 (fr) Dispositif de porte, procédé de commande de dispositif de porte, et support d'enregistrement
KR102583982B1 (ko) 비대면 출입 통제 방법 및 이를 수행하는 출입 통제 시스템
KR102439216B1 (ko) 인공지능 딥러닝 모델을 이용한 마스크 착용 얼굴 인식 방법 및 서버
JP7424469B2 (ja) ゲートシステム、ゲート装置、その画像処理方法、およびプログラム、ならびに、ゲート装置の配置方法
JP7248348B2 (ja) 顔認証装置、顔認証方法、及びプログラム
US20240194013A1 (en) System, gate device, control method for gate device, and storage medium
US20230326254A1 (en) Authentication apparatus, control method, and computer-readable medium
KR20220105738A (ko) 얼굴 인식 기반의 검역 시스템
JP2023079045A (ja) 画像処理装置、画像処理方法、およびプログラム
Jacobs FRAnC: A System for Digital Facial Recognition

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220909

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: H04N0005225000

Ipc: H05B0047105000

A4 Supplementary search report drawn up and despatched

Effective date: 20230127

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 5/107 20060101ALI20230123BHEP

Ipc: H04N 23/74 20230101ALI20230123BHEP

Ipc: H04N 23/611 20230101ALI20230123BHEP

Ipc: H04N 23/60 20230101ALI20230123BHEP

Ipc: A61B 5/00 20060101ALI20230123BHEP

Ipc: G06V 40/18 20220101ALI20230123BHEP

Ipc: G06V 40/16 20220101ALI20230123BHEP

Ipc: G06V 40/12 20220101ALI20230123BHEP

Ipc: G06V 40/10 20220101ALI20230123BHEP

Ipc: A61B 5/1171 20160101ALI20230123BHEP

Ipc: G06T 1/00 20060101ALI20230123BHEP

Ipc: H05B 47/105 20200101AFI20230123BHEP

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)