WO2021260930A1 - 情報処理システム、情報処理方法、プログラム - Google Patents
情報処理システム、情報処理方法、プログラム Download PDFInfo
- Publication number
- WO2021260930A1 WO2021260930A1 PCT/JP2020/025299 JP2020025299W WO2021260930A1 WO 2021260930 A1 WO2021260930 A1 WO 2021260930A1 JP 2020025299 W JP2020025299 W JP 2020025299W WO 2021260930 A1 WO2021260930 A1 WO 2021260930A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- output
- user
- authentication
- gate
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 98
- 238000003672 processing method Methods 0.000 title claims description 4
- 238000012545 processing Methods 0.000 claims description 65
- 238000000034 method Methods 0.000 claims description 52
- 230000006870 function Effects 0.000 claims description 7
- 230000004044 response Effects 0.000 claims description 3
- 230000001815 facial effect Effects 0.000 description 38
- 238000010586 diagram Methods 0.000 description 25
- 238000004891 communication Methods 0.000 description 15
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 14
- 230000036760 body temperature Effects 0.000 description 6
- 238000013459 approach Methods 0.000 description 4
- 210000000613 ear canal Anatomy 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 210000003462 vein Anatomy 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 235000006481 Colocasia esculenta Nutrition 0.000 description 1
- 240000004270 Colocasia esculenta var. antiquorum Species 0.000 description 1
- 208000035473 Communicable disease Diseases 0.000 description 1
- 235000016496 Panda oleosa Nutrition 0.000 description 1
- 240000000220 Panda oleosa Species 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/70—Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
- G06F21/82—Protecting input, output or interconnection devices
- G06F21/84—Protecting input, output or interconnection devices output devices, e.g. displays or monitors
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/10—Movable barriers with registering means
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/20—Individual registration on entry or exit involving the use of a pass
- G07C9/22—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
- G07C9/25—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/20—Individual registration on entry or exit involving the use of a pass
- G07C9/22—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder
- G07C9/25—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition
- G07C9/257—Individual registration on entry or exit involving the use of a pass in combination with an identity check of the pass holder using biometric data, e.g. fingerprints, iris scans or voice recognition electronically
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/03—Indexing scheme relating to G06F21/50, monitoring users, programs or devices to maintain the integrity of platforms
- G06F2221/032—Protect output to user by software means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C2209/00—Indexing scheme relating to groups G07C9/00 - G07C9/38
- G07C2209/12—Comprising means for protecting or securing the privacy of biometric data, e.g. cancellable biometrics
Definitions
- This disclosure relates to information processing systems, information processing methods, and programs.
- Patent Document 1 discloses a technique in which an authentication device that performs authentication based on an iris displays an avatar image different from the face of a person to be authenticated.
- Patent Document 2 discloses a technique for displaying an avatar in response to a user's login.
- the information processing system uses an acquisition means for acquiring the target biometric information and the identification information associated with the biometric information to generate output information determined in response to the identification information.
- a means and an output means for outputting the output information are provided.
- the information processing method acquires the target biological information, uses the identification information associated with the biological information, generates output information determined corresponding to the identification information, and outputs the output information.
- the program is a computer of the information processing system, an acquisition means for acquiring the biological information of the target, a generation means for generating output information determined corresponding to the identification information by using the identification information associated with the biological information. It functions as an output means for outputting the output information.
- the present disclosure aims to improve the technique disclosed in Patent Document 1. It should be noted that there is a demand for a technique for outputting information that makes it more difficult to identify the authenticated target when displaying information about the authenticated target together with the authentication result of the target to be authenticated by the authentication device.
- FIG. 1 is a first diagram showing an outline of an information processing system according to the present embodiment.
- the information processing system 100 includes at least a gate device 10 and an information processing device 20.
- the gate device 10 and the information processing device 20 are connected to each other via a communication network.
- the gate device 10 is composed of a pair of gate bodies 1A and 1B.
- the gate body is collectively referred to as the gate body 1.
- Each of the gate bodies 1 of the pair of gate bodies 1A and 1B is installed in parallel with an interval of the width W of the passing area through which the object of authentication such as a moving body passes.
- a person or the like to be authenticated can pass through the passing area.
- the moving object may be an animal other than a person.
- the gate device 10 is installed at a railway ticket gate, an airport boarding gate, a company entrance / exit, etc. as an example.
- the gate device 10 includes at least a camera 11 (biological information reading device), a code reader 12, and a display 13.
- the gate device 10 may include a flapper 14 that prevents the person or the like from passing through the gate device 10 when the authentication result of the person or the like to be authenticated is unsuccessful.
- a flapper 14 that prevents the person or the like from passing through the gate device 10 when the authentication result of the person or the like to be authenticated is unsuccessful.
- the camera 11 of the gate device 10 photographs a person passing through the gate device 10.
- the camera 11 transmits the captured image to the information processing device 20.
- the information processing device 20 authenticates using the feature information of the face of the person reflected in the captured image.
- the information processing apparatus 20 uses the identification information of a person acquired based on the feature information, and generates output information including information for determining the person and an authentication result corresponding to the identification information.
- the feature information is not limited to the face, but may be an iris, a fingerprint, a vein, a voice, an ear sound, an employee ID card, a user's mobile terminal, a password, or the like.
- the gate device 10 may have a function for reading each modal, such as an iris camera for reading an iris, a fingerprint sensor for reading a fingerprint, and a microphone for reading a voice. That is, the gate device 10 may have a multimodal (face + iris, etc.) specification having an interface mode (multimodal) for reading a plurality of different biological information.
- Ear acoustics are based on the individuality of the spatial structure of the head including the human ear canal (ear canal), and use an earphone-type authentication device to send an inspection sound toward the ear canal (ear canal) from the reflected sound. It is a technique for measuring personality, and the gate device 10 may have an ear acoustic authentication function for performing authentication based on this ear acoustic.
- the information processing device 20 may generate output information including at least information that determines the person corresponding to the identification information of the target person.
- the information processing device 20 transmits the output information to the gate device 10.
- the gate device 10 displays the output information on the display 13.
- the information processing device 20 includes an acquisition means for acquiring the target biometric information. Further, the information processing apparatus 20 includes a generation means for generating output information determined corresponding to the identification information by using the identification information associated with the biometric information acquired by the acquisition means. That is, the generation means uses the identification information associated with the biometric information acquired by the acquisition means to generate output information uniquely determined corresponding to the identification information. The information processing apparatus 20 includes an output means for outputting the output information.
- the output information indicating the information in which the person is determined is the shape information in which the person is determined in the present embodiment.
- the output information indicating the information in which the person is determined may be the color information in which the person is determined in the present embodiment.
- the output information indicating the information in which the person is determined may be a combination of the shape information in which the person is determined and the color information.
- the output information indicating the information in which the person is determined may be a plurality of combinations of the shape information in which the person is determined, the color information, and the information related to the person in the present embodiment.
- the information related to the person may be, for example, a birthday, a part of the employee code, a mark of the place of origin, or the like.
- the information related to the person may be information related to the target name such as the initials of the name (for example, when the name is Niommen Taro, it is displayed as NT).
- the output information acquired from the information processing device 20 by the gate device 10 includes information that determines a person (shape information, color information, related information that does not directly represent the person, or a combination of any of these). Therefore, by displaying the output information on the display 13, a person passing through the gate device 10 can grasp that it is an output of information to himself / herself based on the output information. Then, by including the authentication result in the output information, the person passing through the gate device 10 can grasp that the authentication result is the authentication result for himself / herself. On the other hand, even if a person other than the person passing through the gate device 10 sees the output information displayed on the display 13, the displayed output information can be directly linked to the person passing through the gate device 10. difficult. Therefore, when the output information including the authentication result of the person passing through the gate device 10 is displayed on the conspicuous display, it is difficult to associate the output information with the person to be authenticated, and therefore the authentication target. It is possible to prevent the privacy of the person.
- FIG. 2 is a second diagram showing an outline of the information processing system according to the present embodiment.
- the output destination of the above-mentioned output information may be the mobile terminal 4 carried by the person M passing through the gate device 10.
- the gate device 10 transmits the output information to the mobile terminal 4 carried by the person M.
- the information processing device 20 may directly transmit the output information to the mobile terminal 4 carried by the person M. By directly transmitting the output information to the mobile terminal 4 carried by the person M, the information processing device 20 can ensure more confidentiality as compared with the mode of outputting the output information to the display 13 of the gate device 10. can.
- FIG. 3 is a diagram showing a hardware configuration of a gate control device provided in the gate device according to the present embodiment.
- the gate control device 15 includes a processor 51 (CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (Field Programmable Gate Array), DSP (Digital Signal Processor), ASIC (Application Specific Integrated)). Circuit), etc.), ROM (Read Only Memory) 52, RAM (Random Access Memory) 53, SSD (Solid State Drive) 54, communication module 55, and other hardware.
- processor 51 CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (Field Programmable Gate Array), DSP (Digital Signal Processor), ASIC (Application Specific Integrated)). Circuit), etc.
- ROM Read Only Memory
- RAM Random Access Memory
- SSD Solid State Drive
- FIG. 4 is a diagram showing a hardware configuration of the information processing apparatus according to the present embodiment.
- the information processing apparatus 20 includes a processor 201 (CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (Field Programmable Gate Array), DSP (Digital Signal Processor), ASIC (Application Specific Integrated)). Circuit), etc.), ROM (Read Only Memory) 202, RAM (Random Access Memory) 203, database 204, communication module 205, and other hardware.
- processor 201 CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (Field Programmable Gate Array), DSP (Digital Signal Processor), ASIC (Application Specific Integrated)). Circuit), etc.
- ROM Read Only Memory
- RAM Random Access Memory
- database 204 database
- communication module 205 and other hardware.
- FIG. 5 is a diagram showing a functional configuration of the gate control device.
- the gate control device 15 exerts the functions of the gate control unit 151, the output information control unit 152, and the communication control unit 153 by executing the gate control program in the CPU 51.
- the gate control unit 151 controls the gate device 10.
- the output information control unit 152 controls the output of the information acquired from the information processing device 20 to an output device such as a display 13 or a mobile terminal 4.
- the communication control unit 153 controls communication with another device.
- FIG. 6 is a diagram showing a functional configuration of the information processing apparatus.
- the information processing apparatus 20 exerts each function of the acquisition unit 21, the authentication unit 22, the generation unit 23, and the output unit 24 by executing the information processing program in the CPU 21.
- the acquisition unit 21 acquires biometric information such as a target person.
- the authentication unit 22 authenticates using biometric information such as the target person.
- the generation unit 23 uses the identification information associated with the biometric information of the target person or the like to generate output information determined corresponding to the identification information.
- the output unit 24 outputs output information.
- the generation unit 23 generates output information determined corresponding to the identification information by using the identification information of the target person or the like and the output information generation algorithm.
- the output information generation algorithm is an algorithm that generates output information by combining a character and a shape or color determined according to the position in the character string. ..
- the generation unit 23 generates output information by combining a character and a shape or color determined according to a position in the character string. This process may be one aspect of the process of the generation unit 23.
- the generation unit 23 may acquire related information related to the target associated with the identification information of the target person or the like and generate output information including the related information.
- the information that can confirm the identity of the person is not displayed, so even if the person is mistaken for another person, there is no means for the person to notice it.
- the face image is displayed so that the person can be confirmed.
- a psychological load such as embarrassment is given to the user.
- the security gate in biometric authentication such as face recognition is how to reduce the psychological burden on the user while realizing the function of confirming that the person has been authenticated. It is also an issue in the development of such.
- FIG. 7 is a first diagram showing a processing flow of the gate device according to the first embodiment.
- FIG. 8 is a first diagram showing a processing flow of the information processing apparatus according to the first embodiment.
- the user approaches the gate body 1.
- the gate control unit 151 of the gate control device 15 sequentially acquires image data including an image taken by the camera 11 of the gate body 1 (step S101).
- the gate control unit 151 starts extracting the facial features reflected in the image indicated by the image data. It is assumed that the acquired image data stores the identifier of the gate body 1 including the camera 11 that transmitted the image data.
- the gate control unit 151 determines whether or not the feature information of the human face can be extracted from the image included in the image data (step S102). When the face feature information can be extracted, the gate control unit 151 transmits an authentication request including the feature information to the information processing apparatus 20 (step S103).
- the authentication request may include an identifier indicating the gate body 1, a network address of the gate control device 15, and the like, in addition to the feature information of the user's face.
- the information processing device 20 stores the facial feature information, the passage permission information, and the like registered in advance by the user of the gate device 10 in the database 204 and the like.
- the passage permission information may be an information group of identification information of the gate device 10 that can pass.
- the acquisition unit 21 of the information processing apparatus 20 acquires an authentication request (step S201).
- the authentication unit 22 calculates the similarity between the facial feature information included in the authentication request and a large number of facial feature information recorded in the database 104, and the feature information having a similarity equal to or higher than a predetermined threshold is stored in the database. It is determined whether or not it is recorded in 104 (step S202).
- the authentication unit 22 When the facial feature information whose similarity with the feature information included in the authentication request is equal to or higher than a predetermined threshold value is registered in the database 204, the authentication unit 22 includes the feature information having the highest similarity in the authentication request. It is determined that the characteristic information is the same as that of the user indicated by the characteristic information (step S203).
- the authentication unit 22 acquires the passage permission information recorded in the database 104 in association with the characteristic information of the same person as the user.
- the authentication unit 22 determines whether the pass permission information indicates passability (step S204). As an example, the authentication unit 22 determines whether the passage permission information associated with the characteristic information of the same person as the user is associated with the information of the identifier of the gate device 10 included in the authentication request and registered in the database 204. When the pass permission information associated with the characteristic information of the same person as the user and the information of the identifier of the gate device 10 included in the authentication request are associated with each other and registered in the database 204, the authentication unit 22 determines that the pass is possible. Determination (step S205).
- the authentication unit 22 has a case where the feature information having a similarity equal to or higher than a predetermined threshold is not recorded in the database 204, or the passage permission information associated with the feature information of the same person as the user and the gate included in the authentication request. If it is not registered in the database 204 in association with the information of the identifier of the device 10, it is determined that the passage is impassable (step S206). The authentication unit 22 generates authentication result information indicating passability or impassability (step S207).
- the authentication unit 22 acquires the ID of the user registered in the database 204 in association with the characteristic information of the same person as the user.
- the authentication unit 22 outputs a process start instruction including the user ID to the generation unit 23.
- the user ID may be an employee number, a student ID number, a condominium room number, or the like.
- the generation unit 23 acquires the user's ID based on the processing start instruction. It is assumed that the user's ID is indicated by a string of 10-digit numbers. The generation unit 23 acquires a string of the last four digits of the ten-digit number based on the user's ID. The generation unit 23 uses the four-digit number and the output information generation algorithm to generate output information determined according to the user's ID (step S208). It is assumed that the output information is information indicating the shape.
- the output unit 24 acquires the authentication result information generated by the authentication unit 22 and the output information generated by the generation unit 23.
- the output unit 24 generates a gate control instruction including the authentication result information and the output information (step S209).
- the output unit 24 acquires the network address of the gate device 10 included in the authentication request.
- the output unit 24 transmits a gate control instruction to the acquired network address (step S210).
- the gate control device 15 of the gate device 10 receives the gate control instruction (step S104).
- the gate control unit 151 of the gate control device 15 acquires the authentication result information included in the gate control instruction.
- the output information control unit 152 of the gate control device 15 acquires the output information included in the gate control instruction.
- the gate control unit 151 determines whether the information included in the authentication result information indicates passability or impassability (step S105). When the authentication result information includes information indicating passability, the gate control unit 151 controls the pass permission (step S106). When the authentication result information includes information indicating impassability, the gate control unit 151 controls the disapproval of passage (step S107).
- the gate control unit 151 determines that the passage permission is controlled, the gate control unit 151 outputs information indicating the passage permission to the output information control unit 152. Further, the gate control unit 151 controls the flapper 14 of the gate device 10 to be open (step S108). This opens the passage area of the gate device 10. Further, when the output information control unit 152 acquires the information indicating the passage permission, the output information and the mark indicating the passage permission are output to the display 13 (step S109). As a result, the display 13 displays the output information determined according to the ID of the user passing through the gate device 10 and the mark indicating the passage permission.
- the user recognizes that the passage is permitted based on the mark indicating the passage permission, visually recognizes the output information, recognizes that the passage permission is the output information determined by his / her own ID, and recognizes that the passage is permitted to himself / herself. Make sure you are allowed to pass. Then, the user passes through the gate device 10.
- the gate control unit 151 determines that the control of the passage prohibition is performed, the gate control unit 151 outputs information indicating the passage prohibition to the output information control unit 152. Further, the gate control unit 151 controls the flapper 14 of the gate device 10 to be closed (step S110). As a result, the passing area of the gate device 10 is closed. Further, when the output information control unit 152 acquires the information indicating the passage prohibition, the output information and the mark indicating the passage prohibition are output to the display 13 (step S111). As a result, the display 13 displays the output information determined according to the ID of the user passing through the gate device 10 and the mark indicating that the passage is not permitted.
- the user recognizes that the passage is prohibited based on the mark indicating the passage prohibition, visually recognizes the output information, and recognizes that the passage prohibition is the output information determined by his / her own ID. , Confirm that you are not allowed to pass through yourself. Then, the user makes an inquiry to the nearest manager without passing through the gate device 10.
- the authentication result information includes information indicating the reason why the authentication unit 22 has determined that the permission is not permitted (face is blurred, face is too far, etc.), and the output information control unit 152 outputs this information. You may.
- the output information control unit 152 may output personal information, a user's face image, or the like face down.
- the output information control unit 152 when the output information control unit 152 outputs the reason for determining the disapproval to the mobile terminal 4, the confidentiality is highly ensured, so that the output information control unit 152 includes personal information, a user's face image, and the like. May be controlled.
- FIG. 9 is a diagram showing an example of generating output information.
- the generation unit 23 acquires the user's ID based on the processing start instruction. As described above, the user's ID is shown as an example by a string of 10-digit numbers, and the generation unit 23 is a string of the last 4 digits of the 10-digit number based on the user's ID (S1). ).
- the generation unit 23 specifies the relationship (S2) between the left and right display positions of the two shapes as an example, based on the fourth digit number in the number string (S1).
- the generation unit 23 in the process of specifying the relationship (S2) between the left and right display positions of the two shapes based on the fourth digit, the first display position pattern P1 and the second display position pattern P2. Specify any one of them.
- the generation unit 23 specifies the relationship (S2) between the left and right display positions of the two shapes shown by the first display position pattern P1.
- the generation unit 23 specifies the relationship (S2) between the left and right display positions of the two shapes shown by the second display position pattern P2.
- This process is one aspect of the process in which the generation unit 23 specifies the position of the shape indicated by the output information according to the character and the position of the character in the character string based on the character string indicated by the user's ID. ..
- the generation unit 23 is a shape to be displayed at the left side of the relationship between the left and right display positions of the two shapes specified in (S2) based on the third digit number in the number column (S1). Specify S3).
- the generation unit 23 has the left first shape L1, the left second shape L2, and the left third shape.
- One of L3, the left fourth shape L4, and the left fifth shape L5 is specified.
- the generation unit 23 specifies the left first shape L1 when the third digit is 0 or 1.
- the generation unit 23 specifies the left second shape L2 when the third digit is 2 or 3.
- the generation unit 23 specifies the left third shape L3 when the third digit is 4 or 5.
- the generation unit 23 specifies the left fourth shape L4 when the third digit is 6 or 7.
- the generation unit 23 specifies the left fifth shape L5 when the third digit is 8 or 9.
- the generation unit 23 is a shape (S4) to be displayed on the right side of the relationship between the left and right display positions of the two shapes specified in (S2) based on the second digit number in the number column (S1). To identify.
- the generation unit 23 has the right first shape R1, the right second shape R2, and the right third shape. Specify either R3 or the right fourth shape R4.
- the generation unit 23 specifies the right first shape R1 when the second digit is 0 or 1.
- the generation unit 23 specifies the right second shape R2 when the second digit is 2 or 3.
- the generation unit 23 specifies the right fourth shape R4 when the second digit is any of 4 to 6.
- the generation unit 23 specifies the right fourth shape R4 when the second digit is any of 7 to 9.
- the generation unit 23 specifies a color (S5) to be colored in the shape specified in (S3) based on the first digit number in the number string (S1).
- the generation unit 23 has the first color information C1 and the second color information C2 in the process of specifying the color (S5) to be colored in the shape specified in (S3) based on the first digit number.
- One of the third color information C3 and the fourth color information C4 is specified.
- the generation unit 23 specifies the first color information C1 when the first digit is 0 or 1.
- the generation unit 23 specifies the second color information C2 when the first digit is 2 or 3.
- the generation unit 23 specifies the third color information C3 when the first digit is any of 4 to 6.
- the generation unit 23 specifies the fourth color information C4 when the first digit is any of 7 to 9.
- the generation unit 23 arranges the shape (S3) to be displayed at the left side position and the shape (S4) to be displayed at the right side position based on the relationship (S2) between the left and right display positions of the two shapes.
- the shape information E1 in which the color (S5) to be colored in the shape on the left side specified in S3) is colored is generated as output information.
- the information processing apparatus 20 can generate 160 types of shape information as output information.
- the process of sequentially specifying the information shown in S1, S2, S3, S4, and S5 described above is a process performed by the generation unit 23 of the information processing apparatus 20 using the output information generation algorithm.
- the generation unit 23 may specify the shape information as output information by using another output information generation algorithm.
- the output information generation algorithm is an algorithm that specifies the first four digits instead of the last four digits when specifying the number string from the above ID, and the generation unit 23 specifies the number string based on the algorithm. You may.
- the output information generation algorithm uses the character string including the alphabet, hiragana, katakana, and kanji. May be an algorithm that specifies instead of a number string, and based on this algorithm, the generation unit 23 sets a character string of a mixture of characters such as numbers and alphabets, hiragana, kana, and kanji into a string of numbers. It may be specified instead.
- the output information generation algorithm is an algorithm for specifying a number string or a character string having 5 or more digits based on the ID of the user of the gate device 10, and the generation unit 23 is a character string or a number string based on the algorithm. May be specified.
- the output information generation algorithm is an algorithm that specifies the display position of one shape or the relationship of some display position of three or more shapes instead of specifying the relationship between the left and right display positions of the two shapes (S2). May be. Further, in the output information generation algorithm, instead of specifying the shape (S3) to be displayed at the position on the left side, the shape to be displayed at a specific position in some display position relation of each shape is of a number other than five. It may be an algorithm specified from the inside. Further, the output information generation algorithm has a number other than four shapes to be displayed in each of the remaining display positions in some display position relation of each shape, instead of specifying the shape (S4) to be displayed in the position on the right side. It may be an algorithm that specifies from the shape of. Further, the output information generation algorithm may be an algorithm for specifying at least one specified shape from a number of colors other than four.
- FIG. 10 is a diagram showing an example of the shape indicated by the output information.
- the shapes of S2 to S4 as shown in FIG. 9, 40 types of shapes as shown in FIG. 10 can be generated. Further, as shown in S4, by specifying any one of a plurality of colors in these shapes, it is possible to generate display information by combining more shapes and colors.
- FIG. 11 is a diagram showing a display example of the gate device.
- the gate device 10 displays the mark m indicating the passage permission and the output information E indicating the shape determined according to the ID of the user of the gate device 10 to be authenticated side by side on the display 13 as shown in FIG. do. As shown in FIG. 10, the gate device 10 may add additional information A and output it together with the output information E.
- the gate device 10 is provided with a sensor that detects the surface temperature of the user's face, and the gate control unit 151 of the gate control device 15 uses the surface temperature of the user's face acquired from the sensor as the body temperature of the user. It may be output in addition to the additional information A.
- the user can grasp his / her own body temperature when passing through the gate device 10.
- the user shall confirm the additional information A displayed on the display 13 by the gate device 10. With, it is possible to determine whether or not it is okay to enter.
- the gate control unit 151 may control the passage permission when the body temperature of the user is equal to or higher than a predetermined temperature. That is, the gate control unit 151 confirms whether the user's body temperature is equal to or higher than the predetermined temperature. When the body temperature of the user is equal to or higher than a predetermined temperature, the gate control unit 151 indicates whether the authentication result information indicates passability or impassability based on the authentication result information included in the gate control instruction. Regardless of the judgment result of, the traffic is not permitted. The control of traffic denial may be the same as the above-mentioned process. The gate control unit 151 determines whether or not the user is wearing a mask based on the captured image, and if the user is not wearing a mask, outputs information prompting the user to wear the mask or controls the permission of passage. You may do so.
- the gate control unit 151 acquires information to be displayed as additional information A from the information processing device 20 in advance and temporarily stores the information.
- the output information control unit 152 displays the temporarily stored information in addition to the additional information A.
- the additional information A may be any information such as the current status of the area where the approach is divided by the gate device 10, precautions in the area, countermeasure information, and the like.
- the output information control unit 152 may display characters or sentences indicating that the area is "infectious disease countermeasures in progress".
- FIG. 12 is a diagram showing a processing flow of the gate device according to the second embodiment.
- the gate device 10 may perform the following processing instead of the processing described in the first embodiment.
- the gate control unit 151 of the gate device 10 receives a gate control instruction (step S301).
- the gate control unit 151 of the gate control device 15 acquires the authentication result information included in the gate control instruction.
- the output information control unit 152 of the gate control device 15 acquires the output information included in the gate control instruction.
- the gate control unit 151 determines whether the information included in the authentication result information indicates passability or impassability (step S302).
- the gate control unit 151 controls the pass permission (step S303).
- the gate control unit 151 controls the impassability of passage (step S304).
- the gate control unit 151 determines that the passage permission is controlled, the gate control unit 151 outputs information indicating the passage permission to the output information control unit 152. Further, the gate control unit 151 controls the flapper 14 of the gate device 10 to be open (step S305). This opens the passage area of the gate device 10. Further, when the output information control unit 152 acquires the information indicating the passage permission, the output information control unit 152 of the dedicated application recorded in the user's mobile terminal 4 which is the destination of the output request including the output information and the mark indicating the passage permission. Acquire the identification ID (output destination information) (step S306).
- the identification ID of the dedicated application recorded in the user's mobile terminal 4 may be included in the gate control instruction received from the information processing device 20.
- the information processing device 20 reads the identification ID of the dedicated application recorded in the user's mobile terminal 4 recorded in the database 204 in association with the ID of the user who is going to pass through the gate device 10.
- the identification ID of the dedicated application may be stored in the gate control instruction and transmitted to the gate device 10.
- the output information control unit 152 can acquire the identification ID of the dedicated application recorded in the user's mobile terminal 4 from the gate control instruction.
- the gate control unit 151 determines that the control of the passage prohibition is performed, the gate control unit 151 outputs information indicating the passage prohibition to the output information control unit 152. Further, the gate control unit 151 controls the flapper 14 of the gate device 10 to be closed (step S307). As a result, the passing area of the gate device 10 is closed. Further, when the output information control unit 152 acquires the information indicating the passage prohibition, the output information control unit 152 is recorded in the user's mobile terminal 4 which is the destination of the output request including the output information and the mark indicating the passage prohibition. Acquire the identification ID of the application (step S306).
- the output information control unit 152 transmits an output request including an identification ID and output information of the dedicated application of the user's mobile terminal 4 and a mark indicating passage permission or passage permission (step S308).
- the user has activated a dedicated application for passing through the gate device 10 recorded in the mobile terminal 4 when passing through the gate device 10 in advance before passing through the gate device 10. And. As a result, the mobile terminal 4 waits for the output request received by the dedicated application. When the mobile terminal 4 approaches the gate device 10, it communicates with the gate device 10. The mobile terminal 4 receives an output request from the gate device 10.
- the mobile terminal 4 When the mobile terminal 4 receives an output request from the gate device, it acquires the output information included in the output request and the mark indicating the passage permission. The mobile terminal 4 displays output information determined corresponding to the ID of the user passing through the gate device 10 and a mark indicating a passage permission on the display of the own terminal.
- the user who intends to pass through the gate device 10 can recognize that the passage is permitted by his / her mobile terminal 4 based on the mark indicating the passage permission. Further, the user visually recognizes the output information, recognizes that the passage permission is the output information determined by his / her own ID, and confirms that the passage permission is for himself / herself.
- the user passes through the gate device 10.
- the user causes the code reader 12 to read the payment code information displayed on the dedicated application screen activated by the mobile terminal 4, and makes a payment when passing through the gate device 10. You can do it.
- the gate control unit 151 of the gate control device 15 acquires payment code information from the code reader 12.
- the gate control unit 151 may perform payment processing for passage by a known technique using information for payment of a user who intends to pass through the gate device 10 included in the payment code information.
- the mobile terminal 4 When the mobile terminal 4 receives an output request from the gate device, instead of acquiring the output information included in the output request and the mark indicating the passage permission, the mobile terminal 4 obtains the output information included in the output request and the mark indicating the passage permission. Suppose you got it. In this case, the mobile terminal 4 displays the output information determined corresponding to the ID of the user passing through the gate device 10 and the mark indicating the passage prohibition on the display of the own terminal.
- the display 13 displays the output information determined according to the ID of the user passing through the gate device 10 and the mark indicating that the passage is not permitted.
- a user who intends to pass through the gate device 10 can recognize that the passage is permitted by his / her mobile terminal 4 based on the mark indicating the passage prohibition.
- the user visually recognizes the output information, recognizes that the passage denial is the output information determined by his / her own ID, and confirms that the passage is not permitted for himself / herself. Then, the user makes an inquiry to the nearest manager without passing through the gate device 10.
- the gate device 10 may output the output information to the mobile terminal 4 and may further perform the same processing as in the first embodiment of outputting the output information to the display 13 of the gate device 10.
- FIG. 13 is a diagram showing a processing flow of the gate device according to the third embodiment.
- the gate device 10 may generate output information as follows.
- the acquisition unit 21 of the information processing apparatus 20 acquires an authentication request (step S401).
- the authentication unit 22 calculates the similarity between the facial feature information included in the authentication request and a large number of facial feature information recorded in the database 104, and the feature information having a similarity equal to or higher than a predetermined threshold is stored in the database. It is determined whether or not it is recorded in 104 (step S402).
- the authentication unit 22 When the facial feature information whose similarity with the feature information included in the authentication request is equal to or higher than a predetermined threshold value is registered in the database 204, the authentication unit 22 includes the feature information having the highest similarity in the authentication request. It is determined that the characteristic information is the same as that of the user indicated by the characteristic information (step S403).
- the authentication unit 22 determines whether the pass permission information associated with the characteristic information of the same person as the user indicates passability (step S404). As an example, the authentication unit 22 determines whether the passage permission information associated with the characteristic information of the same person as the user is associated with the information of the identifier of the gate device 10 included in the authentication request and registered in the database 204. When the pass permission information associated with the characteristic information of the same person as the user and the information of the identifier of the gate device 10 included in the authentication request are associated with each other and registered in the database 204, the authentication unit 22 determines that the pass is possible. Determination (step S405).
- the authentication unit 22 includes the case where the feature information having a similarity equal to or higher than a predetermined threshold is not recorded in the database 204, the passage permission information associated with the feature information of the same person as the user, and the authentication request. If the information of the identifier of the gate device 10 is associated with the information and is not registered in the database 204, it is determined that the passage is impassable (step S406). The authentication unit 22 generates authentication result information indicating passability or impassability (step S407).
- the authentication unit 22 acquires the ID of the user registered in the database 204 in association with the characteristic information of the same person as the user.
- the authentication unit 22 outputs a process start instruction including the user ID to the generation unit 23.
- the generation unit 23 acquires the user's ID based on the processing start instruction. It is assumed that the user's ID is indicated by a string of 10-digit numbers. The generation unit 23 acquires a string of the last four digits of the ten-digit number based on the user's ID. The generation unit 23 uses the four-digit number and the output information generation algorithm to generate shape information determined according to the user's ID (step S408). The generation of the shape information is the same as the processing of the first embodiment described with reference to FIG.
- the generation unit 23 acquires the birthday of the user recorded in the database 204 in association with the user ID (step S409).
- the generation unit 23 generates related information indicating a four-digit number of the month and day of the birthday (step S410).
- the generation unit 23 generates output information including shape information and related information (step S411).
- the process of the generation unit 23 is one aspect of the process of acquiring the related information related to the target associated with the identification information of the target user and generating the output information including the related information.
- the output unit 24 acquires the authentication result information generated by the authentication unit 22 and the shape information generated by the generation unit 23.
- the output unit 24 generates a gate control instruction including the authentication result information and the output information (step S412).
- the output unit 24 acquires the network address of the gate device 10 included in the authentication request.
- the output unit 24 transmits a gate control instruction to the acquired network address (step S413).
- the gate device 10 controls the gate device 10 based on the authentication result information and processes the output of the output information, as in the first embodiment and the second embodiment.
- the user of the gate device 10 can confirm the information that uniquely identifies himself / herself together with the authentication result based on the shape information and the birthday information. Therefore, the user can confirm that the authentication result is for himself / herself, while another person cannot immediately identify the relationship between the output information and the user even if he / she visually recognizes the output information.
- FIG. 14 is a diagram showing a processing flow of the gate device according to the fourth embodiment.
- the gate device 10 may generate output information as follows.
- the acquisition unit 21 of the information processing apparatus 20 acquires an authentication request (step S501).
- the authentication unit 22 calculates the similarity between the facial feature information included in the authentication request and a large number of facial feature information recorded in the database 104, and the feature information having a similarity equal to or higher than a predetermined threshold is stored in the database. It is determined whether or not it is recorded in 104 (step S502).
- the authentication unit 22 When the facial feature information whose similarity with the feature information included in the authentication request is equal to or higher than a predetermined threshold value is registered in the database 204, the authentication unit 22 includes the feature information having the highest similarity in the authentication request. It is determined that the characteristic information is the same as that of the user indicated by the characteristic information (step S503).
- the authentication unit 22 determines whether the pass permission information associated with the characteristic information of the same person as the user indicates passability (step S504). As an example, the authentication unit 22 determines whether the passage permission information associated with the characteristic information of the same person as the user is associated with the information of the identifier of the gate device 10 included in the authentication request and registered in the database 204. When the pass permission information associated with the characteristic information of the same person as the user and the information of the identifier of the gate device 10 included in the authentication request are associated with each other and registered in the database 204, the authentication unit 22 determines that the pass is possible. Determination (step S505).
- the authentication unit 22 includes the case where the feature information having a similarity equal to or higher than a predetermined threshold is not recorded in the database 204, the passage permission information associated with the feature information of the same person as the user, and the authentication request. If the information of the identifier of the gate device 10 is associated with the information and is not registered in the database 204, it is determined that the passage is impassable (step S506). The authentication unit 22 generates authentication result information indicating passability or impassability (step S507).
- the authentication unit 22 acquires the ID of the user registered in the database 204 in association with the characteristic information of the same person as the user.
- the authentication unit 22 outputs a process start instruction including the user ID to the generation unit 23.
- the generation unit 23 acquires the user's ID based on the processing start instruction. It is assumed that the user's ID is indicated by a string of 10-digit numbers. The generation unit 23 acquires a string of the last four digits of the ten-digit number based on the user's ID. The generation unit 23 uses the four-digit number and the output information generation algorithm to generate shape information determined according to the user's ID (step S508). The generation of the shape information is the same as the processing of the first embodiment described with reference to FIG.
- the generation unit 23 acquires the birthday of the user recorded in the database 204 in association with the user ID (step S509). Further, the generation unit 23 acquires the initials of the user's name written in English recorded in the database 204 in association with the user ID (step S510). The generation unit 23 generates related information including the four-digit number of the date of the birthday and the initials (step S511). The generation unit 23 generates output information including shape information and related information (step S512).
- the process of the generation unit 23 is one aspect of the process of acquiring the related information related to the target associated with the identification information of the target user and generating the output information including the related information.
- the output unit 24 acquires the authentication result information generated by the authentication unit 22 and the output information generated by the generation unit 23.
- the output unit 24 generates a gate control instruction including the authentication result information and the output information (step S513).
- the output unit 24 acquires the network address of the gate device 10 included in the authentication request.
- the output unit 24 transmits a gate control instruction to the acquired network address (step S514).
- the gate device 10 controls the gate device 10 based on the authentication result information and processes the output of the output information, as in the first embodiment and the second embodiment.
- the user of the gate device 10 can confirm the information that uniquely identifies himself / herself together with the authentication result based on the shape information, the birthday, and the initials. Therefore, the user can confirm that the authentication result is for himself / herself, while another person cannot immediately identify the relationship between the output information and the user even if he / she visually recognizes the output information.
- FIG. 15 is a diagram showing a processing flow of the gate device according to the fifth embodiment.
- the gate device 10 may generate output information as follows.
- the acquisition unit 21 of the information processing apparatus 20 acquires an authentication request (step S601).
- the authentication unit 22 calculates the similarity between the facial feature information included in the authentication request and a large number of facial feature information recorded in the database 104, and the feature information having a similarity equal to or higher than a predetermined threshold is stored in the database. It is determined whether or not it is recorded in 104 (step S602).
- the authentication unit 22 When the facial feature information whose similarity with the feature information included in the authentication request is equal to or higher than a predetermined threshold value is registered in the database 204, the authentication unit 22 includes the feature information having the highest similarity in the authentication request. It is determined that the characteristic information is the same as that of the user indicated by the characteristic information (step S603).
- the authentication unit 22 determines whether the pass permission information associated with the characteristic information of the same person as the user indicates passability (step S604). As an example, the authentication unit 22 determines whether the passage permission information associated with the characteristic information of the same person as the user is associated with the information of the identifier of the gate device 10 included in the authentication request and registered in the database 204. When the pass permission information associated with the characteristic information of the same person as the user and the information of the identifier of the gate device 10 included in the authentication request are associated with each other and registered in the database 204, the authentication unit 22 determines that the pass is possible. Determination (step S605).
- the authentication unit 22 includes the case where the feature information having a similarity equal to or higher than a predetermined threshold is not recorded in the database 204, the passage permission information associated with the feature information of the same person as the user, and the authentication request. If the information of the identifier of the gate device 10 is associated with the information and is not registered in the database 204, it is determined that the passage is impassable (step S606). The authentication unit 22 generates authentication result information indicating passability or impassability (step S607).
- the authentication unit 22 acquires the ID of the user registered in the database 204 in association with the characteristic information of the same person as the user.
- the authentication unit 22 outputs a process start instruction including the user ID to the generation unit 23.
- the generation unit 23 acquires the user's ID based on the processing start instruction. It is assumed that the user's ID is indicated by a string of 10-digit numbers. The generation unit 23 acquires a string of the last four digits of the ten-digit number based on the user's ID. The generation unit 23 uses the four-digit number and the output information generation algorithm to generate shape information determined according to the user's ID (step S608). The generation of the shape information is the same as the processing of the first embodiment described with reference to FIG.
- the generation unit 23 acquires the avatar image of the user recorded in the database 204 in association with the user ID (step S609).
- the avatar image is a character image of the user, and may be an image of an animal or an image imitating a human being.
- the avatar image is an aspect of related information related to the target associated with the identification information of the target user.
- the generation unit 23 generates output information including shape information and an avatar image (related information) (step S610).
- the output unit 24 acquires the authentication result information generated by the authentication unit 22 and the output information generated by the generation unit 23.
- the output unit 24 generates a gate control instruction including the authentication result information and the output information (step S611).
- the output unit 24 acquires the network address of the gate device 10 included in the authentication request.
- the output unit 24 transmits a gate control instruction to the acquired network address (step S612).
- the gate device 10 controls the gate device 10 based on the authentication result information and processes the output of the output information, as in the first embodiment and the second embodiment.
- the user of the gate device 10 can confirm the information that uniquely identifies himself / herself together with the authentication result based on the information of the shape information and the avatar image. Therefore, the user can confirm that the authentication result is for himself / herself, while another person cannot immediately identify the relationship between the output information and the user even if he / she visually recognizes the output information.
- FIG. 16 is a diagram showing a processing flow of the gate device according to the sixth embodiment.
- the gate device 10 may generate output information as follows.
- the acquisition unit 21 of the information processing apparatus 20 acquires an authentication request (step S701).
- the authentication unit 22 calculates the similarity between the facial feature information included in the authentication request and a large number of facial feature information recorded in the database 104, and the feature information having a similarity equal to or higher than a predetermined threshold is stored in the database. It is determined whether or not it is recorded in 104 (step S702).
- the authentication unit 22 When the facial feature information whose similarity with the feature information included in the authentication request is equal to or higher than a predetermined threshold value is registered in the database 204, the authentication unit 22 includes the feature information having the highest similarity in the authentication request. It is determined that the characteristic information is the same as that of the user indicated by the characteristic information (step S703).
- the authentication unit 22 determines whether the pass permission information associated with the characteristic information of the same person as the user indicates passability (step S704). As an example, the authentication unit 22 determines whether the passage permission information associated with the characteristic information of the same person as the user is associated with the information of the identifier of the gate device 10 included in the authentication request and registered in the database 204. When the pass permission information associated with the characteristic information of the same person as the user and the information of the identifier of the gate device 10 included in the authentication request are associated with each other and registered in the database 204, the authentication unit 22 determines that the pass is possible. Determination (step S705).
- the authentication unit 22 includes the case where the feature information having a similarity equal to or higher than a predetermined threshold is not recorded in the database 204, the passage permission information associated with the feature information of the same person as the user, and the authentication request. If the information of the identifier of the gate device 10 is associated with the information and is not registered in the database 204, it is determined that the passage is impassable (step S706). The authentication unit 22 generates authentication result information indicating passability or impassability (step S707).
- the authentication unit 22 acquires the ID of the user registered in the database 204 in association with the characteristic information of the same person as the user.
- the authentication unit 22 outputs a process start instruction including the user ID to the generation unit 23.
- the generation unit 23 acquires the user's ID based on the processing start instruction. It is assumed that the user's ID is indicated by a string of 10-digit numbers. The generation unit 23 acquires a string of the last four digits of the ten-digit number based on the user's ID. The generation unit 23 uses the four-digit number and the output information generation algorithm to generate shape information determined according to the user's ID (step S708). The generation of the shape information is the same as the processing of the first embodiment described with reference to FIG.
- the generation unit 23 acquires an image indicating the birthplace of the user recorded in the database 204 in association with the user ID (step S709).
- the image showing the place of origin is one aspect of the related information related to the target associated with the identification information of the target user.
- the generation unit 23 generates output information including shape information and an image (related information) indicating the place of origin (step S710).
- the output unit 24 acquires the authentication result information generated by the authentication unit 22 and the output information generated by the generation unit 23.
- the output unit 24 generates a gate control instruction including the authentication result information and the output information (step S711).
- the output unit 24 acquires the network address of the gate device 10 included in the authentication request.
- the output unit 24 transmits a gate control instruction to the acquired network address (step S712).
- the gate device 10 controls the gate device 10 based on the authentication result information and processes the output of the output information, as in the first embodiment and the second embodiment.
- the user of the gate device 10 can confirm the information uniquely identifying himself / herself together with the authentication result based on the information of the shape information and the image showing his / her hometown. Therefore, the user can confirm that the authentication result is for himself / herself, while another person cannot immediately identify the relationship between the output information and the user even if he / she visually recognizes the output information.
- FIG. 17 is a diagram showing a processing flow of the gate device according to the seventh embodiment.
- the gate device 10 may generate output information as follows.
- the acquisition unit 21 of the information processing apparatus 20 acquires an authentication request (step S801).
- the authentication unit 22 calculates the similarity between the facial feature information included in the authentication request and a large number of facial feature information recorded in the database 104, and the feature information having a similarity equal to or higher than a predetermined threshold is stored in the database. It is determined whether or not it is recorded in 104 (step S802).
- the authentication unit 22 When the facial feature information whose similarity with the feature information included in the authentication request is equal to or higher than a predetermined threshold value is registered in the database 204, the authentication unit 22 includes the feature information having the highest similarity in the authentication request. It is determined that the characteristic information is the same as that of the user indicated by the characteristic information (step S803).
- the authentication unit 22 determines whether the pass permission information associated with the characteristic information of the same person as the user indicates passability (step S804). As an example, the authentication unit 22 determines whether the passage permission information associated with the characteristic information of the same person as the user is associated with the information of the identifier of the gate device 10 included in the authentication request and registered in the database 204. When the pass permission information associated with the characteristic information of the same person as the user and the information of the identifier of the gate device 10 included in the authentication request are associated with each other and registered in the database 204, the authentication unit 22 determines that the pass is possible. Determination (step S805).
- the authentication unit 22 includes the case where the feature information having a similarity equal to or higher than a predetermined threshold is not recorded in the database 204, the passage permission information associated with the feature information of the same person as the user, and the authentication request. If the information of the identifier of the gate device 10 is associated with the information and is not registered in the database 204, it is determined that the passage is impassable (step S8). 06). The authentication unit 22 generates authentication result information indicating passability or impassability (step S807).
- the authentication unit 22 acquires the ID of the user registered in the database 204 in association with the characteristic information of the same person as the user.
- the authentication unit 22 outputs a process start instruction including the user ID to the generation unit 23.
- the generation unit 23 acquires the user's ID based on the processing start instruction. It is assumed that the user's ID is indicated by a string of 10-digit numbers. The generation unit 23 acquires a string of the last four digits of the ten-digit number based on the user's ID. The generation unit 23 uses the four-digit number and the output information generation algorithm to generate shape information determined according to the user's ID (step S808). The generation of the shape information is the same as the processing of the first embodiment described with reference to FIG.
- the generation unit 23 acquires the user's characteristic information from the authentication unit 22.
- the generation unit 23 acquires the security management information and guest management information of the user from the database 204 based on the feature information (step S809).
- the security management information may be the degree of safety of the user indicated by the feature information of the face. For example, the degree of safety may be information determined by a past crime history or the like.
- the guest management information may be information indicating whether or not the user is a guest.
- the generation unit 23 generates output information including shape information and icon images (related information) corresponding to security management information and guest management information (step S810).
- the icon image corresponding to the security management information is, for example, an icon image indicating safety or danger if the security management information is information representing the degree of safety in two stages of safety and danger. Further, the icon image corresponding to the guest management information may be an icon image indicating whether the person is a guest or not.
- the output unit 24 acquires the authentication result information generated by the authentication unit 22 and the output information generated by the generation unit 23.
- the output unit 24 generates a gate control instruction including the authentication result information and the output information (step S811).
- the output unit 24 acquires the network address of the gate device 10 included in the authentication request.
- the output unit 24 transmits a gate control instruction to the acquired network address (step S812).
- the gate device 10 controls the gate device 10 based on the authentication result information and processes the output of the output information in the same manner as in the first embodiment and the second embodiment.
- the gate device 10 may display the authentication result information, the security management information, and the output information including the icon image corresponding to the guest management information on a monitor or the like in the vicinity of the communication connection.
- the guards and the like who have confirmed the monitor can determine whether the user is a guest or a person with a criminal history.
- the gate device 10 detects that a target with a past crime history has been found based on security management information, it not only displays it on the monitor but also generates alert information to the guards and outputs it to a predetermined output destination. It may be output to, or a process of closing the flapper 14 may be performed.
- FIG. 18 is a diagram showing a processing flow of the managed mobile terminal according to the eighth embodiment.
- FIG. 19 is a diagram showing a processing flow of the information processing apparatus according to the eighth embodiment.
- FIG. 20 is a diagram showing a processing flow of the gate device according to the eighth embodiment.
- the gate device 10 acquires a user's face image.
- a device other than the gate device 10 may acquire a facial image of a user who is about to pass through the gate device 10.
- the management mobile terminal 44 carried by the guard in the vicinity of the gate device may take a face image of the user with a camera provided in the terminal. An example in this case will be described below.
- the management mobile terminal 44 communicates with the information processing device 20.
- the management mobile terminal 44 may be connected to the gate device 10 by communication. This communication connection may be in a mode in which the management mobile terminal 44 and the gate device 10 are always connected while the dedicated application is running on the management mobile terminal 44.
- the management mobile terminal 44 communicates with the gate device 10, it acquires the identification information of the gate device 10. Further, it is assumed that the management mobile terminal 44 is also connected to the information processing device 20 via the communication network.
- the guard takes a picture of the face of the user approaching the gate device 10 while trying to pass through the gate device 10 by using the management mobile terminal 44 that he / she carries.
- the management mobile terminal 44 sequentially acquires image data including images taken by a camera provided in the own terminal (step S901).
- the management mobile terminal 44 starts extracting the facial feature amount reflected in the image indicated by the image data.
- the management mobile terminal 44 determines whether or not the feature information of the human face can be extracted from the image included in the image data (step S902).
- the management mobile terminal 44 transmits the authentication request including the feature information, the network address of the own terminal, and the identification information of the gate device 10 to the information processing device 20 (. Step S903).
- the information processing device 20 stores the facial feature information, the passage permission information, and the like registered in advance by the user of the gate device 10 in the database 204 and the like.
- the passage permission information may be an information group of identification information of the gate device 10 that can pass.
- the acquisition unit 21 of the information processing apparatus 20 acquires an authentication request (step S1001).
- the authentication unit 22 calculates the similarity between the facial feature information included in the authentication request and a large number of facial feature information recorded in the database 104, and the feature information having a similarity equal to or higher than a predetermined threshold is stored in the database. It is determined whether or not it is recorded in 104 (step S1002).
- the authentication unit 22 When the facial feature information whose similarity with the feature information included in the authentication request is equal to or higher than a predetermined threshold value is registered in the database 204, the authentication unit 22 includes the feature information having the highest similarity in the authentication request. It is determined that the characteristic information is the same as that of the user indicated by the characteristic information (step S1003).
- the authentication unit 22 determines whether the pass permission information associated with the characteristic information of the same person as the user indicates passability (step S1004). As an example, the authentication unit 22 determines whether the passage permission information associated with the characteristic information of the same person as the user is associated with the information of the identifier of the gate device 10 included in the authentication request and registered in the database 204. When the pass permission information associated with the characteristic information of the same person as the user and the information of the identifier of the gate device 10 included in the authentication request are associated with each other and registered in the database 204, the authentication unit 22 determines that the pass is possible. Determination (step S1005).
- the authentication unit 22 includes the case where the feature information having a similarity equal to or higher than a predetermined threshold is not recorded in the database 204, the passage permission information associated with the feature information of the same person as the user, and the authentication request. If the information of the identifier of the gate device 10 is associated with the information and is not registered in the database 204, it is determined that the passage is impassable (step S1006). The authentication unit 22 generates authentication result information indicating passability or impassability (step S1007).
- the authentication unit 22 acquires the ID of the user registered in the database 204 in association with the characteristic information of the same person as the user.
- the authentication unit 22 outputs a process start instruction including the user ID to the generation unit 23.
- the generation unit 23 acquires the user's ID based on the processing start instruction. It is assumed that the user's ID is indicated by a string of 10-digit numbers. The generation unit 23 acquires a string of the last four digits of the ten-digit number based on the user's ID. The generation unit 23 uses the four-digit number and the output information generation algorithm to generate output information determined according to the user's ID (step S1008). It is assumed that the output information is information indicating the shape.
- the output unit 24 acquires the authentication result information generated by the authentication unit 22 and the output information generated by the generation unit 23.
- the output unit 24 generates a control instruction including the authentication result information and the output information (step S1009).
- the output unit 24 acquires the network address of the management mobile terminal 44 included in the authentication request (step S1010).
- the output unit 24 acquires the identification information of the gate device 10 included in the authentication request.
- the output unit 24 acquires the network address of the gate device 10 recorded in the database 204 in association with the identification information of the gate device 10 (step S1011).
- the output unit 24 transmits a control instruction to the acquired network address of the management mobile terminal 44 (step S1012). Further, the output unit 24 transmits a control instruction to the acquired network address of the gate device 10 (step S1013).
- the management mobile terminal 44 carried by the guard receives the control instruction (step S904).
- the management mobile terminal 44 acquires the authentication result from the control instruction.
- the management mobile terminal 44 determines whether the information included in the authentication result information indicates passability or impassability (step S905).
- the management mobile terminal 44 outputs the output information and the mark indicating pass permission to the display 13 (step S906).
- the guard can recognize that the authentication result of the user who is going to pass through the gate device 10 is a passage permit.
- the guard makes the user visually recognize the information displayed on the display 13.
- the user recognizes that the passage is permitted based on the mark indicating the passage permission, visually recognizes the output information, and recognizes that the passage permission is the output information determined by his / her own ID. Make sure you have a permit for yourself. Then, the user passes through the gate device 10.
- the management mobile terminal 44 outputs the output information and the mark indicating non-permission to the display 13 (step S907).
- the guard can recognize that the authentication result of the user who is going to pass through the gate device 10 is not permitted.
- the guard makes the user visually recognize the information displayed on the display 13.
- the user recognizes that the passage is prohibited based on the mark indicating the passage prohibition, visually recognizes the output information, and recognizes that the passage prohibition is the output information determined by his / her own ID. Then, confirm that you are not allowed to pass through yourself. The user may consult with the guards for passing through the gate device 10 without passing through the gate device 10.
- the gate control device 15 of the gate device 10 receives the gate control instruction (step S1101).
- the gate control unit 151 of the gate control device 15 acquires the authentication result information included in the gate control instruction.
- the output information control unit 152 of the gate control device 15 acquires the output information included in the gate control instruction.
- the gate control unit 151 determines whether the information included in the authentication result information indicates passability or impassability (step S1102). When the authentication result information includes information indicating passability, the gate control unit 151 controls the pass permission (step S1103). When the authentication result information includes information indicating impassability, the gate control unit 151 controls the impassability of passage (step S1104).
- the gate control unit 151 determines that the passage permission is controlled, the gate control unit 151 outputs information indicating the passage permission to the output information control unit 152. Further, the gate control unit 151 controls the flapper 14 of the gate device 10 to be open (step S1105). This opens the passage area of the gate device 10.
- the gate control unit 151 determines that the control of the passage prohibition is performed, the gate control unit 151 outputs information indicating the passage prohibition to the output information control unit 152. Further, the gate control unit 151 controls the flapper 14 of the gate device 10 to be closed (step S1106). As a result, the passing area of the gate device 10 is closed.
- the management mobile terminal 44 can display the user authentication result and the shape information determined according to the user's identification information on the management mobile terminal 44.
- the information processing apparatus 20 separately transmits control instructions to the management mobile terminal 44 and the gate apparatus 10.
- the information processing device 20 transmits the control instruction to one of the management mobile terminal 44 or the gate device 10, and the management mobile terminal 44 or the gate device 10 that has received the control instruction transfers the control instruction to the other. You may do it.
- the shape information determined corresponding to the identification information of the user to be authenticated included in the output information and the information indicating the authentication result are displayed together. explained. However, the photographed image of the user to be authenticated may be further displayed, and the shape information and the authentication result may be displayed on the photographed image. A case where the gate device 10 performs the processing will be described.
- the gate device 10 sequentially acquires captured images acquired from the camera 11 when displaying the output information and the authentication result (mark indicating passage permission or non-passage permission) on the display 13.
- the gate device 10 can output the acquired images as moving images by displaying them in order. It is assumed that the user is shown in the captured image.
- the gate device 10 superimposes and displays the output information and the authentication result on the captured image. As a result, the user can check his / her face image, output information, and authentication result at the same time.
- the gate device 10 may specify a face that has generated facial feature information as a user of the gate device among a plurality of people reflected in the captured image, and display only the face image. As a result, even when a plurality of people appear in the captured image, the output information and the authentication result can be superimposed on the face image of the authenticated user and displayed on the display 13.
- the output information, the authentication result, and the captured image are displayed on the display 13 provided in the gate device 10.
- the gate device 10 communicates with the gate device 10 and is in the vicinity of the gate device 10. Such information may be displayed on a provided monitor.
- the gate device 10 and the information processing device 20 may perform the same processing on the plurality of persons.
- the gate control unit 151 of the gate device 10 generates characteristic information about each person reflected in the captured image, and generates each authentication information including each characteristic information in the same manner as in each of the above-described embodiments.
- the gate control unit 151 transmits a plurality of generated authentication information to the information processing device 20. Then, based on each authentication information, the information processing apparatus 20 performs processing in the same manner as in each of the above-described embodiments, generates a gate control instruction, and transmits the gate control instruction to the gate apparatus 10.
- the gate device 10 sequentially receives each gate control instruction corresponding to each authentication request.
- the gate control unit 151 specifies the correspondence between the gate control instruction and the feature information of each face in the captured image.
- the processing ID is included in the authentication request, and the information processing apparatus 20 stores the same processing ID in the gate control instruction corresponding to the authentication request.
- the gate control unit 151 identifies the feature information stored in association with the processing ID by using the processing ID included in the gate control instruction information by temporarily storing the feature information in association with the processing ID. can do.
- the output information control unit 152 displays the captured image captured by the camera 11 on a monitor or the like.
- the output information included in the gate control instruction and the mark indicated by the authentication result are superimposed and displayed on the face.
- the gate device 10 outputs the output information of the target person having a crime history in association with the face image of the target person. Can be done.
- the gate control unit 151 may perform a process of associating the output information of a person having a crime history with the facial feature information of the target person and recording the information processing device 20.
- the output information and the mark of the authentication result can be displayed for each of the plurality of people.
- the administrator or the like can manage the user who is going to pass through the gate device 10 while checking the monitor.
- FIG. 21 is a diagram showing an outline of an information processing system according to the eleventh embodiment.
- FIG. 22 is a diagram showing a processing flow of the managed mobile terminal according to the eleventh embodiment.
- FIG. 23 is a diagram showing a processing flow of the information processing apparatus according to the eleventh embodiment.
- the gate control instruction may be output to the management mobile terminal 44 managed by a security guard or the like. An example in this case will be described below.
- the guard has started the dedicated application recorded in the management mobile terminal 44 in the vicinity of the gate device 10. It is assumed that the management mobile terminal 44 is in communication connection with the information processing device 20 via the communication network.
- the guard takes a picture of the face of the user approaching the gate device 10 while trying to pass through the gate device 10 by using the management mobile terminal 44 that he / she carries.
- the management mobile terminal 44 sequentially acquires image data including images taken by a camera provided in the own terminal (step S1201).
- the management mobile terminal 44 starts extracting the facial feature amount reflected in the image indicated by the image data.
- the management mobile terminal 44 determines whether or not the feature information of the human face can be extracted from the image included in the image data (step S1202). When the management mobile terminal 44 can extract the facial feature information, the management mobile terminal 44 transmits the authentication request including the feature information and the network address of the own terminal to the information processing apparatus 20 (step S1203).
- the acquisition unit 21 of the information processing apparatus 20 acquires an authentication request (step S1301).
- the authentication unit 22 calculates the similarity between the facial feature information included in the authentication request and a large number of facial feature information recorded in the database 104, and the feature information having a similarity equal to or higher than a predetermined threshold is stored in the database. It is determined whether or not it is recorded in 104 (step S1302).
- the authentication unit 22 When the facial feature information whose similarity with the feature information included in the authentication request is equal to or higher than a predetermined threshold value is registered in the database 204, the authentication unit 22 includes the feature information having the highest similarity in the authentication request. It is determined that the characteristic information is the same as that of the user indicated by the characteristic information (step S1303).
- the authentication unit 22 determines whether the pass permission information associated with the characteristic information of the same person as the user indicates passability (step S1304). As an example, the authentication unit 22 determines whether the passage permission information associated with the characteristic information of the same person as the user indicates the passage permission. When the passage permission information associated with the characteristic information of the same person as the user indicates the passage permission, the authentication unit 22 determines that the passage is possible (step S1305).
- the authentication unit 22 when the authentication unit 22 does not record the feature information having a similarity equal to or higher than a predetermined threshold value in the database 204, or when the pass permission information associated with the feature information of the same person as the user indicates the pass prohibition. , It is determined that the passage is impassable (step S1306). The authentication unit 22 generates authentication result information indicating passability or impassability (step S1307).
- the authentication unit 22 acquires the ID of the user registered in the database 204 in association with the characteristic information of the same person as the user.
- the authentication unit 22 outputs a process start instruction including the user ID to the generation unit 23.
- the generation unit 23 acquires the user's ID based on the processing start instruction. It is assumed that the user's ID is indicated by a string of 10-digit numbers. The generation unit 23 acquires a string of the last four digits of the ten-digit number based on the user's ID. The generation unit 23 uses the four-digit number and the output information generation algorithm to generate output information determined according to the user's ID (step S1308). It is assumed that the output information is information indicating the shape.
- the output unit 24 acquires the authentication result information generated by the authentication unit 22 and the output information generated by the generation unit 23.
- the output unit 24 generates a control instruction including the authentication result information and the output information (step S1309).
- the output unit 24 acquires the network address of the management mobile terminal 44 included in the authentication request (step S1310).
- the output unit 24 transmits a control instruction to the acquired network address of the management mobile terminal 44 (step S1311).
- the management mobile terminal 44 carried by the guard receives the control instruction (step S1204).
- the management mobile terminal 44 acquires the authentication result from the control instruction.
- the management mobile terminal 44 determines whether the information included in the authentication result information indicates passability or impassability (step S1205).
- the management mobile terminal 44 outputs the output information and the mark indicating pass permission to the display 13 (step S1206).
- the guard can recognize on the management mobile terminal 44 that the authentication result of the user who is going to pass through the gate device 10 is a passage permit.
- the guard makes the user visually recognize the information displayed on the display 13.
- the user recognizes that the passage is permitted based on the mark indicating the passage permission, visually recognizes the output information, and recognizes that the passage permission is the output information determined by his / her own ID. Make sure you have a permit for yourself. Then, the user passes through the gate device 10.
- the guard can determine whether or not the passage is possible with the managed mobile terminal 44, it is possible to manage the entrance of the user using the managed mobile terminal 44 even in a place where the gate device 10 cannot be introduced. ..
- the management mobile terminal 44 outputs the output information and the mark indicating non-permission to the display 13 (step S1207).
- the guard can recognize on the management mobile terminal 44 that the authentication result of the user who is going to pass through the gate device 10 is not permitted.
- the guard makes the user visually recognize the information displayed on the display 13.
- the user recognizes that the passage is prohibited based on the mark indicating the passage prohibition, visually recognizes the output information, and recognizes that the passage prohibition is the output information determined by his / her own ID. Then, confirm that you are not allowed to pass through yourself. The user may consult with the guards for passing through the gate device 10 without passing through the gate device 10.
- a person who manages the gate device 10 such as a security guard takes a picture of a user who is going to pass through the gate device 10 by using the management mobile terminal 44, and the person takes a picture thereof.
- the management mobile terminal 44 can display the user authentication result and the shape information determined according to the user's identification information on the management mobile terminal 44.
- the information processing system 100 in which the gate device 10 and the information processing device 20 are connected via a communication network has been described.
- the information processing system 100 may be a computer system in which a device other than the gate device 10 and the information processing device 20 are connected via a communication network.
- the information processing system 100 may be a guidance device, a ticket issuing machine, a vending machine, or other device instead of the gate device 10.
- These devices will be referred to as display devices.
- These devices may be provided with at least a device for acquiring biological information such as a camera (camera, fingerprint scanner, vein scanner, etc.) and a display.
- the information processing device 20 may not perform the authentication process.
- the display device transmits a processing request including the feature information of the user's face to the information processing device 20.
- the information processing device 20 similarly generates shape information based on facial feature information.
- the information processing device 20 transmits the shape information to the display device.
- the display device receives the shape information and displays it on the display. As a result, the user can confirm the shape information determined by his / her own identification information.
- each of the above-described embodiments is processed using facial feature information
- other biological information such as iris information may be used instead of facial feature information. That is, the gate control unit 151 acquires the iris information of the user's eyes from the captured image.
- the information processing apparatus performs the same processing by using the iris information instead of the facial feature information of each of the above-described embodiments.
- the biometric information may also be fingerprint information, palm vein information, or the like.
- FIG. 24 is a diagram showing a minimum configuration of an information processing system.
- FIG. 25 is a diagram showing a processing flow of an information processing system showing a minimum configuration.
- the information processing system includes 100, at least acquisition means 241 and generation means 242, and output means 243.
- the acquisition means 241 acquires the target biological information (step S2101).
- the generation means 242 uses the identification information associated with the biological information to generate output information determined corresponding to the identification information (step S2102).
- the output means 243 outputs the output information (step S2103).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Bioethics (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- Devices For Checking Fares Or Tickets At Control Points (AREA)
- Hardware Redundancy (AREA)
Abstract
Description
なお、認証装置が認証する対象の認証の結果と共に、認証した対象に関する情報を表示する際に、その認証した対象の特定を、さらに難しくできるような情報を出力する技術が望まれている。
図1で示すように情報処理システム100は、ゲート装置10と情報処理装置20とを少なくとも含んで構成される。ゲート装置10と情報処理装置20とは通信ネットワークを介して接続されている。
上述の出力情報の出力先は、ゲート装置10を通過する人物Mの携帯する携帯端末4であってもよい。この場合、ゲート装置10は、出力情報を人物Mの携帯する携帯端末4へ送信する。または情報処理装置20が、人物Mの携帯する携帯端末4へ出力情報を直接送信するようにしてもよい。情報処理装置20が、人物Mの携帯する携帯端末4へ出力情報を直接送信することにより、ゲート装置10のディスプレイ13に出力情報を出力する態様と比較して、より秘匿性が確保することができる。
図3で示すように、ゲート制御装置15は、プロセッサ51(CPU(Central Processing Unit),GPU(Graphics Processing Unit),FPGA(Field Programmable Gate Array),DSP(Digital Signal Processor),ASIC(Application Specific Integrated Circuit)など)、ROM(Read Only Memory)52、RAM(Random Access Memory)53、SSD(Solid State Drive)54、通信モジュール55等の各ハードウェアを備えたコンピュータである。
図4で示すように、情報処理装置20は、プロセッサ201(CPU(Central Processing Unit),GPU(Graphics Processing Unit),FPGA(Field Programmable Gate Array),DSP(Digital Signal Processor),ASIC(Application Specific Integrated Circuit)など)、ROM(Read Only Memory)202、RAM(Random Access Memory)203、データベース204、通信モジュール205等の各ハードウェアを備えたコンピュータである。
ゲート制御装置15は、CPU51においてゲート制御プログラムを実行することにより、ゲート制御部151、出力情報制御部152、通信制御部153の機能を発揮する。
情報処理装置20は、CPU21において情報処理プログラムを実行することにより、取得部21、認証部22、生成部23、出力部24の各機能を発揮する。
認証部22は、対象である人物等の生体情報を用いて認証を行う。
生成部23は、対象である人物等の生体情報に関連付けられた識別情報を用いて、当該識別情報に対応して定まる出力情報を生成する。
出力部24は、出力情報を出力する。
図7は、第一実施形態によるゲート装置の処理フローを示す第一の図である。
図8は、第一実施形態による情報処理装置の処理フローを示す第一の図である。
以下、本発明の実施形態による処理について順を追って説明する。
利用者がゲート本体1に接近する。ゲート制御装置15のゲート制御部151は、ゲート本体1のカメラ11が撮影した画像を含む画像データを順次取得する(ステップS101)。ゲート制御部151は、画像データを取得すると当該画像データが示す画像に映る顔の特徴量の抽出を開始する。取得した画像データには、当該画像データを送信したカメラ11を備えるゲート本体1の識別子が格納されているものとする。
次に、情報処理装置20の生成部23が、認証対象となった利用者のIDに対応して定まる出力情報の生成処理の第一の具体例について説明する。
生成部23は、処理の開始指示に基づいて利用者のIDを取得する。上述したように当該利用者のIDは一例として10桁の数字の列で示され、生成部23は当該利用者のIDに基づいて、当該10桁の数字の下4桁の数字の列(S1)を取得する。
また出力情報生成アルゴリズムは、左側の位置に表示する形状(S3)の特定の代わりに、各形状の何等かの表示位置関係における特定位置に表示する形状を5つ以外の他の数の形状の中から特定するアルゴリズムであってよい。
また出力情報生成アルゴリズムは、右側の位置に表示する形状(S4)の特定の代わりに、各形状の何等かの表示位置関係における残りの各表示位置に表示する形状を4つ以外の他の数の形状の中から特定するアルゴリズムであってよい。
また出力情報生成アルゴリズムは、特定した少なくとも一つの形状に4つ以外の数の色の中から特定するアルゴリズムであってよい。
図9で示すようなS2~S4の形状を特定することにより、図10で示すような40通りの形状を生成することができる。またS4で示すように、これらの形状に複数の色のうちの何れかの色を特定することにより、さらに多くの形状と色彩の組み合わせによる表示情報を生成することができる。
ゲート装置10は、通行許可を示すマークmと、認証の対象であるゲート装置10の利用者のIDに応じて定まる形状を示す出力情報Eとを、図11で示すように並べてディスプレイ13に表示する。ゲート装置10は、図10で示すように付加情報Aを追加して出力情報Eと共に出力するようにしてもよい。
図12は、第二実施形態によるゲート装置の処理フローを示す図である。
ゲート装置10は、第一実施形態と同様に情報処理装置20からゲート通過指示を受信すると、第一実施形態で説明した処理の代わりに以下の処理を行ってもよい。具体的には、ゲート装置10のゲート制御部151は、ゲート制御指示を受信する(ステップS301)。ゲート制御装置15のゲート制御部151はゲート制御指示に含まれる認証結果情報を取得する。ゲート制御装置15の出力情報制御部152は、ゲート制御指示に含まれる出力情報を取得する。ゲート制御部151は、認証結果情報に含まれる情報が通行可を示すか、通行不可を示すかを判定する(ステップS302)。ゲート制御部151は、認証結果情報に通行可を示す情報が含まれる場合、通行許可の制御を行う(ステップS303)。ゲート制御部151は認証結果情報に通行不可を示す情報が含まれる場合、通行不許可の制御を行う(ステップS304)。
図13は、第三実施形態によるゲート装置の処理フローを示す図である。
ゲート装置10は以下のように出力情報を生成してもよい。
情報処理装置20の取得部21は、認証要求を取得する(ステップS401)。認証部22は、認証要求に含まれる顔の特徴情報と、データベース104に記録されている多数の顔の特徴情報との類似度を計算し、所定の閾値以上の類似度となる特徴情報がデータベース104に記録されているかを判定する(ステップS402)。
図14は、第四実施形態によるゲート装置の処理フローを示す図である。
ゲート装置10は以下のように出力情報を生成してもよい。
情報処理装置20の取得部21は、認証要求を取得する(ステップS501)。認証部22は、認証要求に含まれる顔の特徴情報と、データベース104に記録されている多数の顔の特徴情報との類似度を計算し、所定の閾値以上の類似度となる特徴情報がデータベース104に記録されているかを判定する(ステップS502)。
図15は、第五実施形態によるゲート装置の処理フローを示す図である。
ゲート装置10は以下のように出力情報を生成してもよい。
情報処理装置20の取得部21は、認証要求を取得する(ステップS601)。認証部22は、認証要求に含まれる顔の特徴情報と、データベース104に記録されている多数の顔の特徴情報との類似度を計算し、所定の閾値以上の類似度となる特徴情報がデータベース104に記録されているかを判定する(ステップS602)。
図16は、第六実施形態によるゲート装置の処理フローを示す図である。
ゲート装置10は以下のように出力情報を生成してもよい。
情報処理装置20の取得部21は、認証要求を取得する(ステップS701)。認証部22は、認証要求に含まれる顔の特徴情報と、データベース104に記録されている多数の顔の特徴情報との類似度を計算し、所定の閾値以上の類似度となる特徴情報がデータベース104に記録されているかを判定する(ステップS702)。
図17は、第七実施形態によるゲート装置の処理フローを示す図である。
ゲート装置10は以下のように出力情報を生成してもよい。
情報処理装置20の取得部21は、認証要求を取得する(ステップS801)。認証部22は、認証要求に含まれる顔の特徴情報と、データベース104に記録されている多数の顔の特徴情報との類似度を計算し、所定の閾値以上の類似度となる特徴情報がデータベース104に記録されているかを判定する(ステップS802)。
06)。認証部22は、通行可または通行不可を示す認証結果情報を生成する(ステップS807)。
図18は、第八実施形態による管理携帯端末の処理フローを示す図である。
図19は、第八実施形態による情報処理装置の処理フローを示す図である。
図20は、第八実施形態によるゲート装置の処理フローを示す図である。
上述の第一実施形態から第七実施形態においては、ゲート装置10が利用者の顔画像を取得する例について説明した。しかしながら、ゲート装置10以外の装置が、ゲート装置10を通過しようとする利用者の顔画像を取得してもよい。例えば、ゲート装置の近傍にいる警備員の携帯する管理携帯端末44が、当該端末に備わるカメラで利用者の顔画像を撮影してもよい。以下この場合の例について説明する。管理携帯端末44は情報処理装置20と通信接続する。
上述した第一実施形態から第八実施形態の処理においては、出力情報に含まれる認証対象の利用者の識別情報に対応して定まる形状情報と認証結果を示す情報とが共に表示されることについて説明した。しかしながら、さらに認証対象となる利用者の撮影画像がさらに表示され、その撮影画像に形状情報や認証結果が表示されるようにしてもよい。当該処理を、ゲート装置10が行う場合について説明する。
ゲート装置10に備わるカメラ11の撮影した撮影画像に複数の人物が写る場合、ゲート装置10や情報処理装置20は、それら複数の人物について同様の処理を行うようにしてもよい。
図21は、第十一実施形態による情報処理システムの概要を示す図である。
図22は、第十一実施形態による管理携帯端末の処理フローを示す図である。
図23は、第十一実施形態による情報処理装置の処理フローを示す図である。
上述の第八実施形態において、ゲート制御指示は警備員などが管理する管理携帯端末44に出力されるようにしてもよい。以下この場合の例について説明する。
上述の各実施形態においては、ゲート装置10と情報処理装置20とが通信ネットワークを介して接続される情報処理システム100について説明した。しかしながら、情報処理システム100は、ゲート装置10以外の装置と、情報処理装置20とが通信ネットワークを介して接続されたコンピュータシステムであってよい。
図24は情報処理システムの最小構成を示す図である。
図25は最小構成を示す情報処理システムの処理フローを示す図である。
情報処理システムは100、少なくとも取得手段241、生成手段242、出力手段243を備える。
取得手段241は、対象の生体情報を取得する(ステップS2101)。
生成手段242は、生体情報に関連付けられた識別情報を用いて、識別情報に対応して定まる出力情報を生成する(ステップS2102)。
出力手段243は、出力情報を出力する(ステップS2103)。
4・・・携帯端末
10・・・ゲート装置
11・・・カメラ
13・・・ディスプレイ
15・・・ゲート制御装置
151・・・ゲート制御部
152・・・出力情報制御部
153・・・通信制御部
20・・・情報処理装置
21・・・取得部(取得手段)
22・・・認証部
23・・・生成部(生成手段)
24・・・出力部(出力手段)
100・・・情報処理システム
Claims (9)
- 対象の生体情報を取得する取得手段と、
前記生体情報に関連付けられた識別情報を用いて、前記識別情報に対応して定まる出力情報を生成する生成手段と、
前記出力情報を出力する出力手段と、
を備える情報処理システム。 - 前記生成手段は、前記識別情報と出力情報生成アルゴリズムとを用いて前記識別情報に対応して定まる出力情報を生成する
請求項1に記載の情報処理システム。 - 前記識別情報は複数の文字列を示し、
前記出力情報生成アルゴリズムは前記文字列に含まれる文字と前記文字列における当該文字の位置とに応じて定まる形状または色彩を組み合わせて前記出力情報を生成するアルゴリズムであり、
前記生成手段は、前記文字と前記文字列における当該文字の位置とに応じて定まる形状または色彩を組み合わせて前記出力情報を生成する
請求項2に記載の情報処理システム。 - 前記生成手段は、前記文字と前記文字列における当該文字の位置とに応じて前記出力情報が示す前記形状の位置を特定する
請求項3に記載の情報処理システム。 - 前記生成手段は、前記識別情報に紐づく前記対象に関連する関連情報を取得し、当該関連情報をさらに含む前記出力情報を生成する
請求項1から請求項4の何れか一項に記載の情報処理システム。 - 前記出力情報の出力先の出力装置を備えたゲート装置を有し、
前記取得手段は、前記ゲート装置に備えられた生体情報読取装置の取得した前記生体情報を取得し、
前記出力手段は、前記出力装置に前記出力情報を出力する
請求項1から請求項5の何れか一項に記載の情報処理システム。 - 前記出力手段は、前記生体情報に関連付けられた前記対象が携帯する端末を示す出力先情報を取得し、当該出力先情報に基づいて前記対象が携帯する端末へ前記出力情報を出力する
請求項1から請求項6の何れか一項に記載の情報処理システム。 - 対象の生体情報を取得し、
前記生体情報に関連付けられた識別情報を用いて、前記識別情報に対応して定まる出力情報を生成し、
前記出力情報を出力する
情報処理方法。 - 情報処理システムのコンピュータを、
対象の生体情報を取得する取得手段、
前記生体情報に関連付けられた識別情報を用いて、前記識別情報に対応して定まる出力情報を生成する生成手段、
前記出力情報を出力する出力手段、
として機能させるプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022532215A JP7537497B2 (ja) | 2020-06-26 | 2020-06-26 | 情報処理システム、情報処理方法、プログラム |
US17/622,330 US12067096B2 (en) | 2020-06-26 | 2020-06-26 | Information processing system, information processing method, and non-transitory computer-readable recording medium |
EP20941892.0A EP4174690A4 (en) | 2020-06-26 | 2020-06-26 | INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD AND PROGRAM |
PCT/JP2020/025299 WO2021260930A1 (ja) | 2020-06-26 | 2020-06-26 | 情報処理システム、情報処理方法、プログラム |
AU2020454606A AU2020454606B2 (en) | 2020-06-26 | 2020-06-26 | Information processing system, information processing method, and program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/025299 WO2021260930A1 (ja) | 2020-06-26 | 2020-06-26 | 情報処理システム、情報処理方法、プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021260930A1 true WO2021260930A1 (ja) | 2021-12-30 |
Family
ID=79282151
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/025299 WO2021260930A1 (ja) | 2020-06-26 | 2020-06-26 | 情報処理システム、情報処理方法、プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US12067096B2 (ja) |
EP (1) | EP4174690A4 (ja) |
JP (1) | JP7537497B2 (ja) |
AU (1) | AU2020454606B2 (ja) |
WO (1) | WO2021260930A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024195573A1 (ja) * | 2023-03-17 | 2024-09-26 | 日本電気株式会社 | 情報処理システム、情報処理方法、及び記録媒体 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014222445A (ja) * | 2013-05-14 | 2014-11-27 | 株式会社デンソーウェーブ | 認証システム |
JP2017208638A (ja) | 2016-05-17 | 2017-11-24 | レノボ・シンガポール・プライベート・リミテッド | 虹彩認証装置、虹彩認証方法、及びプログラム |
JP2017224186A (ja) * | 2016-06-16 | 2017-12-21 | 株式会社 日立産業制御ソリューションズ | セキュリティシステム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100714192B1 (ko) | 2005-04-08 | 2007-05-02 | 엔에이치엔(주) | 노출 부위가 가변되는 아바타 제공 시스템 및 그 방법 |
US9313200B2 (en) * | 2013-05-13 | 2016-04-12 | Hoyos Labs Ip, Ltd. | System and method for determining liveness |
EP3392824A4 (en) * | 2015-12-18 | 2019-10-23 | Hitachi, Ltd. | DEVICE AND SYSTEM FOR BIOMETRIC AUTHENTICATION |
US10311220B2 (en) * | 2016-09-02 | 2019-06-04 | Qualcomm Incorporated | Accessing a user equipment using a biometric sensor concurrently with an authentication pattern |
KR102185854B1 (ko) * | 2017-09-09 | 2020-12-02 | 애플 인크. | 생체측정 인증의 구현 |
-
2020
- 2020-06-26 EP EP20941892.0A patent/EP4174690A4/en active Pending
- 2020-06-26 WO PCT/JP2020/025299 patent/WO2021260930A1/ja active Application Filing
- 2020-06-26 JP JP2022532215A patent/JP7537497B2/ja active Active
- 2020-06-26 AU AU2020454606A patent/AU2020454606B2/en active Active
- 2020-06-26 US US17/622,330 patent/US12067096B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014222445A (ja) * | 2013-05-14 | 2014-11-27 | 株式会社デンソーウェーブ | 認証システム |
JP2017208638A (ja) | 2016-05-17 | 2017-11-24 | レノボ・シンガポール・プライベート・リミテッド | 虹彩認証装置、虹彩認証方法、及びプログラム |
JP2017224186A (ja) * | 2016-06-16 | 2017-12-21 | 株式会社 日立産業制御ソリューションズ | セキュリティシステム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024195573A1 (ja) * | 2023-03-17 | 2024-09-26 | 日本電気株式会社 | 情報処理システム、情報処理方法、及び記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
US20220415106A1 (en) | 2022-12-29 |
EP4174690A1 (en) | 2023-05-03 |
JPWO2021260930A1 (ja) | 2021-12-30 |
AU2020454606A1 (en) | 2023-02-02 |
EP4174690A4 (en) | 2023-07-26 |
AU2020454606B2 (en) | 2023-11-09 |
JP7537497B2 (ja) | 2024-08-21 |
US12067096B2 (en) | 2024-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5538701B2 (ja) | 本人認証のための方法、システム、判定装置、端末装置、サーバ装置、プログラムおよび記録媒体 | |
CN110543957A (zh) | 一种智能酒店入住方法以及相应的装置 | |
CN108734830A (zh) | 门禁控制方法及系统 | |
JP2007135149A (ja) | 移動携帯端末 | |
CN103810409A (zh) | 信息处理装置、信息处理方法和计算机程序 | |
Blas et al. | Biometrics and opacity: A conversation | |
JP5603766B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP2006235718A (ja) | 顔認証装置、その顔認証方法、その顔認証装置を組み込んだ電子機器およびその顔認証プログラムを記録した記録媒体 | |
JP2009237801A (ja) | 通信システム及び通信方法 | |
KR101515214B1 (ko) | 얼굴 인식을 통한 신원확인 방법과 얼굴인식을 이용한 출입관리 경보 시스템 및 출입관리 경보 제어방법 | |
WO2021260930A1 (ja) | 情報処理システム、情報処理方法、プログラム | |
JP2007193656A (ja) | 本人認証装置 | |
JP2007006393A (ja) | 情報提示システム | |
JP6664753B1 (ja) | 偽造判定システム、偽造判定方法及び偽造判定プログラム | |
Gururaj et al. | Threats, consequences and issues of various attacks on online social networks | |
JP7385396B2 (ja) | 利用者認証システム | |
JP7169182B2 (ja) | Aiロボットによる入退場管理システム | |
KR20020032048A (ko) | 얼굴 인식 보안방법 | |
JP2008009690A (ja) | 入室管理装置および入退室管理装置ならびに入室管理方法 | |
JP2013120454A (ja) | 情報処理システム、情報処理方法、情報処理装置、情報処理装置の制御方法または制御プログラム | |
CN112395444A (zh) | 一种将新成员注册到面部图像数据库的方法 | |
CN219017029U (zh) | 权限认证系统 | |
Pradeep et al. | Security Enhancement for ATM Machine Using Mobile Application and IoT Technology | |
KR20210030602A (ko) | 스마트 도어 및 이를 포함하는 스마트 홈 시스템 | |
JP7529963B2 (ja) | 携帯端末及び認証システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20941892 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022532215 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2020454606 Country of ref document: AU |
|
ENP | Entry into the national phase |
Ref document number: 2020941892 Country of ref document: EP Effective date: 20230126 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020454606 Country of ref document: AU Date of ref document: 20200626 Kind code of ref document: A |