WO2023199455A1 - Système d'identification, système de gestion d'entrée/sortie et système pos - Google Patents

Système d'identification, système de gestion d'entrée/sortie et système pos Download PDF

Info

Publication number
WO2023199455A1
WO2023199455A1 PCT/JP2022/017759 JP2022017759W WO2023199455A1 WO 2023199455 A1 WO2023199455 A1 WO 2023199455A1 JP 2022017759 W JP2022017759 W JP 2022017759W WO 2023199455 A1 WO2023199455 A1 WO 2023199455A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
server
smart device
user
image
Prior art date
Application number
PCT/JP2022/017759
Other languages
English (en)
Japanese (ja)
Inventor
規之 鈴木
Original Assignee
株式会社アスタリスク
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社アスタリスク filed Critical 株式会社アスタリスク
Priority to PCT/JP2022/017759 priority Critical patent/WO2023199455A1/fr
Priority to JP2022522641A priority patent/JPWO2023199455A1/ja
Publication of WO2023199455A1 publication Critical patent/WO2023199455A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/38Individual registration on entry or exit not involving the use of a pass with central registration
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/12Cash registers electronically operated

Definitions

  • the present invention relates to a specific system, an entrance/exit management system, and a POS system.
  • a reading unit receives an identifier transmitted from a wireless tag (remote IC card) owned by a user, and an acquisition unit acquires facial features registered in correspondence with the identifier received from the wireless tag.
  • an imaging unit that images the user
  • an extraction unit that extracts facial features from the image data captured by the imaging unit, facial features extracted by the extraction unit, and facial features acquired by the acquisition unit.
  • a face matching unit that checks whether the facial features match the face matching unit; and an opening/closing control unit that opens or closes a gate exit depending on the matching result of the face matching unit.
  • Authentication systems are known.
  • problems related to the above-mentioned conventional technology may occur not only in gate opening/closing control, but also in, for example, payment processing at a store.
  • An object of the present invention is to provide an identification system, an entrance/exit management system, and a POS system that can identify a person without having to carry an IC card or the like.
  • a specific system of the present invention is a specific system comprising a smart device and a server capable of communicating with the smart device, wherein the smart device includes an imaging unit that captures an image of a person and generates an image of the person; a transmitting unit that transmits an image of a person to the server, the server includes a receiving unit that receives the image of the person from the smart device, and an identification unit that identifies the person based on the received image of the person.
  • the system is characterized by comprising: a transmitting unit that transmits the identified person to another system.
  • the smart device is characterized in that it includes an input unit that receives a touch input from the person, and captures an image of the person when the touch input occurs.
  • the server is characterized by comprising an authentication unit that authenticates the person specified by the identification unit based on information regarding the person.
  • the smart device includes an input unit that receives a touch input by the person, and a transmission unit that transmits input information input by the touch input together with an image of the person to the server, and the server , a storage unit that stores comparison information in advance in association with the person; a reception unit that receives input information from the smart device; and a storage unit that extracts comparison information corresponding to the identified person from the storage unit;
  • the present invention is characterized by comprising an authentication section that authenticates the identified person when the extracted comparison information and the received input information are synonymous.
  • the smart device includes an acquisition unit that acquires biometric information of the person, and a transmission unit that transmits the acquired biometric information to the server together with an image of the person, and the server a storage unit that stores biometric information of the person in association with the person; a receiving unit that receives the biometric information from the smart device; and extracting biometric information corresponding to the identified person from the storage unit;
  • the present invention is characterized by comprising an authentication unit that compares the extracted biometric information with the received biometric information and authenticates the identified person if the degree of matching is high.
  • it is characterized in that it includes a detection unit that detects the presence of the person in a specific area, and the imaging unit captures an image of the person when the presence of the person is detected.
  • the present invention is also characterized by comprising a plurality of the servers and a distribution unit that distributes the image of the person to a server to which the image of the person is sent each time the image of the person is transmitted from the smart device.
  • the server is characterized in that it has a storage unit that stores the identification history of the person.
  • the server is a cloud-type server provided on the Internet.
  • the present invention is also characterized in that it includes a computer that can communicate with the server, and the computer transmits information used to identify the person to the server.
  • the entrance/exit management system of the present invention is characterized by managing the entrance/exit of a person specified by the above-mentioned identification system.
  • the POS system of the present invention is characterized in that it performs electronic payment for a person specified by the above-mentioned identification system.
  • a person can be identified without having to carry an IC card or the like.
  • A Configuration diagram of a specific system according to the first embodiment of the present invention
  • B Configuration diagram of a smart device included in the specific system
  • C Configuration diagram of a specific server included in the specific system Diagram showing an overview of tables provided by a specific server
  • Flow diagram of specific system processing Configuration diagram of a specific system according to modification 3 of the first embodiment A diagram showing an overview of an entrance/exit management system according to a second embodiment of the present invention
  • Configuration diagram of the above entrance/exit control system and specific system A diagram showing an overview of the tables provided in the entrance/exit management server of the entrance/exit management system mentioned above.
  • the identification system according to the first embodiment of the present invention will be described based on the drawings.
  • a specific system 100 collaborates with a system (hereinafter referred to as "user system 2") that provides specific services and convenience to users (persons).
  • the user system 2 is provided with information on the user (person) identified based on the image.
  • the specific system 100 includes a smart device 10 and a server 20 (hereinafter referred to as "specific server 20") that can communicate with the smart device 10 via a network N.
  • the smart device 10 is a thin plate-shaped information processing terminal represented by a known smartphone or tablet terminal, and has a display section that displays information and an input section that accepts touch input from a user (person) on its front side.
  • the camera 14 includes a touch panel display 13 that functions as a camera, and a camera 14 that functions as an image capture unit that captures an image of a user (person).
  • the touch panel display 13 is connected to the CPU 11 and displays images input from the CPU 11. Furthermore, when a touch input is made by a user (person), the touch panel display 13 inputs the touched position coordinates to the CPU 11.
  • the camera 14 is connected to the CPU 11 and inputs to the CPU 11 an image generated by capturing a user (person) (hereinafter referred to as a "user image").
  • the smart device 10 includes a memory 12 that functions as a storage unit, and stores in advance a program to be executed by the CPU 11. By executing the program, the CPU 11 functions as an extraction unit that extracts a facial image (hereinafter referred to as a "facial image") from the user image input from the camera 14. Furthermore, the smart device 10 includes a network module 15 to establish communication with the specific server 20 via the network N, and the CPU 11 transmits the facial image to the specific server 20 via the network module 15. In this way, the CPU 11 and the network module 15 function as a transmitter. Further, a device ID is defined in the program stored in the memory 12 of the smart device 10. The device ID is a code for uniquely identifying multiple smart devices 10.
  • the specific server 20 includes a network module 23 to establish communication with the smart device 10 via the network N.
  • the network module 23 is connected to the CPU 21 of the specific server 20.
  • the specific server 20 also includes a memory 22 that functions as a storage unit, and as shown in FIG.
  • a user table is provided.
  • the user table includes a user ID field in which identification information uniquely assigned to each user (hereinafter referred to as "user ID") is registered as user information, and a user ID field that is registered as user information.
  • a name field is provided in which the person's name is registered.
  • a specific program is stored in the memory 22 of the specific server 20.
  • the specific program is a program in which a process for identifying a user (person) is defined, and when the CPU 21 executes the program, the specific server 20 includes a receiving unit that receives a facial image from the smart device 10; It functions as a specifying unit that specifies a user (person) based on a received facial image, and a transmitting unit that transmits information regarding the specified user (person) to the user system 2 and smart device 10.
  • identification of the user based on the face image can typically be performed by machine learning.
  • the computer 3 smart phone, tablet terminal, personal computer
  • the specific server 20 transmits the user ID and the received facial image.
  • Machine learning is performed in advance using training data to generate a trained model.
  • the identification server 20 then identifies the user ID by inputting the facial image received from the smart device 10 into the learned model.
  • the method of identifying a user based on a face image is not limited to the above-mentioned machine learning, and may be one that identifies a user based on facial feature amounts.
  • the facial image received from the computer 3 connected to the network N is analyzed, and the relative positions and sizes of the eyes, nose, mouth, ears, contours, etc. included in the facial image are determined.
  • the feature amount is calculated and stored in association with the user ID of the user.
  • the identification server 20 then calculates the feature amount based on the facial image received from the smart device 10, and identifies the user ID of the user with a high degree of matching of the feature amount. It is not necessary to search all registered data, and it is possible to speed up the search by using feature points as an index or by using an algorithm such as the nearby point search method.
  • the specific system 100 executes processing according to the flow shown in FIG. 3.
  • the CPU 11 of the smart device 10 executes display processing (s10).
  • the display process (s10) is a process of displaying an image that prompts the user to perform a touch operation on the touch panel display.
  • the CPU of the smart device 10 executes the confirmation process (s11) after executing the above display process (s10).
  • the confirmation process (s11) is a process of confirming the user's touch operation on the touch panel display. In the confirmation process, if the touch operation cannot be confirmed (No), the CPU 11 repeatedly executes the confirmation process (s11). On the other hand, if the CPU 11 confirms the touch operation (Yes), it executes the imaging process (s12).
  • the imaging process (s12) is a process of imaging the user to generate a user image, and the CPU 11 inputs an imaging start command to the camera 14.
  • the camera 14 images the user, generates a user image including the user's face, and inputs the user image to the CPU 11 .
  • a guide may be displayed on the touch panel display 13.
  • the guide is, for example, a countdown display of the imaging timing, a real-time image of the user, and an alignment image for prompting the user to align the user's face in the imaging area of the camera 14.
  • the CPU 11 executes an extraction process (s13).
  • the extraction process (s13) is a process of extracting an image of the user's face included in the user image, and the CPU 11 extracts, for example, facial features (eyes, ears, nose, mouth, and/or outline). ), a face part in the user image is detected, and a face image is generated by extracting the face part. Note that if the facial part cannot be detected, it is preferable to execute the imaging process again.
  • the CPU 11 executes a transmission process (s14) of transmitting the generated face image together with the device ID to the specific server 20.
  • the CPU 21 of the specific server 20 receives the facial image and device ID from the smart device 10 (s21) and executes the specific process (s22).
  • the identification process (s22) is a process of identifying the user based on the received facial image. For example, the user is identified by inputting the received facial image into a trained model.
  • the specific server 20 executes a transmission process (s23) to transmit the result of the specific process (s22) to the smart device 10 and the user system 2.
  • a transmission process s23
  • the identification process (s22) if one user cannot be identified (if multiple users are identified or the identification accuracy is low), an error is sent to the smart device 10.
  • the smart device 10 receives an error (s15: Yes)
  • it repeatedly executes the imaging process (s11) and the extraction process (s12).
  • the identification server 20 transmits information (for example, user ID and name) of the user to the smart device 10 or user system 2 ( s23).
  • the user can be identified simply by performing a touch operation on the touch panel display 13 of the smart device 10.
  • the aspect of the specific system 100 is not limited to the first embodiment described above, and may be modified as described below.
  • the identification system 100 may perform authentication processing to improve the identification accuracy of the identified user.
  • the authentication process can be performed in the following manner.
  • This modification 1-A is an aspect in which the identification accuracy of the specified user is improved by taking the user's password into account.
  • the smart device 10 executes an input reception process before executing the imaging process (s12).
  • the input reception process is a process of accepting the input of a password (input information) by the user, and displays a software keyboard on the touch panel display 13, and sends the character string input by the software keyboard to the specific server 20 along with the facial image as a password. Send to.
  • the specific server 20 includes an authentication table in its memory 22, as shown in FIG. 2(b).
  • the authentication table has a user field and a password field. Information for identifying a user, such as a user ID, is registered in the user field. Further, in the password field, a password (contrastive information) for authenticating the user is registered.
  • the user ID and password are registered in advance by a user or the like (the user or the administrator or provider of the user system 2) using the computer 3 connected to the network N. Then, the CPU 21 of the specific server 20 executes the authentication process after executing the specific process (s22).
  • a password (comparative information) is extracted from the authentication table using the user ID specified in the identification process (s22) as a search key.
  • the identification process (s22) if two or more users (user IDs) are identified by the identification process (s22), passwords (comparison information) corresponding to all the identified user IDs are extracted.
  • the CPU 21 of the specific server 20 compares the extracted password (comparison information) and the received password (input information), and identifies the user (user ID) corresponding to the password that is synonymous (matches). do.
  • the above-mentioned transmission process (s23) is executed.
  • This modification 1-B is an aspect in which the identification accuracy of the identified user is improved by taking into account the user's biometric information.
  • the biometric information includes images of the user's fingerprints, palm prints, veins, irises, auricles, etc. (hereinafter referred to as "biometric images"), and voice print information of the user.
  • the smart device 10 executes the biological information acquisition process after executing the imaging process (s12) of the first embodiment (hereinafter referred to as "first imaging process (s12)").
  • the biological information acquisition process is a process of acquiring biological images and/or audio information.
  • the smart device 10 can communicate with a scanner that acquires biometric images such as fingerprints, palm prints, veins, etc., and in biometric information acquisition processing, the CPU 11 of the smart device 10 inputs an acquisition command to the scanner. .
  • the scanner acquires a biometric image of the user, such as a fingerprint, a palm print, or a vein, when an acquisition command is input, and inputs the acquired biometric image to the smart device 10.
  • the smart device 10 can acquire biometric images such as fingerprints, palm prints, and veins.
  • the smart device 10 may be able to communicate with a camera that acquires biological images of the iris, the auricle, and the like.
  • the CPU 11 of the smart device 10 inputs an acquisition command to the camera.
  • the camera acquires a biological image of the user's iris, auricle, etc. when the acquisition command is input, and inputs the acquired biological image to the smart device 10.
  • the smart device 10 can acquire biological images of the iris, the auricle, and the like.
  • the smart device 10 may include a microphone module that acquires a voiceprint.
  • the microphone module includes a microphone, an amplifier circuit that amplifies the signal output from the microphone, and an analog-to-digital conversion circuit that converts the signal output from the amplifier circuit into a digital signal.
  • the generated signal is input to the CPU 11 of the smart device 10.
  • the CPU 11 of the smart device 10 generates audio information based on the signal input from the microphone module. Through this process, the smart device 10 can acquire audio information such as a voiceprint.
  • the CPU 11 of the smart device 10 executes the biometric information acquisition process, it transmits the biometric information (biometric image and voice information) together with the facial image to the specific server 20 in the transmission process (s14).
  • the specific server 20 acquires the biometric information of the new user, executes machine learning using the biometric information as training data, and generates a learned model. Then, the CPU 21 of the specific server 20 executes the authentication process after executing the specific process (s22). In the authentication process, the biometric information received from the smart device 10 is input into the trained model, and the output user ID is compared with the user ID obtained in the identification process (s22). Then, when the user IDs match, the user ID is output as the identification result.
  • This modification 1-C is an aspect in which the identification accuracy of the identified user is improved by adding a marker.
  • the smart device 10 images the user's face and the marker attached to the user in the first imaging process (s12).
  • the marker attached to the user is a mark, a character string, a combination of colors, or the shape of an article displayed on an article worn by the user.
  • the CPU of the smart device 10 generates a face image in the extraction process (s13), extracts the marker included in the user image to generate a marker image, and sends the face image and the marker image to the specific server 20. do.
  • the specific server 20 includes an authentication table in its memory 22.
  • the authentication table has a user field and a marker field.
  • feature points of markers are registered.
  • the user ID and the feature points of the marker are registered in advance by a user or the like (the user or the administrator or provider of the user system 2) using the computer 3 connected to the network N. Then, the CPU of the specific server 20 executes the authentication process after executing the specific process (s22).
  • the feature points of the marker are extracted from the authentication table using the user ID specified by the identification process as a search key. Then, the CPU 21 of the specific server 20 compares the extracted feature points with the feature points of the received marker image, and if they match, outputs the user ID as the identification result.
  • Modifications 1-A to 1-C for example, it is difficult to identify the user based on facial images alone such as identical twins, or it is difficult to identify the user due to an injury to the face. Even in such a case, the identification result of the identification process (s22) can be authenticated by the authentication process, and highly accurate user information can be provided to the user system 2. Furthermore, even if multiple users are identified as a result of the identification process (s22), it is possible to identify one user through the authentication process.
  • the software keyboard is displayed with a different key arrangement each time it is displayed. According to this aspect, it is possible to prevent the password from being known to a third party who sees the user's input operation.
  • the authentication process is executed based on the password (input information) specified by the user, but the input information is not limited to a password and may be a password. That is, authentication processing can be performed using a character string specified by an individual user.
  • the authentication process is executed for the purpose of improving the identification accuracy, but the same process as the authentication process may be executed for the purpose of improving the identification speed. That is, the specific server 20 selects a user from the user table based on the character string received from the smart device 10, and then combines the facial features of the selected user with the facial features received from the smart device 10. The user may be identified by comparing feature amounts.
  • the smart device 10 may generate a greeting voice such as "Good morning” and then acquire the user's voiceprint. According to this aspect, since the user makes a sound in response to the greeting sound, the user's voice can be naturally acquired.
  • the authentication processing according to the above modifications 1-A to 1-C may be performed based on a request from the user system 2. Specifically, when initializing the cooperation between the specific system 100 and the user system 2 of this embodiment, the user system 2 sends an authentication request to the specific server 20, and the specific server 20 responds to the authentication request. Based on this, a password, biometric information, or marker information is requested from the smart device 10. When the smart device 10 receives the request, the smart device 10 acquires the password, biometric information, or marker in the manner of Modifications 1-A to 1-C above and transmits it to the specific server 20. The authentication process is executed based on the information and the information of the identified user is transmitted to the user system 2.
  • the authentication processes of Modifications 1-A to 1-C may be selected and executed for each user.
  • an authentication field is provided in the user table, and the type of authentication process (hereinafter referred to as "authentication type") is registered in the authentication field.
  • the CPU of the specific server 20 checks the authentication type corresponding to the user output by the specific process in the user table, executes the authentication process according to the authentication type, and finally specifies the user. According to this modification, it is possible to perform authentication processing that is appropriate for the user.
  • users can be grouped by authentication type, and authentication processing suitable for the group can be executed.
  • the identification system 100 may record the user's identification history.
  • the specific server 20 includes a history table in its memory.
  • the history table includes a user field, a date/time field, and a device field.
  • information related to the user such as the user ID of the user specified by the identification process (s22), is registered.
  • the date and time when the specific process (s22) was performed is registered in the date and time field.
  • the device field the device ID of the smart device 10 that is the source of the facial image used in the identification process (s22) is registered.
  • the specific server 20 may record the face image used in the specific process (s22). Specifically, the specific server 20 assigns an identification number (hereinafter referred to as "facial image identification number”) to the received facial image, stores the facial image in the memory 22, and records the above-mentioned history. A face image field (FIG. 2(c)) is provided in the table to register a face image identification number.
  • facial image group an identification number (hereinafter referred to as facial image group identification number) is assigned to the facial image group.
  • the facial image group is stored in the memory 22, and the facial image group identification number is registered in the facial image field of the history table. Note that the face image or face image group stored in the memory 22 may be compressed. Further, a face image or a group of face images may be deleted or thinned out after a predetermined period of time has passed after being saved.
  • the specific server 20 may store the received face image in the memory 22 every time it executes the reception process (s21). According to this aspect, the history of use by a person other than the user can be saved, which serves as a crime prevention measure.
  • the identification system 100 collects facial images and biological information from a plurality of identification servers 20a, 20b, 20c, a plurality of smart devices 10a, 10b, 10c, and a plurality of smart devices 10a, 10b, 10c. It is also possible to include a load balancer 30 that functions as a distribution unit that distributes the destination specifying servers 20a, 20b, 20c to the specific servers 20a, 20b, 20c with a light load each time a message is transmitted. According to this modification, when a plurality of smart devices 10a, 10b, and 10c are used, the processing load on one specific server 20 can be distributed, so it is possible to prevent system failures from occurring.
  • the smart device 10 may be connectable to peripheral devices through wired or wireless communication.
  • the smart device 10 that is communicably connected to peripheral devices will be described below.
  • the smart device 10 may be connected to an infrared camera 4.
  • the CPU 11 of the smart device 10 in the modified embodiment performs image capture using the camera 14 (hereinafter referred to as "visible light camera 12") possessed by the smart device 10 and the infrared camera 4 which is a peripheral device. Enter the command.
  • the CPU 11 of the smart device 10 compares the user images input from the visible light camera 14 and the infrared camera 4, selects a user image that shows the user more clearly, and executes the extraction process (s14). do.
  • a face image is generated based on a clearer user image, and identification processing is executed based on the clearer face image, so that identification accuracy can be improved.
  • the smart device 10 may be connected to the human sensor 5.
  • the human sensor 5 functions as a detection unit that detects the presence of a person in front of the smart device 10, and inputs a detection signal to the smart device 10 when a person is detected.
  • the smart device 10 executes an imaging process when a detection signal is input from the human sensor 5 instead of the display process (s10) and confirmation process (s11) of the first embodiment. According to this modification, even in a situation where the user's hands are occupied and touch operations cannot be performed on the smart device 10, the imaging process (s12) is executed, so it is possible to identify the user. Become.
  • the smart device 10 may control a home appliance or an electric door lock via Bluetooth (registered trademark) communication.
  • the specific server 20 may be able to manage the smart device 10. Specifically, as shown in FIG. 2(D), the specific server 20 includes a device management table in which the status of the smart device 10 is registered.
  • the device management table has a device ID field in which the device ID is registered, an installation location field in which the installation location of the smart device 10 is registered, an address field in which the IP address of the smart device 10 is registered, and the smart device 10.
  • a GPS field in which a GPS value is registered.
  • the values of the device ID field, installation location field, and address field are registered during initial settings when the smart device 10 is installed.
  • the value of the GPS field is transmitted together with the facial image when the smart device 10 transmits the facial image to the specific server 20 .
  • the specific server 20 may periodically request the smart device 10 to transmit the GPS value, and the smart device 10 may transmit the GPS value to the specific server 20 in response to the request.
  • the CPU 21 of the specific server 20 periodically checks the values in the device management table.
  • the CPU 21 of the specific server 20 sends an alert to the administrator's computer 4 that can communicate via the network N, and also updates the GPS value to the administrator's computer 4.
  • the CPU 21 of the specific server 20 may disconnect the communication connection from the smart device 10 whose GPS value has changed.
  • the communication network N between the smart device 10 and the specific server 20 may be a local area network or an Internet communication line.
  • the smart device 10 and the specific server 20 communicate via the Internet communication line, the smart device 10 can use a network via a mobile phone network.
  • the specific server 20 may be a server provided as a cloud over an Internet communication line.
  • the smart device 10 and the specific server 20 may be able to communicate through a wired connection.
  • the extraction unit is not an essential component in the specific system in the first embodiment, and the smart device 10 sends the user image to the specific server 20, and the specific server 20 identifies the user ( It does not matter if the person (person) is specified.
  • the identification system 100 of the first embodiment and the modified example described above can cooperate with various user systems 2.
  • Typical examples of the user system 2 include a building entry/exit system and a credit card payment system.
  • a mode of cooperation with an entrance/exit system will be explained as a second embodiment
  • a mode of cooperation with a POS system will be explained as a third embodiment.
  • the entrance/exit management system 200 includes an automatic door 210 installed at the entrance of a building, a manual door 220 installed at a tenant in the building, and an automatic door 210 and a manual door 220.
  • the entrance/exit management server 230 with which communication is possible is provided.
  • the automatic door 210 includes a control device 212 that controls opening and closing of the pair of doors 211.
  • the control device 212 includes a network module for connecting to the network N, and can communicate with the entrance/exit management server 230 via the network module.
  • the manual door 220 includes a door 221 that is manually opened and closed, an electric lock that locks and unlocks the door 221, and a control device 222 that controls the electric lock.
  • the control device 222 includes a network module for connecting to the network N, and can communicate with the entrance/exit management server 230 via the network module.
  • the entrance/exit management server 230 is a server that manages entrance/exit of users, and as shown in FIG. 7, its memory is provided with a user management table and a door table.
  • the user management table is a table for managing users of the building, and includes a user ID field in which the user ID is registered, a name field in which the user's name is registered, and the tenant to which the user belongs. It has a tenant field in which a tenant ID is registered.
  • the door management table is a table that manages the settings of the doors 211 and 221 in the building, and includes a tenant field in which the tenant ID of the tenant where the door is installed and the IP address of the control device for the door are registered. It includes an address field to be registered and a device field to which the device ID of the smart device 10 installed at the door is registered.
  • t0006 is installed. Note that smart devices 10 are similarly installed inside and outside the manual door 220 for other tenants in the building. Similarly, smart devices 10 are installed inside and outside the manual door 220 in rooms on other floors.
  • the specific server 20 is capable of communicating with the first smart device 10a to the sixth smart device 10f via the network N. Similar to the first embodiment, the CPUs 11 of the smart devices 10a to 10f execute display processing (s10), confirmation processing (s11), imaging processing (s12), and extraction processing (s13), and The user's face image and device ID are sent to. Similar to the first embodiment, the identification server 20 executes identification processing (s22) and transmission processing (s23), and transmits the user ID and device ID of the user to the entrance/exit management server 230.
  • the CPU of the entrance/exit management server 230 extracts the tenant ID corresponding to the received user ID from the user management table. Furthermore, the entrance/exit management server 230 extracts the tenant ID corresponding to the received device ID from the door management table. Then, the CPU of the entrance/exit management server 230 compares the extracted tenant IDs. As a result of the comparison, if the tenant IDs match, the IP address corresponding to the tenant ID is extracted from the door management table, and an unlock command is sent to the IP address. Note that if the tenant ID extracted from the door management table indicates the automatic door 210 at the entrance, the IP address corresponding to the tenant ID is extracted without performing the above verification, and an open command is sent to the IP address. Send.
  • control device 212 of the automatic door 210 When the control device 212 of the automatic door 210 receives the opening command, the control device 212 opens the pair of doors 211. Furthermore, when the control device 222 of the manual door 220 receives an open command, it inputs an unlock signal to the electric lock. This unlocks the manual door 220.
  • an error is sent to the smart device 10.
  • the smart device 10 receives the error, it starts calling the security guard and becomes able to talk to the security guard via the smart device 10. Otherwise, when the smart device 10 receives an error, it starts acquiring biometric information. This makes it possible to identify the user based on biometric information.
  • the user can open the automatic door 210 and unlock the manual door 220 by simply performing a touch operation on the touch panel display 13 of the smart device 10.
  • the identification server 20 may transmit the identified user ID to the smart device 10 after executing the identification process (s22).
  • the smart device 10 Upon receiving the user ID, the smart device 10 extracts the name of the user corresponding to the user ID from the user table, and outputs the specific result from the speaker, such as a voice saying "Good morning, Taro Shiga.” It is output or displayed on the touch panel display 13. This makes it possible to notify the user of the identification results.
  • the smart device 10 when the smart device 10 is installed at each door, the smart device 10 may read a symbol that encodes the tenant ID, IP address, and device ID.
  • the smart device 10 reads the symbol with a camera and sets this information as its own terminal properties.
  • the entrance/exit management server 230 can also communicate with the computer 3 of the security company, and may send an open command to all doors based on the emergency information received from the computer 3 of the security company. This will allow police officers and security staff to enter and exit the building by opening the doors in the event of an emergency.
  • the smart device 10 may display a button for transitioning to emergency mode.
  • a screen will be displayed requesting the input of a character string such as a password or a QR code.
  • a police officer or security staff enters a password or QR code
  • the smart device 10 sends an emergency release command to the specific server 20.
  • the specific server 20 transfers the received emergency opening command to the entrance/exit management server 230.
  • the entrance/exit management server 230 transmits an opening command to each door based on the emergency opening command. This will allow police officers and security staff to enter and exit the building by opening the doors in the event of an emergency.
  • the smart device 10 displays a telephone button on the touch panel display 13, and makes a call to the police or security company when the telephone button is pressed.
  • the telephone call may be not only a telephone call but also a video telephone call.
  • the smart device 10 is installed inside and outside the door, but the smart device 10 may be installed only on the outside of the door.
  • a POS system 300 is a system that aggregates and analyzes product sales at a retail store for each product type, and includes a reader 310 that reads product identification information from a JAN code attached to a product, It includes a cash register device 320 provided corresponding to the device 310 and a POS server 330 that can communicate with the cash register device 320.
  • the cash register 320 generates accounting information based on the product identification information read by the reader 310, and after payment, the accounting information is sent to the POS server 330, and the POS server 330 records the product sales. are totaled.
  • the above-mentioned POS system can communicate with the specific system 100 of the present invention via the network N, and based on the user information received from the specific system 100, credit card payment is possible.
  • the POS server 330 stores information used for credit card payments in association with user IDs.
  • POS server 330 includes a credit table in its memory.
  • the credit table has a user ID field where the user ID is registered, a name field where the user's name is registered, a card number field where the credit card number is registered, and an expiry date field where the credit card expiration date is registered. It is equipped with Registration in each of these fields is made when the user applies for use.
  • the smart device 10 of the specific system 100 is installed at the cash register counter where the reading device 310 and the cash register device 320 are arranged, and is connected to the cash register device 320 by wire or wirelessly.
  • the smart device 10 executes the above-mentioned imaging process (s12), extraction process (s13), input reception process (modification 1-A), and transmission process (s15).
  • the identification server 20 performs identification processing (s22) on the face image received from the smart device 10, and performs authentication processing based on the password received from the smart device 10, thereby identifying the user ID output by the identification processing. and transmits the user ID to the smart device 10.
  • the smart device 10 Upon receiving the user ID, the smart device 10 transmits the user ID to the cash register device 320.
  • the specific server 20 may directly transmit the user ID to the cash register device 320 without going through the smart device 10.
  • the cash register device 320 requests the POS server 330 to send credit information corresponding to the received user ID, and acquires the credit information from the POS server 330.
  • the cash register terminal then applies for credit card payment to the credit company based on the acquired credit information.
  • credit card payments can be made without presenting a credit card.
  • the member management system includes a member's smartphone on which a member application is installed, and a management server that can communicate with the smartphone via network N.
  • the CPU of the smartphone starts the member application, the CPU displays member information on the touch panel display and configures the member information to be changeable.
  • the member inputs the changed member information the changed member information is sent to the management server, and the member information registered in the management server is updated.
  • the member's face image can also be changed.
  • the management server includes the changed face image in the member information and registers it, and then sends the changed face image to the specific server 20.
  • the specific server 20 executes the above registration process based on the received facial image. This allows members (users) to carry out maintenance on their own even if facial features change with age.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

Le problème décrit par la présente invention est d'identifier une personne sans demander à la personne de transporter une carte à puce ou similaire. La solution selon l'invention porte sur un système d'identification (100) comprenant un dispositif intelligent (10) et un serveur (20) apte à communiquer avec le dispositif intelligent (10), le dispositif intelligent (10) étant pourvu d'une unité d'imagerie pour imager une personne et pour générer une image de la personne, et d'une unité de transmission pour transmettre l'image de la personne au serveur. Le serveur (20) est pourvu d'une unité de réception pour recevoir l'image de la personne en provenance du dispositif intelligent, et d'une unité d'identification pour identifier la personne sur la base de l'image reçue de la personne. Le serveur (20) est également pourvu d'une unité de transmission pour transmettre la personne identifiée à un autre système.
PCT/JP2022/017759 2022-04-13 2022-04-13 Système d'identification, système de gestion d'entrée/sortie et système pos WO2023199455A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/017759 WO2023199455A1 (fr) 2022-04-13 2022-04-13 Système d'identification, système de gestion d'entrée/sortie et système pos
JP2022522641A JPWO2023199455A1 (fr) 2022-04-13 2022-04-13

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/017759 WO2023199455A1 (fr) 2022-04-13 2022-04-13 Système d'identification, système de gestion d'entrée/sortie et système pos

Publications (1)

Publication Number Publication Date
WO2023199455A1 true WO2023199455A1 (fr) 2023-10-19

Family

ID=88329351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/017759 WO2023199455A1 (fr) 2022-04-13 2022-04-13 Système d'identification, système de gestion d'entrée/sortie et système pos

Country Status (2)

Country Link
JP (1) JPWO2023199455A1 (fr)
WO (1) WO2023199455A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018101420A (ja) * 2014-12-29 2018-06-28 東芝テック株式会社 情報処理システムおよび情報処理プログラム
JP2019133347A (ja) * 2018-01-30 2019-08-08 富士通フロンテック株式会社 認証システムおよび認証方法
JP2019144695A (ja) * 2018-02-16 2019-08-29 グローリー株式会社 顔認証システム、顔認証サーバおよび顔認証方法
JP2020021459A (ja) * 2019-04-25 2020-02-06 株式会社メルカリ プログラム、情報処理方法、情報処理装置
WO2020149136A1 (fr) * 2019-01-15 2020-07-23 グローリー株式会社 Système d'authentification, dispositif de gestion, et procédé d'authentification
JP2020126544A (ja) * 2019-02-06 2020-08-20 株式会社メルカリ 情報処理方法、情報処理装置、及びプログラム
JP6951796B1 (ja) * 2020-05-15 2021-10-20 国立大学法人広島大学 画像認識装置、認証システム、画像認識方法及びプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018101420A (ja) * 2014-12-29 2018-06-28 東芝テック株式会社 情報処理システムおよび情報処理プログラム
JP2019133347A (ja) * 2018-01-30 2019-08-08 富士通フロンテック株式会社 認証システムおよび認証方法
JP2019144695A (ja) * 2018-02-16 2019-08-29 グローリー株式会社 顔認証システム、顔認証サーバおよび顔認証方法
WO2020149136A1 (fr) * 2019-01-15 2020-07-23 グローリー株式会社 Système d'authentification, dispositif de gestion, et procédé d'authentification
JP2020126544A (ja) * 2019-02-06 2020-08-20 株式会社メルカリ 情報処理方法、情報処理装置、及びプログラム
JP2020021459A (ja) * 2019-04-25 2020-02-06 株式会社メルカリ プログラム、情報処理方法、情報処理装置
JP6951796B1 (ja) * 2020-05-15 2021-10-20 国立大学法人広島大学 画像認識装置、認証システム、画像認識方法及びプログラム

Also Published As

Publication number Publication date
JPWO2023199455A1 (fr) 2023-10-19

Similar Documents

Publication Publication Date Title
US11151819B2 (en) Access control method, access control apparatus, system, and storage medium
US9426432B2 (en) Remote interactive identity verification of lodging guests
JP5471533B2 (ja) 来訪者入退管理システム
JP2018151838A (ja) 入場管理システム
US11496471B2 (en) Mobile enrollment using a known biometric
KR102243963B1 (ko) 안면 인식을 이용한 근태 관리 시스템
JP2011077835A (ja) インターホンシステム
JP2007241501A (ja) 訪問者特定システムおよび訪問者特定方法
US20150295709A1 (en) Biometric validation method and biometric terminal
WO2021053882A1 (fr) Dispositif d'authentification d'utilisateur et support d'enregistrement
US20220351562A1 (en) Reception terminal
KR20160006126A (ko) 휴대용 인증 장치를 사용한 보안 장치
KR20170001416A (ko) 원격 계좌 개설 시스템
KR20090041619A (ko) 출입 통제 시스템
WO2023199455A1 (fr) Système d'identification, système de gestion d'entrée/sortie et système pos
TW201537521A (zh) 自主式訪客系統與其主機
JP2005036523A (ja) 電子ロック制御システム及びその方法並びにそれに用いる携帯情報端末及び認証装置
JP2023156968A (ja) 特定システム、入退場管理システム、及びposシステム
JP2022117025A (ja) 本人確認方法、プログラム、及び情報システム
JP6891355B1 (ja) 認証システム、認証装置、認証方法、及びプログラム
JP2006104801A (ja) 入退室管理システム
CN112489274A (zh) 一种门禁的控制方法以及系统
JP7164675B1 (ja) 入退管理システム
KR102639356B1 (ko) 안면인식을 이용한 신분인증시스템 및 방법
WO2023033057A1 (fr) Système de porte, système de sécurité et unité de capteur

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2022522641

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22937434

Country of ref document: EP

Kind code of ref document: A1