WO2023084593A1 - Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par ordinateur - Google Patents

Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par ordinateur Download PDF

Info

Publication number
WO2023084593A1
WO2023084593A1 PCT/JP2021/041170 JP2021041170W WO2023084593A1 WO 2023084593 A1 WO2023084593 A1 WO 2023084593A1 JP 2021041170 W JP2021041170 W JP 2021041170W WO 2023084593 A1 WO2023084593 A1 WO 2023084593A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
information processing
authentication
photographed image
Prior art date
Application number
PCT/JP2021/041170
Other languages
English (en)
Japanese (ja)
Inventor
忠信 中山
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2023559225A priority Critical patent/JPWO2023084593A5/ja
Priority to PCT/JP2021/041170 priority patent/WO2023084593A1/fr
Publication of WO2023084593A1 publication Critical patent/WO2023084593A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • the present disclosure relates to an information processing device, an information processing system, an information processing method, and a non-transitory computer-readable medium, and more particularly to an information processing device, an information processing system, an information processing method, and a non-transitory computer-readable medium that output a location of a user. Regarding the medium.
  • Patent Literature 1 discloses a position estimation system for estimating the position of a mobile object from the state in which the sensor terminal is carried by the mobile object and the position of the sensor terminal.
  • Patent Document 2 the location of each of a plurality of employees is detected based on the information of the wireless AP to which the mobile terminal of the user is connected, and the data of the screen displaying the location together with the employee's thumbnail image is generated.
  • a communication support system for providing terminals of other users.
  • Patent Document 1 has the problem that even if the location of the target person is known, it is difficult to actually find the target person if there are multiple people on the same floor.
  • An object of the present disclosure is to provide an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that favorably support finding a user in a space in view of the above-described problems.
  • An information processing device includes: authentication control means for controlling biometric authentication based on the photographed image of the first user; position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. and a control means.
  • An information processing system includes: a biometric authentication device that performs biometric authentication based on a photographed image of a first user; comprising an information processing device and The information processing device is authentication control means for acquiring the result of the biometric authentication from the biometric authentication device; position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. and a control means.
  • An information processing method includes: controlling biometric authentication based on the captured image of the first user; When the biometric authentication is successful, estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user; Output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and appearance data generated based on the photographed image.
  • a non-transitory computer-readable medium comprising: a procedure for controlling biometric authentication based on the photographed image of the first user; a step of estimating the position of the first user in a target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; A procedure for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image.
  • an information processing device an information processing system, an information processing method, and a non-temporary computer-readable medium that favorably assist in finding a user in space.
  • FIG. 1 is a block diagram showing the configuration of an information processing apparatus according to a first embodiment
  • FIG. 4 is a flow chart showing the flow of an information processing method according to the first embodiment
  • 2 is a block diagram showing the overall configuration of an information processing system according to a second embodiment
  • FIG. 2 is a block diagram showing the configuration of a face authentication device according to a second embodiment
  • FIG. 9 is a flowchart showing the flow of face information registration processing according to the second embodiment
  • 9 is a flow chart showing the flow of face authentication processing according to the second embodiment
  • FIG. 8 is a block diagram showing the configuration of a user terminal according to the second embodiment
  • FIG. 7 is a block diagram showing the configuration of a server according to the second embodiment
  • FIG. 11 is a sequence diagram showing the flow of user registration processing according to the second embodiment;
  • FIG. 11 is a sequence diagram showing the flow of position output processing according to the second embodiment;
  • FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment;
  • FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment;
  • FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment;
  • FIG. FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment;
  • FIG. 11 is a sequence diagram showing the flow of position output processing according to the third embodiment;
  • Patent Literature 1 describes that the carrying state may be detected when personal authentication is successful.
  • Patent Literature 1 there is a problem that it is difficult to find the target person in reality.
  • Patent Document 2 the method described in the above-mentioned Patent Document 2 is proposed, but even if the location of the employee is known, it takes time to actually find the employee unless the user's appearance on that day is known. There is In particular, it becomes more and more difficult to find employees when their clothes and hairstyles change.
  • FIG. 1 is a block diagram showing the configuration of an information processing apparatus 10 according to the first embodiment.
  • the information processing device 10 is a computer device that assists another user (sometimes called a second user) to find a target user (sometimes called a first user) in a target space.
  • the information processing device 10 is connected to a network (not shown).
  • a network may be wired or wireless.
  • a first user terminal (not shown) used by a first user and a display device (not shown) are connected to the network.
  • the information processing device 10 includes an authentication control unit 13 , a position estimation unit 14 and an output control unit 16 .
  • the authentication control unit 13 is also called authentication control means.
  • the authentication control unit 13 controls biometric authentication based on the captured image of the first user.
  • Biometric authentication is iris authentication, face authentication, hand geometry authentication, or other biometric authentication that authenticates based on a photographed image of the user. Thereby, the authentication control unit 13 identifies the user.
  • the position estimation unit 14 is also called position estimation means.
  • the position estimation unit 14 estimates the position information of the first user terminal within the target space when the biometric authentication is successful.
  • the target space is a space in which a plurality of people can stay and each person can move inside.
  • the target space is a predetermined space. Then, the position estimation unit 14 estimates the position of the first user in the target space based on the position information of the first user terminal.
  • the output control unit 16 is also called output control means.
  • the output control unit 16 outputs output information in which the position of the first user terminal and the appearance information of the first user are associated with each other to a display device or another device such as a second user terminal used by the second user.
  • the appearance information of the first user is information indicating or suggesting the features or wearing conditions of the item worn by the first user.
  • the thing worn may be clothing, hairstyle, accessories, a mask, glasses, or the like.
  • the appearance information of the first user includes at least one of a photographed image generated by photographing the first user and data regarding appearance generated based on the photographed image.
  • Appearance-related data is also called appearance data, and is, for example, data indicating characteristics or wearing conditions of the wearable item.
  • the appearance data is text data or an illustration image indicating the characteristics or wearing conditions.
  • FIG. 2 is a flow chart showing the flow of the information processing method according to the first embodiment.
  • the authentication control unit 13 of the information processing device 10 controls biometric authentication based on the photographed image of the first user (S10). Controlling biometric authentication based on a captured image may mean performing biometric authentication based on the captured image or feature information extracted from the captured image. Also, controlling biometric authentication based on a captured image may mean transmitting a captured image or feature information extracted from a captured image to a biometric authentication device (not shown) and obtaining an authentication result from the biometric authentication device. .
  • the authentication control unit 13 determines whether biometric authentication has succeeded (S11).
  • Successful biometric authentication may indicate that the degree of matching between the feature information extracted from the captured image of the first user and the pre-registered feature information of the user is equal to or greater than a predetermined value.
  • the position estimation unit 14 estimates the position of the first user terminal within the target space (S12). For example, the position estimation unit 14 acquires the position of the access point (AP) used by the first user terminal or the GPS (Global Positioning System) information of the first user terminal, and the position of the AP or the GPS information as the first It may be estimated as the position of the user terminal. Also, the position estimation unit 14 may estimate the position of the first user terminal based on the position of the AP used by the first user terminal and the received radio wave intensity from the first user terminal.
  • AP access point
  • GPS Global Positioning System
  • the position estimation unit 14 estimates the position of the first user within the target space based on the position information of the first user terminal within the target space (S13). For example, the position estimation unit 14 may estimate the position information of the first user terminal within the target space as the position of the first user within the target space. Further, for example, the position estimation unit 14 may estimate the position of the first user within a predetermined distance range from the position information of the first user terminal in the target space.
  • the output control unit 16 outputs output information in which the position information of the first user and the appearance information such as the photographed image of the first user are associated with each other to the other device (S14).
  • the information processing device 10 ends the process. In other words, the output control unit 16 does not output the location information of the user whose biometric authentication fails to other devices as the location information of the first user.
  • the information processing apparatus 10 uses the location information of the first user who has succeeded in biometric authentication, so it is possible to prevent impersonation of the first user by others.
  • the information processing device 10 also outputs the appearance information such as the captured image of the first user in association with the position information of the first user. Therefore, the second user who has received the output information can grasp the features of the first user's appearance on that day as well as the position of the first user. Accordingly, the information processing device 10 can suitably assist the second user in finding the first user in the actual target space.
  • FIG. 3 is a block diagram showing the overall configuration of an information processing system 1000 according to the second embodiment.
  • the information processing system 1000 is a computer system that assists the second user in finding the first user in the target space TS.
  • the target space TS is the site of the target company as an example.
  • the target company's premises may include one or more floors.
  • the target space TS is not limited to this, and may be a shared office where a plurality of companies gather or a site of a school.
  • biometric authentication is face authentication as an example.
  • the information processing system 1000 includes a face authentication device 100, an information processing device (hereinafter referred to as a server) 200, a plurality of user terminals 300-1, 300-2, 300-3, a plurality of APs 400-1, 400- 2 and a display device 500 .
  • the face authentication device 100, server 200, AP 400 and display device 500 are connected to each other via a network N.
  • the network N is a wired or wireless communication line.
  • the network N is, for example, an intranet, and may be at least one of a LAN (Local Area Network), a WAN (Wide Area Network), and the Internet, or a combination thereof. Note that the number of user terminals and the number of APs are not limited to this.
  • the user terminal 300 is an information terminal used by a user, such as a personal computer, a smart phone, or a tablet terminal.
  • User terminal 300 transmits a user registration request to server 200 .
  • the user terminal 300 also transmits user information to the server 200 and causes the server 200 to register the user information.
  • the user terminal 300 requests face authentication when it is activated or released from a sleep state.
  • the user terminal 300 captures at least the face of the user from the front and transmits the captured image or facial feature information extracted from the captured image to the face authentication device 100 via the server 200 to request face authentication. .
  • the user terminal 300 may directly transmit the captured image or facial feature information to the face authentication device 100 .
  • the user terminal 300 When the user terminal 300 is located within the target space TS, the user terminal 300 connects to any AP 400 via the wireless LAN and connects to the network N via the AP 400 .
  • the user terminal 300 connects to the nearest AP 400 via a wireless LAN and connects to the network N via the nearest AP 400 .
  • user terminals 300-1 and 300-2 are connected to AP 400-1 and connected to network N via AP 400-1.
  • a user terminal 300-3 is connected to the AP 400-2, and is connected to the network N via the AP 400-2.
  • APs 400-1 and 400-2 are wireless access points. APs 400-1 and 400-2 are installed in predetermined areas of target space TS. For example, APs 400-1 and 400-2 may be installed on each floor within the premises of the target company. Also, for example, APs 400-1 and 400-2 may be installed in different areas on the same floor within the premises of the target company.
  • the face authentication device 100 is a computer device that stores facial feature information of multiple people.
  • the face authentication device 100 has a face authentication function that, in response to a face authentication request received from the outside, compares the face image or face feature information included in the request with the face feature information of each user.
  • the face authentication device 100 registers facial feature information of the user at the time of user registration. Then, the face authentication device 100 acquires the photographed image of the user from the user terminal 300 via the server 200, and performs face authentication using the face area in the photographed image. The face authentication device 100 then returns the collation result (face authentication result) to the server 200 .
  • the server 200 is an example of the information processing device 10 described above.
  • the server 200 receives a user registration request including a registration image from the user terminal 300
  • the server 200 transmits the face registration request to the face authentication device 100 .
  • the server 200 registers the user information in association with the user ID issued by the face authentication device 100 .
  • the server 200 receives a photographed image for face authentication or face feature information from the user terminal 300 via the AP 400 , it sends a face authentication request to the face authentication device 100 .
  • the server 200 identifies the user based on the face authentication result, and estimates the position of the user terminal 300 and the position of the user based on the position of the AP 400 to which the user terminal 300 is connected.
  • the server 200 transmits to the display device 500 output information in which appearance information including the photographed image for face authentication of the user and the position of the user are associated with each other.
  • the display device 500 is a device having a display unit such as a digital signage or a tablet terminal.
  • the display device 500 is installed in the target space TS or at a remote location.
  • the display device 500 is installed at the floor entrance within the premises of the target company.
  • the display device 500 displays the position of each user included in the output information received from the server 200 in association with the captured image for face authentication of the user.
  • FIG. 4 is a block diagram showing the configuration of the face authentication device 100 according to the second embodiment.
  • the face authentication device 100 includes a face information database (face information DB) 110 , a face detection section 120 , a feature point extraction section 130 , a registration section 140 and an authentication section 150 .
  • the face information DB 110 associates and stores a user ID 111 and face feature information 112 of the user ID.
  • the facial feature information 112 is a set of feature points extracted from a facial image, and is an example of facial information.
  • the face authentication device 100 may delete the face feature information 112 in the face feature DB 110 in response to a request from the registered user of the face feature information 112 .
  • the face authentication device 100 may delete the facial feature information 112 after a certain period of time has passed since it was registered.
  • the face detection unit 120 detects a face area included in a registered image for registering face information, and supplies it to the feature point extraction unit 130 .
  • the feature point extraction unit 130 extracts feature points from the face area detected by the face detection unit 120 and supplies face feature information to the registration unit 140 .
  • the feature point extraction unit 130 also extracts feature points included in the captured image received from the server 200 and supplies facial feature information to the authentication unit 150 .
  • the registration unit 140 newly issues a user ID 111 when registering facial feature information.
  • the registration unit 140 associates the issued user ID 111 with the facial feature information 112 extracted from the registered image and registers them in the facial information DB 110 .
  • the authentication unit 150 performs face authentication using the facial feature information 112 . Specifically, the authentication unit 150 collates the facial feature information extracted from the captured image with the facial feature information 112 in the facial information DB 110 . Authentication unit 150 returns to server 200 whether or not the facial feature information matches. Whether the facial feature information matches or not corresponds to the success or failure of the authentication. Note that matching of facial feature information (matching) means a case where the degree of matching is equal to or greater than a predetermined value.
  • FIG. 5 is a flowchart showing the flow of face information registration processing according to the second embodiment.
  • the face authentication device 100 acquires the registered image of the user U included in the face registration request (S21). For example, the face authentication device 100 receives a face registration request via the network N from the server 200 that received the user registration request from the user terminal 300 . Note that the face authentication device 100 may receive a face registration request directly from the user terminal 300 without being limited to this.
  • face detection section 120 detects a face area included in the registered image (S22).
  • the feature point extraction unit 130 extracts feature points from the face area detected in step S22, and supplies face feature information to the registration unit 140 (S23).
  • the registration unit 140 issues the user ID 111, associates the user ID 111 with the facial feature information 112, and registers them in the facial information DB 110 (S24).
  • the face authentication device 100 may receive the face feature information 112 from the face registration requester and register it in the face information DB 110 in association with the user ID 111 .
  • FIG. 6 is a flowchart showing the flow of face authentication processing according to the second embodiment.
  • the feature point extraction unit 130 acquires facial feature information for authentication (S31).
  • the face authentication device 100 receives a face authentication request from the server 200 via the network N, and extracts facial feature information from the captured image included in the face authentication request in steps S21 to S23.
  • the face authentication device 100 may receive facial feature information from the server 200 .
  • the authentication unit 150 collates the acquired facial feature information with the facial feature information 112 of the facial information DB 110 (S32).
  • the authentication unit 150 identifies the user ID 111 of the user whose facial feature information matches (S34). . Then, the authenticating unit 150 replies to the server 200 that the face authentication was successful and the identified user ID 111 as the face authentication result (S35). If there is no matching facial feature information (No in S33), the authentication unit 150 returns a face authentication result to the server 200 to the effect that the face authentication has failed (S36).
  • FIG. 7 is a block diagram showing the configuration of the user terminal 300 according to the second embodiment.
  • the user terminal 300 includes a camera 310 , a storage section 320 , a communication section 330 , a display section 340 , an input section 350 and a control section 360 .
  • the camera 310 is an imaging device that performs imaging under the control of the control unit 360 .
  • the storage unit 320 is a storage device that stores programs for realizing each function of the user terminal 300 .
  • a communication unit 330 is a communication interface with the network N.
  • the display unit 340 is a display device.
  • the input unit 350 is an input device that receives input from the user.
  • the display unit 340 and the input unit 350 may be configured integrally like a touch panel.
  • the control unit 360 controls hardware of the user terminal 300 .
  • FIG. 8 is a block diagram showing the configuration of the server 200 according to the second embodiment.
  • the server 200 includes a storage unit 210 , a memory 220 , a communication unit 230 and a control unit 240 .
  • the storage unit 210 is a storage device such as a hard disk or flash memory.
  • Storage unit 210 stores program 211 , user database (DB) 212 , and access point database (APDB) 213 .
  • the program 211 is a computer program in which the processing of the information processing method according to the second embodiment is implemented.
  • the user DB 212 stores information about users. Specifically, the user DB 212 stores user information 2122 , position information 2123 and appearance information 2124 in association with the user ID 2121 .
  • a user ID 2121 is a user ID issued by the face authentication device 100 when face information is registered.
  • User information 2122 may include, for example, user name, employee number, cell phone number, email address, attribute information, or schedule-related information for the user.
  • the attribute information may include at least one of gender, job title, and department name.
  • the schedule-related information may be the schedule itself, or information for accessing the user terminal 300 or the scheduler operating on the cloud.
  • the position information 2123 is the user's position information estimated by the position estimation unit 244, which will be described later.
  • Appearance information 2124 is a photographed image for face authentication, but may also include appearance data.
  • the APDB 213 stores information about the AP 400. Specifically, the APDB 213 associates and stores an APID 2131 that identifies the AP 400 and location information 2132 where the AP 400 is installed.
  • the memory 220 is a volatile storage device such as RAM (Random Access Memory), and is a storage area for temporarily holding information when the control unit 240 operates.
  • the communication unit 230 is a communication interface with the network N. FIG.
  • the control unit 240 is a processor that controls each component of the server 200, that is, a control device.
  • the control unit 240 loads the program 211 from the storage unit 210 into the memory 220 and executes the program 211 .
  • the control unit 240 realizes the functions of the registration unit 241 , the image acquisition unit 242 , the authentication control unit 243 , the position estimation unit 244 , the generation unit 245 and the output control unit 246 .
  • the registration unit 241 is also called registration means. Upon receiving a user registration request including a registration image from the user terminal 300 , the registration unit 241 transmits a face registration request to the face authentication device 100 . When the face authentication device 100 registers face information and issues a user ID, the registration unit 241 registers the user ID in the user DB 212 . The registration unit 241 also registers the user information of the user in the user DB 212 in association with the user ID of the user who uses the user terminal 300 . Since the user ID is associated with the face information in the face authentication device 100, the registration unit 241 registers the user information for each user in association with the user's face information via the user ID. become.
  • the image acquisition unit 242 is also called image acquisition means.
  • the image acquisition unit 242 receives a captured image for face authentication from the user terminal 300 via the AP 400 and supplies the image to the authentication control unit 243 .
  • the authentication control unit 243 is an example of the authentication control unit 13 described above.
  • the authentication control unit 243 controls face authentication for the face area of the user U included in the captured image, and identifies the user. That is, the authentication control unit 243 causes the face authentication device 100 to perform face authentication on the captured image acquired from the user terminal 300 .
  • the authentication control unit 243 transmits a face authentication request including the acquired photographed image to the face authentication device 100 via the network N.
  • the authentication control unit 243 may extract the face area of the user U from the captured image and include the extracted image in the face authentication request.
  • the authentication control unit 243 may also extract facial feature information from the face area and include the facial feature information in the face authentication request.
  • the authentication control unit 243 then receives the face authentication result from the face authentication device 100 . Thereby, the authentication control unit 243 identifies the user ID of the user.
  • the position estimator 244 is an example of the position estimator 14 described above.
  • the position estimation unit 244 estimates the position of the user terminal 300 .
  • Methods for estimating the position of the user terminal 300 include, for example, the following (1) to (3).
  • the position estimation unit 244 identifies the AP 400 via which the captured image is received, and identifies the position information of the AP 400 in the APDB 213 .
  • the position estimation unit 244 estimates the position information of the AP 400 as the position information of the user specified by the authentication control unit 243 .
  • Position estimation section 244 then estimates the position of user terminal 300 based on the position information of AP 400 and the distance between AP 400 and user terminal 300 . (3) When GPS information is acquired from the user terminal 300 , the position estimation unit 244 estimates the GPS information as the position information of the user terminal 300 .
  • the position estimation unit 244 estimates the user's position based on the position of the user terminal 300 .
  • the position estimation unit 244 may estimate the position of the user terminal 300 as the user's position.
  • the position estimation unit 244 estimates the position of the user terminal 300 that recently performed face authentication or the position of the active user terminal 300 as the position of the user. good.
  • the fact that there are multiple user terminals 300 used by the user may mean that there are multiple user terminals 300 for which the user performed face authentication within a predetermined period.
  • the position estimation unit 244 may determine the position of any user terminal 300 to be the user's position based on the type of the user terminal 300 . For example, if the same user has received face authentication from both a PC and a smartphone within a predetermined period of time, the position estimation unit 244 may use the position of the PC as the user's position. If there is a user terminal 300 outside the target space TS, the position estimation unit 244 sets the position of the smartphone as the user's position.
  • the location may be the location of the user.
  • the generator 245 is also called a generator.
  • the generation unit 245 generates appearance information including a captured image for face authentication of the user, and stores it in the user DB 212 in association with the user ID 2121 . Appearance information may be the photographed image itself for face authentication.
  • the generating unit 245 also generates output information in which at least the user's appearance information stored in the user DB 212 and the user's location information are associated with each other.
  • the output information is a map indicating the target space TS.
  • the generation unit 245 generates, as output information, a map representing the target space TS, in which the user's appearance information is superimposed on the position corresponding to the user's position information. By superimposing the appearance information on the map, it becomes easier for the second user viewing the map to find the target user (first user) in the physical space.
  • the output control unit 246 is an example of the output control unit 16 described above.
  • the output control section 246 transmits the output information to the display device 500 and causes the display section of the display device 500 to display the output information.
  • FIG. 9 is a sequence diagram showing the flow of user registration processing according to the second embodiment.
  • the user terminal 300 takes a picture of the user (S500), and transmits a user registration request including the registration image generated by the picture to the server 200 (S501).
  • the registration unit 241 of the server 200 includes the registration image included in the received user registration request in the face registration request and transmits the face registration request to the face authentication device 100 (S502).
  • the face authentication device 100 registers face information (face feature information) of the user U based on the registration image included in the received face registration request (S503).
  • the face authentication device 100 notifies the server 200 of the issued user ID (S504).
  • the user terminal 300 accepts input of user information from the user and transmits the user information to the server 200 (S505).
  • the user information transmitted here includes, for example, the user name, attribute information, and schedule-related information.
  • the registration unit 241 of the server 200 associates the notified user ID and user information with each other and registers them in the user DB 212 (S506).
  • FIG. 10 is a sequence diagram showing the flow of position output processing according to the second embodiment.
  • the user terminal 300 takes an image of the user (S510), and transmits the taken image to the server 200 via the AP 400 to which it is connected (S511).
  • the image acquisition unit 242 of the server 200 acquires the captured image of the user.
  • the authentication control unit 243 of the server 200 transmits a face authentication request for the face area of the user U in the captured image to the face authentication device 100 (S512).
  • the face authentication device 100 performs face authentication on the face area of the user U in the captured image included in the received face authentication request (S513).
  • the face authentication device 100 transmits to the server 200 a face authentication result including the success of the face authentication and the user ID (S514).
  • the authentication control unit 243 of the server 200 identifies the user based on the user ID included in the face authentication result.
  • the authentication control unit 243 transmits the face authentication result to the user terminal 300 via the AP 400 (S515). As a result, the user terminal 300 wakes up or transitions from the sleep state to the normal state.
  • the position estimation unit 244 of the server 200 refers to the APDB 213, specifies the position information of the AP 400 associated with the APID of the AP 400, and estimates the position of the user terminal 300 based on the position information of the AP 400 (S516). . The position estimation unit 244 then estimates the position of the user based on the position of the user terminal 300 (S517). Then, the position estimation unit 244 stores the position information of the user in the user DB 212 in association with the user ID.
  • the generating unit 245 of the server 200 generates a map that associates the user's appearance information including the photographed image for face authentication with the user's position information (S518). Then, the output control unit 246 of the server 200 transmits the map to the display device 500 via the AP 400 (S519), and causes the display unit of the display device 500 to display the map (S520).
  • FIG. 11 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment.
  • the target space TS includes area A, area B, and area C, and AP 400 is installed in each area.
  • Icons I_1 and I_2 of captured images for face authentication of the user using the user terminal 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG. there is Also, at the position of area B of the display image 900, an icon I_3 of a photographed image for face authentication of the user using the user terminal 300 connected to the AP 400 installed in area B is superimposed.
  • the display image 900 also shows the position where the display device 500 is installed as the current position.
  • FIG. 1 shows an icon I_3 of a photographed image for face authentication of the user using the user terminal 300 connected to the AP 400 installed in area B.
  • one icon of a shot image for face authentication is displayed for each user, but a plurality of icons may be displayed for each user.
  • the plurality of icons may include a plurality of photographed images for face authentication, a photographed image of the user's clothes, shoes, hairstyle, or back view of the day, or may include a photographed image of the user's appearance of the day. An illustrated image representing the image may be included.
  • the server 200 uses the location information of the user whose face has been successfully authenticated, so that the user can be prevented from being impersonated by others. Further, in the example of FIG. 11, the server 200 causes the display device 500 to display the captured image of the user in association with the position information. Therefore, other users who browsed the output information can grasp the characteristics of the target user's appearance on that day as well as the position of the target user. This allows the server 200 to preferably assist other users in finding the target user within the target space.
  • the captured image for face authentication uses the camera 310 of the user terminal 300, there is no need to install a new camera for face authentication. Also, when face authentication is performed when the PC is activated or wakes from sleep, the camera attached to the PC can be used to photograph the face of the user sitting in front of the PC from the front. When the user is photographed from the front with the camera 310 in this manner, a high-quality photographed image suitable for face authentication can be obtained.
  • the server 200 estimates the user's position based on the position of the AP 400 or estimates the user's position based on the GPS information of the user terminal 300, installation and introduction of new equipment such as dedicated transmitters and receivers is not required. No need.
  • Embodiment 2 can be modified as follows.
  • the appearance information included in the output information may be appearance data generated from a photographed image for face authentication instead of the photographed image for face authentication.
  • the output information may include a pre-registered registered image of the user.
  • the registered image may be a registered image for face authentication, a thumbnail image appropriately set by the user, or a face image on an employee ID card.
  • FIG. 12 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment.
  • Icons R_1 and R_2 of registered images of users using the user terminals 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG.
  • appearance data O_1 and O_2 generated from the photographed images for face authentication of the user are included in association with icons R_1 and R_2.
  • the appearance data O_1 indicates that the person is wearing "red clothes”
  • the appearance data O_2 indicates that the person is wearing "glasses”.
  • icon R_3 of a user using user terminal 300 connected to AP 400 installed in area B and appearance data O_3 are superimposed.
  • the output information may be information in which the location information and appearance information of the first user are further associated with the user information. That is, the generation unit 245 may generate output information including position information and appearance information of the first user, and user information. Also, the output information may be information in which the location information and appearance information of the first user are further associated with the schedule-related information. That is, the generation unit 245 may generate output information in which the first user's location information and appearance information are associated with the user's schedule information.
  • the user's schedule information may be obtained by the generation unit 245 extracting from the schedule-related information of the user information 2122 or by accessing the scheduler based on the schedule-related information.
  • FIG. 13 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment.
  • Icons I_1 and I_2 of captured images for face authentication of the user using the user terminal 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG. there is
  • the display image 900 includes user information U of users located in the area A.
  • the user corresponding to the icon I_1 belongs to the department "first engineering department" and the name is "Nichiden Taro".
  • the user information U may also include input information input by a user operation.
  • the user can input input information regarding today's appearance, and the user terminal 300 that has received the input may transmit the input information to the server 200 and register it in the user DB 212 as the user's user information.
  • the user information U shown in FIG. 13 includes information “I am wearing red clothes” as user input information corresponding to the icon I_1.
  • the user may also input other input information, and the user terminal 300 that has received the input may transmit the input information to the server 200 and register it as the user's user information.
  • the user information U shown in FIG. 13 includes information "I am next to a round chair" as user input information corresponding to the icon I_3.
  • Input information registered in the user DB 212 as user information can be given an expiration date of one hour or one day, and may be deleted after the expiration date.
  • the viewer of the display device 500 selects the icon I_1 of the photographed image for face authentication, other information regarding the user may be displayed.
  • Other information about the user may be user information or schedule information.
  • Other information about the user may be another photographed image of the user for face authentication, a photographed image of the user's clothes, shoes, hairstyle, or back view of the day, or the appearance of the day. It may be an illustration image expressing the characteristics of If display device 500 includes a touch panel, selecting may be tapping.
  • the display device 500 may display the location of the user associated with the specific location. For example, when the display device 500 is installed in a room of a certain department, the display device 500 may display the whereabouts of the members of that department.
  • FIG. 14 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment.
  • a display image 900 in FIG. 14 shows a map showing rooms of the first technical department and the locations of members belonging to the first technical department.
  • the display device 500 displays icons I_1 to I_3 of captured images for face authentication of the members in association with the positions in the map corresponding to the location of the members in the living room. In addition, the display device 500 displays information about which area the member is not in the room, together with icons I_4 to I_8 of the photographed image of the member for face authentication.
  • the output control unit 246 of the server 200 may output, for example, "remote", the position of a member whose location is determined to be outside the target space TS, distinguishing it from the target space TS. .
  • the position estimation unit 244 of the server 200 may determine whether or not the position of the member is outside the target space TS from the GPS information of the user terminal 300, or by tracking the network through which it passes. may Also, the position estimation unit 244 may determine from the user's schedule information whether or not the member's position is outside the target space TS. In this case, the display device 500 displays the locations of members outside the target space TS as "remote" as shown in FIG. This makes it possible to easily grasp members who are not in the target space TS.
  • the display device 500 may also display breakdowns such as how many of the affiliated members are in the room, how many are in other areas, and how many are remote.
  • the display device 500 may display the same type of information regardless of whether the selected user is inside or outside the target space TS.
  • the present invention is not limited to this, and the display device 500 may display different types of information depending on whether the selected user is inside or outside the target space TS. For example, when a user inside the target space TS is selected, the display device 500 displays detailed information on appearance and an extension number. A phone number may be displayed.
  • Embodiment 3 is characterized in that the server 200 has a search function.
  • the server 200 uses a user name or user ID as a key to output output information including position information of the user.
  • the output control unit 246 receives a search request regarding the position of the first user from the second user terminal 300-2 used by the second user, the output control unit 246 outputs output information including the first user's appearance information together with the position information. , to the second user terminal 300-2.
  • the server 200 uses the area as a key to output the output information of the user in that area.
  • the generation unit 245 refers to the user DB 212 and identifies users whose position information is within the predetermined area. and generate output information for the specified user.
  • the output control unit 246 then outputs the specified user's output information to the second user terminal 300-2.
  • the server 200 uses a user attribute as a key to output output information of a user having that attribute.
  • the generation unit 245 receives from the second user terminal 300-2 a search request regarding the position of a user who belongs to a predetermined department, the generation unit 245 refers to the user DB 212 and searches for the position of the user who belongs to the predetermined department in the target space TS. Identify users. The generation unit 245 then generates output information for the specified user. The generation unit 245 then outputs the specified user's output information to the second user terminal 300-2.
  • FIG. 15 is a sequence diagram showing the flow of position output processing according to the third embodiment.
  • FIG. 15 shows a sequence when the second user searches for the position of the first user using the user name of the first user as a key.
  • second user terminal 300-2 used by the second user transmits a search request including the user name of the first user to server 200 (S531).
  • the second user terminal 300-2 transmits the search request to the server 200 via the AP400, but it does not have to be via the AP400.
  • the generation unit 245 of the server 200 that has received the search request refers to the user DB 212 and searches for the position information and appearance information of the user (first user) corresponding to the user name using the specified user name as a key. (S532). Then, the generation unit 245 generates a map in which the location information and appearance information of the first user are associated (S533).
  • the output control unit 246 of the server 200 then transmits the map to the display device 500 via the AP 400 (S534), and causes the display unit of the display device 500 to display the map (S535).
  • the server 200 determines where the target user is, who is in a specific area, and where a user with a specific attribute is in response to a search by the second user. It can be output in a form that is easy for the user to find.
  • the present disclosure can implement arbitrary processing by causing a processor to execute a computer program.
  • the program includes instructions (or software code) that, when read into the computer, cause the computer to perform one or more of the functions described in the embodiments.
  • the program may be stored in a non-transitory computer-readable medium or tangible storage medium.
  • computer readable media or tangible storage media may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drives (SSD) or other memory technology, CDs - ROM, digital versatile disc (DVD), Blu-ray disc or other optical disc storage, magnetic cassette, magnetic tape, magnetic disc storage or other magnetic storage device.
  • the program may be transmitted on a transitory computer-readable medium or communication medium.
  • transitory computer readable media or communication media include electrical, optical, acoustic, or other forms of propagated signals.
  • the computer mentioned above is composed of a computer system including a personal computer and a word processor.
  • the computer is not limited to this, and can be configured by a LAN (local area network) server, a computer (personal computer) communication host, a computer system connected to the Internet, or the like. It is also possible to distribute the functions to each device on the network and configure the computer over the entire network.
  • face authentication is performed when the user terminal 300 is activated or released from the sleep state, but the timing of face authentication is not limited to this.
  • face authentication may be performed at predetermined time intervals, or may be performed a predetermined number of times in a predetermined period.
  • face authentication may be performed when some input operation is performed. Spoofing can be further prevented when face authentication is performed multiple times a day.
  • the generation unit 245 may update the appearance information for each face authentication, or update the appearance information at some timing of the day (for example, at the beginning of the day).
  • Appearance information based on a photographed image may be set.
  • the generation unit 245 may update the appearance information when a predetermined time has passed since the most recent face authentication. When the appearance information is updated, there is an effect that it is easy to find the user even when the clothing or hairstyle changes or the mask is removed. Further, the generation unit 245 may set which timing of the appearance information is to be used for the output information by the user's selection operation.
  • the face authentication device 100 has the face authentication function, but instead of or in addition to the face authentication device 100, the server 200 may have the face authentication function.
  • An information processing device comprising: a control means; (Appendix 2) The information processing apparatus according to appendix 1, wherein the biometric authentication is face authentication. (Appendix 3) 3.
  • the information processing apparatus according to appendix 1 or 2, wherein the position estimation means estimates position information of the first user terminal based on position information of an access point to which the first user terminal is connected.
  • Appendix 4 The information processing apparatus according to any one of additional notes 1 to 3, wherein the output information further associates the position and appearance information of the first user with user information related to the first user.
  • Appendix 5 Further comprising generating means for generating, as the output information, a map representing the target space in which appearance information of the first user is superimposed at a position corresponding to the position of the first user. 5.
  • the information processing device according to any one of items 4 to 4. (Appendix 6) 6.
  • the information processing apparatus according to any one of appendices 1 to 5, wherein the output control means causes a display device installed in the target space to display the output information.
  • the output control unit outputs the output information of the first user to the second user terminal when receiving a search request regarding the position of the first user from the second user terminal used by the second user. 6.
  • the information processing device according to any one of 1 to 5.
  • the output control means outputs the position of the first user separately from the target space when determining that the position of the first user is outside the target space.
  • the information processing device according to .
  • a biometric authentication device that performs biometric authentication based on a photographed image of a first user; comprising an information processing device and The information processing device is authentication control means for acquiring the result of the biometric authentication from the biometric authentication device; position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image.
  • An information processing system comprising: a control means; (Appendix 10) further comprising a display device, The information processing system according to appendix 9, wherein the output control means outputs the output information to the display device. (Appendix 11) controlling biometric authentication based on the captured image of the first user; When the biometric authentication is successful, estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user; outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image; Processing method.
  • (Appendix 12) a procedure for controlling biometric authentication based on the photographed image of the first user; a step of estimating the position of the first user in a target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; A procedure for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image.
  • a non-transitory computer-readable medium storing a program for causing a computer to execute and .

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Operations Research (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

Un dispositif de traitement d'informations (10) comprend une unité de commande d'authentification (13) destinée à commander une authentification biométrique basée sur une image photographiée d'un premier utilisateur, une unité d'estimation de position (14) qui, en cas de réussite de l'authentification biométrique, estime une position du premier utilisateur dans un espace d'objet, sur la base d'informations de position d'un premier terminal d'utilisateur que le premier utilisateur utilise, ainsi qu'une unité de commande de sortie (16) qui délivre en sortie des informations de sortie dans lesquelles la position du premier utilisateur et des informations d'aspect externe comprenant l'image photographiée du premier utilisateur et/ou des données relatives à l'aspect externe qui sont générées sur la base de l'image photographiée, sont associées.
PCT/JP2021/041170 2021-11-09 2021-11-09 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par ordinateur WO2023084593A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023559225A JPWO2023084593A5 (ja) 2021-11-09 情報処理装置、情報処理システム、情報処理方法及びプログラム
PCT/JP2021/041170 WO2023084593A1 (fr) 2021-11-09 2021-11-09 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par ordinateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/041170 WO2023084593A1 (fr) 2021-11-09 2021-11-09 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2023084593A1 true WO2023084593A1 (fr) 2023-05-19

Family

ID=86335288

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/041170 WO2023084593A1 (fr) 2021-11-09 2021-11-09 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2023084593A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012038242A (ja) * 2010-08-11 2012-02-23 Kddi Corp 出席管理方法およびシステム
JP2019101566A (ja) * 2017-11-29 2019-06-24 株式会社 プロネット 情報処理システム、情報処理方法、情報処理プログラム、および情報処理装置
JP2019144917A (ja) * 2018-02-22 2019-08-29 パナソニックIpマネジメント株式会社 滞在状況表示システムおよび滞在状況表示方法
WO2021186569A1 (fr) * 2020-03-17 2021-09-23 日本電気株式会社 Dispositif d'assistance à la visite, système d'assistance à la visite, procédé d'assistance à la visite, et support non transitoire lisible par ordinateur sur lequel est mémorisé un programme

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012038242A (ja) * 2010-08-11 2012-02-23 Kddi Corp 出席管理方法およびシステム
JP2019101566A (ja) * 2017-11-29 2019-06-24 株式会社 プロネット 情報処理システム、情報処理方法、情報処理プログラム、および情報処理装置
JP2019144917A (ja) * 2018-02-22 2019-08-29 パナソニックIpマネジメント株式会社 滞在状況表示システムおよび滞在状況表示方法
WO2021186569A1 (fr) * 2020-03-17 2021-09-23 日本電気株式会社 Dispositif d'assistance à la visite, système d'assistance à la visite, procédé d'assistance à la visite, et support non transitoire lisible par ordinateur sur lequel est mémorisé un programme

Also Published As

Publication number Publication date
JPWO2023084593A1 (fr) 2023-05-19

Similar Documents

Publication Publication Date Title
EP3076320B1 (fr) Dispositif d'identification individuel et procédé de réglage du seuil d'identification
US10623959B1 (en) Augmented reality security access
JP6123653B2 (ja) 情報処理装置、情報処理方法、プログラム
JP2013041416A (ja) 情報処理装置及び方法、プログラム、並びに情報処理システム
US20190147251A1 (en) Information processing apparatus, monitoring system, method, and non-transitory computer-readable storage medium
JP2019091395A (ja) 情報処理装置、監視システム、方法及びプログラム
WO2022195815A1 (fr) Dispositif de fourniture d'informations, système de fourniture d'informations, procédé de fourniture d'informations et support non transitoire lisible par ordinateur
JP6769475B2 (ja) 情報処理システム、認証対象の管理方法、及びプログラム
JP5813829B1 (ja) 防犯システム
JP2015076044A (ja) 認証サーバ、認証プログラム、及び認証方法
CN108334966A (zh) 一种访客预约方法及系统
JP6776700B2 (ja) 災害用情報管理システム、および、災害用情報管理方法
JP6993597B2 (ja) 情報処理装置、制御方法、及びプログラム
JP5532180B1 (ja) 画像処理装置及びプログラム
JP2015233204A (ja) 画像記録装置及び画像記録方法
WO2021186569A1 (fr) Dispositif d'assistance à la visite, système d'assistance à la visite, procédé d'assistance à la visite, et support non transitoire lisible par ordinateur sur lequel est mémorisé un programme
WO2023084593A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par ordinateur
JP7067593B2 (ja) 情報処理システム、認証対象の管理方法、及びプログラム
JP2017152013A (ja) 情報処理装置、情報処理方法、プログラム
JP2019133566A (ja) 情報処理装置
JP6077930B2 (ja) 情報管理装置、情報管理システム、通信端末および情報管理方法
JP2010231450A (ja) 撮影データ認証装置、撮影データ認証システム、撮影データ認証方法及びプログラム
WO2022113589A1 (fr) Serveur, dispositif terminal, programme de traitement d'informations, système de gestion et procédé de gestion
JP6435676B2 (ja) ファイル管理装置、ファイル管理システム及びプログラム
JP6344984B2 (ja) 人物推定装置、人物推定方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21963956

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023559225

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE