WO2023084593A1 - Information processing device, information processing system, information processing method, and non-transitory computer-readable medium - Google Patents

Information processing device, information processing system, information processing method, and non-transitory computer-readable medium Download PDF

Info

Publication number
WO2023084593A1
WO2023084593A1 PCT/JP2021/041170 JP2021041170W WO2023084593A1 WO 2023084593 A1 WO2023084593 A1 WO 2023084593A1 JP 2021041170 W JP2021041170 W JP 2021041170W WO 2023084593 A1 WO2023084593 A1 WO 2023084593A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
information processing
authentication
photographed image
Prior art date
Application number
PCT/JP2021/041170
Other languages
French (fr)
Japanese (ja)
Inventor
忠信 中山
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2021/041170 priority Critical patent/WO2023084593A1/en
Publication of WO2023084593A1 publication Critical patent/WO2023084593A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • the present disclosure relates to an information processing device, an information processing system, an information processing method, and a non-transitory computer-readable medium, and more particularly to an information processing device, an information processing system, an information processing method, and a non-transitory computer-readable medium that output a location of a user. Regarding the medium.
  • Patent Literature 1 discloses a position estimation system for estimating the position of a mobile object from the state in which the sensor terminal is carried by the mobile object and the position of the sensor terminal.
  • Patent Document 2 the location of each of a plurality of employees is detected based on the information of the wireless AP to which the mobile terminal of the user is connected, and the data of the screen displaying the location together with the employee's thumbnail image is generated.
  • a communication support system for providing terminals of other users.
  • Patent Document 1 has the problem that even if the location of the target person is known, it is difficult to actually find the target person if there are multiple people on the same floor.
  • An object of the present disclosure is to provide an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that favorably support finding a user in a space in view of the above-described problems.
  • An information processing device includes: authentication control means for controlling biometric authentication based on the photographed image of the first user; position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. and a control means.
  • An information processing system includes: a biometric authentication device that performs biometric authentication based on a photographed image of a first user; comprising an information processing device and The information processing device is authentication control means for acquiring the result of the biometric authentication from the biometric authentication device; position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. and a control means.
  • An information processing method includes: controlling biometric authentication based on the captured image of the first user; When the biometric authentication is successful, estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user; Output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and appearance data generated based on the photographed image.
  • a non-transitory computer-readable medium comprising: a procedure for controlling biometric authentication based on the photographed image of the first user; a step of estimating the position of the first user in a target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; A procedure for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image.
  • an information processing device an information processing system, an information processing method, and a non-temporary computer-readable medium that favorably assist in finding a user in space.
  • FIG. 1 is a block diagram showing the configuration of an information processing apparatus according to a first embodiment
  • FIG. 4 is a flow chart showing the flow of an information processing method according to the first embodiment
  • 2 is a block diagram showing the overall configuration of an information processing system according to a second embodiment
  • FIG. 2 is a block diagram showing the configuration of a face authentication device according to a second embodiment
  • FIG. 9 is a flowchart showing the flow of face information registration processing according to the second embodiment
  • 9 is a flow chart showing the flow of face authentication processing according to the second embodiment
  • FIG. 8 is a block diagram showing the configuration of a user terminal according to the second embodiment
  • FIG. 7 is a block diagram showing the configuration of a server according to the second embodiment
  • FIG. 11 is a sequence diagram showing the flow of user registration processing according to the second embodiment;
  • FIG. 11 is a sequence diagram showing the flow of position output processing according to the second embodiment;
  • FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment;
  • FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment;
  • FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment;
  • FIG. FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment;
  • FIG. 11 is a sequence diagram showing the flow of position output processing according to the third embodiment;
  • Patent Literature 1 describes that the carrying state may be detected when personal authentication is successful.
  • Patent Literature 1 there is a problem that it is difficult to find the target person in reality.
  • Patent Document 2 the method described in the above-mentioned Patent Document 2 is proposed, but even if the location of the employee is known, it takes time to actually find the employee unless the user's appearance on that day is known. There is In particular, it becomes more and more difficult to find employees when their clothes and hairstyles change.
  • FIG. 1 is a block diagram showing the configuration of an information processing apparatus 10 according to the first embodiment.
  • the information processing device 10 is a computer device that assists another user (sometimes called a second user) to find a target user (sometimes called a first user) in a target space.
  • the information processing device 10 is connected to a network (not shown).
  • a network may be wired or wireless.
  • a first user terminal (not shown) used by a first user and a display device (not shown) are connected to the network.
  • the information processing device 10 includes an authentication control unit 13 , a position estimation unit 14 and an output control unit 16 .
  • the authentication control unit 13 is also called authentication control means.
  • the authentication control unit 13 controls biometric authentication based on the captured image of the first user.
  • Biometric authentication is iris authentication, face authentication, hand geometry authentication, or other biometric authentication that authenticates based on a photographed image of the user. Thereby, the authentication control unit 13 identifies the user.
  • the position estimation unit 14 is also called position estimation means.
  • the position estimation unit 14 estimates the position information of the first user terminal within the target space when the biometric authentication is successful.
  • the target space is a space in which a plurality of people can stay and each person can move inside.
  • the target space is a predetermined space. Then, the position estimation unit 14 estimates the position of the first user in the target space based on the position information of the first user terminal.
  • the output control unit 16 is also called output control means.
  • the output control unit 16 outputs output information in which the position of the first user terminal and the appearance information of the first user are associated with each other to a display device or another device such as a second user terminal used by the second user.
  • the appearance information of the first user is information indicating or suggesting the features or wearing conditions of the item worn by the first user.
  • the thing worn may be clothing, hairstyle, accessories, a mask, glasses, or the like.
  • the appearance information of the first user includes at least one of a photographed image generated by photographing the first user and data regarding appearance generated based on the photographed image.
  • Appearance-related data is also called appearance data, and is, for example, data indicating characteristics or wearing conditions of the wearable item.
  • the appearance data is text data or an illustration image indicating the characteristics or wearing conditions.
  • FIG. 2 is a flow chart showing the flow of the information processing method according to the first embodiment.
  • the authentication control unit 13 of the information processing device 10 controls biometric authentication based on the photographed image of the first user (S10). Controlling biometric authentication based on a captured image may mean performing biometric authentication based on the captured image or feature information extracted from the captured image. Also, controlling biometric authentication based on a captured image may mean transmitting a captured image or feature information extracted from a captured image to a biometric authentication device (not shown) and obtaining an authentication result from the biometric authentication device. .
  • the authentication control unit 13 determines whether biometric authentication has succeeded (S11).
  • Successful biometric authentication may indicate that the degree of matching between the feature information extracted from the captured image of the first user and the pre-registered feature information of the user is equal to or greater than a predetermined value.
  • the position estimation unit 14 estimates the position of the first user terminal within the target space (S12). For example, the position estimation unit 14 acquires the position of the access point (AP) used by the first user terminal or the GPS (Global Positioning System) information of the first user terminal, and the position of the AP or the GPS information as the first It may be estimated as the position of the user terminal. Also, the position estimation unit 14 may estimate the position of the first user terminal based on the position of the AP used by the first user terminal and the received radio wave intensity from the first user terminal.
  • AP access point
  • GPS Global Positioning System
  • the position estimation unit 14 estimates the position of the first user within the target space based on the position information of the first user terminal within the target space (S13). For example, the position estimation unit 14 may estimate the position information of the first user terminal within the target space as the position of the first user within the target space. Further, for example, the position estimation unit 14 may estimate the position of the first user within a predetermined distance range from the position information of the first user terminal in the target space.
  • the output control unit 16 outputs output information in which the position information of the first user and the appearance information such as the photographed image of the first user are associated with each other to the other device (S14).
  • the information processing device 10 ends the process. In other words, the output control unit 16 does not output the location information of the user whose biometric authentication fails to other devices as the location information of the first user.
  • the information processing apparatus 10 uses the location information of the first user who has succeeded in biometric authentication, so it is possible to prevent impersonation of the first user by others.
  • the information processing device 10 also outputs the appearance information such as the captured image of the first user in association with the position information of the first user. Therefore, the second user who has received the output information can grasp the features of the first user's appearance on that day as well as the position of the first user. Accordingly, the information processing device 10 can suitably assist the second user in finding the first user in the actual target space.
  • FIG. 3 is a block diagram showing the overall configuration of an information processing system 1000 according to the second embodiment.
  • the information processing system 1000 is a computer system that assists the second user in finding the first user in the target space TS.
  • the target space TS is the site of the target company as an example.
  • the target company's premises may include one or more floors.
  • the target space TS is not limited to this, and may be a shared office where a plurality of companies gather or a site of a school.
  • biometric authentication is face authentication as an example.
  • the information processing system 1000 includes a face authentication device 100, an information processing device (hereinafter referred to as a server) 200, a plurality of user terminals 300-1, 300-2, 300-3, a plurality of APs 400-1, 400- 2 and a display device 500 .
  • the face authentication device 100, server 200, AP 400 and display device 500 are connected to each other via a network N.
  • the network N is a wired or wireless communication line.
  • the network N is, for example, an intranet, and may be at least one of a LAN (Local Area Network), a WAN (Wide Area Network), and the Internet, or a combination thereof. Note that the number of user terminals and the number of APs are not limited to this.
  • the user terminal 300 is an information terminal used by a user, such as a personal computer, a smart phone, or a tablet terminal.
  • User terminal 300 transmits a user registration request to server 200 .
  • the user terminal 300 also transmits user information to the server 200 and causes the server 200 to register the user information.
  • the user terminal 300 requests face authentication when it is activated or released from a sleep state.
  • the user terminal 300 captures at least the face of the user from the front and transmits the captured image or facial feature information extracted from the captured image to the face authentication device 100 via the server 200 to request face authentication. .
  • the user terminal 300 may directly transmit the captured image or facial feature information to the face authentication device 100 .
  • the user terminal 300 When the user terminal 300 is located within the target space TS, the user terminal 300 connects to any AP 400 via the wireless LAN and connects to the network N via the AP 400 .
  • the user terminal 300 connects to the nearest AP 400 via a wireless LAN and connects to the network N via the nearest AP 400 .
  • user terminals 300-1 and 300-2 are connected to AP 400-1 and connected to network N via AP 400-1.
  • a user terminal 300-3 is connected to the AP 400-2, and is connected to the network N via the AP 400-2.
  • APs 400-1 and 400-2 are wireless access points. APs 400-1 and 400-2 are installed in predetermined areas of target space TS. For example, APs 400-1 and 400-2 may be installed on each floor within the premises of the target company. Also, for example, APs 400-1 and 400-2 may be installed in different areas on the same floor within the premises of the target company.
  • the face authentication device 100 is a computer device that stores facial feature information of multiple people.
  • the face authentication device 100 has a face authentication function that, in response to a face authentication request received from the outside, compares the face image or face feature information included in the request with the face feature information of each user.
  • the face authentication device 100 registers facial feature information of the user at the time of user registration. Then, the face authentication device 100 acquires the photographed image of the user from the user terminal 300 via the server 200, and performs face authentication using the face area in the photographed image. The face authentication device 100 then returns the collation result (face authentication result) to the server 200 .
  • the server 200 is an example of the information processing device 10 described above.
  • the server 200 receives a user registration request including a registration image from the user terminal 300
  • the server 200 transmits the face registration request to the face authentication device 100 .
  • the server 200 registers the user information in association with the user ID issued by the face authentication device 100 .
  • the server 200 receives a photographed image for face authentication or face feature information from the user terminal 300 via the AP 400 , it sends a face authentication request to the face authentication device 100 .
  • the server 200 identifies the user based on the face authentication result, and estimates the position of the user terminal 300 and the position of the user based on the position of the AP 400 to which the user terminal 300 is connected.
  • the server 200 transmits to the display device 500 output information in which appearance information including the photographed image for face authentication of the user and the position of the user are associated with each other.
  • the display device 500 is a device having a display unit such as a digital signage or a tablet terminal.
  • the display device 500 is installed in the target space TS or at a remote location.
  • the display device 500 is installed at the floor entrance within the premises of the target company.
  • the display device 500 displays the position of each user included in the output information received from the server 200 in association with the captured image for face authentication of the user.
  • FIG. 4 is a block diagram showing the configuration of the face authentication device 100 according to the second embodiment.
  • the face authentication device 100 includes a face information database (face information DB) 110 , a face detection section 120 , a feature point extraction section 130 , a registration section 140 and an authentication section 150 .
  • the face information DB 110 associates and stores a user ID 111 and face feature information 112 of the user ID.
  • the facial feature information 112 is a set of feature points extracted from a facial image, and is an example of facial information.
  • the face authentication device 100 may delete the face feature information 112 in the face feature DB 110 in response to a request from the registered user of the face feature information 112 .
  • the face authentication device 100 may delete the facial feature information 112 after a certain period of time has passed since it was registered.
  • the face detection unit 120 detects a face area included in a registered image for registering face information, and supplies it to the feature point extraction unit 130 .
  • the feature point extraction unit 130 extracts feature points from the face area detected by the face detection unit 120 and supplies face feature information to the registration unit 140 .
  • the feature point extraction unit 130 also extracts feature points included in the captured image received from the server 200 and supplies facial feature information to the authentication unit 150 .
  • the registration unit 140 newly issues a user ID 111 when registering facial feature information.
  • the registration unit 140 associates the issued user ID 111 with the facial feature information 112 extracted from the registered image and registers them in the facial information DB 110 .
  • the authentication unit 150 performs face authentication using the facial feature information 112 . Specifically, the authentication unit 150 collates the facial feature information extracted from the captured image with the facial feature information 112 in the facial information DB 110 . Authentication unit 150 returns to server 200 whether or not the facial feature information matches. Whether the facial feature information matches or not corresponds to the success or failure of the authentication. Note that matching of facial feature information (matching) means a case where the degree of matching is equal to or greater than a predetermined value.
  • FIG. 5 is a flowchart showing the flow of face information registration processing according to the second embodiment.
  • the face authentication device 100 acquires the registered image of the user U included in the face registration request (S21). For example, the face authentication device 100 receives a face registration request via the network N from the server 200 that received the user registration request from the user terminal 300 . Note that the face authentication device 100 may receive a face registration request directly from the user terminal 300 without being limited to this.
  • face detection section 120 detects a face area included in the registered image (S22).
  • the feature point extraction unit 130 extracts feature points from the face area detected in step S22, and supplies face feature information to the registration unit 140 (S23).
  • the registration unit 140 issues the user ID 111, associates the user ID 111 with the facial feature information 112, and registers them in the facial information DB 110 (S24).
  • the face authentication device 100 may receive the face feature information 112 from the face registration requester and register it in the face information DB 110 in association with the user ID 111 .
  • FIG. 6 is a flowchart showing the flow of face authentication processing according to the second embodiment.
  • the feature point extraction unit 130 acquires facial feature information for authentication (S31).
  • the face authentication device 100 receives a face authentication request from the server 200 via the network N, and extracts facial feature information from the captured image included in the face authentication request in steps S21 to S23.
  • the face authentication device 100 may receive facial feature information from the server 200 .
  • the authentication unit 150 collates the acquired facial feature information with the facial feature information 112 of the facial information DB 110 (S32).
  • the authentication unit 150 identifies the user ID 111 of the user whose facial feature information matches (S34). . Then, the authenticating unit 150 replies to the server 200 that the face authentication was successful and the identified user ID 111 as the face authentication result (S35). If there is no matching facial feature information (No in S33), the authentication unit 150 returns a face authentication result to the server 200 to the effect that the face authentication has failed (S36).
  • FIG. 7 is a block diagram showing the configuration of the user terminal 300 according to the second embodiment.
  • the user terminal 300 includes a camera 310 , a storage section 320 , a communication section 330 , a display section 340 , an input section 350 and a control section 360 .
  • the camera 310 is an imaging device that performs imaging under the control of the control unit 360 .
  • the storage unit 320 is a storage device that stores programs for realizing each function of the user terminal 300 .
  • a communication unit 330 is a communication interface with the network N.
  • the display unit 340 is a display device.
  • the input unit 350 is an input device that receives input from the user.
  • the display unit 340 and the input unit 350 may be configured integrally like a touch panel.
  • the control unit 360 controls hardware of the user terminal 300 .
  • FIG. 8 is a block diagram showing the configuration of the server 200 according to the second embodiment.
  • the server 200 includes a storage unit 210 , a memory 220 , a communication unit 230 and a control unit 240 .
  • the storage unit 210 is a storage device such as a hard disk or flash memory.
  • Storage unit 210 stores program 211 , user database (DB) 212 , and access point database (APDB) 213 .
  • the program 211 is a computer program in which the processing of the information processing method according to the second embodiment is implemented.
  • the user DB 212 stores information about users. Specifically, the user DB 212 stores user information 2122 , position information 2123 and appearance information 2124 in association with the user ID 2121 .
  • a user ID 2121 is a user ID issued by the face authentication device 100 when face information is registered.
  • User information 2122 may include, for example, user name, employee number, cell phone number, email address, attribute information, or schedule-related information for the user.
  • the attribute information may include at least one of gender, job title, and department name.
  • the schedule-related information may be the schedule itself, or information for accessing the user terminal 300 or the scheduler operating on the cloud.
  • the position information 2123 is the user's position information estimated by the position estimation unit 244, which will be described later.
  • Appearance information 2124 is a photographed image for face authentication, but may also include appearance data.
  • the APDB 213 stores information about the AP 400. Specifically, the APDB 213 associates and stores an APID 2131 that identifies the AP 400 and location information 2132 where the AP 400 is installed.
  • the memory 220 is a volatile storage device such as RAM (Random Access Memory), and is a storage area for temporarily holding information when the control unit 240 operates.
  • the communication unit 230 is a communication interface with the network N. FIG.
  • the control unit 240 is a processor that controls each component of the server 200, that is, a control device.
  • the control unit 240 loads the program 211 from the storage unit 210 into the memory 220 and executes the program 211 .
  • the control unit 240 realizes the functions of the registration unit 241 , the image acquisition unit 242 , the authentication control unit 243 , the position estimation unit 244 , the generation unit 245 and the output control unit 246 .
  • the registration unit 241 is also called registration means. Upon receiving a user registration request including a registration image from the user terminal 300 , the registration unit 241 transmits a face registration request to the face authentication device 100 . When the face authentication device 100 registers face information and issues a user ID, the registration unit 241 registers the user ID in the user DB 212 . The registration unit 241 also registers the user information of the user in the user DB 212 in association with the user ID of the user who uses the user terminal 300 . Since the user ID is associated with the face information in the face authentication device 100, the registration unit 241 registers the user information for each user in association with the user's face information via the user ID. become.
  • the image acquisition unit 242 is also called image acquisition means.
  • the image acquisition unit 242 receives a captured image for face authentication from the user terminal 300 via the AP 400 and supplies the image to the authentication control unit 243 .
  • the authentication control unit 243 is an example of the authentication control unit 13 described above.
  • the authentication control unit 243 controls face authentication for the face area of the user U included in the captured image, and identifies the user. That is, the authentication control unit 243 causes the face authentication device 100 to perform face authentication on the captured image acquired from the user terminal 300 .
  • the authentication control unit 243 transmits a face authentication request including the acquired photographed image to the face authentication device 100 via the network N.
  • the authentication control unit 243 may extract the face area of the user U from the captured image and include the extracted image in the face authentication request.
  • the authentication control unit 243 may also extract facial feature information from the face area and include the facial feature information in the face authentication request.
  • the authentication control unit 243 then receives the face authentication result from the face authentication device 100 . Thereby, the authentication control unit 243 identifies the user ID of the user.
  • the position estimator 244 is an example of the position estimator 14 described above.
  • the position estimation unit 244 estimates the position of the user terminal 300 .
  • Methods for estimating the position of the user terminal 300 include, for example, the following (1) to (3).
  • the position estimation unit 244 identifies the AP 400 via which the captured image is received, and identifies the position information of the AP 400 in the APDB 213 .
  • the position estimation unit 244 estimates the position information of the AP 400 as the position information of the user specified by the authentication control unit 243 .
  • Position estimation section 244 then estimates the position of user terminal 300 based on the position information of AP 400 and the distance between AP 400 and user terminal 300 . (3) When GPS information is acquired from the user terminal 300 , the position estimation unit 244 estimates the GPS information as the position information of the user terminal 300 .
  • the position estimation unit 244 estimates the user's position based on the position of the user terminal 300 .
  • the position estimation unit 244 may estimate the position of the user terminal 300 as the user's position.
  • the position estimation unit 244 estimates the position of the user terminal 300 that recently performed face authentication or the position of the active user terminal 300 as the position of the user. good.
  • the fact that there are multiple user terminals 300 used by the user may mean that there are multiple user terminals 300 for which the user performed face authentication within a predetermined period.
  • the position estimation unit 244 may determine the position of any user terminal 300 to be the user's position based on the type of the user terminal 300 . For example, if the same user has received face authentication from both a PC and a smartphone within a predetermined period of time, the position estimation unit 244 may use the position of the PC as the user's position. If there is a user terminal 300 outside the target space TS, the position estimation unit 244 sets the position of the smartphone as the user's position.
  • the location may be the location of the user.
  • the generator 245 is also called a generator.
  • the generation unit 245 generates appearance information including a captured image for face authentication of the user, and stores it in the user DB 212 in association with the user ID 2121 . Appearance information may be the photographed image itself for face authentication.
  • the generating unit 245 also generates output information in which at least the user's appearance information stored in the user DB 212 and the user's location information are associated with each other.
  • the output information is a map indicating the target space TS.
  • the generation unit 245 generates, as output information, a map representing the target space TS, in which the user's appearance information is superimposed on the position corresponding to the user's position information. By superimposing the appearance information on the map, it becomes easier for the second user viewing the map to find the target user (first user) in the physical space.
  • the output control unit 246 is an example of the output control unit 16 described above.
  • the output control section 246 transmits the output information to the display device 500 and causes the display section of the display device 500 to display the output information.
  • FIG. 9 is a sequence diagram showing the flow of user registration processing according to the second embodiment.
  • the user terminal 300 takes a picture of the user (S500), and transmits a user registration request including the registration image generated by the picture to the server 200 (S501).
  • the registration unit 241 of the server 200 includes the registration image included in the received user registration request in the face registration request and transmits the face registration request to the face authentication device 100 (S502).
  • the face authentication device 100 registers face information (face feature information) of the user U based on the registration image included in the received face registration request (S503).
  • the face authentication device 100 notifies the server 200 of the issued user ID (S504).
  • the user terminal 300 accepts input of user information from the user and transmits the user information to the server 200 (S505).
  • the user information transmitted here includes, for example, the user name, attribute information, and schedule-related information.
  • the registration unit 241 of the server 200 associates the notified user ID and user information with each other and registers them in the user DB 212 (S506).
  • FIG. 10 is a sequence diagram showing the flow of position output processing according to the second embodiment.
  • the user terminal 300 takes an image of the user (S510), and transmits the taken image to the server 200 via the AP 400 to which it is connected (S511).
  • the image acquisition unit 242 of the server 200 acquires the captured image of the user.
  • the authentication control unit 243 of the server 200 transmits a face authentication request for the face area of the user U in the captured image to the face authentication device 100 (S512).
  • the face authentication device 100 performs face authentication on the face area of the user U in the captured image included in the received face authentication request (S513).
  • the face authentication device 100 transmits to the server 200 a face authentication result including the success of the face authentication and the user ID (S514).
  • the authentication control unit 243 of the server 200 identifies the user based on the user ID included in the face authentication result.
  • the authentication control unit 243 transmits the face authentication result to the user terminal 300 via the AP 400 (S515). As a result, the user terminal 300 wakes up or transitions from the sleep state to the normal state.
  • the position estimation unit 244 of the server 200 refers to the APDB 213, specifies the position information of the AP 400 associated with the APID of the AP 400, and estimates the position of the user terminal 300 based on the position information of the AP 400 (S516). . The position estimation unit 244 then estimates the position of the user based on the position of the user terminal 300 (S517). Then, the position estimation unit 244 stores the position information of the user in the user DB 212 in association with the user ID.
  • the generating unit 245 of the server 200 generates a map that associates the user's appearance information including the photographed image for face authentication with the user's position information (S518). Then, the output control unit 246 of the server 200 transmits the map to the display device 500 via the AP 400 (S519), and causes the display unit of the display device 500 to display the map (S520).
  • FIG. 11 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment.
  • the target space TS includes area A, area B, and area C, and AP 400 is installed in each area.
  • Icons I_1 and I_2 of captured images for face authentication of the user using the user terminal 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG. there is Also, at the position of area B of the display image 900, an icon I_3 of a photographed image for face authentication of the user using the user terminal 300 connected to the AP 400 installed in area B is superimposed.
  • the display image 900 also shows the position where the display device 500 is installed as the current position.
  • FIG. 1 shows an icon I_3 of a photographed image for face authentication of the user using the user terminal 300 connected to the AP 400 installed in area B.
  • one icon of a shot image for face authentication is displayed for each user, but a plurality of icons may be displayed for each user.
  • the plurality of icons may include a plurality of photographed images for face authentication, a photographed image of the user's clothes, shoes, hairstyle, or back view of the day, or may include a photographed image of the user's appearance of the day. An illustrated image representing the image may be included.
  • the server 200 uses the location information of the user whose face has been successfully authenticated, so that the user can be prevented from being impersonated by others. Further, in the example of FIG. 11, the server 200 causes the display device 500 to display the captured image of the user in association with the position information. Therefore, other users who browsed the output information can grasp the characteristics of the target user's appearance on that day as well as the position of the target user. This allows the server 200 to preferably assist other users in finding the target user within the target space.
  • the captured image for face authentication uses the camera 310 of the user terminal 300, there is no need to install a new camera for face authentication. Also, when face authentication is performed when the PC is activated or wakes from sleep, the camera attached to the PC can be used to photograph the face of the user sitting in front of the PC from the front. When the user is photographed from the front with the camera 310 in this manner, a high-quality photographed image suitable for face authentication can be obtained.
  • the server 200 estimates the user's position based on the position of the AP 400 or estimates the user's position based on the GPS information of the user terminal 300, installation and introduction of new equipment such as dedicated transmitters and receivers is not required. No need.
  • Embodiment 2 can be modified as follows.
  • the appearance information included in the output information may be appearance data generated from a photographed image for face authentication instead of the photographed image for face authentication.
  • the output information may include a pre-registered registered image of the user.
  • the registered image may be a registered image for face authentication, a thumbnail image appropriately set by the user, or a face image on an employee ID card.
  • FIG. 12 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment.
  • Icons R_1 and R_2 of registered images of users using the user terminals 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG.
  • appearance data O_1 and O_2 generated from the photographed images for face authentication of the user are included in association with icons R_1 and R_2.
  • the appearance data O_1 indicates that the person is wearing "red clothes”
  • the appearance data O_2 indicates that the person is wearing "glasses”.
  • icon R_3 of a user using user terminal 300 connected to AP 400 installed in area B and appearance data O_3 are superimposed.
  • the output information may be information in which the location information and appearance information of the first user are further associated with the user information. That is, the generation unit 245 may generate output information including position information and appearance information of the first user, and user information. Also, the output information may be information in which the location information and appearance information of the first user are further associated with the schedule-related information. That is, the generation unit 245 may generate output information in which the first user's location information and appearance information are associated with the user's schedule information.
  • the user's schedule information may be obtained by the generation unit 245 extracting from the schedule-related information of the user information 2122 or by accessing the scheduler based on the schedule-related information.
  • FIG. 13 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment.
  • Icons I_1 and I_2 of captured images for face authentication of the user using the user terminal 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG. there is
  • the display image 900 includes user information U of users located in the area A.
  • the user corresponding to the icon I_1 belongs to the department "first engineering department" and the name is "Nichiden Taro".
  • the user information U may also include input information input by a user operation.
  • the user can input input information regarding today's appearance, and the user terminal 300 that has received the input may transmit the input information to the server 200 and register it in the user DB 212 as the user's user information.
  • the user information U shown in FIG. 13 includes information “I am wearing red clothes” as user input information corresponding to the icon I_1.
  • the user may also input other input information, and the user terminal 300 that has received the input may transmit the input information to the server 200 and register it as the user's user information.
  • the user information U shown in FIG. 13 includes information "I am next to a round chair" as user input information corresponding to the icon I_3.
  • Input information registered in the user DB 212 as user information can be given an expiration date of one hour or one day, and may be deleted after the expiration date.
  • the viewer of the display device 500 selects the icon I_1 of the photographed image for face authentication, other information regarding the user may be displayed.
  • Other information about the user may be user information or schedule information.
  • Other information about the user may be another photographed image of the user for face authentication, a photographed image of the user's clothes, shoes, hairstyle, or back view of the day, or the appearance of the day. It may be an illustration image expressing the characteristics of If display device 500 includes a touch panel, selecting may be tapping.
  • the display device 500 may display the location of the user associated with the specific location. For example, when the display device 500 is installed in a room of a certain department, the display device 500 may display the whereabouts of the members of that department.
  • FIG. 14 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment.
  • a display image 900 in FIG. 14 shows a map showing rooms of the first technical department and the locations of members belonging to the first technical department.
  • the display device 500 displays icons I_1 to I_3 of captured images for face authentication of the members in association with the positions in the map corresponding to the location of the members in the living room. In addition, the display device 500 displays information about which area the member is not in the room, together with icons I_4 to I_8 of the photographed image of the member for face authentication.
  • the output control unit 246 of the server 200 may output, for example, "remote", the position of a member whose location is determined to be outside the target space TS, distinguishing it from the target space TS. .
  • the position estimation unit 244 of the server 200 may determine whether or not the position of the member is outside the target space TS from the GPS information of the user terminal 300, or by tracking the network through which it passes. may Also, the position estimation unit 244 may determine from the user's schedule information whether or not the member's position is outside the target space TS. In this case, the display device 500 displays the locations of members outside the target space TS as "remote" as shown in FIG. This makes it possible to easily grasp members who are not in the target space TS.
  • the display device 500 may also display breakdowns such as how many of the affiliated members are in the room, how many are in other areas, and how many are remote.
  • the display device 500 may display the same type of information regardless of whether the selected user is inside or outside the target space TS.
  • the present invention is not limited to this, and the display device 500 may display different types of information depending on whether the selected user is inside or outside the target space TS. For example, when a user inside the target space TS is selected, the display device 500 displays detailed information on appearance and an extension number. A phone number may be displayed.
  • Embodiment 3 is characterized in that the server 200 has a search function.
  • the server 200 uses a user name or user ID as a key to output output information including position information of the user.
  • the output control unit 246 receives a search request regarding the position of the first user from the second user terminal 300-2 used by the second user, the output control unit 246 outputs output information including the first user's appearance information together with the position information. , to the second user terminal 300-2.
  • the server 200 uses the area as a key to output the output information of the user in that area.
  • the generation unit 245 refers to the user DB 212 and identifies users whose position information is within the predetermined area. and generate output information for the specified user.
  • the output control unit 246 then outputs the specified user's output information to the second user terminal 300-2.
  • the server 200 uses a user attribute as a key to output output information of a user having that attribute.
  • the generation unit 245 receives from the second user terminal 300-2 a search request regarding the position of a user who belongs to a predetermined department, the generation unit 245 refers to the user DB 212 and searches for the position of the user who belongs to the predetermined department in the target space TS. Identify users. The generation unit 245 then generates output information for the specified user. The generation unit 245 then outputs the specified user's output information to the second user terminal 300-2.
  • FIG. 15 is a sequence diagram showing the flow of position output processing according to the third embodiment.
  • FIG. 15 shows a sequence when the second user searches for the position of the first user using the user name of the first user as a key.
  • second user terminal 300-2 used by the second user transmits a search request including the user name of the first user to server 200 (S531).
  • the second user terminal 300-2 transmits the search request to the server 200 via the AP400, but it does not have to be via the AP400.
  • the generation unit 245 of the server 200 that has received the search request refers to the user DB 212 and searches for the position information and appearance information of the user (first user) corresponding to the user name using the specified user name as a key. (S532). Then, the generation unit 245 generates a map in which the location information and appearance information of the first user are associated (S533).
  • the output control unit 246 of the server 200 then transmits the map to the display device 500 via the AP 400 (S534), and causes the display unit of the display device 500 to display the map (S535).
  • the server 200 determines where the target user is, who is in a specific area, and where a user with a specific attribute is in response to a search by the second user. It can be output in a form that is easy for the user to find.
  • the present disclosure can implement arbitrary processing by causing a processor to execute a computer program.
  • the program includes instructions (or software code) that, when read into the computer, cause the computer to perform one or more of the functions described in the embodiments.
  • the program may be stored in a non-transitory computer-readable medium or tangible storage medium.
  • computer readable media or tangible storage media may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drives (SSD) or other memory technology, CDs - ROM, digital versatile disc (DVD), Blu-ray disc or other optical disc storage, magnetic cassette, magnetic tape, magnetic disc storage or other magnetic storage device.
  • the program may be transmitted on a transitory computer-readable medium or communication medium.
  • transitory computer readable media or communication media include electrical, optical, acoustic, or other forms of propagated signals.
  • the computer mentioned above is composed of a computer system including a personal computer and a word processor.
  • the computer is not limited to this, and can be configured by a LAN (local area network) server, a computer (personal computer) communication host, a computer system connected to the Internet, or the like. It is also possible to distribute the functions to each device on the network and configure the computer over the entire network.
  • face authentication is performed when the user terminal 300 is activated or released from the sleep state, but the timing of face authentication is not limited to this.
  • face authentication may be performed at predetermined time intervals, or may be performed a predetermined number of times in a predetermined period.
  • face authentication may be performed when some input operation is performed. Spoofing can be further prevented when face authentication is performed multiple times a day.
  • the generation unit 245 may update the appearance information for each face authentication, or update the appearance information at some timing of the day (for example, at the beginning of the day).
  • Appearance information based on a photographed image may be set.
  • the generation unit 245 may update the appearance information when a predetermined time has passed since the most recent face authentication. When the appearance information is updated, there is an effect that it is easy to find the user even when the clothing or hairstyle changes or the mask is removed. Further, the generation unit 245 may set which timing of the appearance information is to be used for the output information by the user's selection operation.
  • the face authentication device 100 has the face authentication function, but instead of or in addition to the face authentication device 100, the server 200 may have the face authentication function.
  • An information processing device comprising: a control means; (Appendix 2) The information processing apparatus according to appendix 1, wherein the biometric authentication is face authentication. (Appendix 3) 3.
  • the information processing apparatus according to appendix 1 or 2, wherein the position estimation means estimates position information of the first user terminal based on position information of an access point to which the first user terminal is connected.
  • Appendix 4 The information processing apparatus according to any one of additional notes 1 to 3, wherein the output information further associates the position and appearance information of the first user with user information related to the first user.
  • Appendix 5 Further comprising generating means for generating, as the output information, a map representing the target space in which appearance information of the first user is superimposed at a position corresponding to the position of the first user. 5.
  • the information processing device according to any one of items 4 to 4. (Appendix 6) 6.
  • the information processing apparatus according to any one of appendices 1 to 5, wherein the output control means causes a display device installed in the target space to display the output information.
  • the output control unit outputs the output information of the first user to the second user terminal when receiving a search request regarding the position of the first user from the second user terminal used by the second user. 6.
  • the information processing device according to any one of 1 to 5.
  • the output control means outputs the position of the first user separately from the target space when determining that the position of the first user is outside the target space.
  • the information processing device according to .
  • a biometric authentication device that performs biometric authentication based on a photographed image of a first user; comprising an information processing device and The information processing device is authentication control means for acquiring the result of the biometric authentication from the biometric authentication device; position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image.
  • An information processing system comprising: a control means; (Appendix 10) further comprising a display device, The information processing system according to appendix 9, wherein the output control means outputs the output information to the display device. (Appendix 11) controlling biometric authentication based on the captured image of the first user; When the biometric authentication is successful, estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user; outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image; Processing method.
  • (Appendix 12) a procedure for controlling biometric authentication based on the photographed image of the first user; a step of estimating the position of the first user in a target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful; A procedure for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image.
  • a non-transitory computer-readable medium storing a program for causing a computer to execute and .

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Operations Research (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

An information processing device (10) includes an authentication control unit (13) for controlling biometric authentication based on a photographed image of a first user, a position estimating unit (14) that, in a case of successful biometric authentication, estimates a position of the first user in an object space, on the basis of position information of a first user terminal that the first user uses, and an output control unit (16) that outputs output information in which the position of the first user, and external appearance information including at least one of the photographed image of the first user and data relating to external appearance that is generated on the basis of the photographed image, are associated.

Description

情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体Information processing device, information processing system, information processing method, and non-transitory computer-readable medium
 本開示は情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体に関し、特にユーザの所在位置を出力する情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体に関する。 TECHNICAL FIELD The present disclosure relates to an information processing device, an information processing system, an information processing method, and a non-transitory computer-readable medium, and more particularly to an information processing device, an information processing system, an information processing method, and a non-transitory computer-readable medium that output a location of a user. Regarding the medium.
 近年、社員の所在位置を把握するための技術が提案されている。例えば特許文献1には、移動体によるセンサ端末の携帯状態とセンサ端末の位置とから、移動体の位置を推定する位置推定システムが開示されている。 In recent years, technologies have been proposed for ascertaining the location of employees. For example, Patent Literature 1 discloses a position estimation system for estimating the position of a mobile object from the state in which the sensor terminal is carried by the mobile object and the position of the sensor terminal.
 また例えば特許文献2では、複数の社員のそれぞれの所在位置を、そのユーザの携帯端末が接続された無線APの情報に基づいて検出し、所在位置を社員のサムネイル画像とともに表示する画面のデータを、他のユーザの端末に提供するコミュニケーション支援システムが開示されている。 Further, for example, in Patent Document 2, the location of each of a plurality of employees is detected based on the information of the wireless AP to which the mobile terminal of the user is connected, and the data of the screen displaying the location together with the employee's thumbnail image is generated. , a communication support system for providing terminals of other users.
特開2008-294593号公報JP 2008-294593 A 特開2018-032294号公報JP 2018-032294 A
 しかし上述の特許文献1に記載の方法では、対象となる人の位置が分かったとしても、同じフロアに複数の人がいる場合は、現実で対象の人を見つけることが難しいという問題がある。 However, the method described in Patent Document 1 above has the problem that even if the location of the target person is known, it is difficult to actually find the target person if there are multiple people on the same floor.
 また上述の特許文献2に記載の方法では、社員の所在位置が分かったとしても、そのユーザのその日の外見が分からなければ、現実で社員を見つけるのに時間がかかるという問題がある。特に、服装や髪型が変わってしまうと、社員を見つけることがますます困難になる。 Also, with the method described in Patent Document 2 above, even if the employee's location is known, if the user's appearance on that day is not known, it will take time to actually find the employee. In particular, it becomes more and more difficult to find employees when their clothes and hairstyles change.
 本開示の目的は、上述した課題に鑑み、空間内でユーザを見つけることを好適に支援する情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体を提供することにある。 An object of the present disclosure is to provide an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that favorably support finding a user in a space in view of the above-described problems.
 本開示の一態様にかかる情報処理装置は、
 第1ユーザの撮影画像に基づく生体認証を制御する認証制御手段と、
 前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する位置推定手段と、
 前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する出力制御手段と
 を備える。
An information processing device according to an aspect of the present disclosure includes:
authentication control means for controlling biometric authentication based on the photographed image of the first user;
position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. and a control means.
 本開示の一態様にかかる情報処理システムは、
 第1ユーザの撮影画像に基づく生体認証を実施する生体認証装置と、
 情報処理装置と
 を備え、
 前記情報処理装置は、
 前記生体認証装置から前記生体認証の結果を取得する認証制御手段と、
 前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する位置推定手段と、
 前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する出力制御手段と
 を有する。
An information processing system according to one aspect of the present disclosure includes:
a biometric authentication device that performs biometric authentication based on a photographed image of a first user;
comprising an information processing device and
The information processing device is
authentication control means for acquiring the result of the biometric authentication from the biometric authentication device;
position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. and a control means.
 本開示の一態様にかかる情報処理方法は、
 第1ユーザの撮影画像に基づく生体認証を制御し、
 前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定し、
 前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する。
An information processing method according to an aspect of the present disclosure includes:
controlling biometric authentication based on the captured image of the first user;
When the biometric authentication is successful, estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user;
Output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and appearance data generated based on the photographed image.
 本開示の一態様にかかる非一時的なコンピュータ可読媒体は、
 第1ユーザの撮影画像に基づく生体認証を制御する手順と、
 前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する手順と、
 前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する手順と
 をコンピュータに実行させるためのプログラムが格納される。
According to one aspect of the present disclosure, a non-transitory computer-readable medium comprising:
a procedure for controlling biometric authentication based on the photographed image of the first user;
a step of estimating the position of the first user in a target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
A procedure for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. Stores a program for causing a computer to execute and .
 本開示により、空間内でユーザを見つけることを好適に支援する情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体を提供できる。 According to the present disclosure, it is possible to provide an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that favorably assist in finding a user in space.
実施形態1にかかる情報処理装置の構成を示すブロック図である。1 is a block diagram showing the configuration of an information processing apparatus according to a first embodiment; FIG. 実施形態1にかかる情報処理方法の流れを示すフローチャートである。4 is a flow chart showing the flow of an information processing method according to the first embodiment; 実施形態2にかかる情報処理システムの全体構成を示すブロック図である。2 is a block diagram showing the overall configuration of an information processing system according to a second embodiment; FIG. 実施形態2にかかる顔認証装置の構成を示すブロック図である。2 is a block diagram showing the configuration of a face authentication device according to a second embodiment; FIG. 実施形態2にかかる顔情報登録処理の流れを示すフローチャートである。9 is a flowchart showing the flow of face information registration processing according to the second embodiment; 実施形態2にかかる顔認証処理の流れを示すフローチャートである。9 is a flow chart showing the flow of face authentication processing according to the second embodiment; 実施形態2にかかるユーザ端末の構成を示すブロック図である。FIG. 8 is a block diagram showing the configuration of a user terminal according to the second embodiment; FIG. 実施形態2にかかるサーバの構成を示すブロック図である。FIG. 7 is a block diagram showing the configuration of a server according to the second embodiment; FIG. 実施形態2にかかるユーザ登録処理の流れを示すシーケンス図である。FIG. 11 is a sequence diagram showing the flow of user registration processing according to the second embodiment; 実施形態2にかかる位置出力処理の流れを示すシーケンス図である。FIG. 11 is a sequence diagram showing the flow of position output processing according to the second embodiment; 実施形態2にかかる表示装置の表示画像の一例を示す図である。FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment; FIG. 実施形態2にかかる表示装置の表示画像の一例を示す図である。FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment; FIG. 実施形態2にかかる表示装置の表示画像の一例を示す図である。FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment; FIG. 実施形態2にかかる表示装置の表示画像の一例を示す図である。FIG. 10 is a diagram showing an example of a display image of the display device according to the second embodiment; FIG. 実施形態3にかかる位置出力処理の流れを示すシーケンス図である。FIG. 11 is a sequence diagram showing the flow of position output processing according to the third embodiment;
 以下では、本開示の実施形態について、図面を参照しながら詳細に説明する。各図面において、同一又は対応する要素には同一の符号が付されており、説明の明確化のため、必要に応じて重複説明は省略される。 Below, embodiments of the present disclosure will be described in detail with reference to the drawings. In each drawing, the same reference numerals are given to the same or corresponding elements, and redundant description will be omitted as necessary for clarity of description.
 <実施形態の課題>
 ここで本実施形態の課題を改めて説明する。
 社員が個々に机を持たないオフィススタイル、いわゆるフリーアドレスを採用する企業が増えている。このようなフリーアドレスのオフィスでは、社員はオフィス内のどこにいても業務を遂行できるため、社員1人1人の所在位置を把握し難い。したがって社員間のコミュニケーションが円滑に行われないことがある。
<Problem of Embodiment>
Here, the problem of this embodiment will be explained again.
An increasing number of companies are adopting an office style in which employees do not have their own desks, the so-called free address system. In such a free-address office, employees can carry out their duties wherever they are in the office, making it difficult to ascertain the location of each employee. Therefore, communication between employees may not be carried out smoothly.
 そこで近年、社員の所在位置を把握するための技術が提案されている。例えば、社員がビーコンを発信するタグを携帯しており、ビーコンを受信した受信機の場所を社員の所在位置として把握するという方法がある。また例えば社員が、ICチップを埋め込んだ社員証をリーダにかざした場合、リーダの設置場所を社員の所在位置として把握するという方法もある。 Therefore, in recent years, technologies have been proposed for ascertaining the location of employees. For example, there is a method in which an employee carries a tag that emits a beacon, and the location of the receiver that received the beacon is grasped as the location of the employee. Further, for example, when an employee holds an employee ID card with an embedded IC chip over a reader, there is also a method of grasping the installation location of the reader as the location of the employee.
 しかし上述のタグを用いた方法では、他人が社員のタグを持ち歩くことで、なりすましの可能性があり、またビーコンの受信機を設置する必要がある。また上述の社員証を用いた方法では、他人が社員の社員証を利用することで依然としてなりすましの可能性がある。またリーダの設置場所を社員の所在位置とした場合、社員はリーダから離れた場所で業務をすることが通常であると考えられるため、所在位置の誤差が大きい。 However, with the tag-based method described above, there is a possibility of spoofing by someone else carrying the employee's tag, and it is necessary to install a beacon receiver. In addition, with the method using the employee ID card described above, there is still the possibility of spoofing by another person using the employee ID card of the employee. Moreover, if the location of the reader is taken as the location of the employee, the error in the location is large because the employee is usually expected to work at a location away from the reader.
 ここで、上述の特許文献1に記載の方法が提案されている。特許文献1では、携帯状態は、本人認証が成功した場合に検出されてよいことが記載されている。しかし上述の通り、現実で対象の人を見つけることが難しいという問題がある。 Here, the method described in the above-mentioned Patent Document 1 is proposed. Patent Literature 1 describes that the carrying state may be detected when personal authentication is successful. However, as described above, there is a problem that it is difficult to find the target person in reality.
 また上述の特許文献2に記載の方法が提案されているが、社員の所在位置が分かったとしても、そのユーザのその日の外見が分からなければ、現実で社員を見つけるのに時間がかかるという問題がある。特に、服装や髪型が変わってしまうと、社員を見つけることがますます困難になる。 In addition, the method described in the above-mentioned Patent Document 2 is proposed, but even if the location of the employee is known, it takes time to actually find the employee unless the user's appearance on that day is known. There is In particular, it becomes more and more difficult to find employees when their clothes and hairstyles change.
 尚、上述の課題は、フリーアドレスのオフィスだけでなく、複数の人が滞在し、内部で移動可能であるようなその他の空間についても同様である。本実施形態は、このような課題を解決するためになされたものである。 The above-mentioned issues are not limited to free-address offices, but also apply to other spaces where multiple people can stay and move inside. The present embodiment is made to solve such problems.
 <実施形態1>
 まず、本開示の実施形態1について説明する。図1は、実施形態1にかかる情報処理装置10の構成を示すブロック図である。情報処理装置10は、対象空間内の対象のユーザ(第1ユーザと呼ぶことがある)を他のユーザ(第2ユーザと呼ぶことがある)が見つけることを支援するコンピュータ装置である。情報処理装置10は、ネットワーク(不図示)に接続される。ネットワークは、有線であっても無線であってもよい。ネットワークには、第1ユーザが使用する第1ユーザ端末(不図示)と、表示装置(不図示)とが接続されている。情報処理装置10は、認証制御部13と、位置推定部14と、出力制御部16とを備える。
<Embodiment 1>
First, Embodiment 1 of the present disclosure will be described. FIG. 1 is a block diagram showing the configuration of an information processing apparatus 10 according to the first embodiment. The information processing device 10 is a computer device that assists another user (sometimes called a second user) to find a target user (sometimes called a first user) in a target space. The information processing device 10 is connected to a network (not shown). A network may be wired or wireless. A first user terminal (not shown) used by a first user and a display device (not shown) are connected to the network. The information processing device 10 includes an authentication control unit 13 , a position estimation unit 14 and an output control unit 16 .
 認証制御部13は、認証制御手段とも呼ばれる。認証制御部13は、第1ユーザの撮影画像に基づく生体認証を制御する。生体認証は、虹彩認証、顔認証、掌形認証、又はユーザの撮影画像に基づいて認証するその他の生体認証である。これにより認証制御部13はユーザを特定する。 The authentication control unit 13 is also called authentication control means. The authentication control unit 13 controls biometric authentication based on the captured image of the first user. Biometric authentication is iris authentication, face authentication, hand geometry authentication, or other biometric authentication that authenticates based on a photographed image of the user. Thereby, the authentication control unit 13 identifies the user.
 位置推定部14は、位置推定手段とも呼ばれる。
 位置推定部14は、生体認証が成功した場合、第1ユーザ端末の、対象空間内の位置情報を推定する。対象空間は、複数の人が滞在可能かつ各人が内部で移動可能な空間である。対象空間は、予め定められた空間である。そして位置推定部14は、第1ユーザ端末の位置情報に基づいて、対象空間内の第1ユーザの位置を推定する。
The position estimation unit 14 is also called position estimation means.
The position estimation unit 14 estimates the position information of the first user terminal within the target space when the biometric authentication is successful. The target space is a space in which a plurality of people can stay and each person can move inside. The target space is a predetermined space. Then, the position estimation unit 14 estimates the position of the first user in the target space based on the position information of the first user terminal.
 出力制御部16は、出力制御手段とも呼ばれる。出力制御部16は、第1ユーザ端末の位置と第1ユーザの外見情報とを対応付けた出力情報を、表示装置、又は第2ユーザが使用する第2ユーザ端末といった他装置に出力する。第1ユーザの外見情報は、第1ユーザが身に着ける物の特徴又は着用状況を示す又は示唆する情報である。身に着けている物は、服装、髪型、装飾品、マスク、又は眼鏡等であってよい。第1ユーザの外見情報は、第1ユーザを撮影することにより生成された撮影画像、及び上記撮影画像に基づいて生成された外見に関するデータの少なくとも一方を含む。外見に関するデータは、外見データとも呼ばれ、例えば、上記身に着ける物の特徴又は着用状況を示すデータである。一例として外見データは、上記特徴又は着用状況を示すテキストデータ又はイラスト画像である。 The output control unit 16 is also called output control means. The output control unit 16 outputs output information in which the position of the first user terminal and the appearance information of the first user are associated with each other to a display device or another device such as a second user terminal used by the second user. The appearance information of the first user is information indicating or suggesting the features or wearing conditions of the item worn by the first user. The thing worn may be clothing, hairstyle, accessories, a mask, glasses, or the like. The appearance information of the first user includes at least one of a photographed image generated by photographing the first user and data regarding appearance generated based on the photographed image. Appearance-related data is also called appearance data, and is, for example, data indicating characteristics or wearing conditions of the wearable item. As an example, the appearance data is text data or an illustration image indicating the characteristics or wearing conditions.
 図2は、実施形態1にかかる情報処理方法の流れを示すフローチャートである。まず情報処理装置10の認証制御部13は、第1ユーザの撮影画像に基づく生体認証を制御する(S10)。撮影画像に基づく生体認証を制御するとは、撮影画像又は撮影画像から抽出された特徴情報に基づいて、生体認証を実施することであってよい。また撮影画像に基づく生体認証を制御するとは、撮影画像又は撮影画像から抽出された特徴情報を生体認証装置(不図示)に送信し、生体認証装置から認証結果を取得することであってもよい。 FIG. 2 is a flow chart showing the flow of the information processing method according to the first embodiment. First, the authentication control unit 13 of the information processing device 10 controls biometric authentication based on the photographed image of the first user (S10). Controlling biometric authentication based on a captured image may mean performing biometric authentication based on the captured image or feature information extracted from the captured image. Also, controlling biometric authentication based on a captured image may mean transmitting a captured image or feature information extracted from a captured image to a biometric authentication device (not shown) and obtaining an authentication result from the biometric authentication device. .
 次に認証制御部13は、生体認証に成功したか否かを判定する(S11)。生体認証に成功するとは、第1ユーザの撮影画像から抽出された特徴情報と、予め登録されたユーザの特徴情報との間の一致度が所定値以上であることを示してよい。 Next, the authentication control unit 13 determines whether biometric authentication has succeeded (S11). Successful biometric authentication may indicate that the degree of matching between the feature information extracted from the captured image of the first user and the pre-registered feature information of the user is equal to or greater than a predetermined value.
 生体認証に成功した場合(S11でYes)、位置推定部14は、対象空間内の第1ユーザ端末の位置を推定する(S12)。例えば位置推定部14は、第1ユーザ端末が利用しているアクセスポイント(AP)の位置、又は第1ユーザ端末のGPS(Global Positioning System)情報を取得し、APの位置又はGPS情報を第1ユーザ端末の位置として推定してよい。また位置推定部14は、第1ユーザ端末が利用しているAPの位置及び第1ユーザ端末からの受信電波強度に基づいて、第1ユーザ端末の位置を推定してもよい。 If the biometric authentication is successful (Yes in S11), the position estimation unit 14 estimates the position of the first user terminal within the target space (S12). For example, the position estimation unit 14 acquires the position of the access point (AP) used by the first user terminal or the GPS (Global Positioning System) information of the first user terminal, and the position of the AP or the GPS information as the first It may be estimated as the position of the user terminal. Also, the position estimation unit 14 may estimate the position of the first user terminal based on the position of the AP used by the first user terminal and the received radio wave intensity from the first user terminal.
 次に位置推定部14は、対象空間内の第1ユーザ端末の位置情報に基づいて、対象空間内の第1ユーザの位置を推定する(S13)。例えば位置推定部14は、対象空間内の第1ユーザ端末の位置情報を、対象空間内の第1ユーザの位置として推定してよい。また例えば位置推定部14は、対象空間内の第1ユーザ端末の位置情報から所定距離範囲内を、第1ユーザの位置として推定してもよい。 Next, the position estimation unit 14 estimates the position of the first user within the target space based on the position information of the first user terminal within the target space (S13). For example, the position estimation unit 14 may estimate the position information of the first user terminal within the target space as the position of the first user within the target space. Further, for example, the position estimation unit 14 may estimate the position of the first user within a predetermined distance range from the position information of the first user terminal in the target space.
 次に出力制御部16は、第1ユーザの位置情報と、第1ユーザの撮影画像等の外見情報とを対応付けた出力情報を他装置に出力する(S14)。 Next, the output control unit 16 outputs output information in which the position information of the first user and the appearance information such as the photographed image of the first user are associated with each other to the other device (S14).
 一方、生体認証に失敗した場合は(S11でNo)、情報処理装置10は処理を終了する。つまり出力制御部16は、生体認証に失敗したユーザの位置情報を、第1ユーザの位置情報として他装置に出力しない。 On the other hand, if the biometric authentication fails (No in S11), the information processing device 10 ends the process. In other words, the output control unit 16 does not output the location information of the user whose biometric authentication fails to other devices as the location information of the first user.
 このように実施形態1によれば、情報処理装置10は、生体認証に成功した第1ユーザの位置情報を用いるため、他人による第1ユーザのなりすましを防止できる。また情報処理装置10は、第1ユーザの撮影画像等の外見情報を第1ユーザの位置情報に対応付けて出力する。したがって出力情報の提供を受けた第2ユーザは、第1ユーザの位置とともに、第1ユーザのその日の外見の特徴を把握することができる。これにより情報処理装置10は、第2ユーザが現実の対象空間内で第1ユーザを見つけることを好適に支援できる。 As described above, according to the first embodiment, the information processing apparatus 10 uses the location information of the first user who has succeeded in biometric authentication, so it is possible to prevent impersonation of the first user by others. The information processing device 10 also outputs the appearance information such as the captured image of the first user in association with the position information of the first user. Therefore, the second user who has received the output information can grasp the features of the first user's appearance on that day as well as the position of the first user. Accordingly, the information processing device 10 can suitably assist the second user in finding the first user in the actual target space.
 <実施形態2>
 次に、本開示の実施形態2について説明する。図3は、実施形態2にかかる情報処理システム1000の全体構成を示すブロック図である。情報処理システム1000は、第2ユーザが対象空間TS内の第1ユーザを見つけることを支援するコンピュータシステムである。本実施形態2において、一例として対象空間TSは、対象企業の敷地である。一例として対象企業の敷地は1又は複数のフロアを含む。しかしこれに限らず、対象空間TSは、複数の企業が集まるシェアオフィス又は学校の敷地であってもよい。また本実施形態2において、一例として生体認証は、顔認証である。
<Embodiment 2>
Next, Embodiment 2 of the present disclosure will be described. FIG. 3 is a block diagram showing the overall configuration of an information processing system 1000 according to the second embodiment. The information processing system 1000 is a computer system that assists the second user in finding the first user in the target space TS. In the second embodiment, the target space TS is the site of the target company as an example. As an example, the target company's premises may include one or more floors. However, the target space TS is not limited to this, and may be a shared office where a plurality of companies gather or a site of a school. In the second embodiment, biometric authentication is face authentication as an example.
 情報処理システム1000は、顔認証装置100と、情報処理装置(以下、サーバと呼ぶ)200と、複数のユーザ端末300-1,300-2,300-3と、複数のAP400-1,400-2と、表示装置500とを備える。顔認証装置100、サーバ200、AP400及び表示装置500は、ネットワークNを介して互いに接続されている。ネットワークNは、有線又は無線の通信回線である。ネットワークNは、例えばイントラネットであり、LAN(Local Area Network)、WAN(Wide Area Network)、及びインターネットの少なくとも1つ、又はこれらの組み合わせであってよい。尚、ユーザ端末の数及びAPの数はこれに限らない。 The information processing system 1000 includes a face authentication device 100, an information processing device (hereinafter referred to as a server) 200, a plurality of user terminals 300-1, 300-2, 300-3, a plurality of APs 400-1, 400- 2 and a display device 500 . The face authentication device 100, server 200, AP 400 and display device 500 are connected to each other via a network N. The network N is a wired or wireless communication line. The network N is, for example, an intranet, and may be at least one of a LAN (Local Area Network), a WAN (Wide Area Network), and the Internet, or a combination thereof. Note that the number of user terminals and the number of APs are not limited to this.
 ユーザ端末300は、ユーザが使用する、パーソナルコンピュータ、スマートフォン又はタブレット端末等の情報端末である。ユーザ端末300は、サーバ200に対してユーザ登録要求を送信する。これにより、ユーザの顔特徴情報が登録され、ユーザIDが発行される。またユーザ端末300は、ユーザ情報をサーバ200に送信し、サーバ200に対してユーザ情報を登録させる。
 またユーザ端末300は、起動時又はスリープ状態からの解除時に、顔認証を要求する。例えばユーザ端末300は、ユーザの少なくとも顔を正面から撮影し、撮影画像又は撮影画像から抽出された顔特徴情報を、サーバ200を介して顔認証装置100に送信することで、顔認証を要求する。尚、ユーザ端末300は、撮影画像又は顔特徴情報を、顔認証装置100に直接送信してもよい。
The user terminal 300 is an information terminal used by a user, such as a personal computer, a smart phone, or a tablet terminal. User terminal 300 transmits a user registration request to server 200 . Thereby, the user's facial feature information is registered and a user ID is issued. The user terminal 300 also transmits user information to the server 200 and causes the server 200 to register the user information.
Also, the user terminal 300 requests face authentication when it is activated or released from a sleep state. For example, the user terminal 300 captures at least the face of the user from the front and transmits the captured image or facial feature information extracted from the captured image to the face authentication device 100 via the server 200 to request face authentication. . Note that the user terminal 300 may directly transmit the captured image or facial feature information to the face authentication device 100 .
 尚、ユーザ端末300が対象空間TS内に位置する場合、ユーザ端末300は、無線LANを介していずれかのAP400に接続し、AP400を介してネットワークNに接続する。例えばユーザ端末300は、無線LANを介して最も近いAP400に接続し、最も近いAP400を介してネットワークNに接続する。図3では、ユーザ端末300-1及び300-2は、AP400-1に接続し、AP400-1を介してネットワークNに接続している。ユーザ端末300-3は、AP400-2に接続し、AP400-2を介してネットワークNに接続している。 When the user terminal 300 is located within the target space TS, the user terminal 300 connects to any AP 400 via the wireless LAN and connects to the network N via the AP 400 . For example, the user terminal 300 connects to the nearest AP 400 via a wireless LAN and connects to the network N via the nearest AP 400 . In FIG. 3, user terminals 300-1 and 300-2 are connected to AP 400-1 and connected to network N via AP 400-1. A user terminal 300-3 is connected to the AP 400-2, and is connected to the network N via the AP 400-2.
 AP400-1,400-2は、無線アクセスポイントである。AP400-1,400-2は、対象空間TSの予め定められたエリアにそれぞれ設置されている。例えばAP400-1,400-2は、対象企業の敷地内の各フロアに設置されてよい。また例えばAP400-1,400-2は、対象企業の敷地内の同じフロアの異なるエリアに設置されてもよい。 APs 400-1 and 400-2 are wireless access points. APs 400-1 and 400-2 are installed in predetermined areas of target space TS. For example, APs 400-1 and 400-2 may be installed on each floor within the premises of the target company. Also, for example, APs 400-1 and 400-2 may be installed in different areas on the same floor within the premises of the target company.
 顔認証装置100は、複数の人物の顔特徴情報を記憶するコンピュータ装置である。また、顔認証装置100は、外部から受信した顔認証要求に応じて、当該要求に含まれる顔画像又は顔特徴情報について、各ユーザの顔特徴情報と照合を行う、顔認証機能を有する。本実施形態2では、顔認証装置100は、ユーザ登録時に、ユーザの顔特徴情報を登録する。そして顔認証装置100は、ユーザ端末300から、サーバ200を介して、ユーザの撮影画像を取得し、撮影画像中の顔領域を用いた顔認証を実行する。そして顔認証装置100は、照合結果(顔認証結果)をサーバ200へ返信する。 The face authentication device 100 is a computer device that stores facial feature information of multiple people. In addition, the face authentication device 100 has a face authentication function that, in response to a face authentication request received from the outside, compares the face image or face feature information included in the request with the face feature information of each user. In the second embodiment, the face authentication device 100 registers facial feature information of the user at the time of user registration. Then, the face authentication device 100 acquires the photographed image of the user from the user terminal 300 via the server 200, and performs face authentication using the face area in the photographed image. The face authentication device 100 then returns the collation result (face authentication result) to the server 200 .
 サーバ200は、上述した情報処理装置10の一例である。
 サーバ200は、ユーザ端末300からの登録画像を含むユーザ登録要求を受信した場合、顔認証装置100に顔登録要求を送信する。そしてサーバ200は、顔認証装置100が発行したユーザIDに対応付けてユーザ情報を登録する。
 サーバ200は、AP400を介してユーザ端末300から顔認証用の撮影画像又は顔特徴情報を受信した場合、顔認証装置100に対して顔認証要求を送信する。サーバ200は、顔認証結果によりユーザを特定し、ユーザ端末300の接続先のAP400の位置に基づいてユーザ端末300の位置及びユーザの位置を推定する。そしてサーバ200は、ユーザの顔認証用の撮影画像を含む外見情報と、ユーザの位置とを対応付けた出力情報を、表示装置500に送信する。
The server 200 is an example of the information processing device 10 described above.
When the server 200 receives a user registration request including a registration image from the user terminal 300 , the server 200 transmits the face registration request to the face authentication device 100 . The server 200 then registers the user information in association with the user ID issued by the face authentication device 100 .
When the server 200 receives a photographed image for face authentication or face feature information from the user terminal 300 via the AP 400 , it sends a face authentication request to the face authentication device 100 . The server 200 identifies the user based on the face authentication result, and estimates the position of the user terminal 300 and the position of the user based on the position of the AP 400 to which the user terminal 300 is connected. Then, the server 200 transmits to the display device 500 output information in which appearance information including the photographed image for face authentication of the user and the position of the user are associated with each other.
 表示装置500は、デジタルサイネージ又はタブレット端末等の表示部を備える装置である。表示装置500は、対象空間TS内又は遠隔地に設置される。一例として表示装置500は、対象企業の敷地内のフロア入口に設置される。表示装置500は、サーバ200から受信した出力情報に含まれる各ユーザの位置を、そのユーザの顔認証用の撮影画像に対応付けて表示する。 The display device 500 is a device having a display unit such as a digital signage or a tablet terminal. The display device 500 is installed in the target space TS or at a remote location. As an example, the display device 500 is installed at the floor entrance within the premises of the target company. The display device 500 displays the position of each user included in the output information received from the server 200 in association with the captured image for face authentication of the user.
 図4は、実施形態2にかかる顔認証装置100の構成を示すブロック図である。顔認証装置100は、顔情報データベース(顔情報DB)110と、顔検出部120と、特徴点抽出部130と、登録部140と、認証部150とを備える。顔情報DB110は、ユーザID111と当該ユーザIDの顔特徴情報112とを対応付けて記憶する。顔特徴情報112は、顔画像から抽出された特徴点の集合であり、顔情報の一例である。尚、顔認証装置100は、顔特徴情報112の登録ユーザからの要望に応じて、顔特徴DB110内の顔特徴情報112を削除してもよい。または、顔認証装置100は、顔特徴情報112の登録から一定期間経過後に削除してもよい。 FIG. 4 is a block diagram showing the configuration of the face authentication device 100 according to the second embodiment. The face authentication device 100 includes a face information database (face information DB) 110 , a face detection section 120 , a feature point extraction section 130 , a registration section 140 and an authentication section 150 . The face information DB 110 associates and stores a user ID 111 and face feature information 112 of the user ID. The facial feature information 112 is a set of feature points extracted from a facial image, and is an example of facial information. Note that the face authentication device 100 may delete the face feature information 112 in the face feature DB 110 in response to a request from the registered user of the face feature information 112 . Alternatively, the face authentication device 100 may delete the facial feature information 112 after a certain period of time has passed since it was registered.
 顔検出部120は、顔情報を登録するための登録画像に含まれる顔領域を検出し、特徴点抽出部130に供給する。特徴点抽出部130は、顔検出部120が検出した顔領域から特徴点を抽出し、登録部140に顔特徴情報を供給する。また、特徴点抽出部130は、サーバ200から受信した撮影画像に含まれる特徴点を抽出し、認証部150に顔特徴情報を供給する。 The face detection unit 120 detects a face area included in a registered image for registering face information, and supplies it to the feature point extraction unit 130 . The feature point extraction unit 130 extracts feature points from the face area detected by the face detection unit 120 and supplies face feature information to the registration unit 140 . The feature point extraction unit 130 also extracts feature points included in the captured image received from the server 200 and supplies facial feature information to the authentication unit 150 .
 登録部140は、顔特徴情報の登録に際して、ユーザID111を新規に発行する。登録部140は、発行したユーザID111と、登録画像から抽出した顔特徴情報112とを対応付けて顔情報DB110へ登録する。認証部150は、顔特徴情報112を用いた顔認証を行う。具体的には、認証部150は、撮影画像から抽出された顔特徴情報と、顔情報DB110内の顔特徴情報112との照合を行う。認証部150は、顔特徴情報の一致の有無をサーバ200に返信する。顔特徴情報の一致の有無は、認証の成否に対応する。尚、顔特徴情報が一致する(一致有)とは、一致度が所定値以上である場合をいうものとする。 The registration unit 140 newly issues a user ID 111 when registering facial feature information. The registration unit 140 associates the issued user ID 111 with the facial feature information 112 extracted from the registered image and registers them in the facial information DB 110 . The authentication unit 150 performs face authentication using the facial feature information 112 . Specifically, the authentication unit 150 collates the facial feature information extracted from the captured image with the facial feature information 112 in the facial information DB 110 . Authentication unit 150 returns to server 200 whether or not the facial feature information matches. Whether the facial feature information matches or not corresponds to the success or failure of the authentication. Note that matching of facial feature information (matching) means a case where the degree of matching is equal to or greater than a predetermined value.
 図5は、実施形態2にかかる顔情報登録処理の流れを示すフローチャートである。まず、顔認証装置100は、顔登録要求に含まれるユーザUの登録画像を取得する(S21)。例えば、顔認証装置100は、顔登録要求を、ユーザ端末300からユーザ登録要求を受けたサーバ200から、ネットワークNを介して受け付ける。尚、顔認証装置100は、これに限らず、ユーザ端末300から直接、顔登録要求を受け付けてもよい。次に、顔検出部120は、登録画像に含まれる顔領域を検出する(S22)。次に、特徴点抽出部130は、ステップS22で検出した顔領域から特徴点を抽出し、登録部140に顔特徴情報を供給する(S23)。最後に、登録部140は、ユーザID111を発行し、当該ユーザID111と顔特徴情報112とを対応付けて顔情報DB110に登録する(S24)。なお、顔認証装置100は、顔登録要求元から顔特徴情報112を受信し、ユーザID111と対応付けて顔情報DB110に登録してもよい。 FIG. 5 is a flowchart showing the flow of face information registration processing according to the second embodiment. First, the face authentication device 100 acquires the registered image of the user U included in the face registration request (S21). For example, the face authentication device 100 receives a face registration request via the network N from the server 200 that received the user registration request from the user terminal 300 . Note that the face authentication device 100 may receive a face registration request directly from the user terminal 300 without being limited to this. Next, face detection section 120 detects a face area included in the registered image (S22). Next, the feature point extraction unit 130 extracts feature points from the face area detected in step S22, and supplies face feature information to the registration unit 140 (S23). Finally, the registration unit 140 issues the user ID 111, associates the user ID 111 with the facial feature information 112, and registers them in the facial information DB 110 (S24). Note that the face authentication device 100 may receive the face feature information 112 from the face registration requester and register it in the face information DB 110 in association with the user ID 111 .
 図6は、実施形態2にかかる顔認証処理の流れを示すフローチャートである。まず、特徴点抽出部130は、認証用の顔特徴情報を取得する(S31)。例えば、顔認証装置100は、サーバ200からネットワークNを介して顔認証要求を受信し、顔認証要求に含まれる撮影画像からステップS21からS23のように顔特徴情報を抽出する。または、顔認証装置100は、サーバ200から顔特徴情報を受信してもよい。次に、認証部150は、取得した顔特徴情報を、顔情報DB110の顔特徴情報112と照合する(S32)。顔特徴情報が一致した場合、つまり、顔特徴情報の一致度が所定値以上である場合(S33でYes)、認証部150は、顔特徴情報が一致したユーザのユーザID111を特定する(S34)。そして認証部150は、顔認証に成功した旨と特定したユーザID111とを、顔認証結果としてサーバ200に返信する(S35)。一致する顔特徴情報が存在しない場合(S33でNo)、認証部150は、顔認証に失敗した旨を、顔認証結果としてサーバ200に返信する(S36)。 FIG. 6 is a flowchart showing the flow of face authentication processing according to the second embodiment. First, the feature point extraction unit 130 acquires facial feature information for authentication (S31). For example, the face authentication device 100 receives a face authentication request from the server 200 via the network N, and extracts facial feature information from the captured image included in the face authentication request in steps S21 to S23. Alternatively, the face authentication device 100 may receive facial feature information from the server 200 . Next, the authentication unit 150 collates the acquired facial feature information with the facial feature information 112 of the facial information DB 110 (S32). If the facial feature information matches, that is, if the degree of matching of the facial feature information is equal to or greater than a predetermined value (Yes in S33), the authentication unit 150 identifies the user ID 111 of the user whose facial feature information matches (S34). . Then, the authenticating unit 150 replies to the server 200 that the face authentication was successful and the identified user ID 111 as the face authentication result (S35). If there is no matching facial feature information (No in S33), the authentication unit 150 returns a face authentication result to the server 200 to the effect that the face authentication has failed (S36).
 図7は、実施形態2にかかるユーザ端末300の構成を示すブロック図である。ユーザ端末300は、カメラ310と、記憶部320と、通信部330と、表示部340と、入力部350と、制御部360とを備える。
 カメラ310は、制御部360の制御に応じて撮影を行う撮影装置である。記憶部320は、ユーザ端末300の各機能を実現するためのプログラムが格納される記憶装置である。通信部330は、ネットワークNとの通信インタフェースである。表示部340は、表示装置である。入力部350は、ユーザからの入力を受け付ける入力装置である。表示部340及び入力部350は、タッチパネルのように一体的に構成されていてもよい。制御部360は、ユーザ端末300が有するハードウェアの制御を行う。
FIG. 7 is a block diagram showing the configuration of the user terminal 300 according to the second embodiment. The user terminal 300 includes a camera 310 , a storage section 320 , a communication section 330 , a display section 340 , an input section 350 and a control section 360 .
The camera 310 is an imaging device that performs imaging under the control of the control unit 360 . The storage unit 320 is a storage device that stores programs for realizing each function of the user terminal 300 . A communication unit 330 is a communication interface with the network N. FIG. The display unit 340 is a display device. The input unit 350 is an input device that receives input from the user. The display unit 340 and the input unit 350 may be configured integrally like a touch panel. The control unit 360 controls hardware of the user terminal 300 .
 図8は、実施形態2にかかるサーバ200の構成を示すブロック図である。サーバ200は、記憶部210と、メモリ220と、通信部230と、制御部240とを備える。記憶部210は、ハードディスク、フラッシュメモリ等の記憶装置である。記憶部210は、プログラム211と、ユーザデータベース(DB)212と、アクセスポイントデータベース(APDB)213を記憶する。プログラム211は、本実施形態2にかかる情報処理方法の処理が実装されたコンピュータプログラムである。 FIG. 8 is a block diagram showing the configuration of the server 200 according to the second embodiment. The server 200 includes a storage unit 210 , a memory 220 , a communication unit 230 and a control unit 240 . The storage unit 210 is a storage device such as a hard disk or flash memory. Storage unit 210 stores program 211 , user database (DB) 212 , and access point database (APDB) 213 . The program 211 is a computer program in which the processing of the information processing method according to the second embodiment is implemented.
 ユーザDB212は、ユーザに関する情報を記憶する。具体的には、ユーザDB212は、ユーザID2121に対応付けて、ユーザ情報2122と位置情報2123と外見情報2124とを記憶する。ユーザID2121は、顔情報登録時に顔認証装置100により発行されるユーザIDである。ユーザ情報2122は、例えば、ユーザ名、社員番号、携帯番号、メールアドレス、属性情報、又はユーザのスケジュール関連情報を含んでよい。属性情報は、性別、役職名、及び所属部署名の少なくとも1つを含んでよい。スケジュール関連情報は、スケジュールそのものであってもよいし、ユーザ端末300又はクラウド上で動作するスケジューラにアクセスするための情報であってもよい。位置情報2123は、後述する位置推定部244が推定した、ユーザの位置情報である。外見情報2124は、顔認証用の撮影画像であるが、外見データを含んでもよい。 The user DB 212 stores information about users. Specifically, the user DB 212 stores user information 2122 , position information 2123 and appearance information 2124 in association with the user ID 2121 . A user ID 2121 is a user ID issued by the face authentication device 100 when face information is registered. User information 2122 may include, for example, user name, employee number, cell phone number, email address, attribute information, or schedule-related information for the user. The attribute information may include at least one of gender, job title, and department name. The schedule-related information may be the schedule itself, or information for accessing the user terminal 300 or the scheduler operating on the cloud. The position information 2123 is the user's position information estimated by the position estimation unit 244, which will be described later. Appearance information 2124 is a photographed image for face authentication, but may also include appearance data.
 APDB213は、AP400に関する情報を記憶する。具体的には、APDB213は、AP400を識別するAPID2131と、AP400が設置された位置情報2132とを対応付けて記憶する。 The APDB 213 stores information about the AP 400. Specifically, the APDB 213 associates and stores an APID 2131 that identifies the AP 400 and location information 2132 where the AP 400 is installed.
 メモリ220は、RAM(Random Access Memory)等の揮発性記憶装置であり、制御部240の動作時に一時的に情報を保持するための記憶領域である。通信部230は、ネットワークNとの通信インタフェースである。 The memory 220 is a volatile storage device such as RAM (Random Access Memory), and is a storage area for temporarily holding information when the control unit 240 operates. The communication unit 230 is a communication interface with the network N. FIG.
 制御部240は、サーバ200の各構成を制御するプロセッサつまり制御装置である。制御部240は、記憶部210からプログラム211をメモリ220へ読み込ませ、プログラム211を実行する。これにより、制御部240は、登録部241、画像取得部242、認証制御部243、位置推定部244、生成部245及び出力制御部246の機能を実現する。 The control unit 240 is a processor that controls each component of the server 200, that is, a control device. The control unit 240 loads the program 211 from the storage unit 210 into the memory 220 and executes the program 211 . Thereby, the control unit 240 realizes the functions of the registration unit 241 , the image acquisition unit 242 , the authentication control unit 243 , the position estimation unit 244 , the generation unit 245 and the output control unit 246 .
 登録部241は、登録手段とも呼ばれる。登録部241は、ユーザ端末300から登録画像を含むユーザ登録要求を受信した場合、顔登録要求を顔認証装置100に送信する。そして登録部241は、顔認証装置100が顔情報を登録してユーザIDを発行した場合、そのユーザIDをユーザDB212に登録する。また登録部241は、そのユーザ端末300が使用するユーザのユーザIDに対応付けて、そのユーザのユーザ情報をユーザDB212に登録する。ユーザIDは顔認証装置100において顔情報に対応付けられているため、登録部241は、ユーザ毎に、ユーザ情報を、ユーザIDを介してそのユーザの顔情報に紐づけて登録していることになる。 The registration unit 241 is also called registration means. Upon receiving a user registration request including a registration image from the user terminal 300 , the registration unit 241 transmits a face registration request to the face authentication device 100 . When the face authentication device 100 registers face information and issues a user ID, the registration unit 241 registers the user ID in the user DB 212 . The registration unit 241 also registers the user information of the user in the user DB 212 in association with the user ID of the user who uses the user terminal 300 . Since the user ID is associated with the face information in the face authentication device 100, the registration unit 241 registers the user information for each user in association with the user's face information via the user ID. become.
 画像取得部242は、画像取得手段とも呼ばれる。画像取得部242は、ユーザ端末300から、AP400を介して顔認証用の撮影画像を受信し、認証制御部243に供給する。 The image acquisition unit 242 is also called image acquisition means. The image acquisition unit 242 receives a captured image for face authentication from the user terminal 300 via the AP 400 and supplies the image to the authentication control unit 243 .
 認証制御部243は、上述した認証制御部13の一例である。認証制御部243は、撮影画像に含まれるユーザUの顔領域に対する顔認証を制御し、ユーザを特定する。すなわち、認証制御部243は、ユーザ端末300から取得した撮影画像について、顔認証装置100に対して顔認証を行わせる。例えば、認証制御部243は、取得した撮影画像を含めた顔認証要求を、ネットワークNを介して顔認証装置100へ送信する。尚、認証制御部243は、撮影画像からユーザUの顔領域を抽出し、抽出した画像を顔認証要求に含めてもよい。また認証制御部243は、顔領域から顔特徴情報を抽出し、顔特徴情報を顔認証要求に含めてもよい。そして認証制御部243は、顔認証装置100から顔認証結果を受信する。これにより認証制御部243は、ユーザのユーザIDを特定する。 The authentication control unit 243 is an example of the authentication control unit 13 described above. The authentication control unit 243 controls face authentication for the face area of the user U included in the captured image, and identifies the user. That is, the authentication control unit 243 causes the face authentication device 100 to perform face authentication on the captured image acquired from the user terminal 300 . For example, the authentication control unit 243 transmits a face authentication request including the acquired photographed image to the face authentication device 100 via the network N. Note that the authentication control unit 243 may extract the face area of the user U from the captured image and include the extracted image in the face authentication request. The authentication control unit 243 may also extract facial feature information from the face area and include the facial feature information in the face authentication request. The authentication control unit 243 then receives the face authentication result from the face authentication device 100 . Thereby, the authentication control unit 243 identifies the user ID of the user.
 位置推定部244は、上述した位置推定部14の一例である。
 まず位置推定部244は、ユーザ端末300の位置を推定する。ユーザ端末300の位置の推定方法は、例えば以下の(1)~(3)が挙げられる。
 (1)位置推定部244は、撮影画像を受信する場合に経由したAP400を特定し、APDB213においてAP400の位置情報を特定する。そして位置推定部244は、AP400の位置情報を、認証制御部243が特定したユーザの位置情報として推定する。
 (2)位置推定部244は、AP400の各々から、そのAP400に接続されたユーザ端末300の電波強度を取得していた場合、電波強度に基づいてAP400とユーザ端末300との距離を算出する。そして位置推定部244は、AP400の位置情報、及びAP400とユーザ端末300との距離に基づいて、ユーザ端末300の位置を推定する。
 (3)位置推定部244は、ユーザ端末300からGPS情報を取得した場合、GPS情報をユーザ端末300の位置情報として推定する。
The position estimator 244 is an example of the position estimator 14 described above.
First, the position estimation unit 244 estimates the position of the user terminal 300 . Methods for estimating the position of the user terminal 300 include, for example, the following (1) to (3).
(1) The position estimation unit 244 identifies the AP 400 via which the captured image is received, and identifies the position information of the AP 400 in the APDB 213 . The position estimation unit 244 then estimates the position information of the AP 400 as the position information of the user specified by the authentication control unit 243 .
(2) If the position estimation unit 244 acquires the radio wave intensity of the user terminal 300 connected to the AP 400 from each AP 400, it calculates the distance between the AP 400 and the user terminal 300 based on the radio wave intensity. Position estimation section 244 then estimates the position of user terminal 300 based on the position information of AP 400 and the distance between AP 400 and user terminal 300 .
(3) When GPS information is acquired from the user terminal 300 , the position estimation unit 244 estimates the GPS information as the position information of the user terminal 300 .
 次に位置推定部244は、ユーザ端末300の位置に基づいてユーザの位置を推定する。例えば位置推定部244は、ユーザ端末300の位置をユーザの位置として推定してよい。 Next, the position estimation unit 244 estimates the user's position based on the position of the user terminal 300 . For example, the position estimation unit 244 may estimate the position of the user terminal 300 as the user's position.
 尚、ユーザが使用するユーザ端末300が複数ある場合は、位置推定部244は、直近で顔認証を実施したユーザ端末300の位置、又はアクティブなユーザ端末300の位置をユーザの位置として推定してよい。ユーザが使用するユーザ端末300が複数あるとは、ユーザが所定期間内に顔認証をしたユーザ端末300が複数あることであってよい。またこの場合、位置推定部244は、ユーザ端末300の種別に基づいて、いずれのユーザ端末300の位置をユーザの位置とするかを決定してよい。例えば所定期間内に同じユーザがPC及びスマートフォンの両方から顔認証を受けていた場合、位置推定部244は、PCの位置をユーザの位置としてよい。また対象空間TS外のユーザ端末300がある場合、位置推定部244は、スマートフォンの位置をユーザの位置とし、いずれのユーザ端末300も対象空間TS内である場合、位置推定部244は、PCの位置をユーザの位置としてもよい。 In addition, when there are a plurality of user terminals 300 used by the user, the position estimation unit 244 estimates the position of the user terminal 300 that recently performed face authentication or the position of the active user terminal 300 as the position of the user. good. The fact that there are multiple user terminals 300 used by the user may mean that there are multiple user terminals 300 for which the user performed face authentication within a predetermined period. Also, in this case, the position estimation unit 244 may determine the position of any user terminal 300 to be the user's position based on the type of the user terminal 300 . For example, if the same user has received face authentication from both a PC and a smartphone within a predetermined period of time, the position estimation unit 244 may use the position of the PC as the user's position. If there is a user terminal 300 outside the target space TS, the position estimation unit 244 sets the position of the smartphone as the user's position. The location may be the location of the user.
 生成部245は、生成手段とも呼ばれる。
 生成部245は、ユーザの顔認証用の撮影画像を含む外見情報を生成し、ユーザID2121に対応付けてユーザDB212に格納する。外見情報は、顔認証用の撮影画像そのものであってもよい。
 また生成部245は、ユーザDB212に記憶されているユーザの外見情報と、ユーザの位置情報とを少なくとも対応付けた出力情報を生成する。例えば出力情報は、対象空間TSを示すマップである。一例として生成部245は、対象空間TSを表すマップであって、ユーザの位置情報に対応する位置に、そのユーザの外見情報が重畳されるマップを、出力情報として生成する。マップに外見情報を重畳することで、マップを閲覧した第2ユーザが現実空間で対象のユーザ(第1ユーザ)を見つけることがより容易となる。
The generator 245 is also called a generator.
The generation unit 245 generates appearance information including a captured image for face authentication of the user, and stores it in the user DB 212 in association with the user ID 2121 . Appearance information may be the photographed image itself for face authentication.
The generating unit 245 also generates output information in which at least the user's appearance information stored in the user DB 212 and the user's location information are associated with each other. For example, the output information is a map indicating the target space TS. As an example, the generation unit 245 generates, as output information, a map representing the target space TS, in which the user's appearance information is superimposed on the position corresponding to the user's position information. By superimposing the appearance information on the map, it becomes easier for the second user viewing the map to find the target user (first user) in the physical space.
 出力制御部246は、上述した出力制御部16の一例である。出力制御部246は、表示装置500に出力情報を送信し、表示装置500の表示部に出力情報を表示させる。 The output control unit 246 is an example of the output control unit 16 described above. The output control section 246 transmits the output information to the display device 500 and causes the display section of the display device 500 to display the output information.
 図9は、実施形態2にかかるユーザ登録処理の流れを示すシーケンス図である。まずユーザ端末300は、ユーザを撮影し(S500)、撮影により生成された登録画像を含むユーザ登録要求を、サーバ200へ送信する(S501)。そしてサーバ200の登録部241は、受信したユーザ登録要求に含まれる登録画像を、顔登録要求に含ませて、顔認証装置100に送信する(S502)。そして、顔認証装置100は、受信した顔登録要求に含まれる登録画像に基づいて、ユーザUの顔情報(顔特徴情報)を登録する(S503)。そして、顔認証装置100は、発行したユーザIDをサーバ200に通知する(S504)。また、ユーザ端末300は、ユーザからユーザ情報の入力を受け付け、ユーザ情報をサーバ200に送信する(S505)。ここで送信されるユーザ情報は、例えばユーザ名、属性情報及びスケジュール関連情報を含む。サーバ200の登録部241は、通知されたユーザID及びユーザ情報を、互いに対応付けてユーザDB212に登録する(S506)。 FIG. 9 is a sequence diagram showing the flow of user registration processing according to the second embodiment. First, the user terminal 300 takes a picture of the user (S500), and transmits a user registration request including the registration image generated by the picture to the server 200 (S501). Then, the registration unit 241 of the server 200 includes the registration image included in the received user registration request in the face registration request and transmits the face registration request to the face authentication device 100 (S502). Then, the face authentication device 100 registers face information (face feature information) of the user U based on the registration image included in the received face registration request (S503). Then, the face authentication device 100 notifies the server 200 of the issued user ID (S504). Also, the user terminal 300 accepts input of user information from the user and transmits the user information to the server 200 (S505). The user information transmitted here includes, for example, the user name, attribute information, and schedule-related information. The registration unit 241 of the server 200 associates the notified user ID and user information with each other and registers them in the user DB 212 (S506).
 図10は、実施形態2にかかる位置出力処理の流れを示すシーケンス図である。まずユーザ端末300はユーザを撮影し(S510)、撮影画像を、接続先のAP400を介してサーバ200に送信する(S511)。これにより、サーバ200の画像取得部242は、ユーザの撮影画像を取得する。次にサーバ200の認証制御部243は、撮影画像内のユーザUの顔領域に対する顔認証要求を、顔認証装置100へ送信する(S512)。そして、顔認証装置100は、受信した顔認証要求に含まれる撮影画像内のユーザUの顔領域について顔認証を行う(S513)。ここでは、顔認証に成功したユーザIDがあったものとする。顔認証装置100は、顔認証に成功した旨及びユーザIDを含めた顔認証結果を、サーバ200へ送信する(S514)。顔認証結果に含まれるユーザIDにより、サーバ200の認証制御部243は、ユーザを特定する。認証制御部243は、顔認証結果を、AP400を介してユーザ端末300に送信する(S515)。これによりユーザ端末300は、起動するか、スリープ状態から通常状態に移行する。 FIG. 10 is a sequence diagram showing the flow of position output processing according to the second embodiment. First, the user terminal 300 takes an image of the user (S510), and transmits the taken image to the server 200 via the AP 400 to which it is connected (S511). Thereby, the image acquisition unit 242 of the server 200 acquires the captured image of the user. Next, the authentication control unit 243 of the server 200 transmits a face authentication request for the face area of the user U in the captured image to the face authentication device 100 (S512). Then, the face authentication device 100 performs face authentication on the face area of the user U in the captured image included in the received face authentication request (S513). Here, it is assumed that there is a user ID for which face authentication has succeeded. The face authentication device 100 transmits to the server 200 a face authentication result including the success of the face authentication and the user ID (S514). The authentication control unit 243 of the server 200 identifies the user based on the user ID included in the face authentication result. The authentication control unit 243 transmits the face authentication result to the user terminal 300 via the AP 400 (S515). As a result, the user terminal 300 wakes up or transitions from the sleep state to the normal state.
 次にサーバ200の位置推定部244は、APDB213を参照し、AP400のAPIDに対応付けられたAP400の位置情報を特定し、AP400の位置情報に基づいてユーザ端末300の位置を推定する(S516)。そして位置推定部244は、ユーザ端末300の位置に基づいてユーザの位置を推定する(S517)。そして位置推定部244は、ユーザDB212に、ユーザIDに対応付けてユーザの位置情報をユーザDB212に格納する。 Next, the position estimation unit 244 of the server 200 refers to the APDB 213, specifies the position information of the AP 400 associated with the APID of the AP 400, and estimates the position of the user terminal 300 based on the position information of the AP 400 (S516). . The position estimation unit 244 then estimates the position of the user based on the position of the user terminal 300 (S517). Then, the position estimation unit 244 stores the position information of the user in the user DB 212 in association with the user ID.
 次にサーバ200の生成部245は、ユーザの顔認証用の撮影画像を含む外見情報と、ユーザの位置情報とを対応付けたマップを生成する(S518)。そしてサーバ200の出力制御部246は、AP400を介してマップを表示装置500に送信し(S519)、表示装置500の表示部にマップを表示させる(S520)。 Next, the generating unit 245 of the server 200 generates a map that associates the user's appearance information including the photographed image for face authentication with the user's position information (S518). Then, the output control unit 246 of the server 200 transmits the map to the display device 500 via the AP 400 (S519), and causes the display unit of the display device 500 to display the map (S520).
 図11は、実施形態2にかかる表示装置500の表示画像900の一例を示す図である。例えば対象空間TSには、エリアA、エリアB、及びエリアCを含み、各エリアにAP400が設置されているとする。図11に示す表示画像900のエリアAの位置には、エリアAに設置されたAP400に接続しているユーザ端末300を使用するユーザの顔認証用の撮影画像のアイコンI_1,I_2が重畳されている。また表示画像900のエリアBの位置には、エリアBに設置されたAP400に接続しているユーザ端末300を使用するユーザの顔認証用の撮影画像のアイコンI_3が重畳されている。また表示画像900には、表示装置500が設置されている位置が、現在位置として示されている。尚、図11では、ユーザ1人につき1つの顔認証用の撮影画像のアイコンが表示されているが、ユーザ1人につき複数のアイコンが表示されてもよい。複数のアイコンには、複数の顔認証用の撮影画像が含まれてもよいし、ユーザのその日の服、靴、髪型又は後ろ姿の撮影画像が含まれてもよいし、その日の外見の特徴を表現したイラスト画像が含まれてもよい。 FIG. 11 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment. For example, the target space TS includes area A, area B, and area C, and AP 400 is installed in each area. Icons I_1 and I_2 of captured images for face authentication of the user using the user terminal 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG. there is Also, at the position of area B of the display image 900, an icon I_3 of a photographed image for face authentication of the user using the user terminal 300 connected to the AP 400 installed in area B is superimposed. The display image 900 also shows the position where the display device 500 is installed as the current position. In addition, in FIG. 11, one icon of a shot image for face authentication is displayed for each user, but a plurality of icons may be displayed for each user. The plurality of icons may include a plurality of photographed images for face authentication, a photographed image of the user's clothes, shoes, hairstyle, or back view of the day, or may include a photographed image of the user's appearance of the day. An illustrated image representing the image may be included.
 このように実施形態2によれば、サーバ200は、顔認証に成功したユーザの位置情報を用いるため、他人によるユーザのなりすましを防止できる。また図11の例では、サーバ200は、ユーザの撮影画像を位置情報に対応付けて表示装置500に表示させる。したがって出力情報を閲覧した他のユーザは、対象のユーザの位置とともに、対象のユーザのその日の外見の特徴を把握することができる。これによりサーバ200は他のユーザが対象空間内で対象のユーザを見つけることを好適に支援できる。 As described above, according to the second embodiment, the server 200 uses the location information of the user whose face has been successfully authenticated, so that the user can be prevented from being impersonated by others. Further, in the example of FIG. 11, the server 200 causes the display device 500 to display the captured image of the user in association with the position information. Therefore, other users who browsed the output information can grasp the characteristics of the target user's appearance on that day as well as the position of the target user. This allows the server 200 to preferably assist other users in finding the target user within the target space.
 また顔認証用の撮影画像は、ユーザ端末300のカメラ310を用いるため、顔認証のために新規でカメラを設置する必要はない。またPCの起動又はスリープ解除時に顔認証をする場合は、PC付属のカメラで、PCの前に座るユーザの顔を正面から撮影することができる。このようにカメラ310でユーザを正面から撮影する場合は、顔認証に適した高品質の撮影画像を得ることができる。 Also, since the captured image for face authentication uses the camera 310 of the user terminal 300, there is no need to install a new camera for face authentication. Also, when face authentication is performed when the PC is activated or wakes from sleep, the camera attached to the PC can be used to photograph the face of the user sitting in front of the PC from the front. When the user is photographed from the front with the camera 310 in this manner, a high-quality photographed image suitable for face authentication can be obtained.
 またサーバ200は、AP400の位置に基づいてユーザの位置を推定する、又はユーザ端末300のGPS情報によりユーザの位置を推定するため、専用の発信機及び受信機等の新規機器の設置及び導入は不要である。 In addition, since the server 200 estimates the user's position based on the position of the AP 400 or estimates the user's position based on the GPS information of the user terminal 300, installation and introduction of new equipment such as dedicated transmitters and receivers is not required. No need.
 尚、実施形態2は、以下のように変形を加えることも可能である。 It should be noted that Embodiment 2 can be modified as follows.
 例えば、出力情報に含まれる外見情報は、顔認証用の撮影画像に代えて、顔認証用の撮影画像から生成された外見データであってもよい。このとき出力情報には、予め登録されているユーザの登録画像が含まれていてもよい。登録画像は顔認証時の登録画像であってもよいし、ユーザが適宜設定したサムネイル画像であってもよいし、社員証の顔画像であってもよい。 For example, the appearance information included in the output information may be appearance data generated from a photographed image for face authentication instead of the photographed image for face authentication. At this time, the output information may include a pre-registered registered image of the user. The registered image may be a registered image for face authentication, a thumbnail image appropriately set by the user, or a face image on an employee ID card.
 図12は、実施形態2にかかる表示装置500の表示画像900の一例を示す図である。図12に示す表示画像900のエリアAの位置には、エリアAに設置されたAP400に接続しているユーザ端末300を使用するユーザの登録画像のアイコンR_1,R_2が重畳されている。さらに表示画像900のエリアAの位置には、アイコンR_1,R_2に対応付けられて、そのユーザの顔認証用の撮影画像から生成された外見データO_1,O_2が含まれる。一例として、外見データO_1は「赤い服」を着用していることを示し、外見データO_2は「眼鏡」をかけていることを示している。エリアBについても同様に、エリアBに設置されたAP400に接続しているユーザ端末300を使用するユーザのアイコンR_3と、外見データO_3とが重畳されている。 FIG. 12 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment. Icons R_1 and R_2 of registered images of users using the user terminals 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG. Furthermore, at the position of area A of display image 900, appearance data O_1 and O_2 generated from the photographed images for face authentication of the user are included in association with icons R_1 and R_2. As an example, the appearance data O_1 indicates that the person is wearing "red clothes", and the appearance data O_2 indicates that the person is wearing "glasses". Similarly, for area B, icon R_3 of a user using user terminal 300 connected to AP 400 installed in area B and appearance data O_3 are superimposed.
 また例えば、出力情報は、第1ユーザの位置情報及び外見情報と、ユーザ情報とをさらに対応付けた情報であってもよい。つまり生成部245は、第1ユーザの位置情報及び外見情報と、ユーザ情報とを含む出力情報を生成してよい。また出力情報は、第1ユーザの位置情報及び外見情報と、スケジュール関連情報とをさらに対応付けた情報であってもよい。つまり生成部245は、第1ユーザの位置情報及び外見情報と、ユーザのスケジュール情報とを対応付けた出力情報を生成してよい。ユーザのスケジュール情報は、生成部245がユーザ情報2122のスケジュール関連情報から抽出したり、スケジュール関連情報に基づいてスケジューラにアクセスすることにより取得してよい。 Also, for example, the output information may be information in which the location information and appearance information of the first user are further associated with the user information. That is, the generation unit 245 may generate output information including position information and appearance information of the first user, and user information. Also, the output information may be information in which the location information and appearance information of the first user are further associated with the schedule-related information. That is, the generation unit 245 may generate output information in which the first user's location information and appearance information are associated with the user's schedule information. The user's schedule information may be obtained by the generation unit 245 extracting from the schedule-related information of the user information 2122 or by accessing the scheduler based on the schedule-related information.
 図13は、実施形態2にかかる表示装置500の表示画像900の一例を示す図である。図13に示す表示画像900のエリアAの位置には、エリアAに設置されたAP400に接続しているユーザ端末300を使用するユーザの顔認証用の撮影画像のアイコンI_1,I_2が重畳されている。そして表示画像900には、エリアAに所在するユーザのユーザ情報Uが含まれている。例えばアイコンI_1に対応するユーザは、所属部署が「第1技術部」であり、氏名が「日電太郎」である。またユーザ情報Uは、ユーザ操作で入力された入力情報が含まれていてもよい。この場合、ユーザは、本日の外見に関する入力情報を入力することができ、入力を受け付けたユーザ端末300は入力情報をサーバ200に送信し、ユーザのユーザ情報としてユーザDB212に登録させてよい。例えば図13に示すユーザ情報Uには、アイコンI_1に対応するユーザの入力情報として「赤い服を着ています」という情報が含まれている。またユーザは、その他の入力情報を入力することもできてよく、入力を受け付けたユーザ端末300は入力情報をサーバ200に送信し、ユーザのユーザ情報として登録させてよい。例えば図13に示すユーザ情報Uには、アイコンI_3に対応するユーザの入力情報として「丸い椅子の隣にいます」という情報が含まれている。ユーザ情報としてユーザDB212に登録される入力情報は、1時間又は1日といった有効期限をつけることができ、有効期限を経過後は削除されてよい。 FIG. 13 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment. Icons I_1 and I_2 of captured images for face authentication of the user using the user terminal 300 connected to the AP 400 installed in the area A are superimposed on the position of the area A of the display image 900 shown in FIG. there is The display image 900 includes user information U of users located in the area A. FIG. For example, the user corresponding to the icon I_1 belongs to the department "first engineering department" and the name is "Nichiden Taro". The user information U may also include input information input by a user operation. In this case, the user can input input information regarding today's appearance, and the user terminal 300 that has received the input may transmit the input information to the server 200 and register it in the user DB 212 as the user's user information. For example, the user information U shown in FIG. 13 includes information “I am wearing red clothes” as user input information corresponding to the icon I_1. The user may also input other input information, and the user terminal 300 that has received the input may transmit the input information to the server 200 and register it as the user's user information. For example, the user information U shown in FIG. 13 includes information "I am next to a round chair" as user input information corresponding to the icon I_3. Input information registered in the user DB 212 as user information can be given an expiration date of one hour or one day, and may be deleted after the expiration date.
 尚、表示装置500の閲覧者が顔認証用の撮影画像のアイコンI_1を選択すると、そのユーザに関する他の情報が表示されるようになっていてもよい。ユーザに関する他の情報とは、ユーザ情報やスケジュール情報であってよい。またユーザに関する他の情報は、そのユーザの顔認証用の他の撮影画像であってもよいし、ユーザのその日の服、靴、髪型又は後ろ姿の撮影画像であってもよいし、その日の外見の特徴を表現したイラスト画像であってもよい。表示装置500がタッチパネルを含む場合、選択することはタップすることであってよい。 It should be noted that when the viewer of the display device 500 selects the icon I_1 of the photographed image for face authentication, other information regarding the user may be displayed. Other information about the user may be user information or schedule information. Other information about the user may be another photographed image of the user for face authentication, a photographed image of the user's clothes, shoes, hairstyle, or back view of the day, or the appearance of the day. It may be an illustration image expressing the characteristics of If display device 500 includes a touch panel, selecting may be tapping.
 また表示装置500が特定の場所に設置されている場合、表示装置500は、特定の場所に紐づけられているユーザの所在を表示してよい。例えば表示装置500がある部署の居室に設置されている場合、表示装置500は、その部署のメンバーの所在を表示してよい。 Also, when the display device 500 is installed at a specific location, the display device 500 may display the location of the user associated with the specific location. For example, when the display device 500 is installed in a room of a certain department, the display device 500 may display the whereabouts of the members of that department.
 図14は、実施形態2にかかる表示装置500の表示画像900の一例を示す図である。図14の表示画像900は、第1技術部の居室を示すマップと、第1技術部の所属メンバーの所在とを示している。 FIG. 14 is a diagram showing an example of a display image 900 of the display device 500 according to the second embodiment. A display image 900 in FIG. 14 shows a map showing rooms of the first technical department and the locations of members belonging to the first technical department.
 表示装置500は、居室にいるメンバーについては、所在位置に対応するマップ内の位置に対応付けて、そのメンバーの顔認証用の撮影画像のアイコンI_1~I_3を表示している。また表示装置500は、居室にいないメンバーについては、どのエリアにいるかという情報を、そのメンバーの顔認証用の撮影画像のアイコンI_4~I_8とともに表示している。 The display device 500 displays icons I_1 to I_3 of captured images for face authentication of the members in association with the positions in the map corresponding to the location of the members in the living room. In addition, the display device 500 displays information about which area the member is not in the room, together with icons I_4 to I_8 of the photographed image of the member for face authentication.
 尚、サーバ200の出力制御部246は、所在位置が対象空間TS外であると判定したメンバーについては、そのメンバーの位置を、対象空間TSとは区別して、例えば「リモート」として出力してよい。尚、サーバ200の位置推定部244は、メンバーの位置が対象空間TS外であるか否かを、ユーザ端末300のGPS情報から判定してもよいし、経由したネットワークを追跡することにより判定してもよい。また位置推定部244は、メンバーの位置が対象空間TS外であるか否かを、ユーザのスケジュール情報から判定してもよい。この場合、表示装置500は、対象空間TSの外部にいるメンバーについては、図14に示すように、その所在を「リモート」として表示する。これにより、対象空間TSにいないメンバーについても、容易に把握することができる。 Note that the output control unit 246 of the server 200 may output, for example, "remote", the position of a member whose location is determined to be outside the target space TS, distinguishing it from the target space TS. . Note that the position estimation unit 244 of the server 200 may determine whether or not the position of the member is outside the target space TS from the GPS information of the user terminal 300, or by tracking the network through which it passes. may Also, the position estimation unit 244 may determine from the user's schedule information whether or not the member's position is outside the target space TS. In this case, the display device 500 displays the locations of members outside the target space TS as "remote" as shown in FIG. This makes it possible to easily grasp members who are not in the target space TS.
 また表示装置500は、所属メンバーのうち何人が在室しており、何人が他のエリアにいて、何人がリモートであるか等の内訳を表示してよい。 The display device 500 may also display breakdowns such as how many of the affiliated members are in the room, how many are in other areas, and how many are remote.
 尚、図14においても、表示装置500の閲覧者がアイコンを選択すると、そのアイコンに対応するユーザに関する他の情報が表示されるようになっていてもよい。このとき表示装置500は、選択されたユーザが対象空間TSの内部にいるか外部にいるかに関わらず、同種の情報を表示してよい。しかしこれに限らず、表示装置500は、選択されたユーザが対象空間TSの内部にいるか外部にいるかによって、表示する情報の種別を異ならせてもよい。例えば表示装置500は、対象空間TSの内部にいるユーザが選択された場合は、外見に関する詳細な情報や内線番号を表示する一方で、対象空間TSの外部にいるユーザが選択された場合は携帯電話番号を表示してよい。 Also in FIG. 14, when the viewer of the display device 500 selects an icon, other information about the user corresponding to the icon may be displayed. At this time, the display device 500 may display the same type of information regardless of whether the selected user is inside or outside the target space TS. However, the present invention is not limited to this, and the display device 500 may display different types of information depending on whether the selected user is inside or outside the target space TS. For example, when a user inside the target space TS is selected, the display device 500 displays detailed information on appearance and an extension number. A phone number may be displayed.
 <実施形態3>
 次に、本開示の実施形態3について説明する。実施形態3は、サーバ200が検索機能を有することに特徴を有する。
 例えばサーバ200は、ユーザ名やユーザIDをキーとして、そのユーザの位置情報を含む出力情報を出力する。一例として出力制御部246は、第2ユーザが使用する第2ユーザ端末300-2から、第1ユーザの位置に関する検索要求を受信した場合、第1ユーザの外見情報を位置情報とともに含む出力情報を、第2ユーザ端末300-2に出力する。
<Embodiment 3>
Next, Embodiment 3 of the present disclosure will be described. Embodiment 3 is characterized in that the server 200 has a search function.
For example, the server 200 uses a user name or user ID as a key to output output information including position information of the user. As an example, when the output control unit 246 receives a search request regarding the position of the first user from the second user terminal 300-2 used by the second user, the output control unit 246 outputs output information including the first user's appearance information together with the position information. , to the second user terminal 300-2.
 また例えばサーバ200は、エリアをキーとして、そのエリアにいるユーザの出力情報を出力する。一例として第2ユーザ端末300-2から、対象空間TS中の所定エリアに位置する人物に関する検索要求を受信した場合、生成部245はユーザDB212を参照し、位置情報が所定エリア内のユーザを特定し、特定したユーザの出力情報を生成する。そして出力制御部246は、特定したユーザの出力情報を第2ユーザ端末300-2に出力する。 Also, for example, the server 200 uses the area as a key to output the output information of the user in that area. As an example, when a search request relating to a person located in a predetermined area in the target space TS is received from the second user terminal 300-2, the generation unit 245 refers to the user DB 212 and identifies users whose position information is within the predetermined area. and generate output information for the specified user. The output control unit 246 then outputs the specified user's output information to the second user terminal 300-2.
 また例えばサーバ200は、ユーザの属性をキーとして、その属性を有するユーザの出力情報を出力する。一例として生成部245は、第2ユーザ端末300-2から、所定の部署に所属するユーザの位置に関する検索要求を受信した場合、ユーザDB212を参照し、対象空間TS中の所定の部署に所属するユーザを特定する。そして生成部245は、特定したユーザの出力情報を生成する。そして生成部245は、特定したユーザの出力情報を第2ユーザ端末300-2に出力する。 Also, for example, the server 200 uses a user attribute as a key to output output information of a user having that attribute. As an example, when the generation unit 245 receives from the second user terminal 300-2 a search request regarding the position of a user who belongs to a predetermined department, the generation unit 245 refers to the user DB 212 and searches for the position of the user who belongs to the predetermined department in the target space TS. Identify users. The generation unit 245 then generates output information for the specified user. The generation unit 245 then outputs the specified user's output information to the second user terminal 300-2.
 図15は、実施形態3にかかる位置出力処理の流れを示すシーケンス図である。図15は、第2ユーザが第1ユーザのユーザ名をキーとして第1ユーザの位置を検索する場合のシーケンスを示している。 FIG. 15 is a sequence diagram showing the flow of position output processing according to the third embodiment. FIG. 15 shows a sequence when the second user searches for the position of the first user using the user name of the first user as a key.
 まず図10のS510~S517と同様の処理が実行される。次に第2ユーザが使用する第2ユーザ端末300-2が、第1ユーザのユーザ名を含む検索要求をサーバ200に送信する(S531)。本図では、第2ユーザ端末300-2は、AP400を介してサーバ200に検索要求を送信しているが、AP400を介さなくてもよい。 First, the same processing as S510 to S517 in FIG. 10 is executed. Next, second user terminal 300-2 used by the second user transmits a search request including the user name of the first user to server 200 (S531). In this figure, the second user terminal 300-2 transmits the search request to the server 200 via the AP400, but it does not have to be via the AP400.
 検索要求を受信したサーバ200の生成部245は、ユーザDB212を参照し、指定されたユーザ名をキーとして、そのユーザ名に対応するユーザ(第1ユーザ)の位置情報と外見情報とを検索する(S532)。そして生成部245は、第1ユーザの位置情報と外見情報とを対応付けたマップを生成する(S533)。そしてサーバ200の出力制御部246は、AP400を介してマップを表示装置500に送信し(S534)、表示装置500の表示部にマップを表示させる(S535)。 The generation unit 245 of the server 200 that has received the search request refers to the user DB 212 and searches for the position information and appearance information of the user (first user) corresponding to the user name using the specified user name as a key. (S532). Then, the generation unit 245 generates a map in which the location information and appearance information of the first user are associated (S533). The output control unit 246 of the server 200 then transmits the map to the display device 500 via the AP 400 (S534), and causes the display unit of the display device 500 to display the map (S535).
 このように実施形態3によれば、サーバ200は、第2ユーザの検索に応じて、対象のユーザがどこにいるか、特定のエリアに誰がいるか、また特定の属性のユーザがどこにいるかを、第2ユーザが見つけやすい態様で出力できる。 As described above, according to the third embodiment, the server 200 determines where the target user is, who is in a specific area, and where a user with a specific attribute is in response to a search by the second user. It can be output in a form that is easy for the user to find.
 本開示は、任意の処理を、プロセッサにコンピュータプログラムを実行させることにより実現することが可能である。上述の例において、プログラムは、コンピュータに読み込まれた場合に、実施形態で説明された1又はそれ以上の機能をコンピュータに行わせるための命令群(又はソフトウェアコード)を含む。プログラムは、非一時的なコンピュータ可読媒体又は実体のある記憶媒体に格納されてもよい。限定ではなく例として、コンピュータ可読媒体又は実体のある記憶媒体は、random-access memory(RAM)、read-only memory(ROM)、フラッシュメモリ、solid-state drive(SSD)又はその他のメモリ技術、CD-ROM、digital versatile disc(DVD)、Blu-ray(登録商標)ディスク又はその他の光ディスクストレージ、磁気カセット、磁気テープ、磁気ディスクストレージ又はその他の磁気ストレージデバイスを含む。プログラムは、一時的なコンピュータ可読媒体又は通信媒体上で送信されてもよい。限定ではなく例として、一時的なコンピュータ可読媒体又は通信媒体は、電気的、光学的、音響的、またはその他の形式の伝搬信号を含む。 The present disclosure can implement arbitrary processing by causing a processor to execute a computer program. In the above examples, the program includes instructions (or software code) that, when read into the computer, cause the computer to perform one or more of the functions described in the embodiments. The program may be stored in a non-transitory computer-readable medium or tangible storage medium. By way of example, and not limitation, computer readable media or tangible storage media may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drives (SSD) or other memory technology, CDs - ROM, digital versatile disc (DVD), Blu-ray disc or other optical disc storage, magnetic cassette, magnetic tape, magnetic disc storage or other magnetic storage device. The program may be transmitted on a transitory computer-readable medium or communication medium. By way of example, and not limitation, transitory computer readable media or communication media include electrical, optical, acoustic, or other forms of propagated signals.
 上述のコンピュータは、パーソナルコンピュータやワードプロセッサ等を含むコンピュータシステムで構成される。しかしこれに限らず、コンピュータは、LAN(ローカル・エリア・ネットワーク)のサーバ、コンピュータ(パソコン)通信のホスト、インターネット上に接続されたコンピュータシステム等によって構成されることも可能である。また、ネットワーク上の各機器に機能分散させ、ネットワーク全体でコンピュータを構成することも可能である。 The computer mentioned above is composed of a computer system including a personal computer and a word processor. However, the computer is not limited to this, and can be configured by a LAN (local area network) server, a computer (personal computer) communication host, a computer system connected to the Internet, or the like. It is also possible to distribute the functions to each device on the network and configure the computer over the entire network.
 尚、本開示は上記実施形態に限られたものではなく、趣旨を逸脱しない範囲で適宜変更することが可能である。例えば実施形態2~3では、ユーザ端末300の起動時又はスリープ状態からの解除時に顔認証を実施するとしたが、顔認証のタイミングがこれに限らない。例えば顔認証は所定の時間間隔で行われてもよいし、所定期間に所定回数行われてよい。また例えば顔認証は、何らかの入力操作があった場合に行われてもよい。顔認証が1日に複数回行われる場合は、なりすましをより防止することができる。また顔認証が1日に複数回行われる場合は、生成部245は、外見情報を顔認証毎に更新してもよいし、外見情報を、その日のいずれかのタイミング(例えばその日の最初)の撮影画像に基づく外見情報に設定してもよい。また生成部245は、直近の顔認証から所定時間経過した場合に外見情報を更新してもよい。外見情報を更新する場合は、途中で服装や髪型が変化したときやマスクを外したときでもユーザを見つけやすいという効果を奏する。また生成部245は、どのタイミングの外見情報を出力情報に用いるかを、ユーザの選択操作により設定してもよい。 It should be noted that the present disclosure is not limited to the above embodiments, and can be modified as appropriate without departing from the scope. For example, in Embodiments 2 and 3, face authentication is performed when the user terminal 300 is activated or released from the sleep state, but the timing of face authentication is not limited to this. For example, face authentication may be performed at predetermined time intervals, or may be performed a predetermined number of times in a predetermined period. Further, for example, face authentication may be performed when some input operation is performed. Spoofing can be further prevented when face authentication is performed multiple times a day. When face authentication is performed multiple times a day, the generation unit 245 may update the appearance information for each face authentication, or update the appearance information at some timing of the day (for example, at the beginning of the day). Appearance information based on a photographed image may be set. The generation unit 245 may update the appearance information when a predetermined time has passed since the most recent face authentication. When the appearance information is updated, there is an effect that it is easy to find the user even when the clothing or hairstyle changes or the mask is removed. Further, the generation unit 245 may set which timing of the appearance information is to be used for the output information by the user's selection operation.
 また上述の実施形態では、顔認証機能を顔認証装置100が有していたが、顔認証装置100に代えて又は加えて、サーバ200が顔認証機能を有してもよい。 Also, in the above-described embodiment, the face authentication device 100 has the face authentication function, but instead of or in addition to the face authentication device 100, the server 200 may have the face authentication function.
 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
   (付記1)
 第1ユーザの撮影画像に基づく生体認証を制御する認証制御手段と、
 前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する位置推定手段と、
 前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する出力制御手段と
 を備える情報処理装置。
   (付記2)
 前記生体認証は、顔認証である
 付記1に記載の情報処理装置。
   (付記3)
 前記位置推定手段は、前記第1ユーザ端末の接続先のアクセスポイントの位置情報に基づいて、前記第1ユーザ端末の位置情報を推定する
 付記1又は2に記載の情報処理装置。
   (付記4)
 前記出力情報は、前記第1ユーザの前記位置及び前記外見情報と、前記第1ユーザに関連するユーザ情報とをさらに対応付ける
 付記1から3のいずれか一項に記載の情報処理装置。
   (付記5)
 前記対象空間を表すマップであって、前記第1ユーザの前記位置に対応する位置に、前記第1ユーザの外見情報が重畳されるマップを、前記出力情報として生成する生成手段をさらに備える
 付記1から4のいずれか一項に記載の情報処理装置。
   (付記6)
 前記出力制御手段は、前記対象空間内に設置された表示装置に、前記出力情報を表示させる
 付記1から5のいずれか一項に記載の情報処理装置。
   (付記7)
 前記出力制御部は、第2ユーザが使用する第2ユーザ端末から、前記第1ユーザの位置に関する検索要求を受信した場合、前記第1ユーザの前記出力情報を前記第2ユーザ端末に出力する
 付記1から5のいずれか一項に記載の情報処理装置。
   (付記8)
 前記出力制御手段は、前記第1ユーザの位置が前記対象空間外であると判定した場合、前記第1ユーザの位置を、前記対象空間とは区別して出力する
 付記1から7のいずれか一項に記載の情報処理装置。
   (付記9)
 第1ユーザの撮影画像に基づく生体認証を実施する生体認証装置と、
 情報処理装置と
 を備え、
 前記情報処理装置は、
 前記生体認証装置から前記生体認証の結果を取得する認証制御手段と、
 前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する位置推定手段と、
 前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する出力制御手段と
 を有する
 情報処理システム。
   (付記10)
 表示装置をさらに備え、
 前記出力制御手段は、前記出力情報を前記表示装置に出力する
 付記9に記載の情報処理システム。
   (付記11)
 第1ユーザの撮影画像に基づく生体認証を制御し、
 前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定し、
 前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する
 情報処理方法。
   (付記12)
 第1ユーザの撮影画像に基づく生体認証を制御する手順と、
 前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する手順と、
 前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する手順と
 をコンピュータに実行させるためのプログラムが格納された非一時的なコンピュータ可読媒体。
Some or all of the above-described embodiments can also be described in the following supplementary remarks, but are not limited to the following.
(Appendix 1)
authentication control means for controlling biometric authentication based on the photographed image of the first user;
position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. An information processing device comprising: a control means;
(Appendix 2)
The information processing apparatus according to appendix 1, wherein the biometric authentication is face authentication.
(Appendix 3)
3. The information processing apparatus according to appendix 1 or 2, wherein the position estimation means estimates position information of the first user terminal based on position information of an access point to which the first user terminal is connected.
(Appendix 4)
4. The information processing apparatus according to any one of additional notes 1 to 3, wherein the output information further associates the position and appearance information of the first user with user information related to the first user.
(Appendix 5)
Further comprising generating means for generating, as the output information, a map representing the target space in which appearance information of the first user is superimposed at a position corresponding to the position of the first user. 5. The information processing device according to any one of items 4 to 4.
(Appendix 6)
6. The information processing apparatus according to any one of appendices 1 to 5, wherein the output control means causes a display device installed in the target space to display the output information.
(Appendix 7)
The output control unit outputs the output information of the first user to the second user terminal when receiving a search request regarding the position of the first user from the second user terminal used by the second user. 6. The information processing device according to any one of 1 to 5.
(Appendix 8)
The output control means outputs the position of the first user separately from the target space when determining that the position of the first user is outside the target space. The information processing device according to .
(Appendix 9)
a biometric authentication device that performs biometric authentication based on a photographed image of a first user;
comprising an information processing device and
The information processing device is
authentication control means for acquiring the result of the biometric authentication from the biometric authentication device;
position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. An information processing system comprising: a control means;
(Appendix 10)
further comprising a display device,
The information processing system according to appendix 9, wherein the output control means outputs the output information to the display device.
(Appendix 11)
controlling biometric authentication based on the captured image of the first user;
When the biometric authentication is successful, estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user;
outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image; Processing method.
(Appendix 12)
a procedure for controlling biometric authentication based on the photographed image of the first user;
a step of estimating the position of the first user in a target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
A procedure for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. A non-transitory computer-readable medium storing a program for causing a computer to execute and .
 10,200 情報処理装置(サーバ)
 13,243 認証制御部
 14,244 位置推定部
 16,246 出力制御部
 100 顔認証装置
 110 顔情報DB
 111 ユーザID
 112 顔特徴情報
 120 顔検出部
 130 特徴点抽出部
 140 登録部
 150 認証部
 210 記憶部
 211 プログラム
 212 ユーザDB
 2121 ユーザID
 2122 ユーザ情報
 2123 位置情報
 213 APDB
 2131 APID
 2132 位置情報
 220 メモリ
 230 通信部
 240 制御部
 241 登録部
 242 画像取得部
 245 生成部
 300 ユーザ端末
 310 カメラ
 320 記憶部
 330 通信部
 340 表示部
 350 入力部
 360 制御部
 400 アクセスポイント(AP)
 500 表示装置
 900 表示画像
 1000 情報処理システム
 TS 対象空間
 I 撮影画像
 R 登録画像
 U ユーザ情報
 O 外見データ
10, 200 information processing device (server)
13,243 authentication control unit 14,244 position estimation unit 16,246 output control unit 100 face authentication device 110 face information DB
111 User ID
112 facial feature information 120 face detection unit 130 feature point extraction unit 140 registration unit 150 authentication unit 210 storage unit 211 program 212 user DB
2121 User ID
2122 User information 2123 Location information 213 APDB
2131 APID
2132 position information 220 memory 230 communication unit 240 control unit 241 registration unit 242 image acquisition unit 245 generation unit 300 user terminal 310 camera 320 storage unit 330 communication unit 340 display unit 350 input unit 360 control unit 400 access point (AP)
500 Display device 900 Display image 1000 Information processing system TS Target space I Photographed image R Registered image U User information O Appearance data

Claims (12)

  1.  第1ユーザの撮影画像に基づく生体認証を制御する認証制御手段と、
     前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する位置推定手段と、
     前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する出力制御手段と
     を備える情報処理装置。
    authentication control means for controlling biometric authentication based on the photographed image of the first user;
    position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
    Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. An information processing device comprising: a control means;
  2.  前記生体認証は、顔認証である
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the biometric authentication is face authentication.
  3.  前記位置推定手段は、前記第1ユーザ端末の接続先のアクセスポイントの位置情報に基づいて、前記第1ユーザ端末の位置情報を推定する
     請求項1又は2に記載の情報処理装置。
    The information processing apparatus according to claim 1 or 2, wherein the position estimation means estimates position information of the first user terminal based on position information of an access point to which the first user terminal is connected.
  4.  前記出力情報は、前記第1ユーザの前記位置及び前記外見情報と、前記第1ユーザに関連するユーザ情報とをさらに対応付ける
     請求項1から3のいずれか一項に記載の情報処理装置。
    The information processing apparatus according to any one of claims 1 to 3, wherein the output information further associates the position and appearance information of the first user with user information related to the first user.
  5.  前記対象空間を表すマップであって、前記第1ユーザの前記位置に対応する位置に、前記第1ユーザの外見情報が重畳されるマップを、前記出力情報として生成する生成手段をさらに備える
     請求項1から4のいずれか一項に記載の情報処理装置。
    A generating means for generating, as the output information, a map representing the target space, the map having appearance information of the first user superimposed at a position corresponding to the position of the first user. 5. The information processing device according to any one of 1 to 4.
  6.  前記出力制御手段は、前記対象空間内に設置された表示装置に、前記出力情報を表示させる
     請求項1から5のいずれか一項に記載の情報処理装置。
    The information processing apparatus according to any one of claims 1 to 5, wherein the output control means causes a display device installed in the target space to display the output information.
  7.  前記出力制御手段は、第2ユーザが使用する第2ユーザ端末から、前記第1ユーザの位置に関する検索要求を受信した場合、前記第1ユーザの前記出力情報を前記第2ユーザ端末に出力する
     請求項1から5のいずれか一項に記載の情報処理装置。
    The output control means outputs the output information of the first user to the second user terminal when receiving a search request regarding the position of the first user from the second user terminal used by the second user. Item 6. The information processing device according to any one of Items 1 to 5.
  8.  前記出力制御手段は、前記第1ユーザの位置が前記対象空間外であると判定した場合、前記第1ユーザの位置を、前記対象空間とは区別して出力する
     請求項1から7のいずれか一項に記載の情報処理装置。
    8. The output control means outputs the position of the first user separately from the target space when determining that the position of the first user is outside the target space. The information processing device according to the item.
  9.  第1ユーザの撮影画像に基づく生体認証を実施する生体認証装置と、
     情報処理装置と
     を備え、
     前記情報処理装置は、
     前記生体認証装置から前記生体認証の結果を取得する認証制御手段と、
     前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する位置推定手段と、
     前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する出力制御手段と
     を有する
     情報処理システム。
    a biometric authentication device that performs biometric authentication based on a photographed image of a first user;
    comprising an information processing device and
    The information processing device is
    authentication control means for acquiring the result of the biometric authentication from the biometric authentication device;
    position estimation means for estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
    Output for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. An information processing system comprising: a control means;
  10.  表示装置をさらに備え、
     前記出力制御手段は、前記出力情報を前記表示装置に出力する
     請求項9に記載の情報処理システム。
    further comprising a display device,
    The information processing system according to claim 9, wherein said output control means outputs said output information to said display device.
  11.  第1ユーザの撮影画像に基づく生体認証を制御し、
     前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定し、
     前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する
     情報処理方法。
    controlling biometric authentication based on the captured image of the first user;
    When the biometric authentication is successful, estimating the position of the first user in the target space based on the position information of the first user terminal used by the first user;
    outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image; Processing method.
  12.  第1ユーザの撮影画像に基づく生体認証を制御する手順と、
     前記生体認証が成功した場合、前記第1ユーザが使用する第1ユーザ端末の位置情報に基づいて、対象空間内の前記第1ユーザの位置を推定する手順と、
     前記第1ユーザの前記位置と、前記第1ユーザの前記撮影画像及び前記撮影画像に基づいて生成された外見に関するデータのうち少なくとも一方を含む外見情報と、を対応付けた出力情報を出力する手順と
     をコンピュータに実行させるためのプログラムが格納された非一時的なコンピュータ可読媒体。
    a procedure for controlling biometric authentication based on the photographed image of the first user;
    a step of estimating the position of the first user in a target space based on the position information of the first user terminal used by the first user when the biometric authentication is successful;
    A procedure for outputting output information in which the position of the first user is associated with appearance information including at least one of the photographed image of the first user and data relating to appearance generated based on the photographed image. A non-transitory computer-readable medium storing a program for causing a computer to execute and .
PCT/JP2021/041170 2021-11-09 2021-11-09 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium WO2023084593A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/041170 WO2023084593A1 (en) 2021-11-09 2021-11-09 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/041170 WO2023084593A1 (en) 2021-11-09 2021-11-09 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium

Publications (1)

Publication Number Publication Date
WO2023084593A1 true WO2023084593A1 (en) 2023-05-19

Family

ID=86335288

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/041170 WO2023084593A1 (en) 2021-11-09 2021-11-09 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium

Country Status (1)

Country Link
WO (1) WO2023084593A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012038242A (en) * 2010-08-11 2012-02-23 Kddi Corp Attendance management method and system
JP2019101566A (en) * 2017-11-29 2019-06-24 株式会社 プロネット Information processing system, information processing method, information processing program, and information processing apparatus
JP2019144917A (en) * 2018-02-22 2019-08-29 パナソニックIpマネジメント株式会社 Stay situation display system and stay situation display method
WO2021186569A1 (en) * 2020-03-17 2021-09-23 日本電気株式会社 Visit assistance device, visit assistance system, visit assistance method, and non-transitory computer-readable medium having program stored thereon

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012038242A (en) * 2010-08-11 2012-02-23 Kddi Corp Attendance management method and system
JP2019101566A (en) * 2017-11-29 2019-06-24 株式会社 プロネット Information processing system, information processing method, information processing program, and information processing apparatus
JP2019144917A (en) * 2018-02-22 2019-08-29 パナソニックIpマネジメント株式会社 Stay situation display system and stay situation display method
WO2021186569A1 (en) * 2020-03-17 2021-09-23 日本電気株式会社 Visit assistance device, visit assistance system, visit assistance method, and non-transitory computer-readable medium having program stored thereon

Similar Documents

Publication Publication Date Title
EP3076320B1 (en) Individual identification device, and identification threshold setting method
JP5747116B1 (en) Security system
JP6123653B2 (en) Information processing apparatus, information processing method, and program
US9679152B1 (en) Augmented reality security access
CN107886602B (en) Method for unlocking and sharing house and system for unlocking and sharing house
JP6183132B2 (en) Authentication server, authentication program, and authentication method
JP2013041416A (en) Information processing device and method, program, and information processing system
JP6769475B2 (en) Information processing system, management method for authentication, and program
JP5813829B1 (en) Crime prevention system
JP2019091395A (en) Information processing device, monitoring system, control method, and program
US20190147251A1 (en) Information processing apparatus, monitoring system, method, and non-transitory computer-readable storage medium
JP6776700B2 (en) Disaster information management system and disaster information management method
JP6993597B2 (en) Information processing equipment, control methods, and programs
JP5532180B1 (en) Image processing apparatus and program
WO2023084593A1 (en) Information processing device, information processing system, information processing method, and non-transitory computer-readable medium
JP6631362B2 (en) Information processing apparatus, information processing method, and program
US10121058B2 (en) Facilitating monitoring of users
WO2019150954A1 (en) Information processing device
WO2022195815A1 (en) Information provision device, information provision system, information provision method, and non-transitory computer-readable medium
JP2017152013A (en) Information processing device, information processing method, and program
WO2021186569A1 (en) Visit assistance device, visit assistance system, visit assistance method, and non-transitory computer-readable medium having program stored thereon
JP6077930B2 (en) Information management apparatus, information management system, communication terminal, and information management method
JP2010231450A (en) Photographing data authentication device, photographing data authentication system, photographing data authentication method and program
WO2022113589A1 (en) Server, terminal device, information processing program, management system, and management method
JP6435676B2 (en) File management apparatus, file management system, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21963956

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023559225

Country of ref document: JP

Kind code of ref document: A