WO2022176342A1 - Information processing device, information processing system, information processing method, and non-transitory computer-readable medium - Google Patents

Information processing device, information processing system, information processing method, and non-transitory computer-readable medium Download PDF

Info

Publication number
WO2022176342A1
WO2022176342A1 PCT/JP2021/045571 JP2021045571W WO2022176342A1 WO 2022176342 A1 WO2022176342 A1 WO 2022176342A1 JP 2021045571 W JP2021045571 W JP 2021045571W WO 2022176342 A1 WO2022176342 A1 WO 2022176342A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
subject
smile
image
degree
Prior art date
Application number
PCT/JP2021/045571
Other languages
French (fr)
Japanese (ja)
Inventor
秀典 北方
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to JP2023500566A priority Critical patent/JP7468771B2/en
Publication of WO2022176342A1 publication Critical patent/WO2022176342A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present disclosure relates to an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium.
  • computer readable medium
  • Patent Document 1 discloses an image selection device that evaluates the degree of smile of a visitor who is experiencing an attraction installed in an amusement park or the like, and selects the still image with the highest degree of smile as an image for sales promotion.
  • Patent Document 2 discloses an information processing system that calculates a happiness level from a face image when entering and exiting a space, and calculates an evaluation value for the space based on the difference in the happiness level.
  • Patent Literature 1 mentioned above does not describe analyzing the satisfaction level of attractions. Further, as a detailed analysis of the degree of satisfaction with content, analysis of the degree of satisfaction with content for each visitor characteristic can be mentioned. However, Patent Literature 2 mentioned above does not describe analyzing the evaluation value of the space for each visitor's characteristic. In addition, in order to perform such a detailed analysis, it is necessary to collect a large amount of information related to visitors. has not been disclosed.
  • An object of the present disclosure is to provide an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that suitably analyze the degree of satisfaction with content at an event venue in view of the above-described problems.
  • An information processing device includes: registration means for registering attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired; Authentication control means for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is acquired; recording control means for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the photographed image in association with content information related to content associated with the subject's attribute information and a photographing location; Output means for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful; evaluation means for summarizing smile levels based on subject's attribute information and content information, and evaluating content satisfaction based on the summarized smile levels.
  • An information processing system includes: an information terminal that generates a captured image of a target person; and an information processing device communicably connected to the information terminal.
  • the information processing device is registration means for registering attribute information of a subject in association with biometric information of the subject when use registration is accepted;
  • Authentication control means for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is obtained from the information terminal;
  • recording control means for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the photographed image in association with content information related to content associated with the subject's attribute information and a photographing location;
  • Output means for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful;
  • evaluation means for summarizing the degree of smile based on the attribute information of the subject and the content information, and evaluating satisfaction with the content based on the summarized degree of smile.
  • An information processing method includes: When the user registration request is obtained, registering the attribute information of the subject in association with the biometric information of the subject, When a photographed image of a target person photographed at a predetermined point associated with predetermined content is acquired, biometric authentication is performed using the photographed image, when the biometric authentication is successful, recording the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and the shooting location; Outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful, Smile levels are aggregated based on the subject's attribute information and content information, and content satisfaction is evaluated based on the aggregated smile levels.
  • a non-transitory computer-readable medium comprises: a registration process of registering attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired; Authentication control processing for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with predetermined content is acquired; recording control processing for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and a shooting location; An output process for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded; and an evaluation process of aggregating smile levels based on subject's attribute information and content information and evaluating content satisfaction based on the aggregated smile levels.
  • an information processing device an information processing system, an information processing method, and a non-temporary computer-readable medium that suitably analyze the degree of satisfaction with content at an event venue.
  • FIG. 1 is a block diagram showing the configuration of an information processing apparatus according to a first embodiment
  • FIG. 3 is a diagram showing the flow of an information processing method according to the first embodiment
  • FIG. 2 is a block diagram showing the overall configuration of an information processing system according to a second embodiment
  • FIG. 2 is a block diagram showing the configuration of an authentication device according to a second embodiment
  • FIG. 9 is a flowchart showing the flow of face information registration processing according to the second embodiment
  • 9 is a flow chart showing the flow of face authentication processing according to the second embodiment
  • FIG. 10 is a block diagram showing the configuration of a face authentication terminal according to the second embodiment
  • FIG. FIG. 8 is a block diagram showing the configuration of a user terminal according to the second embodiment
  • FIG. 2 is a block diagram showing the configuration of an information processing apparatus according to a second embodiment
  • FIG. FIG. 11 is a diagram illustrating an example of a data structure of aggregated information according to the second embodiment
  • FIG. 9 is a flowchart showing the flow of usage registration processing according to the second embodiment
  • 9 is a flowchart showing the flow of image output processing according to the second embodiment
  • 9 is a flowchart showing the flow of evaluation processing according to the second embodiment
  • FIG. 11 is a sequence diagram showing the flow of usage registration processing according to the second embodiment
  • FIG. 10 is a sequence diagram showing the flow of image output processing according to the second embodiment
  • FIG. 10 is a sequence diagram showing the flow of image output processing according to the second embodiment
  • FIG. 10 is a sequence diagram showing the flow of image output processing according to the second embodiment
  • FIG. 10 is a sequence diagram showing the flow of image output processing according to the second embodiment
  • FIG. 10 is a sequence diagram showing the flow of image output processing according to the second embodiment
  • FIG. 10 is
  • FIG. 11 is a diagram showing an example of a synthesized image according to the second embodiment;
  • FIG. FIG. 11 is a diagram showing an example of a terminal screen according to the second embodiment;
  • FIG. FIG. 11 is a diagram showing an example of a terminal screen according to the second embodiment;
  • FIG. 11 is a diagram showing an example of an overall composite image according to the second embodiment;
  • FIG. 1 is a block diagram showing the configuration of an information processing apparatus 10 according to the first embodiment.
  • the information processing device 10 is an information processing device that analyzes the level of satisfaction with content provided at an event site.
  • the information processing device 10 is connected to a network (not shown).
  • a network may be wired or wireless.
  • the network is also connected to an information terminal (not shown) that takes a picture of a target person who is a visitor at the event venue and generates a taken image.
  • the information processing device 10 is communicably connected to the information terminal via the network.
  • the information terminal is installed at a predetermined point associated with the predetermined content.
  • the captured image includes at least the subject's face area.
  • the information terminal may be a face authentication terminal, digital signage with a camera, or the like.
  • the information processing device 10 includes a registration unit 11 , an authentication control unit 12 , a recording control unit 14 , an output unit 16 and an evaluation unit 17 .
  • the registration unit 11 is also called registration means.
  • the registration unit 11 registers the target person's attribute information in association with the target person's biometric information.
  • the subject's attribute information is information related to the subject.
  • the subject's attribute information may include at least one of the subject's age, gender, occupation, family structure, hobby, companion information, and means of visit.
  • the companion information indicates the presence or absence of a companion, and if there is a companion, the attributes of the group including the target person and the companion (group attribute).
  • the group attribute is family (with parent and child), couple, friends, or the like.
  • the companion information may include the identification information (ID) of the companion.
  • a means of visiting is an automobile, a train, or the like.
  • the biometric information may be face, fingerprint, iris or vein feature information, or other biometric information.
  • the authentication control unit 12 is also called authentication control means.
  • the authentication control unit 12 acquires a photographed image of the target person photographed at the above-described predetermined point from the above-described information terminal, the biometric authentication is performed using the photographed image.
  • the recording control unit 14 is also called recording control means.
  • the recording control unit 14 records the subject's smile degree in association with the subject's attribute information and the content information related to the content associated with the shooting location. Incidentally, the degree of smile is calculated from the photographed image.
  • the output unit 16 is also called output means.
  • the output unit 16 outputs information corresponding to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded.
  • the information terminal used by the target person may be an information terminal that is installed at the above-mentioned predetermined spot and takes an image of the target person, or may be a target person terminal owned by the target person.
  • the evaluation unit 17 is also called evaluation means.
  • the evaluation unit 17 summarizes the degree of smile based on the subject's attribute information and content information. Then, the evaluation unit 17 evaluates the degree of satisfaction with the content based on the aggregated degree of smile.
  • FIG. 2 is a diagram showing the flow of the information processing method according to the first embodiment.
  • the registration unit 11 receives service usage registration from a subject (S10). Then, the registration unit 11 registers the target person's attribute information in association with the target person's biometric information (S11). Subsequently, the authentication control unit 12 acquires from the information terminal a photographed image of the target person photographed by the information terminal at a predetermined point (S12). Then, the authentication control unit 12 performs biometric authentication using the captured image (S13), and if the biometric authentication fails (No in S14), the process ends.
  • the recording control unit 14 associates the subject's smile degree calculated from the captured image with the subject's attribute information and the content information of the content associated with the shooting location. It is added and recorded (S15). Subsequently, the output unit 16 acquires information corresponding to the degree of smile and outputs it to the information terminal used by the subject (S16). Then, the evaluation unit 17 aggregates the degree of smile based on the associated content information and attribute information of the subject, and evaluates the degree of satisfaction with the content based on the aggregated degree of smile (S17). The evaluation unit 17 then terminates the process.
  • the information processing apparatus 10 aggregates the degree of smile based on the attribute information of the target person, and performs detailed analysis of the degree of content satisfaction. Therefore, it becomes easy for the operator of the event site to take measures to improve the satisfaction of the content.
  • information corresponding to the degree of smile can be obtained as a privilege for the target person, it is easy to work as an incentive to provide the user's attribute information and to permit usage.
  • the information processing apparatus 10 can easily acquire the attribute information of the subject, and detailed analysis becomes easy. Therefore, it is possible to more preferably analyze the degree of satisfaction with the content at the event site.
  • the information processing apparatus 10 includes a processor, a memory, and a storage device (not shown). Further, the storage device stores a computer program in which the processing of the information processing method according to the present embodiment is implemented. Then, the processor loads the computer program from the storage device into the memory and executes the computer program. Thereby, the processor implements the functions of the registration unit 11 , the authentication control unit 12 , the recording control unit 14 , the output unit 16 and the evaluation unit 17 .
  • the registration unit 11, the authentication control unit 12, the recording control unit 14, the output unit 16, and the evaluation unit 17 may each be realized by dedicated hardware.
  • part or all of each component of each device may be realized by general-purpose or dedicated circuitry, processors, etc., or combinations thereof. These may be composed of a single chip, or may be composed of multiple chips connected via a bus. A part or all of each component of each device may be implemented by a combination of the above-described circuits and the like and programs.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • FPGA field-programmable gate array
  • each component of the information processing device 10 when part or all of each component of the information processing device 10 is realized by a plurality of information processing devices, circuits, etc., the plurality of information processing devices, circuits, etc. may be centrally arranged, They may be distributed.
  • the information processing device, circuits, and the like may be implemented as a form in which each is connected via a communication network, such as a client-server system, a cloud computing system, or the like.
  • the functions of the information processing device 10 may be provided in a SaaS (Software as a Service) format.
  • FIG. 3 is a block diagram showing the overall configuration of an information processing system 1000 according to the second embodiment.
  • the information processing system 1000 provides a service to a target person (user U) who is a visitor at an event site, collects captured images of the user U, and measures satisfaction with each content provided at the event site.
  • the event venue is a zoo or an amusement park.
  • the content may refer to viewing each animal or the animals themselves.
  • the event venue is an amusement park
  • the content may refer to experiencing each attraction or the attractions themselves.
  • the above-mentioned service provides the user U with privilege information according to the user U's photographed image.
  • the information processing system 1000 includes an authentication device 100, an information processing device 200, an image storage server 300, face authentication terminals 400-1 to 400-n (n is a natural number of 2 or more), and a user terminal 500.
  • Authentication device 100, information processing device 200, image storage server 300, face authentication terminals 400-1 to 400-n, and user terminal 500 are connected via network N, respectively.
  • the network N is a wired or wireless communication line.
  • face authentication terminals 400-1, 400-2, 400-3, . . . 400-n are installed at points A1, A2, A3, .
  • points A1 to An are assumed to be different spots in a certain event site A.
  • point A1 may be the entrance to a zoo
  • points A2-An may be animal viewing spots.
  • the face authentication terminal 400 may be configured so as to be able to photograph together the user U and scenery, animals, or the like peculiar to the related content. . . , 400-n may be simply referred to as the face authentication terminal 400 hereinafter when the face authentication terminals 400-1, 400-2, 400-3, . . .
  • User U registers for use of the service at point A1 and registers his/her own face information.
  • user U's attribute information is registered.
  • the use registration and face information registration may be performed at any face authentication terminal 400 or user terminal 500 that the user U desires.
  • the user U then visits one or more points.
  • the face authentication terminal 400 takes an image of the visiting user U.
  • FIG. When the face authentication using the captured image is successful, the user U can acquire information according to the degree of smile in a predetermined manner.
  • the degree of smile is generated based on a photographed image for which face recognition has succeeded. In the following description, it is assumed that the information corresponding to the degree of smile is an image corresponding to the degree of smile.
  • the image corresponding to the degree of smile may be a composite image in which the degree of smile of the user U is superimposed on the photographed image.
  • the image corresponding to the degree of smile may correspond to a photographed image in which the user U appears satisfied or happy.
  • the image corresponding to the degree of smile may be a photographed image in which the degree of smile of the user U is equal to or greater than a predetermined value, or the degree of smile of the user U may be superimposed on the photographed image in which the degree of smile of the user U is equal to or greater than a predetermined value.
  • It may be a composite image obtained by
  • the information according to the degree of smile may be information related to content, such as sales promotion information, in addition to or instead of the above-described image.
  • the content may be content related to the photographing location of the photographed image having a degree of smile equal to or greater than a predetermined value.
  • the authentication device 100 is an information processing device that stores facial feature information of a plurality of persons. In addition, in response to a face authentication request received from the outside, the authentication device 100 compares the face image or face feature information included in the request with the face feature information of each user, and requests the matching result (authentication result). Reply to original.
  • FIG. 4 is a block diagram showing the configuration of the authentication device 100 according to the second embodiment.
  • the authentication device 100 includes a face information DB (DataBase) 110 , a face detection section 120 , a feature point extraction section 130 , a registration section 140 and an authentication section 150 .
  • the face information DB 110 associates and stores a user ID 111 and face feature information 112 of the user ID.
  • the facial feature information 112 is a set of feature points extracted from the facial image.
  • the authentication device 100 may delete the facial feature information 112 in the facial feature DB 110 at the request of the registered user of the facial feature information 112 .
  • the authentication device 100 may delete the facial feature information 112 after a certain period of time has passed since it was registered.
  • the face detection unit 120 detects a face area included in a registration image for registering face information and outputs it to the feature point extraction unit 130 .
  • Feature point extraction section 130 extracts feature points from the face area detected by face detection section 120 and outputs facial feature information to registration section 140 . Further, feature point extraction section 130 extracts feature points included in the facial image received from information processing apparatus 200 and outputs facial feature information to authentication section 150 .
  • the registration unit 140 newly issues a user ID 111 when registering facial feature information.
  • the registration unit 140 associates the issued user ID 111 with the facial feature information 112 extracted from the registered image and registers them in the facial information DB 110 .
  • the authentication unit 150 performs face authentication using the facial feature information 112 . Specifically, the authentication unit 150 collates the facial feature information extracted from the facial image with the facial feature information 112 in the facial information DB 110 .
  • the authentication unit 150 replies to the information processing apparatus 200 whether or not the facial feature information matches. Whether the facial feature information matches or not corresponds to the success or failure of the authentication.
  • matching of facial feature information means a case where the degree of matching is equal to or greater than a predetermined value.
  • FIG. 5 is a flowchart showing the flow of face information registration processing according to the second embodiment.
  • the authentication device 100 acquires the registration image included in the face information registration request (S21).
  • the authentication device 100 receives a face information registration request from the face authentication terminal 400, the user terminal 500, or the like via the network N.
  • the face information registration request source is not limited to this, and may be the information processing apparatus 200 that receives the use registration request from the face authentication terminal 400 or the user terminal 500 .
  • face detection section 120 detects a face area included in the registered image (S22).
  • the feature point extraction unit 130 extracts feature points from the face area detected in step S22, and outputs face feature information to the registration unit 140 (S23).
  • the registration unit 140 issues the user ID 111, associates the user ID 111 with the facial feature information 112, and registers them in the facial information DB 110 (S24).
  • the authentication device 100 may receive the facial characteristic information 112 from the face information registration requester, associate it with the user ID 111 , and register it in the facial information DB 110 .
  • FIG. 6 is a flow chart showing the flow of face authentication processing by the authentication device 100 according to the second embodiment.
  • the feature point extraction unit 130 acquires facial feature information for authentication (S31).
  • the authentication device 100 receives a face authentication request from the information processing device 200 via the network N, and extracts facial feature information from the face image included in the face authentication request in steps S21 to S23.
  • authentication device 100 may receive facial feature information from information processing device 200 .
  • the authentication unit 150 collates the acquired facial feature information with the facial feature information 112 of the facial information DB 110 (S32).
  • the authentication unit 150 specifies the user ID 111 of the user whose facial feature information matches (S34). , the fact that the face authentication was successful and the specified user ID 111 are returned to the information processing apparatus 200 (S35). If there is no matching facial feature information (No in S33), the authentication unit 150 returns to the information processing apparatus 200 that the face authentication has failed (S36).
  • the authentication unit 150 does not need to attempt matching with all the facial feature information 112 in the facial information DB 110.
  • the authentication unit 150 may preferentially attempt matching with facial feature information registered during the period from the day the face authentication request is received until several days before. This can improve matching speed. Also, if the preferential collation fails, it is preferable to collate with all the remaining facial feature information.
  • Each of the face authentication terminals 400-1, 400-2, . . . 400-n is an information terminal including a camera and a display device.
  • the face authentication terminal 400 transmits a usage registration request for the user U to the information processing device 200 via the network N.
  • the face authentication terminal 400 also captures a registration image used for face authentication of the user U at the time of use registration.
  • the face authentication terminal 400 transmits a face information registration request including the registered image to the authentication device 100 via the network N.
  • the face authentication terminal 400 may transmit a face information registration request including the registered image to the authentication device 100 via the information processing device 200 .
  • the user U may use the user terminal 500 to perform usage registration and face information registration.
  • the face authentication terminal 400 captures a face image for authentication used for user U's face authentication. For example, the face authentication terminal 400 uses captured images of the user U at each installation location as images for authentication.
  • the face authentication terminal 400 transmits a face authentication request including an image for authentication to the information processing device 200 via the network N.
  • the face authentication terminal 400 may include in the face authentication request a location ID that identifies the location where the face authentication terminal 400 is installed.
  • the face authentication terminal 400 may include the shooting time in the face authentication request.
  • the face authentication terminal 400 receives the face authentication result and smile level information from the information processing device 200 via the network N, and displays the information on the screen as necessary. Further, when the face authentication terminal 400 receives input of shooting end information indicating that the user U wishes to end shooting, the face authentication terminal 400 transmits a shooting end request including the shooting end information to the information processing apparatus 200 .
  • FIG. 7 is a block diagram showing the configuration of the face authentication terminal 400 according to the second embodiment.
  • the face authentication terminal 400 includes a camera 410 , a storage section 420 , a communication section 430 , a display section 440 , a control section 450 and an input section 460 .
  • the camera 410 is a photographing device that takes pictures under the control of the control unit 450 .
  • Storage unit 420 is a storage device that stores a program for realizing each function of face authentication terminal 400 .
  • a communication unit 430 is a communication interface with the network N.
  • FIG. Display unit 440 is a display device.
  • the input unit 460 is an input device that receives an input from the user U.
  • the display unit 440 and the input unit 460 may be configured integrally.
  • the display unit 440 and the input unit 460 are touch panels.
  • the control unit 450 controls hardware included in the face authentication terminal 400 .
  • the control unit 450 includes an imaging control unit 451 , a registration unit 452 , an authentication control unit 453 and a display control unit 454 .
  • the authentication control unit 453 is not essential in the face authentication terminal 400-1 installed at the point A1.
  • the registration unit 452 is not essential in the face authentication terminals 400-2 to 400-n installed from the point A2 to An.
  • the imaging control unit 451 controls the camera 410 to capture the user U's registration image or authentication image.
  • the registered image and the authentication image are images including at least the user's face area. Note that the photographed image (authentication image) at the point A2 or the like may include scenery, animals, or the like unique to the point in the background.
  • the imaging control section 451 outputs the registration image to the registration section 452 .
  • the imaging control unit 451 also outputs the authentication image to the authentication control unit 453 .
  • the registration unit 452 transmits a face information registration request including the registration image to the authentication device 100 via the network N.
  • the registration unit 452 may transmit the face information registration request to the information processing apparatus 200 via the network N.
  • the registration unit 452 transmits a usage registration request to the information processing apparatus 200 via the network N.
  • the registration unit 452 may transmit the user attribute information received by the input unit 460 to the information processing apparatus 200 via the network N at the time of registration for use.
  • the user attribute information may be included in the usage registration request.
  • Authentication control unit 453 transmits a face authentication request including an authentication image to information processing apparatus 200 via network N, receives the result of face authentication, and outputs the result to display control unit 454 .
  • the authentication control unit 453 outputs an input screen for information on completion of photographing to the display control unit 454 and prompts the user U to input.
  • the authentication control unit 453 transmits a shooting end request to the information processing apparatus 200 in response to receiving the shooting end information from the user U.
  • the display control unit 454 displays on the display unit 440 the display content according to the face authentication result and the degree of smile.
  • a user terminal 500 is an information terminal owned by a user U.
  • the user terminal 500 is, for example, a mobile phone terminal, a smart phone, a tablet terminal, a PC (Personal Computer) equipped with or connected to a camera, or the like.
  • the user terminal 500 is associated with the user U's user ID or facial feature information.
  • the user terminal 500 is a display terminal that can be specified by the user ID or facial feature information in the information processing apparatus 200 .
  • the user terminal 500 is a terminal logged in by the user U with his own user ID.
  • the user terminal 500 transmits a service usage registration request to the information processing device 200 via the network N. Also, the user terminal 500 transmits a registration image used for face authentication of the user U to the authentication device 100, and makes a face information registration request. Note that the user terminal 500 may transmit the facial feature information extracted from the registered image to the authentication device 100 to make a facial information registration request. The user terminal 500 may transmit the registered image and facial feature information to the authentication device 100 via the information processing device 200 . Also, the user terminal 500 acquires an image corresponding to the degree of smile from the information processing apparatus 200 via the network N. FIG.
  • FIG. 8 is a block diagram showing the configuration of the user terminal 500 according to the second embodiment.
  • User terminal 500 includes camera 510 , storage unit 520 , communication unit 530 , display unit 540 , control unit 550 , and input unit 560 .
  • the camera 510 is an image capturing device that performs image capturing under the control of the control unit 550 .
  • the storage unit 520 is a storage device in which programs for realizing each function of the user terminal 500 are stored.
  • a communication unit 530 is a communication interface with the network N.
  • FIG. Display unit 540 is a display device.
  • Input unit 560 is an input device that receives an input.
  • the display unit 540 and the input unit 560 may be configured integrally. As an example, the display unit 540 and the input unit 560 are touch panels.
  • the control unit 550 controls hardware of the user terminal 500 .
  • the control unit 550 includes an imaging control unit 551 , a registration unit 552 , an acquisition unit 553 and a display control unit 554 .
  • the photographing control unit 551 controls the camera 510 to photograph a registered image of the user U.
  • the imaging control section 551 outputs the registration image to the registration section 552 .
  • the registration unit 552 transmits a face information registration request including the registered image to the authentication device 100 via the network N.
  • the registration unit 552 may transmit the face information registration request to the information processing apparatus 200 via the network N.
  • the registration unit 552 transmits a service usage registration request to the information processing apparatus 200 via the network N.
  • the registration unit 552 may transmit the user attribute information received by the input unit 560 to the information processing apparatus 200 via the network N at the time of registration for use.
  • the user attribute information may be included in the usage registration request.
  • Acquisition unit 553 acquires an image corresponding to the degree of smile from information processing apparatus 200 via network N.
  • FIG. Acquisition unit 553 also outputs an image corresponding to the degree of smile to display control unit 554 .
  • the display control unit 554 displays an image corresponding to the degree of smile on the display unit 540 .
  • the information processing apparatus 200 is an information processing apparatus that provides an image corresponding to the degree of smile to the user U using an image captured at a point A1 or the like of the user U, and analyzes the degree of satisfaction with content using collected information.
  • the information processing apparatus 200 may be made redundant by a plurality of servers, and each functional block may be realized by a plurality of computers.
  • the image storage server 300 is one or more file servers for storing images generated by the information processing device 200 according to the degree of smile. Note that the image storage server 300 may store a photographed image for authentication. Upon receiving the image acquisition request, the image storage server 300 provides the requester with an image corresponding to the degree of smile or a photographed image.
  • FIG. 9 is a block diagram showing the configuration of an information processing apparatus 200 according to the second embodiment.
  • Information processing apparatus 200 includes storage unit 210 , memory 220 , communication unit 230 , and control unit 240 .
  • the storage unit 210 is a storage device such as a hard disk or flash memory.
  • Storage unit 210 stores program 211 , user information 212 , history information 213 , content information 214 , and consolidated information 215 .
  • the program 211 is a computer program in which the processing of the information processing method according to the second embodiment is implemented.
  • the user information 212 is basic information related to the user. That is, the user information 212 is user attribute information and the like. Specifically, the user information 212 is information in which a user ID 2121 and user attribute information 2122 are associated with each other.
  • the user ID 2121 is information for identifying the user U, and is a user ID notified when face information is registered in the authentication device 100 .
  • the user attribute information 2122 indicates the attribute information of the user U, and corresponds to the "subject's attribute information" according to the first embodiment.
  • the history information 213 is the shooting history of the user U using the face authentication terminal 400 at each location.
  • the history information 213 is information in which a user ID 2131, a spot ID 2132, a date and time 2133, a smile level 2134, and access information 2135 are associated with each other.
  • the user ID 2131 is information for identifying the user U, and is the user ID included in the face authentication result when the face authentication is successful.
  • the point ID 2132 is information for identifying the point where the face authentication terminal 400 that captured the captured image for face authentication is installed.
  • the date and time 2133 is the date and time when the captured image for face authentication was taken or the date and time when face authentication was performed.
  • the degree of smile 2134 is the degree of smile calculated based on the photographed image for face authentication.
  • the access information 2135 is access information of a storage destination of a captured image or a composite image generated based on the captured image.
  • the access information is, for example, link information to the WEB information corresponding to the image, and is
  • the content information 214 is information in which the point ID 2141, the content ID 2142, and the content attribute information 2143 are associated with each other.
  • the content ID 2142 is information identifying content associated with the point ID 2141 .
  • the content ID 2142 is the ID of the content when the content providing place exists within a predetermined distance from the location ID 2141 .
  • the content attribute information 2143 is attribute information of the content of the content ID 2142 .
  • the content attribute information 2143 may be the type of content. Examples of types of content may be lion/panda, carnivore/herbivore, or indoor/outdoor.
  • the content attribute information 2143 may be configured including the location ID 2141 .
  • the aggregated information 215 is information obtained by aggregating smile degrees based on user U's attribute information and content attribute information.
  • FIG. 10 is a diagram illustrating an example of the data structure of aggregated information 215 according to the second embodiment.
  • aggregated information 215 associates user attribute information, content attribute information, smile level, and content satisfaction level.
  • age groups infants, junior high school students, . . .
  • animal types lion, panda, . . .
  • the degree of smile is aggregated for each combination of user attribute information and content attribute information.
  • Content satisfaction is recorded for each combination of user attribute information and content attribute information.
  • the content satisfaction may be an aggregated smileiness statistic (eg, mean).
  • the aggregation information 215 may include a content ID instead of the content attribute information.
  • the degree of smile may be aggregated for each combination of user attribute information and content ID, and content satisfaction may be recorded for each combination of user attribute information and content ID. From this figure, it can be seen that pandas are more satisfied with content than lions, and that lions are more satisfied with content than pandas are junior high school students.
  • the memory 220 is a volatile storage device such as a RAM (Random Access Memory), and is a storage area for temporarily holding information when the control unit 240 operates.
  • the communication unit 230 is a communication interface with the network N. FIG.
  • the control unit 240 is a processor that controls each component of the information processing device 200, that is, a control device.
  • the control unit 240 loads the program 211 from the storage unit 210 into the memory 220 and executes the program 211 .
  • the control unit 240 realizes the functions of the registration unit 241 , the authentication control unit 242 , the calculation unit 243 , the recording control unit 244 , the image generation unit 245 , the output unit 246 and the evaluation unit 247 .
  • the registration unit 241 is an example of the registration unit 11 described above.
  • the registration unit 241 registers the user ID 2121 and the user attribute information 2122 in the storage unit 210 when the face information of the user U is registered by the authentication device 100 at the time of service usage registration and the user ID is notified. Thereby, the attribute information of the user U is associated with the facial feature information 112 stored in the authentication device 100 via the user ID.
  • the user attribute information 2122 may be information that the information processing apparatus 200 receives via the network N the input data that the user U has input to the face authentication terminal 400 or the user terminal 500 .
  • the registration unit 241 registers the information provided by the user U in association with the user ID as the user U attribute information.
  • the user attribute information 2122 may be information generated by the registration unit 241 based on input data from the user U. For example, when the input data from the user U includes schedule information including flight time and itinerary, the registration unit 241 estimates the means of arrival and registers the user attribute information 2122 including the means of arrival in the storage unit 210. you can Note that the registration unit 241 may acquire the schedule information from the schedule application of the user terminal 500 instead of the input data from the user U. This saves the user U the trouble of inputting.
  • the user attribute information 2122 may be information generated by the registration unit 241 based on a captured image of the user U.
  • the captured image may be a captured image captured for face authentication or face registration.
  • the registration unit 241 may estimate the age and gender of the user U from the captured image, and generate the user attribute information 2122 based on the estimated information.
  • the registration unit 241 may generate companion information and include it in the user attribute information 2122 .
  • the registration unit 241 may determine that the user U has a companion when a plurality of person areas are detected from the captured image.
  • the registration unit 241 can use an existing method. Further, the registration unit 241 may estimate the relationship (group attribute) between the user U and the companions from the captured image, and generate companion information including the group attribute. Note that when estimating various types of attribute information of the user U from the photographed image, the subject to be estimated is not limited to the registration unit 241 and may be the authentication device 100 . In this case, the registration unit 241 receives the attribute information of the user U together with the user ID from the authentication device 100 . In this way, by acquiring the attribute information of the user U from the captured image, it is possible to save the effort of the user U to input.
  • receiving a service usage registration request may mean obtaining user U's attribute information, as well as permission to utilize data of user U's attribute information, that is, permission to use.
  • the authentication control unit 242 is an example of the authentication control unit 12 described above.
  • the authentication control unit 242 controls face authentication for the face area of the user U included in the captured image. That is, the authentication control unit 242 controls face authentication for the face area of the user U included in each photographed image photographed at each location. That is, the authentication control unit 242 causes the authentication device 100 to perform face authentication on the captured image acquired from the face authentication terminal 400 .
  • the authentication control unit 242 transmits a face authentication request including the acquired captured image, location ID, and shooting date and time to the authentication device 100 via the network N, and receives the face authentication result from the authentication device 100 .
  • the authentication control unit 242 may detect the face area of the user U from the captured image and include the image of the face area in the face authentication request.
  • the authentication control unit 242 may extract facial feature information from the face area and include the facial feature information in the face authentication request. The authentication control unit 242 then supplies the captured image to the calculation unit 243 and the image generation unit 245 and supplies the face authentication result to the recording control unit 244 .
  • the calculation unit 243 is also called calculation means.
  • the calculation unit 243 calculates the degree of smile of the user U based on the face area of the user U included in the captured image.
  • a specific example of calculating the degree of smile of the user U will be described.
  • the calculation unit 243 may extract feature points of facial features such as both eyes and mouth in the face region of the user U, and calculate the degree of smile based on the coordinates of each extracted feature point. Further, the calculation unit 243 may calculate the degree of smile using a learned discriminator.
  • the discriminator may be a support vector machine (SVM) that takes as input the feature amount of facial organs and outputs whether or not the degree of smile is greater than or equal to a predetermined value. Also, the discriminator may be a convolutional neural network (CNN) that receives the face area of the captured image as input and outputs the degree of smile. Note that the method for calculating the degree of smile is not limited to the above, and existing methods may be used.
  • SVM support vector machine
  • CNN convolutional neural network
  • the calculation unit 243 calculates the degree of smile based on the degree of smile of the face region of the companion in addition to the degree of smile of the face region of the user U. may be calculated. For example, the calculation unit 243 may calculate an average value of the smile degrees of the user U's face region and the smile degrees of the companion's face regions as the smile degree.
  • the calculation unit 243 supplies the smile degree information to the recording control unit 244 and the image generation unit 245 .
  • the recording control unit 244 is an example of the recording control unit 14 described above.
  • the recording control unit 244 associates the user ID, the location ID, and the date and time included in the face authentication result with the degree of smile calculated by the calculation unit 243, and stores the result in the storage unit 210 as the history information 213. Record.
  • the recording control unit 244 may record the position information of the photographing location by including it in the history information 213 .
  • the image generation unit 245 is also called image generation means.
  • the image generator 245 uses the degree of smile and the photographed image to identify or generate an image corresponding to the degree of smile. For example, the image generation unit 245 may generate a composite image in which the degree of smile is superimposed on the photographed image as an image corresponding to the degree of smile. Further, when the degree of smile is equal to or greater than a predetermined value, the image generation unit 245 may specify the photographed image as an image corresponding to the degree of smile. Further, when the degree of smile is equal to or greater than a predetermined value, the image generation unit 245 may generate a composite image in which the degree of smile is superimposed on the photographed image as an image corresponding to the degree of smile.
  • the image generator 245 may synthesize the generated or specified images using a predetermined template, and use the synthesized image as an image corresponding to the degree of smile.
  • the predetermined template may be a template related to content associated with the shooting location. For example, if the shooting location is associated with a panda, the template may include an image of a panda.
  • the image generation unit 245 saves the image corresponding to the degree of smile in the image saving server 300 and records the access information of the saving destination in the storage unit 210 as the history information 213 .
  • the output unit 246 is an example of the output unit 16 described above.
  • the output unit 246 outputs an image corresponding to the degree of smile to the user terminal 500 via the network N when a predetermined provision condition is satisfied.
  • the predetermined provision condition may be that the communication unit 230 receives a photographing end request from the face authentication terminal 400 after successful face authentication based on the captured image.
  • images corresponding to smile degrees are stored in the image storage server 300 . Therefore, in this case, the output unit 246 may acquire an image corresponding to the degree of smile from the image storage server 300 using the access information corresponding to the user ID included in the history information 213 and transmit the image to the user terminal 500 . .
  • the images corresponding to the degree of smile acquired at this time may be a plurality of images corresponding to images taken at different shooting dates and times. Accordingly, when there are a plurality of images corresponding to the degree of smile, the user U can acquire the images collectively.
  • the predetermined provision condition may simply be that face authentication based on the captured image has succeeded.
  • the output unit 246 transmits information according to the degree of smile to the user terminal 500 when the face authentication based on the captured image is successful even if the request to end the photographing is not received.
  • the output unit 246 may determine information corresponding to the degree of smile to be output to the user terminal 500 according to the amount of attribute information provided by the user U.
  • the attribute information provided by the user U may be attribute information input by the user U or attribute information licensed by the user U.
  • the output unit 246 may determine the number of images to be output according to the amount of attribute information provided by the user U. FIG. As a result, the incentives for providing attribute information and permitting use are more likely to work, and the acquisition of attribute information is further facilitated.
  • the evaluation unit 247 is an example of the evaluation unit 17 described above.
  • the evaluation unit 247 uses the user information 212 , the history information 213 and the content information 214 to generate aggregate information 215 .
  • the evaluation unit 247 acquires the user attribute information of the user information 212 using the user ID corresponding to the degree of smile of the history information 213 .
  • the evaluation unit 247 acquires the content attribute information or the content ID of the content information 214 using the point ID corresponding to the degree of smile of the history information 213 .
  • the evaluation unit 247 associates the degree of smile, the user attribute information, and the content attribute information or the content ID, and records them as aggregated information 215 .
  • the evaluation unit 247 aggregates the degree of smile based on at least one of the position information of the shooting location, the shooting time period, and the order of visits to the shooting location. good too.
  • the location information of the shooting location indicates the location information of the content
  • the shooting time zone indicates the time zone in which the content is experienced
  • the visit order of the shooting location indicates the order in which the content is experienced.
  • the evaluation unit 247 calculates the degree of content satisfaction based on the degree of smile aggregated based on various parameters.
  • the evaluation unit 247 can evaluate the degree of satisfaction of the content from various evaluation axes, enabling detailed analysis. For example, it is possible to analyze that ⁇ when the panda viewing spot is closer to the entrance of the zoo, the elderly people are more satisfied with the overall content than when it is farther away''.
  • FIG. 11 is a flowchart showing the flow of usage registration processing according to the second embodiment.
  • the registration unit 241 receives a usage registration request via the network N (S401).
  • the request source is assumed to be either the face authentication terminal 400 or the user terminal 500 .
  • the registration unit 241 requests the authentication device 100 via the network N to register the face information in the face information DB 110 when the photographed image is received in addition to the use registration request.
  • the registration unit 241 acquires the user ID from the authentication device 100 via the network N (S402). Then, the registration unit 241 acquires the attribute information of the user U via the network N from the face authentication terminal 400 or the user terminal 500 that made the request (S403). As described above, instead of this, the registration unit 241 may generate attribute information of the user U based on the input data of the user U or the captured image. Then, the registration unit 241 registers the attribute information of the user U as the user attribute information 2122 of the user information 212 in association with the user ID 2121 (S404). Then, the registration unit 241 terminates the processing.
  • FIG. 12 is a flowchart showing the flow of image output processing according to the second embodiment.
  • the authentication control unit 242 acquires a captured image from the face authentication terminal 400 via the network N (S411).
  • the authentication control unit 242 transmits a face authentication request to the authentication device 100 via the network N (S412).
  • the authentication control unit 242 includes at least one of the photographed image acquired in step S411, the face area extracted from the photographed image, or the facial feature information extracted from the face area in the face authentication request.
  • the authentication control unit 242 then receives the face authentication result from the authentication device 100 via the network N (S413).
  • the face authentication result includes the fact and the user ID when the face authentication is successful, and includes the fact when the face authentication is unsuccessful.
  • the authentication control unit 242 determines whether face authentication has succeeded (S414). When it is determined that the face authentication has failed (No in S414), the authentication control unit 242 outputs that the face authentication has failed (S415). Specifically, the authentication control unit 242 transmits a message to the face authentication terminal 400 that provides the image via the network N to the effect that the face authentication has failed. The authentication control unit 242 then terminates the process.
  • the authentication control unit 242 identifies the user ID for which face authentication has succeeded (S416). Specifically, the authentication control unit 242 extracts the user ID included in the face authentication result. At this time, the authentication control unit 242 may output that the face authentication was successful. Specifically, the authentication control unit 242 may transmit a message to the face authentication terminal 400 that provides the image via the network N to the effect that the face authentication has succeeded.
  • the calculation unit 243 calculates the degree of smile based on the captured image (S417). Subsequently, the image generation unit 245 generates an image corresponding to the degree of smile based on the photographed image and the degree of smile (S418). The image generation unit 245 stores the generated image corresponding to the degree of smile in the image storage server 300 via the network N (S419). The image generation unit 245 then acquires the access information of the image storage destination from the image storage server 300 . Subsequently, the recording control unit 244 registers the history information 213 (S420).
  • the recording control unit 244 associates the user ID, the spot ID, the date and time, the degree of smile, and the access information, and records them in the storage unit 210 as the history information 213 .
  • the image generation unit 245 also outputs an image corresponding to the degree of smile to the face authentication terminal 400 via the network N (S421). As a result, the face authentication terminal 400 displays an image corresponding to the degree of smile on the display unit 440 .
  • the output unit 246 determines whether or not a predetermined provision condition is satisfied (S422). For example, the output unit 246 determines whether or not a predetermined provision condition is satisfied by determining whether or not the communication unit 230 has received a shooting end request from the face authentication terminal 400 .
  • the process ends.
  • the output unit 246 determines that the predetermined provision condition is not satisfied (No in S422)
  • the output unit 246 acquires an image corresponding to the degree of smile from the image storage server 300 using the access information of the history information 213 ( S423).
  • the output unit 246 then transmits the image corresponding to the degree of smile to the user terminal 500 via the network N (S424).
  • the image corresponding to the degree of smile may be generated after the output unit 246 determines that the provision condition is satisfied.
  • step S418 is omitted, and instead of step S419, the image generator 245 stores the captured image in the image storage server 300 via the network N.
  • the image generation unit 245 acquires the access information of the image storage destination from the image storage server 300 .
  • step S ⁇ b>421 the image generator 245 may output only the smile level information to the face authentication terminal 400 via the network N.
  • the output unit 246 acquires the captured image corresponding to the user U from the image storage server 300 using the access information of the history information 213.
  • the image generation unit 245 generates an image corresponding to the degree of smile based on the captured image and the degree of smile that have been acquired.
  • the output unit 246 transmits the generated image corresponding to the degree of smile to the user terminal 500 via the network N.
  • FIG. 13 is a flowchart showing the flow of evaluation processing according to the second embodiment.
  • This evaluation process may be executed periodically, or may be executed when a predetermined execution condition is satisfied, such as when the history information 213 is updated.
  • the evaluation unit 247 determines whether or not to start the evaluation process (S431). If it is determined to start (Yes in S431), a smile The degrees are aggregated (S432), and aggregated information 215 is recorded.
  • the evaluation unit 247 calculates the degree of content satisfaction based on the aggregated degree of smile (S433).
  • the evaluation unit 247 then outputs the calculated content satisfaction level to the outside in a predetermined format together with the user attribute information and content attribute information (or content ID) (S434).
  • FIG. 14 is a sequence diagram showing an example of the flow of usage registration processing according to the second embodiment.
  • face authentication terminal 400-1 transmits a service use registration request to information processing apparatus 200 (S500). Further, the face authentication terminal 400-1 photographs the user U (S501), and transmits a face information registration request including the photographed image to the authentication device 100 via the network N (S502). Then, the authentication device 100 registers face information (face feature information) of the user U based on the captured image included in the received face information registration request (S503). The authentication device 100 then notifies the information processing device 200 of the user ID via the network N (S504). The face authentication terminal 400-1 also transmits the attribute information of the user U to the information processing apparatus 200 via the network N (S505). The information processing apparatus 200 associates the notified user ID with the attribute information of the user U and registers them in the user information 212 (S506). Then, the user U moves from the point A1 to the point A2.
  • the face authentication terminal 400-2 takes an image of the user U for whom usage registration has been completed (S510), and transmits the taken image and the point ID to the information processing apparatus 200 via the network N (S511).
  • the information processing apparatus 200 transmits a face authentication request for the face area of the user U in the captured image included in the received face authentication request to the authentication apparatus 100 via the network N (S512).
  • the authentication device 100 performs face authentication on the face area of the user U in the captured image included in the received face authentication request (S513).
  • face authentication device 100 transmits the face authentication result including the fact that the face authentication was successful and the user ID to the information processing device 200 via the network N (S514).
  • the information processing apparatus 200 calculates the degree of smile from the photographed image (S515).
  • the information processing apparatus 200 then generates a composite image by superimposing the degree of smile on the photographed image (S516).
  • the information processing apparatus 200 saves the generated composite image in the image saving server 300 (S517).
  • the information processing apparatus 200 registers the user ID, the spot ID, the date and time, the degree of smile, and the access information of the save destination as the history information 213 (S518).
  • the information processing device 200 also transmits the generated composite image to the face authentication terminal 400-2 via the network N (S519). Face authentication terminal 400-2 that has received the synthesized image displays the synthesized image (S520).
  • the face authentication terminal 400-2 displays an input screen for shooting end information and accepts input from the user U.
  • FIG. it is assumed that the face authentication terminal 400-2 does not accept an input operation of shooting end information from the user U, but accepts an input operation of continuation information indicating that the user U does not wish to end (S521).
  • the face authentication terminal 400-2 transmits a continuation request including continuation information to the information processing device 200 via the network N (S522). Then, the user U moves from the point A2 to the point A3.
  • the face authentication terminal 400-3 takes an image of the user U (S530), and transmits the taken image and the point ID to the information processing device 200 via the network N (S531). Then, processing similar to steps S512 to S520 is executed.
  • the face authentication terminal 400-3 displays an input screen for the shooting end information and accepts input from the user U.
  • FIG. it is assumed that the face authentication terminal 400-3 receives an input operation of shooting end information from the user U (S532).
  • the face authentication terminal 400-3 transmits a photographing end request including photographing end information to the information processing apparatus 200 via the network N (S533).
  • the information processing apparatus 200 that has received the shooting end request accesses the image storage server 300 using the access information corresponding to the user ID (S534), and acquires the composite image (S535). The information processing apparatus 200 then transmits the acquired composite image to the user terminal 500 corresponding to the user ID (S536).
  • FIG. 17 is a diagram showing an example of a composite image 620 according to the second embodiment.
  • a composite image 620 shown in the figure is a composite image generated in step S516 of FIG. 15, displayed by the face authentication terminal 400 in step S520, and acquired by the user terminal 500 in step S536 of FIG.
  • a composite image 620 is an image obtained by superimposing a degree of smile on a photographed image.
  • the composite image 620 includes in its background an image area of an animal representing the features of the content associated with the shooting location.
  • the information processing apparatus 200 aggregates smile levels based on user attribute information and other information related to the user U, and performs detailed analysis of content satisfaction levels. Therefore, it becomes easier for the operator of the event site to take measures to improve the satisfaction of the content.
  • the information processing apparatus 200 can easily acquire the attribute information and facilitate detailed analysis. Therefore, it is possible to more preferably analyze the degree of satisfaction with the content at the event site.
  • the output unit 246 may transmit an image corresponding to the degree of smile to the user terminal 500 only when the user U desires.
  • the face authentication terminal 400 that has received the input operation of the shooting end information in step S532 of FIG. 16 may display an input screen for receiving input as to whether or not to obtain an image corresponding to the degree of smile.
  • FIG. 18 is a diagram showing an example of a terminal screen 610 according to the second embodiment.
  • a terminal screen 610 shown in the figure is displayed after step S532 in FIG. 16 is executed.
  • the terminal screen 610 includes an input area for receiving an input as to whether or not the user U wishes to transmit the displayed composite image 620 along with a message "Can I send this picture?"
  • the face authentication terminal 400 transmits to the information processing apparatus 200 information indicating whether or not acquisition is desired together with a request to end shooting.
  • the terminal screen 610 shown in this figure may be displayed in step S520 of FIG. In this case, the information processing apparatus 200 may record whether or not each photographed image is desired to be acquired. 16, the information processing apparatus 200 may acquire the desired image from the image storage server 300 and transmit it to the user terminal 500.
  • FIG. 1 is a diagram showing an example of a terminal screen 610 according to the second embodiment.
  • a terminal screen 610 shown in the figure is displayed after step S532 in FIG. 16 is executed.
  • the terminal screen 610 includes an
  • the output unit 246 transmits an image corresponding to the degree of smile to the user terminal 500 in order to provide it to the user U, but it may transmit access information.
  • the output unit 246 instead of steps S534 to S536 in FIG. 16, the output unit 246 generates an image download guidance screen including access information, and transmits the generated guidance screen information.
  • the destination is not limited to the user terminal 500 and may be the face authentication terminal 400 .
  • FIG. 19 is a diagram showing an example of a terminal screen 610 according to the second embodiment.
  • Terminal screen 610 includes access information 611 .
  • a terminal screen 610 shown in this figure is displayed when the face authentication terminal 400 receives the information of the download guidance screen.
  • the access information 611 in this figure is shown as a QR code (registered trademark).
  • the display of the access information 611 indicates that the access information 611 has been presented to the user U.
  • the user terminal 500 reads the access information 611 in the terminal screen 610 displayed on the face authentication terminal 400 according to the user U's operation. Then, the user terminal 500 analyzes the read access information 611 and transmits an image acquisition request to the image storage server 300 via the network N based on the analysis result.
  • the image acquisition request may simply be an access (request message) from the user terminal 500 to a predetermined storage destination of the image storage server 300 .
  • the image storage server 300 reads the synthetic image stored in the storage destination indicated by the image acquisition request from the user terminal 500 and transmits a response including the synthetic image to the user terminal 500 via the network N.
  • FIG. 20 is a diagram showing an example of an overall synthetic image 630 according to the second embodiment.
  • the overall composite image 630 includes a plurality of images (composite images 631, 632, 633, 634, and 635) corresponding to the degree of smile of the user U whose face authentication has succeeded, and which correspond to the images taken at different shooting dates and times.
  • the overall composite image 630 is composited so that each of the composite images 631 to 635 can be individually identified.
  • the image generation unit 245 uses the access information corresponding to the user ID included in the history information 213 to generate an image (a photographed image or an image corresponding to the degree of smile) from the image storage server 300. get.
  • the image generation unit 245 synthesizes the plurality of images so as to be individually identifiable to generate an overall composite image.
  • the image generation unit 245 may select images shot at different shooting locations from among the plurality of images, and generate an overall composite image from the selected images. Further, the image generation unit 245 may select an image having a degree of smile equal to or greater than a predetermined value from among the plurality of images, and generate an overall composite image from the selected images.
  • the image generation unit 245 may select images in descending order of the degree of smile from among the plurality of images, and generate an overall composite image from the selected images. For example, the image generation unit 245 selects a predetermined number of images with the highest degree of smile from among the plurality of images, and generates an overall composite image from the selected images. The image generator 245 may also select an image with a degree of smile equal to or greater than a predetermined value or with the highest degree of smile for each shooting location, and generate an overall composite image from the selected images. The output unit 246 then provides the user U with the generated overall composite image. At this time, the output unit 246 may cause the user terminal 500 to display the generated overall composite image. The output unit 246 may cause the user terminal 500 to display individual images included in the overall composite image in an identifiable manner in addition to the overall composite image.
  • the user U who has visited each location can acquire an overall composite image that aggregates the captured images taken when face authentication was performed at each location by performing face authentication at a specific location. Therefore, the user U can easily comprehend the content that he/she likes out of the content that he or she has actually experienced. Furthermore, if it is an overall composite image, the user U can easily publish it on an SNS (Social Networking Service) or the like. Therefore, the recognition of the content by many people can contribute to the revitalization of the event site.
  • SNS Social Networking Service
  • the authentication device 100 is connected to the information processing device 200 via the network N.
  • the functions of the face detection unit 120 , the feature point extraction unit 130 , the registration unit 140 and the authentication unit 150 of the authentication device 100 may be included in the control unit 240 of the information processing device 200 .
  • the face information DB 110 of the authentication device 100 may be included in the storage unit 210 of the information processing device 200 .
  • the image storage server 300 is connected to the information processing apparatus 200 via the network N.
  • the functions of image storage server 300 may be included in storage unit 210 of information processing apparatus 200 .
  • the calculation unit 243 of the information processing apparatus 200 calculates the degree of smile based on the photographed image.
  • authentication device 100 may calculate the degree of smile based on the photographed image.
  • the calculation unit 243 of the information processing device 200 may supply the smile level information to the recording control unit 244 and the image generation unit 245 in response to receiving the smile level information from the authentication device 100 .
  • the hardware configuration has been described, but the configuration is not limited to this.
  • the present disclosure can also implement arbitrary processing by causing a CPU to execute a computer program.
  • the program includes instructions (or software code) that, when read into a computer, cause the computer to perform one or more of the functions described in the embodiments.
  • the program may be stored in a non-transitory computer-readable medium or tangible storage medium.
  • computer readable media or tangible storage media may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drives (SSD) or other memory technology, CDs - ROM, digital versatile disc (DVD), Blu-ray disc or other optical disc storage, magnetic cassette, magnetic tape, magnetic disc storage or other magnetic storage device.
  • the program may be transmitted on a transitory computer-readable medium or communication medium.
  • transitory computer readable media or communication media include electrical, optical, acoustic, or other forms of propagated signals.
  • (Appendix 1) a registration unit that registers attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired; an authentication control unit for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is acquired; a recording control unit that, when the biometric authentication is successful, records the degree of smile of the subject calculated from the captured image in association with attribute information of the subject and content information related to content associated with a shooting location; an output unit that outputs information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded;
  • An information processing apparatus comprising: an evaluation unit that aggregates smile levels based on subject's attribute information and content information, and evaluates content satisfaction based on the aggregated smile levels.
  • the subject's attribute information includes at least one of the subject's age, gender, occupation, family structure, hobby, companion information indicating the presence or absence of companions or group attributes, and means of visit.
  • Information processing equipment (Appendix 3) The registration unit registers information provided by the subject in association with the subject as attribute information of the subject, Information according to appendix 1 or 2, wherein the output unit determines information according to the degree of smile to be output to an information terminal used by the subject, according to the amount of information provided by the subject. processing equipment.
  • the output unit outputting information corresponding to the degree of smile of the subject to the information terminal in response to the success of the biometric authentication based on the acquired photographed image; or any of Supplementary Notes 1 to 3, wherein information corresponding to a plurality of smile levels corresponding to images taken at different shooting dates and times of the subject is output to the information terminal when a photographing end request indicating the end of photographing is acquired. or the information processing device according to claim 1.
  • the information corresponding to the degree of smile is an image corresponding to the degree of smile generated based on a photographed image for which biometric authentication is successful;
  • the image corresponding to the degree of smile is a composite image in which the degree of smile of the subject is superimposed on the photographed image, the photographed image in which the degree of smile of the subject is a predetermined value or more, or the degree of smile of the subject is a predetermined value. 5.
  • the information processing apparatus according to any one of appendices 1 to 4, wherein the composite image corresponds to a photographed image having a value equal to or greater than the value.
  • the overall composite image includes a plurality of images corresponding to the degree of smile corresponding to images taken at different shooting dates and times of the subject,
  • the output unit outputs the composite image to the information terminal used by the subject when the biometric authentication based on the acquired photographed image is successful and the photographing end request indicating the end of photographing is acquired.
  • the information processing device according to .
  • the recording control unit records the degree of smile in association with at least one of position information of the shooting location and shooting date and time, 7.
  • the information processing apparatus aggregates the degree of smile based on at least one of position information of shooting locations, shooting time period, and visit order of shooting locations.
  • Appendix 8 an information terminal that generates a captured image of a target person; an information processing device communicably connected to the information terminal, The information processing device is a registration unit that registers attribute information of a subject in association with biometric information of the subject when use registration is accepted; an authentication control unit that performs biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is obtained from the information terminal; a recording control unit that, when the biometric authentication is successful, records the degree of smile of the subject calculated from the captured image in association with attribute information of the subject and content information related to content associated with a shooting location; an output unit that outputs information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded;
  • An information processing system comprising: an evaluation
  • the subject's attribute information includes at least one of the subject's age, gender, occupation, family structure, hobby, companion information indicating the presence or absence of companions or group attributes, and means of visit.
  • Information processing system (Appendix 10) registering attribute information of a subject in association with biometric information of the subject when a request for use registration is obtained; a step of performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is obtained; a step of recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and a shooting location; a step of outputting information according to the degree of smile of the subject to an information terminal used by the subject whose biometric authentication is successful; An information processing method comprising: aggregating smile levels based on subject's attribute information and content information; and evaluating content satisfaction based on the aggregated smile levels.
  • the information processing device and information processing system according to this embodiment can be used, for example, to operate an event venue.

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

An information processing device (10) comprises a registration unit (11), an authentication control unit (12), a recording control unit (14), an output unit (16), and an evaluation unit (17). After acquiring a registration request, the registration unit (11) registers attribute information on a person in association with biometric information . After acquiring a photographed image, which is obtained by photographing the person at a predetermined location associated with predetermined content, the authentication control unit (12) causes the person to perform biometric authentication by using the photographed image. The recording control unit (14) records, when the biometric authentication is successful, the degree of smile of the person, which is calculated from the photographed image, in association with content information relating to content associated with attribute information for the person and the photography location. The output unit (16) outputs information according to the degree of smile of the person to an information terminal used by the person who has passed the biometric authentication. The evaluation unit (17) evaluates the degree of satisfaction with the content on the basis of the attribute information on the person and the content information.

Description

情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体Information processing device, information processing system, information processing method, and non-transitory computer-readable medium
 本開示は、情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体に関し、特にイベント会場の運営を支援するための情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体に関する。 The present disclosure relates to an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium. computer readable medium.
 イベント会場が提供するイベントコンテンツ(以下、コンテンツと呼ぶ)に対する、来場者の満足度を高めるために、様々なアプリケーションが提案されている。例えば特許文献1には、遊園地等に設置されたアトラクションを体験中の来場者の静止画から笑顔度を評価し、笑顔度が最も高い静止画を販促用の画像として選択する画像選択装置が開示されている。また特許文献2には、空間の出入り時の顔画像からハピネス度を算出し、ハピネス度の差分に基づいて空間に対する評価値を算出する情報処理システムが開示されている。 Various applications have been proposed to increase the satisfaction of visitors to the event content (hereinafter referred to as content) provided by the event venue. For example, Patent Document 1 discloses an image selection device that evaluates the degree of smile of a visitor who is experiencing an attraction installed in an amusement park or the like, and selects the still image with the highest degree of smile as an image for sales promotion. disclosed. Further, Patent Document 2 discloses an information processing system that calculates a happiness level from a face image when entering and exiting a space, and calculates an evaluation value for the space based on the difference in the happiness level.
特開2019-071553号公報JP 2019-071553 A 国際公開第2017/022306号WO2017/022306
 ここで、コンテンツの満足度を向上させるためには、まず現状のコンテンツの満足度を詳細に分析することが有効である。しかし上述の特許文献1には、アトラクションの満足度を分析することについて記載されていない。また、コンテンツの満足度の詳細分析としては、来場者の特性ごとにコンテンツの満足度を分析することが挙げられる。しかし上述の特許文献2には、来場者の特性ごとに空間の評価値を分析することについて記載されていない。尚、このような詳細分析をするためには、来場者に関連する情報を多数収集することが求められるが、上述のいずれの特許文献にも、来場者に関連する情報を容易に取得する取り組みについては、開示されていない。 Here, in order to improve the level of content satisfaction, it is effective to first analyze the current level of content satisfaction in detail. However, Patent Literature 1 mentioned above does not describe analyzing the satisfaction level of attractions. Further, as a detailed analysis of the degree of satisfaction with content, analysis of the degree of satisfaction with content for each visitor characteristic can be mentioned. However, Patent Literature 2 mentioned above does not describe analyzing the evaluation value of the space for each visitor's characteristic. In addition, in order to perform such a detailed analysis, it is necessary to collect a large amount of information related to visitors. has not been disclosed.
 本開示の目的は、上述した課題に鑑み、イベント会場におけるコンテンツの満足度を好適に分析する情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体を提供することにある。 An object of the present disclosure is to provide an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that suitably analyze the degree of satisfaction with content at an event venue in view of the above-described problems.
 本開示の一態様にかかる情報処理装置は、
 利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録手段と、
 所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御手段と、
 前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御手段と、
 前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力手段と、
 対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価手段と
 を備える。
An information processing device according to an aspect of the present disclosure includes:
registration means for registering attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired;
Authentication control means for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is acquired;
recording control means for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the photographed image in association with content information related to content associated with the subject's attribute information and a photographing location;
Output means for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful;
evaluation means for summarizing smile levels based on subject's attribute information and content information, and evaluating content satisfaction based on the summarized smile levels.
 本開示の一態様にかかる情報処理システムは、
 対象者を撮影した撮影画像を生成する情報端末と、
 前記情報端末と通信可能に接続される情報処理装置と
 を備える。
 前記情報処理装置は、
 利用登録を受け付けた場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録手段と、
 所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を前記情報端末から取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御手段と、
 前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御手段と、
 前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力手段と、
 対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価手段と
 を有する。
An information processing system according to one aspect of the present disclosure includes:
an information terminal that generates a captured image of a target person;
and an information processing device communicably connected to the information terminal.
The information processing device is
registration means for registering attribute information of a subject in association with biometric information of the subject when use registration is accepted;
Authentication control means for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is obtained from the information terminal;
recording control means for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the photographed image in association with content information related to content associated with the subject's attribute information and a photographing location;
Output means for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful;
evaluation means for summarizing the degree of smile based on the attribute information of the subject and the content information, and evaluating satisfaction with the content based on the summarized degree of smile.
 本開示の一態様にかかる情報処理方法は、
 利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録し、
 所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせ、
 前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録し、
 前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力し、
 対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する。
An information processing method according to an aspect of the present disclosure includes:
When the user registration request is obtained, registering the attribute information of the subject in association with the biometric information of the subject,
When a photographed image of a target person photographed at a predetermined point associated with predetermined content is acquired, biometric authentication is performed using the photographed image,
when the biometric authentication is successful, recording the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and the shooting location;
Outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful,
Smile levels are aggregated based on the subject's attribute information and content information, and content satisfaction is evaluated based on the aggregated smile levels.
 本開示の一態様にかかる非一時的なコンピュータ可読媒体は、コンピュータに、
 利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録処理と、
 所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御処理と、
 前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御処理と、
 前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力処理と、
 対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価処理と
 を実行させるプログラムが格納される。
A non-transitory computer-readable medium according to one aspect of the present disclosure comprises:
a registration process of registering attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired;
Authentication control processing for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with predetermined content is acquired;
recording control processing for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and a shooting location;
An output process for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded;
and an evaluation process of aggregating smile levels based on subject's attribute information and content information and evaluating content satisfaction based on the aggregated smile levels.
 本開示により、イベント会場におけるコンテンツの満足度を好適に分析する情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体を提供できる。 According to the present disclosure, it is possible to provide an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that suitably analyze the degree of satisfaction with content at an event venue.
実施形態1にかかる情報処理装置の構成を示すブロック図である。1 is a block diagram showing the configuration of an information processing apparatus according to a first embodiment; FIG. 実施形態1にかかる情報処理方法の流れを示す図である。3 is a diagram showing the flow of an information processing method according to the first embodiment; FIG. 実施形態2にかかる情報処理システムの全体構成を示すブロック図である。2 is a block diagram showing the overall configuration of an information processing system according to a second embodiment; FIG. 実施形態2にかかる認証装置の構成を示すブロック図である。2 is a block diagram showing the configuration of an authentication device according to a second embodiment; FIG. 実施形態2にかかる顔情報登録処理の流れを示すフローチャートである。9 is a flowchart showing the flow of face information registration processing according to the second embodiment; 実施形態2にかかる顔認証処理の流れを示すフローチャートである。9 is a flow chart showing the flow of face authentication processing according to the second embodiment; 実施形態2にかかる顔認証端末の構成を示すブロック図である。FIG. 10 is a block diagram showing the configuration of a face authentication terminal according to the second embodiment; FIG. 実施形態2にかかるユーザ端末の構成を示すブロック図である。FIG. 8 is a block diagram showing the configuration of a user terminal according to the second embodiment; FIG. 実施形態2にかかる情報処理装置の構成を示すブロック図である。2 is a block diagram showing the configuration of an information processing apparatus according to a second embodiment; FIG. 実施形態2にかかる集約情報のデータ構造の一例を示す図である。FIG. 11 is a diagram illustrating an example of a data structure of aggregated information according to the second embodiment; FIG. 実施形態2にかかる利用登録処理の流れを示すフローチャートである。9 is a flowchart showing the flow of usage registration processing according to the second embodiment; 実施形態2にかかる画像出力処理の流れを示すフローチャートである。9 is a flowchart showing the flow of image output processing according to the second embodiment; 実施形態2にかかる評価処理の流れを示すフローチャートである。9 is a flowchart showing the flow of evaluation processing according to the second embodiment; 実施形態2にかかる利用登録処理の流れを示すシーケンス図である。FIG. 11 is a sequence diagram showing the flow of usage registration processing according to the second embodiment; 実施形態2にかかる画像出力処理の流れを示すシーケンス図である。FIG. 10 is a sequence diagram showing the flow of image output processing according to the second embodiment; 実施形態2にかかる画像出力処理の流れを示すシーケンス図である。FIG. 10 is a sequence diagram showing the flow of image output processing according to the second embodiment; 実施形態2にかかる合成画像の一例を示す図である。FIG. 11 is a diagram showing an example of a synthesized image according to the second embodiment; FIG. 実施形態2にかかる端末画面の一例を示す図である。FIG. 11 is a diagram showing an example of a terminal screen according to the second embodiment; FIG. 実施形態2にかかる端末画面の一例を示す図である。FIG. 11 is a diagram showing an example of a terminal screen according to the second embodiment; FIG. 実施形態2にかかる全体合成画像の一例を示す図である。FIG. 11 is a diagram showing an example of an overall composite image according to the second embodiment; FIG.
 以下では、本開示の実施形態について、図面を参照しながら詳細に説明する。各図面において、同一又は対応する要素には同一の符号が付されており、説明の明確化のため、必要に応じて重複説明は省略される。 Below, embodiments of the present disclosure will be described in detail with reference to the drawings. In each drawing, the same reference numerals are given to the same or corresponding elements, and redundant description will be omitted as necessary for clarity of description.
 <実施形態1>
 まず、本開示の実施形態1について説明する。図1は、実施形態1にかかる情報処理装置10の構成を示すブロック図である。情報処理装置10は、イベント会場で提供されるコンテンツの満足度を分析する情報処理装置である。ここで、情報処理装置10は、ネットワーク(不図示)に接続される。ネットワークは、有線であっても無線であってもよい。また、当該ネットワークには、イベント会場の来場者である対象者を撮影し、撮影画像を生成する情報端末(不図示)と接続されている。つまり、情報処理装置10は、ネットワークを介して情報端末と通信可能に接続される。ここで、情報端末は、所定コンテンツと関連付けられる所定地点に設置される。撮影画像は、少なくとも対象者の顔領域を含む。情報端末は、顔認証端末、カメラ付きのデジタルサイネージ等であって良い。
<Embodiment 1>
First, Embodiment 1 of the present disclosure will be described. FIG. 1 is a block diagram showing the configuration of an information processing apparatus 10 according to the first embodiment. The information processing device 10 is an information processing device that analyzes the level of satisfaction with content provided at an event site. Here, the information processing device 10 is connected to a network (not shown). A network may be wired or wireless. The network is also connected to an information terminal (not shown) that takes a picture of a target person who is a visitor at the event venue and generates a taken image. In other words, the information processing device 10 is communicably connected to the information terminal via the network. Here, the information terminal is installed at a predetermined point associated with the predetermined content. The captured image includes at least the subject's face area. The information terminal may be a face authentication terminal, digital signage with a camera, or the like.
 情報処理装置10は、登録部11と、認証制御部12と、記録制御部14と、出力部16と、評価部17とを備える。 The information processing device 10 includes a registration unit 11 , an authentication control unit 12 , a recording control unit 14 , an output unit 16 and an evaluation unit 17 .
 登録部11は、登録手段とも呼ばれる。登録部11は、対象者の利用登録要求を取得した場合、その対象者の属性情報を、その対象者の生体情報に対応付けて登録する。ここで、対象者の属性情報は、対象者に関連する情報である。例えば対象者の属性情報は、対象者の年齢、性別、職業、家族構成、趣味、同伴者情報、及び来場手段のうち少なくとも1つを含んでよい。同伴者情報は、同伴者の有無や、同伴者がいる場合は、対象者及び同伴者を含む集団の属性(集団属性)を示す。一例として、集団属性は、家族(親子連れ)、カップル、又は友人等である。尚、同伴者情報は、同伴者の識別情報(ID)を含んでもよい。来場手段は、自動車又は電車等である。生体情報は、顔、指紋、虹彩若しくは静脈の特徴情報、又はその他の生体情報である。 The registration unit 11 is also called registration means. When the registration unit 11 acquires a use registration request for a target person, the registration unit 11 registers the target person's attribute information in association with the target person's biometric information. Here, the subject's attribute information is information related to the subject. For example, the subject's attribute information may include at least one of the subject's age, gender, occupation, family structure, hobby, companion information, and means of visit. The companion information indicates the presence or absence of a companion, and if there is a companion, the attributes of the group including the target person and the companion (group attribute). As an example, the group attribute is family (with parent and child), couple, friends, or the like. The companion information may include the identification information (ID) of the companion. A means of visiting is an automobile, a train, or the like. The biometric information may be face, fingerprint, iris or vein feature information, or other biometric information.
 認証制御部12は、認証制御手段とも呼ばれる。認証制御部12は、上述の所定地点において対象者を撮影した撮影画像を、上述の情報端末から取得した場合、その撮影画像を用いて生体認証を行わせる。 The authentication control unit 12 is also called authentication control means. When the authentication control unit 12 acquires a photographed image of the target person photographed at the above-described predetermined point from the above-described information terminal, the biometric authentication is performed using the photographed image.
 記録制御部14は、記録制御手段とも呼ばれる。記録制御部14は、生体認証が成功した場合、対象者の笑顔度を、対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する。尚、笑顔度は、撮影画像から算出される。 The recording control unit 14 is also called recording control means. When the biometric authentication is successful, the recording control unit 14 records the subject's smile degree in association with the subject's attribute information and the content information related to the content associated with the shooting location. Incidentally, the degree of smile is calculated from the photographed image.
 出力部16は、出力手段とも呼ばれる。出力部16は、生体認証が成功した対象者が使用する情報端末に、その対象者の笑顔度に応じた情報を出力する。対象者が使用する情報端末は、上述の所定地点に設置され、対象者を撮影した情報端末であってもよいし、対象者が所有する対象者端末であってもよい。 The output unit 16 is also called output means. The output unit 16 outputs information corresponding to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded. The information terminal used by the target person may be an information terminal that is installed at the above-mentioned predetermined spot and takes an image of the target person, or may be a target person terminal owned by the target person.
 評価部17は、評価手段とも呼ばれる。評価部17は、対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約する。そして評価部17は、集約した笑顔度に基づいてコンテンツの満足度を評価する。 The evaluation unit 17 is also called evaluation means. The evaluation unit 17 summarizes the degree of smile based on the subject's attribute information and content information. Then, the evaluation unit 17 evaluates the degree of satisfaction with the content based on the aggregated degree of smile.
 図2は、実施形態1にかかる情報処理方法の流れを示す図である。まず、登録部11は、対象者からサービスの利用登録を受け付ける(S10)。そして、登録部11は、対象者の属性情報を、その対象者の生体情報に対応付けて登録する(S11)。続いて、認証制御部12は、所定地点において情報端末が対象者を撮影した撮影画像を、上記情報端末から取得する(S12)。そして、認証制御部12は、撮影画像を用いて生体認証を行わせ(S13)、生体認証に失敗すれば(S14でNo)、処理を終了する。一方、生体認証に成功すれば(S14でYes)、記録制御部14は、撮影画像から算出される対象者の笑顔度を、対象者の属性情報及び撮影地点に関連付けられるコンテンツのコンテンツ情報に対応付けて記録する(S15)。続いて出力部16は、笑顔度に応じた情報を取得し、対象者が使用する情報端末に出力する(S16)。そして、評価部17は、笑顔度を、対応付けられたコンテンツ情報及び対象者の属性情報に基づいて集約し、集約した笑顔度に基づいて、コンテンツの満足度を評価する(S17)。そして評価部17は、処理を終了する。 FIG. 2 is a diagram showing the flow of the information processing method according to the first embodiment. First, the registration unit 11 receives service usage registration from a subject (S10). Then, the registration unit 11 registers the target person's attribute information in association with the target person's biometric information (S11). Subsequently, the authentication control unit 12 acquires from the information terminal a photographed image of the target person photographed by the information terminal at a predetermined point (S12). Then, the authentication control unit 12 performs biometric authentication using the captured image (S13), and if the biometric authentication fails (No in S14), the process ends. On the other hand, if the biometric authentication is successful (Yes in S14), the recording control unit 14 associates the subject's smile degree calculated from the captured image with the subject's attribute information and the content information of the content associated with the shooting location. It is added and recorded (S15). Subsequently, the output unit 16 acquires information corresponding to the degree of smile and outputs it to the information terminal used by the subject (S16). Then, the evaluation unit 17 aggregates the degree of smile based on the associated content information and attribute information of the subject, and evaluates the degree of satisfaction with the content based on the aggregated degree of smile (S17). The evaluation unit 17 then terminates the process.
 このように、実施形態1にかかる情報処理装置10は、対象者の属性情報に基づいて笑顔度を集約し、コンテンツの満足度の詳細分析を行う。したがってイベント会場の運営者がコンテンツの満足度を向上させる措置をとることが容易となる。また、対象者に対しては、特典として笑顔度に応じた情報を取得できるため、自己の属性情報の提供や利用許諾のインセンティブが働きやすい。これにより、情報処理装置10は、対象者の属性情報を容易に取得でき、詳細分析が容易となる。したがって、イベント会場におけるコンテンツの満足度をさらに好適に分析できる。 In this way, the information processing apparatus 10 according to the first embodiment aggregates the degree of smile based on the attribute information of the target person, and performs detailed analysis of the degree of content satisfaction. Therefore, it becomes easy for the operator of the event site to take measures to improve the satisfaction of the content. In addition, since information corresponding to the degree of smile can be obtained as a privilege for the target person, it is easy to work as an incentive to provide the user's attribute information and to permit usage. Thereby, the information processing apparatus 10 can easily acquire the attribute information of the subject, and detailed analysis becomes easy. Therefore, it is possible to more preferably analyze the degree of satisfaction with the content at the event site.
 尚、情報処理装置10は、図示しない構成としてプロセッサ、メモリ及び記憶装置を備えるものである。また、当該記憶装置には、本実施形態にかかる情報処理方法の処理が実装されたコンピュータプログラムが記憶されている。そして、当該プロセッサは、記憶装置からコンピュータプログラムを前記メモリへ読み込ませ、当該コンピュータプログラムを実行する。これにより、前記プロセッサは、登録部11、認証制御部12、記録制御部14、出力部16及び評価部17の機能を実現する。 The information processing apparatus 10 includes a processor, a memory, and a storage device (not shown). Further, the storage device stores a computer program in which the processing of the information processing method according to the present embodiment is implemented. Then, the processor loads the computer program from the storage device into the memory and executes the computer program. Thereby, the processor implements the functions of the registration unit 11 , the authentication control unit 12 , the recording control unit 14 , the output unit 16 and the evaluation unit 17 .
 または、登録部11、認証制御部12、記録制御部14、出力部16及び評価部17は、それぞれが専用のハードウェアで実現されていてもよい。また、各装置の各構成要素の一部又は全部は、汎用または専用の回路(circuitry)、プロセッサ等やこれらの組合せによって実現されもよい。これらは、単一のチップによって構成されてもよいし、バスを介して接続される複数のチップによって構成されてもよい。各装置の各構成要素の一部又は全部は、上述した回路等とプログラムとの組合せによって実現されてもよい。また、プロセッサとして、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、FPGA(field-programmable gate array)等を用いることができる。 Alternatively, the registration unit 11, the authentication control unit 12, the recording control unit 14, the output unit 16, and the evaluation unit 17 may each be realized by dedicated hardware. Also, part or all of each component of each device may be realized by general-purpose or dedicated circuitry, processors, etc., or combinations thereof. These may be composed of a single chip, or may be composed of multiple chips connected via a bus. A part or all of each component of each device may be implemented by a combination of the above-described circuits and the like and programs. Moreover, CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (field-programmable gate array), etc. can be used as a processor.
 また、情報処理装置10の各構成要素の一部又は全部が複数の情報処理装置や回路等により実現される場合には、複数の情報処理装置や回路等は、集中配置されてもよいし、分散配置されてもよい。例えば、情報処理装置や回路等は、クライアントサーバシステム、クラウドコンピューティングシステム等、各々が通信ネットワークを介して接続される形態として実現されてもよい。また、情報処理装置10の機能がSaaS(Software as a Service)形式で提供されてもよい。 Further, when part or all of each component of the information processing device 10 is realized by a plurality of information processing devices, circuits, etc., the plurality of information processing devices, circuits, etc. may be centrally arranged, They may be distributed. For example, the information processing device, circuits, and the like may be implemented as a form in which each is connected via a communication network, such as a client-server system, a cloud computing system, or the like. Also, the functions of the information processing device 10 may be provided in a SaaS (Software as a Service) format.
 <実施形態2>
 次に、本開示の実施形態2について説明する。図3は、実施形態2にかかる情報処理システム1000の全体構成を示すブロック図である。情報処理システム1000は、イベント会場において、来場者である対象者(ユーザU)に対してサービスを提供するとともに、ユーザUの撮影画像を収集し、イベント会場で提供される各コンテンツの満足度を分析するコンピュータシステムである。イベント会場は、動物園又は遊園地である。例えばイベント会場が動物園である場合、コンテンツは、各動物を見学することや動物自体を指してよい。例えばイベント会場が遊園地である場合、コンテンツは各アトラクションを体験することやアトラクション自体を指してよい。また上述のサービスは、ユーザUの撮影画像に応じた特典情報を、ユーザUに提供するものである。
<Embodiment 2>
Next, Embodiment 2 of the present disclosure will be described. FIG. 3 is a block diagram showing the overall configuration of an information processing system 1000 according to the second embodiment. The information processing system 1000 provides a service to a target person (user U) who is a visitor at an event site, collects captured images of the user U, and measures satisfaction with each content provided at the event site. A computer system for analysis. The event venue is a zoo or an amusement park. For example, if the event venue is a zoo, the content may refer to viewing each animal or the animals themselves. For example, if the event venue is an amusement park, the content may refer to experiencing each attraction or the attractions themselves. Moreover, the above-mentioned service provides the user U with privilege information according to the user U's photographed image.
 情報処理システム1000は、認証装置100、情報処理装置200、画像保存サーバ300、顔認証端末400-1~400-n(nは2以上の自然数。)、ユーザ端末500を備える。認証装置100、情報処理装置200、画像保存サーバ300、顔認証端末400-1~400-n及びユーザ端末500のそれぞれは、ネットワークNを介して接続されている。ここで、ネットワークNは、有線又は無線の通信回線である。 The information processing system 1000 includes an authentication device 100, an information processing device 200, an image storage server 300, face authentication terminals 400-1 to 400-n (n is a natural number of 2 or more), and a user terminal 500. Authentication device 100, information processing device 200, image storage server 300, face authentication terminals 400-1 to 400-n, and user terminal 500 are connected via network N, respectively. Here, the network N is a wired or wireless communication line.
 また、顔認証端末400-1、400-2、400-3、・・・400-nのそれぞれは、地点A1、A2、A3、・・・Anに設置されている。ここで、地点A1~Anは、あるイベント会場Aにおける異なるスポットであるとする。例えば地点A1は、動物園の入り口であり、地点A2~Anは、動物の見学スポットであってよい。尚、地点A2~Anにおいては、顔認証端末400が、ユーザUと、関連するコンテンツに特有の風景又は動物等とを一緒に撮影できるように、構成されていてよい。以下では、顔認証端末400-1、400-2、400-3、・・・400-nを区別せずに言及する場合、単に顔認証端末400と呼ぶことがある。 Also, face authentication terminals 400-1, 400-2, 400-3, . . . 400-n are installed at points A1, A2, A3, . Here, points A1 to An are assumed to be different spots in a certain event site A. FIG. For example, point A1 may be the entrance to a zoo, and points A2-An may be animal viewing spots. Incidentally, at the points A2 to An, the face authentication terminal 400 may be configured so as to be able to photograph together the user U and scenery, animals, or the like peculiar to the related content. . . , 400-n may be simply referred to as the face authentication terminal 400 hereinafter when the face authentication terminals 400-1, 400-2, 400-3, . . .
 ユーザUは、地点A1においてサービスの利用登録をし、自身の顔情報を登録する。利用登録では、ユーザUの属性情報が登録される。尚、利用登録及び顔情報の登録は、ユーザUが希望する任意の顔認証端末400又はユーザ端末500で行ってよい。そしてユーザUは、1又は複数の地点を訪問する。各地点A1~Anにおいて、顔認証端末400は、訪問したユーザUを撮影する。撮影画像を用いた顔認証が成功した場合、ユーザUは、笑顔度に応じた情報を所定の態様で取得できる。笑顔度は、顔認証に成功した撮影画像に基づいて生成される。以下では、笑顔度に応じた情報は、笑顔度に応じた画像であるとして説明する。笑顔度に応じた画像は、撮影画像にユーザUの笑顔度を重畳した合成画像であってよい。また、笑顔度に応じた画像は、ユーザUが満足している様子や楽しそうな様子で写っている撮影画像に対応していてよい。例えば、笑顔度に応じた画像は、ユーザUの笑顔度が所定値以上の撮影画像であってもよいし、ユーザUの笑顔度が所定値以上の撮影画像に、ユーザUの笑顔度を重畳した合成画像であってもよい。尚、笑顔度に応じた情報は、上述の画像に加えて又は代えて、コンテンツに関連する情報、例えば販促情報であってもよい。この場合、上記コンテンツは、笑顔度が所定値以上の撮影画像の撮影地点に関連するコンテンツであってよい。 User U registers for use of the service at point A1 and registers his/her own face information. In the usage registration, user U's attribute information is registered. The use registration and face information registration may be performed at any face authentication terminal 400 or user terminal 500 that the user U desires. The user U then visits one or more points. At each point A1 to An, the face authentication terminal 400 takes an image of the visiting user U. FIG. When the face authentication using the captured image is successful, the user U can acquire information according to the degree of smile in a predetermined manner. The degree of smile is generated based on a photographed image for which face recognition has succeeded. In the following description, it is assumed that the information corresponding to the degree of smile is an image corresponding to the degree of smile. The image corresponding to the degree of smile may be a composite image in which the degree of smile of the user U is superimposed on the photographed image. Also, the image corresponding to the degree of smile may correspond to a photographed image in which the user U appears satisfied or happy. For example, the image corresponding to the degree of smile may be a photographed image in which the degree of smile of the user U is equal to or greater than a predetermined value, or the degree of smile of the user U may be superimposed on the photographed image in which the degree of smile of the user U is equal to or greater than a predetermined value. It may be a composite image obtained by The information according to the degree of smile may be information related to content, such as sales promotion information, in addition to or instead of the above-described image. In this case, the content may be content related to the photographing location of the photographed image having a degree of smile equal to or greater than a predetermined value.
 ここで、認証装置100は、複数の人物の顔特徴情報を記憶する情報処理装置である。また、認証装置100は、外部から受信した顔認証要求に応じて、当該要求に含まれる顔画像又は顔特徴情報について、各ユーザの顔特徴情報と照合を行い、照合結果(認証結果)を要求元へ返信する。 Here, the authentication device 100 is an information processing device that stores facial feature information of a plurality of persons. In addition, in response to a face authentication request received from the outside, the authentication device 100 compares the face image or face feature information included in the request with the face feature information of each user, and requests the matching result (authentication result). Reply to original.
 図4は、実施形態2にかかる認証装置100の構成を示すブロック図である。認証装置100は、顔情報DB(DataBase)110と、顔検出部120と、特徴点抽出部130と、登録部140と、認証部150とを備える。顔情報DB110は、ユーザID111と当該ユーザIDの顔特徴情報112とを対応付けて記憶する。顔特徴情報112は、顔画像から抽出された特徴点の集合である。尚、認証装置100は、顔特徴情報112の登録ユーザからの要望に応じて、顔特徴DB110内の顔特徴情報112を削除してもよい。または、認証装置100は、顔特徴情報112の登録から一定期間経過後に削除してもよい。 FIG. 4 is a block diagram showing the configuration of the authentication device 100 according to the second embodiment. The authentication device 100 includes a face information DB (DataBase) 110 , a face detection section 120 , a feature point extraction section 130 , a registration section 140 and an authentication section 150 . The face information DB 110 associates and stores a user ID 111 and face feature information 112 of the user ID. The facial feature information 112 is a set of feature points extracted from the facial image. Note that the authentication device 100 may delete the facial feature information 112 in the facial feature DB 110 at the request of the registered user of the facial feature information 112 . Alternatively, the authentication device 100 may delete the facial feature information 112 after a certain period of time has passed since it was registered.
 顔検出部120は、顔情報を登録するための登録画像に含まれる顔領域を検出し、特徴点抽出部130に出力する。特徴点抽出部130は、顔検出部120が検出した顔領域から特徴点を抽出し、登録部140に顔特徴情報を出力する。また、特徴点抽出部130は、情報処理装置200から受信した顔画像に含まれる特徴点を抽出し、認証部150に顔特徴情報を出力する。 The face detection unit 120 detects a face area included in a registration image for registering face information and outputs it to the feature point extraction unit 130 . Feature point extraction section 130 extracts feature points from the face area detected by face detection section 120 and outputs facial feature information to registration section 140 . Further, feature point extraction section 130 extracts feature points included in the facial image received from information processing apparatus 200 and outputs facial feature information to authentication section 150 .
 登録部140は、顔特徴情報の登録に際して、ユーザID111を新規に発行する。登録部140は、発行したユーザID111と、登録画像から抽出した顔特徴情報112とを対応付けて顔情報DB110へ登録する。認証部150は、顔特徴情報112を用いた顔認証を行う。具体的には、認証部150は、顔画像から抽出された顔特徴情報と、顔情報DB110内の顔特徴情報112との照合を行う。認証部150は、顔特徴情報の一致の有無を情報処理装置200に返信する。顔特徴情報の一致の有無は、認証の成否に対応する。尚、顔特徴情報が一致する(一致有)とは、一致度が所定値以上である場合をいうものとする。 The registration unit 140 newly issues a user ID 111 when registering facial feature information. The registration unit 140 associates the issued user ID 111 with the facial feature information 112 extracted from the registered image and registers them in the facial information DB 110 . The authentication unit 150 performs face authentication using the facial feature information 112 . Specifically, the authentication unit 150 collates the facial feature information extracted from the facial image with the facial feature information 112 in the facial information DB 110 . The authentication unit 150 replies to the information processing apparatus 200 whether or not the facial feature information matches. Whether the facial feature information matches or not corresponds to the success or failure of the authentication. Note that matching of facial feature information (matching) means a case where the degree of matching is equal to or greater than a predetermined value.
 図5は、実施形態2にかかる顔情報登録処理の流れを示すフローチャートである。まず、認証装置100は、顔情報登録要求に含まれる登録画像を取得する(S21)。例えば、認証装置100は、顔情報登録要求を、顔認証端末400又はユーザ端末500等からネットワークNを介して受け付ける。尚、顔情報登録要求元は、これに限らず、顔認証端末400又はユーザ端末500から利用登録要求を受けた情報処理装置200であってもよい。次に、顔検出部120は、登録画像に含まれる顔領域を検出する(S22)。次に、特徴点抽出部130は、ステップS22で検出した顔領域から特徴点を抽出し、登録部140に顔特徴情報を出力する(S23)。最後に、登録部140は、ユーザID111を発行し、当該ユーザID111と顔特徴情報112とを対応付けて顔情報DB110に登録する(S24)。なお、認証装置100は、顔情報登録要求元から顔特徴情報112を受信し、ユーザID111と対応付けて顔情報DB110に登録してもよい。 FIG. 5 is a flowchart showing the flow of face information registration processing according to the second embodiment. First, the authentication device 100 acquires the registration image included in the face information registration request (S21). For example, the authentication device 100 receives a face information registration request from the face authentication terminal 400, the user terminal 500, or the like via the network N. FIG. Note that the face information registration request source is not limited to this, and may be the information processing apparatus 200 that receives the use registration request from the face authentication terminal 400 or the user terminal 500 . Next, face detection section 120 detects a face area included in the registered image (S22). Next, the feature point extraction unit 130 extracts feature points from the face area detected in step S22, and outputs face feature information to the registration unit 140 (S23). Finally, the registration unit 140 issues the user ID 111, associates the user ID 111 with the facial feature information 112, and registers them in the facial information DB 110 (S24). Note that the authentication device 100 may receive the facial characteristic information 112 from the face information registration requester, associate it with the user ID 111 , and register it in the facial information DB 110 .
 図6は、実施形態2にかかる認証装置100による顔認証処理の流れを示すフローチャートである。まず、特徴点抽出部130は、認証用の顔特徴情報を取得する(S31)。例えば、認証装置100は、情報処理装置200からネットワークNを介して顔認証要求を受信し、顔認証要求に含まれる顔画像からステップS21からS23のように顔特徴情報を抽出する。または、認証装置100は、情報処理装置200から顔特徴情報を受信してもよい。次に、認証部150は、取得した顔特徴情報を、顔情報DB110の顔特徴情報112と照合する(S32)。顔特徴情報が一致した場合、つまり、顔特徴情報の一致度が所定値以上である場合(S33でYes)、認証部150は、顔特徴情報が一致したユーザのユーザID111を特定し(S34)、顔認証に成功した旨と特定したユーザID111とを情報処理装置200に返信する(S35)。一致する顔特徴情報が存在しない場合(S33でNo)、認証部150は、顔認証に失敗した旨を情報処理装置200に返信する(S36)。 FIG. 6 is a flow chart showing the flow of face authentication processing by the authentication device 100 according to the second embodiment. First, the feature point extraction unit 130 acquires facial feature information for authentication (S31). For example, the authentication device 100 receives a face authentication request from the information processing device 200 via the network N, and extracts facial feature information from the face image included in the face authentication request in steps S21 to S23. Alternatively, authentication device 100 may receive facial feature information from information processing device 200 . Next, the authentication unit 150 collates the acquired facial feature information with the facial feature information 112 of the facial information DB 110 (S32). If the facial feature information matches, that is, if the degree of matching of the facial feature information is equal to or greater than a predetermined value (Yes in S33), the authentication unit 150 specifies the user ID 111 of the user whose facial feature information matches (S34). , the fact that the face authentication was successful and the specified user ID 111 are returned to the information processing apparatus 200 (S35). If there is no matching facial feature information (No in S33), the authentication unit 150 returns to the information processing apparatus 200 that the face authentication has failed (S36).
 尚、ステップS32において、認証部150は、顔情報DB110内の全ての顔特徴情報112との照合を試みる必要はない。例えば、認証部150は、顔認証要求を受け付けた当日から数日前までの期間に登録が行われた顔特徴情報と優先的に照合を試みるとよい。これにより、照合速度が向上し得る。また、上記優先的な照合に失敗した場合、残り全ての顔特徴情報と照合を行うようにするとよい。 It should be noted that in step S32, the authentication unit 150 does not need to attempt matching with all the facial feature information 112 in the facial information DB 110. For example, the authentication unit 150 may preferentially attempt matching with facial feature information registered during the period from the day the face authentication request is received until several days before. This can improve matching speed. Also, if the preferential collation fails, it is preferable to collate with all the remaining facial feature information.
 図3に戻り説明を続ける。顔認証端末400-1、400-2、・・・400-nのそれぞれは、カメラと表示装置を含む情報端末である。 Return to Figure 3 and continue the explanation. Each of the face authentication terminals 400-1, 400-2, . . . 400-n is an information terminal including a camera and a display device.
 顔認証端末400は、ユーザUの利用登録要求を、ネットワークNを介して情報処理装置200に送信する。また顔認証端末400は、利用登録時にユーザUの顔認証に用いる登録画像を撮影する。顔認証端末400は、登録画像を含めた顔情報登録要求をネットワークNを介して認証装置100へ送信する。顔認証端末400は、登録画像を含めた顔情報登録要求を、情報処理装置200を介して認証装置100に送信してもよい。尚、ユーザUは、ユーザ端末500を用いて利用登録や顔情報登録を行っても良い。 The face authentication terminal 400 transmits a usage registration request for the user U to the information processing device 200 via the network N. The face authentication terminal 400 also captures a registration image used for face authentication of the user U at the time of use registration. The face authentication terminal 400 transmits a face information registration request including the registered image to the authentication device 100 via the network N. FIG. The face authentication terminal 400 may transmit a face information registration request including the registered image to the authentication device 100 via the information processing device 200 . Note that the user U may use the user terminal 500 to perform usage registration and face information registration.
 顔認証端末400は、ユーザUの顔認証に用いる認証用の顔画像を撮影する。例えば、顔認証端末400は、設置された各地点においてユーザUを撮影した撮影画像を認証用の画像とする。顔認証端末400は、認証用の画像を含めた顔認証要求をネットワークNを介して情報処理装置200へ送信する。このとき、顔認証端末400は、自身が設置された地点を識別する地点IDを顔認証要求に含めてよい。また、顔認証端末400は、撮影時刻を顔認証要求に含めてもよい。また、顔認証端末400は、情報処理装置200からネットワークNを介して顔認証結果及び笑顔度の情報を受信し、必要に応じてこれらの情報を画面に表示する。また顔認証端末400は、ユーザUから撮影の終了を希望する旨を示す撮影終了情報の入力を受け付けた場合、撮影終了情報を含む撮影終了要求を、情報処理装置200へ送信する。 The face authentication terminal 400 captures a face image for authentication used for user U's face authentication. For example, the face authentication terminal 400 uses captured images of the user U at each installation location as images for authentication. The face authentication terminal 400 transmits a face authentication request including an image for authentication to the information processing device 200 via the network N. FIG. At this time, the face authentication terminal 400 may include in the face authentication request a location ID that identifies the location where the face authentication terminal 400 is installed. Moreover, the face authentication terminal 400 may include the shooting time in the face authentication request. In addition, the face authentication terminal 400 receives the face authentication result and smile level information from the information processing device 200 via the network N, and displays the information on the screen as necessary. Further, when the face authentication terminal 400 receives input of shooting end information indicating that the user U wishes to end shooting, the face authentication terminal 400 transmits a shooting end request including the shooting end information to the information processing apparatus 200 .
 次に、顔認証端末400について詳細に説明する。図7は、実施形態2にかかる顔認証端末400の構成を示すブロック図である。顔認証端末400は、カメラ410と、記憶部420と、通信部430と、表示部440と、制御部450と、入力部460とを備える。 Next, the face authentication terminal 400 will be explained in detail. FIG. 7 is a block diagram showing the configuration of the face authentication terminal 400 according to the second embodiment. The face authentication terminal 400 includes a camera 410 , a storage section 420 , a communication section 430 , a display section 440 , a control section 450 and an input section 460 .
 カメラ410は、制御部450の制御に応じて撮影を行う撮影装置である。記憶部420は、顔認証端末400の各機能を実現するためのプログラムが格納される記憶装置である。通信部430は、ネットワークNとの通信インタフェースである。表示部440は、表示装置である。入力部460は、ユーザUからの入力を受け付ける入力装置である。表示部440及び入力部460は、一体的に構成されていてもよい。一例として、表示部440及び入力部460は、タッチパネルである。制御部450は、顔認証端末400が有するハードウェアの制御を行う。制御部450は、撮影制御部451と、登録部452と、認証制御部453と、表示制御部454とを備える。尚、認証制御部453は、地点A1に設置された顔認証端末400-1において必須ではない。また、登録部452は、地点A2からAnに設置された顔認証端末400-2~400-nにおいて必須ではない。 The camera 410 is a photographing device that takes pictures under the control of the control unit 450 . Storage unit 420 is a storage device that stores a program for realizing each function of face authentication terminal 400 . A communication unit 430 is a communication interface with the network N. FIG. Display unit 440 is a display device. The input unit 460 is an input device that receives an input from the user U. The display unit 440 and the input unit 460 may be configured integrally. As an example, the display unit 440 and the input unit 460 are touch panels. The control unit 450 controls hardware included in the face authentication terminal 400 . The control unit 450 includes an imaging control unit 451 , a registration unit 452 , an authentication control unit 453 and a display control unit 454 . Note that the authentication control unit 453 is not essential in the face authentication terminal 400-1 installed at the point A1. Also, the registration unit 452 is not essential in the face authentication terminals 400-2 to 400-n installed from the point A2 to An.
 撮影制御部451は、カメラ410を制御し、ユーザUの登録画像又は認証用画像を撮影する。登録画像及び認証用画像は、少なくとも当該ユーザの顔領域を含む画像である。尚、地点A2等における撮影画像(認証用画像)は、背景に当該地点に特有の風景や動物等を含んでもよい。撮影制御部451は、登録画像を登録部452へ出力する。また、撮影制御部451は、認証用画像を認証制御部453へ出力する。 The imaging control unit 451 controls the camera 410 to capture the user U's registration image or authentication image. The registered image and the authentication image are images including at least the user's face area. Note that the photographed image (authentication image) at the point A2 or the like may include scenery, animals, or the like unique to the point in the background. The imaging control section 451 outputs the registration image to the registration section 452 . The imaging control unit 451 also outputs the authentication image to the authentication control unit 453 .
 登録部452は、登録画像を含む顔情報登録要求をネットワークNを介して認証装置100へ送信する。尚、登録部452は、上記顔情報登録要求をネットワークNを介して情報処理装置200へ送信してもよい。また、登録部452は、利用登録要求をネットワークNを介して情報処理装置200に送信する。尚、登録部452は、利用登録時に、入力部460で受け付けたユーザ属性情報を、ネットワークNを介して情報処理装置200に送信してよい。ユーザ属性情報は、利用登録要求に含まれていてもよい。認証制御部453は、認証用画像を含む顔認証要求をネットワークNを介して情報処理装置200へ送信し、顔認証結果を受信し、表示制御部454へ出力する。また、認証制御部453は、撮影終了情報の入力画面を表示制御部454へ出力し、ユーザUに入力を促す。認証制御部453は、ユーザUから撮影終了情報を受け付けたことに応じて、撮影終了要求を、情報処理装置200へ送信する。 The registration unit 452 transmits a face information registration request including the registration image to the authentication device 100 via the network N. Note that the registration unit 452 may transmit the face information registration request to the information processing apparatus 200 via the network N. FIG. Further, the registration unit 452 transmits a usage registration request to the information processing apparatus 200 via the network N. FIG. Note that the registration unit 452 may transmit the user attribute information received by the input unit 460 to the information processing apparatus 200 via the network N at the time of registration for use. The user attribute information may be included in the usage registration request. Authentication control unit 453 transmits a face authentication request including an authentication image to information processing apparatus 200 via network N, receives the result of face authentication, and outputs the result to display control unit 454 . In addition, the authentication control unit 453 outputs an input screen for information on completion of photographing to the display control unit 454 and prompts the user U to input. The authentication control unit 453 transmits a shooting end request to the information processing apparatus 200 in response to receiving the shooting end information from the user U. FIG.
 表示制御部454は、顔認証結果及び笑顔度に応じた表示内容を表示部440へ表示する。 The display control unit 454 displays on the display unit 440 the display content according to the face authentication result and the degree of smile.
 図3に戻り、説明を続ける。ユーザ端末500は、ユーザUが所持する情報端末である。ユーザ端末500は、例えば、携帯電話端末、スマートフォン、タブレット端末、カメラを搭載又は接続したPC(Personal Computer)等である。ユーザ端末500は、ユーザUのユーザID又は顔特徴情報と対応付けられている。つまり、ユーザ端末500は、情報処理装置200においてユーザID又は顔特徴情報により特定可能な表示端末である。例えば、ユーザ端末500は、ユーザUが自身のユーザIDによりログイン済みの端末である。 Return to Figure 3 and continue the explanation. A user terminal 500 is an information terminal owned by a user U. FIG. The user terminal 500 is, for example, a mobile phone terminal, a smart phone, a tablet terminal, a PC (Personal Computer) equipped with or connected to a camera, or the like. The user terminal 500 is associated with the user U's user ID or facial feature information. In other words, the user terminal 500 is a display terminal that can be specified by the user ID or facial feature information in the information processing apparatus 200 . For example, the user terminal 500 is a terminal logged in by the user U with his own user ID.
 ユーザ端末500は、サービスの利用登録要求を、ネットワークNを介して情報処理装置200に送信する。またユーザ端末500は、ユーザUの顔認証に用いる登録画像を認証装置100に送信し、顔情報登録要求を行う。なお、ユーザ端末500は、登録画像から抽出された顔特徴情報を認証装置100に送信し、顔情報登録要求を行ってもよい。ユーザ端末500は、登録画像や顔特徴情報を、情報処理装置200を介して認証装置100に送信してもよい。また、ユーザ端末500は、ネットワークNを介して情報処理装置200から笑顔度に応じた画像を取得する。 The user terminal 500 transmits a service usage registration request to the information processing device 200 via the network N. Also, the user terminal 500 transmits a registration image used for face authentication of the user U to the authentication device 100, and makes a face information registration request. Note that the user terminal 500 may transmit the facial feature information extracted from the registered image to the authentication device 100 to make a facial information registration request. The user terminal 500 may transmit the registered image and facial feature information to the authentication device 100 via the information processing device 200 . Also, the user terminal 500 acquires an image corresponding to the degree of smile from the information processing apparatus 200 via the network N. FIG.
 次に、ユーザ端末500について詳細に説明する。図8は、本実施形態2にかかるユーザ端末500の構成を示すブロック図である。ユーザ端末500は、カメラ510と、記憶部520と、通信部530と、表示部540と、制御部550と、入力部560とを備える。 Next, the user terminal 500 will be described in detail. FIG. 8 is a block diagram showing the configuration of the user terminal 500 according to the second embodiment. User terminal 500 includes camera 510 , storage unit 520 , communication unit 530 , display unit 540 , control unit 550 , and input unit 560 .
 カメラ510は、制御部550の制御に応じて撮影を行う撮影装置である。記憶部520は、ユーザ端末500の各機能を実現するためのプログラムが格納される記憶装置である。通信部530は、ネットワークNとの通信インタフェースである。表示部540は、表示装置である。入力部560は、入力を受け付ける入力装置である。表示部540及び入力部560は、一体的に構成されていてもよい。一例として、表示部540及び入力部560は、タッチパネルである。制御部550は、ユーザ端末500が有するハードウェアの制御を行う。制御部550は、撮影制御部551と、登録部552と、取得部553と、表示制御部554とを備える。 The camera 510 is an image capturing device that performs image capturing under the control of the control unit 550 . The storage unit 520 is a storage device in which programs for realizing each function of the user terminal 500 are stored. A communication unit 530 is a communication interface with the network N. FIG. Display unit 540 is a display device. Input unit 560 is an input device that receives an input. The display unit 540 and the input unit 560 may be configured integrally. As an example, the display unit 540 and the input unit 560 are touch panels. The control unit 550 controls hardware of the user terminal 500 . The control unit 550 includes an imaging control unit 551 , a registration unit 552 , an acquisition unit 553 and a display control unit 554 .
 撮影制御部551は、カメラ510を制御し、ユーザUの登録画像を撮影する。撮影制御部551は、登録画像を登録部552へ出力する。 The photographing control unit 551 controls the camera 510 to photograph a registered image of the user U. The imaging control section 551 outputs the registration image to the registration section 552 .
 登録部552は、登録画像を含む顔情報登録要求をネットワークNを介して認証装置100へ送信する。尚、登録部552は、上記顔情報登録要求をネットワークNを介して情報処理装置200へ送信してもよい。また、登録部552は、サービスの利用登録要求をネットワークNを介して情報処理装置200に送信する。尚、登録部552は、利用登録時に、入力部560で受け付けたユーザ属性情報を、ネットワークNを介して情報処理装置200に送信してよい。ユーザ属性情報は、利用登録要求に含まれていてもよい。取得部553は、ネットワークNを介して情報処理装置200から、笑顔度に応じた画像を取得する。また、取得部553は、笑顔度に応じた画像を表示制御部554に出力する。表示制御部554は、笑顔度に応じた画像を表示部540に表示する。 The registration unit 552 transmits a face information registration request including the registered image to the authentication device 100 via the network N. Note that the registration unit 552 may transmit the face information registration request to the information processing apparatus 200 via the network N. FIG. Further, the registration unit 552 transmits a service usage registration request to the information processing apparatus 200 via the network N. FIG. Note that the registration unit 552 may transmit the user attribute information received by the input unit 560 to the information processing apparatus 200 via the network N at the time of registration for use. The user attribute information may be included in the usage registration request. Acquisition unit 553 acquires an image corresponding to the degree of smile from information processing apparatus 200 via network N. FIG. Acquisition unit 553 also outputs an image corresponding to the degree of smile to display control unit 554 . The display control unit 554 displays an image corresponding to the degree of smile on the display unit 540 .
 図3に戻り説明を続ける。情報処理装置200は、ユーザUの地点A1等における撮影画像を用いて、ユーザUに笑顔度に応じた画像を提供するとともに、収集した情報を用いてコンテンツの満足度を分析する情報処理装置である。情報処理装置200は、複数台のサーバに冗長化されてもよく、各機能ブロックが複数台のコンピュータで実現されてもよい。 Return to Figure 3 and continue the explanation. The information processing apparatus 200 is an information processing apparatus that provides an image corresponding to the degree of smile to the user U using an image captured at a point A1 or the like of the user U, and analyzes the degree of satisfaction with content using collected information. be. The information processing apparatus 200 may be made redundant by a plurality of servers, and each functional block may be realized by a plurality of computers.
 画像保存サーバ300は、情報処理装置200により生成された、笑顔度に応じた画像を保存するための1台以上のファイルサーバである。尚、画像保存サーバ300は、認証用の撮影画像を保存してもよい。画像保存サーバ300は、画像取得要求を受信したことに応じて、要求元に対して、笑顔度に応じた画像や撮影画像を提供する。 The image storage server 300 is one or more file servers for storing images generated by the information processing device 200 according to the degree of smile. Note that the image storage server 300 may store a photographed image for authentication. Upon receiving the image acquisition request, the image storage server 300 provides the requester with an image corresponding to the degree of smile or a photographed image.
 次に、情報処理装置200について詳細に説明する。図9は、実施形態2にかかる情報処理装置200の構成を示すブロック図である。情報処理装置200は、記憶部210と、メモリ220と、通信部230と、制御部240とを備える。記憶部210は、ハードディスク、フラッシュメモリ等の記憶装置である。記憶部210は、プログラム211と、ユーザ情報212と、履歴情報213と、コンテンツ情報214と、集約情報215とを記憶する。プログラム211は、本実施形態2にかかる情報処理方法の処理が実装されたコンピュータプログラムである。 Next, the information processing device 200 will be described in detail. FIG. 9 is a block diagram showing the configuration of an information processing apparatus 200 according to the second embodiment. Information processing apparatus 200 includes storage unit 210 , memory 220 , communication unit 230 , and control unit 240 . The storage unit 210 is a storage device such as a hard disk or flash memory. Storage unit 210 stores program 211 , user information 212 , history information 213 , content information 214 , and consolidated information 215 . The program 211 is a computer program in which the processing of the information processing method according to the second embodiment is implemented.
 ユーザ情報212は、ユーザに関連する基本情報である。すなわち、ユーザ情報212は、ユーザの属性情報等である。具体的には、ユーザ情報212は、ユーザID2121と、ユーザ属性情報2122とを対応付けた情報である。ユーザID2121は、ユーザUを識別する情報であり、認証装置100において顔情報が登録された場合に通知されるユーザIDである。ユーザ属性情報2122は、ユーザUの属性情報を示し、実施形態1にかかる「対象者の属性情報」に相当する。 The user information 212 is basic information related to the user. That is, the user information 212 is user attribute information and the like. Specifically, the user information 212 is information in which a user ID 2121 and user attribute information 2122 are associated with each other. The user ID 2121 is information for identifying the user U, and is a user ID notified when face information is registered in the authentication device 100 . The user attribute information 2122 indicates the attribute information of the user U, and corresponds to the "subject's attribute information" according to the first embodiment.
 履歴情報213は、各地点における顔認証端末400を利用したユーザUの撮影履歴である。具体的には、履歴情報213は、ユーザID2131と、地点ID2132と、日時2133と、笑顔度2134と、アクセス情報2135とを対応付けた情報である。ユーザID2131は、ユーザUを識別する情報であり、顔認証に成功した場合に顔認証結果に含まれるユーザIDである。地点ID2132は、顔認証用の撮影画像を撮影した顔認証端末400が設置された地点を識別する情報である。日時2133は、顔認証用の撮影画像が撮影された日時又は顔認証が行われた日時である。笑顔度2134は、顔認証用の撮影画像に基づいて算出される笑顔度である。アクセス情報2135は、撮影画像又は撮影画像に基づいて生成された合成画像の保存先のアクセス情報である。アクセス情報は、例えば上記画像に対応したWEB情報へのリンク情報であり、具体的にはURL等である。 The history information 213 is the shooting history of the user U using the face authentication terminal 400 at each location. Specifically, the history information 213 is information in which a user ID 2131, a spot ID 2132, a date and time 2133, a smile level 2134, and access information 2135 are associated with each other. The user ID 2131 is information for identifying the user U, and is the user ID included in the face authentication result when the face authentication is successful. The point ID 2132 is information for identifying the point where the face authentication terminal 400 that captured the captured image for face authentication is installed. The date and time 2133 is the date and time when the captured image for face authentication was taken or the date and time when face authentication was performed. The degree of smile 2134 is the degree of smile calculated based on the photographed image for face authentication. The access information 2135 is access information of a storage destination of a captured image or a composite image generated based on the captured image. The access information is, for example, link information to the WEB information corresponding to the image, and is specifically a URL or the like.
 コンテンツ情報214は、地点ID2141と、コンテンツID2142と、コンテンツ属性情報2143とを対応付けた情報である。コンテンツID2142は、地点ID2141に関連付けられたコンテンツを識別する情報である。例えばコンテンツID2142は、地点ID2141から所定距離以内にコンテンツ提供場所が存在する場合、そのコンテンツのIDである。コンテンツ属性情報2143は、コンテンツID2142のコンテンツの属性情報である。例えばコンテンツ属性情報2143は、コンテンツの種別であってよい。コンテンツの種別の一例として、ライオン/パンダ、肉食動物/草食動物、又は屋内/屋外等であってよい。尚、コンテンツ属性情報2143は、地点ID2141を含んで構成されてもよい。 The content information 214 is information in which the point ID 2141, the content ID 2142, and the content attribute information 2143 are associated with each other. The content ID 2142 is information identifying content associated with the point ID 2141 . For example, the content ID 2142 is the ID of the content when the content providing place exists within a predetermined distance from the location ID 2141 . The content attribute information 2143 is attribute information of the content of the content ID 2142 . For example, the content attribute information 2143 may be the type of content. Examples of types of content may be lion/panda, carnivore/herbivore, or indoor/outdoor. In addition, the content attribute information 2143 may be configured including the location ID 2141 .
 集約情報215は、ユーザUの属性情報及びコンテンツ属性情報に基づいて笑顔度を集約することで得られた情報である。図10は、実施形態2にかかる集約情報215のデータ構造の一例を示す図である。本図では、集約情報215は、ユーザ属性情報と、コンテンツ属性情報と、笑顔度と、コンテンツ満足度とを対応付けている。本図では、ユーザ属性情報として年齢層(幼児、中学生、…)と、コンテンツ属性として動物の種別(ライオン、パンダ、…)とが記録されている。 The aggregated information 215 is information obtained by aggregating smile degrees based on user U's attribute information and content attribute information. FIG. 10 is a diagram illustrating an example of the data structure of aggregated information 215 according to the second embodiment. In this figure, aggregated information 215 associates user attribute information, content attribute information, smile level, and content satisfaction level. In this figure, age groups (infants, junior high school students, . . . ) are recorded as user attribute information, and animal types (lion, panda, . . . ) are recorded as content attributes.
 本図では、ユーザ属性情報及びコンテンツ属性情報の組み合わせ毎に、笑顔度が集約されている。そして、ユーザ属性情報及びコンテンツ属性情報の組み合わせ毎に、コンテンツ満足度が記録されている。コンテンツ満足度は、集約された笑顔度の統計値(例えば、平均値)であってよい。尚、集約情報215は、コンテンツ属性情報に代えて、コンテンツIDを含んでもよい。この場合、ユーザ属性情報及びコンテンツIDの組み合わせ毎に、笑顔度が集約され、ユーザ属性情報及びコンテンツIDの組み合わせ毎に、コンテンツ満足度が記録されてよい。本図から、ライオンよりパンダのほうが、幼児のコンテンツ満足度が高く、パンダよりライオンのほうが、中学生のコンテンツ満足度が高いことがわかる。 In this figure, the degree of smile is aggregated for each combination of user attribute information and content attribute information. Content satisfaction is recorded for each combination of user attribute information and content attribute information. The content satisfaction may be an aggregated smileiness statistic (eg, mean). Note that the aggregation information 215 may include a content ID instead of the content attribute information. In this case, the degree of smile may be aggregated for each combination of user attribute information and content ID, and content satisfaction may be recorded for each combination of user attribute information and content ID. From this figure, it can be seen that pandas are more satisfied with content than lions, and that lions are more satisfied with content than pandas are junior high school students.
 図9に戻り説明を続ける。メモリ220は、RAM(Random Access Memory)等の揮発性記憶装置であり、制御部240の動作時に一時的に情報を保持するための記憶領域である。通信部230は、ネットワークNとの通信インタフェースである。 Return to Fig. 9 and continue the explanation. The memory 220 is a volatile storage device such as a RAM (Random Access Memory), and is a storage area for temporarily holding information when the control unit 240 operates. The communication unit 230 is a communication interface with the network N. FIG.
 制御部240は、情報処理装置200の各構成を制御するプロセッサつまり制御装置である。制御部240は、記憶部210からプログラム211をメモリ220へ読み込ませ、プログラム211を実行する。これにより、制御部240は、登録部241、認証制御部242、算出部243、記録制御部244、画像生成部245、出力部246及び評価部247の機能を実現する。 The control unit 240 is a processor that controls each component of the information processing device 200, that is, a control device. The control unit 240 loads the program 211 from the storage unit 210 into the memory 220 and executes the program 211 . Thereby, the control unit 240 realizes the functions of the registration unit 241 , the authentication control unit 242 , the calculation unit 243 , the recording control unit 244 , the image generation unit 245 , the output unit 246 and the evaluation unit 247 .
 登録部241は、上述した登録部11の一例である。登録部241は、サービスの利用登録時に認証装置100によってユーザUの顔情報が登録され、ユーザIDが通知された場合、ユーザID2121及びユーザ属性情報2122を、記憶部210に登録する。これにより、ユーザUの属性情報は、ユーザIDを介して、認証装置100に格納される顔特徴情報112と対応付けられる。 The registration unit 241 is an example of the registration unit 11 described above. The registration unit 241 registers the user ID 2121 and the user attribute information 2122 in the storage unit 210 when the face information of the user U is registered by the authentication device 100 at the time of service usage registration and the user ID is notified. Thereby, the attribute information of the user U is associated with the facial feature information 112 stored in the authentication device 100 via the user ID.
 ここでユーザ属性情報2122は、ユーザUが顔認証端末400やユーザ端末500に入力した入力データを、情報処理装置200がネットワークNを介して受信した情報であってよい。この場合、登録部241は、ユーザUから提供される情報を、ユーザUの属性情報としてユーザIDに関連付けて登録する。 Here, the user attribute information 2122 may be information that the information processing apparatus 200 receives via the network N the input data that the user U has input to the face authentication terminal 400 or the user terminal 500 . In this case, the registration unit 241 registers the information provided by the user U in association with the user ID as the user U attribute information.
 またユーザ属性情報2122は、ユーザUからの入力データに基づいて、登録部241によって生成された情報であってもよい。例えば登録部241は、ユーザUからの入力データに、フライト時間や旅程を含むスケジュール情報が含まれている場合、来場手段を推定し、来場手段を含むユーザ属性情報2122を記憶部210に登録してよい。尚、登録部241は、ユーザUからの入力データに代えて、ユーザ端末500のスケジュールアプリから、上記スケジュール情報を取得してよい。これにより、ユーザUの入力の手間を省くことができる。 Also, the user attribute information 2122 may be information generated by the registration unit 241 based on input data from the user U. For example, when the input data from the user U includes schedule information including flight time and itinerary, the registration unit 241 estimates the means of arrival and registers the user attribute information 2122 including the means of arrival in the storage unit 210. you can Note that the registration unit 241 may acquire the schedule information from the schedule application of the user terminal 500 instead of the input data from the user U. This saves the user U the trouble of inputting.
 またユーザ属性情報2122は、ユーザUを撮影した撮像画像に基づいて、登録部241によって生成された情報であってもよい。上記撮影画像は、顔認証用又は顔登録用に撮影した撮影画像であってよい。例えば登録部241は、上記撮影画像からユーザUの年齢や性別を推定し、推定した情報に基づいてユーザ属性情報2122を生成してよい。また例えば登録部241は、上記撮影画像からユーザUとともに同伴者が検出された場合、同伴者情報を生成し、ユーザ属性情報2122に含めてよい。具体的には、登録部241は、撮影画像から複数の人物領域が検出された場合、ユーザUに同伴者がいると判定してよい。ここでユーザUの画像領域以外の人物領域がユーザUの同伴者の画像領域であるか否かについては、ユーザUの画像領域とその他の人物領域との間の距離や、両者のサイズから判定してよい。しかしこれに限らず、登録部241は、既存の手法を用いることができる。また、登録部241は、撮影画像からユーザU及び同伴者の関係性(集団属性)を推定し、集団属性を含む同伴者情報を生成してもよい。尚、撮影画像によりユーザUの各種属性情報を推定する場合、推定する主体は、登録部241に限らず、認証装置100であってもよい。この場合、登録部241は、認証装置100から、ユーザIDとともにユーザUの属性情報を受信する。このように、撮影画像からユーザUの属性情報を取得することで、ユーザUの入力の手間を省くことができる。 Also, the user attribute information 2122 may be information generated by the registration unit 241 based on a captured image of the user U. The captured image may be a captured image captured for face authentication or face registration. For example, the registration unit 241 may estimate the age and gender of the user U from the captured image, and generate the user attribute information 2122 based on the estimated information. Further, for example, when a companion is detected together with the user U from the photographed image, the registration unit 241 may generate companion information and include it in the user attribute information 2122 . Specifically, the registration unit 241 may determine that the user U has a companion when a plurality of person areas are detected from the captured image. Here, whether or not the person area other than the image area of the user U is the image area of the companion of the user U is determined from the distance between the image area of the user U and other person areas and the sizes of both. You can However, not limited to this, the registration unit 241 can use an existing method. Further, the registration unit 241 may estimate the relationship (group attribute) between the user U and the companions from the captured image, and generate companion information including the group attribute. Note that when estimating various types of attribute information of the user U from the photographed image, the subject to be estimated is not limited to the registration unit 241 and may be the authentication device 100 . In this case, the registration unit 241 receives the attribute information of the user U together with the user ID from the authentication device 100 . In this way, by acquiring the attribute information of the user U from the captured image, it is possible to save the effort of the user U to input.
 尚、サービスの利用登録要求を受信することは、ユーザUの属性情報を取得することの他、ユーザUの属性情報のデータ活用への許諾、すなわち利用許諾を意味してよい。 It should be noted that receiving a service usage registration request may mean obtaining user U's attribute information, as well as permission to utilize data of user U's attribute information, that is, permission to use.
 認証制御部242は、上述した認証制御部12の一例である。認証制御部242は、撮影画像に含まれるユーザUの顔領域に対する顔認証を制御する。つまり、認証制御部242は、各地点で撮影された各撮影画像に含まれるユーザUの顔領域に対する顔認証を制御する。すなわち、認証制御部242は、顔認証端末400から取得した撮影画像について、認証装置100に対して顔認証を行わせる。例えば、認証制御部242は、取得した撮影画像、地点ID及び撮影日時を含めた顔認証要求を、ネットワークNを介して認証装置100へ送信し、認証装置100から顔認証結果を受信する。尚、認証制御部242は、撮影画像からユーザUの顔領域を検出し、顔領域の画像を顔認証要求に含めてもよい。また、認証制御部242は、顔領域から顔特徴情報を抽出し、顔特徴情報を顔認証要求に含めてもよい。そして認証制御部242は、撮影画像を算出部243及び画像生成部245に供給し、顔認証結果を記録制御部244に供給する。 The authentication control unit 242 is an example of the authentication control unit 12 described above. The authentication control unit 242 controls face authentication for the face area of the user U included in the captured image. That is, the authentication control unit 242 controls face authentication for the face area of the user U included in each photographed image photographed at each location. That is, the authentication control unit 242 causes the authentication device 100 to perform face authentication on the captured image acquired from the face authentication terminal 400 . For example, the authentication control unit 242 transmits a face authentication request including the acquired captured image, location ID, and shooting date and time to the authentication device 100 via the network N, and receives the face authentication result from the authentication device 100 . Note that the authentication control unit 242 may detect the face area of the user U from the captured image and include the image of the face area in the face authentication request. Also, the authentication control unit 242 may extract facial feature information from the face area and include the facial feature information in the face authentication request. The authentication control unit 242 then supplies the captured image to the calculation unit 243 and the image generation unit 245 and supplies the face authentication result to the recording control unit 244 .
 算出部243は、算出手段とも呼ばれる。算出部243は、撮影画像に含まれるユーザUの顔領域に基づいて、ユーザUの笑顔度を算出する。ここで、ユーザUの笑顔度を算出する具体例について説明する。一般に、笑顔になった場合、顔領域において口角が上がる、目尻が下がる、目が細くなる、口の周りにしわができる、等の特徴が現れる。このため算出部243は、ユーザUの顔領域において両目や口といった顔器官の特徴点を抽出し、抽出した各特徴点の座標に基づいて、笑顔度を算出してよい。また、算出部243は、学習済の識別器を用いて、笑顔度を算出してよい。識別器は、顔器官の特徴量を入力とし、所定の笑顔度以上であるか否かを出力するサポートベクターマシン(SVM)であってよい。また識別器は、撮影画像の顔領域を入力とし、笑顔度を出力する畳み込みニューラルネットワーク(CNN)であってもよい。尚、笑顔度を算出する手法については、上記に限らず、既存の手法が用いられてよい。 The calculation unit 243 is also called calculation means. The calculation unit 243 calculates the degree of smile of the user U based on the face area of the user U included in the captured image. Here, a specific example of calculating the degree of smile of the user U will be described. In general, when a person smiles, features such as raised corners of the mouth, lowered corners of the eyes, narrowed eyes, and wrinkles around the mouth appear in the face area. Therefore, the calculation unit 243 may extract feature points of facial features such as both eyes and mouth in the face region of the user U, and calculate the degree of smile based on the coordinates of each extracted feature point. Further, the calculation unit 243 may calculate the degree of smile using a learned discriminator. The discriminator may be a support vector machine (SVM) that takes as input the feature amount of facial organs and outputs whether or not the degree of smile is greater than or equal to a predetermined value. Also, the discriminator may be a convolutional neural network (CNN) that receives the face area of the captured image as input and outputs the degree of smile. Note that the method for calculating the degree of smile is not limited to the above, and existing methods may be used.
 また算出部243は、撮影画像にユーザU及び同伴者の顔領域が含まれている場合、ユーザUの顔領域の笑顔度に加えて、同伴者の顔領域の笑顔度に基づいて、笑顔度を算出してもよい。例えば算出部243は、ユーザUの顔領域の笑顔度及び同伴者の顔領域の笑顔度の平均値を、笑顔度として算出してよい。 Further, when the photographed image includes the face regions of the user U and the companion, the calculation unit 243 calculates the degree of smile based on the degree of smile of the face region of the companion in addition to the degree of smile of the face region of the user U. may be calculated. For example, the calculation unit 243 may calculate an average value of the smile degrees of the user U's face region and the smile degrees of the companion's face regions as the smile degree.
 算出部243は、笑顔度の情報を、記録制御部244及び画像生成部245に供給する。 The calculation unit 243 supplies the smile degree information to the recording control unit 244 and the image generation unit 245 .
 記録制御部244は、上述した記録制御部14の一例である。記録制御部244は、顔認証が成功した場合、顔認証結果に含まれるユーザID、地点ID及び日時と、算出部243が算出した笑顔度とを対応付けて、履歴情報213として記憶部210に記録する。記録制御部244は、これに加えて、撮影地点の位置情報を、履歴情報213に含めて記録してよい。 The recording control unit 244 is an example of the recording control unit 14 described above. When the face authentication is successful, the recording control unit 244 associates the user ID, the location ID, and the date and time included in the face authentication result with the degree of smile calculated by the calculation unit 243, and stores the result in the storage unit 210 as the history information 213. Record. In addition to this, the recording control unit 244 may record the position information of the photographing location by including it in the history information 213 .
 画像生成部245は、画像生成手段とも呼ばれる。画像生成部245は、笑顔度及び撮影画像を用いて、笑顔度に応じた画像を特定又は生成する。例えば画像生成部245は、撮影画像に、笑顔度を重畳した合成画像を、笑顔度に応じた画像として生成してよい。また画像生成部245は、笑顔度が所定値以上である場合、その撮影画像を、笑顔度に応じた画像として特定してよい。また画像生成部245は、笑顔度が所定値以上である場合、その撮影画像に笑顔度を重畳した合成画像を、笑顔度に応じた画像として生成してもよい。尚、画像生成部245は、上記生成又は特定した画像を、所定のテンプレートを用いて合成し、その合成画像を笑顔度に応じた画像としてもよい。所定のテンプレートは、撮影地点に関連付けられたコンテンツに関連するテンプレートであってよい。例えば、撮影地点がパンダと関連付けられている場合、テンプレートは、パンダの画像を含んでよい。画像生成部245は、笑顔度に応じた画像を画像保存サーバ300に保存し、保存先のアクセス情報を、履歴情報213として記憶部210に記録する。 The image generation unit 245 is also called image generation means. The image generator 245 uses the degree of smile and the photographed image to identify or generate an image corresponding to the degree of smile. For example, the image generation unit 245 may generate a composite image in which the degree of smile is superimposed on the photographed image as an image corresponding to the degree of smile. Further, when the degree of smile is equal to or greater than a predetermined value, the image generation unit 245 may specify the photographed image as an image corresponding to the degree of smile. Further, when the degree of smile is equal to or greater than a predetermined value, the image generation unit 245 may generate a composite image in which the degree of smile is superimposed on the photographed image as an image corresponding to the degree of smile. The image generator 245 may synthesize the generated or specified images using a predetermined template, and use the synthesized image as an image corresponding to the degree of smile. The predetermined template may be a template related to content associated with the shooting location. For example, if the shooting location is associated with a panda, the template may include an image of a panda. The image generation unit 245 saves the image corresponding to the degree of smile in the image saving server 300 and records the access information of the saving destination in the storage unit 210 as the history information 213 .
 出力部246は、上述した出力部16の一例である。出力部246は、所定の提供条件を満たした場合、笑顔度に応じた画像を、ユーザ端末500にネットワークNを介して出力する。所定の提供条件とは、撮影画像に基づく顔認証が成功した後、通信部230が顔認証端末400から撮影終了要求を受信したことであってよい。ここで、笑顔度に応じた画像は、画像保存サーバ300に蓄積されている。したがって、この場合、出力部246は、履歴情報213に含まれるユーザIDに対応するアクセス情報を用いて、画像保存サーバ300から笑顔度に応じた画像を取得し、ユーザ端末500に送信してよい。このとき取得した笑顔度に応じた画像は、異なる撮影日時の撮影画像に対応する複数の画像であってよい。これにより、ユーザUは、笑顔度に応じた画像が複数ある場合には、画像をまとめて取得できる。尚、所定の提供条件は、単に、撮影画像に基づく顔認証が成功したことであってもよい。この場合、出力部246は、撮影終了要求を受信していなくても、撮影画像に基づく顔認証が成功した場合は、笑顔度に応じた情報を、ユーザ端末500に送信する。 The output unit 246 is an example of the output unit 16 described above. The output unit 246 outputs an image corresponding to the degree of smile to the user terminal 500 via the network N when a predetermined provision condition is satisfied. The predetermined provision condition may be that the communication unit 230 receives a photographing end request from the face authentication terminal 400 after successful face authentication based on the captured image. Here, images corresponding to smile degrees are stored in the image storage server 300 . Therefore, in this case, the output unit 246 may acquire an image corresponding to the degree of smile from the image storage server 300 using the access information corresponding to the user ID included in the history information 213 and transmit the image to the user terminal 500 . . The images corresponding to the degree of smile acquired at this time may be a plurality of images corresponding to images taken at different shooting dates and times. Accordingly, when there are a plurality of images corresponding to the degree of smile, the user U can acquire the images collectively. Note that the predetermined provision condition may simply be that face authentication based on the captured image has succeeded. In this case, the output unit 246 transmits information according to the degree of smile to the user terminal 500 when the face authentication based on the captured image is successful even if the request to end the photographing is not received.
 ここで出力部246は、ユーザUから提供される属性情報の情報量に応じて、ユーザ端末500に出力する、笑顔度に応じた情報を決定してよい。ユーザUから提供される属性情報は、ユーザUが入力した属性情報や、ユーザUが利用許諾した属性情報であってよい。例えば出力部246は、ユーザUから提供される属性情報の情報量に応じて、画像の出力枚数を決定してよい。これにより、属性情報の提供や利用許諾のインセンティブがさらに働きやすくなり、属性情報の取得がさらに容易となる。 Here, the output unit 246 may determine information corresponding to the degree of smile to be output to the user terminal 500 according to the amount of attribute information provided by the user U. The attribute information provided by the user U may be attribute information input by the user U or attribute information licensed by the user U. For example, the output unit 246 may determine the number of images to be output according to the amount of attribute information provided by the user U. FIG. As a result, the incentives for providing attribute information and permitting use are more likely to work, and the acquisition of attribute information is further facilitated.
 評価部247は、上述した評価部17の一例である。評価部247は、ユーザ情報212、履歴情報213及びコンテンツ情報214を用いて、集約情報215を生成する。具体的には、評価部247は、履歴情報213の笑顔度に対応するユーザIDを用いて、ユーザ情報212のユーザ属性情報を取得する。また、評価部247は、履歴情報213の笑顔度に対応する地点IDを用いて、コンテンツ情報214のコンテンツ属性情報又はコンテンツIDを取得する。そして評価部247は、笑顔度と、ユーザ属性情報と、コンテンツ属性情報又はコンテンツIDとを対応付けて、集約情報215として記録する。そして評価部247は、ユーザ属性情報及びコンテンツ属性情報の組み合わせや、ユーザ属性情報及びコンテンツIDの組み合わせ毎に笑顔度を集約し、集約した笑顔度に基づいて、コンテンツ満足度を算出する。これにより、例えば、「中学生には、ライオンよりパンダの方が人気があり、草食動物よりも肉食動物のほうが人気がある」というように、ユーザ属性ごとに満足度の高いコンテンツIDやコンテンツ属性を把握することが可能となる。 The evaluation unit 247 is an example of the evaluation unit 17 described above. The evaluation unit 247 uses the user information 212 , the history information 213 and the content information 214 to generate aggregate information 215 . Specifically, the evaluation unit 247 acquires the user attribute information of the user information 212 using the user ID corresponding to the degree of smile of the history information 213 . Also, the evaluation unit 247 acquires the content attribute information or the content ID of the content information 214 using the point ID corresponding to the degree of smile of the history information 213 . Then, the evaluation unit 247 associates the degree of smile, the user attribute information, and the content attribute information or the content ID, and records them as aggregated information 215 . Then, the evaluation unit 247 aggregates the degree of smile for each combination of user attribute information and content attribute information and for each combination of user attribute information and content ID, and calculates content satisfaction based on the aggregated degree of smile. As a result, content IDs and content attributes with high satisfaction for each user attribute can be determined, for example, "Pandas are more popular than lions and carnivores are more popular than herbivores among junior high school students." It is possible to comprehend.
 尚、評価部247は、ユーザ属性情報及びコンテンツ情報に加えて又は代えて、撮影地点の位置情報、撮影時間帯、及び撮影地点の来訪順番のうち少なくとも1つに基づいて笑顔度を集約してもよい。ここで、撮影地点の位置情報は、コンテンツの位置情報を示し、撮影時間帯は、コンテンツを体験する時間帯を示し、撮影地点の来訪順番は、コンテンツを体験する順番を示している。そして評価部247は、各種パラメータに基づいて集約した笑顔度に基づいて、コンテンツの満足度を算出する。これにより、評価部247は、様々な評価軸からコンテンツの満足度を評価でき、詳細分析が可能となる。例えば、「パンダの見学スポットが動物園の入り口から近いほうが、遠い場合に比べて、高齢者層のコンテンツ全体の満足度が高い」といった分析が可能となる。 In addition to or instead of the user attribute information and the content information, the evaluation unit 247 aggregates the degree of smile based on at least one of the position information of the shooting location, the shooting time period, and the order of visits to the shooting location. good too. Here, the location information of the shooting location indicates the location information of the content, the shooting time zone indicates the time zone in which the content is experienced, and the visit order of the shooting location indicates the order in which the content is experienced. Then, the evaluation unit 247 calculates the degree of content satisfaction based on the degree of smile aggregated based on various parameters. As a result, the evaluation unit 247 can evaluate the degree of satisfaction of the content from various evaluation axes, enabling detailed analysis. For example, it is possible to analyze that ``when the panda viewing spot is closer to the entrance of the zoo, the elderly people are more satisfied with the overall content than when it is farther away''.
 図11は、実施形態2にかかる利用登録処理の流れを示すフローチャートである。まず、登録部241は、ネットワークNを介して利用登録要求を受信する(S401)。ここでは、要求元は、顔認証端末400又はユーザ端末500のいずれかとする。尚、このとき登録部241は、利用登録要求に加えて撮影画像を受信した場合、ネットワークNを介して認証装置100に顔情報DB110への顔情報の登録を要求する。 FIG. 11 is a flowchart showing the flow of usage registration processing according to the second embodiment. First, the registration unit 241 receives a usage registration request via the network N (S401). Here, the request source is assumed to be either the face authentication terminal 400 or the user terminal 500 . It should be noted that, at this time, the registration unit 241 requests the authentication device 100 via the network N to register the face information in the face information DB 110 when the photographed image is received in addition to the use registration request.
 続いて登録部241は、認証装置100からネットワークNを介してユーザIDを取得する(S402)。そして登録部241は、要求元の顔認証端末400又はユーザ端末500から、ネットワークNを介してユーザUの属性情報を取得する(S403)。尚、上述の通り、登録部241は、これに代えて、ユーザUの入力データ又は撮影画像に基づいてユーザUの属性情報を生成してもよい。そして登録部241は、ユーザUの属性情報を、ユーザ情報212のユーザ属性情報2122として、ユーザID2121に対応付けて登録する(S404)。そして登録部241は、処理を終了する。 Then, the registration unit 241 acquires the user ID from the authentication device 100 via the network N (S402). Then, the registration unit 241 acquires the attribute information of the user U via the network N from the face authentication terminal 400 or the user terminal 500 that made the request (S403). As described above, instead of this, the registration unit 241 may generate attribute information of the user U based on the input data of the user U or the captured image. Then, the registration unit 241 registers the attribute information of the user U as the user attribute information 2122 of the user information 212 in association with the user ID 2121 (S404). Then, the registration unit 241 terminates the processing.
 図12は、実施形態2にかかる画像出力処理の流れを示すフローチャートである。まず、認証制御部242は、顔認証端末400からネットワークNを介して撮影画像を取得する(S411)。次に認証制御部242は、認証装置100に対してネットワークNを介して顔認証要求を送信する(S412)。このとき、認証制御部242は、ステップS411で取得した撮影画像、当該撮影画像から抽出した顔領域、又は、当該顔領域から抽出した顔特徴情報の少なくともいずれかを顔認証要求に含める。そして、認証制御部242は、認証装置100からネットワークNを介して顔認証結果を受信する(S413)。顔認証結果は、顔認証に成功した場合、その旨及びユーザIDを含み、顔認証に失敗した場合、その旨を含む。 FIG. 12 is a flowchart showing the flow of image output processing according to the second embodiment. First, the authentication control unit 242 acquires a captured image from the face authentication terminal 400 via the network N (S411). Next, the authentication control unit 242 transmits a face authentication request to the authentication device 100 via the network N (S412). At this time, the authentication control unit 242 includes at least one of the photographed image acquired in step S411, the face area extracted from the photographed image, or the facial feature information extracted from the face area in the face authentication request. The authentication control unit 242 then receives the face authentication result from the authentication device 100 via the network N (S413). The face authentication result includes the fact and the user ID when the face authentication is successful, and includes the fact when the face authentication is unsuccessful.
 認証制御部242は、顔認証に成功したか否かを判定する(S414)。顔認証に失敗したと判定した場合(S414でNo)、認証制御部242は、顔認証に失敗した旨を出力する(S415)。具体的には、認証制御部242は、ネットワークNを介して画像提供元の顔認証端末400に対して顔認証に失敗した旨を示すメッセージを送信する。そして認証制御部242は、処理を終了する。 The authentication control unit 242 determines whether face authentication has succeeded (S414). When it is determined that the face authentication has failed (No in S414), the authentication control unit 242 outputs that the face authentication has failed (S415). Specifically, the authentication control unit 242 transmits a message to the face authentication terminal 400 that provides the image via the network N to the effect that the face authentication has failed. The authentication control unit 242 then terminates the process.
 一方、顔認証に成功したと判定した場合(S414でYes)、認証制御部242は、顔認証に成功したユーザIDを特定する(S416)。具体的には、認証制御部242は、顔認証結果に含まれるユーザIDを抽出する。このとき、認証制御部242は、顔認証に成功した旨を出力してよい。具体的には、認証制御部242は、ネットワークNを介して画像提供元の顔認証端末400に対して顔認証に成功した旨を示すメッセージを送信してよい。 On the other hand, if it is determined that face authentication has succeeded (Yes in S414), the authentication control unit 242 identifies the user ID for which face authentication has succeeded (S416). Specifically, the authentication control unit 242 extracts the user ID included in the face authentication result. At this time, the authentication control unit 242 may output that the face authentication was successful. Specifically, the authentication control unit 242 may transmit a message to the face authentication terminal 400 that provides the image via the network N to the effect that the face authentication has succeeded.
 次に、算出部243は、撮影画像に基づいて、笑顔度を算出する(S417)。続いて画像生成部245は、撮影画像及び笑顔度に基づいて、笑顔度に応じた画像を生成する(S418)。画像生成部245は、生成した、笑顔度に応じた画像を、ネットワークNを介して画像保存サーバ300に保存する(S419)。そして画像生成部245は、画像保存サーバ300から画像保存先のアクセス情報を取得する。続いて記録制御部244は、履歴情報213を登録する(S420)。具体的には、記録制御部244は、ユーザID、地点ID、日時、笑顔度、及びアクセス情報を対応付けて、履歴情報213として記憶部210に記録する。また、画像生成部245は、笑顔度に応じた画像を、ネットワークNを介して顔認証端末400に出力する(S421)。これにより、顔認証端末400は、笑顔度に応じた画像を表示部440に表示する。続いて、出力部246は、所定の提供条件を満たしたか否かを判定する(S422)。例えば出力部246は、通信部230が顔認証端末400から撮影終了要求を受信したか否かを判定することで、所定の提供条件を満たしたか否かを判定する。出力部246は、所定の提供条件を満たしていないと判定した場合は(S422でNo)、処理を終了する。一方、出力部246は、所定の提供条件を満たしたと判定した場合は(S422でYes)、画像保存サーバ300から、履歴情報213のアクセス情報を用いて、笑顔度に応じた画像を取得する(S423)。そして出力部246は、笑顔度に応じた画像を、ネットワークNを介してユーザ端末500に送信する(S424)。 Next, the calculation unit 243 calculates the degree of smile based on the captured image (S417). Subsequently, the image generation unit 245 generates an image corresponding to the degree of smile based on the photographed image and the degree of smile (S418). The image generation unit 245 stores the generated image corresponding to the degree of smile in the image storage server 300 via the network N (S419). The image generation unit 245 then acquires the access information of the image storage destination from the image storage server 300 . Subsequently, the recording control unit 244 registers the history information 213 (S420). Specifically, the recording control unit 244 associates the user ID, the spot ID, the date and time, the degree of smile, and the access information, and records them in the storage unit 210 as the history information 213 . The image generation unit 245 also outputs an image corresponding to the degree of smile to the face authentication terminal 400 via the network N (S421). As a result, the face authentication terminal 400 displays an image corresponding to the degree of smile on the display unit 440 . Subsequently, the output unit 246 determines whether or not a predetermined provision condition is satisfied (S422). For example, the output unit 246 determines whether or not a predetermined provision condition is satisfied by determining whether or not the communication unit 230 has received a shooting end request from the face authentication terminal 400 . When the output unit 246 determines that the predetermined provision condition is not satisfied (No in S422), the process ends. On the other hand, when the output unit 246 determines that the predetermined provision condition is satisfied (Yes in S422), the output unit 246 acquires an image corresponding to the degree of smile from the image storage server 300 using the access information of the history information 213 ( S423). The output unit 246 then transmits the image corresponding to the degree of smile to the user terminal 500 via the network N (S424).
 尚、笑顔度に応じた画像は、出力部246が提供条件を満たしたと判定した後に生成されてもよい。この場合、具体的には、ステップS418は省略され、ステップS419に代えて、画像生成部245は、撮像画像を、ネットワークNを介して画像保存サーバ300に保存する。そして画像生成部245は、画像保存サーバ300から画像保存先のアクセス情報を取得する。ステップS421においては、画像生成部245は、笑顔度の情報のみを、ネットワークNを介して顔認証端末400に出力してよい。そしてステップS423に代えて、出力部246は、画像保存サーバ300から、履歴情報213のアクセス情報を用いて、ユーザUに対応する撮影画像を取得する。そして画像生成部245が、取得した撮影画像及び笑顔度に基づいて、笑顔度に応じた画像を生成する。ステップS424において、出力部246は、生成された笑顔度に応じた画像を、ネットワークNを介してユーザ端末500に送信する。 Note that the image corresponding to the degree of smile may be generated after the output unit 246 determines that the provision condition is satisfied. In this case, specifically, step S418 is omitted, and instead of step S419, the image generator 245 stores the captured image in the image storage server 300 via the network N. The image generation unit 245 then acquires the access information of the image storage destination from the image storage server 300 . In step S<b>421 , the image generator 245 may output only the smile level information to the face authentication terminal 400 via the network N. Then, instead of step S423, the output unit 246 acquires the captured image corresponding to the user U from the image storage server 300 using the access information of the history information 213. FIG. Then, the image generation unit 245 generates an image corresponding to the degree of smile based on the captured image and the degree of smile that have been acquired. In step S<b>424 , the output unit 246 transmits the generated image corresponding to the degree of smile to the user terminal 500 via the network N.
 図13は、実施形態2にかかる評価処理の流れを示すフローチャートである。本評価処理は、定期的に実行されてもよいし、履歴情報213が更新される等、所定の実行条件を満たした場合に実行されてもよい。まず評価部247は、評価処理を開始するか否かを判定し(S431)、開始すると判定した場合(S431でYes)、ユーザ属性情報及びコンテンツ属性情報(又はコンテンツID)の組み合わせ毎に、笑顔度を集約し(S432)、集約情報215を記録する。評価部247は、集約した笑顔度に基づいて、コンテンツ満足度を算出する(S433)。そして評価部247は、算出したコンテンツ満足度を、ユーザ属性情報及びコンテンツ属性情報(又はコンテンツID)とともに所定の形式で外部に出力する(S434)。 FIG. 13 is a flowchart showing the flow of evaluation processing according to the second embodiment. This evaluation process may be executed periodically, or may be executed when a predetermined execution condition is satisfied, such as when the history information 213 is updated. First, the evaluation unit 247 determines whether or not to start the evaluation process (S431). If it is determined to start (Yes in S431), a smile The degrees are aggregated (S432), and aggregated information 215 is recorded. The evaluation unit 247 calculates the degree of content satisfaction based on the aggregated degree of smile (S433). The evaluation unit 247 then outputs the calculated content satisfaction level to the outside in a predetermined format together with the user attribute information and content attribute information (or content ID) (S434).
 図14は、実施形態2にかかる利用登録処理の流れの一例を示すシーケンス図である。まず地点A1において、顔認証端末400-1は、情報処理装置200にサービスの利用登録要求を送信する(S500)。また、顔認証端末400-1は、ユーザUを撮影し(S501)、撮影画像を含む顔情報登録要求を、ネットワークNを介して認証装置100へ送信する(S502)。そして、認証装置100は、受信した顔情報登録要求に含まれる撮影画像に基づいて、ユーザUの顔情報(顔特徴情報)を登録する(S503)。そして、認証装置100は、ユーザIDをネットワークNを介して情報処理装置200に通知する(S504)。また、顔認証端末400-1は、ユーザUの属性情報をネットワークNを介して情報処理装置200に送信する(S505)。情報処理装置200は、通知されたユーザIDと、ユーザUの属性情報とを対応付けてユーザ情報212に登録する(S506)。そしてユーザUは、地点A1から地点A2に移動する。 FIG. 14 is a sequence diagram showing an example of the flow of usage registration processing according to the second embodiment. First, at point A1, face authentication terminal 400-1 transmits a service use registration request to information processing apparatus 200 (S500). Further, the face authentication terminal 400-1 photographs the user U (S501), and transmits a face information registration request including the photographed image to the authentication device 100 via the network N (S502). Then, the authentication device 100 registers face information (face feature information) of the user U based on the captured image included in the received face information registration request (S503). The authentication device 100 then notifies the information processing device 200 of the user ID via the network N (S504). The face authentication terminal 400-1 also transmits the attribute information of the user U to the information processing apparatus 200 via the network N (S505). The information processing apparatus 200 associates the notified user ID with the attribute information of the user U and registers them in the user information 212 (S506). Then, the user U moves from the point A1 to the point A2.
 図15~16は、実施形態2にかかる画像出力処理の流れの一例を示すシーケンス図である。地点A2において、顔認証端末400-2は、利用登録が完了したユーザUを撮影し(S510)、撮影画像及び地点IDを、ネットワークNを介して情報処理装置200に送信する(S511)。情報処理装置200は、受信した顔認証要求に含まれる撮影画像内のユーザUの顔領域に対する顔認証要求を、ネットワークNを介して認証装置100へ送信する(S512)。そして、認証装置100は、受信した顔認証要求に含まれる撮影画像内のユーザUの顔領域について顔認証を行う(S513)。ここでは、顔認証に成功したものとする。認証装置100は、顔認証に成功した旨及びユーザIDを含めた顔認証結果を、ネットワークNを介して情報処理装置200へ送信する(S514)。 15 and 16 are sequence diagrams showing an example of the flow of image output processing according to the second embodiment. At the point A2, the face authentication terminal 400-2 takes an image of the user U for whom usage registration has been completed (S510), and transmits the taken image and the point ID to the information processing apparatus 200 via the network N (S511). The information processing apparatus 200 transmits a face authentication request for the face area of the user U in the captured image included in the received face authentication request to the authentication apparatus 100 via the network N (S512). Then, the authentication device 100 performs face authentication on the face area of the user U in the captured image included in the received face authentication request (S513). Here, it is assumed that face authentication has succeeded. The authentication device 100 transmits the face authentication result including the fact that the face authentication was successful and the user ID to the information processing device 200 via the network N (S514).
 続いて情報処理装置200は、撮影画像から笑顔度を算出する(S515)。そして情報処理装置200は、撮影画像に笑顔度を重畳させた合成画像を生成する(S516)。情報処理装置200は、生成した合成画像を、画像保存サーバ300に保存する(S517)。そして情報処理装置200は、ユーザID、地点ID、日時、笑顔度及び保存先のアクセス情報を、履歴情報213として登録する(S518)。また情報処理装置200は、生成した合成画像を、ネットワークNを介して顔認証端末400-2に送信する(S519)。合成画像を受信した顔認証端末400-2は、合成画像を表示する(S520)。顔認証端末400-2は、撮影終了情報の入力画面を表示し、ユーザUからの入力を受け付ける。ここでは、顔認証端末400-2は、ユーザUから撮影終了情報の入力操作を受け付けず、例えばユーザUから終了を希望しない旨の続行情報の入力操作を受け付けたとする(S521)。顔認証端末400-2は、続行情報を含む続行要求を、ネットワークNを介して情報処理装置200に送信する(S522)。そしてユーザUは、地点A2から地点A3に移動する。 Subsequently, the information processing apparatus 200 calculates the degree of smile from the photographed image (S515). The information processing apparatus 200 then generates a composite image by superimposing the degree of smile on the photographed image (S516). The information processing apparatus 200 saves the generated composite image in the image saving server 300 (S517). Then, the information processing apparatus 200 registers the user ID, the spot ID, the date and time, the degree of smile, and the access information of the save destination as the history information 213 (S518). The information processing device 200 also transmits the generated composite image to the face authentication terminal 400-2 via the network N (S519). Face authentication terminal 400-2 that has received the synthesized image displays the synthesized image (S520). The face authentication terminal 400-2 displays an input screen for shooting end information and accepts input from the user U. FIG. Here, it is assumed that the face authentication terminal 400-2 does not accept an input operation of shooting end information from the user U, but accepts an input operation of continuation information indicating that the user U does not wish to end (S521). The face authentication terminal 400-2 transmits a continuation request including continuation information to the information processing device 200 via the network N (S522). Then, the user U moves from the point A2 to the point A3.
 地点A3において、顔認証端末400-3は、ユーザUを撮影し(S530)、撮影画像及び地点IDを、ネットワークNを介して情報処理装置200に送信する(S531)。そしてステップS512~520と同様の処理が実行される。顔認証端末400-3は、撮影終了情報の入力画面を表示し、ユーザUからの入力を受け付ける。ここでは、顔認証端末400-3は、ユーザUから撮影終了情報の入力操作を受け付けたとする(S532)。顔認証端末400-3は、撮影終了情報を含む撮影終了要求を、ネットワークNを介して情報処理装置200に送信する(S533)。撮影終了要求を受信した情報処理装置200は、ユーザIDに対応するアクセス情報を用いて、画像保存サーバ300にアクセスし(S534)、合成画像を取得する(S535)。そして情報処理装置200は、取得した合成画像を、ユーザIDに対応するユーザ端末500に対して送信する(S536)。 At the point A3, the face authentication terminal 400-3 takes an image of the user U (S530), and transmits the taken image and the point ID to the information processing device 200 via the network N (S531). Then, processing similar to steps S512 to S520 is executed. The face authentication terminal 400-3 displays an input screen for the shooting end information and accepts input from the user U. FIG. Here, it is assumed that the face authentication terminal 400-3 receives an input operation of shooting end information from the user U (S532). The face authentication terminal 400-3 transmits a photographing end request including photographing end information to the information processing apparatus 200 via the network N (S533). The information processing apparatus 200 that has received the shooting end request accesses the image storage server 300 using the access information corresponding to the user ID (S534), and acquires the composite image (S535). The information processing apparatus 200 then transmits the acquired composite image to the user terminal 500 corresponding to the user ID (S536).
 図17は、実施形態2にかかる合成画像620の一例を示す図である。本図に示す合成画像620は、図15のステップS516において生成され、ステップS520において顔認証端末400により表示され、図16のステップS536において、ユーザ端末500により取得される合成画像である。合成画像620は、撮影画像に、笑顔度が重畳された画像である。合成画像620は、その背景に、撮影地点に関連付けられたコンテンツの特徴を表現した動物の画像領域を含んでいる。 FIG. 17 is a diagram showing an example of a composite image 620 according to the second embodiment. A composite image 620 shown in the figure is a composite image generated in step S516 of FIG. 15, displayed by the face authentication terminal 400 in step S520, and acquired by the user terminal 500 in step S536 of FIG. A composite image 620 is an image obtained by superimposing a degree of smile on a photographed image. The composite image 620 includes in its background an image area of an animal representing the features of the content associated with the shooting location.
 このように実施形態2にかかる情報処理装置200は、ユーザ属性情報やその他のユーザUに関連する情報に基づいて笑顔度を集約し、コンテンツの満足度の詳細分析を行う。したがってイベント会場の運営者がコンテンツの満足度を向上させる措置をとることが用意となる。また、ユーザUに対しては、特典として笑顔度に応じた情報を取得できるため、自己の属性情報の提供や利用許諾のインセンティブが働きやすい。これにより、情報処理装置200は、属性情報を容易に取得でき、詳細分析が容易となる。したがって、イベント会場におけるコンテンツの満足度をさらに好適に分析できる。 As described above, the information processing apparatus 200 according to the second embodiment aggregates smile levels based on user attribute information and other information related to the user U, and performs detailed analysis of content satisfaction levels. Therefore, it becomes easier for the operator of the event site to take measures to improve the satisfaction of the content. In addition, for the user U, since information corresponding to the degree of smile can be obtained as a benefit, incentives to provide the user's attribute information and permit usage are likely to work. As a result, the information processing apparatus 200 can easily acquire the attribute information and facilitate detailed analysis. Therefore, it is possible to more preferably analyze the degree of satisfaction with the content at the event site.
 尚、出力部246は、ユーザUが希望した場合にのみ、笑顔度に応じた画像をユーザ端末500に送信してよい。この場合、図16のステップS532において撮影終了情報の入力操作を受け付けた顔認証端末400は、笑顔度に応じた画像の取得を希望するか否かの入力を受け付ける入力画面を表示してよい。 Note that the output unit 246 may transmit an image corresponding to the degree of smile to the user terminal 500 only when the user U desires. In this case, the face authentication terminal 400 that has received the input operation of the shooting end information in step S532 of FIG. 16 may display an input screen for receiving input as to whether or not to obtain an image corresponding to the degree of smile.
 図18は、実施形態2にかかる端末画面610の一例を示す図である。本図に示す端末画面610は、図16のステップS532が実行された後、表示される。端末画面610には、「この写真を送信しても良いですか」というメッセージとともに、ユーザUが表示された合成画像620の送信を希望するか否かの入力を受け付ける入力領域を含んでいる。そして顔認証端末400は、図16のステップS533に代えて、撮影終了要求とともに、取得希望可否を示す情報を、情報処理装置200に送信する。尚、本図に示す端末画面610は、図15のステップS520において表示されてもよい。この場合、情報処理装置200は、各撮影画像について取得希望可否を記録してよい。そして情報処理装置200は、図16のステップS533において撮影終了要求を受信した場合、取得希望にかかる画像を画像保存サーバ300から取得し、ユーザ端末500に送信してよい。 FIG. 18 is a diagram showing an example of a terminal screen 610 according to the second embodiment. A terminal screen 610 shown in the figure is displayed after step S532 in FIG. 16 is executed. The terminal screen 610 includes an input area for receiving an input as to whether or not the user U wishes to transmit the displayed composite image 620 along with a message "Can I send this picture?" Then, instead of step S533 in FIG. 16, the face authentication terminal 400 transmits to the information processing apparatus 200 information indicating whether or not acquisition is desired together with a request to end shooting. Note that the terminal screen 610 shown in this figure may be displayed in step S520 of FIG. In this case, the information processing apparatus 200 may record whether or not each photographed image is desired to be acquired. 16, the information processing apparatus 200 may acquire the desired image from the image storage server 300 and transmit it to the user terminal 500. FIG.
 また、上述の説明では、出力部246は、ユーザUに提供するために、笑顔度に応じた画像をユーザ端末500に送信するとしたが、アクセス情報を送信するとしてもよい。例えば出力部246は、図16のステップS534~536に代えて、アクセス情報を含む、画像のダウンロード案内画面を生成し、生成した案内画面の情報を送信する。送信先は、ユーザ端末500に限らず、顔認証端末400であってもよい。 In the above description, the output unit 246 transmits an image corresponding to the degree of smile to the user terminal 500 in order to provide it to the user U, but it may transmit access information. For example, instead of steps S534 to S536 in FIG. 16, the output unit 246 generates an image download guidance screen including access information, and transmits the generated guidance screen information. The destination is not limited to the user terminal 500 and may be the face authentication terminal 400 .
 図19は、実施形態2にかかる端末画面610の一例を示す図である。端末画面610は、アクセス情報611を含んでいる。本図に示す端末画面610は、顔認証端末400がダウンロード案内画面の情報を受信したことに応じて表示される。本図におけるアクセス情報611は、QRコード(登録商標)として示されている。つまり、アクセス情報611の表示により、アクセス情報611がユーザUに提示されたことを示す。ここで、ユーザ端末500は、ユーザUの操作に応じて顔認証端末400に表示された端末画面610内のアクセス情報611を読み取る。そして、ユーザ端末500は、読み取ったアクセス情報611を解析して、解析結果に基づきネットワークNを介して画像保存サーバ300に対して画像取得要求を送信する。尚、画像取得要求は、単に、ユーザ端末500から画像保存サーバ300の所定の保存先へのアクセス(リクエストメッセージ)であってもよい。画像保存サーバ300は、ユーザ端末500からの画像取得要求が示す保存先に格納された合成画像を読み出し、合成画像を含めた応答を、ネットワークNを介してユーザ端末500へ送信する。 FIG. 19 is a diagram showing an example of a terminal screen 610 according to the second embodiment. Terminal screen 610 includes access information 611 . A terminal screen 610 shown in this figure is displayed when the face authentication terminal 400 receives the information of the download guidance screen. The access information 611 in this figure is shown as a QR code (registered trademark). In other words, the display of the access information 611 indicates that the access information 611 has been presented to the user U. FIG. Here, the user terminal 500 reads the access information 611 in the terminal screen 610 displayed on the face authentication terminal 400 according to the user U's operation. Then, the user terminal 500 analyzes the read access information 611 and transmits an image acquisition request to the image storage server 300 via the network N based on the analysis result. The image acquisition request may simply be an access (request message) from the user terminal 500 to a predetermined storage destination of the image storage server 300 . The image storage server 300 reads the synthetic image stored in the storage destination indicated by the image acquisition request from the user terminal 500 and transmits a response including the synthetic image to the user terminal 500 via the network N.
 また、図16のステップS536においてユーザUに提供される、笑顔度に応じた画像は、画像保存サーバ300に蓄積されるユーザUの複数の画像を1画像に集約した全体合成画像であってもよい。
 図20は、実施形態2にかかる全体合成画像630の一例を示す図である。全体合成画像630は、顔認証に成功したユーザUの、異なる撮影日時の撮影画像に対応する、笑顔度に応じた画像(合成画像631、632、633,634及び635)を複数含んでいる。全体合成画像630は、各々の合成画像631~635が個別に識別可能に合成されている。
The image corresponding to the degree of smile provided to the user U in step S536 of FIG. good.
FIG. 20 is a diagram showing an example of an overall synthetic image 630 according to the second embodiment. The overall composite image 630 includes a plurality of images ( composite images 631, 632, 633, 634, and 635) corresponding to the degree of smile of the user U whose face authentication has succeeded, and which correspond to the images taken at different shooting dates and times. The overall composite image 630 is composited so that each of the composite images 631 to 635 can be individually identified.
 画像生成部245は、所定の提供条件を満たした場合、履歴情報213に含まれるユーザIDに対応するアクセス情報を用いて、画像保存サーバ300から画像(撮影画像又は笑顔度に応じた画像)を取得する。画像生成部245は、複数の画像を取得した場合、複数の画像を、個別に識別可能に合成し、全体合成画像を生成する。このとき画像生成部245は、複数の画像のうち、撮影地点が互いに異なる画像を選択し、選択した画像から全体合成画像を生成してよい。また画像生成部245は、複数の画像のうち、笑顔度が所定値以上の画像を選択し、選択した画像から全体合成画像を生成してよい。また画像生成部245は、複数の画像のうち、笑顔度が高い画像から順に選択し、選択した画像から全体合成画像を生成してよい。例えば画像生成部245は、複数の画像のうち、笑顔度が上位所定個の画像を選択し、選択した画像から全体合成画像を生成する。また画像生成部245は、各撮影地点について、笑顔度が所定値以上又は笑顔度が最も高い画像を選択し、選択した画像から全体合成画像を生成してもよい。そして出力部246は、生成された全体合成画像を、ユーザUに提供する。尚、このとき出力部246は、生成された全体合成画像を、ユーザ端末500に表示させてよい。また出力部246は、全体合成画像に加えて、全体合成画像に識別可能に含まれる個々の画像を、ユーザ端末500に表示させてもよい。 When a predetermined provision condition is satisfied, the image generation unit 245 uses the access information corresponding to the user ID included in the history information 213 to generate an image (a photographed image or an image corresponding to the degree of smile) from the image storage server 300. get. When acquiring a plurality of images, the image generation unit 245 synthesizes the plurality of images so as to be individually identifiable to generate an overall composite image. At this time, the image generation unit 245 may select images shot at different shooting locations from among the plurality of images, and generate an overall composite image from the selected images. Further, the image generation unit 245 may select an image having a degree of smile equal to or greater than a predetermined value from among the plurality of images, and generate an overall composite image from the selected images. In addition, the image generation unit 245 may select images in descending order of the degree of smile from among the plurality of images, and generate an overall composite image from the selected images. For example, the image generation unit 245 selects a predetermined number of images with the highest degree of smile from among the plurality of images, and generates an overall composite image from the selected images. The image generator 245 may also select an image with a degree of smile equal to or greater than a predetermined value or with the highest degree of smile for each shooting location, and generate an overall composite image from the selected images. The output unit 246 then provides the user U with the generated overall composite image. At this time, the output unit 246 may cause the user terminal 500 to display the generated overall composite image. The output unit 246 may cause the user terminal 500 to display individual images included in the overall composite image in an identifiable manner in addition to the overall composite image.
 このように、各地点を訪問したユーザUは、特定の地点での顔認証により、各地点で顔認証を行った際に撮影された撮影画像を集約した全体合成画像を取得できる。そのため、ユーザUは、自身が実際に体験したコンテンツのうち、気に入ったコンテンツを、容易に把握できる。さらに全体合成画像であれば、ユーザUがSNS(Social Networking Service)等で容易に公開できる。したがって、多くの人にコンテンツが認知されることで、イベント会場の活性化に寄与できる。 In this way, the user U who has visited each location can acquire an overall composite image that aggregates the captured images taken when face authentication was performed at each location by performing face authentication at a specific location. Therefore, the user U can easily comprehend the content that he/she likes out of the content that he or she has actually experienced. Furthermore, if it is an overall composite image, the user U can easily publish it on an SNS (Social Networking Service) or the like. Therefore, the recognition of the content by many people can contribute to the revitalization of the event site.
 なお、本開示は上記実施形態に限られたものではなく、趣旨を逸脱しない範囲で適宜変更することが可能である。例えば上述の実施形態2では、認証装置100は、情報処理装置200とネットワークNを介して接続されていた。しかしこれに代えて、認証装置100の顔検出部120、特徴点抽出部130、登録部140及び認証部150の機能は、情報処理装置200の制御部240に含まれていてもよい。また、認証装置100の顔情報DB110は、情報処理装置200の記憶部210に含まれていてもよい。 It should be noted that the present disclosure is not limited to the above embodiments, and can be modified as appropriate without departing from the scope. For example, in the second embodiment described above, the authentication device 100 is connected to the information processing device 200 via the network N. FIG. However, instead of this, the functions of the face detection unit 120 , the feature point extraction unit 130 , the registration unit 140 and the authentication unit 150 of the authentication device 100 may be included in the control unit 240 of the information processing device 200 . Moreover, the face information DB 110 of the authentication device 100 may be included in the storage unit 210 of the information processing device 200 .
 また、上述の実施形態2では、画像保存サーバ300は、情報処理装置200とネットワークNを介して接続されていた。しかしこれに代えて、画像保存サーバ300の機能は、情報処理装置200の記憶部210に含まれていてもよい。 Also, in the second embodiment described above, the image storage server 300 is connected to the information processing apparatus 200 via the network N. However, instead of this, the functions of image storage server 300 may be included in storage unit 210 of information processing apparatus 200 .
 また、上述の実施形態2では、情報処理装置200の算出部243が撮影画像に基づいて笑顔度を算出した。しかしこれに代えて、認証装置100が撮影画像に基づいて笑顔度を算出してもよい。この場合、情報処理装置200の算出部243は、認証装置100から笑顔度の情報を受信したことに応じて、笑顔度の情報を記録制御部244及び画像生成部245に供給してよい。 Also, in the second embodiment described above, the calculation unit 243 of the information processing apparatus 200 calculates the degree of smile based on the photographed image. However, instead of this, authentication device 100 may calculate the degree of smile based on the photographed image. In this case, the calculation unit 243 of the information processing device 200 may supply the smile level information to the recording control unit 244 and the image generation unit 245 in response to receiving the smile level information from the authentication device 100 .
 尚、上述の実施形態では、ハードウェアの構成として説明したが、これに限定されるものではない。本開示は、任意の処理を、CPUにコンピュータプログラムを実行させることにより実現することも可能である。 In addition, in the above-described embodiment, the hardware configuration has been described, but the configuration is not limited to this. The present disclosure can also implement arbitrary processing by causing a CPU to execute a computer program.
 上述の例において、プログラムは、コンピュータに読み込まれた場合に、実施形態で説明された1又はそれ以上の機能をコンピュータに行わせるための命令群(又はソフトウェアコード)を含む。プログラムは、非一時的なコンピュータ可読媒体又は実体のある記憶媒体に格納されてもよい。限定ではなく例として、コンピュータ可読媒体又は実体のある記憶媒体は、random-access memory(RAM)、read-only memory(ROM)、フラッシュメモリ、solid-state drive(SSD)又はその他のメモリ技術、CD-ROM、digital versatile disc(DVD)、Blu-ray(登録商標)ディスク又はその他の光ディスクストレージ、磁気カセット、磁気テープ、磁気ディスクストレージ又はその他の磁気ストレージデバイスを含む。プログラムは、一時的なコンピュータ可読媒体又は通信媒体上で送信されてもよい。限定ではなく例として、一時的なコンピュータ可読媒体又は通信媒体は、電気的、光学的、音響的、またはその他の形式の伝搬信号を含む。 In the above examples, the program includes instructions (or software code) that, when read into a computer, cause the computer to perform one or more of the functions described in the embodiments. The program may be stored in a non-transitory computer-readable medium or tangible storage medium. By way of example, and not limitation, computer readable media or tangible storage media may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drives (SSD) or other memory technology, CDs - ROM, digital versatile disc (DVD), Blu-ray disc or other optical disc storage, magnetic cassette, magnetic tape, magnetic disc storage or other magnetic storage device. The program may be transmitted on a transitory computer-readable medium or communication medium. By way of example, and not limitation, transitory computer readable media or communication media include electrical, optical, acoustic, or other forms of propagated signals.
 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
   (付記1)
 利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録部と、
 所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御部と、
 前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御部と、
 前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力部と、
 対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価部と
 を備える情報処理装置。
   (付記2)
 前記対象者の属性情報は、対象者の年齢、性別、職業、家族構成、趣味、同伴者の有無若しくは集団属性を示す同伴者情報、及び来場手段のうち少なくとも1つを含む
 付記1に記載の情報処理装置。
   (付記3)
 前記登録部は、前記対象者から提供される情報を、前記対象者の属性情報として前記対象者に関連付けて登録し、
 前記出力部は、前記対象者から提供される情報の情報量に応じて、前記対象者が使用する情報端末に出力する、前記笑顔度に応じた情報を決定する
 付記1又は2に記載の情報処理装置。
   (付記4)
 前記出力部は、
 取得した撮影画像に基づく前記生体認証が成功したことに応じて、該対象者の笑顔度に応じた情報を、前記情報端末に出力し、
 又は、
 撮影の終了を示す撮影終了要求を取得した場合、前記対象者の、異なる撮影日時の撮影画像に対応する複数の前記笑顔度に応じた情報を、前記情報端末に出力する
 付記1から3のいずれか一項に記載の情報処理装置。
   (付記5)
 前記笑顔度に応じた情報は、生体認証に成功した撮影画像に基づいて生成される、前記笑顔度に応じた画像であり、
 前記笑顔度に応じた画像は、前記撮影画像に前記対象者の笑顔度を重畳した合成画像、又は前記対象者の笑顔度が所定値以上の前記撮影画像、若しくは前記対象者の笑顔度が所定値以上の撮影画像に対応する前記合成画像である
 付記1から4のいずれか一項に記載の情報処理装置。
   (付記6)
 前記対象者の全体合成画像を生成する画像生成部をさらに備え、
 前記全体合成画像は、前記対象者の、異なる撮影日時の撮影画像に対応する前記笑顔度に応じた画像を複数含み、
 前記出力部は、取得した撮影画像に基づく前記生体認証が成功し、かつ撮影の終了を示す撮影終了要求を取得した場合、前記対象者が使用する情報端末に、前記合成画像を出力する
 付記5に記載の情報処理装置。
   (付記7)
 前記記録制御部は、前記笑顔度を、前記撮影地点の位置情報及び撮影日時の少なくとも一方に対応付けて記録し、
 前記評価部は、撮影地点の位置情報、撮影時間帯、及び撮影地点の来訪順番のうち少なくとも1つに基づいて笑顔度を集約する
 付記1から6のいずれか一項に記載の情報処理装置。
   (付記8)
 対象者を撮影した撮影画像を生成する情報端末と、
 前記情報端末と通信可能に接続される情報処理装置と
 を備え、
 前記情報処理装置は、
 利用登録を受け付けた場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録部と、
 所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を前記情報端末から取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御部と、
 前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御部と、
 前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力部と、
 対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価部と
 を有する、情報処理システム。
   (付記9)
 前記対象者の属性情報は、対象者の年齢、性別、職業、家族構成、趣味、同伴者の有無若しくは集団属性を示す同伴者情報、及び来場手段のうち少なくとも1つを含む
 付記8に記載の情報処理システム。
   (付記10)
 利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する段階と、
 所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせる段階と、
 前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する段階と、
 前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する段階と、
 対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する段階と
 を備える情報処理方法。
   (付記11)
 コンピュータに、
 利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録処理と、
 所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御処理と、
 前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御処理と、
 前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力処理と、
 対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価処理と
 を実行させるためのプログラム。
Some or all of the above-described embodiments can also be described in the following supplementary remarks, but are not limited to the following.
(Appendix 1)
a registration unit that registers attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired;
an authentication control unit for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is acquired;
a recording control unit that, when the biometric authentication is successful, records the degree of smile of the subject calculated from the captured image in association with attribute information of the subject and content information related to content associated with a shooting location;
an output unit that outputs information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded;
An information processing apparatus comprising: an evaluation unit that aggregates smile levels based on subject's attribute information and content information, and evaluates content satisfaction based on the aggregated smile levels.
(Appendix 2)
The subject's attribute information includes at least one of the subject's age, gender, occupation, family structure, hobby, companion information indicating the presence or absence of companions or group attributes, and means of visit. Information processing equipment.
(Appendix 3)
The registration unit registers information provided by the subject in association with the subject as attribute information of the subject,
Information according to appendix 1 or 2, wherein the output unit determines information according to the degree of smile to be output to an information terminal used by the subject, according to the amount of information provided by the subject. processing equipment.
(Appendix 4)
The output unit
outputting information corresponding to the degree of smile of the subject to the information terminal in response to the success of the biometric authentication based on the acquired photographed image;
or
any of Supplementary Notes 1 to 3, wherein information corresponding to a plurality of smile levels corresponding to images taken at different shooting dates and times of the subject is output to the information terminal when a photographing end request indicating the end of photographing is acquired. or the information processing device according to claim 1.
(Appendix 5)
the information corresponding to the degree of smile is an image corresponding to the degree of smile generated based on a photographed image for which biometric authentication is successful;
The image corresponding to the degree of smile is a composite image in which the degree of smile of the subject is superimposed on the photographed image, the photographed image in which the degree of smile of the subject is a predetermined value or more, or the degree of smile of the subject is a predetermined value. 5. The information processing apparatus according to any one of appendices 1 to 4, wherein the composite image corresponds to a photographed image having a value equal to or greater than the value.
(Appendix 6)
Further comprising an image generation unit that generates the overall composite image of the subject,
The overall composite image includes a plurality of images corresponding to the degree of smile corresponding to images taken at different shooting dates and times of the subject,
The output unit outputs the composite image to the information terminal used by the subject when the biometric authentication based on the acquired photographed image is successful and the photographing end request indicating the end of photographing is acquired. The information processing device according to .
(Appendix 7)
The recording control unit records the degree of smile in association with at least one of position information of the shooting location and shooting date and time,
7. The information processing apparatus according to any one of appendices 1 to 6, wherein the evaluation unit aggregates the degree of smile based on at least one of position information of shooting locations, shooting time period, and visit order of shooting locations.
(Appendix 8)
an information terminal that generates a captured image of a target person;
an information processing device communicably connected to the information terminal,
The information processing device is
a registration unit that registers attribute information of a subject in association with biometric information of the subject when use registration is accepted;
an authentication control unit that performs biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is obtained from the information terminal;
a recording control unit that, when the biometric authentication is successful, records the degree of smile of the subject calculated from the captured image in association with attribute information of the subject and content information related to content associated with a shooting location;
an output unit that outputs information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded;
An information processing system, comprising: an evaluation unit that aggregates smile levels based on target person attribute information and content information, and evaluates content satisfaction based on the aggregated smile levels.
(Appendix 9)
The subject's attribute information includes at least one of the subject's age, gender, occupation, family structure, hobby, companion information indicating the presence or absence of companions or group attributes, and means of visit. Information processing system.
(Appendix 10)
registering attribute information of a subject in association with biometric information of the subject when a request for use registration is obtained;
a step of performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is obtained;
a step of recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and a shooting location;
a step of outputting information according to the degree of smile of the subject to an information terminal used by the subject whose biometric authentication is successful;
An information processing method comprising: aggregating smile levels based on subject's attribute information and content information; and evaluating content satisfaction based on the aggregated smile levels.
(Appendix 11)
to the computer,
a registration process of registering attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired;
Authentication control processing for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with predetermined content is acquired;
recording control processing for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and a shooting location;
An output process for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded;
A program for executing an evaluation process of aggregating smile levels based on subject's attribute information and content information, and evaluating content satisfaction based on the aggregated smile levels.
 以上、実施の形態を参照して本願発明を説明したが、本願発明は上記によって限定されるものではない。本願発明の構成や詳細には、発明のスコープ内で当業者が理解し得る様々な変更をすることができる。 Although the present invention has been described with reference to the embodiments, the present invention is not limited to the above. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the invention.
 この出願は、2021年2月17日に出願された日本出願特願2021-023729を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2021-023729 filed on February 17, 2021, and the entire disclosure thereof is incorporated herein.
 本実施形態にかかる情報処理装置及び情報処理システムは、例えばイベント会場を運営するために利用可能である。 The information processing device and information processing system according to this embodiment can be used, for example, to operate an event venue.
 10 情報処理装置
 11 登録部
 12 認証制御部
 14 記録制御部
 16 出力部
 17 評価部
 100 認証装置
 110 顔情報DB
 111 ユーザID
 112 顔特徴情報
 120 顔検出部
 130 特徴点抽出部
 140 登録部
 150 認証部
 200 情報処理装置
 300 画像保存サーバ
 400 顔認証端末
 410 カメラ
 420 記憶部
 430 通信部
 440 表示部
 450 制御部
 451 撮影制御部
 452 登録部
 453 認証制御部
 454 表示制御部
 460 入力部
 500 ユーザ端末
 510 カメラ
 520 記憶部
 530 通信部
 540 表示部
 550 制御部
 551 撮影制御部
 552 登録部
 553 取得部
 554 表示制御部
 560 入力部
 210 記憶部
 211 プログラム
 212 ユーザ情報
 2121 ユーザID
 2122 ユーザ属性情報
 213 履歴情報
 2131 ユーザID
 2132 地点ID
 2133 日時
 2134 笑顔度
 2135 アクセス情報
 214 コンテンツ情報
 2141 地点ID
 2142 コンテンツID
 2143 コンテンツ属性情報
 215 集約情報
 220 メモリ
 230 通信部
 240 制御部
 241 登録部
 242 認証制御部
 243 算出部
 244 記録制御部
 245 画像生成部
 246 出力部
 247 評価部
 610 端末画面
 611 アクセス情報
 620 合成画像
 630 全体合成画像
 631,632,633,634,635 合成画像
 1000 情報処理システム
 A1,A2,A3,An 地点
 N ネットワーク
 U ユーザ
REFERENCE SIGNS LIST 10 information processing device 11 registration unit 12 authentication control unit 14 recording control unit 16 output unit 17 evaluation unit 100 authentication device 110 face information DB
111 User ID
112 facial feature information 120 face detection unit 130 feature point extraction unit 140 registration unit 150 authentication unit 200 information processing device 300 image storage server 400 face authentication terminal 410 camera 420 storage unit 430 communication unit 440 display unit 450 control unit 451 photographing control unit 452 Registration unit 453 Authentication control unit 454 Display control unit 460 Input unit 500 User terminal 510 Camera 520 Storage unit 530 Communication unit 540 Display unit 550 Control unit 551 Shooting control unit 552 Registration unit 553 Acquisition unit 554 Display control unit 560 Input unit 210 Storage unit 211 program 212 user information 2121 user ID
2122 User attribute information 213 History information 2131 User ID
2132 Point ID
2133 Date and time 2134 Degree of smile 2135 Access information 214 Content information 2141 Spot ID
2142 Content ID
2143 content attribute information 215 aggregation information 220 memory 230 communication unit 240 control unit 241 registration unit 242 authentication control unit 243 calculation unit 244 recording control unit 245 image generation unit 246 output unit 247 evaluation unit 610 terminal screen 611 access information 620 synthetic image 630 whole Synthetic image 631, 632, 633, 634, 635 Synthetic image 1000 Information processing system A1, A2, A3, An Point N Network U User

Claims (11)

  1.  利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録手段と、
     所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御手段と、
     前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御手段と、
     前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力手段と、
     対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価手段と
     を備える情報処理装置。
    registration means for registering attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired;
    Authentication control means for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is acquired;
    recording control means for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the photographed image in association with content information related to content associated with the subject's attribute information and a photographing location;
    Output means for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful;
    An information processing apparatus comprising: evaluation means for aggregating smile levels based on subject's attribute information and content information, and evaluating content satisfaction based on the aggregated smile levels.
  2.  前記対象者の属性情報は、対象者の年齢、性別、職業、家族構成、趣味、同伴者の有無若しくは集団属性を示す同伴者情報、及び来場手段のうち少なくとも1つを含む
     請求項1に記載の情報処理装置。
    2. The target person's attribute information includes at least one of age, gender, occupation, family structure, hobby, companion information indicating the presence or absence of companions or group attributes of the target person, and means of visiting. information processing equipment.
  3.  前記登録手段は、前記対象者から提供される情報を、前記対象者の属性情報として前記対象者に関連付けて登録し、
     前記出力手段は、前記対象者から提供される情報の情報量に応じて、前記対象者が使用する情報端末に出力する、前記笑顔度に応じた情報を決定する
     請求項1又は2に記載の情報処理装置。
    The registration means registers information provided by the subject in association with the subject as attribute information of the subject,
    3. The output unit according to claim 1 or 2, wherein the output means determines the information according to the degree of smile to be output to the information terminal used by the target person according to the amount of information provided by the target person. Information processing equipment.
  4.  前記出力手段は、
     取得した撮影画像に基づく前記生体認証が成功したことに応じて、該対象者の笑顔度に応じた情報を、前記情報端末に出力し、
     又は、
     撮影の終了を示す撮影終了要求を取得した場合、前記対象者の、異なる撮影日時の撮影画像に対応する複数の前記笑顔度に応じた情報を、前記情報端末に出力する
     請求項1から3のいずれか一項に記載の情報処理装置。
    The output means is
    outputting information corresponding to the degree of smile of the subject to the information terminal in response to the success of the biometric authentication based on the acquired photographed image;
    or
    4. Outputs to the information terminal information according to a plurality of smile levels corresponding to images taken at different shooting dates and times of the subject, when a shooting end request indicating the end of shooting is acquired. The information processing device according to any one of the items.
  5.  前記笑顔度に応じた情報は、生体認証に成功した撮影画像に基づいて生成される、前記笑顔度に応じた画像であり、
     前記笑顔度に応じた画像は、前記撮影画像に前記対象者の笑顔度を重畳した合成画像、又は前記対象者の笑顔度が所定値以上の前記撮影画像、若しくは前記対象者の笑顔度が所定値以上の撮影画像に対応する前記合成画像である
     請求項1から4のいずれか一項に記載の情報処理装置。
    the information corresponding to the degree of smile is an image corresponding to the degree of smile generated based on a photographed image for which biometric authentication is successful;
    The image corresponding to the degree of smile is a composite image in which the degree of smile of the subject is superimposed on the photographed image, the photographed image in which the degree of smile of the subject is a predetermined value or more, or the degree of smile of the subject is a predetermined value. The information processing apparatus according to any one of claims 1 to 4, wherein the composite image corresponds to a photographed image having a value equal to or greater than the value.
  6.  前記対象者の全体合成画像を生成する画像生成手段をさらに備え、
     前記全体合成画像は、前記対象者の、異なる撮影日時の撮影画像に対応する前記笑顔度に応じた画像を複数含み、
     前記出力手段は、取得した撮影画像に基づく前記生体認証が成功し、かつ撮影の終了を示す撮影終了要求を取得した場合、前記対象者が使用する情報端末に、前記合成画像を出力する
     請求項5に記載の情報処理装置。
    further comprising an image generating means for generating the overall composite image of the subject;
    The overall composite image includes a plurality of images corresponding to the degree of smile corresponding to images taken at different shooting dates and times of the subject,
    The output means outputs the composite image to the information terminal used by the subject when the biometric authentication based on the acquired photographed image is successful and the photographing end request indicating the end of photographing is acquired. 6. The information processing device according to 5.
  7.  前記記録制御手段は、前記笑顔度を、前記撮影地点の位置情報及び撮影日時の少なくとも一方に対応付けて記録し、
     前記評価手段は、撮影地点の位置情報、撮影時間帯、及び撮影地点の来訪順番のうち少なくとも1つに基づいて笑顔度を集約する
     請求項1から6のいずれか一項に記載の情報処理装置。
    The recording control means records the degree of smile in association with at least one of position information of the shooting location and shooting date and time;
    7. The information processing apparatus according to any one of claims 1 to 6, wherein the evaluation unit aggregates smile levels based on at least one of location information of shooting locations, shooting time zones, and order of visits to shooting locations. .
  8.  対象者を撮影した撮影画像を生成する情報端末と、
     前記情報端末と通信可能に接続される情報処理装置と
     を備え、
     前記情報処理装置は、
     利用登録を受け付けた場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録手段と、
     所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を前記情報端末から取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御手段と、
     前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御手段と、
     前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力手段と、
     対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価手段と
     を有する、情報処理システム。
    an information terminal that generates a captured image of a target person;
    an information processing device communicably connected to the information terminal,
    The information processing device is
    registration means for registering attribute information of a subject in association with biometric information of the subject when use registration is accepted;
    Authentication control means for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with a predetermined content is obtained from the information terminal;
    recording control means for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the photographed image in association with content information related to content associated with the subject's attribute information and a photographing location;
    Output means for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful;
    An information processing system, comprising: evaluation means for aggregating smile levels based on subject's attribute information and content information, and evaluating content satisfaction based on the aggregated smile levels.
  9.  前記対象者の属性情報は、対象者の年齢、性別、職業、家族構成、趣味、同伴者の有無若しくは集団属性を示す同伴者情報、及び来場手段のうち少なくとも1つを含む
     請求項8に記載の情報処理システム。
    9. The subject's attribute information includes at least one of the subject's age, gender, occupation, family structure, hobbies, companion information indicating the presence or absence of companions or group attributes, and means of visit. information processing system.
  10.  利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録し、
     所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせ、
     前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録し、
     前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力し、
     対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する
     情報処理方法。
    When the user registration request is obtained, registering the attribute information of the subject in association with the biometric information of the subject,
    When a photographed image of a target person photographed at a predetermined point associated with predetermined content is acquired, biometric authentication is performed using the photographed image,
    when the biometric authentication is successful, recording the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and the shooting location;
    Outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication is successful,
    An information processing method, comprising aggregating smile levels based on target person attribute information and content information, and evaluating content satisfaction based on the aggregated smile levels.
  11.  コンピュータに、
     利用登録要求を取得した場合、対象者の属性情報を、前記対象者の生体情報に対応付けて登録する登録処理と、
     所定コンテンツと関連付けられる所定地点において対象者を撮影した撮影画像を取得した場合、前記撮影画像を用いて生体認証を行わせる認証制御処理と、
     前記生体認証が成功した場合、前記撮影画像から算出した前記対象者の笑顔度を、前記対象者の属性情報及び撮影地点に関連付けられるコンテンツに関するコンテンツ情報に対応付けて記録する記録制御処理と、
     前記生体認証が成功した対象者が使用する情報端末に、該対象者の笑顔度に応じた情報を出力する出力処理と、
     対象者の属性情報及びコンテンツ情報に基づいて笑顔度を集約し、集約した前記笑顔度に基づいてコンテンツの満足度を評価する評価処理と
     を実行させるためのプログラムが格納された非一時的なコンピュータ可読媒体。
    to the computer,
    a registration process of registering attribute information of a subject in association with biometric information of the subject when a request for use registration is acquired;
    Authentication control processing for performing biometric authentication using the captured image when a captured image of a target person captured at a predetermined point associated with predetermined content is acquired;
    recording control processing for recording, when the biometric authentication is successful, the degree of smile of the subject calculated from the captured image in association with content information related to content associated with the subject's attribute information and a shooting location;
    An output process for outputting information according to the degree of smile of the subject to the information terminal used by the subject whose biometric authentication has succeeded;
    A non-temporary computer storing a program for executing an evaluation process of aggregating smile levels based on attribute information of a subject and content information, and evaluating content satisfaction based on the aggregated smile levels. readable medium.
PCT/JP2021/045571 2021-02-17 2021-12-10 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium WO2022176342A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023500566A JP7468771B2 (en) 2021-02-17 2021-12-10 Information processing device, information processing system, information processing method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021023729 2021-02-17
JP2021-023729 2021-02-17

Publications (1)

Publication Number Publication Date
WO2022176342A1 true WO2022176342A1 (en) 2022-08-25

Family

ID=82931391

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/045571 WO2022176342A1 (en) 2021-02-17 2021-12-10 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium

Country Status (2)

Country Link
JP (1) JP7468771B2 (en)
WO (1) WO2022176342A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024062571A1 (en) * 2022-09-21 2024-03-28 日本電気株式会社 Information processing device, authentication system, authentication method, and non-transitory computer-readable medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012009957A (en) * 2010-06-22 2012-01-12 Sharp Corp Evaluation information report device, content presentation device, content evaluation system, evaluation information report device control method, evaluation information report device control program, and computer-readable recording medium
JP2015011712A (en) * 2013-06-28 2015-01-19 アザパ アールアンドディー アメリカズ インク Digital information gathering and analyzing method and apparatus
JP2019132928A (en) * 2018-01-30 2019-08-08 株式会社デンソー Music providing device for vehicle
JP2020144443A (en) * 2019-03-04 2020-09-10 パナソニックIpマネジメント株式会社 Face authentication system and face recognition method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012009957A (en) * 2010-06-22 2012-01-12 Sharp Corp Evaluation information report device, content presentation device, content evaluation system, evaluation information report device control method, evaluation information report device control program, and computer-readable recording medium
JP2015011712A (en) * 2013-06-28 2015-01-19 アザパ アールアンドディー アメリカズ インク Digital information gathering and analyzing method and apparatus
JP2019132928A (en) * 2018-01-30 2019-08-08 株式会社デンソー Music providing device for vehicle
JP2020144443A (en) * 2019-03-04 2020-09-10 パナソニックIpマネジメント株式会社 Face authentication system and face recognition method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024062571A1 (en) * 2022-09-21 2024-03-28 日本電気株式会社 Information processing device, authentication system, authentication method, and non-transitory computer-readable medium

Also Published As

Publication number Publication date
JPWO2022176342A1 (en) 2022-08-25
JP7468771B2 (en) 2024-04-16

Similar Documents

Publication Publication Date Title
JP5300585B2 (en) Information processing apparatus and information processing method
JP6769475B2 (en) Information processing system, management method for authentication, and program
CN105893813A (en) Biometric Information Registration Apparatus And Biometric Information Registration Method
JP5671224B2 (en) Image processing apparatus and image processing method
JP2012252613A (en) Customer behavior tracking type video distribution system
WO2022176342A1 (en) Information processing device, information processing system, information processing method, and non-transitory computer-readable medium
JP2022022083A (en) Entry/exit management device, entry/exit management method, entry/exit management program and entry/exit management system
JP2019020882A (en) Life log utilization system, method and program
JP2024040203A (en) Person detection device, person tracking device, person tracking system, person detection method, person tracking method, person detection program, and person tracking program
JP7367847B2 (en) Recommended control devices, systems, methods and programs
JP7409483B2 (en) Recommended devices, systems, methods and programs
WO2021250817A1 (en) Image providing device, image providing system, image providing method, and non-temporary computer-readable medium
JP7452622B2 (en) Presentation control device, system, method and program
WO2022176339A1 (en) Information processing device, information processing system, information processing method, and non-transitory computer-readable medium
WO2021199171A1 (en) Information processing device, system, method, and non-transitory computer-readable medium having program stored thereon
CN113888763A (en) Information processing system, information processing apparatus, and information processing method
JP2021190051A (en) Behavior body identification system
WO2023095318A1 (en) Guidance device, system, method, and computer-readable medium
JPWO2021199114A5 (en) Recommendation device, system, method and program
US20230222834A1 (en) Image providing apparatus, image providing system, image providing method, and non-transitory computer readable medium
WO2022038661A1 (en) Recommendation device, recommendation system, recommendation method, and non-transitory computer-readable medium
JP7067593B2 (en) Information processing system, management method for authentication, and program
JP5760131B1 (en) Image collation system, image collation apparatus, image collation method, and computer program
JP7400944B2 (en) Recommended control devices, systems, methods and programs
WO2023187960A1 (en) Authentication system, authentication method, and computer-readable medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21926767

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023500566

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21926767

Country of ref document: EP

Kind code of ref document: A1