WO2023073940A1 - Information processing device, information processing system, information processing method, and non-transitory computer-readable medium - Google Patents

Information processing device, information processing system, information processing method, and non-transitory computer-readable medium Download PDF

Info

Publication number
WO2023073940A1
WO2023073940A1 PCT/JP2021/040059 JP2021040059W WO2023073940A1 WO 2023073940 A1 WO2023073940 A1 WO 2023073940A1 JP 2021040059 W JP2021040059 W JP 2021040059W WO 2023073940 A1 WO2023073940 A1 WO 2023073940A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
play
point
information processing
Prior art date
Application number
PCT/JP2021/040059
Other languages
French (fr)
Japanese (ja)
Inventor
宏嘉 澤井
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2021/040059 priority Critical patent/WO2023073940A1/en
Publication of WO2023073940A1 publication Critical patent/WO2023073940A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/38Training appliances or apparatus for special sports for tennis

Definitions

  • the present disclosure relates to an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium, and in particular, an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that introduce sports-related information to users.
  • computer readable medium
  • Patent Literature 1 discloses a method of acquiring swing analysis data of a user from a user terminal and providing a diagnostic result including information on a recommended golf club type to a terminal such as a shop based on the swing analysis data. ing.
  • a user terminal is configured to receive measurement data from a sensor unit attached to a golf club, generate swing analysis data based on the measurement data, and transmit the swing analysis data to a server. ing.
  • an object of the present disclosure is to provide an information processing device, an information processing system, and an information processing device that can easily collect measurement data in various situations of a user and make appropriate recommendations according to the skill of the user.
  • the object is to provide a method and a non-transitory computer-readable medium.
  • An information processing device includes: identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points; a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID; If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select; and output control means for outputting the recommended information to a terminal device located at the specified output point.
  • An information processing system includes: a measurement terminal positioned at each of the specific points, capturing at least the face of a user visiting the point, and acquiring measurement data of the action of the user who visited the point playing a specific sport; comprising an information processing device and
  • the information processing device is an identifying means for identifying a user ID that identifies a user who has visited one of the specific points based on a captured image captured by a measurement terminal positioned at one of the specific points; a recording control means for recording play data obtained from measurement data acquired by the measurement terminal at a point visited by the user as a play history in a play DB in association with the user ID; If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select; and output control means for outputting the recommended information to a terminal device located at the specified output point.
  • An information processing method includes: identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points; a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID; If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select; and output control means for outputting the recommended information to a terminal device located at the specified output point.
  • a non-transitory computer-readable medium comprising: an identification step of identifying a user ID that identifies the user based on a photographed image of at least the face of the user visiting any one of the specific points; a recording control procedure for recording play data obtained by measuring a specific sport action by the user at a point visited by the user in a play DB in association with the user ID as a play history; If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection procedure to select; A program for causing a computer to execute an output control procedure for outputting the recommended information to a terminal device located at the specified output point is stored.
  • an information processing device an information processing system, an information processing method, and a non-temporary computer-readable method that can easily collect user measurement data in various situations and make appropriate recommendations according to the user's skill We can provide media.
  • FIG. 1 is a block diagram showing the configuration of an information processing apparatus according to a first embodiment
  • FIG. 4 is a flow chart showing the flow of an information processing method according to the first embodiment
  • 2 is a block diagram showing the overall configuration of an information processing system according to a second embodiment
  • FIG. 2 is a block diagram showing the configuration of a face authentication device according to a second embodiment
  • FIG. 9 is a flowchart showing the flow of face information registration processing according to the second embodiment
  • 9 is a flow chart showing the flow of face authentication processing according to the second embodiment
  • FIG. 8 is a block diagram showing the configuration of a user terminal according to the second embodiment
  • FIG. 8 is a block diagram showing the configuration of a measurement terminal according to the second embodiment
  • FIG. 7 is a block diagram showing the configuration of a server according to the second embodiment
  • FIG. FIG. 10 is a diagram showing an example of the data structure of a spot table according to the second embodiment
  • FIG. FIG. 11 is a sequence diagram showing an example of the flow of user registration processing according to the second embodiment
  • FIG. 10 is a sequence diagram showing an example of the flow of information processing when the server according to the second embodiment receives measurement data from a measurement terminal other than a designated output point
  • FIG. 10 is a sequence diagram showing an example of the flow of information processing when the server according to the second embodiment receives measurement data from a measurement terminal at a designated output point
  • FIG. 10 is a diagram showing an example of display on the measurement terminal according to the second embodiment
  • FIG. 10 is a diagram showing an example of display on the measurement terminal according to the second embodiment
  • FIG. 10 is a diagram showing an example of display on the measurement terminal according to the second embodiment
  • FIG. 10 is a diagram showing an example of display on the measurement terminal according to the second embodiment
  • 9 is a flowchart showing an example of the flow of collected data provision processing according to the second embodiment
  • FIG. 11 is a block diagram showing the overall configuration of an information processing system according to a third embodiment
  • FIG. FIG. 11 is a sequence diagram showing the flow of information processing when the server according to the third embodiment receives measurement data from a measurement terminal at a designated output point;
  • a “specific point” is a point where a camera for capturing a captured image for specifying a user is installed.
  • the "point where the camera is installed” may include not only the point where the camera is actually installed, but also the area around the camera.
  • the area around the camera may be a room or section in which the camera is installed, or an area or space that can be photographed by the camera.
  • the particular point may be a predetermined location of a driving range, any location on a golf course, or a predetermined location of a particular sporting goods store, such as a trial corner.
  • the arbitrary location on the golf course may be the area around the camera when the camera is fixedly installed at a predetermined location on the golf course, or the camera is installed on a golf cart traveling around the golf course. It may be the area around the camera in the case.
  • a "specific sport” is golf, tennis, soccer, baseball or any other sport.
  • “Equipment” is a tool used for the sport. For example, if the sport is golf, the “equipment” may be a golf club or a golf ball. Also, for example, if the sport is tennis, the “equipment” may be a tennis racket or a tennis ball. Also, for example, if the sport is baseball, the “equipment” may be a bat, a glove, or a baseball.
  • the "equipment” may be clothing or shoes.
  • Sports-related information is also called sports-related information.
  • Sports-related information includes information about equipment, lessons, coaches, courses or driving ranges for the sport.
  • the sport-related information may be golf club product information or sale information, golf lesson video distribution information, lesson professional introduction information, or golf course or driving range introduction information.
  • “Outputting information (to a terminal device)” may refer to transmitting information to a terminal device and outputting the information to an output unit of the terminal device.
  • the output format may be a display format or an audio output format. In the case of the display format, the output section of the terminal device may also be referred to as the display section, and in the case of the audio output format, the output section of the terminal device may also be referred to as the audio output section.
  • FIG. 1 is a block diagram showing the configuration of an information processing apparatus 10 according to the first embodiment.
  • the information processing device 10 is a computer device that introduces information related to a specific sport to the user.
  • the information processing device 10 is connected to a network (not shown).
  • a network may be wired or wireless.
  • a terminal device (not shown) is connected to the network.
  • the terminal device is located at the designated output point.
  • the information processing device 10 includes a specifying unit 13 , a recording control unit 15 , a selection unit 16 and an output control unit 17 .
  • the identification unit 13 is also called identification means.
  • the specifying unit 13 specifies a user ID for identifying a user based on a photographed image of at least the face of the user who has visited one of the plurality of specific spots. For example, the identifying unit 13 identifies the user ID of the user by face authentication.
  • the user ID may be a user identification number or a user name.
  • the recording control unit 15 is also called recording control means.
  • the recording control unit 15 records the play data obtained by measuring the user's motion at the point visited by the user as a play history in a play DB (not shown) in association with the user ID.
  • the motions to be measured are body motions related to the sport. As an example, if the sport is golf, it is the address posture and the swing motion, and if the sport is tennis, it is the stroke motion and the serve motion. Examples of the measurement data include the acceleration of the tool, the angular velocity of the tool, the positional relationship between the tool and the body, the position of the hitting point, the direction of the face, the direction of the hit ball, the speed of the hit ball, the flight distance, and the positions of various parts of the user's body.
  • the measured action may also include information about the score of the sport.
  • the play data may be measurement data of the user's motion itself, or may be analysis data generated by analyzing the motion based on the measurement data. Analytical data may include data regarding a user's skill level, behavior characteristics, or similarity of behavior to other users or models.
  • the selection unit 16 is also called selection means. First, the selection unit 16 determines whether or not the point visited by the user is the designated output point.
  • the designated output point is a point determined in advance so as to output the recommended information, and may be, for example, a test hitting corner of a specific sporting goods store. Then, if the point visited by the user is the designated output point, the selection unit 16 selects at least one piece of information from the sports-related information as recommended information based on the play history associated with the user ID and accumulated in the play DB. Select as As an example, the selection unit 16 may estimate the user's skill level based on the user's play history, and select golf club product information corresponding to the skill level as the recommended information.
  • the play history used in the selection process may be all of the user's play history accumulated in the user DB, or may be a part of the play history.
  • a part of the play history may be, for example, a play history of a specific period, or a play history when playing at a specific point or using equipment of a specific manufacturer. There may be.
  • the selection unit 16 does not select the recommended information.
  • the output control unit 17 is also called output control means.
  • the output control unit 17 causes the terminal device located at the designated output point to output the recommended information.
  • the terminal device may include a measurement terminal located at the designated output point.
  • the output control unit 17 causes the measurement terminal located at the specified output point to output the recommended information.
  • the terminal device may include a user terminal possessed by a user located at the designated output point.
  • the output control unit 17 may output the recommended information to a user terminal possessed by the user located at the designated output point.
  • FIG. 2 is a flow chart showing the flow of the information processing method according to the first embodiment.
  • the information processing device 10 determines whether or not the user has visited a specific spot (S10). If it is not determined that the user has visited the specific point (NO in S10), the process shown in S10 is repeated. On the other hand, if it is determined that the user has visited the specific point (YES in S10), the information processing apparatus 10 acquires a photographed image of the user at the specific point (S11).
  • the identification unit 13 of the information processing device 10 identifies the user ID of the user based on the captured image (S12).
  • the recording control unit 15 of the information processing device 10 records the play data obtained by measuring the user's motion at the point visited by the user as a play history in the play DB in association with the user ID. (S13).
  • the information processing apparatus 10 determines whether or not the point visited by the user is the specified output point (S14). If the point visited by the user is not the designated output point (NO in S14), the process returns to S10, and if the point visited by the user is the designated output point (YES in S14), the process proceeds to S15.
  • the selection unit 16 of the information processing device 10 selects recommended sports-related information from the sports-related information based on at least part of the play history.
  • the output control unit 17 of the information processing device 10 outputs the recommended information to the terminal device.
  • the information processing device 10 can collect measurement data in various scenes by linking it to the user specified by the captured image, so that the play history can be easily accumulated. A user's login operation is not required for linking. As a result, the information processing apparatus 10 can make an appropriate recommendation to the user who has visited the designated output point according to the skill of the user, taking into consideration the play history collected in the past.
  • FIG. 3 is a block diagram showing the overall configuration of an information processing system 1000 according to the second embodiment.
  • the information processing system 1000 is a computer system that introduces users to information related to specific sports.
  • the specific sport is golf, but the type of sport is of course not limited to this.
  • the information processing system 1000 makes recommendations to the user according to the skill of the user, and in addition, efficiently uses play data to activate the product development of manufacturers and other activities of related companies. Well configured to collect.
  • the information processing system 1000 includes a user terminal 300, measurement terminals 400-1, 400-2, . . . , 400-n located at respective points A1, A2, . (hereinafter referred to as a server) 200 and a manufacturer terminal 500 .
  • a user terminal 300 measurement terminals 400-1, 400-2, . . . , 400-n located at respective points A1, A2, . (hereinafter referred to as a server) 200 and a manufacturer terminal 500 .
  • Each device and terminal are connected to each other via a network N.
  • the network N is a wired or wireless communication line.
  • the user terminal 300 is an information terminal used by the user.
  • User terminal 300 transmits a user registration request to server 200 . Thereby, the user's facial feature information is registered and a user ID is issued.
  • the user terminal 300 also transmits user information, which is information about the user, to the server 200, and causes the server 200 to register the user information in association with the user ID.
  • Measuring terminals 400-1, 400-2, ..., 400-n are information terminals located at points A1, A2, ..., An, respectively. Points A1 to An are points at which cameras for capturing images for face authentication are installed. Measurement terminals 400-1, 400-2, . . . , 400-n located at locations A1, A2, . Measurement terminals 400-1, 400-2, . . . , 400-n located at points A1, A2, .
  • the measurement data includes the captured image itself for face authentication or data related to the physical quantity generated by image processing based on the captured image for face authentication, but is not limited to this, for example, the user or equipment may include output data of sensors attached to the .
  • the measurement terminal 400 associates the photographed image for face authentication with the measurement data and transmits them to the server 200 . When the measurement data is the captured image itself for face authentication, the measurement terminal 400 may transmit the captured image for face authentication to the server 200 .
  • the designated output points are points A1 and A2.
  • the specified output points A1 and A2 may be predetermined areas of pre-specified stores (first store, second store) that sell sporting goods.
  • the designated output points A1 and A2 may be the trial hitting corners of the first store and the second store.
  • the points other than the designated output points A1 and A2 may include a sporting goods store different from the first store and the second store, or a predetermined area of a sports practice field or game venue.
  • the measurement terminal 400 located at the designated output point has a recommended information output function for outputting recommended information to the user, in addition to the function of transmitting the photographed image for face authentication and the measurement data. When receiving the recommended information from the server 200, the measuring terminal 400 positioned at the designated output point displays or outputs the recommended information by voice.
  • the face authentication device 100 is a computer device that stores facial feature information of multiple people.
  • the face authentication device 100 has a face authentication function that, in response to a face authentication request received from the outside, compares the face image or face feature information included in the request with the face feature information of each user.
  • the face authentication device 100 registers facial feature information of the user at the time of user registration. Then, the face authentication device 100 acquires the photographed image of the user who visited the spot from the measurement terminal 400 at each spot via the server 200, and performs face authentication using the face area in the photographed image. The face authentication device 100 then returns the collation result (face authentication result) to the server 200 .
  • the server 200 is an example of the information processing device 10 described above.
  • the server 200 is a computer device that collects user measurement data and accumulates play data obtained from the measurement data in association with user IDs. For example, when the server 200 receives a photographed image for face authentication and measurement data from the measurement terminal 400, the server 200 requests the face authentication device 100 to perform face authentication for the face area.
  • the server 200 associates the user ID included in the face authentication result received from the face authentication device 100 with the play data generated based on the measurement data, and registers them in a play history DB (not shown) as play history. .
  • the server 200 associates the specified user ID accumulated in the play history DB with Select recommendations based on play history. The server 200 then transmits the recommended information to the measurement terminal 400 that is the transmission source.
  • the server 200 also transmits the accumulated play history of the user to the maker terminal 500 .
  • the maker terminal 500 performs product improvement, new product development, or marketing based on the user's play history.
  • FIG. 4 is a block diagram showing the configuration of the face authentication device 100 according to the second embodiment.
  • the face authentication device 100 includes a face information DB (DataBase) 110 , a face detection section 120 , a feature point extraction section 130 , a registration section 140 and an authentication section 150 .
  • the face information DB 110 associates and stores a user ID 111 and face feature information 112 of the user ID.
  • the facial feature information 112 is a set of feature points extracted from a facial image, and is an example of facial information.
  • the face authentication device 100 may delete the face feature information 112 in the face feature DB 110 in response to a request from the registered user of the face feature information 112 .
  • the face authentication device 100 may delete the facial feature information 112 after a certain period of time has passed since it was registered.
  • the face detection unit 120 detects a face area included in a registered image for registering face information, and supplies it to the feature point extraction unit 130 .
  • the feature point extraction unit 130 extracts feature points from the face area detected by the face detection unit 120 and supplies face feature information to the registration unit 140 .
  • the feature point extraction unit 130 also extracts feature points included in the captured image received from the server 200 and supplies facial feature information to the authentication unit 150 .
  • the registration unit 140 newly issues a user ID 111 when registering facial feature information.
  • the registration unit 140 associates the issued user ID 111 with the facial feature information 112 extracted from the registered image and registers them in the facial information DB 110 .
  • the authentication unit 150 performs face authentication using the facial feature information 112 . Specifically, the authentication unit 150 collates the facial feature information extracted from the captured image with the facial feature information 112 in the facial information DB 110 . Authentication unit 150 returns to server 200 whether or not the facial feature information matches. Whether the facial feature information matches or not corresponds to the success or failure of the authentication. Note that matching of facial feature information (matching) means a case where the degree of matching is equal to or greater than a predetermined value.
  • FIG. 5 is a flowchart showing the flow of face information registration processing according to the second embodiment.
  • the face authentication device 100 acquires the registered image of the user U included in the face registration request (S21). For example, the face authentication device 100 receives a face registration request via the network N from the server 200 that received the user registration request from the user terminal 300 . Note that the face authentication device 100 may receive a face registration request directly from the user terminal 300 without being limited to this.
  • face detection section 120 detects a face area included in the registered image (S22).
  • the feature point extraction unit 130 extracts feature points from the face area detected in step S22, and supplies face feature information to the registration unit 140 (S23).
  • the registration unit 140 issues the user ID 111, associates the user ID 111 with the facial feature information 112, and registers them in the facial information DB 110 (S24).
  • the face authentication device 100 may receive the face feature information 112 from the face registration requester and register it in the face information DB 110 in association with the user ID 111 .
  • FIG. 6 is a flowchart showing the flow of face authentication processing according to the second embodiment.
  • the feature point extraction unit 130 acquires facial feature information for authentication (S31).
  • the face authentication device 100 receives a face authentication request from the server 200 via the network N, and extracts facial feature information from the captured image included in the face authentication request in steps S21 to S23.
  • the face authentication device 100 may receive facial feature information from the server 200 .
  • the authentication unit 150 collates the acquired facial feature information with the facial feature information 112 of the facial information DB 110 (S32).
  • the authentication unit 150 identifies the user ID 111 of the user whose facial feature information matches (S34). . Then, the authenticating unit 150 replies to the server 200 that the face authentication was successful and the identified user ID 111 as the face authentication result (S35). If there is no matching facial feature information (No in S33), the authentication unit 150 returns a face authentication result to the server 200 to the effect that the face authentication has failed (S36).
  • FIG. 7 is a block diagram showing the configuration of the user terminal 300 according to the second embodiment.
  • User terminal 300 includes camera 310 , storage unit 320 , communication unit 330 , display unit 340 , input unit 350 , and control unit 360 .
  • the camera 310 is an imaging device that performs imaging under the control of the control unit 360 .
  • the storage unit 320 is a storage device that stores programs for realizing each function of the user terminal 300 .
  • a communication unit 330 is a communication interface with the network N.
  • the display unit 340 is a display device.
  • the input unit 350 is an input device that receives input from the user.
  • the display unit 340 and the input unit 350 may be configured integrally like a touch panel.
  • the control unit 360 controls hardware of the user terminal 300 .
  • FIG. 8 is a block diagram showing the configuration of the measurement terminal 400 according to the second embodiment.
  • the measurement terminal 400 includes a camera 410 , a storage section 420 , a communication section 430 , a display section 440 , an input section 450 and a control section 460 .
  • the camera 410 is an imaging device that performs imaging under the control of the control unit 460 .
  • the storage unit 420 is a storage device that stores programs for realizing each function of the measurement terminal 400 .
  • a communication unit 430 is a communication interface with the network N.
  • FIG. Display unit 440 is a display device.
  • Input unit 450 is an input device that receives an input.
  • the display unit 440 and the input unit 450 may be configured integrally like a touch panel.
  • the control unit 460 controls hardware included in the measurement terminal 400 . Note that the display unit 440 and the input unit 450 may be omitted for the measurement terminals 400 located at points other than the designated output points A1 and A2.
  • FIG. 9 is a block diagram showing the configuration of the server 200 according to the second embodiment.
  • this diagram shows the configuration of the server 200 when introducing equipment-related information related to golf equipment as information related to sports, but the present invention is not limited to this.
  • the server 200 includes a storage unit 210, a memory 220, a communication unit 230, and a control unit 240.
  • the storage unit 210 is a storage device such as a hard disk or flash memory.
  • Storage unit 210 stores program 211 , user DB 212 , play history DB 213 , equipment DB 214 , and location table 215 .
  • the program 211 is a computer program in which the processing of the information processing method according to the second embodiment is implemented.
  • the user DB 212 stores information related to users. Specifically, the user DB 212 stores user information 2122 and usage history 2123 in association with the user ID 2121 .
  • a user ID 2121 is a user ID issued by the face authentication device 100 when face information is registered.
  • the user information 2122 is information about the user, and may include, for example, user attribute information and questionnaire information about the user's degree of sports experience and tools in use. Attribute information may include age, address, gender, occupation, height, weight, or foot size.
  • the user information 2122 may also include personal information such as the user's credit card number.
  • the usage history 2123 may include the user's entrance history to each location, as well as payment history and trial hitting history at a predetermined store.
  • the play history DB 213 stores the user's play history. Specifically, the play history DB 213 stores play data 2132 and measurement conditions 2133 in association with the user ID 2131 .
  • the play data 2132 is the measurement data itself received from the measurement terminal 400 or analysis data generated by analyzing the measurement data.
  • the measurement condition 2133 is the measurement condition of the measurement data that forms the basis of the play data.
  • the measurement conditions 2133 may include information related to the date and time of measurement, the location of measurement, and the tool used by the user at the time of measurement. In the second embodiment, the date and time of measurement and the location of measurement are the same as the date and time of shooting and the location of the captured image for face authentication.
  • the information about the tool used may include the tool ID and the manufacturer ID.
  • the tool DB 214 stores information related to tools from various manufacturers. Specifically, the tool DB 214 stores a maker ID 2142, tool attributes 2143, and tool related information 2144 in association with the tool ID 2141.
  • the tool ID 2141 is information for identifying the tool, and may be a product model number or a combination of a manufacturer ID and a product model number.
  • the maker ID 2142 is information for identifying the maker that manufactured the tool, such as the maker's identification number or maker name.
  • the tool attribute 2143 is information indicating the attribute of the tool.
  • equipment attributes 2143 may include information indicating the type and number of golf clubs, such as drivers, irons or putters.
  • Equipment attributes 2143 may also include information regarding the length, weight or shape characteristics of the equipment.
  • Equipment attributes 2143 may also include, for example, beginner, intermediate, or advanced, male, female, or children, or right-handed or left-handed.
  • the location table 215 is a table that defines for each location whether or not the location is a location for which recommended information is to be output, that is, whether or not it is a designated output location.
  • FIG. 10 is a diagram showing an example of the data structure of the point table 215 according to the second embodiment. As shown in FIG. 10, the point table 215 associates a point ID with information indicating whether or not it is a specified output point.
  • the memory 220 is a volatile storage device such as a RAM (Random Access Memory), and is a storage area for temporarily holding information when the control unit 240 operates.
  • the communication unit 230 is a communication interface with the network N. FIG.
  • the control unit 240 is a processor that controls each component of the server 200, that is, a control device.
  • the control unit 240 loads the program 211 from the storage unit 210 into the memory 220 and executes the program 211 .
  • the control unit 240 realizes the functions of the registration unit 241 , the image acquisition unit 242 , the identification unit 243 , the generation unit 244 , the recording control unit 245 , the selection unit 246 , the output control unit 247 and the collected data providing unit 248 .
  • the registration unit 241 is also called registration means.
  • the registration section 241 transmits a face registration request to the face authentication device 100 .
  • the registration unit 241 registers the user ID in the user DB 212 .
  • the registration unit 241 registers the user information of the user in the user DB 212 in association with the user ID of the user used by the user terminal 300 .
  • the image acquisition unit 242 is also called image acquisition means. When receiving a record request from the measurement terminal 400 at any point, the image acquisition unit 242 acquires the captured image or facial feature information included in the record request. The image acquisition unit 242 supplies the captured image or facial feature information to the identification unit 243 . The image acquisition unit 242 also supplies the measurement data included in the recording request to the generation unit 244 .
  • the specifying unit 243 is an example of the specifying unit 13 described above.
  • the identifying unit 243 controls face authentication for the face area of the user U included in the captured image to identify the user. That is, the identifying unit 243 causes the face authentication device 100 to perform face authentication on the captured image acquired from the measurement terminal 400 .
  • the specifying unit 243 transmits a face authentication request including the acquired photographed image to the face authentication device 100 via the network N.
  • the specifying unit 243 may extract the face area of the user U from the captured image and include the extracted image in the face authentication request. Further, the specifying unit 243 may extract facial feature information from the face area and include the facial feature information in the face authentication request.
  • the specifying unit 243 then receives the face authentication result from the face authentication device 100 . Thereby, the identifying unit 243 identifies the user ID of the user.
  • the generation unit 244 analyzes the measurement data included in the recording request, and generates play data regarding the user's actions at the point visited by the user based on the measurement data.
  • the measurement data is obtained by image processing based on the photographed image for face authentication (that is, the photographed image used to specify the user ID) itself or the photographed image for face authentication. Contains data about generated physical quantities such as position and orientation.
  • the method of generating play data is not limited to this, but the following method can be given as an example. For example, if the measurement data includes the positional relationship between the equipment and the body, the position of the user's body parts, and the position of the center of gravity, the generation unit 244 analyzes the user's address posture and swing trajectory from these data.
  • the feature amount of the posture at address and the trajectory of the swing are calculated. If the measurement data includes the position of the hitting point and the direction of the face, the generation unit 244 analyzes the direction and flight distance of the hit ball from these data, and calculates the feature amount of the hit ball. Then, the generation unit 244 generates play data including the feature amount of the user's posture at address and the trajectory of the swing, and the feature amount of the batted ball. The generation unit 244 supplies play data to the recording control unit 245 .
  • the generating unit 244 may generate data regarding physical quantities based on the photographed image, and then generate play data based on the data regarding physical quantities. That is, in the second embodiment, play data generation and face authentication may be performed based on the same captured image. Therefore, there is no need to attach a dedicated sensor for measurement to the tool or the user himself/herself, and the trouble of pairing the sensor and the measurement terminal 400 can be saved. In addition, even if the user does not enter the user ID into the measurement terminal 400 to log in, the user ID of the user to be measured can be specified and the play data and the user ID can be easily linked.
  • the generation unit 244 may detect the tool used by the user from the captured image by image recognition, and specify the tool ID or manufacturer ID of the used tool. The generation unit 244 may then supply the recording control unit 245 with information on the measurement conditions including the date and time of measurement, the measurement location, and the tool ID or manufacturer ID of the tool used.
  • the recording control unit 245 is an example of the recording control unit 15 described above.
  • the recording control unit 245 records the generated play data as a play history in the play history DB 213 in association with the user ID specified by the specifying unit 243 . Also, the recording control unit 245 may record the information of the measurement conditions in the play history DB 213 in association with the user ID.
  • the selection unit 246 is an example of the selection unit 16 described above. First, the selection unit 246 uses the location table 215 to determine whether or not the location visited by the user, that is, the location where the measuring terminal 400 that requested the recording is located, is the specified output location. If the point of the measurement terminal 400 that requested the recording is the specified output point, the selection unit 246 extracts the play data associated with the user ID in the play history DB 213 . At this time, the selection unit 246 may extract the play data for the most recent predetermined period from the play data associated with the user ID in the play history DB 213, and may not extract the rest.
  • the selection unit 246 also extracts, from the play data associated with the user ID in the play history DB 213, play data in which the tool ID, tool attribute, or manufacturer ID of the used tool included in the measurement condition is predetermined, and extracts the play data. No need to extract anything other than
  • the selection unit 246 selects the equipment-related information of the equipment recommended to the user from the equipment-related information of various equipment registered in the equipment DB 214. , to select as recommendations.
  • the selection unit 246 analyzes the user's current skill level, changes in the skill level, and the characteristics of the user's actions based on the extracted play data. Then, the selection unit 246 selects, from among the tool IDs stored in the tool DB 214, a tool ID having a tool attribute that matches the skill level, transition of the skill level, or characteristics of the action. The selection unit 246 then identifies the tool-related information associated with the selected tool ID as recommended information.
  • the selection unit 246 may select recommended information based on user information associated with the user ID in the user DB 212 in addition to the extracted play data.
  • the selection unit 246 estimates the user's preferred manufacturer, price range, or design tendency from the user's attribute information, the degree of sports experience, or the information on the equipment currently owned, and based on the user's preference tendency A Equipment ID may be selected, thereby selecting recommendations.
  • the selection unit 246 may select recommended information based on the usage history associated with the user ID in the user DB 212 in addition to the extracted play data. For example, the selection unit 246 may select the recommended information based on the purchase history of equipment of the user included in the usage history. As an example, the selection unit 246 estimates the tendency of the user's preferred manufacturer, price range, or design from the purchase history of the user's equipment, selects the equipment ID based on the user's preference, and thereby selects recommended information. You can
  • the output control unit 247 is an example of the output control unit 17 described above.
  • the output control unit 247 transmits the recommended information selected by the selection unit 246 to the measurement terminal 400 at the designated output point that requested the recording, and causes the information to be output.
  • the collected data providing unit 248 is also called collected data providing means. Collected data providing unit 248 transmits the data accumulated in play history DB 213 to maker terminal 500 .
  • the timing of transmission may be periodically, when a predetermined amount of data is accumulated, or each time data is recorded.
  • the type of data to be transmitted may be all or any of the user ID, play data, and measurement conditions accumulated in the play history DB 213 .
  • the user ID may be omitted from the viewpoint of privacy protection, or user attribute information may be used instead of the user ID.
  • the data to be transmitted may be related to the manufacturer to which it is transmitted.
  • the data related to the maker of the transmission destination is the data contained in the record, among the records accumulated in the play history DB 213, in which the maker ID of the used equipment contained in the measurement conditions matches the maker ID of the transmission destination. good.
  • the data to be transmitted may also include data relating to competing manufacturers to which it is transmitted.
  • Data related to the competition may be data included in records accumulated in the play history DB 213 in which the maker ID of the tool used included in the measurement conditions matches the maker ID of the competition. This allows the destination manufacturer to easily collect usage data of competing products.
  • FIG. 11 is a sequence diagram showing an example of the flow of user registration processing according to the second embodiment.
  • the user terminal 300 takes an image of the user U (S500), and transmits a user registration request including a registration image generated by the image taking to the server 200 (S501).
  • the registration unit 241 of the server 200 includes the registration image included in the received user registration request in the face registration request and transmits the face registration request to the face authentication device 100 (S502).
  • the face authentication device 100 registers face information (face feature information) of the user U based on the registration image included in the received face registration request (S503).
  • the face authentication device 100 notifies the server 200 of the issued user ID (S504).
  • the user terminal 300 accepts input of user information from the user and transmits the user information to the server 200 (S505).
  • the user information transmitted here includes user attribute information and information on golf clubs owned by the user.
  • the registration unit 241 of the server 200 associates the notified user ID and user information with each other and registers them in the user DB 212 (S506).
  • FIG. 12 is a sequence diagram showing an example of the flow of information processing when the server 200 according to the second embodiment receives measurement data from the measurement terminal 400 other than the designated output point.
  • the measurement terminal 400 positioned at a point other than the designated output point takes a picture of the visiting user (S520).
  • the measurement terminal 400 transmits a recording request to the server 200 (S521).
  • the recording request includes the photographed image generated by photographing, the point ID of the point where the measuring terminal 400 is located, and the measurement data. It is also assumed that the measurement data is generated from the captured image by the measurement terminal 400 .
  • the image acquisition unit 242 of the server 200 acquires the captured image of the user and the measurement data.
  • the specifying unit 243 of the server 200 transmits a face authentication request for the face area of the user U in the captured image to the face authentication device 100 (S522).
  • the face authentication device 100 performs face authentication on the face area of the user U in the captured image included in the received face authentication request (S523).
  • the face authentication device 100 transmits to the server 200 a face authentication result including the success of the face authentication and the user ID (S524).
  • the identifying unit 243 of the server 200 identifies the user based on the user ID included in the face authentication result.
  • the generation unit 244 of the server 200 generates play data based on the measurement data (S525).
  • the generation unit 244 may detect the tool used by the user from the captured image, and include the tool ID or manufacturer ID of the detected used tool in the measurement conditions.
  • the recording control unit 245 of the server 200 records the play data and the measurement condition information in the play history DB 213 in association with the identified user ID (S526).
  • the selection unit 246 refers to the point table 215 and determines whether or not the point ID of the measurement terminal 400 that requested the recording is the specified output point (S527).
  • the server 200 ends the process.
  • FIG. 13 is a sequence diagram showing an example of the flow of information processing when the server 200 according to the second embodiment receives measurement data from the measurement terminal 400 at the designated output point.
  • S520 to S527 in FIG. 13 are the same as in FIG.
  • the process proceeds to S528.
  • the selection unit 246 of the server 200 extracts the user information and usage history associated with the user ID from the user DB 212, and extracts play data (that is, play history) associated with the user ID from the play history DB 213. (S528). Then, the selection unit 246 analyzes the user information, the usage history, and the play history, and estimates the user's current skill level, changes in the skill level, motion characteristics, and user's preference trends. The selection unit 246 selects a recommended tool based on the analysis result and the tool attribute of each tool in the tool DB 214, and selects the tool-related information of the tool (S529). The selected equipment-related information becomes recommended information.
  • the output control unit 247 of the server 200 transmits the recommended information to the measurement terminal 400 that requested the recording (S530).
  • the measurement terminal 400 that has received the recommended information displays the recommended information (S531).
  • the server 200 When the server 200 receives the measurement data from the measurement terminal 400 at the designated output point, the server 200, before recording the play data based on the received measurement data, based on the play history accumulated in the past. Recommended information may be selected and output. In this case, the processing shown in S525-S526 may be performed in parallel with S527-S530 or at any timing after S527 is performed.
  • FIG. 14 is a diagram showing an example of display on the measurement terminal 400 according to the second embodiment.
  • the display unit 440 of the measuring terminal 400 displays product information of "an iron made by XX company" corresponding to the tool ID selected as the tool suitable for the skill level of the user.
  • the display unit 440 of the measuring terminal 400 may be configured to display an input area of "try hitting", and the user can apply for a trial hitting of the tool using the input area.
  • the selection process and the output process are executed each time it is determined that the point visited by the user is the specified output point, but the execution timing of the selection process and the output process is not limited to this.
  • the execution timing of the selection process and the output process may be when the point visited by the user is the designated output point and a predetermined condition is satisfied.
  • the predetermined condition may be that the user has not purchased the same type of tool within the most recent predetermined period. This is because, for example, if the user recently purchased a driver, even if the server 200 causes the measurement terminal 400 to output the driver's product information and recommend a driver, the effect of the recommendation is considered to be small.
  • the predetermined condition may be that the tool in use and the user are not compatible with each other. Poor compatibility may mean that the value indicating compatibility is less than a predetermined threshold value, and good compatibility may mean that the value indicating compatibility is equal to or greater than a predetermined threshold value.
  • the selection unit 246 analyzes the compatibility between the equipment in use and the user based on the play data 2132 and the equipment attribute of the equipment ID in use by the user, and finds that the equipment in use and the user have good compatibility. Otherwise, selection processing and output processing may be performed.
  • the server 200 transmits the tool-related information of the recommended tool to the measurement terminal 400 that requested the recording. 400 does not need to send equipment-related information for recommended equipment. For example, if the equipment in use is compatible with the user, the measurement terminal 400 may output a message such as "The equipment currently in use is compatible with the user. Let's practice at this rate.”
  • the information introduced to the user may be information about lessons, coaches, courses or practice ranges.
  • the equipment DB 214 of the server 200 may store information related to lessons, coaches, courses, or practice ranges instead of or in addition to information related to equipment.
  • the equipment DB 214 may store attributes of lessons, coaches, courses, or practice grounds and their introduction information in association with IDs of lessons, coaches, courses, or practice grounds.
  • the collected data providing unit 248 of the server 200 transmits the collected play data to the terminal of the company that provides the lessons, the company to which the coach belongs, or the company that provides the course or practice range instead of the maker terminal 500. You can
  • FIG. 15 shows an example of a display when the server 200 causes the measurement terminal 400 to output information about a coach as recommended information.
  • FIG. 15 is a diagram showing an example of display on the measurement terminal 400 according to the second embodiment.
  • the display unit 440 of the measurement terminal 400 displays information of "Professional YY in the XX practice field" as a lesson professional. Then, the user who viewed the screen shown in FIG. 15 may be able to view the displayed lesson schedule of the lesson professional and apply for a lesson reservation from this screen.
  • the selection unit 246 selects the lessons, coaches, courses, or practice grounds that serve as the basis for the recommended information, based on the user's activity range. You can That is, the selection unit 246 may preferentially select lessons, coaches, courses, or practice grounds within or close to the user's action range.
  • the user's action range may be estimated from attribute information such as the user's address, location information of the measuring terminal 400 that requested the recording, or the user's face authentication history (for example, location visit history).
  • the information to be introduced to the user may also be information on subscription services related to sports.
  • a subscription service allows a user to receive a target service for a predetermined period under predetermined conditions by paying a fixed usage fee.
  • Access to the subject service may be the ability to rent equipment, take lessons, or use a course or practice range at no additional charge.
  • being able to receive the target service may mean being able to reserve a course slot as a member of the golf course.
  • FIG. 16 shows an example of a display when the server 200 causes the measurement terminal 400 to output information about the subscription service as recommended information.
  • FIG. 16 is a diagram showing an example of display on the measurement terminal 400 according to the second embodiment.
  • the display unit 440 of the measuring terminal 400 displays information of "intermediate-level golf club subscription" for users having intermediate-level skills. Further, the display unit 440 of the measurement terminal 400 may be configured to display an input area of "apply", and the user can apply for the subscription service using the input area.
  • the information to be introduced to the user may be the information of the sporting goods manufacturer recommended to the user.
  • the selection unit 246 may select the sporting goods maker based on the user's current skill level and the characteristics of the user's motion analyzed based on the extracted play data.
  • the selection unit 246 may select a sporting goods maker according to the tendency of the user's preference estimated based on the user information associated with the user ID or the usage history. Then, the selection unit 246 may select the information about the sporting goods maker as the recommended information.
  • FIG. 17 shows an example of a display when the server 200 causes the measuring terminal 400 to output information on recommended sporting goods makers as recommended information.
  • FIG. 17 is a diagram showing an example of display on the measurement terminal 400 according to the second embodiment.
  • the display unit 440 of the measurement terminal 400 displays information on "XX company” as a recommended manufacturer for the user.
  • the display unit 440 of the measurement terminal 400 also displays an input area for "View golf club sets from company XX", and the user can use this input area to browse the details of the recommended maker's set products. It's okay.
  • FIG. 18 is a flowchart showing an example of the flow of collected data provision processing according to the second embodiment.
  • the collected data providing unit 248 of the server 200 identifies the maker ID of the tool used included in the measurement condition for each play history accumulated in the play history DB 213, and identifies the play history accumulated in the play history DB 213 for each manufacturer.
  • the history is sorted (S540). Then, the collected data providing unit 248 transmits the sorted play history data to the maker terminal of the maker (S541).
  • the server 200 can collect measurement data in various scenes by linking it to the user specified by the captured image, so that the play history can be easily accumulated.
  • a user's login operation is not required for linking.
  • the server 200 can make appropriate recommendations to the user who has visited the designated output point according to the skill of the user, taking into consideration the play history collected in the past. This has the effect that the user can easily grasp the sporting goods, lessons, coaches, courses or practice fields.
  • effective sales promotion can be expected to increase sales for sporting goods stores, lesson providers, and driving ranges.
  • sporting goods manufacturers can efficiently collect information for product development and marketing. Therefore, the cooperation of related companies can revitalize the business of the entire industry.
  • the measurement terminal 400 transmits the output data of the sensor attached to the user or the tool to the server 200 as the measurement data.
  • FIG. 19 is a block diagram showing the overall configuration of an information processing system 1000a according to the third embodiment.
  • An information processing system 1000a according to the third embodiment includes sensors 600-1, 600-2, . . . , 600-n in addition to the configuration of the information processing system 1000 according to the second embodiment.
  • Sensors 600-1, 600-2, . . . , 600-n are attached to users visiting each location A1, A2, .
  • Sensors 600-1, 600-2, . . . , 600-n communicate with measurement terminals 400-1, 400-2, . connected by For example, sensors 600-1, 600-2, . . . , 600-n are connected to measurement terminals 400-1, 400-2, . ing.
  • the sensor 600 measures the motion of the user and transmits the resulting output data to the measurement terminal 400 as measurement data.
  • the measurement terminal 400 photographs the user visiting the spot with the camera 410 and generates a photographed image for face authentication. Then, the measurement terminal 400 transmits the photographed image for face authentication and the measurement data to the server 200 in such a manner that the correspondence can be understood.
  • the specifying unit 243 of the server 200 specifies the user ID of the user based on the photographed image for face authentication. Also, the generation unit 244 of the server 200 generates play data regarding the user's motion at the point visited by the user based on the measurement data of the sensor 600 . Then, the recording control unit 245 of the server 200 associates the identified user ID with the play data and records them in the play history DB 213 .
  • FIG. 20 is a sequence diagram showing an example of the flow of information processing when the server 200 according to the third embodiment receives measurement data from the measurement terminal 400 at the designated output point.
  • the steps shown in FIG. 20 have S540 to S544 instead of S520 to S521 in FIG.
  • the measurement terminals 400-1 and 400-2 of the designated output points A1 and A2 are installed at the entrances of the trial corners of the first store and the second store, respectively.
  • the measurement terminal 400 located at the designated output point takes a picture of the user when the user enters the trial hitting corner (S540). Thereby, the measurement terminal 400 generates a photographed image for face authentication.
  • the measurement terminal 400 transmits a recording start request to the server 200 (S541).
  • the recording start request includes the photographed image generated in S540 and the point ID of the point where the measuring terminal 400 is located.
  • the image acquisition unit 242 of the server 200 acquires the captured image of the user.
  • the server 200 treats the measurement data received from the measurement terminal 400 having the same spot ID as the measurement data linked to the user identified by the captured image until the recording end request is received.
  • the user entering the test hitting corner uses the equipment to test hit.
  • the sensor 600 attached to the user or the tool outputs measurement data to the measurement terminal 400 (S542).
  • the measurement terminal 400 then transmits the measurement data and the point ID to the server 200 (S543).
  • the image acquisition unit 242 of the server 200 acquires the measurement data associated with the captured image of the user.
  • the measurement terminal 400 transmits a recording end request including the point ID to the server 200 (S544).
  • the measurement terminal 400 may transmit the recording end request in response to receiving an input indicating the end of trial hitting from the user, or may transmit the recording end request in response to the user exiting the trial hitting corner. good too. Further, the measurement terminal 400 may transmit a recording end request for the previous user when the next user enters.
  • the server 200 When the server 200 identifies the equipment to be used and records the measurement conditions, it may detect the equipment possessed by the user from the captured image acquired in S541 and identify the equipment ID or manufacturer ID. Alternatively, the measurement terminal 400 receives an input about the tool to be used from the user or the store staff, and the server 200 receives the information on the tool to be used received by the measurement terminal 400, thereby obtaining the tool ID or manufacturer ID of the tool to be used. may be specified.
  • the measurement data is automatically linked to the user ID specified in the photographed image for face authentication and collected, so the same effect as the second embodiment can be achieved.
  • the hardware configuration is described, but it is not limited to this.
  • the present disclosure can also implement arbitrary processing by causing a processor to execute a computer program.
  • the program includes instructions (or software code) that, when read into a computer, cause the computer to perform one or more of the functions described in the embodiments.
  • the program may be stored in a non-transitory computer-readable medium or tangible storage medium.
  • computer readable media or tangible storage media may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drives (SSD) or other memory technology, CDs - ROM, digital versatile disc (DVD), Blu-ray disc or other optical disc storage, magnetic cassette, magnetic tape, magnetic disc storage or other magnetic storage device.
  • the program may be transmitted on a transitory computer-readable medium or communication medium.
  • transitory computer readable media or communication media include electrical, optical, acoustic, or other forms of propagated signals.
  • the computer mentioned above is composed of a computer system including a personal computer and a word processor.
  • the computer is not limited to this, and can be configured by a LAN (local area network) server, a computer (personal computer) communication host, a computer system connected to the Internet, or the like. It is also possible to distribute the functions to each device on the network and configure the computer over the entire network.
  • the present disclosure is not limited to the above embodiments, and can be modified as appropriate without departing from the scope.
  • the measurement data may be a combination of data acquired from the captured image and sensor output data.
  • the measurement data at some points may be acquired from the captured image, and the measurement data at other points may be sensor output data.
  • the face authentication device 100 has the face authentication function, but instead of or in addition to the face authentication device 100, the server 200 may have the face authentication function.
  • Appendix 2 The information processing apparatus according to appendix 1, wherein the selection means does not select the recommended information when the point visited by the user is not the specified output point.
  • Appendix 3 The information processing device according to appendix 1 or 2, wherein the information related to the sport includes information related to equipment, lessons, coaches, courses, or practice grounds of the sport.
  • (Appendix 5) The information processing apparatus according to appendix 4, wherein the measurement data includes the captured image used to specify the user ID.
  • (Appendix 6) The information processing apparatus according to appendix 4 or 5, wherein the measurement data includes output data of a sensor attached to the user or a sensor attached to a tool used by the user.
  • (Appendix 7) The designated output point is a predetermined area of a sporting goods store, 7.
  • the selection means selects the recommended information based on, in addition to the play history, the purchase history of the sports equipment, the user's attribute information, or the location visit history.
  • the information processing device according to . (Appendix 9) a measurement terminal positioned at each of the specific points, capturing at least the face of a user visiting the point, and acquiring measurement data of the action of the user who visited the point playing a specific sport; comprising an information processing device and
  • the information processing device is an identifying means for identifying a user ID that identifies a user who has visited one of the specific points based on a captured image captured by a measurement terminal positioned at one of the specific points; a recording control means for recording play data obtained from measurement data acquired by the measurement terminal at a point visited by the user as a play history in a play DB in association with the user ID; If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated
  • An information processing system comprising output control means for outputting the recommended information to a terminal device positioned at the designated output point.
  • Appendix 10 The information processing system according to appendix 9, wherein the terminal device includes a measurement terminal positioned at the designated output point.
  • (Appendix 11) identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points; a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID; If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select; and output control means for outputting the recommended information to a terminal device located at the designated output point.
  • Appendix 12 an identification step of identifying a user ID that identifies the user based on a photographed image of at least the face of the user visiting any one of the specific points; a recording control procedure for recording play data obtained by measuring a specific sport action by the user at a point visited by the user in a play DB in association with the user ID as a play history; If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection procedure to select;
  • a non-temporary computer-readable medium storing a program for causing a computer to execute an output control procedure for outputting the recommended information to a terminal device located at the specified output point.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

An information processing device (10) comprises: a specification unit (13) that specifies a user ID for identifying a user who visits a spot among specific spots, the user ID being specified on the basis of a captured image of at least the face of the user; a recording control unit (15) that records, in a play DB as a play history in association with the user ID, play data obtained by measuring a specific sports movement by the user at the spot that the user has visited; a selection unit (16) that selects, if the spot that the user has visited is a specific output spot, recommendation information from sports-related information on the basis of at least a portion of the play history stored in the play DB and associated with the user ID; and an output control unit (17) that causes a terminal device located at the specific output sport to output the recommendation information.

Description

情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体Information processing device, information processing system, information processing method, and non-transitory computer-readable medium
 本開示は、情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体に関し、特にスポーツに関連する情報をユーザに紹介する情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体に関する。 TECHNICAL FIELD The present disclosure relates to an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium, and in particular, an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable medium that introduce sports-related information to users. computer readable medium.
 ゴルフクラブ、テニスラケット及び野球バット等のスポーツ用品を販売するスポーツ用品店では、ユーザが実際の商品を試打することにより、ユーザに合った用具を探索している。近年、これまでに収集したデータに基づいてユーザに合った用具をレコメンドする方法が提案されている。例えば特許文献1には、ユーザのスイング解析データをユーザ端末から取得し、スイング解析データに基づいて、ショップ等の端末に、推奨するゴルフクラブタイプの情報を含む診断結果を提供する方法が開示されている。上述の特許文献1では、ユーザ端末は、ゴルフクラブに装着されたセンサーユニットから計測データを受信し、計測データに基づいてスイング解析データを生成し、サーバにスイング解析データを送信するように構成されている。 At a sporting goods store that sells sporting goods such as golf clubs, tennis rackets, and baseball bats, users try out the actual products to search for equipment that suits them. In recent years, methods have been proposed for recommending tools that suit the user based on previously collected data. For example, Patent Literature 1 discloses a method of acquiring swing analysis data of a user from a user terminal and providing a diagnostic result including information on a recommended golf club type to a terminal such as a shop based on the swing analysis data. ing. In Patent Document 1 described above, a user terminal is configured to receive measurement data from a sensor unit attached to a golf club, generate swing analysis data based on the measurement data, and transmit the swing analysis data to a server. ing.
特開2017-029460号公報JP 2017-029460 A
 ユーザの技能に応じた適切なレコメンドを実施するためには、様々な場面においてユーザに紐づけられた計測データをできる限り多くかつ容易に収集することが求められている。しかし上述の特許文献1に記載の方法では、計測データやスイング解析データとユーザIDとを紐づけるために、予めセンサーユニットとログイン済のユーザ端末とをペアリングする必要があるため、手間がかかる。またユーザ端末が未ログインであればユーザIDを入力する等してログインする必要もある。したがってユーザの、様々な場面における計測データを収集することが難しいという問題があった。 In order to make appropriate recommendations according to the user's skill, it is required to collect as much measurement data as possible and easily linked to the user in various situations. However, in the method described in Patent Document 1 above, in order to associate the measurement data or swing analysis data with the user ID, it is necessary to pair the sensor unit with the logged-in user terminal in advance, which is time-consuming. . Also, if the user terminal has not yet logged in, it is necessary to log in by entering a user ID or the like. Therefore, there is a problem that it is difficult to collect user's measurement data in various situations.
 本開示の目的は、上述した課題に鑑み、ユーザの、様々な場面における計測データを容易に収集して、ユーザの技能に応じた適切なレコメンドを実施できる情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体を提供することにある。 In view of the above-described problems, an object of the present disclosure is to provide an information processing device, an information processing system, and an information processing device that can easily collect measurement data in various situations of a user and make appropriate recommendations according to the skill of the user. The object is to provide a method and a non-transitory computer-readable medium.
 本開示の一態様にかかる情報処理装置は、
 特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手段と、
 前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
 前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
 前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
 を備える。
An information processing device according to an aspect of the present disclosure includes:
identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points;
a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID;
If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
and output control means for outputting the recommended information to a terminal device located at the specified output point.
 本開示の一態様にかかる情報処理システムは、
 特定の地点の各々に位置し、その地点を訪問したユーザの少なくとも顔を撮影し、その地点を訪問したユーザによる特定のスポーツの動作の計測データを取得する計測端末と、
 情報処理装置と
 を備え、
 前記情報処理装置は、
 特定の地点のうちいずれかの地点に位置する計測端末が撮影した撮影画像に基づいて、その地点を訪問したユーザを識別するユーザIDを特定する特定手段と、
 前記ユーザが訪問した地点において該計測端末が取得した計測データにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
 前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
 前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
 を有する。
An information processing system according to one aspect of the present disclosure includes:
a measurement terminal positioned at each of the specific points, capturing at least the face of a user visiting the point, and acquiring measurement data of the action of the user who visited the point playing a specific sport;
comprising an information processing device and
The information processing device is
an identifying means for identifying a user ID that identifies a user who has visited one of the specific points based on a captured image captured by a measurement terminal positioned at one of the specific points;
a recording control means for recording play data obtained from measurement data acquired by the measurement terminal at a point visited by the user as a play history in a play DB in association with the user ID;
If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
and output control means for outputting the recommended information to a terminal device located at the specified output point.
 本開示の一態様にかかる情報処理方法は、
 特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手段と、
 前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
 前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
 前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
 を備える。
An information processing method according to an aspect of the present disclosure includes:
identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points;
a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID;
If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
and output control means for outputting the recommended information to a terminal device located at the specified output point.
 本開示の一態様にかかる非一時的なコンピュータ可読媒体は、
 特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手順と、
 前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手順と、
 前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手順と、
 前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手順と
 をコンピュータに実行させるためのプログラムが格納される。
According to one aspect of the present disclosure, a non-transitory computer-readable medium comprising:
an identification step of identifying a user ID that identifies the user based on a photographed image of at least the face of the user visiting any one of the specific points;
a recording control procedure for recording play data obtained by measuring a specific sport action by the user at a point visited by the user in a play DB in association with the user ID as a play history;
If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection procedure to select;
A program for causing a computer to execute an output control procedure for outputting the recommended information to a terminal device located at the specified output point is stored.
 本開示により、ユーザの、様々な場面における計測データを容易に収集して、ユーザの技能に応じた適切なレコメンドを実施できる情報処理装置、情報処理システム、情報処理方法及び非一時的なコンピュータ可読媒体を提供できる。 According to the present disclosure, an information processing device, an information processing system, an information processing method, and a non-temporary computer-readable method that can easily collect user measurement data in various situations and make appropriate recommendations according to the user's skill We can provide media.
実施形態1にかかる情報処理装置の構成を示すブロック図である。1 is a block diagram showing the configuration of an information processing apparatus according to a first embodiment; FIG. 実施形態1にかかる情報処理方法の流れを示すフローチャートである。4 is a flow chart showing the flow of an information processing method according to the first embodiment; 実施形態2にかかる情報処理システムの全体構成を示すブロック図である。2 is a block diagram showing the overall configuration of an information processing system according to a second embodiment; FIG. 実施形態2にかかる顔認証装置の構成を示すブロック図である。2 is a block diagram showing the configuration of a face authentication device according to a second embodiment; FIG. 実施形態2にかかる顔情報登録処理の流れを示すフローチャートである。9 is a flowchart showing the flow of face information registration processing according to the second embodiment; 実施形態2にかかる顔認証処理の流れを示すフローチャートである。9 is a flow chart showing the flow of face authentication processing according to the second embodiment; 実施形態2にかかるユーザ端末の構成を示すブロック図である。FIG. 8 is a block diagram showing the configuration of a user terminal according to the second embodiment; FIG. 実施形態2にかかる計測端末の構成を示すブロック図である。8 is a block diagram showing the configuration of a measurement terminal according to the second embodiment; FIG. 実施形態2にかかるサーバの構成を示すブロック図である。FIG. 7 is a block diagram showing the configuration of a server according to the second embodiment; FIG. 実施形態2にかかる地点テーブルのデータ構造の一例を示す図である。FIG. 10 is a diagram showing an example of the data structure of a spot table according to the second embodiment; FIG. 実施形態2にかかるユーザ登録処理の流れの一例を示すシーケンス図である。FIG. 11 is a sequence diagram showing an example of the flow of user registration processing according to the second embodiment; 実施形態2にかかるサーバが指定出力地点以外の計測端末から計測データを受信した場合の情報処理の流れの一例を示すシーケンス図である。FIG. 10 is a sequence diagram showing an example of the flow of information processing when the server according to the second embodiment receives measurement data from a measurement terminal other than a designated output point; 実施形態2にかかるサーバが指定出力地点の計測端末から計測データを受信した場合の情報処理の流れの一例を示すシーケンス図である。FIG. 10 is a sequence diagram showing an example of the flow of information processing when the server according to the second embodiment receives measurement data from a measurement terminal at a designated output point; 実施形態2にかかる計測端末の表示の一例を示す図である。FIG. 10 is a diagram showing an example of display on the measurement terminal according to the second embodiment; 実施形態2にかかる計測端末の表示の一例を示す図である。FIG. 10 is a diagram showing an example of display on the measurement terminal according to the second embodiment; 実施形態2にかかる計測端末の表示の一例を示す図である。FIG. 10 is a diagram showing an example of display on the measurement terminal according to the second embodiment; 実施形態2にかかる計測端末の表示の一例を示す図である。FIG. 10 is a diagram showing an example of display on the measurement terminal according to the second embodiment; 実施形態2にかかる収集データ提供処理の流れの一例を示すフローチャートである。9 is a flowchart showing an example of the flow of collected data provision processing according to the second embodiment; 実施形態3にかかる情報処理システムの全体構成を示すブロック図である。FIG. 11 is a block diagram showing the overall configuration of an information processing system according to a third embodiment; FIG. 実施形態3にかかるサーバが指定出力地点の計測端末から計測データを受信した場合の情報処理の流れを示すシーケンス図である。FIG. 11 is a sequence diagram showing the flow of information processing when the server according to the third embodiment receives measurement data from a measurement terminal at a designated output point;
 以下では、本開示の実施形態について、図面を参照しながら詳細に説明する。各図面において、同一又は対応する要素には同一の符号が付されており、説明の明確化のため、必要に応じて重複説明は省略される。 Below, embodiments of the present disclosure will be described in detail with reference to the drawings. In each drawing, the same reference numerals are given to the same or corresponding elements, and redundant description will be omitted as necessary for clarity of description.
 <用語の説明>
 以下では、下記の用語を次のように用いる。
 「特定の地点」とは、ユーザを特定するための撮影画像を撮影するカメラが設置された地点である。「カメラが設置された地点」は、カメラが現に設置された地点だけでなく、カメラ周辺の領域を含んでよい。カメラ周辺の領域は、カメラが設置された部屋若しくは区画、又はカメラが撮影可能な領域若しくは撮影可能な空間であってよい。例えば特定の地点は、練習場の所定の場所や、ゴルフコース中の任意の場所、又は特定のスポーツ用品店の所定の場所、例えば試打コーナーであってよい。ゴルフコース中の任意の場所は、ゴルフコース中の所定場所にカメラが固定設置されている場合のカメラ周辺の領域であってもよいし、ゴルフコースを回遊するゴルフカートにカメラが設置されている場合のカメラ周辺の領域であってもよい。
 「特定のスポーツ」は、ゴルフ、テニス、サッカー、野球又はその他の任意のスポーツである。
 「用具」は、そのスポーツに用いる道具である。例えばスポーツがゴルフの場合、「用具」は、ゴルフクラブやゴルフボールであってよい。また例えばスポーツがテニスの場合、「用具」は、テニスラケットやテニスボールであってよい。また例えばスポーツが野球の場合、「用具」は、バット、グローブ又は野球ボールであってよい。尚、「用具」はウェアやシューズであってもよい。
 「スポーツに関連する情報」は、スポーツ関連情報とも呼ばれる。スポーツ関連情報は、そのスポーツの用具、レッスン、コーチ、コース又は練習場に関する情報を含む。例えばスポーツがゴルフの場合、スポーツ関連情報は、ゴルフクラブの製品情報若しくはセール情報、ゴルフレッスン動画の配信情報、レッスンプロの紹介情報、又は、ゴルフコース若しくは練習場の紹介情報であってよい。
 「(情報を端末装置に)出力させる」とは、情報を端末装置に送信して、端末装置の出力部に出力させることを指してよい。出力形式は、表示形式でも音声出力形式でもよい。表示形式の場合は、端末装置の出力部は表示部とも呼ばれ、音声出力形式の場合は、端末装置の出力部は音声出力部とも呼ばれてよい。
<Description of terms>
In the following, the following terms are used as follows.
A “specific point” is a point where a camera for capturing a captured image for specifying a user is installed. The "point where the camera is installed" may include not only the point where the camera is actually installed, but also the area around the camera. The area around the camera may be a room or section in which the camera is installed, or an area or space that can be photographed by the camera. For example, the particular point may be a predetermined location of a driving range, any location on a golf course, or a predetermined location of a particular sporting goods store, such as a trial corner. The arbitrary location on the golf course may be the area around the camera when the camera is fixedly installed at a predetermined location on the golf course, or the camera is installed on a golf cart traveling around the golf course. It may be the area around the camera in the case.
A "specific sport" is golf, tennis, soccer, baseball or any other sport.
"Equipment" is a tool used for the sport. For example, if the sport is golf, the "equipment" may be a golf club or a golf ball. Also, for example, if the sport is tennis, the "equipment" may be a tennis racket or a tennis ball. Also, for example, if the sport is baseball, the "equipment" may be a bat, a glove, or a baseball. Note that the "equipment" may be clothing or shoes.
"Sports-related information" is also called sports-related information. Sports-related information includes information about equipment, lessons, coaches, courses or driving ranges for the sport. For example, if the sport is golf, the sport-related information may be golf club product information or sale information, golf lesson video distribution information, lesson professional introduction information, or golf course or driving range introduction information.
“Outputting information (to a terminal device)” may refer to transmitting information to a terminal device and outputting the information to an output unit of the terminal device. The output format may be a display format or an audio output format. In the case of the display format, the output section of the terminal device may also be referred to as the display section, and in the case of the audio output format, the output section of the terminal device may also be referred to as the audio output section.
 <実施形態1>
 まず、本開示の実施形態1について説明する。図1は、実施形態1にかかる情報処理装置10の構成を示すブロック図である。情報処理装置10は、特定のスポーツに関連する情報をユーザに紹介するコンピュータ装置である。情報処理装置10は、ネットワーク(不図示)に接続される。ネットワークは、有線であっても無線であってもよい。ネットワークには、端末装置(不図示)が接続されている。端末装置は、指定出力地点に位置する。
<Embodiment 1>
First, Embodiment 1 of the present disclosure will be described. FIG. 1 is a block diagram showing the configuration of an information processing apparatus 10 according to the first embodiment. The information processing device 10 is a computer device that introduces information related to a specific sport to the user. The information processing device 10 is connected to a network (not shown). A network may be wired or wireless. A terminal device (not shown) is connected to the network. The terminal device is located at the designated output point.
 情報処理装置10は、特定部13と、記録制御部15と、選択部16と、出力制御部17とを備える。 The information processing device 10 includes a specifying unit 13 , a recording control unit 15 , a selection unit 16 and an output control unit 17 .
 特定部13は、特定手段とも呼ばれる。特定部13は、複数の特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、ユーザを識別するユーザIDを特定する。例えば特定部13は、顔認証により、ユーザのユーザIDを特定する。ユーザIDは、ユーザの識別番号であってもよいし、ユーザ名であってもよい。 The identification unit 13 is also called identification means. The specifying unit 13 specifies a user ID for identifying a user based on a photographed image of at least the face of the user who has visited one of the plurality of specific spots. For example, the identifying unit 13 identifies the user ID of the user by face authentication. The user ID may be a user identification number or a user name.
 記録制御部15は、記録制御手段とも呼ばれる。記録制御部15は、ユーザが訪問した地点においてユーザによる動作を計測することにより得られたプレーデータを、プレー履歴としてユーザIDに対応付けてプレーDB(不図示)に記録する。計測する動作は、そのスポーツに関連する身体動作である。一例としてスポーツがゴルフの場合は、アドレス姿勢やスイング動作であり、スポーツがテニスの場合は、ストローク動作やサーブ動作である。計測データは、一例として、用具の加速度、用具の角速度、用具と身体との位置関係、打点の位置、フェースの向き、打球の方向、打球の速度、飛距離、ユーザの身体の各種部位の位置又はユーザの重心の位置であってよい。また計測する動作は、そのスポーツのスコアに関する情報を含んでもよい。プレーデータは、ユーザの動作の計測データ自体であってもよいし、計測データに基づいて動作を解析して生成された解析データであってもよい。解析データは、ユーザの技能レベル、動作の特徴、又は他のユーザ若しくはモデルとの動作の類似度に関するデータを含んでよい。 The recording control unit 15 is also called recording control means. The recording control unit 15 records the play data obtained by measuring the user's motion at the point visited by the user as a play history in a play DB (not shown) in association with the user ID. The motions to be measured are body motions related to the sport. As an example, if the sport is golf, it is the address posture and the swing motion, and if the sport is tennis, it is the stroke motion and the serve motion. Examples of the measurement data include the acceleration of the tool, the angular velocity of the tool, the positional relationship between the tool and the body, the position of the hitting point, the direction of the face, the direction of the hit ball, the speed of the hit ball, the flight distance, and the positions of various parts of the user's body. Or it may be the position of the center of gravity of the user. The measured action may also include information about the score of the sport. The play data may be measurement data of the user's motion itself, or may be analysis data generated by analyzing the motion based on the measurement data. Analytical data may include data regarding a user's skill level, behavior characteristics, or similarity of behavior to other users or models.
 選択部16は、選択手段とも呼ばれる。まず選択部16は、ユーザが訪問した地点が指定出力地点であるか否かを判定する。指定出力地点は、推奨情報を出力するように予め定められた地点であり、例えば特定のスポーツ用品店の試打コーナーであってよい。
 そして選択部16は、ユーザが訪問した地点が指定出力地点である場合、プレーDBに蓄積された、ユーザIDに対応付けられたプレー履歴に基づいて、スポーツ関連情報から少なくとも1つの情報を推奨情報として選択する。一例として選択部16は、ユーザのプレー履歴に基づいてユーザの技能レベルを推定し、技能レベルに応じたゴルフクラブの製品情報を、推奨情報として選択してよい。選択処理に用いるプレー履歴は、ユーザDBに蓄積されたユーザのプレー履歴全てであってもよいし、一部のプレー履歴であってもよい。一部のプレー履歴とは、例えば、特定の期間のプレー履歴であってもよいし、特定の地点でプレーしている場合や特定のメーカの用具を用いてプレーしている場合のプレー履歴であってもよい。
 一方、選択部16は、ユーザが訪問した地点が指定出力地点でない場合、前記推奨情報を選択しない。
The selection unit 16 is also called selection means. First, the selection unit 16 determines whether or not the point visited by the user is the designated output point. The designated output point is a point determined in advance so as to output the recommended information, and may be, for example, a test hitting corner of a specific sporting goods store.
Then, if the point visited by the user is the designated output point, the selection unit 16 selects at least one piece of information from the sports-related information as recommended information based on the play history associated with the user ID and accumulated in the play DB. Select as As an example, the selection unit 16 may estimate the user's skill level based on the user's play history, and select golf club product information corresponding to the skill level as the recommended information. The play history used in the selection process may be all of the user's play history accumulated in the user DB, or may be a part of the play history. A part of the play history may be, for example, a play history of a specific period, or a play history when playing at a specific point or using equipment of a specific manufacturer. There may be.
On the other hand, if the point visited by the user is not the designated output point, the selection unit 16 does not select the recommended information.
 出力制御部17は、出力制御手段とも呼ばれる。出力制御部17は、推奨情報を指定出力地点に位置する端末装置に出力させる。端末装置は、指定出力地点に位置する計測端末を含んでよい。例えば出力制御部17は、推奨情報を、指定出力地点に位置する計測端末に出力させる。あるいは、端末装置は、指定出力地点に位置するユーザが所持するユーザ端末を含んでもよい。例えば出力制御部17は、推奨情報を指定出力地点に位置するユーザが所持するユーザ端末に出力させてもよい。 The output control unit 17 is also called output control means. The output control unit 17 causes the terminal device located at the designated output point to output the recommended information. The terminal device may include a measurement terminal located at the designated output point. For example, the output control unit 17 causes the measurement terminal located at the specified output point to output the recommended information. Alternatively, the terminal device may include a user terminal possessed by a user located at the designated output point. For example, the output control unit 17 may output the recommended information to a user terminal possessed by the user located at the designated output point.
 図2は、実施形態1にかかる情報処理方法の流れを示すフローチャートである。まず情報処理装置10は、ユーザが特定の地点を訪問したか否かを判定する(S10)。ユーザが特定の地点を訪問したと判定しない場合(S10でNO)、S10に示す処理を繰り返す。一方、ユーザが特定の地点を訪問したと判定した場合(S10でYES)、情報処理装置10は、特定の地点にいるユーザの撮影画像を取得する(S11)。 FIG. 2 is a flow chart showing the flow of the information processing method according to the first embodiment. First, the information processing device 10 determines whether or not the user has visited a specific spot (S10). If it is not determined that the user has visited the specific point (NO in S10), the process shown in S10 is repeated. On the other hand, if it is determined that the user has visited the specific point (YES in S10), the information processing apparatus 10 acquires a photographed image of the user at the specific point (S11).
 次に、情報処理装置10の特定部13は、撮影画像に基づいて、ユーザのユーザIDを特定する(S12)。次に、情報処理装置10の記録制御部15は、ユーザが訪問した地点においてユーザの動作を計測することにより得られたプレーデータを、プレー履歴として、ユーザIDに対応付けてプレーDBに記録する(S13)。次に情報処理装置10は、ユーザが訪問した地点が指定出力地点か否かを判定する(S14)。ユーザが訪問した地点が指定出力地点でない場合(S14でNO)、処理をS10に戻し、ユーザが訪問した地点が指定出力地点である場合(S14でYES)、処理をS15に進める。 Next, the identification unit 13 of the information processing device 10 identifies the user ID of the user based on the captured image (S12). Next, the recording control unit 15 of the information processing device 10 records the play data obtained by measuring the user's motion at the point visited by the user as a play history in the play DB in association with the user ID. (S13). Next, the information processing apparatus 10 determines whether or not the point visited by the user is the specified output point (S14). If the point visited by the user is not the designated output point (NO in S14), the process returns to S10, and if the point visited by the user is the designated output point (YES in S14), the process proceeds to S15.
 S15において、情報処理装置10の選択部16は、プレー履歴の少なくとも一部に基づいて、スポーツ関連情報からスポーツに関連する推奨情報を選択する。次に、情報処理装置10の出力制御部17は、推奨情報を端末装置に出力する。 In S15, the selection unit 16 of the information processing device 10 selects recommended sports-related information from the sports-related information based on at least part of the play history. Next, the output control unit 17 of the information processing device 10 outputs the recommended information to the terminal device.
 このように実施形態1によれば、情報処理装置10は、様々な場面における計測データを、撮影画像により特定したユーザに紐づけて収集できるため、プレー履歴を容易に蓄積できる。紐づけには、ユーザによるログイン操作は不要である。これにより情報処理装置10は、指定出力地点を訪問したユーザに対して、過去に収集したプレー履歴を考慮した、ユーザの技能に応じた適切なレコメンドを実施することができる。 As described above, according to the first embodiment, the information processing device 10 can collect measurement data in various scenes by linking it to the user specified by the captured image, so that the play history can be easily accumulated. A user's login operation is not required for linking. As a result, the information processing apparatus 10 can make an appropriate recommendation to the user who has visited the designated output point according to the skill of the user, taking into consideration the play history collected in the past.
 <実施形態2>
 次に、本開示の実施形態2について説明する。図3は、実施形態2にかかる情報処理システム1000の全体構成を示すブロック図である。情報処理システム1000は、特定のスポーツに関連する情報をユーザに紹介するコンピュータシステムである。実施形態2では、特定のスポーツとしてゴルフの場合を説明するが、スポーツの種別はもちろんこれに限らない。実施形態2では、情報処理システム1000は、ユーザに対してユーザの技能に応じたレコメンドを行うことに加え、メーカの製品開発や関連企業のその他の活動を活性化させるために、プレーデータを効率よく収集するように構成されている。
<Embodiment 2>
Next, Embodiment 2 of the present disclosure will be described. FIG. 3 is a block diagram showing the overall configuration of an information processing system 1000 according to the second embodiment. The information processing system 1000 is a computer system that introduces users to information related to specific sports. In the second embodiment, the specific sport is golf, but the type of sport is of course not limited to this. In the second embodiment, the information processing system 1000 makes recommendations to the user according to the skill of the user, and in addition, efficiently uses play data to activate the product development of manufacturers and other activities of related companies. Well configured to collect.
 情報処理システム1000は、ユーザ端末300と、各地点A1,A2,…,Anに位置する計測端末400-1,400-2,…,400-nと、顔認証装置100と、情報処理装置(以下、サーバと呼ぶ)200と、メーカ端末500とを備える。各装置及び端末は、ネットワークNを介して互いに接続されている。ネットワークNは、有線又は無線の通信回線である。 The information processing system 1000 includes a user terminal 300, measurement terminals 400-1, 400-2, . . . , 400-n located at respective points A1, A2, . (hereinafter referred to as a server) 200 and a manufacturer terminal 500 . Each device and terminal are connected to each other via a network N. FIG. The network N is a wired or wireless communication line.
 ユーザ端末300は、ユーザが使用する情報端末である。ユーザ端末300は、サーバ200に対してユーザ登録要求を送信する。これにより、ユーザの顔特徴情報が登録され、ユーザIDが発行される。またユーザ端末300は、ユーザに関する情報であるユーザ情報をサーバ200に送信し、サーバ200に対して、ユーザIDに対応付けてユーザ情報を登録させる。 The user terminal 300 is an information terminal used by the user. User terminal 300 transmits a user registration request to server 200 . Thereby, the user's facial feature information is registered and a user ID is issued. The user terminal 300 also transmits user information, which is information about the user, to the server 200, and causes the server 200 to register the user information in association with the user ID.
 計測端末400-1,400-2,…,400-nは、それぞれ地点A1,A2,…,Anに位置する情報端末である。地点A1~Anは、顔認証用の撮影画像を撮影するカメラが設置された地点である。そして地点A1,A2,…,Anに位置する計測端末400-1,400-2,…,400-nは、それぞれ、その地点を訪問したユーザの顔認証用の撮影画像をカメラから取得する。また地点A1,A2,…,Anに位置する計測端末400-1,400-2,…,400-nは、それぞれ、その地点でプレーするユーザの動作を計測することにより計測データを取得する。本実施形態2では、計測データは、顔認証用の撮影画像自体又は、顔認証用の撮影画像に基づいて画像処理により生成された物理量に関するデータを含むが、これに限らず、例えばユーザ又は用具に取り付けられたセンサの出力データを含んでもよい。計測端末400は、顔認証用の撮影画像と計測データとを対応付けてサーバ200に送信する。尚、計測データが顔認証用の撮影画像自体である場合は、計測端末400は、顔認証用の撮影画像をサーバ200に送信すればよい。  Measuring terminals 400-1, 400-2, ..., 400-n are information terminals located at points A1, A2, ..., An, respectively. Points A1 to An are points at which cameras for capturing images for face authentication are installed. Measurement terminals 400-1, 400-2, . . . , 400-n located at locations A1, A2, . Measurement terminals 400-1, 400-2, . . . , 400-n located at points A1, A2, . In the second embodiment, the measurement data includes the captured image itself for face authentication or data related to the physical quantity generated by image processing based on the captured image for face authentication, but is not limited to this, for example, the user or equipment may include output data of sensors attached to the . The measurement terminal 400 associates the photographed image for face authentication with the measurement data and transmits them to the server 200 . When the measurement data is the captured image itself for face authentication, the measurement terminal 400 may transmit the captured image for face authentication to the server 200 .
 ここで、地点A1~Anのうち少なくとも1つの地点は、指定出力地点として予め定められている。本実施形態2では、指定出力地点は、地点A1及びA2である。そして指定出力地点A1及びA2は、スポーツ用品を販売する予め指定された店舗(第1の店舗,第2の店舗)の所定領域であってよい。一例として、指定出力地点A1及びA2は、第1の店舗及び第2の店舗の試打コーナーであってよい。尚、指定出力地点A1及びA2以外の地点は、第1の店舗及び第2の店舗とは異なるスポーツ用品店、又はスポーツの練習場若しくは試合会場の所定領域を含んでよい。指定出力地点に位置する計測端末400については、上記顔認証用の撮影画像及び計測データの送信機能の他、ユーザに対して推奨情報を出力する推奨情報出力機能を有する。そして指定出力地点に位置する計測端末400は、サーバ200から推奨情報を受信した場合、推奨情報を表示又は音声出力する。 Here, at least one of the points A1 to An is predetermined as a designated output point. In the second embodiment, the designated output points are points A1 and A2. The specified output points A1 and A2 may be predetermined areas of pre-specified stores (first store, second store) that sell sporting goods. As an example, the designated output points A1 and A2 may be the trial hitting corners of the first store and the second store. Note that the points other than the designated output points A1 and A2 may include a sporting goods store different from the first store and the second store, or a predetermined area of a sports practice field or game venue. The measurement terminal 400 located at the designated output point has a recommended information output function for outputting recommended information to the user, in addition to the function of transmitting the photographed image for face authentication and the measurement data. When receiving the recommended information from the server 200, the measuring terminal 400 positioned at the designated output point displays or outputs the recommended information by voice.
 顔認証装置100は、複数の人物の顔特徴情報を記憶するコンピュータ装置である。また、顔認証装置100は、外部から受信した顔認証要求に応じて、当該要求に含まれる顔画像又は顔特徴情報について、各ユーザの顔特徴情報と照合を行う、顔認証機能を有する。本実施形態2では、顔認証装置100は、ユーザ登録時に、ユーザの顔特徴情報を登録する。そして顔認証装置100は、各地点の計測端末400から、サーバ200を介して、その地点を訪問したユーザの撮影画像を取得し、撮影画像中の顔領域を用いた顔認証を実行する。そして顔認証装置100は、照合結果(顔認証結果)をサーバ200へ返信する。 The face authentication device 100 is a computer device that stores facial feature information of multiple people. In addition, the face authentication device 100 has a face authentication function that, in response to a face authentication request received from the outside, compares the face image or face feature information included in the request with the face feature information of each user. In the second embodiment, the face authentication device 100 registers facial feature information of the user at the time of user registration. Then, the face authentication device 100 acquires the photographed image of the user who visited the spot from the measurement terminal 400 at each spot via the server 200, and performs face authentication using the face area in the photographed image. The face authentication device 100 then returns the collation result (face authentication result) to the server 200 .
 サーバ200は、上述した情報処理装置10の一例である。サーバ200は、ユーザの計測データを収集し、計測データから得られるプレーデータをユーザIDに対応付けて蓄積するコンピュータ装置である。例えばサーバ200は、計測端末400から、顔認証用の撮影画像と計測データとを受信した場合、顔認証装置100に対して顔領域に対する顔認証を要求する。サーバ200は、顔認証装置100から受信した顔認証結果に含まれるユーザIDと、計測データに基づいて生成されたプレーデータとを対応付けて、プレー履歴としてプレー履歴DB(不図示)に登録する。 The server 200 is an example of the information processing device 10 described above. The server 200 is a computer device that collects user measurement data and accumulates play data obtained from the measurement data in association with user IDs. For example, when the server 200 receives a photographed image for face authentication and measurement data from the measurement terminal 400, the server 200 requests the face authentication device 100 to perform face authentication for the face area. The server 200 associates the user ID included in the face authentication result received from the face authentication device 100 with the play data generated based on the measurement data, and registers them in a play history DB (not shown) as play history. .
 またサーバ200は、顔認証用の撮影画像と計測データとを送信した計測端末400が指定出力地点A1,A2に位置する場合、プレー履歴DBに蓄積された、特定したユーザIDに対応付けられたプレー履歴に基づいて、推奨情報を選択する。そしてサーバ200は、推奨情報を送信元の計測端末400に送信する。 In addition, when the measurement terminal 400 that has transmitted the photographed image and the measurement data for face authentication is located at the specified output points A1 and A2, the server 200 associates the specified user ID accumulated in the play history DB with Select recommendations based on play history. The server 200 then transmits the recommended information to the measurement terminal 400 that is the transmission source.
 またサーバ200は、蓄積したユーザのプレー履歴をメーカ端末500に送信する。 The server 200 also transmits the accumulated play history of the user to the maker terminal 500 .
 メーカ端末500は、ユーザのプレー履歴に基づいて、製品改善、新製品開発又はマーケティングを行う。 The maker terminal 500 performs product improvement, new product development, or marketing based on the user's play history.
 図4は、実施形態2にかかる顔認証装置100の構成を示すブロック図である。顔認証装置100は、顔情報DB(DataBase)110と、顔検出部120と、特徴点抽出部130と、登録部140と、認証部150とを備える。顔情報DB110は、ユーザID111と当該ユーザIDの顔特徴情報112とを対応付けて記憶する。顔特徴情報112は、顔画像から抽出された特徴点の集合であり、顔情報の一例である。尚、顔認証装置100は、顔特徴情報112の登録ユーザからの要望に応じて、顔特徴DB110内の顔特徴情報112を削除してもよい。または、顔認証装置100は、顔特徴情報112の登録から一定期間経過後に削除してもよい。 FIG. 4 is a block diagram showing the configuration of the face authentication device 100 according to the second embodiment. The face authentication device 100 includes a face information DB (DataBase) 110 , a face detection section 120 , a feature point extraction section 130 , a registration section 140 and an authentication section 150 . The face information DB 110 associates and stores a user ID 111 and face feature information 112 of the user ID. The facial feature information 112 is a set of feature points extracted from a facial image, and is an example of facial information. Note that the face authentication device 100 may delete the face feature information 112 in the face feature DB 110 in response to a request from the registered user of the face feature information 112 . Alternatively, the face authentication device 100 may delete the facial feature information 112 after a certain period of time has passed since it was registered.
 顔検出部120は、顔情報を登録するための登録画像に含まれる顔領域を検出し、特徴点抽出部130に供給する。特徴点抽出部130は、顔検出部120が検出した顔領域から特徴点を抽出し、登録部140に顔特徴情報を供給する。また、特徴点抽出部130は、サーバ200から受信した撮影画像に含まれる特徴点を抽出し、認証部150に顔特徴情報を供給する。 The face detection unit 120 detects a face area included in a registered image for registering face information, and supplies it to the feature point extraction unit 130 . The feature point extraction unit 130 extracts feature points from the face area detected by the face detection unit 120 and supplies face feature information to the registration unit 140 . The feature point extraction unit 130 also extracts feature points included in the captured image received from the server 200 and supplies facial feature information to the authentication unit 150 .
 登録部140は、顔特徴情報の登録に際して、ユーザID111を新規に発行する。登録部140は、発行したユーザID111と、登録画像から抽出した顔特徴情報112とを対応付けて顔情報DB110へ登録する。認証部150は、顔特徴情報112を用いた顔認証を行う。具体的には、認証部150は、撮影画像から抽出された顔特徴情報と、顔情報DB110内の顔特徴情報112との照合を行う。認証部150は、顔特徴情報の一致の有無をサーバ200に返信する。顔特徴情報の一致の有無は、認証の成否に対応する。尚、顔特徴情報が一致する(一致有)とは、一致度が所定値以上である場合をいうものとする。 The registration unit 140 newly issues a user ID 111 when registering facial feature information. The registration unit 140 associates the issued user ID 111 with the facial feature information 112 extracted from the registered image and registers them in the facial information DB 110 . The authentication unit 150 performs face authentication using the facial feature information 112 . Specifically, the authentication unit 150 collates the facial feature information extracted from the captured image with the facial feature information 112 in the facial information DB 110 . Authentication unit 150 returns to server 200 whether or not the facial feature information matches. Whether the facial feature information matches or not corresponds to the success or failure of the authentication. Note that matching of facial feature information (matching) means a case where the degree of matching is equal to or greater than a predetermined value.
 図5は、実施形態2にかかる顔情報登録処理の流れを示すフローチャートである。まず、顔認証装置100は、顔登録要求に含まれるユーザUの登録画像を取得する(S21)。例えば、顔認証装置100は、顔登録要求を、ユーザ端末300からユーザ登録要求を受けたサーバ200から、ネットワークNを介して受け付ける。尚、顔認証装置100は、これに限らず、ユーザ端末300から直接、顔登録要求を受け付けてもよい。次に、顔検出部120は、登録画像に含まれる顔領域を検出する(S22)。次に、特徴点抽出部130は、ステップS22で検出した顔領域から特徴点を抽出し、登録部140に顔特徴情報を供給する(S23)。最後に、登録部140は、ユーザID111を発行し、当該ユーザID111と顔特徴情報112とを対応付けて顔情報DB110に登録する(S24)。なお、顔認証装置100は、顔登録要求元から顔特徴情報112を受信し、ユーザID111と対応付けて顔情報DB110に登録してもよい。 FIG. 5 is a flowchart showing the flow of face information registration processing according to the second embodiment. First, the face authentication device 100 acquires the registered image of the user U included in the face registration request (S21). For example, the face authentication device 100 receives a face registration request via the network N from the server 200 that received the user registration request from the user terminal 300 . Note that the face authentication device 100 may receive a face registration request directly from the user terminal 300 without being limited to this. Next, face detection section 120 detects a face area included in the registered image (S22). Next, the feature point extraction unit 130 extracts feature points from the face area detected in step S22, and supplies face feature information to the registration unit 140 (S23). Finally, the registration unit 140 issues the user ID 111, associates the user ID 111 with the facial feature information 112, and registers them in the facial information DB 110 (S24). Note that the face authentication device 100 may receive the face feature information 112 from the face registration requester and register it in the face information DB 110 in association with the user ID 111 .
 図6は、実施形態2にかかる顔認証処理の流れを示すフローチャートである。まず、特徴点抽出部130は、認証用の顔特徴情報を取得する(S31)。例えば、顔認証装置100は、サーバ200からネットワークNを介して顔認証要求を受信し、顔認証要求に含まれる撮影画像からステップS21からS23のように顔特徴情報を抽出する。または、顔認証装置100は、サーバ200から顔特徴情報を受信してもよい。次に、認証部150は、取得した顔特徴情報を、顔情報DB110の顔特徴情報112と照合する(S32)。顔特徴情報が一致した場合、つまり、顔特徴情報の一致度が所定値以上である場合(S33でYes)、認証部150は、顔特徴情報が一致したユーザのユーザID111を特定する(S34)。そして認証部150は、顔認証に成功した旨と特定したユーザID111とを、顔認証結果としてサーバ200に返信する(S35)。一致する顔特徴情報が存在しない場合(S33でNo)、認証部150は、顔認証に失敗した旨を、顔認証結果としてサーバ200に返信する(S36)。 FIG. 6 is a flowchart showing the flow of face authentication processing according to the second embodiment. First, the feature point extraction unit 130 acquires facial feature information for authentication (S31). For example, the face authentication device 100 receives a face authentication request from the server 200 via the network N, and extracts facial feature information from the captured image included in the face authentication request in steps S21 to S23. Alternatively, the face authentication device 100 may receive facial feature information from the server 200 . Next, the authentication unit 150 collates the acquired facial feature information with the facial feature information 112 of the facial information DB 110 (S32). If the facial feature information matches, that is, if the degree of matching of the facial feature information is equal to or greater than a predetermined value (Yes in S33), the authentication unit 150 identifies the user ID 111 of the user whose facial feature information matches (S34). . Then, the authenticating unit 150 replies to the server 200 that the face authentication was successful and the identified user ID 111 as the face authentication result (S35). If there is no matching facial feature information (No in S33), the authentication unit 150 returns a face authentication result to the server 200 to the effect that the face authentication has failed (S36).
 図7は、実施形態2にかかるユーザ端末300の構成を示すブロック図である。ユーザ端末300は、カメラ310と、記憶部320と、通信部330と、表示部340と、入力部350と、制御部360とを備える。
 カメラ310は、制御部360の制御に応じて撮影を行う撮影装置である。記憶部320は、ユーザ端末300の各機能を実現するためのプログラムが格納される記憶装置である。通信部330は、ネットワークNとの通信インタフェースである。表示部340は、表示装置である。入力部350は、ユーザからの入力を受け付ける入力装置である。表示部340及び入力部350は、タッチパネルのように一体的に構成されていてもよい。制御部360は、ユーザ端末300が有するハードウェアの制御を行う。
FIG. 7 is a block diagram showing the configuration of the user terminal 300 according to the second embodiment. User terminal 300 includes camera 310 , storage unit 320 , communication unit 330 , display unit 340 , input unit 350 , and control unit 360 .
The camera 310 is an imaging device that performs imaging under the control of the control unit 360 . The storage unit 320 is a storage device that stores programs for realizing each function of the user terminal 300 . A communication unit 330 is a communication interface with the network N. FIG. The display unit 340 is a display device. The input unit 350 is an input device that receives input from the user. The display unit 340 and the input unit 350 may be configured integrally like a touch panel. The control unit 360 controls hardware of the user terminal 300 .
 図8は、実施形態2にかかる計測端末400の構成を示すブロック図である。計測端末400は、カメラ410と、記憶部420と、通信部430と、表示部440と、入力部450と、制御部460とを備える。
 カメラ410は、制御部460の制御に応じて撮影を行う撮影装置である。記憶部420は、計測端末400の各機能を実現するためのプログラムが格納される記憶装置である。通信部430は、ネットワークNとの通信インタフェースである。表示部440は、表示装置である。入力部450は、入力を受け付ける入力装置である。表示部440及び入力部450は、タッチパネルのように一体的に構成されていてもよい。制御部460は、計測端末400が有するハードウェアの制御を行う。尚、指定出力地点A1,A2以外の地点に位置する計測端末400については、表示部440及び入力部450は省略されてもよい。
FIG. 8 is a block diagram showing the configuration of the measurement terminal 400 according to the second embodiment. The measurement terminal 400 includes a camera 410 , a storage section 420 , a communication section 430 , a display section 440 , an input section 450 and a control section 460 .
The camera 410 is an imaging device that performs imaging under the control of the control unit 460 . The storage unit 420 is a storage device that stores programs for realizing each function of the measurement terminal 400 . A communication unit 430 is a communication interface with the network N. FIG. Display unit 440 is a display device. Input unit 450 is an input device that receives an input. The display unit 440 and the input unit 450 may be configured integrally like a touch panel. The control unit 460 controls hardware included in the measurement terminal 400 . Note that the display unit 440 and the input unit 450 may be omitted for the measurement terminals 400 located at points other than the designated output points A1 and A2.
 図9は、実施形態2にかかるサーバ200の構成を示すブロック図である。本図は、説明のため、スポーツに関連する情報として、ゴルフの用具に関連する用具関連情報を紹介する場合のサーバ200の構成を示しているが、これに限らない。 FIG. 9 is a block diagram showing the configuration of the server 200 according to the second embodiment. For the sake of explanation, this diagram shows the configuration of the server 200 when introducing equipment-related information related to golf equipment as information related to sports, but the present invention is not limited to this.
 サーバ200は、記憶部210と、メモリ220と、通信部230と、制御部240とを備える。記憶部210は、ハードディスク、フラッシュメモリ等の記憶装置である。記憶部210は、プログラム211と、ユーザDB212と、プレー履歴DB213と、用具DB214と、地点テーブル215とを記憶する。プログラム211は、本実施形態2にかかる情報処理方法の処理が実装されたコンピュータプログラムである。 The server 200 includes a storage unit 210, a memory 220, a communication unit 230, and a control unit 240. The storage unit 210 is a storage device such as a hard disk or flash memory. Storage unit 210 stores program 211 , user DB 212 , play history DB 213 , equipment DB 214 , and location table 215 . The program 211 is a computer program in which the processing of the information processing method according to the second embodiment is implemented.
 ユーザDB212は、ユーザに関連する情報を記憶する。具体的には、ユーザDB212は、ユーザID2121に対応付けて、ユーザ情報2122及び利用履歴2123を記憶する。ユーザID2121は、顔情報登録時に顔認証装置100により発行されるユーザIDである。ユーザ情報2122は、ユーザに関する情報であり、例えばユーザの属性情報及びユーザのスポーツ経験の程度や使用中の用具に関するアンケート情報を含んでよい。属性情報は、年齢、住所、性別、職業、身長、体重、又は足のサイズを含んでよい。またユーザ情報2122には、ユーザのクレジットカード番号等の個人情報を含んでもよい。利用履歴2123は、ユーザの各地点への入場履歴の他、所定の店舗での決済履歴及び試打履歴を含んでよい。 The user DB 212 stores information related to users. Specifically, the user DB 212 stores user information 2122 and usage history 2123 in association with the user ID 2121 . A user ID 2121 is a user ID issued by the face authentication device 100 when face information is registered. The user information 2122 is information about the user, and may include, for example, user attribute information and questionnaire information about the user's degree of sports experience and tools in use. Attribute information may include age, address, gender, occupation, height, weight, or foot size. The user information 2122 may also include personal information such as the user's credit card number. The usage history 2123 may include the user's entrance history to each location, as well as payment history and trial hitting history at a predetermined store.
 プレー履歴DB213は、ユーザのプレー履歴を記憶する。具体的には、プレー履歴DB213は、ユーザID2131に対応付けて、プレーデータ2132及び計測条件2133を記憶する。プレーデータ2132は、計測端末400から受信した計測データそのもの、又は計測データを解析することにより生成された解析データである。計測条件2133は、プレーデータの基礎となる計測データの計測条件である。計測条件2133は、計測日時、計測地点、及び計測時にユーザが使用している用具に関する情報を含んでよい。本実施形態2では、計測日時及び計測地点は、顔認証用の撮影画像の撮影日時及び撮影地点と同じである。使用用具に関する情報は、用具ID及びメーカIDを含んでよい。 The play history DB 213 stores the user's play history. Specifically, the play history DB 213 stores play data 2132 and measurement conditions 2133 in association with the user ID 2131 . The play data 2132 is the measurement data itself received from the measurement terminal 400 or analysis data generated by analyzing the measurement data. The measurement condition 2133 is the measurement condition of the measurement data that forms the basis of the play data. The measurement conditions 2133 may include information related to the date and time of measurement, the location of measurement, and the tool used by the user at the time of measurement. In the second embodiment, the date and time of measurement and the location of measurement are the same as the date and time of shooting and the location of the captured image for face authentication. The information about the tool used may include the tool ID and the manufacturer ID.
 用具DB214は、各種メーカの用具に関連する情報を記憶する。具体的には、用具DB214は、用具ID2141に対応付けて、メーカID2142、用具属性2143及び用具関連情報2144を記憶する。用具ID2141は、用具を識別する情報であり、製品型番、あるいはメーカID及び製品型番の組み合わせであってよい。メーカID2142は、その用具を製造したメーカを識別する情報であり、例えばメーカの識別番号又はメーカ名である。用具属性2143は、その用具の属性を示す情報である。例えば用具属性2143は、ドライバ、アイアン又はパター等のゴルフクラブの種類及び番手を示す情報を含んでよい。また用具属性2143は、その用具の長さ、重量又は形状の特徴に関する情報を含んでよい。また用具属性2143は、例えば初級者用、中級者用又は上級者用の別、男性用、女性用又は子供用の別、あるいは右利き用又は左利き用の別を含んでよい。 The tool DB 214 stores information related to tools from various manufacturers. Specifically, the tool DB 214 stores a maker ID 2142, tool attributes 2143, and tool related information 2144 in association with the tool ID 2141. FIG. The tool ID 2141 is information for identifying the tool, and may be a product model number or a combination of a manufacturer ID and a product model number. The maker ID 2142 is information for identifying the maker that manufactured the tool, such as the maker's identification number or maker name. The tool attribute 2143 is information indicating the attribute of the tool. For example, equipment attributes 2143 may include information indicating the type and number of golf clubs, such as drivers, irons or putters. Equipment attributes 2143 may also include information regarding the length, weight or shape characteristics of the equipment. Equipment attributes 2143 may also include, for example, beginner, intermediate, or advanced, male, female, or children, or right-handed or left-handed.
 地点テーブル215は、地点ごとに、その地点が推奨情報を出力する地点であるか否か、つまり指定出力地点か否かを定めたテーブルである。図10は、実施形態2にかかる地点テーブル215のデータ構造の一例を示す図である。図10に示す通り、地点テーブル215は、地点IDと、指定出力地点か否かを示す情報とを対応付けている。 The location table 215 is a table that defines for each location whether or not the location is a location for which recommended information is to be output, that is, whether or not it is a designated output location. FIG. 10 is a diagram showing an example of the data structure of the point table 215 according to the second embodiment. As shown in FIG. 10, the point table 215 associates a point ID with information indicating whether or not it is a specified output point.
 図9に戻り説明を続ける。メモリ220は、RAM(Random Access Memory)等の揮発性記憶装置であり、制御部240の動作時に一時的に情報を保持するための記憶領域である。通信部230は、ネットワークNとの通信インタフェースである。 Return to Fig. 9 and continue the explanation. The memory 220 is a volatile storage device such as a RAM (Random Access Memory), and is a storage area for temporarily holding information when the control unit 240 operates. The communication unit 230 is a communication interface with the network N. FIG.
 制御部240は、サーバ200の各構成を制御するプロセッサつまり制御装置である。制御部240は、記憶部210からプログラム211をメモリ220へ読み込ませ、プログラム211を実行する。これにより、制御部240は、登録部241、画像取得部242、特定部243、生成部244、記録制御部245、選択部246、出力制御部247及び収集データ提供部248の機能を実現する。 The control unit 240 is a processor that controls each component of the server 200, that is, a control device. The control unit 240 loads the program 211 from the storage unit 210 into the memory 220 and executes the program 211 . Thereby, the control unit 240 realizes the functions of the registration unit 241 , the image acquisition unit 242 , the identification unit 243 , the generation unit 244 , the recording control unit 245 , the selection unit 246 , the output control unit 247 and the collected data providing unit 248 .
 登録部241は、登録手段とも呼ばれる。登録部241は、ユーザ端末300から登録画像を受信した場合、顔登録要求を顔認証装置100に送信する。そして登録部241は、顔認証装置100が顔情報を登録し、ユーザIDを発行した場合、そのユーザIDをユーザDB212に登録する。また登録部241は、ユーザ端末300からユーザ登録要求を受信した場合、そのユーザ端末300が使用するユーザのユーザIDに対応付けて、そのユーザのユーザ情報をユーザDB212に登録する。 The registration unit 241 is also called registration means. When the registration image is received from the user terminal 300 , the registration section 241 transmits a face registration request to the face authentication device 100 . When the face authentication device 100 registers face information and issues a user ID, the registration unit 241 registers the user ID in the user DB 212 . Further, when receiving a user registration request from the user terminal 300 , the registration unit 241 registers the user information of the user in the user DB 212 in association with the user ID of the user used by the user terminal 300 .
 画像取得部242は、画像取得手段とも呼ばれる。画像取得部242は、いずれかの地点の計測端末400から記録要求を受信した場合、上記記録要求に含まれる撮影画像又は顔特徴情報を取得する。画像取得部242は、撮影画像又は顔特徴情報を特定部243に供給する。また画像取得部242は、上記記録要求に含まれる計測データを生成部244に供給する。 The image acquisition unit 242 is also called image acquisition means. When receiving a record request from the measurement terminal 400 at any point, the image acquisition unit 242 acquires the captured image or facial feature information included in the record request. The image acquisition unit 242 supplies the captured image or facial feature information to the identification unit 243 . The image acquisition unit 242 also supplies the measurement data included in the recording request to the generation unit 244 .
 特定部243は、上述した特定部13の一例である。特定部243は、撮影画像に含まれるユーザUの顔領域に対する顔認証を制御し、ユーザを特定する。すなわち、特定部243は、計測端末400から取得した撮影画像について、顔認証装置100に対して顔認証を行わせる。例えば、特定部243は、取得した撮影画像を含めた顔認証要求を、ネットワークNを介して顔認証装置100へ送信する。尚、特定部243は、撮影画像からユーザUの顔領域を抽出し、抽出した画像を顔認証要求に含めてもよい。また特定部243は、顔領域から顔特徴情報を抽出し、顔特徴情報を顔認証要求に含めてもよい。そして特定部243は、顔認証装置100から顔認証結果を受信する。これにより特定部243は、ユーザのユーザIDを特定する。 The specifying unit 243 is an example of the specifying unit 13 described above. The identifying unit 243 controls face authentication for the face area of the user U included in the captured image to identify the user. That is, the identifying unit 243 causes the face authentication device 100 to perform face authentication on the captured image acquired from the measurement terminal 400 . For example, the specifying unit 243 transmits a face authentication request including the acquired photographed image to the face authentication device 100 via the network N. FIG. Note that the specifying unit 243 may extract the face area of the user U from the captured image and include the extracted image in the face authentication request. Further, the specifying unit 243 may extract facial feature information from the face area and include the facial feature information in the face authentication request. The specifying unit 243 then receives the face authentication result from the face authentication device 100 . Thereby, the identifying unit 243 identifies the user ID of the user.
 生成部244は、記録要求に含まれる計測データを解析し、計測データに基づいてユーザが訪問した地点におけるユーザの動作に関するプレーデータを生成する。ここで、上述の通り、本実施形態2では計測データは、顔認証用の撮影画像(つまり、ユーザIDの特定に用いた撮影画像)自体又は、顔認証用の撮影画像に基づいて画像処理により生成された、位置及び向き等の物理量に関するデータを含む。そして、プレーデータの生成方法はこれに限らないが、一例として、以下のような方法が挙げられる。例えば計測データが、用具と身体との位置関係、ユーザの身体の部位の位置及び重心の位置を含む場合、生成部244は、これらのデータからユーザのアドレス時の姿勢及びスイングの軌跡を解析し、アドレス時の姿勢及びスイングの軌跡の特徴量を算出する。また計測データが、打点の位置及びフェースの向きを含む場合、生成部244は、これらのデータから打球の方向及び飛距離を解析し、打球の特徴量を算出する。そして生成部244は、ユーザのアドレス時の姿勢及びスイングの軌跡の特徴量、並びに打球の特徴量を含むプレーデータを生成する。生成部244は、プレーデータを記録制御部245に供給する。 The generation unit 244 analyzes the measurement data included in the recording request, and generates play data regarding the user's actions at the point visited by the user based on the measurement data. Here, as described above, in the second embodiment, the measurement data is obtained by image processing based on the photographed image for face authentication (that is, the photographed image used to specify the user ID) itself or the photographed image for face authentication. Contains data about generated physical quantities such as position and orientation. The method of generating play data is not limited to this, but the following method can be given as an example. For example, if the measurement data includes the positional relationship between the equipment and the body, the position of the user's body parts, and the position of the center of gravity, the generation unit 244 analyzes the user's address posture and swing trajectory from these data. , the feature amount of the posture at address and the trajectory of the swing are calculated. If the measurement data includes the position of the hitting point and the direction of the face, the generation unit 244 analyzes the direction and flight distance of the hit ball from these data, and calculates the feature amount of the hit ball. Then, the generation unit 244 generates play data including the feature amount of the user's posture at address and the trajectory of the swing, and the feature amount of the batted ball. The generation unit 244 supplies play data to the recording control unit 245 .
 尚、計測データが、顔認証用の撮影画像を含む場合は、生成部244は、撮影画像に基づいて物理量に関するデータを生成した上で、物理量に関するデータに基づいてプレーデータを生成してよい。つまり本実施形態2では、プレーデータの生成と顔認証とは、同じ撮影画像に基づいて実行されてよい。したがって、用具やユーザ自身に計測のための専用センサを装着する必要がなく、センサと計測端末400とをペアリングする手間が省ける。またユーザが計測端末400にユーザIDを入力してログインせずとも、計測対象となるユーザのユーザIDを特定してプレーデータとユーザIDとを容易に紐づけることができる。 When the measurement data includes a photographed image for face authentication, the generating unit 244 may generate data regarding physical quantities based on the photographed image, and then generate play data based on the data regarding physical quantities. That is, in the second embodiment, play data generation and face authentication may be performed based on the same captured image. Therefore, there is no need to attach a dedicated sensor for measurement to the tool or the user himself/herself, and the trouble of pairing the sensor and the measurement terminal 400 can be saved. In addition, even if the user does not enter the user ID into the measurement terminal 400 to log in, the user ID of the user to be measured can be specified and the play data and the user ID can be easily linked.
 さらに生成部244は、撮影画像から画像認識によりユーザが使用する用具を検出し、使用用具の用具ID又はメーカIDを特定してもよい。そして生成部244は、計測日時、計測地点、及び使用用具の用具ID又はメーカIDを含む計測条件の情報を、記録制御部245に供給してよい。 Furthermore, the generation unit 244 may detect the tool used by the user from the captured image by image recognition, and specify the tool ID or manufacturer ID of the used tool. The generation unit 244 may then supply the recording control unit 245 with information on the measurement conditions including the date and time of measurement, the measurement location, and the tool ID or manufacturer ID of the tool used.
 記録制御部245は、上述した記録制御部15の一例である。記録制御部245は、生成したプレーデータを、プレー履歴として、特定部243が特定したユーザIDに対応付けてプレー履歴DB213に記録する。また記録制御部245は、ユーザIDに対応付けて、計測条件の情報をプレー履歴DB213に記録してよい。 The recording control unit 245 is an example of the recording control unit 15 described above. The recording control unit 245 records the generated play data as a play history in the play history DB 213 in association with the user ID specified by the specifying unit 243 . Also, the recording control unit 245 may record the information of the measurement conditions in the play history DB 213 in association with the user ID.
 選択部246は、上述した選択部16の一例である。まず選択部246は、地点テーブル215を用いて、ユーザが訪問した地点、つまり記録要求元の計測端末400が位置する地点が指定出力地点であるか否かを判定する。選択部246は、記録要求元の計測端末400の地点が指定出力地点である場合、プレー履歴DB213においてユーザIDに対応付けられたプレーデータを抽出する。このとき選択部246は、プレー履歴DB213においてユーザIDに対応付けられたプレーデータのうち、直近所定期間のプレーデータを抽出し、それ以外は抽出しなくてもよい。また選択部246は、プレー履歴DB213においてユーザIDに対応付けられたプレーデータのうち、計測条件に含まれる使用用具の用具ID、用具属性、又はメーカIDが所定であるプレーデータを抽出し、それ以外は抽出しなくてもよい。 The selection unit 246 is an example of the selection unit 16 described above. First, the selection unit 246 uses the location table 215 to determine whether or not the location visited by the user, that is, the location where the measuring terminal 400 that requested the recording is located, is the specified output location. If the point of the measurement terminal 400 that requested the recording is the specified output point, the selection unit 246 extracts the play data associated with the user ID in the play history DB 213 . At this time, the selection unit 246 may extract the play data for the most recent predetermined period from the play data associated with the user ID in the play history DB 213, and may not extract the rest. The selection unit 246 also extracts, from the play data associated with the user ID in the play history DB 213, play data in which the tool ID, tool attribute, or manufacturer ID of the used tool included in the measurement condition is predetermined, and extracts the play data. No need to extract anything other than
 そして選択部246は、少なくとも、抽出したプレーデータと、用具DB214の用具属性とに基づいて、用具DB214に登録される各種用具の用具関連情報の中から、ユーザに推奨する用具の用具関連情報を、推奨情報として選択する。 Then, based on at least the extracted play data and the equipment attributes of the equipment DB 214, the selection unit 246 selects the equipment-related information of the equipment recommended to the user from the equipment-related information of various equipment registered in the equipment DB 214. , to select as recommendations.
 選択方法の一例として、選択部246は、抽出したプレーデータに基づいて、ユーザの現在の技能レベル、技能レベルの推移及びユーザの動作の特徴を解析する。そして選択部246は、用具DB214に格納される用具IDのうち、技能レベル、技能レベルの推移又は動作の特徴に合った用具属性を有する用具IDを選択する。そして選択部246は、選択した用具IDに対応付けられた用具関連情報を、推奨情報として特定する。 As an example of the selection method, the selection unit 246 analyzes the user's current skill level, changes in the skill level, and the characteristics of the user's actions based on the extracted play data. Then, the selection unit 246 selects, from among the tool IDs stored in the tool DB 214, a tool ID having a tool attribute that matches the skill level, transition of the skill level, or characteristics of the action. The selection unit 246 then identifies the tool-related information associated with the selected tool ID as recommended information.
 また選択方法の第1変形例として、選択部246は、抽出したプレーデータに加えて、ユーザDB212においてユーザIDに対応付けられたユーザ情報に基づいて、推奨情報を選択してもよい。一例として選択部246は、ユーザの属性情報、スポーツ経験の程度又は現在所有している用具の情報から、ユーザが好むメーカ、価格帯又はデザインの傾向を推定し、ユーザの好みの傾向に基づいて用具IDを選択し、これにより推奨情報を選択してよい。 As a first modification of the selection method, the selection unit 246 may select recommended information based on user information associated with the user ID in the user DB 212 in addition to the extracted play data. As an example, the selection unit 246 estimates the user's preferred manufacturer, price range, or design tendency from the user's attribute information, the degree of sports experience, or the information on the equipment currently owned, and based on the user's preference tendency A Equipment ID may be selected, thereby selecting recommendations.
 また選択方法の第2変形例として、選択部246は、抽出したプレーデータに加えて、ユーザDB212においてユーザIDに対応付けられた利用履歴に基づいて、推奨情報を選択してもよい。例えば選択部246は、利用履歴に含まれるユーザの用具の購入履歴に基づいて、推奨情報を選択してよい。一例として選択部246は、ユーザの用具の購入履歴から、ユーザが好むメーカ、価格帯又はデザインの傾向を推定し、ユーザの好みの傾向に基づいて用具IDを選択し、これにより推奨情報を選択してよい。 As a second modification of the selection method, the selection unit 246 may select recommended information based on the usage history associated with the user ID in the user DB 212 in addition to the extracted play data. For example, the selection unit 246 may select the recommended information based on the purchase history of equipment of the user included in the usage history. As an example, the selection unit 246 estimates the tendency of the user's preferred manufacturer, price range, or design from the purchase history of the user's equipment, selects the equipment ID based on the user's preference, and thereby selects recommended information. You can
 出力制御部247は、上述した出力制御部17の一例である。出力制御部247は、選択部246が選択した推奨情報を、記録要求元の指定出力地点の計測端末400に送信し、出力させる。 The output control unit 247 is an example of the output control unit 17 described above. The output control unit 247 transmits the recommended information selected by the selection unit 246 to the measurement terminal 400 at the designated output point that requested the recording, and causes the information to be output.
 収集データ提供部248は、収集データ提供手段とも呼ばれる。収集データ提供部248は、プレー履歴DB213に蓄積されたデータを、メーカ端末500に送信する。送信するタイミングは、定期的であってもよいし、データが所定量蓄積した場合であってもよいし、データが記録される毎であってもよい。送信するデータの種類は、プレー履歴DB213に蓄積されたユーザID、プレーデータ、及び計測条件の全てであってもよいし、いずれかであってもよい。プライバシー保護の観点からユーザIDを省略してもよいし、ユーザIDに代えてユーザの属性情報であってもよい。また送信するデータは、送信先のメーカに関連するものであってよい。送信先のメーカに関連するデータとは、プレー履歴DB213に蓄積されたレコードのうち、計測条件に含まれる使用用具のメーカIDが送信先のメーカIDと一致するレコード、に含まれるデータであってよい。これにより送信先のメーカは、自社製品の使用データを容易に収集できる。また送信するデータは、送信先のメーカの競合に関するデータを含んでもよい。競合に関するデータとは、プレー履歴DB213に蓄積されたレコードのうち、計測条件に含まれる使用用具のメーカIDが競合のメーカIDと一致するレコード、に含まれるデータであってよい。これにより、送信先のメーカは、競合製品の使用データを容易に収集できる。 The collected data providing unit 248 is also called collected data providing means. Collected data providing unit 248 transmits the data accumulated in play history DB 213 to maker terminal 500 . The timing of transmission may be periodically, when a predetermined amount of data is accumulated, or each time data is recorded. The type of data to be transmitted may be all or any of the user ID, play data, and measurement conditions accumulated in the play history DB 213 . The user ID may be omitted from the viewpoint of privacy protection, or user attribute information may be used instead of the user ID. Also, the data to be transmitted may be related to the manufacturer to which it is transmitted. The data related to the maker of the transmission destination is the data contained in the record, among the records accumulated in the play history DB 213, in which the maker ID of the used equipment contained in the measurement conditions matches the maker ID of the transmission destination. good. As a result, the destination maker can easily collect usage data of its products. The data to be transmitted may also include data relating to competing manufacturers to which it is transmitted. Data related to the competition may be data included in records accumulated in the play history DB 213 in which the maker ID of the tool used included in the measurement conditions matches the maker ID of the competition. This allows the destination manufacturer to easily collect usage data of competing products.
 図11は、実施形態2にかかるユーザ登録処理の流れの一例を示すシーケンス図である。まずユーザ端末300は、ユーザUを撮影し(S500)、撮影により生成された登録画像を含むユーザ登録要求を、サーバ200へ送信する(S501)。そしてサーバ200の登録部241は、受信したユーザ登録要求に含まれる登録画像を、顔登録要求に含ませて、顔認証装置100に送信する(S502)。そして、顔認証装置100は、受信した顔登録要求に含まれる登録画像に基づいて、ユーザUの顔情報(顔特徴情報)を登録する(S503)。そして、顔認証装置100は、発行したユーザIDをサーバ200に通知する(S504)。また、ユーザ端末300は、ユーザからユーザ情報の入力を受け付け、ユーザ情報をサーバ200に送信する(S505)。ここで送信されるユーザ情報は、ユーザの属性情報と、ユーザが所有しているゴルフクラブの情報とを含むものとする。サーバ200の登録部241は、通知されたユーザID及びユーザ情報を、互いに対応付けてユーザDB212に登録する(S506)。 FIG. 11 is a sequence diagram showing an example of the flow of user registration processing according to the second embodiment. First, the user terminal 300 takes an image of the user U (S500), and transmits a user registration request including a registration image generated by the image taking to the server 200 (S501). Then, the registration unit 241 of the server 200 includes the registration image included in the received user registration request in the face registration request and transmits the face registration request to the face authentication device 100 (S502). Then, the face authentication device 100 registers face information (face feature information) of the user U based on the registration image included in the received face registration request (S503). Then, the face authentication device 100 notifies the server 200 of the issued user ID (S504). Also, the user terminal 300 accepts input of user information from the user and transmits the user information to the server 200 (S505). The user information transmitted here includes user attribute information and information on golf clubs owned by the user. The registration unit 241 of the server 200 associates the notified user ID and user information with each other and registers them in the user DB 212 (S506).
 次に、実施形態2にかかるサーバ200の情報処理の流れを説明する。まず図12は、実施形態2にかかるサーバ200が指定出力地点以外の計測端末400から計測データを受信した場合の情報処理の流れの一例を示すシーケンス図である。まず指定出力地点以外の地点に位置する計測端末400は、訪問したユーザを撮影する(S520)。次に計測端末400は、記録要求をサーバ200に送信する(S521)。ここでは記録要求には、撮影により生成された撮影画像と、その計測端末400が位置する地点の地点IDと、計測データとが含まれるものとする。また計測データは、計測端末400によって撮影画像から生成されたものとする。これにより、サーバ200の画像取得部242は、ユーザの撮影画像と、計測データとを取得する。次にサーバ200の特定部243は、撮影画像内のユーザUの顔領域に対する顔認証要求を、顔認証装置100へ送信する(S522)。そして、顔認証装置100は、受信した顔認証要求に含まれる撮影画像内のユーザUの顔領域について顔認証を行う(S523)。ここでは、顔認証に成功したユーザIDがあったものとする。顔認証装置100は、顔認証に成功した旨及びユーザIDを含めた顔認証結果を、サーバ200へ送信する(S524)。顔認証結果に含まれるユーザIDにより、サーバ200の特定部243は、ユーザを特定する。 Next, the flow of information processing of the server 200 according to the second embodiment will be described. First, FIG. 12 is a sequence diagram showing an example of the flow of information processing when the server 200 according to the second embodiment receives measurement data from the measurement terminal 400 other than the designated output point. First, the measurement terminal 400 positioned at a point other than the designated output point takes a picture of the visiting user (S520). Next, the measurement terminal 400 transmits a recording request to the server 200 (S521). Here, it is assumed that the recording request includes the photographed image generated by photographing, the point ID of the point where the measuring terminal 400 is located, and the measurement data. It is also assumed that the measurement data is generated from the captured image by the measurement terminal 400 . Accordingly, the image acquisition unit 242 of the server 200 acquires the captured image of the user and the measurement data. Next, the specifying unit 243 of the server 200 transmits a face authentication request for the face area of the user U in the captured image to the face authentication device 100 (S522). Then, the face authentication device 100 performs face authentication on the face area of the user U in the captured image included in the received face authentication request (S523). Here, it is assumed that there is a user ID for which face authentication has succeeded. The face authentication device 100 transmits to the server 200 a face authentication result including the success of the face authentication and the user ID (S524). The identifying unit 243 of the server 200 identifies the user based on the user ID included in the face authentication result.
 次にサーバ200の生成部244は、計測データに基づいてプレーデータを生成する(S525)。これに加えて生成部244は、撮影画像からユーザが使用する用具を検出し、検出した使用用具の用具ID又はメーカIDを計測条件に含めてよい。そしてサーバ200の記録制御部245は、特定したユーザIDに対応付けて、プレーデータ及び計測条件の情報をプレー履歴DB213に記録する(S526)。次に選択部246は、地点テーブル215を参照し、記録要求元の計測端末400の地点IDが、指定出力地点か否かを判定する(S527)。ここでは、記録要求元の計測端末400の地点IDは、指定出力地点以外の地点の地点IDであるため、サーバ200は処理を終了する。 Next, the generation unit 244 of the server 200 generates play data based on the measurement data (S525). In addition to this, the generation unit 244 may detect the tool used by the user from the captured image, and include the tool ID or manufacturer ID of the detected used tool in the measurement conditions. Then, the recording control unit 245 of the server 200 records the play data and the measurement condition information in the play history DB 213 in association with the identified user ID (S526). Next, the selection unit 246 refers to the point table 215 and determines whether or not the point ID of the measurement terminal 400 that requested the recording is the specified output point (S527). Here, since the point ID of the measuring terminal 400 that requested the recording is the point ID of a point other than the specified output point, the server 200 ends the process.
 次に、図13は、実施形態2にかかるサーバ200が指定出力地点の計測端末400から計測データを受信した場合の情報処理の流れの一例を示すシーケンス図である。図13のS520~S527は、図12と同様である。ここでは、記録要求元の計測端末400の地点IDは、指定出力地点の地点IDであるため、処理がS528に進められる。 Next, FIG. 13 is a sequence diagram showing an example of the flow of information processing when the server 200 according to the second embodiment receives measurement data from the measurement terminal 400 at the designated output point. S520 to S527 in FIG. 13 are the same as in FIG. Here, since the point ID of the measurement terminal 400 that requested the recording is the point ID of the designated output point, the process proceeds to S528.
 S528において、サーバ200の選択部246は、ユーザDB212からユーザIDに対応付けられたユーザ情報及び利用履歴を抽出し、プレー履歴DB213からユーザIDに対応付けられたプレーデータ(つまりプレー履歴)を抽出する(S528)。そして選択部246は、ユーザ情報及び利用履歴と、プレー履歴とを解析し、ユーザの現在の技能レベル、技能レベルの推移、動作の特徴、及びユーザが好みの傾向を推定する。選択部246は、解析結果と、用具DB214の各用具の用具属性とに基づいて、推奨にかかる用具を選択し、その用具の用具関連情報を選択する(S529)。選択された用具関連情報が、推奨情報となる。 In S528, the selection unit 246 of the server 200 extracts the user information and usage history associated with the user ID from the user DB 212, and extracts play data (that is, play history) associated with the user ID from the play history DB 213. (S528). Then, the selection unit 246 analyzes the user information, the usage history, and the play history, and estimates the user's current skill level, changes in the skill level, motion characteristics, and user's preference trends. The selection unit 246 selects a recommended tool based on the analysis result and the tool attribute of each tool in the tool DB 214, and selects the tool-related information of the tool (S529). The selected equipment-related information becomes recommended information.
 サーバ200の出力制御部247は、推奨情報を、記録要求元の計測端末400に送信する(S530)。推奨情報を受信した計測端末400は、推奨情報を表示する(S531)。 The output control unit 247 of the server 200 transmits the recommended information to the measurement terminal 400 that requested the recording (S530). The measurement terminal 400 that has received the recommended information displays the recommended information (S531).
 尚、サーバ200が指定出力地点の計測端末400から計測データを受信した場合については、サーバ200は、受信した計測データに基づくプレーデータを記録する前に、過去に蓄積されたプレー履歴に基づいて推奨情報を選択して出力してもよい。この場合、S525~S526に示す処理は、S527~S530と並行して、又はS527を実行後の任意のタイミングで行われてよい。 When the server 200 receives the measurement data from the measurement terminal 400 at the designated output point, the server 200, before recording the play data based on the received measurement data, based on the play history accumulated in the past. Recommended information may be selected and output. In this case, the processing shown in S525-S526 may be performed in parallel with S527-S530 or at any timing after S527 is performed.
 図14は、実施形態2にかかる計測端末400の表示の一例を示す図である。例えば計測端末400の表示部440には、ユーザの技能レベルに適合した用具として選択された用具IDに対応する「XX社製のアイアン」の製品情報が表示されている。また計測端末400の表示部440には、「試打する」という入力領域が表示され、ユーザが当該入力領域を用いて用具の試打を申し込むことができるように構成されていてよい。 FIG. 14 is a diagram showing an example of display on the measurement terminal 400 according to the second embodiment. For example, the display unit 440 of the measuring terminal 400 displays product information of "an iron made by XX company" corresponding to the tool ID selected as the tool suitable for the skill level of the user. Further, the display unit 440 of the measuring terminal 400 may be configured to display an input area of "try hitting", and the user can apply for a trial hitting of the tool using the input area.
 尚、上述の説明では、ユーザが訪問した地点が指定出力地点であると判定される毎に、選択処理及び出力処理を実行するとしたが、選択処理及び出力処理の実行タイミングは、これに限らない。例えば選択処理及び出力処理の実行タイミングは、ユーザが訪問した地点が指定出力地点であり、かつ所定条件を満たした場合であってよい。
 一例として所定条件は、ユーザが直近の所定期間内に同じ種類の用具を購入していないことであってよい。例えば直近でユーザがドライバーを購入していた場合、サーバ200が計測端末400にドライバーの製品情報を出力させて、ドライバーのレコメンドをしても、レコメンドの効果が薄いと考えられるからである。一方で、ユーザが長期間初級者用のドライバーを使用しており購入の記録がない場合は、新しいドライバーを紹介することは効果的と考えられる。
 また一例として所定条件は、使用中の用具とユーザとの相性が良好でないことであってよい。相性が良好でないとは、相性を示す値が所定閾値未満であることであってよく、相性が良好であるとは、相性を示す値が所定閾値以上であることであってよい。例えば選択部246は、プレーデータ2132と、ユーザの使用中の用具IDの用具属性とに基づいて、使用中の用具とユーザとの相性を解析し、使用中の用具とユーザとの相性が良好でない場合に、選択処理及び出力処理を実行してもよい。つまり、サーバ200は、使用中の用具とユーザとの相性が良好でない場合、記録要求元の計測端末400に推奨用具の用具関連情報を送信し、相性が良好な場合、記録要求元の計測端末400に推奨用具の用具関連情報を送信しなくてよい。例えば使用中の用具とユーザとの相性が良い場合は、計測端末400に「現在使用中の用具との相性は良好です。この調子で練習しましょう。」といったメッセージを出力させてもよい。
In the above description, the selection process and the output process are executed each time it is determined that the point visited by the user is the specified output point, but the execution timing of the selection process and the output process is not limited to this. . For example, the execution timing of the selection process and the output process may be when the point visited by the user is the designated output point and a predetermined condition is satisfied.
As an example, the predetermined condition may be that the user has not purchased the same type of tool within the most recent predetermined period. This is because, for example, if the user recently purchased a driver, even if the server 200 causes the measurement terminal 400 to output the driver's product information and recommend a driver, the effect of the recommendation is considered to be small. On the other hand, if the user has been using a beginner's driver for a long time and has no purchase record, introducing a new driver may be effective.
Further, as an example, the predetermined condition may be that the tool in use and the user are not compatible with each other. Poor compatibility may mean that the value indicating compatibility is less than a predetermined threshold value, and good compatibility may mean that the value indicating compatibility is equal to or greater than a predetermined threshold value. For example, the selection unit 246 analyzes the compatibility between the equipment in use and the user based on the play data 2132 and the equipment attribute of the equipment ID in use by the user, and finds that the equipment in use and the user have good compatibility. Otherwise, selection processing and output processing may be performed. In other words, if the tool in use and the user are not compatible with each other, the server 200 transmits the tool-related information of the recommended tool to the measurement terminal 400 that requested the recording. 400 does not need to send equipment-related information for recommended equipment. For example, if the equipment in use is compatible with the user, the measurement terminal 400 may output a message such as "The equipment currently in use is compatible with the user. Let's practice at this rate."
 また上述の説明では、ユーザに、ゴルフの用具に関連する用具関連情報を紹介する場合の構成を説明した。しかし、ユーザに紹介する情報は、レッスン、コーチ、コース又は練習場に関する情報であってもよい。この場合、サーバ200の用具DB214は、用具に関連する情報に代えて又は加えて、レッスン、コーチ、コース又は練習場に関連する情報を記憶してよい。例えば用具DB214は、レッスン、コーチ、コース又は練習場のIDに対応付けて、レッスン、コーチ、コース又は練習場の属性と、これらの紹介情報とを記憶してよい。この場合、サーバ200の収集データ提供部248は、メーカ端末500に代えてレッスンを提供する企業、コーチが所属する企業、又はコース若しくは練習場を提供する企業の端末に、収集したプレーデータを送信してよい。 Also, in the above explanation, the configuration for introducing equipment-related information related to golf equipment to the user has been explained. However, the information introduced to the user may be information about lessons, coaches, courses or practice ranges. In this case, the equipment DB 214 of the server 200 may store information related to lessons, coaches, courses, or practice ranges instead of or in addition to information related to equipment. For example, the equipment DB 214 may store attributes of lessons, coaches, courses, or practice grounds and their introduction information in association with IDs of lessons, coaches, courses, or practice grounds. In this case, the collected data providing unit 248 of the server 200 transmits the collected play data to the terminal of the company that provides the lessons, the company to which the coach belongs, or the company that provides the course or practice range instead of the maker terminal 500. You can
 例えばサーバ200が計測端末400に、推奨情報としてコーチに関する情報を出力させる場合の表示の一例を図15に示す。図15は、実施形態2にかかる計測端末400の表示の一例を示す図である。計測端末400の表示部440には、レッスンプロとして「XX練習場のYYプロ」の情報を表示している。そして図15に示す画面を閲覧したユーザが、本画面から、表示されたレッスンプロのレッスンスケジュールを閲覧したり、レッスンの予約の申し込みができるようになっていてよい。 For example, FIG. 15 shows an example of a display when the server 200 causes the measurement terminal 400 to output information about a coach as recommended information. FIG. 15 is a diagram showing an example of display on the measurement terminal 400 according to the second embodiment. The display unit 440 of the measurement terminal 400 displays information of "Professional YY in the XX practice field" as a lesson professional. Then, the user who viewed the screen shown in FIG. 15 may be able to view the displayed lesson schedule of the lesson professional and apply for a lesson reservation from this screen.
 尚、ユーザにレッスン、コーチ、コース又は練習場に関連する情報を紹介する場合、選択部246は、ユーザの行動範囲に基づいて、推奨情報の基礎となるレッスン、コーチ、コース又は練習場を選択してよい。つまり選択部246は、ユーザの行動範囲内又は行動範囲に近いレッスン、コーチ、コース又は練習場を優先的に選択してよい。ユーザの行動範囲は、ユーザの住所等の属性情報、記録要求元の計測端末400の位置情報、又はユーザの顔認証履歴(例えば地点の訪問履歴)から推定されてよい。 When introducing information related to lessons, coaches, courses, or practice grounds to the user, the selection unit 246 selects the lessons, coaches, courses, or practice grounds that serve as the basis for the recommended information, based on the user's activity range. You can That is, the selection unit 246 may preferentially select lessons, coaches, courses, or practice grounds within or close to the user's action range. The user's action range may be estimated from attribute information such as the user's address, location information of the measuring terminal 400 that requested the recording, or the user's face authentication history (for example, location visit history).
 またユーザに紹介する情報は、スポーツに関するサブスクリプションサービスの情報であってもよい。サブスクリプションサービスは、ユーザが定額の利用料金を支払うと所定期間に所定条件下で対象のサービスが受けられるものである。対象のサービスが受けられるとは、追加料金なしで、用具をレンタルできる、レッスンを受けられる、又はコース若しくは練習場を利用できることであってよい。また対象のサービスが受けられるとは、ゴルフ場の会員としてコースのプレー枠を予約できることであってもよい。 The information to be introduced to the user may also be information on subscription services related to sports. A subscription service allows a user to receive a target service for a predetermined period under predetermined conditions by paying a fixed usage fee. Access to the subject service may be the ability to rent equipment, take lessons, or use a course or practice range at no additional charge. Also, being able to receive the target service may mean being able to reserve a course slot as a member of the golf course.
 サーバ200が計測端末400に、推奨情報としてサブスクリプションサービスに関する情報を出力させる場合の表示の一例を図16に示す。図16は、実施形態2にかかる計測端末400の表示の一例を示す図である。計測端末400の表示部440には、中級者レベルの技能を有するユーザに対して、「中級者用ゴルフクラブのサブスクリプション」の情報を表示している。また計測端末400の表示部440には、「申し込む」という入力領域が表示され、ユーザが当該入力領域を用いてサブスクリプションサービスの申し込みができるように構成されていてよい。 FIG. 16 shows an example of a display when the server 200 causes the measurement terminal 400 to output information about the subscription service as recommended information. FIG. 16 is a diagram showing an example of display on the measurement terminal 400 according to the second embodiment. The display unit 440 of the measuring terminal 400 displays information of "intermediate-level golf club subscription" for users having intermediate-level skills. Further, the display unit 440 of the measurement terminal 400 may be configured to display an input area of "apply", and the user can apply for the subscription service using the input area.
 またユーザに紹介する情報は、ユーザにオススメのスポーツ用品メーカの情報であってもよい。例えば選択部246は、抽出したプレーデータに基づいて解析したユーザの現在の技能レベル及びユーザの動作の特徴等に基づいて、スポーツ用品メーカを選択してよい。また例えば選択部246は、ユーザIDに対応付けられたユーザ情報又は利用履歴に基づいて推定したユーザの好みの傾向に応じたスポーツ用品メーカを選択してよい。そして選択部246は、スポーツ用品メーカに関する情報を、推奨情報として選択してよい。 Also, the information to be introduced to the user may be the information of the sporting goods manufacturer recommended to the user. For example, the selection unit 246 may select the sporting goods maker based on the user's current skill level and the characteristics of the user's motion analyzed based on the extracted play data. Alternatively, for example, the selection unit 246 may select a sporting goods maker according to the tendency of the user's preference estimated based on the user information associated with the user ID or the usage history. Then, the selection unit 246 may select the information about the sporting goods maker as the recommended information.
 サーバ200が計測端末400に、推奨情報としてオススメのスポーツ用品メーカに関する情報を出力させる場合の表示の一例を図17に示す。図17は、実施形態2にかかる計測端末400の表示の一例を示す図である。計測端末400の表示部440には、ユーザに対して、オススメメーカとして「XX社」の情報を表示している。また計測端末400の表示部440には、「XX社のゴルフクラブセットを見る」という入力領域が表示され、ユーザが当該入力領域を用いてオススメメーカのセット商品の詳細が閲覧できるように構成されていてよい。 FIG. 17 shows an example of a display when the server 200 causes the measuring terminal 400 to output information on recommended sporting goods makers as recommended information. FIG. 17 is a diagram showing an example of display on the measurement terminal 400 according to the second embodiment. The display unit 440 of the measurement terminal 400 displays information on "XX company" as a recommended manufacturer for the user. The display unit 440 of the measurement terminal 400 also displays an input area for "View golf club sets from company XX", and the user can use this input area to browse the details of the recommended maker's set products. It's okay.
 図18は、実施形態2にかかる収集データ提供処理の流れの一例を示すフローチャートである。サーバ200の収集データ提供部248は、プレー履歴DB213に蓄積された各プレー履歴に対して、計測条件に含まれる使用用具のメーカIDを特定し、メーカ毎に、プレー履歴DB213に蓄積されたプレー履歴を仕分ける(S540)。そして収集データ提供部248は、仕分けたプレー履歴のデータを、そのメーカのメーカ端末に送信する(S541)。 FIG. 18 is a flowchart showing an example of the flow of collected data provision processing according to the second embodiment. The collected data providing unit 248 of the server 200 identifies the maker ID of the tool used included in the measurement condition for each play history accumulated in the play history DB 213, and identifies the play history accumulated in the play history DB 213 for each manufacturer. The history is sorted (S540). Then, the collected data providing unit 248 transmits the sorted play history data to the maker terminal of the maker (S541).
 このように実施形態2によれば、サーバ200は、様々な場面における計測データを、撮影画像により特定したユーザに紐づけて収集できるため、プレー履歴を容易に蓄積できる。紐づけには、ユーザによるログイン操作は不要である。これによりサーバ200は、指定出力地点を訪問したユーザに対して、過去に収集したプレー履歴を考慮した、ユーザの技能に応じた適切なレコメンドを実施することができる。これは、ユーザにとっては、スポーツ用品、レッスン、コーチ、コース又は練習場が容易に把握できるという効果がある。またスポーツ用品店、レッスン提供店、又は練習場にとっては効果的な販促による売り上げ向上が期待できる。またスポーツ用品メーカにとっては、製品開発やマーケティングのための情報集が効率よく行える。したがって、関連企業の連携により、業界全体のビジネスを活性化できる。 As described above, according to the second embodiment, the server 200 can collect measurement data in various scenes by linking it to the user specified by the captured image, so that the play history can be easily accumulated. A user's login operation is not required for linking. As a result, the server 200 can make appropriate recommendations to the user who has visited the designated output point according to the skill of the user, taking into consideration the play history collected in the past. This has the effect that the user can easily grasp the sporting goods, lessons, coaches, courses or practice fields. In addition, effective sales promotion can be expected to increase sales for sporting goods stores, lesson providers, and driving ranges. Also, sporting goods manufacturers can efficiently collect information for product development and marketing. Therefore, the cooperation of related companies can revitalize the business of the entire industry.
 <実施形態3>
 次に、本開示の実施形態3について説明する。実施形態3では、計測端末400が、計測データとして、ユーザ又は用具に取り付けられたセンサの出力データをサーバ200に送信する。
<Embodiment 3>
Next, Embodiment 3 of the present disclosure will be described. In Embodiment 3, the measurement terminal 400 transmits the output data of the sensor attached to the user or the tool to the server 200 as the measurement data.
 図19は、実施形態3にかかる情報処理システム1000aの全体構成を示すブロック図である。実施形態3にかかる情報処理システム1000aは、実施形態2にかかる情報処理システム1000の構成に加えてセンサ600-1,600-2,…,600-nを備える。センサ600-1,600-2,…,600-nは、それぞれ、各地点A1,A2,…,Anを訪問するユーザ、又はユーザが使用する用具に取り付けられている。 FIG. 19 is a block diagram showing the overall configuration of an information processing system 1000a according to the third embodiment. An information processing system 1000a according to the third embodiment includes sensors 600-1, 600-2, . . . , 600-n in addition to the configuration of the information processing system 1000 according to the second embodiment. Sensors 600-1, 600-2, . . . , 600-n are attached to users visiting each location A1, A2, .
 センサ600-1,600-2,…,600-nは、それぞれ、地点A1,A2,…,Anに位置する計測端末400-1,400-2,…,400-nと、有線又は無線通信により接続されている。例えばセンサ600-1,600-2,…,600-nは、それぞれ、計測端末400-1,400-2,…,400-nと、Bluetooth(登録商標)等の近距離無線通信により接続されている。 Sensors 600-1, 600-2, . . . , 600-n communicate with measurement terminals 400-1, 400-2, . connected by For example, sensors 600-1, 600-2, . . . , 600-n are connected to measurement terminals 400-1, 400-2, . ing.
 センサ600は、ユーザの動作を計測し、その結果出力されたデータを計測データとして計測端末400に送信する。一方、計測端末400は、その地点を訪問したユーザをカメラ410で撮影し、顔認証用の撮影画像を生成する。そして計測端末400は、顔認証用の撮影画像と、計測データとを、対応関係がわかるような態様でサーバ200に送信する。 The sensor 600 measures the motion of the user and transmits the resulting output data to the measurement terminal 400 as measurement data. On the other hand, the measurement terminal 400 photographs the user visiting the spot with the camera 410 and generates a photographed image for face authentication. Then, the measurement terminal 400 transmits the photographed image for face authentication and the measurement data to the server 200 in such a manner that the correspondence can be understood.
 そしてサーバ200の特定部243は、顔認証用の撮影画像に基づいてユーザのユーザIDを特定する。またサーバ200の生成部244は、センサ600の計測データに基づいて、ユーザが訪問した地点におけるユーザの動作に関するプレーデータを生成する。そしてサーバ200の記録制御部245は、特定したユーザIDと、プレーデータとを対応付けてプレー履歴DB213に記録する。 Then, the specifying unit 243 of the server 200 specifies the user ID of the user based on the photographed image for face authentication. Also, the generation unit 244 of the server 200 generates play data regarding the user's motion at the point visited by the user based on the measurement data of the sensor 600 . Then, the recording control unit 245 of the server 200 associates the identified user ID with the play data and records them in the play history DB 213 .
 図20は、実施形態3にかかるサーバ200が指定出力地点の計測端末400から計測データを受信した場合の情報処理の流れの一例を示すシーケンス図である。図20に示すステップは、図13のS520~S521に代えて、S540~S544を有する。尚、本例では、指定出力地点A1,A2の計測端末400-1,400-2はそれぞれ、第1の店舗及び第2の店舗の試打コーナーの入場口に設置されているものとする。 FIG. 20 is a sequence diagram showing an example of the flow of information processing when the server 200 according to the third embodiment receives measurement data from the measurement terminal 400 at the designated output point. The steps shown in FIG. 20 have S540 to S544 instead of S520 to S521 in FIG. In this example, it is assumed that the measurement terminals 400-1 and 400-2 of the designated output points A1 and A2 are installed at the entrances of the trial corners of the first store and the second store, respectively.
 まず指定出力地点に位置する計測端末400は、ユーザが試打コーナーに入場する場合に、ユーザを撮影する(S540)。これにより、計測端末400は、顔認証用の撮影画像を生成する。次に計測端末400は、記録開始要求をサーバ200に送信する(S541)。ここで記録開始要求には、S540により生成された撮影画像と、その計測端末400が位置する地点の地点IDとが含まれるものとする。これにより、サーバ200の画像取得部242は、ユーザの撮影画像を取得する。そしてサーバ200は、記録終了要求を受信するまで同じ地点IDの計測端末400から受信した計測データについては、上記撮影画像で特定されるユーザに紐づけられた計測データとして扱う。 First, the measurement terminal 400 located at the designated output point takes a picture of the user when the user enters the trial hitting corner (S540). Thereby, the measurement terminal 400 generates a photographed image for face authentication. Next, the measurement terminal 400 transmits a recording start request to the server 200 (S541). Here, it is assumed that the recording start request includes the photographed image generated in S540 and the point ID of the point where the measuring terminal 400 is located. Thereby, the image acquisition unit 242 of the server 200 acquires the captured image of the user. Then, the server 200 treats the measurement data received from the measurement terminal 400 having the same spot ID as the measurement data linked to the user identified by the captured image until the recording end request is received.
 次にS522~S524と同様の処理が実行される。 Next, processing similar to S522 to S524 is executed.
 一方、試打コーナーに入場したユーザは、用具を用いて試打をする。このときユーザ又は用具に取り付けられたセンサ600が、計測端末400に計測データを出力する(S542)。そして計測端末400は、計測データと地点IDとをサーバ200に送信する(S543)。これにより、サーバ200の画像取得部242は、ユーザの撮影画像に紐づけられた計測データを取得する。そしてユーザの試打が終了すると、計測端末400は、地点IDを含む記録終了要求をサーバ200に送信する(S544)。尚、計測端末400は、ユーザから試打終了を示す入力を受け付けたことに応じて記録終了要求を送信してもよいし、ユーザが試打コーナーを退場したことに応じて記録終了要求を送信してもよい。また計測端末400は、次のユーザが入場したことに応じて、前のユーザについての記録終了要求を送信してもよい。 On the other hand, the user entering the test hitting corner uses the equipment to test hit. At this time, the sensor 600 attached to the user or the tool outputs measurement data to the measurement terminal 400 (S542). The measurement terminal 400 then transmits the measurement data and the point ID to the server 200 (S543). Thereby, the image acquisition unit 242 of the server 200 acquires the measurement data associated with the captured image of the user. Then, when the user's trial hit ends, the measurement terminal 400 transmits a recording end request including the point ID to the server 200 (S544). Note that the measurement terminal 400 may transmit the recording end request in response to receiving an input indicating the end of trial hitting from the user, or may transmit the recording end request in response to the user exiting the trial hitting corner. good too. Further, the measurement terminal 400 may transmit a recording end request for the previous user when the next user enters.
 そしてS525~S531に示す処理が実行される。 Then, the processes shown in S525 to S531 are executed.
 尚、サーバ200が、使用用具を特定して計測条件を記録する場合、S541で取得した撮影画像からユーザが所持する用具を検出して用具ID又はメーカIDを特定してよい。あるいは、計測端末400がユーザ又は店舗のスタッフから、使用用具についての入力を受け付け、サーバ200は、計測端末400が受け付けた使用用具の情報を受信することで、使用用具の用具ID又はメーカIDを特定してもよい。 When the server 200 identifies the equipment to be used and records the measurement conditions, it may detect the equipment possessed by the user from the captured image acquired in S541 and identify the equipment ID or manufacturer ID. Alternatively, the measurement terminal 400 receives an input about the tool to be used from the user or the store staff, and the server 200 receives the information on the tool to be used received by the measurement terminal 400, thereby obtaining the tool ID or manufacturer ID of the tool to be used. may be specified.
 上述の説明では、サーバ200が指定出力地点の計測端末400から計測データを受信した場合の情報処理の流れの一例を示した。しかしサーバ200が指定出力地点以外の地点の計測端末400から計測データを受信した場合であっても、計測端末400がユーザを撮影してから(S540)、サーバ200が地点判定をするまで(S527)の流れは同様である。 In the above description, an example of the flow of information processing when the server 200 receives measurement data from the measurement terminal 400 at the designated output point has been shown. However, even if the server 200 receives the measurement data from the measurement terminal 400 at a point other than the specified output point, after the measurement terminal 400 photographs the user (S540) and until the server 200 determines the point (S527). ) flow is similar.
 このように実施形態3によれば、計測データは顔認証用の撮影画像で特定されたユーザIDに自動で紐づけられて収集されるため、実施形態2と同様の効果を奏することができる。 As described above, according to the third embodiment, the measurement data is automatically linked to the user ID specified in the photographed image for face authentication and collected, so the same effect as the second embodiment can be achieved.
 上述の実施形態では、ハードウェアの構成として説明したが、これに限定されるものではない。本開示は、任意の処理を、プロセッサにコンピュータプログラムを実行させることにより実現することも可能である。 In the above-described embodiment, the hardware configuration is described, but it is not limited to this. The present disclosure can also implement arbitrary processing by causing a processor to execute a computer program.
 上述の例において、プログラムは、コンピュータに読み込まれた場合に、実施形態で説明された1又はそれ以上の機能をコンピュータに行わせるための命令群(又はソフトウェアコード)を含む。プログラムは、非一時的なコンピュータ可読媒体又は実体のある記憶媒体に格納されてもよい。限定ではなく例として、コンピュータ可読媒体又は実体のある記憶媒体は、random-access memory(RAM)、read-only memory(ROM)、フラッシュメモリ、solid-state drive(SSD)又はその他のメモリ技術、CD-ROM、digital versatile disc(DVD)、Blu-ray(登録商標)ディスク又はその他の光ディスクストレージ、磁気カセット、磁気テープ、磁気ディスクストレージ又はその他の磁気ストレージデバイスを含む。プログラムは、一時的なコンピュータ可読媒体又は通信媒体上で送信されてもよい。限定ではなく例として、一時的なコンピュータ可読媒体又は通信媒体は、電気的、光学的、音響的、またはその他の形式の伝搬信号を含む。 In the above examples, the program includes instructions (or software code) that, when read into a computer, cause the computer to perform one or more of the functions described in the embodiments. The program may be stored in a non-transitory computer-readable medium or tangible storage medium. By way of example, and not limitation, computer readable media or tangible storage media may include random-access memory (RAM), read-only memory (ROM), flash memory, solid-state drives (SSD) or other memory technology, CDs - ROM, digital versatile disc (DVD), Blu-ray disc or other optical disc storage, magnetic cassette, magnetic tape, magnetic disc storage or other magnetic storage device. The program may be transmitted on a transitory computer-readable medium or communication medium. By way of example, and not limitation, transitory computer readable media or communication media include electrical, optical, acoustic, or other forms of propagated signals.
 上述のコンピュータは、パーソナルコンピュータやワードプロセッサ等を含むコンピュータシステムで構成される。しかしこれに限らず、コンピュータは、LAN(ローカル・エリア・ネットワーク)のサーバ、コンピュータ(パソコン)通信のホスト、インターネット上に接続されたコンピュータシステム等によって構成されることも可能である。また、ネットワーク上の各機器に機能分散させ、ネットワーク全体でコンピュータを構成することも可能である。 The computer mentioned above is composed of a computer system including a personal computer and a word processor. However, the computer is not limited to this, and can be configured by a LAN (local area network) server, a computer (personal computer) communication host, a computer system connected to the Internet, or the like. It is also possible to distribute the functions to each device on the network and configure the computer over the entire network.
 尚、本開示は上記実施形態に限られたものではなく、趣旨を逸脱しない範囲で適宜変更することが可能である。上述の実施形態2~3において、各地点での計測データが撮影画像から取得されるデータである場合と、各地点での計測データがセンサの出力データである場合とについて説明した。しかし、計測データは、撮影画像から取得されるデータとセンサの出力データとの組み合わせてであってもよい。また、一部の地点での計測データは撮影画像から取得され、他の地点での計測データはセンサの出力データであってもよい。 It should be noted that the present disclosure is not limited to the above embodiments, and can be modified as appropriate without departing from the scope. In the above-described second and third embodiments, the case where the measurement data at each point is data acquired from the captured image and the case where the measurement data at each point is the output data of the sensor have been described. However, the measurement data may be a combination of data acquired from the captured image and sensor output data. Also, the measurement data at some points may be acquired from the captured image, and the measurement data at other points may be sensor output data.
 例えば上述の実施形態では、顔認証機能を顔認証装置100が有していたが、顔認証装置100に代えて又は加えて、サーバ200が顔認証機能を有してもよい。 For example, in the above embodiment, the face authentication device 100 has the face authentication function, but instead of or in addition to the face authentication device 100, the server 200 may have the face authentication function.
 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。
   (付記1)
 特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手段と、
 前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
 前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
 前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
 を備える情報処理装置。
   (付記2)
 前記選択手段は、前記ユーザが訪問した地点が前記指定出力地点でない場合、前記推奨情報を選択しない
 付記1に記載の情報処理装置。
   (付記3)
 前記スポーツに関連する情報は、前記スポーツの用具、レッスン、コーチ、コース又は練習場に関する情報を含む
 付記1又は2に記載の情報処理装置。
   (付記4)
 前記ユーザが訪問した地点に位置する計測端末から取得した前記動作の計測データに基づいて、前記ユーザが訪問した地点における前記ユーザの前記動作に関するプレーデータを生成する生成手段を備える
 付記1から3のいずれか一項に記載の情報処理装置。
   (付記5)
 前記計測データは、前記ユーザIDの特定に用いた前記撮影画像を含む
 付記4に記載の情報処理装置。
   (付記6)
 前記計測データは、前記ユーザに取り付けられたセンサ、又は前記ユーザが使用する用具に取り付けられたセンサの出力データを含む
 付記4又は5に記載の情報処理装置。
   (付記7)
 前記指定出力地点は、スポーツ用品店の所定領域であり、
 前記特定の地点は、前記スポーツ用品店とは異なるスポーツ用品店、又は前記スポーツの練習場若しくは試合会場の所定領域を含む
 付記1から6のいずれか一項に記載の情報処理装置。
   (付記8)
 前記選択手段は、前記プレー履歴に加えて、前記スポーツの用具の購入履歴、前記ユーザの属性情報、又は地点の訪問履歴に基づいて、前記推奨情報を選択する
 付記1から7のいずれか一項に記載の情報処理装置。
   (付記9)
 特定の地点の各々に位置し、その地点を訪問したユーザの少なくとも顔を撮影し、その地点を訪問したユーザによる特定のスポーツの動作の計測データを取得する計測端末と、
 情報処理装置と
 を備え、
 前記情報処理装置は、
 特定の地点のうちいずれかの地点に位置する計測端末が撮影した撮影画像に基づいて、その地点を訪問したユーザを識別するユーザIDを特定する特定手段と、
 前記ユーザが訪問した地点において該計測端末が取得した計測データにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
 前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
 前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
 を有する
 情報処理システム。
   (付記10)
 前記端末装置は、前記指定出力地点に位置する計測端末を含む
 付記9に記載の情報処理システム。
   (付記11)
 特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手段と、
 前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
 前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
 前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
 を備える情報処理方法。
   (付記12)
 特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手順と、
 前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手順と、
 前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手順と、
 前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手順と
 をコンピュータに実行させるためのプログラムが格納された非一時的なコンピュータ可読媒体。
Some or all of the above-described embodiments can also be described in the following supplementary remarks, but are not limited to the following.
(Appendix 1)
identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points;
a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID;
If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
and output control means for outputting the recommended information to a terminal device located at the designated output point.
(Appendix 2)
The information processing apparatus according to appendix 1, wherein the selection means does not select the recommended information when the point visited by the user is not the specified output point.
(Appendix 3)
3. The information processing device according to appendix 1 or 2, wherein the information related to the sport includes information related to equipment, lessons, coaches, courses, or practice grounds of the sport.
(Appendix 4)
generating means for generating play data relating to the motion of the user at the point visited by the user based on the measurement data of the motion acquired from a measurement terminal positioned at the point visited by the user The information processing device according to any one of the items.
(Appendix 5)
The information processing apparatus according to appendix 4, wherein the measurement data includes the captured image used to specify the user ID.
(Appendix 6)
The information processing apparatus according to appendix 4 or 5, wherein the measurement data includes output data of a sensor attached to the user or a sensor attached to a tool used by the user.
(Appendix 7)
The designated output point is a predetermined area of a sporting goods store,
7. The information processing apparatus according to any one of appendices 1 to 6, wherein the specific point includes a sporting goods store different from the sporting goods store, or a predetermined area of the sports practice ground or match venue.
(Appendix 8)
The selection means selects the recommended information based on, in addition to the play history, the purchase history of the sports equipment, the user's attribute information, or the location visit history. The information processing device according to .
(Appendix 9)
a measurement terminal positioned at each of the specific points, capturing at least the face of a user visiting the point, and acquiring measurement data of the action of the user who visited the point playing a specific sport;
comprising an information processing device and
The information processing device is
an identifying means for identifying a user ID that identifies a user who has visited one of the specific points based on a captured image captured by a measurement terminal positioned at one of the specific points;
a recording control means for recording play data obtained from measurement data acquired by the measurement terminal at a point visited by the user as a play history in a play DB in association with the user ID;
If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
An information processing system comprising output control means for outputting the recommended information to a terminal device positioned at the designated output point.
(Appendix 10)
The information processing system according to appendix 9, wherein the terminal device includes a measurement terminal positioned at the designated output point.
(Appendix 11)
identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points;
a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID;
If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
and output control means for outputting the recommended information to a terminal device located at the designated output point.
(Appendix 12)
an identification step of identifying a user ID that identifies the user based on a photographed image of at least the face of the user visiting any one of the specific points;
a recording control procedure for recording play data obtained by measuring a specific sport action by the user at a point visited by the user in a play DB in association with the user ID as a play history;
If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection procedure to select;
A non-temporary computer-readable medium storing a program for causing a computer to execute an output control procedure for outputting the recommended information to a terminal device located at the specified output point.
 1000,1000a 情報処理システム
 12,242 画像取得部
 13,243 特定部
 15,245 記録制御部
 16,246 選択部
 17,247 出力制御部
 100 顔認証装置
 110 顔情報DB
 111 ユーザID
 112 顔特徴情報
 120 顔検出部
 130 特徴点抽出部
 140 登録部
 150 認証部
 10,200 情報処理装置(サーバ)
 210 記憶部
 211 プログラム
 212 ユーザDB
 2121 ユーザID
 2122 ユーザ情報
 2123 利用履歴
 213 プレー履歴DB
 2131 ユーザID
 2132 プレーデータ
 2133 計測条件
 214 用具DB
 2141 用具ID
 2142 メーカID
 2143 用具属性
 2144 用具関連情報
 215 地点テーブル
 220 メモリ
 230 通信部
 240 制御部
 241 登録部
 244 生成部
 248 収集データ提供部
 300 ユーザ端末
 310 カメラ
 320 記憶部
 330 通信部
 340 表示部
 350 入力部
 360 制御部
 400 計測端末
 410 カメラ
 420 記憶部
 430 通信部
 440 表示部
 450 入力部
 460 制御部
 500 メーカ端末
 600 センサ
1000, 1000a information processing system 12, 242 image acquisition unit 13, 243 identification unit 15, 245 recording control unit 16, 246 selection unit 17, 247 output control unit 100 face authentication device 110 face information DB
111 User ID
112 facial feature information 120 face detection unit 130 feature point extraction unit 140 registration unit 150 authentication unit 10, 200 information processing apparatus (server)
210 storage unit 211 program 212 user DB
2121 User ID
2122 User information 2123 Usage history 213 Play history DB
2131 User ID
2132 play data 2133 measurement conditions 214 tool DB
2141 Equipment ID
2142 Manufacturer ID
2143 Equipment attribute 2144 Equipment related information 215 Location table 220 Memory 230 Communication unit 240 Control unit 241 Registration unit 244 Generation unit 248 Collected data provision unit 300 User terminal 310 Camera 320 Storage unit 330 Communication unit 340 Display unit 350 Input unit 360 Control unit 400 Measurement terminal 410 Camera 420 Storage unit 430 Communication unit 440 Display unit 450 Input unit 460 Control unit 500 Manufacturer terminal 600 Sensor

Claims (12)

  1.  特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手段と、
     前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
     前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
     前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
     を備える情報処理装置。
    identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points;
    a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID;
    If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
    and output control means for outputting the recommended information to a terminal device located at the designated output point.
  2.  前記選択手段は、前記ユーザが訪問した地点が前記指定出力地点でない場合、前記推奨情報を選択しない
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the selection means does not select the recommended information when the point visited by the user is not the designated output point.
  3.  前記スポーツに関連する情報は、前記スポーツの用具、レッスン、コーチ、コース又は練習場に関する情報を含む
     請求項1又は2に記載の情報処理装置。
    3. The information processing apparatus according to claim 1, wherein the information related to the sport includes information related to equipment, lessons, coaches, courses, or practice grounds of the sport.
  4.  前記ユーザが訪問した地点に位置する計測端末から取得した前記動作の計測データに基づいて、前記ユーザが訪問した地点における前記ユーザの前記動作に関するプレーデータを生成する生成手段を備える
     請求項1から3のいずれか一項に記載の情報処理装置。
    4. A generation means for generating play data relating to the action of the user at a point visited by the user based on measurement data of the action acquired from a measurement terminal located at the point visited by the user. The information processing device according to any one of .
  5.  前記計測データは、前記ユーザIDの特定に用いた前記撮影画像を含む
     請求項4に記載の情報処理装置。
    The information processing apparatus according to claim 4, wherein the measurement data includes the captured image used to specify the user ID.
  6.  前記計測データは、前記ユーザに取り付けられたセンサ、又は前記ユーザが使用する用具に取り付けられたセンサの出力データを含む
     請求項4又は5に記載の情報処理装置。
    The information processing apparatus according to claim 4 or 5, wherein the measurement data includes output data of a sensor attached to the user or a sensor attached to tools used by the user.
  7.  前記指定出力地点は、スポーツ用品店の所定領域であり、
     前記特定の地点は、前記スポーツ用品店とは異なるスポーツ用品店、又は前記スポーツの練習場若しくは試合会場の所定領域を含む
     請求項1から6のいずれか一項に記載の情報処理装置。
    The designated output point is a predetermined area of a sporting goods store,
    The information processing apparatus according to any one of claims 1 to 6, wherein the specific point includes a sporting goods store different from the sporting goods store, or a predetermined area of the sports practice ground or match venue.
  8.  前記選択手段は、前記プレー履歴に加えて、前記スポーツの用具の購入履歴、前記ユーザの属性情報、又は地点の訪問履歴に基づいて、前記推奨情報を選択する
     請求項1から7のいずれか一項に記載の情報処理装置。
    8. The selection means selects the recommended information based on the purchase history of the sports equipment, the attribute information of the user, or the visit history of places in addition to the play history. The information processing device according to the item.
  9.  特定の地点の各々に位置し、その地点を訪問したユーザの少なくとも顔を撮影し、その地点を訪問したユーザによる特定のスポーツの動作の計測データを取得する計測端末と、
     情報処理装置と
     を備え、
     前記情報処理装置は、
     特定の地点のうちいずれかの地点に位置する計測端末が撮影した撮影画像に基づいて、その地点を訪問したユーザを識別するユーザIDを特定する特定手段と、
     前記ユーザが訪問した地点において該計測端末が取得した計測データにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
     前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
     前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
     を有する
     情報処理システム。
    a measurement terminal positioned at each of the specific points, capturing at least the face of a user visiting the point, and acquiring measurement data of the action of the user who visited the point playing a specific sport;
    comprising an information processing device and
    The information processing device is
    an identifying means for identifying a user ID that identifies a user who has visited one of the specific points based on a captured image captured by a measurement terminal positioned at one of the specific points;
    a recording control means for recording play data obtained from measurement data acquired by the measurement terminal at a point visited by the user as a play history in a play DB in association with the user ID;
    If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
    An information processing system comprising output control means for outputting the recommended information to a terminal device positioned at the designated output point.
  10.  前記端末装置は、前記指定出力地点に位置する計測端末を含む
     請求項9に記載の情報処理システム。
    The information processing system according to claim 9, wherein the terminal device includes a measurement terminal positioned at the specified output point.
  11.  特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手段と、
     前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手段と、
     前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手段と、
     前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手段と
     を備える情報処理方法。
    identifying means for identifying a user ID that identifies a user based on a photographed image of at least the face of a user who has visited one of the specific points;
    a recording control means for recording play data obtained by measuring a specific sport action by the user at a point visited by the user as a play history in a play DB in association with the user ID;
    If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection means to select;
    and output control means for outputting the recommended information to a terminal device located at the designated output point.
  12.  特定の地点のうちいずれかの地点を訪問したユーザの少なくとも顔を撮影した撮影画像に基づいて、前記ユーザを識別するユーザIDを特定する特定手順と、
     前記ユーザが訪問した地点において前記ユーザによる特定のスポーツの動作を計測することにより得られたプレーデータを、プレー履歴として前記ユーザIDに対応付けてプレーDBに記録する記録制御手順と、
     前記ユーザが訪問した地点が指定出力地点である場合、前記プレーDBに蓄積された、前記ユーザIDに対応付けられたプレー履歴の少なくとも一部に基づいて、前記スポーツに関連する情報から推奨情報を選択する選択手順と、
     前記推奨情報を、前記指定出力地点に位置する端末装置に出力させる出力制御手順と
     をコンピュータに実行させるためのプログラムが格納された非一時的なコンピュータ可読媒体。
    an identification step of identifying a user ID that identifies the user based on a photographed image of at least the face of the user visiting any one of the specific points;
    a recording control procedure for recording play data obtained by measuring a specific sport action by the user at a point visited by the user in a play DB in association with the user ID as a play history;
    If the point visited by the user is the specified output point, recommended information is selected from the information related to sports based on at least a part of the play history associated with the user ID and accumulated in the play DB. a selection procedure to select;
    A non-temporary computer-readable medium storing a program for causing a computer to execute an output control procedure for outputting the recommended information to a terminal device located at the specified output point.
PCT/JP2021/040059 2021-10-29 2021-10-29 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium WO2023073940A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/040059 WO2023073940A1 (en) 2021-10-29 2021-10-29 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/040059 WO2023073940A1 (en) 2021-10-29 2021-10-29 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium

Publications (1)

Publication Number Publication Date
WO2023073940A1 true WO2023073940A1 (en) 2023-05-04

Family

ID=86157605

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/040059 WO2023073940A1 (en) 2021-10-29 2021-10-29 Information processing device, information processing system, information processing method, and non-transitory computer-readable medium

Country Status (1)

Country Link
WO (1) WO2023073940A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002000787A (en) * 2000-07-19 2002-01-08 Sumitomo Rubber Ind Ltd Golf product selecting device
JP2017029460A (en) * 2015-08-03 2017-02-09 セイコーエプソン株式会社 Diagnostic server, diagnostic system, diagnostic method, diagnosis program, and recording medium
JP2021029738A (en) * 2019-08-27 2021-03-01 株式会社プロギア Shot management system, shot management method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002000787A (en) * 2000-07-19 2002-01-08 Sumitomo Rubber Ind Ltd Golf product selecting device
JP2017029460A (en) * 2015-08-03 2017-02-09 セイコーエプソン株式会社 Diagnostic server, diagnostic system, diagnostic method, diagnosis program, and recording medium
JP2021029738A (en) * 2019-08-27 2021-03-01 株式会社プロギア Shot management system, shot management method and program

Similar Documents

Publication Publication Date Title
US11941915B2 (en) Golf game video analytic system
US8512162B2 (en) Golf simulation apparatus and method for the same
US10607349B2 (en) Multi-sensor event system
US10923224B2 (en) Non-transitory computer-readable recording medium, skill determination method, skill determination device and server
US9039527B2 (en) Broadcasting method for broadcasting images with augmented motion data
US8371989B2 (en) User-participating type fitness lecture system and fitness training method using the same
CA2781992C (en) Virtual golf simulation device, system including the same and terminal device, and method for virtual golf simulation
US11839805B2 (en) Computer vision and artificial intelligence applications in basketball
JP2018530804A (en) Multi-sensor event detection and tagging system
JP2006247023A (en) Golf club information providing system, method and program
Myers et al. The Sony Smart Tennis Sensor accurately measures external workload in junior tennis players
US11806579B2 (en) Sports operating system
KR101221065B1 (en) Practicing method of golf swing motion using motion overlap and practicing system of golf swing motion using the same
CN103328053A (en) Apparatus for providing golf content, system for providing golf content using same, apparatus for virtual golf simulation, method for providing golf content, and method for virtual golf simulation
JP6186542B1 (en) Information processing apparatus, information processing method, and information processing program
WO2023073940A1 (en) Information processing device, information processing system, information processing method, and non-transitory computer-readable medium
KR101864039B1 (en) System for providing solution of justice on martial arts sports and analyzing bigdata using augmented reality, and Drive Method of the Same
US20140330412A1 (en) Computerized interactive sports system and a method of assessing sports skills of users
JP7164334B2 (en) Motion analysis device, motion analysis method and motion analysis program
US20090017945A1 (en) Golf ball selection assisting method and selection assisting apparatus
KR20100034809A (en) Method and system for colligate management of golf information
KR101504539B1 (en) Cloud system for golf information, cloud server apparatus and cloud database used to the same and cloud service providing method for golf information using the same
WO2021172459A1 (en) Movement analysis system, server, movement analysis method, control program, and recording medium
KR102344739B1 (en) System of operating golf course and server performing the same
KR102599302B1 (en) Golf game management device and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21962481

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023556045

Country of ref document: JP

Kind code of ref document: A