WO2022196059A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2022196059A1
WO2022196059A1 PCT/JP2022/000897 JP2022000897W WO2022196059A1 WO 2022196059 A1 WO2022196059 A1 WO 2022196059A1 JP 2022000897 W JP2022000897 W JP 2022000897W WO 2022196059 A1 WO2022196059 A1 WO 2022196059A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
posture
unit
reference motion
Prior art date
Application number
PCT/JP2022/000897
Other languages
French (fr)
Japanese (ja)
Inventor
嘉寧 呉
順 横野
嘉昭 岩井
明香 渡辺
夏子 尾崎
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022196059A1 publication Critical patent/WO2022196059A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/65Entertainment or amusement; Sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/40Information sensed or collected by the things relating to personal data, e.g. biometric data, records or preferences
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/20Analytics; Diagnosis

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • Patent Literature 1 discloses a technique of evaluating the exercise posture based on the height of the user's leg and the time the user raises the leg, and feeding back the evaluation result to the user in one-leg standing exercise. Further, it is disclosed that personalized information such as height and weight manually input by the user is used when evaluating exercise posture.
  • Patent Literature 1 does not recognize physique feature information due to the physique of the user, such as the length of each part of the user and the presence or absence of limbs, and uses the physique feature information that differs for each user as an exercise posture. Not used for evaluation.
  • the present disclosure proposes a new and improved information processing method, information processing device, and program capable of providing personalized feedback according to the user's physique feature information.
  • a posture information acquisition unit that acquires posture information indicating a posture of a user, and a calculation unit that calculates physique characteristic information of the user by comparing the posture information of the user and predetermined posture information.
  • a correction unit that corrects posture information prepared as reference motion information according to the physique feature information of the user; and guidance information that guides the user's motion based on the reference motion information corrected by the correction unit. and an output unit that outputs.
  • obtaining posture information indicating a posture of a user calculating physical characteristic information of the user by comparing the posture information of the user and predetermined posture information, and referring to correcting posture information prepared as motion information according to the user's physique feature information; and outputting guidance information for guiding the user's motion based on the corrected reference motion information.
  • a computer implemented method for processing information is provided.
  • a computer has a posture information acquisition function for acquiring posture information indicating a posture of a user; a calculation function for calculating, a correction function for correcting posture information prepared as reference movement information according to the physique characteristic information of the user, and a movement of the user based on the reference movement information corrected by the correction function.
  • a program is provided for realizing an output function of outputting guidance information for guidance.
  • FIG. 1 is an explanatory diagram showing an information processing system according to an embodiment of the present disclosure
  • FIG. FIG. 2 is an explanatory diagram for explaining a functional configuration example of the information processing terminal 10 according to the present disclosure
  • FIG. 2 is an explanatory diagram for explaining a functional configuration example of a server 20 according to the present disclosure
  • FIG. FIG. 4 is an explanatory diagram for explaining an example of skeleton data, which is user posture information
  • FIG. 4 is an explanatory diagram for explaining an example of object detection according to the present disclosure
  • FIG. 10 is an explanatory diagram for explaining an example of a calculation method of user's physique characteristic information
  • FIG. 11 is an explanatory diagram for explaining an example of correction of reference motion information
  • FIG. 10 is an explanatory diagram for explaining Feedback Example 1 according to the present disclosure
  • FIG. 11 is an explanatory diagram for explaining a feedback example 2 according to the present disclosure
  • FIG. 11 is an explanatory diagram for explaining a feedback example 3 according to the present disclosure
  • FIG. 11 is an explanatory diagram for explaining a feedback example 4 according to the present disclosure
  • FIG. 11 is an explanatory diagram for explaining a feedback example 5 according to the present disclosure
  • FIG. 4 is an explanatory diagram for explaining an operation processing example related to feedback of the information processing terminal 10 according to the present disclosure
  • FIG. 10 is an explanatory diagram for explaining an example of operation processing of the information processing terminal 10 when acquiring another user's operation as reference operation information
  • 2 is a block diagram showing the hardware configuration of the information processing terminal 10
  • FIG. 11 is an explanatory diagram for explaining a feedback example 2 according to the present disclosure
  • FIG. 11 is an explanatory diagram for explaining a feedback example 3 according to the present disclosure
  • FIG. 11 is an explanatory diagram for
  • Skeleton data expressed by a skeleton structure indicating the structure of the body, for example, is used in order to visualize information on the movement of a moving body such as a human being or an animal.
  • Skeleton data includes information on parts.
  • the parts in the skeleton structure correspond to, for example, terminal parts and joint parts of the body.
  • Skeleton data may also include bones, which are line segments connecting parts.
  • the bones in the skeleton structure can correspond to, for example, human bones, but the positions and numbers of bones do not necessarily match the actual human skeleton.
  • the position and posture of each part in the skeleton data can be obtained by sensors that detect the user's movements. For example, technology that detects the position and posture of each part of the body based on time-series data of image data acquired by an imaging sensor, and time-series data acquired by a motion sensor attached to a part of the body. Techniques exist to obtain the position information of the motion sensor based on .
  • time-series data of skeleton data is used for form improvement in sports, and for applications such as VR (Virtual Reality) or AR (Augmented Reality).
  • time-series data of skeleton data is used to generate an avatar image imitating a user's movement, and the avatar image is distributed.
  • FIG. 1 is an explanatory diagram showing an information processing system according to an embodiment of the present disclosure.
  • an information processing system according to an embodiment of the present disclosure has an information processing terminal 10 and a server 20.
  • an information processing terminal 10 As shown in FIG. 1, an information processing system according to an embodiment of the present disclosure has an information processing terminal 10 and a server 20.
  • a server 20 As shown in FIG. 1
  • a network 1 is a wired or wireless transmission path for information transmitted from devices connected to the network 1 .
  • the network 1 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), WANs (Wide Area Networks), and the like.
  • the network 1 may also include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
  • the information processing terminal 10 is an example of an information processing device.
  • the information processing terminal 10 acquires posture information indicating the posture of the user U, and calculates physical characteristic information of the user U by comparing the posture information with predetermined posture information.
  • the information processing terminal 10 corrects the posture information prepared as the reference motion information by the server 20 according to the physique characteristic information of the user U, and guides the motion of the user U based on the corrected reference motion information. Output guidance information.
  • FIG. 1 shows a smartphone as the information processing terminal 10
  • the information processing terminal 10 may be other information processing devices such as a notebook PC (Personal Computer) and a desktop PC.
  • a notebook PC Personal Computer
  • the server 20 holds standard skeleton data containing predetermined skeleton information. In addition, the server 20 holds time-series data of posture information prepared as at least one piece of reference motion information.
  • the server 20 transmits the above-described standard skeleton data or reference motion information to the information processing terminal 10 in response to a request signal received from the information processing terminal 10 .
  • FIG. 2 is an explanatory diagram for explaining a functional configuration example of the information processing terminal 10 according to the present disclosure.
  • the information processing terminal 10 includes a data acquisition section 110 , an operation display section 120 , a communication section 130 , a storage section 140 and a control section 150 .
  • the data acquisition unit 110 acquires motion information of a user and an object.
  • the data acquisition unit 110 may be an imaging sensor, a ToF (Time of Flight) sensor, or an antenna that transmits and receives radio waves.
  • an imaging sensor will be described as a main example of the data acquisition unit 110 .
  • the operation display unit 120 has a function as a display unit that displays guidance information that guides user's actions under the control of the display control unit 171 . Further, the operation display unit 120 has a function as an operation unit for the user to operate and input physical information and health information, which will be described later.
  • the operation display unit 120 has a function as an operation unit that allows the user to operate and input sports events, points of interest, and the like. For example, the user inputs or selects "baseball" and "swing" on the operation display unit 120 .
  • the information processing terminal 10 may output guidance information that guides the user's motion based on posture information when the user swings and reference motion information prepared as a swing model. .
  • the function of the display unit is realized by, for example, a CRT (Cathode Ray Tube) display device, a liquid crystal display (LCD) device, or an OLED (Organic Light Emitting Diode) device.
  • CTR Cathode Ray Tube
  • LCD liquid crystal display
  • OLED Organic Light Emitting Diode
  • the function as an operation unit is realized by, for example, a touch panel, keyboard, or mouse.
  • the information processing terminal 10 has a configuration in which the functions of the display unit and the operation unit are integrated, but may have a configuration in which the functions of the display unit and the operation unit are separated.
  • the communication unit 130 communicates various information with the server 20 via the network 1 .
  • the communication unit 130 transmits to the server 20 a request signal requesting standard skeleton data and reference motion information.
  • the communication unit 130 also receives standard skeleton data and reference motion information transmitted from the server 20 in response to a request signal.
  • the storage unit 140 holds software and various data.
  • the storage unit 140 holds physique feature information of the user.
  • control unit 150 controls overall operations of the information processing terminal 10 .
  • the control unit 150 includes a posture estimation unit 151, an object detection unit 155, a physique feature information calculation unit 159, a conversion unit 163, an action comparison unit 167, and a display control unit 171. .
  • the posture estimation unit 151 estimates part information indicating the position and posture of each part of the user based on the time-series data acquired by the data acquisition unit 110 .
  • posture estimation section 151 generates skeleton data including position information and posture information of each part in the skeleton structure based on the part information. Details of the skeleton data will be described later.
  • the object detection unit 155 detects object information and object motion information based on the time-series data acquired by the data acquisition unit 110 .
  • the physique feature information calculation unit 159 calculates physique feature information of the user by comparing the user's skeleton data generated by the posture estimation unit 151 and the standard skeleton data held by the standard skeleton data storage unit 231 .
  • a specific example of the method for calculating physique characteristic information will be described later.
  • the conversion unit 163 is an example of a correction unit, and corrects the reference motion information acquired from the server 20 according to the user's physique feature information.
  • the reference motion information corrected according to the physique feature information of the user may be expressed as post-correction reference motion information.
  • the conversion unit 163 may also convert the skeleton features of each part of the user's skeleton data into the skeleton features of each part of the standard skeleton data.
  • the skeletal features in this specification include the length and thickness of bones.
  • the conversion unit 163 may convert the skeletal feature of each part of the plurality of skeleton data included in the post-correction reference motion information to the skeletal feature of each part of the skeleton data of the user.
  • the motion comparison unit 167 compares the user motion information, which is time-series data of user skeleton data, with the reference motion information converted by the conversion unit 163, and outputs the comparison result to the conversion unit 163 or the display control unit 171. Output.
  • the motion comparison unit 167 compares one frame of the user motion information with one frame of the reference motion information converted by the conversion unit 163, and outputs the comparison result to the conversion unit 163 or the display control unit 171.
  • the display control unit 171 is an example of an output unit, and causes the operation display unit 120 to display guidance information that guides the user's actions based on the reference action information corrected by the conversion unit 163 .
  • a specific example of the guidance information will be described later.
  • FIG. 3 is an explanatory diagram for explaining a functional configuration example of the server 20 according to the present disclosure.
  • the server 20 includes a communication section 210, a control section 220, and a storage section 230.
  • the communication unit 210 communicates various information with the information processing terminal 10 via the network 1 .
  • the communication unit 210 receives from the information processing terminal 10 a request signal requesting standard skeleton data or reference motion information. Further, under the control of the control unit 220 , the communication unit 210 transmits standard skeleton data and reference motion information to the information processing terminal 10 in response to a request signal received from the information processing terminal 10 .
  • control unit 220 controls overall operations of the server 20 .
  • the control unit 220 causes the communication unit 210 to transmit various information held in the storage unit 230 in response to a request signal received from the information processing terminal 10 .
  • Storage unit 230 holds software and various data. As shown in FIG. 3 , the storage section 230 includes a standard skeleton data storage section 231 and a reference motion storage section 235 .
  • the standard skeleton data storage unit 231 holds standard skeleton data having predetermined skeletal features for each part.
  • the predetermined posture information according to the present disclosure is not limited to such an example.
  • the predetermined posture information according to the present disclosure may be part information having predetermined skeletal features.
  • the reference motion storage unit 235 holds reference motion information as time-series data of skeleton data containing certain motion information. Details of the reference motion information will be described later.
  • FIG. 4 The functional configuration example according to the present disclosure has been described above. Subsequently, details of the processing according to the present disclosure will be sequentially described with reference to FIGS. 4 to 13.
  • FIG. 4 The functional configuration example according to the present disclosure has been described above. Subsequently, details of the processing according to the present disclosure will be sequentially described with reference to FIGS. 4 to 13.
  • FIG. 4 is an explanatory diagram for explaining an example of skeleton data including user posture information.
  • Posture estimation section 131 acquires skeleton data US including position information and posture information of each part in the skeleton structure, for example, based on the time-series data acquired by data acquisition section 110 .
  • the posture estimation unit 131 may generate skeleton data US of the user U using machine learning technology such as DNN (Deep Neural Network). More specifically, the posture estimating unit 131 uses an estimator obtained by a machine learning technique using, for example, a set of image data obtained by photographing a person and skeleton data as teacher data, to determine the position of the user U. to generate skeleton data US.
  • machine learning technology such as DNN (Deep Neural Network). More specifically, the posture estimating unit 131 uses an estimator obtained by a machine learning technique using, for example, a set of image data obtained by photographing a person and skeleton data as teacher data, to determine the position of the user U. to generate skeleton data US.
  • the skeleton data US may include bone information (position information, posture information, skeletal feature information, etc.) in addition to the information on the parts.
  • FIG. 4 shows the skeleton data of the whole body of the user U
  • the posture estimation unit 131 may generate the skeleton data of only parts as necessary.
  • FIG. 5 is an explanatory diagram for explaining an example of object detection according to the present disclosure.
  • the object detection unit 155 detects object information and object motion information of various objects o based on the time-series data acquired by the data acquisition unit 110 .
  • the object information includes the type of object (ball, racket, etc.).
  • the object detection unit 155 recognizes the wheelchair o1 on which the user U rides, the racket o2 and the ball o3 held by the user based on the time-series data acquired by the data acquisition unit 110, and further , to detect the motion direction and size of each object.
  • the object detection unit 155 may detect the relationship between each of the objects o1 to o3 and the user U. For example, when the user U is in a wheelchair, the object detection unit 155 may recognize the wheelchair associated with the movement of the user as part of the user's body. In this case, the posture estimation unit 131 does not need to generate skeleton data of the lower limbs of the user U, for example.
  • the object detection unit 155 may detect an object corresponding to the input sport event. For example, when the user inputs "baseball" using the operation display unit 120, the object detection unit 155 may detect an object related to baseball (eg, a batting tool, a ball, etc.).
  • an object related to baseball eg, a batting tool, a ball, etc.
  • FIG. 6 is an explanatory diagram for explaining an example of a method for calculating user's physique characteristic information.
  • Physique feature information calculation section 159 compares the posture information of each part of the user estimated by posture estimation section 151 with predetermined posture information held by standard skeleton data storage section 231 to calculate the physique feature information of the user. calculate.
  • the physique feature information calculation unit 159 calculates the difference between each of various skeletal feature information included in the user's skeleton data US and each of the various skeletal feature information included in the standard skeleton data RS as the user's physique feature information. .
  • physique characteristic information calculation section 159 calculates the difference D between the length of one skeleton US1 of the right arm included in the skeleton data US of the user and the one skeleton RS1 of the right arm included in the standard skeleton data RS. It may be calculated as physique feature information.
  • the physique feature information calculation unit 159 calculates a difference between each piece of skeletal feature information of a plurality of parts included in the skeleton data US of the user and each piece of skeletal feature information of a plurality of parts included in the standard skeleton data RS, The physique characteristic information of the user may be calculated based on each calculated difference.
  • the physique feature information calculation unit 159 may correct the physique feature information of the user according to the user's physical information or health information that is not included in the skeleton information of the skeleton data.
  • Physical information includes, for example, information about the user's body, such as height, weight, or muscle mass.
  • Health information includes information about the user's health history, such as minor medical history, age, and physical disabilities.
  • Information input through the operation display unit 120 may be used as the user's physical information and health information.
  • the physique feature information calculation unit 159 may compare the center of gravity position of the skeleton data US of the user and the position of the center of gravity of the standard skeleton data RS, and calculate the physique feature information of the user based on the comparison result.
  • the physique feature information calculation unit 159 may calculate the user's physique feature information by combining a plurality of the above-described calculation methods. Next, an example of a correction method for correcting each of the plurality of reference skeleton data included in the reference motion information in accordance with the physique feature information of the user will be described.
  • FIG. 7 is an explanatory diagram for explaining an example of correction of reference motion information.
  • the conversion unit 163 converts the skeletal feature information (for example, bone length, thickness, etc.) of each part of the plurality of reference skeleton data CS1 included in the reference motion information into skeletal feature information of each part of the standard skeleton data.
  • a conversion process C1 is executed to generate post-conversion skeleton data CS2.
  • conversion unit 163 executes correction processing C2 for correcting post-conversion skeleton data CS2 according to the physique feature information, physical information, and health information of the user calculated by physique feature information calculation unit 159.
  • correction processing C2 for correcting post-conversion skeleton data CS2 according to the physique feature information, physical information, and health information of the user calculated by physique feature information calculation unit 159.
  • the conversion unit 163 also executes a conversion process C1 for converting the skeleton feature information of each part of the user's skeleton data into the skeleton feature information of the standard skeleton data. This enables the motion comparison unit 167 to compare the motions of skeleton data having the same skeleton feature information.
  • the conversion process C1 may be a process of converting each piece of skeletal feature information of a plurality of pieces of skeleton data included in the reference motion information into skeletal feature information included in the user's skeleton data. Accordingly, the conversion unit 163 does not have to perform the conversion processing C1 on the skeleton data of the user.
  • the conversion unit 163 executes a correction process C2 for correcting the converted skeleton data CS2 based on the weight parameter set for each event of the reference movement information in addition to the physique characteristic information of the user.
  • CS3 may be generated. For example, when the user is practicing batting, the weight parameters of parts of the user that have a particularly large effect (for example, arms and hips) are set large, and the weight parameters of parts of the user that have a small effect (for example, the head) are set to be large. The parameter may be set small.
  • the information processing terminal 10 can provide various types of feedback to the user. Specific examples of feedback will be sequentially described below with reference to FIGS. 8 to 12. FIG.
  • FIG. 8 is an explanatory diagram for explaining Feedback Example 1 according to the present disclosure.
  • the display control unit 171 may cause the operation display unit 120 to display an image in which the skeleton data US of the user U and the reference skeleton data CS are superimposed on the user's display.
  • the reference skeleton data CS in the feedback example is obtained by converting the post-correction skeleton data CS3 shown in FIG. Later reference skeleton data.
  • the operation comparison unit 167 compares the time-series data of the skeleton data US of the user U and the reference skeleton data CS, for example, compares the direction and distance of the difference between the time-series data of the skeleton data US and the reference skeleton data CS.
  • the result is output to the display control unit 171 .
  • the display control unit 171 may cause the operation display unit 120 to display an arrow or the like indicating the direction and distance of the difference between the user's skeleton data US and the reference skeleton data CS. Display of such reference skeleton data CS and arrows is an example of guidance information that guides the user's posture.
  • FIG. 9 is an explanatory diagram for explaining Feedback Example 2 according to the present disclosure.
  • the information processing terminal 10 may acquire the relationship between the reference motion information and the muscle load amount of each part from a database.
  • the conversion unit 163 calculates the muscle load amount of each part corresponding to the reference motion information acquired from the database according to the user's physique characteristic information. may be corrected.
  • the display control unit 171 causes the operation display unit 120 to display the muscle load amount of each part of the user as guidance information.
  • the user can check the degree of achievement toward the muscle mass set in advance as a target, and can consider the type of training according to the degree of achievement.
  • FIG. 10 is an explanatory diagram for explaining Feedback Example 3 according to the present disclosure.
  • the conversion unit 163 may correct the position, motion direction, or speed of the ball included in the heading motion as the reference motion information according to the user's physique feature information, object information, and object motion information.
  • the motion comparison unit 167 compares the user's skeleton data with the ball position, motion direction, or speed included in each of the reference motion information, and compares various information such as the impact point, direction, or force of the ball. , to the display control unit 171 . Then, the display control unit 171 may cause the operation display unit 120 to display various types of information such as the impact point, direction, or force of the ball input from the motion comparison unit 167 as guidance information.
  • FIG. 11 is an explanatory diagram for explaining Feedback Example 4 according to the present disclosure.
  • the solid-line skeleton data in FIG. 11 is the user's skeleton data
  • the dashed-line skeleton data is the reference skeleton data.
  • the display control unit 171 displays an image in which the time-series data of the skeleton data of the user generated a plurality of times and the reference motion information are superimposed. You may make it display on the operation display part 120.
  • FIG. This allows the user to check the user's habits and the like included in the user's actions.
  • the display control unit 171 provides time-series data of one skeleton data obtained by averaging each skeleton data obtained by the user performing the same action a plurality of times, and the reference action information to the user. An image superimposed thereon may be displayed on the operation display unit 120 . It should be noted that such display of reference motion information is an example of guidance information.
  • FIG. 12 is an explanatory diagram for explaining Feedback Example 5 according to the present disclosure.
  • the conversion unit 163 may correct the ball launching method of the skeleton data included in the reference motion information according to the user's physique feature information, the use of the wheelchair, the shape and type of the racket, and the like.
  • the motion comparison unit 167 compares the skeleton data of the user and the reference motion information, and outputs the impact range of the racket and the movement direction of the wheelchair to the display control unit 171 as the comparison result.
  • the display control unit 171 may cause the operation display unit 120 to display the impact range of the racket and the movement direction of the wheelchair in addition to the skeleton data of the user and the range of the wheelchair on which the user is riding.
  • the impact range of the racket and the motion direction of the wheelchair are examples of guidance information.
  • FIG. 13 is an explanatory diagram for explaining an operation processing example related to feedback of the information processing terminal 10 according to the present disclosure.
  • the data acquisition unit 110 photographs the user and acquires time-series data of the images (S101).
  • the posture estimation unit 151 generates user skeleton data from the time-series data of the acquired images (S105).
  • the object detection unit 155 detects object information from the time-series data of the acquired images (S109).
  • the communication unit 130 receives the standard skeleton data held in the standard skeleton data storage unit 231 (S113).
  • the physique feature information calculation unit 159 compares the user's skeleton data with the standard skeleton data, and calculates the result of the comparison as the user's physique feature information (S117).
  • the conversion unit 163 converts the skeleton information included in the user's skeleton data into skeleton information included in the standard skeleton data (S121).
  • the communication unit 130 receives the reference motion information held in the reference motion storage unit 235 (S235).
  • the conversion unit 163 corrects the time-series data of the skeleton data included in the reference motion information according to the user's physique feature information (S129).
  • the motion comparison unit 167 compares the time-series data of the skeleton data of the user with the time-series data of the skeleton data included in the reference motion information (S133).
  • the display control unit 171 causes the operation display unit 120 to display the result of the comparison (S137), and the information processing terminal 10 ends the processing.
  • the reference motion information may be stored in advance in the reference motion storage unit 235 as model data based on dynamics. good.
  • FIG. 14 is an explanatory diagram for explaining an example of operation processing of the information processing terminal 10 when acquiring another user's operation as reference operation information.
  • the data acquisition unit 110 photographs another user and acquires time-series data of the images (S201).
  • the posture estimation unit 151 generates skeleton data of other users from the time-series data of the acquired images (S205).
  • the object detection unit 155 detects object information from the time-series data of the acquired images (S209).
  • the communication unit 130 receives the standard skeleton data held in the standard skeleton data storage unit 231 (S213).
  • the physique feature information calculation unit 159 compares the skeleton data of the other user with the standard skeleton data, and calculates the result of the comparison as the physique feature information of the other user (S217).
  • the conversion unit 163 corrects the skeleton data of the other user based on the standard skeleton data and the physique feature information of the other user (S221).
  • the conversion unit 163 corrects the corrected skeleton data of the other user according to the user's physique feature information (S225).
  • the operation comparison unit 167 compares the time-series data of the skeleton data of the user with the corrected time-series data of the skeleton data of other users (S229).
  • the display control unit 171 causes the operation display unit 120 to display the result of the comparison (S233), and the information processing terminal 10 ends the processing.
  • the information processing terminal 10 provides the user with guidance information based on reference motion information corrected according to the user's physique feature information. Therefore, the information processing terminal 10 can provide the user with guidance information that takes into consideration the range of motion of each part that may differ depending on the user.
  • the reference motion information is corrected according to the user's physical information or health information.
  • the information processing terminal 10 can provide the user with guidance information based on the reference motion information corrected within a range that the user can overcome, according to the user's age, medical history, and the like.
  • the information processing terminal 10 also compares the user's motion information with the reference motion information. Accordingly, the information processing terminal 10 can provide the user with guidance information that quantifies the difference between the user's motion information and the target reference motion information.
  • the reference motion information may be time-series data of other users' posture information.
  • the user can perform various types of training, for example, aiming at the motion information of a leader such as a teacher or an instructor.
  • the information processing terminal 10 may detect object information and object motion information, and correct the reference motion information according to the detected various information. As a result, the information processing terminal 10 can provide the user with guidance information that considers possible effects of the user's training, for example, in a wheelchair, with an artificial arm, or an artificial leg.
  • the information processing terminal 10 may correct the reference motion information according to the weight parameter of each part owned by the user, which is set for each training event. Thereby, the information processing terminal 10 can provide the user with guidance information that reflects the degree of importance of each part, which may differ depending on the type of training.
  • Hardware configuration example >> The embodiments of the present disclosure have been described above. Information processing such as generation of skeleton data and extraction of feature amounts described above is realized by cooperation between software and hardware of the information processing terminal 10 described below. Note that the hardware configuration described below can also be applied to the server 20 .
  • FIG. 15 is a block diagram showing the hardware configuration of the information processing terminal 10.
  • the information processing terminal 10 includes a CPU (Central Processing Unit) 1001 , a ROM (Read Only Memory) 1002 , a RAM (Random Access Memory) 1003 and a host bus 1004 .
  • the information processing terminal 10 also includes a bridge 1005 , an external bus 1006 , an interface 1007 , an input device 1008 , an output device 1010 , a storage device (HDD) 1011 , a drive 1012 and a communication device 1015 .
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the information processing terminal 10 also includes a bridge 1005 , an external bus 1006 , an interface 1007 , an input device 1008 , an output device 1010 , a storage device (HDD) 1011 , a drive 1012 and a communication device 1015 .
  • HDMI storage device
  • the CPU 1001 functions as an arithmetic processing device and a control device, and controls general operations within the information processing terminal 10 according to various programs.
  • the CPU 1001 may be a microprocessor.
  • a ROM 1002 stores programs, calculation parameters, and the like used by the CPU 1001 .
  • the RAM 1003 temporarily stores programs used in the execution of the CPU 1001, parameters that change as appropriate during the execution, and the like. These are interconnected by a host bus 1004 comprising a CPU bus or the like. Functions such as the physique characteristic information calculation unit 159 and the motion comparison unit 167 described with reference to FIG.
  • the host bus 1004 is connected via a bridge 1005 to an external bus 1006 such as a PCI (Peripheral Component Interconnect/Interface) bus.
  • an external bus 1006 such as a PCI (Peripheral Component Interconnect/Interface) bus.
  • PCI Peripheral Component Interconnect/Interface
  • bridge 1005 and external bus 1006 do not necessarily have to be configured separately, and these functions may be implemented in one bus.
  • the input device 1008 includes input means for the user to input information, such as a mouse, keyboard, touch panel, button, microphone, switch, and lever, and an input control circuit that generates an input signal based on the user's input and outputs it to the CPU 1001 . etc.
  • input means for the user to input information such as a mouse, keyboard, touch panel, button, microphone, switch, and lever
  • an input control circuit that generates an input signal based on the user's input and outputs it to the CPU 1001 . etc.
  • the user of the information processing terminal 10 can input various data to the information processing terminal 10 and instruct processing operations.
  • the output device 1010 includes display devices such as liquid crystal display devices, OLED devices, and lamps, for example.
  • output device 1010 includes audio output devices such as speakers and headphones.
  • the output device 1010 outputs reproduced content, for example.
  • the display device displays various information such as reproduced video data as text or images.
  • the audio output device converts reproduced audio data and the like into audio and outputs the audio.
  • the storage device 1011 is a device for storing data.
  • the storage device 1011 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 1011 is composed of, for example, an HDD (Hard Disk Drive).
  • the storage device 1011 drives a hard disk and stores programs executed by the CPU 1001 and various data.
  • the drive 1012 is a reader/writer for storage media, and is built in or externally attached to the information processing terminal 10 .
  • the drive 1012 reads out information recorded in the attached removable storage medium 30 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 1003 .
  • Drive 1012 can also write information to removable storage medium 30 .
  • the communication device 1015 is, for example, a communication interface configured with a communication device or the like for connecting to the network 12 .
  • the communication device 1015 may be a wireless LAN compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wired communication device that performs wired communication.
  • LTE Long Term Evolution
  • the posture estimation unit 151 may generate a mesh model representing the surface shape of the user's body based on the time-series data acquired by the data acquisition unit 110 . Then, the physique feature information calculation unit 159 may calculate the difference between the user's mesh model and a predetermined mesh model as the user's physique feature information. Further, the display control unit 171 may cause the operation display unit 120 to display a screen in which the mesh model of the user and the mesh model of the reference motion information corrected according to the physique feature information are superimposed on the user. This allows the user to more clearly recognize the deviation amount of the reference motion information and the user's entire body.
  • the information processing terminal 10 may further include all or part of the functional configuration of the server 20 according to the present disclosure.
  • the information processing terminal 10 can perform a series of processes according to the present disclosure without communicating via the network 1 .
  • the server 20 may acquire the user's skeleton data generated by the information processing terminal 10 . Then, the server 20 may calculate the physique feature information of the user and correct the reference motion information according to the calculated user physique feature information. Then, the server 20 may transmit to the information processing terminal 10 guidance information that guides the action of the user.
  • the display control unit 171 divides the steps of the degree of achievement step by step and guides the user to the operation display unit 120 . Information may be displayed. This allows the user to experience multiple achievements, which can improve the user's motivation.
  • each step in the processing of the information processing terminal 10 and the server 20 in this specification does not necessarily have to be processed in chronological order according to the order described as the flowchart.
  • each step in the processing of the information processing terminal 10 and the server 20 may be processed in an order different from the order described as the flowchart.
  • a posture information acquisition unit that acquires posture information indicating a posture of a user; a calculation unit that calculates physique characteristic information of the user by comparing the posture information of the user and predetermined posture information; a correction unit that corrects the posture information prepared as reference motion information according to the user's physique feature information; an output unit that outputs guidance information for guiding the user's motion based on the reference motion information corrected by the correction unit; An information processing device.
  • the calculation unit calculating a difference between each of the various types of skeleton information included in the posture information and each of the various types of skeleton information included in the predetermined posture information as physique feature information of the user; The information processing device according to (1) above.
  • the correction unit is correcting the posture information prepared as the reference motion information according to physical information or health information of the user that is not included in the posture information;
  • the output unit Display information obtained by superimposing the reference motion information corrected by the correction unit and the posture information of the user is output as the guidance information.
  • the information processing device is a motion comparison unit that compares the user posture information with reference motion information corrected by the correction unit; further comprising The output unit Outputting guide information for guiding the user's action based on the result of the comparison by the action comparison unit;
  • the reference motion information includes: including pose information of other users, The information processing apparatus according to any one of (1) and (7) above.
  • the information processing device is an object information acquisition unit that acquires feature information of an object and motion information of the object; further comprising The correction unit is correcting the reference motion information according to features of the object and motion information of the object; The information processing apparatus according to any one of (1) to (8).
  • the correction unit is correcting the reference motion information according to a weight parameter set for each part of the user, in addition to the physique characteristic information; The information processing apparatus according to any one of (1) to (9). (11) the weight parameter is set for each item of the reference motion information; The information processing device according to (10) above.
  • a computer-implemented information processing method comprising: (13) to the computer, a posture information acquisition function for acquiring posture information indicating the posture of the user; a calculation function for calculating physique characteristic information of the user by comparing the posture information of the user and predetermined posture information; a correction function for correcting posture information prepared as reference motion information according to the user's physique feature information; an output function for outputting guidance information for guiding a motion of the user based on the reference motion information corrected by the correction unit;

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Physiology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

[Problem] To provide a novel and improved information processing device, information processing method, and program capable of providing personalized feedback according to physique feature information about a user. [Solution] Provided is an information processing device comprising: a posture information acquisition unit that acquires posture information indicating the posture of a user; a calculation unit that calculates physique feature information about the user through a comparison between the posture information about the user and predetermined posture information; a correction unit that corrects posture information prepared as reference movement information according to the physique feature information about the user; and an output unit that outputs guidance information for guiding movements by the user on the basis of the reference movement information corrected by the correction unit.

Description

情報処理装置、情報処理方法およびプログラムInformation processing device, information processing method and program
 本開示は、情報処理装置、情報処理方法およびプログラムに関する。 The present disclosure relates to an information processing device, an information processing method, and a program.
 近年、ユーザの姿勢情報を取得するセンサを用いて、運動時におけるユーザの姿勢情報を取得し、当該姿勢情報に応じてユーザの姿勢を改善するためのフィードバックを行う技術が開発されている。 In recent years, technology has been developed that acquires the user's posture information during exercise using a sensor that acquires the user's posture information, and provides feedback to improve the user's posture according to the posture information.
 例えば、特許文献1では、片足立ちの運動において、ユーザの足の高さや足を上げた時間に基づく運動姿勢の評価を行い、評価結果をユーザにフィードバックする技術が開示されている。また、運動姿勢の評価に際して、ユーザが手動で入力した身長や体重などの個人化した情報が用いられることが開示されている。 For example, Patent Literature 1 discloses a technique of evaluating the exercise posture based on the height of the user's leg and the time the user raises the leg, and feeding back the evaluation result to the user in one-leg standing exercise. Further, it is disclosed that personalized information such as height and weight manually input by the user is used when evaluating exercise posture.
特開2017-38959号JP 2017-38959
 しかし、特許文献1に記載の技術では、ユーザの各部位の長さや四肢の有無などのユーザの体格に起因する体格特徴情報を認識しておらず、ユーザごとに異なる体格特徴情報を運動姿勢の評価に用いない。 However, the technique described in Patent Literature 1 does not recognize physique feature information due to the physique of the user, such as the length of each part of the user and the presence or absence of limbs, and uses the physique feature information that differs for each user as an exercise posture. Not used for evaluation.
 そこで、本開示では、ユーザの体格特徴情報に応じて個人化されたフィードバックを提供することが可能な、新規かつ改良された情報処理方法、情報処理装置およびプログラムを提案する。 Therefore, the present disclosure proposes a new and improved information processing method, information processing device, and program capable of providing personalized feedback according to the user's physique feature information.
 本開示によれば、ユーザの姿勢を示す姿勢情報を取得する姿勢情報取得部と、前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出する算出部と、参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正する補正部と、前記補正部により補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力する出力部と、を備える、情報処理装置が提供される。 According to the present disclosure, a posture information acquisition unit that acquires posture information indicating a posture of a user, and a calculation unit that calculates physique characteristic information of the user by comparing the posture information of the user and predetermined posture information. a correction unit that corrects posture information prepared as reference motion information according to the physique feature information of the user; and guidance information that guides the user's motion based on the reference motion information corrected by the correction unit. and an output unit that outputs.
 また、本開示によれば、ユーザの姿勢を示す姿勢情報を取得することと、前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出することと、参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正することと、補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力することと、を含む、コンピュータにより実行される情報処理方法が提供される。 In addition, according to the present disclosure, obtaining posture information indicating a posture of a user, calculating physical characteristic information of the user by comparing the posture information of the user and predetermined posture information, and referring to correcting posture information prepared as motion information according to the user's physique feature information; and outputting guidance information for guiding the user's motion based on the corrected reference motion information. A computer implemented method for processing information is provided.
 また、本開示によれば、コンピュータに、ユーザの姿勢を示す姿勢情報を取得する姿勢情報取得機能と、前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出する算出機能と、参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正する補正機能と、前記補正機能により補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力する出力機能と、を実現させる、プログラムが提供される。 Further, according to the present disclosure, a computer has a posture information acquisition function for acquiring posture information indicating a posture of a user; a calculation function for calculating, a correction function for correcting posture information prepared as reference movement information according to the physique characteristic information of the user, and a movement of the user based on the reference movement information corrected by the correction function. A program is provided for realizing an output function of outputting guidance information for guidance.
本開示の一実施形態による情報処理システムを示す説明図である。1 is an explanatory diagram showing an information processing system according to an embodiment of the present disclosure; FIG. 本開示に係る情報処理端末10の機能構成例を説明するための説明図である。FIG. 2 is an explanatory diagram for explaining a functional configuration example of the information processing terminal 10 according to the present disclosure; 本開示に係るサーバ20の機能構成例を説明するための説明図である。FIG. 2 is an explanatory diagram for explaining a functional configuration example of a server 20 according to the present disclosure; FIG. ユーザの姿勢情報であるスケルトンデータの一例を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining an example of skeleton data, which is user posture information; 本開示に係る物体検出の一例を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining an example of object detection according to the present disclosure; ユーザの体格特徴情報の算出方法例を説明するための説明図である。FIG. 10 is an explanatory diagram for explaining an example of a calculation method of user's physique characteristic information; 参照動作情報の補正例を説明するための説明図である。FIG. 11 is an explanatory diagram for explaining an example of correction of reference motion information; 本開示に係るフィードバック例1を説明するための説明図である。FIG. 10 is an explanatory diagram for explaining Feedback Example 1 according to the present disclosure; 本開示に係るフィードバック例2を説明するための説明図である。FIG. 11 is an explanatory diagram for explaining a feedback example 2 according to the present disclosure; 本開示に係るフィードバック例3を説明するための説明図である。FIG. 11 is an explanatory diagram for explaining a feedback example 3 according to the present disclosure; 本開示に係るフィードバック例4を説明するための説明図である。FIG. 11 is an explanatory diagram for explaining a feedback example 4 according to the present disclosure; 本開示に係るフィードバック例5を説明するための説明図である。FIG. 11 is an explanatory diagram for explaining a feedback example 5 according to the present disclosure; 本開示に係る情報処理端末10のフィードバックに係る動作処理例を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining an operation processing example related to feedback of the information processing terminal 10 according to the present disclosure; 他のユーザの動作を参照動作情報として取得した際の、情報処理端末10の動作処理例を説明するための説明図である。FIG. 10 is an explanatory diagram for explaining an example of operation processing of the information processing terminal 10 when acquiring another user's operation as reference operation information; 情報処理端末10のハードウェア構成を示したブロック図である。2 is a block diagram showing the hardware configuration of the information processing terminal 10; FIG.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Preferred embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. In the present specification and drawings, constituent elements having substantially the same functional configuration are denoted by the same reference numerals, thereby omitting redundant description.
 また、以下に示す項目順序に従って当該「発明を実施するための形態」を説明する。
  1.情報処理システムの概要
  2.機能構成例
   2-1.情報処理端末の機能構成例
   2-2.サーバの機能構成例
  3.詳細
   3-1.姿勢推定
   3-2.物体検出
   3-3.体格特徴情報の算出
   3-4.参照動作情報の補正
   3-5.フィードバック例
  4.動作例
   4-1.情報処理端末の第1の動作例
   4-2.情報処理端末の第2の動作例
  5.作用効果例
  6.ハードウェア構成
  7.補足
In addition, the "Mode for Carrying Out the Invention" will be explained according to the order of items shown below.
1. Overview of information processing system 2 . Example of functional configuration 2-1. Functional Configuration Example of Information Processing Terminal 2-2. Functional configuration example of server 3 . Details 3-1. Posture Estimation 3-2. Object detection 3-3. Calculation of physique feature information 3-4. Correction of reference motion information 3-5. Feedback example 4 . Operation example 4-1. First operation example of information processing terminal 4-2. Second operation example of information processing terminal 5 . Example of action and effect 6. Hardware configuration7. supplement
 <<1.情報処理システムの概要>>
 ユーザの姿勢情報には、人間や動物等の動体の動きの情報を可視化するため、例えば身体の構造を示すスケルトン構造により表現されるスケルトンデータが用いられる。スケルトンデータは、部位の情報を含む。なお、スケルトン構造における部位は、例えば身体の末端部位や関節部位等に対応する。また、スケルトンデータは、部位間を結ぶ線分であるボーンを含んでもよい。スケルトン構造におけるボーンは例えば人間の骨に相当し得るが、ボーンの位置や数は必ずしも実際の人間の骨格と整合していなくてもよい。
<<1. Information processing system overview >>
For the posture information of the user, skeleton data expressed by a skeleton structure indicating the structure of the body, for example, is used in order to visualize information on the movement of a moving body such as a human being or an animal. Skeleton data includes information on parts. Note that the parts in the skeleton structure correspond to, for example, terminal parts and joint parts of the body. Skeleton data may also include bones, which are line segments connecting parts. The bones in the skeleton structure can correspond to, for example, human bones, but the positions and numbers of bones do not necessarily match the actual human skeleton.
 スケルトンデータにおける各部位の位置および姿勢は、ユーザの動きを検出するセンサにより取得可能である。例えば、撮像センサが取得した画像データの時系列データに基づいて、身体の各部位の位置および姿勢を検出する技術や、身体の部位にモーションセンサを装着し、モーションセンサにより取得された時系列データに基づいてモーションセンサの位置情報を取得する技術が存在する。 The position and posture of each part in the skeleton data can be obtained by sensors that detect the user's movements. For example, technology that detects the position and posture of each part of the body based on time-series data of image data acquired by an imaging sensor, and time-series data acquired by a motion sensor attached to a part of the body. Techniques exist to obtain the position information of the motion sensor based on .
 また、スケルトンデータの用途は多様である。例えば、スケルトンデータの時系列データは、スポーツにおいてフォーム改善に用いられたり、VR(Virtual Reality)またはAR(Augmented Reality)等のアプリケーションに用いられたりしている。また、スケルトンデータの時系列データを用いて、ユーザの動きを模したアバター映像を生成し、当該アバター映像を配信することも行われている。 In addition, the uses of skeleton data are diverse. For example, time-series data of skeleton data is used for form improvement in sports, and for applications such as VR (Virtual Reality) or AR (Augmented Reality). In addition, time-series data of skeleton data is used to generate an avatar image imitating a user's movement, and the avatar image is distributed.
 以下では、本開示の一実施形態として、ユーザの全身の動きの時系列データから算出されたスケルトンデータからユーザの体格特徴情報を算出し、当該体格特徴情報に応じて参照動作情報を補正する情報処理システムの構成例を説明する。なお、以下では動体の一例として主に人間を説明するが、本開示の実施形態は、動物およびロボットなどの他の動体にも同様に適用可能である。 In the following, as an embodiment of the present disclosure, information for calculating user's physique feature information from skeleton data calculated from time-series data of movement of the user's whole body, and correcting reference motion information according to the physique feature information A configuration example of the processing system will be described. Note that although humans will be mainly described below as an example of a moving object, the embodiments of the present disclosure are similarly applicable to other moving objects such as animals and robots.
 図1は、本開示の一実施形態による情報処理システムを示す説明図である。図1に示すように、本開示の一実施形態による情報処理システムは、情報処理端末10およびサーバ20を有する。 FIG. 1 is an explanatory diagram showing an information processing system according to an embodiment of the present disclosure. As shown in FIG. 1, an information processing system according to an embodiment of the present disclosure has an information processing terminal 10 and a server 20. As shown in FIG.
 情報処理端末10およびサーバ20は、ネットワーク1を介して接続されている。ネットワーク1は、ネットワーク1に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、ネットワーク1は、インターネット、電話回線網、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)などを含んでもよい。また、ネットワーク1は、IP-VPN(Internet Protocol-Virtual Private Network)などの専用回線網を含んでもよい。 The information processing terminal 10 and the server 20 are connected via the network 1. A network 1 is a wired or wireless transmission path for information transmitted from devices connected to the network 1 . For example, the network 1 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs (Local Area Networks) including Ethernet (registered trademark), WANs (Wide Area Networks), and the like. The network 1 may also include a dedicated line network such as IP-VPN (Internet Protocol-Virtual Private Network).
 (情報処理端末10)
 情報処理端末10は、情報処理装置の一例である。情報処理端末10は、ユーザUの姿勢を示す姿勢情報を取得し、当該姿勢情報と、所定の姿勢情報との比較により、ユーザUの体格特徴情報を算出する。
(Information processing terminal 10)
The information processing terminal 10 is an example of an information processing device. The information processing terminal 10 acquires posture information indicating the posture of the user U, and calculates physical characteristic information of the user U by comparing the posture information with predetermined posture information.
 また、情報処理端末10は、サーバ20により参照動作情報として用意された姿勢情報を、ユーザUの体格特徴情報に応じて補正し、補正された参照動作情報に基づき、ユーザUの動作を誘導する誘導情報を出力する。 Further, the information processing terminal 10 corrects the posture information prepared as the reference motion information by the server 20 according to the physique characteristic information of the user U, and guides the motion of the user U based on the corrected reference motion information. Output guidance information.
 なお、図1では、情報処理端末10としてスマートフォンを示しているが、情報処理端末10は、ノートPC(Personal Computer)およびデスクトップPC等の他の情報処理装置であってもよい。 Although FIG. 1 shows a smartphone as the information processing terminal 10, the information processing terminal 10 may be other information processing devices such as a notebook PC (Personal Computer) and a desktop PC.
 (サーバ20)
 サーバ20は、所定の骨格情報を含む標準スケルトンデータを保持する。また、サーバ20は、少なくとも1以上の参照動作情報として用意された姿勢情報の時系列データを保持する。
(Server 20)
The server 20 holds standard skeleton data containing predetermined skeleton information. In addition, the server 20 holds time-series data of posture information prepared as at least one piece of reference motion information.
 サーバ20は、例えば、情報処理端末10から受信した要求信号に応じて、上述した標準スケルトンデータまたは参照動作情報を情報処理端末10に送信する。 For example, the server 20 transmits the above-described standard skeleton data or reference motion information to the information processing terminal 10 in response to a request signal received from the information processing terminal 10 .
 以上、本開示における情報処理システムの概要を説明した。続いて、本開示に係る情報処理端末10およびサーバ20の機能構成例を説明する。 The outline of the information processing system according to the present disclosure has been described above. Subsequently, functional configuration examples of the information processing terminal 10 and the server 20 according to the present disclosure will be described.
 <<2.機能構成例>>
 <2-1.情報処理端末の機能構成例>
 図2は、本開示に係る情報処理端末10の機能構成例を説明するための説明図である。図2に示すように、情報処理端末10は、データ取得部110と、操作表示部120と、通信部130と、記憶部140と、制御部150とを備える。
<<2. Functional configuration example >>
<2-1. Example of functional configuration of information processing terminal>
FIG. 2 is an explanatory diagram for explaining a functional configuration example of the information processing terminal 10 according to the present disclosure. As shown in FIG. 2 , the information processing terminal 10 includes a data acquisition section 110 , an operation display section 120 , a communication section 130 , a storage section 140 and a control section 150 .
 (データ取得部110)
 データ取得部110は、ユーザおよび物体の動き情報を取得する。例えば、データ取得部110は、撮像センサであってもよいし、ToF(Time of Flight)センサや電波を送受信するアンテナであってもよい。本明細書では、撮像センサをデータ取得部110の主な例として説明する。
(Data acquisition unit 110)
The data acquisition unit 110 acquires motion information of a user and an object. For example, the data acquisition unit 110 may be an imaging sensor, a ToF (Time of Flight) sensor, or an antenna that transmits and receives radio waves. In this specification, an imaging sensor will be described as a main example of the data acquisition unit 110 .
 (操作表示部120)
 操作表示部120は、表示制御部171の制御に従い、ユーザの動作を誘導する誘導情報を表示する表示部としての機能を有する。また、操作表示部120は、ユーザが後述する身体情報や健康情報を操作入力するための操作部としての機能を有する。
(Operation display unit 120)
The operation display unit 120 has a function as a display unit that displays guidance information that guides user's actions under the control of the display control unit 171 . Further, the operation display unit 120 has a function as an operation unit for the user to operate and input physical information and health information, which will be described later.
 操作表示部120は、ユーザによりスポーツの種目や注目すべき箇所等を操作入力する操作部としての機能を有する。例えば、操作表示部120は、ユーザにより「野球」および「スイング」と入力または選択される。この場合、本開示に係る情報処理端末10は、ユーザがスイングした際の姿勢情報と、スイングのお手本として用意された参照動作情報に基づき、ユーザの動作を誘導する誘導情報を出力してもよい。 The operation display unit 120 has a function as an operation unit that allows the user to operate and input sports events, points of interest, and the like. For example, the user inputs or selects "baseball" and "swing" on the operation display unit 120 . In this case, the information processing terminal 10 according to the present disclosure may output guidance information that guides the user's motion based on posture information when the user swings and reference motion information prepared as a swing model. .
 表示部としての機能は、例えば、CRT(Cathode Ray Tube)ディスプレイ装置、液晶ディスプレイ(LCD)装置、OLED(Organic Light Emitting Diode)装置により実現される。 The function of the display unit is realized by, for example, a CRT (Cathode Ray Tube) display device, a liquid crystal display (LCD) device, or an OLED (Organic Light Emitting Diode) device.
 また、操作部としての機能は、例えば、タッチパネル、キーボードまたはマウスにより実現される。 Also, the function as an operation unit is realized by, for example, a touch panel, keyboard, or mouse.
 なお、図1において、情報処理端末10は、表示部および操作部の機能を一体化した構成としているが、表示部および操作部の機能を分離した構成としてもよい。 In FIG. 1, the information processing terminal 10 has a configuration in which the functions of the display unit and the operation unit are integrated, but may have a configuration in which the functions of the display unit and the operation unit are separated.
 (通信部130)
 通信部130は、ネットワーク1を介して、サーバ20と各種情報を通信する。例えば、通信部130は、標準スケルトンデータや参照動作情報を要求する要求信号をサーバ20に送信する。また、通信部130は、要求信号に応じてサーバ20から送信された標準スケルトンデータや参照動作情報を受信する。
(Communication unit 130)
The communication unit 130 communicates various information with the server 20 via the network 1 . For example, the communication unit 130 transmits to the server 20 a request signal requesting standard skeleton data and reference motion information. The communication unit 130 also receives standard skeleton data and reference motion information transmitted from the server 20 in response to a request signal.
 (記憶部140)
 記憶部140は、ソフトウェアおよび各種データを保持する。例えば、記憶部140は、ユーザの体格特徴情報を保持する。
(storage unit 140)
The storage unit 140 holds software and various data. For example, the storage unit 140 holds physique feature information of the user.
 (制御部150)
 制御部150は、情報処理端末10の動作全般に係る制御を行う。図2に示すように、制御部150は、姿勢推定部151と、物体検出部155と、体格特徴情報算出部159と、変換部163と、動作比較部167と、表示制御部171とを備える。
(control unit 150)
The control unit 150 controls overall operations of the information processing terminal 10 . As shown in FIG. 2, the control unit 150 includes a posture estimation unit 151, an object detection unit 155, a physique feature information calculation unit 159, a conversion unit 163, an action comparison unit 167, and a display control unit 171. .
 姿勢推定部151は、データ取得部110により取得された時系列データに基づき、ユーザの各部位の位置および姿勢を示す部位情報を推定する。 The posture estimation unit 151 estimates part information indicating the position and posture of each part of the user based on the time-series data acquired by the data acquisition unit 110 .
 そして、姿勢推定部151は、部位情報に基づいて、スケルトン構造における各部位の位置情報および姿勢情報を含むスケルトンデータを生成する。スケルトンデータに係る詳細は後述する。 Then, posture estimation section 151 generates skeleton data including position information and posture information of each part in the skeleton structure based on the part information. Details of the skeleton data will be described later.
 物体検出部155は、データ取得部110により取得された時系列データに基づき、物体情報および物体の動き情報を検出する。 The object detection unit 155 detects object information and object motion information based on the time-series data acquired by the data acquisition unit 110 .
 体格特徴情報算出部159は、姿勢推定部151により生成されたユーザのスケルトンデータと、標準スケルトンデータ記憶部231により保持される標準スケルトンデータとの比較により、ユーザの体格特徴情報を算出する。体格特徴情報の算出方法に係る具体例は後述する。 The physique feature information calculation unit 159 calculates physique feature information of the user by comparing the user's skeleton data generated by the posture estimation unit 151 and the standard skeleton data held by the standard skeleton data storage unit 231 . A specific example of the method for calculating physique characteristic information will be described later.
 変換部163は、補正部の一例であり、サーバ20から取得した参照動作情報を、ユーザの体格特徴情報に応じて補正する。なお、以下の説明では、ユーザの体格特徴情報に応じて補正された参照動作情報を補正後参照動作情報と表現する場合がある。 The conversion unit 163 is an example of a correction unit, and corrects the reference motion information acquired from the server 20 according to the user's physique feature information. In the following description, the reference motion information corrected according to the physique feature information of the user may be expressed as post-correction reference motion information.
 また、変換部163は、ユーザのスケルトンデータの各部位の骨格特徴を標準スケルトンデータの各部位の骨格特徴に変換してもよい。なお、本明細書における骨格特徴は、ボーンの長さや太さなどを含む。 The conversion unit 163 may also convert the skeleton features of each part of the user's skeleton data into the skeleton features of each part of the standard skeleton data. Note that the skeletal features in this specification include the length and thickness of bones.
 また、変換部163は、補正後参照動作情報に含まれる複数のスケルトンデータの各部位の骨格特徴をユーザのスケルトンデータの各部位の骨格特徴に変換してもよい。 Further, the conversion unit 163 may convert the skeletal feature of each part of the plurality of skeleton data included in the post-correction reference motion information to the skeletal feature of each part of the skeleton data of the user.
 動作比較部167は、ユーザのスケルトンデータの時系列データであるユーザ動作情報と、変換部163により変換された参照動作情報との比較を行い、比較した結果を変換部163または表示制御部171に出力する。 The motion comparison unit 167 compares the user motion information, which is time-series data of user skeleton data, with the reference motion information converted by the conversion unit 163, and outputs the comparison result to the conversion unit 163 or the display control unit 171. Output.
 例えば、動作比較部167は、ユーザ動作情報の1フレームと、変換部163により変換された参照動作情報の1フレームとの比較を行い、比較した結果を変換部163または表示制御部171に出力してもよい。 For example, the motion comparison unit 167 compares one frame of the user motion information with one frame of the reference motion information converted by the conversion unit 163, and outputs the comparison result to the conversion unit 163 or the display control unit 171. may
 表示制御部171は、出力部の一例であり、変換部163により補正された参照動作情報に基づき、ユーザの動作を誘導する誘導情報を操作表示部120に表示させる。誘導情報の具体例は後述する。 The display control unit 171 is an example of an output unit, and causes the operation display unit 120 to display guidance information that guides the user's actions based on the reference action information corrected by the conversion unit 163 . A specific example of the guidance information will be described later.
 以上、情報処理端末10の機能構成例を説明した。続いて、図3を参照し、サーバ20の機能構成例を説明する。 An example of the functional configuration of the information processing terminal 10 has been described above. Next, a functional configuration example of the server 20 will be described with reference to FIG.
 <2-2.サーバの機能構成例>
 図3は、本開示に係るサーバ20の機能構成例を説明するための説明図である。図3に示すように、サーバ20は、通信部210と、制御部220と、記憶部230とを備える。
<2-2. Server functional configuration example>
FIG. 3 is an explanatory diagram for explaining a functional configuration example of the server 20 according to the present disclosure. As shown in FIG. 3, the server 20 includes a communication section 210, a control section 220, and a storage section 230.
 (通信部210)
 通信部210は、ネットワーク1を介して、情報処理端末10と各種情報を通信する。例えば、通信部210は、標準スケルトンデータや参照動作情報を要求する要求信号を情報処理端末10から受信する。また、通信部210は、制御部220の制御に従い、情報処理端末10から受信した要求信号に応じて標準スケルトンデータや参照動作情報を情報処理端末10に送信する。
(Communication unit 210)
The communication unit 210 communicates various information with the information processing terminal 10 via the network 1 . For example, the communication unit 210 receives from the information processing terminal 10 a request signal requesting standard skeleton data or reference motion information. Further, under the control of the control unit 220 , the communication unit 210 transmits standard skeleton data and reference motion information to the information processing terminal 10 in response to a request signal received from the information processing terminal 10 .
 (制御部220)
 制御部220は、サーバ20の動作全般に係る制御を行う。例えば、制御部220は、情報処理端末10から受信した要求信号に応じて、記憶部230に保持される各種情報を通信部210に送信させる。
(control unit 220)
The control unit 220 controls overall operations of the server 20 . For example, the control unit 220 causes the communication unit 210 to transmit various information held in the storage unit 230 in response to a request signal received from the information processing terminal 10 .
 (記憶部230)
 記憶部230は、ソフトウェアおよび各種データを保持する。図3に示すように、記憶部230は、標準スケルトンデータ記憶部231と、参照動作記憶部235とを備える。
(storage unit 230)
Storage unit 230 holds software and various data. As shown in FIG. 3 , the storage section 230 includes a standard skeleton data storage section 231 and a reference motion storage section 235 .
 標準スケルトンデータ記憶部231は、各部位に所定の骨格特徴を有する標準スケルトンデータを保持する。 The standard skeleton data storage unit 231 holds standard skeleton data having predetermined skeletal features for each part.
 なお、標準スケルトンデータは、所定の姿勢情報の一例であるが、本開示に係る所定の姿勢情報は係る例に限定されない。例えば、本開示に係る所定の姿勢情報は、所定の骨格特徴を有する部位情報であってもよい。 Although the standard skeleton data is an example of predetermined posture information, the predetermined posture information according to the present disclosure is not limited to such an example. For example, the predetermined posture information according to the present disclosure may be part information having predetermined skeletal features.
 参照動作記憶部235は、ある動作情報を含むスケルトンデータの時系列データとして参照動作情報を保持する。参照動作情報の詳細は後述する。 The reference motion storage unit 235 holds reference motion information as time-series data of skeleton data containing certain motion information. Details of the reference motion information will be described later.
 以上、本開示に係る機能構成例を説明した。続いて、図4~13を参照し、本開示に係る処理の詳細を順次説明する。 The functional configuration example according to the present disclosure has been described above. Subsequently, details of the processing according to the present disclosure will be sequentially described with reference to FIGS. 4 to 13. FIG.
 <<3.詳細>>
 <3-1.姿勢推定>
 図4は、ユーザの姿勢情報を含むスケルトンデータの一例を説明するための説明図である。姿勢推定部131は、例えば、データ取得部110により取得された時系列データに基づき、スケルトン構造における各部位の位置情報および姿勢情報を含むスケルトンデータUSを取得する。
<<3. Details>>
<3-1. Posture Estimation>
FIG. 4 is an explanatory diagram for explaining an example of skeleton data including user posture information. Posture estimation section 131 acquires skeleton data US including position information and posture information of each part in the skeleton structure, for example, based on the time-series data acquired by data acquisition section 110 .
 例えば、姿勢推定部131は、DNN(Deep Neural Network)等の機械学習技術を用いて、ユーザUのスケルトンデータUSを生成してもよい。より具体的には、姿勢推定部131は、例えば、人物を撮影して取得された画像データと、スケルトンデータの組を教師データとする機械学習技術により得られた推定器を用いて、ユーザUのスケルトンデータUSを生成する。 For example, the posture estimation unit 131 may generate skeleton data US of the user U using machine learning technology such as DNN (Deep Neural Network). More specifically, the posture estimating unit 131 uses an estimator obtained by a machine learning technique using, for example, a set of image data obtained by photographing a person and skeleton data as teacher data, to determine the position of the user U. to generate skeleton data US.
 なお、スケルトンデータUSには、部位の情報に加え、ボーンの情報(位置情報、姿勢情報、骨格特徴情報等)も含まれ得る。 The skeleton data US may include bone information (position information, posture information, skeletal feature information, etc.) in addition to the information on the parts.
 なお、図4では、ユーザUの全身のスケルトンデータを示しているが、姿勢推定部131は、必要に応じた部位のみのスケルトンデータを生成してもよい。 Although FIG. 4 shows the skeleton data of the whole body of the user U, the posture estimation unit 131 may generate the skeleton data of only parts as necessary.
 <3-2.物体検出>
 図5は、本開示に係る物体検出の一例を説明するための説明図である。物体検出部155は、データ取得部110により取得された時系列データに基づき、各種物体oの物体情報および物体の動き情報を検出する。なお、物体情報は、物体の種類(ボールやラケット等)を含む。
<3-2. Object detection>
FIG. 5 is an explanatory diagram for explaining an example of object detection according to the present disclosure. The object detection unit 155 detects object information and object motion information of various objects o based on the time-series data acquired by the data acquisition unit 110 . Note that the object information includes the type of object (ball, racket, etc.).
 例えば、図5に示すように、物体検出部155は、データ取得部110により取得された時系列データに基づき、ユーザUが搭乗する車いすo1、ユーザが握るラケットo2およびボールo3を認識し、更に、各物体の動き方向や大きさを検出する。 For example, as shown in FIG. 5, the object detection unit 155 recognizes the wheelchair o1 on which the user U rides, the racket o2 and the ball o3 held by the user based on the time-series data acquired by the data acquisition unit 110, and further , to detect the motion direction and size of each object.
 また、物体検出部155は、各物体o1~o3とユーザUとの関係性を検出してもよい。例えば、物体検出部155は、ユーザUが車いすに搭乗していた場合、ユーザの動きに関連する車いすをユーザの身体の一部として認識してもよい。この場合、姿勢推定部131は、例えば、ユーザUの下肢のスケルトンデータを生成しなくてもよい。 Also, the object detection unit 155 may detect the relationship between each of the objects o1 to o3 and the user U. For example, when the user U is in a wheelchair, the object detection unit 155 may recognize the wheelchair associated with the movement of the user as part of the user's body. In this case, the posture estimation unit 131 does not need to generate skeleton data of the lower limbs of the user U, for example.
 また、ユーザが操作表示部120を用いて、スポーツの種目を入力した場合、物体検出部155は、入力されたスポーツの種目に応じた物体を検出してもよい。例えば、ユーザが操作表示部120を用いて、「野球」と入力した場合、物体検出部155は、野球に関連する物体(例えば、打具やボール等)を検出してもよい。 Further, when the user uses the operation display unit 120 to input a sport event, the object detection unit 155 may detect an object corresponding to the input sport event. For example, when the user inputs "baseball" using the operation display unit 120, the object detection unit 155 may detect an object related to baseball (eg, a batting tool, a ball, etc.).
 <3-3.体格特徴情報の算出>
 図6は、ユーザの体格特徴情報の算出方法例を説明するための説明図である。体格特徴情報算出部159は、姿勢推定部151により推定されたユーザの各部位の姿勢情報と、標準スケルトンデータ記憶部231により保持される所定の姿勢情報との比較により、ユーザの体格特徴情報を算出する。
<3-3. Calculation of physique characteristic information>
FIG. 6 is an explanatory diagram for explaining an example of a method for calculating user's physique characteristic information. Physique feature information calculation section 159 compares the posture information of each part of the user estimated by posture estimation section 151 with predetermined posture information held by standard skeleton data storage section 231 to calculate the physique feature information of the user. calculate.
 例えば、体格特徴情報算出部159は、ユーザのスケルトンデータUSに含まれる各種骨格特徴情報の各々と、標準スケルトンデータRSに含まれる各種骨格特徴情報の各々の差分をユーザの体格特徴情報として算出する。 For example, the physique feature information calculation unit 159 calculates the difference between each of various skeletal feature information included in the user's skeleton data US and each of the various skeletal feature information included in the standard skeleton data RS as the user's physique feature information. .
 より具体的には、体格特徴情報算出部159は、ユーザのスケルトンデータUSに含まれる右腕の一骨格US1の長さと、標準スケルトンデータRSに含まれる右腕の一骨格RS1との差分Dをユーザの体格特徴情報として算出してもよい。 More specifically, physique characteristic information calculation section 159 calculates the difference D between the length of one skeleton US1 of the right arm included in the skeleton data US of the user and the one skeleton RS1 of the right arm included in the standard skeleton data RS. It may be calculated as physique feature information.
 また、体格特徴情報算出部159は、ユーザのスケルトンデータUSに含まれる複数部位の骨格特徴情報の各々と、標準スケルトンデータRSに含まれる複数部位の骨格特徴情報の各々との差分を算出し、算出した各差分に基づき、ユーザの体格特徴情報を算出してもよい。 Further, the physique feature information calculation unit 159 calculates a difference between each piece of skeletal feature information of a plurality of parts included in the skeleton data US of the user and each piece of skeletal feature information of a plurality of parts included in the standard skeleton data RS, The physique characteristic information of the user may be calculated based on each calculated difference.
 また、体格特徴情報算出部159は、スケルトンデータの骨格情報に含まれないユーザの身体情報または健康情報に応じて、ユーザの体格特徴情報を修正してもよい。身体情報は、例えば、身長、体重または筋肉量などのユーザの身体に関する情報を含む。健康情報は、例えば小病歴、年齢、身体障碍などのユーザの健康履歴に関する情報を含む。なお、ユーザの身体情報および健康情報には、操作表示部120により入力された情報を用いてもよい。 Also, the physique feature information calculation unit 159 may correct the physique feature information of the user according to the user's physical information or health information that is not included in the skeleton information of the skeleton data. Physical information includes, for example, information about the user's body, such as height, weight, or muscle mass. Health information includes information about the user's health history, such as minor medical history, age, and physical disabilities. Information input through the operation display unit 120 may be used as the user's physical information and health information.
 また、体格特徴情報算出部159は、ユーザのスケルトンデータUSの重心位置と標準スケルトンデータRSの重心位置とを比較し、比較の結果をユーザの体格特徴情報を算出してもよい。 Also, the physique feature information calculation unit 159 may compare the center of gravity position of the skeleton data US of the user and the position of the center of gravity of the standard skeleton data RS, and calculate the physique feature information of the user based on the comparison result.
 以上、ユーザの体格特徴情報の算出方法例を説明したが、体格特徴情報算出部159は、上述した算出方法を複数組み合わせてユーザの体格特徴情報を算出してもよい。続いて、ユーザの体格特徴情報に応じて、参照動作情報に含まれる複数の参照スケルトンデータの各々を補正する補正方法例を説明する。 Although an example of a method for calculating the user's physique feature information has been described above, the physique feature information calculation unit 159 may calculate the user's physique feature information by combining a plurality of the above-described calculation methods. Next, an example of a correction method for correcting each of the plurality of reference skeleton data included in the reference motion information in accordance with the physique feature information of the user will be described.
 <3-4.参照動作情報の補正>
 図7は、参照動作情報の補正例を説明するための説明図である。変換部163は、参照動作情報に含まれる複数の参照スケルトンデータCS1の各部位の骨格特徴情報(例えば、ボーンの長さや太さ等)を、標準スケルトンデータの各部位の骨格特徴情報に変換する変換処理C1を実行し、変換後スケルトンデータCS2を生成する。
<3-4. Correction of Reference Operation Information>
FIG. 7 is an explanatory diagram for explaining an example of correction of reference motion information. The conversion unit 163 converts the skeletal feature information (for example, bone length, thickness, etc.) of each part of the plurality of reference skeleton data CS1 included in the reference motion information into skeletal feature information of each part of the standard skeleton data. A conversion process C1 is executed to generate post-conversion skeleton data CS2.
 続いて、変換部163は、体格特徴情報算出部159により算出されたユーザの体格特徴情報、身体情報および健康情報に応じて、変換後スケルトンデータCS2を補正する補正処理C2を実行し、補正後スケルトンデータCS3を生成する。 Subsequently, conversion unit 163 executes correction processing C2 for correcting post-conversion skeleton data CS2 according to the physique feature information, physical information, and health information of the user calculated by physique feature information calculation unit 159. Generate skeleton data CS3.
 また、変換部163は、ユーザのスケルトンデータの各部位の骨格特徴情報を、標準スケルトンデータの骨格特徴情報に変換する変換処理C1を実行する。これにより、動作比較部167は、同じ骨格特徴情報を有するスケルトンデータ同士の動作の比較を行うことが可能になる。 The conversion unit 163 also executes a conversion process C1 for converting the skeleton feature information of each part of the user's skeleton data into the skeleton feature information of the standard skeleton data. This enables the motion comparison unit 167 to compare the motions of skeleton data having the same skeleton feature information.
 あるいは、変換処理C1は、参照動作情報に含まれる複数のスケルトンデータの各骨格特徴情報を、ユーザのスケルトンデータに含まれる骨格特徴情報に変換する処理であってもよい。これにより、変換部163は、ユーザのスケルトンデータに変換処理C1を実行しなくてもよい。 Alternatively, the conversion process C1 may be a process of converting each piece of skeletal feature information of a plurality of pieces of skeleton data included in the reference motion information into skeletal feature information included in the user's skeleton data. Accordingly, the conversion unit 163 does not have to perform the conversion processing C1 on the skeleton data of the user.
 また、変換部163は、ユーザの体格特徴情報に加えて、参照動作情報の種目ごとに設定される重みパラメータに基づき、変換後スケルトンデータCS2を補正する補正処理C2を実行し、補正後スケルトンデータCS3を生成してもよい。例えば、ユーザがバッティング練習をしていた際に、特に影響の大きいユーザの部位(例えば、腕や腰など)の重みパラメータは大きく設定され、影響の小さいユーザの部位(例えば、頭など)の重みパラメータは小さく設定されてもよい。 Further, the conversion unit 163 executes a correction process C2 for correcting the converted skeleton data CS2 based on the weight parameter set for each event of the reference movement information in addition to the physique characteristic information of the user. CS3 may be generated. For example, when the user is practicing batting, the weight parameters of parts of the user that have a particularly large effect (for example, arms and hips) are set large, and the weight parameters of parts of the user that have a small effect (for example, the head) are set to be large. The parameter may be set small.
 以上、本開示に係る参照動作情報の補正例を説明した。続いて、表示制御部171により、補正された参照動作情報に基づくフィードバックの具体例を説明する。 An example of correction of reference motion information according to the present disclosure has been described above. Next, a specific example of feedback based on the corrected reference motion information by the display control unit 171 will be described.
 <3-5.フィードバック例>
 本開示に係る情報処理端末10は、多種多様なフィードバックをユーザに提供し得る。以下、フィードバックに係る具体例を、図8~12を参照し、順次説明する。
<3-5. Feedback example>
The information processing terminal 10 according to the present disclosure can provide various types of feedback to the user. Specific examples of feedback will be sequentially described below with reference to FIGS. 8 to 12. FIG.
 (フィードバック例1)
 図8は、本開示に係るフィードバック例1を説明するための説明図である。図8に示すように、表示制御部171は、ユーザUのスケルトンデータUSと、参照スケルトンデータCSをユーザの表示上で重畳した映像を操作表示部120に表示させてもよい。なお、フィードバック例における参照スケルトンデータCSは、より具体的には、図7に示した補正後スケルトンデータCS3が、変換部163によりユーザのスケルトンデータに含まれる各部位の骨格特徴情報に変換された後の参照スケルトンデータである。
(Feedback example 1)
FIG. 8 is an explanatory diagram for explaining Feedback Example 1 according to the present disclosure. As shown in FIG. 8, the display control unit 171 may cause the operation display unit 120 to display an image in which the skeleton data US of the user U and the reference skeleton data CS are superimposed on the user's display. More specifically, the reference skeleton data CS in the feedback example is obtained by converting the post-correction skeleton data CS3 shown in FIG. Later reference skeleton data.
 また、動作比較部167は、ユーザUのスケルトンデータUSの時系列データと、参照スケルトンデータCSとを比較し、例えばスケルトンデータUSの時系列データと参照スケルトンデータCSの差の方向および距離を比較結果として表示制御部171に出力する。そして、表示制御部171は、ユーザのスケルトンデータUSと参照スケルトンデータCSの差の方向および距離を示す矢印等を操作表示部120に表示させてもよい。このような参照スケルトンデータCSおよび矢印などの表示は、ユーザの姿勢を誘導する誘導情報の一例である。 Further, the operation comparison unit 167 compares the time-series data of the skeleton data US of the user U and the reference skeleton data CS, for example, compares the direction and distance of the difference between the time-series data of the skeleton data US and the reference skeleton data CS. The result is output to the display control unit 171 . Then, the display control unit 171 may cause the operation display unit 120 to display an arrow or the like indicating the direction and distance of the difference between the user's skeleton data US and the reference skeleton data CS. Display of such reference skeleton data CS and arrows is an example of guidance information that guides the user's posture.
 (フィードバック例2)
 図9は、本開示に係るフィードバック例2を説明するための説明図である。情報処理端末10は、参照動作情報と、各部位の筋肉負荷量との関係をデータベースから取得してもよい。例えば、図9に示すように、ユーザがダンベルを昇降する運動をした場合、変換部163は、ユーザの体格特徴情報に応じて、データベースから取得した参照動作情報に対応する各部位の筋肉負荷量を補正してもよい。そして、表示制御部171は、ユーザの各部位の筋肉負荷量を誘導情報として操作表示部120に表示させる。これにより、ユーザは予め目標として設定した筋肉量に向けた達成度を確認することができ、当該達成度に応じてトレーニングの種目を検討し得る。
(Feedback example 2)
FIG. 9 is an explanatory diagram for explaining Feedback Example 2 according to the present disclosure. The information processing terminal 10 may acquire the relationship between the reference motion information and the muscle load amount of each part from a database. For example, as shown in FIG. 9, when the user performs an exercise of lifting and lowering dumbbells, the conversion unit 163 calculates the muscle load amount of each part corresponding to the reference motion information acquired from the database according to the user's physique characteristic information. may be corrected. Then, the display control unit 171 causes the operation display unit 120 to display the muscle load amount of each part of the user as guidance information. As a result, the user can check the degree of achievement toward the muscle mass set in advance as a target, and can consider the type of training according to the degree of achievement.
 (フィードバック例3)
 図10は、本開示に係るフィードバック例3を説明するための説明図である。変換部163は、参照動作情報としてヘディング動作に含まれるボールの位置、運動方向または速度などをユーザの体格特徴情報、物体情報および物体動き情報に応じて補正してもよい。
(Feedback example 3)
FIG. 10 is an explanatory diagram for explaining Feedback Example 3 according to the present disclosure. The conversion unit 163 may correct the position, motion direction, or speed of the ball included in the heading motion as the reference motion information according to the user's physique feature information, object information, and object motion information.
 そして、動作比較部167は、ユーザのスケルトンデータと、参照動作情報の各々に含まれるボールの位置、運動方向または速度などを比較し、ボールのインパクト点、方向または力などの各種情報を比較結果として表示制御部171に出力する。そして、表示制御部171は、動作比較部167から入力されたボールのインパンク点、方向または力などの各種情報を誘導情報として操作表示部120に表示させてもよい。 Then, the motion comparison unit 167 compares the user's skeleton data with the ball position, motion direction, or speed included in each of the reference motion information, and compares various information such as the impact point, direction, or force of the ball. , to the display control unit 171 . Then, the display control unit 171 may cause the operation display unit 120 to display various types of information such as the impact point, direction, or force of the ball input from the motion comparison unit 167 as guidance information.
 (フィードバック例4)
 図11は、本開示に係るフィードバック例4を説明するための説明図である。図11における実線のスケルトンデータはユーザのスケルトンデータであり、破線のスケルトンデータは参照スケルトンデータである。例えば、ユーザが同じ動作を複数回に亘って行った際に、表示制御部171は、複数回に亘って生成されたユーザのスケルトンデータの時系列データと、参照動作情報とを重畳した映像を操作表示部120に表示させてもよい。これにより、ユーザは、ユーザの動作に含まれるユーザの癖などを確認し得る。
(Feedback example 4)
FIG. 11 is an explanatory diagram for explaining Feedback Example 4 according to the present disclosure. The solid-line skeleton data in FIG. 11 is the user's skeleton data, and the dashed-line skeleton data is the reference skeleton data. For example, when the user performs the same motion a plurality of times, the display control unit 171 displays an image in which the time-series data of the skeleton data of the user generated a plurality of times and the reference motion information are superimposed. You may make it display on the operation display part 120. FIG. This allows the user to check the user's habits and the like included in the user's actions.
 また、表示制御部171は、例えば、ユーザが同じ動作を複数回に亘って行うことで得られる、各々のスケルトンデータを平均化した一のスケルトンデータの時系列データと、参照動作情報とをユーザ上に重畳した映像を操作表示部120に表示させてもよい。なお、このような参照動作情報の表示は、誘導情報の一例である。 In addition, the display control unit 171, for example, provides time-series data of one skeleton data obtained by averaging each skeleton data obtained by the user performing the same action a plurality of times, and the reference action information to the user. An image superimposed thereon may be displayed on the operation display unit 120 . It should be noted that such display of reference motion information is an example of guidance information.
 (フィードバック例5)
 図12は、本開示に係るフィードバック例5を説明するための説明図である。変換部163は、参照動作情報に含まれるスケルトンデータのボールの打ち出し方を、ユーザの体格特徴情報、車いすの使用およびラケットの形状や種類等に応じて補正してもよい。
(Feedback example 5)
FIG. 12 is an explanatory diagram for explaining Feedback Example 5 according to the present disclosure. The conversion unit 163 may correct the ball launching method of the skeleton data included in the reference motion information according to the user's physique feature information, the use of the wheelchair, the shape and type of the racket, and the like.
 そして、動作比較部167は、ユーザのスケルトンデータと、参照動作情報とを比較し、ラケットのインパクト範囲や車いすの運動方向を比較結果として表示制御部171に出力する。そして、表示制御部171は、ユーザのスケルトンデータおよびユーザが搭乗している車いすの範囲に加え、ラケットのインパクト範囲や車いすの運動方向を操作表示部120に表示させてもよい。なお、このようなラケットのインパクト範囲や車いすの運動方向は誘導情報の一例である。 Then, the motion comparison unit 167 compares the skeleton data of the user and the reference motion information, and outputs the impact range of the racket and the movement direction of the wheelchair to the display control unit 171 as the comparison result. Then, the display control unit 171 may cause the operation display unit 120 to display the impact range of the racket and the movement direction of the wheelchair in addition to the skeleton data of the user and the range of the wheelchair on which the user is riding. The impact range of the racket and the motion direction of the wheelchair are examples of guidance information.
 以上、本開示に係る詳細を説明した。続いて、本開示に係るシステムの動作処理例を説明する。 The details of the present disclosure have been described above. Next, an example of operation processing of the system according to the present disclosure will be described.
 <<4.動作例>>
 <4-1.情報処理端末の第1の動作例>
 図13は、本開示に係る情報処理端末10のフィードバックに係る動作処理例を説明するための説明図である。
<<4. Operation example >>
<4-1. First Operation Example of Information Processing Terminal>
FIG. 13 is an explanatory diagram for explaining an operation processing example related to feedback of the information processing terminal 10 according to the present disclosure.
 まず、データ取得部110は、ユーザを撮影し、画像の時系列データを取得する(S101)。 First, the data acquisition unit 110 photographs the user and acquires time-series data of the images (S101).
 続いて、姿勢推定部151は、取得した画像の時系列データからユーザのスケルトンデータを生成する(S105)。 Subsequently, the posture estimation unit 151 generates user skeleton data from the time-series data of the acquired images (S105).
 また、物体検出部155は、取得した画像の時系列データから物体情報を検出する(S109)。 Also, the object detection unit 155 detects object information from the time-series data of the acquired images (S109).
 次に、通信部130は、標準スケルトンデータ記憶部231に保持される標準スケルトンデータを受信する(S113)。 Next, the communication unit 130 receives the standard skeleton data held in the standard skeleton data storage unit 231 (S113).
 そして、体格特徴情報算出部159は、ユーザのスケルトンデータと、標準スケルトンデータを比較し、比較の結果をユーザの体格特徴情報として算出する(S117)。 Then, the physique feature information calculation unit 159 compares the user's skeleton data with the standard skeleton data, and calculates the result of the comparison as the user's physique feature information (S117).
 次に、変換部163は、ユーザのスケルトンデータに含まれる骨格情報を標準スケルトンデータに含まれる骨格情報に変換する(S121)。 Next, the conversion unit 163 converts the skeleton information included in the user's skeleton data into skeleton information included in the standard skeleton data (S121).
 そして、通信部130は、参照動作記憶部235に保持される参照動作情報を受信する(S235)。 Then, the communication unit 130 receives the reference motion information held in the reference motion storage unit 235 (S235).
 そして、変換部163は、参照動作情報に含まれるスケルトンデータの時系列データをユーザの体格特徴情報に応じて補正する(S129)。 Then, the conversion unit 163 corrects the time-series data of the skeleton data included in the reference motion information according to the user's physique feature information (S129).
 そして、動作比較部167は、ユーザのスケルトンデータの時系列データと、参照動作情報に含まれるスケルトンデータの時系列データとの比較を行う(S133)。 Then, the motion comparison unit 167 compares the time-series data of the skeleton data of the user with the time-series data of the skeleton data included in the reference motion information (S133).
 そして、表示制御部171は、比較の結果を操作表示部120に表示させ(S137)、情報処理端末10は処理を終了する。 Then, the display control unit 171 causes the operation display unit 120 to display the result of the comparison (S137), and the information processing terminal 10 ends the processing.
 なお、参照動作情報は、動力学に基づくモデルデータとして参照動作記憶部235により予め保持されてもよいが、例えば、インストラクターや教師などの他のユーザのある動作を参照動作情報として用意されてもよい。 The reference motion information may be stored in advance in the reference motion storage unit 235 as model data based on dynamics. good.
 以下、図14を参照し、他のユーザの動作を参照動作情報として取得する一例を説明する。 An example of acquiring another user's motion as reference motion information will be described below with reference to FIG.
 (情報処理端末10の第2の動作例)
 図14は、他のユーザの動作を参照動作情報として取得した際の、情報処理端末10の動作処理例を説明するための説明図である。
(Second operation example of information processing terminal 10)
FIG. 14 is an explanatory diagram for explaining an example of operation processing of the information processing terminal 10 when acquiring another user's operation as reference operation information.
 まず、データ取得部110は、他のユーザを撮影し、画像の時系列データを取得する(S201)。 First, the data acquisition unit 110 photographs another user and acquires time-series data of the images (S201).
 続いて、姿勢推定部151は、取得した画像の時系列データから他のユーザのスケルトンデータを生成する(S205)。 Subsequently, the posture estimation unit 151 generates skeleton data of other users from the time-series data of the acquired images (S205).
 また、物体検出部155は、取得した画像の時系列データから物体情報を検出する(S209)。 Also, the object detection unit 155 detects object information from the time-series data of the acquired images (S209).
 次に、通信部130は、標準スケルトンデータ記憶部231に保持される標準スケルトンデータを受信する(S213)。 Next, the communication unit 130 receives the standard skeleton data held in the standard skeleton data storage unit 231 (S213).
 そして、体格特徴情報算出部159は、他のユーザのスケルトンデータと、標準スケルトンデータを比較し、比較の結果を他のユーザの体格特徴情報として算出する(S217)。 Then, the physique feature information calculation unit 159 compares the skeleton data of the other user with the standard skeleton data, and calculates the result of the comparison as the physique feature information of the other user (S217).
 次に、変換部163は、標準スケルトンデータおよび他のユーザの体格特徴情報に基づき、他のユーザのスケルトンデータを補正する(S221)。 Next, the conversion unit 163 corrects the skeleton data of the other user based on the standard skeleton data and the physique feature information of the other user (S221).
 そして、変換部163は、補正された他のユーザのスケルトンデータを、ユーザの体格特徴情報に応じて補正する(S225)。 Then, the conversion unit 163 corrects the corrected skeleton data of the other user according to the user's physique feature information (S225).
 そして、動作比較部167は、ユーザのスケルトンデータの時系列データと、補正された他のユーザのスケルトンデータの時系列データとの比較を行う(S229)。 Then, the operation comparison unit 167 compares the time-series data of the skeleton data of the user with the corrected time-series data of the skeleton data of other users (S229).
 そして、表示制御部171は、比較の結果を操作表示部120に表示させ(S233)、情報処理端末10は処理を終了する。 Then, the display control unit 171 causes the operation display unit 120 to display the result of the comparison (S233), and the information processing terminal 10 ends the processing.
 以上、本開示に係るシステムの動作処理の一例を説明した。続いて、本開示に係る作用効果例を説明する。 An example of the operation processing of the system according to the present disclosure has been described above. Next, examples of effects according to the present disclosure will be described.
 <<5.作用効果例>
 以上説明した本開示によれば、多様な作用効果が得られる。例えば、情報処理端末10は、ユーザの体格特徴情報に応じて補正された参照動作情報に基づく誘導情報をユーザに提供する。したがって、情報処理端末10は、ユーザによって異なり得る各部位の可動域を考慮した誘導情報をユーザに提供し得る。
<<5. Action and effect example>
According to the present disclosure described above, various effects can be obtained. For example, the information processing terminal 10 provides the user with guidance information based on reference motion information corrected according to the user's physique feature information. Therefore, the information processing terminal 10 can provide the user with guidance information that takes into consideration the range of motion of each part that may differ depending on the user.
 また、参照動作情報は、ユーザの身体情報または健康情報に応じて補正される。これにより、例えば、情報処理端末10は、ユーザの年齢や症病歴などに応じて、ユーザが克服可能な範囲で補正された参照動作情報に基づく誘導情報をユーザに提供し得る。 Also, the reference motion information is corrected according to the user's physical information or health information. Thereby, for example, the information processing terminal 10 can provide the user with guidance information based on the reference motion information corrected within a range that the user can overcome, according to the user's age, medical history, and the like.
 また、情報処理端末10は、ユーザの動作情報と、参照動作情報との比較を行う。これにより、情報処理端末10は、ユーザの動作情報と目標とする参照動作情報との差を定量的にした誘導情報をユーザに提供し得る。 The information processing terminal 10 also compares the user's motion information with the reference motion information. Accordingly, the information processing terminal 10 can provide the user with guidance information that quantifies the difference between the user's motion information and the target reference motion information.
 また、参照動作情報は他のユーザの姿勢情報の時系列データであってもよい。これにより、ユーザは、例えば、教師やインストラクターなどの指導者の動作情報を目標として、各種トレーニングを実行し得る。 Also, the reference motion information may be time-series data of other users' posture information. As a result, the user can perform various types of training, for example, aiming at the motion information of a leader such as a teacher or an instructor.
 また、情報処理端末10は、物体情報および物体の動き情報を検出し、検出された各種情報に応じて参照動作情報を補正してもよい。これにより、情報処理端末10は、例えば、ユーザが車いす、義手または義足などにより、ユーザがトレーニングを行う際に生じ得る影響を考慮した誘導情報をユーザに提供し得る。 Further, the information processing terminal 10 may detect object information and object motion information, and correct the reference motion information according to the detected various information. As a result, the information processing terminal 10 can provide the user with guidance information that considers possible effects of the user's training, for example, in a wheelchair, with an artificial arm, or an artificial leg.
 また、情報処理端末10は、トレーニング種目ごとに設定されたユーザが有する各部位の重みパラメータに応じて、参照動作情報を補正してもよい。これにより、情報処理端末10は、トレーニング種目に応じて異なり得る各部位の重要度合いを反映した誘導情報をユーザに提供し得る。 In addition, the information processing terminal 10 may correct the reference motion information according to the weight parameter of each part owned by the user, which is set for each training event. Thereby, the information processing terminal 10 can provide the user with guidance information that reflects the degree of importance of each part, which may differ depending on the type of training.
 <<6.ハードウェア構成例>>
 以上、本開示の実施形態を説明した。上述したスケルトンデータの生成および特徴量の抽出などの情報処理は、ソフトウェアと、以下に説明する情報処理端末10のハードウェアとの協働により実現される。なお、以下に説明するハードウェア構成はサーバ20にも適用可能である。
<<6. Hardware configuration example >>
The embodiments of the present disclosure have been described above. Information processing such as generation of skeleton data and extraction of feature amounts described above is realized by cooperation between software and hardware of the information processing terminal 10 described below. Note that the hardware configuration described below can also be applied to the server 20 .
 図15は、情報処理端末10のハードウェア構成を示したブロック図である。情報処理端末10は、CPU(Central Processing Unit)1001と、ROM(Read Only Memory)1002と、RAM(Random Access Memory)1003と、ホストバス1004と、を備える。また、情報処理端末10は、ブリッジ1005と、外部バス1006と、インタフェース1007と、入力装置1008と、出力装置1010と、ストレージ装置(HDD)1011と、ドライブ1012と、通信装置1015とを備える。 FIG. 15 is a block diagram showing the hardware configuration of the information processing terminal 10. As shown in FIG. The information processing terminal 10 includes a CPU (Central Processing Unit) 1001 , a ROM (Read Only Memory) 1002 , a RAM (Random Access Memory) 1003 and a host bus 1004 . The information processing terminal 10 also includes a bridge 1005 , an external bus 1006 , an interface 1007 , an input device 1008 , an output device 1010 , a storage device (HDD) 1011 , a drive 1012 and a communication device 1015 .
 CPU1001は、演算処理装置および制御装置として機能し、各種プログラムに従って情報処理端末10内の動作全般を制御する。また、CPU1001は、マイクロプロセッサであってもよい。ROM1002は、CPU1001が使用するプログラムや演算パラメータ等を記憶する。RAM1003は、CPU1001の実行において使用するプログラムや、その実行において適宜変化するパラメータ等を一時記憶する。これらはCPUバスなどから構成されるホストバス1004により相互に接続されている。CPU1001、ROM1002およびRAM1003とソフトウェアとの協働により、図2を参照して説明した体格特徴情報算出部159や動作比較部167などの機能が実現され得る。 The CPU 1001 functions as an arithmetic processing device and a control device, and controls general operations within the information processing terminal 10 according to various programs. Alternatively, the CPU 1001 may be a microprocessor. A ROM 1002 stores programs, calculation parameters, and the like used by the CPU 1001 . The RAM 1003 temporarily stores programs used in the execution of the CPU 1001, parameters that change as appropriate during the execution, and the like. These are interconnected by a host bus 1004 comprising a CPU bus or the like. Functions such as the physique characteristic information calculation unit 159 and the motion comparison unit 167 described with reference to FIG.
 ホストバス1004は、ブリッジ1005を介して、PCI(Peripheral Component Interconnect/Interface)バスなどの外部バス1006に接続されている。なお、必ずしもホストバス1004、ブリッジ1005および外部バス1006を分離構成する必要はなく、1つのバスにこれらの機能を実装してもよい。 The host bus 1004 is connected via a bridge 1005 to an external bus 1006 such as a PCI (Peripheral Component Interconnect/Interface) bus. Note that the host bus 1004, bridge 1005 and external bus 1006 do not necessarily have to be configured separately, and these functions may be implemented in one bus.
 入力装置1008は、マウス、キーボード、タッチパネル、ボタン、マイクロフォン、スイッチおよびレバーなどユーザが情報を入力するための入力手段と、ユーザによる入力に基づいて入力信号を生成し、CPU1001に出力する入力制御回路などから構成されている。情報処理端末10のユーザは、該入力装置1008を操作することにより、情報処理端末10に対して各種のデータを入力したり処理動作を指示したりすることができる。 The input device 1008 includes input means for the user to input information, such as a mouse, keyboard, touch panel, button, microphone, switch, and lever, and an input control circuit that generates an input signal based on the user's input and outputs it to the CPU 1001 . etc. By operating the input device 1008, the user of the information processing terminal 10 can input various data to the information processing terminal 10 and instruct processing operations.
 出力装置1010は、例えば、液晶ディスプレイ装置、OLED装置およびランプなどの表示装置を含む。さらに、出力装置1010は、スピーカおよびヘッドホンなどの音声出力装置を含む。出力装置1010は、例えば、再生されたコンテンツを出力する。具体的には、表示装置は再生された映像データ等の各種情報をテキストまたはイメージで表示する。一方、音声出力装置は、再生された音声データ等を音声に変換して出力する。 The output device 1010 includes display devices such as liquid crystal display devices, OLED devices, and lamps, for example. In addition, output device 1010 includes audio output devices such as speakers and headphones. The output device 1010 outputs reproduced content, for example. Specifically, the display device displays various information such as reproduced video data as text or images. On the other hand, the audio output device converts reproduced audio data and the like into audio and outputs the audio.
 ストレージ装置1011は、データ格納用の装置である。ストレージ装置1011は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置および記憶媒体に記録されたデータを削除する削除装置などを含んでもよい。ストレージ装置1011は、例えば、HDD(Hard Disk Drive)で構成される。このストレージ装置1011は、ハードディスクを駆動し、CPU1001が実行するプログラムや各種データを格納する。 The storage device 1011 is a device for storing data. The storage device 1011 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 1011 is composed of, for example, an HDD (Hard Disk Drive). The storage device 1011 drives a hard disk and stores programs executed by the CPU 1001 and various data.
 ドライブ1012は、記憶媒体用リーダライタであり、情報処理端末10に内蔵、あるいは外付けされる。ドライブ1012は、装着されている磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ等のリムーバブル記憶媒体30に記録されている情報を読み出して、RAM1003に出力する。また、ドライブ1012は、リムーバブル記憶媒体30に情報を書き込むこともできる。 The drive 1012 is a reader/writer for storage media, and is built in or externally attached to the information processing terminal 10 . The drive 1012 reads out information recorded in the attached removable storage medium 30 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 1003 . Drive 1012 can also write information to removable storage medium 30 .
 通信装置1015は、例えば、ネットワーク12に接続するための通信デバイス等で構成された通信インタフェースである。また、通信装置1015は、無線LAN対応通信装置であっても、LTE(Long Term Evolution)対応通信装置であっても、有線による通信を行うワイヤー通信装置であってもよい。 The communication device 1015 is, for example, a communication interface configured with a communication device or the like for connecting to the network 12 . The communication device 1015 may be a wireless LAN compatible communication device, an LTE (Long Term Evolution) compatible communication device, or a wired communication device that performs wired communication.
 <<7.補足>>
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示はかかる例に限定されない。本開示の属する技術の分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
<<7. Supplement >>
Although the preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, the present disclosure is not limited to such examples. It is clear that a person having ordinary knowledge in the technical field to which the present disclosure belongs can conceive of various modifications or modifications within the scope of the technical idea described in the claims. It is understood that these also naturally belong to the technical scope of the present disclosure.
 例えば、姿勢推定部151は、データ取得部110により取得された時系列データに基づき、ユーザの身体の表面形状を示すメッシュモデルを生成してもよい。そして、体格特徴情報算出部159は、ユーザのメッシュモデルと所定のメッシュモデルとの差分をユーザの体格特徴情報として算出してもよい。また、表示制御部171は、ユーザのメッシュモデルと、体格特徴情報に応じて補正された参照動作情報のメッシュモデルとをユーザ上に重畳した画面を操作表示部120に表示させてもよい。これにより、ユーザは、参照動作情報とユーザの身体全体のずれ量をより鮮明に認識し得る。 For example, the posture estimation unit 151 may generate a mesh model representing the surface shape of the user's body based on the time-series data acquired by the data acquisition unit 110 . Then, the physique feature information calculation unit 159 may calculate the difference between the user's mesh model and a predetermined mesh model as the user's physique feature information. Further, the display control unit 171 may cause the operation display unit 120 to display a screen in which the mesh model of the user and the mesh model of the reference motion information corrected according to the physique feature information are superimposed on the user. This allows the user to more clearly recognize the deviation amount of the reference motion information and the user's entire body.
 また、情報処理端末10は、本開示に係るサーバ20の全てまたは一部の機能構成を更に備えてもよい。情報処理端末10が本開示に係るサーバ20の全ての機能構成を備えた場合、情報処理端末10は、ネットワーク1を介した通信をせずに、本開示に一連の処理を実行し得る。 In addition, the information processing terminal 10 may further include all or part of the functional configuration of the server 20 according to the present disclosure. When the information processing terminal 10 has all the functional configurations of the server 20 according to the present disclosure, the information processing terminal 10 can perform a series of processes according to the present disclosure without communicating via the network 1 .
 また、サーバ20は、情報処理端末10により生成されたユーザのスケルトンデータを取得してもよい。そして、サーバ20は、ユーザの体格特徴情報の算出および、算出したユーザ体格特徴情報に応じて参照動作情報を補正してもよい。そして、サーバ20は、ユーザの動作を誘導する誘導情報を情報処理端末10に送信してもよい。 Also, the server 20 may acquire the user's skeleton data generated by the information processing terminal 10 . Then, the server 20 may calculate the physique feature information of the user and correct the reference motion information according to the calculated user physique feature information. Then, the server 20 may transmit to the information processing terminal 10 guidance information that guides the action of the user.
 また、表示制御部171は、ユーザのスケルトンデータの時系列データと、参照動作情報との差分が所定量以上であった際に、段階的に達成度のステップを分けて操作表示部120に誘導情報を表示させてもよい。これにより、ユーザは、複数の達成経験を経験することができ、ユーザのモチベーションを向上し得る。 In addition, when the difference between the time-series data of the user's skeleton data and the reference motion information is equal to or greater than a predetermined amount, the display control unit 171 divides the steps of the degree of achievement step by step and guides the user to the operation display unit 120 . information may be displayed. This allows the user to experience multiple achievements, which can improve the user's motivation.
 また、本明細書の情報処理端末10およびサーバ20の処理における各ステップは、必ずしもフローチャートとして記載された順序に沿って時系列に処理する必要はない。例えば、情報処理端末10およびサーバ20の処理における各ステップは、フローチャートとして記載した順序と異なる順序で処理されてもよい。 Also, each step in the processing of the information processing terminal 10 and the server 20 in this specification does not necessarily have to be processed in chronological order according to the order described as the flowchart. For example, each step in the processing of the information processing terminal 10 and the server 20 may be processed in an order different from the order described as the flowchart.
 また、情報処理端末10に内蔵されるCPU、ROMおよびRAMなどのハードウェアに、上述した情報処理端末10の各構成と同等の機能を発揮させるためのコンピュータプログラムも作成可能である。また、当該コンピュータプログラムを記憶させた記憶媒体も提供される。 It is also possible to create a computer program for causing the hardware such as the CPU, ROM, and RAM built into the information processing terminal 10 to exhibit functions equivalent to the components of the information processing terminal 10 described above. A storage medium storing the computer program is also provided.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 Also, the effects described in this specification are merely descriptive or exemplary, and are not limiting. In other words, the technology according to the present disclosure can produce other effects that are obvious to those skilled in the art from the description of this specification, in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 ユーザの姿勢を示す姿勢情報を取得する姿勢情報取得部と、
 前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出する算出部と、
 参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正する補正部と、
 前記補正部により補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力する出力部と、
を備える、情報処理装置。
(2)
 前記算出部は、
 前記姿勢情報に含まれる各種骨格情報の各々と、前記所定の姿勢情報に含まれる各種骨格情報の各々との差分を前記ユーザの体格特徴情報として算出する、
前記(1)に記載の情報処理装置。
(3)
 前記補正部は、
 前記参照動作情報として用意された姿勢情報を、前記姿勢情報に含まれない前記ユーザの身体情報または健康情報に応じて補正する、
前記(1)または前記(2)に記載の情報処理装置。
(4)
 前記出力部は、
 前記補正部により補正された参照動作情報と、ユーザの姿勢情報とを重畳した表示情報を前記誘導情報として出力する、
前記(1)から前記(3)までのうちいずれか一項に記載の情報処理装置。
(5)
 前記情報処理装置は、
 前記ユーザの姿勢情報と、前記補正部により補正された参照動作情報との比較を行う動作比較部、
を更に備え、
 前記出力部は、
 前記動作比較部による比較の結果に基づき、前記ユーザの動作を誘導する誘導情報を出力する、
前記(1)から前記(3)までのうちいずれか一項に記載の情報処理装置。
(6)
 前記ユーザの姿勢情報に、前記所定の姿勢情報に基づく姿勢変換を行う姿勢変換部、
を更に備え、
 前記動作比較部は、
 前記姿勢変換部により姿勢変換が行われた前記ユーザの姿勢情報と、前記補正部により補正された参照動作情報との比較を行う、
前記(5)に記載の情報処理装置。
(7)
 前記出力部は、
 前記ユーザの姿勢情報と、前記補正部により補正された参照動作情報との差分情報を前記誘導情報として出力する、
前記(5)または前記(6)に記載の情報処理装置。
(8)
 前記参照動作情報は、
 他のユーザの姿勢情報を含む、
前記(1)または前記(7)までのうちいずれか一項に記載の情報処理装置。
(9)
 前記情報処理装置は、
 物体の特徴情報および前記物体の動き情報を取得する物体情報取得部、
を更に備え、
 前記補正部は、
 前記参照動作情報を、前記物体の特徴および前記物体の動き情報に応じて補正する、
前記(1)から前記(8)までのうちいずれか一項に記載の情報処理装置。
(10)
 前記補正部は、
 前記参照動作情報を、前記体格特徴情報に加えて、前記ユーザが有する部位ごとに設定された重みパラメータに応じて補正する、
前記(1)から前記(9)までのうちいずれか一項に記載の情報処理装置。
(11)
 前記重みパラメータは、前記参照動作情報の種目ごとに設定される、
前記(10)に記載の情報処理装置。
(12)
 ユーザの姿勢を示す姿勢情報を取得することと、
 前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出することと、
 参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正することと、
 補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力することと、
を含む、コンピュータにより実行される情報処理方法。
(13)
 コンピュータに、
 ユーザの姿勢を示す姿勢情報を取得する姿勢情報取得機能と、
 前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出する算出機能と、
 参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正する補正機能と、
 前記補正部により補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力する出力機能と、
を実現させる、プログラム。
Note that the following configuration also belongs to the technical scope of the present disclosure.
(1)
a posture information acquisition unit that acquires posture information indicating a posture of a user;
a calculation unit that calculates physique characteristic information of the user by comparing the posture information of the user and predetermined posture information;
a correction unit that corrects the posture information prepared as reference motion information according to the user's physique feature information;
an output unit that outputs guidance information for guiding the user's motion based on the reference motion information corrected by the correction unit;
An information processing device.
(2)
The calculation unit
calculating a difference between each of the various types of skeleton information included in the posture information and each of the various types of skeleton information included in the predetermined posture information as physique feature information of the user;
The information processing device according to (1) above.
(3)
The correction unit is
correcting the posture information prepared as the reference motion information according to physical information or health information of the user that is not included in the posture information;
The information processing apparatus according to (1) or (2).
(4)
The output unit
Display information obtained by superimposing the reference motion information corrected by the correction unit and the posture information of the user is output as the guidance information.
The information processing apparatus according to any one of (1) to (3).
(5)
The information processing device is
a motion comparison unit that compares the user posture information with reference motion information corrected by the correction unit;
further comprising
The output unit
Outputting guide information for guiding the user's action based on the result of the comparison by the action comparison unit;
The information processing apparatus according to any one of (1) to (3).
(6)
a posture transformation unit that transforms the posture information of the user based on the predetermined posture information;
further comprising
The operation comparison unit
comparing the posture information of the user whose posture has been transformed by the posture transforming unit and the reference motion information corrected by the correcting unit;
The information processing device according to (5) above.
(7)
The output unit
outputting, as the guidance information, difference information between the user's posture information and the reference motion information corrected by the correction unit;
The information processing apparatus according to (5) or (6).
(8)
The reference motion information includes:
including pose information of other users,
The information processing apparatus according to any one of (1) and (7) above.
(9)
The information processing device is
an object information acquisition unit that acquires feature information of an object and motion information of the object;
further comprising
The correction unit is
correcting the reference motion information according to features of the object and motion information of the object;
The information processing apparatus according to any one of (1) to (8).
(10)
The correction unit is
correcting the reference motion information according to a weight parameter set for each part of the user, in addition to the physique characteristic information;
The information processing apparatus according to any one of (1) to (9).
(11)
the weight parameter is set for each item of the reference motion information;
The information processing device according to (10) above.
(12)
obtaining posture information indicating the posture of the user;
calculating physique feature information of the user by comparing the posture information of the user and predetermined posture information;
correcting the posture information prepared as reference motion information according to the physique characteristic information of the user;
outputting guidance information for guiding a motion of the user based on the corrected reference motion information;
A computer-implemented information processing method comprising:
(13)
to the computer,
a posture information acquisition function for acquiring posture information indicating the posture of the user;
a calculation function for calculating physique characteristic information of the user by comparing the posture information of the user and predetermined posture information;
a correction function for correcting posture information prepared as reference motion information according to the user's physique feature information;
an output function for outputting guidance information for guiding a motion of the user based on the reference motion information corrected by the correction unit;
A program that realizes
10  情報処理端末
20  サーバ
110  データ取得部
120  操作表示部
130  通信部
140  記憶部
150  制御部
 151  姿勢推定部
 155  物体検出部
 159  体格特徴情報算出部
 163  変換部
 167  動作比較部
 171  表示制御部
210  通信部
220  制御部
230  記憶部
 231  標準スケルトンデータ記憶部
 235  参照動作記憶部
10 information processing terminal 20 server 110 data acquisition unit 120 operation display unit 130 communication unit 140 storage unit 150 control unit 151 posture estimation unit 155 object detection unit 159 physique feature information calculation unit 163 conversion unit 167 motion comparison unit 171 display control unit 210 communication Unit 220 Control unit 230 Storage unit 231 Standard skeleton data storage unit 235 Reference motion storage unit

Claims (13)

  1.  ユーザの姿勢を示す姿勢情報を取得する姿勢情報取得部と、
     前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出する算出部と、
     参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正する補正部と、
     前記補正部により補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力する出力部と、
    を備える、情報処理装置。
    a posture information acquisition unit that acquires posture information indicating a posture of a user;
    a calculation unit that calculates physique characteristic information of the user by comparing the posture information of the user and predetermined posture information;
    a correction unit that corrects the posture information prepared as reference motion information according to the user's physique feature information;
    an output unit that outputs guidance information for guiding the user's motion based on the reference motion information corrected by the correction unit;
    An information processing device.
  2.  前記算出部は、
     前記姿勢情報に含まれる各種骨格情報の各々と、前記所定の姿勢情報に含まれる各種骨格情報の各々との差分を前記ユーザの体格特徴情報として算出する、
    請求項1に記載の情報処理装置。
    The calculation unit
    calculating a difference between each of the various types of skeleton information included in the posture information and each of the various types of skeleton information included in the predetermined posture information as physique feature information of the user;
    The information processing device according to claim 1 .
  3.  前記補正部は、
     前記参照動作情報として用意された姿勢情報を、前記姿勢情報に含まれない前記ユーザの身体情報または健康情報に応じて補正する、
    請求項2に記載の情報処理装置。
    The correction unit is
    correcting the posture information prepared as the reference motion information according to physical information or health information of the user that is not included in the posture information;
    The information processing apparatus according to claim 2.
  4.  前記出力部は、
     前記補正部により補正された参照動作情報と、ユーザの姿勢情報とを重畳した表示情報を前記誘導情報として出力する、
    請求項3に記載の情報処理装置。
    The output unit
    Display information obtained by superimposing the reference motion information corrected by the correction unit and the posture information of the user is output as the guidance information.
    The information processing apparatus according to claim 3.
  5.  前記情報処理装置は、
     前記ユーザの姿勢情報と、前記補正部により補正された参照動作情報との比較を行う動作比較部、
    を更に備え、
     前記出力部は、
     前記動作比較部による比較の結果に基づき、前記ユーザの動作を誘導する誘導情報を出力する、
    請求項3に記載の情報処理装置。
    The information processing device is
    a motion comparison unit that compares the user posture information with reference motion information corrected by the correction unit;
    further comprising
    The output unit
    Outputting guide information for guiding the user's action based on the result of the comparison by the action comparison unit;
    The information processing apparatus according to claim 3.
  6.  前記ユーザの姿勢情報に、前記所定の姿勢情報に基づく姿勢変換を行う姿勢変換部、
    を更に備え、
     前記動作比較部は、
     前記姿勢変換部により姿勢変換が行われた前記ユーザの姿勢情報と、前記補正部により補正された参照動作情報との比較を行う、
    請求項5に記載の情報処理装置。
    a posture transformation unit that transforms the posture information of the user based on the predetermined posture information;
    further comprising
    The operation comparison unit
    comparing the posture information of the user whose posture has been transformed by the posture transforming unit and the reference motion information corrected by the correcting unit;
    The information processing device according to claim 5 .
  7.  前記出力部は、
     前記ユーザの姿勢情報と、前記補正部により補正された参照動作情報との差分情報を前記誘導情報として出力する、
    請求項6に記載の情報処理装置。
    The output unit
    outputting, as the guidance information, difference information between the user's posture information and the reference motion information corrected by the correction unit;
    The information processing device according to claim 6 .
  8.  前記参照動作情報は、
     他のユーザの姿勢情報を含む、
    請求項7に記載の情報処理装置。
    The reference motion information includes:
    including pose information of other users,
    The information processing apparatus according to claim 7.
  9.  前記情報処理装置は、
     物体の特徴情報および前記物体の動き情報を取得する物体情報取得部、
    を更に備え、
     前記補正部は、
     前記参照動作情報を、前記物体の特徴および前記物体の動き情報に応じて補正する、
    請求項8に記載の情報処理装置。
    The information processing device is
    an object information acquisition unit that acquires feature information of an object and motion information of the object;
    further comprising
    The correction unit is
    correcting the reference motion information according to features of the object and motion information of the object;
    The information processing apparatus according to claim 8 .
  10.  前記補正部は、
     前記参照動作情報を、前記体格特徴情報に加えて、前記ユーザが有する部位ごとに設定された重みパラメータに応じて補正する、
    請求項9に記載の情報処理装置。
    The correction unit is
    correcting the reference motion information according to a weight parameter set for each part of the user, in addition to the physique characteristic information;
    The information processing apparatus according to claim 9 .
  11.  前記重みパラメータは、前記参照動作情報の種目ごとに設定される、
    請求項10に記載の情報処理装置。
    the weight parameter is set for each item of the reference motion information;
    The information processing apparatus according to claim 10.
  12.  ユーザの姿勢を示す姿勢情報を取得することと、
     前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出することと、
     参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正することと、
     補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力することと、
    を含む、コンピュータにより実行される情報処理方法。
    obtaining posture information indicating the posture of the user;
    calculating physique feature information of the user by comparing the posture information of the user and predetermined posture information;
    correcting the posture information prepared as reference motion information according to the physique characteristic information of the user;
    outputting guidance information for guiding a motion of the user based on the corrected reference motion information;
    A computer-implemented information processing method comprising:
  13.  コンピュータに、
     ユーザの姿勢を示す姿勢情報を取得する姿勢情報取得機能と、
     前記ユーザの姿勢情報と、所定の姿勢情報との比較により、前記ユーザの体格特徴情報を算出する算出機能と、
     参照動作情報として用意された姿勢情報を、前記ユーザの体格特徴情報に応じて補正する補正機能と、
     前記補正機能により補正された参照動作情報に基づき、前記ユーザの動作を誘導する誘導情報を出力する出力機能と、
    を実現させる、プログラム。
    to the computer,
    a posture information acquisition function for acquiring posture information indicating the posture of the user;
    a calculation function for calculating physique characteristic information of the user by comparing the posture information of the user and predetermined posture information;
    a correction function for correcting posture information prepared as reference motion information according to the user's physique feature information;
    an output function for outputting guidance information for guiding a motion of the user based on the reference motion information corrected by the correction function;
    A program that realizes
PCT/JP2022/000897 2021-03-19 2022-01-13 Information processing device, information processing method, and program WO2022196059A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021046158 2021-03-19
JP2021-046158 2021-03-19

Publications (1)

Publication Number Publication Date
WO2022196059A1 true WO2022196059A1 (en) 2022-09-22

Family

ID=83320213

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/000897 WO2022196059A1 (en) 2021-03-19 2022-01-13 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2022196059A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011062352A (en) * 2009-09-17 2011-03-31 Koki Hashimoto Exercise motion teaching device and play facility
US20140280219A1 (en) * 2013-03-15 2014-09-18 FitStar, Inc. Identifying available exercises for customizing an exercise session
US9350951B1 (en) * 2011-11-22 2016-05-24 Scott Dallas Rowe Method for interactive training and analysis
JP2019024550A (en) * 2017-07-25 2019-02-21 株式会社クオンタム Detection device, detection system, processing device, detection method and detection program
JP2020108823A (en) * 2015-09-29 2020-07-16 ソニー株式会社 Information processing device, information processing method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011062352A (en) * 2009-09-17 2011-03-31 Koki Hashimoto Exercise motion teaching device and play facility
US9350951B1 (en) * 2011-11-22 2016-05-24 Scott Dallas Rowe Method for interactive training and analysis
US20140280219A1 (en) * 2013-03-15 2014-09-18 FitStar, Inc. Identifying available exercises for customizing an exercise session
JP2020108823A (en) * 2015-09-29 2020-07-16 ソニー株式会社 Information processing device, information processing method, and program
JP2019024550A (en) * 2017-07-25 2019-02-21 株式会社クオンタム Detection device, detection system, processing device, detection method and detection program

Similar Documents

Publication Publication Date Title
US20210272376A1 (en) Virtual or augmented reality rehabilitation
US10905350B2 (en) Camera-guided interpretation of neuromuscular signals
US20210008413A1 (en) Interactive Personal Training System
US11006856B2 (en) Method and program product for multi-joint tracking combining embedded sensors and an external sensor
US9195304B2 (en) Image processing device, image processing method, and program
CN109432753B (en) Action correcting method, device, storage medium and electronic equipment
US9987520B2 (en) Method and system for monitoring and feed-backing on execution of physical exercise routines
JP5356690B2 (en) Method, system, and program for tracking a range of physical movement of a user
JP2021027917A (en) Information processing device, information processing system, and machine learning device
KR20220028654A (en) Apparatus and method for providing taekwondo movement coaching service using mirror dispaly
US11049321B2 (en) Sensor-based object tracking and monitoring
WO2019183733A1 (en) Method and system for motion capture to enhance performance in an activity
Kinger et al. Deep learning based yoga pose classification
US20230285806A1 (en) Systems and methods for intelligent fitness solutions
WO2022196059A1 (en) Information processing device, information processing method, and program
Chatzitofis et al. Technological module for unsupervised, personalized cardiac rehabilitation exercising
JP2021099666A (en) Method for generating learning model
WO2024105991A1 (en) Information processing apparatus, information processing method, and program
US20240013410A1 (en) Information processing apparatus, information processing method, and program
US20220328159A1 (en) Range of motion determination
US20220406206A1 (en) Recording medium recorded with cardiopulmonary resuscitation training program, cardiopulmonary resuscitation training method, apparatus, and system
KR102276009B1 (en) Apparatus and method for cardiopulmonary resuscitation training simulation
US20240055099A1 (en) Range of motion determination
US20230414132A1 (en) System and method for providing rehabilitation in a virtual environment
JP2024071015A (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22770821

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22770821

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP