WO2022030619A1 - Guidance support system - Google Patents

Guidance support system Download PDF

Info

Publication number
WO2022030619A1
WO2022030619A1 PCT/JP2021/029301 JP2021029301W WO2022030619A1 WO 2022030619 A1 WO2022030619 A1 WO 2022030619A1 JP 2021029301 W JP2021029301 W JP 2021029301W WO 2022030619 A1 WO2022030619 A1 WO 2022030619A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
evaluation
user
image
tool
Prior art date
Application number
PCT/JP2021/029301
Other languages
French (fr)
Japanese (ja)
Inventor
侑也 ▲高▼久
Original Assignee
株式会社Sportip
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Sportip filed Critical 株式会社Sportip
Priority to JP2022541749A priority Critical patent/JPWO2022030619A1/ja
Publication of WO2022030619A1 publication Critical patent/WO2022030619A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus ; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/65Entertainment or amusement; Sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/40Information sensed or collected by the things relating to personal data, e.g. biometric data, records or preferences
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/20Analytics; Diagnosis

Definitions

  • the present invention relates to a guidance support system.
  • Physical exercise guidance and coaching are widely performed in aspects such as sports and other sports, muscle strength training, and rehabilitation.
  • Guidance and coaching are often provided by experienced and knowledgeable supporters (including instructors, coaches, physiotherapists, care workers, etc.), and users who perform physical exercise can hear objective opinions. Although it is useful, it is often done based on the supporter's intuition and experience, and sometimes the supporter cannot verbalize, so the effectiveness of guidance and coaching is low.
  • Patent Document 1 discloses an invention that supports instruction by comparing the movement of a user and the movement of an instructor by juxtaposing them as a method of instructing sports skills.
  • Patent Document 1 Although the difference between the user and the instructor becomes clear, the knowledge of how to make the difference close and what kind of training is required depends on the instructor. In addition, since moving images are shot and compared, it is not possible to give guidance immediately after or during physical exercise, and the effect is limited.
  • the present invention has been made in view of such a background, and can easily detect and evaluate the movement of a body part or a tool part from an image of a user's body movement, further improving skills, training, rehabilitation, etc.
  • the purpose is to provide technology that can support exercise by giving advice for improving the effect of.
  • the present invention is a guidance support system that supports guidance given by a supporter to a user who performs physical exercise, and is a reference value storage that stores a reference value relating to the position of at least one body part or tool part.
  • the part, the part specifying the part or the part that identifies the part by analyzing the image of the physical exercise of the user, and the position of the part or the part in the image and the reference value are compared with each other. It is characterized by including an evaluation unit for determining an evaluation value, an imaging function for capturing an image of the user's body movement, and a supporter terminal having a display function for displaying the evaluation value.
  • the guidance support server according to the embodiment of the present invention has the following configuration.
  • a reference value storage unit that stores a reference value related to the position of at least one body part or tool part, and a reference value storage unit.
  • a part specifying part that identifies the part or the part by analyzing an image of the user's physical exercise,
  • An evaluation unit that determines the evaluation value of the physical exercise by comparing the position of the portion or the portion in the image and the reference value. Equipped with An imaging function that captures an image of the user's physical exercise,
  • a supporter terminal having a display function for displaying the evaluation value, and Guidance support system characterized by being equipped with.
  • the supporter terminal must be a wearable computer.
  • a reference value storage step that stores a reference value relating to the position of at least one body part or tool part, A site part identification step for identifying the part or the part by analyzing an image of the user's physical exercise, An evaluation step of comparing the position of the part or the part in the image and the reference value to determine the evaluation value of the physical exercise, and the evaluation step. Equipped with An imaging function that captures an image of the user's physical exercise, A supporter terminal having a display function for displaying the evaluation value, and Guidance support method characterized by providing.
  • the guidance support server evaluates the movement of the body and the movement of the tool in the physical exercise performed by the user (including the one performed by using the tool), and then the supporter gives guidance. It is an attempt to support that.
  • Physical exercises performed by the user include those that do not use tools such as gymnastics, fitness, walking, running, yoga, self-weight training, and rehabilitation, and those that use tools such as soccer, volleyball, and basketball. Competitions that use balls and other tools such as baseball, tennis, table tennis, and golf, competitions that use tools other than balls such as kendo and fencing, physical exercises that use tools such as juggling, barbells, etc. It includes, but is not limited to, physical exercise using training equipment and machines, various rehabilitation using tools, movements of the care recipient using tools such as canes, and playing using musical instruments.
  • the instruction support server of the present embodiment is, for example, an image obtained by capturing a state in which a user is exercising a body (a still image or a moving image may be used, but in the present embodiment, the moving image is used. From), identify the body and parts of the body, tools and parts of the tool, and based on the absolute position of the part or part and the relative positional relationship of multiple different parts, parts, etc. Evaluate the movement of the tool. The movement of the tool may be evaluated based on the relative positional relationship between the plurality of different parts of the body and the part of the tool.
  • FIG. 1 is a diagram showing an overall configuration example of the guidance support server according to this embodiment.
  • the instruction support server of the present embodiment includes a user terminal 10 and a server device 20. Further, the image pickup terminal 50 may be included.
  • the user terminal 10, the server device 20, and the image pickup terminal 50 are connected to each other so as to be able to communicate with each other via the communication network 30.
  • the communication network 30 is, for example, the Internet or a LAN (Local Area Network), and is constructed by a public telephone line network, a dedicated telephone line network, a mobile telephone line network, an Ethernet (registered trademark), a wireless communication path, or the like.
  • the user terminal 10 is a computer operated by a user who performs physical exercise or a supporter thereof.
  • the user terminal 10 is, for example, a smartphone, a tablet computer, a personal computer, a wearable computer, or the like.
  • the user terminal 10 includes an image pickup device such as a camera, which can image the user's body during exercise. In the present embodiment, it is assumed that the moving image of the body of the user during exercise is transmitted from the user terminal 10 to the server device 20. Further, the user terminal 10 has an image pickup function that captures an image close to the user's field of view, and has an output function for visually recognizing characters, images, videos, etc. by a virtual image projection method, a retinal projection method, or any other method.
  • a hat-type computer, an HMD (head-mounted display), or the like, or a separate terminal may be responsible for the image pickup function and the output function.
  • the user terminal 10 may also serve as the image pickup terminal 50 and the output terminal 60, the image pickup function of the user terminal 10 may also serve as the image pickup terminal 50, and the output function of the supporter terminal 40 is the output terminal 60. May also serve as. Further, although only one user terminal 10 is shown in FIG. 1, it goes without saying that there may be a plurality of user terminals 10.
  • the server device 20 is a computer that evaluates the movements of the body and tools.
  • the server device 20 is, for example, a workstation, a personal computer, a virtual computer logically realized by cloud computing, or the like.
  • the server device 20 receives the moving image taken by the user terminal 10, analyzes the received moving image, and evaluates the physical exercise.
  • the server device 20 also makes a proposal for improving physical exercise. Details of the evaluation of physical exercise and the proposal of improvement measures will be described later.
  • the supporter terminal 40 is a computer operated by a person (supporter) who plays a role of providing guidance, teaching, explanation, support, etc. to a user who performs physical exercise, such as a trainer, a physiotherapist, and a caregiver.
  • the supporter terminal 40 is, for example, a smartphone, a tablet computer, a personal computer, a wearable computer, or the like.
  • the supporter terminal 40 has an image pickup function that captures images close to the field of view of the supporter, and has an output function for visually recognizing characters, images, videos, etc. by a virtual image projection method, a retinal projection method, or other methods.
  • a computer such as a lens type, a hat type, or an HMD (head-mounted display) may be used, and another terminal may be responsible for the image pickup function and the output function.
  • the supporter terminal 40 may also serve as the image pickup terminal 50 and the output terminal 60, the image pickup function of the supporter terminal 40 may also serve as the image pickup terminal 50, and the output function of the supporter terminal 40 is output. It may also serve as a terminal 60. Further, although only one supporter terminal 40 is shown in FIG. 1, it goes without saying that there may be a plurality of supporter terminals 40.
  • the image pickup terminal 50 is a computer that captures an image of the user, a camera having a communication function, or the like, which is installed or attached to a place where the user exercises.
  • the image pickup terminal 50 is, for example, a smartphone, a tablet computer, a personal computer, a wearable computer, or the like.
  • the image pickup terminal 50 includes an image pickup device such as a camera, which can take an image of the user's body during exercise. In the present embodiment, it is assumed that the moving image of the body of the user who is exercising is transmitted from the image pickup terminal 50 to the server device 20.
  • the data stored in the image pickup terminal 50 and captured by the user may be directly input to the server device 20 by the user, its supporter, or a business operator using the guidance support server, or the communication network. It may be input via 30.
  • the image pickup terminal 50 may be installed at a plurality of places to image the subject at the same time or in order. Further, the image pickup terminal 50 may also serve as the user terminal 10 and the output terminal 60, or may be responsible for the image pickup function of the user terminal 10 and the supporter terminal 40.
  • the output terminal 60 may be, for example, a smartphone, a tablet computer, a personal computer, a wearable computer, or the like, or by connecting a computer to an output device such as a display and communicating between the computer and the server device 20. , Information may be output to the output terminal 60 in response to an instruction from the server device 20.
  • the output terminal 60 includes an audio output device, and a user can be attached to an audio output device such as headphones, earphones, or a neck speaker to output information as audio.
  • virtual image projection method In addition, virtual image projection method, retinal projection method, eyeglass type, contact lens type, hat type, other types of output terminals that make characters, images, images, etc. visible by other methods, HMD (head mounted display), and brain waves.
  • Information is sent to the user or supporter via an interface such as BMI (Brain Machine Interface) that inputs characters, images, videos, etc. by direct stimulation to the brain without using a sensory device. It can also be output.
  • BMI Brain Machine Interface
  • output terminal 60 may also serve as the user terminal 10 and the supporter terminal 40, or may have an output function of the user terminal 10 and the supporter terminal 40.
  • FIG. 2 is a diagram showing a hardware configuration example of the user terminal 10.
  • the user terminal 10 includes a CPU 101, a memory 102, a storage device 103, a communication interface 104, a touch panel display 105, and a camera 106.
  • the storage device 103 stores various data and programs, such as a hard disk drive, a solid state drive, and a flash memory.
  • the communication interface 104 is an interface for connecting to the communication network 30, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for performing wireless communication. , USB (Universal Serial Bus) connector for serial communication, RS232C connector, etc.
  • the touch panel display 105 is a device that inputs / outputs data.
  • the user terminal 10 may further include an input device such as a keyboard, a mouse, a button, and a microphone, and an output device such as a speaker and a printer.
  • FIG. 3 is a diagram showing a software configuration example of the user terminal 10.
  • the user terminal 10 includes an imaging unit 111, an evaluation request transmission unit 112, an evaluation information receiving unit 113, an evaluation display unit 114, a checkpoint display unit 115, an improvement measure request transmission unit 116, an improvement measure information receiving unit 117, and an improvement measure information display.
  • Each of the above functional units is realized by the CPU 101 included in the user terminal 10 reading a program stored in the storage device 103 into the memory 102 and executing the program, and each of the above storage units is the memory 102 included in the user terminal 10. And implemented as part of the storage area provided by the storage device 103.
  • the image pickup unit 111 captures an image during physical exercise performed by the user, including a moving image. By controlling the camera 106, the image pickup unit 111 can acquire a moving image of body movement.
  • the user or the user's supporter installs the user terminal 10 on a flat place, a wall, or the like, directs the optical axis of the camera 106 to the place where the user exercises, and instructs the start of video shooting, and responds accordingly.
  • the imaging unit 111 may operate the camera 106 to acquire a moving image.
  • the image pickup unit 111 stores the acquired moving image in the image storage unit 131.
  • the image storage unit 131 stores the image captured by the image pickup unit 111.
  • the image is a moving image, but the image is not limited thereto.
  • the image storage unit 131 can store a moving image as a file, for example.
  • the physical information storage unit 130 stores information (hereinafter referred to as physical information) related to the user's body, physical ability, factors affecting the training effect, and the like.
  • FIG. 4 is a diagram showing a configuration example of physical information stored by the physical information storage unit 130.
  • physical information includes height, weight, gender, dominant hand, arm length, foot length, hand size, finger length, grip strength, muscle strength, flexibility, and shoulder strength. It includes genomes, epigenomes, gene polymorphisms, intestinal flora, diet, etc.
  • the type and number of physical exercises performed may be stored in the image analysis performed by the image analysis unit 212.
  • the evaluation request transmission unit 112 transmits a request for evaluating physical exercise using a tool (hereinafter referred to as an evaluation request) to the server device 20 based on the image captured by the image pickup unit 111.
  • FIG. 5 is a diagram showing a configuration example of an evaluation request transmitted by the evaluation request transmission unit 112 to the server device 20.
  • the evaluation request includes a user ID, a mode, physical information and image data.
  • the user ID is information that identifies the user.
  • the mode is information indicating the physical exercise performed by the user.
  • the modes can be, for example, "walking", “yoga: cat pose", “bench press”, “tennis serve", “walking rehabilitation” and the like.
  • the mode shall be selected from predetermined options.
  • the physical information is physical information stored in the physical information storage unit 130.
  • the image data is moving image data acquired by the imaging unit 111.
  • the evaluation information receiving unit 113 receives information related to the evaluation of physical exercise (hereinafter referred to as evaluation information), which is responded to by the server device 20 in response to the evaluation request.
  • evaluation information information related to the evaluation of physical exercise
  • the evaluation information receiving unit 113 registers the received evaluation information in the evaluation information storage unit 132.
  • FIG. 6 is a diagram showing a configuration example of evaluation information received from the server device 20 by the evaluation information receiving unit 113.
  • the evaluation information includes a mode, a user ID, a tool position information, a body part position information, a posture information, a movement information, and a checkpoint information.
  • the user ID and mode are the user ID and mode included in the evaluation request.
  • the captured image indicates that the body is photographed by the user performing the exercise indicated by the mode.
  • the tool position information is for each part of the tool (for example, the entire bat used for baseball, both ends, the grip, the place where the ball meets, the center of gravity, and any other part. Also, the entire barbell, shaft, and plate (weight part). , The gripped part, the center of the shaft, the center of gravity, and any other part. Including the entire tool.) Indicates the position in the image.
  • the tool position information includes each part of the tool and the position of the part in association with the time point on the time axis of the moving image. Based on the tool position information, the movement of the tool and the relationship with the body part can be displayed.
  • a figure for example, a circle
  • the positions of a plurality of parts may be included at one time point.
  • the part connecting the two parts does not have to include the tool position information.
  • the tool partial position information may be included for each frame constituting the video, for each key frame (including the frame for the checkpoint described later), or for any number. It may be included for each frame, or it may be included for random time points.
  • Body position information indicates the position of each part of the body (for example, head, shoulders, elbows, hips, knees, ankles, etc.) in the image.
  • the body position information includes a part and a position of the part in association with a time point on the time axis of the moving image.
  • the state of the skeleton (bone) of the body can be displayed based on the body position information. That is, for example, a figure (for example, a circle) indicating a part can be superimposed and displayed on an image at a position indicated by body position information. It should be noted that the positions of a plurality of parts may be included at one time point.
  • the position information may not be included in the part connecting the two parts (for example, the forearm connecting the wrist and the elbow or the thigh connecting the waist and the knee). In this case, by connecting a pair of marks (such as a circle) indicating two predetermined parts with a line, the part connecting these two parts can be expressed.
  • the position information may be included for each frame constituting the video, for each key frame (including the frame for the checkpoint described later), or for each frame of any number. It may be included in, or it may be included at random time points. If My Frame does not contain location information, you can display bones based on the location information at the nearest past point in time.
  • Tool orientation information is information related to the orientation of the tool used by the user and the direction in which the part of the tool is facing.
  • the tool orientation information includes a part of the tool to be evaluated, a tool movement value, an evaluation rank, an evaluation comment, and the like in association with a time point on the time axis of the moving image.
  • the tool orientation value is a value indicating the orientation of the tool.
  • the tool orientation value is, for example, the distance from the ground to a certain part of the tool, the distance between two parts of the tool, the angle of the part (to the first part of the tool and, for example, the part where the user holds the tool).
  • the evaluation rank is a value representing the evaluation value by rank.
  • the evaluation rank is expressed by, for example, 1 to 5 in 5 stages, ABC, or the like.
  • the evaluation comment is a comment related to the evaluation regarding the posture. For example, if the mode is "upright row” and the distance between the right and left ends of the barbell is different from the ground, it may contain an evaluation comment such as "different forces are applied to the left and right".
  • Tool movement information is information related to the movement of the tool used by the user.
  • the tool movement information includes a part of the tool to be evaluated, a list of the tool orientation values, an evaluation rank, and an evaluation comment in association with a period on the time axis of the moving image.
  • the list of tool orientation values is a time-series tool orientation value within a period.
  • the evaluation comment is a comment related to the evaluation regarding the movement of the tool. For example, if the mode is "upright row" and the barbell does not move up and down sufficiently, it may include a rating comment such as "the barbell has not been lifted sufficiently".
  • Posture information is information related to the posture of the user's body.
  • the posture information includes a part to be evaluated, a posture value, an evaluation rank, and an evaluation comment in association with a time point on the time axis of the moving image.
  • the posture value is a value representing the posture.
  • Posture values are, for example, the distance from the ground to the part, the distance between the two parts, the angle of the indirect part (the straight line from the first end part to the indirect part, and the second end part to the indirect part). The angle created by the straight line to).
  • the evaluation rank is a value representing the evaluation value by rank.
  • the evaluation rank is expressed by, for example, 1 to 5 in 5 stages, ABC, or the like.
  • the evaluation comment is a comment related to the evaluation regarding the posture. For example, if the mode is "lifting" and the flexion is not sufficient, an evaluation comment such as "not sitting down” may be included.
  • Body movement information is information related to the user's body movement.
  • the body movement information includes a part to be evaluated, a list of posture values, an evaluation rank, and an evaluation comment in association with a period on the time axis of the moving image.
  • the list of posture values is a time-series posture value within a period.
  • the evaluation comment is a comment related to the evaluation of movement. For example, if the mode is "lifting" and the knee extension is not smooth, it may include a rating comment such as "knee movement is not smooth".
  • the relationship information indicates information on one or more positional relationships between the tool and the body.
  • the relationship information includes information on the relationship between information such as the position, orientation, and movement of the tool portion and information such as body parts, posture, and movement, in association with the time point on the time axis of the moving image.
  • information such as the position, orientation, and movement of the tool portion
  • information such as body parts, posture, and movement, in association with the time point on the time axis of the moving image.
  • the positions of a plurality of parts and parts may be included at one time point.
  • the position information may not be included in the portion connecting the portion and the portion (for example, the tip portion of the bat and the center point of the portion where the user holds the bat).
  • a part connecting these two parts and the part can be expressed.
  • the relationship information may be included for each frame constituting the video, for each key frame, for each checkpoint (details will be described later), or for each frame of any number. It may be included or it may be included for random time points. If my frame does not contain location information, it is possible to display a figure showing a part or part and a line connecting them based on the location information at the nearest past time point.
  • Checkpoint information is information indicating points (hereinafter referred to as checkpoints) for checking the orientation of the tool, the posture of the body, etc. in the movement of the tool used by the user or a series of movements of the user's body.
  • Checkpoints include, for example, when the mode is "weightlifting", where the barbell reaches the highest position, when it reaches the lowest position, and when it is lifted. When the mode is "pitching", the place where the foot is raised, the place where the raised foot is lowered to move the weight, the place where the ball is released, and the like.
  • the checkpoint information stores information indicating a checkpoint (hereinafter referred to as a checkpoint ID) in association with a time point on the time axis of the moving image. That is, it is possible to specify a frame (still image) in the moving image in which the checkpoint indicated by the checkpoint ID is displayed.
  • the evaluation display unit 114 displays evaluation information. For example, the evaluation display unit 114 superimposes the tool position, the tool orientation, the tool movement information, the body position, the posture, the body movement information, and the like included in the evaluation information on the moving image, and the tool part and the body. By displaying a figure representing a part (for example, the end of a tool, a circle representing the center of gravity of the body, a line connecting them, etc.), the movement of the tool part or body part can be superimposed on the video. .. Further, the evaluation display unit 114 may display, for example, a graph of changes over time such as the position of the tool part, the orientation of the tool, the movement of the tool, and the position, posture, and movement of the body part. can.
  • a graph of changes over time such as the position of the tool part, the orientation of the tool, the movement of the tool, and the position, posture, and movement of the body part. can.
  • the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the tool orientation information and the tool movement information included in the evaluation information. For example, the evaluation display unit 114 reaches the tool orientation information when the playback time of the moving image is near the front and back of the time point included in the tool orientation information (for example, it can be an arbitrary length such as around 5 seconds). You can view the rating ranks and rating comments contained in. Further, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the tool movement information when the reproduction time of the moving image comes within the period included in the tool movement information. Further, the evaluation display unit 114 can display the posture value included in the posture information. Further, the evaluation display unit 114 can display a graph of the time-series conversion of the tool orientation values based on the list of the tool orientation values included in the tool movement information.
  • the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the posture information and the movement information included in the evaluation information.
  • the evaluation display unit 114 is included in the posture information when the playback time of the moving image is near the front and back of the time when it is included in the posture information (for example, it can be an arbitrary length such as around 5 seconds). It is possible to display the evaluation rank and evaluation comment that have been given. Further, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the motion information when the reproduction time of the moving image comes within the period included in the motion information. Further, the evaluation display unit 114 can display the posture value included in the posture information.
  • the evaluation display unit 114 can display a graph of the time-series conversion of the posture values based on the list of posture values included in the movement information.
  • the evaluation rank and the evaluation comment may be displayed together with the evaluation rank and the evaluation comment displayed according to the display of the moving image based on the tool orientation information and the tool movement information described in the previous paragraph. ..
  • the checkpoint display unit 115 can extract and display a checkpoint image from a moving image.
  • the checkpoint display unit 115 can read a frame corresponding to a time point included in the checkpoint information from the image data of the moving image stored in the image storage unit 131 and display it as a still image. Further, the checkpoint display unit 115 may, for example, extract and display only the body part from the read frame.
  • the improvement measure request transmission unit 116 transmits a request for acquiring improvement measures regarding how to handle the tool and physical exercise (hereinafter referred to as an improvement measure request) to the server device 20.
  • FIG. 7 is a diagram showing a configuration example of an improvement measure request.
  • the improvement measure request includes a user ID, a mode, a purpose, and the like.
  • the purpose is for the user to make improvements.
  • the purpose can be, for example, "to increase the speed of the ball", “to increase muscle strength", “to stabilize the lower body”, and the like.
  • the purpose shall also be selected from the predetermined options.
  • the improvement measure information receiving unit 117 receives information on the improvement measure (hereinafter referred to as improvement measure information) transmitted from the server device 20 in response to the improvement measure request.
  • the improvement measure information receiving unit 117 stores the received improvement measure information in the improvement measure storage unit 133.
  • FIG. 8 shows a configuration example of the improvement measure information.
  • the improvement measure information includes the purpose, advice, and standard information.
  • the advice is assumed to be a character string representing the improvement measure, but may be content that presents the improvement measure by an image, a moving image, or the like.
  • Reference information includes suitable tool orientation and movement (position and orientation of each part, movement, speed, angle, etc.), body posture (position and orientation of each part, movement, speed, angle, etc.), and tool and body.
  • the improvement measure information receiving unit 117 may receive the improvement measure information transmitted by the improvement measure information transmitting unit 216 based on the evaluation result and the reference value even if there is no improvement measure request.
  • the improvement measure information display unit 118 displays the improvement measure.
  • the improvement measure information display unit 118 displays the advice included in the improvement measure information. Further, when the improvement measure information includes a suitable position or angle of the portion, each may be superimposed on the image information and displayed.
  • the reference value request transmission unit 119 transmits a request (hereinafter referred to as a reference value request) for acquiring a reference value or a candidate thereof targeted by the user to the server device 20.
  • FIG. 8 is a diagram showing a configuration example of an improvement measure request.
  • the improvement measure request includes a user ID, a mode, a purpose, the image information, the physical information, and the evaluation information.
  • the purpose is for the user to make improvements.
  • the purpose can be, for example, "to increase the speed of the ball", “to increase muscle strength", “to stabilize the lower body", and the like.
  • the purpose shall also be selected from the predetermined options.
  • the reference value selection information transmission unit 120 transmits information selected from one or more of the reference values presented by the reference information determination unit 220 to the server device 20.
  • the reference value selection information transmission unit 120 may transmit to the server device 20 an option requesting that another option be presented.
  • FIG. 10 is a diagram showing a hardware configuration example of the server device 20.
  • the server device 20 includes a CPU 201, a memory 202, a storage device 203, a communication interface 204, an input device 205, and an output device 206.
  • the storage device 203 stores various data and programs, such as a hard disk drive, a solid state drive, and a flash memory.
  • the communication interface 204 is an interface for connecting to the communication network 30, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for performing wireless communication. , USB (Universal Serial Bus) connector for serial communication, RS232C connector, etc.
  • USB Universal Serial Bus
  • the input device 205 is, for example, a keyboard, a mouse, a touch panel, a button, a microphone, or the like for inputting data.
  • the output device 206 is, for example, a display, a printer, a speaker, or the like that outputs data.
  • FIG. 11 is a diagram showing a software configuration example of the server device 20.
  • the server device 20 includes an evaluation request receiving unit 211, an image analysis unit 212, an evaluation unit 213, an evaluation information transmitting unit 214, an improvement measure request receiving unit 215, an improvement measure information transmitting unit 216, and a group analysis unit. 217, group analysis information presentation unit 218, reference information request reception unit 219, reference information determination unit 220, image data storage unit 231, reference information storage unit 232, evaluation condition information storage unit 233, improvement measure information storage.
  • Each storage unit includes a unit 234 and a group analysis information storage unit 235.
  • Each of the above functional units is realized by the CPU 201 included in the server device 20 reading a program stored in the storage device 203 into the memory 202 and executing the program, and each of the above storage units is the memory 202 included in the server device 20. And implemented as part of the storage area provided by the storage device 203.
  • the evaluation request receiving unit 211 receives the evaluation request transmitted from the user terminal 10.
  • the evaluation request receiving unit 211 registers information including the image data included in the received evaluation request (hereinafter, referred to as image information) in the image data storage unit 231.
  • FIG. 12 is a diagram showing a configuration example of image information stored in the image data storage unit 231. As shown in the figure, the image information includes image data in association with a user ID indicating a photographed user. The image data was included in the evaluation request.
  • the reference information storage unit 232 is related to the physical exercise using the tool, such as the tool position, the tool movement (direction and movement of the tool), the posture (position and angle of the part, etc.), and the relationship derived from the relationship between the tool and the body. Stores information including reference values (hereinafter referred to as reference information).
  • FIG. 13 is a diagram showing a configuration example of reference information stored in the reference information storage unit 232. As shown in the figure, the reference information includes information on the absolute position of the tool part and how the tool part moved when performing physical exercise using the tool (movement speed and movement distance).
  • position reference information reference information about the absolute position of the part or relative position to other parts or other reference objects
  • angle reference information reference information on the angle formed by a straight line connecting each and the joint part
  • reference values are prepared for each physical exercise (mode), and for each physical exercise (mode), each purpose, each feature of the physical information, each feature of the evaluation information, and a specific individual (constant). There are multiple standard values such as those that standardize the physical exercise of athletes, professional athletes, stepped athletes, experienced athletes, etc., but not limited to this). You may.
  • the tool position reference information includes the tool part and the reference position of the part in association with the mode and the checkpoint ID.
  • the vertical position may be, for example, the height from the ground or the distance from any of the toes.
  • the mode when it is "weightlifting", it may be a distance from a body part or a line connecting parts such as a distance between a line connecting both shoulders and a shaft.
  • the horizontal position of the site may be a distance from a predetermined reference object (for example, a mound plate, a mark on the floor, etc.) or a reference site such as a shoulder, chest, or foot. It is assumed that the position reference information is registered in advance.
  • the tool movement reference information includes the reference value of information such as the movement speed and distance of the tool part, the direction of movement at a certain point in time, and the trajectory of movement in a certain period in association with the mode and the checkpoint ID. And are included.
  • the position reference information includes a part and a reference position of the part in association with the mode and the checkpoint ID. There may be multiple parts.
  • the vertical position may be, for example, the height from the ground or the distance from any of the toes.
  • the horizontal position may be a distance from a predetermined reference object (for example, a mound plate, a mark on the floor, etc.) or a reference site such as a shoulder, chest, or foot. It is assumed that the position reference information is registered in advance.
  • the angle reference information includes two parts (part 1 and part 2), one joint part, a straight line connecting the part 1 and the joint part, and the part 2 and the joint in association with the mode and the checkpoint ID. Includes a reference value for the angle between the straight line connecting the site.
  • the relationship standard information includes information related to the standard represented by the relationship between the tool part and the body part in association with the mode and the checkpoint ID.
  • the relational reference information includes information obtained from the moving speed, the moving distance, the angle, and the like in one or more parts and parts in association with the mode and the checkpoint ID. For example, when the mode is batting, the relational reference information includes the moving speed of the tip of the bat at the time of meeting the ball and the angle between the bat and the dominant arm holding the bat.
  • the evaluation condition information storage unit 233 stores information for performing evaluation (hereinafter referred to as evaluation condition information).
  • FIG. 14 is a diagram showing a configuration example of evaluation condition information stored in the evaluation condition information storage unit 233.
  • the evaluation condition information includes categories, conditions, evaluation ranks, and comments.
  • a category is a category of evaluation.
  • the category can be, for example, "muscle strength", “ball speed”, “control” and the like.
  • the conditions are conditions for the position, orientation or movement of each part of the tool in the image (change in position in time series), and the position or movement of each part of the body (change in position in time series).
  • the condition for can be set in the evaluation condition information.
  • the evaluation rank is an evaluation value when the above conditions are satisfied.
  • the comment is an explanation of the posture and movement of the body when the above conditions are satisfied.
  • the image analysis unit 212 (part / part identification unit) analyzes the image data.
  • the image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool and each part of the body, and specifies the position of each part and each part in the image. Further, the image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool, and specifies the direction in which each part is facing.
  • As the image analysis method by the image analysis unit 212 a general method is adopted, and detailed description thereof will be omitted here.
  • the image analysis unit 212 may analyze the image data frame by frame or key frame, analyze the image data at each checkpoint, or analyze at random timing. May be good.
  • the image analysis unit 212 also compares the position of each part extracted from the image data with the position reference information stored in the reference information storage unit 232 for each checkpoint ID, and sets the nearest point as the checkpoint. Specify as a point in time.
  • the evaluation unit 213 evaluates the movement of the tool used by the user based on the image data.
  • the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including the condition that the position and the movement of each part of the tool specified from the image data are satisfied, and the condition is satisfied. If there is evaluation condition information, the evaluation rank and comments included in it are acquired.
  • the evaluation unit 213 may evaluate the movement of the tool and count the number of physical exercises.
  • the evaluation unit 213 evaluates the movement of the user's body based on the image data.
  • the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including the condition that the position of each part and the movement of the part specified from the image data are satisfied, and the evaluation condition that the condition is satisfied is searched. If there is information, get the rating rank and comments included in it.
  • the evaluation unit 213 may evaluate the movement of the body and count the number of physical exercises.
  • the evaluation unit 213 evaluates the tool used by the user and the movement of the body based on the image data.
  • the evaluation unit 213 evaluates the evaluation condition information including the position of each part of the tool and each part of the body specified from the image data, and the condition that the movement or relationship between the part and the part is satisfied. It searches from the storage unit 233, and if there is evaluation condition information that satisfies the condition, the evaluation rank and the comment included in the evaluation condition information are acquired.
  • the evaluation unit 213 may evaluate the tool and the movement of the body and count the number of physical exercises.
  • the evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10.
  • the evaluation information transmission unit 214 generates tool position information including a time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part of the tool. For the evaluation rank and comment acquired by the evaluation unit 213, if the position of the tool part satisfies the condition, the posture information including the time point, the part and the tool orientation value, and the evaluation rank and the comment is generated, and the movement of the part is generated. If (change in position in time series) is satisfied, tool movement information including a list of time points, parts and tool orientation values, and evaluation ranks and comments is generated.
  • the evaluation information transmission unit 214 generates checkpoint information including a time point corresponding to each checkpoint analyzed by the image analysis unit 212 and a checkpoint ID indicating the checkpoint.
  • the evaluation information transmission unit 214 creates evaluation information including the generated tool position information, tool orientation information, tool movement information, and checkpoint information, and transmits the evaluation information to the user terminal 10.
  • the evaluation unit 213 and the evaluation information transmission unit 214 may correspond to the comment output unit of the present invention.
  • the evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10.
  • the evaluation information transmission unit 214 generates position information including a time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part. For the evaluation rank and comment acquired by the evaluation unit 213, if the position of the part satisfies the condition, the posture information including the time point, the part and the posture value, and the evaluation rank and the comment is generated, and the movement of the part (time series). If the condition (change in position in) is satisfied, motion information including a list of time points, parts and posture values, and evaluation ranks and comments is generated.
  • the evaluation information transmission unit 214 generates checkpoint information including a time point corresponding to each checkpoint analyzed by the image analysis unit 212 and a checkpoint ID indicating the checkpoint.
  • the evaluation information transmission unit 214 creates evaluation information including the generated position information, posture information, motion information, and checkpoint information, and transmits the evaluation information to the user terminal 10.
  • the evaluation unit 213 and the evaluation information transmission unit 214 may correspond to the comment output unit of the present invention.
  • the improvement measure information storage unit 234 stores information related to the improvement measure (hereinafter referred to as improvement measure information).
  • FIG. 15 is a diagram showing a configuration example of improvement measure information stored in the improvement measure information storage unit 234.
  • the improvement measure information includes advice in association with the purpose, category, and condition.
  • the condition may be a condition for the tool itself (weight of the barbell, etc.), how to use the tool, a physical condition (flexibility, etc.), a condition for the position and orientation of the part of the tool, a condition for movement, and the body. It may be a condition for the position and movement of the part of the body.
  • the improvement measure request receiving unit 215 receives the improvement measure request transmitted from the user terminal 10.
  • the improvement measure information transmission unit 216 includes the physical information of the user included in the evaluation request and each part specified by the image analysis unit 212 among the improvement measure information corresponding to the mode and the purpose included in the improvement measure request. Search for items that satisfy the conditions such as the position, orientation, and movement of each part.
  • the improvement measure information transmission unit 216 acquires the advice of the searched improvement measure information, creates the improvement measure information for which the purpose and the advice are set, and responds the created improvement measure information to the user terminal 10.
  • the improvement measure information transmission unit 216 also includes the position, direction, speed, angle, etc. of each part and each part included in the reference information in the improvement measure information and transmits the information.
  • the improvement measure information transmission unit 216 may search for improvement measures based on the evaluation information and the reference value without the improvement measure request, and the improvement measure information transmission unit 216 transmits the improvement measures to the user terminal 10. You may.
  • FIG. 17 is a diagram showing an example of the flow of processing executed in the guidance support server of the present embodiment.
  • the imaging unit 111 receives the input of the mode, images the body during the exercise of the user, and acquires the moving image data (S321).
  • the evaluation request transmission unit 112 transmits an evaluation request including a user ID indicating a user, a received mode, physical information, and moving image data to the server device 20 (S322).
  • the image analysis unit 212 analyzes the moving image data to extract the feature amount (S323) and specifies the position of each part and each part (S324).
  • the image analysis unit 212 may specify the position on the image, or may use the body information to determine the actual size position (height from the ground, distance from the reference point such as the center of gravity of the body, etc.). You may try to specify.
  • the evaluation unit 213 acquires an evaluation rank and a comment from the evaluation condition information in which the position or part of each part, the position or part of each part, or the movement of the part (change in time series of position) satisfies the condition (S325).
  • the evaluation information transmission unit 214 creates evaluation information and transmits it to the user terminal 10 (S326).
  • the evaluation display unit 114 displays the position, posture, movement, position, orientation, movement, etc. of the body on the moving image data based on the received evaluation information (S327). Further, in the user terminal 10, the evaluation display unit 114 may display the position (bone) of each unit indicating the posture of the body, and may display the evaluation rank and the comment (S327). Here, the evaluation display unit 114 may display a graph of the position, orientation, movement, etc. of the portion, and the time-series change of the position, movement, etc. of the portion. Further, the checkpoint display unit 115 may extract the checkpoint image from the moving image and display it. The improvement measure request transmission unit 116 transmits the improvement measure request to the server device 20 in response to an instruction from the user (S328).
  • the improvement measure information transmitting unit 216 searches for the improvement measure information that satisfies the condition, and the searched improvement measures.
  • the advice included in the information is acquired (S329), improvement measure information including the acquired advice is created, and the information is transmitted to the user terminal 10 (S330).
  • the improvement measure information display unit 118 displays the advice included in the received improvement measure information and superimposes the suitable usage of the tool on the video data. It can be displayed (S331).
  • the improvement measure information display unit 118 displays the advice included in the received improvement measure information and displays a suitable body posture in the form of a bone. It can be superimposed on the moving image data and displayed (S331).
  • the guidance support server of the present embodiment it is possible to easily evaluate the physical exercise.
  • physical exercise related to sports since it is possible to evaluate the positional relationship and movement of each part of the tool and each part of the body, it is easy to lead to concrete improvement efforts, and improvement in results is expected.
  • the guidance support server of the present embodiment also provides comments and advice, the user can easily grasp the current situation and improvement measures.
  • FIG. 18 is a diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • the figure illustrates the case where a moving image is captured in the weightlifting mode.
  • the screen 41 displays a mark 411 indicating the position of the barbell shaft.
  • the movement of the barbell shaft is indicated by line 412.
  • FIG. 19 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • the figure illustrates the case where a moving image is captured in the weightlifting mode.
  • the evaluation unit 213 in the weight lifting mode for example, the inclination of the shaft of the barbell, the moving distance of the shaft, the moving speed (line 421), and the like are displayed.
  • the reference value line 422
  • the actual measurement result line 423) may be displayed, and the difference from the reference value may be displayed by a numerical value or the like, or may be shown by a graph or the like. The user can consider the movement and posture to be corrected by referring to this.
  • FIG. 20 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • the figure illustrates the case where a moving image is captured in the weightlifting mode.
  • the evaluation result (line 432) is displayed together with the reference value (line 431) such as the angle of the joint of the body at the lowest point of the barbell.
  • the difference from the reference value may be displayed by a numerical value or the like, or may be shown by a graph or the like.
  • the evaluation result based on the relationship between the body and the tool may be displayed. The user can consider the movement and posture to be corrected by referring to this.
  • FIG. 21 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • the figure illustrates the case where a moving image is captured in the weightlifting mode.
  • the line 441 displays a bone that is displayed by connecting a predetermined position of a tool part or a body part specified from the image with a line. The bone may be superimposed on the captured image and displayed.
  • the line 442 indicates the acceleration of each part of the body. Further, a count result such as how many times the weight is lifted may be displayed. Further, as shown by line 443, the evaluation result and the next training result according to the purpose and the like may be shown.
  • FIG. 22 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool.
  • the figure illustrates the case where a moving image is captured in the weightlifting mode.
  • the evaluation (line 441) obtained by comparison with various reference values is displayed.
  • the group analysis unit 217 is a scene where multiple users use this system at the same time and a small number of supporters provide guidance remotely (for example, training guidance using online meeting tools, online communication tools, etc., remote rehabilitation support, etc.). It is supposed to be used in. In such a scene, there may be a plurality of users for each supporter. Although the terminal used by the supporter displays image information of multiple users in real time, detailed user status such as which user is training effectively and which user is tired. It is difficult to grasp, and as a result, the effect is diminished. In addition, it is difficult to communicate between the supporter and the user based on the content of the training conducted, and there is also a problem that the continuity is low. The group analysis unit 217 solves such a problem.
  • the group analysis unit 217 analyzes the evaluation information performed by the evaluation unit 213 on each user's image, and stores the analyzed information in the group analysis information storage unit 235.
  • the group analysis unit 217 is close to (or a standard) the number of repetitions of the training performed and the reference value based on the evaluation information evaluated by the evaluation unit 213 regarding the physical exercise (including the exercise using the tool) performed by the user. Create a ranking of users whose physical exercises (or what they are doing may be analyzed in real time), such as the order (far from the value), were effective (or ineffective).
  • each user may be categorized into a high-ranking group, a low-ranking group, and the like.
  • the user's fatigue level increases as the position, movement speed, movement distance, joint angle, etc. of the tool part and body part move away from the reference values while continuing physical exercise. It may be determined that the patient is present, and the degree of fatigue may be estimated based on the degree of deviation from the reference value. Furthermore, the time to complete a predetermined number of iterations may be measured for each set, and the degree of fatigue may be determined by increasing the completion time, or the degree of fatigue may be estimated by comparing with the increased time or the completion time of the previous set. You may.
  • the group analysis unit 217 may evaluate the time until the physical exercise actually starts, such as a signal for the supporter to start or end the exercise, and estimate the degree of concentration of the user. Further, the group analysis unit 217 may calculate a deviation value from the evaluation information among users who are participating in training or the like at the same time, or may create a ranking based on the deviation value. Further, the group analysis unit 217 may calculate a deviation value from the evaluation information among users who have performed the same physical exercise at different timings, or may create a ranking based on the deviation value. Further, from the ranking, the degree of fatigue, the degree of concentration, etc., a user who needs guidance or a user who is considered to be more effective in terms of continuity by making a direct comment may be specified. Further, the group analysis unit 217 may compare the evaluation information when the user has previously performed the same physical exercise with the evaluation information evaluated from the image of the newly performed physical exercise.
  • the group analysis information storage unit 235 stores the group analysis information performed by the group analysis unit 217.
  • FIG. 16 is a diagram showing a configuration example of group analysis information stored in the group analysis information storage unit 235.
  • Group analysis information largely includes ranking information, status information, and caution information.
  • the ranking information includes ranking based on the evaluation information, ranking, and category information such as whether the ranking is included in the upper group or the lower group.
  • the situation information includes the degree of fatigue and the degree of concentration. Attention information includes information such as users who need to be instructed in particular.
  • the group analysis information presentation unit 218 presents the group analysis information stored in the group analysis information storage unit 235 to the user terminal 10 and the supporter terminal 40.
  • the group analysis information presentation unit 218 is a group analysis information presenting unit 218 between users who are simultaneously performing physical exercises with respect to the user terminal 10, and between users including users who are performing similar physical exercises at different timings. Presents categories such as rankings, deviation values, high ranks, and low ranks. Further, the group analysis information presentation unit 218 may also present the information stored in the evaluation information storage unit 132. Further, the group analysis information presentation unit 218 may present the ranking and deviation value information of each user to the supporter terminal 40, color the users at the top (or bottom) of the ranking, and so on. It may be presented in an easy-to-see manner (such as displaying the screen on which the user is displayed in a large size or presenting it at the top).
  • FIG. 23 is a diagram showing an example of a screen created by presenting the group analysis information presentation unit 218 to the user terminal 10.
  • the figure illustrates the case where a moving image is captured in squat mode.
  • the rank within the own group is displayed by a line 461.
  • the line 462 shows the number of exercises performed.
  • Line 463 displays a real-time video (or recorded video) of the trainer as a model.
  • FIG. 24 is a diagram showing an example of a screen created by the group analysis information presentation unit 218 presenting to the supporter terminal 40.
  • the evaluation result of each user, the group analysis information, and the like are displayed together with the moving image of each user.
  • FIG. 25 is a diagram showing an example of a screen created by the group analysis information presentation unit 218 presenting to the supporter terminal 40. Images of users are arranged on the screen 47, and users who need more guidance (including support) in the group are displayed in an easy-to-understand manner by evaluation and group analysis. On the screen 47, as an example, a mark is displayed on the upper right of the screen showing a user who needs guidance.
  • FIG. 26 is a diagram showing an example of a screen created by the group analysis information presentation unit 218 presenting to the supporter terminal 40.
  • the screen 48 is a screen showing the details of each user, and includes evaluation results (bones, posture values, tool orientation values, number of repetitions of physical exercise, etc.), group analysis results (ranking within the group, changes thereof, etc.), and the relevant screens.
  • a function for viewing a recorded image (or a part of a moving image such as a few seconds before or after the time when the evaluation is lowered) such as an advice transmitted by the improvement measure information transmitting unit 216 is displayed for the user.
  • the screen may be a combination of the screen 47 and the screen 48.
  • the reference information request receiving unit 219 receives the reference information request transmitted from the user terminal 10.
  • the reference information request receiving unit 219 registers information including image data included in the received reference information request (hereinafter, referred to as image information) in the image data storage unit 231.
  • FIG. 12 is a diagram showing a configuration example of image information stored in the image data storage unit 231. As shown in the figure, the image information includes image data in association with a user ID indicating a photographed user. The image data was included in the reference information request. It is also assumed that the reference information request does not include image data.
  • the reference information determination unit 220 searches for reference information satisfying the user's needs based on the mode, purpose, image information included in the reference information request, the physical information, the evaluation information, and the like, and causes the user terminal 10 to search for the reference information. Send. Further, the reference information determination unit 220 receives information selected from any of the reference information transmitted by the reference value selection information transmission unit 120, and even if the image analysis unit 212 performs analysis based on the selected reference information. good. Further, the reference information determination unit 220 may transmit candidate reference information to the user based on the purpose, physical information, basic information, etc., even if there is no reference information request.
  • the reference information determination unit 220 may present an option for the user to present another reference value together with the option of the reference value, and in that case, "increase the strength" or "long time”. You may present a reference option for selecting a new reference value, such as "continue.”
  • the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and the purpose is "muscle hypertrophy of the upper part of the pectoralis major muscle". Select the standard information that matches the training of the upper part of the pectoralis major muscle from the standard information of. In this case, for example, reference information 1 "for training the upper part of the pectoralis major muscle: pushing the arm diagonally upward", reference information 2 "for training inside the pectoralis major muscle: closing the arm in front”, and reference information 3 "lower part of the pectoralis major muscle”. : The reference information such as “push the arm diagonally downward” is stored in the reference information storage unit 232.
  • the reference information determination unit 220 transmits reference information according to the purpose to the user terminal 10, and when the user selects the reference information, the reference information determination unit 220 accepts the selection and the evaluation unit 213 uses the reference information for evaluation. It becomes.
  • the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and the target is “muscle hypertrophy of the upper part of the pectoralis major muscle”.
  • the reference information suitable for the training of the upper part of the pectoralis major muscle is selected from a plurality of reference information, but the reference information may be further selected based on the physical information. In this case, for example, reference information 1 "for training the upper part of the pectoralis major muscle: pushing the arm diagonally upward", reference information 2 "for training inside the pectoralis major muscle: closing the arm in front”, reference information 3 "lower part of the pectoralis major muscle".
  • Standard information such as "push the arm diagonally downward” is stored, and if the inside of the pectoralis major muscle is not trained to some extent between each standard information, the effect is weak even if the upper part of the pectoralis major muscle is trained.
  • the reference information determination unit 220 further uses, for example, the user's physical information at that time (for example, muscle strength, muscle strength inside the pectoralis major muscle, etc.).
  • reference information 2 “for training inside the pectoralis major muscle: close the arm in front” may be transmitted to the user terminal 10.
  • send training steps and advice such as "First train the inside of the pectoralis major muscle (that is, reference information 2), and then train the upper part of the pectoralis major muscle (that is, reference information 1)". You may.
  • the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and "muscle hypertrophy of the upper part of the pectoralis major muscle" is included as the purpose.
  • the reference information suitable for the training of the upper part of the pectoralis major muscle is selected from a plurality of reference information, but the reference information may be further selected based on the evaluation information. In this case, for example, reference information 1 "for training the upper part of the pectoralis major muscle: pushing the arm diagonally upward", reference information 2 "for training inside the pectoralis major muscle: closing the arm in front”, and reference information 3 "lower part of the pectoralis major muscle".
  • the reference information determination unit 220 is further included in, for example, the user's evaluation information (which can be regarded as a skill) at that time (for this reason, the reference information).
  • the image information included in the request may be analyzed by the image analysis unit 212 and evaluated by the evaluation unit 213.) Difficulty level first based on information such as tool movement information (for example, whether the barbell can be raised and lowered smoothly).
  • Low reference information 2 “For training inside the pectoralis major muscle: Close the arm in front” may be transmitted to the user terminal 10.
  • further training steps and advice such as “First practice the movement to train the inside of the pectoralis major muscle (that is, reference information 2), and then train the upper part of the pectoralis major muscle (that is, reference information 1)". You may also send it together.
  • the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and "muscle hypertrophy of the upper part of the pectoralis major muscle" is included as the purpose.
  • the reference information suitable for the training of the upper part of the pectoralis major muscle is selected from a plurality of reference information, and the reference information may be further selected based on the past training history information included in the physical information. In this case, for example, reference information 1 "for training the upper part of the pectoralis major muscle: pushing the arm diagonally upward", reference information 2 "for training inside the pectoralis major muscle: closing the arm in front”, reference information 3 "lower part of the pectoralis major muscle".
  • the reference information determination unit 220 may further use, for example, the user's past training history information (for example, the number of times the pectoralis major muscle medial training has been performed, or already.
  • the reference information 1 " "For pectoralis major upper training: pushing the arm diagonally upward” may be transmitted to the user terminal 10.
  • further training steps and advice such as "Because the inside of the pectoralis major muscle (that is, reference information 2) has already been sufficiently trained, let's train the upper part of the pectoralis major muscle (that is, reference information 1)". May be sent together.
  • the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and the purpose is "muscle hypertrophy of the upper part of the pectoralis major muscle". , Select the reference information according to the training of the upper part of the pectoralis major muscle from a plurality of reference information, but send the body exercise menu or the like to the user terminal 10 so as to perform one or more physical exercises, and the user takes an image. Criteria information may be selected based on the evaluation of physical exercise.
  • reference information 1 "for pectoralis major upper training A: 20 kg barbell pushes the arm diagonally upward”
  • reference information 2 for pectoralis major upper training B: 40 kg barbell
  • Criteria information such as “push up” and reference information 3 "pectoral muscle training basics: push-up 30 times” are stored, and if the pectoral muscles are trained to some extent between each reference information, the next step is taken.
  • the reference information determination unit 220 first transmits "suspension" to the user terminal 10, causes the user to image an image of physical exercise, and the image analysis unit receives the image. , Count the number of times, etc.
  • the standard information determination unit 220 transmits the standard information 1 "A for training the upper part of the pectoralis major muscle: push the arm diagonally upward with a 20 kg barbell" to the user terminal 10 as the next step. .. Further, if the standard is not exceeded, the standard information 3 “pectoral muscle training basics: push-up 30 times” for clearing the standard may be transmitted to the user terminal 10. In this case, further training steps such as “Let's do basic pectoral muscle training with push-ups (reference information 3) first to train the upper part of the pectoralis major muscle (that is, reference information 1 or 2)" You may also send advice and so on.
  • the reference information determination unit 220 presents the mode, purpose, and image information included in the reference information request together with the recommendation level of the reference information to be presented to the user based on the physical information, the evaluation information, and the like.
  • the degree of recommendation may be indicated by a numerical value such as a numerical value out of 100 points or a ratio in 100%, or may be indicated by a degree such as 3 out of 5 levels, or ⁇ , ⁇ , ⁇ It may be indicated by a mark such as, thumb-up, thumb-down, etc., or a symbol or color indicating an impression of the degree of conformity such as color, or the number of users who have selected the reference value in the past, the user concerned.
  • the evaluation for the reference value obtained from the above may also be presented.
  • the reference information determination unit 220 may derive the recommendation level of the reference information presented to the user from the degree of matching between the purpose and the purpose set in the reference value, and the relationship between the evaluation information and the reference value ( As evaluation information, if the swing speed of the bat is fast, the degree of conformity may be high if the reference value is for a person with a fast swing speed of the bat, etc.), or the relationship between the physical information and the reference value ( In the case of left-handed as physical information, if the batting standard value for the left batter is high, the degree of conformity is high. It may be derived from (Naru, etc.), or these may be used alone or in combination to derive the degree of recommendation.
  • the reference information determination unit 220 may derive the recommendation level of the reference information presented to the user from the degree of matching between the purpose and the purpose set in the reference value, and the relationship between the evaluation information and the reference value ( As evaluation information, if the swing speed of the bat is fast, the degree of conformity may be high if the reference value is for a person with a fast swing speed of the bat, etc.), or the relationship between the physical information and the reference value ( In the case of left-handed as physical information, if the batting standard value for the left batter is high, the degree of conformity is high. It may be derived from (Naru, etc.), or these may be used alone or in combination to derive the degree of recommendation.
  • the image is analyzed by the server device 20, but the present invention is not limited to this, and the image is analyzed by the user terminal 10 to specify the positional relationship of each part and each part. May be good.
  • the positions of the parts and parts are assumed to be positions on the two-dimensional image, but the position is not limited to this and may be a three-dimensional position.
  • the three-dimensional positions of parts and parts are specified based on the image from the camera 106 and the depth map from the depth camera. be able to.
  • a three-dimensional image may be estimated to specify a three-dimensional position of a part or a part. It is also possible to provide a depth camera instead of the camera 106 and specify the three-dimensional position only from the depth map from the depth camera.
  • the depth map may be transmitted from the user terminal 10 together with the image data or instead of the image data to the server device 20, and the image analysis unit 212 of the server device 20 may analyze the three-dimensional position. can.
  • an image of the body of the user who is exercising using the tool is transmitted from the user terminal 10 to the server device 20, but the present invention is not limited to this, and the user terminal 10 is characterized by the image.
  • the amount may be extracted and the feature amount may be transmitted to the server device 20, or the user terminal 10 estimates the part of the tool and the part of the body based on the feature amount, and the absolute position of the part and the part ( It may be the position on the XY coordinates of the image, it may be the actual size distance from the reference position (for example, the ground, toes, head, center of gravity of the body, etc.), or it may be the position in any other coordinate system. It is also possible to acquire the relative positional relationships between the plurality of parts and the plurality of parts for a plurality of parts, and transmit these absolute positions and relative positional relationships to the server device 20. You may.
  • the content prepared on the server device 20 side is provided as the improvement measure information, but the present invention is not limited to this, and for example, the reference value is included to be correct based on the reference value.
  • Marks and bones that are movements and postures may be superimposed and displayed on a moving image or a still image extracted from the moving image. This makes it possible to easily grasp what kind of movement and posture should be taken.
  • the part and orientation of the tool, the position or movement of the body part, etc. are evaluated, but the present invention is not limited to this, and the position of the tool worn by the user is not limited to this. It may be specified and evaluated.
  • the server device 20 stores the reference value of the tool and the size (length, etc.) of the tool in association with the physical information (height, weight, etc.) of the user, and the user uses the image data.
  • the feature amount of the tool is extracted, the shape of the tool is specified, the size of the tool is estimated based on the shape and the size of the user included in the physical information (for example, height, etc.), and the estimated size of the tool is used. If the difference from the reference value is equal to or greater than a predetermined threshold value, a tool of the size of the reference value can be recommended.
  • a tool of the size of the reference value can be recommended.
  • even if you recommend a tool that suits your purpose from information such as the conditions for the tool itself (barbell weight, etc.), how to use the tool, physical conditions (flexibility, etc.), the position and orientation of the tool part, and movement. good.
  • contents such as advice are provided for improvement measures, but for example, the physical exercise being performed may be interrupted.
  • the server device 20 stores the reference value at which the physical exercise should be interrupted in association with the physical information (purpose, height, weight, etc.) of the user, and the physical exercise performed by the user from the image data.
  • the number of times and speed for example, the speed of lifting the barbell is extremely slowed down, or the number of times the barbell is lifted too many times
  • the physical exercise is interrupted.
  • a comment may be given to the user terminal 10 to cancel, the user may be notified by changing the display such as turning off the screen, or a sound such as an alert sound may be emitted. Alternatively, the user may be notified by vibration.
  • contents such as advice are provided for improvement measures, but for example, physical exercise for determination of illness and its improvement may be presented.
  • the server device 20 extracts a candidate for a disease presumed to be developed by the user from the symptom input by the user in the physical information and the evaluation information, and presents a screening test for narrowing down.
  • a screening test for narrowing down the name of the disease
  • he / she consults a doctor, exercises for improvement, tools for physical exercise, and recommends items such as meals. May be good.
  • the server device 20 can estimate the speed, acceleration, moving distance, trajectory, etc. of the tool. Further, the server device 20 can estimate the number of patterns as the number of operations using the tool by extracting the pattern of the change in the position of the tool in time series.
  • the movement is evaluated, but the present invention is not limited to this, and when a certain posture or movement is detected, a problem for the movement may be proposed.
  • the server device 20 may store the task in place of the evaluation comment and output the task in association with one or a series of postures or movements.
  • the exercise is evaluated, but the exercise is not limited to this, and training, rehabilitation, and the like to be performed when the movement of a certain tool, the direction, the posture, or the movement of a part of the body are detected.
  • You may present the content to improve the physical exercise according to the purpose such as performance, stretching, strength training, posture, etc., which are the preparation stages thereof.
  • the server device 20 correlates with the movement of one or a series of tool parts, the orientation of the tool parts, the posture of the body, or the movement of the body parts, and instead of the evaluation comment, the content of the training or the like is performed. It may be stored and the contents may be output.
  • the motion is evaluated, but the present invention is not limited to this, and the motion performed by the user can be automatically detected.
  • the server device 20 stores the position and posture (position of each part of the body) of each part of the tool that performs a predetermined operation such as a shoot or a pass as reference information, and analyzes the tool from the image. By comparing the position of a part or a part of the body with the reference information, it is possible to identify the action performed by the user in the image.
  • the image captured in the past is analyzed to evaluate the motion, but the present invention is not limited to this, and when the analysis process is performed in real time and a predetermined motion is detected, the next action is taken. You may try to recommend the tactics that should be done. In this case, the tactics may be stored in place of the evaluation comment in association with the posture or movement, and the tactics may be output in real time.
  • the supporter terminal 40 may also serve as the image pickup terminal 50 or may also serve as the output terminal 60.
  • the supporter terminal 40 has a shape such as a glasses type, a contact lens type, a hat type, an HMD (head-mounted display), and the range close to the supporter's field of view due to the image pickup function provided in the supporter terminal 40. Is imaged, and the captured image is sent to the server device 20 via the communication network 30.
  • the evaluation unit 213 and the group analysis unit 217 process the image, and the result is sent to the supporter terminal 40 or the user terminal 10 via the communication network 30. As shown in FIG.
  • the supporter terminal 40 uses, for example, a virtual image projection method, a retinal projection method, other methods, and brain activity such as an electroencephalogram to transfer characters, images, images, and the like to the brain.
  • Information is output to the supporter or the user via an interface such as BMI (Brain Machine Interface) that is input by direct stimulation without going through the sensory organs.
  • BMI Brain Machine Interface
  • the user's physical movement is seen as a real image, and the result of processing by the evaluation unit 213 and the group analysis unit 217 is viewed by the supporter with the naked eye through the supporter terminal 40.
  • the supporter may be able to see it by superimposing it on the field of view.
  • the supporter terminal 40 is an HMD or the like, the image captured by the supporter terminal 40 may be presented to the supporter in a form in which the results of processing by the evaluation unit 213 and the group analysis unit 217 are superimposed. ..
  • the user terminal 10 may also serve as the image pickup terminal 50 or may also serve as the output terminal 60.
  • the user terminal 10 is in the shape of, for example, a spectacle type, a contact lens type, a hat type, an HMD (head mounted display), or the like, and an image pickup function provided in the user terminal 10 captures a range close to the user's field of view.
  • the captured image is sent to the server device 20 via the communication network 30.
  • the evaluation unit 213 and the group analysis unit 217 process the image, and the result is sent to the user terminal 10 or the supporter terminal 40 via the communication network 30. As shown in FIG.
  • the supporter terminal 40 uses, for example, a virtual image projection method, a retinal projection method, other methods, and brain activity such as an electroencephalogram to transfer characters, images, images, and the like to the brain.
  • Information is output to the user or supporter via an interface such as BMI (Brain Machine Interface) that is input by direct stimulation without going through the sensory organs.
  • BMI Brain Machine Interface
  • a plurality of reference values may exist, and the user may select the reference value for a fee.
  • the reference value may include one created based on physical exercise (including one performed by using a tool) of a skilled person such as a professional athlete.
  • the user terminal 10, the supporter terminal 40, the image pickup terminal 50, and the output terminal 60 may be wearable terminals such as eyeglasses, and the results of imaging, analysis, and evaluation by the wearable terminal are performed. May be output to a wearable terminal, but a non-wearable user terminal 10 such as a mobile terminal, a supporter terminal 40, an image pickup terminal 50, and an output terminal 60 may exist at the same time, and the non-wearable terminal may also be present. The result of analysis and evaluation may be displayed.
  • the execution of a predetermined function and the storage of information are performed by the user terminal 10 or the server device 20, but the present invention is not limited to this, and the execution of the function and the storage of information by either device are not limited to this. You may do memory. Alternatively, the functional unit and the storage unit may be distributed and provided in a form different from the present embodiment.
  • the execution of a predetermined function and the storage of information are performed by the user terminal 10 or the server device 20, but the present invention is not limited to this, and the execution of the function and the storage of information by either device are not limited to this. You may do memory. Alternatively, the functional unit and the storage unit may be distributed and provided in a form different from the present embodiment.

Abstract

[Problem] To easily evaluate physical exercise. [Solution] Provided is a guidance support system which supports, by a supporter, guidance given to a user who performs physical exercise, and comprises: a reference value storage unit which stores a reference value related to the position of at least one part of a body or a portion of a tool; a part and portion specifying unit which specifies the part or the portion by analyzing an image of the user's physical exercise; an evaluation unit which determines an evaluation value of the physical exercise by comparing the position of the part or the portion in the image with the reference value; and a supporter terminal having an imaging function for capturing an image of the user's physical exercise and a display function for displaying the evaluation value.

Description

指導支援システムGuidance support system
 本発明は、指導支援システムに関する。 The present invention relates to a guidance support system.
 身体運動の指導、コーチングは、例えばスポーツなどの競技、筋力などのトレーニング、リハビリなどの面で幅広く行われている。 Physical exercise guidance and coaching are widely performed in aspects such as sports and other sports, muscle strength training, and rehabilitation.
 指導、コーチングは、経験や知識を持つ支援者(指導者、コーチ、理学療法士、介護福祉士等を含む)によって行われることが多く、身体運動を行うユーザにとっては客観的な意見が聞けるため有用であるが、支援者の勘や経験を基に行われることが多く、時に支援者が言語化できないために指導、コーチングの効果が低くなる。 Guidance and coaching are often provided by experienced and knowledgeable supporters (including instructors, coaches, physiotherapists, care workers, etc.), and users who perform physical exercise can hear objective opinions. Although it is useful, it is often done based on the supporter's intuition and experience, and sometimes the supporter cannot verbalize, so the effectiveness of guidance and coaching is low.
 特許文献1には、スポーツ技能の指導の方法として、ユーザの動作と、指導者の動作を並置することで比較し、指導を支援する発明が開示されている。 Patent Document 1 discloses an invention that supports instruction by comparing the movement of a user and the movement of an instructor by juxtaposing them as a method of instructing sports skills.
特開2007-313362号公報Japanese Unexamined Patent Publication No. 2007-313362
 しかしながら、特許文献1の技術では、ユーザと指導者の差は明確になるものの、その差をどのように近づけるのか、どのようなトレーニングが必要なのかなどの知識は指導者に依存している。また、動画を撮影して、比較をするため、身体運動を行った直後や行っている最中に指導を行うことができず、効果が限定的となる。 However, in the technology of Patent Document 1, although the difference between the user and the instructor becomes clear, the knowledge of how to make the difference close and what kind of training is required depends on the instructor. In addition, since moving images are shot and compared, it is not possible to give guidance immediately after or during physical exercise, and the effect is limited.
 本発明はこのような背景を鑑みてなされたものであり、ユーザの身体運動の画像から身体の部位や道具の部分の動きを容易に検知、評価し、更なる技能の向上やトレーニング、リハビリ等の効果の向上に向けたアドバイス等をすることによって、運動の支援をできる技術を提供することを目的とする。 The present invention has been made in view of such a background, and can easily detect and evaluate the movement of a body part or a tool part from an image of a user's body movement, further improving skills, training, rehabilitation, etc. The purpose is to provide technology that can support exercise by giving advice for improving the effect of.
 本発明は、支援者が、身体運動を行うユーザに対して行う指導を支援する指導支援システムであって、少なくとも1つの身体の部位または道具の部分の位置に係る基準値を記憶する基準値記憶部と、前記ユーザの身体運動の画像を解析して前記部位または前記部分を特定する部位部分特定部と、前記画像における前記部位または前記部分の位置および前記基準値を比較して前記身体運動の評価値を決定する評価部と、を備え、前記ユーザの身体運動の画像を撮像する撮像機能と、前記評価値を表示する表示機能を有する支援者端末と、を備えることを特徴とする。 The present invention is a guidance support system that supports guidance given by a supporter to a user who performs physical exercise, and is a reference value storage that stores a reference value relating to the position of at least one body part or tool part. The part, the part specifying the part or the part that identifies the part by analyzing the image of the physical exercise of the user, and the position of the part or the part in the image and the reference value are compared with each other. It is characterized by including an evaluation unit for determining an evaluation value, an imaging function for capturing an image of the user's body movement, and a supporter terminal having a display function for displaying the evaluation value.
 その他本願が開示する課題やその解決方法については、発明の実施形態の欄および図面により明らかにされる。 Other problems disclosed in the present application and solutions thereof will be clarified by the columns and drawings of the embodiments of the invention.
 本発明によれば、容易に道具を用いた身体運動の技能や効果を評価し、技能向上、効果向上に資する支援をすることができる。 According to the present invention, it is possible to easily evaluate the skills and effects of physical exercise using tools, and provide support that contributes to skill improvement and effect improvement.
本実施形態に係る身体運動支援システムの全体構成例を示す図である。It is a figure which shows the whole structure example of the physical exercise support system which concerns on this embodiment. ユーザ端末10のハードウェア構成例を示す図である。It is a figure which shows the hardware configuration example of the user terminal 10. ユーザ端末10のソフトウェア構成例を示す図である。It is a figure which shows the software configuration example of a user terminal 10. 身体情報記憶部130が記憶する身体情報の構成例を示す図である。It is a figure which shows the structural example of the physical information which a physical information storage part 130 stores. 評価リクエスト送信部112がサーバ装置20に送信する評価リクエストの構成例を示す図である。It is a figure which shows the configuration example of the evaluation request which the evaluation request transmission unit 112 transmits to a server apparatus 20. 評価情報受信部113がサーバ装置20から受信する評価情報の構成例を示す図である。It is a figure which shows the structural example of the evaluation information which the evaluation information receiving part 113 receives from a server apparatus 20. 改善策リクエスト送信部116がサーバ装置20に送信する改善策リクエストの構成例を示す図である。It is a figure which shows the configuration example of the improvement measure request which the improvement measure request transmission part 116 transmits to the server apparatus 20. 基準値リクエスト送信部119がサーバ装置20に送信する基準値リクエストの構成例を示す図である。It is a figure which shows the configuration example of the reference value request which the reference value request transmission unit 119 transmits to the server apparatus 20. 改善策情報の構成例を示す図である。It is a figure which shows the structure example of improvement measure information. サーバ装置20のハードウェア構成例を示す図である。It is a figure which shows the hardware configuration example of the server apparatus 20. サーバ装置20のソフトウェア構成例を示す図である。It is a figure which shows the software configuration example of a server apparatus 20. 画像データ記憶部231に記憶される画像情報の構成例を示す図である。It is a figure which shows the structural example of the image information stored in the image data storage unit 231. 基準情報記憶部232が記憶する基準情報の構成例を示す図である。It is a figure which shows the structural example of the reference information which a reference information storage part 232 stores. 評価条件情報記憶部233に記憶されている評価条件情報の構成例を示す図である。It is a figure which shows the structural example of the evaluation condition information stored in the evaluation condition information storage unit 233. 改善策情報記憶部234に記憶される改善策情報の構成例を示す図である。It is a figure which shows the structural example of the improvement measure information stored in the improvement measure information storage unit 234. グループ解析情報記憶部235に記憶されるグループ解析情報の構成例を示す図である。It is a figure which shows the structural example of the group analysis information stored in the group analysis information storage part 235. 本実施形態の指導支援サーバにおいて実行される処理の流れを示す図である。It is a figure which shows the flow of the process executed in the instruction support server of this embodiment. ウェイトリフティングのモードにおける動画の評価画面の一例を示す図である。It is a figure which shows an example of the evaluation screen of a moving image in a mode of weightlifting. ウェイトリフティングのモードにおける動画の評価画面の一例を示す他の図である。It is another figure which shows an example of the evaluation screen of a moving image in a mode of weightlifting. ウェイトリフティングのモードにおける動画の評価画面の一例を示す他の図である。It is another figure which shows an example of the evaluation screen of a moving image in a mode of weightlifting. ウェイトリフティングのモードにおける動画の評価画面の一例を示す他の図である。It is another figure which shows an example of the evaluation screen of a moving image in a mode of weightlifting. ウェイトリフティングのモードにおける動画の評価画面の一例を示す他の図である。It is another figure which shows an example of the evaluation screen of a moving image in a mode of weightlifting. ユーザ端末10に提示される、スクワットのモードにおける動画の評価画面の一例を示す他の図である。It is another figure which shows an example of the evaluation screen of the moving image in the squat mode, which is presented to a user terminal 10. 支援者端末40に提示される、集合トレーニングにおける動画の評価画面の一例を示す図である。It is a figure which shows an example of the evaluation screen of the moving image in group training presented to a supporter terminal 40. 支援者端末40に提示される、集合トレーニングにおける動画の評価画面の一例を示す他の図である。It is another figure which shows an example of the evaluation screen of the moving image in group training presented to a supporter terminal 40. 支援者端末40に提示される、集合トレーニングにおける動画の評価画面の一例を示す他の図である。It is another figure which shows an example of the evaluation screen of the moving image in group training presented to a supporter terminal 40. 支援者端末40を通じて支援者が視聴するイメージの一例を示す他の図である。It is another figure which shows an example of the image which a supporter views through a supporter terminal 40.
 本発明の実施形態の内容を列記して説明する。本発明の実施の形態による指導支援サーバは、以下のような構成を備える。 The contents of the embodiments of the present invention will be listed and described. The guidance support server according to the embodiment of the present invention has the following configuration.
[項目1]
 支援者が、身体運動を行うユーザに対して行う指導を支援する指導支援システムであって、
 少なくとも1つの身体の部位または道具の部分の位置に係る基準値を記憶する基準値記憶部と、
 前記ユーザの身体運動の画像を解析して前記部位または前記部分を特定する部位部分特定部と、
 前記画像における前記部位または前記部分の位置および前記基準値を比較して前記身体運動の評価値を決定する評価部と、
を備え、
 前記ユーザの身体運動の画像を撮像する撮像機能と、
 前記評価値を表示する表示機能を有する支援者端末と、
を備えることを特徴とする、指導支援システム。
[項目2]
 前記支援者端末は、ウェアラブルコンピュータであること、
を特徴とする、請求項1に記載の指導支援システム。
[項目3]
 前記基準値は複数存在し、
 前記ユーザの目的、身体情報、前記評価値の少なくとも一つの情報を基に、前記ユーザに対して一つ以上の基準値を提示する基準値決定部と、
を備えることを特徴とする、請求項1または2に記載の指導支援システム。
[項目4]
 前記ユーザが基準値の候補から選択する基準値選択情報送信部と、
を備えることを特徴とする、請求項1から3に記載の指導支援システム。
[項目5]
 支援者が、身体運動を行うユーザに対して行う指導を支援する指導支援方法であって、
 少なくとも1つの身体の部位または道具の部分の位置に係る基準値を記憶する基準値記憶ステップと、
 前記ユーザの身体運動の画像を解析して前記部位または前記部分を特定する部位部分特定ステップと、
 前記画像における前記部位または前記部分の位置および前記基準値を比較して前記身体運動の評価値を決定する評価ステップと、
を備え、
 前記ユーザの身体運動の画像を撮像する撮像機能と、
 前記評価値を表示する表示機能を有する支援者端末と、
を備えることを特徴とする、指導支援方法。
[Item 1]
It is a guidance support system that supports the guidance given by the supporter to the user who performs physical exercise.
A reference value storage unit that stores a reference value related to the position of at least one body part or tool part, and a reference value storage unit.
A part specifying part that identifies the part or the part by analyzing an image of the user's physical exercise,
An evaluation unit that determines the evaluation value of the physical exercise by comparing the position of the portion or the portion in the image and the reference value.
Equipped with
An imaging function that captures an image of the user's physical exercise,
A supporter terminal having a display function for displaying the evaluation value, and
Guidance support system characterized by being equipped with.
[Item 2]
The supporter terminal must be a wearable computer.
The guidance support system according to claim 1, wherein the guidance support system is characterized by the above.
[Item 3]
There are multiple reference values,
A reference value determining unit that presents one or more reference values to the user based on at least one of the user's purpose, physical information, and the evaluation value.
The guidance support system according to claim 1 or 2, wherein the system is provided with.
[Item 4]
The reference value selection information transmission unit selected by the user from the reference value candidates, and
The guidance support system according to claim 1 to 3, further comprising.
[Item 5]
It is a guidance support method that supports the guidance given by the supporter to the user who performs physical exercise.
A reference value storage step that stores a reference value relating to the position of at least one body part or tool part,
A site part identification step for identifying the part or the part by analyzing an image of the user's physical exercise,
An evaluation step of comparing the position of the part or the part in the image and the reference value to determine the evaluation value of the physical exercise, and the evaluation step.
Equipped with
An imaging function that captures an image of the user's physical exercise,
A supporter terminal having a display function for displaying the evaluation value, and
Guidance support method characterized by providing.
 本発明の一実施形態に係る指導支援サーバは、ユーザが行う身体運動(道具を用いて行うものも含む)において、身体の動き、道具の動きを評価した上で、その支援者が指導を行うことを支援しようとするものである。 The guidance support server according to the embodiment of the present invention evaluates the movement of the body and the movement of the tool in the physical exercise performed by the user (including the one performed by using the tool), and then the supporter gives guidance. It is an attempt to support that.
 ユーザが行う身体運動には、たとえば、体操、フィットネス、歩行、走行、ヨガ、自重トレーニング、リハビリなどの道具を用いないものと、道具を用いた、サッカー、バレーボール、バスケットボールなどの、ボールのみを道具として扱う競技、野球、テニス、卓球、ゴルフなどのボールとその他道具を用いる競技、また剣道、フェンシングなどのボール以外の道具を用いる競技、また、ジャグリングなどの道具を用いて行う身体運動、バーベルやトレーニング用の器具やマシンを用いる身体運動、さらに、道具を用いて行う各種リハビリ、杖などの道具を用いた被介護者の動作、楽器を用いる演奏等が含まれるが、これに限定されない。 Physical exercises performed by the user include those that do not use tools such as gymnastics, fitness, walking, running, yoga, self-weight training, and rehabilitation, and those that use tools such as soccer, volleyball, and basketball. Competitions that use balls and other tools such as baseball, tennis, table tennis, and golf, competitions that use tools other than balls such as kendo and fencing, physical exercises that use tools such as juggling, barbells, etc. It includes, but is not limited to, physical exercise using training equipment and machines, various rehabilitation using tools, movements of the care recipient using tools such as canes, and playing using musical instruments.
 本実施形態の指導支援サーバは、たとえば、ユーザが身体運動をしている様子を撮像した画像(静止画像であっても動画像であってもよいが、本実施形態では動画像であるものとする。)から、身体および身体の部位、道具および道具の部分を特定し、その部位または部分の絶対的な位置および、複数の異なる部位、部分の相対的な位置関係などに基づいて、身体、道具の動きを評価する。なお、複数の異なる前記身体の部位と前記道具の部分の相対的な位置関係などに基づいて道具の動きを評価してもよい。 The instruction support server of the present embodiment is, for example, an image obtained by capturing a state in which a user is exercising a body (a still image or a moving image may be used, but in the present embodiment, the moving image is used. From), identify the body and parts of the body, tools and parts of the tool, and based on the absolute position of the part or part and the relative positional relationship of multiple different parts, parts, etc. Evaluate the movement of the tool. The movement of the tool may be evaluated based on the relative positional relationship between the plurality of different parts of the body and the part of the tool.
 図1は本実施形態に係る指導支援サーバの全体構成例を示す図である。同図に示すように、本実施形態の指導支援サーバは、ユーザ端末10およびサーバ装置20を含んで構成される。さらに、撮像端末50を含んでもよい。ユーザ端末10とサーバ装置20と撮像端末50は、通信ネットワーク30を介して互いに通信可能に接続されている。通信ネットワーク30は、たとえば、インターネットやLAN(Local Area Network)であり、公衆電話回線網、専用電話回線網、携帯電話回線網、イーサネット(登録商標)、無線通信路などにより構築される。 FIG. 1 is a diagram showing an overall configuration example of the guidance support server according to this embodiment. As shown in the figure, the instruction support server of the present embodiment includes a user terminal 10 and a server device 20. Further, the image pickup terminal 50 may be included. The user terminal 10, the server device 20, and the image pickup terminal 50 are connected to each other so as to be able to communicate with each other via the communication network 30. The communication network 30 is, for example, the Internet or a LAN (Local Area Network), and is constructed by a public telephone line network, a dedicated telephone line network, a mobile telephone line network, an Ethernet (registered trademark), a wireless communication path, or the like.
==ユーザ端末10==
 ユーザ端末10は、身体運動を行うユーザまたはその支援者が操作するコンピュータである。ユーザ端末10は、たとえば、スマートフォンやタブレットコンピュータ、パーソナルコンピュータ、ウェアラブルコンピュータなどである。ユーザ端末10はカメラ等の撮像装置を備えており、これにより運動中におけるユーザの身体を撮像することができる。本実施形態では、運動中のユーザの身体を撮像した動画像はユーザ端末10からサーバ装置20に送信されるものとする。また、ユーザ端末10は、ユーザの視野に近い撮像する撮像機能を持ち、虚像投影方式、網膜投影方式、その他の方式で文字・画像・映像等を視認させる出力機能を持つ眼鏡型、コンタクトレンズ型、帽子型、HMD(ヘッドマウントディスプレイ)などのコンピュータであってもよく、撮像機能と出力機能を別々の端末が担ってもよい。なお、ユーザ端末10は撮像端末50および出力端末60を兼ねていてもよいし、ユーザ端末10の撮像機能は撮像端末50を兼ねていてもよいし、支援者端末40の出力機能は出力端末60を兼ねていてもよい。さらに、ユーザ端末10は図1において1台しか記載していないが、複数であってもよいことは言うまでもない。
== User terminal 10 ==
The user terminal 10 is a computer operated by a user who performs physical exercise or a supporter thereof. The user terminal 10 is, for example, a smartphone, a tablet computer, a personal computer, a wearable computer, or the like. The user terminal 10 includes an image pickup device such as a camera, which can image the user's body during exercise. In the present embodiment, it is assumed that the moving image of the body of the user during exercise is transmitted from the user terminal 10 to the server device 20. Further, the user terminal 10 has an image pickup function that captures an image close to the user's field of view, and has an output function for visually recognizing characters, images, videos, etc. by a virtual image projection method, a retinal projection method, or any other method. , A hat-type computer, an HMD (head-mounted display), or the like, or a separate terminal may be responsible for the image pickup function and the output function. The user terminal 10 may also serve as the image pickup terminal 50 and the output terminal 60, the image pickup function of the user terminal 10 may also serve as the image pickup terminal 50, and the output function of the supporter terminal 40 is the output terminal 60. May also serve as. Further, although only one user terminal 10 is shown in FIG. 1, it goes without saying that there may be a plurality of user terminals 10.
==サーバ装置20==
 サーバ装置20は、身体と道具の動きを評価するコンピュータである。サーバ装置20は、たとえば、ワークステーションやパーソナルコンピュータ、クラウドコンピューティングにより論理的に実現される仮想コンピュータなどである。サーバ装置20は、ユーザ端末10が撮影した動画像を受信し、受信した動画像を解析して身体運動の評価を行う。また、サーバ装置20は、身体運動の改善策に係る提案も行う。身体運動の評価および改善策の提案の詳細については後述する。
== Server device 20 ==
The server device 20 is a computer that evaluates the movements of the body and tools. The server device 20 is, for example, a workstation, a personal computer, a virtual computer logically realized by cloud computing, or the like. The server device 20 receives the moving image taken by the user terminal 10, analyzes the received moving image, and evaluates the physical exercise. The server device 20 also makes a proposal for improving physical exercise. Details of the evaluation of physical exercise and the proposal of improvement measures will be described later.
==支援者端末40==
 支援者端末40は、トレーナー、理学療法士、介護士等の、身体運動を行うユーザに対して指導、教示、説明、支援等を行う役割を担う人(支援者)が操作するコンピュータである。支援者端末40は、たとえば、スマートフォンやタブレットコンピュータ、パーソナルコンピュータ、ウェアラブルコンピュータなどである。また、支援者端末40は、支援者の視野に近い撮像する撮像機能を持ち、虚像投影方式、網膜投影方式、その他の方式で文字・画像・映像等を視認させる出力機能を持つ眼鏡型、コンタクトレンズ型、帽子型、HMD(ヘッドマウントディスプレイ)などのコンピュータ等であってもよく、撮像機能と出力機能を別の端末が担ってもよい。なお、支援者端末40は撮像端末50および出力端末60を兼ねていてもよいし、支援者端末40の撮像機能は撮像端末50を兼ねていてもよいし、支援者端末40の出力機能は出力端末60を兼ねていてもよい。さらに、支援者端末40は図1において1台しか記載していないが、複数であってもよいことは言うまでもない。
== Supporter terminal 40 ==
The supporter terminal 40 is a computer operated by a person (supporter) who plays a role of providing guidance, teaching, explanation, support, etc. to a user who performs physical exercise, such as a trainer, a physiotherapist, and a caregiver. The supporter terminal 40 is, for example, a smartphone, a tablet computer, a personal computer, a wearable computer, or the like. In addition, the supporter terminal 40 has an image pickup function that captures images close to the field of view of the supporter, and has an output function for visually recognizing characters, images, videos, etc. by a virtual image projection method, a retinal projection method, or other methods. A computer such as a lens type, a hat type, or an HMD (head-mounted display) may be used, and another terminal may be responsible for the image pickup function and the output function. The supporter terminal 40 may also serve as the image pickup terminal 50 and the output terminal 60, the image pickup function of the supporter terminal 40 may also serve as the image pickup terminal 50, and the output function of the supporter terminal 40 is output. It may also serve as a terminal 60. Further, although only one supporter terminal 40 is shown in FIG. 1, it goes without saying that there may be a plurality of supporter terminals 40.
==撮像端末50==
 撮像端末50は、ユーザが身体運動を行う場所に設置、または取り付けられた、ユーザを撮像するコンピュータ、または通信機能を持つカメラなどである。撮像端末50は、たとえば、スマートフォンやタブレットコンピュータ、パーソナルコンピュータ、ウェアラブルコンピュータなどである。撮像端末50はカメラ等の撮像装置を備えており、これにより運動中におけるユーザの身体を撮像することができる。本実施形態では、運動中のユーザの身体を撮像した動画像は撮像端末50からサーバ装置20に送信されるものとする。なお、撮像端末50に記憶された、ユーザを撮像したデータを、ユーザまたはその支援者、指導支援サーバを用いて事業を行う事業者が、サーバ装置20に直接インプットしてもよいし、通信ネットワーク30を介してインプットしてもよい。また、撮像端末50は図1において1台しか記載していないが、複数であってもよいことは言うまでもない。さらに、撮像端末50を複数の場所に設置し、被写体を同時に、または順番に撮像してもよい。さらに、撮像端末50はユーザ端末10および出力端末60を兼ねていてもよいし、ユーザ端末10および支援者端末40の撮像機能を担ってもよい。
== Imaging terminal 50 ==
The image pickup terminal 50 is a computer that captures an image of the user, a camera having a communication function, or the like, which is installed or attached to a place where the user exercises. The image pickup terminal 50 is, for example, a smartphone, a tablet computer, a personal computer, a wearable computer, or the like. The image pickup terminal 50 includes an image pickup device such as a camera, which can take an image of the user's body during exercise. In the present embodiment, it is assumed that the moving image of the body of the user who is exercising is transmitted from the image pickup terminal 50 to the server device 20. The data stored in the image pickup terminal 50 and captured by the user may be directly input to the server device 20 by the user, its supporter, or a business operator using the guidance support server, or the communication network. It may be input via 30. Further, although only one image pickup terminal 50 is shown in FIG. 1, it goes without saying that there may be a plurality of image pickup terminals 50. Further, the image pickup terminal 50 may be installed at a plurality of places to image the subject at the same time or in order. Further, the image pickup terminal 50 may also serve as the user terminal 10 and the output terminal 60, or may be responsible for the image pickup function of the user terminal 10 and the supporter terminal 40.
==出力端末60==
 出力端末60は、通信ネットワーク30を介してサーバ装置20からの制御により情報を出力するものとする。なお、出力端末60はたとえば、スマートフォンやタブレットコンピュータ、パーソナルコンピュータ、ウェアラブルコンピュータなどであってもよいし、ディスプレイ等の出力装置にコンピュータを接続し、当該コンピュータとサーバ装置20とが通信を行うことにより、サーバ装置20からの指示に応じて情報を出力端末60に出力してもよい。また、出力端末60は音声出力装置を含み、ユーザにヘッドフォン、イヤフォン、ネックスピーカなどの音声出力装置を装着させて、情報を音声として出力することもできる。また、虚像投影方式、網膜投影方式、その他の方式で文字・画像・映像等を視認させる眼鏡型、コンタクトレンズ型、帽子型、その他の型の出力端末、HMD(ヘッドマウントディスプレイ)、更に、脳波などの脳活動を利用して、文字・画像・映像等を脳への直接刺激によって感覚器を介さずに入力するBMI(ブレインマシンインターフェース)等のインタフェースを介して、情報をユーザまたは支援者に対して出力することもできる。また、出力端末60は図1において1台しか記載していないが、複数であってもよいことは言うまでもないし、映像出力と音声出力、その他の出力を一つの出力端末で行ってもよいし、別の出力端末で行ってもよい。さらに、出力端末60はユーザ端末10および支援者端末40を兼ねていてもよいし、ユーザ端末10および支援者端末40の出力機能を担ってもよい。
== Output terminal 60 ==
The output terminal 60 shall output information under the control of the server device 20 via the communication network 30. The output terminal 60 may be, for example, a smartphone, a tablet computer, a personal computer, a wearable computer, or the like, or by connecting a computer to an output device such as a display and communicating between the computer and the server device 20. , Information may be output to the output terminal 60 in response to an instruction from the server device 20. Further, the output terminal 60 includes an audio output device, and a user can be attached to an audio output device such as headphones, earphones, or a neck speaker to output information as audio. In addition, virtual image projection method, retinal projection method, eyeglass type, contact lens type, hat type, other types of output terminals that make characters, images, images, etc. visible by other methods, HMD (head mounted display), and brain waves. Information is sent to the user or supporter via an interface such as BMI (Brain Machine Interface) that inputs characters, images, videos, etc. by direct stimulation to the brain without using a sensory device. It can also be output. Further, although only one output terminal 60 is shown in FIG. 1, it goes without saying that there may be a plurality of output terminals 60, and video output, audio output, and other outputs may be performed by one output terminal. It may be done with another output terminal. Further, the output terminal 60 may also serve as the user terminal 10 and the supporter terminal 40, or may have an output function of the user terminal 10 and the supporter terminal 40.
 図2は、ユーザ端末10のハードウェア構成例を示す図である。ユーザ端末10は、CPU101、メモリ102、記憶装置103、通信インタフェース104、タッチパネルディスプレイ105、カメラ106を備える。記憶装置103は、各種のデータやプログラムを記憶する、例えばハードディスクドライブやソリッドステートドライブ、フラッシュメモリなどである。通信インタフェース104は、通信ネットワーク30に接続するためのインタフェースであり、例えばイーサネット(登録商標)に接続するためのアダプタ、公衆電話回線網に接続するためのモデム、無線通信を行うための無線通信機、シリアル通信のためのUSB(Universal Serial Bus)コネクタやRS232Cコネクタなどである。タッチパネルディスプレイ105は、データの入出力を行うデバイスである。ユーザ端末10はまた、キーボードやマウス、ボタン、マイクロフォンなどの入力装置、スピーカやプリンタなどの出力装置をさらに備えるようにしてもよい。 FIG. 2 is a diagram showing a hardware configuration example of the user terminal 10. The user terminal 10 includes a CPU 101, a memory 102, a storage device 103, a communication interface 104, a touch panel display 105, and a camera 106. The storage device 103 stores various data and programs, such as a hard disk drive, a solid state drive, and a flash memory. The communication interface 104 is an interface for connecting to the communication network 30, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for performing wireless communication. , USB (Universal Serial Bus) connector for serial communication, RS232C connector, etc. The touch panel display 105 is a device that inputs / outputs data. The user terminal 10 may further include an input device such as a keyboard, a mouse, a button, and a microphone, and an output device such as a speaker and a printer.
 図3は、ユーザ端末10のソフトウェア構成例を示す図である。ユーザ端末10は、撮像部111、評価リクエスト送信部112、評価情報受信部113、評価表示部114、チェックポイント表示部115、改善策リクエスト送信部116、改善策情報受信部117、改善策情報表示部118、基準値リクエスト送信部119、基準値選択情報送信部120の各機能部と、身体情報記憶部130、画像記憶部131、評価情報記憶部132および改善策記憶部133の各記憶部とを備える。 FIG. 3 is a diagram showing a software configuration example of the user terminal 10. The user terminal 10 includes an imaging unit 111, an evaluation request transmission unit 112, an evaluation information receiving unit 113, an evaluation display unit 114, a checkpoint display unit 115, an improvement measure request transmission unit 116, an improvement measure information receiving unit 117, and an improvement measure information display. Each functional unit of the unit 118, the reference value request transmission unit 119, the reference value selection information transmission unit 120, and each storage unit of the physical information storage unit 130, the image storage unit 131, the evaluation information storage unit 132, and the improvement measure storage unit 133. To prepare for.
 なお、上記各機能部は、ユーザ端末10が備えるCPU101が記憶装置103に記憶されているプログラムをメモリ102に読み出して実行することにより実現され、上記各記憶部は、ユーザ端末10が備えるメモリ102および記憶装置103が提供する記憶領域の一部として実現される。 Each of the above functional units is realized by the CPU 101 included in the user terminal 10 reading a program stored in the storage device 103 into the memory 102 and executing the program, and each of the above storage units is the memory 102 included in the user terminal 10. And implemented as part of the storage area provided by the storage device 103.
 撮像部111は、ユーザが行う身体運動中の画像を、動画を含めて撮像する。撮像部111は、カメラ106を制御することにより、身体運動を撮像した動画像を取得することができる。なお、ユーザまたはユーザの支援者は、ユーザ端末10を平坦な場所や壁などに設置し、カメラ106の光軸をユーザが運動を行う場所に向け、ビデオ撮影の開始を指示し、これに応じて撮像部111はカメラ106を動作させて、動画像を取得すればよい。撮像部111は、取得した動画像を画像記憶部131に記憶する。 The image pickup unit 111 captures an image during physical exercise performed by the user, including a moving image. By controlling the camera 106, the image pickup unit 111 can acquire a moving image of body movement. The user or the user's supporter installs the user terminal 10 on a flat place, a wall, or the like, directs the optical axis of the camera 106 to the place where the user exercises, and instructs the start of video shooting, and responds accordingly. The imaging unit 111 may operate the camera 106 to acquire a moving image. The image pickup unit 111 stores the acquired moving image in the image storage unit 131.
 画像記憶部131は、撮像部111が撮像した画像を記憶する。本実施形態では、画像は動画像であるが、それに限定されない。画像記憶部131は、たとえばファイルとして動画像を記憶することができる。 The image storage unit 131 stores the image captured by the image pickup unit 111. In the present embodiment, the image is a moving image, but the image is not limited thereto. The image storage unit 131 can store a moving image as a file, for example.
 身体情報記憶部130は、ユーザの身体や身体能力、トレーニング効果に影響を及ぼす要素等に関する情報(以下、身体情報という。)を記憶する。図4は、身体情報記憶部130が記憶する身体情報の構成例を示す図である。同図に示すように、身体情報には、身長、体重、性別、利き手、腕の長さ、足の長さ、手の大きさ、指の長さ、握力、筋力、柔軟性、肩の強さ、ゲノム、エピゲノム、遺伝子多型、腸内細菌叢、食事などが含まれる。また、画像解析部212による画像解析により得られる、行った身体運動の種類とその回数などを記憶してもよい。 The physical information storage unit 130 stores information (hereinafter referred to as physical information) related to the user's body, physical ability, factors affecting the training effect, and the like. FIG. 4 is a diagram showing a configuration example of physical information stored by the physical information storage unit 130. As shown in the figure, physical information includes height, weight, gender, dominant hand, arm length, foot length, hand size, finger length, grip strength, muscle strength, flexibility, and shoulder strength. It includes genomes, epigenomes, gene polymorphisms, intestinal flora, diet, etc. In addition, the type and number of physical exercises performed may be stored in the image analysis performed by the image analysis unit 212.
 評価リクエスト送信部112は、撮像部111が撮像した画像に基づいて、道具を用いた身体運動の評価を行うことのリクエスト(以下、評価リクエストという。)をサーバ装置20に送信する。 The evaluation request transmission unit 112 transmits a request for evaluating physical exercise using a tool (hereinafter referred to as an evaluation request) to the server device 20 based on the image captured by the image pickup unit 111.
 図5は、評価リクエスト送信部112がサーバ装置20に送信する評価リクエストの構成例を示す図である。同図に示すように、評価リクエストには、ユーザID、モード、身体情報および画像データが含まれる。ユーザIDはユーザを特定する情報である。モードは、ユーザが行う身体運動を示す情報である。モードは、たとえば、「歩行」、「ヨガ:猫のポーズ」、「ベンチプレス」、「テニスのサーブ」、「歩行リハビリ」などとすることができる。なお、モードは、所定の選択肢から選択されるものとする。身体情報は、身体情報記憶部130に記憶されている身体情報である。画像データは撮像部111が取得した動画像のデータである。 FIG. 5 is a diagram showing a configuration example of an evaluation request transmitted by the evaluation request transmission unit 112 to the server device 20. As shown in the figure, the evaluation request includes a user ID, a mode, physical information and image data. The user ID is information that identifies the user. The mode is information indicating the physical exercise performed by the user. The modes can be, for example, "walking", "yoga: cat pose", "bench press", "tennis serve", "walking rehabilitation" and the like. The mode shall be selected from predetermined options. The physical information is physical information stored in the physical information storage unit 130. The image data is moving image data acquired by the imaging unit 111.
 評価情報受信部113は、評価リクエストに応じてサーバ装置20から応答される、身体運動の評価に関する情報(以下、評価情報という。)を受信する。評価情報受信部113は、受信した評価情報を評価情報記憶部132に登録する。 The evaluation information receiving unit 113 receives information related to the evaluation of physical exercise (hereinafter referred to as evaluation information), which is responded to by the server device 20 in response to the evaluation request. The evaluation information receiving unit 113 registers the received evaluation information in the evaluation information storage unit 132.
 図6は、評価情報受信部113がサーバ装置20から受信する評価情報の構成例を示す図である。同図に示すように、評価情報には、モード、ユーザID、道具位置情報、身体部位位置情報、姿勢情報、動き情報およびチェックポイント情報が含まれる。 FIG. 6 is a diagram showing a configuration example of evaluation information received from the server device 20 by the evaluation information receiving unit 113. As shown in the figure, the evaluation information includes a mode, a user ID, a tool position information, a body part position information, a posture information, a movement information, and a checkpoint information.
 ユーザIDおよびモードは評価リクエストに含まれていたユーザIDとモードである。撮像された画像には、モードが示す運動をユーザが行った身体を撮影したものであることを示す。 The user ID and mode are the user ID and mode included in the evaluation request. The captured image indicates that the body is photographed by the user performing the exercise indicated by the mode.
 道具位置情報は、道具の各部分(たとえば、野球に用いるバット全体や、両端、グリップ、ボールがミートした場所、重心、その他任意の部分など。また、バーベル全体やシャフト、プレート(重りの部分)、グリップした部分、シャフトの中心、重心、その他任意の部分など。道具全体も含む。)の画像中の位置を示す。道具位置情報には、動画の時間軸上の時点に対応付けて、道具の各部分と、当該部分の位置とが含まれる。道具位置情報に基づいて、道具の動きや、身体の部位との関係性を表示することができる。すなわち、たとえば、道具位置情報が示す位置に、部分を示す図形(たとえば円など)を画像に重畳させて表示することができる。なお、1つの時点について複数の部分の位置が含まれ得る。さらに、2つの部分の間を結ぶ部分(たとえば、バーベルを右手と左手で握った部分の中間点など)については、道具位置情報が含まれていなくてよい。この場合、所定の2つの部分を示すマーク(円など)のペアの間を線で結ぶことにより、これらの2つの部分の間を結ぶ部分を表現することができる。道具部分位置情報は、動画を構成する各フレームについて含まれていてもよいし、キーフレーム(のちに説明するチェックポイントに関するフレームを含む)ごとに含まれていてもよいし、任意の数ごとのフレームごとに含まれていてもよいし、ランダムな時点について含まれていてもよい。マイフレームに位置情報が含まれていない場合、最も近い過去の時点の位置情報に基づいて前記図形を表示するようにすることができる。 The tool position information is for each part of the tool (for example, the entire bat used for baseball, both ends, the grip, the place where the ball meets, the center of gravity, and any other part. Also, the entire barbell, shaft, and plate (weight part). , The gripped part, the center of the shaft, the center of gravity, and any other part. Including the entire tool.) Indicates the position in the image. The tool position information includes each part of the tool and the position of the part in association with the time point on the time axis of the moving image. Based on the tool position information, the movement of the tool and the relationship with the body part can be displayed. That is, for example, a figure (for example, a circle) indicating a portion can be superimposed and displayed on the image at the position indicated by the tool position information. It should be noted that the positions of a plurality of parts may be included at one time point. Further, the part connecting the two parts (for example, the midpoint between the parts where the barbell is held by the right hand and the left hand) does not have to include the tool position information. In this case, by connecting a pair of marks (such as a circle) indicating two predetermined parts with a line, the part connecting these two parts can be expressed. The tool partial position information may be included for each frame constituting the video, for each key frame (including the frame for the checkpoint described later), or for any number. It may be included for each frame, or it may be included for random time points. When the position information is not included in My Frame, the figure can be displayed based on the position information at the nearest past time point.
 身体位置情報は、身体の各部位(たとえば、頭、肩、肘、腰、膝、足首など)の画像中の位置を示す。身体位置情報には、動画の時間軸上の時点に対応付けて、部位と、当該部位の位置とが含まれる。身体位置情報に基づいて、身体の骨格の状態(ボーン)を表示することができる。すなわち、たとえば、身体位置情報が示す位置に、部位を示す図形(たとえば円など)を画像に重畳させて表示することができる。なお、1つの時点について複数の部位の位置が含まれ得る。なお、2つの部位の間を結ぶ部位(たとえば、手首と肘を結ぶ前腕や腰と膝を結ぶ大腿など)については、位置情報が含まれていなくてよい。この場合、所定の2つの部位を示すマーク(円など)のペアの間を線で結ぶことにより、これらの2つの部位の間を結ぶ部位を表現することができる。位置情報は、動画を構成する各フレームについて含まれていてもよいし、キーフレーム(のちに説明するチェックポイントに関するフレームを含む)ごとに含まれていてもよいし、任意の数ごとのフレームごとに含まれていてもよいし、ランダムな時点について含まれていてもよい。マイフレームに位置情報が含まれていない場合、最も近い過去の時点の位置情報に基づいてボーンを表示するようにすることができる。 Body position information indicates the position of each part of the body (for example, head, shoulders, elbows, hips, knees, ankles, etc.) in the image. The body position information includes a part and a position of the part in association with a time point on the time axis of the moving image. The state of the skeleton (bone) of the body can be displayed based on the body position information. That is, for example, a figure (for example, a circle) indicating a part can be superimposed and displayed on an image at a position indicated by body position information. It should be noted that the positions of a plurality of parts may be included at one time point. The position information may not be included in the part connecting the two parts (for example, the forearm connecting the wrist and the elbow or the thigh connecting the waist and the knee). In this case, by connecting a pair of marks (such as a circle) indicating two predetermined parts with a line, the part connecting these two parts can be expressed. The position information may be included for each frame constituting the video, for each key frame (including the frame for the checkpoint described later), or for each frame of any number. It may be included in, or it may be included at random time points. If My Frame does not contain location information, you can display bones based on the location information at the nearest past point in time.
 道具向き情報は、ユーザが用いている道具の向きや道具の部分が向いている方向などに係る情報である。道具向き情報には、動画の時間軸上の時点に対応付けて、評価対象となる道具の部分と、道具動き値と、評価ランクと、評価コメント等が含まれる。道具向き値とは、道具の向きを表す値である。道具向き値は、たとえば、地面から道具のとある部分までの距離、道具の2つの部分間の距離、部分の角度(道具の第1の部分と、例えばユーザが当該道具を握っている部分への直線と、道具の第2の部分と、例えばユーザが当該道具を握っている部分への直線とが作る角度)、道具のとある部位の動きなどである。評価ランクは、評価値をランクにより表した値である。評価ランクは、たとえば、5段階の1ないし5や、ABCなどで表現される。評価コメントは、姿勢に関する評価に係るコメントである。たとえば、モードが「アップライトロウ」で、地面からバーベルの右端と左端の距離が異なる場合に、「左右に異なる力が掛かっています」といった評価コメントが含まれうる。 Tool orientation information is information related to the orientation of the tool used by the user and the direction in which the part of the tool is facing. The tool orientation information includes a part of the tool to be evaluated, a tool movement value, an evaluation rank, an evaluation comment, and the like in association with a time point on the time axis of the moving image. The tool orientation value is a value indicating the orientation of the tool. The tool orientation value is, for example, the distance from the ground to a certain part of the tool, the distance between two parts of the tool, the angle of the part (to the first part of the tool and, for example, the part where the user holds the tool). The angle between the straight line of the tool, the second part of the tool, and the straight line to the part where the user holds the tool, for example), the movement of a certain part of the tool, and the like. The evaluation rank is a value representing the evaluation value by rank. The evaluation rank is expressed by, for example, 1 to 5 in 5 stages, ABC, or the like. The evaluation comment is a comment related to the evaluation regarding the posture. For example, if the mode is "upright row" and the distance between the right and left ends of the barbell is different from the ground, it may contain an evaluation comment such as "different forces are applied to the left and right".
 道具動き情報は、ユーザが用いている道具の動きに係る情報である。道具動き情報には、動画の時間軸上の期間に対応付けて、評価対象となる道具の部分と、前記道具向き値のリストと、評価ランクと、評価コメントとが含まれる。道具向き値のリストは、期間内における時系列の道具向き値である。評価コメントは、道具の動きに関する評価に係るコメントである。たとえば、モードが「アップライトロウ」で、バーベルの上下運動が十分でない場合に、「バーベルが十分に持ち上げられていません」といった評価コメントが含まれうる。 Tool movement information is information related to the movement of the tool used by the user. The tool movement information includes a part of the tool to be evaluated, a list of the tool orientation values, an evaluation rank, and an evaluation comment in association with a period on the time axis of the moving image. The list of tool orientation values is a time-series tool orientation value within a period. The evaluation comment is a comment related to the evaluation regarding the movement of the tool. For example, if the mode is "upright row" and the barbell does not move up and down sufficiently, it may include a rating comment such as "the barbell has not been lifted sufficiently".
 姿勢情報は、ユーザの身体の姿勢に係る情報である。姿勢情報には、動画の時間軸上の時点に対応付けて、評価対象となる部位と、姿勢値と、評価ランクと、評価コメントとが含まれる。姿勢値とは、姿勢を表す値である。姿勢値は、たとえば、地面から部位までの距離、2つの部位間の距離、間接部位の角度(第1の端部の部位から間接部位への直線と、第2の端部の部位から間接部位への直線とが作る角度)などである。評価ランクは、評価値をランクにより表した値である。評価ランクは、たとえば、5段階の1ないし5や、ABCなどで表現される。評価コメントは、姿勢に関する評価に係るコメントである。たとえば、モードが「リフティング」で、屈曲が十分でない場合に、「腰が下がっていない」といった評価コメントが含まれうる。 Posture information is information related to the posture of the user's body. The posture information includes a part to be evaluated, a posture value, an evaluation rank, and an evaluation comment in association with a time point on the time axis of the moving image. The posture value is a value representing the posture. Posture values are, for example, the distance from the ground to the part, the distance between the two parts, the angle of the indirect part (the straight line from the first end part to the indirect part, and the second end part to the indirect part). The angle created by the straight line to). The evaluation rank is a value representing the evaluation value by rank. The evaluation rank is expressed by, for example, 1 to 5 in 5 stages, ABC, or the like. The evaluation comment is a comment related to the evaluation regarding the posture. For example, if the mode is "lifting" and the flexion is not sufficient, an evaluation comment such as "not sitting down" may be included.
 身体動き情報は、ユーザの身体の動きに係る情報である。身体動き情報には、動画の時間軸上の期間に対応付けて、評価対象となる部位と、姿勢値のリストと、評価ランクと、評価コメントとが含まれる。姿勢値のリストは、期間内における時系列の姿勢値である。評価コメントは、動きに関する評価に係るコメントである。たとえば、モードが「リフティング」で、膝の伸展がスムーズでない場合に、「膝の動きがスムーズではありません」といった評価コメントが含まれうる。 Body movement information is information related to the user's body movement. The body movement information includes a part to be evaluated, a list of posture values, an evaluation rank, and an evaluation comment in association with a period on the time axis of the moving image. The list of posture values is a time-series posture value within a period. The evaluation comment is a comment related to the evaluation of movement. For example, if the mode is "lifting" and the knee extension is not smooth, it may include a rating comment such as "knee movement is not smooth".
 関係情報は、前記道具と前記身体のそれぞれ一つ以上の位置関係の情報を示す。関係情報には、動画の時間軸上の時点に対応付けて、道具の部分の位置、向き、動き等の情報と、身体の部位、姿勢、動き等の情報との関係の情報が含まれる。例えば、道具位置情報と身体位置情報に基づいて、両者がどのような位置関係にあるのかを表示することができる。すなわち、たとえば、道具位置情報が示す位置に、部分を示す図形(たとえば円など)を画像に重畳させ、さらに身体位置情報が示す位置に、部位を示す図形(たとえば円など)を表示することができる。なお、1つの時点について複数の部分、部位の位置が含まれ得る。なお、部分と部位の間を結ぶ箇所(たとえば、バットの先端部分とユーザがバットを握っている部位の中心点など)については、位置情報が含まれていなくてよい。この場合、所定の2つの部分と部位を示す図形(円など)のペアの間を線で結ぶことにより、これらの2つの部分と部位の間を結ぶ箇所を表現することができる。関係情報は、動画を構成する各フレームについて含まれていてもよいし、キーフレームごと、チェックポイント(詳細は後述する)ごとに含まれていてもよいし、任意の数ごとのフレームごとに含まれていてもよいし、ランダムな時点について含まれていてもよい。マイフレームに位置情報が含まれていない場合、最も近い過去の時点の位置情報に基づいて、部分や部位を示す図形、それらを結ぶ線を表示するようにすることができる。 The relationship information indicates information on one or more positional relationships between the tool and the body. The relationship information includes information on the relationship between information such as the position, orientation, and movement of the tool portion and information such as body parts, posture, and movement, in association with the time point on the time axis of the moving image. For example, it is possible to display the positional relationship between the tool position information and the body position information. That is, for example, a figure indicating a part (for example, a circle) may be superimposed on an image at a position indicated by tool position information, and a figure indicating a part (for example, a circle) may be displayed at a position indicated by body position information. can. It should be noted that the positions of a plurality of parts and parts may be included at one time point. It should be noted that the position information may not be included in the portion connecting the portion and the portion (for example, the tip portion of the bat and the center point of the portion where the user holds the bat). In this case, by connecting a pair of predetermined two parts and a figure (circle or the like) indicating the part with a line, a part connecting these two parts and the part can be expressed. The relationship information may be included for each frame constituting the video, for each key frame, for each checkpoint (details will be described later), or for each frame of any number. It may be included or it may be included for random time points. If my frame does not contain location information, it is possible to display a figure showing a part or part and a line connecting them based on the location information at the nearest past time point.
 チェックポイント情報は、ユーザが用いる道具の動きまたはユーザの身体の一連の動作の中で、道具の向きや身体の姿勢などをチェックするべきポイント(以下、チェックポイントという。)を示す情報である。チェックポイントとしては、たとえば、モードが「ウェイトリフティング」である場合、バーベルが一番高い位置に到達したところや、一番低い位置に到達したところ、持ち上がる瞬間などである。モードが「ピッチング」である場合、足を上げたところや、上げた足を下ろして体重移動したところ、ボールをリリースしたところなどである。チェックポイント情報には、動画の時間軸上の時点に対応付けて、チェックポイントを示す情報(以下、チェックポイントIDという。)を記憶している。すなわち、チェックポイントIDが示すチェックポイントが表示されている動画中のフレーム(静止画像)を特定することができる。 Checkpoint information is information indicating points (hereinafter referred to as checkpoints) for checking the orientation of the tool, the posture of the body, etc. in the movement of the tool used by the user or a series of movements of the user's body. Checkpoints include, for example, when the mode is "weightlifting", where the barbell reaches the highest position, when it reaches the lowest position, and when it is lifted. When the mode is "pitching", the place where the foot is raised, the place where the raised foot is lowered to move the weight, the place where the ball is released, and the like. The checkpoint information stores information indicating a checkpoint (hereinafter referred to as a checkpoint ID) in association with a time point on the time axis of the moving image. That is, it is possible to specify a frame (still image) in the moving image in which the checkpoint indicated by the checkpoint ID is displayed.
 評価表示部114は、評価情報を表示する。たとえば、評価表示部114は、評価情報に含まれている道具位置、道具向き、道具動き情報や身体位置、姿勢、身体動き情報などに基づいて、動画に重畳させて、道具の部分や身体の部位を表す図形(たとえば、道具の端部や身体の重心を表す円、それらを結ぶ線など)を表示することで、道具の部分や身体の部位の動きを動画に重ねて表示することができる。また、評価表示部114は、たとえば、道具の部分の位置、道具の向き、道具の動きなどや、身体の部位の位置、姿勢、身体の動きなどの時系列的な変化をグラフ表示することができる。 The evaluation display unit 114 displays evaluation information. For example, the evaluation display unit 114 superimposes the tool position, the tool orientation, the tool movement information, the body position, the posture, the body movement information, and the like included in the evaluation information on the moving image, and the tool part and the body. By displaying a figure representing a part (for example, the end of a tool, a circle representing the center of gravity of the body, a line connecting them, etc.), the movement of the tool part or body part can be superimposed on the video. .. Further, the evaluation display unit 114 may display, for example, a graph of changes over time such as the position of the tool part, the orientation of the tool, the movement of the tool, and the position, posture, and movement of the body part. can.
 また、評価表示部114は、評価情報に含まれている道具向き情報および道具動き情報に基づいて、動画の表示に併せて、評価ランクおよび評価コメントを表示することができる。たとえば、評価表示部114は、動画の再生時間が、道具向き情報に含まれている時点の前後近傍(たとえば、5秒前後など任意の長さとすることができる。)にきたところで、道具向き情報に含まれている評価ランクおよび評価コメントを表示することができる。また、評価表示部114は、動画の再生時間が、道具動き情報に含まれている期間内にきたところで、道具動き情報に含まれている評価ランクおよび評価コメントを表示することができる。また、評価表示部114は、姿勢情報に含まれている姿勢値を表示することができる。また、評価表示部114は、道具動き情報に含まれている道具向き値のリストに基づいて、道具向き値の時系列的な変換をグラフ表示することができる。 Further, the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the tool orientation information and the tool movement information included in the evaluation information. For example, the evaluation display unit 114 reaches the tool orientation information when the playback time of the moving image is near the front and back of the time point included in the tool orientation information (for example, it can be an arbitrary length such as around 5 seconds). You can view the rating ranks and rating comments contained in. Further, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the tool movement information when the reproduction time of the moving image comes within the period included in the tool movement information. Further, the evaluation display unit 114 can display the posture value included in the posture information. Further, the evaluation display unit 114 can display a graph of the time-series conversion of the tool orientation values based on the list of the tool orientation values included in the tool movement information.
 また、評価表示部114は、評価情報に含まれている姿勢情報および動き情報に基づいて、動画の表示に併せて、評価ランクおよび評価コメントを表示することができる。たとえば、評価表示部114は、動画の再生時間が、姿勢情報に含まれている時点の前後近傍(たとえば、5秒前後など任意の長さとすることができる。)にきたところで、姿勢情報に含まれている評価ランクおよび評価コメントを表示することができる。また、評価表示部114は、動画の再生時間が、動き情報に含まれている期間内にきたところで、動き情報に含まれている評価ランクおよび評価コメントを表示することができる。また、評価表示部114は、姿勢情報に含まれている姿勢値を表示することができる。また、評価表示部114は、動き情報に含まれている姿勢値のリストに基づいて、姿勢値の時系列的な変換をグラフ表示することができる。なお、当該評価ランクと当該評価コメントは、前段落に記載した、道具向き情報および道具動き情報に基づいて、動画の表示に合わせて表示された評価ランクおよび評価コメントと併せて表示してもよい。 Further, the evaluation display unit 114 can display the evaluation rank and the evaluation comment together with the display of the moving image based on the posture information and the movement information included in the evaluation information. For example, the evaluation display unit 114 is included in the posture information when the playback time of the moving image is near the front and back of the time when it is included in the posture information (for example, it can be an arbitrary length such as around 5 seconds). It is possible to display the evaluation rank and evaluation comment that have been given. Further, the evaluation display unit 114 can display the evaluation rank and the evaluation comment included in the motion information when the reproduction time of the moving image comes within the period included in the motion information. Further, the evaluation display unit 114 can display the posture value included in the posture information. Further, the evaluation display unit 114 can display a graph of the time-series conversion of the posture values based on the list of posture values included in the movement information. The evaluation rank and the evaluation comment may be displayed together with the evaluation rank and the evaluation comment displayed according to the display of the moving image based on the tool orientation information and the tool movement information described in the previous paragraph. ..
 チェックポイント表示部115は、動画からチェックポイントの画像を抽出して表示することができる。チェックポイント表示部115は、画像記憶部131に記憶されている動画の画像データから、チェックポイント情報に含まれている時点に対応するフレームを読み出して静止画像として表示することができる。また、チェックポイント表示部115は、たとえば、読み出したフレームから身体部分のみを抽出して表示するようにしてもよい。 The checkpoint display unit 115 can extract and display a checkpoint image from a moving image. The checkpoint display unit 115 can read a frame corresponding to a time point included in the checkpoint information from the image data of the moving image stored in the image storage unit 131 and display it as a still image. Further, the checkpoint display unit 115 may, for example, extract and display only the body part from the read frame.
 改善策リクエスト送信部116は、道具の扱い方および身体運動に関する改善策を取得するためのリクエスト(以下、改善策リクエストという。)をサーバ装置20に送信する。図7は、改善策リクエストの構成例を示す図である。同図に示すように、改善策リクエストには、ユーザID、モードおよび目的等が含まれている。目的は、ユーザが改善を行う目的である。目的としては、たとえば、「ボールのスピードを上げる」、「筋力をアップする」、「下半身を安定させる」などとすることができる。目的についても、所定の選択肢から選択されるものとする。 The improvement measure request transmission unit 116 transmits a request for acquiring improvement measures regarding how to handle the tool and physical exercise (hereinafter referred to as an improvement measure request) to the server device 20. FIG. 7 is a diagram showing a configuration example of an improvement measure request. As shown in the figure, the improvement measure request includes a user ID, a mode, a purpose, and the like. The purpose is for the user to make improvements. The purpose can be, for example, "to increase the speed of the ball", "to increase muscle strength", "to stabilize the lower body", and the like. The purpose shall also be selected from the predetermined options.
 改善策情報受信部117は、改善策リクエストに応じてサーバ装置20から送信される改善策に関する情報(以下、改善策情報という。)を受信する。改善策情報受信部117は、受信した改善策情報を改善策記憶部133に記憶する。改善策情報の構成例を図8に示す。同図に示すように改善策情報には目的とアドバイスと基準情報とが含まれる。本実施形態では、アドバイスは、改善策を表した文字列であることを想定するが、画像や動画などにより改善策を提示するコンテンツであってもよい。基準情報は、好適な道具の向きや動き(各部分の位置や向き、動き、速度、角度等)、身体の姿勢(各部分の位置や向き、動き、速度、角度等)、および道具と身体の関係(各部分および各部位の位置や向き、動き、速度、角度等とその関係等)である。なお、改善策情報受信部117は、改善策リクエストが無くても、評価結果と基準値を基に改善策情報送信部216が送信した改善策情報を受信してもよい。 The improvement measure information receiving unit 117 receives information on the improvement measure (hereinafter referred to as improvement measure information) transmitted from the server device 20 in response to the improvement measure request. The improvement measure information receiving unit 117 stores the received improvement measure information in the improvement measure storage unit 133. FIG. 8 shows a configuration example of the improvement measure information. As shown in the figure, the improvement measure information includes the purpose, advice, and standard information. In the present embodiment, the advice is assumed to be a character string representing the improvement measure, but may be content that presents the improvement measure by an image, a moving image, or the like. Reference information includes suitable tool orientation and movement (position and orientation of each part, movement, speed, angle, etc.), body posture (position and orientation of each part, movement, speed, angle, etc.), and tool and body. (Position and orientation of each part and each part, movement, speed, angle, etc. and their relationship, etc.). The improvement measure information receiving unit 117 may receive the improvement measure information transmitted by the improvement measure information transmitting unit 216 based on the evaluation result and the reference value even if there is no improvement measure request.
 改善策情報表示部118は、改善策を表示する。改善策情報表示部118は、改善策情報に含まれているアドバイスを表示する。また、改善策情報に部位の好適な位置や角度などが含まれている場合には、それぞれを画像情報に重ねて表示してもよい。 The improvement measure information display unit 118 displays the improvement measure. The improvement measure information display unit 118 displays the advice included in the improvement measure information. Further, when the improvement measure information includes a suitable position or angle of the portion, each may be superimposed on the image information and displayed.
 基準値リクエスト送信部119は、ユーザが目標とする基準値またはその候補を取得するためのリクエスト(以下、基準値リクエストという。)をサーバ装置20に送信する。図8は、改善策リクエストの構成例を示す図である。同図に示すように、改善策リクエストには、ユーザID、モード、目的、前記画像情報、前記身体情報、前記評価情報が含まれている。目的は、ユーザが改善を行う目的である。目的としては、たとえば、「ボールのスピードを上げる」、「筋力をアップする」、「下半身を安定させる」などとすることができる。目的についても、所定の選択肢から選択されるものとする。 The reference value request transmission unit 119 transmits a request (hereinafter referred to as a reference value request) for acquiring a reference value or a candidate thereof targeted by the user to the server device 20. FIG. 8 is a diagram showing a configuration example of an improvement measure request. As shown in the figure, the improvement measure request includes a user ID, a mode, a purpose, the image information, the physical information, and the evaluation information. The purpose is for the user to make improvements. The purpose can be, for example, "to increase the speed of the ball", "to increase muscle strength", "to stabilize the lower body", and the like. The purpose shall also be selected from the predetermined options.
 基準値選択情報送信部120は、基準情報決定部220から提示された1つ以上の基準値のいずれかを選択した情報を、サーバ装置20に送信する。基準値選択情報送信部120は、他の選択肢を提示するように求める選択肢を、サーバ装置20に送信してもよい。 The reference value selection information transmission unit 120 transmits information selected from one or more of the reference values presented by the reference information determination unit 220 to the server device 20. The reference value selection information transmission unit 120 may transmit to the server device 20 an option requesting that another option be presented.
 図10は、サーバ装置20のハードウェア構成例を示す図である。サーバ装置20は、CPU201、メモリ202、記憶装置203、通信インタフェース204、入力装置205、出力装置206を備える。記憶装置203は、各種のデータやプログラムを記憶する、例えばハードディスクドライブやソリッドステートドライブ、フラッシュメモリなどである。通信インタフェース204は、通信ネットワーク30に接続するためのインタフェースであり、例えばイーサネット(登録商標)に接続するためのアダプタ、公衆電話回線網に接続するためのモデム、無線通信を行うための無線通信機、シリアル通信のためのUSB(Universal Serial Bus)コネクタやRS232Cコネクタなどである。入力装置205は、データを入力する、例えばキーボードやマウス、タッチパネル、ボタン、マイクロフォンなどである。出力装置206は、データを出力する、例えばディスプレイやプリンタ、スピーカなどである。 FIG. 10 is a diagram showing a hardware configuration example of the server device 20. The server device 20 includes a CPU 201, a memory 202, a storage device 203, a communication interface 204, an input device 205, and an output device 206. The storage device 203 stores various data and programs, such as a hard disk drive, a solid state drive, and a flash memory. The communication interface 204 is an interface for connecting to the communication network 30, for example, an adapter for connecting to Ethernet (registered trademark), a modem for connecting to a public telephone network, and a wireless communication device for performing wireless communication. , USB (Universal Serial Bus) connector for serial communication, RS232C connector, etc. The input device 205 is, for example, a keyboard, a mouse, a touch panel, a button, a microphone, or the like for inputting data. The output device 206 is, for example, a display, a printer, a speaker, or the like that outputs data.
 図11は、サーバ装置20のソフトウェア構成例を示す図である。同図に示すように、サーバ装置20は、評価リクエスト受信部211、画像解析部212、評価部213、評価情報送信部214、改善策リクエスト受信部215、改善策情報送信部216、グループ解析部217、グループ解析情報提示部218、基準情報リクエスト受信部219、基準情報決定部220の各機能部と、画像データ記憶部231、基準情報記憶部232、評価条件情報記憶部233、改善策情報記憶部234、グループ解析情報記憶部235の各記憶部とを備える。 FIG. 11 is a diagram showing a software configuration example of the server device 20. As shown in the figure, the server device 20 includes an evaluation request receiving unit 211, an image analysis unit 212, an evaluation unit 213, an evaluation information transmitting unit 214, an improvement measure request receiving unit 215, an improvement measure information transmitting unit 216, and a group analysis unit. 217, group analysis information presentation unit 218, reference information request reception unit 219, reference information determination unit 220, image data storage unit 231, reference information storage unit 232, evaluation condition information storage unit 233, improvement measure information storage. Each storage unit includes a unit 234 and a group analysis information storage unit 235.
 なお、上記各機能部は、サーバ装置20が備えるCPU201が記憶装置203に記憶されているプログラムをメモリ202に読み出して実行することにより実現され、上記各記憶部は、サーバ装置20が備えるメモリ202および記憶装置203が提供する記憶領域の一部として実現される。 Each of the above functional units is realized by the CPU 201 included in the server device 20 reading a program stored in the storage device 203 into the memory 202 and executing the program, and each of the above storage units is the memory 202 included in the server device 20. And implemented as part of the storage area provided by the storage device 203.
 評価リクエスト受信部211は、ユーザ端末10から送信される評価リクエストを受信する。評価リクエスト受信部211は、受信した評価リクエストに含まれている画像データを含む情報(以下、画像情報という。)を画像データ記憶部231に登録する。図12は、画像データ記憶部231に記憶される画像情報の構成例を示す図である。同図に示すように、画像情報には、撮影されたユーザを示すユーザIDに対応付けて、画像データが含まれる。画像データは、評価リクエストに含まれていたものである。 The evaluation request receiving unit 211 receives the evaluation request transmitted from the user terminal 10. The evaluation request receiving unit 211 registers information including the image data included in the received evaluation request (hereinafter, referred to as image information) in the image data storage unit 231. FIG. 12 is a diagram showing a configuration example of image information stored in the image data storage unit 231. As shown in the figure, the image information includes image data in association with a user ID indicating a photographed user. The image data was included in the evaluation request.
 基準情報記憶部232は、道具を用いた身体運動に係る、道具位置、道具動き(道具の向きや動き等)、姿勢(部位の位置や角度等)等、また道具と身体の関係から導き出す関係に関する基準値を含む情報(以下、基準情報という。)を記憶する。図13は、基準情報記憶部232が記憶する基準情報の構成例を示す図である。同図に示すように、基準情報には、道具を用いた身体運動を行った際に、道具の部分の絶対位置や、道具の部分がどのように動いたのかという情報(移動速度や移動距離、移動の方向など)、部位の絶対位置または他の部位もしくは他の基準物に対する相対位置に関する基準情報(以下、位置基準情報という。)と、関節部位を含む3つの部位について、2つの部位のそれぞれと関節部位とを結ぶ直線により形成される角度の基準情報(以下、角度基準情報という。)、さらに、道具の部分と身体の部位の関係に関する情報とが含まれるが、これに限定されない。また、基準値は身体運動ごと(モードごと)に準備され、各身体運動(モード)に対して、目的ごと、前記身体情報の特徴ごと、前記評価情報の特徴ごと、更には特定の個人(一定の成果を上げた選手やプロ選手、有段者、経験者などの熟練者等を想定しているが、これに限らない)の身体運動を基準値化したものなどの複数の基準値が存在していてもよい。 The reference information storage unit 232 is related to the physical exercise using the tool, such as the tool position, the tool movement (direction and movement of the tool), the posture (position and angle of the part, etc.), and the relationship derived from the relationship between the tool and the body. Stores information including reference values (hereinafter referred to as reference information). FIG. 13 is a diagram showing a configuration example of reference information stored in the reference information storage unit 232. As shown in the figure, the reference information includes information on the absolute position of the tool part and how the tool part moved when performing physical exercise using the tool (movement speed and movement distance). , Direction of movement, etc.), reference information about the absolute position of the part or relative position to other parts or other reference objects (hereinafter referred to as position reference information), and for three parts including the joint part, two parts It includes, but is not limited to, reference information on the angle formed by a straight line connecting each and the joint part (hereinafter referred to as angle reference information), and information on the relationship between the tool part and the body part. In addition, reference values are prepared for each physical exercise (mode), and for each physical exercise (mode), each purpose, each feature of the physical information, each feature of the evaluation information, and a specific individual (constant). There are multiple standard values such as those that standardize the physical exercise of athletes, professional athletes, stepped athletes, experienced athletes, etc., but not limited to this). You may.
 道具位置基準情報には、モードとチェックポイントIDとに対応付けて、道具の部分と、当該部分の基準となる位置が含まれる。部分は複数あってもよい。部分の位置について、鉛直方向の位置は、たとえば、地面からの高さとしてもよいし、いずれかの足先からの距離とすることができる。また、たとえばモードが「ウェイトリフティング」の場合は、両肩を結ぶ線とシャフトの間の距離など、身体の部位や部位と部位を繋ぐ線からの距離としてもよい。部位の水平方向の位置は、所定の基準物(たとえば、マウンドプレートや床上のマークなど)からの距離としてもよいし、肩や胸、足などの基準部位からの距離としてもよい。位置基準情報は、予め登録されているものとする。 The tool position reference information includes the tool part and the reference position of the part in association with the mode and the checkpoint ID. There may be multiple parts. Regarding the position of the portion, the vertical position may be, for example, the height from the ground or the distance from any of the toes. Further, for example, when the mode is "weightlifting", it may be a distance from a body part or a line connecting parts such as a distance between a line connecting both shoulders and a shaft. The horizontal position of the site may be a distance from a predetermined reference object (for example, a mound plate, a mark on the floor, etc.) or a reference site such as a shoulder, chest, or foot. It is assumed that the position reference information is registered in advance.
 道具動き基準情報には、モードとチェックポイントIDとに対応付けて、道具の部位の移動速度や移動距離、ある時点での移動の方向やある期間での移動の軌跡などの情報の基準値等とが含まれる。 The tool movement reference information includes the reference value of information such as the movement speed and distance of the tool part, the direction of movement at a certain point in time, and the trajectory of movement in a certain period in association with the mode and the checkpoint ID. And are included.
 位置基準情報には、モードとチェックポイントIDとに対応付けて、部位と、当該部位の基準となる位置が含まれる。部位は複数あってもよい。位置について、鉛直方向の位置は、たとえば、地面からの高さとしてもよいし、いずれかの足先からの距離とすることができる。水平方向の位置は、所定の基準物(たとえば、マウンドプレートや床上のマークなど)からの距離としてもよいし、肩や胸、足などの基準部位からの距離としてもよい。位置基準情報は、予め登録されているものとする。 The position reference information includes a part and a reference position of the part in association with the mode and the checkpoint ID. There may be multiple parts. Regarding the position, the vertical position may be, for example, the height from the ground or the distance from any of the toes. The horizontal position may be a distance from a predetermined reference object (for example, a mound plate, a mark on the floor, etc.) or a reference site such as a shoulder, chest, or foot. It is assumed that the position reference information is registered in advance.
 角度基準情報には、モードとチェックポイントIDとに対応付けて、2つの部位(部位1および部位2)と、1つの関節部位と、部位1と関節部位とを結ぶ直線と、部位2と関節部位とを結ぶ直線との間の角度の基準値とが含まれる。 The angle reference information includes two parts (part 1 and part 2), one joint part, a straight line connecting the part 1 and the joint part, and the part 2 and the joint in association with the mode and the checkpoint ID. Includes a reference value for the angle between the straight line connecting the site.
 関係基準情報には、モードとチェックポイントIDとに対応付けて、道具の部分と身体の部位との関係で表される基準に関する情報が含まれる。関係基準情報には、モードとチェックポイントIDとに対応付けて、一つ以上の部分と部位において、移動速度、移動距離、角度等から得られる情報が含まれる。関係基準情報には、たとえば、モードがバッティングの場合、ボールをミートした時点でのバットの先端の移動速度と、バットとバットを持つ利き腕からなる角度など、が基準情報として含まれる。 The relationship standard information includes information related to the standard represented by the relationship between the tool part and the body part in association with the mode and the checkpoint ID. The relational reference information includes information obtained from the moving speed, the moving distance, the angle, and the like in one or more parts and parts in association with the mode and the checkpoint ID. For example, when the mode is batting, the relational reference information includes the moving speed of the tip of the bat at the time of meeting the ball and the angle between the bat and the dominant arm holding the bat.
 評価条件情報記憶部233は、評価を行うための情報(以下、評価条件情報という。)を記憶する。図14は、評価条件情報記憶部233に記憶されている評価条件情報の構成例を示す図である。評価条件情報には、カテゴリ、条件、評価ランク、コメントが含まれている。カテゴリは、評価のカテゴリである。カテゴリとしては、たとえば、「筋力」、「球速」、「コントロール」などとすることができる。条件は、画像における道具の各部分の位置、向きまたは動き(時系列における位置の変化)など、また、身体の各部位の位置または動き(時系列における位置の変化)に対する条件である。たとえば、ウェイトリフティングの動きを分析する場合、バーベルを持ち上げる瞬間のチェックポイントについて、肘の角度や腕を伸ばす速度などに対する条件、また、バーベルを持ち上げて下ろす期間中のシャフトの動きや上下する速度などに対する条件を評価条件情報に設定することができる。また、ピッチングフォームを分析する場合、ボールをリリースするチェックポイントについて、肘の角度や腕の回線速度などに対する条件を評価条件情報に設定することができる。評価ランクは、上記条件が満たされた場合の評価値である。コメントは、上記条件が満たされた場合における、身体の姿勢や動きについての説明である。 The evaluation condition information storage unit 233 stores information for performing evaluation (hereinafter referred to as evaluation condition information). FIG. 14 is a diagram showing a configuration example of evaluation condition information stored in the evaluation condition information storage unit 233. The evaluation condition information includes categories, conditions, evaluation ranks, and comments. A category is a category of evaluation. The category can be, for example, "muscle strength", "ball speed", "control" and the like. The conditions are conditions for the position, orientation or movement of each part of the tool in the image (change in position in time series), and the position or movement of each part of the body (change in position in time series). For example, when analyzing the movement of weightlifting, the checkpoint at the moment of lifting the barbell, the conditions for the angle of the elbow, the speed of extending the arm, the movement of the shaft during the period of lifting and lowering the barbell, the speed of raising and lowering, etc. The condition for can be set in the evaluation condition information. Further, when analyzing the pitching form, it is possible to set conditions for the angle of the elbow, the line speed of the arm, and the like for the checkpoint for releasing the ball in the evaluation condition information. The evaluation rank is an evaluation value when the above conditions are satisfied. The comment is an explanation of the posture and movement of the body when the above conditions are satisfied.
 画像解析部212(部分・部位特定部)は、画像データを解析する。画像解析部212は、画像データを解析して道具の各部分、身体の各部位の特徴量を抽出し、各部分、各部位の、画像における位置を特定する。また、画像解析部212は、画像データを解析して道具の各部分の特徴量を抽出し、各部分が向いている方向を特定する。なお、画像解析部212による画像解析の手法については一般的なものを採用するものとして、ここでは詳細な説明を省略する。画像解析部212は、フレームごとまたはキーフレームごとに画像データを解析するようにしてもよいし、チェックポイントごとに画像データを解析するようにしてもよいし、ランダムなタイミングで解析するようにしてもよい。 The image analysis unit 212 (part / part identification unit) analyzes the image data. The image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool and each part of the body, and specifies the position of each part and each part in the image. Further, the image analysis unit 212 analyzes the image data, extracts the feature amount of each part of the tool, and specifies the direction in which each part is facing. As the image analysis method by the image analysis unit 212, a general method is adopted, and detailed description thereof will be omitted here. The image analysis unit 212 may analyze the image data frame by frame or key frame, analyze the image data at each checkpoint, or analyze at random timing. May be good.
 画像解析部212はまた、チェックポイントIDごとに、画像データから抽出した各部位の位置と、基準情報記憶部232に記憶されている位置基準情報などとを比較し、最も近い時点をチェックポイントの時点として特定する。 The image analysis unit 212 also compares the position of each part extracted from the image data with the position reference information stored in the reference information storage unit 232 for each checkpoint ID, and sets the nearest point as the checkpoint. Specify as a point in time.
 評価部213は、画像データに基づいてユーザの用いた道具の動きを評価する。本実施形態では、評価部213は、画像データから特定された道具の各部分の位置および部分の動きが満たす条件を含む評価条件情報を評価条件情報記憶部233から検索し、条件が満たされた評価条件情報があればそれに含まれる評価ランクおよびコメントを取得する。なお、評価部213は、当該道具の動きを評価して、身体運動の回数をカウントしてもよい。 The evaluation unit 213 evaluates the movement of the tool used by the user based on the image data. In the present embodiment, the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including the condition that the position and the movement of each part of the tool specified from the image data are satisfied, and the condition is satisfied. If there is evaluation condition information, the evaluation rank and comments included in it are acquired. The evaluation unit 213 may evaluate the movement of the tool and count the number of physical exercises.
 評価部213は、画像データに基づいてユーザの身体の動きを評価する。本実施形態では、評価部213は、画像データから特定された各部位の位置および部位の動きが満たす条件を含む評価条件情報を評価条件情報記憶部233から検索し、条件が満たされた評価条件情報があればそれに含まれる評価ランクおよびコメントを取得する。なお、評価部213は、当該身体の動きを評価して、身体運動の回数をカウントしてもよい。 The evaluation unit 213 evaluates the movement of the user's body based on the image data. In the present embodiment, the evaluation unit 213 searches the evaluation condition information storage unit 233 for evaluation condition information including the condition that the position of each part and the movement of the part specified from the image data are satisfied, and the evaluation condition that the condition is satisfied is searched. If there is information, get the rating rank and comments included in it. The evaluation unit 213 may evaluate the movement of the body and count the number of physical exercises.
 評価部213は、画像データに基づいてユーザの用いた道具と身体の動きを評価する。本実施形態では、評価部213は、画像データから特定された道具の各部分と身体の各部位の位置および、前記部分と前記部位の動きまたは関係が満たす条件を含む評価条件情報を評価条件情報記憶部233から検索し、条件が満たされた評価条件情報があればそれに含まれる評価ランクおよびコメントを取得する。なお、評価部213は、当該道具と当該身体の動きを評価して、身体運動の回数をカウントしてもよい。 The evaluation unit 213 evaluates the tool used by the user and the movement of the body based on the image data. In the present embodiment, the evaluation unit 213 evaluates the evaluation condition information including the position of each part of the tool and each part of the body specified from the image data, and the condition that the movement or relationship between the part and the part is satisfied. It searches from the storage unit 233, and if there is evaluation condition information that satisfies the condition, the evaluation rank and the comment included in the evaluation condition information are acquired. The evaluation unit 213 may evaluate the tool and the movement of the body and count the number of physical exercises.
 評価情報送信部214は、評価情報をユーザ端末10に送信する。評価情報送信部214は、画像解析部212が特定した動画の時間軸における時点と道具の各部分の位置とを含む道具位置情報を生成する。評価部213が取得した評価ランクおよびコメントについて、道具の部分の位置が条件を満たす場合には、時点、部分および道具向き値と、評価ランクおよびコメントとを含む姿勢情報を生成し、部分の動き(時系列における位置の変化)が条件を満たす場合には、時点、部分および道具向き値のリストと、評価ランクおよびコメントとを含む道具動き情報を生成する。また、評価情報送信部214は、画像解析部212が解析した、各チェックポイントに対応する時点と、当該チェックポイントを示すチェックポイントIDとを含むチェックポイント情報を生成する。評価情報送信部214は、生成した道具位置情報、道具向き情報、道具動き情報およびチェックポイント情報を含む評価情報を作成してユーザ端末10に送信する。なお、評価部213および評価情報送信部214は、本発明のコメント出力部に該当しうる。 The evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10. The evaluation information transmission unit 214 generates tool position information including a time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part of the tool. For the evaluation rank and comment acquired by the evaluation unit 213, if the position of the tool part satisfies the condition, the posture information including the time point, the part and the tool orientation value, and the evaluation rank and the comment is generated, and the movement of the part is generated. If (change in position in time series) is satisfied, tool movement information including a list of time points, parts and tool orientation values, and evaluation ranks and comments is generated. Further, the evaluation information transmission unit 214 generates checkpoint information including a time point corresponding to each checkpoint analyzed by the image analysis unit 212 and a checkpoint ID indicating the checkpoint. The evaluation information transmission unit 214 creates evaluation information including the generated tool position information, tool orientation information, tool movement information, and checkpoint information, and transmits the evaluation information to the user terminal 10. The evaluation unit 213 and the evaluation information transmission unit 214 may correspond to the comment output unit of the present invention.
 評価情報送信部214は、評価情報をユーザ端末10に送信する。評価情報送信部214は、画像解析部212が特定した動画の時間軸における時点と各部位の位置とを含む位置情報を生成する。評価部213が取得した評価ランクおよびコメントについて、部位の位置が条件を満たす場合には、時点、部位および姿勢値と、評価ランクおよびコメントとを含む姿勢情報を生成し、部位の動き(時系列における位置の変化)が条件を満たす場合には、時点、部位および姿勢値のリストと、評価ランクおよびコメントとを含む動き情報を生成する。また、評価情報送信部214は、画像解析部212が解析した、各チェックポイントに対応する時点と、当該チェックポイントを示すチェックポイントIDとを含むチェックポイント情報を生成する。評価情報送信部214は、生成した位置情報、姿勢情報、動き情報およびチェックポイント情報を含む評価情報を作成してユーザ端末10に送信する。なお、評価部213および評価情報送信部214は、本発明のコメント出力部に該当しうる。 The evaluation information transmission unit 214 transmits the evaluation information to the user terminal 10. The evaluation information transmission unit 214 generates position information including a time point on the time axis of the moving image specified by the image analysis unit 212 and the position of each part. For the evaluation rank and comment acquired by the evaluation unit 213, if the position of the part satisfies the condition, the posture information including the time point, the part and the posture value, and the evaluation rank and the comment is generated, and the movement of the part (time series). If the condition (change in position in) is satisfied, motion information including a list of time points, parts and posture values, and evaluation ranks and comments is generated. Further, the evaluation information transmission unit 214 generates checkpoint information including a time point corresponding to each checkpoint analyzed by the image analysis unit 212 and a checkpoint ID indicating the checkpoint. The evaluation information transmission unit 214 creates evaluation information including the generated position information, posture information, motion information, and checkpoint information, and transmits the evaluation information to the user terminal 10. The evaluation unit 213 and the evaluation information transmission unit 214 may correspond to the comment output unit of the present invention.
 改善策情報記憶部234は、改善策に係る情報(以下、改善策情報という。)を記憶する。図15は、改善策情報記憶部234に記憶される改善策情報の構成例を示す図である。同図に示すように、改善策情報には、目的、カテゴリおよび条件に対応付けてアドバイスが含まれる。条件は、道具自体への条件(バーベルの重量等)、道具の使い方、身体条件(柔軟性など)に対する条件であってもよいし、道具の部位の位置や向き、動きに対する条件でもよく、身体の部位の位置や動きに対する条件であってもよい。 The improvement measure information storage unit 234 stores information related to the improvement measure (hereinafter referred to as improvement measure information). FIG. 15 is a diagram showing a configuration example of improvement measure information stored in the improvement measure information storage unit 234. As shown in the figure, the improvement measure information includes advice in association with the purpose, category, and condition. The condition may be a condition for the tool itself (weight of the barbell, etc.), how to use the tool, a physical condition (flexibility, etc.), a condition for the position and orientation of the part of the tool, a condition for movement, and the body. It may be a condition for the position and movement of the part of the body.
 改善策リクエスト受信部215は、ユーザ端末10から送信される改善策リクエストを受信する。 The improvement measure request receiving unit 215 receives the improvement measure request transmitted from the user terminal 10.
 改善策情報送信部216は、改善策リクエストに含まれているモードおよび目的に対応する改善策情報のうち、評価リクエストに含まれていたユーザの身体情報や、画像解析部212が特定した各部分や各部位の位置や向き、動き等が条件を満たされるものを検索する。改善策情報送信部216は、検索した改善策情報のアドバイスを取得し、目的およびアドバイスを設定した改善策情報を作成し、作成した改善策情報をユーザ端末10に応答する。改善策情報送信部216は、また、基準情報に含まれている各部分や各部位の位置や向き、速度、角度等を改善策情報に含めて送信する。なお、改善策情報送信部216は、改善策リクエスト無くても、評価情報と基準値を基に改善策を検索してもよく、当該改善策を改善策情報送信部216がユーザ端末10に送信してもよい。 The improvement measure information transmission unit 216 includes the physical information of the user included in the evaluation request and each part specified by the image analysis unit 212 among the improvement measure information corresponding to the mode and the purpose included in the improvement measure request. Search for items that satisfy the conditions such as the position, orientation, and movement of each part. The improvement measure information transmission unit 216 acquires the advice of the searched improvement measure information, creates the improvement measure information for which the purpose and the advice are set, and responds the created improvement measure information to the user terminal 10. The improvement measure information transmission unit 216 also includes the position, direction, speed, angle, etc. of each part and each part included in the reference information in the improvement measure information and transmits the information. The improvement measure information transmission unit 216 may search for improvement measures based on the evaluation information and the reference value without the improvement measure request, and the improvement measure information transmission unit 216 transmits the improvement measures to the user terminal 10. You may.
 図17は、本実施形態の指導支援サーバにおいて実行される処理の流れの一例を示す図である。 FIG. 17 is a diagram showing an example of the flow of processing executed in the guidance support server of the present embodiment.
 ユーザ端末10において撮像部111は、モードの入力を受け付け、ユーザの運動中における身体を撮像し、動画データを取得する(S321)。評価リクエスト送信部112は、ユーザを示すユーザID、受け付けたモード、身体情報および動画データを含む評価リクエストをサーバ装置20に送信する(S322)。 In the user terminal 10, the imaging unit 111 receives the input of the mode, images the body during the exercise of the user, and acquires the moving image data (S321). The evaluation request transmission unit 112 transmits an evaluation request including a user ID indicating a user, a received mode, physical information, and moving image data to the server device 20 (S322).
 サーバ装置20において評価リクエスト受信部211が評価リクエストを受信すると、画像解析部212は動画データを解析して特徴量を抽出し(S323)、各部分、各部位の位置を特定する(S324)。ここで画像解析部212は、画像上の位置を特定するようにしてもよいし、身体情報を用いて実寸の位置(地面からの高さ、身体の重心等の基準点からの距離など)を特定するようにしてもよい。評価部213は、各部分、各部位の位置や部分、部位の動き(位置の時系列的な変化)が条件を満たす評価条件情報から評価ランクおよびコメントを取得する(S325)。評価情報送信部214は、評価情報を作成してユーザ端末10に送信する(S326)。 When the evaluation request receiving unit 211 receives the evaluation request in the server device 20, the image analysis unit 212 analyzes the moving image data to extract the feature amount (S323) and specifies the position of each part and each part (S324). Here, the image analysis unit 212 may specify the position on the image, or may use the body information to determine the actual size position (height from the ground, distance from the reference point such as the center of gravity of the body, etc.). You may try to specify. The evaluation unit 213 acquires an evaluation rank and a comment from the evaluation condition information in which the position or part of each part, the position or part of each part, or the movement of the part (change in time series of position) satisfies the condition (S325). The evaluation information transmission unit 214 creates evaluation information and transmits it to the user terminal 10 (S326).
 ユーザ端末10において評価表示部114は、受信した評価情報に基づいて、動画データ上に身体の位置、姿勢、動き、道具の位置、向き、動きなどを表示する(S327)。また、ユーザ端末10において評価表示部114は身体の姿勢を示す各部の位置(ボーン)を表示するとともに、評価ランクやコメントを表示してもよい(S327)。ここで評価表示部114は、部分の位置や向き、動きなど、また、部位の位置や動きの時系列的な変化などをグラフ表示してもよい。また、チェックポイント表示部115が、動画からチェックポイントの画像を抽出して表示してもよい。改善策リクエスト送信部116は、ユーザからの指示に応じて、改善策リクエストをサーバ装置20に送信する(S328)。 In the user terminal 10, the evaluation display unit 114 displays the position, posture, movement, position, orientation, movement, etc. of the body on the moving image data based on the received evaluation information (S327). Further, in the user terminal 10, the evaluation display unit 114 may display the position (bone) of each unit indicating the posture of the body, and may display the evaluation rank and the comment (S327). Here, the evaluation display unit 114 may display a graph of the position, orientation, movement, etc. of the portion, and the time-series change of the position, movement, etc. of the portion. Further, the checkpoint display unit 115 may extract the checkpoint image from the moving image and display it. The improvement measure request transmission unit 116 transmits the improvement measure request to the server device 20 in response to an instruction from the user (S328).
 サーバ装置20において、改善策リクエスト受信部215は、ユーザ端末10から送信される改善策リクエストを受信すると、改善策情報送信部216は、条件が満たされる改善策情報を検索し、検索した改善策情報に含まれているアドバイスを取得し(S329)、取得したアドバイスを含む改善策情報を作成してユーザ端末10に送信する(S330)。 In the server device 20, when the improvement measure request receiving unit 215 receives the improvement measure request transmitted from the user terminal 10, the improvement measure information transmitting unit 216 searches for the improvement measure information that satisfies the condition, and the searched improvement measures. The advice included in the information is acquired (S329), improvement measure information including the acquired advice is created, and the information is transmitted to the user terminal 10 (S330).
 ユーザ端末10において改善策情報受信部117が改善策情報を受信すると、改善策情報表示部118は、受信した改善策情報に含まれるアドバイスを表示するとともに、好適な道具の使い方を動画データに重畳させて表示することができる(S331)。 When the improvement measure information receiving unit 117 receives the improvement measure information in the user terminal 10, the improvement measure information display unit 118 displays the advice included in the received improvement measure information and superimposes the suitable usage of the tool on the video data. It can be displayed (S331).
 ユーザ端末10において改善策情報受信部117が改善策情報を受信すると、改善策情報表示部118は、受信した改善策情報に含まれるアドバイスを表示するとともに、好適な身体の姿勢をボーンの形態で動画データに重畳させて表示することができる(S331)。 When the improvement measure information receiving unit 117 receives the improvement measure information in the user terminal 10, the improvement measure information display unit 118 displays the advice included in the received improvement measure information and displays a suitable body posture in the form of a bone. It can be superimposed on the moving image data and displayed (S331).
 以上のようにして、本実施形態の指導支援サーバによれば、容易に身体運動についての評価を行うことができる。とくにスポーツに係る身体運動について、道具の各部分、身体の各部位の位置関係や動きについて評価することができるので、具体的な改善努力につながりやすく、成績向上が期待される。また、本実施形態の指導支援サーバでは、コメントやアドバイスも提供されるため、ユーザは容易に現状と改善策とを把握することが可能になる。 As described above, according to the guidance support server of the present embodiment, it is possible to easily evaluate the physical exercise. In particular, regarding physical exercise related to sports, since it is possible to evaluate the positional relationship and movement of each part of the tool and each part of the body, it is easy to lead to concrete improvement efforts, and improvement in results is expected. In addition, since the guidance support server of the present embodiment also provides comments and advice, the user can easily grasp the current situation and improvement measures.
 図18は、道具を用いた身体運動の評価を表示する画面の一例を示す図である。同図は、ウェイトリフティングモードでの動画を撮像した場合を説明している。同図に示すように、画面41では、バーベルのシャフトの位置を示す印411を表示している。バーベルのシャフトの動きを線412で表示している。 FIG. 18 is a diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. The figure illustrates the case where a moving image is captured in the weightlifting mode. As shown in the figure, the screen 41 displays a mark 411 indicating the position of the barbell shaft. The movement of the barbell shaft is indicated by line 412.
 図19は、道具を用いた身体運動の評価を表示する画面の一例を示す他の図である。同図は、ウェイトリフティングモードでの動画を撮像した場合を説明している。ウェイトリフティングモードにおいて評価部213が評価を行った結果として、例えば一例としてバーベルのシャフトの傾き、シャフトの移動距離、移動速度(線421)などが表示されている図である。同図においては、基準値(線422)を表示している。実際の測定結果(線423)を表示し、さらに、基準値との差異を数値等で表示してもよいし、グラフ等で示してもよい。ユーザはこれを参考にして直すべき動きや姿勢などを検討することができる。 FIG. 19 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. The figure illustrates the case where a moving image is captured in the weightlifting mode. As a result of evaluation by the evaluation unit 213 in the weight lifting mode, for example, the inclination of the shaft of the barbell, the moving distance of the shaft, the moving speed (line 421), and the like are displayed. In the figure, the reference value (line 422) is displayed. The actual measurement result (line 423) may be displayed, and the difference from the reference value may be displayed by a numerical value or the like, or may be shown by a graph or the like. The user can consider the movement and posture to be corrected by referring to this.
 図20は、道具を用いた身体運動の評価を表示する画面の一例を示す他の図である。同図は、ウェイトリフティングモードでの動画を撮像した場合を説明している。ウェイトリフティングモードにおいて評価部213が評価を行った結果として、例えば、一例としてバーベルの最下点における身体の関節の角度などの基準値(線431)とともに評価結果(線432)などが表示されている図である。また、基準値との差異を数値等で表示してもよいし、グラフ等で示してもよい。さらに、身体と道具の関係による評価結果を表示してもよい。ユーザはこれを参考にして直すべき動きや姿勢などを検討することができる。 FIG. 20 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. The figure illustrates the case where a moving image is captured in the weightlifting mode. As a result of the evaluation performed by the evaluation unit 213 in the weight lifting mode, for example, the evaluation result (line 432) is displayed together with the reference value (line 431) such as the angle of the joint of the body at the lowest point of the barbell. It is a figure. Further, the difference from the reference value may be displayed by a numerical value or the like, or may be shown by a graph or the like. Further, the evaluation result based on the relationship between the body and the tool may be displayed. The user can consider the movement and posture to be corrected by referring to this.
 図21は、道具を用いた身体運動の評価を表示する画面の一例を示す他の図である。同図は、ウェイトリフティングモードでの動画を撮像した場合を説明している。ウェイトリフティングモードにおいて評価部213が評価を行った結果として、線441では、画像から特定した道具の部分や身体の部位の所定の位置を線で結んで表示したボーンが表示されている。なお、ボーンは撮像した画像に重ねて表示してもよい。また、線442は、身体の各部位の加速度を表示している。さらに、ウェイトを何回持ち上げたかなどのカウント結果を表示してもよい。また、線443に示すように、評価結果と目的等に合わせた次のトレーニング結果を示してもよい。 FIG. 21 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. The figure illustrates the case where a moving image is captured in the weightlifting mode. As a result of the evaluation performed by the evaluation unit 213 in the weight lifting mode, the line 441 displays a bone that is displayed by connecting a predetermined position of a tool part or a body part specified from the image with a line. The bone may be superimposed on the captured image and displayed. Further, the line 442 indicates the acceleration of each part of the body. Further, a count result such as how many times the weight is lifted may be displayed. Further, as shown by line 443, the evaluation result and the next training result according to the purpose and the like may be shown.
 図22は、道具を用いた身体運動の評価を表示する画面の一例を示す他の図である。同図は、ウェイトリフティングモードでの動画を撮像した場合を説明している。ウェイトリフティングモードにおいて評価部213が評価を行った結果として、各種基準値との比較によって得られた評価(線441)を表示している。 FIG. 22 is another diagram showing an example of a screen displaying an evaluation of physical exercise using a tool. The figure illustrates the case where a moving image is captured in the weightlifting mode. As a result of the evaluation performed by the evaluation unit 213 in the weight lifting mode, the evaluation (line 441) obtained by comparison with various reference values is displayed.
 グループ解析部217は、複数のユーザが同時に本システムを利用し、少数の支援者が遠隔で指導等を行うシーン(例えばオンラインミーティングツールやオンラインコミュニケーションツール等を用いたトレーニング指導や遠隔リハビリ支援など)で用いられることを想定している。このようなシーンにおいては、支援者一人に対してユーザが複数いることがある。支援者が使う端末にはリアルタイムで複数のユーザを撮像した画像情報が表示されるものの、どのユーザが効果的にトレーニングできているのか、どのユーザが疲労しているのかなどの詳細なユーザの状態を把握することは難しく、結果的に効果が薄くなってしまっている。また、支援者とユーザ間の、実施したトレーニング内容を基にしたコミュニケーションも取り辛く、継続性が低くなるという課題もある。グループ解析部217は、このような課題を解決するものである。 The group analysis unit 217 is a scene where multiple users use this system at the same time and a small number of supporters provide guidance remotely (for example, training guidance using online meeting tools, online communication tools, etc., remote rehabilitation support, etc.). It is supposed to be used in. In such a scene, there may be a plurality of users for each supporter. Although the terminal used by the supporter displays image information of multiple users in real time, detailed user status such as which user is training effectively and which user is tired. It is difficult to grasp, and as a result, the effect is diminished. In addition, it is difficult to communicate between the supporter and the user based on the content of the training conducted, and there is also a problem that the continuity is low. The group analysis unit 217 solves such a problem.
 グループ解析部217は、各ユーザの画像に対して評価部213が行った評価情報を解析し、解析した情報をグループ解析情報記憶部235に記憶する。グループ解析部217は、ユーザが行った身体運動(道具を用いた運動を含む)について評価部213が評価した評価情報をもとに、行ったトレーニングの反復回数や、基準値に近い(または基準値から遠い)順番など、行った(行っているものをリアルタイムに解析してもよい)身体運動が効果的(または非効果的)であったユーザのランキングを作成する。また、各ユーザをランキングの上位グループ、下位グループなどのカテゴライズを行ってもよい。また、身体運動を継続して行う中で、道具の部分や身体の部位などの位置や移動速度、移動距離、関節の角度などが基準値から離れていくことで、ユーザの疲労度が高まっていることを判定、また基準値からの乖離度で疲労度の程度を推定してもよい。さらに、所定の反復回数を完了する時間をセット毎に計測し、完了時間が増えることで疲労度を判定してもよいし、増えた時間や前セットの完了時間との比較によって疲労度を推定してもよい。 The group analysis unit 217 analyzes the evaluation information performed by the evaluation unit 213 on each user's image, and stores the analyzed information in the group analysis information storage unit 235. The group analysis unit 217 is close to (or a standard) the number of repetitions of the training performed and the reference value based on the evaluation information evaluated by the evaluation unit 213 regarding the physical exercise (including the exercise using the tool) performed by the user. Create a ranking of users whose physical exercises (or what they are doing may be analyzed in real time), such as the order (far from the value), were effective (or ineffective). In addition, each user may be categorized into a high-ranking group, a low-ranking group, and the like. In addition, the user's fatigue level increases as the position, movement speed, movement distance, joint angle, etc. of the tool part and body part move away from the reference values while continuing physical exercise. It may be determined that the patient is present, and the degree of fatigue may be estimated based on the degree of deviation from the reference value. Furthermore, the time to complete a predetermined number of iterations may be measured for each set, and the degree of fatigue may be determined by increasing the completion time, or the degree of fatigue may be estimated by comparing with the increased time or the completion time of the previous set. You may.
 さらに、グループ解析部217は、支援者の運動開始または終了の合図など実際に身体運動が始まるまでの時間などを評価し、ユーザの集中度を推定してもよい。さらに、グループ解析部217は、同時にトレーニング等に参加しているユーザ間で、前記評価情報から偏差値を算出してもよいし、前記偏差値でランキングを作成してもよい。さらに、グループ解析部217は、別のタイミングで同じ身体運動を行ったユーザ間で、前記評価情報から偏差値を算出してもよいし、前記偏差値でランキングを作成してもよい。さらに、前記ランキングや疲労度、集中度などから、指導が必要なユーザや直接コメントをした方が継続性等の面で効果があると考えられるユーザ等を特定してもよい。さらに、グループ解析部217は、ユーザが以前に同様の身体運動を行った際の評価情報と、新たに行った身体運動の画像から評価した評価情報を比較してもよい。 Further, the group analysis unit 217 may evaluate the time until the physical exercise actually starts, such as a signal for the supporter to start or end the exercise, and estimate the degree of concentration of the user. Further, the group analysis unit 217 may calculate a deviation value from the evaluation information among users who are participating in training or the like at the same time, or may create a ranking based on the deviation value. Further, the group analysis unit 217 may calculate a deviation value from the evaluation information among users who have performed the same physical exercise at different timings, or may create a ranking based on the deviation value. Further, from the ranking, the degree of fatigue, the degree of concentration, etc., a user who needs guidance or a user who is considered to be more effective in terms of continuity by making a direct comment may be specified. Further, the group analysis unit 217 may compare the evaluation information when the user has previously performed the same physical exercise with the evaluation information evaluated from the image of the newly performed physical exercise.
 グループ解析情報記憶部235は、グループ解析部217が行ったグループ解析情報を記憶する。図16はグループ解析情報記憶部235に記憶されるグループ解析情報の構成例を示す図である。グループ解析情報には大きく順位情報、状況情報、注意情報が含まれる。前記順位情報は、前記評価情報をもとにした順位、ランキング、また順位が上位グループ、下位グループに含まれるかなどのカテゴリ情報などが含まれる。状況情報には、疲労度や集中度などが含まれる。注意情報には、特に指導を行う必要があるユーザなどの情報が含まれる。 The group analysis information storage unit 235 stores the group analysis information performed by the group analysis unit 217. FIG. 16 is a diagram showing a configuration example of group analysis information stored in the group analysis information storage unit 235. Group analysis information largely includes ranking information, status information, and caution information. The ranking information includes ranking based on the evaluation information, ranking, and category information such as whether the ranking is included in the upper group or the lower group. The situation information includes the degree of fatigue and the degree of concentration. Attention information includes information such as users who need to be instructed in particular.
 グループ解析情報提示部218は、グループ解析情報記憶部235に記憶されたグループ解析情報を、ユーザ端末10および支援者端末40に提示する。グループ解析情報提示部218は、ユーザ端末10に対して、当該ユーザの、同時に身体運動を行っているユーザ間、また、別のタイミングで同様の身体運動を行ったユーザも含めたユーザ間でのランキングや偏差値、上位、下位などのカテゴリなどを提示する。更に、グループ解析情報提示部218は、評価情報記憶部132に記憶される情報を合わせて提示してもよい。更に、グループ解析情報提示部218は、支援者端末40に対して、各ユーザのランキングや偏差値情報を提示してもよいし、ランキングの上位(または下位)のユーザを色付けするなど、画面上に見やすく(当該ユーザの映っている画面を大きく提示、上部に提示するなど)提示してもよい。 The group analysis information presentation unit 218 presents the group analysis information stored in the group analysis information storage unit 235 to the user terminal 10 and the supporter terminal 40. The group analysis information presentation unit 218 is a group analysis information presenting unit 218 between users who are simultaneously performing physical exercises with respect to the user terminal 10, and between users including users who are performing similar physical exercises at different timings. Presents categories such as rankings, deviation values, high ranks, and low ranks. Further, the group analysis information presentation unit 218 may also present the information stored in the evaluation information storage unit 132. Further, the group analysis information presentation unit 218 may present the ranking and deviation value information of each user to the supporter terminal 40, color the users at the top (or bottom) of the ranking, and so on. It may be presented in an easy-to-see manner (such as displaying the screen on which the user is displayed in a large size or presenting it at the top).
 図23は、グループ解析情報提示部218がユーザ端末10に提示して作成された画面の一例を示す図である。同図は、スクワットモードでの動画を撮像した場合を説明している。同図に示すように、画面46では、自身のグループ内での順位を線461で表示している。また、線462では実行した運動の回数を示している。線463では手本となるトレーナーのリアルタイムの動画(または録画した動画)を表示する。 FIG. 23 is a diagram showing an example of a screen created by presenting the group analysis information presentation unit 218 to the user terminal 10. The figure illustrates the case where a moving image is captured in squat mode. As shown in the figure, on the screen 46, the rank within the own group is displayed by a line 461. The line 462 shows the number of exercises performed. Line 463 displays a real-time video (or recorded video) of the trainer as a model.
 図24は、グループ解析情報提示部218が支援者端末40に提示して作成された画面の一例を示す図である。支援者端末40では、各ユーザを撮像した動画とともに、各ユーザの評価結果やグループ解析情報等が表示される。 FIG. 24 is a diagram showing an example of a screen created by the group analysis information presentation unit 218 presenting to the supporter terminal 40. On the supporter terminal 40, the evaluation result of each user, the group analysis information, and the like are displayed together with the moving image of each user.
 図25は、グループ解析情報提示部218が支援者端末40に提示して作成された画面の一例を示す図である。画面47は、ユーザを撮像した画像が並んでおり、評価やグループ解析により、グループの中でもより指導(支援等含む)が必要なユーザが支援者にとって分かりやすいように表示される。画面47では一例として、指導が必要なユーザを写した画面の右上に印が表示されている。 FIG. 25 is a diagram showing an example of a screen created by the group analysis information presentation unit 218 presenting to the supporter terminal 40. Images of users are arranged on the screen 47, and users who need more guidance (including support) in the group are displayed in an easy-to-understand manner by evaluation and group analysis. On the screen 47, as an example, a mark is displayed on the upper right of the screen showing a user who needs guidance.
 図26は、グループ解析情報提示部218が支援者端末40に提示して作成された画面の一例を示す図である。画面48は各ユーザの詳細を示す画面であり、評価結果(ボーンや姿勢値、道具の向き値、身体運動の反復回数等)、グループ解析結果(グループ内での順位やその推移等)、当該ユーザに対して改善策情報送信部216が送信したアドバイス等、撮像された画像を録画(または評価が下がった時点の前後数秒などの一部の動画)を閲覧できる機能等が表示される。また、画面47と画面48が組み合わされた画面であってもよい。 FIG. 26 is a diagram showing an example of a screen created by the group analysis information presentation unit 218 presenting to the supporter terminal 40. The screen 48 is a screen showing the details of each user, and includes evaluation results (bones, posture values, tool orientation values, number of repetitions of physical exercise, etc.), group analysis results (ranking within the group, changes thereof, etc.), and the relevant screens. A function for viewing a recorded image (or a part of a moving image such as a few seconds before or after the time when the evaluation is lowered) such as an advice transmitted by the improvement measure information transmitting unit 216 is displayed for the user. Further, the screen may be a combination of the screen 47 and the screen 48.
 基準情報リクエスト受信部219は、ユーザ端末10から送信される基準情報リクエストを受信する。基準情報リクエスト受信部219は、受信した基準情報リクエストに含まれている画像データを含む情報(以下、画像情報という。)を画像データ記憶部231に登録する。図12は、画像データ記憶部231に記憶される画像情報の構成例を示す図である。同図に示すように、画像情報には、撮影されたユーザを示すユーザIDに対応付けて、画像データが含まれる。画像データは、基準情報リクエストに含まれていたものである。なお、基準情報リクエストには画像データが含まれない場合も想定される。 The reference information request receiving unit 219 receives the reference information request transmitted from the user terminal 10. The reference information request receiving unit 219 registers information including image data included in the received reference information request (hereinafter, referred to as image information) in the image data storage unit 231. FIG. 12 is a diagram showing a configuration example of image information stored in the image data storage unit 231. As shown in the figure, the image information includes image data in association with a user ID indicating a photographed user. The image data was included in the reference information request. It is also assumed that the reference information request does not include image data.
 基準情報決定部220は、基準情報リクエストに含まれているモード、目的、画像情報と、前記身体情報、前記評価情報などを基に、ユーザのニーズを満たす基準情報を検索し、ユーザ端末10に送信する。また、基準情報決定部220は、基準値選択情報送信部120が送信した基準情報のどれかを選択した情報を受け付け、選択された基準情報を基に、画像解析部212が解析を行ってもよい。更に、基準情報決定部220は、基準情報リクエストが無くても、基準情報決定部220は目的、身体情報、基本情報などを基に、ユーザに候補となる基準情報を送信してもよい。更に、基準情報決定部220は、基準値の選択肢と共に、ユーザが他の基準値を提示することを求める選択肢を提示してもよく、その場合には重ねて「強度を高める」や「長時間続けられる」などの、新たに提示する基準値を選択するための参考とする選択肢を提示してもよい。 The reference information determination unit 220 searches for reference information satisfying the user's needs based on the mode, purpose, image information included in the reference information request, the physical information, the evaluation information, and the like, and causes the user terminal 10 to search for the reference information. Send. Further, the reference information determination unit 220 receives information selected from any of the reference information transmitted by the reference value selection information transmission unit 120, and even if the image analysis unit 212 performs analysis based on the selected reference information. good. Further, the reference information determination unit 220 may transmit candidate reference information to the user based on the purpose, physical information, basic information, etc., even if there is no reference information request. Further, the reference information determination unit 220 may present an option for the user to present another reference value together with the option of the reference value, and in that case, "increase the strength" or "long time". You may present a reference option for selecting a new reference value, such as "continue."
 基準情報決定部220は、例えば、基準情報リクエストとしてウェイトリフティングモード、目的として「大胸筋上部の筋肥大」が含まれている場合に、基準情報として基準情報記憶部に記憶されている、複数の基準情報から大胸筋上部のトレーニングに合わせた基準情報を選択する。この場合、例えば、基準情報1「大胸筋上部トレーニング用:腕を斜め上方に押し出す」、基準情報2「大胸筋内側トレーニング用:腕を前で閉じる」、基準情報3「大胸筋下部:腕を斜め下方に押し出す」などの基準情報が基準情報記憶部232に記憶されている。基準情報決定部220は、目的に応じた基準情報をユーザ端末10に送信し、ユーザが当該基準情報を選択すると、基準情報決定部220が当該選択を受け付け、評価部213が評価に用いる基準情報となる。 The reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and the purpose is "muscle hypertrophy of the upper part of the pectoralis major muscle". Select the standard information that matches the training of the upper part of the pectoralis major muscle from the standard information of. In this case, for example, reference information 1 "for training the upper part of the pectoralis major muscle: pushing the arm diagonally upward", reference information 2 "for training inside the pectoralis major muscle: closing the arm in front", and reference information 3 "lower part of the pectoralis major muscle". : The reference information such as "push the arm diagonally downward" is stored in the reference information storage unit 232. The reference information determination unit 220 transmits reference information according to the purpose to the user terminal 10, and when the user selects the reference information, the reference information determination unit 220 accepts the selection and the evaluation unit 213 uses the reference information for evaluation. It becomes.
 また、基準情報決定部220は、例えば、基準情報リクエストとしてウェイトリフティングモード、目的として「大胸筋上部の筋肥大」が含まれている場合に、基準情報として基準情報記憶部に記憶されている、複数の基準情報から大胸筋上部のトレーニングに合わせた基準情報を選択するが、更に前記身体情報を基に、基準情報を選択してもよい。この場合、例えば、基準情報1「大胸筋上部トレーニング用:腕を斜め上方に押し出す」、基準情報2「大胸筋内側トレーニング用:腕を前で閉じる」、基準情報3「大胸筋下部:腕を斜め下方に押し出す」などの基準情報が記憶されており、それぞれの基準情報の間に、大胸筋の内側がある程度鍛えられていないと、大胸筋上部を鍛えても効果が薄い、または怪我をしてしまう危険がある、などの関係がある場合に、更に、基準情報決定部220は、例えばその時点でのユーザの身体情報(例えば筋力、大胸筋の内側の筋力など)を基に、まずは基準情報2「大胸筋内側トレーニング用:腕を前で閉じる」をユーザ端末10に送信してもよい。この場合、更に、「まずは大胸筋内側(つまり基準情報2)を鍛えたのち、大胸筋上部(つまり基準情報1)を鍛えましょう」のようなトレーニングのステップやアドバイス等を合わせて送信してもよい。 Further, the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and the target is “muscle hypertrophy of the upper part of the pectoralis major muscle”. , The reference information suitable for the training of the upper part of the pectoralis major muscle is selected from a plurality of reference information, but the reference information may be further selected based on the physical information. In this case, for example, reference information 1 "for training the upper part of the pectoralis major muscle: pushing the arm diagonally upward", reference information 2 "for training inside the pectoralis major muscle: closing the arm in front", reference information 3 "lower part of the pectoralis major muscle". Standard information such as "push the arm diagonally downward" is stored, and if the inside of the pectoralis major muscle is not trained to some extent between each standard information, the effect is weak even if the upper part of the pectoralis major muscle is trained. , Or when there is a risk of injury, the reference information determination unit 220 further uses, for example, the user's physical information at that time (for example, muscle strength, muscle strength inside the pectoralis major muscle, etc.). First, reference information 2 “for training inside the pectoralis major muscle: close the arm in front” may be transmitted to the user terminal 10. In this case, further, send training steps and advice such as "First train the inside of the pectoralis major muscle (that is, reference information 2), and then train the upper part of the pectoralis major muscle (that is, reference information 1)". You may.
 また、基準情報決定部220は、例えば、基準情報リクエストとしてウェイトリフティングモード、目的として「大胸筋上部の筋肥大」が含まれている場合に、基準情報として基準情報記憶部に記憶されている、複数の基準情報から大胸筋上部のトレーニングに合わせた基準情報を選択するが、更に前記評価情報を基に、基準情報を選択してもよい。この場合、例えば、基準情報1「大胸筋上部トレーニング用:腕を斜め上方に押し出す」、基準情報2「大胸筋内側トレーニング用:腕を前で閉じる」、基準情報3「大胸筋下部:腕を斜め下方に押し出す」などの基準情報が記憶されており、それぞれの基準情報の間に、大胸筋上部を鍛える動きは難易度が高いので、大胸筋の内側を鍛える動きでまずは練習をした方が良い、などの関係がある場合に、更に、基準情報決定部220は、例えばその時点でのユーザの評価情報(スキルと見做せる)に含まれる(このために、基準情報リクエストに含まれる画像情報を画像解析部212が解析し、評価部213が評価してもよい)道具動き情報(例えば、バーベルがスムーズに上げ下げできているか)などの情報を基に、まずは難易度の低い基準情報2「大胸筋内側トレーニング用:腕を前で閉じる」をユーザ端末10に送信してもよい。この場合、更に、「まずは大胸筋内側(つまり基準情報2)を鍛える動きを練習したのち、大胸筋上部(つまり基準情報1)を鍛えましょう」のようなトレーニングのステップやアドバイス等を合わせて送信してもよい。 Further, the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and "muscle hypertrophy of the upper part of the pectoralis major muscle" is included as the purpose. , The reference information suitable for the training of the upper part of the pectoralis major muscle is selected from a plurality of reference information, but the reference information may be further selected based on the evaluation information. In this case, for example, reference information 1 "for training the upper part of the pectoralis major muscle: pushing the arm diagonally upward", reference information 2 "for training inside the pectoralis major muscle: closing the arm in front", and reference information 3 "lower part of the pectoralis major muscle". Standard information such as "push the arm diagonally downward" is stored, and it is difficult to train the upper part of the pectoralis major muscle between each standard information, so first of all, the movement to train the inside of the pectoralis major muscle When there is a relationship such as that it is better to practice, the reference information determination unit 220 is further included in, for example, the user's evaluation information (which can be regarded as a skill) at that time (for this reason, the reference information). The image information included in the request may be analyzed by the image analysis unit 212 and evaluated by the evaluation unit 213.) Difficulty level first based on information such as tool movement information (for example, whether the barbell can be raised and lowered smoothly). Low reference information 2 “For training inside the pectoralis major muscle: Close the arm in front” may be transmitted to the user terminal 10. In this case, further training steps and advice such as "First practice the movement to train the inside of the pectoralis major muscle (that is, reference information 2), and then train the upper part of the pectoralis major muscle (that is, reference information 1)". You may also send it together.
 また、基準情報決定部220は、例えば、基準情報リクエストとしてウェイトリフティングモード、目的として「大胸筋上部の筋肥大」が含まれている場合に、基準情報として基準情報記憶部に記憶されている、複数の基準情報から大胸筋上部のトレーニングに合わせた基準情報を選択するが、更に前記身体情報に含まれる過去のトレーニング履歴情報を基に、基準情報を選択してもよい。この場合、例えば、基準情報1「大胸筋上部トレーニング用:腕を斜め上方に押し出す」、基準情報2「大胸筋内側トレーニング用:腕を前で閉じる」、基準情報3「大胸筋下部:腕を斜め下方に押し出す」などの基準情報が記憶されており、それぞれの基準情報の間に、大胸筋の内側がある程度鍛えられていないと、大胸筋上部を鍛えても効果が薄い、または怪我をしてしまう危険がある、などの関係がある場合に、更に、基準情報決定部220は、例えばユーザの過去のトレーニング履歴情報(例えば大胸筋内側トレーニングを行った回数や、既に改善策情報送信部216がユーザに大胸筋内側トレーニングを行うよう、ユーザ端末10に送信した回数など)を基に、一定の回数以上のトレーニング履歴が確認された場合などに、基準情報1「大胸筋上部トレーニング用:腕を斜め上方に押し出す」をユーザ端末10に送信してもよい。この場合、更に、「既に大胸筋内側(つまり基準情報2)は十分にトレーニングされていますので、大胸筋上部(つまり基準情報1)を鍛えましょう」のようなトレーニングのステップやアドバイス等を合わせて送信してもよい。 Further, the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and "muscle hypertrophy of the upper part of the pectoralis major muscle" is included as the purpose. , The reference information suitable for the training of the upper part of the pectoralis major muscle is selected from a plurality of reference information, and the reference information may be further selected based on the past training history information included in the physical information. In this case, for example, reference information 1 "for training the upper part of the pectoralis major muscle: pushing the arm diagonally upward", reference information 2 "for training inside the pectoralis major muscle: closing the arm in front", reference information 3 "lower part of the pectoralis major muscle". Standard information such as "push the arm diagonally downward" is stored, and if the inside of the pectoralis major muscle is not trained to some extent between each standard information, the effect is weak even if the upper part of the pectoralis major muscle is trained. In addition, if there is a relationship such as, or there is a risk of injury, the reference information determination unit 220 may further use, for example, the user's past training history information (for example, the number of times the pectoralis major muscle medial training has been performed, or already. Improvement measure When the training history more than a certain number of times is confirmed based on the number of times the information transmission unit 216 has transmitted to the user terminal 10 so that the user can perform medial training of the pectoralis major muscle, etc., the reference information 1 " "For pectoralis major upper training: pushing the arm diagonally upward" may be transmitted to the user terminal 10. In this case, further training steps and advice such as "Because the inside of the pectoralis major muscle (that is, reference information 2) has already been sufficiently trained, let's train the upper part of the pectoralis major muscle (that is, reference information 1)". May be sent together.
 さらに、基準情報決定部220は、例えば、基準情報リクエストとしてウェイトリフティングモード、目的として「大胸筋上部の筋肥大」が含まれている場合に、基準情報として基準情報記憶部に記憶されている、複数の基準情報から大胸筋上部のトレーニングに合わせた基準情報を選択するが、更に1つ以上の身体運動を行うようにユーザ端末10に身体運動のメニュー等を送り、ユーザが撮像した当該身体運動の評価を基に、基準情報を選択してもよい。この場合、例えば、基準情報1「大胸筋上部トレーニング用A:20kgのバーベルで、腕を斜め上方に押し出す」、基準情報2「大胸筋上部トレーニング用B:40kgのバーベルで、腕を斜め上方に押し出す」、基準情報3「胸筋トレーニング基礎:プッシュアップ30回」などの基準情報が記憶されており、それぞれの基準情報の間に、胸筋がある程度鍛えられていると次のステップに進める、などの関係がある場合に、更に、基準情報決定部220は、まず「懸垂」をユーザ端末10に送信し、ユーザに身体運動の画像を撮像させ、当該画像を画像解析部が受信し、回数等をカウントする。一定の基準を超えた場合、基準情報決定部220は、基準情報1「大胸筋上部トレーニング用A:20kgのバーベルで、腕を斜め上方に押し出す」を次のステップとしてユーザ端末10に送信する。また、一定の基準を超えなかった場合、その基準をクリアするための基準情報3「胸筋トレーニング基礎:プッシュアップ30回」をユーザ端末10に送信してもよい。この場合、更に、「大胸筋上部(つまり基準情報1または2)を鍛えるために、まずはプッシュアップ(基準情報3)で基本的な胸筋の訓練をしましょう」のようなトレーニングのステップやアドバイス等を合わせて送信してもよい。 Further, the reference information determination unit 220 is stored in the reference information storage unit as reference information when, for example, the weight lifting mode is included as the reference information request and the purpose is "muscle hypertrophy of the upper part of the pectoralis major muscle". , Select the reference information according to the training of the upper part of the pectoralis major muscle from a plurality of reference information, but send the body exercise menu or the like to the user terminal 10 so as to perform one or more physical exercises, and the user takes an image. Criteria information may be selected based on the evaluation of physical exercise. In this case, for example, reference information 1 "for pectoralis major upper training A: 20 kg barbell pushes the arm diagonally upward", reference information 2 "for pectoralis major upper training B: 40 kg barbell, the arm is slanted". Criteria information such as "push up" and reference information 3 "pectoral muscle training basics: push-up 30 times" are stored, and if the pectoral muscles are trained to some extent between each reference information, the next step is taken. Further, when there is a relationship such as advancing, the reference information determination unit 220 first transmits "suspension" to the user terminal 10, causes the user to image an image of physical exercise, and the image analysis unit receives the image. , Count the number of times, etc. When a certain standard is exceeded, the standard information determination unit 220 transmits the standard information 1 "A for training the upper part of the pectoralis major muscle: push the arm diagonally upward with a 20 kg barbell" to the user terminal 10 as the next step. .. Further, if the standard is not exceeded, the standard information 3 “pectoral muscle training basics: push-up 30 times” for clearing the standard may be transmitted to the user terminal 10. In this case, further training steps such as "Let's do basic pectoral muscle training with push-ups (reference information 3) first to train the upper part of the pectoralis major muscle (that is, reference information 1 or 2)" You may also send advice and so on.
 さらに、基準情報決定部220は、基準情報リクエストに含まれているモード、目的、画像情報と、前記身体情報、前記評価情報などを基に、ユーザに提示する基準情報のおすすめ度を合わせて提示してもよい。この場合、おすすめ度は100点満点中の数値や100分率での割合のように数値で示してもよいし、5段階中の3などの程度で示してもよいし、○、×、△などのマークで示してもよいし、サムアップ、サムダウンなど、または色などの適合度の印象を示す記号、色で示してもよいし、過去にその基準値を選択したユーザの数、当該ユーザから得た、当該基準値に対する評価を併せて提示してもよい。 Further, the reference information determination unit 220 presents the mode, purpose, and image information included in the reference information request together with the recommendation level of the reference information to be presented to the user based on the physical information, the evaluation information, and the like. You may. In this case, the degree of recommendation may be indicated by a numerical value such as a numerical value out of 100 points or a ratio in 100%, or may be indicated by a degree such as 3 out of 5 levels, or ○, ×, △ It may be indicated by a mark such as, thumb-up, thumb-down, etc., or a symbol or color indicating an impression of the degree of conformity such as color, or the number of users who have selected the reference value in the past, the user concerned. The evaluation for the reference value obtained from the above may also be presented.
 さらに、基準情報決定部220は、ユーザに提示する基準情報のおすすめ度を、目的と基準値に設定された目的の合致度から導いてもよいし、前記評価情報と当該基準値との関係(評価情報としてバットの振り速度が速い場合、バットの振り速度が速い人用の基準値であれば適合度が高い、など)から導いてもいいし、前記身体情報と当該基準値との関係(身体情報として左利きの場合、左バッター用のバッティング基準値であれば適合度が高い、さらに、身体運動として筋力が強い場合、筋力が強い方が効果的という基準値であれば更に適合度が高くなる、など)から導いてもいいし、これらを単独または複数組み合わせておすすめ度を導いてもよい。 Further, the reference information determination unit 220 may derive the recommendation level of the reference information presented to the user from the degree of matching between the purpose and the purpose set in the reference value, and the relationship between the evaluation information and the reference value ( As evaluation information, if the swing speed of the bat is fast, the degree of conformity may be high if the reference value is for a person with a fast swing speed of the bat, etc.), or the relationship between the physical information and the reference value ( In the case of left-handed as physical information, if the batting standard value for the left batter is high, the degree of conformity is high. It may be derived from (Naru, etc.), or these may be used alone or in combination to derive the degree of recommendation.
 さらに、基準情報決定部220は、ユーザに提示する基準情報のおすすめ度を、目的と基準値に設定された目的の合致度から導いてもよいし、前記評価情報と当該基準値との関係(評価情報としてバットの振り速度が速い場合、バットの振り速度が速い人用の基準値であれば適合度が高い、など)から導いてもいいし、前記身体情報と当該基準値との関係(身体情報として左利きの場合、左バッター用のバッティング基準値であれば適合度が高い、さらに、身体運動として筋力が強い場合、筋力が強い方が効果的という基準値であれば更に適合度が高くなる、など)から導いてもいいし、これらを単独または複数組み合わせておすすめ度を導いてもよい。 Further, the reference information determination unit 220 may derive the recommendation level of the reference information presented to the user from the degree of matching between the purpose and the purpose set in the reference value, and the relationship between the evaluation information and the reference value ( As evaluation information, if the swing speed of the bat is fast, the degree of conformity may be high if the reference value is for a person with a fast swing speed of the bat, etc.), or the relationship between the physical information and the reference value ( In the case of left-handed as physical information, if the batting standard value for the left batter is high, the degree of conformity is high. It may be derived from (Naru, etc.), or these may be used alone or in combination to derive the degree of recommendation.
 以上、本実施形態について説明したが、上記実施形態は本発明の理解を容易にするためのものであり、本発明を限定して解釈するためのものではない。本発明は、その趣旨を逸脱することなく、変更、改良され得るとともに、本発明にはその等価物も含まれる。 Although the present embodiment has been described above, the above embodiment is for facilitating the understanding of the present invention, and is not for limiting the interpretation of the present invention. The present invention can be modified and improved without departing from the spirit thereof, and the present invention also includes an equivalent thereof.
 たとえば、本実施形態では、サーバ装置20において画像の解析を行うものとしたが、これに限らず、ユーザ端末10において画像の解析を行い、各部分、各部位の位置関係を特定するようにしてもよい。 For example, in the present embodiment, the image is analyzed by the server device 20, but the present invention is not limited to this, and the image is analyzed by the user terminal 10 to specify the positional relationship of each part and each part. May be good.
 また、本実施形態では、部分、部位の位置は2次元の画像上の位置であることを想定したが、これに限らず、3次元の位置としてもよい。たとえば、ユーザ端末10が、カメラ106に加えてデプスカメラを備えている場合に、カメラ106からの画像と、デプスカメラからの深度マップとに基づいて、部分、部位の3次元の位置を特定することができる。また、たとえば、2次元画像から3次元を推定して、部分、部位の3次元の位置を特定してもよい。なお、カメラ106に代えてデプスカメラを設けるようにし、デプスカメラからの深度マップのみから3次元の位置を特定することも可能である。この場合、ユーザ端末10から画像データとともに、または画像データに代えて深度マップをサーバ装置20に送信するようにし、サーバ装置20の画像解析部212が3次元の位置を解析するようにすることができる。 Further, in the present embodiment, the positions of the parts and parts are assumed to be positions on the two-dimensional image, but the position is not limited to this and may be a three-dimensional position. For example, when the user terminal 10 includes a depth camera in addition to the camera 106, the three-dimensional positions of parts and parts are specified based on the image from the camera 106 and the depth map from the depth camera. be able to. Further, for example, a three-dimensional image may be estimated to specify a three-dimensional position of a part or a part. It is also possible to provide a depth camera instead of the camera 106 and specify the three-dimensional position only from the depth map from the depth camera. In this case, the depth map may be transmitted from the user terminal 10 together with the image data or instead of the image data to the server device 20, and the image analysis unit 212 of the server device 20 may analyze the three-dimensional position. can.
 また、本実施形態では、道具を用いた運動中のユーザの身体を撮像した画像がユーザ端末10からサーバ装置20に送信されるものとしたが、これに限らず、ユーザ端末10において画像から特徴量を抽出し、特徴量をサーバ装置20に送信するようにしてもよいし、ユーザ端末10が特徴量に基づいて道具の部分、身体の部位を推定し、部分、部位の絶対的な位置(画像のXY座標上の位置としてもよいし、基準位置(たとえば、地面や足先、頭、身体の重心など)からの実寸での距離としてもよいし、その他の任意の座標系での位置とすることもできる。)または複数部分間、複数部位間、複数部分部位間の相対的な位置関係を取得し、これらの絶対的な位置や相対的な位置関係をサーバ装置20に送信するようにしてもよい。 Further, in the present embodiment, an image of the body of the user who is exercising using the tool is transmitted from the user terminal 10 to the server device 20, but the present invention is not limited to this, and the user terminal 10 is characterized by the image. The amount may be extracted and the feature amount may be transmitted to the server device 20, or the user terminal 10 estimates the part of the tool and the part of the body based on the feature amount, and the absolute position of the part and the part ( It may be the position on the XY coordinates of the image, it may be the actual size distance from the reference position (for example, the ground, toes, head, center of gravity of the body, etc.), or it may be the position in any other coordinate system. It is also possible to acquire the relative positional relationships between the plurality of parts and the plurality of parts for a plurality of parts, and transmit these absolute positions and relative positional relationships to the server device 20. You may.
 また、本実施形態では、改善策情報にはサーバ装置20側で準備されたコンテンツが提供されるものとしたが、これに限らず、たとえば、基準値を含めるようにして、基準値に基づく正しい動きや姿勢(各部分の位置や向き等、各部位の位置や角度等)となる印やボーンを動画または動画から抽出した静止画に重畳して表示するようにしてもよい。これにより、どのような動きや姿勢とするべきかを容易に把握することができる。 Further, in the present embodiment, the content prepared on the server device 20 side is provided as the improvement measure information, but the present invention is not limited to this, and for example, the reference value is included to be correct based on the reference value. Marks and bones that are movements and postures (positions and orientations of each part, positions and angles of each part, etc.) may be superimposed and displayed on a moving image or a still image extracted from the moving image. This makes it possible to easily grasp what kind of movement and posture should be taken.
 また、本実施形態では、道具の部分や向き、身体の部位の位置または動き等(経時的な位置)について評価するものとしたが、これに限らず、ユーザが装着している道具の位置を特定して評価するようにしてもよい。 Further, in the present embodiment, the part and orientation of the tool, the position or movement of the body part, etc. (position over time) are evaluated, but the present invention is not limited to this, and the position of the tool worn by the user is not limited to this. It may be specified and evaluated.
 また、本実施形態では、改善策についてはアドバイス等のコンテンツを提供するものとしたが、たとえば、道具のレコメンデーションを行うようにしてもよい。この場合、サーバ装置20は、ユーザの身体情報(身長、体重等)に対応付けて、道具と当該道具のサイズ(長さ等)の基準値を記憶しておき、画像データからユーザが使用している道具の特徴量を抽出して道具の形状を特定し、当該形状と身体情報に含まれるユーザのサイズ(たとえば身長等)に基づいて道具の大きさを推定し、推定した道具の大きさと、基準値との差が所定の閾値以上であれば、基準値のサイズの道具をレコメンドすることができる。さらに、道具自体への条件(バーベルの重量等)、道具の使い方、身体条件(柔軟性など)、道具の部位の位置や向き、動きなどの情報から、目的に応じた道具をレコメンドしてもよい。 Further, in the present embodiment, contents such as advice are provided for improvement measures, but for example, tool recommendations may be performed. In this case, the server device 20 stores the reference value of the tool and the size (length, etc.) of the tool in association with the physical information (height, weight, etc.) of the user, and the user uses the image data. The feature amount of the tool is extracted, the shape of the tool is specified, the size of the tool is estimated based on the shape and the size of the user included in the physical information (for example, height, etc.), and the estimated size of the tool is used. If the difference from the reference value is equal to or greater than a predetermined threshold value, a tool of the size of the reference value can be recommended. Furthermore, even if you recommend a tool that suits your purpose from information such as the conditions for the tool itself (barbell weight, etc.), how to use the tool, physical conditions (flexibility, etc.), the position and orientation of the tool part, and movement. good.
 また、本実施形態では、改善策についてはアドバイス等のコンテンツを提供するものとしたが、たとえば、行っている身体運動を中断させてもよい。この場合、サーバ装置20は、ユーザの身体情報(目的、身長、体重等)に対応付けて、身体運動を中断すべき基準値を記憶しておき、画像データからユーザが行っている身体運動の回数や速度など(例えば、バーベルを持ち上げるスピードが極端に落ちてしまう、また、一度に行う回数が多すぎるなど)が基準値から外れた場合に、身体運動を中断させる。この場合、ユーザ端末10に対して中止するようにコメントを出してもよいし、画面を消すなどディスプレイの表示を変化させることによってユーザに知らせてもよいし、アラート音などの音を出してもよいし、バイブレーションによってユーザに知らせてもよい。 Further, in the present embodiment, contents such as advice are provided for improvement measures, but for example, the physical exercise being performed may be interrupted. In this case, the server device 20 stores the reference value at which the physical exercise should be interrupted in association with the physical information (purpose, height, weight, etc.) of the user, and the physical exercise performed by the user from the image data. When the number of times and speed (for example, the speed of lifting the barbell is extremely slowed down, or the number of times the barbell is lifted too many times) deviates from the standard value, the physical exercise is interrupted. In this case, a comment may be given to the user terminal 10 to cancel, the user may be notified by changing the display such as turning off the screen, or a sound such as an alert sound may be emitted. Alternatively, the user may be notified by vibration.
 また、本実施形態では、改善策についてはアドバイス等のコンテンツを提供するものとしたが、たとえば、病気の判定やその改善に向けた身体運動を提示してもよい。この場合、サーバ装置20は、前記身体情報にユーザが入力した症状や、評価情報から、ユーザが発症していると想定される病気の候補を抽出し、絞り込みのためのスクリーニングテストを提示する。ユーザがスクリーニングテストを行い、病名が絞り込めた段階で、医師の診察を受けることや、改善に向けた身体運動、または身体運動を行うための道具や、食事などの物品のレコメンドなどを行ってもよい。 Further, in the present embodiment, contents such as advice are provided for improvement measures, but for example, physical exercise for determination of illness and its improvement may be presented. In this case, the server device 20 extracts a candidate for a disease presumed to be developed by the user from the symptom input by the user in the physical information and the evaluation information, and presents a screening test for narrowing down. When the user conducts a screening test and narrows down the name of the disease, he / she consults a doctor, exercises for improvement, tools for physical exercise, and recommends items such as meals. May be good.
 また、道具の部位の位置を推定することにより、サーバ装置20は、道具のスピード、加速度、移動距離、軌道等を推定することができる。また、サーバ装置20は、時系列での道具の位置の変化のパターンを抽出することにより、パターンの回数を、道具を使った動作の回数として推定することができる。 Further, by estimating the position of the part of the tool, the server device 20 can estimate the speed, acceleration, moving distance, trajectory, etc. of the tool. Further, the server device 20 can estimate the number of patterns as the number of operations using the tool by extracting the pattern of the change in the position of the tool in time series.
 また、本実施形態では、運動の評価を行うものとしたが、これに限らず、ある姿勢または動きを検出した場合に、その動作に対する課題を提案するようにしてもよい。この場合、サーバ装置20は、ひとつまたは一連の姿勢または動きに対応付けて、評価コメントに代えて、課題を記憶しておき、当該課題を出力すればよい。 Further, in the present embodiment, the movement is evaluated, but the present invention is not limited to this, and when a certain posture or movement is detected, a problem for the movement may be proposed. In this case, the server device 20 may store the task in place of the evaluation comment and output the task in association with one or a series of postures or movements.
 また、本実施形態では、運動の評価を行うものとしたが、これに限らず、ある道具の動き、道具の向き、姿勢または身体の部位の動きを検出した場合に、行うべきトレーニング、リハビリ、演奏、またはその準備段階であるストレッチや筋力トレーニング、姿勢など、目的等に応じて身体運動を改善する内容を提示するようにしてもよい。この場合、サーバ装置20は、ひとつまたは一連の道具の部位の動き、道具の部位の向き、身体の姿勢または身体の部位の動きに対応付けて、評価コメントに代えて、トレーニング等の実施内容を記憶しておき、当該内容を出力すればよい。 Further, in the present embodiment, the exercise is evaluated, but the exercise is not limited to this, and training, rehabilitation, and the like to be performed when the movement of a certain tool, the direction, the posture, or the movement of a part of the body are detected. You may present the content to improve the physical exercise according to the purpose such as performance, stretching, strength training, posture, etc., which are the preparation stages thereof. In this case, the server device 20 correlates with the movement of one or a series of tool parts, the orientation of the tool parts, the posture of the body, or the movement of the body parts, and instead of the evaluation comment, the content of the training or the like is performed. It may be stored and the contents may be output.
 また、本実施形態では、運動の評価を行うものとしたが、これに限らず、ユーザが行った動作を自動検出するようにすることもできる。この場合、サーバ装置20は、たとえばシュートやパスなどの所定の動作を行う道具の各部分の位置や姿勢(身体の各部位の位置)を基準情報として記憶しておき、画像から解析した道具の部分や身体の部位の位置と基準情報とを比較して、画像中のユーザが行った動作を特定することができる。 Further, in the present embodiment, the motion is evaluated, but the present invention is not limited to this, and the motion performed by the user can be automatically detected. In this case, the server device 20 stores the position and posture (position of each part of the body) of each part of the tool that performs a predetermined operation such as a shoot or a pass as reference information, and analyzes the tool from the image. By comparing the position of a part or a part of the body with the reference information, it is possible to identify the action performed by the user in the image.
 また、本実施形態では、過去に撮像した画像を解析して運動の評価を行うものとしたが、これに限らず、リアルタイムに解析処理を行い、所定の動作を検出した場合に、次にとるべき戦術をレコメンドするようにしてもよい。この場合、姿勢または動きに対応付けて、評価コメントに代えて戦術を記憶しておき、リアルタイムに戦術を出力すればよい。 Further, in the present embodiment, the image captured in the past is analyzed to evaluate the motion, but the present invention is not limited to this, and when the analysis process is performed in real time and a predetermined motion is detected, the next action is taken. You may try to recommend the tactics that should be done. In this case, the tactics may be stored in place of the evaluation comment in association with the posture or movement, and the tactics may be output in real time.
 また、本実施形態では、支援者端末40が撮像端末50を兼ねていてもよいし、出力端末60を兼ねていてもよい。この場合、支援者端末40は、例えば眼鏡型、コンタクトレンズ型、帽子型、HMD(ヘッドマウントディスプレイ)などの形をしており、支援者端末40が備える撮像機能により支援者の視野に近い範囲を撮像し、撮像した画像は通信ネットワーク30を介してサーバ装置20に送られる。評価部213およびグループ解析部217が、当該画像の処理を行い、その結果は、通信ネットワーク30を介して支援者端末40またはユーザ端末10に送られる。図27に示したように、支援者端末40は、例えば、虚像投影方式、網膜投影方式、その他の方式、更に、脳波などの脳活動を利用して、文字・画像・映像等を脳への直接刺激によって感覚器を介さずに入力するBMI(ブレインマシンインターフェース)等のインタフェースを介して、情報を支援者またはユーザに対して出力する。この通信と処理を高速で行うことにより、支援者がユーザの身体運動を見ると、ほぼリアルタイムでユーザの身体運動の評価やグループ内評価を確認することができる。また、ユーザもほぼリアルタイムに自身の身体運動の評価やグループ内評価を確認することができる。なお、支援者の視野においては、ユーザの身体運動は実像を見ており、評価部213およびグループ解析部217が処理を行った結果が、支援者端末40を通じて、支援者が肉眼で見ている視野に重ねる形で、支援者が視認できる状態となってよい。さらに、支援者端末40がHMDなどの場合は、支援者端末40が撮像した画像に、評価部213およびグループ解析部217が処理を行った結果が重なる形で、支援者に提示されてもよい。 Further, in the present embodiment, the supporter terminal 40 may also serve as the image pickup terminal 50 or may also serve as the output terminal 60. In this case, the supporter terminal 40 has a shape such as a glasses type, a contact lens type, a hat type, an HMD (head-mounted display), and the range close to the supporter's field of view due to the image pickup function provided in the supporter terminal 40. Is imaged, and the captured image is sent to the server device 20 via the communication network 30. The evaluation unit 213 and the group analysis unit 217 process the image, and the result is sent to the supporter terminal 40 or the user terminal 10 via the communication network 30. As shown in FIG. 27, the supporter terminal 40 uses, for example, a virtual image projection method, a retinal projection method, other methods, and brain activity such as an electroencephalogram to transfer characters, images, images, and the like to the brain. Information is output to the supporter or the user via an interface such as BMI (Brain Machine Interface) that is input by direct stimulation without going through the sensory organs. By performing this communication and processing at high speed, when the supporter sees the user's physical movement, it is possible to confirm the evaluation of the user's physical movement and the evaluation within the group in almost real time. In addition, the user can also check the evaluation of his / her physical exercise and the evaluation within the group in almost real time. From the supporter's field of view, the user's physical movement is seen as a real image, and the result of processing by the evaluation unit 213 and the group analysis unit 217 is viewed by the supporter with the naked eye through the supporter terminal 40. The supporter may be able to see it by superimposing it on the field of view. Further, when the supporter terminal 40 is an HMD or the like, the image captured by the supporter terminal 40 may be presented to the supporter in a form in which the results of processing by the evaluation unit 213 and the group analysis unit 217 are superimposed. ..
 また、本実施形態では、ユーザ端末10が撮像端末50を兼ねていてもよいし、出力端末60を兼ねていてもよい。この場合、ユーザ端末10は、例えば眼鏡型、コンタクトレンズ型、帽子型、HMD(ヘッドマウントディスプレイ)などの形をしており、ユーザ端末10が備える撮像機能によりユーザの視野に近い範囲を撮像し、撮像した画像は通信ネットワーク30を介してサーバ装置20に送られる。評価部213およびグループ解析部217が、当該画像の処理を行い、その結果は、通信ネットワーク30を介してユーザ端末10または支援者端末40に送られる。図25に示したように、支援者端末40は、例えば、虚像投影方式、網膜投影方式、その他の方式、更に、脳波などの脳活動を利用して、文字・画像・映像等を脳への直接刺激によって感覚器を介さずに入力するBMI(ブレインマシンインターフェース)等のインタフェースを介して、情報をユーザまたは支援者に対して出力する。この通信と処理を高速で行うことにより、ユーザが鏡等を介して自身の身体運動を見ると、ほぼリアルタイムで自身の身体運動の評価やグループ内評価を確認することができる。 Further, in the present embodiment, the user terminal 10 may also serve as the image pickup terminal 50 or may also serve as the output terminal 60. In this case, the user terminal 10 is in the shape of, for example, a spectacle type, a contact lens type, a hat type, an HMD (head mounted display), or the like, and an image pickup function provided in the user terminal 10 captures a range close to the user's field of view. The captured image is sent to the server device 20 via the communication network 30. The evaluation unit 213 and the group analysis unit 217 process the image, and the result is sent to the user terminal 10 or the supporter terminal 40 via the communication network 30. As shown in FIG. 25, the supporter terminal 40 uses, for example, a virtual image projection method, a retinal projection method, other methods, and brain activity such as an electroencephalogram to transfer characters, images, images, and the like to the brain. Information is output to the user or supporter via an interface such as BMI (Brain Machine Interface) that is input by direct stimulation without going through the sensory organs. By performing this communication and processing at high speed, when the user sees his / her body movement through a mirror or the like, he / she can confirm the evaluation of his / her body movement and the evaluation in the group in almost real time.
基準情報の売買
 また、本実施形態では、基準値は複数存在してよく、ユーザはその基準値を有償で選択してもよい。
Buying and selling of reference information Further, in the present embodiment, a plurality of reference values may exist, and the user may select the reference value for a fee.
基準情報の売買
 また、本実施形態では、基準値はプロ選手などの熟練者等の身体運動(道具を用いて行うものを含む)を基に作成したものを含んでもよい。
Buying and selling of reference information Further, in the present embodiment, the reference value may include one created based on physical exercise (including one performed by using a tool) of a skilled person such as a professional athlete.
基準情報の売買
 また、本実施形態では、ユーザ端末10、支援者端末40、撮像端末50、出力端末60は眼鏡型などウェアラブル端末でもよく、当該ウェアラブル端末で撮像し、解析、評価を行った結果を、ウェアラブル端末に出力してもよいが、モバイル端末等のウェアラブルではないユーザ端末10、支援者端末40、撮像端末50、出力端末60が同時に存在してもよく、当該ウェアラブルではない端末にも解析、評価を行った結果を表示してもよい。
Buying and selling of reference information Further, in the present embodiment, the user terminal 10, the supporter terminal 40, the image pickup terminal 50, and the output terminal 60 may be wearable terminals such as eyeglasses, and the results of imaging, analysis, and evaluation by the wearable terminal are performed. May be output to a wearable terminal, but a non-wearable user terminal 10 such as a mobile terminal, a supporter terminal 40, an image pickup terminal 50, and an output terminal 60 may exist at the same time, and the non-wearable terminal may also be present. The result of analysis and evaluation may be displayed.
 また、本実施形態では、所定の機能の実行および情報の記憶を、ユーザ端末10またはサーバ装置20で行うものとしているが、これに限らず、いずれか一方の装置で当該機能の実行および情報の記憶を行うこととしてもよい。または、本実施形態とは異なる形態で、機能部および記憶部を分散して設けてもよい。 Further, in the present embodiment, the execution of a predetermined function and the storage of information are performed by the user terminal 10 or the server device 20, but the present invention is not limited to this, and the execution of the function and the storage of information by either device are not limited to this. You may do memory. Alternatively, the functional unit and the storage unit may be distributed and provided in a form different from the present embodiment.
 また、本実施形態では、所定の機能の実行および情報の記憶を、ユーザ端末10またはサーバ装置20で行うものとしているが、これに限らず、いずれか一方の装置で当該機能の実行および情報の記憶を行うこととしてもよい。または、本実施形態とは異なる形態で、機能部および記憶部を分散して設けてもよい。 Further, in the present embodiment, the execution of a predetermined function and the storage of information are performed by the user terminal 10 or the server device 20, but the present invention is not limited to this, and the execution of the function and the storage of information by either device are not limited to this. You may do memory. Alternatively, the functional unit and the storage unit may be distributed and provided in a form different from the present embodiment.
  10  ユーザ端末
  20  サーバ装置
  30  通信ネットワーク
  40  支援者端末
  50  撮像端末
  60  出力端末
  101 CPU
  102 メモリ
  103 記憶装置
  104 通信インタフェース
  105 出力装置
  106 カメラ等入力装置
  111 撮像部
  112 評価リクエスト送信部
  113 評価情報受信部
  114 評価表示部
  115 チェックポイント表示部
  116 改善策リクエスト送信部
  117 改善策情報受信部
  118 改善策表示部
  119 基準値リクエスト送信部
  120 基準値選択情報送信部
  130 身体情報記憶部
  131 画像記憶部
  132 評価情報記憶部
  133 改善策記憶部
  201 CPU
  202 メモリ
  203 記憶装置
  204 通信インタフェース
  205 出力装置
  206 カメラ等入力装置
  211 評価リクエスト受信部
  212 画像解析部
  213 評価部
  214 評価情報送信部
  215 改善策リクエスト受信部
  216 改善策情報送信部
  217 グループ解析部
  218 グループ解析提示部
  219 基準値リクエスト受信部
  220 基準値決定部
  231 画像データ記憶部
  232 基準情報記憶部
  233 評価条件情報記憶部
  234 改善策情報記憶部
  235 グループ解析情報記憶部
 
10 User terminal 20 Server device 30 Communication network 40 Supporter terminal 50 Imaging terminal 60 Output terminal 101 CPU
102 Memory 103 Storage device 104 Communication interface 105 Output device 106 Camera input device 111 Imaging unit 112 Evaluation request transmission unit 113 Evaluation information reception unit 114 Evaluation display unit 115 Checkpoint display unit 116 Improvement measure request transmission unit 117 Improvement measure information reception unit 118 Improvement measure display unit 119 Reference value request transmission unit 120 Reference value selection information transmission unit 130 Physical information storage unit 131 Image storage unit 132 Evaluation information storage unit 133 Improvement measure storage unit 201 CPU
202 Memory 203 Storage device 204 Communication interface 205 Output device 206 Camera input device 211 Evaluation request reception unit 212 Image analysis unit 213 Evaluation unit 214 Evaluation information transmission unit 215 Improvement measure request reception unit 216 Improvement measure information transmission unit 217 Group analysis unit 218 Group analysis presentation unit 219 Reference value request reception unit 220 Reference value determination unit 231 Image data storage unit 232 Reference information storage unit 233 Evaluation condition information storage unit 234 Improvement measure Information storage unit 235 Group analysis information storage unit

Claims (5)

  1.  支援者が、身体運動を行うユーザに対して行う指導を支援する指導支援システムであって、
     少なくとも1つの身体の部位または道具の部分の位置に係る基準値を記憶する基準値記憶部と、
     前記ユーザの身体運動の画像を解析して前記部位または前記部分を特定する部位部分特定部と、
     前記画像における前記部位または前記部分の位置および前記基準値を比較して前記身体運動の評価値を決定する評価部と、
    を備え、
     前記ユーザの身体運動の画像を撮像する撮像機能と、
     前記評価値を表示する表示機能を有する支援者端末と、
    を備えることを特徴とする、指導支援システム。
    It is a guidance support system that supports the guidance given by the supporter to the user who performs physical exercise.
    A reference value storage unit that stores a reference value related to the position of at least one body part or tool part, and a reference value storage unit.
    A part specifying part that identifies the part or the part by analyzing an image of the user's physical exercise,
    An evaluation unit that determines the evaluation value of the physical exercise by comparing the position of the portion or the portion in the image and the reference value.
    Equipped with
    An imaging function that captures an image of the user's physical exercise,
    A supporter terminal having a display function for displaying the evaluation value, and
    Guidance support system characterized by being equipped with.
  2.  前記支援者端末は、ウェアラブルコンピュータであること、
    を特徴とする、請求項1に記載の指導支援システム。
    The supporter terminal must be a wearable computer.
    The guidance support system according to claim 1, wherein the guidance support system is characterized by the above.
  3.  前記基準値は複数存在し、
     前記ユーザの目的、身体情報、前記評価値の少なくとも一つの情報を基に、前記ユーザに対して一つ以上の基準値を提示する基準値決定部と、
    を備えることを特徴とする、請求項1または2に記載の指導支援システム。
    There are multiple reference values,
    A reference value determining unit that presents one or more reference values to the user based on at least one of the user's purpose, physical information, and the evaluation value.
    The guidance support system according to claim 1 or 2, wherein the system is provided with.
  4.  前記ユーザが基準値の候補から選択する基準値選択情報送信部と、
    を備えることを特徴とする、請求項1から3に記載の指導支援システム。
    The reference value selection information transmission unit selected by the user from the reference value candidates, and
    The guidance support system according to claim 1 to 3, further comprising.
  5.  支援者が、身体運動を行うユーザに対して行う指導を支援する指導支援方法であって、
     少なくとも1つの身体の部位または道具の部分の位置に係る基準値を記憶する基準値記憶ステップと、
     前記ユーザの身体運動の画像を解析して前記部位または前記部分を特定する部位部分特定ステップと、
     前記画像における前記部位または前記部分の位置および前記基準値を比較して前記身体運動の評価値を決定する評価ステップと、
    を備え、
     前記ユーザの身体運動の画像を撮像する撮像機能と、
     前記評価値を表示する表示機能を有する支援者端末と、
    を備えることを特徴とする、指導支援方法。
     
    It is a guidance support method that supports the guidance given by the supporter to the user who performs physical exercise.
    A reference value storage step that stores a reference value relating to the position of at least one body part or tool part,
    A site part identification step for identifying the part or the part by analyzing an image of the user's physical exercise,
    An evaluation step of comparing the position of the part or the part in the image and the reference value to determine the evaluation value of the physical exercise, and the evaluation step.
    Equipped with
    An imaging function that captures an image of the user's physical exercise,
    A supporter terminal having a display function for displaying the evaluation value, and
    Guidance support method characterized by providing.
PCT/JP2021/029301 2020-08-07 2021-08-06 Guidance support system WO2022030619A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2022541749A JPWO2022030619A1 (en) 2020-08-07 2021-08-06

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-134403 2020-08-07
JP2020134403 2020-08-07

Publications (1)

Publication Number Publication Date
WO2022030619A1 true WO2022030619A1 (en) 2022-02-10

Family

ID=80118135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/029301 WO2022030619A1 (en) 2020-08-07 2021-08-06 Guidance support system

Country Status (2)

Country Link
JP (1) JPWO2022030619A1 (en)
WO (1) WO2022030619A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7359349B1 (en) 2022-07-07 2023-10-11 Co-Growth株式会社 Evaluation support system, information processing device control method, and information processing device control program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180104541A1 (en) * 2016-09-28 2018-04-19 Bodbox, Inc. Evaluation And Coaching Of Athletic Performance
WO2019008771A1 (en) * 2017-07-07 2019-01-10 りか 高木 Guidance process management system for treatment and/or exercise, and program, computer device and method for managing guidance process for treatment and/or exercise
JP2019016254A (en) * 2017-07-10 2019-01-31 株式会社FiNC Method and system for evaluating user posture
JP2019058285A (en) * 2017-09-25 2019-04-18 パナソニックIpマネジメント株式会社 Activity support method, program, and activity support system
JP2019180539A (en) * 2018-04-03 2019-10-24 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Moving image list creation method and server
JP2020080096A (en) * 2018-11-14 2020-05-28 Kddi株式会社 Object identifying apparatus, identifying system and identifying method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180104541A1 (en) * 2016-09-28 2018-04-19 Bodbox, Inc. Evaluation And Coaching Of Athletic Performance
WO2019008771A1 (en) * 2017-07-07 2019-01-10 りか 高木 Guidance process management system for treatment and/or exercise, and program, computer device and method for managing guidance process for treatment and/or exercise
JP2019016254A (en) * 2017-07-10 2019-01-31 株式会社FiNC Method and system for evaluating user posture
JP2019058285A (en) * 2017-09-25 2019-04-18 パナソニックIpマネジメント株式会社 Activity support method, program, and activity support system
JP2019180539A (en) * 2018-04-03 2019-10-24 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Moving image list creation method and server
JP2020080096A (en) * 2018-11-14 2020-05-28 Kddi株式会社 Object identifying apparatus, identifying system and identifying method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7359349B1 (en) 2022-07-07 2023-10-11 Co-Growth株式会社 Evaluation support system, information processing device control method, and information processing device control program

Also Published As

Publication number Publication date
JPWO2022030619A1 (en) 2022-02-10

Similar Documents

Publication Publication Date Title
US9878206B2 (en) Method for interactive training and analysis
US9025824B2 (en) Systems and methods for evaluating physical performance
CA2819067C (en) Systems and methods for performance training
Velloso et al. Qualitative activity recognition of weight lifting exercises
KR100772497B1 (en) Golf clinic system and application method thereof
US9223936B2 (en) Fatigue indices and uses thereof
US20130178960A1 (en) Systems and methods for remote monitoring of exercise performance metrics
JP2020174910A (en) Exercise support system
JP2022043264A (en) Motion evaluation system
US20140039353A1 (en) Apparatus and Method of Analyzing Biomechanical Movement of an Animal/Human
US20220180634A1 (en) Method for Teaching Precision Body Movements and Complex Patterns of Activity
WO2022030619A1 (en) Guidance support system
WO2021261529A1 (en) Physical exercise assistance system
JP2005111178A (en) Motion training display system
US20230285806A1 (en) Systems and methods for intelligent fitness solutions
US20170312577A1 (en) System and Method for Sport Performance Monitoring, Analysis, and Coaching
Velloso et al. Towards qualitative assessment of weight lifting exercises using body-worn sensors
US20220370853A1 (en) J-sleeve system
WO2022145563A1 (en) User-customized exercise training method and system
WO2023127870A1 (en) Care support device, care support program, and care support method
KR20220088862A (en) Quantified motion feedback system
AU2014232710A1 (en) Systems and methods for evaluating physical performance
JP2021068069A (en) Providing method for unmanned training
WO2023275940A1 (en) Posture estimation device, posture estimation system, posture estimation method
Noorbhai A systematic review of the batting backlift technique in cricket

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21854537

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022541749

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21854537

Country of ref document: EP

Kind code of ref document: A1